Enable job alerts via email!

Data Engineer

TechTiera Corporation

Butterworth

On-site

MYR 60,000 - 80,000

Full time

5 days ago
Be an early applicant

Job summary

A technology solutions company in Butterworth, Malaysia, is seeking a Data Engineer to design and manage robust data pipelines and develop ETL processes. The ideal candidate will be proficient in SQL, have experience with cloud platforms (preferably AWS), and strong communication skills. This role includes optimizing data systems and ensuring data quality and security across multiple sources.

Qualifications

  • Proficient in SQL and at least one programming language.
  • Experience with ETL tools and workflow orchestration.
  • Hands-on experience with AWS and familiarity with big data tools.

Responsibilities

  • Design, build, and manage robust and scalable data pipelines.
  • Develop ETL processes to acquire and transform data.
  • Optimize data systems for performance, reliability, and scalability.

Skills

SQL
Python
Java
Scala
Data governance
Problem-solving
Communication skills

Tools

AWS Glue
Apache Airflow
Talend
Hadoop
Spark
Kafka
Git

Job description

Responsibilities:

  • Design, build, and manage robust and scalable data pipelines (batch and real-time).
  • Develop ETL processes to acquire, transform, and integrate data from multiple sources.
  • Build and maintain data warehouses, data lakes, and other storage solutions.
  • Optimize data systems for performance, reliability, and scalability.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
  • Ensure data quality, consistency, and integrity across systems.
  • Implement data governance, privacy, and security best practices.
  • Monitor and troubleshoot data pipelines and flows, ensuring high availability.
  • Document data architecture, flows, and system designs.
  • Proficient in SQL and at least one programming language (Python, Java, or Scala).
  • Experience with ETL tools and workflow orchestration (e.g., AWS Glue, Apache Airflow, Luigi, Talend).
  • Hands-on experience with cloud platformsAWS, is must – Good to have –(GCP, or Azure) and services like S3, Redshift, BigQuery, or Data Factory.
  • Familiarity with big data tools such as Hadoop, Spark, and Kafka.
  • Knowledge of data modeling and warehousing concepts.
  • Experience with version control tools (Git) and CI/CD processes.
  • Strong problem-solving and communication skills.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.