Job Search and Career Advice Platform

Enable job alerts via email!

IT Data Engineer

Tungsten Automation

Remote

MYR 100,000 - 140,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading automation firm is seeking an IT Data Engineer in Kuala Lumpur. This role focuses on building and maintaining robust data infrastructure to support data-driven decision-making. Responsibilities include developing scalable data pipelines and managing databases. Candidates should have cloud platform expertise, experience in big data technologies, and strong skills in SQL and Python. The position offers a fully remote work arrangement and excellent career growth opportunities.

Benefits

Fixed allowance for shift work
Remote work option
Career growth opportunities

Qualifications

  • 5+ years of experience in data engineering or BI/reporting roles with a strong back-end focus.
  • Proficiency in data warehousing design and management.

Responsibilities

  • Design, build, and maintain scalable data pipelines.
  • Develop and manage databases and data warehouses.
  • Write and maintain efficient data processing code.
  • Ensure data quality and consistency across systems.
  • Collaborate with teams to deliver aligned data solutions.
  • Leverage cloud platforms for scalable data infrastructure.

Skills

Cloud platform proficiency (Azure, AWS, GCP)
Big Data technologies (Snowflake, Hadoop, Spark)
SQL and Python skills
ETL/ELT processes
Containerization tools (Docker, Kubernetes)
Version control (Git)
Streaming data technologies (Kafka, Spark Streaming)
Data governance knowledge
Data visualization (Power BI, Tableau)
Analytical and problem-solving skills
Communication skills
Detail-oriented
Familiarity with Agile methodologies
AI systems prompting

Education

Bachelor's degree in Computer Science or related field

Tools

SQL Server
Pandas
PySpark
Apache Airflow
Azure Data Factory
Power BI
Tableau
Job description
IT Data Engineer

Tracking Code: 222266-973

Job Location

Level 31 Menara Prestige, No 1 Jalan Pinang, Kuala Lumpur,

Job Level

Not Applicable

Category

Information Technology / Information Systems

Position Type

Full-Time/Regular

Before continue reading further please take note:

  • Candidate may require working on AMS OR EMEA hours
  • Fixed allowance will be provided on working such shift
  • Fully remote

The IT Data Engineer plays a critical role in enabling data-driven decision-making across the organization by designing, building, and maintaining robust data infrastructure. This position is responsible for developing scalable data pipelines, managing databases and data warehouses, and ensuring the integrity, accessibility, and quality of enterprise data. Through thoughtful architecture and efficient processing systems, the Data Engineer supports analytical initiatives and empowers teams with reliable, well-organized data assets.

Key Responsibilities
  • Design, build, and maintain scalable data pipelines to extract, transform, and load data from diverse sources into centralized repositories (e.g., data warehouses, data lakes)
  • Develop and manage databases, data warehouses, and data lakes to support analytics and reporting needs
  • Write and maintain efficient code and scripts for data processing, transformation, and manipulation
  • Ensure data quality, consistency, and reliability across all systems and pipelines
  • Implement and uphold data security, privacy, and access controls in compliance with governance standards
  • Collaborate with cross-functional teams to deliver data solutions aligned with business objectives
  • Monitor and optimize the performance, scalability, and cost-efficiency of data infrastructure
  • Create and maintain clear documentation for data architecture, processes, and workflows
  • Stay current with emerging data engineering technologies and recommend improvements to existing systems
  • Design and implement data models and schemas to support analytical and machine learning use cases
  • Build curated datasets and data marts to enable self-service analytics for business users
  • Establish monitoring and alerting systems to proactively detect and resolve data pipeline issues
  • Support machine learning operations by collaborating with data scientists on feature engineering and model deployment
  • Leverage cloud platforms (e.g., AWS, Azure, GCP) to build and maintain modern, scalable data infrastructure
  • Contribute to data governance initiatives by ensuring compliance with internal policies and external regulations
  • Utilizes AI-enabled tools (e.g., chatbots, document automation, analytics assistants) to improve efficiency, accuracy, and streamline routine tasks while following company AI governance and data privacy standards
Required Skills
  • Proficiency in designing, building, and supporting large-scale data warehouses using cloud platforms such as Azure, AWS, or Google Cloud
  • Strong experience with Big Data technologies (e.g., Snowflake, Hadoop, Spark), data modeling (OLTP/OLAP), and database design and management
  • Skilled in writing and debugging complex SQL scripts and Python-based data processing using libraries such as Pandas and PySpark
  • Hands-on experience with ETL/ELT processes, orchestration tools (e.g., Apache Airflow, Azure Data Factory), and CI/CD practices
  • Familiarity with containerization and deployment tools like Docker and Kubernetes
  • Experience with version control systems such as Git
  • Knowledge of streaming data technologies (e.g., Kafka, Spark Streaming) for real-time data processing
  • Understanding of data governance, privacy practices, and compliance regulations
  • Existing technical knowledge of modern systems software, protocols, and standards, including SQL Server, Snowflake/Databricks/Fabric, and Power BI
  • E xperience with data visualization tools such as Power BI or Tableau
  • Ability to translate business requirements into scalable technical solutions
  • Strong analytical, problem-solving, and troubleshooting skills
  • Excellent communication and collaboration abilities across technical and non-technical teams
  • Detail-oriented with a strong focus on data quality, consistency, and reliability
  • Comfortable working independently and in a fast-paced, team-oriented environment
  • Familiarity with Agile or Scrum methodologies is a plus
  • Skills in prompting AI systems and assessing output quality
  • Ability to leverage AI to ideate, develop, and scale to the needs of their department
Required Experience
  • Bachelor’s degree in Computer Science, Software Engineering or related field.
  • 5+ years of experience in data engineering or BI/reporting roles with a strong back-end focus
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.