Enable job alerts via email!

Data Engineer

ASIAPAC TECHNOLOGY PTE. LTD.

Singapore

On-site

SGD 75,000 - 100,000

Full time

Yesterday
Be an early applicant

Job summary

A technology firm in Singapore is seeking a Data Engineer to design and build data integration and analytics solutions for sales and marketing. The role requires 5+ years of software development and expertise in data architectures across cloud platforms. Responsibilities include leading the development of ETL/ELT pipelines and collaborating with business teams to refine data requirements. This position demands strong skills in big data technologies and data modeling.

Qualifications

  • Minimum 5 years of software development experience.
  • 3 years of data application or data platform architecture experience.
  • Deep technology expertise in data structures and data processes.

Responsibilities

  • Develop and maintain architectural roadmap for data platform.
  • Lead identification of technologies & tools for data platform.
  • Design and support data ETL/ELT pipelines.

Skills

Data & Analytics solutions
Data Architecture
ETL/ELT pipelines
Cloud computing technologies
Data modelling
Big Data
Data visualization tools

Tools

Azure Databricks
Informatica
AWS Glue
Azure Data Factory
Tableau
Power BI
Python

Job description

Overview:

AsiaPac Technology Pte. Ltd. Is looking for a Data Engineer to take responsibility for the design, lead and build of strategic data integration and data analytics solutions for their sales and marketing reporting and analytics processes.

Required Work Experience:

At least 3 years of demonstrable experience implementing Data & Analytics solutions solving complex data problems, developing and testing modular, reusable, efficient and scalable solutions. Additionally, experience in designing and developing complex Data Architectures in a Multi-Cloud environment.

Job Description:

  • Develop and maintain architectural roadmap for data platform, data products and data services plus ensuring alignment with the business and enterprise architecture strategies and standards.
  • Lead the identification of technologies & tools for data platform Articulate technology solutions as well as explain the competitive advantages of various technology alternatives.
  • Designing, documenting, developing, testing, installing and supporting complex data ETL/ELT pipelines using Azure Databricks, Informatica, AWS Glue, Azure Data Factory, or similar platforms.
  • Providing support to existing designs in Oracle, SQL Server, Azure Databricks, Snowflake and Delta Lake.
  • Participating in the design and evolution of DevOps for Client Analytics.
  • Experience in working closely with business and product teams to collect, assess, and refine requirements for the Data Platform.
  • Design conceptual, logical, and physical data models suitable for reports and dashboards.
  • Define and govern data architecture artifacts, such as design blueprint, design standards, best practices, and documentations.
  • Able to communicate complex processes and data models, as well as articulating the benefits of certain designs to different target audiences (especially business users).

Qualifications:

Experienced technology implementor with a minimum of 5 years of software development experience including 3 years of data application or data platform architecture experience with deep technology expertise in the following:

  • Expertise in design and management of complex data structures and data processes.
  • Deep knowledge and hands on experience on big data and cloud computing technologies.
  • Strong service architecture and development experience with high performance and scalability.
  • Extensive hands-on experience in designing and developing data models, integrating data from multiple sources, data flow design, building data pipelines for data platform– data lake and data warehouse.
  • Experience building data pipelines for BI tools and data lake implementations using cloud native tools such as AWS Glue, Azure Data Factory, Google Dataflow and/or Python.
  • Minimum 3 years of experience in designing enterprise data platforms, especially around data modelling, data warehousing or advanced analytics solutions.
  • Minimum 3 years hands on experience in managing and implementing data related projects.
  • Familiarity with Enterprise Resource Planning and Big Data Analytics platforms (e.g., MS Dynamics, Salesforce, SAP, Oracle, Teradata, or Snowflake).
  • Practical working experience with Hadoop, Spark or any other Big Data distributed frameworks.
  • Practical working experience using visualization tools like Tableau, Power BI or Qlik.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.