Job Search and Career Advice Platform

Enable job alerts via email!

Senior Consultant, Data Apps Mgt

Singtel Group

Singapore

On-site

SGD 70,000 - 100,000

Full time

27 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology services firm in Singapore seeks an experienced Data Engineer to develop and maintain scalable data pipelines. The ideal candidate should have over 3 years of relevant experience, extensive SQL knowledge, and the ability to work with Azure and data orchestration tools. Responsibilities include data integration, governance, and ensuring data integrity. This role offers a chance to drive impactful projects in a dynamic environment.

Qualifications

  • Proven experience as a Data Engineer (3+ years) with a strong record of scalable data pipelines.
  • Extensive experience designing data solutions including data modeling.
  • Hands-on experience developing data processing jobs in PySpark/SQL.
  • Experience using DevOps tools, Git workflow, and building CI/CD pipelines.

Responsibilities

  • Develop and maintain data pipelines for ETL processes.
  • Integrate data from multiple sources including APIs and databases.
  • Implement data governance and best practices.
  • Monitor and support data pipelines for reliability and integrity.

Skills

Data pipeline development
Data integration
Data transformation
SQL proficiency
Experience with data orchestration tools
Cloud computing expertise

Tools

PySpark
Azure
Apache Kafka
Airflow
Job description

Select how often (in days) to receive an alert:

NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities, providing consulting, digital services, technology solutions, and more. We believe in harnessing the power of technology to achieve extraordinary things, creating lasting value and impact for our communities, partners, and people. Our diverse workforce of 13,000 has delivered large-scale, mission-critical, and multi-platform projects for governments and enterprises in Singapore and the APAC region

What will you do?

  • Data Pipeline Development : Develop and maintain data pipelines that extract, transform, and load (ETL) data from various sources into a centralized data storage system, such as a data warehouse or data lake.
  • Data Integration : Integrate data from multiple sources and systems, including databases, APIs, log files, streaming platforms, and external data providers.
  • Data Transformation and Processing : Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, handle missing or inconsistent data, and prepare the data for analysis, reporting, or machine learning tasks.
  • Contribute to common frameworks and best practices in code development, deployment, and automation/orchestration of data pipelines.
  • Implement data governance in line with company standards.
  • Partner with Data Analytics and Product leaders to design best practices and standards for developing and productionalising analytic pipelines.
  • Partner with Infrastructure leaders on architecture approaches to advance the data and analytics platform, including exploring new tools and techniques that leverage the cloud environment (Azure, Databricks, others).
  • Monitoring and Support : Monitor data pipelines and data systems to detect and resolve issues promptly. Develop monitoring tools, alerts, and automated error handling mechanisms to ensure data integrity and system reliability

The ideal candidate should possess:

Must haves

  • Proven experience as a Data Engineering role (3 years +), with a strong track record of delivering scalable data pipelines.
  • Extensive experience designing data solutions including data modelling is required.
  • Extensive hands-on experience developing data processing jobs (PySpark / SQL) that demonstrate a strong understanding of software engineering principles is needed.
  • Experience orchestrating data pipelines using technology like ADF, Airflow etc is necessary.
  • Experience working with both real-time and batch data is important.
  • Experience building data pipelines on Azure is crucial. AWS data pipelines will be beneficial.
  • Fluency in SQL (any flavour), with experience using Window functions and more advanced features is required.
  • Understanding of DevOps tools, Git workflow and building CI/CD pipelines is essential.

Nice to Have

  • Domain knowledge of commodities, covering Sales, Trading, Risk, Supply Chain, Customer Interaction, etc. is highly desirable.
  • Familiarity with Scrum methodology and experience working in a Scrum team can be advantageous. This includes understanding of Scrum roles, events, artifacts, and rules, and the ability to apply them in a practical context.
  • Experience with streaming data processing technologies such as Apache Kafka, Apache Flink, or AWS Kinesis can be beneficial. This includes the ability to design and implement real-time data processing pipelines.

We are driven by our AEIOU beliefs—Adventure, Excellence, Integrity, Ownership, and Unity —and we seek individuals who embody these values in both their professional and personal lives. We are committed to our Impact: Valuing our clients, Growing our people, and Creating our future.

Together, we make the extraordinary happen.

Learn more about us at ncs.co and visit our LinkedIn career site.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.