Enable job alerts via email!

Data Senior Analyst (DLP)

EVERSAFE ACADEMY PTE. LTD.

Singapore

On-site

SGD 80,000 - 120,000

Full time

Today
Be an early applicant

Job summary

A data-focused company in Singapore is seeking an experienced professional to design, build, and maintain data pipelines. Candidates should have a Bachelor’s or Master’s degree in a relevant field, strong programming skills in Python, Java, or Scala, and at least 6 years of experience in data management. The role involves collaborating with data teams to ensure data quality and support decision-making processes. Competitive compensation and benefits are offered.

Qualifications

  • At least 6 years of experience in data analysis, data interpretation, and data management.
  • Implemented at least 4 programs with a data dashboard, data cleansing, and data synchronization.

Responsibilities

  • Design, build, and maintain data pipelines for data transformation.
  • Develop infrastructure to support data science initiatives.
  • Ensure data quality and consistency across data sources.
  • Collaborate with stakeholders to support data-driven decision-making.

Skills

Python
Java
Scala
SQL
NoSQL

Education

Bachelor’s or Master’s degree in computer science, data science, or a related field

Tools

Apache Airflow
AWS Glue
Azure Data Factory
Redshift
Snowflake
BigQuery
Hadoop
Spark
Flink
Job description
Responsibilities
  • Design, build, and maintain data pipelines to move and transform data from various sources into a target location such as a data warehouse or data lake.
  • Develop and maintain the infrastructure required to support data science initiatives, including data warehousing, ETL or ELT tools, and data integration solutions.
  • Ensure data quality, accuracy, and consistency across multiple data sources.
  • Work with data scientists, data analysts, and other stakeholders to understand data requirements and provide support for data-driven decision-making.
Qualifications
  • Bachelor’s or Master’s degree in computer science, data science, or a related field.
  • Strong programming skills in one or more languages such as Python, Java, or Scala.
  • Strong experience with SQL, NoSQL, and data warehousing technologies such as Redshift, Snowflake or BigQuery.
  • Experience with ETL tools such as Apache Airflow, AWS Glue, or Azure Data Factory.
  • Familiarity with distributed computing frameworks such as Hadoop, Spark, or Flink.
  • Knowledge of data modelling, data integration, and data quality concepts.
  • Strong communication skills and ability to work collaboratively with cross-functional teams.
Experience
  • At least 6 years of experience in data analysis, data interpretation, and data management.
  • Implemented at least 4 programs with a data dashboard, data cleansing, and data synchronization.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.