Enable job alerts via email!

Databricks Developer

Unison Consulting

Singapore

On-site

SGD 60,000 - 90,000

Full time

4 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in Singapore is seeking a skilled Databricks Developer to enhance their data engineering team. The successful candidate will utilize their expertise in Databricks, Informatica, and Java to design and optimize ETL processes for data integration and management. You will work with modern cloud technologies and ensure data quality while collaborating with cross-functional teams. Ideal candidates have a degree in Computer Science and relevant certifications.

Qualifications

  • 5+ years experience in data engineering or related field.
  • Familiarity with Delta Lake, data lakes, and data warehousing concepts.
  • Certifications like Databricks Certified Data Engineer preferred.

Responsibilities

  • Design, develop, and maintain ETL pipelines using Databricks (Scala) and Informatica PowerCenter.
  • Integrate and transform data from various sources into the enterprise data platform.
  • Optimize data workflows for performance and cost-efficiency.

Skills

Databricks (Scala)
Informatica
Java
SQL
Apache Spark
Cloud environments
Data lakes

Education

Bachelor's or Master's degree in Computer Science

Tools

Databricks
Informatica PowerCenter
Azure Data Lake
AWS S3

Job description

Job Title: Databricks Developer

Location: Singapore

Experience Required: 5+ years

Job Summary:

We are looking for a skilled Databricks Developer with strong expertise in Databricks (Scala), Informatica, and Java to join our data engineering team in Singapore. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and integration workflows using a combination of modern cloud technologies and traditional ETL tools.

Key Responsibilities:

  • Design, develop, and maintain ETL pipelines using Databricks (Scala) and Informatica PowerCenter
  • Integrate and transform data from various sources into the enterprise data platform
  • Develop high-performance Spark applications in Scala for large-scale data processing
  • Build reusable components and services using Java for ETL orchestration or middleware integration
  • Optimize data workflows for performance and cost-efficiency in cloud environments (Azure or AWS)
  • Ensure data quality, consistency, and governance across all ETL processes
  • Collaborate with cross-functional teams to understand business requirements and deliver reliable data solutions
  • Implement CI/CD and monitoring processes for ETL workflows

Required Skills:

  • Strong hands-on experience with Databricks and Apache Spark (Scala)
  • Proficient in Informatica PowerCenter for traditional ETL processes
  • Solid programming knowledge in Core Java for backend data integration or utility services
  • Strong experience in SQL and performance tuning
  • Experience working in cloud environments (Azure Data Lake, ADF, AWS S3, Glue, etc.)
  • Familiarity with Delta Lake, data lakes, and data warehousing concepts
  • Excellent problem-solving, debugging, and performance tuning skills

Qualifications:

  • Bachelor's or Master's degree in Computer Science, Information Technology, or related field
  • Relevant certifications (e.g., Databricks Certified Data Engineer, Informatica Developer) are highly desirable
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.