Job Title: Databricks Developer
Location: Singapore
Experience Required: 5+ years
Job Summary:
We are looking for a skilled Databricks Developer with strong expertise in
Databricks (Scala),
Informatica, and
Java to join our data engineering team in Singapore. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and integration workflows using a combination of modern cloud technologies and traditional ETL tools.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines using Databricks (Scala) and Informatica PowerCenter
- Integrate and transform data from various sources into the enterprise data platform
- Develop high-performance Spark applications in Scala for large-scale data processing
- Build reusable components and services using Java for ETL orchestration or middleware integration
- Optimize data workflows for performance and cost-efficiency in cloud environments (Azure or AWS)
- Ensure data quality, consistency, and governance across all ETL processes
- Collaborate with cross-functional teams to understand business requirements and deliver reliable data solutions
- Implement CI/CD and monitoring processes for ETL workflows
Required Skills:
- Strong hands-on experience with Databricks and Apache Spark (Scala)
- Proficient in Informatica PowerCenter for traditional ETL processes
- Solid programming knowledge in Core Java for backend data integration or utility services
- Strong experience in SQL and performance tuning
- Experience working in cloud environments (Azure Data Lake, ADF, AWS S3, Glue, etc.)
- Familiarity with Delta Lake, data lakes, and data warehousing concepts
- Excellent problem-solving, debugging, and performance tuning skills
Qualifications:
- Bachelor's or Master's degree in Computer Science, Information Technology, or related field
- Relevant certifications (e.g., Databricks Certified Data Engineer, Informatica Developer) are highly desirable