Contract: 12 months renewable
Location: Central
Key Responsibilities:
- Design, build, and optimize scalable data pipelines and ETL processes.
- Integrate data from multiple sources using Informatica/IDMC or similar tools.
- Develop and maintain data solutions on Databricks using Python for data processing and transformation.
- Collaborate with analytics teams to deliver clean, structured data for reporting and visualization.
- Ensure data quality, security, and compliance with organizational standards.
- Troubleshoot and resolve data-related issues in production environments.
- Document processes and maintain best practices for data engineering workflows.
Required Skills & Qualifications:
Primary Skills (Must-Have):
- Strong hands‑on experience in Informatica/IDMC or Databricks with Python (expertise in at least one is mandatory).
Secondary Skills:
- Familiarity with Tableau or Oracle Analytics Server (OAS) for data visualization and reporting.
General Requirements:
- Solid understanding of data integration, ETL concepts, and data modeling.
- Strong problem‑solving and analytical skills.
- Ability to work collaboratively in cross‑functional teams.
Preferred / Nice‑to‑Have:
- AWS Cloud Practitioner certification.
- Databricks or IDMC Data Engineer certification.
- Experience working in cloud‑based data environments.