PT.Japfa Comfeed Indonesia
Surabaya ꦱꦸꦫꦧꦪ
On-site
IDR 200.000.000 - 300.000.000
Full time
Job summary
A leading Indonesian company is seeking a Data Engineer to design and maintain ETL/ELT pipelines, collaborate with analytics teams, and optimize data models for business intelligence. Candidates should possess a Bachelor's degree in Information Technology or a related field, with a strong proficiency in SQL and familiarity with cloud data platforms. This is an excellent opportunity for fresh graduates and experienced professionals alike.
Qualifications
- Bachelor’s degree required; Master’s degree is a plus.
- Experience in data engineering or related roles welcome, fresh graduates are invited.
- Proven experience with ETL/ELT data pipelines for structured and semi-structured data.
Responsibilities
- Design, develop, and maintain ETL/ELT pipelines.
- Collaborate with business and analytics teams for data solutions.
- Build and optimize data models for BI tools.
Skills
SQL
Python
ETL/ELT tools (e.g., dbt, Informatica, Talend)
Cloud data platforms (e.g., Snowflake, Databricks)
CI/CD and version control (e.g., Git)
Education
Bachelor’s degree in Information Technology or related field
Responsibilities
- Design, develop, and maintain ETL/ELT pipelines for data ingestion, transformation, and integration.
- Collaborate with business and analytics teams to define data requirements and translate them into technical solutions.
- Build and optimize data models to support BI tools, dashboards, and advanced analytics.
- Ensure data accuracy, consistency, and compliance with data governance policies.
- Automate data workflows and monitor pipeline performance for reliability and scalability.
- Work closely with data architects and functional teams to improve overall data ecosystem efficiency.
Requirements
- Bachelor’s degree in Information Technology, Information Systems, or a related field; Master’s degree is a plus.
- Experience in data engineering, data transformation, or data integration roles is a plus. Fresh Grad is welcome.
- Proven experience designing, developing, and maintaining ETL/ELT data pipelines for structured and semi-structured data using tools such as dbt, Informatica, Talend, or Apache Airflow.
- Hands‑on experience working with cloud‑based data platforms such as Snowflake, Databricks, Google BigQuery, Amazon Redshift, or Azure Synapse.
- Strong proficiency in SQL and practical experience with at least one programming language (e.g., Python, Scala, or Java) for automation and data manipulation.
- Demonstrated experience integrating and preparing data for Business Intelligence (BI) tools, such as Power BI, Qlik Sense, or Tableau, ensuring optimized data structures for visualization and analysis.
- Experience collaborating closely with business analysts, functional teams, and data consumers to translate business requirements into technical data solutions.
- Exposure to version control systems (Git) and CI/CD workflows, ideally within an Agile development environment.
Technical Skills
- SQL, Python, or Scala.
- TL/ELT tools (e.g., dbt, Informatica, Talend).
- Cloud data platforms (e.g., Snowflake, Databricks, BigQuery, Redshift).
- CI/CD and version control (e.g., Git, Jenkins).
Soft Skills
- Analytical Thinking – Ability to interpret complex data processes, identify root causes of issues, and propose practical solutions.
- Business Acumen – Understands business processes and can translate functional requirements into technical data solutions that add measurable value.
- Communication Skills – Able to communicate clearly with both technical and non‑technical stakeholders, including data architects, analysts, and business users.
- Collaboration & Teamwork – Works effectively within cross‑functional teams, fostering cooperation between IT, data, and business units.
- Problem‑Solving Mindset – Approaches challenges systematically, balancing technical precision with business needs.
- Attention to Detail – Ensures data accuracy and consistency throughout transformation and integration processes.
- Continuous Learning – Stays updated with emerging data technologies, engineering frameworks, and best practices.