Tangerang Selatan
On-site
IDR 15.000.000 - 25.000.000
Full time
Job summary
A leading data solutions company in Tangerang Selatan is seeking a Data Engineer to design and maintain data pipelines. Responsibilities include optimizing data storage solutions and collaborating with analytics teams. Ideal candidates will have a degree in Computer Science and strong SQL & programming skills. Experience with ETL tools and a passion for data engineering are essential. This role offers a dynamic environment and opportunities for collaboration.
Qualifications
- Strong foundation in SQL and experience with relational databases.
- Proficiency in at least one programming language such as Python.
- Experience with ETL tools and data pipeline development.
Responsibilities
- Design, develop, and maintain scalable data pipelines.
- Implement and optimize data storage solutions.
- Collaborate with Data Scientists and Analytics teams.
Skills
SQL
Python
Data engineering
Problem-solving
Communication
Education
Bachelor's degree in Computer Science
Tools
PostgreSQL
MySQL
ETL tools
Git
Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes to support analytics and business intelligence initiatives
- Implement and optimize data storage solutions, ensuring data quality, accessibility, and security
- Collaborate with Data Scientists, Analytics teams, and other stakeholders to understand data requirements and deliver effective solutions
- Build and maintain data warehouses and data lakes, ensuring efficient data organization and retrieval
- Write clean, maintainable, and well-documented code following team standards and best practices
- Participate in code reviews and provide constructive feedback to team members
- Monitor and optimize data pipeline performance and efficiency
- Create and maintain comprehensive documentation for data processes and architectures
- Implement data validation and quality control measures
- Contribute to technical discussions and architectural planning sessions
- Share knowledge with team members and participate in mentoring activities
Requirements
Required Skills
- Bachelor's degree in Computer Science, Data Engineering, or related field
- Strong foundation in SQL and experience with relational databases (e.g., PostgreSQL, MySQL)
- Proficiency in at least one programming language (e.g., Python, Java, Scala)
- Experience with ETL tools and data pipeline development
- Understanding of data warehouse concepts and dimensional modeling
- Familiarity with version control systems (Git) and collaborative development workflows
- Basic knowledge of data security practices and compliance requirements
- Strong problem-solving skills and analytical thinking abilities
- Excellent communication skills in both technical and non-technical contexts
- Demonstrated interest in data engineering through projects or work experience
Preferred Skills
- Experience with big data technologies (e.g., BigQuery, Spark)
- Knowledge of stream processing frameworks (e.g., Kafka, RabbitMQ)
- Experience with data modeling and optimization techniques
- Knowledge of data governance principles
- Understanding of machine learning pipelines and requirements
- Experience with data visualization tools
- Understanding of CI/CD practices for data pipelines
- Knowledge of scripting languages for automation (e.g., Bash, Shell)
- Familiarity with Python web development (e.g., FastAPI, Flask) is a plus
- Proficiency with AI-powered coding assistants (e.g., GitHub Copilot, Cursor) is a plus
- Familiarity with data warehouse tools or relevant Apache frameworks (e.g., Airflow,Spark)