Tangerang Selatan
On-site
IDR 200.000.000 - 300.000.000
Full time
19 days ago
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Job summary
A leading company in data solutions is seeking a data specialist to manage ETL processes and databases. The role involves designing data models, ensuring data accuracy, and collaborating with data scientists. Candidates should have a strong background in programming, cloud platforms, and data visualization tools.
Qualifications
- Bachelor's degree in computer science, Information Technology, or related field.
- Proficiency in programming languages (Python, Java).
- Experience with ETL processes and tools like AWS Glue.
Responsibilities
- Create and maintain ETL processes using AWS Glue.
- Manage SQL and NoSQL databases for efficient data storage.
- Develop dashboards and reports using Tableau.
Skills
Python
Java
SQL
NoSQL
ETL
Hadoop
Spark
Data visualization
Git
Problem-solving
Communication
Attention to detail
Adaptability
Education
Bachelor's degree in computer science
Bachelor's degree in Information Technology
Bachelor's degree in a related field
Tools
AWS Glue
Amazon Redshift
Tableau
AWS
Azure
GCP
Responsibilities
- Create and maintain ETL (Extract, Transform, Load) processes to move data from various sources to data storage systems using tools like AWS Glue.
- Manage databases (SQL and NoSQL) to ensure data is stored efficiently and securely, including working with Amazon Redshift.
- Design data models and schemas for efficient data storage and retrieval.
- Implement processes to ensure data accuracy and quality.
- Utilize tools like Hadoop, Spark, and distributed computing to process large datasets.
- Work with cloud providers like AWS, Azure, or GCP to store and process data in the cloud.
- Develop dashboards and reports using visualization tools like Tableau to support data-driven decisions.
- Collaborate with data scientists and analysts to understand data requirements and deliver data solutions.
Requirements
Requirements
- Bachelor's degree in computer science, Information Technology, or a related field.
- Proficiency in one or more programming languages (e.g., Python, Java).
- Knowledge of databases, including SQL and NoSQL.
- Experience with ETL processes and tools like AWS Glue.
- Familiarity with big data technologies like Hadoop and Spark.
- Cloud platform experience (e.g., AWS, Azure, GCP), including services like Amazon Redshift.
- Data modeling expertise.
- Experience with data visualization tools like Tableau.
- Version control using tools like Git.
- Problem-solving abilities.
- Strong communication skills for collaboration with team members and stakeholders.
- Attention to detail to maintain data accuracy.
- Adaptability to learn and use new technologies and tools.