Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset.
Work on a range of projects including batch pipelines, data modeling, and data mart solutions as part of collaborative project teams to implement robust data collection and processing pipelines to meet specific business needs.
Goals:
Execute and provide feedback for data modeling policies, procedures, processes, and standards.
Assist with capturing and documenting system flow and other pertinent technical information about data, database design, and systems.
Develop data quality standards and tools for ensuring accuracy.
Work across departments to understand new data patterns.
Translate high-level business requirements into technical specifications.
Required:
Bachelor’s degree in computer science or engineering.
3+ years of experience with data analytics, data modeling, and database design.
3+ years of coding and scripting (Python, Java, Pyspark) and design experience.
Experience with ETL methodologies and tools.
Experience with Vertica.
Expertise in tuning and troubleshooting SQL.
Strong data integrity, analytical, and multitasking skills.
Excellent communication, problem solving, organizational, and analytical skills.
Able to work independently.
Additional / Preferred Skills:
Familiar with agile project delivery process.
Experience with Airflow.
Ability to manage diverse projects impacting multiple roles and processes.
Able to troubleshoot problem areas and identify data gaps and issues.
Ability to adapt to a fast changing environment.
Experience with Python.
Basic knowledge of database technologies (Vertica, Redshift, etc.).
Experience designing and implementing automated ETL processes.
Obtenez un examen gratuit et confidentiel de votre CV.
Sélectionnez le fichier ou faites-le glisser pour le déposer