¡Activa las notificaciones laborales por email!

Data Engineer II - Mexico

Western Governors University

Guadalajara

Presencial

USD 30,000 - 50,000

Jornada completa

Hace 10 días

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading university is seeking a Data Engineer to develop and maintain ETL/ELT data pipelines. The role involves ensuring data integrity, creating reports, and working in an agile environment. Ideal candidates will have a strong background in data engineering and relevant tools.

Formación

  • 4 years of experience in Data Engineering or related fields.
  • Equivalent relevant experience may substitute for education.

Responsabilidades

  • Develops and builds ETL/ELT data pipelines for analysis.
  • Creates and maintains optimal data pipeline architecture.
  • Delivers ad-hoc and analytical reports to internal users.

Conocimientos

Communication
Agile Methodology
Data Integrity

Educación

Bachelor's Degree in Management Information Systems
Bachelor's Degree in Computer Science

Herramientas

Jira
Confluence
GitHub
Python
Java
Scala
Databricks
Hadoop
Spark
Kafka
Power BI
Tableau

Descripción del empleo

If you’re passionate about building a better future for individuals, communities, and our country—and you’re committed to working hard to play your part in building that future—consider WGU as the next step in your career.

Driven by a mission to expand access to higher education through online, competency-based degree programs, WGU is also committed to being a great place to work for a diverse workforce of student-focused professionals. The university has pioneered a new way to learn in the 21st century, one that has received praise from academic, industry, government, and media leaders. Whatever your role, working for WGU gives you a part to play in helping students graduate, creating a better tomorrow for themselves and their families.

  • Develops and builds ETL/ELT data pipelines for use in data analysis.

  • Creates and maintains optimal data pipeline architecture.

  • Keeps data separated and secure across multiple cloud environments.

  • Assembles large, complex data sets that meet functional and non-functional business requirements.

  • Delivers ad-hoc and analytical reports to internal users and teams.

  • Monitors and maintains ETL/ELT jobs and troubleshoots load issues.

  • Manages change requests/ticket queues for analytical reports and ETL/ELT jobs.

  • Performs data/technology discovery from new sources and third-party applications for data ingestion.

  • Creates complex reports and dashboards in Cognos and Tableau.

  • Ingests and transforms structured, semi-structured, and unstructured data from sources including relational databases, NoSQL, external APIs, JSON, XML, delimited files, and more.

  • Works and delivers in agile methodology for new development projects. Delivers efficient and effective solutions on time.

  • Analyzes and understands data sources and designs a data model for data capture and ETL/ELT.

  • Identifies bugs, applies fixes, and checks data quality via process/pipeline audits.

  • Uses industry best practices for code development, testing, implementation, and documentation.

  • Performs other job-related duties as assigned.

Knowledge, Skills, and Abilities

  • Excellent verbal and written communication skills, along with technical documentation skills.

  • Ability to work with team members, as well as cross-team for product delivery.

  • Ability to work in an agile environment with timely delivery of ETL/ELT pipelines and reports.

  • Knowledge and experience using tools like Jira, Confluence, and GitHub.

  • Ability to develop processes for the audit of data integrity.

  • Knowledge and experience with validation and testing development to analyze and debug issues.

  • Experience with relational SQL and NoSQL databases.

  • Knowledge and experience with object-oriented/object function scripting languages, like Python, Java, and Scala.

  • Knowledge and experience with big data tools, like Databricks, Hadoop, Spark, Kafka, etc.

  • Exposure to analytical reporting tools, preferably Power BI and Tableau.

Minimum Qualifications

  • 4 years of experience in Data Engineering, Data Integration, Big Data, Business Intelligence, or Software Engineering.

  • Bachelor's Degree in Management Information Systems, Computer Science, or a related field.

  • Equivalent relevant experience performing the essential functions of this job may substitute for education degree requirements. Generally, equivalent relevant experience is defined as 1 year of experience for 1 year of education, and is at the discretion of the hiring manager.

Physical Requirements:

  • Prolonged periods of sitting at a desk and working on a computer.

  • Must be able to lift up to 15 pounds at times.



Location: Guadalajara

#LI-AQ1

Learn more about our WGU Mexico Team by clicking here.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.