¡Activa las notificaciones laborales por email!

Procurement Data Engineer III

Johnson Controls

Nuevo León

Presencial

MXN 700,000 - 900,000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A global technology company in Nuevo León is seeking a Data Engineer to support the end-to-end management of data. The role involves deploying data processes and managing complex data sets while ensuring compliance with data governance standards. Ideal candidates will have a Bachelor's degree and over 4 years of experience in data engineering or related fields. Strong SQL skills and familiarity with Azure Data Factory are essential. This position offers the opportunity to work within a diverse team in a fast-paced environment.

Formación

  • 4+ years of relevant professional experience in BI Engineering or data-related roles.
  • Knowledge in DW/DL concepts and data modeling.
  • Experience building data pipelines and architectures.

Responsabilidades

  • Deploy data ingestion processes using Azure Data Factory.
  • Build and design scalable ETL/ELT pipelines.
  • Manage data models and ensure compliance with architecture standards.

Conocimientos

Strong SQL knowledge
ETL/ELT development experience
Project management skills
Problem solving
Communication skills
Familiarity with Python

Educación

Bachelor’s degree in Engineering, Computer Science, Data Science or similar

Herramientas

Azure Data Factory
dbt
Snowflake
Azure DevOps
Descripción del empleo
Overview

Join us in the Procurement Execution Center (PEC) as a Data Engineer as part of a diverse team of data and procurement individuals. In this role, you will be responsible for deploying supporting the E2E management of our data, including: ETL/ELT, DW/DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reporting for the PEC. This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Fleet, MRO & Energy, Travel, Professional Services, among others.

Responsibilities
  • Serve as the main technical resource for any data-related requirement
  • Demonstrate an ability to communicate technical knowledge through project management and contributions to product strategy
  • Deploy data ingestion processes through Azure Data Factory to load data models as required into Azure Synapse
  • Build and design robust, modular and scalable ETL/ELT pipelines with Azure DataFactory, Python and/or dbt
  • Assemble large, complex, robust and modular data sets that meet functional / nonfunctional business requirements
  • Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using Data Lakehouse technologies and ADF
  • Develop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data models
  • Maintain conceptual, logical, and physical data models along with corresponding metadata
  • Manage the DevOps pipeline deployment model, including automated testing procedures
  • Deploy data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rules
  • Ensure compliance with system architecture, methods, standards, practices and participate in their creation
  • Clearly articulate and effectively influence both business and technical teams
  • Perform the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities
  • Support the deployment of a global data standard
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
  • Support Rate Repository management as required (including Rate Card uploads to our DW)
  • Other Procurement duties as assigned
Qualifications
  • Bachelor’s degree in related field (Engineering, Computer Science, Data Science or similar)
  • 4+ years of relevant professional experience in BI Engineering, data modeling, data engineering, software engineering or other relevant roles. Strong SQL knowledge and experience working with relational databases
  • Knowledge in DW/DL concepts, data marts, data modeling, ETL/ELT, data quality/stewardship, distributed systems and metadata management
  • Experience building and optimizing data pipelines, architectures, and data sets
  • Azure Data Engineering certification preferred (DP-203)
  • ETL/ELT development experience (4+ years), ADF, dbt and Snowflake are preferred
  • Ability to resolve ETL/ELT problems by proposing and implementing tactical/strategic solutions
  • Strong project management and organizational skills
  • Experience with object-oriented scripting languages: Python, Scala, R, etc
  • Experience with NoSQL databases is a plus to support the transition from On-Prem to Cloud
  • Excellent problem solving, critical thinking, and communication skills
  • Relevant experience with Azure DevOps (CI/CD, git/repo management) is a plus
  • Due to the global nature of the role, proficiency in English language is a must
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.