Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Jr Data Engineer

AutoZone Business and Technology Store Support Center

Chihuahua

Presencial

MXN 200,000 - 400,000

Jornada completa

Ayer
Sé de los primeros/as/es en solicitar esta vacante

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A technology support center in Chihuahua is seeking a Programmer Analyst to build and maintain data pipelines in Google Cloud. This role involves transforming data sources into BigQuery datasets, contributing to LookML models, and implementing data quality checks. Ideal candidates will have a degree in Computer Science and some experience in data engineering or analytics. Familiarity with SQL, Google Cloud, and version control is essential. Proficiency in English is required for team collaboration and documentation.

Formación

  • 0 to 2 years in data engineering, analytics engineering, or business intelligence.
  • Proficient in English (written and spoken).

Responsabilidades

  • Build and maintain ELT pipelines in Google Cloud.
  • Design and document datasets that support marketing analytics.
  • Contribute to Looker by refining LookML models.
  • Implement data quality checks and maintain runbooks.
  • Translate requirements into technical tasks with Marketing and IT.

Conocimientos

SQL in BigQuery
Google Cloud fundamentals
Version control with Git
Data modeling concepts

Educación

BA or BS in Computer Science or equivalent

Herramientas

Looker
Dataform or dbt
Python
Descripción del empleo
SUMMARY

The Programmer Analyst will enable marketing analytics by building and maintaining our Google Cloud data foundation. They will transform vendor and internal sources into dependable BigQuery datasets and views, run scheduled jobs, and keep documentation, monitoring, and alerts in good shape. They will work with Marketing and IT to turn business questions into technical plans, support Looker models and dashboards, and make data easy to find, trust, and reuse across the organization. They will help consolidate customer identifiers and events from multiple systems, reconcile keys, define update rules, and sustain fresh profiles that support activation and measurement. Core activities include writing SQL in BigQuery, designing tables and views for performance and usability, implementing data quality checks, and applying appropriate access controls for sensitive data. The role values clear communication, ownership of outcomes, and a consistent focus on reliability and usability. Familiarity with modern AI tools for development and general use is strongly preferred.

RESPONSIBILITIES
  • Build and maintain ELT pipelines in Google Cloud that land, transform, and publish data to BigQuery from internal systems and third-party sources using Cloud Storage, scheduled queries, and lightweight serverless jobs with Cloud Scheduler and Cloud Functions or Cloud Run.
  • Design and document datasets, tables, and views that support analytics use cases in marketing, including clear naming, partitioning and clustering, and performance considerations.
  • Contribute to Looker by adding or refining LookML models, dimensions, measures, and Explores, and by maintaining dashboards with sensible refresh and performance settings.
  • Implement data quality checks and basic alerting for pipeline health and data completeness. Maintain runbooks and respond to incidents with timely fixes and clear communication.
  • Partner with Marketing and IT to translate requirements into technical tasks, propose simple solution designs, estimate effort, and track delivery.
  • Integrate identifiers and events from multiple systems, aligning schemas and keys, defining update and merge rules, and keeping profiles current for activation and measurement needs.
  • Apply security and stewardship practices, including IAM least-privilege, careful handling of PII, and clear lineage and assumptions in documentation.
  • Use modern AI assistants when appropriate to accelerate routine tasks such as drafting SQL, documenting changes, or generating tests.
REQUIREMENTS

Degree: BA or BS in Computer Science, Information Systems, Engineering, Statistics, or equivalent practical experience through internships, capstone projects, or personal projects.

Years of experience: 0 to 2 years in data engineering, analytics engineering, or business intelligence.

Required technical skills
  • SQL in BigQuery, including joins, window functions, common table expressions, and performance basics.
  • Google Cloud fundamentals for data work, including BigQuery, Cloud Storage, IAM basics, and scheduling with Cloud Scheduler.
  • Version control with Git and GitHub, including branching and pull requests.
  • Data modeling concepts that support analytics, including star or snowflake patterns and pragmatic table design.
Preferred skills
  • Looker and LookML for semantic modeling and dashboards.
  • Dataform or dbt for ELT orchestration and testing.
  • Airflow or Cloud Composer familiarity.
  • Python or a similar language for utilities, API integrations, or data quality checks.
  • Working with REST and JSON for vendor data feeds.
  • Experience using AI assistants for coding and documentation and interest in Vertex AI.

Other: Proficient in English (written and spoken)

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.