Job Search and Career Advice Platform

¡Activa las notificaciones laborales por email!

Senior Data Engineer (100% Remote) (LATAM Only)

Jobgether

A distancia

MXN 200,000 - 400,000

A tiempo parcial

Hace 9 días

Genera un currículum adaptado en cuestión de minutos

Consigue la entrevista y gana más. Más información

Descripción de la vacante

A dynamic technology partner is hiring a Senior Data Engineer to design and maintain scalable data solutions in a fully remote capacity across Latin America. The ideal candidate should have extensive experience with Databricks, strong skills in Python, ETL/ELT processes, and cloud engineering. This role offers autonomy and collaboration with a skilled international team on cutting-edge projects, providing a unique opportunity for personal and professional growth.

Servicios

Hourly rate of 23.75 to 25 USD
Flexible 35–40 hour work week
Opportunity to work with advanced tools

Formación

  • Hands-on experience with Databricks and its components.
  • Proficiency in Python, PySpark, and SQL.
  • Experience in cloud data engineering and building data pipelines.

Responsabilidades

  • Design and optimize scalable data pipelines.
  • Collaborate with teams to deliver reliable data solutions.
  • Support ETL/ELT workflow automation.

Conocimientos

Databricks
Spark
Python
PySpark
SQL
ETL/ELT pipelines
Cloud data engineering
Data quality frameworks

Herramientas

Microsoft Fabric
Oracle databases
Power BI
Azure Data Factory
Descripción del empleo

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Senior Data Engineer in Latin America.

In this role, you will design, develop, and maintain scalable, high-performance data pipelines and analytics solutions across multiple departments. You will work autonomously in a remote environment, applying modern data engineering tools and cloud-based platforms to deliver production-ready data solutions. Your work will directly impact business intelligence, analytics, and reporting, enabling teams to make data-driven decisions efficiently. Collaboration with cross-functional teams and internal stakeholders will be key to translating complex requirements into robust, reliable data workflows. This position offers opportunities to work with cutting-edge technologies and contribute to enterprise-level data modernization projects.

Accountabilities
  • Design, develop, and optimize scalable data pipelines using Databricks, Spark, Delta Lake, and notebook-based workflows.
  • Build, automate, and maintain ETL/ELT workflows aligned with organizational standards.
  • Support data modeling, pipeline orchestration, and data quality initiatives within cloud environments.
  • Collaborate with cross-functional teams to deliver reliable, production-ready data solutions.
  • Develop and maintain data integrations with Oracle database environments and Microsoft Fabric for analytics and reporting.
  • Provide support for legacy Microsoft data stack tools (SQL Server, SSIS, SSRS, SSAS) when needed.
  • Partner with stakeholders to refine requirements, optimize data workflows, and ensure accessibility and reliability of datasets.
Requirements
  • Extensive hands‑on experience with Databricks, including Spark, Delta Lake, and notebook‑based development.
  • Strong proficiency in Python, PySpark, SQL, and distributed data processing.
  • Proven experience with cloud data engineering and enterprise‑scale data pipelines.
  • Familiarity with Microsoft Fabric, Power BI, or related tooling.
  • Working knowledge of legacy Microsoft data stack (SQL Server, SSIS, SSRS, SSAS) and Oracle databases.
  • Ability to develop scalable, secure ETL/ELT pipelines following best practices.
  • Strong documentation, communication, and stakeholder‑management skills.
  • Nice‑to‑haves: experience with data quality frameworks, testing, monitoring, Azure Data Factory, Synapse, or migrating legacy BI/data systems.
Benefits
  • Independent contractor role with hourly rate of 23.75 USD to 25 USD, depending on experience.
  • Fully remote position within Latin America.
  • Flexible 35–40 hour work week.
  • Opportunity to work with cutting‑edge data engineering tools and cloud platforms.
  • High degree of autonomy and ownership over technical deliverables.
  • Collaboration with a skilled, international team on impactful enterprise projects.
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.