Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Senior Data Engineer (100% Remote) (Latam Only)

Allshore Talent

Teletrabalho

BRL 80.000 - 120.000

Tempo parcial

Hoje
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading remote staffing company is seeking a Senior Data Engineer to design and optimize scalable data pipelines using Databricks, support data modeling, and develop Oracle database integrations. Ideal candidates should have extensive experience with Python, PySpark, and SQL. This is an independent contractor role with 35 to 40 hours of work per week, offering an hourly rate between 23.75 USD and 25 USD. Fully remote position available.

Qualificações

  • Extensive hands-on experience with Databricks.
  • Strong proficiency in Python, PySpark, and SQL.
  • Demonstrated experience with cloud data engineering.

Responsabilidades

  • Design and optimize scalable data pipelines using Databricks.
  • Support data modeling and pipeline orchestration within cloud environments.
  • Develop data integrations involving Oracle databases.

Conhecimentos

Databricks
Python
PySpark
SQL
ETL / ELT pipelines
Azure Data Factory

Ferramentas

Microsoft Fabric
Oracle Databases
Microsoft SQL Server
Descrição da oferta de emprego

About AllShore Talent

AllShore Talent is a leading remote staffing company, offering top-tier professionals working 100% remote to businesses worldwide. Specializing in IT and software development, design, administrative support, digital marketing, and more. AllShore connects organizations with skilled talent to meet diverse business needs.

About The Role

As our client’s organization continues to grow and expand our support for internal clients, we are seeking an experienced Senior Data Engineer with strong hands-on expertise in modern data engineering tools and cloud-based analytics platforms. This role will work across two internal departments, with Databricks serving as the highest-priority and most critical skillset for immediate consideration.

The ideal candidate is a proactive problem solver with deep technical experience, the ability to work autonomously, and a strong track record of building scalable data pipelines in enterprise environments.

Key Responsibilities
Facilities Department (Primary Scope)
  • Design, develop, and optimize scalable data pipelines using Databricks (highest priority)
  • Build, automate, and maintain ETL / ELT workflows aligned with organizational standards
  • Support data modeling, data quality, and pipeline orchestration within cloud environments
  • Collaborate with cross-functional teams to deliver reliable, production-ready data solutions
  • Leverage Microsoft Fabric for analytics, data integration, or reporting needs (medium priority)
  • Provide support for legacy systems when needed, including SQL Server, SSIS, SSRS, and SSAS
Family History Department (Secondary Scope)
  • Develop and maintain data integrations and pipelines involving Oracle database environments
  • Work with various Data Services to ensure accessibility, accuracy, and reliability of datasets
  • Partner with internal stakeholders to refine requirements and optimize data workflows
Required Qualifications
  • Extensive hands‑on experience with Databricks, including Spark, Delta Lake, and notebook‑based development
  • Strong proficiency in Python, PySpark, SQL, and distributed data processing
  • Demonstrated experience with cloud data engineering in enterprise settings
  • Experience with Microsoft Fabric, Power BI, or Fabric‑aligned tooling
  • Working knowledge of legacy Microsoft data stack ( SQL Server, SSIS, SSRS, SSAS )
  • Experience working with Oracle databases and associated data services
  • Ability to develop scalable, secure ETL / ELT pipelines using best practices
  • Strong documentation, communication, and stakeholder‑management skills
Nice‑to‑Have Skills
  • Experience migrating or modernizing legacy BI / data systems
  • Background in data quality frameworks, testing, or monitoring
  • Familiarity with Azure Data Factory, Synapse, or other Microsoft cloud analytics components
Who You Are
  • You are a senior‑level data engineer who excels in fast‑paced environments
  • You can work independently as a contractor without requiring heavy oversight
  • You are comfortable partnering with multiple stakeholder groups and adapting to shifting priorities
  • You bring a strong sense of ownership, accountability, and problem‑solving
Engagement Details
  • Independent contractor role
  • 35 to 40 hours per week
  • Hourly rate: 23.75 USD to 25 USD, depending on experience
  • Fully remote position
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.