Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Senior Data Base Engineer

WatchGuard Technologies

São Paulo

Presencial

BRL 160.000 - 200.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading technology company in São Paulo is seeking an experienced Senior Data Engineer to design, develop, and maintain scalable data pipelines and data systems. In this role, you will work with cross-functional teams to deliver data solutions that support analytics and business intelligence initiatives. The ideal candidate should have extensive experience in data architecture, ETL/ELT processes, and cloud technologies like AWS or Azure. The position offers a competitive salary, benefits package, and opportunities for career growth in an innovative team environment.

Serviços

Competitive salary
Comprehensive benefits package
Opportunities for career growth

Qualificações

  • Solid experience as a Data Engineer or similar role in data architecture and pipeline development.
  • Strong experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Advanced knowledge of ETL/ELT processes, data modeling, and data warehousing (e.g., Snowflake, Redshift).

Responsabilidades

  • Design, develop, and optimize data pipelines to extract, transform, and load (ETL/ELT) data from various sources.
  • Build and manage data models and data warehouses that support business intelligence.
  • Integrate APIs to connect data sources and facilitate real-time data processing.

Conhecimentos

Data architecture
ETL/ELT processes
AWS
Azure
Google Cloud
SQL
Python
Apache Airflow
Hadoop
Tableau

Formação académica

Master’s degree in computer science, Engineering, Data Science, or related field

Ferramentas

AWS Glue
Azure Data Factory
Apache Airflow
Descrição da oferta de emprego

We are looking for an experienced and passionate Senior Data Engineer to join our growing data team. In this role, you will be responsible for designing, developing, and maintaining scalable data pipelines and systems to support a wide range of analytics and business intelligence solutions. You will work closely with cross‑functional teams including data scientists, analysts, and engineers to provide data solutions that drive key business decisions. The ideal candidate should have strong experience in data architecture, ETL/ELT processes, cloud technologies, and data warehousing.

Key Responsibilities:
  • Design, develop, and optimize data pipelines to extract, transform, and load (ETL/ELT) data from a variety of sources.
  • Build and manage data models and data warehouses that support business intelligence, reporting, and analytics needs.
  • Leverage cloud technologies such as AWS, Azure, or Google Cloud Platform for building scalable, reliable, and efficient data solutions.
  • Develop and maintain automated data workflows using tools like Airflow, AWS Glue, Azure Data Factory, or similar technologies.
  • Work with large datasets and complex data structures, ensuring data quality, integrity, and performance.
  • Write and optimize SQL queries for complex data extraction, aggregation, and transformation tasks.
  • Integrate APIs to connect data sources, extract information, and facilitate real‑time data processing.
  • Collaborate with business intelligence and data science teams to define data requirements and ensure the availability of clean, accurate data for analysis and decision‑making.
  • Implement CI/CD pipelines for automated deployment of data pipelines and models.
  • Monitor the performance of data systems, ensuring reliability, availability, and scalability of data architectures.
  • Create and maintain comprehensive documentation for data pipelines, systems, and processes.
  • Stay up to date with emerging trends and technologies in the data engineering field and continuously improve data systems.
Required Skills & Qualifications:
  • Solid experience as a Data Engineer or similar role in data architecture and pipeline development.
  • Strong experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Advanced knowledge of ETL/ELT processes, data modeling, and data warehousing (e.g., Snowflake, Redshift).
  • Proficiency in SQL for complex data transformation and querying.
  • Hands‑on experience with data pipeline orchestration tools like Azure Data Factory, Apache Airflow, AWS Glue, or similar.
  • Strong programming skills in Python for automation, data processing, and integration tasks.
  • Experience working with big data technologies such as Hadoop, Spark, or Kafka is a plus.
  • Familiarity with GitHub for version control and CI/CD pipelines for deployment automation.
  • Strong understanding of data security, governance, and compliance best practices.
  • Experience with business intelligence tools such as Tableau, Power BI, or similar for reporting and data visualization.
  • Ability to work in an agile, fast‑paced environment and manage multiple tasks simultaneously.
Preferred Qualifications:
  • Master’s degree in computer science, Engineering, Data Science, or related field.
  • Experience with real‑time data streaming platforms such as Kafka or AWS Kinesis.
  • Exposure to machine learning and AI technologies and how data engineering supports these initiatives.
  • Experience with Infrastructure as Code (IaC) tools such as Terraform or CloudFormation.
  • Knowledge of data lake architectures and modern data processing frameworks.
  • Experience with Tableau for building reports, dashboards, and visual analytics.
Why Join Us?
  • Opportunity to work on cutting‑edge data engineering projects with the latest cloud technologies.
  • Be part of an innovative and collaborative team driving data‑driven decision‑making.
  • Competitive salary and comprehensive benefits package.
  • Opportunities for career growth and professional development.
  • Work in an agile environment, where your contributions make a direct impact on the business.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.