Ativa os alertas de emprego por e-mail!

Especialista - Engenheiro de Dados Sênior (AWS)

Leega

São Paulo

Híbrido

BRL 80.000 - 150.000

Tempo integral

Há 7 dias
Torna-te num dos primeiros candidatos

Melhora as tuas possibilidades de ir a entrevistas

Cria um currículo adaptado à oferta de emprego para teres uma taxa de sucesso superior.

Resumo da oferta

An innovative company is seeking a Senior Data Engineer specialized in AWS to join their dynamic team. This role focuses on developing and optimizing data pipelines, ensuring data quality, and implementing large-scale distributed processing solutions. The ideal candidate will have extensive experience with AWS services, SQL, and ETL processes. With a commitment to employee development, this forward-thinking firm offers a hybrid work model and values ethics and teamwork. If you're driven by challenges and eager to make an impact in a collaborative environment, this opportunity is perfect for you.

Qualificações

  • Over 5 years of experience with AWS services including S3, Glue, and Redshift.
  • Proficient in PySpark for distributed data processing and SQL for data queries.

Responsabilidades

  • Develop and optimize data pipelines using PySpark in AWS environments.
  • Design large-scale distributed data processing solutions and ensure data quality.

Conhecimentos

AWS (S3, Glue, Redshift, Athena, Lambda)
PySpark
SQL
ETL Processes
Data Security
Data Integration
Workflow Orchestration
Python
Data Analysis

Formação académica

Bachelor's degree in Computer Science
Equivalent experience in Data Engineering

Ferramentas

Jupyter Notebooks
SSIS
Apache Airflow
AWS Step Functions

Descrição da oferta de emprego

Especialista - Engenheiro de Dados Sênior (AWS)

Join us to apply for the Especialista - Engenheiro de Dados Sênior (AWS) role at Leega.

Leega is a company focused on providing efficient and innovative customer service. Our culture is inspiring, and our core values include ethics, transparency, quality excellence, teamwork, social and environmental responsibility, human relations, and credibility.

We seek innovative professionals driven by challenges and results-oriented. If you're looking for a dynamic, partner-oriented company that invests in its employees through continuous training, Leega is the place for you.

>> LEGA IS FOR EVERYONE. We would be delighted to have you join our team. Come be part of our history and help shape our future.

Register now for our vacancies!

Responsibilities and duties
  • Develop and optimize data pipelines using PySpark in AWS environments (S3, Glue, Redshift, Athena).
  • Design and implement large-scale distributed data processing solutions.
  • Utilize Jupyter Notebooks for data analysis, algorithm testing, and visualizations.
  • Integrate data from various sources using ETL processes, ensuring quality and consistency.
  • Work with SQL for data queries and transformations, optimizing performance.
  • Use SSIS for data integration and ETL automation.
  • Employ workflow orchestration tools like Apache Airflow or AWS Step Functions to automate data processes.
  • Monitor and ensure the performance of solutions, optimizing costs and resources.
  • Apply data security and governance practices, ensuring compliance with privacy policies.
  • Collaborate with BI and Analytics teams to prepare data for analysis.
  • Maintain technical documentation of processes and solutions.
Requirements and qualifications
  • Proven solid experience of over 5 years with AWS (S3, Glue, Redshift, Athena, Lambda, etc.).
  • Proven experience with DynamoDB.
  • Proficiency in PySpark for distributed data processing (over 5 years).
  • Familiarity with Jupyter Notebooks for data analysis and prototyping.
  • Strong knowledge of SQL and relational/non-relational databases.
  • Experience with SSIS for data integration and ETL automation.
  • Experience in data ETL processes and automation.
  • Knowledge of cloud data architectures and data security best practices.
  • Experience with workflow orchestration tools (Apache Airflow, AWS Step Functions).
  • Programming skills in Python or other data integration languages.
  • Bachelor’s degree in Computer Science, Data Engineering, or related fields (or equivalent experience).
Differentials
  • AWS certifications (Solutions Architect, Big Data, etc.).
  • Experience with Data Lakes and large-scale data processing.
  • Knowledge of data analysis frameworks like Pandas, Matplotlib, or Plotly.
Additional information
  • Work model: Hybrid, 3 days a week in Vila Olimpia - São Paulo.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.

Ofertas semelhantes

Mainframe App Modernization

Kyndryl

São Paulo

Teletrabalho

BRL 60,000 - 100,000

Há 3 dias
Torna-te num dos primeiros candidatos

Salesforce Software Engineer | Full Stack Tech Loft - São Paulo - SP

Loft

São Paulo

Teletrabalho

BRL 80,000 - 150,000

Há 7 dias
Torna-te num dos primeiros candidatos

Sr. DevOps Engineer

OfferFit

São Paulo

Teletrabalho

USD 80,000 - 120,000

Há 4 dias
Torna-te num dos primeiros candidatos

Channel Marketing Manager

Canonical

São Paulo

Teletrabalho

USD 60,000 - 100,000

Há 6 dias
Torna-te num dos primeiros candidatos

Senior Vice President (Operations) - Advertising (Latam)

Truelogic Software LLC

São Paulo

Teletrabalho

USD 70,000 - 120,000

Há 7 dias
Torna-te num dos primeiros candidatos

Chief Operating Officer (COO) - Advertising (Latam)

Truelogic Software LLC

São Paulo

Teletrabalho

USD 80,000 - 150,000

Há 7 dias
Torna-te num dos primeiros candidatos

Senior Site Reliability Engineer (SRE)

Avra

São Paulo

Teletrabalho

BRL 80,000 - 130,000

Há 11 dias

Senior Site Reliability Engineer (SRE)

Avra

São Paulo

Teletrabalho

BRL 80,000 - 120,000

Há 12 dias

Senior Software Engineer (Kotlin)

Match Profiler

Dois Irmãos

Teletrabalho

BRL 60,000 - 100,000

Há 2 dias
Torna-te num dos primeiros candidatos