Ativa os alertas de emprego por e-mail!

Senior Data DevOps Engineer

EPAM Systems

Brasil

Teletrabalho

BRL 100.000 - 150.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Melhora as tuas possibilidades de ir a entrevistas

Cria um currículo adaptado à oferta de emprego para teres uma taxa de sucesso superior.

Resumo da oferta

A leading company is seeking a Senior Data DevOps Engineer to join their remote team. This role involves developing and maintaining large-scale big data infrastructure, ensuring reliability and performance. The ideal candidate will work closely with data scientists and engineers to optimize data processing workflows and provide technical guidance to junior members. They will also be responsible for automating tasks and ensuring compliance with security policies.

Serviços

Healthcare benefits
Employee financial programs
Paid time off and sick leave
Upskilling and reskilling courses
Unlimited access to LinkedIn Learning
Global career opportunities
Volunteer and community involvement opportunities
Award-winning culture

Qualificações

  • At least 3 years of experience in DevOps focused on data infrastructure.
  • Hands-on experience with big data components.

Responsabilidades

  • Develop and maintain infrastructure as code for big data components.
  • Collaborate with cross-functional teams to design and implement data pipelines.

Conhecimentos

DevOps
Python
SQL
Shell Scripting
Cloud Computing
Infrastructure as Code

Ferramentas

Terraform
Kubernetes
Helm
AWS
Azure
GCP

Descrição da oferta de emprego

We are seeking a Senior Data DevOps Engineer to join our remote team, working on a cutting-edge project that involves developing and maintaining large-scale big data infrastructure.

In this role, you will play a crucial role in ensuring the reliability, scalability, and performance of our big data infrastructure. You will work closely with cross-functional teams, including data scientists, data engineers, and software developers, to deploy, operate, monitor, optimize, and troubleshoot our big data infrastructure.

Responsibilities
  1. Develop and maintain infrastructure as code for big data components, using tools such as Terraform, Kubernetes, and Helm.
  2. Deploy, operate, monitor, optimize, and troubleshoot large-scale big data infrastructure, ensuring high availability, reliability, and performance.
  3. Collaborate with cross-functional teams, including data scientists, data engineers, and software developers, to design, implement, and maintain data pipelines and processing workflows.
  4. Automate data processing tasks, using shell scripts, Python, and other programming languages.
  5. Ensure compliance with security and data privacy policies and regulations.
  6. Participate in on-call rotation to provide 24/7 support for critical production systems.
  7. Continuously improve the performance, scalability, and reliability of our big data infrastructure, using monitoring and alerting tools.
  8. Provide technical guidance and mentorship to junior team members.
Requirements
  1. At least 3 years of experience in DevOps, with a focus on data infrastructure and operations.
  2. Hands-on experience with big data components.
  3. Expertise in deploying, operating, monitoring, optimizing, and troubleshooting large-scale big data infrastructure.
  4. Proficient in shell scripting, Python, and SQL, with a deep understanding of data pipeline and data processing concepts.
  5. Experience in cloud computing platforms, such as AWS, Azure, and GCP, with a focus on cloud operations and infrastructure as code development and maintenance.
  6. Proficiency in infrastructure automation tools, such as Terraform, CloudFormation, CDK, and Kubernetes.
  7. Strong knowledge of Unix-based operating systems and networking concepts.
  8. Fluent spoken and written English at an upper-intermediate level or higher.
Nice to have
  1. Experience with machine learning frameworks and tools, such as TensorFlow, PyTorch, and Scikit-learn.
  2. Familiarity with data visualization tools, such as Tableau and Power BI.
  3. Knowledge of container orchestration platforms, such as Docker Swarm and Amazon ECS.
  4. Experience with CI/CD pipelines, using tools such as Jenkins and GitLab.
We offer
  1. International projects with top brands.
  2. Work with global teams of highly skilled, diverse peers.
  3. Healthcare benefits.
  4. Employee financial programs.
  5. Paid time off and sick leave.
  6. Upskilling, reskilling, and certification courses.
  7. Unlimited access to the LinkedIn Learning library and 22,000+ courses.
  8. Global career opportunities.
  9. Volunteer and community involvement opportunities.
  10. EPAM Employee Groups.
  11. Award-winning culture recognized by Glassdoor, Newsweek, and LinkedIn.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.

Ofertas semelhantes

Senior Data DevOps Engineer

EPAM Systems

Teletrabalho

BRL 120,000 - 150,000

Há 2 dias
Torna-te num dos primeiros candidatos

Senior Data DevOps Engineer

EPAM Systems

Teletrabalho

BRL 80,000 - 120,000

Ontem
Torna-te num dos primeiros candidatos

Senior Data DevOps Engineer

EPAM Systems

Teletrabalho

BRL 120,000 - 150,000

Hoje
Torna-te num dos primeiros candidatos

Senior Data DevOps Engineer

EPAM Systems

Teletrabalho

BRL 100,000 - 120,000

Hoje
Torna-te num dos primeiros candidatos

Senior Data DevOps Engineer

EPAM Systems

Teletrabalho

BRL 120,000 - 150,000

Ontem
Torna-te num dos primeiros candidatos

Senior Data DevOps Engineer

EPAM Systems

Teletrabalho

BRL 120,000 - 150,000

Hoje
Torna-te num dos primeiros candidatos

Senior Data Platform Engineer (DevOps)

EPAM Systems

Teletrabalho

BRL 120,000 - 160,000

Hoje
Torna-te num dos primeiros candidatos

Engenheiro Devops

NEORIS

Teletrabalho

BRL 80,000 - 120,000

Hoje
Torna-te num dos primeiros candidatos

Senior DevOps Engineer

Gosolve

Teletrabalho

BRL 120,000 - 160,000

Ontem
Torna-te num dos primeiros candidatos