1 week ago Be among the first 25 applicants
We are seeking a highly skilled Senior Data DevOps Engineer to join our remote team, working on a cutting-edge project in the financial services industry.
In this role, you will be responsible for designing, implementing, and maintaining the infrastructure and tools necessary for the development, testing, and deployment of data-driven applications. You will work closely with cross-functional teams to ensure the seamless integration of data pipelines, databases, and data analytics tools. If you are passionate about DevOps and data engineering, we invite you to apply for this exciting opportunity.
Responsibilities
- Design, implement, and maintain the infrastructure required for the development, testing, and deployment of data-driven applications
- Build, configure, and manage CI/CD pipelines using tools such as Jenkins, GitLab, or CircleCI
- Deploy and manage Kubernetes clusters for containerized applications and microservices
- Develop and maintain Docker images for data processing and analytics tools
- Configure and manage Amazon Web Services resources, including EC2 instances, S3 buckets, and RDS databases
- Automate infrastructure deployment and management using Terraform and other infrastructure-as-code tools
- Monitor and troubleshoot infrastructure issues, ensuring high availability and performance of data pipelines and databases
- Collaborate with data scientists and analysts to ensure seamless integration of data pipelines and analytics tools
Requirements
- A minimum of 3 years of experience in Data DevOps, demonstrating your expertise in designing and implementing data pipelines, databases, and data analytics tools
- In-depth knowledge of CI/CD pipelines and Helm
- Strong experience with Kubernetes, Docker, Amazon Web Services, Linux, and Terraform
- Familiarity with Elastic Stack
- Experience with distributed data processing frameworks such as Apache Spark, Kafka, or Flink
- Expertise in scripting languages such as Python, Bash, or PowerShell, allowing you to automate tasks and manage infrastructure as code
- Strong interpersonal and communication skills, enabling you to collaborate effectively with cross-functional teams and stakeholders
- Ability to work independently and manage multiple projects simultaneously, while maintaining a high level of performance
- Fluent spoken and written English at an Upper-intermediate level or higher (B2+)
Nice to have
- Experience with data governance and security practices, including data encryption, access control, and compliance requirements
- Knowledge of machine learning frameworks and tools, including TensorFlow, PyTorch, or Scikit-learn
- Experience with big data technologies such as Hadoop, Hive, or Presto
- Familiarity with data visualization tools such as Tableau, Power BI, or Grafana
We offer
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Seniority level
Seniority level
Mid-Senior level
Employment type
Job function
Job function
Engineering, Information Technology, and Business DevelopmentIndustries
Software Development, IT Services and IT Consulting, and Media and Telecommunications
Referrals increase your chances of interviewing at EPAM Systems by 2x
Get notified about new Platform Engineer jobs in Brazil.
[Interaction Platform] Software Engineer III - Web
DevOps Engineer - (Remote multiple locations)
DevOps Engineer Career Opportunities at Dev.Pro - 01
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.