We are seeking a highly skilled remote Senior Data DevOps Engineer to join our team, working on a cutting-edge project that leverages the latest technologies in Azure Data Factory, Azure DevOps, and Databricks.
In this position, you will play a critical role in designing, building, and deploying data pipelines, ensuring the reliability, scalability, and performance of our data systems. This is a unique opportunity to work on a project that has a significant impact on our business and the industry, collaborating with a team of talented professionals.
Responsibilities
- Design, build, and deploy data pipelines using Azure Data Factory, Databricks, and other related technologies
- Automate data operations using Python or Java, ensuring the reliability and scalability of data systems
- Collaborate with cross-functional teams to understand business requirements and design data solutions that meet those needs
- Implement DataOps and MLOps practices, ensuring the quality and accuracy of data systems
- Monitor and troubleshoot data pipelines, identifying and resolving issues in a timely manner
- Develop and maintain documentation for data systems and processes
- Stay up-to-date with the latest trends and technologies in data engineering and DevOps
Requirements
- At least 3 years of experience in Data DevOps, with expertise in designing, building, and deploying data pipelines in Microsoft Azure
- In-depth knowledge of Azure Data Factory, Azure DevOps, Databricks, and related technologies
- Experience with DataOps and MLOps practices
- Strong programming skills in Python or Java
- Good understanding of cloud infrastructure and networking concepts, including security, scalability, and resilience
- Excellent communication skills and ability to work collaboratively with cross-functional teams
- Strong analytical and problem-solving skills to resolve complex issues
- Fluent spoken and written English at B2+ level
Nice to have
- Experience with other cloud platforms such as AWS or Google Cloud
- Knowledge of big data technologies like Hadoop, Spark, or Kafka
- Experience with containerization technologies such as Docker and Kubernetes
- Certifications in Microsoft Azure or related fields
We offer
- International projects with top brands
- Work with global, diverse teams of skilled professionals
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling, and certification courses
- Unlimited access to LinkedIn Learning and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek, and LinkedIn
Additional Details
- Seniority level: Mid-Senior level
- Employment type: Full-time
- Job functions: Engineering, IT, Business Development
- Industries: Software Development, IT Services and Consulting, Retail