
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A leading online sportsbook and casino company in Poland is seeking an experienced DataDevOps Engineer to optimize data pipelines and manage cloud infrastructure. The ideal candidate has over 3 years of experience in DevOps or DataOps, expertise in tools like Airflow and Snowflake, and excellent communication skills. This fast-paced environment allows for immediate impact and technical improvements across the organization.
Hard Rock Digital is a team focused on becoming the best online sportsbook, casino, and social casino company in the world. We’re building a team that resonates passion for learning, operating, and building new products and technologies for millions of consumers. We care about each customer’s interaction, experience, behaviour, and insight and strive to ensure we’re always acting authentically.
Rooted in the kindred spirits of Hard Rock and the Seminole Tribe of Florida, the new Hard Rock Digital taps a brand known the world over as the leader in gaming, entertainment, and hospitality. We’re taking that foundation of success and bringing it to the digital space — ready to join us?
We are seeking a passionate DataDevOps Engineer who loves optimizing pipelines, automating workflows, and scaling cloud-based data infrastructure.
You’ll work as a part of the DataDevOps team to collaborate with Data Science, Machine Learning, Reporting, and other data-related teams to deploy and support cutting edge data applications. This fast-paced role will allow you to make an immediate impact, growing into the team and offering the opportunity to drive technical improvements across the organization.
As a DataDevOps Engineer, you will:
We are looking for a DataOps Engineer with experience supporting high-velocity data/development teams and designing and maintaining data infrastructure, pipelines, and automation frameworks. You should also have experience streamlining data workflows using tools like Airflow, DBT, Databricks, and Snowflake while maintaining data integrity, security, and performance.
The ideal candidate will have: