Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
Join Hard Rock Digital as a DataOps Engineer to optimize and automate cloud-based data infrastructure. Work with leading tools like Airflow and Snowflake while fostering a collaborative environment focused on high-quality data management. Enjoy a flexible work culture with competitive benefits in a startup environment backed by a global brand.
What are we building?
Hard Rock Digital is a team focused on becoming the best online sportsbook, casino, and social casino company in the world. We’re building a team that resonates passion for learning, operating and building new products and technologies for millions of consumers. We care about each customer's interaction, experience, behavior, and insight and strive to ensure we’re always acting authentically.
Rooted in the kindred spirits of Hard Rock and the Seminole Tribe of Florida, the new Hard Rock Digital taps a brand known the world over as the leader in gaming, entertainment, and hospitality. We’re taking that foundation of success and bringing it to the digital space — ready to join us?
What’s the position?
We are seeking a passionate DataOps Engineer who loves optimizing pipelines, automating workflows, and scaling cloud-based data infrastructure.
Key Responsibilities:
Design, build, and optimize data pipelines using Airflow, DBT, and Databricks.
Monitor and improve pipeline performance to support real-time and batch processing.
Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as Snowflake.
Implement best practices for cost-efficient, secure, and scalable data processing.
Enable and optimize AWS SageMaker environments for ML teams.
Collaborate with ML, Data Science, and Reporting teams to ensure seamless data accessibility.
Implement data pipeline monitoring, alerting, and logging to detect failures and performance bottlenecks.
Build automation to ensure data quality, lineage tracking, and schema evolution management.
Participate in incident response, troubleshooting, and root cause analysis for data issues.
Advocate for DataOps best practices, driving automation, reproducibility, and scalability.
Document infrastructure, data workflows, and operational procedures.
What are we looking for?
We are looking for a DataOps Engineer with experience supporting high-velocity data/development teams and designing and maintaining data infrastructure, pipelines, and automation frameworks. You should also have experience streamlining data workflows using tools like Airflow, DBT, Databricks, and Snowflake while maintaining data integrity, security, and performance.
Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
Minimum of 3 years of experience in DataOps or similar.
Proficiency in key technologies, including Airflow, Snowflake, and SageMaker.
Certifications in AWS/Snowflake/other technologies a plus.
Excellent communication and interpersonal skills.
Ability to work in a fast-paced environment and manage multiple priorities effectively.
What’s in it for you?
We offer our employees more than just competitive compensation. Our team benefits include:
Competitive pay and benefits
Flexible vacation allowance
Flexible work from home or office hours
Startup culture backed by a secure, global brand
Opportunity to build products enjoyed by millions as part of a passionate team
Roster of Uniques
We care deeply about every interaction our customers have with us, and trust and empower our staff to own and drive their experience. Our vision for our business and customers is built on fostering a diverse and inclusive work environment where regardless of background or beliefs you feel able to be authentic and bring all your talent into play. We want to celebrate you being you (we are an equal opportunity employer).