About Wynn Al Marjan Island
On schedule to open in the United Arab Emirates in early 2027, Wynn Al Marjan Island has been created as an opulent and entertaining beachside destination for discerning guests to play and relax. Located less than 50 minutes from Dubai International Airport, this integrated resort offering 1,530 rooms and suites, 22 restaurants and lounges, a theatre, nightclub, and five-star spa is currently under construction on a picturesque island that gracefully curves into the Arabian Gulf.
About The Position
Wynn Al Marjan Island is currently seeking a Senior Data Engineer to join the resort’s Information Technology team.
Responsibilities
- Lead architecture development and solution design sessions, contributing strategic insights to ensure robust, scalable, and efficient data systems
- Collaborate with architects and cross-functional teams to design and implement end-to-end data solutions that address complex business needs
- Develop advanced data assets, including highly scalable ETL/ELT pipelines, optimized data structures, and dynamic data workflows on Wynn’s cloud infrastructure
- Drive data quality initiatives, including sophisticated data profiling, cleansing, and anomaly detection, leveraging advanced tools and methodologies
- Establish, enforce, and continuously refine data quality standards while designing automated processes for issue identification and resolution
- Act as a technical leader within the team, mentoring junior engineers, conducting detailed code reviews, and setting standards for best practices
- Analyze and translate complex business requirements into detailed technical specifications, ensuring solutions align with broader organizational goals
- Define, design, and implement advanced data ingestion patterns (batch, real-time streaming, and hybrid) tailored to diverse use cases and business requirements
- Lead data modeling initiatives, developing and optimizing both logical and physical data models to support high-performance analytics and operations
- Collaborate closely with stakeholders to design and deliver data models that meet advanced operational, analytical, and reporting needs
- Drive data governance initiatives by contributing to the design and implementation of robust security, compliance, and data privacy frameworks
- Ensure data integrity and quality by embedding validation, anomaly detection, and self-healing mechanisms throughout data workflows
- Create comprehensive technical documentation, including data flows, pipeline designs, and operational runbooks to support system maintainability
- Evaluate, recommend, and implement cutting-edge data ingestion, integration, and replication tools to enhance the efficiency of cloud-based analytics systems
- Lead proof-of-concept projects to assess and introduce innovative data engineering technologies, tools, and frameworks
- Design and implement highly reliable real-time data streaming solutions using platforms such as Azure Event Hubs, Kafka, or AWS Kinesis to meet time-sensitive business needs
- Develop and maintain complex, reusable ETL/ELT processes that can handle large-scale, dynamic datasets efficiently and securely.
- Own the maintenance and performance optimization of data ingestion tools, ensuring they meet high availability and reliability standards
- Continuously optimize existing data pipelines, identifying and resolving performance bottlenecks to improve scalability and cost-efficiency
- Provide advanced support to analytics and data science teams by curating high-quality, well-documented, and accessible datasets tailored to their needs
- Lead efforts to monitor and proactively troubleshoot data pipelines, implementing automated alerts and self-healing mechanisms to minimize downtime
- Research and propose strategies for adopting emerging trends in data engineering, driving innovation and process improvements.
- Implement and manage robust CI/CD workflows to automate pipeline deployment, testing, and version control for seamless operations
- Regularly upgrade and enhance data ingestion tools, ensuring system resilience and alignment with the latest industry best practices
- Contribute to cross-functional projects, demonstrating expertise in serverless architectures, API integration (e.g., FastAPI), and scalable cloud solutions
- Drive knowledge-sharing initiatives, such as training sessions and technical presentations, to elevate the team’s overall capabilities and expertise
About You
The ideal candidate for this position will have the following experience and qualifications:
- A Bachelor’s degree in computer science, information technology, or a related field is required; a Master’s degree is preferred but not mandatory
- 5 to 7 years of hands-on experience in data engineering, demonstrating consistent career progression and technical growth
- Proven ability to design, develop, and deploy highly scalable and efficient data solutions for complex business needs
- Extensive experience managing and optimizing complex data integrations across diverse systems, platforms, and cloud environments
- Advanced proficiency in programming languages such as Python, SQL, and Shell Scripting, with the ability to implement optimized and scalable code solutions
- Deep expertise in data platforms like Snowflake and Databricks, including extensive experience working with PySpark and distributed dataframes to process large-scale datasets
- Advanced knowledge of orchestration tools such as Azure Data Factory, Apache Airflow, and Databricks workflows, including the ability to design and manage complex, multi-step workflows
- Significant hands-on experience with tools like DBT for data transformation and replication solutions such as Qlik for efficient data migration and synchronization
- Strong understanding of big data systems and frameworks, with practical experience in building and optimizing solutions for high-volume and high-velocity data
- Extensive experience with version control tools such as GitHub or Azure DevOps, including implementing CI/CD pipelines for data engineering workflows
- Advanced knowledge of serverless computing, including designing and deploying scalable solutions using Azure Functions with Python
- Proficiency in API development frameworks such as FastAPI, with the ability to create robust, efficient, and secure data-driven APIs
- Comprehensive expertise in designing and implementing ETL/ELT processes with a focus on performance, scalability, and maintainability
- Proven experience in data warehouse development, including hands-on expertise with dimensional modeling and schema optimization for analytics
- Solid English language communication skills, both written and verbal, for effective collaboration across teams and stakeholders
- Snowflake SnowPro Core Certification
- Databricks Certified Data Engineer Professional
- Microsoft Azure Data Engineer Associate
About Wynn Al Marjan Island’s Benefits
We offer an attractive salary, paid in Dirhams (AED), the local currency of the UAE. In addition, we offer an excellent leave policy, a healthcare package and as well as life insurance, incentive programs, and other employee benefits. The result is a package that makes this role highly attractive to outstanding applicants seeking a career with Wynn Resorts, among the most renowned and celebrated brands in the global hospitality industry.