***Posting Type***Remote***Job Overview***We are building a specialized team focused on enabling advanced analytics and reporting capabilities across our internal data ecosystem. As anAdvanced Data Platform Engineer, you will design and implement scalable, cloud-native data platforms that integrate modernlakehousetechnologies, distributed compute frameworks, and cloud-native services to support diverse analytical use cases and enterprise-scale insights. You will work on systemsleveragingApache Spark, Delta Lake, and Iceberg to process large-scale datasets efficiently, while enabling internal users to build reporting and analytics through curated data models, optimized query performance, and reliable data pipelines. This role emphasizes technical depth, performance optimization, and governance best practices to deliver secure and reliable solutions. Relativity’sscale and breadthprovidesignificant opportunities for rich data exploration and insights. Our data infrastructure ensures that vast datasetsremainaccessible, secure, and compliant, while enabling innovation across the organization. We are making substantial investments in data lake technology and distributed systems to support future growth and advanced analytics.***Job Description and Requirements******Your Role in Action**** Design and implement complex data pipelines and distributed systems using Spark and Python.* Apply software engineering best practices: clean code, modular design, CI/CD, automated testing, and code reviews.* Develop and maintainlakehousecapabilities with Delta Lake and Iceberg, ensuring reliability and performance.* Enable analytics workflows by integratingdbtfor SQL transformations running on Spark.* Collaborate with internal teams to deliver curated datasets and self-service analytics capabilities.* Optimizedata warehousing solutions such as Databricks and Snowflake for performance and scalability.* Implement observability and governance frameworks, including data lineage and compliance controls.* Drive performance tuning, scalability strategies, and cost optimization across Spark jobs and cloud-native environments.**Core Requirements:*** Strong programming skills in Python and SQL; experience with Apache Spark for distributed data processing.* Expertisein Delta Lake and/or Apache Iceberg forlakehousearchitecture.* Familiarity withdbt, Databricks, and Snowflake for analytics workflows.* Solid understanding of software engineering principles, CI/CD, and automated testing.* Familiarity with Kubernetes, Docker, and infrastructure-as-code tools.* Understanding ofperformance tuning, scalability strategies, and cost optimization for large-scale systems.**Nice to Have:*** Exposure toevent-driven architectures and advanced analytics platforms.* Experience enabling self-service analytics for internal stakeholders.* Experience in any of the following languages: Java, Scala, Rust.Relativity is a diverse workplace with different skills and life experiences—and we love and celebrate those differences. We believe that employees are happiest when they're empowered to be their full, authentic selves, regardless how you identify.**Benefit Highlights:**Comprehensive health, dental, and vision plansParental leave for primary and secondary caregiversFlexible work arrangementsTwo, week-long company breaks per yearUnlimited time offLong-term incentive programTraining investment programAll qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, or national origin, disability or protected veteran status, or any other legally protected basis, in accordance with applicable law.**Relativity is committed to competitive, fair, and equitable compensation practices.**This position is eligible for total compensation which includes a competitive base salary, an annual performance bonus, and long-term incentives.The expected salary range for this role is between following values:146 000 and 218 000PLNThe final offered salary will be based on several factors, including but not limited to the candidate's depth of experience, skill set, qualifications, and internal pay equity. Hiring at the top end of the range would not be typical, to allow for future meaningful salary growth in this position. **Suggested Skills:**Engineering Principle, Hardware Integration, Innovation, Problem Solving, Process Improvements, Quality Assurance (QA), Research and Development, System Designs, Technical Documents, TroubleshootingWe’re solving big data challenges in the legal tech industry, and we’re always looking for more people to join us on the journey. At Relativity, you'll learn cross-functional skills to grow your career and have the chance to make a big impact on our customers, our industry, and our communities. We admire and value our employees, so it’s no surprise that our hiring process is designed to help us really get to know you – and for you to get to know us, too.