Remote - Lead Data Engineer - up to $190K base - join a team building systems to make data driven business decisions
This Jobot job is hosted by Chuck Wirtz.
Are you a fit? Easy Apply now by clicking the "Easy Apply" button and sending us your resume.
About the role
Our client, in the financial services industry, is seeking a Lead Data Engineer to join their team. This is a full-time, direct hire, remote role with a salary range of $150,000 to $190,000 per year, plus benefits, depending on experience.
Why join us?
This role is ideal for someone who thrives in a dynamic, fast-paced environment, enjoys solving complex data problems, and is passionate about driving innovation in data engineering. If you're looking to make an impact on the financial landscape with cutting-edge data solutions, this could be for you!
Key Responsibilities
- Lead the design and implementation of end-to-end data pipelines, from extraction (API, scraping, pyodbc) to cleansing/transformation (Python, TSQL) and loading into SQL databases or data lakes.
- Oversee the development of robust data architectures that support efficient querying and analytics, ensuring high-performance and scalable data workflows.
- Collaborate with data scientists, software developers, business intelligence teams, and stakeholders to develop and deploy data solutions that meet business needs.
- Ensure smooth coordination between engineering and other teams to translate business requirements into technical solutions.
- Guide the development of data models and business schemas, ensuring they are optimized for relational (3NF) and dimensional (Kimball) architectures.
- Lead the creation of scalable, reliable data models and optimize them for performance and usability.
- Develop and maintain infrastructure for large-scale data solutions, leveraging cloud platforms (e.g., Azure) and containerization technologies (e.g., Docker).
- Lead the use of modern data platforms such as Snowflake and Fabric, ensuring their effective use in large-scale data solutions.
- Manage and optimize data pipelines using tools such as Apache Airflow, Prefect, DBT, and SSIS, ensuring efficiency, scalability, and reliability.
- Ensure robust testing, monitoring, and validation of all data systems and pipelines.
- Drive continuous improvement in data engineering processes, aligning with industry best practices.
- Foster a culture of clean code, best practices, and rigorous testing across the team.
Qualifications & Experience
- 5+ years of experience in data engineering roles, with a proven track record of developing and maintaining data pipelines and architectures.
- Experience with large-scale data platforms and cloud environments.
- Strong background in relational databases, dimensional data modeling, and cloud-native solutions.
- Experience with data engineering tools such as Apache Airflow, Prefect, and cloud storage platforms.
- Excellent problem-solving skills and ability to navigate complex technical challenges.
Interested? Easy Apply now by clicking the "Easy Apply" button.
Jobot is an Equal Opportunity Employer. We celebrate diversity and consider all qualified candidates without regard to race, color, religion, age, sex, national origin, disability, genetics, veteran status, sexual orientation, gender identity, or expression. Background checks may be performed with your authorization, in accordance with applicable laws.
Learn more
Click our Jobot logo and follow our LinkedIn page!