Job Search and Career Advice Platform

Activez les alertes d’offres d’emploi par e-mail !

Software Engineer- Data Engineering (Remote from France)

Jobgether

À distance

EUR 50 000 - 70 000

Plein temps

Aujourd’hui
Soyez parmi les premiers à postuler

Générez un CV personnalisé en quelques minutes

Décrochez un entretien et gagnez plus. En savoir plus

Résumé du poste

A leading recruitment platform is seeking a Software Engineer specializing in Data Engineering in France. This key role involves designing and scaling a cutting-edge data platform, mentoring engineers, and ensuring data quality across systems. Ideal candidates will have over 7 years of data or software engineering experience, advanced skills in Python and SQL, and a passion for resolving complex data challenges. Enjoy a competitive salary, remote-first environment, and career growth opportunities.

Prestations

Competitive salary with equity participation
Comprehensive health benefits
Unlimited vacation policy
Opportunities for professional growth
Collaborative culture

Qualifications

  • 7+ years of progressive experience in data or software engineering.
  • Advanced programming skills in Python and SQL.
  • Experience with orchestration/streaming frameworks.

Responsabilités

  • Design, implement, and maintain scalable ETL/ELT pipelines.
  • Build and optimize data models for cloud warehouses.
  • Lead large-scale data initiatives from planning through execution.

Connaissances

Python
SQL
Data engineering best practices
Distributed systems
Data quality assurance
Mentorship

Formation

Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field

Outils

AWS
Databricks
Airflow
Kafka
Spark
Description du poste

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Software Engineer – Data Engineering in France.

In this role, you will be a key contributor to the design, development, and scaling of a next-generation data platform that supports critical business operations and AI initiatives. You will lead the creation of high-performance data pipelines, cloud-based architectures, and data models that enable near real-time analytics and machine learning. Collaborating closely with engineers, data scientists, and product teams, you will ensure data quality, observability, and operational excellence across all systems. This position requires hands‑on technical expertise, a strong understanding of distributed systems, and a passion for solving complex data challenges in a fast-paced, innovative environment. You will also mentor other engineers and contribute to shaping the long‑term data strategy while working in a collaborative, remote‑first culture.

Accountabilities
  • Design, implement, and maintain scalable ETL/ELT pipelines using Python, SQL, and modern orchestration frameworks.
  • Build and optimize data models and schemas for cloud warehouses and relational databases, supporting AI and analytics workflows.
  • Develop and operate distributed, real‑time data systems for high‑throughput ingestion and processing.
  • Collaborate with cross‑functional teams including AI, engineering, and product to deliver high‑quality datasets and robust data products.
  • Lead large‑scale data initiatives from planning through execution, ensuring performance, cost efficiency, and reliability.
  • Mentor and provide architectural guidance to other engineers, promoting best practices in data engineering and pipeline development.
  • Support testing, debugging, and QA processes to ensure system stability and data integrity.
Requirements
  • Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field. Graduate degrees are a plus.
  • 7+ years of progressive experience in data or software engineering, building complex data systems.
  • Advanced programming skills in Python and SQL; experience with orchestration/streaming frameworks such as Temporal, Dagster, Airflow, Spark, or Kafka.
  • Strong knowledge of relational and NoSQL databases (Postgres, MySQL, MongoDB, ElasticSearch, Cassandra).
  • Experience with cloud computing and data warehousing platforms, preferably Databricks and AWS.
  • Familiarity with ML feature stores and productionizing ML pipelines is a plus.
  • Strong analytical skills, attention to data quality, and experience with both OLTP and OLAP systems.
  • Experience mentoring engineers and providing technical and architectural guidance.
  • Bonus: experience with energy market or weather data, dbt, DataOps practices, real‑time data technologies, or knowledge of power systems.
Benefits
  • Competitive salary with equity participation and long‑term growth potential.
  • Comprehensive health benefits including medical, dental, and vision coverage.
  • Flexible, remote‑first working environment with unlimited vacation policy.
  • Opportunities for accelerated professional growth and mentorship from industry experts.
  • Collaborative, inclusive culture focused on innovation and impact.
Why Apply Through Jobgether?

We use an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role’s core requirements. Our system identifies the top‑fitting candidates, and this shortlist is then shared directly with the hiring company. The final decision and next steps (interviews, assessments) are managed by their internal team.

Data Privacy Notice

By submitting your application, you acknowledge that Jobgether will process your personal data to evaluate your candidacy and share relevant information with the hiring employer. This processing is based on legitimate interest and pre‑contractual measures under applicable data protection laws (including GDPR). You may exercise your rights (access, rectification, erasure, objection) at any time.

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.