Job Search and Career Advice Platform

Activez les alertes d’offres d’emploi par e-mail !

Data Engineer (Remote from France)

Jobgether

À distance

EUR 60 000 - 80 000

Plein temps

Aujourd’hui
Soyez parmi les premiers à postuler

Générez un CV personnalisé en quelques minutes

Décrochez un entretien et gagnez plus. En savoir plus

Résumé du poste

A technology partner firm is seeking an experienced Data Engineer in France. The role involves designing and maintaining scalable data pipelines and warehouse architectures to support critical business insights. Candidates should have extensive experience with SQL, Python, and cloud platforms (AWS, GCP, Azure). The position offers competitive salary, remote-first work arrangements, and opportunities for professional growth within a collaborative environment focused on innovation.

Prestations

Competitive salary
Remote-first working arrangements
Opportunities for professional growth
Autonomy and responsibility in a team
Supportive culture fostering creativity
Exposure to modern data technologies

Qualifications

  • 7+ years of experience in data warehousing or database administration.
  • 5+ years of hands-on experience as a Data Engineer using SQL and Python.
  • Expertise with cloud-based platforms like AWS, GCP, or Azure.
  • Familiarity with containerization tools like Docker or Kubernetes.
  • Strong knowledge of ELT/ETL tools and data integration frameworks.

Responsabilités

  • Maintain and optimize the company’s cloud-based data warehouse.
  • Design and implement data integration pipelines.
  • Collaborate with teams for seamless data flow.
  • Explore and integrate new data infrastructure tools.
  • Implement data security, auditing, and monitoring processes.

Connaissances

SQL
Python
Data modeling
ELT/ETL tools
Cloud platforms (AWS, GCP, Azure)
Containerization (Docker, Kubernetes)
Data integration frameworks
Communication skills

Outils

Snowflake
BigQuery
Redshift
Databricks
Airflow
DBT
Description du poste

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer in France.

In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures that support critical business insights. You will work with cross-functional teams to integrate, transform, and manage high-volume datasets across multiple platforms. Your focus will be on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies. This role provides the opportunity to have a direct impact on analytics capabilities, business intelligence, and operational efficiency. You will work in a collaborative, flexible, and forward-thinking environment that encourages experimentation and continuous improvement. It is a hands‑on, technically challenging role with high visibility across the organization.

Accountabilities
  • Maintain, configure, and optimize the company’s cloud-based data warehouse platform, ensuring reliability, scalability, and performance.
  • Design and implement incremental and batch data integration pipelines with a focus on quality, efficiency, and cost-effectiveness.
  • Collaborate closely with development, analytics, and operations teams to support seamless data flow and accessibility.
  • Innovate by exploring and integrating new tools, technologies, and best practices to enhance data infrastructure.
  • Implement and maintain data security, auditing, and monitoring processes.
  • Troubleshoot and resolve data-related issues while providing ongoing support for production systems.
  • Contribute to the overall improvement of data engineering practices and platform capabilities.
Requirements
  • 7+ years of experience in data warehousing, database administration, or database development.
  • 5+ years of hands‑on experience as a Data Engineer using SQL and Python.
  • Expertise with cloud‑based platforms such as AWS, GCP, or Azure, and familiarity with containerization tools like Docker or Kubernetes.
  • Proven experience working with large‑scale datasets using technologies such as Snowflake, BigQuery, Redshift, Databricks, or similar.
  • Strong knowledge of ELT/ETL tools and data integration frameworks, including Airflow, DBT, Python, and REST APIs.
  • Solid understanding of data modeling, performance optimization, and pipeline maintainability.
  • Positive, solution‑oriented mindset with a willingness to learn and adopt new technologies.
  • Excellent English communication skills and ability to work effectively in distributed teams.
Benefits
  • Competitive salary and comprehensive compensation package.
  • Remote‑first and flexible working arrangements promoting work‑life balance.
  • Autonomy and personal responsibility within a collaborative team environment.
  • Opportunities for professional growth, learning, and skill development.
  • Supportive, people‑oriented culture that encourages creativity and innovation.
  • Exposure to modern cloud‑based data technologies and high‑impact projects.

Why Apply Through Jobgether?

We use an AI‑powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top‑fitting candidates, and this shortlist is then shared directly with the hiring company. The final decision and next steps (interviews, assessments) are managed by their internal team.

We appreciate your interest and wish you the best!

Data Privacy Notice: By submitting your application, you acknowledge that Jobgether will process your personal data to evaluate your candidacy and share relevant information with the hiring employer. This processing is based on legitimate interest and pre‑contractual measures under applicable data protection laws (including GDPR). You may exercise your rights (access, rectification, erasure, objection) at any time.

Obtenez votre examen gratuit et confidentiel de votre CV.
ou faites glisser et déposez un fichier PDF, DOC, DOCX, ODT ou PAGES jusqu’à 5 Mo.