Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer (Remote from Germany)

Jobgether

Remote

EUR 70.000 - 90.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A recruitment partner is seeking a Data Engineer in Germany to design, build, and maintain scalable data pipelines and warehouse architectures. This role involves ensuring data quality and performance, as well as integrating modern tools and technologies. Responsibilities include optimizing cloud-based platforms, collaborating with various teams, and enhancing data engineering practices. The ideal candidate has extensive experience in data warehousing and a strong knowledge of SQL, Python, and cloud environments. This position offers a competitive salary and flexible working arrangements.

Leistungen

Competitive salary
Remote-first working arrangements
Professional growth opportunities
Supportive culture

Qualifikationen

  • 7+ years of experience in data warehousing, database administration, or database development.
  • 5+ years of hands-on experience as a Data Engineer using SQL and Python.
  • Expertise with cloud-based platforms and familiarity with containerization tools.

Aufgaben

  • Maintain and optimize cloud-based data warehouse platforms.
  • Design and implement data integration pipelines.
  • Collaborate with teams to support data flow and accessibility.

Kenntnisse

Data warehousing
SQL
Python
Cloud platforms (AWS, GCP, Azure)
Data modeling
Data integration frameworks

Tools

Docker
Kubernetes
Snowflake
BigQuery
Redshift
Databricks
Airflow
DBT
Jobbeschreibung

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer in Germany.

In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and warehouse architectures that support critical business insights. You will work with cross-functional teams to integrate, transform, and manage high-volume datasets across multiple platforms. Your focus will be on ensuring data quality, performance, and security while driving innovation through the adoption of modern tools and technologies. This role provides the opportunity to have a direct impact on analytics capabilities, business intelligence, and operational efficiency. You will work in a collaborative, flexible, and forward-thinking environment that encourages experimentation and continuous improvement. It is a hands‑on, technically challenging role with high visibility across the organization.

Accountabilities:
  • Maintain, configure, and optimize the company’s cloud‑based data warehouse platform, ensuring reliability, scalability, and performance.
  • Design and implement incremental and batch data integration pipelines with a focus on quality, efficiency, and cost‑effectiveness.
  • Collaborate closely with development, analytics, and operations teams to support seamless data flow and accessibility.
  • Innovate by exploring and integrating new tools, technologies, and best practices to enhance data infrastructure.
  • Implement and maintain data security, auditing, and monitoring processes.
  • Troubleshoot and resolve data‑related issues while providing ongoing support for production systems.
  • Contribute to the overall improvement of data engineering practices and platform capabilities.
Requirements:
  • 7+ years of experience in data warehousing, database administration, or database development.
  • 5+ years of hands‑on experience as a Data Engineer using SQL and Python.
  • Expertise with cloud‑based platforms such as AWS, GCP, or Azure, and familiarity with containerization tools like Docker or Kubernetes.
  • Proven experience working with large‑scale datasets using technologies such as Snowflake, BigQuery, Redshift, Databricks, or similar.
  • Strong knowledge of ELT/ETL tools and data integration frameworks, including Airflow, DBT, Python, and REST APIs.
  • Solid understanding of data modeling, performance optimization, and pipeline maintainability.
  • Positive, solution‑oriented mindset with a willingness to learn and adopt new technologies.
  • Excellent English communication skills and ability to work effectively in distributed teams.
Benefits:
  • Competitive salary and comprehensive compensation package.
  • Remote‑first and flexible working arrangements promoting work‑life balance.
  • Autonomy and personal responsibility within a collaborative team environment.
  • Opportunities for professional growth, learning, and skill development.
  • Supportive, people‑oriented culture that encourages creativity and innovation.
  • Exposure to modern cloud‑based data technologies and high‑impact projects.
Why Apply Through Jobgether?

We use an AI‑powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role’s core requirements. Our system identifies the top‑fitting candidates, and this shortlist is then shared directly with the hiring company. The final decision and next steps (interviews, assessments) are managed by their internal team.

Data Privacy Notice:

By submitting your application, you acknowledge that Jobgether will process your personal data to evaluate your candidacy and share relevant information with the hiring employer. This processing is based on legitimate interest and pre‑contractual measures under applicable data protection laws (including GDPR). You may exercise your rights (access, rectification, erasure, objection) at any time.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.