Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer (f / m / x) Dortmund

NETCONOMY

Münster

Hybrid

EUR 50.000 - 60.000

Vollzeit

Vor 4 Tagen
Sei unter den ersten Bewerbenden

Zusammenfassung

A leading digital solutions provider in Germany is seeking an experienced Data Engineer to build scalable data solutions using Databricks and Python. The ideal candidate should have over 3 years of experience and strong skills in data manipulation and cloud platforms. This role offers flexible working models and the opportunity to work with a dynamic team on innovative data projects.

Leistungen

Flexible working models
Structured onboarding and mentoring
Monthly meal allowance
Social events

Qualifikationen

  • 3+ years of hands-on experience as a Data Engineer.
  • Strong programming skills in Python with data manipulation libraries.
  • Proven experience with at least one major cloud platform.

Aufgaben

  • Build scalable data solutions using Databricks and Python.
  • Design and maintain data pipelines for cloud data lakes.
  • Collaborate with data scientists and analysts to meet data needs.

Kenntnisse

Python
Databricks
Apache Spark
SQL
Data Warehousing
ETL/ELT processes
Communication skills (English, German)

Tools

Azure
Power BI
Terraform

Jobbeschreibung

Salary: 50,000 - 60,000 per year

Requirements:

  • 3+ years of hands-on experience as a Data Engineer working with Databricks and Apache Spark
  • Strong programming skills in Python, with experience in data manipulation libraries (e.g., PySpark, Spark SQL)
  • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables
  • Solid understanding of data warehousing principles, ETL/ELT processes, data modeling, and database systems
  • Proven experience with at least one major cloud platform (Azure, AWS, or GCP)
  • Excellent SQL skills for data querying, transformation, and analysis
  • Excellent communication and collaboration skills in English and German (min. B2 levels)
  • Ability to work independently as well as part of a team in an agile environment

Responsibilities:

  • Build modern, scalable, and high-performance data solutions using Databricks, Spark, and Python
  • Design, develop, and maintain robust data pipelines for ingesting, transforming, and loading data into cloud-based data lakes and warehouses
  • Leverage Databricks ecosystem components (SQL, Delta Lake, Workflows, Unity Catalog) for reliable and efficient data workflows
  • Integrate with cloud services like Azure, AWS, or GCP to create secure, cost-effective data solutions
  • Contribute to data modeling and architecture decisions for long-term data landscape maintainability
  • Ensure data quality through validation and governance adherence
  • Collaborate with data scientists and analysts to meet data needs
  • Stay updated on advancements in Databricks, data engineering, and cloud technologies

Technologies:

  • Azure
  • CI / CD
  • Cloud
  • Databricks
  • DevOps
  • Support
  • Machine Learning
  • Power BI
  • Python
  • PySpark
  • Spark
  • Terraform
  • Unity

More:

NETCONOMY has grown over the past 20 years from a startup to a 500-people team across 10 European locations, emphasizing agile, cross-functional collaboration and diverse backgrounds to create outstanding digital solutions.

Our Offer:

  • Flexible working models with hybrid options
  • Structured onboarding, mentoring, and training
  • Annual company summit for professional and personal networking
  • Social events like pizza & games nights, hiking, and Christmas parties
  • Monthly meal allowance and partner restaurant discounts
  • Support for eco-friendly transportation

Contact Information:

  • Brauquartier 2, 8055 Graz, Austria
  • 43 316 81 55 44
  • office@netconomy.net

Additional Information:

NETCONOMY offers expertise in Digital Strategy, Cloud Adoption, Digital Architecture, Consulting, and Innovation. The company operates in Germany, Austria, Switzerland, Serbia, Spain, and the Netherlands, providing detailed career, industry, partnership, and news information via its website and social media channels.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.