Aktiviere Job-Benachrichtigungen per E-Mail!

(Senior) Data Engineer (m/f/d)

Valmet

Deutschland

Vor Ort

EUR 70.000 - 90.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Zusammenfassung

A leading tech company based in Berlin is looking for a (Senior) Data Engineer to design and implement data pipelines using advanced technologies. The ideal candidate has over 5 years of experience in ETL and data modeling, along with proficiency in Python and AWS. Join us to revolutionize manufacturing through advanced data solutions.

Qualifikationen

  • 5+ years of experience with ETL, data modeling, and data lake approaches.
  • 3+ years of experience in cloud technologies (AWS).
  • Ability to effectively communicate with both business and technical teams.

Aufgaben

  • Design and deploy real-time data pipelines using stream processing platforms.
  • Build and enhance a high-performance Data Lake using Apache Iceberg.
  • Collaborate with cross-functional teams to improve our data solutions.

Kenntnisse

ETL
Data modeling
Python
Cloud technologies (AWS)
Streaming-based systems (Kafka/Kinesis)

Ausbildung

Bachelor's degree in Management Information Systems or related field

Tools

Apache Kafka
Apache Flink
AWS Glue
Snowflake

Jobbeschreibung

(Senior) Data Engineer (m/f/d)

You want to redefine how entire industries work by leveraging IoT, Smart Manufacturing and Industry 4.0? Would you like to be part of the success of a digital solution which will revolutionize the manufacturing process to improve shop floor performance? If your answer is a big yes, you should continue reading!

With HQ in Berlin, FactoryPal is a corporate Start-Up - with an additional location in Porto/Portugal. The venture is poised to become the leading end-to-end IoT solution for machine efficiency and equipment effectiveness. The digitally enabled solution is not just completely reshaping how companies produce and elevate their efficiency levels, but it is fundamentally augmenting the way manufacturing employees do their job.

We are data scientists, engineers, designers, IIoT experts, product managers, and manufacturing operations consultants. We are a team, united by our shared ambition: revolutionize manufacturing and transform the way it is done to ensure smooth operations.

Become part of an amazing journey and successfully accompany our customers in their Digital Factory efforts!

Role and Responsibilities
  • Design, develop, and deploy real-time data pipelines using stream processing platforms such as Apache Kafka, Apache Flink, and AWS Glue.
  • Build a high-performance, ACID-compliant Data Lake using Apache Iceberg.
  • Create, enhance, and optimize data models and implement data warehousing solutions within the Snowflake platform.
  • Monitor, identify, and proactively reduce technical debt to maintain system health.
  • Develop and improve the current data architecture, emphasizing data lake security, data quality and timeliness, scalability, and extensibility.
  • Deploy and use various big data technologies and run pilots to design low-latency data architectures at scale
  • Contribute to automating and monitoring data pipelines, as well as streamlining client onboarding.
  • Collaborate with cross-functional teams, including Software Engineers, Product Owners, Data Scientists, Data Analysts, and shopfloor consultants, to build and improve our data and analytics solutions.
Qualifications
  • Bachelor's degree in Management Information Systems, Statistics, Software Engineering, STEM, or a related technical/quantitative field.
  • 5+ years of experience with ETL, data modeling, and data lake approaches.
  • 5+ years of experience with processing multi-dimensional datasets from different sources and automating the end-to-end ETL pipeline.
  • 3+ years of experience in Python.
  • 3+ years of experience in cloud technologies (AWS)
  • 3+ years of experience with streaming-based systems (Kafka/Kinesis) and event-driven design.
  • 2+ years of experience in distributed computing systems (such as Spark/Flink)
  • Experience with continuous delivery and integration.
  • Ability to effectively communicate with both business and technical teams.
Nice to have
  • Familiarity with IoT data ingestion into any cloud system.
  • Basic understanding of Infrastructure as Code principles and experience with Terraform.
  • Proficiency in writing dbt models (e.g., sources, transformations, tests).
  • Knowledge of building data pipelines and applications to trigger or schedule jobs using airflow.
  • Experience with micro-service architecture
Are you interested?
Join our team and send us your detailed application documents via the "Apply" button.


The personal data that you provide to us as part of this application process within our applicant platform (Workday) will be processed by the controller, Valmet GmbH Marienburgstr. 35 64297 Darmstadt, for the purpose of the selection process for the specified job advertisement. The legal basis here is Art. 6 para. 1 lit. b GDPR (implementation of pre-contractual measures). The data will be deleted after completion of the procedure, at the latest after 6 months. You have the right to information, erasure, blocking, data portability and to lodge a complaint with the competent supervisory authority
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.