Aktiviere Job-Benachrichtigungen per E-Mail!

DataOps and Platform Engineer (m/f/d)

Future Demand

Berlin

Vor Ort

Vertraulich

Vollzeit

Vor 26 Tagen

Zusammenfassung

At Future Demand, a dynamic start-up in Berlin, we're seeking a mid- to senior-level DataOps & Platform Engineer. You'll create and manage our data platform, ensuring the seamless flow of data from our systems into actionable insights. Join us in a vibrant team that values innovation and collaboration, while enjoying great benefits and a collaborative work environment.

Leistungen

Competitive salary
Monthly learning budget
Urban Sports Club membership support
Transport subsidies for bike/public transport
Regular team events

Qualifikationen

  • 3+ years of experience in data engineering or DevOps.
  • Proficient in Python for scripting and SQL for data transformation.
  • Hands-on experience with cloud infrastructure and Infrastructure as Code.

Aufgaben

  • Design and maintain scalable data pipelines and ETL workflows.
  • Manage AWS cloud infrastructure with focus on containerized services.
  • Collaborate with teams to deploy machine learning models.

Kenntnisse

Python
SQL
Cloud infrastructure
CI/CD
Containerization
Agile mindset
Communication skills

Tools

PostgreSQL
Terraform
Apache Airflow
Docker

Jobbeschreibung

At Future Demand we build an Audience Intelligence platform that helps our users to see what their customers really want, auto-build high-converting audiences and ads, and let AI optimise those campaigns.

To expand our product team, we are looking to hire a mid- to senior-level DataOps & Platform Engineer with a lean and agile mindset. You will join our engineering team and help develop and maintain the data infrastructure and pipelines that connect our customers’ data to insights generated by our data science and machine learning services. Join our international team and drive how campaigns are done in an AI-first world.

Tasks

Your Tasks

Own and evolve our data platform: Design, build, and maintain scalable data pipelines and ETL workflows (e.g. using Apache Airflow) to ensure timely, accurate data delivery across our products.

  • Cloud infrastructure management: Manage and automate our AWS cloud infrastructure (with a focus on containerized services on ECS) using Infrastructure as Code (Terraform), ensuring a reliable and scalable platform .
  • Cross-functional collaboration: Collaborate with data scientists, software engineers, and product stakeholders to design and implement new data-driven features and improvements .
  • Model deployment: Work closely with the data science team to deploy machine learning models into production, integrating them into our data pipelines and ensuring smooth operation in our platform .
  • CI/CD and automation: Implement and refine CI/CD pipelines for our data workflows, including containerization (Docker) and automated testing, to enable rapid and safe deployments of pipeline changes .
  • Monitoring and optimization: Continuously monitor pipeline performance and data quality. Troubleshoot issues proactively and optimize systems for efficiency, scalability, and reliability.
Requirements

Your Profile

Experience: 3+ years of experience in data engineering, DevOps, or a related role (mid- to senior-level). You have managed production data pipelines or platforms before.

  • Programming & databases: Proficiency in Python for scripting/automation and strong SQL skills for data transformation. Hands-on experience working with databases like PostgreSQL.
  • Cloud & Infrastructure as Code: Solid experience with cloud infrastructure (ideally AWS) . You have worked with Infrastructure as Code tools (e.g. Terraform) to provision and manage resources .
  • Data pipelines: Practical experience building or orchestrating data pipelines, preferably with frameworks like Apache Airflow or similar workflow managers. You understand scheduling, monitoring, and dependency management in data processes.
  • CI/CD & Containerization: Familiarity with continuous integration/continuous delivery practices and containerization (Docker) for deploying data services.
  • Agile mindset & communication: A lean, agile approach to problem-solving and automation. Excellent communication skills, with the ability to explain complex concepts in simple terms and collaborate effectively in a cross-functional team .
Benefits

We offer an exciting environment at the interface between IT and live entertainment and sports as part of an ambitious, international and results-orientated team in a young start-up.

  • You work with an excellent international team with varied tasks with a wide range of great opportunities
  • An attractive and competitive salary
  • Monthly learning budget
  • Urban Sports Club membership support
  • Support for your favourite means of transport - we help finance your new bike or your public transport subscription
  • We are a startup in one of the cultural hubs in Berlin, you'll never get bored
  • We love live events and if you do too, then you are certainly going to enjoy our regular team events!
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.