Tagesspiegel is one of the most-cited newspapers in Germany and it has the highest sold circulation of all newspapers in the capital region, with over 14 million online readers throughout Germany. The editorial team and publishing house are continuously expanding in order to meet the challenges of the media market with innovations. Exciting magazines and an extensive event and conference business round off the multimedia offering. The Tagesspiegel belongs to DvH Medien GmbH, owned by Dieter von Holtzbrinck, as well as Die Zeit, Handelsblatt, and WirtschaftsWoche.
The BI and Analytics team at Tagesspiegel supports the publishing house and the editorial team with meaningful reports and precise analyses - with the aim of enabling informed decisions based on data and accompanying the leading medium from the capital on its way into the digital future.
With our team - currently consisting of one engineer and three analysts - we cover the entire data value chain - from tracking in the web and app to data processing and the use of data for reports, ad-hoc analyses, CRM and data science applications.
As a Data Engineer (d/m/w), you will be an essential part of our growing team.
These are your tasks:
- ETL processes: You develop, test and optimise data pipelines using Airflow and Python to provide a reliable basis for internal applications ranging from reporting and CRM to ML and AI applications.
- Connection of data sources: You connect data sources to our DWH (BigQuery) and automate the data transformation.
- DWH Management: You optimise data models and structures and enable efficient processing and provision of data.
- Data Operations: You analyse requirements for data-related projects and find optimal solutions for implementation.
- Data Quality & Consistency: You design solutions for the continuous optimisation of the data infrastructure to ensure data quality and guarantee consistent and reliable data.
- Data availability: You ensure the stability of data processing procedures and monitor data availability in order to maximise the quality of downstream applications.
What you bring to the table:- Relevant background: you have completed a technical degree or relevant training.
- ETL skills: You have experience in using Apache Airflow and Python or comparable tools.
- Database knowledge: You are familiar with BigQuery and related Google Cloud products.
- Software development processes: You have experience working with software development processes & tools (Jira, Git, Docker).
- Communication skills: You have the ability not only to understand complex concepts, but also to communicate them clearly.
- ML experience: Ideally, you already have some experience in implementing machine learning pipelines and experience with ML frameworks (e.g. Tensorflow or Keras).
What we offer:- Opportunity to work on impactful projects reaching millions of readers within an established media company
- Collaborative environment with emphasis on professional growth
- Fair compensation and additional benefits such as a BVG company ticket and direct insurance
- Work at the intersection of technology and digital journalism
- Flat structures, short decision-making processes, and direct communication
- Creative freedom to contribute and implement your own ideas
- Flexible working hours, 38.5 hours per week as well as 30 Vacation days (applying to a full-time role)
- Home office
As an employer, Tagesspiegel stands for equal opportunities and respectful interaction with one another. Fair employment opportunities, regardless of ethnic or social background, gender, religion, worldview, age, sexual identity, or disability, are a matter of course for us. A valuing and motivating work environment is our common incentive and goal.
Do you think we should get to know each other?
Then apply now through our job portal https://verlagsjobs.tagesspiegel.de/ and send us your complete application documents including your resume, earliest possible availability, and salary expectations.
We look forward to getting to know you!