Enable job alerts via email!

(Senior) Data Engineer with Data Analytics (m/f/x)

Schwarz Group

Welcome (SC)

Remote

USD 90,000 - 140,000

Full time

14 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

Join a leading company striving to revolutionize marketing through data with the Schwarz Media Platform. You will work on designing and implementing data-driven features for Europe's most advanced ad network, contribute towards scaling data infrastructure, and develop cutting-edge reporting solutions to help measure advertising impact effectively. This remote role requires strong technical expertise and collaboration skills to thrive in a fast-paced environment.

Qualifications

  • 5+ years of professional experience working on data-intensive applications.
  • Fluency with Python and proficiency in SQL.
  • Experience with Apache Spark and data visualization tools.

Responsibilities

  • Design and implement data-centered features for Europe's largest Ad Network.
  • Help scale data stores and optimize data processing workflows.
  • Develop market-leading reporting solutions for advertising campaigns.

Skills

Python
SQL
Data visualization
Data analysis
Collaboration
Communication

Tools

Looker
Tableau
Apache Spark
Kubernetes
Google Cloud Platform
Snowflake
BigQuery
Databricks
Apache Airflow

Job description

How can we change the world to make marketing both relevant and impactful? With your help! At Schwarz Media Platform, we are on a mission to build Europe's largest and most advanced ad network for retail - a real-life AdTech application with a big impact on consumers, stores, and advertisers. It is based on Europe's largest retail data pool from Europe's No. 1 retailer, Schwarz Group, and cutting-edge technology that understands individual consumer behavior at scale. If you are interested in this vision and are excited about how data and engineering excellence can help us get there, you will love Schwarz Media Platform.

What you´ll do

  • Work in a cross-functional product team to design and implement data centered features for Europe’s largest Ad Network
  • Help to scale our data stores, data pipelines and ETLs handling terabytes of one of the largest retail companies
  • Design and implement effi cient data processing workfl ows
  • Continue to develop our custom data processing pipeline and continuously search for ways to improve our technology stack along our increasing scale
  • Develop and standardize a product for measuring the incremental impact of advertising campaigns
  • Design and deliver market-leading reporting solutions
  • Leverage Business Intelligence tools to provide internal business insights, supporting strategic decision-making and driving product development initiatives
  • Extend our reporting platform for external customers and internal stakeholders to measure advertising performance
  • You will work in a fully remote setup but you will meet your colleagues in person in the company and engineering specific onsite events
What you’ll bring along
  • 5+ years of professional experience working on data-intensive applications
  • Fluency with Python and profi cient in SQL
  • Experience with developing scalable data pipelines with Apache Spark
  • Experience with data visualization tools (e.g., Looker, Tableau, Microstrategy)
  • Familiarity with statistical techniques and A/B testing methodologies
  • Good understanding of effi cient algorithms and know-how to analyze them
  • Curiosity about how databases and other data processing tools work internally
  • Ability to write testable and maintainable code that scales
  • Ability to present fi ndings in a clear, concise manner to both technical and non-technical stakeholders
  • Familiarity with git
  • Excellent communication skills and a team-player attitude
Great if you also have
  • Experience with Kubernetes
  • Experience with Google Cloud Platform
  • Experience with Snowfl ake, Big Query, Databricks and DataProc
  • Knowledge of columnar databases and fi le formats like Apache Parquet
  • Knowledge of "Big Data" technologies like Delta Lake
  • Experience with workfl ow management solutions like Apache Airfl ow
  • Knowledge of Datafl ow / Apache Beam
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.