Aktiviere Job-Benachrichtigungen per E-Mail!

SAP Data Engineer / Integration Specialist

Ernst & Young Advisory Services Sdn Bhd

Mannheim

Vor Ort

EUR 70.000 - 90.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Zusammenfassung

A leading consulting firm in Mannheim is seeking a Data Engineer / Integration Specialist to design and optimize data pipelines for SAP integration. The role requires 8–12 years of experience in data engineering, focusing on cloud-based data technologies. This position offers the opportunity to work with a global team and develop future-focused skills in a flexible environment.

Leistungen

Flexible work environment
Diverse and inclusive culture
Future-focused skill development

Qualifikationen

  • 8–12+ years of experience in data engineering and system integration.
  • 3–5 years focusing on SAP data pipelines and cloud integration technologies.
  • Certifications in relevant tools or cloud platforms (preferred).

Aufgaben

  • Build and maintain ETL/ELT data pipelines and integrations.
  • Develop scalable data flows connecting SAP systems with cloud platforms.
  • Monitor data jobs and troubleshoot failures.

Kenntnisse

SAP BDC
ETL tools
cloud integration services
SQL
Python
Agile delivery

Ausbildung

BS/MS in Computer Science, Data Engineering, Information Systems

Tools

Azure Data Factory
Informatica
Databricks
AWS Glue
Jobbeschreibung

Location: Mannheim

Date: Oct 27, 2025

Requisition ID: 1644844

At EY, we’re all in to shape your future with confidence.

We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go.

Join EY and help build a better working world.

Job Summary

As a Data Engineer / Integration Specialist within the EY SAP Enterprise Data Management Initiative run by SAP Platforms & Assets, you will be responsible for designing, building, and optimizing scalable data pipelines and system integrations for data management and transformation projects.

You will be part of a global team creating an advanced, cloud‑enabled data platform that uses technologies from SAP, Databricks, Snowflake, NVIDIA and Microsoft. Your focus will be to enable seamless data movement, transformation, and integration between SAP systems and modern data platforms, ensuring data availability and quality across multiple environments.

This role requires a hands‑on, technically proficient individual with deep experience in both SAP integration and modern cloud‑native data engineering.

Essential Functions of the Job
  • Build and maintain ETL/ELT data pipelines and integrations to support SAP BDC migration and transformation workflows.
  • Develop scalable data flows connecting SAP systems with cloud platforms such as Databricks, Azure Synapse, and Snowflake.
  • Integrate SAP data (IDocs, BAPIs, flat files) with modern data lakes, warehouses, and analytical tools using Databricks, Nvidia RAPIDS, and other technologies.
  • Optimize data transformation jobs for performance, reliability, and maintainability in hybrid or multi‑cloud setups.
  • Collaborate with architects to ensure integration solutions align with enterprise data strategy and standards.
  • Apply best practices in data security, encryption, data masking, and compliance within integration pipelines.
  • Develop reusable scripts, connectors, and data wrangling logic across SAP and cloud‑native environments.
  • Monitor data jobs, troubleshoot failures, and perform root‑cause analysis to resolve complex data movement issues.
  • Use CI/CD practices to automate deployment of data jobs across dev/test/prod environments.
  • Document technical specifications, mappings, job flows, and operational procedures to support long‑term maintainability.
Knowledge and Skills Requirements
  • Deep experience with SAP BDC, data migration, and SAP integration techniques (LSMW, IDocs, BAPIs, BDC recordings).
  • Strong proficiency in ETL tools and frameworks, e.g., SAP BODS, Azure Data Factory, Informatica, or Talend.
  • Hands‑on with cloud‑based integration services, e.g., AWS Glue, Google Dataflow, Snowflake Tasks/Streams, or Databricks Workflows.
  • Familiarity with cloud data platforms (Azure Synapse, Google BigQuery, Snowflake) and parallel compute frameworks (Nvidia RAPIDS, PySpark).
  • Strong skills in SQL, scripting (Python, Shell), and version control (Git).
  • Knowledge of API integrations, message queues, and event‑driven data pipelines.
  • Experience in data quality validation and exception handling within pipelines.
  • Comfortable working in Agile delivery and CI/CD environments.
Other Requirements
  • Strong collaboration skills in global delivery models (onshore/offshore).
  • Certifications in relevant tools or cloud platforms (e.g., Azure Data Engineer, AWS Big Data) are a plus.
  • Working experience in regulated or large enterprise data environments is desirable.
  • Ability to travel based on project or client needs.
Job Requirements
  • Education
    • BS/MS in Computer Science, Data Engineering, Information Systems, or related field.
  • Experience
    • 8–12+ years of experience in data engineering and system integration, with 3–5 years focused on SAP data pipelines and cloud integration technologies.
What we offer you

At EY, we’ll develop you with future‑focused skills and equip you with world‑class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more.

Are you ready to shape your future with confidence? Apply today.

To help create an equitable and inclusive experience during the recruitment process, please inform us as soon as possible about any disability‑related adjustments or accommodations you may need.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.