Our client is currently seeking a Sr. SAP ETL Developer
Responsibilities
- SAP Data Integration (70%):
- Architect and implement robust ETL pipelines to extract data from SAP ECC, SAP S / 4 HANA, SAP HANA, and SAP Datasphere using best-practice integration methods (ODP, CDS views, RFCs, BAPIs).
- Analyze and interpret SAP’s internal data models (tables like BKPF, BSEG, MARA, EKPO) and work closely with SAP functional and technical teams.
- Work with SAP Datasphere (SAP Data Warehouse Cloud) to federate or replicate SAP data for consumption in the EDP (highly desired).
- Review or interpret ABAP logic when necessary to understand legacy transformation rules and business logic (nice to have).
- Lead the end-to-end data integration process from SAP ECC, ensuring deep alignment with the EDP’s design and downstream data usage needs.
- Leverage knowledge of HANA Data Warehouse and SAP BW to support historical reporting and semantic modeling.
- Integrate data from non-SAP ERP systems such as JD Edwards and other legacy systems into a unified data platform.
- Transform and model data for analytics within Microsoft Fabric, including OneLake, Dataflows Gen2, and Synapse Data Warehouse.
- Establish data quality, security, and governance standards within integration workflows.
- Document technical processes and contribute to the ongoing improvement of data integration frameworks.
- Stay current on SAP and Microsoft data ecosystem developments to inform future architecture and tools.
- Enterprise ETL and Cloud Data Engineering (30%):
- Design and build robust, scalable ETL / ELT pipelines to ingest data into Microsoft cloud using tools such as Azure Data Factory, or Alteryx.
- Automate data movement from SAP into Azure Data Lake Storage / OneLake, enabling clean handoffs for consumption by Power BI, data science models, and APIs.
- Build data transformations in SQL, Python, and PySpark, leveraging distributed compute (e.g., Synapse or Spark pools).
- Work closely with cloud architects to ensure integration patterns are secure, cost-effective, and meet performance SLAs.
- Data Quality and Governance:
- Establish and enforce data quality standards and governance practices to ensure data integrity and consistency across integrated systems.
- Monitor, troubleshoot and address data quality issues, implementing solutions as needed.
- Collaboration and Communication:
- Collaborate with data modelers to define canonical enterprise models and develop mappings from SAP source tables.
- Work closely with cross-functional teams, including data analysts, business analysts, and ERP administrators, to understand data requirements and deliver solutions.
- Provide technical expertise and guidance to team members and stakeholders regarding data transformation and integration.
- Drive pragmatic approaches to solve complex business problems through providing data models suited for business intelligence / analytics tools such as Power BI.
- Work with engineering teams to enable the appropriate capture and storage of data.
Knowledge / Skills / Abilities
- Proven track record working with SAP tables, modules, and transactional data across finance, supply chain, procurement, and production planning.
- Strong proficiency in one or more enterprise-grade ETL tools (Azure Data Factory, Informatica, Alteryx, SSIS).
- Proficient in SQL for data transformation and orchestration.
- Experience building data pipelines at scale, including partitioning, parallelization, error handling, and monitoring.
- Expert knowledge of data modeling, data warehousing, and big data technologies.
- Understanding of data privacy regulations and best practices related to storing PII.
- Experience with dimensional model design (star schema), Kimball DW concepts, conversion of data stored in 3NF to denormalized form geared for reporting.
- Proven track record of delivering significant business impact, with a solid track record in Finance, Supply Chain, Sales or other verticals.
Education and Experience
- Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related field. Relevant certifications (e.g., Microsoft Certified: Azure Data Engineer, Azure Data Scientist; Snowflake: SnowPro Data Engineer) are a plus.
- 10-15 years of IT experience, with at least 8-10 years of SAP experience (SAP ECC and SAP S / 4HANA).
- Hands-on experience with Azure cloud data services including Synapse Analytics, Data Lake Storage, SQL DB.
- Experience building cloud-native applications, for example with Microsoft Azure, AWS or GCP.
- Experience with tools such as Azure Data Factory, Informatica, Alteryx, PySpark, and Python. Experience working with SAP Datasphere, SAP Business Data Cloud, SAP Data Services. Experience with Microsoft Fabric.