Enable job alerts via email!

Azure Developer at Hyperion Technologies

Charlestonsouthern

Washington (District of Columbia)

Hybrid

USD 100,000 - 140,000

Full time

25 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company is seeking an Azure Developer in Washington, DC, to spearhead data engineering initiatives using Azure Cloud Services and SAP BPC. The role requires designing scalable ETL pipelines and collaborating with SAP teams for data management. Ideal candidates will have substantial experience with Azure Databricks, proficiency in Python and SQL, and strong analytical skills to drive effective data delivery and compliance.

Qualifications

  • 5+ years of experience in data engineering.
  • Strong focus on Azure Databricks and understanding of data modeling.
  • Experience with SAP BPC, including data structures and logic scripts.

Responsibilities

  • Analyze current SAP BPC data models and integration points.
  • Design and implement scalable ETL/ELT pipelines in Azure Databricks.
  • Collaborate with SAP teams for data extraction and transformation.

Skills

Data engineering
Azure Databricks
Data modeling
Python
SQL
Version control
Analytical skills
Communication skills

Job description

You are viewing a preview of this job. Log in or register to view more details about this job.

Title: Azure Developer
Location: Washington, DC

Hybrid Onsite: 4 Days onsite per week from Day1

Premium Skill:
Azure Cloud Services (PaaS and Iaas)
SAP APO, SAP Fiori, SAP BPC, S/4 Hana

Required Qualifications:
5+ years of experience in data engineering, with a strong focus on Azure Databricks.
Strong understanding of data modeling for planning and forecasting.
Experience with SAP BPC, including data structures, logic scripts, and planning process flows.
Hands-on experience with Azure services: ADLS Gen2, Data Factory, Synapse, Key Vault, etc.
Ability to translate legacy planning logic into modern, modular, and scalable data pipelines.
Experience working with cloud-based planning tools or their integration patterns.
Proficient in Python, SQL, and version control (e.g., Git).
Strong analytical and communication skills.

Preferred Qualifications:
Experience in finance, FP&A, or enterprise performance management (EPM) domain.
Prior involvement in BPC migration projects or cloud planning platform implementations.
Familiarity with cloud-based FP&A tools Key Responsibilities: - Analyze the current SAP BPC data models, processes, and integration points.
Design and implement scalable ETL/ELT pipelines in Azure Databricks to support data extraction, transformation, and delivery to the new planning platform.
Collaborate with SAP teams to extract actuals, plans, forecasts, and master data from SAP BPC (NetWeaver or MS version)
Translate BPC logic (scripts, transformations, allocations) into Databricks-based data models and logic.
Ensure accurate and timely data delivery from Databricks to the new cloud planning platform via APIs, flat files, or direct connectors.
Create reusable frameworks for data quality, lineage, and reconciliation.
Partner with solution architects and planning platform experts to ensure smooth integration and alignment.
Document technical solutions and support knowledge transfer to internal teams.
Ensure security, compliance, and performance, best practices are followed acrossthedatastack.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Azure Developer at Hyperion Technologies

Wayne State University

Washington

Hybrid

USD 90.000 - 140.000

30+ days ago