Enable job alerts via email!

Azure Developer at Hyperion Technologies

Wayne State University

Washington (District of Columbia)

Hybrid

USD 90,000 - 140,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking an Azure Developer to lead the transformation of data solutions. This role involves designing and implementing scalable data pipelines using Azure Databricks, while collaborating with SAP teams to enhance data integration. The ideal candidate will have a strong background in data engineering, particularly with Azure services, and will play a crucial role in ensuring data accuracy and compliance. If you're passionate about leveraging cloud technologies to drive business success, this opportunity is perfect for you.

Qualifications

  • 5+ years of experience in data engineering focused on Azure Databricks.
  • Strong understanding of data modeling for planning and forecasting.

Responsibilities

  • Design and implement scalable ETL/ELT pipelines in Azure Databricks.
  • Collaborate with SAP teams to extract actuals, plans, forecasts, and master data.

Skills

Azure Cloud Services (PaaS and IaaS)
SAP BPC
Python
SQL
Data Engineering
Data Modeling
Version Control (Git)
Analytical Skills
Communication Skills

Tools

Azure Databricks
Azure Data Factory
Azure Synapse
Azure Key Vault

Job description

Job Title: Azure Developer

Location: Washington, DC

Hybrid Onsite: 4 Days onsite per week from Day 1

Premium Skills:
  • Azure Cloud Services (PaaS and IaaS)
  • SAP APO, SAP Fiori, SAP BPC, S/4 HANA
Required Qualifications:
  • 5+ years of experience in data engineering, with a focus on Azure Databricks
  • Strong understanding of data modeling for planning and forecasting
  • Experience with SAP BPC, including data structures, logic scripts, and planning process flows
  • Hands-on experience with Azure services: ADLS Gen2, Data Factory, Synapse, Key Vault, etc.
  • Ability to translate legacy planning logic into modern, modular, and scalable data pipelines
  • Experience working with cloud-based planning tools or their integration patterns
  • Proficient in Python, SQL, and version control (e.g., Git)
  • Strong analytical and communication skills
Preferred Qualifications:
  • Experience in finance, FP&A, or enterprise performance management (EPM) domain
  • Prior involvement in BPC migration projects or cloud planning platform implementations
  • Familiarity with cloud-based FP&A tools
Key Responsibilities:
  • Analyze current SAP BPC data models, processes, and integration points
  • Design and implement scalable ETL/ELT pipelines in Azure Databricks to support data extraction, transformation, and delivery to the new planning platform
  • Collaborate with SAP teams to extract actuals, plans, forecasts, and master data from SAP BPC (NetWeaver or MS version)
  • Translate BPC logic (scripts, transformations, allocations) into Databricks-based data models and logic
  • Ensure accurate and timely data delivery from Databricks to the new cloud planning platform via APIs, flat files, or direct connectors
  • Create reusable frameworks for data quality, lineage, and reconciliation
  • Partner with solution architects and planning platform experts to ensure smooth integration and alignment
  • Document technical solutions and support knowledge transfer to internal teams
  • Ensure security, compliance, and performance best practices are followed across the data stack
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Azure Developer at Hyperion Technologies

Charlestonsouthern

Washington

Hybrid

USD 100,000 - 140,000

24 days ago