Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Azure Architect

TestYantra Software Solutions

Böblingen

Vor Ort

EUR 80.000 - 100.000

Vollzeit

Heute
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A leading software company in Böblingen is seeking an experienced Azure Data Architect to design scalable and secure data solutions using Microsoft Azure technologies. Responsibilities include architecting end-to-end data solutions, ensuring data governance, and optimizing performance. Candidates should have over 10 years of experience in data architecture and strong knowledge of Azure services. This role offers competitive compensation and opportunities for collaboration with cross-functional teams.

Qualifikationen

  • 10+ years in data architecture and engineering; 3+ years in Azure cloud data solutions.
  • Strong knowledge of dimensional modeling, star/snowflake schemas, and NoSQL.
  • Excellent communication and stakeholder management skills.

Aufgaben

  • Architect end-to-end data solutions using Azure technologies.
  • Create data models and implement data governance.
  • Optimize Databricks clusters for performance.
  • Integrate with analytics tools for reporting and visualization.
  • Ensure data security and compliance standards.

Kenntnisse

Azure Data Factory
Azure Databricks
PySpark
Data Governance
SQL
Python
CI/CD
Terraform
Jobbeschreibung
Role Overview

We are seeking an experienced Azure Data Architect to design and implement scalable, secure, and high-performing data solutions on Microsoft Azure. The ideal candidate will have strong expertise in Azure Databricks, PySpark, and modern data architecture principles to enable advanced analytics and business intelligence.

Key Responsibilities
  • Architect End-to-End Data Solutions
    • Design and implement data ingestion, transformation, and storage pipelines using Azure Data Factory (ADF), Azure Databricks, and PySpark.
    • Develop lakehouse architectures leveraging Delta Lake and medallion patterns (Bronze/Silver/Gold layers).
  • Data Modeling & Governance
    • Create conceptual, logical, and physical data models for structured and semi-structured data.
    • Implement data governance using Azure Purview and Unity Catalog for metadata and access control.
  • Performance & Optimization
    • Optimize Databricks clusters for cost and performance; implement auto-scaling and caching strategies.
    • Apply PySpark best practices for distributed data processing and transformation.
  • Integration & Analytics
    • Integrate with Azure Synapse Analytics, Power BI, and other BI tools for reporting and visualization.
    • Enable real-time and batch data processing using Event Hubs, Stream Analytics, and Spark Structured Streaming.
  • Security & Compliance
    • Ensure adherence to data security, privacy, and regulatory compliance standards.
    • Implement RBAC, VNet injection, and encryption for data at rest and in transit.
  • Collaboration & Leadership
    • Work closely with business stakeholders, data engineers, and data scientists to translate requirements into technical solutions.
    • Mentor team members on PySpark, Databricks, and Azure best practices.
Required Skills & Qualifications
  • Experience:
    • 10+ years in data architecture and engineering; 3+ years in Azure cloud data solutions.
  • Technical Expertise:
    • Azure Services: Data Factory, Databricks, Synapse Analytics, Data Lake Storage Gen2, Cosmos DB.
    • Big Data Tools: PySpark, Delta Lake, SparkSQL.
    • Programming: Python (with Pandas, NumPy), SQL; familiarity with Scala is a plus.
  • Data Architecture:
    • Strong knowledge of dimensional modeling, star/snowflake schemas, and NoSQL.
  • Other Skills:
    • CI/CD with Azure DevOps, Terraform for infrastructure as code.
    • Excellent communication and stakeholder management skills.
Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.