Enable job alerts via email!

SC Cleared Azure Data Engineer

Anglia IT Recruitment

United States

Remote

USD 60,000 - 80,000

Full time

5 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join a forward-thinking company as an SC Cleared Azure Data Engineer, where you'll leverage your expertise in Azure data services to analyze and transform data for impactful metrics. This remote role offers the chance to collaborate closely with analysts and data scientists, ensuring business requirements are met through innovative data solutions. You'll be responsible for developing ADF pipelines, creating Databricks notebooks, and refining data models. If you're passionate about data engineering and ready to make a difference, this is the perfect opportunity for you!

Qualifications

  • Solid background in Azure data services and experience with Databricks.
  • Strong skills in PySpark and SQL for data transformation.

Responsibilities

  • Analyze raw data for parsing and transformation towards metric development.
  • Create normalized data models and develop ADF pipelines for data orchestration.

Skills

Azure Data Factory (ADF)
Azure Synapse
SQL
Databricks
PySpark
Data Modeling
Power BI

Job description

Role: SC Cleared Azure Data Engineer

Location: Remote

Duration: 1 -3 Months

Rate: £500 per day (Inside IR35)

Responsibilities:

  • Analyse raw data (mostly in Json format ) for data parsing, schema evolution, data transformation towards metric development purpose.
  • Analyse reporting/metric requirements from data engineering perspective for refinement, estimation , development and deployment.
  • Closely work with analysts , data scientists to understand the business requirements, data sources and logic for metric development.
  • Create normalised/dimensional data models based on the requirement.
  • Translated and refine the notebooks and logics developed as part of prototype
  • Transform data from landing/staging/transformed to synapse dimensional model.
  • Creating notebooks in Databricks for incremental data load and transformation.
  • Creating stored procedures for data load and transformation in azure synapse dedicated pools
  • Created ADF pipelines for data orchestration across different data layers of data bricks and synapse

Skills:

  • Solid background in Azure data services like ADF, Synapse, SQL, ADB , etc..
  • Experience with Databricks notebooks development for data ingestion, validation, transformation and metric build.
  • Strong in PySpark and SQL.
  • Experience in ADF pipeline development, data orchestration techniques, monitoring and troubleshooting
  • Experience in stored procedure development.
  • Good Knowledge in data modelling and Power BI.

Candidates should either hold or be willing to undergo SC Clearance.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.