Enable job alerts via email!

Databricks Developer

Stanra Tech Solutions

Agra District

Remote

INR 9,00,000 - 12,00,000

Part time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology solutions company is seeking a skilled Databricks Developer for a 4-month remote contract. Candidates should have 4+ years of experience with Databricks, Python, and data engineering technologies. The role includes developing data pipelines and collaborating with teams to deliver data solutions in multi-cloud environments. The position involves India evening shifts (till 11:30 PM IST).

Qualifications

  • 4+ years of hands-on experience in Databricks, Python, Spark (PySpark), DBT, and AWS data services.
  • Strong experience with SQL and large-scale datasets.
  • Hands-on exposure to multi-tenant environments (AWS/Azure/GCP).
  • Knowledge of data modeling, data warehouse design, and best practices.
  • Good understanding of workflow orchestration tools like Airflow.

Responsibilities

  • Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift.
  • Work in multi-cloud environments including AWS, Azure, and GCP.
  • Implement workflow orchestration using Airflow or similar frameworks.
  • Design, implement, and manage data warehouse solutions, schema evolution, and data versioning.
  • Collaborate with cross-functional teams to deliver high-quality data solutions.

Skills

Databricks
Python
Spark (PySpark)
DBT
AWS data services
SQL
Workflow orchestration
Data modeling

Tools

Airflow
AWS S3
AWS Glue
AWS Redshift
Azure
GCP
Job description
Job Title

Databricks Developer (Contract)

Contract Duration

4 Months +Extensible based on Performance

Job Location

Remote

Job Timings

India Evening Shift (till 11:30 PM IST)

Experience Required

4+ Years

Job Description

We are seeking a skilled Databricks Developer to join our team on a 4-month contract basis. The ideal candidate will have strong expertise in modern data engineering technologies and the ability to work in a fast-paced, remote environment.

Key Responsibilities
  • Develop, optimize, and manage large-scale data pipelines using Databricks, PySpark, DBT, and AWS S3/Glue/Redshift.
  • Work in multi-cloud environments including AWS, Azure, and GCP.
  • Implement workflow orchestration using Airflow or similar frameworks.
  • Design, implement, and manage data warehouse solutions, schema evolution, and data versioning.
  • Collaborate with cross-functional teams to deliver high-quality data solutions.
Required Skills & Experience
  • 4+ years of hands-on experience in Databricks, Python, Spark (PySpark), DBT, and AWS data services.
  • Strong experience with SQL and large-scale datasets.
  • Hands-on exposure to multi-tenant environments (AWS/Azure/GCP).
  • Knowledge of data modeling, data warehouse design, and best practices.
  • Good understanding of workflow orchestration tools like Airflow.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.