We are seeking an experienced Project Manager with strong Databricks exposure to lead end-to-end data engineering, analytics, and cloud transformation initiatives. The ideal candidate will have demonstrated expertise in managing cross-functional teams, delivering data platform projects, and working closely with technical teams using Databricks, Azure/AWS, and modern data architectures.
Key Responsibilities
- Lead and manage full project lifecycle for data platform, analytics, and ML pipeline initiatives using Databricks.
- Collaborate with data engineers, architects, BI teams, and business stakeholders to gather requirements, define scope, and translate business needs into detailed project plans.
- Drive Databricks-based implementation including workspace setup, lakehouse integration, ETL/ELT pipelines, and data governance workflows.
- Oversee Agile ceremonies (sprint planning, grooming, stand-ups, retrospectives) and maintain delivery cadence using JIRA/ADO.
- Ensure strong project tracking, risk management, and stakeholder communication through structured reporting, dashboards, and documentation.
- Manage vendor coordination, resource allocation, and budgeting for data platform and cloud projects.
- Ensure all deliverables meet security, compliance, and data quality standards, collaborating with cloud, InfoSec, and architecture teams.
- Support UAT, deployment, and post-production stabilization activities.
- Promote best practices for Databricks governance, cluster optimization, cost management, and workspace operations.
Skills Required
- 7+ years of total experience with 2+ years in data or cloud platform project management.
- Hands‑on exposure to Databricks (lakehouse architecture, notebooks, clusters, jobs, workflows).
- Strong understanding of data engineering concepts, ETL/ELT pipelines, Delta Lake, and data governance.
- Experience delivering projects on Azure or AWS cloud environments.
- Expertise in project planning, Agile/Scrum, and standard PM frameworks (PMP/Prince2 preferred).
- Strong communication, stakeholder management, and cross‑functional leadership skills.
- Proficiency with JIRA, Azure DevOps, Confluence, MS Project, or similar tools.
Nice-to-Have Skills
- Knowledge of Spark, PySpark, SQL, or data modelling concepts.
- Exposure to MLflow, Unity Catalog, or Databricks cost optimization.
- Certification in Databricks or cloud platforms (Azure/AWS/GCP).
- Experience working in large enterprise or regulated environments (BFSI, Healthcare, Retail).