Job Search and Career Advice Platform

Enable job alerts via email!

Big Data Solutions Architect (Professional Services)

Databricks Inc.

Greater London

On-site

GBP 80,000 - 110,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data and AI company seeks a Big Data Solutions Architect to work with clients on their big data challenges using the company's platform. The role involves guiding customers on impactful projects and requires extensive experience in data engineering and cloud ecosystems. Candidates should have proficiency in Python or Scala and Apache Spark™, along with strong client management skills. The position requires travel to customers about 30% of the time and offers an environment focused on comprehensive employee benefits.

Benefits

Comprehensive benefits package
Professional development opportunities

Qualifications

  • 6+ years experience in data engineering, data platforms & analytics.
  • Comfortable writing code in either Python or Scala.
  • Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals.
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP).

Responsibilities

  • Guide customers as they implement transformational big data projects.
  • Design and build reference architectures for clients.
  • Provide an escalated level of support for customer operational issues.
  • Facilitate technical workshops and customer requirement gathering.

Skills

Data engineering
Cloud ecosystems (AWS, Azure, GCP)
Apache Spark™
Python
Scala
MLOps
Technical project delivery
Documentation skills
Client management

Education

Databricks Certification
Job description

As a Big DataSolutions Architect, in our Professional Services team you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data. RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.

The impact you will have:
  • You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  • Work with engagement managers to scope variety of professional services work with input from the customer
  • Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  • Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks.
  • Provide an escalated level of support for customer operational issues.
  • You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs.
  • Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.
What we look for:
  • 6+ years experience in data engineering, data platforms & analytics
  • Comfortable writing code in either Python or Scala
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
  • Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Design and deployment of performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines.
  • Documentation and white-boarding skills.
  • Experience working with clients and managing conflicts.
  • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
  • Travel to customers 30% of the time
  • Databricks Certification

As a Senior Big Data Solutions Architect (Sr Resident Solutions Architect), in our Professional Services team you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data. RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.

The impact you will have:
  • You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  • Work with engagement managers to scope variety of professional services work with input from the customer
  • Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  • Consult on architecture and design; bootstrap hands-on projects which leads to a customers' successful understanding, evaluation and adoption of Databricks.
  • Provide an escalated level of support for customer operational issues.
  • You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs.
  • Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.
What we look for:
  • 9+ years experience in data engineering, data platforms & analytics
  • Comfortable writing code in either Python or Scala
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
  • Deep experience with distributed computing with Apache Spark™ and knowledge of Apache Spark™ runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Capable of design and deployment of highly performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines.
  • Documentation and white-boarding skills.
  • Experience working with clients and managing conflicts.

You will be part of the Professional Services team and work with clients on short to medium term customer engagements solving their big data problems using Apache Spark™ and the Databricks platform. You will be in a customer-facing role that requires deep hands-on expertise in Apache Spark™ and data engineering, along with a variety of knowledge of the big data ecosystem. You will guide our largest customers, implementing pipelines from data engineering through model building and deployment, plus other technical tasks to help customers to get value out of their data with Databricks. You will report to the regional Manager/Lead.

The impact you will have:
  • You will guide customers as they implement transformational big data projects, including end-to-end development and deployment of industry-leading big data and AI applications
  • You will assure that Databricks best practices are being used within all projects and that our quality of service and implementation is strictly followed
  • You will facilitate technical workshops, discovery and design sessions, customer requirements gathering and scoping for new and existing strategic customers
  • Assist the Professional Services leader and project managers with level of effort estimation and mitigation of risk within customer proposals and statements of work
  • Architect, design, develop, deploy, operationalize and document complex customer engagements individually or as part of an extended team as the technical lead and overall authority
  • Knowledge transfer, enablement and mentoring of other team members, customers and partners, including developing reusable project artifacts
  • Provide experience to the consulting team, provide best practices in client engagement to other teams
What we look for:
  • 4+ years experience in data engineering, data platforms & analytics
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP)
  • Comfort with object-oriented and functional programming in Scala and Python
  • Experience in building scalable streaming and batch solutions using cloud-native components
  • Strong knowledge of distributed computing with Apache Spark™
  • Travel to customers 30% of the time
  • Nice to have: Databricks Certification
About Databricks

Databricks is the data and AI company. More than 10,000 organizations worldwide—including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500—rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.

Benefits

At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.