Aktiviere Job-Benachrichtigungen per E-Mail!

Big Data Architect

Databricks Inc.

Berlin

Vor Ort

EUR 70.000 - 100.000

Vollzeit

Vor 28 Tagen

Zusammenfassung

A leading data platform provider in Berlin is seeking a Big Data Solutions Architect to work on impactful customer projects using their innovative platform. The ideal candidate will have expertise in data engineering and analytics, proficiency in Python or Scala, and a strong understanding of cloud ecosystems. Responsibilities include guiding clients through transformational projects and providing elevated operational support.

Qualifikationen

  • Proven track record of successful projects and industry best practices.
  • Experience managing scope and timelines on technical projects.
  • Strong documentation and white-boarding skills.

Aufgaben

  • Design and build reference architectures for customers.
  • Guide customers in big data project implementations.
  • Provide support for customer operational issues.

Kenntnisse

Data engineering
Analytics
Python
Scala
Enterprise Data Warehousing
Cloud ecosystems (AWS, Azure, GCP)
Apache Spark
CI/CD
MLOps

Ausbildung

Databricks Certification
Jobbeschreibung

We have 5 open positions based in our Germany offices.

Overview

As a Big Data Solutions Architect (Resident Solutions Architect) in our Professional Services team you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks Data Intelligence Platform. You will provide data engineering, data science, and cloud technology projects which require integrating with client systems, training, and other technical tasks to help customers to get most value out of their data. RSAs are billable and know how to complete projects according to specification with excellent customer service. You will report to the regional Manager/Lead.

The impact you will have:
  • You will work on a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases
  • Work with engagement managers to scope variety of professional services work with input from the customer
  • Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications
  • Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks.
  • Provide an escalated level of support for customer operational issues.
  • You will work with the Databricks technical team, Project Manager, Architect and Customer team to ensure the technical components of the engagement are delivered to meet customer's needs.
  • Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues.
What we look for:
  • Proficient in data engineering, data platforms, and analytics with a strong track record of successful projects and in-depth knowledge of industry best practices
  • Comfortable writing code in either Python or Scala
  • Enterprise Data Warehousing experience (Teradata / Synapse/ Snowflake or SAP)
  • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
  • Deep experience with distributed computing with Apache Spark and knowledge of Spark runtime internals
  • Familiarity with CI/CD for production deployments
  • Working knowledge of MLOps
  • Design and deployment of performant end-to-end data architectures
  • Experience with technical project delivery - managing scope and timelines.
  • Documentation and white-boarding skills.
  • Experience working with clients and managing conflicts.
  • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
  • Travel is required up to 10%, more at peak times.
  • Databricks Certification
About Databricks

If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.