Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer

Phiture

Deutschland

Hybrid

EUR 40.000 - 60.000

Vollzeit

Vor 6 Tagen
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A consulting company in Germany is seeking a data engineer to design and implement data pipelines using tools like PySpark and Databricks. This role involves analyzing user problems, maintaining open communication with teams, and ensuring best practices in data management. The ideal candidate is proficient in cloud storage, ETL processes, and has experience with Terraform. Benefits include a salary based on experience, remote work opportunities, and health insurance from the start of employment.

Leistungen

Wage according to candidate's professional experience
Remote Work whenever possible
Health insurance
Delivery of work equipment
Others

Qualifikationen

  • Proficiency with PySpark and Spark SQL for data processing.
  • Experience with Databricks using Unit Catalog.
  • Knowledge of Delta Live Tables for automated ETL and workflow orchestration.

Aufgaben

  • Analyze user problems and maintain communication with Data Architect.
  • Design and implement data pipelines and infrastructure.
  • Define, execute, and document functional and technical tests.

Kenntnisse

Proficiency with PySpark and Spark SQL for data processing
Experience with Databricks using Unit Catalog
Knowledge of Delta Live Tables for automated ETL
Familiarity with Azure Data Lake Storage
Experience with orchestration tools (e.g., Apache Airflow)
Knowledge of data partitioning and data lifecycle management
Familiarity with implementing data security practices
Terraform: At least one year of experience

Tools

Terraform
Apache Airflow
Databricks
Kubernetes
Apache Kafka
Jobbeschreibung

Syffer is an all-inclusive consulting company focused on talent, tech and innovation. We exist to elevate companies and humans all around the world, making change, from the inside to the outside.

We believe that technology + human kindness positively impacts every community around the world. Our approach is simple, we see a world without borders, and believe in equal opportunities. We are guided by our core principles of spreading positivity, good energy and promote equality and care for others.

Our hiring process is unique! People are selected by their value, education, talent and personality. We dont present ethnicity, religion, national origin, age, gender, sexual orientation or identity.

Its time to burst the bubble, and we will do it together!

What You'll do
  • Analyze user problems, ensure clear understanding of architecture, and maintain open communication with Data Architect, peers, and Project Manager;
  • Design and implement data pipelines and infrastructure (e.g., with Terraform), follow data best practices, and manage interface contracts with version control and code reviews;
  • Apply strong knowledge of data warehousing, ETL/ELT processes, data lakes, and modeling throughout development;
  • Define, execute, and document functional and technical tests in collaboration with the Project Manager, sharing regular updates on results;
  • Participate in Deployment Reviews, monitor post-deployment behavior, log errors, and ensure proper use of deployment and monitoring strategies.
What You Are
  • Proficiency with PySpark and Spark SQL for data processing.
  • Experience with Databricks using Unit Catalog.
  • Knowledge of Delta Live Tables (DLT) for automated ETL and workflow orchestration in Databricks.
  • Familiarity with Azure Data Lake Storage.
  • Experience with orchestration tools (e.g., Apache Airflow or similar) for building and scheduling ETL/ELT pipelines.
  • Knowledge of data partitioning and data lifecycle management on cloudbased storage.
  • Familiarity with implementing data security and data privacy practices in a cloud environment.
  • Terraform: At least one year of experience with Terraform and know good practices of GitOps.
  • Additional Knowledge and Experience that are a Pluss: Databricks Asset Bundles, Kubernetes, Apache Kafka, Vault.
What youll get
  • Wage according to candidate's professional experience;
  • Remote Work whenever possible;
  • Allocation of health insurance from the beginning of the employment;
  • Delivery of work equipment adjusted to the performance of functions;
  • And others.

Work together with expert teams on projects of large magnitude and intensity, long term together with our clients, all leaders in their industries.

Are you ready to step into a diverse and inclusive world with us?

Together we will promote uniquess!

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.