Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Freelance Data Warehouse Engineer – AWS/K8s/ClickHouse/Airflow (100% Remote, from January)

freelance.ca

Remote

EUR 40.000 - 60.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A freelance platform is seeking a Freelance Data Warehouse Engineer to support a remote Data Warehouse environment starting in January. The ideal candidate will build ETL pipelines and work with AWS, Kubernetes, and ClickHouse. Responsibilities include developing Airflow DAGs, optimizing ClickHouse, and implementing CI/CD processes. Candidates should possess solid skills in Python and Terraform. This role offers flexibility with a minimum workload of 3 days per week over a 6-month period.

Qualifikationen

  • Strong hands‑on experience with AWS, Kubernetes (EKS or self-managed), ClickHouse, and Airflow (Python).
  • Confident development skills in Python; TypeScript used for adjacent services/tools.
  • Solid Infrastructure-as-Code experience with Terraform.
  • Strong deployment practice using Helm and/or Kubernetes manifests.
  • Proven CI/CD experience with GitHub Actions.

Aufgaben

  • Build and operate production-grade ETL pipelines (ingest transform ClickHouse).
  • Develop and maintain Airflow DAGs in Python (idempotent, backfills, retries, tests).
  • Deploy and operate workloads on Kubernetes (EKS or self-managed) incl. Helm/manifests.
  • Optimize ClickHouse (schema design, performance, query tuning).
  • Implement CI/CD with GitHub Actions and monitoring/alerting (Prometheus/Grafana).
  • Create runbooks and ensure structured knowledge transfer to the DWH team.

Kenntnisse

AWS
Kubernetes
ClickHouse
Airflow
Python
Terraform
GitHub Actions
Helm
TypeScript
Jobbeschreibung

Dear Freelancers,

for one of our key clients, we are looking for a Freelance Data Warehouse Engineer to support a Data Warehouse environment.

Start : from January

Duration : 6 months +

Workload : minimum 3 days per week

Remote / On-site : 100% remote

Location : remote

Responsibilities
  • Build and operate production-grade ETL pipelines (ingest transform ClickHouse)
  • Develop and maintain Airflow DAGs in Python (idempotent, backfills, retries, tests)
  • Deploy and operate workloads on Kubernetes (EKS or self-managed) incl. Helm / manifests; automate infrastructure via Terraform
  • Optimize ClickHouse (schema design, performance, query tuning)
  • Implement CI / CD with GitHub Actions and monitoring / alerting (Prometheus / Grafana), including defining SLIs / SLOs
  • Create runbooks (deploy, rollback, incident) and ensure structured knowledge transfer to the DWH team
Requirements & Skills
  • Strong hands‑on experience with AWS, Kubernetes (EKS or self-managed), ClickHouse, and Airflow (Python)
  • Confident development skills in Python; TypeScript used for adjacent services / tools
  • Solid Infrastructure-as-Code experience with Terraform
  • Strong deployment practice using Helm and / or Kubernetes manifests
  • Proven CI / CD experience with GitHub Actions
  • Nice to have : AWS Lambda, MWAA, and tools such as Moose, Appsmith, or Metabase

If you are interested in this project, please get back to me with your current CV, hourly rate, and availability, and I will respond promptly.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.