We’re looking for someone to join our team of data engineers, developing new data pipelines and scaling the existing ones for a growing number of customers and use cases.
Responsibilities:
Develop and maintain data pipelines using technologies like Python, Terraform, Kubernetes, and Pub/Sub on Google Cloud Platform.
Implement new features through automated Infrastructure as Code and CI/CD pipelines, including code reviews and automated testing.
Manage the full DevOps cycle, focusing on feature development and architecture optimization.
Benefits:
Standby flying for work/life balance
Employee benefits including old-age provision, profit sharing, flight privileges, discounts, employee events, hybrid and flexible working hours, training opportunities, and paid parental leave.
Qualifications:
Degree in computer science or related field.
At least 4 years of experience in data engineering.
Minimum 2 years of experience with cloud providers, preferably GCP.
Proficiency in Python and familiarity with tools like Jupyter, VS Code, PyCharm, git, virtual environments, CLI.
Experience with automated testing, CI/CD, build pipelines, and monitoring.
Strong understanding of DevOps/SRE principles.
Experience with Kubernetes/OpenShift, Docker, SQL/NoSQL databases, messaging, Airflow, Apache Beam.
Knowledge of event-driven architectures is a plus.
Legal work permit in Spain; no relocation assistance provided.
Obtenga la revisión gratuita y confidencial de su currículum.