Job Search and Career Advice Platform

Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer /Developer

Toogeza

Remote

EUR 60.000 - 80.000

Vollzeit

Vor 16 Tagen

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Zusammenfassung

A Ukrainian recruiting company is seeking a Data Engineer/Developer to build and optimize ETL processes for a leading slot gaming company. This remote position focuses on data infrastructure, analytics collaboration, and cost-efficient operations, requiring strong Python and SQL experience, along with a solid understanding of GCP and tooling like Pulumi and Terraform. The role provides meaningful work, a flexible culture, and benefits including health insurance and a substantial vacation policy.

Leistungen

25 vacation days + 15 sick days + 1 birthday leave
Flexible, remote-friendly culture
English classes with native speakers
Health insurance
Annual education & development budget

Qualifikationen

  • Strong experience with Python and SQL for data engineering.
  • Solid understanding of cloud platforms (ideally GCP) and data services (BigQuery, Cloud Storage, etc.).
  • Hands-on experience with Infrastructure-as-Code tools like Pulumi or Terraform.
  • Experience with Airflow, dbt, or similar orchestration/transform tools.
  • Proficiency in Docker and Kubernetes for data workflows.
  • Understanding of Linux systems, cloud networking, and security best practices.
  • Experience with CI/CD pipelines and version control (GitLab or similar).
  • A mindset for continuous improvement, optimization, and working cross-functionally.

Aufgaben

  • Understand, format and prepare data for analytics and data-science processes.
  • Design, build, and optimize scalable ETL/ELT pipelines for batch and streaming data.
  • Collaborate with analysts to understand data needs and ensure accessible, well-modeled data sets.
  • Dive deep into system metrics and usage patterns to identify opportunities for FinOps-driven cost savings.
  • Manage data infrastructure on GCP (BigQuery, Cloud Composer, Vertex AI, Kubernetes, etc.).
  • Automate infrastructure provisioning using Pulumi or Terraform.
  • Set up data quality monitoring, alerting, and logging systems.
  • Collaborate with data scientists and ML engineers to productionize models and build supporting pipelines.
  • Continuously improve performance, scalability, and cost-efficiency of data workflows.
Jobbeschreibung

We are toogeza, a Ukrainian recruiting company that is focused on hiring talents and building teams for tech startups worldwide. People make a difference in the big game, we may help to find the right ones.

Job Title: Data Engineer /Developer for Spinlab.

Location

Remote

Job Type

Full-Time

About our client

We help slot gaming leaders unlock the potential of their data, enhancing business outcomes and strengthening their competitive edge in the market. We collect and process data using advanced methods and technologies to provide our clients with clear, actionable recommendations based on real metrics. SpinLab’s goal is not just to collect data but to help businesses use it for maximum efficiency and amplify their performance results.

About the Role

We’re looking for a hands‑on Data Engineer with a strong focus on data infrastructure, analytics collaboration, and cost‑efficient operations. In this role, you will develop and optimize ETL processes, ensure data reliability and scalability, and work closely with analytics and product teams to support data‑driven decision‑making.

You will also contribute to the effective management of cloud resources, with a focus on automation, cost optimization, and continuous improvement.

Responsibilities
  • Understand, format and prepare data for analytics and data‑science processes.
  • Design, build, and optimize scalable ETL/ELT pipelines for batch and streaming data.
  • Collaborate with analysts to understand data needs and ensure accessible, well‑modeled data sets.
  • Dive deep into system metrics and usage patterns to identify opportunities for FinOps‑driven cost savings.
  • Manage data infrastructure on GCP (BigQuery, Cloud Composer, Vertex AI, Kubernetes, etc.).
  • Automate infrastructure provisioning using Pulumi or Terraform.
  • Set up data quality monitoring, alerting, and logging systems.
  • Collaborate with data scientists and ML engineers to productionize models and build supporting pipelines.
  • Continuously improve performance, scalability, and cost‑efficiency of data workflows.
Requirements
  • Strong experience with Python and SQL for data engineering.
  • Solid understanding of cloud platforms (ideally GCP) and data services (BigQuery, Cloud Storage, etc.).
  • Hands‑on experience with Infrastructure‑as‑Code tools like Pulumi or Terraform.
  • Experience with Airflow, dbt, or similar orchestration/transform tools.
  • Proficiency in Docker and Kubernetes for data workflows.
  • Understanding of Linux systems, cloud networking, and security best practices.
  • Experience with CI/CD pipelines and version control (GitLab or similar).
  • A mindset for continuous improvement, optimization, and working cross‑functionally.
Will be a plus
  • Previous exposure to FinOps practices or cost‑optimization work in cloud environments.
  • Experience with ClickHouse.
  • Experience with AWS.
  • Familiarity with iGaming, B2B SaaS, or Fintech domains.
  • Experience supporting data science/ML workflows in production.
  • Cloud/data‑related certifications.
Benefits
  • Work on meaningful data products and shape them with your vision.
  • 25 vacation days + 15 sick days + 1 birthday leave.
  • Flexible, remote‑friendly culture with a small, dedicated team.
  • English classes with native speakers.
  • Health insurance.
  • Annual education & development budget.
What’s next

If this role sounds like a fit — we’d love to hear from you! Just send over your CV and anything else you’d like us to consider.

We’ll review everything within five working days, and if your background matches what we’re looking for, we’ll get in touch to set up a call and get to know each other better.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.