¡Activa las notificaciones laborales por email!

Data Enablement Specialist

Next-Link

Sant Cugat del Vallès

Presencial

EUR 40.000 - 60.000

Jornada completa

Hace 12 días

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading company in data solutions is seeking a Data Developer to implement and maintain transformation logic using dbt within a Snowflake environment. This role involves collaborating with global teams to turn raw data into actionable insights, ensuring data quality, and optimizing transformation processes for performance and cost-efficiency.

Formación

  • 3+ years of professional experience in data engineering.
  • Proficiency in dbt and SQL for data transformation.
  • Experience with Apache Airflow and CI/CD practices.

Responsabilidades

  • Implement and maintain data transformation logic using dbt.
  • Write clean, modular SQL code for Snowflake.
  • Enforce data quality standards through testing and monitoring.

Conocimientos

Data transformation
SQL
Data quality assurance
Collaboration
Data observability

Herramientas

dbt
Apache Airflow
Bitbucket
AWS
Jira
Confluence

Descripción del empleo

As a Data Developer, you’ll play a key role in implementing and maintaining transformation logic using dbt and working within a Snowflake environment to support downstream analytics, reporting, and different use cases. You will collaborate closely with data product teams, IT, and business stakeholders around the world, helping to turn raw data into actionable insights.

He / she will :

  • Implement and maintain data transformation logic using, following already defined models and specifications.
  • Write clean, modular, and efficient SQL code tailored for Snowflake, focusing on data cleaning, normalization, and enrichment.
  • Enforce data quality standards through testing, monitoring, and integration with data observability practices.
  • Orchestrate and manage pipeline execution using Apache Airflow, ensuring reliability and reusability.
  • Participate in documentation efforts, including technical specs, transformation logic, and metadata definitions.
  • Contribute to CI / CD pipelines, version control workflows (Bitbucket), and best practices for data development.
  • Continuously optimize transformation processes for performance, cost-efficiency, and maintainability in Snowflake.

Requirements

  • 3+ years of professional experience in data engineering or a related field, with a strong focus on data transformation and quality assurance.
  • Proficiency in dbt, including hands-on experience writing and managing models, tests, and macros.
  • Demonstrated ability to write clean, efficient, and high-performance SQL in Snowflake, particularly for complex data transformation and cleaning workflows.
  • Experience with Apache Airflow or similar pipeline orchestration tools.
  • Familiarity with Bitbucket, Git workflows, and DevOps / CI / CD practices.
  • Solid understanding of data quality frameworks, testing methodologies, and data observability principles.
  • Excellent verbal and written communication skills, with a proven ability to collaborate effectively in a remote, global, and cross-functional environment.
  • Fluency in English (both spoken and written) is required.
  • Preferred Qualifications :

o Experience working with pharmaceutical datasets and applications.

o Familiarity with Jira and Confluence for task and knowledge management.

o Knowledge of Data Vault modeling principles.

o Experience with AWS cloud services.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.