¡Activa las notificaciones laborales por email!

Senior Data Engineer

g2 Recruitment

Barcelona

A distancia

EUR 30.000 - 50.000

Jornada completa

Hace 2 días
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A leading recruitment company is seeking a Senior Data Engineer to work remotely in Catalonia. The position involves designing scalable data pipelines and cloud solutions using AWS and Azure, along with mentoring junior engineers. Candidates should possess strong Python and SQL skills. This is a freelance role with a 6-month contract and the possibility of extension.

Formación

  • 5 years of experience in data engineering or software development.
  • Experience working in agile teams.
  • Technical English (reading / writing).

Responsabilidades

  • Design and build scalable and reliable ETL / ELT pipelines.
  • Develop applications and APIs for data access and manipulation.
  • Implement solutions on AWS and Azure.
  • Automate data ingestion, transformation, and loading processes.
  • Collaborate with Data Scientists and product teams.
  • Mentor junior developers.

Conocimientos

Python
SQL
Git
Apache Spark
Apache Airflow
Docker

Educación

Bachelor's degree in Engineering, Computer Science, or related field

Herramientas

PostgreSQL
MySQL
SQL Server
Snowflake
BigQuery
Redshift
AWS
Azure

Descripción del empleo

Senior Data Engineer- Python | SQL | GIT | Freelance | Remote | Catalonia

I am currently working with a respected client who are looking to bring in a freelance Data Engineer to work with them on a 6-month contract with possibility of extension. This role entails

designing and maintaining scalable data pipelines and cloud solutions (AWS / Azure), developing APIs, automating workflows, collaborating across teams, and mentoring junior engineers.

Role Details :

Job title : Senior Data Engineer.

Working Model : Remote.

Day Rate : D.O.E.

Contract Length : 6 Months Initially with extension.

Start Date : ASAP.

Key Responsibilities :

  • Design and build scalable and reliable ETL / ELT pipelines.
  • Develop applications and APIs for data access and manipulation.
  • Implement solutions on AWS and Azure (data warehousing, data lakes).
  • Automate data ingestion, transformation, and loading processes.
  • Collaborate with Data Scientists, Analysts, and product teams.
  • Mentor junior developers and participate in code reviews.

REQUIRED Technical Skills :

Programming :

  • Python : 3+ years of experience with Pandas, NumPy, SQLAlchemy.
  • SQL : Complex queries, optimization, stored procedures.
  • Git : Version control and collaborative workflows.
  • Big Data & Streaming :

  • Apache Spark : Distributed processing and optimization.
  • Apache Airflow : Workflow orchestration and DAGs.
  • Databases :

  • SQL : PostgreSQL, MySQL, or SQL Server.
  • Data Warehouses : Snowflake, BigQuery, or Redshift.
  • Cloud (AWS or Azure) :
  • AWS : S3, EMR, Glue, RDS, Lambda.
  • Azure : Data Factory, Synapse, Storage, Functions.
  • DevOps :

  • Docker : Application containerization.
  • CI / CD : Jenkins, GitLab CI, or GitHub Actions.
  • REQUIRED Experience and Education :

  • 5 years of experience in data engineering or software development.
  • Bachelor's degree in Engineering, Computer Science, or related field.
  • Experience working in agile teams.
  • Technical English (reading / writing).
  • Essential Skills :

  • Language Skills : Complete proficiency in English and exceptional communication skills for client facing engagements (Spanish is a bonus).
  • If this sounds interesting to you, please apply or send your CV with a contact number to

    Consigue la evaluación confidencial y gratuita de tu currículum.
    o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.