¡Activa las notificaciones laborales por email!

Data Engineer

HBX Group

Almería

Híbrido

EUR 40.000 - 60.000

Jornada completa

Hace 12 días

Descripción de la vacante

A dynamic data-driven company is seeking a Data Engineer to manage and optimize data architectures and flows. The position involves working on projects like data migration to Snowflake and developing real-time data pipelines. Candidates should possess strong SQL skills and experience with big data tools. This role supports a critical function in enhancing data-driven decisions and optimizing operational efficiency, practicing a hybrid work model.

Formación

  • Experience with big data tools and SQL.
  • Strong analytics skills with unstructured datasets.
  • Self-directed, supporting multiple teams.

Responsabilidades

  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex datasets meeting business requirements.
  • Identify and implement process improvements.
  • Build infrastructure for extraction, transformation, and loading of data.
  • Develop analytics tools for insights and metrics.

Conocimientos

Apache Airflow
Advanced SQL
Big data pipeline architecture
Root cause analysis
Analytics with unstructured datasets
Handling large datasets
Message queuing
Project management
Collaboration with cross-functional teams
Cloud services

Herramientas

AWS
SQL
NoSQL

Descripción del empleo

Join our expanding Data Delivery team, taking on new challenges to support strategic projects. We aim to put data insights at the heart of every commercial action in our e-commerce business. To do this, we need people experienced in data modelling, data pipelines, testing, and documentation. We want our team to drive our commercial success through the solutions we build.

The company offers a great working environment with many challenges. Our HQ is in Palma de Mallorca, but the team is based in multiple locations, practicing a hybrid work model. Your internal collaborators will be global, with users all over.

We want you to be productive, so we have a modern analytics stack that includes Snowflake for data warehousing, DBT and Airflow for data pipelines, Python for ML modelling, and Tableau for data visualizations.

This role focuses on projects impacting data engineering. The main project in the short term is to finalize the Atlas report migration, mapping new data sources from Mitra to Snowflake BI tables. Other projects include Salesforce data ingestion, updating Airflow, developing real-time data pipelines, and enhancing our architecture for better data security and permissions.

The impact of this role is to improve business efficiency. Building and maintaining stable, timely data flows enables everyone in the company to access data for insights, alerts, and reporting.

Job Responsibilities

As a Data Engineer, you will contribute to all parts of our data architecture. This includes optimizing data flows, designing new pipeline architectures, or landing data from external sources.

You don't need to be an expert initially but should be enthusiastic and willing to learn.

You should also be comfortable working with stakeholders across the business to understand their problems and translate them into insightful, intuitive solutions. We utilize agile methods, and you can expect responsibility from day one.

See the main responsibilities below:

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex datasets meeting business requirements
  • Identify and implement process improvements: automating manual tasks, optimizing data delivery, redesigning infrastructure for scalability
  • Build infrastructure for extraction, transformation, and loading of data from various sources using SQL and AWS 'big data' technologies
  • Develop analytics tools utilizing data pipelines to provide insights into customer acquisition, operational efficiency, and key metrics
  • Collaborate with stakeholders including Executive, Product, Data, and Design teams to support data infrastructure needs
  • Ensure data is secure across borders through multiple data centers and AWS regions
  • Create data tools for analytics and data science teams to enhance product innovation
  • Work with data and analytics experts to improve system functionality

Required Skills

  • Experience with Apache Airflow or similar orchestration tools
  • Advanced SQL skills and experience with relational databases
  • Experience building and optimizing big data pipelines and architectures
  • Ability to perform root cause analysis on data and processes
  • Strong analytics skills with unstructured datasets
  • Proven experience manipulating large, disconnected datasets
  • Knowledge of message queuing, stream processing, and scalable big data stores
  • Strong project management and organizational skills
  • Experience working with cross-functional teams in dynamic environments
  • Technical skills in big data tools, SQL, NoSQL, cloud services, data pipelines, and scripting languages
  • Self-directed, supporting multiple teams, systems, and products
Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.