¡Activa las notificaciones laborales por email!

Senior Data Engineer (Aws)

buscojobs España

Jaén

Presencial

EUR 35.000 - 55.000

Jornada completa

Hace 4 días
Sé de los primeros/as/es en solicitar esta vacante

Mejora tus posibilidades de llegar a la entrevista

Elabora un currículum adaptado a la vacante para tener más posibilidades de triunfar.

Descripción de la vacante

A leading company in agriculture and animal health is seeking a skilled Data Engineer proficient in Python and SQL, to design and develop cloud-based data solutions in AWS. In this role, you will optimize data pipelines, ensure data integrity, and collaborate with teams to drive insights, contributing to the company’s ambitious growth in innovative technology and sustainability.

Formación

  • Expert-level proficiency in Python for data engineering tasks.
  • Advanced SQL skills, including performance tuning.
  • Experience with AWS services and Databricks for data pipeline development.

Responsabilidades

  • Design and maintain robust ETL / ELT pipelines using Databricks and AWS services.
  • Utilize advanced SQL for complex queries and data transformations.
  • Monitor performance of data pipelines and troubleshoot issues.

Conocimientos

Python
SQL
AWS
Databricks
Data Governance
Performance Optimization

Educación

Bachelor’s or Master’s degree in Computer Science, Engineering, or related field

Herramientas

Databricks
Git
Terraform
Docker

Descripción del empleo

Our Company At Kynetec, we're proud to be at the forefront of the intersection between agriculture, sustainability, and animal health. We’re redefining our industry with unparalleled insights and leading technology, whilst on an ambitious growth plan to supersede our influence from the food on our plates, to the health of our livestock and the care of our beloved pets at home.

We owe our success to our industry experts. They are the driving force behind our reputation as a global leader in the industry - Their innovative ideas and expertise have helped us achieve new heights. From seasoned insights specialists, and client leaders to innovative tech genius. What connects us? A shared passion for Agriculture and Animal Health! We don’t settle for “business as usual”.

Each day, we are taking strides towards transforming our industry and improving the lives of people and animals around the world. If you’re looking for a company who challenges the norm and fosters a culture of innovation, Kynetec is the place for you.

The Role We are seeking a Data Engineer with expert-level Python and SQL skills. The ideal candidate will have experience designing and building cloud-based data solutions in AWS (preferably with Solution Architect expertise), Databricks, and RDS / Postgres. In this role, you will be responsible for creating and optimizing scalable data pipelines, ensuring the integrity and performance of our data infrastructure, and collaborating with cross-functional teams to drive business insights.

Responsibilities Data Pipeline Development o Design, develop, and maintain robust ETL / ELT pipelines using Databricks and AWS services (Glue, Lambda, S3, EMR). o Write efficient, maintainable code in Python for data ingestion, transformation, and automation.

Database & SQL Mastery o Use advanced SQL skills to perform complex queries, data transformations, and performance tuning on RDS / Postgres. o Create and optimize database structures (tables, indexes, partitions) to support analytics and reporting needs.

AWS Architecture & Infrastructure o Leverage AWS best practices to architect secure, scalable, and high-performing cloud data environments (EC2, S3, Lambda, IAM, VPC, etc.). o Provide Solution Architect-level guidance, ensuring alignment with best practices for cost optimization, security, and reliability.

Performance Optimization & Troubleshooting o Continuously monitor the health and performance of data pipelines and databases. o Diagnose and resolve bottlenecks in data ingestion, transformation, and querying processes.

Collaboration & Stakeholder Management o Work closely with data scientists, data analysts, and business stakeholders to gather requirements and deliver data-driven solutions. o Communicate progress, roadblocks, and technical details to both technical and non-technical team members.

Data Governance & Security o Implement and maintain data governance protocols, ensuring compliance with industry regulations (GDPR, HIPAA, etc., as relevant). o Establish security best practices for data access, encryption, backup, and disaster recovery.

Requirements Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field (or equivalent work experience). Expert-level proficiency in Python for data engineering tasks (ETL, scripting, automation). Advanced SQL skills, including performance tuning, complex queries, stored procedures, and data transformations. Hands-on experience with Databricks for data pipeline development and orchestration. Strong understanding of AWS services (EC2, S3, Lambda, Glue, IAM, VPC) and familiarity with Solution Architect practices. Proven track record of RDS / Postgres administration, including performance optimization and schema design. Familiarity with version control (Git) and CI / CD pipelines. Exceptional problem-solving capabilities and a strong attention to detail. Effective communication and collaboration skills across technical and non-technical teams. Ability to work independently, set priorities, and manage multiple projects in a fast-paced environment. Eagerness to learn and adapt in an ever-evolving tech landscape.

Preferred : AWS certifications (e.g., AWS Certified Solutions Architect – Associate or Professional). Experience with other data warehouse technologies (Snowflake) and BI / analytics tools. Exposure to infrastructure-as-code (CloudFormation, Terraform) and containerization (Docker, Kubernetes). Knowledge of Java at a moderate level for specific data-related use cases or integrations.

J-18808-Ljbffr

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.