¡Activa las notificaciones laborales por email!

Senior Data Engineer

Automat-it

Madrid

Presencial

EUR 50.000 - 70.000

Jornada completa

Hace 30+ días

Descripción de la vacante

A fast-growing tech firm in Madrid is seeking a Senior Data Engineer to build their Data & Analytics practice, delivering modern data solutions on AWS. The role requires strong AWS experience, expertise in data engineering and analytics, and the ability to communicate effectively with clients. You will design end-to-end data pipelines and manage data lakes for AI/ML training, ensuring scalable and cost-effective architectures.

Servicios

Professional training and certifications
International work environment
Referral program
Company events and social gatherings
Wellbeing coaching
Soft skills training

Formación

  • 5+ years of experience in data engineering, data analytics, or a related field.
  • 3+ years of hands-on AWS experience.
  • Skilled in AWS analytics and dashboard tools.

Responsabilidades

  • Design and deploy AWS-based data solutions.
  • Develop dashboards and analytics reports.
  • Migrate existing data workflows to AWS.

Conocimientos

AWS services
Python
SQL
Data analytics
ETL pipelines
Communication skills

Educación

5+ years in data engineering or related field
AWS certifications

Herramientas

AWS Glue
Amazon Redshift
Amazon QuickSight
Tableau
Power BI
Terraform
Descripción del empleo

Automat-it is where high-growth startups turn when they need to move faster, scale smarter, and make the most of the cloud. As an AWS Premier Partner and Strategic Partner, we deliver hands-on DevOps, FinOps, and GenAI support that drives real results.

We work across EMEA, fueling innovation and solving complex challenges daily. Join us to grow your skills, shape bold ideas, and help build the future of tech.

We are looking for a Senior Data Engineer to play a key role in building our Data & Analytics practice and delivering modern data solutions on AWS. In this role, you'll be a customer-facing, hands-on technical engineer who designs and implements end-to-end data pipelines and analytics platforms using AWS services like AWS Glue, Amazon OpenSearch Service, Amazon Redshift, and Amazon QuickSight. From migrating legacy ETL workflows to AWS Glue to building scalable data lakes for AI/ML training, you'll ensure our customers can unlock the full value of their data. You will work closely with client stakeholders—from startup founders and CTOs to data engineers—to create secure, cost-efficient architectures that drive real business impact.

Work location: hybrid from Madrid

If you are interested in this opportunity, please submit your CV in English.

Responsibilities:

  • Design, develop, and deploy AWS-based data and analytics solutions to meet customer requirements. Ensure architectures are highly available, scalable, and cost-efficient.
  • Develop dashboards and analytics reports using Amazon QuickSight or equivalent BI tools.
  • Migrate and modernize existing data workflows to AWS. Re-architect legacy ETL pipelines to AWS Glue and move on-premises data systems to Amazon OpenSearch/Redshift for improved scalability and insights.
  • Build and manage multi-modal data lakes and data warehouses for analytics and AI. Integrate structured and unstructured data on AWS (e.g., S3, Redshift) to enable advanced analytics and generative AI model training using tools like SageMaker.
  • Implement infrastructure automation and CI/CD for data projects. Use Infrastructure as Code (Terraform) and DevOps best practices to provision AWS resources and continuously integrate/deploy data pipeline code.
  • Lead customer workshops and proof-of-concepts (POCs) to demonstrate proposed solutions. Run technical sessions (architecture whiteboards, Well-Architected reviews) to validate designs and accelerate customer adoption.
  • Collaborate with engineering teams (Data Scientist, DevOps, and MLOps teams) and stakeholders to deliver projects successfully. Ensure solutions follow AWS best practices and security guidelines, guiding client teams in implementation.
  • Stay up-to-date on emerging data technologies and mentor team members. Continuously learn new AWS services (e.g., AWS Bedrock, Lake Formation) and industry trends, sharing knowledge to improve our Data & Analytics practice.

Benefits:

  • Professional training and certifications covered by the company (AWS, FinOps, Kubernetes, etc.)
  • International work environment
  • Referral program—enjoy collaboration with colleagues and receive bonuses
  • Company events and social gatherings (happy hours, team events, knowledge sharing, etc.)
  • Wellbeing and professional coaching
  • English classes
  • Soft skills training

Country-specific benefits will be discussed during the hiring process.

Automat-it is committed to fostering a workplace that promotes equal opportunities for all and believes that a diverse workforce is crucial to our success. Our recruitment decisions are based on your experience and skills, recognizing the value you bring to our team.

LI-Remote #LI-AIT

Requirements:

  • 5+ years of experience in data engineering, data analytics, or a related field, including 3+ years of hands-on AWS experience (designing, building, and maintaining data solutions on AWS)
  • Production experience with AWS cloud and data services, including building solutions at scale with tools like AWS Glue, Amazon Redshift, Amazon S3, Amazon Kinesis, Amazon OpenSearch Service, etc.
  • Skilled in AWS analytics and dashboard tools, with hands-on expertise with services such as Amazon QuickSight, Tableau, Power BI, and Amazon Athena.
  • Experience with ETL pipelines and ability to build ETL/ELT workflows (using AWS Glue, Spark, Python, SQL).
  • Experience with data warehousing and data lakes—designing and optimizing data lakes (on S3), Amazon Redshift, and Amazon OpenSearch for log/search analytics.
  • Proficiency in programming (Python/PySpark) and SQL for data processing and analysis.
  • Understanding of cloud security and data governance best practices (encryption, IAM, data privacy).
  • Excellent communication skills, capable of explaining complex data concepts clearly. Comfortable working directly with clients and guiding technical discussions.
  • Proven ability to lead end-to-end technical engagements and work effectively in fast-paced Agile environments.
  • AWS certifications, especially in Data Analytics or Machine Learning, are a plus.
  • DevOps/MLOps knowledge, including Infrastructure as Code (Terraform), CI/CD pipelines, containerization, and AWS AI/ML services (SageMaker, Bedrock), is a plus.

Remote Work: Employment Type: Full-time

Key Skills:

Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala

Experience: years

Vacancy: 1

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.