¡Activa las notificaciones laborales por email!

Staff Data Engineer / Data Architect

Trimble

España

A distancia

EUR 70.000 - 90.000

Jornada completa

Hace 2 días
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A logistics technology company is seeking a Staff Data Engineer / Data Architect to design and implement cloud-based data pipelines. This remote role requires strong expertise in AWS and Azure along with leadership skills. Candidates should have a proven background in data architecture and cloud solutions. If you’re passionate about leveraging AI for efficiency, this is the perfect opportunity. Local employment contracts are available in Spain.

Formación

  • Strong technical background in designing and implementing data pipelines.
  • Experience with AWS and Azure data services.
  • Familiarity with data warehousing and big data technologies.

Responsabilidades

  • Lead the design and implementation of cloud-based data pipelines.
  • Ensure code quality and compliance with industry standards.
  • Collaborate with teams to deliver cloud solutions.

Conocimientos

Data architecture design
Cloud platforms (AWS, Azure)
ETL processes
SQL proficiency
Python programming
Infrastructure as Code
Leadership
Problem-solving

Educación

Minimum 8 years of experience as a Data Engineer

Herramientas

AWS Glue
Terraform
Docker
Kubernetes

Descripción del empleo

Transporeon is a SaaS company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including:

  • Buying & selling of logistics services
  • Organizing shipment execution
  • Organizing dock, yard, truck, and driver schedules
  • Invoice auditing for logistics services

It has grown significantly over the years, reaching €150m in revenue before being acquired by Trimble for $2 billion USD in 2022. Transporeon has one of the largest networks of shippers and carriers in Europe, with approximately 1,400 employees.

We are looking for a highly skilled Staff Data Engineer / Data Architect for our Data and Cloud Engineering team with expertise in AWS and Azure. The ideal candidate will have a strong technical background in designing, building, developing, and implementing data pipelines and cloud solutions along with excellent technical guidance and communication skills. This role requires a strong technical background in cloud platforms, data architecture, and engineering best practices.

Your daily tasks...

  1. Lead the design and implement robust, scalable, and secure cloud-based data pipelines and architectures in AWS (later maybe migrating to MS Azure).
  2. Ensure best practices in code quality, architecture, and design.
  3. Design and implement secure, scalable, and high-performance cloud infrastructure.
  4. Manage cloud resources, optimize costs, ensure high availability and disaster recovery.
  5. Automate infrastructure provisioning and deployment processes using Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, and ARM templates.
  6. Collaborate with cross-functional teams to understand data needs and deliver comprehensive cloud solutions.
  7. Oversee cloud infrastructure management, including monitoring, maintenance, and scaling of cloud resources.
  8. Ensure compliance with industry standards and regulatory requirements.
  9. Implement data governance policies and practices and ensure high data quality, integrity, and security across all cloud platforms.
  10. Identify and implement process improvements to enhance efficiency, quality, and scalability of data engineering and cloud operations.
  11. Stay current with emerging technologies and industry trends to drive innovation. Utilize AI to increase our efficiency.

Our tech-stack...

  • Infrastructure : Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, Sagemaker, Dagster
  • DevOps : Cloudformation, Terraform, Git, CodeBuild
  • Database : Redshift, PostgreSQL, DynamoDB, Athena (Trino), Snowflake, Databricks
  • Language : Bash, Python (PySpark, Pydantic, PyArrow), SQL

What do you bring to the table...

  • Min. 8 years of proven experience as a Data Engineer with current focus on data architecture design.
  • Extensive experience with AWS and Azure cloud platforms and their data services (AWS Redshift, AWS Glue, AWS S3, Azure Data Lake, Azure Synapse, Snowflake, Databricks).
  • Strong understanding of ETL processes, data warehousing, and big data technologies.
  • Proficiency in SQL, Python, and other relevant programming languages.
  • Experience with infrastructure as code (IaC) tools such as Terraform, CloudFormation, or ARM templates.
  • Knowledge of containerization and orchestration (Docker, Kubernetes).
  • Understanding of cloud cost management and optimization strategies.
  • Familiarity with CI / CD pipelines and DevOps practices.
  • Excellent leadership, communication, and interpersonal skills.
  • Ability to work in a fast-paced, dynamic environment.
  • Strong problem-solving and analytical skills.
  • Familiarity with data visualization tools (Power BI, QuickSight) is a plus.
  • Openness to use AI, in our case Cursor, as your daily tool.

Job Location

Remote role in the following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contracts, only local employment contracts.

Our Inclusiveness Commitment

We believe in celebrating our differences. That is why our diversity is our strength. To us, that means actively participating in opportunities to be inclusive. Diversity, Equity, and Inclusion have guided our current success while also moving our desire to improve. We actively seek to add members to our community who represent our customers and the places we live and work.

We have programs in place to make sure our people are seen, heard, and welcomed and most importantly that they know they belong, no matter who they are or where they are coming from.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.