Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Trimble Inc.

Tarnów

Remote

PLN 292,000 - 439,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global technology firm is seeking a Senior Data Engineer with expertise in AWS and Azure. Responsibilities include designing scalable cloud data pipelines and collaborating with cross-functional teams. Candidates should have a minimum of 5 years experience and strong skills in Python, SQL, and ETL processes. This position offers the flexibility to work remotely from select countries, including Poland.

Qualifications

  • 5+ years of proven experience as a Data Engineer.
  • Hands-on experience with AWS data services.
  • Strong understanding of ETL processes and data warehousing.

Responsibilities

  • Design and implement cloud-based data pipelines.
  • Collaborate with teams to deliver cloud solutions.
  • Manage and optimize cloud resources.

Skills

AWS cloud platform
Python
SQL
ETL processes
Terraform
Docker
Data warehousing
Big data technologies

Tools

AWS Glue
AWS Redshift
Azure Data Lake
Snowflake
Databricks
Job description

Transporeon is a SaaS company founded in 2000 in Ulm, Germany. The company provides logistics solutions across several areas, including:

  • Buying & selling of logistics services
  • Organizing shipment execution
  • Organizing dock, yard, truck, and driver schedules
  • Invoice auditing for logistics services

It has grown significantly over the years, reaching €150m in revenue before being acquired by Trimble for $2 billion USD in 2022. Transporeon has one of the largest networks of shippers and carriers in Europe, with approximately 1,400 employees: https://www.transporeon.com/en

We are looking for a highly skilled Senior Data Engineer for our Data and Cloud Engineering team with expertise in AWS and Azure. The ideal candidate will have a strong technical background in designing, building, developing, and implementing data pipelines and cloud solutions along with great technical guidance and communication skills.

Your daily tasks
  • Design and implement robust, scalable, and secure cloud-based data pipelines and architectures in AWS (later maybe migrating to MS Azure).
  • Ensure best practices in code quality, architecture, and design.
  • Design and implement secure, scalable, and high-performance cloud infrastructure.
  • Manage cloud resources, optimize costs, ensure high availability and disaster recovery.
  • Automate infrastructure provisioning and deployment processes using Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, and ARM templates.
  • Collaborate with cross-functional teams to understand data needs and deliver comprehensive cloud solutions.
  • Oversee cloud infrastructure management, including monitoring, maintenance, and scaling of cloud resources.
  • Ensure compliance with industry standards and regulatory requirements.
  • Implement data governance policies and practices and ensure high data quality, integrity, and security across all cloud platforms.
  • Identify and implement process improvements to enhance efficiency, quality, and scalability of data engineering and cloud operations.
  • Stay current with emerging technologies and industry trends to drive innovation. Utilize AI to increase our efficiency.
Our tech-stack
  • Infrastructure: Glue, Lambda, Step Function, Batch, ECS, Quicksight, Machine Learning, SageMaker, Dagster
  • DevOps: CloudFormation, Terraform, Git, CodeBuild
  • Database: Redshift, PostgreSQL, DynamoDB, Athena (Trino), Snowflake, Databricks
  • Language: Bash, Python (PySpark, Pydantic, PyArrow), SQL
What Do You Bring To The Table
  • Min. 5 years of proven experience as a Data Engineer with proven track record of delivering production‑grade data pipelines.
  • Hands‑on experience with AWS cloud platform and its data services (e.g., AWS Redshift, AWS Glue, AWS S3, Azure Data Lake, Azure Synapse, Snowflake, Databricks).
  • Strong understanding of ETL processes, data warehousing, and big data technologies.
  • Proficiency in SQL and Python, comfortable with Spark jobs.
  • Experience with infrastructure as code (IaC) tools such as Terraform, CloudFormation, or ARM templates.
  • Knowledge of containerization and orchestration (e.g., Docker, Kubernetes).
  • Understanding of cloud cost management and optimization strategies.
  • Familiarity with CI/CD pipelines and DevOps practices.
  • Strong problem‑solving and analytical skills.
  • Familiarity with data visualization tools (e.g., Power BI, QuickSight) is a plus.
  • Openness to use AI, in our case Cursor, as your daily tool.
Job Location

On‑site in Ulm, Germany or Tallinn, Estonia or remote role in following countries: Estonia, Latvia, Lithuania, Poland, Slovakia, Hungary, Romania, Portugal, Spain, Italy, Croatia. We are not offering freelancing contract, only local employment contract.

How to Apply

Please submit an online application for this position by clicking on the ‘Apply Now’ button located in this posting.

Application Deadline

Applications could be accepted until at least 30 days from the posting date.

Join a Values-Driven Team: Belong, Grow, Innovate.

At Trimble, our core values of Belong, Grow, and Innovate aren't just words—they're the foundation of our culture. We foster an environment where you are seen, heard, and valued (Belong); where you have an opportunity to build a career and drive our collective growth (Grow); and where your innovative ideas shape the future (Innovate). We believe in empowering local teams to create impactful strategies, ensuring our global vision resonates with every individual. Become part of a team where your contributions truly matter.

Trimble’s Privacy Policy

If you need assistance or would like to request an accommodation in connection with the application process, please contact AskPX@px.trimble.com.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.