Aktiviere Job-Benachrichtigungen per E-Mail!

Data Engineer (LatAm-based) at Tricura Insurance Group - a US-based healthcare startup speciali[...]

Hire5

Deutschland

Remote

EUR 60.000 - 80.000

Vollzeit

Vor 2 Tagen
Sei unter den ersten Bewerbenden

Erstelle in nur wenigen Minuten einen maßgeschneiderten Lebenslauf

Überzeuge Recruiter und verdiene mehr Geld. Mehr erfahren

Starte ganz am Anfang oder importiere einen vorhandenen Lebenslauf

Zusammenfassung

A leading data-focused insurance firm in Germany is seeking a Data Engineer to design and maintain data pipelines. The ideal candidate should have a strong background in data engineering, expertise in AWS services and Snowflake, and a proactive attitude. This full-time remote role offers flexible hours aligning with EST and opportunities for long-term growth with USD compensation.

Leistungen

Flexible working hours
Learning budget for professional development
Permanent role with long-term growth opportunities

Qualifikationen

  • Proven experience building production-grade data pipelines and systems.
  • Hands-on experience with AWS services.
  • Deep knowledge of Snowflake, including data modeling.
  • Strong skills in SQL and comfort with Python.

Aufgaben

  • Design and maintain reliable data pipelines using AWS services.
  • Work with structured and semi-structured data.
  • Manage large-scale databases and data models.
  • Ensure pipelines are robust and monitored.
  • Collaborate with Data Science, Product, and Engineering teams.
  • Document and automate data processes.

Kenntnisse

Strong Data Engineering Background
AWS Expertise
Snowflake
SQL Mastery
Python
Version Control
Ownership Mentality

Tools

AWS (S3, Lambda, EC2, Glue, IAM)
Snowflake

Jobbeschreibung

Tricura Insurance Group is a rapidly growing company redefining the insurance landscape through data-driven insights and innovative technology. Founded by industry experts with backgrounds in clinical care, risk management, and technology, Tricura focuses on providing tailored liability coverage and advanced claims management for high-risk clients. Led by Matthew Queen (CEO), Beau Walker (CTO), and Gabriel Mayer (CSO), the company leverages AI and machine learning to improve underwriting, risk analysis, and client outcomes. At Tricura, you’ll join a collaborative, fast-paced environment where innovation drives meaningful impact.

We’re looking for a strong Data Engineer who can hit the ground running and help us scale our data infrastructure. You’ll join a collaborative and agile environment where speed, clarity, and ownership are valued.

About the Role

As a Data Engineer at Tricura, you will be responsible for designing, building, and maintaining the data pipelines and infrastructure that power our products and analytics. This role is ideal for someone with deep experience in data engineering—not data science—who’s comfortable optimizing complex ETL workflows, managing cloud environments, and supporting scalable data architecture in production.

You will work closely with the CTO and partner with engineering and product teams to address key data challenges—particularly in AWS and Snowflake.

What You’ll Do
  • Own our Data Infrastructure: Design, implement, and maintain reliable data pipelines using AWS services (e.g., Lambda, S3, Glue) and Snowflake.

  • Build and Optimize ETL Pipelines: Work with structured and semi-structured data across a variety of sources (e.g., internal systems, PDFs, APIs).

  • Schema and Database Management: Restructure and manage large-scale databases and data models to support reporting, analytics, and product features.

  • Integrate and Monitor Pipelines: Ensure pipelines are robust, well-logged, and monitored. Identify and resolve bottlenecks and failures.

  • Collaborate Across Teams: Partner with Data Science, Product, and Engineering teams to align data engineering work with business and product needs.

  • Document and Automate: Write clear documentation, automate manual data processes, and contribute to scalable, maintainable infrastructure.

Must-Have Skills
  • Strong Data Engineering Background: Proven experience building production-grade data pipelines and systems (not ML or DS roles).

  • AWS Expertise: Hands-on experience with AWS services, particularly S3, Lambda, EC2, Glue, and IAM.

  • Snowflake: Deep knowledge of Snowflake, including data modeling, performance tuning, and access control.

  • SQL Mastery: Strong skills in SQL (e.g., Snowflake SQL, Postgres, MySQL).

  • Python: Comfort with Python for scripting, data wrangling, and ETL development.

  • Version Control: Experience using GitHub or GitLab in a collaborative engineering environment.

  • Ownership Mentality: Proactive problem-solver who doesn’t wait for instructions and enjoys moving quickly and independently.

Nice to Have
  • Experience with data orchestration tools (e.g., Airflow, dbt).

  • Familiarity with CI/CD pipelines for data workflows.

  • Previous exposure to healthcare, insurance, or risk data is a bonus.

What We Offer
  • Full-time Remote Role: Work from anywhere with a reliable internet connection and laptop.

  • Aligned Hours: Flexible schedule during EST working hours (9am–5pm EST) with daily stand-ups at 12:30pm EST.

  • Long-Term Growth: This is a permanent role (not project-based), ideal for candidates looking for a 2+ year commitment.

  • Learning Budget: Resources for professional development, mentorship, and training.

  • USD Compensation: Salary discussed during interviews.

How to Apply
  1. Submit your CV and GitHub (or portfolio) via our application form.

  2. Zoom screening interview with a Hire5 recruiter.

  3. Technical interview with Tricura’s Engineering team.

  4. Offer & onboarding.

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.