Enable job alerts via email!

[Hiring] Data Platform Engineer @Kontakt

Kontakt

United States

Remote

USD 80,000 - 100,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join a forward-thinking company as a remote Data Platform Engineer, where you'll design and maintain scalable data infrastructures that empower AI-driven automation in healthcare. This role offers the opportunity to work with cutting-edge technologies and collaborate with talented professionals in the field. You'll be at the forefront of transforming care delivery operations, ensuring compliance with healthcare standards while optimizing data processes. If you're passionate about data systems and want to make a meaningful impact in healthcare, this position is perfect for you.

Qualifications

  • 5+ years of experience in data engineering roles with a focus on scalable systems.
  • Proficiency in Python, Scala, or Java for data processing and automation.

Responsibilities

  • Design and build scalable data pipelines for RTLS, IoT, and EHR data.
  • Implement real-time event processing architectures using Kafka.

Skills

Python
Scala
Java
Data Engineering
Kafka
ETL/ELT Processes
HIPAA Compliance
Data Processing

Education

Bachelor's Degree in Computer Science or related field

Tools

Apache Spark
AWS
Docker
Kubernetes
Terraform

Job description

Feb 12, 2025 - Kontakt is hiring a remote Data Platform Engineer. Location: Europe.

Kontakt.io is building the platform that care operations run on. We reduce waste, cut costs, and improve revenue by improving throughout asset utilization and staff productivity. Our platform uses AI, RTLS, and EHR data to enable self-learning agents to automate workflows, adapt in real-time, and orchestrate all of care delivery operations. Easy to deploy and scale, it gives a clear picture of spaces, equipment, and people, eliminating inefficiencies and enhancing the patient experience. With measurable 10X ROI and over 20+ use cases, Kontakt.io is the go-to platform for better and faster care delivery operations.
As a Data Platform Engineer, you will play a crucial role in designing, building, and maintaining scalable, high-performance data infrastructure. You will work with cutting-edge cloud and data technologies to enable real-time analytics, data ingestion, and machine learning applications. Your work will enable AI-powered automation, allowing hospitals and healthcare facilities to improve efficiency, reduce costs, and deliver better, faster care.
If you’re passionate about scalable data systems, AI-driven automation, and transforming care delivery operations, join Kontakt.io and help us redefine the future of healthcare!


Responsibilities
  1. Design and build scalable data pipelines (batch & streaming) to process RTLS, IoT, and EHR data.
  2. Implement real-time event processing architectures using Kafka.
  3. Optimize ETL/ELT processes for efficient data ingestion, transformation, and storage.
  4. Develop and maintain data lakes and warehouses for analytics and machine learning applications - create scalable data layers—bronze, silver, and gold—that facilitate advanced analytics and machine learning applications.
  5. Partner with software engineers and cloud teams to maintain a highly available data platform.
  6. Work closely with third-party vendors (e.g., EPIC, Cerner, Meditech) to integrate, standardize, and secure data from EHRs.
  7. Initiate and participate in architectural decisions to improve scalability, resiliency, and system efficiency.
  8. Ensure HIPAA and healthcare compliance standards are met in data storage and transmission.
  9. Optimize query performance, indexing, and partitioning for large-scale data processing.
  10. Implement role-based access control (RBAC), encryption, and anonymization for sensitive healthcare data.


What You Bring
  1. 5+ years of experience as a Data Engineer, Data Platform Engineer, or related role.
  2. Proficiency in Python, Scala, or Java for data processing and automation.
  3. Hands-on experience with big data frameworks like Apache Spark.
  4. Experience with streaming and batch processing using Kafka.
  5. Strong knowledge of cloud platforms (AWS) and their data processing services.
  6. Expertise in TimeScale.
  7. Familiarity with containerization and orchestration tools (Docker, Kubernetes, Terraform).
Bonus Skills
  1. Previous experience in healthcare, hospital operations, or real-time systems.
  2. Experience working with EHR systems (Epic, Cerner, Meditech).
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.