Job Search and Career Advice Platform

Enable job alerts via email!

Integration Specialist

CUTE MARINE SERVICES PTE. LTD.

Singapore

On-site

SGD 70,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A marine data analytics company in Singapore is seeking an experienced data engineer to architect, design, and maintain ETL workflows using KNIME for marine data integration. Candidates should have solid experience in data integration and analytics, particularly with a focus on maritime operations and compliance. This role requires understanding complex data scenarios, developing automated pipelines, and working with diverse datasets.

Qualifications

  • 4–7 years of experience in data integration or analytics.
  • Minimum of 3 years working with KNIME.
  • Hands-on experience with AIS feeds and maritime API integration.

Responsibilities

  • Architect and develop KNIME-based ETL workflows.
  • Implement data validation and error handling in pipelines.
  • Collaborate with marine analysts for operational intelligence.

Skills

Data integration
ETL development
KNIME
SQL
Python
Data modeling
Cloud-native ETL
Understanding of maritime operations
Cloud services (AWS, Azure, GCP)

Education

Bachelor’s or master’s degree in computer science or Data Engineering

Tools

KNIME Server
Oracle
PostgreSQL
MS SQL Server
NoSQL
Job description
Responsibilities
  • Architect, design, develop, and maintain KNIME-based ETL workflows for ingestion, transformation, enrichment, and analysis of marine data.
  • Develop automated pipelines for complex data processing scenarios, including:
  • Real-time and batch vessel tracking via AIS feeds and satellite data
  • Integration of environmental sensor networks (e.g., weather stations, sea‑state monitoring, pollution sensors)
  • Port operations and cargo logistics monitoring, predictive analytics, and anomaly detection
  • Compliance reporting aligned with maritime regulations (IMO, MARPOL, SOLAS)
  • Integrate KNIME workflows with marine operational databases, streaming platforms, REST/SOAP APIs, IoT sensor feeds, and third‑party maritime data sources.
  • Implement robust data validation, cleansing, error handling, exception management, and alerting mechanisms to ensure high reliability in mission‑critical marine data pipelines.
  • Handle structured, semi‑structured, and unstructured datasets (e.g., AIS messages, JSON, XML, NetCDF, CSV) and transform them into actionable insights for dashboards and operational KPIs.
  • Build KNIME workflows to support audit trails, regulatory compliance, vessel activity tracking, cargo movement, fuel consumption, emissions, and safety incident reporting.
  • Collaborate with marine analysts, port authorities, and operations teams to design both real‑time streaming and batch processing pipelines for operational intelligence.
  • Develop modular, reusable KNIME components for reporting, maritime risk scoring, and operational monitoring.
  • Maintain workflow lifecycle management, version control (Git), CI/CD pipelines, and deployment to KNIME Server for production execution.
  • Monitor, optimize, and scale workflows for performance, throughput, and latency, particularly for high‑frequency AIS and sensor streams.
Requirements
  • Bachelor’s or master’s degree in computer science, Data Engineering, or related technical field.
  • 4–7 years of experience in data integration, ETL development, or analytics, with a minimum of 3 years working extensively with KNIME.
  • Strong understanding of maritime operations, vessel monitoring systems, port logistics, and environmental regulations.
  • Hands‑on experience integrating KNIME with databases (Oracle, PostgreSQL, MS SQL Server, NoSQL), AIS feeds, IoT/sensor networks, and maritime APIs.
  • Proficiency in SQL, data modeling, and complex data transformations across relational and non‑relational systems.
  • Advanced scripting/programming skills in Python, Shell, or Java, particularly for KNIME extensions, API integration, and automation.
  • Familiarity with marine and environmental data formats: AIS, JSON, XML, NetCDF, CSV.
  • Demonstrated ability to design and optimize high‑throughput, fault‑tolerant ETL pipelines, manage workflow performance, and implement exception handling strategies.
  • Experience working in regulated environments, handling sensitive operational and environmental datasets securely.
  • Hands‑on experience with KNIME Server (workflow scheduling, remote execution, user permissions).
  • Experience building cloud‑native ETL and analytics pipelines (AWS, Azure, GCP) for marine and sensor data.
  • Knowledge of maritime risk scoring frameworks, vessel safety standards, and environmental compliance metrics.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.