Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

PropHero

Remote

IDR 666.777.000 - 1.000.167.000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A growing property analytics company in Indonesia is seeking a Data Engineer to build and maintain real-time data ingestion pipelines for their analytics platform. This role involves using AWS services to architect event-driven data flows, ensuring data quality, and integrating external APIs effectively. The position offers flexible remote work arrangements and the opportunity to shape data foundations that drive business decisions. Ideal candidates will have extensive experience in Python and AWS, along with strong communication skills.

Benefits

Fully remote working arrangement
Growth opportunities
Impact on business decisions

Qualifications

  • 5+ years experience as a Data Engineer focused on data ingestion.
  • Strong proficiency in Python for building ETL pipelines.
  • Hands-on AWS experience with Lambda and Kinesis.

Responsibilities

  • Design and implement event-driven data pipelines using AWS services.
  • Build and maintain streaming data pipelines between HubSpot and PostgreSQL.
  • Ensure proper data validation and quality at the ingestion layer.

Skills

Python
AWS services
Data ingestion
Event-driven architecture
Real-time data streaming
API integration
PostgreSQL
English communication

Tools

PostgreSQL
AWS Lambda
Kinesis
AWS S3
Kafka
Job description

As a Data Engineer at PropHero, you will build and maintain real-time data ingestion pipelines that power our property analytics platform. You'll architect event-driven data flows to seamlessly stream data from external sources (HubSpot, APIs, webhooks) into our operational PostgreSQL database. Working within an AWS ecosystem, you'll ensure data is reliably ingested, validated, and ready for downstream analytics teams. You'll be the gatekeeper of data quality at the entry point, enabling our analysts to transform raw data into insights.

Responsibilities
  • Event-Based Data Streaming: Design and implement event-driven pipelines using AWS services (Lambda, EventBridge, Kinesis/MSK, SQS) to ingest data from external sources in real-time.
  • HubSpot Integration: Build and maintain streaming data pipelines between HubSpot CRM and PostgreSQL, handling webhook events, API polling, and CDC patterns for sub-minute data freshness.
  • External API Integration: Develop robust connectors for third-party APIs, webhooks, and data sources, ensuring reliable data capture with proper error handling and retry logic.
  • Data Validation & Quality: Implement schema validation, data type checking, and automated quality gates at the ingestion layer to prevent bad data from entering the system.
  • Database Design & Optimization: Manage relational and analytical databases, ensuring performance, scalability, and cost efficiency.
  • AWS Infrastructure Management: Deploy and manage AWS resources (Lambda, RDS, EventBridge, CloudWatch, S3) for scalable data solutions.
  • Monitoring & Alerting: Build comprehensive monitoring dashboards and alerting systems to track pipeline health, data freshness, and error rates.
  • Performance Optimization: Optimize ingestion pipelines for throughput, latency and efficiency while handling high-volume data streams.
  • Documentation: Maintain clear documentation of pipeline architecture, data flows, API integrations, and operational runbooks.
Requirements
  • 5+ years of experience as a Data Engineer focused on data ingestion and pipeline development.
  • Strong proficiency in Python for building ETL/ELT pipelines, API integrations, and data validation logic.
  • Hands-on AWS experience with Lambda, EventBridge, Kinesis/SQS, RDS/PostgreSQL, CloudWatch, and S3.
  • Event-driven architecture: Proven experience with event buses, message queues, webhooks, and streaming architectures.
  • Real-time streaming data: Experience with Kafka, Kinesis, or similar streaming platforms; understanding of CDC and real-time data patterns.
  • API integration expertise: Strong experience with REST APIs, authentication methods (OAuth, API keys), rate limiting, and error handling.
  • PostgreSQL knowledge: Comfortable writing efficient SQL and understanding database constraints, indexing, and connection pooling.
  • Professional English proficiency: Strong written and verbal English communication skills for technical documentation, code comments, and daily collaboration with Australia and Spain-based team members.
  • HubSpot or CRM API experience (bonus): Familiarity with CRM APIs (HubSpot, Salesforce, Zoho, or similar) is a strong plus.
  • Problem-solving: Analytical mindset with ability to debug complex pipeline issues and implement robust error recovery.
  • Remote collaboration: Comfortable working remotely with distributed teams across different time zones.
Why PropHero
  • Impact & Ownership: Your work will directly shape the data foundation that drives our business decisions and customer experience.
  • Cutting-Edge Stack: Build on a modern, cloud-native AWS data platform.
  • Growth Opportunities: Be part of a fast-scaling team with opportunities to lead data architecture and mentor others.
  • Healthy Business: €30M revenue in 4 years, 25% QoQ growth, already profitable.
  • Fully remote working arrangement: Work flexibly while maintaining high deliverables.

Diversity Statement

At Prophero, we are committed to fostering an inclusive and equitable workplace where diverse perspectives and backgrounds are not only welcomed but celebrated. We believe that diversity drives innovation and empowers us to build stronger connections with our clients and communities.

Prophero is an equal opportunity employer and is dedicated to ensuring a hiring process free from discrimination based on race, ethnicity, gender, age, disability, religion, sexual orientation, or any other characteristic protected by law. Our mission is to create a workplace where everyone feels valued, supported, and empowered to achieve their full potential.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.