Enable job alerts via email!

Senior Data Engineer

NucleusX B.V

Denpasar

On-site

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant

Job summary

A data engineering firm in Bali is looking for a Senior Data Engineer to build scalable web scrapers and lead projects. You will mentor junior engineers and ensure best practices while collaborating with different stakeholders. Proficiency in Python and cloud infrastructures like AWS/GCP/Azure is essential. The position offers competitive compensation and relocation support for candidates moving within Indonesia.

Benefits

Competitive monthly salary
Relocation support

Qualifications

  • Strong proficiency in Python and SQL with hands-on web scraping experience.
  • Deep knowledge of frameworks like Scrapy, Playwright, Selenium, and anti-bot strategies.
  • Experience designing reusable libraries and modular scraper frameworks.

Responsibilities

  • Lead the development and maintenance of scalable, production-grade web scrapers.
  • Design, build, and maintain reusable scraper libraries and frameworks.
  • Manage and optimize cloud infrastructure for large-scale scraping workloads.

Skills

Python
SQL
Scrapy
Playwright
Selenium
AWS
GCP
Azure
Docker
Kubernetes

Tools

Git
Terraform
CloudFormation
Job description

We are looking for a Senior Data Engineer to join our team in Bali. In this role, you will play a key part in building and scaling production-grade web scrapers and supporting the Lead Data Engineer in technical decisions, project planning, and team coordination. You will also mentor junior and medior engineers while ensuring best practices are followed.

Responsibilities

Lead the development and maintenance of scalable, production-grade web scrapers.

Design, build, and maintain reusable scraper libraries and frameworks.

Define coding standards, workflows, and processes with the Lead Engineer.

Implement robust monitoring, testing, and alerting systems to ensure reliability.

Manage and optimize cloud infrastructure (AWS/GCP/Azure) for large-scale scraping workloads.

Optimize compute, storage, and proxy usage for performance and cost efficiency.

Mentor and review code of junior and medior engineers.

Collaborate with stakeholders to prioritize backlog items and integrate scraper outputs into pipelines.

Act as second in command to the Lead Data Engineer for technical and project support.

Must-Have Skills

Strong proficiency in Python and SQL, with hands-on web scraping experience.

Deep knowledge of frameworks like Scrapy, Playwright, Selenium, and anti-bot strategies.

Experience designing reusable libraries, coding standards, and modular scraper frameworks.

Proficiency with cloud infrastructure (AWS/GCP/Azure), including compute, storage, and security basics.

Experience with Docker & Kubernetes for containerization and orchestration.

Familiarity with CI/CD pipelines, Git workflows, and IaC tools (Terraform/CloudFormation).

Strong debugging, performance tuning, and large-scale scraper optimization skills.

Good-to-Have Skills

Workflow orchestration (Airflow, Prefect, Dagster).

Observability/monitoring tools (Prometheus, Grafana, CloudWatch, ELK).

Cost optimization in cloud and proxy-heavy workloads.

Knowledge of data governance, RBAC, and compliance.

Competitive monthly salary.

Relocation support: one-off reimbursement for flights and moving expenses (for candidates relocating from other parts of Indonesia).

Opportunity to work in a collaborative, data-driven environment with growth opportunities.

What's your expected monthly basic salary?

Which of the following types of qualifications do you have?

How many years' experience do you have as a Data Engineer?

How would you rate your English language skills?

Which of the following programming languages are you experienced in?

How many years' experience do you have using SQL queries?

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.