Job Search and Career Advice Platform

Enable job alerts via email!

Associate, DATA Platform Support SRE Engineer, Technology & Operations

DBS

Indonesia

On-site

IDR 200.000.000 - 300.000.000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading financial services group in Indonesia is seeking an Associate, DATA Platform Support SRE Engineer. In this pivotal role, you will ensure the stability and performance of our Data Platform, collaborating with various teams and developing essential tools. Candidates should have a minimum of 4 years of experience in data support or SRE roles, a strong grasp of data frameworks, and skills in Python and Spark SQL. Join us to support critical data-driven initiatives in a dynamic environment.

Qualifications

  • Minimum 4 years of relevant working experience.
  • Proven experience in data platform support or SRE roles.
  • Strong understanding of data ingestion, processing, and consumption frameworks.

Responsibilities

  • Collaborate with teams for various projects.
  • Develop scripts/tools to support data requirements.
  • Provide technical support for the Data Platform.

Skills

Python
Spark SQL
Unix Shell Scripting
Data governance
API integration
Communication skills

Tools

Airflow
Apache Superset
Sparkola
Cloudera Machine Learning
Tableau
Trino/Presto
Job description
Associate, DATA Platform Support SRE Engineer, Technology & Operations (WD80041)
Business Function

DBS, a leading financial services group in Asia, is seeking a highly motivated and skilled Specialist, SRE Engineer for Data Platform Support. This role is crucial in ensuring the stability, performance, and reliability of our cutting-edge Data Platform (ADA) and its associated services. As a key member of our IT Applications team in Indonesia, you will play a vital role in maintaining the integrity of our data ecosystem and supporting critical data-driven initiatives.

Responsibilities
  • Collaborate with regional team, business unit, support unit and other related party for any projects.
  • Ensure user requirements on data needs are captured
  • Develop scripts/tools to support data requirements and ingestion from BU/SU
  • Maintain good relationships between IT and business, manage business expectations, and negotiate compromise
  • Provide comprehensive technical support for the Data Platform, covering various components and frameworks such as Data-as-a-Service (DaaS), Data Ingestion (Flux Framework), and Data Compute engines like Sparkola.
  • Troubleshoot and resolve data-related issues and queries, collaborating with L1/L2/L3 teams and following established Standard Operating Procedures (SOPs).
  • Work with various tools and technologies within the ecosystem, such as Airflow, Apache Superset, Celerity, Cloudera Machine Learning (CML), DALi, Datanaut Job Server, Dify, Lumen, Ray and KubeRay, SMTP Proxy Mail Solution, Sparkola, Tableau, and Trino/Presto.
  • Utilize JIRA for issue tracking and management.
  • Contribute to the continuous improvement of support processes and documentation, including SOPs and troubleshooting guides.
Requirements
  • Minimum 4 years of relevant working experience
  • Understand key data concepts, develop data set processes, techniques and considerations in Machine learning and Data analytics
  • Training and relevant experience in one or more of the following areas :
    • Statistical modelling tools such as: Python, Spark SQL etc
    • Data manipulation using scripting languages like Python or using ETL tools
    • End-to-end analytics architecture, preferably with some working knowledge of big data stack
    • Unix Shell Scripting and knowledge of Job scheduler
    • Excellent understanding of technology life cycles and the concepts and practices required to build big data solutions
    • Ability to understand and build re-usable data assets or features to enable downstream data science models to read and use
  • Proven experience in data platform support or SRE roles, preferably within a large enterprise environment.
  • Strong understanding of data ingestion, processing, and consumption frameworks.
  • Familiarity with data governance principles and metadata management, ideally with experience using Collibra.
  • Hands-on experience with big data technologies and tools mentioned in the responsibilities (e.g., Sparkola, Trino, Airflow).
  • Ability to troubleshoot complex data issues and perform root cause analysis.
  • Experience with API integration and understanding of REST APIs.
  • Knowledge of incident management and problem resolution processes.
  • Excellent communication and interpersonal skills to collaborate with various stakeholders.
  • Ability to work in a fast-paced and dynamic environment, supporting 24/7 operations where required.
  • Experience with or understanding of AI/ML platforms and related metric monitoring is a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.