ONLY NATIVE SINGAPOREANS WILL BE CONSIDERED
Job Description
Responsible for designing, building, and maintaining the infrastructure for data collection, storage, and processing, ensuring data is accessible and reliable for analysis.
Key Responsibilities
- Design and Build Data Pipelines: Develop and maintain scalable data pipelines that efficiently collect, transform, and store data from various sources.
- Data Management: Manage and optimize databases and data warehouses to ensure high performance and reliability.
- Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand their data needs and provide tailored solutions.
- Data Quality Assurance: Implement processes to ensure data quality and integrity, including validation and cleaning procedures.
- Research and Development: Explore new data acquisition opportunities and technologies to enhance data processing capabilities.
- Documentation: Maintain clear documentation of data processes, architectures, and workflows to facilitate knowledge sharing and compliance.
Required Skills and Qualifications
- Minimum bachelor's degree in Computer Science, Information Technology or related field.
- Technical Proficiency: Strong experience with SQL, Python, and data processing frameworks (e.g., Apache Spark, Hadoop, Cloudera).
- Data Architecture Knowledge: Understanding of data modeling, ETL (Extract, Transform, Load) processes, and data warehousing concepts.
- Problem-Solving Skills: Ability to troubleshoot data issues and optimize data workflows for efficiency.
- Collaboration and Communication: Excellent interpersonal skills to work effectively with cross-functional teams and communicate technical concepts to non-technical stakeholders.
- Experience with Cloud Technologies: Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and data storage solutions.