Job Search and Career Advice Platform

Enable job alerts via email!

Senior Network consultants / Network consultants - July - 2020

ITCAN

Singapore

On-site

SGD 80,000 - 120,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology solutions company in Singapore is seeking an experienced Data Architect to lead implementation teams for large-scale Big Data solutions. This role involves translating business requirements into technical solutions, analyzing existing processes for data opportunities, and mentoring peers. Ideal candidates will have extensive experience in Hadoop ecosystems and strong data manipulation skills. Join us for a dynamic work environment with opportunities for professional growth.

Qualifications

  • More than 2 years of experience in architecting large-scale Cloudera/Hortonworks platforms.
  • Over 5 years of managing teams of 5 or more.
  • Experience implementing large scale Hadoop clusters over 100TB.

Responsibilities

  • Translate business needs into technical solutions leveraging business acumen.
  • Lead implementation teams to deliver Big Data solutions.
  • Analyze current practices and identify business opportunities.

Skills

Data manipulation languages (Spark, Scala, Impala)
Hadoop ecosystem services (HBase, Kafka, Spark)
Infrastructure management (Linux shells, YARN)
Data modelling and warehousing concepts

Tools

Cloudera/Hortonworks Data Platform
Apache Ambari
AWS Lambda
Azure Data Lake
Job description
Job Description
  • Translate business requirements to technical solutions leveraging strong business acumen.
  • Lead implementation teams in the delivery of large-scale enterprise Big Data solutions in client data centres and cloud.
  • Work in interdisciplinary teams that combine technical, business and data science competencies that deliver work in waterfall or agile software development lifecycle methodologies.
  • Develop new solutions and accelerators that help to deploy our Data Analytics services at scale.
  • Help develop playbooks, standardized framework, practice guides and other artefacts that will allow our practitioners perform their work in a structured, standard manner.
  • Identify risks, assumptions, and develop pricing estimates for the Data & Analytics solutions.
  • Provide solution oversight to delivery architects and teams.
  • Build and maintain relationships with principle vendors to ensure price guarantees and other enablement benefits.
  • Design and implement relevant data models in Big Data and NoSQL platforms.
  • Architect data pipelines to bring information from source systems, harmonise and cleanse data to support analytics initiatives for core business metrics and performance trends.
  • Work closely with project manager and technical leads to provide regular status reporting and support them to refine issues / problem statements and propose / evaluate relevant analytics solutions.
  • Bring your experience and ideas to effective and innovative engineering, design and strategy.
  • Analyse current business practice, processes, and procedures as well as identifying future business opportunities for leveraging Data & Analytics solutions on various platforms.
  • Develop solution proposals that provide details of project scope, approach, deliverables and project timeline.
  • Develop pricing for the solution proposals.
  • The range of accountability, responsibility and autonomy will depend on your experience and seniority, including:
    • Contributing to our internal networks and special interest groups.
    • Mentoring to upskill peers and juniors.
Essential Requirements
  • More than 2 years of experience in architecting, designing and implementing a large-scale Cloudera / Hortonworks Data Platform.
  • Experienced IT professional of more than 5 years leading or managing team sizes of 5 or more.
  • Implemented large scale Hadoop clusters of more than 10 data nodes, total volume in excess of 100TB and daily volume of 5GB or more.
  • Certified, and with working experience in implementing 3 or more of the following services in Hadoop ecosystem:
    • HBase, Apache Kafka, Knife, Spark
    • Atlas, Ambari / Cloudera Manager, Ranger
    • Data Flow Studio or Cloudera Data Science Workbench is a plus
  • Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Knife necessary to build and maintain complex queries, streaming and real-time data pipelines.
  • Good appreciation and operational experience of infrastructure management and administrative tools and skillsets egg : Linux shells, Apache Ambari, YARN, to build scalable and resilient data platforms.
  • Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP.
Nice to Have
  • Experience with other aspects of data management such as data governance, metadata management, archival, data lifecycle management.
  • Large scale data loading experience moving enterprise or operational data from source systems to new applications or data analytics solutions.
  • Experience in leveraging on cloud-based data analytics platform such as:
    • AWS serverless architecture in Lambda on AWS DynamoDB, EMR
    • Azure Data Lake, HDInsight
    • GCP Big Query / Bigtable, Cloud Data prep / Dataflow / Daturic
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.