Responsibilities
A Data Architect is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Architect is to turn raw data into information creating insight and business value.
- Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform
- Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
- Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
- Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution
Required Skills
- Bachelor’s degree in Computer Science, Computer Engineering or relevant field
- At least 5 - 10years’ experience in a data engineering role
- Expertise as a software engineering using Scala / Java / Python
- Experience in Advanced SQL skillset - preference on using BigQuery
- Good knowledge on Google Managed Services as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion
- Experience using workflow management
- Good understand of GCP Architecture batch and streaming
- Strong knowledge of data technologies and data modeling
- Expertise on building modern, cloud-native data pipelines and operations, with an ELT philosophy
- Experience with Data Migration / Data Warehouse
- Intuitive thinking of how to organize, normalize, and store complex data, enabling both ETL and end users
- Passion for mapping and designing ingestion and transformation of data from multiple sources, creating a cohesive data asset
- Good understanding of developer tools, CICD etc
- Excellent communication, empathetic with end users and internal customers.
Nice-to-have :
- Experience using Big Data echo system Hadoop, Hive, HDFS, Hbase
- Experience with Agile methodologies and DevOps principles