Responsibilities
A Data Architect is an IT expert who enables data-driven decision making by collecting, transforming, and publishing data. At NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a focus on security, compliance, scalability, efficiency, reliability, fidelity, flexibility, and portability. The main mission of a Data Architect is to turn raw data into information, creating insights and business value.
- Build large-scale batch and real-time data pipelines using data processing frameworks on the GCP cloud platform.
- Apply an analytical, data-driven approach to understand rapidly changing business needs.
- Collaborate with the team to evaluate business needs and priorities, liaise with key business partners, and address team requirements related to data systems and management.
- Participate in project planning by identifying milestones, deliverables, resource requirements, and tracking activities and task execution.
Required Skills
- Bachelor’s degree in Computer Science, Computer Engineering, or a relevant field.
- At least 5-10 years of experience in a data engineering role.
- Expertise as a software engineer using Scala, Java, or Python.
- Advanced SQL skills, preferably with BigQuery.
- Good knowledge of Google Managed Services such as Cloud Storage, BigQuery, Dataflow, Dataproc, and Data Fusion.
- Experience with workflow management tools.
- Strong understanding of GCP architecture for batch and streaming data processing.
- Extensive knowledge of data technologies and data modeling.
- Experience in building modern, cloud-native data pipelines and operations following an ELT philosophy.
- Experience with data migration and data warehouse solutions.
- Ability to organize, normalize, and store complex data to enable both ETL processes and end-user access.
- Passion for designing ingestion and transformation processes from multiple data sources to create cohesive data assets.
- Good understanding of developer tools, CI/CD pipelines, etc.
- Excellent communication skills, empathetic towards end users and internal customers.
Nice-to-have:
- Experience with Big Data ecosystem tools such as Hadoop, Hive, HDFS, HBase.
- Experience with Agile methodologies and DevOps principles.
J-18808-Ljbffr