Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
A leading technology company is looking for Big Data Architects specializing in Apache Hadoop to join their London team. Candidates should have experience in platform engineering and open-source contributions, focused on designing and building robust Big Data solutions in hybrid environments. This role offers the opportunity to work with cutting-edge technologies while contributing to significant open-source projects.
Social network you want to login/join with:
Client: HCLTech
Location: London (City of London), United Kingdom
Job Category: Other
EU work permit required: Yes
HCLTech is a global technology company, home to 219,000+ people across 54 countries, delivering industry-leading capabilities centered on digital, engineering, and cloud, powered by a broad portfolio of technology services and products. We work with clients across all major verticals, providing industry solutions for Financial Services, Manufacturing, Life Sciences and Healthcare, Technology and Services, Telecom and Media, Retail and CPG, and Public Services. Consolidated revenues are over $13 billion.
Location: London
Skill: Apache Hadoop
We are looking for open-source contributors to Apache projects, who have an in-depth understanding of the code behind the Apache ecosystem, experience with Cloudera or similar distributions, and in-depth knowledge of the big data tech stack.
Requirements:
Job description: The Apache Hadoop project requires up to 3 individuals experienced in designing and building platforms, supporting applications in cloud and on-premises environments. These individuals should be open source contributors with an in-depth understanding of the Apache ecosystem, capable of identifying and fixing complex issues during delivery. They will support developers in migrating and debugging critical applications like RiskFinder. They must be experts in designing and building Big Data platforms using Apache Hadoop and supporting Hadoop implementations both in cloud environments and on-premises.