Job Search and Career Advice Platform

Enable job alerts via email!

Senior Big Data Engineer (Java Focus)

Global Relay

City Of London

On-site

GBP 65,000 - 85,000

Full time

24 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology firm in the City Of London is seeking an experienced Java Developer specialized in big data solutions. You will develop ETL/ELT processes, design microservices, and work in an agile team environment. The ideal candidate will have at least 5 years of Java experience and a passion for data processing. This role offers a dynamic work environment, competitive compensation, and opportunities for career advancement.

Benefits

Competitive salary
Career development opportunities
Inclusive work environment

Qualifications

  • 5 years of Java development experience in an Agile environment.
  • Experience in developing ETL / ELT processes.
  • Understanding of unstructured, semi-structured, and structured data processing.

Responsibilities

  • Develop ETL, ELT and streaming processes using big data frameworks in Java.
  • Design and implement microservices within an agile development team.
  • Write tests for Java code.

Skills

Java development
Big data solutions
ETL processes
Agile methodology
SQL
Problem Solving
Team player

Tools

Kubernetes
Apache Spark
Docker
Hadoop
Kafka
CI/CD
Job description
Who we are:

For over 20 years, Global Relay has set the standard in enterprise information archiving with industry-leading cloud archiving, surveillance, eDiscovery, and analytics solutions. We securely capture and preserve the communications data of the world’s most highly regulated firms, giving them greater visibility and control over their information and ensuring compliance with stringent regulations.

Though we offer competitive compensation and benefits and all the other perks one would expect from an established company, we are not your typical technology company. Global Relay is a career-building company. A place for big ideas. New challenges. Groundbreaking innovation. It’s a place where you can genuinely make an impact – and be recognized for it.

We believe great businesses thrive on diversity, inclusion, and the contributions of all employees. To that end, we recruit candidates from different backgrounds and foster a work environment that encourages employees to collaborate and learn from each other, completely free of barriers.

Your Role:

Joining the Reporting product line, you would work as a member of a highly focused team. This team specialises in Java-based data engineering, designing and delivering large-scale ELT/ETL workflows on a data lake house platform. You will be working with modern big data technologies to move, transform, and optimise data for high-performance analytics and regulatory reporting. The environment encourages autonomy, problem-solving, and system-level thinking. If you’re passionate about clean, well-tested, performant code and enjoy working on complex data pipelines at scale, you’ll thrive here.

Tech Stack:
  • Micro-services Container Platforms (Kubernetes, CRC, Docker)
  • Big Data Technologies (Apache Spark, Flink, Hadoop, Airflow, Trino, Iceberg)
  • Dependency injection frameworks (Spring)
  • Observability (Loki/Grafana)
  • Large scale data processing (Kafka)
  • CI/CD Build tools (Maven, Git, Jenkins, Ansible)
  • NoSQL DBs (Cassandra, Zookeeper, HBase)
Your Responsibilities:
  • Develop ETL, ELT and streaming processes using big data frameworks primarily in Java
  • Design, implement and provide architectural guidance in deploying microservices as a part of an agile development team
  • Write unit and integration tests for your Java code
  • Collaborate with testers in development of functional test cases
  • Develop deployment systems for Java based systems
  • Collaborate with product owners on user story generation and refinement
  • Monitor and support the operation of production systems
  • Participate in knowledge sharing activities with colleagues
  • Pair programming and peer reviews
About you:
Required Experience:
  • Minimum 5 years of Java development experience in an Agile environment, building scalable applications and services with a focus on big data solutions and analytics
  • 3+ year experience in developing ETL / ELT processes using relevant technologies and tools.
  • Experienced in working with data lakes and data warehouse platforms for both batch and streaming data sources.
  • ANSI SQL experience or other flavours of SQL
  • Experience of unstructured, semi-structured and structured data processing.
  • A good understanding of ETL/ELT principles, best practices and patterns used.
  • Experienced in some big data technologies such as Hadoop, Spark and Flink o Experience in web services technologies
  • Experience in Test Driven Development o Experience in CI/CD • Attributes:
  • Good communication skills
  • Problem Solving
  • Self-starter
  • Team player
What you can expect:

At Global Relay, there’s no ceiling to what you can achieve. It’s the land of opportunity for the energetic, the intelligent, the driven. You’ll receive the mentoring, coaching, and support you need to reach your career goals. You’ll be part of a culture that breeds creativity and rewards perseverance and hard work. And you’ll be working alongside smart, talented individuals from diverse backgrounds, with complementary knowledge and skills.

Global Relay is an equal-opportunity employer committed to diversity, equity, and inclusion.

We seek to ensure reasonable adjustments, accommodations, and personal time are tailored to meet the unique needs of every individual.

To learn more about our business, culture, and community involvement, visit www.globalrelay.com.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.