Job Search and Career Advice Platform

Enable job alerts via email!

Software Engineer - Data Infrastructure - Kafka

Canonical

Remote

SAR 200,000 - 300,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech firm is seeking a Software Engineer specializing in Data Infrastructure. This role involves developing automation solutions for data platforms, utilizing Python and expertise in distributed systems like Kafka. The ideal candidate will collaborate with a global team, focusing on infrastructure features rather than data processing. The firm offers a fully remote work environment, a substantial personal development budget, and annual compensation reviews.

Benefits

Fully remote working environment
Personal learning and development budget of 2,000USD
Annual compensation review
Opportunity to travel to new locations to meet colleagues twice a year

Qualifications

  • Proven hands-on experience in software development using Python.
  • Experience in distributed systems like Kafka and Spark.
  • Willingness to travel up to 4 times a year.

Responsibilities

  • Collaborate proactively with a distributed team.
  • Write high-quality, idiomatic Python code to create new features.
  • Manage and integrate Big Data platforms at scale.

Skills

Software development using Python
Distributed systems (Kafka, Spark)
Strong collaboration

Education

Bachelor's degree in Computer Science or STEM

Tools

Linux systems administration
Kubernetes clusters
SQL databases (MySQL, PostgreSQL, Oracle)
NoSQL databases (MongoDB, Redis, ElasticSearch)
Job description
Software Engineer - Data Infrastructure - Kafka

Join to apply for the Software Engineer - Data Infrastructure - Kafka role at Canonical.

Canonical is building a comprehensive automation suite to provide multi-cloud and on-premise data solutions for the enterprise. The data platform team is a collaborative team that develops managed solutions for a full range of data stores and data technologies, spanning from big data, through NoSQL, cache-layer capabilities, and analytics; all the way to structured SQL engines (similar to Amazon RDS approach). We are facing the interesting problem of fault‑tolerant mission‑critical distributed systems and intend to deliver the world’s best automation solution for delivering managed data platforms.

Location: This role can be filled in European, Middle East and African time zones.

What your day will look like

The data platform team is responsible for the automation of data platform operations, with the mission of managing and integrating Big Data platforms at scale. This includes ensuring fault‑tolerant replication, TLS, installation, backups and much more; but also provides domain‑specific expertise on the actual data system to other teams within Canonical. This role is focused on the creation and automation of infrastructure features of data platforms, not analysing and/or processing the data in them.

  • Collaborate proactively with a distributed team
  • Write high‑quality, idiomatic Python code to create new features
  • Debug issues and interact with upstream communities publicly
  • Work with helpful and talented engineers including experts in many fields
  • Discuss ideas and collaborate on finding good solutions
  • Work from home with global travel for 2 to 4 weeks per year for internal and external events
What we are looking for in you
  • Proven hands‑on experience in software development using Python
  • Proven hands‑on experience in distributed systems, such as Kafka and Spark
  • Have a Bachelor’s or equivalent in Computer Science, STEM, or a similar degree
  • Willingness to travel up to 4 times a year for internal events
Additional Skills That You Might Also Bring
  • Experience operating and managing other data platform technologies, SQL (MySQL, PostgreSQL, Oracle, etc) and/or NoSQL (MongoDB, Redis, ElasticSearch, etc), similar to DBA level expertise
  • Experience with Linux systems administration, package management, and infrastructure operations
  • Experience with the public cloud or a private cloud solution like OpenStack
  • Experience with operating Kubernetes clusters and a belief that it can be used for serious persistent data services
What we offer you

Your base pay will depend on various factors including your geographical location, level of experience, knowledge and skills. In addition to the benefits above, certain roles are also eligible for additional benefits and rewards including annual bonuses and sales incentives based on revenue or utilisation. Our compensation philosophy is to ensure equity right across our global workforce.

  • Fully remote working environment - we've been working remotely since 2004!
  • Personal learning and development budget of 2,000USD per annum
  • Annual compensation review
  • Recognition rewards
  • Annual holiday leave
  • Parental Leave
  • Employee Assistance Programme
  • Opportunity to travel to new locations to meet colleagues twice a year
  • Priority Pass for travel and travel upgrades for long haul company events
About Canonical

Canonical is a pioneering tech firm that is at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open source projects and the platform for AI, IoT and the cloud, we are changing the world on a daily basis. Canonical is an equal‑opportunity employer. We are proud to foster a workplace free from discrimination. Whatever your identity, we will give your application fair consideration.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.