Enable job alerts via email!

Data- Software Engineer

Rippling

United States

Remote

USD 121,000 - 144,000

Full time

Today
Be an early applicant

Job summary

A tech-driven company is seeking a Data Software Engineer to build and scale the data backbone of its platform. You will design ETL pipelines and manage data lakes using Google Cloud technologies. Ideal candidates have 3+ years of experience in software engineering, strong Python skills, and familiarity with GCP services. This remote-friendly role offers a competitive salary and opportunities for professional growth.

Benefits

Competitive salary
Benefits package
Professional development support

Qualifications

  • 3+ years of professional software engineering experience, primarily in data engineering or backend systems.
  • Strong proficiency in Python for data processing and scripting.
  • Hands-on experience with GCP, specifically DataFlow, BigQuery, and Cloud Storage.

Responsibilities

  • Architect, develop, and maintain scalable ETL/ELT pipelines using Google Cloud DataFlow.
  • Contribute to backend services and data-centric libraries in Python.
  • Ingest and process data from third-party services ensuring data integrity.
  • Contribute to the technical direction of our data domain.

Skills

Python
Google Cloud Platform
DataFlow
BigQuery
TypeScript
Node.js
Terraform
Data modeling
Database design
CI/CD

Education

Bachelor's or Master's degree in Computer Science, Engineering, or related field

Tools

Git
Job description
Data Software Engineer

Location: Remote (CAN) | Job Type: Full Time, Remote

Position Summary

We are seeking a talented Data Engineer to build, scale, and own the data backbone of our platform. You will be responsible for designing and implementing robust ETL pipelines, managing our data lakes, and creating the libraries that power our analytics, compliance, and product features. This role is critical to our success, as you will ensure that high-quality data is available, reliable, and accessible to drive business decisions and power our core services.

You will work with a modern tech stack on Google Cloud Platform, including DataFlow and BigQuery, and will have the opportunity to solve complex challenges in the FinTech and Web3 space.

Key Responsibilities
  • Design and Build Data Pipelines: Architect, develop, and maintain scalable and reliable ETL/ELT pipelines using Google Cloud DataFlow, Python, and BigQuery to process large volumes of structured and semi-structured data.
  • Backend & Library Development: Contribute to the development of backend services and data-centric libraries in Python and TypeScript/Node.js, ensuring they are well-tested, performant, and maintainable.
  • Platform Integration: Ingest and process data from critical third-party services, including Persona (KYC/KYB), Sardine (Fraud/Compliance), and Stytch (Authentication), ensuring data integrity and availability.
  • Data Architecture & Strategy: Contribute to the technical direction of our data domain, including event cataloging, schema design and evolution, and data governance practices.
  • System Scalability & Reliability: Gain a deep understanding of our cloud architecture to ensure the high availability and scalability of our APIs, data processing reactors, and ledger systems.
  • Mentorship & Collaboration: Act as a technical mentor for junior engineers and a subject-matter expert for business stakeholders, helping them effectively consume and interpret platform data.
  • Feature Delivery: Consistently deliver high-quality features and associated tests in alignment with our product roadmap.
What You'll Bring (Required Qualifications)
  • 3+ years of professional software engineering experience, with a significant focus on data engineering or backend systems.
  • Strong proficiency in Python for data processing, scripting, and ETL development.
  • Hands-on experience with Google Cloud Platform (GCP), specifically with DataFlow, BigQuery, and Cloud Storage.
  • Experience building or contributing to backend services and APIs using TypeScript and Node.js.
  • Solid understanding of data modeling, database design (SQL and NoSQL), and data warehousing concepts.
  • Experience with Infrastructure as Code (IaC) tools, preferably Terraform.
  • A strong foundation in software development best practices, including version control (Git), automated testing, and CI/CD.
  • Excellent problem-solving skills and the ability to work independently in a fast-paced environment.
Nice to Haves (Preferred Qualifications)
  • Familiarity with the FinTech, Blockchain, or Web3 ecosystems and concepts like smart contracts.
  • Direct experience integrating with APIs for identity verification (e.g., Persona), fraud detection (e.g., Sardine), or authentication (e.g., Stytch).
  • Experience with other GCP services such as Pub/Sub, Cloud Functions, and Composer.
  • Knowledge of containerization and orchestration technologies like Docker and Kubernetes.
  • Previous experience mentoring junior engineers or acting as a tech lead on projects.
  • A Bachelor2s or Master2s degree in Computer Science, Engineering, or a related field.
Why Join Us?
  • Be part of a high-impact team
  • Work with cutting-edge technology and regulatory frameworks
  • Work with a diverse, global team in a remote-friendly environment.
  • Competitive salary, benefits and professional development support.

The pay range for this role is:

170,000 - 200,000 CAD per year (Remote (Canada))

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.