Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Compunnel, Inc.

Harrisburg (Dauphin County)

On-site

USD 90,000 - 130,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading company is seeking a skilled Data Engineer to develop cloud-native data processing systems. You will work with Google Cloud Platform, Python, and Terraform to optimize identity data synchronization. This role includes designing modular applications, integrating with APIs, and ensuring compliance with security practices.

Qualifications

  • 6+ years of experience in backend development or data engineering.
  • Strong experience with Python for data processing.
  • Proficiency in GCP services and REST APIs.

Responsibilities

  • Design and develop modular Python applications to process identity data.
  • Stage identity metadata in BigQuery; implement change detection logic.
  • Integrate with RESTful APIs using secure authentication methods.

Skills

Python
GCP
Terraform
SQL
REST APIs

Education

Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field

Tools

Terraform
BigQuery
Cloud Run
Cloud Functions
Cloud Composer
Pub/Sub
Docker
Git
Job description

We are seeking a highly skilled Data Engineer to design and implement a cloud-native data processing and API integration system.

This role focuses on ingesting identity data, detecting record-level changes, and synchronizing user metadata to downstream systems via APIs.

The ideal candidate will have strong experience with Google Cloud Platform (GCP), Python, and Terraform, and a passion for building scalable, fault-tolerant data solutions.

Key Responsibilities

  • Design and develop modular Python applications to process identity data and synchronize it with target platforms.
  • Stage identity metadata in BigQuery and implement change detection logic (create/update/delete).
  • Integrate with RESTful APIs using secure authentication methods (OAuth2, token-based).
  • Orchestrate workflows using GCP services such as Pub/Sub, Cloud Composer, and Cloud Run.
  • Deploy and manage infrastructure using Terraform with a focus on security and configuration as code.
  • Implement observability features including logging, retry logic, and incident handling.
  • Build automated test coverage for critical data processing and API logic.
  • Ensure compliance with enterprise security and change control practices.
Required Qualifications
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • 6+ years of experience in backend development or data engineering, with a focus on identity, security, or metadata systems.
  • Strong proficiency in Python for data processing and backend development.
  • Advanced experience with GCP services: BigQuery, Cloud Run, Cloud Functions, Cloud Composer, Pub/Sub, Cloud Storage, Secret Manager, Cloud Scheduler.
  • Experience with REST APIs and secure authentication (OAuth2, token-based).
  • Proficiency in Terraform for infrastructure automation.
  • Strong SQL skills for data transformation and validation.
  • Familiarity with CI/CD, Docker, and Git workflows.
  • Experience working with structured metadata, user roles, and directory-style data.
  • Strong documentation, debugging, and independent problem-solving skills.
Preferred Qualifications
  • Experience integrating with IAM or identity systems (e.g., LDAP, Okta).
  • Background in regulated or high-security environments.
  • Experience handling large-scale user datasets (millions of records).
  • Familiarity with hybrid data processing (batch + streaming).
  • GCP certifications.
Physical Demands
  • Ability to perform essential job functions in accordance with ADA and other applicable standards.
  • Primarily sedentary work with occasional movement for meetings and collaboration.
  • Frequent use of computer systems including keyboard, mouse, and monitor.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.