Enable job alerts via email!

GCP Data Engineer

Onebridge

Indianapolis (IN)

On-site

USD 60,000 - 80,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative consulting firm is seeking a talented GCP Data Engineer to join their dynamic team. In this role, you will design and develop scalable data solutions that drive data-driven decision-making across the organization. Your expertise in Google Cloud Platform and big data processing frameworks will be crucial in optimizing data pipelines and ensuring data integrity. You will collaborate with cross-functional teams to create robust data infrastructures that support analytics and business intelligence. This is a fantastic opportunity to make a significant impact in a company recognized as one of the best places to work in Indianapolis.

Qualifications

  • 5+ years of experience as a Data Engineer with expertise in GCP.
  • Proficient in SQL and Python for developing data pipelines.

Responsibilities

  • Design and maintain data pipelines using GCP services.
  • Collaborate with teams to translate business needs into data solutions.

Skills

Google Cloud Platform (GCP)
SQL
Python
ETL/ELT processes
Data governance
Apache Beam
Spark

Tools

BigQuery
Dataflow
Cloud Storage
Pub/Sub
Cloud Composer
Terraform

Job description

Onebridge, a Marlabs Company, is an AI and data analytics consulting firm that strives to improve outcomes for the people we serve through data and technology. We have served some of the largest healthcare, life sciences, manufacturing, financial services, and government entities in the U.S. since 2005. We have an exciting opportunity for a highly skilled GCP Data Engineer to join an innovative and dynamic group of professionals at a company rated among the top “Best Places to Work” in Indianapolis since 2015.

GCP Data Engineer | About You

As a GCP Data Engineer, you are responsible for designing, developing, and maintaining scalable data solutions that empower data-driven decision-making across the organization. With extensive experience in GCP, you specialize in optimizing data pipelines, working with big data processing frameworks, and ensuring data integrity and availability. You thrive in fast-paced environments and excel at solving complex data challenges using innovative cloud architectures. Your ability to collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders ensures that your solutions meet business needs. You are committed to building reliable, efficient, and secure data systems that support analytics and business intelligence.

GCP Data Engineer | Day-to-Day

  • Design, develop, and maintain robust data pipelines leveraging GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage.
  • Optimize and scale large-scale ETL workflows to ensure high performance, reliability, and efficiency in data processing.
  • Implement data modeling, warehousing solutions, and best practices for data organization and accessibility using BigQuery.
  • Ensure data quality and integrity by developing rigorous testing, validation, and monitoring mechanisms across all data workflows.
  • Collaborate effectively with data scientists, analysts, and application teams to translate business requirements into scalable data solutions.
  • Automate infrastructure deployment using Infrastructure-as-Code (IaC) tools like Terraform, and manage security, access control, and compliance standards.

GCP Data Engineer | Skills & Experience

  • 5+ years of experience as a Data Engineer with deep expertise in Google Cloud Platform (GCP) and a proven track record of designing scalable data solutions.
  • Proficient in SQL and Python, utilizing these skills to develop efficient data pipelines, process large datasets, and automate workflows.
  • Hands-on experience with GCP services such as BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Cloud Composer for building robust cloud-based data infrastructure.
  • Strong understanding of ETL/ELT processes, with expertise in handling and transforming large-scale datasets in cloud environments.
  • Familiarity with distributed data processing frameworks like Apache Beam, Spark, and others to handle complex data transformations at scale.
  • In-depth knowledge of data governance, security best practices, and compliance standards, ensuring the integrity and safety of cloud-based data systems.
  • Experience in the Telco domain or a Google Cloud Professional Data Engineer Certification are highly desirable.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Data Engineer (GCP)-CTH

Davita Inc.

Eagan

Remote

USD 60,000 - 80,000

6 days ago
Be an early applicant

Sr Data Analyst/Data Analyst (Contract to Hire)

Jobot Consulting

Indianapolis

Remote

USD 60,000 - 80,000

Yesterday
Be an early applicant

Data Analyst

Lensa

Indianapolis

Remote

USD 79,000 - 113,000

13 days ago

Data Analyst

Lensa

Indianapolis

Remote

USD 70,000 - 90,000

12 days ago

Report/Data Analyst

Key Benefit Administrators

Indianapolis

Remote

USD 70,000 - 90,000

3 days ago
Be an early applicant

Data Engineer - Senior

Lumenalta

Tulsa

Remote

USD 50,000 - 100,000

Yesterday
Be an early applicant

Data Engineer - Databricks

Lumenalta

Raleigh

Remote

USD 60,000 - 110,000

Yesterday
Be an early applicant

Data Engineer

Trillium Health Resources

Remote

USD 62,000 - 99,000

3 days ago
Be an early applicant

Senior Data Engineer

Abbott

Remote

USD 75,000 - 151,000

3 days ago
Be an early applicant