Enable job alerts via email!

Principal GCP Data Engineer_Remote_USC/GC or H4EAD_ONLY ON W2

Chelsoft Solutions Co.

Eagan (MN)

Remote

USD 86,000 - 136,000

Full time

2 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company is seeking a Principal GCP Data Engineer to focus on building data pipelines and ETL processes. This remote role requires expertise in technologies such as Airflow and SnapLogic, with responsibilities including the development of data workflows and warehouse solutions. Applicants should possess strong problem-solving skills and an ambition to contribute to migration and modernization initiatives in a team context.

Qualifications

  • 5-6 years of experience in building data pipelines.
  • Experience with Airflow, SnapLogic, and Google BigQuery.
  • Familiarity with data warehousing and ETL processes.

Responsibilities

  • Build data pipelines and orchestration from scratch.
  • Develop data ingestion and ETL pipelines primarily using SnapLogic.
  • Support data modeling and warehouse migration to GCP.

Skills

Airflow
Cloud Composer
SnapLogic
Python
SQL
Dataflow
Spark
Data Warehousing
Google BigQuery
Data Modeling

Job description

Principal GCP Data Engineer_Remote_USC/GC or H4EAD_ONLY ON W2

Principal GCP Data Engineer

Remote

USC/GC or H4EAD

ONLY ON W2

Required Skills : Needs experience in Airflow or Cloud Composer orchestration, development of new DAGs from scratch. Development of data ingestion and ETL pipelines from scratch. Using SnapLogic primarily for data pipelines and integrations, but also Python, SQL, Dataflow, Spark. Needs experience in data warehousing, Google BigQuery. Understanding of analytics as a whole, how data moves from source, warehouse, semantic or reporting layer, models, and reporting/BI, but their hands-on focus will be around building data pipelines and orchestration.

Principal GCP Data Engineer

Remote

USC/GC or H4EAD

ONLY ON W2

Required Skills : Needs experience in Airflow or Cloud Composer orchestration, development of new DAGs from scratch. Development of data ingestion and ETL pipelines from scratch. Using SnapLogic primarily for data pipelines and integrations, but also Python, SQL, Dataflow, Spark. Needs experience in data warehousing, Google BigQuery. Understanding of analytics as a whole, how data moves from source, warehouse, semantic or reporting layer, models, and reporting/BI, but their hands-on focus will be around building data pipelines and orchestration.

Years of experience: Principal 5-6 years roughly - More about what they've done i.e. healthcare experience, exposure to more than one project, communication.

Remote: yes, 9-3 CST are core hours

Pain Points: candidates who do not have the correct focus area (building the data pipelines), candidates who have not built something from scratch; either they are maintaining existing data workflows or processes in production, but not building.

Overview: Have multiple needs for Associate Data Engineer, Senior Data Engineer, and Principal Data Engineer to support multiple ongoing initiatives including migration of data to GCP, building out the GCP BigQuery warehouse (and moving data from Redshift into it), sunsetting legacy ETL (Datastage) and replacing with various new data ingestion solutions.

  • Experience in SnapLogic is strongly preferred; they have found this to be more amenable to upskilling.
  • Needs experience in Airflow or Cloud Composer orchestration, development of new DAGs from scratch.
  • Development of data ingestion and ETL pipelines from scratch. Using SnapLogic primarily for data pipelines and integrations, but also Python, SQL, Dataflow, Spark.
  • Needs experience in data warehousing, Google BigQuery.
  • Not responsible for building out visualizations; another team handles
  • Will be supporting data modeling, but not owning. Should have some experience in data modeling, data warehousing fundamentals.
  • Understanding of analytics as a whole, how data moves from source, warehouse, semantic or reporting layer, models, and reporting/BI, but their hands-on focus will be around building data pipelines and orchestration.
  • Proactive communicators, inquisitive people, problem-solvers, unafraid to make suggestions, ask questions. "Order taker" and "heads down" types of Engineers will not be a culture fit for the team.
  • They have a list of other technologies in their environment in smaller amounts/more dispersed--any would be a "nice to have": i.e. Kafka, Java, Apache Beam, Alteryx, etc.

Seniority level
  • Seniority level
    Mid-Senior level
Employment type
  • Employment type
    Contract
Job function
  • Job function
    Information Technology
  • Industries
    IT Services and IT Consulting

Referrals increase your chances of interviewing at Chelsoft Solutions Co. by 2x

Brooklyn Park, MN $86,500 - $135,960 2 weeks ago

Minneapolis, MN $110,000 - $130,000 4 months ago

Minneapolis, MN $110,000 - $130,000 4 days ago

Minneapolis, MN $90,000 - $120,000 1 week ago

St Paul, MN $135,700 - $223,900 3 months ago

Greater Minneapolis-St. Paul Area $190,000 - $195,000 4 days ago

Minneapolis, MN $152,000 - $263,000 4 days ago

Greater Minneapolis-St. Paul Area $105,000 - $125,000 1 month ago

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.