Enable job alerts via email!

Data Architect/Engineer

Compunnel Inc.

Los Angeles (CA)

On-site

USD 100,000 - 130,000

Part time

Today
Be an early applicant

Job summary

A leading IT services company is seeking a Data Engineer to design and maintain data pipelines, ensuring scalability and reliability. The ideal candidate has a bachelor's degree in Computer Science and at least 3 years of experience with Python, PySpark, and cloud data technologies. This contract role offers an opportunity to work in a collaborative Agile environment in Los Angeles.

Qualifications

  • 2+ years’ experience with data tools including Databricks, Collibra, and Starburst.
  • 3+ years’ experience with Python and PySpark.
  • Experience working with traditional ETL & Big Data on-prem or Cloud.

Responsibilities

  • Design, develop, and maintain robust data pipelines.
  • Deliver high-quality data products following Agile Practices.
  • Collaborate with cross-functional teams to meet data requirements.

Skills

Python
PySpark
Data Engineering
Databricks
Collibra
Starburst
AWS
Agile methodology

Education

Bachelor’s degree in Computer Science or related field

Tools

Jupyter Notebooks
Airflow
S3
RedShift
Snowflake
Job description
Overview

Duration: Long Term

Role Overview: As a Data Engineer, this CW will be responsible for collecting, parsing, managing, analyzing, and visualizing large sets of data to turn information into actionable insights. They will work across multiple platforms to ensure that data pipelines are scalable, repeatable, and secure, capable of serving multiple users.

Qualifications
  • Bachelor’s degree in Computer Science, Information Systems, or a related field, or equivalent experience.
  • 2+ years’ experience with tools such as Databricks, Collibra, and Starburst.
  • 3+ years’ experience with Python and PySpark.
  • Experience using Jupyter notebooks, including coding and unit testing.
  • Recent accomplishments working with relational and NoSQL data stores, methods, and approaches (STAR, Dimensional Modeling).
  • 2+ years of experience with a modern data stack (Object stores like S3, Spark, Airflow, Lakehouse architectures, real-time databases) and cloud data warehouses such as RedShift, Snowflake.
  • Overall data engineering experience across traditional ETL & Big Data, either on-prem or Cloud.
  • Data engineering experience in AWS (any CFS2/EDS) highlighting the services/tools used.
  • Experience building end-to-end data pipelines to ingest and process unstructured and semi-structured data using Spark architecture.
Responsibilities
  • Design, develop, and maintain robust and efficient data pipelines to ingest, transform, catalog, and deliver curated, trusted, and quality data from disparate sources into our Common Data Platform.
  • Actively participate in Agile rituals and follow Scaled Agile processes as set forth by the CDP Program team.
  • Deliver high-quality data products and services following Safe Agile Practices.
  • Proactively identify and resolve issues with data pipelines and analytical data stores.
  • Deploy monitoring and alerting for data pipelines and data stores, implementing auto-remediation where possible to ensure system availability and reliability.
  • Employ a security-first, testing, and automation strategy, adhering to data engineering best practices.
  • Collaborate with cross-functional teams, including product management, data scientists, analysts, and business stakeholders, to understand their data requirements and provide them with the necessary infrastructure and tools.
  • Keep up with the latest trends and technologies, evaluating and recommending new tools, frameworks, and technologies to improve data engineering processes and efficiencies.
Seniority level
  • Mid-Senior level
Employment type
  • Contract
Job function
  • Information Technology
Industries
  • IT Services and IT Consulting and Banking

Referrals increase your chances of interviewing at Compunnel Inc. by 2x

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.