Enable job alerts via email!

Data Engineer/ Architect - Data Bricks

ZipRecruiter

Rockville (MD)

Remote

USD 90,000 - 140,000

Full time

13 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative company is seeking a Senior Data Engineer/Architect to join their team remotely. This role involves maintaining and enhancing data processing pipelines, designing ETL processes, and collaborating with Data Engineers and Data Scientists to analyze large datasets. The ideal candidate will have extensive experience with Python, Databricks, and AWS, and will play a crucial role in implementing big data technologies. With flexible work opportunities and a strong benefits package, this position offers a chance to make a significant impact in a dynamic environment.

Benefits

Competitive pay and 401K retirement plan
Flexible and remote work opportunities
Health care benefits (medical, dental, vision)
Tuition reimbursement
Employee Referral Program
Short-term and long-term insurance
Flexible Savings Account

Qualifications

  • 7+ years of experience with Python or another OOP language.
  • 5+ years of experience with Databricks and NoSQL products.

Responsibilities

  • Maintain and enhance processing pipelines using AWS tools.
  • Design and build ETL pipelines for data ingestion and migration.

Skills

Python
NoSQL
Databricks
SQL
Data Engineering

Tools

AWS
Snowflake

Job description

Job Description

Salary:

Senior Data Engineer/Architect - Remote
Full-time

Responsibilities:
  • Maintain and enhance processing pipelines using tools and frameworks in the AWS ecosystem.
  • Maintain architecture specifications and detailed design documentation.
  • Conduct data engineering functions including data extraction, transformation, loading, and integration in an AWS environment leveraging Snowflake and Databricks.
  • Work with large datasets and collaborate with Data Engineers and Data Scientists on data analysis tasks.
  • Implement and configure big data technologies and tune processes for performance at scale.
  • Design and build ETL pipelines to automate ingestion and data migration of structured and unstructured data.
  • Collaborate with DevOps engineers on CI, CD, and IaC processes; translate specifications into code and design documents; perform code reviews and develop processes for improving code quality.
Basic Qualifications:
  • 7 years' experience with Python or another OOP language.
  • 5 years' experience with NoSQL products, such as JSON.
  • 5 years' experience with Databricks.
  • 5 years' experience with advanced SQL features — regular expressions, analytical functions (e.g., RANK, PARTITION, LEAD, LAG).
  • 4 years' experience with query planning and database metrics to analyze and optimize queries, table structures, indices, and partitioning strategies.
  • Experience with federal government contracting work.

Note: Need very good Databricks experience.

If you're interested in applying, please contact me at mayuri.s@globalalliantinc.com.

Our Benefits:
  • Competitive pay and 401K retirement plan
  • Flexible and remote work opportunities as per client requirements
  • Health care benefits (medical, dental, vision)
  • Tuition reimbursement
  • Employee Referral Program
  • Short-term and long-term insurance
  • Flexible Savings Account

Global Alliant, Inc. provides equal employment opportunities (EEO) to all employees and applicants without regard to race, color, religion, sex, national origin, age, disability, or genetic information, marital status, veteran status, or expression. We especially encourage women, minorities, veterans, and individuals with disabilities to apply.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.