Enable job alerts via email!

Data Engineer

INSPYR

Myrtle Point (OR)

Remote

USD 60,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A flexible technology solutions provider is looking for a Data Engineer to design and manage data pipelines remotely. The ideal candidate should possess strong skills in Databricks, Python, and SQL, along with experience in cloud services. This role offers competitive compensation of $60.00 – 65.00/hr for a duration of over 6 months, and is available for US citizens and authorized workers.

Benefits

Comprehensive medical benefits
Competitive pay
401(k) retirement plan

Qualifications

  • Strong hands-on experience with Databricks, including Spark and Delta Lake.
  • Proficiency in Python and SQL is essential.
  • Experience with cloud-native data services like AWS Glue or Azure Data Factory.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using Apache Spark.
  • Build and manage Delta Lake architectures for efficient data storage.
  • Collaborate with AI/ML teams to operationalize models.

Skills

Databricks
Python
SQL
ETL/ELT workflows
Data Quality
Collaboration

Education

Bachelor's or Master’s degree in Computer Science or related field

Tools

Apache Spark
Delta Lake
MLflow
Git
Cloud platforms (AWS, Azure, GCP)
Job description
Overview

Title: Data Engineer
Location: Remote
Duration: 6+ months
Compensation: $60.00 – 65.00/hr
Work Requirements: US Citizen, GC Holders or Authorized to Work in the U.S.

Responsibilities
  • Design, develop, and maintain scalable data pipelines using Apache Spark on Databricks.
  • Build and manage Delta Lake architectures for efficient data storage and retrieval.
  • Implement robust ETL/ELT workflows using Databricks notebooks, SQL, and Python.
  • Collaborate with AI/ML teams to operationalize models within the Databricks environment.
  • Optimize data workflows for performance, reliability, and cost-efficiency in cloud platforms (AWS, Azure, or GCP).
  • Ensure data quality, lineage, and governance using tools like Unity Catalog and MLflow.
  • Develop CI/CD pipelines for data and ML workflows using Databricks Repos and Git integrations.
  • Monitor and troubleshoot production data pipelines and model deployments.
Key Qualifications / Responsibilities
  • Strong hands-on experience with Databricks, including Spark, Delta Lake, and MLflow.
  • Proficiency in Python, SQL, and distributed data processing.
  • Experience with cloud-native data services (e.g., AWS Glue, Azure Data Factory, GCP Dataflow).
  • Familiarity with machine learning lifecycle and integration of models into data pipelines.
  • Understanding of data warehousing, data lakehouse architecture, and real-time streaming (Kafka, Spark Structured Streaming).
  • Experience with version control, CI/CD, and infrastructure-as-code tools.
  • Excellent communication and collaboration skills.
  • Certifications in Databricks (e.g., Databricks Certified Data Engineer Associate/Professional).
  • Experience with feature engineering and feature stores in Databricks.
  • Exposure to MLOps practices and tools.
  • Bachelor\'s or Master\'s degree in Computer Science, Data Engineering, or related field.
  • Leveraged Databricks for scalable AI and BI solutions, integrating large language models (Anthropic, LLaMA, Gemini) to enhance data-driven insights. Developed agentic AI agents to automate complex decision-making workflows.
Technology Stack
  • Databricks (Spark, Delta Lake, MLflow, Notebooks)
  • Python & SQL
  • Apache Spark (via Databricks)
  • Delta Lake (for lakehouse architecture)
  • Cloud Platforms: Azure, AWS, or GCP
  • Cloud Storage (ADLS, S3, GCS)
  • Data Integration: Kafka or Event Hubs (streaming)
  • Auto Loader (Databricks file ingestion)
  • REST APIs
  • AI/ML
  • MLflow (model tracking/deployment)
  • Hugging Face Transformers
  • LangChain / LlamaIndex (LLM integration)
  • LLMs: Anthropic Claude, Meta LLaMA, Google Gemini
  • DevOps: Git (GitHub, GitLab, Azure Repos)
  • Databricks Repos
  • CI/CD: GitHub Actions, Azure DevOps
  • Security & Governance: Unity Catalog, RBAC
Benefits
  • Comprehensive medical benefits
  • Competitive pay
  • 401(k) retirement plan
  • ...and much more!
About INSPYR Solutions

Technology is our focus and quality is our commitment. As a national expert in delivering flexible technology and talent solutions, we strategically align industry and technical expertise with our clients\' business objectives and cultural needs. Our solutions are tailored to each client and include a wide variety of professional services, project, and talent solutions. By always striving for excellence and focusing on the human aspect of our business, we work seamlessly with our talent and clients to match the right solutions to the right opportunities. Learn more about us at inspyrsolutions.com.

INSPYR Solutions provides Equal Employment Opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, or genetics. In addition to federal law requirements, INSPYR Solutions complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities.

Information collected and processed through your application with INSPYR Solutions (including any job applications you choose to submit) is subject to INSPYR Solutions’ Privacy Policy and INSPYR Solutions’ AI and Automated Employment Decision Tool Policy. By submitting an application, you are consenting to being contacted by INSPYR Solutions through phone, email, or text.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.