Enable job alerts via email!

Data Engineer ( Remote ) ( Full-Time ) ( Crypto Industry )

Career Developers

Bellevue (WA)

Remote

USD 100,000 - 120,000

Full time

2 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A staffing firm seeks remote Data Engineers to join a thriving team in the crypto industry. Candidates should have strong experience with ETL, data pipeline construction, and cloud deployments. This role offers a competitive salary and the opportunity to work on cutting-edge data technologies in a fast-paced environment.

Qualifications

  • Expertise in building data pipelines and ETL processes.
  • Experience with PostgreSQL and PL/pgSQL for database performance.
  • Knowledge of data warehouse technologies like Trino and Clickhouse.

Responsibilities

  • Build scalable and reliable data pipelines.
  • Collaborate on architecture definitions.
  • Drive data systems towards near real-time processing.

Skills

Streaming data pipelines
Batch data pipelines
Python
PostgreSQL
PL/pgSQL
Cloud deployments
Data warehouse technologies
Agile methodology
English proficiency

Education

Bachelor's degree in Computer Science or related field

Tools

Kubernetes
Docker
Kafka
Redpanda

Job description

Refer a friend: Referral fee program

Career Developers Inc., a distinguished staffing and consulting firm, is proud to celebrate 30 years of service excellence. As a GSA Contract holder, we offer comprehensive staffing solutions for both commercial and government sectors nationwide. Our dedication to candidates involves managing expectations with precision through business intelligence, thorough interview preparation, transparent communication, and exceptional feedback throughout the process.

We are committed to advancing your career and look forward to supporting your professional growth.


Data Engineer (Remote) (Full-Time) (Crypto Industry)
Location: Fully Remote
Salary: $100K - $120K Base Salary

Must-Haves:
  • Experience building both streaming & batch data pipelines/ETL and familiarity with design principles.
  • Expertise in Python, PostgreSQL, and PL/pgSQL development and administration of large databases focusing on performance and production support on native cloud deployments.
  • Experience with scalability solutions, multi-region replication, and failover solutions.
  • Experience with data warehouse technologies (Trino, Clickhouse, Airflow, etc.).
  • Deep understanding of programming and experience with at least one programming language.
  • English language proficiency.

My client needs 2 Data Engineers to join their collaborative and fast-moving team and work on one of the most rewarding projects in the mining and computing industry.

Basic Requirements
  • Experience building both streaming & batch data pipelines/ETL and familiarity with design principles.
  • Expertise in Python, PostgreSQL, and PL/pgSQL development and administration of large databases focusing on performance and production support on native cloud deployments.
  • Experience with scalability solutions, multi-region replication, and failover solutions.
  • Experience with data warehouse technologies (Trino, Clickhouse, Airflow, etc.).
  • Bachelor's degree (or its foreign degree equivalent) in Computer Science, Engineering, or a related technical discipline or equivalent experience.
  • Deep understanding of programming and experience with at least one programming language.
  • English language proficiency.
Preferred Requirements
  • Knowledge of Kubernetes and Docker.
  • 4+ years of relevant data field experience.
  • Knowledge of blockchain technology/mining pool industry.
  • Experience with agile development methodology.
  • Experience delivering and owning web-scale data systems in production.
  • Experience working with Kafka, preferably Redpanda & Redpanda Connect.

The Ideal Candidate:

  • Passionate about cryptocurrency and public blockchain technologies.
  • Interested in creating a new market with Hashrate (compute power) as a commodity.
  • Interested in evolving the architecture of our software for robustness and maintainability.
  • Enjoys coding and pushing boundaries.
  • Brings enthusiasm to the team and can focus on quality and schedule.

Responsibilities:

  • Build scalable and reliable data pipelines for accurate data feeds.
  • Govern scalable & performant cloud-deployed databases.
  • Collaborate on scalable and secure architecture definitions.
  • Drive data systems towards near real-time processing.
  • Design, automate, and execute test plans for datasets.
  • Participate in feature generation and analysis.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.