Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Data Engineer - Remote

Monetize More Inc

Teletrabalho

BRL 80.000 - 120.000

Tempo integral

Há 3 dias
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A global ad-tech company is seeking a Data Engineer to design and maintain data pipelines and analytics infrastructure. The ideal candidate has 2–3 years of experience, strong SQL skills, and familiarity with Apache Airflow. This remote position offers competitive salary and opportunities for career growth while working with large datasets in a collaborative team culture.

Serviços

Competitive salary
Fully remote work environment
Career growth opportunities
Collaborative team culture

Qualificações

  • 2–3 years of experience as a Data Engineer or similar role.
  • Strong SQL skills for query optimization.
  • Hands-on experience with Apache Airflow.

Responsabilidades

  • Design, build, and maintain ETL/ELT pipelines for data.
  • Develop workflows using Apache Airflow.
  • Write efficient SQL queries for analytics.

Conhecimentos

SQL query optimization
Apache Airflow
Python
Data warehousing concepts
Cloud platforms (AWS, GCP)

Ferramentas

Apache Spark
Descrição da oferta de emprego
About MonetizeMore

MonetizeMore is a global ad‑tech company helping publishers maximize their ad revenue through data‑driven optimization and advanced monetization solutions. We work with thousands of publishers worldwide and rely heavily on scalable, reliable data systems to power insights, automation, and decision‑making.

Role Overview

We are looking for a Data Engineer with 2–3 years of hands‑on experience to help design, build, and maintain robust data pipelines and analytics infrastructure. You will work closely with analytics, product, and engineering teams to ensure high‑quality, reliable, and scalable data systems.

This role is ideal for someone who enjoys working with large datasets, building automated workflows, and optimizing data performance.

Key Responsibilities
  • Design, build, and maintain ETL/ELT pipelines for structured and semi‑structured data
  • Develop and manage workflows using Apache Airflow (DAG creation, scheduling, monitoring, and optimization)
  • Write efficient, optimized SQL queries for analytics and reporting
  • Ensure data accuracy, integrity, and reliability across pipelines
  • Work with large datasets and optimize data storage and retrieval
  • Collaborate with data analysts and product teams to enable actionable insights
  • Troubleshoot pipeline failures and improve system performance
  • Document data workflows, schemas, and best practices
Required Qualifications
  • 2–3 years of experience as a Data Engineer or similar role
  • Strong SQL skills (query optimization, joins, window functions, data modeling)
  • Hands‑on experience with Apache Airflow
  • Experience building and maintaining production‑grade data pipelines
  • Proficiency in Python for data processing and orchestration
  • Solid understanding of data warehousing concepts
  • Experience working with cloud‑based data platforms (AWS, GCP, or similar)
Good to Have
  • Experience with Apache Spark (batch or distributed data processing)
  • Familiarity with big data ecosystems and tools
  • Experience with ad‑tech, analytics, or revenue optimization data
  • Knowledge of CI/CD practices for data pipelines
What We Offer
  • Fully remote work environment
  • Competitive salary and performance‑based growth
  • Opportunity to work with large‑scale, real‑world data
  • Learning and career growth in a fast‑paced ad‑tech company
  • Collaborative, supportive, and global team culture
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.