Ativa os alertas de emprego por e-mail!

Data Engineer (AI Platforms)

OpenVPN Inc.

São Paulo

Teletrabalho

BRL 100.000 - 130.000

Tempo integral

Hoje
Torna-te num dos primeiros candidatos

Resumo da oferta

A tech company is looking for a skilled Data Engineer to build and optimize data pipelines and collaborate with AI/ML teams. The ideal candidate has over 4 years of experience, deep SQL expertise, and familiarity with cloud data warehouses like Google BigQuery. This role allows for a fully remote work environment with a competitive pay rate and self-managed time off.

Serviços

Competitive pay rates
Fully remote work environments
Self-managed time off

Qualificações

  • Proven experience (4+ years) in a data engineering role.
  • Deep expertise in SQL and query optimization.
  • Hands-on experience with cloud data warehouses.

Responsabilidades

  • Design, build, and optimize robust, scalable data pipelines.
  • Evolve our data models and schemas for complex analytics.
  • Collaborate with AI/ML teams to productionize models.

Conhecimentos

SQL
Data engineering
Cloud data warehousing
Python
Java
Communication
Self-motivation

Ferramentas

Google BigQuery
CloudSQL (PostgreSQL)
Docker
Descrição da oferta de emprego

Are you passionate about building the data foundations for a new generation of AI? We're looking for a skilled Data Engineer to be a major contributor to our company's intelligent future. You won't just be maintaining systems; you'll be at the heart of building, scaling, and deploying the data and AI platforms that will redefine how we deliver data solutions.

This is an opportunity to make a significant impact by transforming our data landscape and enabling cutting-edge AI and agentic workflows.

What You'll Do
  • Design, build, and optimize robust, scalable data pipelines, leading the migration from legacy systems to our modern, AI-centric platform.

  • Evolve our data models and schemas to better support complex analytics, AI training, and fine-tuning workloads.

  • Collaborate with AI/ML teams to productionize models, streamline training data delivery, and support the development of sophisticated agentic systems.

  • Empower the organization by partnering with BI developers and analysts to design highly efficient queries and unlock new insights.

  • Champion data governance and compliance, ensuring our data handling practices remain secure and trustworthy as we innovate.

Challenges You'll Help Us Tackle
  • Modernize Our Data Backbone: Lead the charge in migrating our historical data flows to cutting-edge, AI-driven workflows.

  • Shape the Future of our AI: Redesign our datasets and schemas to be well aligned for training and fine-tuning next-generation models.

  • Build the Brains of the Operation: Play an important role in the infrastructure that supports powerful, data-driven Agentic Agents.

  • Scale with Intelligence: Help us build a data ecosystem that is not only powerful today but is ready for the demands of tomorrow's AI.

Our philosophy is that we are a small, close-knit team, and we care deeply about you:

  • Competitive pay rates

  • Fully remote work environments

  • Self-managed time off


  • Proven experience (4+ years) in a data engineering role, with a track record of building and managing complex data systems.

  • Deep expertise in SQL and query optimization.

  • Hands-on experience with cloud data warehouses and databases, specifically Google BigQuery and CloudSQL (PostgreSQL).

  • Programming experience with Python or JAVA

  • A proactive and self-motivated & managed mindset, perfect for a fully remote environment with a high degree of autonomy.

  • Excellent communication and documentation skills; you can clearly articulate complex technical concepts to diverse audiences.

  • The ability to work a flexible schedule and the readiness to respond to occasional off-hours emergencies.

Bonus Points For
  • AI/ML Tooling: Experience with Google's VertexAI platform.

  • Programming Languages: Proficiency in Go.

  • DE tools : familiarity with dbt and airflow

  • Streaming Data: Familiarity with event-streaming platforms like Apache Kafka.

  • Streaming Analytics: Real time streaming analytics

  • DevOps & Infrastructure: Experience with containerization (Docker) and serverless compute (Google Cloud Run)

  • Legacy Systems: Experience with Perl or PHP is a plus.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.