Ativa os alertas de emprego por e-mail!

Data Engineer (AI Platforms), LATAM

OpenVPN Inc.

Brasil

Teletrabalho

BRL 80.000 - 120.000

Tempo integral

Há 4 dias
Torna-te num dos primeiros candidatos

Resumo da oferta

A tech company focused on data solutions is seeking a skilled Data Engineer to design and optimize robust data pipelines. This fully remote role offers a chance to work on cutting-edge AI projects and shape the future of the industry. Ideal candidates have over 4 years of experience in data engineering and are proficient in SQL, Python, and cloud technologies. Competitive pay and a self-managed work environment are included.

Serviços

Competitive pay rates
Fully remote work environments
Self-managed time off

Qualificações

  • 4+ years of experience in a data engineering role.
  • Proven track record of building complex data systems.
  • Excellent communication and documentation skills.

Responsabilidades

  • Design and optimize data pipelines for AI solutions.
  • Collaborate with AI/ML teams for model productionization.
  • Champion data governance and compliance.

Conhecimentos

Data engineering experience
SQL expertise
Cloud databases
Python or JAVA programming
Communication skills

Ferramentas

Google BigQuery
CloudSQL (PostgreSQL)
Docker
Event-streaming platforms
Descrição da oferta de emprego

Are you passionate about building the data foundations for a new generation of AI? We're looking for a skilled Data Engineer to be a major contributor to our company's intelligent future. You won't just be maintaining systems; you'll be at the heart of building, scaling, and deploying the data and AI platforms that will redefine how we deliver data solutions.

This is an opportunity to make a significant impact by transforming our data landscape and enabling cutting-edge AI and agentic workflows.

Our philosophy is that we are a small, close-knit team, and we care deeply about you:

  • Competitive pay rates
  • Fully remote work environments
  • Self-managed time off

Important: This is a full-time remote job, and it will be a long-term B2B contract.

On our website in JD, you can see the locations for this role: Argentina, Brazil, Chile, Colombia, Costa Rica, Dominican Republic, Ecuador, Mexico, Panama, Peru

What You'll Do
  • Design, build, and optimize robust, scalable data pipelines, leading the migration from legacy systems to our modern, AI-centric platform.
  • Evolve our data models and schemas to better support complex analytics, AI training, and fine-tuning workloads.
  • Collaborate with AI/ML teams to productionize models, streamline training data delivery, and support the development of sophisticated agentic systems.
  • Empower the organization by partnering with BI developers and analysts to design highly efficient queries and unlock new insights.
  • Champion data governance and compliance, ensuring our data handling practices remain secure and trustworthy as we innovate.
Challenges You'll Help Us Tackle
  • Modernize Our Data Backbone: Lead the charge in migrating our historical data flows to cutting-edge, AI-driven workflows.
  • Shape the Future of our AI: Redesign our datasets and schemas to be well aligned for training and fine-tuning next-generation models.
  • Build the Brains of the Operation: Play an important role in the infrastructure that supports powerful, data-driven Agentic Agents.
  • Scale with Intelligence: Help us build a data ecosystem that is not only powerful today but is ready for the demands of tomorrow's AI.
  • Proven experience (4+ years) in a data engineering role, with a track record of building and managing complex data systems.
  • Deep expertise in SQL and query optimization.
  • Hands-on experience with cloud data warehouses and databases, specifically Google BigQuery and CloudSQL (PostgreSQL).
  • Programming experience with Python or JAVA.
  • A proactive and self-motivated & managed mindset, perfect for a fully remote environment with a high degree of autonomy.
  • Excellent communication and documentation skills; you can clearly articulate complex technical concepts to diverse audiences.
  • The ability to work a flexible schedule and the readiness to respond to occasional off-hours emergencies.
Bonus Points For
  • AI/ML Tooling: Experience with Google's VertexAI platform.
  • Programming Languages: Proficiency in Go.
  • DE tools: familiarity with dbt and airflow.
  • Streaming Data: Familiarity with event-streaming platforms like Apache Kafka.
  • Streaming Analytics: Real time streaming analytics.
  • DevOps & Infrastructure: Experience with containerization (Docker) and serverless compute (Google Cloud Run).
  • Legacy Systems: Experience with Perl or PHP is a plus.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.