Job Search and Career Advice Platform

Ativa os alertas de emprego por e-mail!

Middle Data Warehouse Engineer (#4576)

N-iX

Teletrabalho

BRL 80.000 - 120.000

Tempo integral

Ontem
Torna-te num dos primeiros candidatos

Cria um currículo personalizado em poucos minutos

Consegue uma entrevista e ganha mais. Sabe mais

Resumo da oferta

A leading technology firm is seeking a motivated Middle Data Warehouse Engineer to develop and maintain robust data pipelines. You will design ETL processes, optimize queries, and automate data workflows while using AWS CLI for data management. Candidates should have at least 4 years of experience, proficiency in Java, Oracle, and PL/SQL, along with strong problem-solving skills. This position offers a flexible working format and a competitive salary package.

Serviços

Flexible working format
Competitive salary and compensation package
Professional development tools
Active tech communities

Qualificações

  • At least 4 years of experience in data engineering or similar role.
  • Intermediate proficiency in Java, Oracle, and PL/SQL.
  • Experience with ETL processes and data pipeline development.

Responsabilidades

  • Design and maintain ETL processes for data movement.
  • Optimize complex queries using PL/SQL and Oracle.
  • Automate data workflows using shell scripting.
  • Utilize AWS CLI for data management tasks.
  • Debug data pipelines to ensure accuracy and availability.

Conhecimentos

Java
Oracle
PL/SQL
ETL processes
Shell scripting
AWS CLI
Data warehousing concepts
Problem-solving skills
English (Upper-Intermediate/Advanced)
Descrição da oferta de emprego

We are seeking a motivated Middle Data Warehouse Engineer to join our team. In this role, you will be responsible for developing and maintaining robust data pipelines that drive our business intelligence and analytics.

Responsibilities
  • Design, develop, and maintain ETL (Extract, Transform, Load) processes to move data from various sources to our data warehouse.
  • Write and optimize complex queries and scripts using PL/SQL and Oracle to transform and load data.
  • Automate and orchestrate data workflows using shell scripting (Unix/korn shell).
  • Utilize AWS CLI for tasks such as managing data in S3 or interacting with other AWS services.
  • Debug and troubleshoot data pipeline issues to ensure data accuracy and availability for downstream consumers.
  • Collaborate with stakeholders and team members to understand data requirements and deliver reliable solutions.
Requirements
  • At least 4 years of experience in this or a similar role.
  • Intermediate proficiency with Java, Oracle, and PL/SQL.
  • Experience with ETL processes and data pipeline development.
  • Familiarity with shell scripting (Unix) and basic command-line debugging.
  • Working knowledge of AWS CLI.
  • A solid understanding of data warehousing concepts.
  • Strong problem-solving skills and the ability to learn from technical documentation and training materials.
  • Upper-Intermediate/Advanced English level is required.
We offer*
  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
Project: Global stock photography provider
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.