Ativa os alertas de emprego por e-mail!

BR Data Engineer

Tarmac.IO

Curitiba

Teletrabalho

BRL 20.000 - 80.000

Tempo integral

Há 26 dias

Melhora as tuas possibilidades de ir a entrevistas

Cria um currículo adaptado à oferta de emprego para teres uma taxa de sucesso superior.

Resumo da oferta

A leading company in data engineering is seeking a skilled Data Engineer to join their team in Curitiba. The ideal candidate will have expertise in Python, cloud platforms, and ETL processes. This role offers the flexibility to work from anywhere, alongside a collaborative team environment. Enjoy perks like flexible hours, wellness programs, and ongoing training opportunities.

Serviços

Fresh fruit sometimes, spoiled fruit all the time.
You can work from anywhere, including your home.
Flexible hours.
Team lunches, Bday celebrations, happy hours.
Wellness program and company retreats.
English lessons.
Courses and training.

Qualificações

  • Proficiency in Python and SQL.
  • Experience with cloud platforms (AWS, Azure, GCP).
  • Ability to collaborate with Data Scientists and Analysts.

Responsabilidades

  • Develop and maintain data pipelines using ETL/ELT processes.
  • Build and consume RESTful APIs for data exchange.
  • Implement monitoring and logging for data quality.

Conhecimentos

Python
Data manipulation
Scripting & automation
APIs
Testing
Problem-solving
Teamwork

Ferramentas

AWS
Azure
GCP
Terraform
Docker
Kubernetes
Git
Airflow
PostgreSQL
BigQuery

Descrição da oferta de emprego

Core Skills & Knowledge Areas

  1. Python
  • Data manipulation: Proficiency with libraries like pandas, numpy, and pyarrow.
  • Scripting & automation: Writing reusable, modular scripts for data ingestion and transformation.
  • APIs: Consuming and building RESTful APIs for data exchange.
  • Testing: Unit testing with pytest or unittest.
  1. Cloud Platforms
  • AWS / Azure / GCP: Familiarity with services like:
    • AWS: S3, Lambda, Glue, Redshift, EMR
    • Azure: Data Factory, Blob Storage, Synapse
    • GCP: BigQuery, Cloud Functions, Dataflow
  • Infrastructure as Code (IaC): Tools like Terraform or CloudFormation.
  • Security & IAM: Managing access and permissions.
  1. Back-End Development
  • Databases: SQL (PostgreSQL, MySQL), NoSQL (MongoDB, DynamoDB).
  • APIs: Building data services using frameworks like Flask, FastAPI, or Django.
  • CI/CD: Familiarity with Git, Docker, Jenkins, or GitHub Actions.
  1. ETL / ELT Pipelines
  • Pipeline orchestration: Tools like Apache Airflow, Prefect, or Luigi.
  • Data transformation: Using SQL, dbt, or Python scripts.
  • Batch vs Streaming: Understanding of Kafka, Spark Streaming, or Flink.
  • Monitoring & Logging: Ensuring data quality and pipeline reliability.

Tools & Technologies

  • Programming: Python, SQL.
  • Cloud: AWS, Azure, GCP.
  • Orchestration: Airflow, Prefect.
  • Databases: PostgreSQL, BigQuery, Redshift.
  • Data Lakes: S3, Azure Data Lake.
  • Containers: Docker, Kubernetes.
  • Version Control: Git, GitHub/GitLab.

Soft Skills & Other Requirements

  • Problem-solving: Ability to debug and optimize data workflows.
  • Teamwork: Collaborating with Data Scientists, Analysts, and DevOps.

Some of our perks

  • Fresh fruit sometimes, spoiled fruit all the time.
  • You can work from anywhere, including your home.
  • Flexible hours.
  • Team lunches, Bday celebrations, happy hours.
  • Wellness program and company retreats.
  • English lessons.
  • Courses and training.
Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.

Ofertas semelhantes

Data Scientist - Remote

INDI Staffing Services

São Paulo

Teletrabalho

USD 40.000 - 70.000

Há 3 dias
Torna-te num dos primeiros candidatos

Part-Time Data Scientist – Digital Media & Market Intelligence - Brazil

Truelogic Software

São Paulo

Teletrabalho

USD 30.000 - 50.000

Há 27 dias