¡Activa las notificaciones laborales por email!

*Data Engineer Backend Developer

Coderio | Software Company

La Plata

A distancia

ARS 85.785.000 - 114.381.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A leading software solutions provider in Buenos Aires is seeking a Data Engineer Backend Developer. The role involves designing and maintaining analytical applications using Python and SQL. Ideal candidates will have over 5 years of experience in data-driven applications, strong skills in distributed processing frameworks, and familiarity with AWS. This position offers a 100% remote work environment and a clear growth path in a collaborative team.

Servicios

100% remote
Long-term commitment
Collaborative international team
Growth opportunities

Formación

  • 5+ years of proven experience developing analytical or data-driven applications.
  • Strong proficiency in Python and SQL, with hands-on experience in DBMS.
  • Expertise working with distributed data processing frameworks.

Responsabilidades

  • Contribute to all phases of the analytical application development life cycle.
  • Design, develop, and deliver high-volume data analytics applications.
  • Write efficient code that aligns with technical and business requirements.

Conocimientos

Python
SQL
Apache Spark
Data modeling
NoSQL databases
ETL/ELT
GitHub
English (advanced)

Herramientas

DBMS
Apache Iceberg
Hadoop
AWS
Docker
Kubernetes
Descripción del empleo
About Coderio

Coderio designs and delivers scalable digital solutions for global businesses. With a strong technical foundation and a product mindset, our teams lead complex software projects from architecture to execution. We value autonomy, clear communication, and technical excellence. We work closely with international teams and partners, building technology that makes a difference.

In this role, as a Data Engineer Backend Developer, you will design, develop, and maintain high-performance analytical applications and services that power large-scale data pipelines and analytics platforms. Your main focus will be on building Python-based solutions for data ingestion, transformation, and modeling, ensuring efficiency, scalability, and reliability across distributed systems. You will work closely with Data Engineers, Analysts, and cross-functional teams to optimize data workflows, improve ETL/ELT processes, and deliver high-quality datasets that enable advanced analytics and business insights within a cloud-based environment.

Responsibilities
  • Contribute to all phases of the analytical application development life cycle, from design to deployment.
  • Design, develop, and deliver high-volume data analytics applications with a focus on performance and scalability.
  • Write well-structured, testable, and efficient code that aligns with technical and business requirements.
  • Ensure all solutions comply with design specifications and best engineering practices.
  • Support continuous improvement by researching emerging technologies, evaluating alternatives, and presenting recommendations for architectural review.
Requirements
  • 5+ years of proven experience developing analytical or data-driven applications.
  • 5+ years of strong proficiency in Python and SQL, with hands-on experience in DBMS and Apache Iceberg.
  • 5+ years of expertise working with distributed data processing frameworks such as Apache Spark, Hadoop, Hive, or Presto.
  • Deep understanding of data modeling techniques (Star schema, Snowflake) and data cleansing/manipulation processes.
  • Good knowledge of NoSQL databases and handling semi-structured/unstructured data.
  • Experience working within the AWS ecosystem (S3, Redshift, RDS, SQS, Athena, Glue, CloudWatch, EMR, Lambda, or similar).
  • Experience with ETL/ELT batch processing workflows.
  • Proficiency with version control tools (GitHub or similar).
  • Advanced or fluent level of English (written and spoken).
Nice to Have
  • Experience in data architecture or pipeline optimization.
  • Familiarity with containerization (Docker, Kubernetes).
  • Exposure to Agile/Scrum methodologies.
  • Knowledge of data orchestration tools (Airflow, Prefect).
  • Understanding of data lakehouse architectures.
Benefits
  • 100% remote –Long-term commitment, with autonomy and impact
  • Strategic and high-visibility role in a modern engineering culture
  • Collaborative international team and strong technical leadership
  • Clear path to growth and leadership within Coderio

If you are motivated to build solutions with impact, we are waiting for you. Apply now.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.