¡Activa las notificaciones laborales por email!

*Senior Software Engineer – Data & APIs

Medium

Maldonado

A distancia

EUR 70.000 - 100.000

Jornada completa

Hoy
Sé de los primeros/as/es en solicitar esta vacante

Descripción de la vacante

A digital solutions provider is seeking a Senior Software Engineer to design and maintain high-performance analytical applications and data pipelines. The position requires expertise in Python, SQL, and distributed processing frameworks. Offering a 100% remote work format, this role provides an opportunity for autonomy, impact, and growth within a collaborative international team.

Servicios

Long-term commitment
Strong technical leadership
Collaborative environment
Clear growth path

Formación

  • 8+ years of experience in developing data-driven applications.
  • Hands-on experience with distributed data processing frameworks.
  • Advanced proficiency in Python, SQL, and AWS ecosystem.

Responsabilidades

  • Design and develop high-performance analytical applications.
  • Optimize data workflows and ETL/ELT processes.
  • Contribute to the entire application development life cycle.

Conocimientos

Proficient in Python
Strong SQL skills
Experience with Apache Spark
Knowledge of NoSQL databases
Familiarity with data modeling techniques
Version control (Git)
Fluent in English

Herramientas

Apache Iceberg
AWS services (S3, Redshift, etc.)
Data orchestration tools (Airflow, Prefect)
Containerization tools (Docker, Kubernetes)
Descripción del empleo
About Us

Coderio designs and delivers scalable digital solutions for global businesses. With a strong technical foundation and a product mindset, our teams lead complex software projects from architecture to execution. We value autonomy, clear communication, and technical excellence. We work closely with international teams and partners, building technology that makes a difference.

Learn more: http://coderio.com

In this role, as a Senior Software Engineer – Data & APIs, you will design, develop, and maintain high-performance analytical applications and services that power large-scale data pipelines and analytics platforms. Your main focus will be on building Python-based solutions for data ingestion, transformation, and modeling, ensuring efficiency, scalability, and reliability across distributed systems. You will work closely with Data Engineers, Analysts, and cross-functional teams to optimize data workflows, improve ETL/ELT processes, and deliver high-quality datasets that enable advanced analytics and business insights within a cloud-based environment.

Responsibilities
  • Contribute to all phases of the analytical application development life cycle, from design to deployment.
  • Design, develop, and deliver high-volume data analytics applications with a focus on performance and scalability.
  • Write well-structured, testable, and efficient code that aligns with technical and business requirements.
  • Ensure all solutions comply with design specifications and best engineering practices.
  • Support continuous improvement by researching emerging technologies, evaluating alternatives, and presenting recommendations for architectural review.
Requirements
  • 8+ years of proven experience developing analytical or data-driven applications.
  • 8+ years of strong proficiency in Python and SQL, with hands-on experience in DBMS and Apache Iceberg.
  • 8+ years of expertise working with distributed data processing frameworks such as Apache Spark, Hadoop, Hive, or Presto.
  • Deep understanding of data modeling techniques (Star schema, Snowflake) and data cleansing/manipulation processes.
  • Good knowledge of NoSQL databases and handling semi-structured/unstructured data.
  • Experience working within the AWS ecosystem (S3, Redshift, RDS, SQS, Athena, Glue, CloudWatch, EMR, Lambda, or similar).
  • Experience with ETL/ELT batch processing workflows.
  • Proficiency with version control tools (GitHub or similar).
  • Advanced or fluent level of English (written and spoken).
Nice to Have
  • Experience in data architecture or pipeline optimization.
  • Familiarity with containerization (Docker, Kubernetes).
  • Exposure to Agile/Scrum methodologies.
  • Knowledge of data orchestration tools (Airflow, Prefect).
  • Understanding of data lakehouse architectures.
Benefits
  • 100% remote –Long-term commitment, with autonomy and impact
  • Strategic and high-visibility role in a modern engineering culture
  • Collaborative international team and strong technical leadership
  • Clear path to growth and leadership within Coderio
Why join Coderio?

At Coderio, we value talent regardless of location. We are a remote-first company, passionate about technology, collaborative work, and fair compensation.

We offer an inclusive, challenging environment with real opportunities for growth.

If you are motivated to build solutions with impact, we are waiting for you.

Apply now.

Consigue la evaluación confidencial y gratuita de tu currículum.
o arrastra un archivo en formato PDF, DOC, DOCX, ODT o PAGES de hasta 5 MB.