Senior Data Engineer (AWS)

Solo para miembros registrados
Lugo
EUR 40.000 - 60.000
Descripción del empleo

Our Company At Kynetec, we're proud to be at the forefront of the intersection between agriculture, sustainability, and animal health. We’re redefining our industry with unparalleled insights and leading technology, whilst on an ambitious growth plan to influence everything from the food on our plates to the health of livestock and the care of pets at home.

We owe our success to our industry experts who drive our reputation as a global leader. Their innovative ideas and expertise have helped us reach new heights. From seasoned insights specialists and client leaders to innovative tech geniuses, what connects us is a shared passion for Agriculture and Animal Health! We don’t settle for “business as usual”.

Each day, we are transforming our industry and improving lives worldwide. If you’re looking for a company that challenges the norm and fosters innovation, Kynetec is the place for you.

The Role

We are seeking a Data Engineer with expert-level Python and SQL skills. The ideal candidate will have experience designing and building cloud-based data solutions in AWS (preferably with Solution Architect expertise), Databricks, and RDS / Postgres. You will create and optimize scalable data pipelines, ensure data infrastructure integrity and performance, and collaborate with cross-functional teams to drive insights.

Responsibilities

  1. Data Pipeline Development
  • Design, develop, and maintain robust ETL / ELT pipelines using Databricks and AWS services (Glue, Lambda, S3, EMR).
  • Write efficient, maintainable Python code for data ingestion, transformation, and automation.
  • Database & SQL Mastery
    • Perform complex queries, data transformations, and performance tuning on RDS / Postgres using advanced SQL skills.
    • Create and optimize database structures to support analytics and reporting.
  • AWS Architecture & Infrastructure
    • Architect secure, scalable, high-performing cloud data environments using AWS best practices (EC2, S3, Lambda, IAM, VPC).
    • Provide Solution Architect-level guidance, ensuring alignment with cost, security, and reliability best practices.
  • Performance Optimization & Troubleshooting
    • Monitor data pipelines and databases, diagnose, and resolve bottlenecks.
  • Collaboration & Stakeholder Management
    • Work with data scientists, analysts, and stakeholders to gather requirements and deliver solutions.
    • Communicate progress and technical details effectively.
  • Data Governance & Security
    • Implement data governance protocols, ensuring compliance with industry regulations (GDPR, HIPAA).
    • Maintain security best practices for data access, encryption, backup, and disaster recovery.

    Requirements

    • Bachelor’s or Master’s in Computer Science, Engineering, or related field (or equivalent experience).
    • Expert Python skills for data engineering tasks.
    • Advanced SQL skills, including performance tuning and complex queries.
    • Hands-on experience with Databricks and AWS services.
    • Proven RDS / Postgres administration skills.
    • Familiarity with version control (Git) and CI/CD pipelines.
    • Strong problem-solving, communication, and collaboration skills.
    • Ability to work independently and manage multiple projects.

    Preferred

    • AWS certifications (e.g., Solutions Architect).
    • Experience with Snowflake and BI tools.
    • Knowledge of infrastructure-as-code and containerization.
    • Moderate Java knowledge for specific use cases.