Attiva gli avvisi di lavoro via e-mail!

Senior Data Engineer

INNOCV Solutions S.L.U.

Pordenone

In loco

EUR 45.000 - 70.000

Tempo pieno

3 giorni fa
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Inizia da zero o importa un CV esistente

Descrizione del lavoro

Una società esperta in tecnologia cerca un Senior Data Engineer per progettare e costruire architetture scalabili per Data Lakes e Data Warehouses. Sarai responsabile della gestione dei processi ETL e delle basi di dati, utilizzando tecnologie all'avanguardia come AWS e Azure. In un ambiente di lavoro stimolante, lavorerai con tecnologie moderne assicurando efficienza e qualità dei dati, e contribuendo alla continua evoluzione digitale dei nostri clienti.

Servizi

Formazione e sviluppo professionale
Flessibilità e work-life balance
Colleghi appassionati e supporto continuo
Budget personale per aggiornamenti di mercato

Competenze

  • Experience with ETL tools and techniques.
  • Advanced knowledge in cloud computing, especially AWS, Azure, and DataBricks.
  • Design and modeling of relational and non-relational databases.

Mansioni

  • Design and build scalable architectures for Data Lakes and Data Warehouses.
  • Implement and manage ETL processes; manage both relational and non-relational databases.
  • Develop APIs and integrate data across systems.

Conoscenze

ETL tools
Cloud computing
Data governance
Programming languages (Python, Java, Scala)
API development
CI/CD

Descrizione del lavoro

We are a company expert in helping others grow through our technological expertise, solving complex technological challenges , and focusing on operational excellence .

Since 2012, we have grown annually at an average rate of 30% , and for three consecutive years, we have been one of the fastest-growing companies in Europe according to the Financial Times .

We are part of Alkemy , an international company wich specializes in evolving the business model of large and medium-sized enterprises and is a public company listed on the MTA STAR market of Borsa Italiana .

  • We are digital natives and technology-neutral thanks to our rapid adaptation capabilities and problem-solving skills in any business situation.
  • We are passionate about technology challenges that have a significant impact on business. To address these, we have high-performance teams and the ability to quickly and effectively combine technologies.
  • We support our clients in their digital evolution to achieve maximum efficiency in their businesses, leveraging our extensive multi-sector experience.

We invite you to get to know us in depth by exploring the various sections of our website.

What will you do?

The work focuses on designing and building scalable architectures for Data Lakes and Data Warehouses to centralize and ensure the quality and accessibility of data. It involves implementing and managing ETL processes to extract, transform, and load data from multiple sources, including platforms like Sitetracker , SAP , and Smartmeters . Advanced techniques are applied to transform and enrich data, ensuring alignment with business requirements. The infrastructure is primarily cloud-based, utilizing AWS , Azure , and DataBricks for distributed data processing. Data security measures such as encryption, access controls, and threat monitoring are implemented to protect sensitive information.

Additionally, the work encompasses managing both relational and non-relational databases (e.g., SQL Server, Couchbase, MongoDB) and implementing real-time data processing technologies like Kafka Streams , Apache Flink , and ksqlDB . Event streaming platforms, such as Kafka and Confluent , are designed to enable real-time data integration. APIs are developed to facilitate efficient data flow between systems, while microservices architecture using frameworks like Spring Boot ensures scalability. Monitoring systems like Dynatrace and Grafana are set up to track the health and performance of platforms, with error-handling mechanisms, such as DLQ , ensuring data reliability. Data governance policies are defined to maintain data quality, integrity, and compliance with regulations.

What do we value?

  • Experience with ETL tools and techniques .
  • Advanced knowledge in cloud computing, especially AWS, Azure, and DataBricks .
  • Design and modeling of relational and non-relational databases .
  • Proficiency in programming languages such as Python, Java, and Scala .
  • Experience with Kafka Streams, Apache Flink, and ksqlDB .
  • Knowledge of data governance principles and practices.
  • Familiarity with DevOps tools and CI / CD .
  • Development and integration of APIs .

If you are someone who likes to stay updated on the latest trends, is eager to try new technologies , and enjoys finding alternative ways to do things, this is the place for you. If you consider yourself a team player who advocates for Fair Play above all... a team is waiting for you!

What do we offer you?

In this company, you can be yourself and work with colleagues equally passionate about technology and innovation, ready to support you at all times. We want to see you evolve and grow with us. To that end, we invest in your professional development with training, tech breakfasts, and attendance at forums. We are advocates of continuous learning , which is why we provide a personal budget for staying up to date with market trends.

We care about well-being and happiness at work , which is why we listen to needs to offer the best working conditions in a unique human environment. Due to the emphasis we place on flexibility and work-life balance .

Excellence, Passion, Integrity, and Concreteness are Alkemy’s core values, which we all share and that guide us in achieving our mission. These values define our identity and form the foundation of our organizational culture, shaping our interactions with all our people and clients and progressively building our business approach.

Suitable profiles will be contacted within 20 days. Applications without a complete CV attached will not be considered. Candidate data will be processed in accordance with the Privacy Notice, which is always available on our website.

J-18808-Ljbffr

Senior Data Engineer • pordenone, Italia

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.