
Attiva gli avvisi di lavoro via e-mail!
Genera un CV personalizzato in pochi minuti
Ottieni un colloquio e una retribuzione più elevata. Scopri di più
A technology solutions company located in Rome is looking for a skilled professional knowledgeable in designing scalable data architectures and managing ETL processes. You should have advanced skills in cloud technologies like AWS and Azure, proficiency in programming languages such as Python and Java, and experience with technologies like Kafka and DataBricks. The position offers opportunities for professional development and a flexible work-life balance.
We are a company expert in helping others grow through our technological expertise, solving complex technological challenges, and focusing on operational excellence. We are part of Alkemy, an international company specializing in evolving the business model of large and medium-sized enterprises and a public company listed on the MTA STAR market of Borsa Italiana.
The work focuses on designing and building scalable architectures for Data Lakes and Data Warehouses to centralize and ensure the quality and accessibility of data. It involves implementing and managing ETL processes to extract, transform, and load data from multiple sources, including platforms like Sitetracker, SAP, and Smartmeters. Advanced techniques are applied to transform and enrich data, ensuring alignment with business requirements. The infrastructure is primarily cloud-based, utilizing AWS, Azure, and DataBricks for distributed data processing. Data security measures such as encryption, access controls, and monitoring are implemented to protect sensitive information. Additionally, the role encompasses managing relational and non‑relational databases (e.g., SQL Server, Couchbase, MongoDB) and implementing real‑time data processing technologies like Kafka Streams, Apache Flink, and ksqlDB. Event streaming platforms, such as Kafka and Confluent, are designed to enable real‑time data integration. APIs are developed to facilitate efficient data flow between systems, while microservices architecture using frameworks like Spring Boot ensures scalability. Monitoring systems like Dynatrace and Grafana are set up to track the health and performance of platforms, with error‑handling mechanisms, such as DLQ, ensuring data reliability. Data governance policies are defined to maintain data quality, integrity, and compliance with regulations.
In this company, you can be yourself and work with colleagues equally passionate about technology and innovation, ready to support you at all times. We invest in your professional development with training, tech breakfasts, and attendance at forums. We are advocates of continuous learning, which is why we provide a personal budget for staying up to date with market trends. We care about well‑being and happiness at work, offering flexible working conditions and a balanced work‑life environment. The core values of Excellence, Passion, Integrity, and Concreteness guide us in achieving our mission.
Suitable profiles will be contacted within 20 days. Applications without a complete CV attached will not be considered. Candidate data will be processed in accordance with the Privacy Notice, which is always available on our website.