Attiva gli avvisi di lavoro via e-mail!

ICT Business Intelligence Data Engineer

TN Italy

Calderara di Reno

In loco

EUR 40.000 - 80.000

Tempo pieno

15 giorni fa

Aumenta le tue possibilità di ottenere un colloquio

Crea un curriculum personalizzato per un lavoro specifico per avere più probabilità di riuscita.

Descrizione del lavoro

An established industry player is seeking a skilled data engineer to join its ICT Business Demand & Applications team. This role involves building and maintaining data infrastructures that support business intelligence and analytics. You will design robust architectures for big data, develop ETL pipelines, and ensure data quality and security. If you are passionate about data and eager to contribute to impactful projects within a collaborative environment, this is the perfect opportunity for you.

Competenze

  • 3+ years of experience in data-related roles like data analyst or data engineer.
  • Fluent in English, both written and verbal.

Mansioni

  • Design and implement data architectures for big data handling.
  • Develop data pipelines for ETL processes ensuring data quality.

Conoscenze

Data Analysis
Data Engineering
ETL Processes
Analytical Thinking
Problem-Solving

Formazione

Bachelor's Degree in Computer Science
Master's Degree in Data Science

Strumenti

SQL Server
PostgreSQL
Azure Data Factory
Power BI
Amazon Redshift

Descrizione del lavoro

Social network you want to login/join with:

col-narrow-left

Client:

Datalogic

Location:
Job Category:

-

EU work permit required:

Yes

col-narrow-right

Job Reference:

720e39790108

Job Views:

2

Posted:

04.05.2025

Expiry Date:

18.06.2025

col-wide

Job Description:

Job Description

POSITION INFORMATION


Datalogic Group is looking for a candidate joining our ICT Business Demand & Applications team playing a crucial role in enabling DL Functions to leverage their data assets effectively for decision-making, product development, and business growth.
The position will be based in Datalogic HQ in Lippo di Calderara (BO).

ROLE MISSION


Reporting directly to the Global BI Manager, the position is responsible for building and maintaining the data infrastructure that supports Business Intelligence and data analytics activities, for designing, constructing, and maintaining the systems and architectures that allow large volumes of data to be processed and stored, architectures such as databases and large-scale processing systems.
He/She designs, builds, and maintains ETL data pipelines for data warehouses and lakes. Ensures compliance and security for sensitive data through governance policies. Collaborates with solution architects and business analysts to create data models for reporting and analytics. Implements data cleansing, normalization, and transformation processes for data quality and consistency. Supports decisions on databases, data warehousing, and pipeline tools.


KEY RESPONSIBILITIES

  • Designs and implements efficient and robust architectures for handling big data including choosing appropriate storage solutions, data warehouses and data lakes.
  • Develops data pipelines to extract, transform, and load data from various sources into the data storage systems ensuring data consistency, quality, and availability.
  • Manages databases and data warehouses, ensuring their performance, scalability, and security. Optimizes queries, indexes, and data models for efficient data retrieval and storage.
  • Works closely with data scientists and analysts to provide the data needed for analysis and reporting, understanding the data requirements of different stakeholders and creating data sets that meet those needs.
  • Implements processes and tools to ensure data quality, integrity, and consistency including data validation, cleansing, and monitoring to identify and address data quality issues.
  • Collaborates with cross-functional teams, including data scientists, analysts, and software engineers, to understand their data needs and requirements.
  • Stays updated on the latest technologies and best practices in data engineering to continuously improve systems and processes.

QUALIFICATIONS/REQUIREMENTS


Education: Bachelor/Master’s Degree in Computer Science, Business Analytics, Information Technology, Data Science or a related field.
Language: Fluent English; written and verbal.
Experience: at least 3 of years of experience in data-related roles such as data analyst, data scientist, or junior data engineer.

OTHER INFORMATION & SPECIFIC SKILLS


Knowledge of at least one of the following Programming Languages: SQ, Python, Java/Scala.
Knowledge of Database Technologies: SQL Server, Oracle, PostgreSQL; NoSQL Databases knowledge would be a plus (MongoDB, Cassandra, Redis).
Familiarity with data warehousing concepts and technologies like SAP HANA, Snowflake or Amazon Redshift or Google BigQuery.
Experience with ETL (Extract, Transform, Load) tools such as Azure Data Factory, Talend or Microsoft SSIS.
Knowledge of BI tools such as Power BI, QlikView, Looker, or SAP BusinessObjects.
Experience with cloud services from providers like Azure or Google Cloud Platform (GCP), especially their data-related services (e.g. Azure Synapse Analytics).
Understanding of data modeling concepts and techniques.
Analytical Thinking.
Problem-Solving.
Effective communication and relational skills.

Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.