Job Search and Career Advice Platform

Attiva gli avvisi di lavoro via e-mail!

Business Intelligence Analyst

Datalogic

Calderara di Reno

In loco

EUR 35.000 - 55.000

Tempo pieno

Oggi
Candidati tra i primi

Genera un CV personalizzato in pochi minuti

Ottieni un colloquio e una retribuzione più elevata. Scopri di più

Descrizione del lavoro

A technology firm in Emilia-Romagna is seeking a BI Data Engineer to build and maintain data infrastructure for analytics. The role involves designing data architectures, developing ETL pipelines, and managing databases for optimal performance and security. Candidates should have a degree in Computer Science or a related field and experience in data engineering, along with knowledge of SQL and cloud services. Strong communication skills and the ability to work collaboratively are essential.

Competenze

  • Must hold a Bachelor's or Master's Degree in Computer Science or related field.
  • Fluent in Italian and English with strong communication skills.
  • At least 2 years in a data engineering role.

Mansioni

  • Design robust architectures for big data handling.
  • Develop ETL pipelines to ensure data quality.
  • Manage databases for performance and security.

Conoscenze

SQL
Python
Java / Scala
Analytical Thinking
Collaboration
ETL Tools

Formazione

Bachelor / Master’s Degree in Computer Science or related field

Strumenti

SQL Server
Oracle
Postgre
Azure Data Factory
Power BI
Descrizione del lavoro
Job Description

The BI Data Egineer is responsible for building and maintaining the data infrastructure that supports Business Intelligence and data analytics activities, for designing, constructing, and maintaining the systems and architectures that allow large volumes of data to be processed and stored, architectures such as databases and large-scale processing systems.

Key Responsibilities
  • Designs and implements efficient and robust architectures for handling big data. This includes choosing appropriate storage solutions, data warehouses, and data lakes.
  • Develops data pipelines to extract, transform, and load data from various sources into the data storage systems. These pipelines ensure data consistency, quality, and availability.
  • Manages databases and data warehouses, ensuring their performance, scalability, and security. Optimizes queries, indexes, and data models for efficient data retrieval and storage.
  • Works closely with data scientists and analysts to provide them with the data they need for analysis and reporting. This involves understanding the data requirements of different stakeholders and creating data sets that meet those needs.
  • Implements processes and tools to ensure data quality, integrity, and consistency. This includes data validation, cleansing, and monitoring to identify and address data quality issues.
  • Collaborates with cross-functional teams, including data scientists, analysts, and software engineers, to understand their data needs and requirements. Effective communication skills are essential for translating these requirements into technical solutions.
  • Stays updated on the latest technologies and best practices in data engineering to continuously improve systems and processes. This involves experimenting with new tools and techniques to enhance efficiency, scalability, and reliability.
Requirements
  • Education (School / Specialization): Bachelor / Master’s Degree in Computer Science, Business Analytics, Information Technology, Data Science or a related field is often required.
  • Languages: Italian; Fluent English; written and verbal communication
  • Experience: At least 2 years in a similar role + at least 1 year of prior experience in data-related roles such as data analyst, data scientist, or junior data engineer can be beneficial.
Other Information and Specific Skills
  • Knowledge of at least one of the following Programming Languages: SQL, Python, Java / Scala
  • Knowledge of Database Technologies: Relational Databases: SQL Server, Oracle, Postgre, SQL. NoSQL Databases knowledge is a plus: MongoDB, Cassandra, Redis.
  • Familiarity with data warehousing concepts and technologies like SAP HANA, Snowflake or Amazon Redshift or Google BigQuery.
  • Experience with ETL (Extract, Transform, Load) tools such as Azure Data Factory, Talend or Microsoft SSIS.
  • Knowledge of BI tools such as Power BI, QlikView, Looker, or SAP BusinessObjects.
  • Experience with cloud services from providers like Azure or Google Cloud Platform (GCP), especially their data-related services (e.g. Azure Synapse Analytics)
  • Understanding of data modeling concepts and techniques.
  • Analytical Thinking / Problem Solving: ability to analyze and interpret complex data sets.
  • Collaboration: ability to work collaboratively with other teams, such as data scientists, analysts, and business stakeholders.
  • Stay Current: keep up-to-date with the latest trends and technologies in the BI and data engineering field through blogs, forums, webinars, and conferences.
Ottieni la revisione del curriculum gratis e riservata.
oppure trascina qui un file PDF, DOC, DOCX, ODT o PAGES di non oltre 5 MB.