Job Purpose
To prepare and analyze data for the development of dashboard visualizations and corporate reports, in line with company procedures and requirements. This role demands strong analytical skills, proficiency in data tools, and the ability to deliver reports that support effective decision-making at MODENA.
Key Accountability Areas
- Design Database Structure
- Develop optimal database structures to support the Data Warehouse.
- Perform ETL & ELT Processes
- Extract data from various sources.
- Transform data to meet analytical requirements.
- Load data into the prepared Data Warehouse.
- Maintain and schedule ETL processes using Apache Airflow.
- Data Cleansing
- Perform data cleansing to ensure integrity and consistency.
- Data Analysis
- Conduct data analysis using statistical and business intelligence techniques.
- Deliver insights to support strategic business initiatives.
- Dashboard & Reporting
- Develop interactive dashboards for business stakeholders.
- Generate reports tailored to business needs.
Freedom to Act
- Collaborate with key users and cross-functional teams in building and optimizing the Data Warehouse and analytics solutions.
- Responsible for maintaining data quality, consistency, and security.
Qualification Requirements
- Education: Bachelor’s degree in Computer Science, Information Systems, Mathematics, Statistics, or a related field.
- Experience:
- Minimum of 2 years of experience as a Data Analyst or Data Engineer.
- Proven experience with SQL (Snowflake, PostgreSQL, or equivalent).
- Hands-on experience in ETL/ELT and data pipelines, especially Apache Airflow.
- Mandatory Skills:
- Proficiency in Python scripting.
- Advanced SQL skills (query optimization, joins, aggregations).
- Strong knowledge of data cleansing and data validation practices.
- Experience with data visualization tools (Power BI, Tableau, or equivalent).
- Strong quantitative and analytical capabilities.
- Preferred / Nice to Have:
- Understanding of cloud infrastructure (Azure, AWS, GCP) or data infrastructure (Data Lake, Data Warehouse, Data Mart).
- Familiarity with modern data stack technologies (Snowflake, Kafka, Airflow, dbt, etc.).
- Knowledge of additional scripting languages (R, Shell) is a plus.