Summary:
The BI Developer will be responsible for designing, developing, and maintaining scalable and robust BI solutions that transform complex data into clear, actionable insights. Experience in working in Power BI, Azure Synapse, Data Factory, Data Bricks, and Microsoft Fabric.
S/he should develop scalable data pipelines, data harmonization, query/results/storage optimization, deploying machine learning models, and leveraging predictive analytics to enhance operational efficiency. This role requires a strong understanding of data warehousing, ETL processes, data modeling, and data visualization, along with excellent analytical and communication skills.
Duties and Responsibilities:
- Develop and maintain data pipelines for real-time data ingestion, cleaning, preprocessing, and wrangle raw data into usable formats for analysis.
- Implement data quality checks and monitoring to ensure accuracy and consistency.
- Work with heterogeneous systems like CargoWise, Transport Management, Warehousing, Terminal Operating Systems (TOS), IoT and non-RDBMS for data integration.
- Analyze large datasets to extract meaningful patterns and insights. Develop and implement statistical models for predictive and prescriptive analytics not limited to Cargo Tracking, Demand Forecasting, Inventory Management and operational analytics etc.
- Collaborate with various internal and external stakeholders to understand data requirements for building scalable solutions, and present insights effectively in the form of Dashboards/Reports.
- Reviews software design, change specifications and plans against contractual and/or process requirements.
- Verify Reporting Platform upgrades & patches before production rollouts.
- Develop and Verify: Unit, QA and UAT criteria & test cases to ensure implementation of the software quality in accordance with project, process, contract requirements and objectives.
- Support enhancement request, business operational request, and service request fulfillment.
- Support UAT with Business Users on projects/change initiatives.
- Work closely with cross-functional teams to collaborate on the design and implementation of data-related projects.
Information Security Management System (ISMS):
- Safeguard Gulftainer Information’s as per Information protection requirement.
- Ensure data security, compliance, and system reliability in all implementations.
- Perform the IT compliance monitoring and improvement activities to ensure compliance both with internal security policies etc. and applicable laws and regulations.
- Participate and support information security awareness, training, and educational activities.
- Participate and support information security risk assessments and controls selection activities.
- Participate and support an Information security audits and remediation.
- Participate and support relating to contingency planning, business continuity management and IT disaster recovery in conjunction with relevant functions and third parties.
- Comply with IT Policies (Governance and IT Security) and comply with all applicable system and security controls /Clause and LAWs.
- Report an Information Security incident timely manner and action per management direction.
Qualification:
Educational Background
Bachelor’s or Master’s in Computer Science, Data Science, or a related field.
Technical Qualification / Certification
- Programming – Power BI, Java, Python
- Cloud & DevOps - AWS, Azure [Synapse, Data Factory, Data Bricks], GCP, Docker, Kubernetes, CI/CD Pipelines.
- Data Engineering - Oracle, MySQL, NoSQ, Azure Data Factory, Microsoft Fabric
- Preferred – TensorFlow, PyTorch, Scikit-learn, Regression, Classification, Clustering. Kafka
Related Work Experience
- Minimum of 3-5 years, preferably in Ports & Logistics, Supply Chain, and/or similar Enterprise Applications.
- Experience with CargoWise, Transport, Warehousing, Terminal Operating Systems, IoT is a plus.
Technical/Functional Competency
- Good Technical Report Writing Skills.
- Experienced in Agile Methodology to manage projects or change initiatives.