Summary:
Responsible for delivering data-driven solutions across project and change initiatives by managing the full lifecycle from requirements to implementation. This role involves building scalable data pipelines, deploying predictive models, and applying machine learning and statistical techniques to extract actionable insights from complex datasets. The engineer will also visualize insights through dashboards and support user adoption via training and workshops.
Essentials Functions:
- Develop and maintain real-time data pipelines for ingestion, cleaning, and transformation, ensuring data accuracy through quality checks.
- Integrate data from multiple logistics systems (CargoWise, TMS, WMS, TOS, IoT, non-RDBMS) and apply statistical models for predictive insights.
- Build, deploy, and optimize machine learning models (TensorFlow, PyTorch, Scikit-learn) via REST APIs, microservices, or cloud platforms.
- Collaborate with stakeholders to gather data requirements, present insights through dashboards, and support software testing, upgrades, and rollouts.
- Assist with enhancements, service requests, user training, and work cross-functionally on end-to-end implementation of data-driven projects.
Information Security Management System (ISMS):
- Safeguard Gulftainer’s information, ensuring data security, compliance, and system reliability.
- Monitor IT compliance, support awareness initiatives, and assist in security risk assessments.
- Participate in security audits, remediation, and contribute to business continuity and disaster recovery.
- Adhere to IT policies, security standards, and report incidents promptly as directed.
Quality, Health and Safety, and Environment
- Enforces health, safety, and security protocols to protect employees and the public.
- Ensures compliance with laws and industry standards, driving continuous HSSE improvements.
- Demonstrates strong leadership and commitment to HSSE objectives, providing necessary resources.
Qualification:
Educational Background
Graduate or Post Graduated or equivalent degree in the field of Data / Computer Science.
Related Work Experience
Minimum of 3-5 years, preferably in Ports & Logistics, Supply Chain and/or similar Enterprise Applications.
Technical Qualification / Certification
- Programming & Machine Learning: Python (NumPy, Pandas, Scikit-learn), Scala, Java, TensorFlow, PyTorch, Regression, Classification, Clustering
- Data Engineering & Cloud: Spark, Hadoop, Kafka, PostgreSQL, NoSQL, AWS, Azure (Synapse, Data Factory), Docker, Kubernetes, CI/CD
- API Development & Big Data: RESTful, GraphQL, WebSockets, SOAP, HTTP, Data lakes, warehouses, real-time analytics
- Data Analysis & Statistical Techniques: EDA, Feature Engineering, Data Visualization, Hypothesis Testing (t-test, chi-square)
Technical/Functional Competency
- Proficient in software applications, programming, and cloud platforms (Azure, AWS, Google), including Microsoft Azure (Synapse, Data Factory, Data Bricks, ML, Power BI)
- Experienced with Power BI, Tableau, Crystal Reports, Qlik Sense, Looker, and version control tools (GIT, JIRA, SVN)
- Strong analytical skills with the ability to write detailed technical reports and manage defect management tools (Redmine, HP–QC, Bugzilla)
- Experienced in Agile methodology for managing projects and change initiatives, with a solid understanding of databases (Oracle, MS SQL)