Overview
Gentco Logistics is a leading international liquid chemical logistics provider with three core business units: chemical tankers (5th largest fleet globally), chemical tank containers(2nd largest fleet globally), and terminal storage tanks. Our global headquarters are in Shanghai, China, supported by a regional headquarters in Singapore and 40 operational offices across six continents. Gentco Logistics is wholly owned by the Junzheng Group (ticker 601216.SH).
The Opportunity
The Data Engineer position is a part of the Global IT Department, which is directly overseen by the Group’s CIO. Our team currently comprises over 40 IT, software, and data professionals primarily based in Shanghai, Moerdijk (the Netherlands), and Singapore. With our IT presence in Singapore rapidly expanding, we are excited to invite you to join our team!
Responsibilities
- Assist the senior data engineer in the implementation and maintenance of critical pieces of our Databricks ELT pipelines and data warehouse using Azure Databricks (PySpark, Spark SQL)
- Assist in the design and architecting of ELT pipelines, data schemas/models, and data marts for business users.
- Participate in code reviews and contribute to CI/CD pipelines for automated testing and deployment of data solutions.
- Occasionally assist data analysts in creating new semantic models and/or dashboards in PowreBI
- Ensure the reliability and availability of core data infrastructure with proper monitoring and alerting systems to guarantee continuous data delivery to the business.
- Take ownership of data quality and governance. Collaborate with business and IT teams to establish and meet data standards, including developing automated data checks and data quality monitoring dashboards.
- Promote a culture of organizational learning by facilitating knowledge sharing and implementing best practices in documentation.
- Stay updated with current industry best practices and trends, and evaluate technical decisions at a business level.
Requirements
- Bachelor's or Master's degree in Computer Science or a related engineering field from atechnical university.
- Experience in data engineering with working knowledge of Python and SQL.
- Experience with Agile/DevOps tools (Git, Jira).
- Deeply knowledgeable of data warehousing best practices, especially in the Azure cloudenvironment, with a strong commitment to staying current with industry trendsand future advancements.
- Highly motivated, with a proactive approach to project ownership and accountabilityfor outcomes and timelines. Committed to continuous personal growth anddevelopment within the organization.
- Fluency in both written and spoken English and Mandarin is highly desirable to effectively liaise with Chinese-speaking clients and counterparts, understanding their technical requirements.
Desirable skills
- Experience with public cloud infrastructure. (Azure preferred)
- Exposure to Databricks, Snowflake, or similar big data processing platforms.
- Background in logistics, shipping/maritime or chemicals industry.
- Experience with Power BI or similar BI tools.