Would you like to join the leading international intergovernmental organization?
The Centre for Maritime Research and Experimentation (CMRE) is an established, world-class scientific research and experimentation facility that organizes and conductsscientific researchand technology development centred on the maritime domain. It delivers innovative and field-tested science and technology (S&T)solutions to address defence and security needsof the Alliance.
Responsibilities:
- Copy and transfer data across different storage systems, networks and locations efficiently and securely, to support experimentation campaign (at sea or on shore)
- Configure Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) to align storage access with organizational needs.
- Collect statistics and KPIs and monitor the status and availability of data management systems addressing any issues related to storage, transfer and access
- Participate in Agile ceremonies including sprint planning, daily stand-ups, retrospectives, backlog refinement, code reviews, and technical documentation using Azure DevOps or similar platforms
- Work in cross-functional teams made by scientists, developers and engineers
- Document processes, systems, and methodologies to facilitate knowledge sharing and continuity
- Champion security compliance and quality assurance practices throughout the data lifecycle, ensuring adherence to industry standards and organization policies and best practices
Essential Qualifications & Experience:
- A minimum requirement of a bachelor’s degree at a nationally recognised/certified University in an information systems, physics or electronics related scientific or engineering discipline, or other relevant scientific discipline, and 2 years post-related experience.
- A minimum of 3 years of professional experience in maintaining data management systems, data storage systems and with cloud systems from major vendors
- Professional experience with backup storage systems from major vendors and related optimization techniques
- Proven knowledge of data transfer protocols and access control mechanisms.
- Professional experience with one or more of Linux Ubuntu, Red Hat, Windows OS
- Professional experience with Agile/Scrum methodologies, Git workflows, code review processes, and collaboration tools (AzureDevOps, JIRA, GitHub, GitLab, etc.)
- Understanding of cloud services and infrastructure provided by one or more major Cloud vendor, including data storage and processing capabilities
- Understanding of data security data protection aspects
- Professional experience of implementing Findable, Accessible, Interoperable, and Reusable (FAIR) principles in data management practices and understanding of Data Mesh architecture
If you've read the description and feel this role is a great match, we'd love to hear from you! Click "Apply for this job" to be directed to a brief questionnaire. It should only take a few moments to complete, and we'll be in touch promptly if your experience aligns with our needs.