Ativa os alertas de emprego por e-mail!

Data Platform Architect - Lisbon (remote) (m/f/d)

ClearOps

Faro

Teletrabalho

EUR 55 000 - 70 000

Tempo integral

Há 30+ dias

Resumo da oferta

A supply chain software company is seeking a Data Platform Architect to design and implement data architectures. This role involves ensuring data scalability, governance, and integrating multiple data sources. Candidates should have extensive experience with AWS services, ETL/ELT processes, and data modeling. The position offers growth opportunities, flexible hours, and a supportive work culture.

Serviços

Growth opportunities
Flexible working hours
Learning & development access

Qualificações

  • Proven experience in designing and building large-scale data platforms.
  • Strong knowledge of data modeling including dimensional modeling.
  • Expertise with modern data lakes and data warehouses using AWS.

Responsabilidades

  • Design and maintain overall data architecture, ensuring scalability.
  • Implement data models and organize data storage and access.
  • Build and orchestrate batch and real-time data pipelines.

Conhecimentos

Data modeling
AWS services
ETL/ELT pipelines
SQL
NoSQL systems
Data lakes
Data governance
Infrastructure as code
Communication skills

Ferramentas

Python
Apache Airflow
Terraform
Docker
Kubernetes
Descrição da oferta de emprego
Overview

ClearOps is looking for a Data Platform Architect to design, scale, and evolve the backbone of our data infrastructure. In this role, you’ll define and implement modern data architectures, ensure performance and scalability across our platform, and guide our teams to leverage data effectively. Your work will empower ClearOps to handle massive volumes of machine and supply chain data with precision and speed.

Responsibilities
  • Design and maintain the overall data architecture, ensuring scalability, security, and performance
  • Define and implement data models and organize how data is stored, integrated, and accessed
  • Select and implement technologies within AWS and the broader data ecosystem to build scalable solutions
  • Build and orchestrate batch and real-time data pipelines, integrating multiple internal and external sources
  • Ensure reliable data availability through effective ETL/ELT processes and backup strategies
  • Establish governance frameworks and standards for data quality, metadata, lifecycle management, and compliance
  • Standardize data formats and access controls across the company to ensure consistency and security
  • Drive the creation of data products, analytics models, dashboards, and curated datasets
  • Collaborate with engineers, analysts, and product teams to transform raw data into actionable insights
  • Provide best practices and mentorship to enable teams to build on the platform effectively
  • Work closely with stakeholders to align data architecture with business needs and technical requirements
  • Continuously evaluate and adopt new technologies to keep the platform modern and efficient
Qualifications
  • Proven experience in designing and building large-scale data platforms
  • Strong knowledge of data modeling, including dimensional modeling, star and snowflake schemas
  • Expertise with modern data lakes, data warehouses, and lakehouse architectures using AWS services such as S3, Redshift, and Glue
  • Hands-on experience designing and optimizing ETL/ELT pipelines with Python-based frameworks and AWS tools (PySpark on EMR, AWS Glue, AWS Data Wrangler, AWS Lambda, Step Functions)
  • Experience with distributed data processing on AWS EMR and orchestration tools like Apache Airflow or dbt
  • Familiarity with change-data-capture (CDC) on Kafka for near-real-time data pipelines
  • Strong background with SQL and NoSQL systems such as PostgreSQL, MySQL, Redshift, and cloud storage data lakes
  • Experience building or migrating data warehouse and lake architectures with Redshift, Databricks, Snowflake, or S3
  • Knowledge of AWS security best practices, networking, and scalable cloud deployments
  • Practical experience with infrastructure as code (Terraform, CloudFormation), containerization (Docker), and orchestration (Kubernetes on EKS)
  • Ability to design serverless data pipeline components (e.g., Lambda)
  • Strong communication skills to explain complex data concepts and align cross-functional teams
Nice to have
  • Familiarity with supply chain data and processes such as logistics, inventory, and supplier data
  • Domain expertise in the supply chain industry to maximize impact of data architecture
Why us?

ClearOps is a hidden champion on a strong growth path in the supply-chain-software industry. As part of ClearOps, you will be benefitting in several ways:

  • High Impact: Shape the core of our data platform and influence product direction
  • Growth Opportunities: Develop into a strategic leadership role as the platform and company scale
  • Supportive Culture: Join a collaborative team where your expertise drives real change
  • Learning & Development: Access to mentors, continuous training, and professional growth
  • Flexibility: Flexible working hours, mobile work, and workcation opportunities
About Us

At ClearOps, we keep the world of machinery moving by transforming the entire service supply chain of machine manufacturers into a seamless data ecosystem. Our platform connects manufacturers, dealers, and machines to predict the demand for parts and services in ways never seen before, ensuring that machinery never stops working. As a young, ambitious team of 50+ experts spread across Munich, Lisbon, San José, and Atlanta, we are passionate about personal growth and professional impact.

Obtém a tua avaliação gratuita e confidencial do currículo.
ou arrasta um ficheiro em formato PDF, DOC, DOCX, ODT ou PAGES até 5 MB.