Yapily is on a mission to enable innovative companies to create better and fairer financial services for everyone, through the power of open banking.
Yapily is an open banking infrastructure platform solving a fundamental problem in financial services today : access. Historically, card networks have monopolised the global movement of money, and banks have monopolised the ownership of, and access to, financial data.
Yapily was founded to challenge these structures and create a global open economy that works for everyone. We exist behind the scenes, securely connecting companies - from growth to enterprise - to thousands of banks worldwide, enabling them to access data and initiate payments through the power of open banking.
Overview
As a Java Software Engineer specializing in Data Products at Yapily, you will play a key role in designing and implementing our modern next generation data platform. Your responsibilities will involve creating high-performance data pipelines, billing infrastructure, and self-serve data infrastructure and APIs. Ultimately, you will develop data systems that enable engineering teams to derive more value from their data. This is an excellent opportunity to enhance your data engineering skills using the GCP stack.
Responsibilities
- Developing and Optimising Data Pipelines: Designing, building, and maintaining scalable data ingestion and processing systems to transform raw data into actionable insights.
- Designing and Maintaining Data Products: Developing and maintaining APIs that deliver a seamless data experience for internal and external stakeholders.
- Managing Databases: Working with SQL and NoSQL databases, optimising schema design, and troubleshooting queries to support high-volume data transactions and improve database performance.
- Managing Cloud Data Resources: Develop and maintain software products utilising GCP services such as PubSub, BigQuery, Cloud Storage, and Dataflow.
- Contributing to Billing Infrastructure: Building and maintaining a reliable billing architecture within an event-driven environment.
- Partnering with Business Intelligence, infrastructure, product managers, and cross-functional teams to deliver data-centric solutions that drive business value.
- Ensuring Quality Assurance: Implementing testing, monitoring, and logging practices to ensure the performance and resilience of data systems.
- Participating in code reviews, iterative development, and agile methodologies to enhance product functionality and reliability.
Qualifications
- Java Development: 3–5 years of hands-on experience in Java development, particularly in data-intensive environments and building data products.
- Database Management: Background in managing both SQL and NoSQL databases.
- Version Control & CI / CD: Knowledge of version control (Git) and CI / CD practices for data pipeline deployment and exposure to tools such as Terraform.
- Data Modelling & Schema Design: Familiarity with data modelling and schema design for operational or analytical systems.
- PubSub, Kafka, Flink, Spark Streaming: Experience in batch and streaming data processing frameworks.
- BI Tools & Visualisation Platforms: Experience supporting BI tools or visualization platforms (e.g. Looker, Grafana, PowerBI etc.).
- ETL / ELT Processes: Exposure to ETL / ELT processes in medium-to-large scale data environments (experience handling millions of records / events daily is a plus).
- Python: Knowledge for data automation and scripting.
- Docker & Kubernetes: Familiarity with containerization and orchestration.
- Airflow, Dagster, Prefect: Experience with workflow orchestration tools.
- Cloud-based Data Services: Exposure to cloud-based data services (GCP preferred; AWS / Azure also considered).
- Data Governance & Compliance: Awareness of data governance and compliance principles (GDPR, ISO27001).
- Background in SaaS / API-driven environments, ideally with experience in billing or usage-based data.
- Data Pipeline Monitoring & Troubleshooting: Basic skills in monitoring and troubleshooting data pipelines.
- Data Security: Understanding of data security best practices and encryption for sensitive data.
- Data Quality: Experience ensuring data quality through validation, cleaning, and monitoring.
- You love innovation – it’s wired into your DNA. You understand the importance of attention to detail and ensuring quality outputs. Everything you produce is of high quality.
Benefits
- 23 days holiday a year (plus bank holidays) in Spain
- Additional holiday day after 1 year service, up to the value of 5 days over 5 years
- Hybrid working: work from home, office space or work abroad up to 30 days per year ‘Nomad Working’
- 200 annual Learning and Personal Development budget
- State health insurance
- We obsess about quality
- We’re guided by our mission and earn and maintain trust by doing what’s right, even when it’s not easy.