Your mission
Our significantly growing Data, AI & MarTech Department is looking for an experienced Data Platform Engineer. The position is responsible for improving and continuously further developing our cloud-based data platform – the heart of Smartbroker‘s technical data infrastructure for business analytics & insights. Join us and play a major role in promoting and enabling a truly data-driven culture across the organisation!
Job description:
- Develop and improve our cloud based data platform for data analytic and business insights using most innovative data technologies
- Build end-to-end data pipelines from raw data ingestion to consumable data: prepare and clean structured and unstructured data and develop high-quality data models for advanced analytics and AI use cases
- Implement data quality monitoring to ensure accuracy and reliability of data pipelines
- Architect, code, and deploy data infrastructure components
- Collaborate closely with highly ambitious data engineers and analysts in our growing Data, AI & MarTech Department as well as product technology colleagues
- Stay up to date with latest market developments in data cloud architecture and share your knowledge
Über die Smartbroker Gruppe
Your profile
- Universitydegree in computer science, mathematics, natural sciences, or a similar field
- Several years of experience in data engineering and strong know-how in building data-native, robust, scalable, and maintainable data platforms
- Significant hands-on experience designing and operating data pipelines on cloud based data platforms (AWS, GCP) using data-native services (S3, Athena, BigQuery…)
- Experience in data warehousing and containerization, e.g., Kubernetes, Docker…
- Advanced knowledge about cloud networking & security (IAM, security groups…)
- Proficient and experienced with Infrastructure as Code
- Deep understanding of software engineering best practices: requirements specification, version control, CI/CD, testing, deployment, and monitoring of data pipelines and services
- Excellent SQL skills and strong programming skills in Python, ideally including Airflow and PySpark
- Strong knowledge in data streaming technologies like Kafka, Kinesis, Flink…
- Excellent English communication skills, German is a plus
- Interest in finance and fintech industry and a sense of humor
Benifits
Why us?