We are seeking a skilled Senior Data Engineer with deep expertise in ClickHouse and streaming data, and a passion for building scalable real‑time analytics solutions. In this role, you will design, develop, and optimize our data pipelines and analytics infrastructure, empowering our teams to harness real‑time insights that enhance customer experience and drive business growth.
Key Responsibilities
- Design, implement, maintain and document highly scalable data pipelines for real‑time and batch processing.
- Build and optimize data systems to support accurate, low‑latency analytics and reporting use cases.
- Develop and maintain solutions for streaming and serverless data processing.
- Collaborate with cross‑functional teams to implement and support end‑to‑end analytics workflows.
- Ensure data quality, reliability, and performance across the platform.
- Monitor, troubleshoot, and optimize data infrastructure to maintain high availability.
- Mentor junior engineers and contribute to the continuous improvement of engineering practices.
- Strong problem‑solving skills and the ability to thrive in a fast‑paced environment.
- Excellent communication and teamwork skills.
Qualifications
- 5+ years of experience in Data Engineering or related fields.
- Strong expertise in ClickHouse (schema design, ingestion optimization, query performance tuning, and cluster management).
- Proven experience with real‑time data processing using Apache Kafka, Flink, or Spark Streaming.
- Deep understanding of distributed systems architecture with emphasis on scalability, reliability, and fault tolerance.
- Proficiency in one or more programming languages:
- Python: Building data pipelines, automation scripts, and integrations.
- Go: Developing high‑performance data services or tools.
- TypeScript: Contributing to data‑related front‑end or service‑side applications.
- Bash/Shell scripting: Writing automation scripts for data operations.
- Rust (Good to have): Interest or experience in building memory‑safe, high‑performance systems.
- Hands‑on experience with cloud platforms such as AWS, GCP, or Azure.
- Familiarity with containerization and orchestration tools (Docker, Kubernetes).