About the Role
We are looking for a data enthusiast who is passionate about databases to join us as a key contributor in building our company’s data governance framework. You will work closely with backend engineers, product managers, and data analysts from various business units to drive database standardization, data visualization, and data reliability initiatives.
You are not just a developer — you are a “data guardian” who ensures that data is clear, organized, and trustworthy.
Key Responsibilities
- Utilize data governance tools such as DataHub or Collibra to scan and manage metadata from databases (e.g., MySQL) and build an enterprise‑level data catalog.
- Define and implement automated data quality rules using data quality tools to ensure the reliability and availability of core data.
- Design, implement, and validate tracking strategies (using tools like Sensors Data, GrowingIO, or in‑house systems) across business modules to ensure that tracking data is accurate and traceable.
- Participate in creating and promoting database design standards (naming conventions, indexes, comments, table structures, etc.) to improve development consistency.
- Classify business databases, label primary/replica/sensitive tables, and establish a Master Data (Golden Source) system.
- Collaborate with product, engineering, analytics, and compliance teams to align on field definitions and data metrics, ensuring that governance practices are embedded into the development process.
Qualifications
- Bachelor’s degree or above in Computer Science, Software Engineering, or related data disciplines.
- 3+ years of experience in data development, preferably with hands‑on experience in complex data modeling and analytical logic within industries such as finance, insurance, or securities.
- Solid understanding of metadata, master data, data quality, and data lineage concepts.
- Familiarity with MySQL/PostgreSQL, able to understand table structures, field logic, and database design standards.
- Proficient in Python/Shell, capable of configuring DataHub scans or writing Great Expectations (GE) data quality rules.
- Able to use Chinese and English as daily working language to work with Chinese speaking clients.
- Experience in data governance or database standardization projects.
- Experience in tracking design, event modeling, or data collection SDK integration.
- Knowledge of tools such as DataHub, Atlan, Collibra, or Great Expectations.
- Familiarity with Flink, Doris, Kafka, or other data processing frameworks.