Employ GCP tools (dbt, Airflow, and Looker) to enhance data quality, efficiency, and the delivery of accurate and timely data.
Identify, investigate, and resolve data issues such as data quality problems, discrepancies, and missing data.
Contribute to the development and improvement of data solutions.
Leverage the power of dbt to address complex modeling challenges, focusing on performance, robustness, and scalability.
Work on prevention and alerting solutions.
Adopt and refine best practices, including naming conventions, data modeling, and data quality testing.
Communicate effectively with cross-functional teams and non-technical stakeholders in a clear and structured manner.
Assist and support team members in designing, developing, and implementing data warehousing, reporting, and analytics solutions.
Take ownership of tasks and initiatives.
Demonstrate proven SQL skills: ability to join and manipulate various data types (String, Integer, JSON, Array), write parameterized scripts, and debug SQL code.
Understand ETL and data warehousing concepts.
Possess strong communication skills: timely, clear, and consistent sharing of information, progress, bottlenecks, and findings.
Maintain a curious and growth-oriented mindset towards continuous learning.
Be capable of understanding, addressing, and communicating problems from both technical and business perspectives.
Location: Fully remote.
Perks include:
* The salary benchmark is based on the target salaries of market leaders in their relevant sectors. It is intended to serve as a guide to help Premium Members assess open positions and to help in salary negotiations. The salary benchmark is not provided directly by the company, which could be significantly higher or lower.