Workato makes the creation and implementation of automations 10X faster than traditional platforms. As the leader in AI-powered enterprise automation, we enable enterprises to automate their business processes across the organization by integrating their applications, data, and experiences.
Job Description
Workato transforms technology complexity into business opportunity. As the leader in enterprise orchestration, Workato helps businesses globally streamline operations by connecting data, processes, applications, and experiences. Its AI-powered platform enables teams to navigate complex workflows in real-time, driving efficiency and agility.
Trusted by a community of 400,000 global customers, Workato empowers organizations of every size to unlock new value and lead in today’s fast-changing world. Learn how Workato helps businesses of all sizes achieve more at workato.com.
Ultimately, Workato believes in fostering a flexible, trust-oriented culture that empowers everyone to take full ownership of their roles. We are driven by innovation and looking for team players who want to actively build our company.
But, we also believe in balancing productivity with self-care. That’s why we offer all of our employees a vibrant and dynamic work environment along with a multitude of benefits they can enjoy inside and outside of their work lives.
If this sounds right up your alley, please submit an application. We look forward to getting to know you!
Responsibilities
At Workato, we’re redefining business automation by integrating innovative technologies that drive digital transformation. We’re seeking a highly skilled Senior Data Engineer to lead the design, development, and optimization of our modern data infrastructure. In this role, you will work extensively with advanced tools such as dbt, Automate DV, Trino, Snowflake, Apache Iceberg, and Apache Airflow to build robust, scalable, and efficient data pipelines that empower our decision-making and analytics capabilities.
You will work closely with data scientists, providing data vault for them, integrating models to the data vault, integrating different sources of the data to a single data warehouse :
In this role, you will also be responsible for :
Requirements
Qualifications / Experience / Technical Skills
Education & Experience :
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering, with a proven track record of designing and managing large-scale data infrastructures.
Technical Expertise :
Proficiency in dbt for data transformation and modeling.
Experience with Automate DV for data validation and workflow automation.
Hands-on expertise with Trino for distributed SQL query engines.
Deep understanding of Snowflake architecture and its ecosystem.
Knowledge of Apache Iceberg for managing large analytic datasets.
Strong background in orchestrating workflows using Apache Airflow.
Proficiency in SQL and at least one programming language (Python preferred).
Analytical & Problem-Solving Skills :
Ability to analyze complex data challenges and design innovative, data-driven solutions.
Strong debugging skills and attention to detail.
Soft Skills :
Excellent communication and collaboration skills.
Demonstrated leadership and mentoring capabilities.
Ability to thrive in a fast-paced, dynamic environment.
Preferred Qualifications
Familiarity with cloud data platforms (AWS, GCP, or Azure) and containerization technologies.
Proven track record of working in automation-centric environments.