Enable job alerts via email!

Senior Snowflake Developer (Hybrid remote friendly)

Imagine Communications

Plano (TX)

Remote

USD 100,000 - 150,000

Full time

3 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Imagine Communications, a leader in media software and networking solutions, is looking for a Snowflake Data Engineer to design and implement robust data solutions. In this hands-on role, you will build scalable pipelines, contribute to the enterprise data warehouse, and ensure data integrity while working with cross-functional teams to transform raw data into actionable insights. Ideal candidates will have a Bachelor's degree in Computer Science and extensive experience with Snowflake, data modeling, and ETL processes.

Benefits

Medical, Dental, and Vision Insurance
HSA and 401(k) with company matching
Employee Assistance Program
Wellness programs
Paid volunteering time

Qualifications

  • 5+ years of experience in data engineering or data warehousing.
  • 2+ years hands-on experience with Snowflake as primary data platform.
  • Solid understanding of OLAP and dimensional modeling.

Responsibilities

  • Design and optimize ETL pipelines to load data into Snowflake.
  • Implement data quality checks and validate data integrity.
  • Collaborate with analysts and stakeholders for data requirements.

Skills

SQL
Data modeling
Data governance
Problem-solving
Communication
ETL/ELT pipeline design
Data extraction
Cloud services

Education

Bachelor's degree in Computer Science

Tools

Snowflake
ETL Tools
Python
Informatica
Oracle ODI
Power BI

Job description

Overview

Every day, Imagine Communications is delivering billions of media moments all over the world —anywhere, anytime and on any device. Imagine Communications delivers innovative, end-to-end media software and networking solutions to over 3,000 customers in more than 185 countries, including the top broadcast facilities and the most technologically advanced sports and live-event venues.

Why Imagine?

Imagine Communications offers a generous Medical, Dental, Vision and Life Insurance package and HSA and 401(k) options with company matching. We like to make sure all our employees are safe when travelling so we’ve got travel insurance covered too. Employee Wellbeing is a priority for us, so all employees and their family have access to our EAP and Wellness programs, including LifeSpeak and Vitality. Volunteer in your community and we will pay for that too.

A Bit About The Role

We’re at the early stages of building a modern data platform usingSnowflake on Azure, with Power BI as our visualization layer and Informatica + Oracle ODI for ingestion. Our aim is to support robust financial reporting. We’re seeking a Snowflake Data Engineer who can design performant models, build scalable pipelines, and contribute to a high-quality enterprise data warehouse.

This is ahands-on rolewith room to shape architecture and delivery standards. You’ll work closely with Power BI developers, analysts, and stakeholders across finance, product, and operations.You will play a crucial role in transforming raw data into actionable insights, enabling our business to make data-driven decisions.

  • Design & Development:Design, build, and optimize robust ETL/ELT pipelines to ingest, transform, and load data from various sources (e.g., relational databases, APIs, flat files) into Snowflake.
  • Snowflake Expertise:Leverage advanced Snowflake features, including Snowpipe, Streams, Tasks, Time Travel, Zero-Copy Cloning, and Dynamic Tables, to build efficient and cost-effective data solutions.
  • Data Modeling:Collaborate with data architects and analysts to design and implement optimal data models (e.g., Kimball, Inmon, Data Vault) within Snowflake for reporting, analytics, and machine learning initiatives.
  • Performance Tuning:Monitor, troubleshoot, and optimize Snowflake queries and data loads for performance, scalability, and cost efficiency.
  • Data Quality & Governance:Implement data quality checks, validation rules, and robust error handling mechanisms to ensure data integrity and reliability within the data warehouse.
  • Automation:Develop and maintain automation scripts (e.g., using Python, SQL, dbt) for data pipeline orchestration, monitoring, and alerting.
  • Collaboration:Work closely with cross-functional teams including Architects, business analysts, ETL Developers, and other engineers to understand data requirements and deliver solutions.
  • Documentation:Create and maintain comprehensive technical documentation for data pipelines, data models, and processes.
  • Best Practices:Advocate for and implement best practices in data engineering, including CI/CD, version control (Git), testing, and code review.
  • Security:Ensure data security and compliance within the Snowflake environment, implementing roles, grants, and data masking as required.
About You
  • Bachelor's degree in Computer Science, Information Systems, Engineering, or a related quantitative field.
  • 5+ years of experience in data engineering, data warehousing, or a similar role.
  • 2+ years of hands-on experience specifically with Snowflake as a primary data platform.
  • Strong practical experience with data extraction from Oracle Fusion Cloud Applications (e.g., ERP, SCM, GL) using BICC, PVOs, or OTBI.
  • Strong proficiency in SQL, with the ability to write complex, optimized queries.
  • Proven experience designing and building scalable ETL/ELT pipelines.
  • Experience with at least one scripting/programming language (e.g., Python, Java, Scala) for data processing and automation.
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and their data services.
  • Solid understanding of data warehousing concepts, dimensional modeling, and OLAP.
  • Excellent problem-solving, analytical, and communication skills.

Preferred Skills (Nice to have):

  • Snowflake certification (e.g., SnowPro Core, SnowPro Advanced).
  • Experience with Oracle Autonomous Data Warehouse (ADW) and its integration with Snowflake.
  • Oracle Fusion Data Extraction - Develop and maintain robust strategies for extracting data from Oracle Fusion Cloud Applications, utilizing methods such asBI Cloud Connector (BICC),Public View Objects (PVOs),OTBI Subject Areas,and potentially Fusion REST APIs.
  • Familiarity with Oracle Fusion Analytics Warehouse (FAW) and its underlying data models.
  • Experience with Data Build Tool (dbt) for data transformation and modeling in Snowflake.
  • Experience with IICS ETL Tool.
  • Familiarity with data governance tools and practices.
  • Experience with BI tools (e.g., Tableau, Power BI, Looker) for data visualization and reporting.
  • Experience working in an Agile/Scrum development environment.

Celebrating difference, together stronger

At Imagine Communications, we don’t just accept difference — we celebrate it, we support it, and we thrive on it for the benefit of our customers, our employees, our products, and our communities. We are committed to providing an environment of mutual respect. Imagine Communications is proud to be an equal opportunity workplace and is an affirmative action employer.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.