Imagine Communications

Senior Snowflake Data Developer (Hybrid remote friendly)

Job Locations US-TX-Plano
Posted Date 3 days ago(6/23/2025 5:55 PM)
Requisition ID
06252913

Overview

Every day, Imagine Communications is delivering billions of media moments all over the world — anywhere, anytime and on any device.  Imagine Communications delivers innovative, end-to-end media software and networking solutions to over 3,000 customers in more than 185 countries, including the top broadcast facilities and the most technologically advanced sports and live-event venues.

 

 

Why Imagine?

Imagine Communications offers a generous Medical, Dental, Vision and Life Insurance package and HSA and 401(k) options with company matching. We like to make sure all our employees are safe when travelling so we’ve got travel insurance covered too.  Employee Wellbeing is a priority for us, so all employees and their family have access to our EAP and Wellness programs, including LifeSpeak and Vitality. Volunteer in your community and we will pay for that too.

A Bit About The Role

We’re at the early stages of building a modern data platform using Snowflake on Azure, with Power BI as our visualization layer and Informatica + Oracle ODI for ingestion. Our aim is to support robust financial reporting. We’re seeking a Snowflake Data Engineer who can design performant models, build scalable pipelines, and contribute to a high-quality enterprise data warehouse.  

 

This is a hands-on role with room to shape architecture and delivery standards. You’ll work closely with Power BI developers, analysts, and stakeholders across finance, product, and operations. You will play a crucial role in transforming raw data into actionable insights, enabling our business to make data-driven decisions. 

 

  • Design & Development:Design, build, and optimize robust ETL/ELT pipelines to ingest, transform, and load data from various sources (e.g., relational databases, APIs, flat files) into Snowflake. 
  • Snowflake Expertise:Leverage advanced Snowflake features, including Snowpipe, Streams, Tasks, Time Travel, Zero-Copy Cloning, and Dynamic Tables, to build efficient and cost-effective data solutions. 
  • Data Modeling:Collaborate with data architects and analysts to design and implement optimal data models (e.g., Kimball, Inmon, Data Vault) within Snowflake for reporting, analytics, and machine learning initiatives. 
  • Performance Tuning:Monitor, troubleshoot, and optimize Snowflake queries and data loads for performance, scalability, and cost efficiency. 
  • Data Quality & Governance:Implement data quality checks, validation rules, and robust error handling mechanisms to ensure data integrity and reliability within the data warehouse. 
  • Automation:Develop and maintain automation scripts (e.g., using Python, SQL, dbt) for data pipeline orchestration, monitoring, and alerting. 
  • Collaboration:Work closely with cross-functional teams including Architects, business analysts, ETL Developers, and other engineers to understand data requirements and deliver solutions. 
  • Documentation:Create and maintain comprehensive technical documentation for data pipelines, data models, and processes. 
  • Best Practices:Advocate for and implement best practices in data engineering, including CI/CD, version control (Git), testing, and code review. 
  • Security:Ensure data security and compliance within the Snowflake environment, implementing roles, grants, and data masking as required. 

About You

  • Bachelor's degree in Computer Science, Information Systems, Engineering, or a related quantitative field. 
  • 5+ years of experience in data engineering, data warehousing, or a similar role. 
  • 2+ years of hands-on experience specifically with Snowflake as a primary data platform. 
  • Strong practical experience with data extraction from Oracle Fusion Cloud Applications (e.g., ERP, SCM, GL) using BICC, PVOs, or OTBI. 
  • Strong proficiency in SQL, with the ability to write complex, optimized queries. 
  • Proven experience designing and building scalable ETL/ELT pipelines. 
  • Experience with at least one scripting/programming language (e.g., Python, Java, Scala) for data processing and automation. 
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and their data services. 
  • Solid understanding of data warehousing concepts, dimensional modeling, and OLAP. 
  • Excellent problem-solving, analytical, and communication skills. 

 

Preferred Skills (Nice to have): 

  • Snowflake certification (e.g., SnowPro Core, SnowPro Advanced). 
  • Experience with Oracle Autonomous Data Warehouse (ADW) and its integration with Snowflake. 
  • Oracle Fusion Data Extraction - Develop and maintain robust strategies for extracting data from Oracle Fusion Cloud Applications, utilizing methods such as BI Cloud Connector (BICC), Public View Objects (PVOs), OTBI Subject Areas, and potentially Fusion REST APIs. 
  • Familiarity with Oracle Fusion Analytics Warehouse (FAW) and its underlying data models. 
  • Experience with Data Build Tool (dbt) for data transformation and modeling in Snowflake. 
  • Experience with IICS ETL Tool. 
  • Familiarity with data governance tools and practices. 
  • Experience with BI tools (e.g., Tableau, Power BI, Looker) for data visualization and reporting. 
  • Experience working in an Agile/Scrum development environment. 

 

Celebrating difference, together stronger

At Imagine Communications, we don’t just accept difference — we celebrate it, we support it, and we thrive on it for the benefit of our customers, our employees, our products, and our communities.  We are committed to providing an environment of mutual respect.  Imagine Communications is proud to be an equal opportunity workplace and is an affirmative action employer.

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Connect With Us!