Job Description

Job Summary

Great Wolf Lodge is seeking a talented and motivated Data Engineer to join our Analytics team. As a Data Engineer, you will play a crucial role in shaping our data infrastructure and analytics capabilities, enabling data-driven decision-making across the organization. You will work closely with data scientists, analysts, and other stakeholders to design, build, and maintain data pipelines, databases, and data models to support analytics and reporting solutions, including performance management, data science, and personalization. The Data Engineer will report to the Director of Analytics and Data Science, and will collaborate with Technology, Finance, and Commercial Analytics to drive value through best-in-class data architecture and data model design.

 

Job Duties:

  • Design, develop, and maintain data pipelines and ELT/ETL processes using Matillion and dbt.
  • Implement and maintain data warehouse and database systems for efficient data storage, retrieval, and analysis using Snowflake.
  • Partner with internal stakeholders and external vendors involved in project definition, design, and planning, mapping the data journey from source through consumers (data visualization, applications, or predictive models).
  • Gather, document, and analyze business requirements to deliver data products that support business needs
  • Design, develop, test and deploy data models, data collection, and transformation components.  Determine best point for transformations, calculations, and joins (e.g. data lake, data warehouse, or Tableau data source)
  • Ensure data quality, consistency, and integrity by following data governance best practices and ensuring scalability and sustainability of business intelligence data architecture.
  • Troubleshoot and support existing data workflow processes; deliver fixes and optimizations where appropriate
  • Collaborate with team members to improve technical skills and foster a cooperative working environment.
  • Stay abreast of new data management methods and technologies and support development of these new capabilities internally.


Qualifications:

  • 2+ years’ experience building ETL/ELT pipelines and data warehouses
  • Experience designing and implementing conceptual, logical, and physical data models
  • Experience with common data infrastructure tools and services, including AWS S3, Snowflake, Snowpipe, and Tableau Server.
  • Expert SQL scripting skills and advanced familiarity with a variety of databases, including both structured (SQL) and unstructured (NoSQL) environments.
  • Experience with Matillion, dbt, Informatica, Talend, or other data transformation tool required.
  • Advanced experience with one or more programming/scripting languages, including but not limited to Python and/or R.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Ability to effectively build relationships across the business at all levels
  • Self-starter, entrepreneurial, high-energy who can take initiative in a fast-moving environment
  • Strong technical understanding of current and emerging business intelligence and analytics technologies


Education:

  • Bachelor's Degree in Technology, Computer Science, or similar technical field

Application Instructions

Please click on the link below to apply for this position. A new window will open and direct you to apply at our corporate careers page. We look forward to hearing from you!

Loading . . .