About the Role
As a Data Engineer on the Data & Analytics (D&A) team, you will ensure the data powering business decisions is accurate, well-structured, and accessible. You’ll work with technologies such as Snowflake, DBT, Power BI, Fivetran, AWS, and Python to deliver clean, reliable, and performant data.
Your initial focus will be on building robust data models in Snowflake and semantic models in Power BI, creating the trusted foundation for dashboards and analyses across Quickbase. Over time, you will also contribute to broader data engineering responsibilities including pipeline design, data integration, compute optimization, governance enforcement, and data quality improvements.
This role requires both technical depth and collaboration skills. You’ll partner with analytics, business, and engineering teams, while contributing to a high-performing culture through code reviews, knowledge sharing, and best practices.
Key Responsibilities
•Design, build, and optimize DBT models in Snowflake to support analytic and reporting needs.
•Design and manage data integrations using Fivetran, AWS, Python, and other modern tools.
•Develop and maintain Power BI datasets and semantic models aligned with business logic.
•Collaborate with analysts and BI developers to ensure downstream dashboards are powered by trusted data.
•Implement strong GitOps, documentation, and governance practices for transparency and reuse.
•Tune SQL queries and optimize data models for both performance and cost efficiency.
•Apply data security and compliance best practices to safeguard sensitive information.
•Contribute to Agile delivery of iterative analytics solutions and platform improvements.
•Stay current with industry trends in data engineering and analytics engineering.
Qualifications
•Bachelor’s degree in Computer Science, Information Technology, or related field.
•4+ years of experience modeling data for analytics. Experience with Snowflake or DBT strongly preferred.
•2+ years of experience developing and maintaining Power BI datasets/semantic models.
•Proficiency in SQL; Python programming experience is a plus.
•Strong understanding of ELT processes, data integration, and modern data tooling (e.g., Fivetran).
•Knowledge of data warehousing concepts and best practices.
•Excellent problem-solving, communication, and teamwork skills.
•Experience working in Agile environments with Git-based workflows.
•Strong sense of ownership and accountability; able to drive work to completion independently.
•Comfortable asking questions and proactively surfacing potential risks or ambiguities.
•Curious mindset with a desire to continuously learn and improve.