Summary
- Senior data analyst and analytics engineer working with billions of rows of financial transaction data
- Build and maintain reliable dbt, SQL, and Python pipelines in BigQuery
- Design reusable feature layers for analysts and data scientists
- Translate technical work into clear stories for non-technical stakeholders
Skills & Tools
Python SQL dbt (data build tool) Airflow Git Docker BigQuery Vertica PostgreSQL Tableau ETL / ELT Agile/Scrum Feature Engineering Data Storytelling
Experience
Senior Data Analyst
MX Technologies · Seattle, WA (Remote)
- Pioneered a dbt-based incremental transaction modeling layer for core BigQuery pipelines, cutting bytes billed by 87% and reducing end-to-end runtime by 30%, integrated with Airflow for standardized orchestration and monitoring
- Built Python tooling in secure Vertex AI Workbench environments to automate complex enrichment workflows, reducing analyst time spent on data enrichment by ~80% for a team of 10+ analysts
- Designed a dbt-driven behavioral profiling layer transforming transaction data into 50+ reusable user attributes, forming a maintainable foundation for profiling, ML, and analytics products
- Reworked highly technical presentations on AI-driven merchant identification into clear narratives for engineering and business leadership, tying pipeline and feature work to company goals and a ~40x increase in identifiable merchants
Python SQL dbt (data build tool) BigQuery Airflow Feature Engineering Data Storytelling
Data Analyst
MX Technologies · Lehi, UT
- Analyzed large-scale transactional datasets (10B+ rows) in Vertica using SQL to develop and validate 1,000+ data classification rules used by 100+ financial institutions
- Built Python automation reducing manual workflows from 3-4 hours to ~15 minutes, enabling faster iteration and more reliable data delivery
- Designed logic for multiple data-driven financial insights deployed to tens of thousands of end users, contributing directly to product value
- Consolidated data from 15+ projects into a shared PostgreSQL environment by extracting, cleaning, and standardizing datasets for internal research and analytics
Python SQL Vertica PostgreSQL ETL / ELT
Student IT Data Analyst
Brigham Young University - Office of IT · Provo, UT
- Published 50+ company-wide Tableau reports
- Shaped a data-driven presentation for IT leadership attended by ~85% of the organization
Tableau SQL Data Storytelling
Education
Bachelor of Science in Statistics
Brigham Young University · Provo, UT
Projects
Harbor Yard • Ansible-Driven Home Infrastructure
Using Ansible as the central tool controlling Docker containers, host monitoring, and server tooling. All backed up to and deployed from a GitHub repo.
Ansible Home-Lab Docker
WayfinDex • LLM-powered Travel Knowledge Base
AI agent that researches travel destinations and generates guides using web search and structured outputs.
Python AI Pydantic-AI