
Analyze virtual warehouse performance metrics and adjust sizing to eliminate over-provisioning while maintaining optimal query performance.

Analyze virtual warehouse performance metrics and adjust sizing to eliminate over-provisioning while maintaining optimal query performance.
Larger warehouses provide more compute power but cost scales linearly. The key is finding the minimum size that meets performance requirements.
SELECT
WAREHOUSE_NAME,
WAREHOUSE_SIZE,
COUNT(*) as QUERY_COUNT,
AVG(EXECUTION_TIME)/1000 as AVG_SECONDS,
PERCENTILE_CONT(0.95) WITHIN GROUP (ORDER BY EXECUTION_TIME)/1000 as P95_SECONDS
FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY
WHERE START_TIME >= DATEADD(day, -7, CURRENT_TIMESTAMP())
AND WAREHOUSE_NAME IS NOT NULL
GROUP BY 1, 2
ORDER BY 1;
A company used Large warehouses for all workloads. Analysis showed 70% of queries completed in <15 seconds (didn't need Large). By creating separate warehouses: BI queries → Small warehouse (70% of workload), complex analytics → Medium (25% of workload), massive aggregations → Large (5% of workload). Overall cost reduction: 55%.
Uncover hidden inefficiencies and start reducing Snowflake spend in minutes no disruption, no risk.