Dashboard refresh strategies: live queries, scheduled refreshes, and cached extracts
Max Musing
Max Musing Founder and CEO of Basedash · May 16, 2026

Max Musing
Max Musing Founder and CEO of Basedash · May 16, 2026

Most BI dashboards do not need to refresh in real time. They need to refresh at the right time. The three refresh strategies every BI tool implements are live queries (run the SQL on every page load), scheduled refreshes (run the SQL on a cron and cache the result), and cached extracts (materialize the result set into a separate engine and query that). Choosing among them is a tradeoff between data freshness, warehouse cost, dashboard latency, and how many concurrent viewers you need to support.
This guide is for analytics engineers, founders, and operators choosing between BI tools, or tuning the ones they already have. It explains how each refresh strategy actually works, when each one fits, and how to map a refresh cadence to the type of decision the dashboard supports.
Every BI tool implements some variation of these three. The names differ but the architecture is similar.
The BI tool issues a fresh SQL query to your warehouse or database every time a user loads or filters a dashboard. Nothing is cached on the BI side (or only a short-lived per-session cache is used).
How it shows up in different tools:
The advantage is that what you see is what the warehouse currently has. There is no staleness window. Permissions enforced by the database, like row-level security in PostgreSQL or Snowflake, are honored on every query.
The disadvantage is that every page load is a query. Ten people opening the same dashboard at 9:01 AM is ten queries. Add a filter and it is another query. On warehouses billed by compute time (Snowflake, BigQuery, Databricks), this can become the largest line item in your data bill if you are not careful.
The BI tool runs the SQL on a schedule — every 15 minutes, hourly, every six hours, daily — and stores the result. Dashboard viewers query the stored result, not the warehouse.
How it shows up:
The advantage is that one warehouse query serves many dashboard views. If a hundred people open the same dashboard between refreshes, it is still just one query. Load times are typically much faster because the data is pre-aggregated and stored in a fast columnar store.
The disadvantage is staleness. A dashboard refreshed at 6 AM does not reflect a transaction at 6:01 AM. For most internal reporting this is fine. For operational dashboards, customer-facing dashboards, or anomaly alerting, it is often not.
This is a subset of scheduled refreshes worth calling out separately because of how aggressively it changes performance characteristics. Instead of caching a query result, you materialize an entire transformed dataset into a purpose-built engine optimized for analytics queries.
Examples:
These work because the engine is designed for slice-and-dice queries on a fixed schema. Filtering, grouping, and joining are very fast once the data is loaded. The tradeoff is that the dataset has to be rebuilt or incrementally updated on a schedule, and the engine is a separate piece of infrastructure to operate or pay for.
The most common mistake is choosing refresh rate based on how often the data changes, not how often a human acts on it.
A few examples:
The rule of thumb: refresh just often enough that no one ever sees the same number twice when they should see a new one. Anything more frequent is wasted compute.
| Question | Live query | Scheduled refresh | Cached extract |
|---|---|---|---|
| Data freshness needed | Seconds | Minutes to hours | Minutes to days |
| Typical concurrent viewers | 1 to 20 | 10 to 1,000 | 100 to 10,000+ |
| Warehouse cost per view | High | Low | Very low |
| Setup complexity | Low | Medium | High |
| Honors database row-level security | Yes | Sometimes | Rarely |
| Good for ad-hoc exploration | Yes | Limited | Limited |
| Good for embedded customer dashboards | Only at low scale | Yes | Yes |
Use it as a starting point. Most teams end up with a mix: live queries for analyst exploration, scheduled refreshes for the dashboards executives look at, and cached extracts for anything embedded in a customer-facing product.
Use when:
Avoid when:
Use when:
Avoid when:
Use when:
Avoid when:
Many teams ask for “real-time dashboards” when they actually mean “fresh dashboards.” Real real-time analytics — the kind where each event lands on the dashboard within a second of happening — is a different problem than choosing a refresh rate.
True real-time analytics needs:
Most BI tools — Tableau, Power BI, Looker, Metabase, Sigma, Basedash — are not designed for sub-second freshness on millions of events per minute. They will technically “refresh every minute,” but the underlying warehouse query is not architected for that workload.
If the use case is genuinely operational (live system health, ad bidding, fraud, trading), look at purpose-built real-time analytics stacks and consider real-time dashboard tools separately from your general-purpose BI. For “fresh enough” reporting on a transactional database, hourly or 15-minute scheduled refreshes are almost always the right call.
Refresh strategy is one of the biggest hidden drivers of warehouse cost. A few patterns to watch for.
Live queries with no caching layer. Every dashboard view triggers a warehouse query. If a 200-person team opens the same Snowflake-backed dashboard each morning, that is 200 warehouse executions of the same query. Most of them could be served from a 5 minute cached result with no one noticing.
Scheduled refreshes that run every 5 minutes “just in case.” A common Power BI or Tableau pattern. If the data only updates hourly upstream, 11 out of every 12 refreshes do nothing useful and still consume warehouse compute.
Cached extracts that get re-built from scratch. Many tools default to full rebuilds. If you have a hundred million row fact table and a daily extract, an incremental refresh pattern can cut warehouse usage by 90% or more.
Filter combinations that bypass the cache. Some BI tools cache by exact query hash. If users frequently apply unique filter combinations, every variant is a cache miss. Pre-aggregating common rollups (by day, by region, by product) often helps more than tuning the cache TTL.
A useful exercise: for each dashboard, multiply (refresh frequency in a day) × (average query runtime in seconds) × (warehouse credits per second) and compare it to (daily viewers) × (average load time in seconds) × (credits per second). The cheaper number tells you which strategy is right for that dashboard. Most teams have a handful of expensive outliers that would benefit from a different mode.
A short, honest read of where the major tools sit. This is not a recommendation; it is how their architecture pushes you toward certain choices.
If you are picking a BI tool partly on refresh behavior, ask vendors: (1) what is cached and where, (2) whether row-level security applies to the cache as well as the source, (3) whether refresh failures alert someone, and (4) whether refresh time counts toward your warehouse bill or theirs.
If you are tuning refresh strategy on an existing BI deployment, the order of operations that usually works:
Basedash is built around live queries against your connected database or warehouse, with per-query caching to keep page loads fast even when many people view the same dashboard. This is a good fit for startups and lean teams who want one model — query the source — without operating a separate extract engine. For dashboards that need to scale to hundreds or thousands of concurrent viewers, or that pull from very large historical datasets, a tool with a dedicated cache like Tableau or Power BI may be a better match. The honest answer for most teams is to use one tool for internal live exploration and another for embedded or high-scale customer-facing dashboards. Refresh strategy is the lens that makes that choice obvious.
How often should a BI dashboard refresh?
Match the refresh interval to how often a human acts on the dashboard. Daily for board and finance reporting, hourly for marketing and sales, every few minutes (or live) for operational dashboards like support queues. Sub-minute freshness is a streaming problem, not a BI refresh problem.
What is the difference between a live query and a scheduled refresh?
A live query runs the SQL against the warehouse every time a user loads the dashboard. A scheduled refresh runs the SQL on a fixed cadence and stores the result; users query the stored result, not the warehouse.
When should I use Power BI Import vs DirectQuery?
Use Import (a scheduled extract into VertiPaq) for most dashboards. Use DirectQuery when the data needs to be live, you have row-level security enforced in the source, or the dataset is too large to import. Composite models let you mix the two on a per-table basis.
Are Tableau extracts the same as caching?
Not quite. Extracts are full materializations into Tableau’s Hyper engine, designed for fast slice-and-dice on a fixed schema. Caching usually refers to short-lived storage of recent query results. Extracts are what you set up; caching is what happens automatically on top.
Do scheduled refreshes still cost warehouse credits?
Yes. Each scheduled refresh runs a query against your warehouse and consumes compute. The savings come from amortizing that query across many dashboard views, not from avoiding warehouse usage entirely.
How do I get real-time dashboards?
Real-time analytics needs a streaming source (Kafka, CDC), a fast OLAP engine (ClickHouse, Pinot, Druid, Tinybird), and a tool designed for short polling or streaming subscriptions. General-purpose BI tools can simulate it with frequent refreshes but are not architected for sub-second freshness at scale.
Written by
Founder and CEO of Basedash
Max Musing is the founder and CEO of Basedash, an AI-native business intelligence platform designed to help teams explore analytics and build dashboards without writing SQL. His work focuses on applying large language models to structured data systems, improving query reliability, and building governed analytics workflows for production environments.
Basedash lets you build charts, dashboards, and reports in seconds using all your data.