For years, the self-storage industry has focused on one core challenge: getting access to better data.
More coverage. Better pricing visibility. More market intelligence. More historical context.
That push made sense. Most operators, investors, and developers were working with incomplete information. If you could access more reliable data than the next person, you had an edge.
But that is no longer the real bottleneck.
Today, the issue is not whether teams have data; it is whether they can actually operationalize it quickly enough to matter.
That distinction is becoming increasingly important because most self-storage workflows are still built around static moments in time. Data gets exported, pasted into Excel, analyzed, shared internally, and then slowly starts aging the second it enters the model.
The market moves. The spreadsheet does not.
And that gap between the market and the model is where a surprising amount of risk now lives.
The Industry Has Quietly Built a “Snapshot Problem”
Most people do not think about this because exporting data feels normal.
But nearly every underwriting model, comp sheet, acquisition memo, or pricing tracker in the industry is based on snapshots.
A snapshot of pricing.
A snapshot of competitors.
A snapshot of market conditions.
The challenge is that self-storage pricing has become far more dynamic than many legacy workflows were designed to handle.
Operators are adjusting rates more frequently. Promotions change constantly. Revenue management strategies differ by channel. Local competition can shift within weeks, not quarters.
Yet many teams are still making decisions based on information that may already be days or weeks behind current market conditions.
Not because they are careless. Because their workflow was never designed for continuously changing data.
The Real Competitive Gap Is Starting to Shift
Historically, competitive advantage came from:
- Having access to data others could not access
- Building larger comp sets
- Gathering information faster than competitors
But increasingly, the gap is shifting toward something else:
Which teams can keep their decision-making environment closest to real-time conditions?
That sounds subtle, but it changes almost everything.
Because the firms operating closest to live market conditions:
- Refresh assumptions faster
- Adjust pricing strategy sooner
- Identify market shifts earlier
- Re-run scenarios more often
- Make decisions with less lag between insight and execution
In other words, the advantage is no longer just information.
It is responsiveness.
Why Excel Became the Bottleneck Nobody Talks About
Excel remains the operational center of the industry for a reason.
It is flexible, customizable, and deeply embedded into how deals are evaluated. Most teams are not looking to replace it.
But Excel was originally built around static inputs. You import data. Build formulas. Lock assumptions. Save the file. The problem is that the market is now behaving more like a live environment than a static one.
That creates friction.
Every time pricing changes, teams are forced to:
- Re-export datasets
- Rebuild portions of models
- Verify assumptions manually
- Reconcile inconsistencies between files
Over time, this creates an invisible operational tax. Not just in labor hours, but in hesitation. Because when updating a model becomes time-consuming, teams naturally do it less frequently. That means fewer iterations. Fewer stress tests. Fewer refreshed assumptions.
And ultimately, slower decisions.
Most Models Are More Fragile Than People Realize
One of the more overlooked issues in self-storage underwriting is how sensitive many models are to small pricing assumptions.
A seemingly minor difference in rent growth assumptions, occupancy stabilization timing, or competitive positioning can materially change:
- Exit projections
- Development feasibility
- Debt coverage expectations
- Acquisition pricing
The issue is not that assumptions are wrong.
It is that many assumptions are unintentionally stale.
This becomes especially dangerous during periods where:
- Demand softens unevenly
- Supply pipelines contract
- Operators respond differently to occupancy pressure
- Market conditions diverge locally instead of nationally
A comp set that felt reliable 60 days ago may no longer reflect the competitive landscape today.
And because most workflows are not continuously connected to live data, many teams do not realize how far their assumptions have drifted until later in the process.
Why Historical Context Matters More Than Point-in-Time Pricing
Another industry blind spot is over-reliance on current pricing without enough historical framing.
Point-in-time pricing can be misleading on its own.
A facility may appear aggressive today because:
- It recently adjusted occupancy strategy
- It is responding to new competition
- It is using temporary promotions
- It is recovering from a previous pricing reset
Without historical context, it is difficult to understand whether current pricing represents:
- A trend
- A temporary adjustment
- A long-term repositioning
- Or an outlier moment
This is why historical averages matter so much.
Using T-12 and T-25 historical averages creates a more stable understanding of market behavior by reducing sensitivity to short-term noise.
More importantly, access to all-time facility-level pricing history changes how assets can be evaluated entirely.
Instead of asking:
“What is this facility charging today?”
Teams can ask:
“How has this facility behaved across different market environments?”
That is a fundamentally more powerful question.
The Industry Still Underestimates the Importance of Web vs. In-Store Pricing
Another area that remains surprisingly underexplored is the relationship between web pricing and in-store pricing.
Many models still assume pricing is singular.
In practice, operators increasingly manage different pricing strategies across channels.
This means:
- Web pricing may reflect a promotional acquisition strategy
- In-store pricing may reflect operational yield management
- Discounting behavior may vary significantly by market
When teams only evaluate one side of that equation, they are often missing how competitors are truly positioning themselves.
Being able to evaluate web and in-store rates simultaneously introduces a more realistic understanding of:
- Competitive aggressiveness
- Revenue strategy
- Occupancy management behavior
- Consumer acquisition tactics
This becomes particularly important in highly competitive trade areas where subtle pricing shifts can signal broader operational pressure.
The Future Is Not More Dashboards
A common assumption across software is that the solution to better decision-making is another dashboard.
But many experienced operators and investors do not actually want more dashboards.
They want:
- Faster workflows
- Cleaner assumptions
- Better inputs inside existing systems
This is an important distinction.
The next phase of competitive advantage in self-storage likely will not come from forcing teams into entirely new platforms.
It will come from embedding intelligence directly into the environments where decisions already happen.
That is a very different philosophy.
What Radius+ Relay Is Actually Solving
This is ultimately the thinking behind Radius+ Relay.
Relay is not trying to replace Excel.
It is trying to eliminate the friction between market intelligence and the spreadsheet models where decisions are already being made.
By allowing teams to pull:
- Real-time pricing
- Monthly pricing trends
- T-12 and T-25 historical averages
- All-time facility-level pricing history
- Web and in-store rates simultaneously
directly into Excel, Relay changes the relationship between the market and the model.
The spreadsheet stops becoming a static file and starts becoming a living representation of current market conditions.
That shift matters more than most people realize.
Because when updating assumptions becomes effortless, teams naturally:
- Iterate more often
- Stress test more aggressively
- Refresh decisions faster
- Operate closer to real-world conditions
And over time, that compounds into a meaningful competitive advantage.
The Firms Closest to Reality Will Win
For years, the industry focused on acquiring more data.
The next phase will be about reducing the distance between:
- The market
- The model
- And the decision itself
The firms that operate closest to live conditions will simply make better decisions over time.
Not because they are smarter.
Because their workflows allow them to respond faster, test more frequently, and adapt with less friction.
That is where the industry is heading.
And increasingly, the advantage will belong not to the teams with the most data, but to the teams with the shortest gap between insight and action.
