Steven J. Crowley, P.E.
“To generalize, it is often true that studies will be promoted that tend to support the policy inclinations of the Chairman, under whose direction, after all, every draft decision is made.”
“[S]tatistics can lie. But cast as ‘studies’ by commentors, they take on the weight that a decision maker chooses to make of them.”
As a follow-on to its National Broadband Plan, the FCC last year released a Technical Paper intended to validate the Plan’s prediction of a 300 MHz mobile-broadband spectrum deficit by 2014. The Paper describes a spectrum requirements model that totals current spectrum assigned to mobile broadband and applies a multiplier based on expected demand, taking into account expected increased tower density and improvements in air-interface spectrum efficiency. The model’s result is a predicted deficit of 275 MHz in 2014, which rounds to 300 MHz. On the way toward that result, however, the analysis uses just a few of the available data forecasts, ignores offloading of macrocell data to Wi-Fi and femtocells, and assumes the continuation of flat-rate plans for consumers. Some of these oddities I noted in a post at the time. I had hoped the FCC would make the Paper a subject of public comment. That hasn’t happened. So, I’ve looked at the Paper in more detail. I find that when looking at the above factors in a more realistic manner, predicted spectrum requirements go down significantly.
INVALID ASSUMPTION #1: THREE ARBITRARILY-PICKED FORECASTS ARE REPRESENTATIVE
To estimate spectrum requirements in 2014, the FCC’s model uses a multiplier based on an average of three forecasts of mobile-broadband data demand. These are by Cisco, Coda Research, and Yankee Group. From 2009 to 2014, they predict mobile broadband data growth of 4722%, 3464%, and 2332%, respectively, for an average of 3506% (or, if you prefer, 35x).
The FCC characterizes these as “industry analyst mobile data demand forecasts” when in fact only two are from industry analysts. Cisco is an equipment vendor. The preparation of its forecast is managed by a member of Cisco’s marketing team. The Cisco forecast is used to promote the sale of Cisco’s core-network hardware that can be used to help address the increased data demand the forecast predicts. The Technical Paper gives no indication as to why these three forecasts were chosen and others rejected. It would be as if the FDA looked at clinical trial data, ignored the statistics, and made decisions based on its favorite data points.
The decision to use Cisco’s forecast, and not those of other equipment vendors, is odd. Forecasts are also available from Alcatel-Lucent, Ericsson, and Nokia Siemens Networks (NSN), among others. Unlike Cisco, these companies have core competency in the 3G/4G radio air-interface that is the most challenging bottleneck when it comes to mobile capacity, so it would seem useful to include their findings. For the same time frame looked at by the FCC, 2009-2014, Alcatel-Lucent, Ericsson, and NSN predict data growth of 3893%, 2541%, and 811%, respectively. The 811% looks very low, but is consistent with recent forecasts predicting data increases in the 8x-10x range over the next several years; these lower estimates may be an indication of large-cell model of cellular hitting an inflection on a technology-maturation S-curve.
When the forecasts are considered with those of Cisco, Coda Research, and Yankee Group, the six-forecast average is 2961%, or 15.5% less than the FCC’s estimate of 3506%. What does this do to spectrum requirements? From the FCC’s own sensitivity analysis in the Paper (p. 22), this reduces the 2014 shortfall from 275 MHz to approximately 165 MHz.
INVALID ASSUMPTION #2: OFFLOADING IS AN ABSTRACT CONCEPT
Today’s typical macrocell (large cell) wireless systems have always expended disproportional resources trying to overcome building attenuation and reach user devices indoors; it’s been an outside-in approach. Adding to the challenge, we’re inside 70% of the time, and will be inside even more as time goes on, according to Informa estimates. Building attenuation is not the only indoor problem; signals indoors weaken as the distance to base stations increases. Furthermore, capacity available to a user goes down as more users join the cell.
At the same time, our indoors increasingly have fixed broadband service. This can be used in conjunction with small cells, such as Wi-Fi access points or femtocells, to offload data from the macrocell. When the user is close to small cells, a lot of good things happen, things beyond the ability of additional spectrum to provide. Building attenuation goes down because we’re not punching through as many walls. Signal strength increases because of the shorter distance. Throughput to the user goes up because capacity is no longer shared with several dozen others. (Throughput to those still on macrocells goes up, too, because they’re no longer competing with the small cell users.) As an added benefit, since the user is close to the cell, not as much power is needed on the uplink; handset transmit power goes down, increasing battery life. Taking all these factors into account, data rates available to a user can go up 80x or more using small cells depending on the deployment scenario. In contrast, doubling available spectrum increases throughput only 2x. Allocating the entire 300-3000 MHz band to mobile broadband would increase throughput only 7x, were that a practical option.
The outside-in approach of macrocells is turning inside-out, bring the user closer to the base station. This offloading concept is consistent with the ITU’s 2003 vision of heterogeneous networks; each wireless access technology excels in certain circumstances, and shouldn’t be force-fit into others. It’s hopeless to reach the capacity increases that can be achieved through small cells by the use of additional spectrum. Qualcomm, a longtime advocate of more spectrum for mobile broadband, recently said that “the next performance and capacity leap will come from network topology evolution by using a mix of macro cells and small cells – also referred to as a Heterogeneous Network (HetNet) – effectively bringing the network closer to the user.” The same improvements in electronics technology that enable smartphones, and their increased data requirements, likewise enable new small-cell technology that can address the demand. Wireless innovation is not only on the user-device side.
Despite this progress in small cells, the Technical Paper inexplicably dismisses offloading:
“Since this paper is focused on mobile data traffic, strategies to offload traffic to femto-cells and WiFi is [sic] not directly considered. In addition, the rollout of such network architecture strategies has been slow to date, and its effects are unclear.”
First, mobile broadband data traffic and offloading go hand in hand and thus must be considered; the more data offloaded, the less carried by the mobile broadband network, and the less spectrum required for that network. Second, offloading strategies are ramping up quickly. The “effects” are clear. Less data is carried on the macrocell, reducing the need for new spectrum.
If the Technical Paper does not directly consider the effects of offloading, perhaps the input forecasts from Yankee Group, Coda, and Cisco do. It’s not clear, from information provided in the Paper, to what extent the three forecasts take offloading into account. Looking at the Cisco forecast separately, we can see it estimates that in 2014, 23% of wireless data in the U.S. will be offloaded to Wi-Fi and femtocells. More recent observations and forecasts, however, are substantially higher than Cisco’s. Commenting on the issue in its latest wireless competition report, the FCC said “AT&T has experienced significant growth in hot spot usage in the first half of 2010, with an estimated 40 percent of iPhone traffic in the United States being transmitted over a Wi-Fi connection.” Independent analysts ABI Research and Juniper Research predict worldwide offloading rates of 48% and 63%, respectively, in 2015. ComScore estimates that in August 2011, 37.2% of mobile phone data was sent using a WiFi connection, a percentage that grew almost 3 points in just the preceding three months. For 2014, Juniper Research predicts North American offloading will reach 76.9%. Cisco’s underestimation of offloading contributes to its forecast usually being the highest of the bunch, making it the go-to forecast for spectrum crisis adherents.
The FCC didn’t directly consider offloading, but we can. For the purposes of this post, let’s average the low and high offload estimates, from Cisco and Juniper Research. That gives us 50% as an offload factor. Adjusting Cisco’s forecast using the 50% offload factor instead of 23%, one gets a Cisco 2014 data growth relative to 2009 of 3066%. This lowers the six-forecast average to 2685%, which is 23.4% less than the FCC’s estimate. Returning to the sensitivity analysis, the revised spectrum shortfall is approximately 115 MHz instead of 275 MHz.
INVALID ASSUMPTION #3 – FLAT-RATES RULE
Most U.S. operators have gone from flat-rate “all you can eat” rate plans to usage-based plans where the consumer is charged based on the amount of data used. AT&T made the change in June 2010, T-Mobile in April 2011, and Verizon Wireless in July 2011. (Sprint is the only major operator with an unlimited plan today, on its 3G network for the iPhone.) These new rate plans will further encourage users to offload data. The Technical Paper, again inexplicably, does not take this into account:
“The projections of mobile data demand used in this analysis are based in part on historic market dynamics, such as “all you can eat” pricing for data. The effect of new pricing strategies on consumer data demand is not yet known, but has the potential to impact data traffic projections if widely adopted in the market.”
I expect the effect of these new pricing strategies will be that consumers moderate their less essential use of mobile broadband. This will be especially so for the data hogs, a few of whom consume the bulk of mobile broadband data in each cell. There is not much public information yet on consumer response to these new pricing plans, but it’s something to watch out for, and then we can make a further correction to the spectrum deficit estimate.
The effects of usage based plans on data consumption will be far reaching. With the cost of bits more aligned with what it costs the operator to supply them, the consumer has more price signals on which to act upon. These signals, in turn, will be felt by the rest of the wireless ecosystem. Under flat-rate plans there was little incentive for consumers to be conscious about how efficiently-coded the phone’s operating system or applications were. Under newer plans that charge by use, consumers will have a heightened awareness of how much data they’re using, and for what purpose. Software will increasingly compete on efficiency. On the application side, operators know where every bit goes, and which applications consume the most data. There is no such awareness on the consumer side. Perhaps the industry will provide greater granularity in data metering, to the application level, so users can better prioritize usage of their data plans. Once this happens, developer sensitization to the issue will increase.
Some of this software-based improvement is nearly “free,” requiring only improved programming practices. Last year, the World Wide Web Consortium adopted a recommendation for best practices in mobile web application development. As one example in the recommendation, mobile web applications often use several static images to represent buttons. Each image that is sent uses a separate HTML request. HTML requests can be reduced to one by combining the buttons into one static image, sending that image, and cutting the buttons from the image. That saves data. Under flat-rate plans, why bother?
On the operating system side, different phones can vary greatly as to the amount of data needed to perform the same function. One analysis, sponsored in part by Research in Motion, finds that across multiple applications and for the particular smartphones studied, the BlackBerry used much less data than the iPhone or Android phones. For web browsing, the Blackberry was 2.1 times more efficient (i.e., used 2.1 times less data) than iPhone iOS or Android. For e-mail, the Blackberry was 4.5 times more efficient than Android and 11.4 times more efficient than iPhone iOS. As users become more aware of what data they’re getting for their money, competitive pressures will lead the less efficient operating systems to become more so.
FURTHER ADJUSTMENTS TO THE ESTIMATED SPECTRUM DEFICIT
We’ve looked at the estimated spectrum deficit from the basis of demand and the FCC’s model. Before we finish, let’s take a quick look at where we are today on the supply side. The following efforts show promise for making more spectrum available for mobile broadband within the 2014 timeframe:
- LightSquared is seeking approval to use 40 MHz of spectrum near 1500 MHz for terrestrial mobile broadband. Such use is pending resolution of GPS interference issues.
- The FCC has an open proceeding looking at maximizing the mobile broadband potential of a total 75 MHz of spectrum around 2 GHz. Some 40 MHz of that belongs to Dish Network, which recently asked the FCC for permission to deploy a hybrid satellite and terrestrial mobile and fixed broadband network. Qualcomm, Dish, and others have various smaller pieces of the 700 MHz band, which might be practical for use with LTE in a TDD mode using unpaired spectrum.
- NTIA has issued a plan and timetable identifying over 2200 MHz of Federal and non-Federal spectrum that might provide opportunities for wireless broadband use.For mobile use, the most promising band in the near term is 1755-1850 MHz. NTIA is finishing a detailed review of the band to determine to what extent it can be made available for commercial broadband use. The review was supposed to be completed by September 30, 2011 but is running late; perhaps we’ll see the results by the end of the year. I hoping at least 40-50 MHz becomes available from this band.
An accurate and current inventory of frequency assignments and usage would help identify new mobile spectrum, but such an inventory doesn’t exist. The preliminary steps the FCC has taken so far are so laden with disclaimers they can’t be relied upon. On the Federal side, the Government Accountability office (GAO) recently reported that “NTIA’s data management system lacks transparency and data validation processes, making it uncertain if spectrum management decisions are based on accurate and complete data.” The U.S. should conduct a thorough spectrum inventory, informed by measurements.
We’ve seen that reasonable updating of the FCC’s spectrum deficit model significantly reduces the short-term forecast deficit for 2014. As adjusted, the Technical Paper doesn’t support the National Broadband Plan’s mobile broadband spectrum recommendations, as was its intent.
We might step back and ask why the FCC is forecasting spectrum demand at all. If the U.S. moves more toward a property-rights and marketplace regime for spectrum use, as it may with the incentive auction approach, the Technical Paper estimates and National Broadband Plan recommendations become less important as the market will tend to allocate spectrum resources efficiently. To the extent it does not, and fails to meet a public policy goal, policymakers can change the market outcome.
If a forecast is to be maintained, the FCC should reconcile the National Broadband Plan with an updated forecast spectrum deficit, openly prepared with broad input and using the latest data. Part of that examination should be the validity of other methodology used the Technical Paper. The work should be thorough enough that the FCC Chairman no longer feels the need to cite Cisco’s forecast in speeches, and can instead cite the work of the FCC’s own staff — with authority.