06.17.14

A hundred Superbowls per sq km?

Posted in General, Regulatory, Spectrum at 4:02 am by timfarrar

Back in February, I wrote an article for GigaOm, questioning the unrealistic projections of future data traffic produced by the ITU Speculator model. Since then the conclusions of one of the studies I mentioned, conducted by Real Wireless for Ofcom in June 2013, have been amended to reduce the modeled traffic per sq km by a factor of 1000 (from 10 PB per sq km per month to 10 TB per sq km per month in suburban areas), by the simple expedient of changing the label on the chart axis in Figure 44. The new version of the report fails to give any explanation of why this thousandfold “error” occurred, or indeed how the new results are consistent with the ITU model (which of course does project traffic demand of petabytes per sq km per month by 2020).


Ofcom claimed by way of explanation, in a statement to PolicyTracker, that “since the report has served its purpose we do not plan to carry out any further work to update it,” but one therefore has to wonder exactly what that purpose was, if not to exaggerate future demand for mobile spectrum and/or shore up a model which even Ofcom now apparently considers to be in error by a factor of 1000.

Just to give another illustration of quite how badly wrong the Speculator model is, I thought it might be helpful to compare the predicted levels of traffic demand with that experienced during the Superbowl in 2014, which is documented in a Computerworld article from earlier this year. That article highlights that AT&T carried around 119 GB of traffic in the busiest hour of the game, while Verizon carried roughly 3 times as much as AT&T. Broadly, we can therefore estimate that the total amount of data traffic across all mobile networks in the busiest hour of what is widely viewed as the most extreme situation for mobile demand in the entire US (if not the whole world) is around 500GB in the square kilometer in and around the stadium (depicted in red below).

For comparison, the Speculator model projects that by 2020, the typical level of everyday demand that needs to be accommodated by mobile networks (excluding WiFi) in a dense urban public area will be 51 TB per hour per sq km, one hundred times more than the traffic level experienced in the busiest hour at the Superbowl in 2014.

When AT&T reports that data usage in the busiest hour of the game has increased by only a factor of four in the last 3 years, is it really credible to expect traffic at the Superbowl to increase by 100 times in the next 6 years? And even if traffic at the Superbowl itself grows by leaps and bounds, why should global spectrum allocations be set based on traffic in the busiest hour at the busiest location in the whole of the US? Clearly, a more rational conclusion is that the Speculator model is simply wrong, and cannot possibly be representative of typical scenarios for mobile data usage in 2020.

3 Comments »

  1. Scottminehane said,

    June 22, 2014 at 3:30 am

    Agree generally with the article, but it is far from certain that the Superbowl is the most extreme situation for mobile demand in the entire world. At the Australian Football League (AFL) Grand Final in September 2013 at the Melbourne Cricket Ground (MCG) , the demand may have been higher .. see http://www.bandt.com.au/media/AFL-Half-a-million-texts-sent-from-the-G .. Telstra is maybe 50 percent of the Oz market.

    it is for this reason that Telstra is moving quickly on LTE-B .. See http://www.whistleout.com.au/MobilePhones/News/Telstra-LTE-B-trial-successful. with supportive comments made by Telstra at Communicasia in Singapore last week. Cheers

  2. 4gpro said,

    August 29, 2014 at 7:27 am

    I agree with “why should global spectrum allocations be set based on traffic in the busiest hour at the busiest location in the whole of the US? ”

    The industry is prone to take the easiest/least costly way out, pushing for more spectrum and co-use of Wi-Fi spectrum for LTE rather than expend the efforts to innovate. These special use cases are often difficult to justify on a cost basis because the operator revenue model, subscriber and premium services plans, does not provide the means to fund the specific capex or pursue R&D specific to the situation. However, the highest density use scenarios can help spur the problem solving innovations that will be needed to densify deployments more broadly: advances in interference mitigation, use of higher order MIMO, Co-MIMO and MU-MIMO/CoMP and advanced tiered use of spectrum can serve as excellent test beds to push evolution forward.

    An advantage of stadium/intermittent use locations is these are vacant much of the time, providing a deployment and test environment scenario that can be used to test out new technologies and interplay of methods. The event goers are, in one way of looking at it, the guinea pigs in the experimental development of advanced uses of technology that operators can pull the best of breed solutions into the broader market. Rather than looking at this as abusing the use of the public/guinea pigs, they will be getting the privilege, with fall back, of free use of future mainstream technologies and methods.

  3. TMF Associates MSS blog » Report on ITU Speculator model published said,

    September 17, 2014 at 8:03 am

    [...] published and is available here. This report critiques the ITU Speculator model, concluding that (as I noted earlier), the Speculator model traffic assumptions vastly exceed any reasonable traffic density that can be [...]

Leave a Comment

You must be logged in to post a comment.