MH370: analysis of where to look…

Posted in General, Inmarsat at 2:23 pm by timfarrar

Last week’s Wall St Journal article and my blog post highlighted that the MH370 search area was poised to move to the southwest, and yesterday this shift was confirmed by Inmarsat.

Although the location of this new search area has not yet been released, the independent team that has been analyzing the publicly available data felt it was appropriate to provide a statement, given below, with our best estimate of the highest probability (but not the only possible) location for a potential search. In this way, we hope to provide information which can clearly be seen to be completely independent of any locations that might be published by the search team in the near future.

The statement is as follows:

Shortly after the disappearance of MH370 on March 8th, an informal group of people with diverse technical backgrounds came together on-line to discuss the event and analyze the specific technical information that had been released, with the individuals sharing reference material and their experience with aircraft and satellite systems. While there remain a number of uncertainties and some disagreements as to the interpretation of aspects of the data, our best estimates of a location of the aircraft at 00:11UT (the last ping ring) cluster in the Indian Ocean near 36.02S, 88.57E. This location is consistent with an average ground speed of approximately 470 kts and the wind conditions at the time. The exact location is dependent on specific assumptions as to the flight path before 18:38UT. The range of locations, based on reasonable variations in the earlier flight path result in the cluster of results shown. We recommend that the search for MH370 be focused in this area.

We welcome any additional information that can be released to us by the accident investigation team that would allow us to refine our models and our predictions. We offer to work directly with the investigation team, to share our work, to collaborate on further work, or to contribute in any way that can aid the investigation. Additional information relating to our analysis will be posted on http://duncansteel.com and http://blog.tmfassociates.com. A report of the assumptions and approaches used to calculate the estimated location is being prepared and will be published to these web sites in the near future.

The following individuals have agreed to be publicly identified with this statement, to represent the larger collective that has contributed to this work, and to make themselves available to assist with the investigation in any constructive way. Other members prefer to remain anonymous, but their contributions are gratefully acknowledged. We prefer that contact be made through the organizations who have published this statement.

Brian Anderson, BE: Havelock North, New Zealand;
Sid Bennett, MEE: Chicago, Illinois, USA;
Curon Davies, MA: Swansea, UK;
Michael Exner, MEE: Colorado, USA;
Tim Farrar, PhD: Menlo Park, California, USA;
Richard Godfrey, BSc: Frankfurt, Germany;
Bill Holland, BSEE: Cary, North Carolina, USA;
Geoff Hyman, MSc: London, UK;
Victor Iannello, ScD: Roanoke, Virginia, USA;
Duncan Steel, PhD: Wellington, New Zealand.

A hundred Superbowls per sq km?

Posted in General, Regulatory, Spectrum at 4:02 am by timfarrar

Back in February, I wrote an article for GigaOm, questioning the unrealistic projections of future data traffic produced by the ITU Speculator model. Since then the conclusions of one of the studies I mentioned, conducted by Real Wireless for Ofcom in June 2013, have been amended to reduce the modeled traffic per sq km by a factor of 1000 (from 10 PB per sq km per month to 10 TB per sq km per month in suburban areas), by the simple expedient of changing the label on the chart axis in Figure 44. The new version of the report fails to give any explanation of why this thousandfold “error” occurred, or indeed how the new results are consistent with the ITU model (which of course does project traffic demand of petabytes per sq km per month by 2020).

Ofcom claimed by way of explanation, in a statement to PolicyTracker, that “since the report has served its purpose we do not plan to carry out any further work to update it,” but one therefore has to wonder exactly what that purpose was, if not to exaggerate future demand for mobile spectrum and/or shore up a model which even Ofcom now apparently considers to be in error by a factor of 1000.

Just to give another illustration of quite how badly wrong the Speculator model is, I thought it might be helpful to compare the predicted levels of traffic demand with that experienced during the Superbowl in 2014, which is documented in a Computerworld article from earlier this year. That article highlights that AT&T carried around 119 GB of traffic in the busiest hour of the game, while Verizon carried roughly 3 times as much as AT&T. Broadly, we can therefore estimate that the total amount of data traffic across all mobile networks in the busiest hour of what is widely viewed as the most extreme situation for mobile demand in the entire US (if not the whole world) is around 500GB in the square kilometer in and around the stadium (depicted in red below).

For comparison, the Speculator model projects that by 2020, the typical level of everyday demand that needs to be accommodated by mobile networks (excluding WiFi) in a dense urban public area will be 51 TB per hour per sq km, one hundred times more than the traffic level experienced in the busiest hour at the Superbowl in 2014.

When AT&T reports that data usage in the busiest hour of the game has increased by only a factor of four in the last 3 years, is it really credible to expect traffic at the Superbowl to increase by 100 times in the next 6 years? And even if traffic at the Superbowl itself grows by leaps and bounds, why should global spectrum allocations be set based on traffic in the busiest hour at the busiest location in the whole of the US? Clearly, a more rational conclusion is that the Speculator model is simply wrong, and cannot possibly be representative of typical scenarios for mobile data usage in 2020.


MH370: On the wrong track?

Posted in General, Inmarsat at 8:19 am by timfarrar

Since the Inmarsat ping data was released almost two weeks ago, I like many others have spent a good deal of time trying to discern what the data tells us. Particular thanks are due to Duncan Steel, Victor Iannello, Mike Exner, Don Thompson, Bill Holland and Brian Anderson, who’ve spent days and weeks performing numerous complex calculations and analysis of satellite and other data, much of which I’ve relied on in my analysis.

Although the data analysis remains a work in progress, and further information is needed to validate the BFO model in particular, I’ve now written up my initial conclusions, which indicate that the search area may need to be widened significantly beyond the areas identified in the most recent search effort. As the WSJ is reporting, this appears to be the approach now being taken by the investigative team.


Google’s space odyssey…

Posted in Broadband, General, Services, Spectrum at 4:04 pm by timfarrar

Over the last two weeks rumors have swept the satellite industry about Google’s plans to build a huge new broadband satellite constellation (dubbed “son of Teledesic” in a February article). I’ve done a fair amount of digging and since it looks like we will see this story in the mainstream press pretty soon, I thought it would be useful to summarize the analysis I produced for research clients last weekend.

As The Information reported on Tuesday, last month Google hired Brian Holz (former CTO of O3b) and Dave Bettinger (former CTO of iDirect) to work on the design of a massive new broadband satellite system, as part of Google’s Access division.

What has so far gone unreported are the technical details of the planned system, which is expected to involved 360 LEO Ku-band satellites using a filing by WorldVu in Jersey. The constellation will have 18 planes of 20 satellites, with half at an altitude of 950km and the remainder at 800km. I would expect the constellation to be launched in two phases, with the higher altitude satellites providing complete global coverage, and the lower satellites being added later, in between the initial 9 planes, to provide additional capacity. It also seems likely that the system could include inter-satellite crosslinks (within each of the two halves of the constellation) given the near polar orbit that is planned. WorldVu is apparently owned/controlled by Greg Wyler, the founder of O3b, who is rumored to have a handshake agreement with Larry Page to move ahead with the project.

The satellite system is budgeted to cost $3B, which is a very aggressive price target (recall Teledesic was supposed to cost $10B back in 1999), based on a plan to use very small (100kg) satellites. If this ultimately proves infeasible then the cost would certainly rise: for example the O3b and Iridium NEXT systems (700kg and 800kg respectively) cost at least $40M per satellite to build and launch.

UPDATE (6/1): The WSJ now has more details of the plan, confirming my supposition that it would start with 180 satellites and add the rest later. I was quoted in that article as stating that “180 small satellites could be launched for as little as about $600 million” but that should not be interpreted as a total cost for building and launching the satellites. If the target of 100kg could be achieved, the all-in cost for the first 180 satellites would certainly approach $2B, and if the satellites end up being more like 200-300kg, which a satellite designer suggested to me might be easier to achieve, then that all-in cost could reach $3B. The full 360 satellite system would likely cost $3B for the 100kg satellites and $4B-$5B for the 200-300kg satellites.

Notably the satellites would use the Ku-band, not the Ka-band which has been popular for broadband in recent years. This takes advantage of the FCC and international rulings secured by Skybridge in the late 1990s, which made over 3GHz of spectrum available for NGSO Ku-band systems, so long as they avoid interfering with satellites along the geostationary arc. In practice this means turning off the satellite when it is within about 10 degrees of the equator and handing over to an another satellite that is outside this exclusion zone. WorldVu apparently has priority ITU filing status with respect to this huge amount of spectrum on a global basis.

The total system capacity is unclear, but it could certainly be 1-2 Tbps or more for the full constellation, although not all of this will be usable (for example in polar and oceanic regions). Importantly, any LEO system would be critically dependent on the successful development of Kymeta’s new flat panel meta-materials antennas (which are being developed initially for Ka-band, but could also be extended to operate in Ku-band), because otherwise the need for tracking dish antennas makes it impossible to build terminals cost-effectively. After all, this terminal problem ultimately proved terminal for Teledesic in the late 1990s, and O3b is already telling potential enterprise customers that they should look to Kymeta to provide a viable low end terminal in a couple of years time.

Construction and launch of the first half of the constellation could probably be achieved within 5 years, if the satellites were small enough for dozens of them to be launched at once, and sufficient launch slots could be secured. However, it seems Google has not yet engaged actively with satellite manufacturers to seek their input on design feasibility (let alone bids) and so it might be premature to expect any formal announcement (and for the clock to start running on construction) at this stage.

Nevertheless this prospect is causing considerable excitement amongst satellite manufacturers, who had been bracing for a potential decline in business after record orders in recent years, and corresponding trepidation amongst satellite operators, who were already wary of a potential price war (and accelerated depreciation in the value of some older satellite assets) brought on by new high throughput Ku and Ka-band GEO satellites. Those investing in new broadband satellite systems of their own (like Intelsat, Inmarsat, ViaSat and Hughes) will certainly have to take this wildcard into account, but like the movie, only time will tell if Google’s space odyssey is going to be regarded as more than just dazzling special effects.


Still going down?

Posted in General, Spectrum at 6:03 pm by timfarrar

Today Cisco helpfully tweeted out one of the key statistics from their upcoming VNI report, which is scheduled for release on Feb 5, indicating that the “annual run rate” for mobile data traffic in 2013 was “less than 18 exabytes.” That’s even lower than last year’s report which forecast total traffic of 1.58EB/mo at the end of 2013. So I thought it would be interesting to examine how Cisco’s projection of global mobile data traffic for 2013 has evolved over the last six years of VNI reports.

The new figure also suggests that unless Cisco retrospectively reduced its estimate of global traffic in 2012 (which happened last year), then global traffic growth was only ~68% in 2013, rather than the 78% growth that Cisco forecasted in February 2013. Looking out to 2018, where an annual run rate of 190EB (i.e. monthly traffic of 15.8EB) is indicated, that would compare to a February 2013 projection for monthly traffic of 11.2EB at the end of 2017, or 42% growth in 2018 if the 2017 figure remained unchanged (in fact it may also come down slightly).

Sadly, we don’t have any CTIA benchmarks for traffic growth in the US in the first half of 2013, as that survey has been converted from a six monthly analysis to an annual effort, but its interesting to contrast these numbers with Chetan Sharma’s recent report suggesting that usage per consumer grew from 690MB to 1.2GB each month in the US in 2013 (74% growth) and from 140MB to 240MB per month globally (71% growth). Sharma’s numbers seem to be a little on the high side because obviously the number of smartphone users grew significantly during the year and there is tablet traffic to add in as well. One possibility is that Cisco is assuming there was little or no growth in laptop data traffic, which accounted for 46% of mobile data traffic in 2012 according to its February 2013 report.

We’ll obviously find out more next week, but it seems that despite evidence consumers are using more data on their smartphones when they upgrade to LTE, mobile data traffic growth worldwide is still slowing rather more rapidly than Cisco previously expected.

UPDATE (2/5): The released Cisco figures confirmed that traffic in 2012 is now estimated at 820PB/month, increasing by 81% to 1488PB/month in 2013. This represents a retrospective reduction of 7.3% in the 2012 estimate and 5.7% in the 2013 estimate. The trend for 2012, 2013 and growth between 2012 and 2013 is shown below.


Gogo dancing…

Posted in Aeronautical, Financials, General, Services at 6:42 am by timfarrar

Gogo has finally decided to strut its stuff, by filing an S-1 with the SEC in preparation for a potential IPO early next year. However, it needs to put on a very good performance over the next couple of years if this IPO is going to be successful.

The filing reveals some interesting statistics about the company, and highlights just how wrong most analyst forecasts for the passenger communications market have been. Bizarrely, the S-1 quotes Forrester projections that “in-flight internet usage is expected to increase rapidly over the next five years, from approximately 15.6 million North American sessions in 2011 to 96.9 million by 2015″ when the company knows that the 2011 number is simply wrong – Gogo (which had over 90% of usage in 2011) had “provided more than 15 million Gogo sessions” since its inception by the end of September, and previously stated at the beginning of this year that it had reached 10 million sessions, after the success of the free promotion with Google late last year (which itself generated 3 million sessions in 6 weeks). As a result, the total number of sessions in 2011 (when there hasn’t been the same level of promotional activity) across all providers in North America is certainly well below 10 million and more likely close to half the level estimated by Forrester. In fact the total number of sessions (free and paid) might be no higher in 2011 than in 2010 because of the distorting effect of the free promotion last year.

Perhaps one reason for quoting Forrester is that the other widely cited analyst projection (by InStat in October 2011) is even further off the map (one person deeply involved in the industry said to me that he “didn’t know what they had been smoking”), with InStat noting that “take rates have increased significantly, moving from an average of 4% in 2010 up to 7% in 2011” (contradicted by Gogo’s actual numbers showing no more than ~4% take rate in 2011) and that “in-flight Wi-Fi revenue is expected to grow from about $225 million in 2011 to over $1.5 billion in 2015” (again miles away from Gogo’s passenger revenues of $58M in the first 9 months of 2011).

However, now we have the public S-1 filing, perhaps some of these erroneous forecasts will come back down to Earth. The key data in the S-1 shows that current take rates are only about 4%, with users spending an average of $10 to $10.50 per flight, implying a severe (but unsurprising) skew towards laptop use on long flights (charged at $12.95). A survey (strangely citing data from 2009) showed that the average Gogo user had taken 14.2 domestic business flights in the last 12 months, indicating exactly as Connexion-by-Boeing found in 2006, that it is frequent business travelers who pay for in-flight WiFi and rarely anyone else. Of course, only about 10%-15% of airline passengers fly this frequently on business, so it is hardly surprising that current usage levels are so low (especially once very short flights are taken into account). It is also unsurprising that the service is dominated by repeat users (15 million sessions from 4.4 million unique users – and some of these unique users may have only used the service in late 2010, when free access attracted 2 million users in 6 weeks).

What is critical for the future growth of Gogo is therefore its ability to expand usage dramatically amongst leisure travelers and more occasional business travelers. The key statistic to watch is the average revenue per passenger, which has climbed from $0.32 in 2010 to $0.41 in the first 9 months of 2011. It would be useful to see in the roadshow how much of this growth is due to new leisure users (as opposed to growing awareness after major fleetwide rollouts were completed in 2010), e.g. by looking at the trajectory on Virgin America (which had fleetwide availability very early) and by examining more recent data on the average number of annual flights taken by a Gogo user.

If the average revenue per passenger (ARPP) only grows to say $0.50, because paid usage remains limited to frequent business travelers, then it will be hard for Gogo to generate more than about $100M in EBITDA by 2015, but if ARPP grows to say $1.00, then EBITDA could grow to ~$200M, due to the nearly fixed cost base of the ground-based system (airlines receive about a 15% revenue share at present, though this payment might be on a tiered system so could increase in the future if usage is higher). Clearly, to have a successful IPO at a value not less than the $500M invested to date, Gogo therefore needs to demonstrate convincingly how its ARPP can more than double within 2-3 years, which can only happen by achieving much greater take-up amongst leisure and occasional business travelers.


The next broadband battle…

Posted in General, Regulatory, Spectrum at 9:15 am by timfarrar

As my article yesterday for GigaOm highlighted, the potential ripple effects of an AT&T/DISH deal are almost too numerous to mention. However, in addition to the political consequences, its also worth considering the implications of this potential industry realignment for the US spectrum market. As I’ve noted before, spectrum didn’t look like a good investment a year ago, and while the cable companies have come out OK (based mainly on their smart bidding strategy in the 2006 AWS auction), companies like Clearwire and NextWave who have bet on more speculative spectrum bands have suffered badly from a lack of buyers for the spectrum they’ve tried to sell. Even DISH faced little or no opposition from major wireless operators in its acquisition of DBSD and TerreStar’s spectrum assets.

Now, if deals between AT&T/DISH and Verizon/SpectrumCo go through, network sharing will create significant bandwidth efficiencies and with only two national LTE networks there will be even less competition in future spectrum auctions. That could well mean that incentive auctions will come to naught, because it will not be possible to generate high enough bids to persuade broadcasters to give up their spectrum (although that probably won’t prevent Congress eventually passing a bill so it can count imaginary future revenues against the deficit and/or D-block buildout).

In the near term, it also means that it will be difficult if not impossible for Clearwire to find eager bidders for the portion of its spectrum holdings it would like to sell (at anything from $0.25 to $0.75 per MHzPOP according to its recent roadshow). Indeed I’ve been told that the only offer to buy spectrum from Clearwire (during its efforts to sell spectrum earlier this year) came from Sprint, and its far from obvious that enough has changed to justify the recent speculation about new near term Clearwire partners/spectrum buyers ranging from MetroPCS to DirecTV.

As an aside I also find it hard to see how DirecTV’s involvement in a 2.6GHz TD-LTE venture in Brazil, which is focused on fixed wireless broadband in residential suburbs, just like Clearwire’s original fixed WiMAX business plan, has much relevance to Clearwire’s current small cell mobile data roaming plan in core urban hotspots. In theory DirecTV could buy Clearwire spectrum to deploy its own separate fixed wireless broadband network in the US, with a completely different cell spacing than a mobile network would require, but that hardly seems a productive use of capital when the US has vastly better fixed broadband infrastructure than Brazil and we’ve just seen the ignominious collapse of Open Range, which was trying to execute such a plan in rural areas, with subsidized loans from the USDA. As I’ve said before, fixed broadband is by far the best way to go for almost all in-home data delivery, and so I think that ultimately DirecTV will have to reach some agreement to use AT&T’s wireline infrastructure, completing the alignment of AT&T with the satellite TV companies against Verizon and the cable companies.


Uncutting the cord

Posted in General, Regulatory, Spectrum at 10:31 am by timfarrar

If the paradigm shift for the telecom industry in the 1990s was the Death of Distance and in the 2000s was Cutting the Cord, then what might the new paradigm be for the next decade? I’d venture to suggest that one of the most important trends will be data offloading from mobile to fixed networks. As a result, you will need that fixed wire (or cable or fiber) into your home more in a decade’s time than you do today.

The simple reason for this is that the cost of data delivery on a fixed network (at around 2-5 cents per Gbyte) is nearly two orders of magnitude lower than on wireless networks. Similarly, monthly usage per subscriber, despite dramatic increases in wireless usage over the last few years, is also about 100 times greater on wireline networks (15Gbytes per month compared to a few hundred Mbytes on wireless). This ratio is unlikely to change significantly over the next decade, given expected improvements in both wireline and wireless technologies.

What that means is if you want to watch streaming video on your tablet or smartphone, you will be very strongly incentivized (by price) to offload that traffic to a WiFi hotspot and onto a wireline network. For example, at ~2Gbyte per hour for streaming HD video (at 4-5Mbps) to a tablet, you would have to pay $20 per hour at current wireless prices of $10/Gbyte. Even in 5 years time the price will undoubtedly be dollars per hour, not pennies per hour. Unsurprisingly, consumers are already taking these pricing signals onboard and turning to WiFi. However, the impact of this switch is dramatically underestimated in Cisco’s forecasts, which project that in the US only 30% of smartphone and tablet traffic will be offloaded in 2015. Even today, this ludicrously underestimates the amount of tablet offloading, given that the majority of iPads are WiFi-only.

Ironically, the spectrum that might therefore be in greatest demand in the future is unlicensed spectrum, for short range wireless access. That’s why there is considerable pressure from Microsoft and Google to ensure that white spaces are protected in any future broadcast spectrum auction, and why a study for Ofcom on future UK spectrum requirements predicted that there would be more near term demand for incremental unlicensed spectrum than licensed spectrum.


O3b Networks – not a reprise of Teledesic (except perhaps in its ultimate fate?)

Posted in Financials, General at 8:23 pm by timfarrar

This isn’t really an MSS topic, but having worked for several years in the late 1990s on the Teledesic project and with fond memories of evaluating the market for broadband in developing countries, I thought the emergence of O3b Networks with reportedly $60M of investment from Liberty Global, Google and others, merited a comment.

Some of the commentaries appear to conflate O3b’s satellite backhaul business plan with the satellite access services offered by Wildblue and HughesNet in the US. O3b can’t offer services to the end user (except the largest corporates) since customers will require expensive terminals which track their Low Earth Orbit satellites. As Teledesic found out in the late 1990s (and as is still true today) you either need a couple of moving dishes (to ensure seamless handover) or an electronic steered antenna – both are well beyond realistic consumer prices (not to mention their installation difficulties compared to fixed geostationary terminals).

Instead O3b will offer backhaul for ISPs operating in countries without fiber links, so they can obtain connectivity to the Internet backbone. That’s a fairly well established and highly competitive market (valued at several hundred million dollars a year), with Intelsat and other FSS players competing intensively on price. Characteristically the market grows as Internet take-up and usage expands within a country, then collapses almost to zero within 12-18 months of fiber’s entry. There’s absolutely no reason for O3b to change this equation – satellite is at least two and in some cases closer to three orders of magnitude more expensive than large fiber connections for this backhaul service, and fiber continues to decline rapidly in price.

A second market is for cellular backhaul within a country, when fiber isn’t deployed outside the major cities. Again this market is somewhat transitory: terrestrial microwave links become a good option when the cellular coverage is sufficiently contiguous for daisy chaining links from one tower to another, but the satellite opportunity is longer lasting than ISP backhaul in major cities. The opportunity here for O3b will also be affected by how much more expensive its terminals are than standard VSATs, since somewhat more limited amounts of capacity are required. However, O3b’s low latency may be helpful for this voice-oriented traffic.

Thus O3b is simply a bet that quite a few countries, particularly in Africa, won’t get fiber any time soon. Even though its been (very) slow to get going, I’d rather put my money on EASSY, an African fiber project with many of the telcos in the region as signatories (as well as credible international players), which has just entered the construction phase and is hoping to be operational in 2010, well before O3b could expect to be up and running.


Welcome to our new MSS blog

Posted in General at 9:04 am by timfarrar

This blog is intended to reflect brief comments on current issues in Mobile Satellite Services, highlighting market developments, and new data on the industry.
It will also provide a forum to provide feedback on longer articles on our website www.tmfassociates.com/articles and to suggest topics for coverage in our MSS information service

« Previous Page « Previous Page Next entries »