Often I wonder whether some companies understand how the FCC works and what they really shouldn’t say in an FCC filing. Gogo has just provided a classic example in its August 26 ex parte filing that tries to counter SpaceX’s recent intervention in the 14GHz ATG proceeding, where Gogo has been trying to get 500MHz of spectrum auctioned for next generation ATG networks.
Unfortunately for Gogo, it has been left as virtually the sole active proponent of this auction, after Qualcomm laid off the team that developed the original proposal and stopped participating in the proceeding. While I’m sure Panasonic and Inmarsat would take part if an auction was held, undoubtedly they are relishing the prospect of Gogo struggling to improve its “infuriatingly expensive, slow internet” service with 2Ku capacity that Gogo itself admits is roughly the same cost per Mbyte as its existing ATG-4 network (at least until it can renegotiate its current bandwidth contracts).
So when Gogo makes submissions that directly contradict those it previously put into the record, it shouldn’t be surprised if the FCC regards these rather skeptically. In particular, in July 2014 Gogo told the FCC that it “supports the proposed §21.1120 requirement that interference from all air-ground mobile broadband aircraft and base stations not exceed a 1% rise over thermal” whereas now “Gogo concurs with Qualcomm in that a 6% RoT has a negligible impact on the cost and performance of an NGSO system while creating an additional and disproportionate level of complexity or loss of performance for the AG system” and “Gogo supports the 6% RoT aggregate interference levels initially proposed by Qualcomm”. So suddenly Gogo thinks that its a perfectly acceptable to have six times more interference than a year ago.
Even more of a hostage to fortune was Gogo’s September 2013 comment about the unacceptable problems that an ATG network (referred to as Air to Ground Mobile Broadband Service or AGMBS) would cause for NGSO systems like that proposed by SpaceX:
“In its initial comments, Gogo expressed its concern that Qualcomm’s assumptions regarding the operating parameters of the hypothetical NGSO satellite systems were not representative of typical or worst case system configurations, and that the interference between a future system and AGMBS systems could be far greater than indicated by Qualcomm’s estimates. Gogo is not alone in this view, as the Satellite Industry Association (“SIA”), ViaSat, EchoStar and Hughes all raised similar concerns in their comments. SIA included an analysis within the Technical Appendix attached to its comments which illustrates the potential for much greater interference than had previously been calculated by Qualcomm. In Gogo’s view, some aspects of the analysis are subject to challenge because it overstates the level of interference that may be expected. Nevertheless, the overall conclusion remains valid – an AGMBS system operating consistent with the proposed rules would cause unacceptable levels of interference to many, if not most, possible future Ku-band NGSO system configurations. The analysis of EchoStar and Hughes, provided in Annex B of their comments, provides additional support for this conclusion. Similarly, ViaSat’s comments indicated that the NGSO analysis presented by Qualcomm is not representative of the range of potential Ku-band NGSO systems which have been previously proposed.”
Yet now Gogo, having previously claimed that Qualcomm’s calculations were flawed, suddenly decides that after “incorporating [SpaceX's] stated parameters into the Qualcomm interference calculation methodology” everything is fine and “the resultant RoT from an AG system into the SpaceX NGSO system is far less than [its newly relaxed] 6%” interference criteria.
I can only conclude that Gogo must be truly desperate to get the 14GHz ATG proceeding completed, because it needs the capacity ASAP. However, making contradictory filings is certainly not going to help the company to get a favorable ruling from the FCC anytime soon (especially when politics is lurking in the background, in the form of the Association of Flight Attendants expressing concern about the FCC taking action on this matter).
It feels like an age since Ergen’s plan for fixed wireless broadband and hosted small cell deployment on rooftop satellite TV antennas was at the core of his bids for Sprint and Clearwire in 2013. And as I pointed out last year, the AT&T acquisition of DirecTV seemed to pre-empt DISH’s plan and threaten more competition if DISH did proceed with a rollout.
Now the prospects of DISH reaching agreement with T-Mobile seem as distant as ever, and Verizon and AT&T appears eager to dismiss any prospect of them buying DISH’s spectrum. In addition, DISH’s stock has fallen after the FCC ruled against it last week over the Designated Entity discounts in the AWS-3 auction and Ergen has hinted that as a result he might now seek to dispose of his spectrum rather than entering the wireless market.
However, in recent weeks, Sprint has been playing up its small cell plan, but has not yet named its partners, except to hint that it will look towards off-balance sheet financing for the buildout. So I wonder if Charlie’s next angle to put his spectrum to use could be through a partnership with Sprint to make use of DISH’s rooftop sites in the small cell buildout, and perhaps host some of DISH’s spectrum at the same time. After all, the time when Ergen claims he is definitely leaning one way is usually the point at which he moves decisively in the opposite direction.
Such a deal could include an exchange of equity, with Softbank investing in DISH and DISH investing in Sprint. That would be a logical explanation for Softbank’s otherwise incomprehensible recent moves to buy additional Sprint equity in the public markets, rather than injecting much needed incremental cash into Sprint.
DISH could even participate in the network equipment leasing company (perhaps reframed as a JV) if it can use the cellsites for its own fixed wireless broadband (and perhaps mobile broadband) offerings. And none of this would prevent DISH from entering into a spinoff of its spectrum holdings, perhaps even with Sprint agreeing to act as an anchor tenant, leasing spectrum such as the PCS H-block and the adjacent AWS-4 uplink, which could be repurposed as a supplementary downlink and might provide Sprint with an alternative to bidding in the incentive auction next year.
A spectrum spinoff (or other transaction) by DISH still seems a likely outcome, and the FCC appears to have helped DISH on its way, by stating it will accept an “an irrevocable, standby letter of credit” instead of immediate payment, which will only be drawn if DISH has failed to make the $3.3B repayment of the DE discount by 120 days after the release of the Order (i.e. mid December), instead of the 30 days available to make a cash payment. That concession (which doesn’t have any obvious precedents that I’m aware of) will save DISH 90 days interest (over $40M at a 5% interest rate) and gives Ergen much more time to sort out a deal to reorganize his spectrum interests.
It feels like DISH will now finally have to pull the trigger on something, though I’m surprised no analysts appear to have even contemplated the scenario I’ve described above. The current uncertainty in the financial markets may not be helpful to the prospects of a deal being reached, especially if it proves difficult to get financing for a spectrum spinoff. Nevertheless, that need not prevent a small cell hosting deal, and with Charlie you simply have to expect him to have an angle most people haven’t thought of.
As I pointed out in a tweet a couple of months ago, Iridium’s SBD service is being used for command and control of Google’s Project Loon. So it was interesting to see just how much Google has been spending on Iridium airtime, when Iridium’s CFO mentioned in their July 30 results call that:
“…our network provides the connectivity to remotely command and control the assets of the large and unique project by a major company who doesn’t let us reference their involvement in the program. We saw significant airtime usage in last year’s third quarter during the testing phase for this project. We now understand from our customer that this high level of activity will decline in the second half of 2015 as the service moves into another, more mature development phase, which will culminate in commercialization in 2016. We expect a full-year decline of $500,000 in M2M service revenue from this customer as a result of this evolution, with much of that coming in the third quarter.”
Its been reported that the Loon balloons have flown for “more than three million kilometers” at speeds of up to 300km/hour, though an average speed of say 40-50km/hour seems more plausible (which would mean it takes 50-60 minutes for the 40km diameter coverage area to traverse a given location if directly overhead, or somewhat less if the balloon path is more distant).
So that would suggest Project Loon has achieved something like 60,000-80,000 flight hours in total over the three years of the project, with a significant fraction of that during the 2014 testing phase. Much of the spending on command and control was likely incurred in 2014, because Google reportedly moved to sending new orders to the balloons “as frequently as every 15 minutes” (and presumably receiving data from them even more often).
But if Google spent something over $500K on wholesale Iridium airtime (and even more with retail markups included) in 2014, then that would suggest the cost of airtime command and control is something like $8-$10 per hour (before retail markup). As a benchmark, the spending level of about $140K per month in Q3 of last year suggested by Iridium would then equate to an average of 20-25 balloons operating continuously during the quarter (which is consistent with Google’s suggestion that it would step up to “more than 100″ balloons in the next phase of testing).
Google has indicated that the operating costs of each balloon are “just hundreds of dollars per day” but it is still surprising to consider that the company would be spending $200+ per balloon per day just on satellite connectivity. Moreover, it seems that Google’s “hundred of dollars per day” quoted cost could potentially exclude all the other costs involved in manufacturing and deploying the balloons and backhauling the traffic carried by them. That seems pretty expensive compared to the costs of a new fixed cellsite and highlights the perhaps questionable economics of the Loon architecture.
Now that Google has announced an MOU to potentially bring internet to remote areas of Sri Lanka next year, it is also interesting to contemplate just what that might mean in terms of Iridium airtime if the deal comes to fruition. Google has said it needs “more than 100 Loon balloons circling the globe” just to provide “‘quasi-continuous’ service along a thin ribbon around the Southern Hemisphere”. So it seems implausible to think that all of the rural areas of Sri Lanka would be served with less than say 300 balloons operating continuously. Assuming Google could get a somewhat better deal for high volume usage of say $5 per flight hour (of wholesale revenue to Iridium), then that would equate to annual wholesale airtime revenues of perhaps $13M for Iridium. And revenues could be even higher if more balloons are used to ensure continuous reliable coverage.
Perhaps Google can afford to spend a few tens of millions of dollars a year for a demonstration project in Sri Lanka (although the funding sources for this project remain uncertain). However, the scalability of Loon to a global deployment must be in much greater question. For continuous global coverage there would need to be as many as 100,000+ balloons in operation simultaneously. Even ignoring capital costs, if the operating costs of the network (for all aspects, not just satellite connectivity) are of order $300 per balloon per day, then that would amount to $11B per year in operating costs (for comparison US wireless carriers are projected to spend $56B in opex between them in 2017 to serve well over 300M customers). Its therefore unsurprising that Google intends to rely on wireless operators (and perhaps governments) to support these costs, rather than taking on the burden of commercial deployment itself.
Not content with disinterring the FCC’s infamous October 2010 working paper that most thought had been completely discredited five years ago, last month CTIA went on to commission Brattle Group to produce a new “updated” version of the FCC’s forecasts.
Ironically enough this new report confirms that the FCC was totally wrong in 2010, because the total amount of spectrum in use at the end of 2014 was only 348MHz, not the 822MHz that the FCC projected. Despite this clear demonstration of how ludicrous the original projections were, Brattle reuses the same flawed methodology, which ignores factors such as that new deployment is of cells for capacity not for coverage, and so the ability to support traffic growth is in no way proportional to the total number of cellsites in the US.
Now Verizon’s Q2 results, announced today, highlight another fundamental flaw in the methodology used by Brattle, in terms of the projected gains in spectral efficiency. Brattle assume that the gain in spectral efficiency between 2014 and 2019 is based on the total amount of traffic being carried on 3G, 4G LTE and LTE+ technologies, so with 72% of US traffic in 2014 already carried on LTE, there is relatively little scope for further gains.
This is completely the wrong way to account for the data carrying capacity of a certain number of MHz of spectrum, since it is the share of spectrum used in each technology that is the critical factor, not the share of traffic. Verizon highlighted that only 40% of its spectrum is used for LTE at present, while 60% is still deployed for 2G and 3G, despite the fact that 87% of traffic is now carried on LTE. Of course once that 60% of 2G and 3G spectrum is repurposed to LTE, Verizon’s network capacity will increase dramatically without any additional spectrum being needed.
Brattle’s methodology would suggest that moving the rest of Verizon’s traffic to LTE would only represent a gain of 5% in capacity (assuming an improvement from 0.72bps/Hz to 1.12bps/Hz) but in fact moving all of Verizon’s spectrum to LTE would produce a gain of 27% in network capacity (and an even bigger improvement once LTE Advanced is considered). Adjusting for this error in the methodology reduces the need for more spectrum very sharply, and once it is considered that the incremental cellsites will be deployed to add capacity, not coverage, the need for additional spectrum above the current 645.5MHz is completely eliminated.
In recent months Globalstar has vented its frustration with the slow progress of the TLPS NPRM, telling the Commission in April that “it is time for the Commission to move forward with an order in this proceeding and realize the substantial public interest benefits of TLPS.” Nevertheless Globalstar has previously been unwilling to compromise, indicating that it would only accept approval of the rules proposed in the November 2013 TLPS NPRM and that it would not relinquish spectrum to Iridium.
However, in the face of overwhelming pressure from Microsoft, Google, Sprint and others, it seems Globalstar has now decided it will have to accept a compromise as an interim measure to avoid being stuck in limbo for many more months. In a meeting with the FCC International Bureau last Friday, Globalstar struck a much different tone, urging the FCC “to grant Globalstar the proposed ATC authority,” a term which Globalstar has always declined to use, preferring instead to refer to the Commission’s “regulatory framework for low power wireless broadband.”
Moreover, Globalstar “expressed support for the Commission’s 2013 proposal” apparently hinting at the existence of a new 2015 proposal. Looking at the elements that Globalstar “urged” the Commission to adopt (apparently Globalstar’s bottom line) compared to those that it “encouraged” or “asked” the Commission to consider (those elements that are not essential), it is clear that Globalstar now wants a grant of “ATC authority” under “proposed rules” which no longer necessarily comport with the 2013 NPRM. Globalstar also “asked” (but didn’t “urge”) the Commission to “reject the unsubstantiated technical and policy requests by [its] opponents,” suggesting that any decision on TLPS OOBE limits can be deferred.
In contrast, back in May, Globalstar “urged the Commission to adopt its proposed rules expeditiously to add 22 megahertz to the nation’s wireless broadband spectrum inventory and ease the congestion that is diminishing the quality of Wi-Fi service at high-traffic 802.11 hotspots and other locations,” i.e. to approve TLPS specifically.
This move now points the way to a near term order written by the International Bureau on the narrower matter of ATC authority for Globalstar within its existing 11.5MHz of licensed S-band spectrum from 2483.5-2495MHz, in exchange for granting Iridium’s request to share more of the L-band. That would be a close parallel to the FCC’s ruling in November 2007, when it issued an NPRM on extension of Globalstar’s ATC authority in conjunction with the last reallocation of L-band Big LEO spectrum.
I would expect the FCC to defer any potential approval of the wider 22MHz TLPS channel to a further proceeding, with more testing and analysis of interference concerns to be undertaken. The main uncertainty relates to whether the approval of ATC authority would be for full power use, along the lines of the Open Range approval (but adapted to LTE), in conjunction with protection measures for BAS, or whether the approval will be limited to the much lower power levels contemplated in the TLPS NPRM.
I would assume that high power ATC usage is likely to be approved (as it is hard to see a limited low power channel being acceptable to Globalstar), with Globalstar welcoming this ruling as offering it more flexibility to either lease a single 10MHz LTE channel to a wireless operator in the near term or to later gain approval for TLPS at the end of the further rulemaking process.
Of course the debate would then move to appropriate valuation benchmarks, which are much easier to assess for standard licensed spectrum, albeit with upwards adjustments for lack of a buildout requirement and downwards adjustments for maintaining an MSS network and creating an ecosystem for a non-standard band. In addition the potential timeline and cost must be considered for the rebanding needed to avoid interference with grandfathered BAS users.
I’m sure that some will emphasize AWS-3 benchmarks of $2+/MHzPOP as a baseline, while others will highlight the MoffettNathanson assessment that spectrum around 2.5GHz, like that owned by Sprint, is only worth around $0.40/MHzPOP, and this enormous discrepancy means that the debate about what Globalstar’s spectrum is actually worth will certainly continue. Nevertheless, approval of a high power licensed spectrum block, even if limited to only a single 10MHz LTE channel, will make it harder to argue that Globalstar’s spectrum is completely worthless.
Back in October 2012, despite their misleading press release, CTIA’s own data indicated that there had been a significant slowdown in data traffic growth and confirmed that the emperor/FCC Chairman had no clothes when talking about the non-existent spectrum crisis. Now it seems CTIA is at it again, releasing an error-strewn paper today on how the FCC’s October 2010 forecasts of mobile data traffic have supposedly proven to be “remarkable accurate.”
This groveling attempt to “renew the effort to bring more licensed spectrum to market” is clearly designed to distract from CTIA’s own release of its year end 2014 wireless industry survey results last week, which showed that US mobile data traffic only grew by 26% last year (from 3230PB in 2013 to 4061PB in 2014) compared to growth of 120% in 2013, a dramatic slowdown which CTIA conveniently ignores.
Instead CTIA is praising the “solid analytical foundation” of the FCC’s October 2010 paper, which was recognized at the time, by myself and others, to be fundamentally flawed. So perhaps its not so ironic that the CTIA’s new paper mischaracterizes the data that the FCC used, stating that the forecasts “were remarkably accurate: In 2010, the FCC’s growth rate projections predicted mobile data traffic of 562 petabytes (PBs) each month by 2014; the actual amount was 563 PBs per month.”
Firstly, the FCC did not actually state an explicit projection of mobile data traffic, instead giving an assessment of growth from 2009 to 2014, as the (simple arithmetic) average of growth projections by Cisco, Yankee and Coda (use of an arithmetic average in itself is erroneous in this context, a geometric average of multipliers should be used instead).
Secondly, the FCC was projecting US mobile data traffic, not North American data traffic, which is the source of the quoted 563PB per month (which is taken from Cisco’s February 2015 mobile VNI report). We can see the difference, because the February 2010 Cisco report (available here) projects growth for North America from 16.022PB/mo in 2009 to 773.361PB/mo in 2014, a multiplier of 4827%, whereas the FCC paper quotes Cisco growth projections of 4722% from 2009 to 2014. (The reason for the difference is that growth in Canada was expected to be faster than the US, because Canada was expected to partially catch-up with US in mobile data traffic per user over the period).
If CTIA had bothered to look at Cisco’s mobile VNI tool, which gives data for major countries, it could have easily found out that Cisco estimates US mobile data traffic grew by 32 times between 2009 and 2014, not 35 times as the FCC forecast, let alone the 47 times that Cisco forecast back in February 2010.
Moreover, CTIA completes fails to mention that Cisco’s figure for 2014 (which according to the VNI tool is 531.7PB/mo for the US, rather than the 562.5PB/mo for North America that CTIA quotes), is completely different to (and far higher than) CTIA’s own data, which is based on “aggregated data from companies serving 97.8 percent of all estimated wireless subscriber connections” so should obviously be far more accurate than Cisco’s estimates.
However, CTIA is instead running away from its own data, stating in a footnote to the new paper that:
“Note that participation in CTIA’s annual survey is voluntary and thus does not yield a 100 percent response rate from all service providers. No company can be compelled to participate, and otherwise participating companies can choose not to respond to specific questions. While the survey captures data from carriers serving a significant percentage of wireless subscribers, the results reflect a sample of the total wireless industry, and does not purport to capture nor reflect all wireless providers’ traffic metrics. CTIA does not adjust the reported traffic figures to account for non-responses.”
Compare that disclaimer to the report itself, which notes that “the survey has an excellent response rate” (of 97.8%) and that it is adjusted for non-responses (at least so far as subscribers are concerned):
“Because not all systems do respond, CTIA develops an estimate of total wireless connections. The estimate is developed by determining the identity and character of non-respondents and their markets (e.g., RSA/MSA or equivalent-market designation, age of system, market population), and using surrogate penetration and growth rates applicable to similar, known systems to derive probable subscribership. These numbers are then summed with the reported subscriber connection numbers to reach the total estimated figures.”
CTIA’s wireless industry survey states that total US mobile data traffic was 4061PB in 2014, equating to an average of 338.4PB/mo over the year. Even allowing for the fact that Cisco estimate end of year traffic, not year averages, it is hard to see how the CTIA number for Dec 2014 could be more than 400PB/mo, some 25% less than Cisco.
If we instead compare growth estimated by CTIA’s own surveys (which only provide data traffic statistics back to 2010), then the four year growth from 388PB in 2010 to 4061PB in 2014 is a multiplier of 10.47 times, whereas the FCC model is a multiplier of 13.86 times (3506%/253%) and Cisco’s projection is a multiplier of 19.51 times (4722%/242%).
Thus by any rational and undistorted analysis, the FCC’s mobile data traffic growth projections have proven to be overstated. Likely reasons for this include the increasing utilization of WiFi (which was dismissed by the FCC paper, stating that “the rollout of such network architecture strategies has been slow to date, and its effects are unclear”) and the effect of dilution, as late adopters of smartphones use far fewer apps and less data than early adopters.
Nevertheless, what the data on traffic growth does confirm is that the FCC’s estimate of a 275MHz spectrum deficit by 2014 was utter nonsense. Network performance has far outpaced expectations, despite cellsite growth being far slower than predicted (3.9% compared to the 7% assumed in the FCC model) and large amounts of spectrum remaining unused: if we simply look at the Brattle paper prepared for CTIA last month, its easy to calculate that of the 645.5MHz of licensed spectrum identified by Brattle, at least 260MHz remains undeployed (12MHz of unpaired 700MHz, 10MHz of H-block, 65MHz of AWS-3, 40MHz of AWS-4, 20MHz of WCS, and all but around 40MHz of the 156.5MHz of BRS/EBS).
Thus in 2014, the US didn’t require 822MHz of licensed spectrum as the FCC forecast (which would have increased to 861MHz if the FCC model was corrected to the supposed traffic growth of 32x, as estimated by Cisco, and the actual number of 298,055 cellsites, as reported by CTIA), but instead, as CTIA proclaims, US mobile operators enabled “Americans [to] enjoy the best wireless experience in the world” with less than 400MHz of actual deployed spectrum.
I’m told that after a fair amount of difficulty and a month or two of delay, Greg Wyler has now successfully secured commitments of about $500M to start building the OneWeb system, and he will announce the contract signing with Airbus at the Paris Air Show next week. The next step will be to seek as much as $1.7B in export credit financing from COFACE to support the project with an objective of closing that deal by the end of 2015.
This comes despite Elon Musk’s best efforts to derail the project, culminating in an FCC filing on May 29. That filing proposes the launch of 2 Ku-band test satellites in late 2016, which would presumably be aimed at ensuring OneWeb is forced to share the spectrum with SpaceX, as I predicted back in March.
Clearly Musk is not happy about the situation, since I’m told he fired Barry Matsumori, SpaceX’s head of sales and business development, a couple of weeks ago, after a disagreement over whether the SpaceX LEO project was attracting a sufficiently high public profile.
Most observers appear to think that Musk’s actions are primarily motivated by animus towards Wyler and question whether SpaceX is truly committed to building a satellite network (which is amplified by the half-baked explanation of the project that Musk gave in his Seattle speech in January, and the fact that I’m told SpaceX’s Seattle office is still little more than a sign in front of an empty building).
Google also demonstrated what appears to be a lack of enthusiasm for satellite, despite having invested $900M in SpaceX earlier this year, when its lawyers at Harris, Wiltshire & Grannis asked the FCC on May 20 to include a proposal for WRC-15 that consideration should be given to sharing all of the spectrum from 6GHz to 30GHz (including the Ku and Ka-bands) with balloons and drones (see pp66-81 of this document). Needless to say, this last minute proposal has met with furious opposition from the satellite industry.
However, one unreported but intriguing aspect of SpaceX’s application is the use of a large (5m) high power S-band antenna operating in the 2180-2200MHz spectrum band for communication with the satellites. Of course that spectrum is controlled by DISH, after its purchase of DBSD and TerreStar, and so its interesting to wonder if SpaceX has sought permission from DISH to use that band, and if so, what interest Charlie Ergen might have in the SpaceX project.
Nevertheless, it looks like Wyler is going to win the initial skirmish, though there are still many rounds to play out in this fight. In particular, if Musk truly believes that the LEO project, and building satellites in general, are really going to be a source of profits to support his visions of traveling to Mars (as described in Ashlee Vance’s fascinating biography, which I highly recommend) then he may well invest considerable resources in pursuing this effort in the future.
If that’s the case, then the first to get satellites into space will have a strong position to argue to the FCC that they should select which part of the Ku-band spectrum they will use, and so Wyler will also have to develop one or more test satellites in the very near future. Fortunately for him, Airbus’s SSTL subsidiary is very well placed to develop such a satellite, and I’d expect a race to deploy in the latter part of 2016, with SpaceX’s main challenge being to get their satellite working, and OneWeb’s challenge being to secure a co-passenger launch slot in a very constrained launch environment.
Two people have now told me that with 99% certainty, the leak about the DISH/T-Mobile talks came from T-Mobile itself, not from DISH, based on the authorship of the WSJ report. Although it might be tempting to conclude that T-Mobile is trying to prompt a cable operator to consider an alternative bid, Charter has indicated that it will focus on TWC’s MVNO agreement with Verizon to provide wireless services if its TWC bid is successful and Comcast could presumably do likewise if desired.
Moreover, it seems this was not some sort of “official” leak, but instead simply reflects general conversations which got blown out of proportion, because Bloomberg has reported that the talks, which have been going on since last summer, have not advanced significantly in recent weeks.
That still leaves the perplexing analyst event that DISH held on Tuesday, and there’s been no convincing explanation of why that event was scheduled at short notice. Nevertheless, there’s now a frenzy of speculation leaving some convinced about the “inevitability” of a merger. What none of the reports deal with at all is how T-Mobile would actually make use of DISH’s spectrum without AWS-3/4 interoperability, and even then half of DISH’s spectrum in PCS H-block and 2000-2020MHz would still have no ecosystem available.
Instead analysts simply assume that interoperability doesn’t even need to be considered, and that the FCC “buildout requirements of its spectrum are so far in the future it’s not even worth starting the discussion about the weak enforceability of those deadlines.”
Of course a merger makes all the sense in the world if you assume DISH’s spectrum is just as usable as any other spectrum and that the FCC won’t enforce its buildout deadlines (in March 2020) so DISH has all the time in the world to strike a deal at a full price. Unfortunately that simply isn’t the case, and both Verizon and AT&T know that only too well.
That’s seems to be the question Charlie Ergen is asking Verizon, with the leak of merger talks between DISH and T-Mobile to the Wall St Journal. Yesterday DISH held an analyst meeting at which nothing much of consequence was said, raising the question of precisely why DISH held that analyst meeting in the first place.
The logical conclusion is that DISH hoped it would be able to announce some sort of deal yesterday, but that wasn’t achieved, and so now there has been a decision to leak more specific details about the progress of the DISH/T-Mobile talks (which have been rumored for months). The details disclosed make it unlikely that the intent is to bring T-Mobile back to the table, given the statement that talks on valuation remain at a “formative stage”. If the leak came from the T-Mobile side then its plausible to imagine that the aim is to pressure a cable company to make a bid for T-Mobile, or simply that the WSJ made a mountain out of a molehill, given others are saying there has been no change in the situation in recent weeks.
However, (until now) I considered it more likely that DISH is sending a message to Verizon, after the breakdown of talks on a spectrum sale or leasing deal, that Ergen has other alternatives he can pursue. Its previously been reported that Verizon rejected DISH’s asking price of $1.50 per MHzPOP for the AWS-4 spectrum last summer, and even after the AWS-3 auction, I very much doubt Verizon has shifted its position on valuation significantly. For spectrum without an ecosystem like AWS-4, I would still not expect Verizon to be willing to pay much more than $1 per MHzPOP.
Nevertheless, if Verizon had been willing to commit to a partial lease of DISH’s AWS-4 spectrum and support interoperability into the bargain (perhaps with some AWS-3 licenses included to raise the average reported price), then that would have helped DISH to undertake a spectrum spinoff. By doing a deal now, I would expect DISH to also have been able to seek a compromise with the FCC by agreeing to repay the $3.3B DE discount it received in the AWS-3 auction, and thereby mitigate the bad feeling which would otherwise be likely to hamstring DISH’s ability to get help from the FCC in ensuring AWS-3/4 interoperability in the future.
So if Verizon has truly walked away for good, and cannot be forced back to the table by this leak, then I think this is unalloyed bad news for DISH. Without interoperability it is hard to see the value of DISH’s AWS-3 spectrum for T-Mobile, as I noted last week. And it is equally hard to see how agreement can be reached with Deutsche Telekom on the respective valuations of DISH and T-Mobile, especially when DT can hold out for a potential merger with a cable company in the future. So I think Verizon can still proclaim that when it comes to DISH’s spectrum, it’s heads we win, tails you lose.
Despite the delays in the launch of GX, it seems Inmarsat may be looking to stitch up an even larger share of the maritime market in the near term. Rumors are flying that Inmarsat may soon make a formal bid to acquire KVH, the largest maritime VSAT player in terms of vessels (though not in revenues), adding about 3500 more terminals to Inmarsat’s existing 2200 VSAT equipped ships.
KVH generated nearly $80M from its miniVSAT business in 2014 with an average service ARPU of $1500 per month, compared to Inmarsat’s $90M and ARPU of $4000 including equipment leases (this equates to $2500 per month after stripping out hardware, according to Inmarsat’s most recent results call, which is a more appropriate point of comparison with the KVH ARPU).
The difference in ARPUs between Inmarsat’s current VSAT business and KVH is striking, in fact KVH’s smaller V3 terminal (which has about 900 active terminals) is generating around $500 in monthly ARPU, below even Inmarsat’s FleetBB ARPU of $700 (note that the standard FleetBB package sold by KVH now only provides 20 Mbytes per month of data for $749, whereas KVH offers airtime at rates as low as $0.99 per Mbyte).
If Inmarsat does move ahead with a KVH bid, it would likely be seen as a counter to Airbus’s disposal of its Vizada business unit, because Inmarsat would then have by far the largest number of VSAT-equipped ships. Indeed it would not be surprising to see attempts by competitors to block the deal on antitrust grounds, not to mention the concerns that current KVH customers will have about potential future price increases.
However, it would also be something of an acknowledgement that GX is optimally positioned as a lower end off-the-shelf maritime VSAT service (like KVH’s miniVSAT), as a step up from FleetBB, rather than as a high end solution for cruise ships and oil rigs. KVH’s growth has slowed in the last year, with terminal shipments staying at close to 1000 per year in 2012, 2013 and 2014, but net adds and ARPUs declining. Pressure from Inmarsat will only intensify, once the low cost 60cm GX antenna is available with global coverage, so this looks like it would be a good time for KVH to sell out.
Inmarsat investors will presumably also welcome a deal, with a much clearer path established to a GX maritime business of $200M+ in annual service revenues over the next few years (though its important to note this represents a retail service business, not the wholesale spend on satellite capacity). However, the obvious question that customers will ask is whether low end price packages will still be offered for miniVSAT users, or whether Inmarsat will move them up to much higher price points, as it has done with FleetBB over the last few years.
And what will be the alternative for these users: will it be other VSAT solutions, or will it be the new broadband services (comparable in capability to FleetBB) offered by Iridium’s NEXT constellation? It will take some time for either of these options to emerge, with low cost small Ku-band VSAT antennas needed for the former, and completion of the NEXT constellation needed for the latter. That provides a further motivation for Inmarsat to move sooner rather than later, while its freedom of action in the low end of the maritime market remains relatively unconstrained by competitive alternatives.
« Previous entries Next Page » Next Page »