Steven J. Crowley, P.E.
Archive for the ‘4G’ Category
A study commissioned by UK telecom regulator Ofcom examines tradeoffs among many mobile indoor-coverage technologies, and suggests the agency help consumers learn more about them.
LightSquared has asked the FCC for special temporary authority to conduct four months of tests in support of a potential frequency move. The application, and accompanying exhibit, were received by the FCC on March 5.
As background, to help resolve GPS interference concerns, LightSquared has proposed to conduct a portion of its terrestrial operations in 1670-1680 MHz instead of 1545-1555 MHz. It currently has authority to use half that, 1670-1675 MHz. The 1675-1680 MHz portion, however, is currently allocated for use by meteorological aids such as radiosondes and satellites. The company wants to conduct tests to see if its base stations would be compatible with other services in the 1675-1680 MHz band. A big concern is the radiosondes. Another part of the testing is determining if the radiosondes would be compatible with other services in the 400.15-406 MHz band, if they need to move there to accommodate LightSquared’s needs in 1675-1680 MHz. If a move is needed, the tests would help determine the costs of such a move, and “inform an eventual determination of an appropriate vehicle for meeting these costs” (i.e., who pays).
LightSquared asks to conduct tests across the continental United States. All transmitters would be coordinated with the FCC and NTIA, as needed.
This summarizes a selection of applications for the Experimental Radio Service received by the FCC during July and August, 2012. These are related to medium-frequency communications, meteor radar, space-to-space communications, UAV communications, synthetic aperture radar, TV white space, 600 MHz LTE, disaster communications, cellular content caching, GSM, passive intermodulation distortion, ultra-wideband, TDD, ground-mapping radar, Doppler radar, and ground surveillance radar. The descriptions are sorted by the lowest frequency in the application.
In the first of a series of webinars, Qualcomm today began reporting on the results of its “1000x Data Challenge,” an initiative to meet what it sees as the need, someday, to increase mobile capacity 1000-times. The webinar, conducted by Rasmus Hellberg, Qualcomm’s Senior Director of Technical Marketing, was an overview. He discussed spectrum, small cells, and other techniques to increase capacity. More-detailed webinars on each of these are forthcoming: spectrum initiatives on September 18, small cells and heterogeneous networks on October 18, and more efficient networks, applications, and devices on November 14. Today’s webinar should be posted tomorrow, and a white paper should appear in about a week.
4G Americas, a wireless industry trade association representing the 3GPP family of technologies, has released a report looking at broadband devices and applications, and their impact on HSPA and LTE networks. There’s quite a bit of interesting information; here I highlight the discussion on mobile broadband offload and mobile data growth.
That’s what a D.C. think tank says. I take a look at this contrary view in a piece I did for GigaOM.
“To generalize, it is often true that studies will be promoted that tend to support the policy inclinations of the Chairman, under whose direction, after all, every draft decision is made.”
“[S]tatistics can lie. But cast as ‘studies’ by commentors, they take on the weight that a decision maker chooses to make of them.”
As a follow-on to its National Broadband Plan, the FCC last year released a Technical Paper intended to validate the Plan’s prediction of a 300 MHz mobile-broadband spectrum deficit by 2014. The Paper describes a spectrum requirements model that totals current spectrum assigned to mobile broadband and applies a multiplier based on expected demand, taking into account expected increased tower density and improvements in air-interface spectrum efficiency. The model’s result is a predicted deficit of 275 MHz in 2014, which rounds to 300 MHz. On the way toward that result, however, the analysis uses just a few of the available data forecasts, ignores offloading of macrocell data to Wi-Fi and femtocells, and assumes the continuation of flat-rate plans for consumers. Some of these oddities I noted in a post at the time. I had hoped the FCC would make the Paper a subject of public comment. That hasn’t happened. So, I’ve looked at the Paper in more detail. I find that when looking at the above factors in a more realistic manner, predicted spectrum requirements go down significantly.
This summarizes a selection from 173 applications for the Experimental Radio Service received by the FCC during August and September 2011. These are related to long-range low-frequency radar, amateur radio, shortwave data, wireless microphones, single-sideband, mine detection, millimeter-wave communications, signal intelligence, automotive radar, satellite feeder links, meteor-burst communications, aircraft telemetry, white space systems, border security radar, 3G and 4G applications, RFID, wind turbine testing, unmanned aerial vehicles, spacecraft telemetry and control, aircraft passenger broadband, and autonomous aircraft landing systems. The descriptions are sorted by the lowest frequency found in the application.
In a recent blog post, CTIA compares some measures of the U.S. wireless industry to those in nine other countries. The purpose is two-fold; to show the U.S. is a leader in number of subscribers, lowest cost per voice minute, and spectrum efficiency, and to argue the need for getting more mobile broadband spectrum in the “pipeline.” These goals are somewhat at odds, and the spectrum-efficiency argument I don’t get, as I’ll explain, but within the constraints of a blog post I think CTIA makes the case that the U.S. is a clear leader in some areas, and that the prospects for more mobile spectrum in the U.S. are fuzzier than they should be today.