Automakers, Others Respond to 5.9 GHz Band Obama Letter

Fifty-four automaker interests, state highway agencies, satellite companies, public safety groups, and others have sent a letter to President Obama responding to a letter that he was sent last week by two dozen tech and telecom entities, consumer groups, and school and library organizations urging him to push for shared use of the 5.9 gigahertz band between connected-vehicle and Wi-Fi applications (TRDaily, April 28).

“Last week, you received a letter from the cable industry and additional stakeholders suggesting that the transportation sector refuses to share the 5.9 GHz band used for connected vehicle technology with Wi-Fi. Nothing could be further from the truth,” said the most recent letter. “The transportation sector has been actively engaged with the Wi-Fi industry to determine the best method for robustly sharing the band while maintaining the integrity and reliability of previously permitted Dedicated Short Range Communication (DSRC) systems and ensuring that the vast amount of resources already invested are not wasted. These efforts include the testing of at least two potential sharing solutions that the Federal Communications Commission, the Department of Transportation (DOT), and the National Telecommunications and Information Administration plan to assess this summer. Continue reading

Experts Agree on Encryption, But Clash on Surveillance

May 4, 2016–Experts from different ends of the ideological spectrum today agreed that government mandates for tech-sector companies to create backdoors into encrypted devices and services would be harmful to advancing the goals of better information security, but they clashed over the value of intelligence and law enforcement agency surveillance programs. Speaking at an event organized by the Hudson Institute, Nadine Strossen, a professor at New York Law School and a former president of the American Civil Liberties Union, said she agreed with assertions that if the federal government were allowed to require – as the Federal Bureau of Investigation attempted to do earlier this year in litigation involving Apple, Inc. – companies to create security and encryption backdoors for the government’s benefit, the consequences would be dire if they were accessed by other parties with ill intent. 

“Exactly how serious the risks [of creating backdoors] are depend on exactly what the manufacturers would be expected to do,” Ms. Strossen said. “The worst case would be if Apple designed a code that would be capable of being used on Apple devices with the same operating system,” she said, adding, “If that leaks, the public danger would be catastrophic.”

 She also criticized the federal government for “overstating” its need in recent court actions to compel Apple to create workarounds to security and encryption technologies.  “The FBI insisted over and over again that it could not access the data” that it was seeking on Apple devices, yet “those claims turned out to be unfounded,” she said. Continue reading

FCC to Consider Modifications to Outage Reporting

May 4, 2016–In the item concerning the FCC’s communications network outage reporting requirements, the Commission plans to consider a report and order and further notice of proposed rulemaking in PS dockets 15-80 and 11-82 and ET docket 04-35 updating its part 4 rules. “For more than a decade, communications providers have kept the FCC apprised of major disruptions in their networks through our network outage reporting requirements,” Mr. Wheeler said in his blog posting. “The data have allowed staff to detect adverse outage trends, support providers’ service restoration efforts, and communicate with public safety officials during times of crisis. These reports also provide the FCC with a unique industry-wide view into communications outages that enables us to help make networks more reliable.

“This becomes even more important as critical infrastructure services rely increasingly on interconnected communications networks,” Mr. Wheeler added. “However, communications providers currently report 911 outages that occur on legacy networks, but not for next-generation 911 over IP networks. That’s why I am circulating an item that would refine our network outage reporting requirements and propose common-sense updates to keep pace with technological change.

“This proposal would initiate a dialogue and seek comment on ways to keep our reporting requirements current, whether for outages to emergency or non-emergency communications, so that we can continue to collectively safeguard the networks that American consumers and businesses rely upon,” the Chairman said.

Last year, the FCC adopted a notice of proposed rulemaking in the proceeding that proposed requiring providers to report any outages that “significantly” degrade or prevent the completion of 911 calls to public safety answering points (PSAPs), not just total outages (TRDaily, March 30, 2015). It also proposed allowing states to access outage information covering their states. Continue reading

Andy Seybold’s Public Safety Advocate, May 6, 2016

Unlike CNN, I won’t be posting a clock with a countdown to the day and time responses to the FirstNet partnership RFP are due at FirstNet headquarters. Suffice it to say that this is the sixth of May, RFP responses are due by the end of this month, and I do not believe there will be any more time extensions.

Recently, the Congressional Research Service (CRS) published its latest update. The CRS is part of the Library of Congress and has been providing Congressional staffers with reports on the need (or lack of need) for Public Safety Broadband Spectrum and FirstNet since well before FirstNet was created by Congress. On April 28, 2016, the CRS sent another report in its series of reports on FirstNet to Congress. This one was entitled: The First Responder Network (FirstNet) and Next-Generation Communications for Public Safety: Issues for Congress, authored by the same person who has authored all of these reports. I was sent a copy of the report as have others, and it makes for interesting reading. You can find it here: https://www.fas.org/sgp/crs/homesec/index.html so you can read it for yourself. The CRS is part of the Library of Congress, and therefore, its bosses are the members of Congress. Naturally, most of their comments are focused on the states even when reviewing FirstNet and the mission it was given by Congress. So it should be no surprise that the report takes FirstNet to task for not looking at options that might be more state-centric. Continue reading

FCC Officials Plan to Seek Feedback on Solution to ‘Twilight Towers’

May 4, 2016–The FCC plans to seek input on an upcoming proposal to address “twilight towers,” – communications structures built between 2001 and 2005 that did not go through the section 106 review process required by the National Historic Preservation Act (NHPA), an FCC official said today.

Chad Breckinridge, associate chief of the Wireless Telecommunications Bureau, discussed twilight towers during a workshop on environmental compliance and historic preservation review procedures required for wireless communications facilities. The towers were built between the adoption of two nationwide programmatic agreements (NPAs) concerning the construction of towers. “This is still very much a fluid process,” said Mr. Breckinridge, noting that the FCC has met with state historic preservation officers (SHPOs), industry, tribes, and the Advisory Council on Historic Preservation on the twilight tower issue.

“We are actively working on it and we are committed to bringing it to closure as quickly as we can,” he said, noting that industry is frustrated because collocations are not allowed on twilight towers. The goals are to account for historic preservation review of the towers and to make them available as quickly as possible, he said.

“There are some thorny issues and some strong feelings about how best to resolve this,” said Mr. Breckinridge, noting that the bureau welcomes any additional input from stakeholders. “At this point, the ball is squarely in our court. We are developing a single solution, proposed solution that we will want to present for broader feedback.” Continue reading

FCC Tentatively to Consider CAF Phase II, Outage Reporting Items at May Meeting

FCC Chairman Tom Wheeler plans to ask his fellow Commissioners to vote on three items at the agency’s May 25 meeting:  a report and order setting out the framework of a reverse auction to allocate support for providers willing to deploy service in certain high-cost price cap areas; a report and order and further notice of proposed rulemaking dealing with network outage reporting requirements; and a notice of proposed rulemaking to ease requirements for broadcasters’ and cable TV operators’ public inspection files.

The draft high-cost support item in Wireline Competition dockets 10-90, 14-58, and 14-259 will include “rules to implement a competitive bidding process for high-cost universal service support from Phase II of the Connect America Fund,” or CAF Phase II, according to the tentative agenda for the May 25 meeting released by the FCC this afternoon.  As is typical of the FCC’s approach to auctions, the agency expects to issue a separate public notice seeking comment to address specific auction procedures at a later date, according to an agency official who spoke with TRDaily on background.

“Building on the Commission’s experience with its Rural Broadband Experiments program, I’m proposing new rules that would allocate over $2 billion over the next decade in Connect America Fund support for rural broadband through competitive bidding,” Chairman Wheeler said today in a blog posting describing his plans for the May meeting. Continue reading

DHS S&T to Conduct Airflow Study in NYC Subway

Experiment to Collect Data on Airborne Contaminant Behavior Scheduled for May 2016, Poses No Public Risk

NEW YORK – The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) will conduct a week-long airflow study in portions of the New York City (NYC) subway system to gather data on the behavior of airborne particles in the event contaminants were released. This study poses no risk to the general public and will run from May 9 to May 13.

The study will use only inert particle tracers, or harmless, non-toxic, inert gases, and results from the tests will be used to validate airflow and transport models for the subway environment.

“This study is part of the Department’s ongoing commitment to preparedness and the shared responsibility of protecting the nation’s critical infrastructure,” said DHS S&T Program Manager Dr. Donald Bansleben. “The results of this study will provide us with a greater understanding of airflow characteristics, informing the research and development of next generation systems that continue to ensure the safety and security of the general public.”

There will be a single 20-minutes release period of tracers in multiple stations, and daily release locations over the course of the week will include any two of the following stations: Grand Central, Times Square and Penn Station. Air sampling will occur in approximately 55 subway stations around Manhattan over a four-hour window following these tracer releases. During the test period, commuters and travelers may see equipment, or experimenters in regular clothes, safety vests and badges.

“The Metropolitan Transportation Authority continues to be an active partner in safeguarding the New York City subway system and this study will generate valuable information on protecting against airborne contaminants,” MTA Chairman and CEO Thomas F. Prendergast said.  “These inert gases are safe for our customers and employees, and the entire test will be performed with no impact on them and no interruption to service.  We are fully committed to keeping our nearly six million daily subway customers safe and secure, and this test will bolster the MTA and our partners’ ability to protect them and the city at large.”

Prior to the airflow study, S&T conducted and posted for public comment a draft Environmental Assessment (EA) and proposed Finding of No Significant Impact (FONSI) for the use of the various tracer materials. The final EA and FONSI can be can be downloaded from the DHS website: http://www.dhs.gov/national-environmental-policy-act. Questions about the NYC test may be directed to the email box: mtanycttest@hq.dhs.gov

Andy Seybold’s Real World Intelligence, May 4, 2016

The New Hot Technologies

“What goes around comes around” certainly rings true in the world of wireless where many early pioneers who introduced new ideas ended up going out of business while pursuing the same types of technologies or concepts that are, today, the next big things in wireless. Perhaps they were too early to market, perhaps they were trying to make small wireless data pipes do the work of today’s broadband data pipes, or perhaps network operators did not see the value in what the pioneers were trying to provide. Whatever happened, the point is that most of today’s “hot’ wireless ideas are modern-day versions of what came before them.

Three examples of this are data encryption, the Internet of Things (IoT), and even 5G, meaning the use of small cells to increase network capacity for customers. Each of these was developed way back in the 1990s, centuries ago in wireless time.

Data Encryption

The very first BlackBerry featured end-to-end encryption. It was the only product to do so in the late 1990s and no one considered it all that necessary for many years. Phil Zimmermann founded an even earlier encryption company in 1991. It was called Pretty Good Privacy (PGP). It was only mildly successful, mostly because a demand for encryption had not yet been created by hackers or the Internet. BlackBerry baked it into its service but PGP and a few other early encryption companies had to create demand for their products as an add-on to corporate and even government data. This was not easy to do in those days.

It took many years for Research In Motion (RIM, now BlackBerry) to start bumping heads with governments regarding its email and device encryption. Over time, BlackBerry ended up having to give the keys to its encryption to more and more government entities. However, for the most part, BlackBerry still provides the best-encrypted email available today. Fast-forward to 2016 and we see that the U.S. Congress is trying to pass an ill-advised encryption law. The FBI, which wanted to be able to break into a specific iPhone, sued Apple, and other government agencies are claiming they need unfettered access to secure devices.

We have gone full circle with encryption and the need for encryption. From companies such as PGP that had trouble drumming up interest to today’s world where anything sent via the Internet unencrypted is easily intercepted. Encrypted communications or files are now the subject of a battle between those who want privacy and those within the government who claim providing privacy that is not accessible to government agencies will foster hostile actions by individuals, groups, or countries.

Finally, there are those who use the Internet for social media, freely posting personal information, discussing their location, and providing those that wish to be malicious or worse all the information they need to do so. Some willingly put their life stories on the Internet, governments want to make sure they can access anything and everything they think might be interesting, others want to protect their corporate Intellectual Property, and some organizations need to use the Internet for communications purposes but need it’s communications to be secure for any number of reasons. Few of these would be harmful to any government or its citizens. Yet governments want uncrackable encryption for their own use and believe their citizens do not have that same right. There is no way to tell how this will turn out but for now it appears there is a much larger battle brewing over encryption on the horizon.

The Internet of Things (IoT)

This is the “next big thing,” or at least one of the next big things, again coming about because of work that went on back in the 1990s. During that time we provided consulting services to Aeris, a company whose name is still carried forward today by others who followed those who built the company. They were almost able to sell Aeris to Verizonbefore the network operator got cold feet. Aeris was an interesting start-up because it purchased its own SS7 telecommunications switch and designed its Machine-to-Machine services so that when a session was started, the network operator’s system saw the wireless number as a roaming call and handed it off directly to the Aeris switch. In those days the data payload was very small and the call did not take long to complete. In most cases the networks did not even know it was occurring.

Aeris worked with a number of customers and even though the data carried over the M-2-M modems was small the impact for the customers was huge. They could measure the water level in a remote storage tank and turn a pump on or off, and monitor chemicals in a hotel pool and add more if needed without having to send out a person. They had systems that indicated they needed to be fixed, there were Coke machines that could let the truck driver know how many bottles of which products were in the machine and how much of what to bring for restocking, and the company knew exactly how much money was in the machine.

Aeris and a few others were doing all of this and the network operators had no interest. The income from M-2-M was noise to them versus their voice and text traffic and some data services. Yet they missed the point that the income derived from these devices basically went to their bottom line as profit, and the fact that there are many more machines in the world than people. I remember one conversation we had with a network operator’s financial folks. They explained to us that at that time it cost them about $23 per subscriber (total cost of the network, overhead, marketing, etc.) and that each subscriber was bringing in an average of $54 so margins were great. In the case of M-2-M, they could not see any reason to pay $23 per user when M-2-M devices returned only a few dollars a month. To them this was a loss for each device on their network. They failed to realize that the cost of putting an M-2-M device on the network was essentially zero.

Now IoT is a big deal to more than existing wireless broadband networks and the Internet. Some chip-level and other vendors have gone so far as to develop a lighter version of LTE specifically to handle the IoT traffic (LTE-M). Companies such as ABI research forecast that by 2021, 28 percent of wireless connectivity will be IoT devices. Companies such asFitBit are already thriving because of the explosion of IoT, but it is not all peaches and cream. In some areas many homeowners are not happy about “smart meters” they feel might be spewing out harmful radio frequencies or permitting utilities to spy on us. Many of these meters are mounted on an outside wall, often on the other side of the master bedroom wall. Others are concerned about wearables, yet most of these folks, when asked, will admit they have one or more Wi-Fi access points in their home and think nothing of using them every day to bring content to their many wireless devices.

IoT has the attention of network operators and others who are trying to identify new and unique ways to make use of the technology. It is clear that many law enforcementofficers will have IoT devices on them to start their body cameras recording the moment they reach for their Taser or firearm and firemen will have sensors on their body to record their heartrate, breathing, and body temperature when in a building. And, of course, the medical community is already providing devices and applications to remotely monitor patients’ sugar levels, heart rates, and so much more.

IoT will be used for things we cannot even imagine at the moment. Think back to before the iPhone and how we thought of a wireless cell phone and its limited uses, then fast forward to today and see how many of the things we do on our wireless devices we had not envisioned. IoT will have the same impact on many additional aspects of our lives. The pioneers are long gone or retired and watching what is happening. I am sure they are amazed that what they first conceived of and tried to make into mainstream products and services have taken this long to mature and grow. So many things have come together to enable IoT to be perceived as one of the most important new wireless technologies of this century: the uptake of smart devices, the willingness of people to make use of these devices in many different ways, and the way in which smart people and companies have already married together IoT and wireless devices. In many cases IoT will be a standalone device to network communications but in many more the smart wireless device we carry will be the “access point” to collect and forward information gathered by IoT devices on us or in close proximity to us.

M-2-M has morphed into IoT as much of the early work and many early wins and losses gave some of the ideas and concepts to those who have decided IoT will become one of the next big things. These they have developed into today’s growing IoT market that network operators are now embracing whole-heartedly.

The Early Days of 5G

Wi-Fi was also coming to fruition in the 1990s. For example, WayPort, which is now a part of AT&T, was founded in 1996. Another Pioneer in the Wi-Fi space was Proxim (1984), which was providing Wi-Fi-like access points before Wi-Fi became a standard. These access points were deployed by MobileStar, also founded in 1996. In 2001, MobileStar faced bankruptcy and in 2002, T-Mobile purchased the assets. MobileStar, you might remember, pioneered the placement of Wi-Fi access points in Starbucks and many other venues.

During this period of time, AT&T, Verizon, and Sprint were not Wi-Fi fans, mainly because the spectrum was unlicensed and could not be coordinated or owned by the wireless network operators. However, T-Mobile moved forward and not only purchased the Wi-Fi access points that made up MobileStar, it integrated them into its cellular back-end system. In the 2007 T-Mobile was the only network operator to offer combined cellular and Wi-Fi connectivity. I had a T-Mobile modified access point in my home and one in my office about 200 feet away. I was able to answer my T-Mobile phone using the access point in my home, walk outside and be switched to the T-Mobile network, and once in my office I was switched to my office access point. During such calls, I didn’t miss a single syllable of my conversation in either direction.

Today the promise of 5G is almost exactly what T-Mobile implemented in the early 2000s. It is about more, smaller cells, placed in areas of heavy demand, and off-loading the primary wireless network to provide more capacity and data throughput for everyone. 5G will use a mix of different portions of the radio spectrum including Wi-Fi on 2.4 GHz and 5 GHz, 3500 MHz shared spectrum, some AWS spectrum, and spectrum up into and beyond what is currently used for microwave and satellite communications.

There were also some spectacular Wi-Fi failures during this time. Some companies decided that like Metricom before them, they could build Muni-Wi-Fi networks providing cheaper, faster data to customers and cut out network operators. One of the biggest promoters of Muni-Wi-Fi was the Internet service provider EarthLink, which built and planned some very ambitious Metro-Wi-Fi systems. In 2005, after building several systems in other parts of the United States, EarthLink made a deal with Anaheim to build out the city. The promise was to cover Anaheim and provide inbuilding coverage to the first wall. The day of the system launch we were invited to attend the event, tour the network operations center, and experience the system firsthand. I had written a number of articles about Metro- or Muni-Wi-Fi and the fact that there was no business model for this service. Because of this, after the event we were invited to a meeting with EarthLink’s CEO and others. Out of the meeting came a contract for us to run tests in the City of Anaheim three times during a six-month period, measuring data rates as well as coverage.

During this six-month period as we ran our tests we watched EarthLink struggle with its system. It went from 70 access points a square mile to 90 and then 105. The Tropos Wi-Fi equipment included a mesh-network type of architecture where only one in three access points were connected to the backbone and onto the Internet. The other access points received customers’ data and hopped it over to one of the connected access points. Our tests showed a number of issues. First was that system Wi-Fi channels could not be changed automatically. Next, the cable company provider was shipping cable modems to the citizens of Anaheim hard-wired to Wi-Fi channel 6 and the constant beacons of multiple in-home access points really messed up the network. There were several times when we could see the EarthLink access point but could not register or talk to it. Basically, over the six months we conducted the tests, we watched the system implode on itself.

Today network operators have embraced Wi-Fi access points and both AT&T and Verizon have finally enabled voice over Wi-Fi, something T-Mobile did more than a decade ago. Wi-Fi is now an accepted way of off-loading the operator’s primary network inbuilding. Which brings us to 5G, which will basically be an extension of what is already occurring with Wi-Fi but using different technologies and additional portions of the radio spectrum.

There is still the issue of Wi-Fi and LTE-U (LTE unlicensed), which Qualcomm and others claim can co-exist on Wi-Fi channels (since Wi-Fi channels are unlicensed) with licensed operators also using them. FCC rules state that you have to accept any and all interference on your unlicensed system but you cannot cause any interference to a licensed user. To my knowledge this has not been a real problem except Wi-Fi users notice over time that their access points don’t cover inside their homes as well as they used to and perhaps their data rates are slower. This is typical when you have so many radios in close proximity to each other spewing out data on the same basic portion of the spectrum.

Wi-Fi will be an important part of the 5G rollout and much of it is already in place, so network operators are looking at other owned or licensed and coordinated spectrum to use for their 5G systems. Companies are flocking to what they consider to be 5G (I have yet to see a formal definition of 5G from any of the standards bodies). AT&T is testing 5G in its lab in New Jersey and will deploy it in a few test areas while Verizon is rolling out 5G at its New Jersey Headquarters and will reportedly use Boston as its test bed. Further, Comcast will take part in the 600-MHz spectrum auction and is ideally positioned with assets in the field to deploy 5G small cells in its coverage areas due to the large amount of fiber it already has on the streets. (Note: This is yet another do-over, the track record for a cable company in the wide-area wireless world is dismal at best.)

Since each cell will have to be connected back to the network somehow, having fiber in place is a good start. Google is another company that has been offering fiber services in several cities with more planned. It is currently conducting 5G studies and testing in one of the cities where it already offers fiber. Verizon and AT&T have fiber as do others including most cable companies and, of course, 5G is all about off-loading the main network with lots and lots of small cells, all of which need connectivity. But wait, there’s more!

The same companies that have fiber on streets either underground or on poles are looking at the same radio technologies to connect the fiber to the home. Some companies are using light beams to move gigabytes of data from the street to homes and some are using the same radio spectrum that will be used for 5G to provide a radio link between the street and the home. Perhaps that might be one reason AT&T bought DirecTV, which has a multitude of small dishes mounted on homes, apartments, condos, and businesses.

Conclusions

What goes around comes around in the wireless industry. The new hot technologies are encryption, the Internet of Things, and 5G. What came before each of these prepared those who are re-pioneering the same basic technologies only more secure (encryption), or capable of sending and receiving more data for less money (IoT), and better wireless capacity and data speeds (5G).

In the “old days” when people were experimenting and rolling out these and other technologies, many mistakes were made along the way. Often the products were ahead of their time or there was no perceived need or economic incentive for what was being developed. The question of the day is will some of these same mistakes be made this time around or will we look to the past to avoid them? Continue reading

FirstNet Elaborates on Purpose of Additional State Data

The First Responder Network Authority has elaborated on the purpose of new or updated data that states and territories may submit to FirstNet by Sept. 30. In a blog posting, Brian Hobson, FirstNet’s state plans technical lead, said, “The initial data collection process that culminated last fall was a key source of stakeholder input into the RFP.  … As we discussed at the SPOC meeting [last] month, this round of data collection is an optional opportunity for any states that have collected additional data or have updates to their submissions since last fall’s data collection. We are still focusing on the four key topics about which we originally requested data, including:  1) coverage; 2) users and operational areas; 3) capacity; and 4) current services and procurement.

“A key difference this time, however, is that the data will not be incorporated into FirstNet’s acquisition process nor will it be considered for proposal evaluations,” Mr. Hobson added. “The initial data submitted by states and territories in 2015, on behalf of local, state and tribal public safety agencies, was used to inform FirstNet’s RFP.  Those inputs, however, have been frozen and will not be changed.

“We are requesting the additional data ahead of the contract award slated for later this year so that we can synthesize and understand the most recent inputs from states and territories,” the blog posting stressed. “Any updated or additional data that we receive this year will only be shared with FirstNet’s partner(s) after award of the network contract and will not be used to materially change the negotiated solution under the award.”

In his weekly e-mailed commentary last week, Andy Seybold, a wireless industry consultant and public safety advocate, said that FirstNet’s solicitation of updated data from states had raised questions among potential bidders to build out the FirstNet system. – Paul Kirby, paul.kirby@wolterskluwer.com

Courtesy TRDaily

 

City of New York Expresses Concern about LTE-U

The city of New York today expressed “grave concern” that any interference from the deployment of LTE-U technology will cause harmful interference to Wi-Fi in the city. In a letter to 3GPP leaders that was copied to FCC officials and others, Maya Wiley, counsel to New York City Mayor Bill de Blasio (D.), said “the City has embarked on an aggressive effort to achieve universal broadband for all New Yorkers, including affordable high-speed residential service and free service in public areas. Wi-Fi is a central part of this effort and any technological interference with our ability to deliver free and affordable wireless access to our residents is of grave concern.”

The letter outlined efforts to deploy Wi-Fi in the city through the LinkNYC, Harlem Free Wi-Fi, Queensbridge Wi-Fi, and Red Hook Wi-Fi initiatives, as well as through Wi-Fi deployed in schools, libraries, parks, and subway stations.

“Clearly, any threat to Wi-Fi is a threat to the very fabric of the city,” it said. “Even a modest loss of coverage area for a Wi-Fi hotspot, when multiplied and magnified over the scale of New York City, could impact millions of users daily and decrease the value of hundreds of millions of dollars of public and private investment. Likewise, any increase in latency could undermine the utility of the City’s investments for innovative voice and video applications.” Continue reading