Broadband.gov
Federal Communications Commission



Workshop Summary: Technology/Wireless

August 19th, 2009 by Julius Knapps - Chief, Office of Engineering and Technology.

FCC NATIONAL BROADBAND PLAN WORKSHOP THURSDAY, AUGUST 13, 2009

On Thursday we had a very interesting workshop on the role of wireless technology in offering fixed and mobile broadband access.  The workshop was divided into two panels.  The first panel discussed the status of mobile wireless and the second addressed the opportunities and challenges of serving rural users.  Everyone agreed that there is a continuing growth in demand for data services and a number of solutions are being worked on by the providers to try to meet expected demands.  Several wireless carriers and their providers are focusing on building out current 3G networks with HSPA technologies with plans to evolve to LTE technologies.  Others are moving rapidly to deploy WiMax for fixed and mobile applications.  To try to maximize frequency reuse and increase capacity, the providers are also deploying smaller cell sites (eg. micro- pico- and femto- cells are being used more commonly).  However, this also requires higher capacity back haul connections. All the panelists agreed that the back-haul ("middle mile") problem needs to be addressed.  The availability of fiber connections to such sites limits how quickly the networks will evolve.  The service providers in rural areas face the additional issues with getting cost-effective back-haul connections.  It is necessary to develop innovative technological and regulatory solutions to address this critical issue.

One other fundamental issue is that of obtaining more usable spectrum to address anticipated bandwidth demands.  However, panelists varied in their requests and proposals and no one identified specific spectrum requirements.  The carriers operating using licensed spectrum recommend international harmonization of spectrum to help drive the cost of equipment down.  Several rural providers that have developed innovative solutions using limited spectrum seek Commission involvement in getting access to more spectrum via secondary markets.  The operators offering services using unlicensed spectrum are recommending special recognition of their needs in the rules by creating better "light" licensing regimes and expanding the license-light concept to additional frequency bands.  For the longer term, some of the panelists presented ideas for opportunistic use of spectrum by using dynamic spectrum access techniques or learning from open access technology projects.  This also raised the possibility of networks based on streamlined and flexible designs of base stations (using software defined radios); device to device communications protocols and open access for devices to roam across multiple networks.  The panelists also challenged the Commission to investigate developing flexible technical rules which allow continuing technology innovation.

Fixed Broadband Workshop

The recently concluded Fixed Broadband Workshop brought together researchers, technology developers and business planners to discuss the current status of non-mobile or fixed broadband and its future potential.  The workshop was organized in two panel sessions.  One discussed Broadband Vision and the other Fixed Broadband Technologies.  Perhaps the most challenging consensus presented by the Broadband Vision panelists was that our need for broadband will continue to grow far beyond today's performance capabilities.  This was responded to by panelists from the Broadband Technologies session who indicated that for cable, fiber and DSL technologies, evolution strategies were in play to meet this challenge supporting bit rates an order of magnitude higher or more than at present.

Another challenging view presented was that utilization of broadband infrastructure should be maximized to ensure participation by the greatest number of innovators and that, to the fullest extent possible, the network should be transparent to the applications it supports.  As noted by one researcher, such transparency would further development of applications and lower barriers to use by individuals.  He also noted that as the nation becomes more dependent upon a broadband infrastructure, other factors such as reliability, security, etc. will need to be included in an evolving definition of broadband.  Towards this point, it was noted that research is advancing on new network constructs such as cloud computing and network virtualization.  Such constructs may permit the implementation of new network services and features far more easily than implementing them in the underlying broadband infrastructure and avoiding the upgrade issues associated with legacy infrastructure.

So-called "middle mile" costs were cited by one panelist as a significant impediment to broadband in rural areas and suggested that the Commission seek to address this issue.  He also noted that nearly half of the estimated seven million unserved homes in the US are already accessible by cable systems and that stimulus funds directed towards these small cable operators would be a very cost-effective solution.  In all, panelists presented an environment where technology can meet evolving broadband goals, but policy issues affecting the openness of the network, its capabilities and its ability to serve all the peoples of the nation will need to be addressed to realize broadband's full potential.

8 Responses to “Workshop Summary: Technology/Wireless”

  1. Jefferson's Ghost says:

    Watching the afternoon workshop now... why did the FCC moderator just cut Craig Settles off from talking about why the incumbents didn't apply for BTOP/BIP grants?

    Why is the FCC censoring the discussion? The $7.2B in stimulus is arguably an important component of the general "adoption/utilization" topic. Why is this topic off limits? Is the FCC protecting the incumbents from public criticism? Come on, FCC!

  2. arclight says:

    I am interested in how the FCC will approach dealing with incumbent spectrum users. It's already clear that the Commission's rules for white-space devices will not protect incumbent TV receivers from receiver-generated interference caused by locally-strong signals (e.g. those caused by white-space terminals in close proximity). It could be that the deployment of large amounts of white-space devices will wind up delivering a death blow to rural TV reception. Does anyone really care?

    Where is the rigorous, peer-reviewed math analysis that supports these deployments and shows how they will affect incumbent users? Doesn't the FCC's OET have an obligation to publish such an analysis BEFORE they recommend policy? Does anyone around here still know how to do real science? Or is it all "political" science, where the laws of physics are subject to Congressional oversight, Executive orders, and Judicial review? It's not just the last 8 years; it seems to be the last 15-20 years, or longer.

  3. Paula Bernier says:

    Why does the NOFA require higher broadband rates from wireline providers than from wireless ones? It doesn't seem like that creates the level playing field the NOFA also talks about.

  4. Richard Shockey says:

    There is no evidence at this point that the use of whitespace spectrum would cause interference with existing applications if the cognitive radios are "smart" enough.

  5. knock knock Guest who? says:

    To address the Fixed Wireless portion, let us not lose sight of the fact that it is not just the 'bit rate' that is important but also the 'type of bit's coded'. Given restrictions on bit rates in the past and the limited amount of download speed one can conclude that coding had to necessarily be very efficient. As bit rates increase coding efficiencies decrease creating burden on the network (via auto-update features etc etc). My point is not to steer towards the net-neutrality debate but rather a simple of observation: you end up chasing yourself if the objective is to make a faster network without some guidance on the data creation side. Network transparency does nothing to accomplish this task nor is it clear whether the concept is directed at a standards based ideology (keeping in mind all of these networks are proprietary and are fundamentally designed differently) . In addition, network transparency strikes a chord with protected innovation as innovation occurs to either lessen cost or increase profit (one in the same). A patented process and protection thereof was seen in all of the litigation around VOiP in the Vonage suits. These legal actions are proof positive that transparency cannot be instituted in a clear fashion and in a manner consistent in reconciling free market innovation with standards based ideology.

  6. Richard Shockey says:

    One additional point that did not come out in any of the Broadband hearings was a proper definition of what constitutes proper network congestion management. The definition of broadband is difficult enough but network operators need some reasonable idea on what steps they can use to control congestion that does not constitute a deliberate attempt to impair competition. It may be impossible, and I certainly don't want Congress to attempt to define proper network management practices. It may be that a case by case determination is the best solution but I certainly hope OET can take the lead in seeing that the proper experts are consulted.

    For instance...

    http://www.ietf.org/id/draft-irtf-iccrg-welzl-congestion-control-open-research-04.txt

  7. Wireless Broadband says:

    Don't be fooled here. The power output rates for any wireless card make them nearly radioactive if they were to operate on 4g or WiMax at a rate over 1mps-check any device manufactures spec's. That is not to say of course that there cannot be meaningful deployment of WiMax or LTE services in devices, automobiles etc. However, in a mobile model the attempt to reconcile the data frequency output with battery life is the single largest innovative problem. You can allocate all of the spectrum in the world but in a mobile setting the simple fact is that mobile LTE and WiMax is not practical. Device power loss combined with high frequency output exposure to the user makes using any LTE or WiMax card unlikely at a broadband rate of 1mps and down right dangerous as you exceed that output rate. Traditional data push through 3g is tolerable exposure however, I do not see, given the potential for data packet transfer, why there is discussion on this type of technology when the goal is to figure out Broadband strategy. Put wireless on the back burner and focus on the larger infrastructure problems.

  8. knock knock Guest who? says:

    Good point Richard. In my experience overflow management is a fools errand. Because ISP's are inherently closed networks in some cases they enter into favorable dealings (i.e. Comcast) with third parties for transfer of data directly to their subscribers. I suppose that is a different conversation but keep in mind- data packets at the root level (i.e. those parsed by algorithms) can and will be used in a very intelligent manner going forward. For example, ID3 tags can be inserted into the packaging of the data, which then, depending on the sophistication of the algorithm and network, could be gated in a preferential fashion. I do not want to see legislation governing network management. However, as more and more convergence in the data management space occurs we are going to have to ever mindful of the power that comes with such technology. Google and Comcast are early stakeholders already and on the plus side action regarding the 'gating' that has occurred by Comcast in the past was relatively successful in changing the companys' approach to data management. I know some networks have management regulation tools deployed in their network to assess whether an individual is operating on a consumer or commercial basis- such analysis is calculated based off of average use over the network- a formula I find compelling and non-discriminatory.

Leave a Reply



Capture The Phone Numbers Using Your Camera Phone

If you have a camera and a 2D matrix code reader on your mobile phone, you can capture the FCC Phone numbers right to your phone by following these three easy steps:
Step 1: Take a photograph of one of the codes below using the camera on your mobile phone.
Step 2: Use your phone's Datamatrix or QR Code reader to decode the information on the photograph. Please note, these code readers are device specific and are available to download on the internet.
Step 3: Store the decoded address information to your phone's address book and use it with your Maps or GPS application.

Datamatrix and QR FCC Phones