Once something becomes a part of our everyday lives, we tend to assume that it will always be roughly as we see it now. In the case of the Internet, while we hope that even greater data rates will become available, and that these rates will become available to even more homes and businesses, we may assume that little else will change. Last week's panel discussion on "the Internet of the Future" was a powerful reminder that the Internet, and broadband technology more generally, will continue to evolve. Moreover, innovation is critical to an infrastructure that meets our long-term needs, and this has implications for broadband policy. For this discussion, it was my privilege to bring together six true thought leaders: engineers Dave Clark of MIT, Van Jacobson of PARC, Scott Shenker of UC-Berkeley, Taieb (Ty) Znati of NSF, Dick Green of Cablelabs, and economist Rob Atkinson of ITIF.
Moore's law suggests that electronic devices will continue to improve exponentially, and if the Internet does not improve at a comparable pace, it may become what Dave Clark called a "sea anchor." Moreover, progress does not simply mean we will see the same actors doing the same things but more quickly. Although the Internet has been around for four decades, elements of the current infrastructure, applications, and industry structure have emerged fairly recently, quickly, and sometimes unexpectedly - a phenomenon that could continue in the future. Even a visionary researcher like Van Jacobson thought the world wide web was a surprise out of "left field" when it emerged in the late 1990s. And as Dave Clark pointed out, today's ISP, which many now see as the only possible form of service provider, emerged only 15 years ago, and at that time, ISPs generally leased infrastructure rather than building their own. Are comparable changes ahead? The panel discussed possible technical advances that could conceivably change the nature of the Internet and associated businesses, creating new challenges and opportunities for policymakers in the process. Will virtualization shift some control from the owners of communications facilities to a new kind of service provider that does not yet exist? Will changes in switch design shift some control from equipment makers back to facilities-based providers of Internet services? Will a proliferation of virtual networks operating over the same physical infrastructure "blur the boundary of what it means to be connected to the Internet," as Dave Clark conjectured? It is too soon to tell.
While much of the recent broadband policy discussion has focused on data rates, we know this is just one limitation of current technology that could motivate innovation. The problem cited most in last week's panel was security. Other issues include mobility, manageability, and support for new device types such as sensors, or perhaps meter readers in a new smart grid. Rob Atkinson pointed out that some Americans are not using the Internet not because it is unavailable or unaffordable, but because they find it difficult to use; this too could drive innovation. If capacity does remain a limitation, there may also be ways to address it other than greater data rates. Van Jacobson has suggested new ways of using increased storage as a substitute for increased communications capacity, although success may depend on finding more intelligent ways to share stored information across users, and more effective security mechanisms.
Some of the emerging technical approaches under consideration have the potential of facilitating subsequent innovation, creating a virtuous circle of rapid change. Scott Shenker suggested that today's complex switches may someday be replaced with commodity hardware combined with software that can easily be customized to meet its owner's needs, perhaps giving the owners of networks (and server farms) a new ability to innovate. Taeib Znati described how virtualization of some communications functions could enable what he called "evolvability," allowing multiple Internets to exist in parallel. This could support multiple versions designed for the same purpose, thereby facilitating improvements, or it could support multiple architectures designed for very different purposes.
So is the U.S. prepared for the innovation to come? Here, the views were more mixed, and the discussion more sobering. On the one hand, it was suggested that the U.S. has strong abilities to commercialize new ideas quickly. However, great concern was expressed about innovation over the longer term. Dick Green discussed the need for experimentation. Van Jacobson said fewer of the traditional research leaders are thinking long-term, citing DARPA's desire to produce "commercially relevant" technology and the rise of the professor-entrepreneur as among the reasons. Scott Shenker said that even in a research university it was difficult to do long-term work when one has to write many short-term funding proposals every year. Dave Clark said many of his best students were choosing short-term work in industry over long-term research because they found the research climate in the U.S. too "hostile." Dave also considers the total funding level for research to be "simply miserable." Overall, according to a report by Rob Atkinson and his ITIF colleagues, the U.S. ranks 40th among the 40 nations ITIF considered in "progress toward the new knowledge-based innovation economy." According to Dave Clark, many of our competitors overseas have an "articulated national policy" designed "to exploit our failure, in order to leapfrog us and make sure we are left in the past." They believe they can do this because the U.S. is no longer "serious" about maintaining its technological leadership. Let us hope they are wrong.