What Google Fiber May Mean for North Carolina, NC State and R&D
On Jan. 27, Google announced that it would be expanding its fiber-optic network, Google Fiber, into four new metropolitan areas: Raleigh-Durham, Charlotte, Atlanta and Nashville. So what?
Google Fiber is basically a form of infrastructure for connecting to the Internet. It’s a big deal because it promises online connection speeds of up to 1,000 megabits per second (Mbps). You don’t really need to know what Mbps means to appreciate that 1,000 Mbps is much faster than 45-50 Mbps – which are the highest possible download speeds reported by two major Internet service providers. (And that’s for the most expensive packages – most customers are probably using download speeds of between 6 and 15 Mpbs).
But what does a faster internet connection mean in practical terms? How could it affect the Raleigh-Durham area, for example? What could it mean for NC State?
To find out, I asked two people who know a lot about the subject: Marc Hoit, NC State’s vice chancellor for information technology and chief information officer, and Rudra Dutta, a professor of computer science at NC State whose research focuses on computer networking.
The Abstract: The Research Triangle (Raleigh-Durham-Chapel Hill) is known for its, well, research. What will having Google Fiber mean for research and development (R&D) in the Triangle?
Marc Hoit: It will allow research and development to spread more into the community through the connection to a fiber infrastructure. Universities already enjoy this level of connectivity which facilities interactions and collaborations across physical locations. Entrepreneurs will be able to work and develop new innovations from highly connected homes and small businesses. This transformation is already underway in Kansas City [Note: in 2011, Google selected Kansas City to be the first metro area to get Google Fiber]. Think of the number of ideas generated by people from their homes and small businesses using technology and robust broadband connections. Also, this will allow underserved areas to get connected, become part of the fully connected workforce, create new products and compete with more well-funded groups.
TA: What will having Google Fiber mean for economic development in the Triangle?
Hoit: The competition between Google and AT&T Fiber will bring access to the entire community at affordable prices. A key issue in choosing a business location is the availability of fiber for both the business and its employees. Businesses will come to the region because they can now allow more telecommuting and bring IT services to a distributed workforce – such as employing people from home for support services like call centers or tech support, allowing for flexible schedules, part time work, etc.
TA: What will having Google Fiber mean for NC State?
Hoit: Faculty, staff and students will now have connections at home that allow full interaction as if they were on campus. Currently, leaving campus means a severe loss of connectivity and smooth access. It will be a boon for online education, and enable us to explore more flipped classrooms and use of innovative immersion technologies.
Most important, many new innovations come from students working at home – and now they will be able to create new apps, new business models, and new ideas both at home and on campus.
TA: How could Google Fiber affect research at NC State?
Rudra Dutta: Google Fiber aims to bring fiber-class bandwidth, 1 Gbps [gigabit per second, or 1,000 Mbps] or close, to individual consumers. In terms of providing bandwidth as a general service, this is unlikely to produce much impact on university research in the short-term, since (a) the university already has access to high bandwidth, and (b) many of the research units that require the greatest bandwidth face their bottleneck inside the university network, not outside.
But in the longer term, this will have an effect, since for the last few years we have already been in the process of improving the fiber and bandwidth in the network inside the university to allow us to get Gbps or more to buildings that need it. And, as broadband becomes more and more common – because of Google Fiber, but also because established telecom companies will feel the pressure to create comparable offerings (this has already happened with AT&T in Austin, and may be happening here) – our research partners (wherever they are located) will also have access to high bandwidth, and will be able to exchange more extensive data faster with us.
I believe there would also be another type of effect on research in the university, but I’m not sure of whether it would take some time to build or not. The difference in broadband experience that Google Fiber brings will be more significant for individual consumers. Over time, it will change the perception of how much bandwidth is useful or sufficient, and what can be meaningfully and feasibly done over the network.
A few years ago, watching entire movies over the Internet was not really very feasible. Now it is the norm. In the future, even more dependable and high-bandwidth networking will mean that even more disciplines will have even more involvement from distributed computing than ever before. This will spur studies and research in corresponding fields.
For example, a sociologist might study the effect of network latency on the effectiveness of collaboration or instruction of performing artists such as dancers, because such a thing becomes feasible (and therefore meaningful to study), where it wasn’t before.
TA: How could Google Fiber affect your research in particular?
Dutta: Network researchers such as myself will hopefully benefit from the general rise in the perception of networking capabilities, and increased interest, but I believe there will be specific effects in terms of shifts in research importance in the short term due to the local increase of bandwidth. (This will definitely be true of my own research, by the way.)
The key point is that, at least initially, the increase in bandwidth due to the addition of fiber will be at a certain scale – specifically, the metro scale. This means that the increase in speed in accessing “the Internet,” as perceived by the local consumers, will be non-uniform. The speed perceived by an end-user when accessing some service, such as a website, a cloud, or a streaming server, depends on many things – the user’s speed in being able to locally access the Internet, the corresponding speed for the final service provider, and the speed in all the segments of the network in between.
Google Fiber will definitely improve the local consumer’s access speed. But whether this improves the user’s experience in accessing a website depends on whether that website’s server is able to supply data at the higher speed. Realistically, things will improve more for some websites or clouds than for others. This is analogous to the current situation inside your home – your home wireless network has a much higher speed (maybe 50 Mbps) than your Internet connection (maybe 10 Mbps or even lower). So a faster home network does not help you access things on the Internet any faster – just things on your home network itself (like if you have a music server that you access from a TV elsewhere in your home).
As the higher bandwidth represented by Google Fiber becomes generally available in our local metropolitan area, it will spur the growth of pervasive computing applications locally; applications that have become known as “the Internet of Things” (IoT) scenarios. Municipal networking applications could leverage a more “aware” environment.
For example, one could design an application that will supply on-demand roadway traffic video to individual users, or to traffic controllers engaged in real-time traffic engineering. So the relevance of IoT research will increase, with more opportunities to try things out locally. Incidentally, the US Ignite organization has been fostering research in “what applications could you develop if you did not have to worry about any bandwidth limitations” for the last few years.
A similar research area that becomes better motivated is “software defined networking,” because it is generally considered that its flexibility comes at the cost of the extra bandwidth required for the controller-datapath interaction – which would now be less of a problem.
A different research direction whose importance would also increase has to do with trying to reduce the effect of the mismatch between local and global bandwidth mentioned above. There are known approaches to counteract this. For example, predicting what content is likely to be addressed from outside the high-bandwidth “island,” and staging them inside the island, so that even content outside the high-bandwidth fiber island is available at high-speed.
This may be particularly interesting in the present context, because Google already stores or mirrors a very large proportion of data in the world. Other techniques include multipath access to data over slower networks when required. Network design research problems that integrate demand predictions and joint path/storage control would have lots of relevance in this context.
Of course, Google will definitely have done a lot of that research already before embarking into this area, but there are sure to be problems still open.