The Case for a Municipal 4G Network

We live in a connected world, one in which the lower the connection costs and latency, and the greater the bandwidth and mobility, the better.  As users desire further optimization of all four of these characteristics due to increases in data consumption, existing solutions to the last mile problem (the final leg of transmitting data from the communications provider to the consumer) must be reexamined.  Currently, telephone lines, cable, and fiber-to-the-home compose nearly the entire spread of last mile infrastructure options available within the United States.  Yet each poses a number of limitations that make it difficult or impossible to achieve the kind of data consumption that users will require going forward.  Telephone lines, while reaching nearly all United States residences, are a noisy environment that cannot carry the hundreds of megabits per second that other solutions are capable of transmitting[1].  They also suffer from relatively high latencies.  Cable and fiber-to-the-home offer very high bandwidth and low latency but a high fixed cost per user, as each individual residence must be wired up, requiring excessive amounts of labor and resources to be expended for any single connection[2].  All three of these solutions offer very limited mobility, as even attaching a wireless router to a home modem can only provide a few hundred feet of freedom at most.  Thankfully, a new technology that goes by the trade names of WiMAX and LTE but is more broadly known as fourth-generation wireless (or 4G) is reaching maturity and hopes to solve all of the major problems inherent to current systems; 4G mates the low cost of digital subscriber lines with the high bandwidth and low latency of cable and fiber-to-the-home solutions.  Moreover, when deployed across a wide area, 4G can provide the kind of mobility that consumers are used to experiencing with their cellular phones.  In short, if every United States citizen is to have always-on access to cheap, low-latency, high-bandwidth broadband, then it is 4G, and not DSL, cable, or fiber-to-the-home, that can get the job done.

What make 4G such a revolutionary way to solve the last mile problem are the technology’s vast capabilities at low cost.  Rather than depending on the installation of millions of individual residential connections as do current prominent last mile solutions, 4G requires the installation of communications towers that distribute a data connection to all subscribers within range of the transmitter, much like 3G service for cellular phones.  Each subscriber must have a modem that either acts as a central standalone unit much like a DSL, cable, or fiber-to-the-home modem, or which plugs into an individual computer.  The modems can solely provide internet access or offer so-called “triple play” offerings in which internet, television, and telephone service are all bundled through the same connection.  Because users have the option of a standalone modem or one which plugs into a computer, those with laptops or 4G-enabled cellular phones (such as the soon-to-be-released HTC EVO 4G) can receive data services from any covered location rather than just their home as in previous last mile solutions.

Current and Proposed 4G Networks

Various slices of spectrum have been licensed for 4G use, although in the United States 2.5GHz spectrum is currently the most widely used and has already been widely implemented by Clear, a joint venture by Clearwire, Sprint Nextel, Comcast, Time Warner, Intel, Google, and Bright House.  Clear is a commercialization of WiMAX, operating on licensed 2.5GHz spectrum with about 190MHz of depth, and is available in a growing number of cities, including Philadelphia.  Currently Clear offers triple play at rates that are competitive with DSL, cable, and fiber-to-the-home, although presently its bandwidth and latency are not able to match cable and fiber-to-the-home offerings.  Within the next year, Verizon’s 4G service, a commercialization of LTE, will begin transmitting on 700MHz spectrum with about 30MHz of depth and should offer somewhat similar performance to Clear’s service[3][4].

WiMAX and LTE have few meaningful differences in their capabilities.  However, the spectrum that they will operate on within the United States has a number of important consequences for how the services can be used.  Because current WiMAX implementations run at higher frequencies than proposed LTE implementations, Clear’s individual WiMAX towers will be able to offer higher bandwidth than Verizon’s eventual LTE towers at the cost of a smaller range for each tower.  This situation is similar to a comparison of AM and FM radio stations; AM stations, transmitting at a lower frequency, can travel much farther but do not have the kind of sound quality of their FM counterparts, while the FM stations, transmitting at a higher frequency, cannot transmit nearly as far as their AM counterparts.

The differences between Clear’s WiMAX and Verizon’s LTE are greater than just the frequency of spectrum each will be using; depth matters, too.  Because Clear has access to more than eight times as much spectrum as Verizon (190MHz versus 22MHz), it can offer each user significantly more bandwidth or offer the same amount of bandwidth to more users.  Current 4G implementations offer theoretical spectral efficiencies of approximately 3.7 bits per second per Hertz (usually closer to about 2 bits per second per Hertz in the real world[5]), meaning that without resorting to various smart antenna technologies which will be briefly described below, Clear can theoretically offer about 700 megabits per second per tower, while Verizon can only achieve about 80 megabits per second per tower.  Because WiMAX and LTE plan to allocate between 5MHz to 20MHz of spectrum per channel, this means that raw theoretical connection speeds per channel without smart antenna technologies should vary from about 18 megabits per second to 75 megabits per second.

At first glance this makes it seem like an LTE tower might only be able to service a small handful of users at a time, with a WiMAX tower only connecting a few dozen.  Fortunately, both WiMAX and LTE allow users to dynamically share channels, so while LTE still cannot offer as much bandwidth as WiMAX at any one time thanks to its lower frequency and decreased depth of spectrum, each 4G standard can carry more than 200 users per channel[6].  Of course, sharing channels means that each one of these users will be sharing bandwidth unless beamforming and MIMO technologies can isolate different users sharing the same channel.

The consequences of sharing channels at a practical level should be clear to anybody who has ever employed a WiFi network in a heavily populated area.  When the user first takes out his laptop or other WiFi device to sniff out the newly created wireless network, he might find that there are dozens of other routers already transmitting within his area.  Upon closer inspection, each of these routers transmits on one particular channel which was chosen when that router was set up.  It may be that many of these routers are all transmitting on the same channel (the usual suspects seem to be channels six and eleven when it comes to WiFi), yet even so, all of these networks can happily coexist.  However, because each WiFi channel can only transmit 54 megabits per second for an 802.11g connection, users with routers on the same channel will all be eating slices of the same 54 megabit per second pie.  Fortunately, if there is an open channel among the eleven possible WiFi channels, then any one router set to that channel can have its own 54 megabit per second pie without worrying about sharing bandwidth.  This same sort of logic holds true for 4G systems, except for that in the case of 4G, it is likely that hundreds of users share one channel instead of a handful as in a WiFi network.  Thankfully, both 4G profiles dynamically assign channels, so the clogging effect that happens on WiFi networks in which everybody utilizes the same channel rather than choosing open channels does not take place.  In this way, the total amount of available spectrum is evenly distributed among all users across all available 4G channels, making sure that users have a relatively consistent experience regardless of channel.

The Capabilities and Challenges of 4G

When it comes to 4G, just one transmitter can blanket a substantial area, anywhere from half a dozen to more than a thousand square miles.  The characteristics of coverage depend on a number of factors such as background noise, terrain, number of users, users’ bandwidth needs,  amount and frequency of allocated spectrum, and backhaul capabilities.  In order to create a consistent user experience across a broad swath of the United States, each one of these factors must be accounted and corrected for.  And in order to get an understanding for the technical challenges that 4G implementations must deal with, it is important that each of these factors be explained.

The Shannon-Hartley Theorem dictates that when a large amount of background noise is present in an environment, less data can be clearly transmitted than in a quieter environment.  To transmit the same amount of data in a high-noise environment as in a low-noise environment, each bit must be transmitted at a higher power level in order for it to overcome the background noise.  The limitations described by the Shannon-Hartley Theorem show up in all data transmission situations, and they are particularly salient when there is large variance in the level of background noise present.  In the case of 4G wireless implementations, less bandwidth is going to be available in environments subject to substantial radio-frequency noise in the spectrum allocated to 4G wireless, all other factors being equal, so in those places, 4G towers must be placed closer together or transmit at higher power to maintain a uniform user experience to that of low-noise environments.  By placing 4G towers closer together, each user can be allocated a larger amount of spectrum because each tower will serve fewer total users, thus compensating for bandwidth loss due to noise.  By having 4G towers transmit at higher power levels, the background noise can be overcome because the signal will drown it out; however, for a consistent experience to exist in this scenario, both the provider and subscriber must be transmitting at a higher power, and in mobile devices the power constraints of transmitting at higher power may be unreasonable (uploads speeds may have to be lowered).  What is clear through all scenarios, however, is that all things being equal, environments with more background noise will offer less bandwidth per user than low-noise environments, and so the 4G implementation must be designed in such as a way as to mitigate the problems that noise may pose on the user experience.

Widely varying terrain is another problem that a large-scale 4G implementation must cope with, and while the best ways to mitigate the problem have some similarities to solving the background noise problem, solutions such as beamforming using MIMO technology can also assist in maintaining a consistent user experience.  In flat areas with relatively few obstructions and low noise, so long as the 4G tower can be placed well above the ground, its signal can travel for many miles.  Attenuation will eventually cause the signal to be indistinguishable from background noise, but nonetheless, ranges greater than 30 miles are possible[7].  In areas with widely varying elevation or many buildings and other obstructions, it is much more difficult for a signal to reach a large area due to reflection, refraction, absorption, and attenuation of the signal.  Because these interference effects become more prominent as the signal travels farther away from the source (more obstacles get in the way and the signal simply dissipates), placing 4G towers closer together can help solve these problems.  Transmitting at higher power can similarly allow the signal to be received in areas that had been unreachable due to various obstructions limiting reception, and, if the option is available, transmitting at a lower frequency can improve the ability of the signal to penetrate obstructions at the cost of bandwidth (much like the example of Verizon’s LTE versus Clear’s WiMAX).  However, beamforming and MIMO technology have come along as some of the true flagship methods for improving wireless transmissions without necessarily having to increase the number of towers or transmission power or having to lower the frequency; both of these have been included in 4G implementations.  MIMO allows for multiple antennae on the transmitting and receiving devices to send and receive data along multiple paths, increasing both the efficiency and redundancy of the system.  In this case, even though data is sent along the same channel, because it is sent from multiple antennae in a way that prevents interference between signals, the receiving antennae are able to differentiate the signals and process many times more data depending on the number of transmitting and receiving antennae.  This technology was commonly used in wireless routers that utilized the 802.11g specification (meaning each channel can carry a theoretical limit of 54 megabits per second) but which advertised speeds as high as 108 megabits per second; those routers and receivers were sending data on the same channel in two separate paths in order to double the total bandwidth.

Beamforming allows the actual transmission path of the signal to be shaped in a way that optimizes transmission robustness and efficiency; instead of sending the signal in every direction, the transmitting antenna can detect the location of the receiver and only send the signal in that direction.  Imagine one WiMAX tower operating on one 20MHz channel, which has a theoretical limit of about 75 megabits per second.  Without beamforming, sending data from this tower is somewhat like sending Morse code by blocking and unblocking the light from a lamp in the middle of a room; there is no way for anybody on one side of the room to receive data that is unique from the data received by anybody in another part of the room.  This is like everybody connected to the 4G tower receiving an equal portion of the total 75 megabits per second.  Now going back to the light in the room analogy, imagine that instead of a lamp, there are individual spotlights shining on each person in the room.  Each one of those spotlights can be blocked and unblocked individually, and now the total amount of data that can be sent is the amount of data that the original lamp could send multiplied by the number of spotlights.  In much the same way, if the WiMAX tower can direct all 20MHz of the channel toward each user without any interference from transmissions to other users (probably by implementing some sort of MIMO strategy), then rather than forcing all users to share that 75 megabits per second, each user can get his own 75 megabits per second even though he is using the same channel as other users.  Combining MIMO and beamforming allows for greatly increased bandwidth and range with a given transmission power, making 4G practical even in urban environments or other locations plagued with numerous obstructions or sources of interference.

Different areas have varying densities of users; a rural area may only have a few dozen users per square mile, while an urban environment may have thousands.  Accounting for this variable is crucial to ensuring that users in all environments can offer consistent experiences.  Towers have a maximum number of channels in which spectrum is subdivided among users.  This means that each individual tower can only provide a satisfying 4G experience to a limited number of users.  In a rural area, even when a 4G tower is transmitting at its highest power level to maximize its range, it may still not reach so many users that it maxes out the capacities of its available channels.  That tower has excess bandwidth capacity, but there are no users to utilize that capacity unless individual channels are made wider, thus providing more bandwidth per channel and consequently providing more data to each user.  In this scenario, transmitting at a lower frequency would be more effective because it would increase the signal range, thus capturing more users; the loss of bandwidth would not be problematic.  Conversely, in an urban setting in which thousands or millions of users require high levels of bandwidth, individual towers must be placed relatively close together and run at lower power levels to limit their range.  This allows for the maximal number of users to enjoy high bandwidth connections because the same channels can be reused in areas covered by separate 4G towers; essentially, these towers run at low enough power levels that they do not interfere with one another, with each tower’s coverage area tuned (using a combination of diminished power levels as well as MIMO and beamforming technology) to include just as many users as it has capacity for.  Transmitting at a high frequency would be effective here because it would increase the amount of available bandwidth, especially when maximizing range is not a priority.

In describing both the rural and urban situations, an analogy comes to mind: one 4G tower is to hundreds or thousands of users as one DSL, cable, or fiber-to-the-home line is to a single residence.  Even though 4G can provide comparable levels of data consumption to all of these older last mile solutions, the increased efficiency at which 4G can accomplish this is astonishing.  So long as the coverage area of a 4G tower is large enough that the number of users saturates the tower’s available channels (this means roughly anywhere from about 250 to several thousand users depending on transmission frequency and channel depth), it does not cost much more to connect people in rural areas than it does in urban areas.  The only real differences relate to the amount of electrical power needed to run a high or low-power transmitter, and the cost of creating the backhaul to connect the 4G tower to the internet’s backbone.  However, 4G towers can connect to one another through mesh networking, so as long as every 4G tower somehow has access to an Ethernet connection to the internet’s backbone (either via a direct line or through the direct line of another 4G tower), network operators do not have to worry about wiring each individual 4G tower.

Depending on what networked activities users are partaking in, their bandwidth needs can differ drastically.  Simple web browsing requires relatively little bandwidth, especially when 4G towers are capable of transmitting hundreds of megabits per second.  On the other hand, if many users are streaming high-definition videos or making other large file transfers, individual will require a far more significant level of bandwidth, something that might only be possible by widening the spectrum per channel or implementing MIMO and beamforming techniques to increase the spectral efficiency of the system.  Depending on the direction that the internet takes – and now it is certainly looking as though very bandwidth-intensive applications are in the cards for the future – 4G is going to have to work very hard to make sure that it can handle the needs of users.

It is situations in which many users are all drawing lots of bandwidth at once that put the most strain on networks, wireless ones especially.  For fiber-to-the-home, extremely heavy usage is not much of a problem; the physical fiber connection carries a tremendous amount of data, and there need not be much concern about interference between users as they each have their own dedicated line that essentially runs straight to the provider’s backhaul.  For cable and DSL, heavy bandwidth usage by numerous users poses a large problem, as neither of these networks have the bandwidth capabilities of fiber and in addition, users must share the last mile of the connection.  Unlike a fiber-to-the-home connection in which a continuous connection at the rated connection can nearly be assured, connection speed is far more variable when more users share the same means of connection.  In 4G networks, this is a larger issue than in cable or DSL; in certain cases, upwards of 200 users may be sharing one single channel; if that channel only offers tens of megabits of bandwidth, then each user is going to get a sluggish, sub-optimal experience unless some fancy footwork takes place on the side of the 4G provider.  Thankfully, MIMO and beamforming come to the rescue once again, and going back to the lamp versus spotlights example, it is clear how one tower can provide numerous users with an extremely high-bandwidth experience that exceeds that of cable or DSL by reusing channels.  Because cable and DSL transmissions from multiple users must travel through the same line before reaching the network provider’s backhaul, the many signals can interfere with one another and at the very least limit all users from receiving the full download speeds that the “neighborhood’s” pipe can offer.  4G can skirt around this problem through MIMO and beamforming techniques, putting it in line with the performance of fiber-to-the-home in these cases once it reaches full maturity.  And as with fiber-to-the-home, backhaul is not a problem for 4G towers, which are either connected to the internet’s backbone through Ethernet or through line-of-sight 4G data connections that occur at frequencies much higher than consumer implementations.

As has already been discussed to some extent, frequency and spectrum depth have a large effect on available bandwidth and signal range for any 4G implementation.  Moreover, what frequency and spectrum depth create is a complex trade space, in which depending on network needs, range can be increased at the cost of bandwidth, or bandwidth increased at the cost of range and signal penetration.  Essentially there are two different types of 4G networks that can be created; lower frequency networks such as those being implemented by Clear and Verizon are very much like WiFi hotspots on steroids; they offer higher bandwidth for more users across a larger area.  Even better than WiFi hotspots, they allow users to seamlessly hop from one 4G tower to another as he moves around just like users already do with their cellular phones.  Thinking about the area covered by one of these lower frequency 4G towers as a giant WiFi hotspot can be useful in understanding the implications of the technology: just as some cities, such as Philadelphia, have been covered by municipal WiFi networks, one can imagine the entire United States being blanketed by a similar network of 4G towers.  Among these lower frequency 4G implementations are the kinds of WiMAX and LTE networks that have been discussed so far, and even within that lower frequency spectrum, differences exist.

For a 4G network in which range is the primary concern and many terrain obstacles exist, relatively low frequencies such as Verizon’s 700MHz spectrum should be considered due to their lower attenuation and ability to penetrate buildings and foliage.  When bandwidth and carrying capacity is a larger concern than sheer range but the network provider still wants to offer service somewhat akin to a WiFi hotspot, higher frequency networks like Clear’s 2.5GHz spectrum (or even higher) can help ameliorate some of the problems of low-frequency networks.  However, when bandwidth is the primary concern and the very flexible WiFi hotspot-like access of lower frequency 4G networks (those below 11GHz) is not necessary, ultra-high frequency 4G networks can be implemented.  These transmit at or above 60GHz, and because of the high frequencies employed can easily carry commensurate amounts of data to fiber-to-the-home connections.  Unfortunately, for these ultra-high frequency 4G networks to work, the transmitting and receiving devices must actually be able to see one another, as the signal will be absorbed by obstructions.  However, this does not make connections unfeasible.  In fact, it can provide a strong alternative to a much lower frequency 4G network in a flat unobstructed area in which line-of-sight can continue for miles on end.  If an ultra-high frequency tower is erected hundreds of feet above the ground, then users many miles away could still receive the signal and get fiber-like levels of service[8].  Additionally, by utilizing these line-of-sight connections, 4G towers in rural areas no longer need to be connected to Ethernet backhaul; instead they can all connect to one another in a mesh network in which only a few towers actually have an Ethernet backhaul connection and others stand alone, essentially as repeaters.  This would greatly decrease the cost of rural 4G rollouts by eliminating the need to dig trenches to install backhaul.

Future Implementations of 4G

The use of ultra-high frequency 4G in rural areas and as a backhaul opportunity is far less exciting than the possibility of placing 4G antennae on blimps or low-flying autonomous aircraft above cities where people below will have access to the extremely high-speed network connections that 60GHz spectrum can offer.  Just as millions of people at a sporting event can see the Goodyear blimps circling above them, one can imagine this blimp providing every one of those people with a data connection similar to what fiber-to-the-home customers enjoy today.  4G could be used for the backhaul of this blimp, beaming data down to a receiving station on the ground which routs it to the internet’s backbone.  If 5GHz of depth could be licensed at the 60GHz spectrum, then before applying beamforming and MIMO technologies and assuming closer to optimal spectral efficiencies of 7 bits per second per Hertz, about 35 terabits per second of bandwidth would become available.  It does not seem unreasonable that bandwidth levels far higher than this could be offered thanks to the optimization of smart antenna technology.  But while these sorts of solutions might seem like a pipe-dream now, the technology needed to implement them is hardly far off.

4G:  From Technology to Policy

The possible capabilities and flexibility of all of these 4G systems far exceeds that of current last mile solutions.  As spectral efficiencies improve thanks to improving standards, more bandwidth becomes licensed for 4G use due to spectrum auctions, and marketing campaigns increase the number of users interested in 4G connections, 4G will be able to take a firm hold among United States broadband customers.  Right now 4G solutions are in their infancy.  Clearly for users wanting to maximize bandwidth and reduce latency today, fiber-to-the-home is the best solution that any network providers have to offer.  Yet 4G has a number of advantages that set it apart, especially compared to DSL and cable.  When it comes to those two last mile solutions, 4G can already provide similar levels of bandwidth, although real-world latency times can hover above 100ms[9], which is a little too high for certain real-time applications like gaming or video.  Yet for most users, this slight lag is a non-issue, easily trumped by the sense in which 4G can make an entire city or country feel like one giant speed-boosted WiFi hotspot.  As computing becomes increasingly mobile, with netbooks making up nearly one-fifth of laptop sales[10] and many people trading away their desktop computers, broadband internet mobility will start being an ever increasing priority.  Moreover, it is important not just to consider the merits of 4G based on where the technology is now, but also on the direction in which it is clearly headed.  As 4G begins to match and exceed the levels of service offered by DSL, cable, and fiber-to-the-home, competing not just on its own territory as a mobile technology but also on the turf where those technologies currently have it beat, it will become obvious that 4G is going to be most people’s go-to networking solution in the future.

Twenty years ago, few would have thought that many Americans would be ditching landline phones in 2010, and along the same vein it is likely that many American will soon be getting rid of static home internet connections in favor of 4G solutions.  But in order for that to happen, a number of policy concerns must be carefully considered.  Specifically, with a last mile solution that touts low cost as one of its primary features, should the United States consider a model in which internet becomes a municipal good, much like roads and highways are now?  And in a world in which connectivity can so dramatically improve lives, is it even fair to allow some American citizens to grow up without access to broadband internet while others enjoy all-you-can-eat bandwidth access?  The National Broadband Plan, recently undertaken by the FCC, has shown that policymakers are beginning to ask themselves some of these questions.  For now, these policymakers have set their sights on improving broadband penetration and the quality of data connections among other initiatives.  Yet many of these policymakers are still considering the economics of DSL, cable, and fiber-to-the-home rather than the extreme cost-effectiveness of 4G, a technology that, if implemented, could revolutionize the way Washington sees the internet.

The Public Policy Case for 4G

The statistics say it all:  only 60% of American homes have access to broadband internet.  This leaves America 20th in broadband penetration, behind countries such as South Korea (95% penetration), Israel (77% penetration), Canada (76% penetration), and even Estonia (62% penetration)[11].  Being the country where the internet was invented, it is almost shameful that the United States lags behind so many other countries that came to the game much later.  Many pundits speak about this “digital divide” as something that requires immediate remedying, and they could not be any more correct.  As economies become increasingly dependent on broadband networks in order to operate efficiently, leaving a significant fraction of the population out in the cold is not much of an option.  With conventional last mile solutions, wiring rural and suburban homes is less cost-effective than wiring urban homes, forcing the digital lives of those suburban and rural customers to lag behind those of urbanites.  But times have changed; Washington needs to take a look at the way 4G could make connecting the digitally underserved feasible.

Considering that the cost of installing fiber-to-the-home averages about $500-600 per user[12], it is no wonder that rollouts are not going as quickly as some would like, especially in an economy recovering from recession.  Moreover, this installation cost is without including additional expenditures to bring the fiber backend to a neighborhood or other smaller area.  Essentially, fiber-to-the-home does not involve directly connecting each residence to a single base station that covers an entire city; there exists a network of routers that become increasingly spread out and serve fewer and fewer users each, and these must be paid for in addition to the cost of just installing fiber for one more user.  Compare the cost of installing fiber-to-the-home to that of building a 4G tower, and it is immediately clear that the way to connect a lot of users cost-effectively is by building more 4G towers, not by wiring individual residences. One 4G consulting company, Senza Fili, estimates that only about $30,000 is required to create a 4G base station that can be placed on top of a building or other elevated area (this base station could be added to an existing cellular tower, and because the range of 4G exceeds that of cellular communications, it is possible and probable that providers of 4G will save tremendous sums of money by piggy-backing on existing wireless infrastructure). Factoring in Senza Fili’s estimates for rent and power costs, that tower’s per year operating cost should run approximately $15,000[13].  This means that the first year’s cost is about $45,000, and thus if that one tower can provide high quality service to just 100 customers, then its cost per user is below that of fiber-to-the-home.  Taking into account that each tower can serve not just 100 but thousands of users, it is clear that 4G is significantly cheaper than installing fiber to individual residences.  Just doing some quick math, one can see that if all 130 million American households were wired with fiber-to-the-home, then without including any of the backhaul costs, the total bill would be no less than 65 billion dollars.  If 4G towers were deployed to cover all 300 million Americans, with each tower servicing approximately 1,000 users, the total cost without factoring in backhaul (which is cheaper than fiber backhaul because fewer separate entities must be connected) would be about 15 billion dollars.  Running that 4G network would only cost about five billion dollars per year once the towers were installed, using Senza Fili’s estimate for yearly tower operating expenditures.  What is of note here is not only that 4G is more than four times cheaper than fiber-to-the-home (while this back-of-the-envelope solution constitutes just a rough estimate, the numbers are skewed in such a way that 4G’s cost is probably a smaller fraction of fiber-to-the-home’s cost than this figure indicates), but that the 300 million Americans who would be covered by this network would not be tied to a residence as is the case for fiber-to-the-home customers.  Thus 4G has a much lower marginal cost per user while offering far more flexibility; it is a solution for massive broadband penetration that should be considered above all others if Washington is truly serious about ending the digital divide.

Given that 4G is not overwhelmingly expensive to implement, with its installation costing only a small fraction of the 425 billion inflation-adjusted dollars that the government spent on creating the national highway system more than a half-century ago[14], it seems reasonable to suggest that the government, and not private companies, might be a prime contender to create and operate a 4G network, passing the costs back onto Americans via their tax dollars.  Comparisons to the national highway system would be inevitable and likely quite valuable; there seems to be no question that this vast network revolutionized transportation, making commuting and shipping, along with scenic drives, an efficient and convenient possibility.  A municipal 4G network would offer analogous benefits; Americans could access important commercial data easily and ubiquitously while also enjoying casual web-browsing and entertainment.  And perhaps premium services such as higher levels of bandwidth or subscription television channels could be offered on top of a basic free service (and possibly through a private provider), thus giving everybody access to the most valuable resource of the 4G network – the internet – while leaving options open for more commercial ventures. Yet it is also worth stating why the legacy of the national highway system is so important to the possibility of a municipal 4G network:  the former shows that the government is capable and willing to connect the population at great cost in order to increase social welfare.  Moreover, it is willing do so at the protest of the private sector; just as the United States put many turnpikes out of business with its interstate highways, so too might the government be willing to shutter the operations of communications providers in order to provide what most view as a necessary good.

Even if the government were not to step in to create a sweeping 4G network and instead leaves the task to the standard bevy of communications providers, the cost of connecting to the network ought to plummet as the cost of supplying that connection and bandwidth takes a commensurate drop.  Furthermore, the capabilities of 4G are directly in line with the tenets of the National Broadband Plan, which while not stipulating who operates networks, does hope to help the United States reach a number of broadband goals in the coming years.  These include the ability to offer 100 million United States homes access to download speeds of 100 megabits per second and upload speeds of 50 megabits per second as well as access to “the fastest most extensive wireless networks of any nation” and a wireless public safety network[15].  Not only can 4G achieve these goals, but it can do so alone, without worrying about implementing both cable and fiber-to-the-home networks along with a separate wireless solution.  4G really is the complete package, and the features it promises can put the United States at the head of the class when it comes to broadband.

An unfortunate component of the National Broadband Plan is that it assumes that private networks will be responsible for achieving the various goals it sets forth (“At least 100 million U.S. homes should have affordable access to actual download speeds of at least 100 megabits per second…”15).  If there is to be any out-of-pocket costs for Americans to use the internet, then inevitably some citizens will be priced out of a connection to the grid.  As a country touting the values of “equality of opportunity,” it seems in violation of these founding principles to deny anybody, especially children of low-income families, the ability to access the internet at the speeds that their higher-income peers enjoy.  Because the internet offers such boundless opportunity, holding some Americans back by not giving them access to broadband connections (especially when they can be provided at such low cost) may be akin to the United States shooting itself in the foot when it comes to spurring on innovation by taking advantage of budding American minds.  Even if the benefits of connecting every American using 4G may not present themselves immediately, the gross domestic product that might have been realized had every last American been given access to a fast internet connection is too large an opportunity to miss, especially when one considers the affordability of a nationwide 4G deployment.  If the United States implements a municipal 4G solution, it would truly be an amazing step not only in bridging the digital divide but in living up to the American promise of equality of opportunity.  While presenting hardships for the private sector and likely invoking cries of socialism from more conservative policymakers, it seems that many are missing the sense in which complete broadband penetration leads to an emergent phenomenon, a digital environment that is greater than the sum of its users and which benefits as that number of users increases.  Thus, a recommendation that the United States adopt country-wide 4G networks with haste as a replacement for current last mile solutions is not just an argument built on top of 4G’s raw technological capability, but one which takes into account the economics and social value of such a network.  A mature, ubiquitous, and free 4G network can do more than just satiate all Americans’ growing need for a high-bandwidth, low-latency, and highly-mobile network; it can level the digital playing field, finally providing everybody access to a resource that for too long has been treated more like a luxury good than a necessary component of daily life.


[1] Lechleider, J.W. 1991. High bit rate digital subscriber lines: a review of HDSL progress. IEEE Journal on Selected Areas in Communications 9, no. 6 (August): 769-784. doi:10.1109/49.93088.

[2] Yang, Catherine, and bureau. 2004. Cable vs. Fiber. BusinessWeek: Online Magazine, November 1. http://www.businessweek.com/magazine/content/04_44/b3906044_mz011.htm.

[3] Syputa, Robert. Sizing up the Competitive Opportunities for Verizon (LTE) and Clearwire (WiMAX). http://www.maravedis-bwa.com/article-109.html.

[4] Spectrum Dashboard.  http://reboot.fcc.gov.

[5] Gray, Doug. Comparing Mobile WiMAX with HSPA+, LTE, and Meeting the Goals of IMT-Advanced. http://www.wimaxforum.org/files/wimax_lte/wimax_and_lte_feb2009.pdf.

[6] Baburajan, Rajani. 4G Wireless Evolution – Cost a Major Concern for Migration to LTE: Aricent. http://4g-wirelessevolution.tmcnet.com/topics/4g-wirelessevolution/articles/67698-cost-major-concern-migration-lte-aricent.htm.

[7] Anon. Wimax. http://hubpages.com/hub/Wimax-hub.

[8] Anon. What Is WiMax? — WIMAX. http://www.wimax.com/education.

[9] Based on field research testing Clear’s WiMAX service at 540 S. South Street, Philadelphia, PA 19147

[10] Anon. Mini-Note (Netbook) Shipments Grow 103% Y/Y in 2009; Revenues Up 72% – DisplaySearch. http://www.displaysearch.com/cps/rde/xchg/displaysearch/hs.xsl/091222_mini_note_netbook_shipments_grow_103_y_y_in_2009_revenues_up.asp.

[11] Anderson, Nate. US 20th in broadband penetration, trails S. Korea, Estonia. http://arstechnica.com/tech-policy/news/2009/06/us-20th-in-broadband-penetration-trails-s-korea-estonia.ars.

[12] Fernandez, Bob. Verizon to double early-termination fees | Philadelphia Inquirer | 01/06/2010. http://www.philly.com/philly/business/homepage/20100106_Verizon_to_double_early-termination_fees.html.

[13] Paolini, Monica.  Compact Base Stations: a new step in the evolution of base station design. http://img.en25.com/Web/WiMaxBroadbandSolutions/SenzaFili_CompactBTS.pdf.

[14] Neuharth, Al. Traveling interstates is our sixth freedom – Opinion – USATODAY.com. http://blogs.usatoday.com/oped/2006/06/traveling_inter.html.

[15] National Broadband Plan.  http://www.broadband.gov.

By Aaron

I'm a junior at the University of Pennsylvania studying cognitive science, and I'm the proud founder of Arteculate.com. In addition to my tech addiction, I enjoy biking, photography, vacationing in tropical locales, and spending time with friends.

Leave a comment

Your email address will not be published. Required fields are marked *