Around 1980, a Motorola executive predicted during a technology symposium at the Fermi National Accelerator Laboratory (Fermilab) that many people would someday be able to carry a phone that would fit in their pocket with access to a cellular radio system so it could be used over a wide area, without a physical connection to a network. As events have shown, that prediction was prophetic. In fact, it is estimated that today 270 million people in the United States alone carry a hand-held computer (aka a smartphone) on their person and there are 6.4 billion connected devices world-wide. This explosion in personal computing has driven and in part been a beneficiary of the expansion of the data rate supported by cellular communication technology. Today it is possible to bridge between Ethernet networks using cellular devices with Mbit/s speed. This lesson provides an overview of the various cellular network advances that have occurred, and the equipment needed when interfacing to an Ethernet network.
Wireless communication can be grouped in several categories, using an increasing distance order.
Personal Area networks are intended for short range communication between devices typically controlled by one person, such as wireless headphones or wireless heart rate and blood glucose sensors communicating with a wearable monitor. This technology includes Bluetooth.
Wireless Sensor Networks are groups of low-power low-cost devices that interconnect wirelessly to collect, exchange, and act-on collected data from their physical environment. They are mostly line-of-sight networks. Some of these technologies include standards such as EnOcean and ZigBee. Gateways are being developed to connect these networks to wider area networks and that will be the subject of a future lesson.
For wider area communication, Wireless Local Area Networks are used. These networks are often known by their commercial product name Wi-Fi. Lesson 503 discussed connecting Wi-Fi to an Ethernet/IP backbone using an access point.
Narrowband IoT (also known as NB-IoT or LTE Cat M1) is a Low Power Wide Area Network (LPWAN) radio technology standard developed to enable a wide range of devices and services to be connected using cellular telecommunications bands. NB-IoT is a narrowband radio technology designed for the Internet of Things (IoT) and is one of a range of Mobile IoT (MIoT) technologies standardized by the 3rd Generation Partnership Project (3GPP), a standards organization which develops protocols for mobile telephony.
Consumer Cellular Networks are designed for citywide/national/global coverage areas and seamless mobility from one access point to another. These networks and their ability to be utilized in conjunction with Ethernet are the subject of this lesson.
The evolution of cellular communication networks is commonly known as 1G, 2G, 3G. 4G, and 5G. The 'G" refers to the cellular generation. As you might expect, each generation is faster and includes new features. We are currently in the fourth generation (4G), with 5G being rolled out.
Since the introduction of the first cellular telecommunication networks, many different standards or systems have been used, all with their own abbreviations or icons. Different cellular providers and various nations have championed different specifications, such as frequencies, channel spacing, and access technology. Thus, the phone you carry may not work with all provider networks or in all locations around the globe. The acronyms abound but understanding the difference between the encoding schemes called TDMA CDMA, and OFDM is fundamental to your understanding.
First generation (1G) mobile phones used Frequency Division Multiple Access (FDMA) to allocate calls. It used separate signal frequencies or "channels" for each conversation. In North America that system was called Advanced Mobile Phone Service or AMPS It was inefficient in that it required a lot of bandwidth to support multiple users. The Global System for Mobile communication (GSM), Time Division Multiple Access (TDMA) and Code Division Multiple Access (CDMA) technologies were introduced when 2G was rolled out. They are basic technologies in mobile phones, but they represent a gap that many phones cannot cross. The CDMA/GSM/TDMA difference is a legitimate technical barrier to moving phones between carriers. These technologies allow more phone calls or internet connections to share one radio channel. TDMA is based on a "time division" system. Calls take turns, so digital data (your voice is digitized) is given a channel and a time slot. So, three calls look like "123123123". On the other end, the receiver only listens to the assigned time slot and pieces the data back together again. GSM was originally a European technology that utilized TDMA and has expanded over time into a system which is utilized on 3G networks and beyond.
CDMA is a "code division" system, where every call's data is encoded with a unique key, then calls are transmitted at once. If you have calls 1, 2, and 3 in a channel, the channel would just say "666666666". The receivers each have a unique key to "divide" the combined signal into its individual calls. Code division turned out to be a more powerful and flexible technology, but GSM had evolved faster. Enhancements to these technologies under various names have sped up 3G networks to a theoretical speed of 30Mbit/sec.
LTE (Long Term Evolution) was introduced with 4G networks and uses another multiplexing scheme called OFDM (Orthogonal Frequency Division Multiplexing) to meet the targets of higher data rates, reduced latency, and improved spectrum efficiency. OFDM is a complex modulation method, on which entire books have been written. Basically, the encoding schemes mentioned above are called single-carrier modulation (SCM). With OFDM the data stream is split into a number of parallel data streams each of which is modulated onto a separate carrier. This scheme helps address the situation where the same signal reaches a receiver (your cellphone) via multiple paths, often as a result of reflections from buildings, hills, etc. The effect on SCM is signal fading, but by using multiple carriers OFDM helps overcome this problem.
The data speed of the various generations is a key parameter. Table 1 shows an approximation of the various download speeds. The numbers are not exact, as they vary widely in real-life situations depending on your location, whether you are indoors or outdoors, the distance to nearby towers, and the amount of network congestion. All those factors impact the actual speed you can attain. What is shown is a progression in speed as the technology has advanced. Some of the generations have intermediate versions that are represented by fractional numbers, such as 2.5G.
Generation | Icon | Technology | Max Download Speed | Typical Download Speed |
1G | 1G | AMPS (Analog) | 2.4 Kbit/s | Not Usable |
2G | Basic | TDMA, CDMA | 10 Kbit/s | <10 Kbit/s |
2G | 2.5G | TDMA, CDMA | 200 Kbit/s | 20 Kbit/s |
2G | 2.75G | GSM, CDMA | 473 Kbit/s | 100 Kbit/s |
3G | 3G | 3G (Basic) | 384 Kbit/s | 100 Kbit/s |
3G | 3.5G | HSDPA/HSUPA | 14 Mbit/s | 5 Mbit/s |
3G | 3.75G | 1XEVDO | 30 Mbit/s | 10 Mbit/s |
4G | 4G | HSPA+ | 21 to 84 Mbit/s | 5 to 22 Mbit/s |
4G | 4G | WIMAX | 37 to 365 Mbit/s | 17 to 100 Mbit/s |
4G+ | 4G+ | LTE | 100 to 300 Mbit/s | 50 to 100 Mbit/s |
5G | 5G | 5G-NR (New Radio) | 1 to 10 Gbit/sec | 150 to 200 Mbit/s |
Table 1 — Download Speeds
Note that the table provides two different download speeds. The first is a theoretical "maximum download speed" that is based on the limits of the technology assuming you had perfect coverage and no congestion on the cell tower. The second is a more typical download speed which is more representative of what users can expect to experience on a day-to-day basis. Fortunately, there are numerous tools available that allow you to measure the download speed of your connection. Also, while not generally mentioned, upload speeds are typically slower, perhaps one-half or less of their download partners. Cellphone networks are designed for streaming, where large files (such as a movie) can be downloaded quickly, whereas your photo or the file you wish to upload is of lesser importance.
All modern smartphones support 4G technology, but they often differ in the maximum download speed supported or the maximum "category" of LTE they support. As of this writing, the latest iPhone and Android devices support up to Category 16 LTE-Advanced, which can approach speeds of 1 Gbit/s, but not all phones or all countries support this data rate yet.
5G NR (New Radio) is a new radio access technology developed by 3GPP for the 5G (fifth generation) mobile network. It was designed to be the global standard for the air interface of 5G networks.
As the Motorola executive in the introduction understood, cellphones share attributes with radios. Cellular performance is impacted by both radio-frequency communication and network performance parameters. While we do not need to become cellular engineers, we need to understand a little of both topics in order to understand cellular performance.
The radio spectrum includes frequencies between 3 kilohertz (kHz) and 300 gigahertz (GHz). Cellular networks utilize discrete frequencies in the range of 380 MHz (1G) to 4.7 GHz (5G). The quality of a radio-frequency communication link is a function of six parameters:
Network performance refers to the measure of service quality of a network as seen by the customer. It applies to any network medium, such as copper wire, fiber optic, or cellular. There are many different ways to measure the performance of a network type, but the following measures are often considered important for cellular networks:
It has taken years for 4G networks to spread around the world and there are still rural areas relying on 3G networks, but in developed regions 4G has become the norm. We can expect 5G networks to take a while to reach everyone, but it is worthwhile to compare the two.
Using our multi-lane highway analogy again, think of a cellular network as a highway with fleets of tractor trailers in each lane moving data in all directions. To move this data faster, the tractor trailers need to speed up or we have to add more lanes to our "highway". Our speed is already limited (we can't exceed to speed of light), so more lanes or "available bandwidth" is needed. By increasing signal frequency into the gigahertz (GHz) range, 5G will create "superhighways" with thousands of wider lanes ( up to 100 MHz) where more devices can join the network, throughput can be increased, and lag time will be reduced.
As we have seen, the topic is complicated by the variety of different technologies that are used in each generation, geographical differences in coverage, and by the fact that the technology continues to evolve. As shown in Table 1, the latest iteration of 4G offers speeds approaching the lower end those promised by 5G. For data communication, it is necessary to put that speed into context. 1 Gbit/s is 1000 Mbit/s, but data communication speeds are listed in bytes where 1 Byte = 8 bits. So, 1 Gbit/s translates to 125 MByte/s. The distinction is commonly indicated by the upper- or lower-case letter "b". lower case "b" refers to bits and upper case "B" refers to bytes. What you can realistically expect from 4G is a data rate between 10 Mbit/s and 50 Mbit/s (1.25 MB/s and 6.25 MB/s). The goal of 5G is to hit 50 Mbit/s as the average minimum.
With 4G networks, the average latency is between 20 and 30 milliseconds with a worst case around 50 milliseconds. For context, it takes at least 10 milliseconds for an image seen by the human eye to be processed by the brain. Low latency is vital for real-time applications such as machine control or self-driving cars. The average latency expected from 5G is in the range of 1 to 10 milliseconds and may be one of the greatest benefits of this technology.
4G LTE technology operates in the frequency bands from 600 MHz to 2500 MHz. 5G will utilize what is sometimes called the millimeter wavelength spectrum. The frequency range can as high as 40 GHz (40000 MHz). 5G will be rolled out in three frequency bands called Sub-1 GHz, 1 - 6 GHz, and above 6 GHz bands. Remember our discussion above concerning radio signal dissipation. The distance between antennas must drop as a square of the frequency in use. Highest band 5G is going to need an antenna every 800 feet to provide similar coverage to 4G.
To gain cellular access to Ethernet, a Cellular to Ethernet modem or router will be required. Although any other product could have been used, we will use as our physical device example the Contemporary Controls EIGR-C Gigabit LTE Cellular Router. A router is a networking device that forwards data packets between computer networks. Routers typically perform the traffic directing functions on the Internet. Data sent through the Internet is in the form of data packets. A packet is typically forwarded from one router to another router through the networks that constitute an internetwork until it reaches its destination node. The EIGR-C Cellular Router performs this function when no Internet connection is available. It links a 4G cellular network to a 10/100/1000 Mbit/s Internet Protocol (IPv4) Ethernet network. This network is the local-area-network (LAN); the cellular is the wide-area-network (WAN). The router features a 4-port Ethernet LAN switch and a built-in cellular modem with a real-time clock and VPN functionality. All of the security and functionality of the Internet is available over the cellular connection.
Users can obtain a SIM card from Contemporary Controls or they can establish service and obtain a card from their cellular provider. Antenna connections on the unit allow for the use of a remote antenna when the router is located in an area where cellular coverage is spotty (such as inside a control cabinet).
The advances in cellular networks with regard to data rates and coverage make if feasible to connect isolated Ethernet networks to the wider world of the Internet and the cloud by using cellular communication. While not yet ready for real-time applications, the promise of the future is trending in that direction.