The True Speed of Cellular IoT

As of now, there are over 20 different LTE categories. From the initial LTE specifications released in 2008, to novel categories classified as LTE Advanced which supports data rates of over 300 Mbps, LTE Advanced Pro with speeds of over 1 Gbps, and LTE for Machines designed for battery-powered, IoT devices.

When talking about cellular IoT, people often refer to LTE Cat-M1 (LTE-M), LTE Cat NB1 (NB-IoT) and LTE Cat 1. These technologies support long range, low power and low cost data transfer, ideal for machine type communication.

Cellular IoT – in combination with Monogoto’s global connectivity platform – is used for building applications which often comprises tens of thousands of devices. Applications include warehouse management, smart street lighting, asset tracking and micro mobility.

 

In this article we share the real-world performance metrics of LTE Cat-M1 and LTE Cat 1, allowing you to make a better decision on what technology to use in your application.

Let’s put cellular IoT to the test.

Comparing 2 LTE Categories

LTE Cat-M1 is an LTE standard released in 2016. Designed as a low power, low cost wireless technology, ideal for battery-powered IoT devices. The tradeoff for using this power-efficient LTE category, is the limitation in speed and latency. Most modules support up to 375 kbps uplink (UL) and 300 kbps downlink (DL), others managed to get their module to send data up to 1 Mbps UL and 500 kbps DL using 1.4 MHz bandwidth.

The low-power capabilities of this category make LTE Cat-M1 ideal for IoT devices which need to run on batteries for multiple years.

LTE Cat 1 is one of the oldest LTE categories, released in 2008 supporting 10 Mbps uplink (UL) and 5 Mbps downlink (DL) speed using 5 MHz of bandwidth. This technology has better capabilities in terms of speed and latency, though it requires more power compared to LTE Cat-M1. In comparison to LTE Cat 4, the modules are significantly cheaper and consume less power, thus making LTE Cat 1 more suitable for IoT applications. 

As this LTE standard has been out there for 14 years, it is one of the most widely available LTE categories globally, ideal for IoT devices which require global connectivity.

In the 3GPP Release 14 which was published in 2017, a variation on LTE Cat 1 was released called LTE Cat 1 Bis. Instead of 2 integrated antennas, only 1 antenna is used. This reduces the BOM of the module without compromising on data speeds. It however disables the device from sending and receiving data at the same time.

LTE Cat-M1 LTE Cat 1 LTE Cat 1 Bis
3GPP release Release 13 (2016) Release 8 (2008) Release 14 (2017)
Data rate 375 kbps UL
300 kbps DL
1 Mbps UL
500 kbps DL
5 Mbps UL
10 Mbps DL
5 Mbps UL
10 Mbps DL
Popular SIMCom modems SIM7000E SIM7070G A7672E A7676E

Real-world Performance

Method for running speed tests

To define the true speed of the different cellular categories, the SIMCom A7676E (LTE Cat 1) and SIM7000E (LTE Cat-M1) modules were connected to a Raspberry Pi over USB. All connectivity interfaces other than the cellular one were disabled and several speed tests were conducted using speedtest.net. Below the results.

LTE Cat-M1 (SIM7000E)

Results:

Test Downlink Uplink Latency
1 0.19 Mbps 0.37 Mbps 98 ms
2 0.25 Mbps 0.33 Mbps 100 ms
3 0.26 Mbps 0.32 Mbps 147 ms
4 0.26 Mbps 0.33 Mbps 103 ms
5 0.25 Mbps 0.34 Mbps 137 ms
Average 0.24 Mbps 0.34 Mbps 117 ms

 

LTE Cat 1 (A7672E)

Results:

Test Downlink Uplink Latency
1 8.92 Mbps 4.57 Mbps 73 ms
2 8.71 Mbps 4.59 Mbps 67 ms
3 9.02 Mbps 4.73 Mbps 68 ms
4 8.02 Mbps 4.67 Mbps 66 ms
5 8.90 Mbps 4.64 Mbps 72 ms
Average 8.71 Mbps 4.64 Mbps 69 ms

 

Explaining the differences between theoretical & practical speeds

Looking at the speedtest results, it meets the data speed defined in the data sheets quite well. The uplinks are almost identical to the specifications, the downlinks are somewhat slower. 

Shown from the tests, the latency of both technologies is different. The roundtrip time of a data packet is almost 50 ms faster using LTE Cat 1, compared to LTE Cat-M1.

The good results are an indication of good network quality. With poor network quality, the results will look very different due to a mechanism referred to as adaptive modulation.

Adaptive Modulation

Poor network reception can be the result of different factors. When the distance between the sender and receiver increases, the weaker the radio signal becomes which makes it harder for the receiver to process the signal. When devices are close to the cell tower, they can still suffer from poor network quality due to obstruction of buildings or objects, or due to interference also referred to as noise. When too many radio signals are sent, radio waves start interfering with one another, making it harder for the receiver to filter out the right message. 

When sending data wirelessly, bits are translated into radio waves. Each radio wave (referred to as a symbol) can encode a specific amount of information. With good network quality, more bits are encoded in each symbol compared to scenarios with poor network quality. This is because poor network quality makes it difficult for the receiver to decode radio signals.

In practice, this means that cellular devices automatically adjust their data speed. LTE Cat-M1 can send either 2 or 4 bits of data per symbol. LTE Cat 1 can encode 2, 4 or 6 bits per symbol. With 2 bits, you can count to 4 (00, 01, 10, 11), with 4 bits you can count to 16, with 6 bits you can count to 64. 

This smart mechanism allows cellular devices to automatically improve performance, in relation to the network quality.

Getting started with cellular IoT

To learn more about cellular IoT, get started with Monogoto by requesting a free sample kit containing a SIM which supports LTE Cat 1, LTE Cat-M1 as well as many other cellular standards. 

Request free sample kit

Share this post
Share this post
Related Posts

Fill in your details to experience the goRAN Kit

Ubiik's & Monogoto's goRAN Kit Form