Bit error rate

Another important characteristic of data transmission is the Bit error rate (BER). BER refers to the number of bit errors received over a communication channel. BER is a unitless measurement expressed as a ratio or percentage. For example:

If an original transmission sequence is: 1 0 1 0 1 1 0 1 0 0

And the received sequence is: 0 0 1 0 1 0 1 0 1 0 (differences in bold)

Then the BER is 5 errors / 10 bits transferred = 50%

BER is affected by channel noise, interference, multipath fading, and attenuation. Techniques to improve BER include increasing transmission power, improving receiver sensitivity, using less dense/lower order modulation techniques, or adding more redundant data. The last technique is typically referred to as forward error correction (FEC). FEC simply adds extra information to a transmission. In the most basic sense, one would add triple redundancy and a majority vote algorithm; however, this would reduce bandwidth by 3x. Modern FEC techniques include hamming codes and Reed-Solomon error correct codes. The BER can be expressed as a function of Eb/NSNR.

The following diagram shows a variety of modulation techniques and their respective BERs for various SNRs: 

Bit error rates (Pb) versus power efficiency (Eb/No) SNR for various modulation schemes. As the SNR increases towards the right the BER naturally decreases.

What should be understood at this point is the following:

  • We can now calculate the minimum SNR needed to achieve a certain data rate for a system
  • The only way to add more capacity or bandwidth to a wireless service is to:
    • Add more spectrum and channel capacity, which improves the bandwidth linearly
    • Add more antennas (MIMO), which improves the bandwidth linearly
    • Improve the SNR with advanced antennas and receivers, which only improves the equation logarithmically
  • The Shannon limit is the ultimate bound of digital transmission; exceeding the limit is possible, but the data integrity will be lost
  • Factors that contribute to noise 
  • One cannot simply increase modulation levels without incurring costs to error rates and complexity

For 4G-LTE cellular signals, which are covered later in this book, it operates in the 700 MHz to 5 GHz spectrum with dozens of segregated bands within that range. A cell phone (or battery-based IoT device) has substantially less power than a cell tower, but it is often the case that an IoT device will be transmitting sensor data to the cloud. The uplink from the IoT device is what we examine here. The limit to uplink power is 200mW maximum, which is 23 dBm. This limits the overall range of the transmission; that limit, however, is dynamic and will vary based on the bandwidth of the channel and the data rate. 4G systems, like several WPAN and WLAN devices, use orthogonal frequency-division multiplexing. Each channel has many sub-carriers to address the multipath fading problems. If you sum all the data transmitting over sub-carriers, you achieve high data rates.  

4G-LTE generally uses 20 MHz channels and LTE-A can use 100 MHz channels. These wide channels are limited by the overall spectrum of availability and are in contention with multiple carriers (ATT, Verizon, and so on) and other technologies sharing the spectrum. An additional complexity of cellular communication is that a carrier may have portions of the spectrum divided up and disjoint from each other.  

Cat-3 LTE can use five, 10, or 20 MHz channels. The smallest channel granularity is 1.4 MHz. LTE-A is allowed to aggregate up to five 20-MHz channels for an aggregate bandwidth of 100 MHz.

A method to measure the distance a wireless device is functional is the Maximum coupling loss (MCL). The MCL is the maximum distance at which total channel loss occurs between a transmitter and a receiving antenna but data service can still be delivered. MCL is a very common way to measure the coverage of a system. MCL will include antenna gains, path loss, shadowing, and other radio effects. Generally, a 4G-LTE system will have an MCL of about 142 dB. We will revisit MCL when examining the cellular IoT technologies such as Cat-M1.

What we should grasp at this point is that if we increase the listening time per bit, the noise level will go down. If we reduce the bitrate by 2x, then the following is true: (Bit_Rate / 2) = (Bit_Duration * 2). Additionally, energy per bit increases by 2x and noise energy increases by sqrt(2). For example, if we reduce the Bit_Rate from 1 Mbps to 100 kbps, then Bit_Duration = increases by 10x. The range improves by the sqrt(10) = 3.162x.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset