Abstract:
In this research the equalization concepts in Long Term Evolution (LTE) has been discussed in theoretical and practical manners as an analysis and implementation study of channel equalization in LTE downlink. As channel equalization is achieved on the receiver side after the OFDM modulation , we proposed a complete scenario for generation of OFDM frame from random bits which symbolized and QPSK modulated to generate the OFDM frame which consist of 10 sub frame, then the frame is passed through a fading channel with the additive of noise and received to reverse the OFDM operation and the equalization in order to return the original transmitted frame, then due to the different types of equalizer according to different receiver implementations we have to examine the effects of two of the familiar linear equalizers which are the Zero Forcing (ZF) and the Minimum Mean Square Error (MMSE), then calculate the RMS percentage of Error Vector Magnitude for pre-equalized and post-equalized signal for each equalizer. The simulation environment used is MATLAB R2014a, which offers very helpful built-in commands from the communication and LTE toolboxes that designed according to the 3GPP LTE technical specification and communication engineering standards latest releases.