|
Post by soumendra on Jul 15, 2012 14:06:00 GMT -5
Hi All,
I have a few doubts in OFDM concept. Would really appreciate for your help in clarifying the same.
First is, in case of OFDM system like LTE why sampling rate is not equal to or more than twice the system bandwidth as in Nyquist criterion. For example for a 20 MHz system, the sampling rate is 30.72 MHz. Or is it actually following Nyquist criterion because the signal is complex ?, i.e, for every complex sample there are two samples one for real and one for imaginary effectively giving 61.44 Msamples or are there some other reasons.
Second is, do we effectively use the subcarrier width in OFDM for transmitting data ? From diagrams we see the symbols are modulated on subcarriers which are sync pulses. In order to demodulate back we need to be synchronized to the center of the sync pulses. So from this perspective the symbols are just located on the center of subcarriers and the space between them are "unused". Even if OFDM is the most efficient method of packing subcarriers, it looks like spectrum is used on discrete intervals rather than in continuous manner. One can argue like as we make Δf -> 0, OFDM system can have single carrier like continuous usage, albeit at the cost of higher sampling rate. Please correct me if my interpretation is wrong.
So what makes OFDM superior to single carrier systems where the spectrum is completely utilized. Is simpler equalization and lower ISI for high data rates the only reasons for using OFDM ?
Thanks in advance for your help !
|
|
|
Post by beverly on Aug 13, 2012 4:04:29 GMT -5
|
|
|
Post by sweety on Aug 24, 2012 6:15:34 GMT -5
Hi all
I am working on ofdm system with turbo codes as fec method.
I want to plot the ber for the turbo coded ofdm system but its very difficult for me to do it by matlab programming.
Can anyone help me in this difficulty??
|
|