Wavelength Division Multiplexing (WDM) is the key technology in ultra-high capacity links that form the backbone of the internet. Hundreds or more data channels each at a different wavelength travel through a single fiber resulting in aggregate data rates exceeding many Terabits per second. The fundamental limit to the data transmission rate is the optical crosstalk between channels induced by the inevitable nonlinearity of the fiber. Traditional methods for compensating for the reduction in the bit error rate caused by the crosstalk include numerical backpropagation as well as nonlinear Volterra filter, both implemented in the digital domain at the receiver. Backpropagation through the canonical nonlinear Schrodinger equation is computationally expensive and beyond the capability of today’s DSP at the data rates that optical networks operate. Volterra filters scale superlinearly with an increasing number of taps and which in turn scale with the amount dispersion in the fiber. Therefore, they are not the ideal solution for high data rates. In this talk, we report on the application of machine learning, and neural networks in particular, on the compensation of optical crosstalk in WDM communication. We compare the performance of different machine learning models such as support vector machine (SVM), decision tree, convolutional neural network (CNN) in terms of the achievable bit error rate on both binary and multilevel modulated data. We further evaluate the sensitivity of the error rate to the resolution of the analog to digital converter (ADC) and to the signal to noise ratio as well as the latency of our algorithms
|