关键词:
Artificial intelligence
Computer science
摘要:
Deep learning techniques have revolutionized many areas of science due to their unique properties like adaptability, scalability, and the potential to rapidly adjust to new and unknown challenges. This thesis aims to revolutionize the methodology behind the millimeter-wave (mmWave) beamforming in the same way. The main objective of this thesis is to reduce the time complexity for finding the best beam pair in an mmWave link by transitioning from a conventional brute force approach of testing all possible beam pairs, with polynomial time complexity of O(????2), to a deep learning solution, with time complexity of constant time of O(1). The collected SNR data of all beam pairs was interpreted to a multivariate parallel time series for the lateral motion with a fixed step size. However, the main challenge was to choose how to fit this data to an LSTM, a short form of Long Short Term Memory, considering the technical limitations to build numerous different multivariate time series models for each beam pair. Even if we somehow manage to build an independent LSTM model for each beam pair, then there would be very high chances of missing the inherent relationship among the beam pairs. Hence, we came up with an approach that combines both of the shortcomings - a multivariate parallel time series approach. To ensure that the data we used in our evaluation is reliable and an accurate representation of a real-world scenario, we collected the data using a real hardware – the X60 testbed. X60 is the first SDR-based testbed for 60 GHz WLANs, featuring fully programmable MAC/PHY/Network layers, multi-Gbps rates, and a user-configurable 12-element phased antenna array. Our evaluation shows that deep learning can considerably improve the prediction accuracy, and the model can achieve high throughput with little performance loss and with almost zero overhead.