• 5G Networks Security: Attack Detection Using the J48 and the Random Forest Tree Classifiers

      Kholidy, Hisham A.; Steele II, Bruce; Kholidy, Hisham A.; Advisor (SUNY Polytechnic Institute, 2020)
      5G is the next generation of cellular networks succeeding and improving upon the last generation of 4G Long Term Evolution (LTE) networks. With the introduction of 5G comes significant improvements over the previous generation with the ability to support new and emerging technologies in addition to the growth in the number of devices. The purpose of this report is to give a broad overview of what 5G encompasses including the architecture, underlying technology, advanced features, use cases/applications, and security, and to evaluate the security of this new networks using existing machine learning classification techniques such as The J48 Tree Classifier and the Random Forest tree classifier. The evaluation is based on the UNSW-NB15 dataset that was created at the Cyber Range Lab of the Australian Centre for Cyber Security (ACCS) at the University of New South Wales. Since 5G datasets have yet to have been created, there is no publicly available dataset for the 5G systems. However, While the UNSW-NB15 dataset is built using a standard wireless computer network, we will use it to simulate the device-to-device (D2D) connections that 5G will support. In the case with the UNSW dataset, the J48 tree classifier fits more accurately than the Random Forest classifier. The J48 tree classifier achieved an 86.422% of correctly classified instances. On the other hand, the Random Forest tree classifier achieved 85.8451% of correctly classified instances.
    • ?Generic Datasets, Beamforming Vectors Prediction of 5G Celleular Networks

      Kholidy, Hisham A.; Singh, Manjit; Kholidy, Hisham A.; Advisor (SUNY Polytechnic Institute, 2020)
      The early stages of 5G evolution revolves around delivering higher data speeds, latency improvements and the functional redesign of mobile networks to enable greater agility, efficiency and openness. The millimeter-wave (mmWave) massive multiple-input-multiple-output (massive MIMO) system is one of the dominant technology that consistently features in the list of the 5G enablers and opens up new frontiers of services and applications for next-generation 5G cellular networks. The mmWave massive MIMO technology shows potentials to significantly raise user throughput, enhances spectral and energy efficiencies and increases the capacity of mobile networks using the joint capabilities of the huge available bandwidth in the mmWave frequency bands and high multiplexing gains achievable with massive antenna arrays. In this report, we present the preliminary outcomes of research on mmWave massive MIMO (as research on this subject is still in the exploratory phase) and study two papers related to the Millimeter Wave (mmwave) and massive MIMO for next-gen 5G wireless systems. We focus on how a generic dataset uses accurate real-world measurements using ray tracing data and how machine learning/Deep learning can find correlations for better beam prediction vectors through this ray tracing data. We also study a generated deep learning model to be trained using TensorFlow and Google Collaboratory.