Also, Python's AI/ML libraries can be used. You need to pass input parameters from NetSim. Clustering algorithms in the Python ML library can be used for network routing in NetSim. See https://scikit-learn.org/stable/modules/clustering.html
Yes, your general idea is correct. You need some external tool to run the ML functions while NetSim handles the network simulation. From what I know the best option is MATLAB. There are plenty of examples available in the NetSim file exchange (https://tetcos.com/file-exchange.html) and Knowledge base portal (https://support.tetcos.com/support/home). I would believe Python will have most of the functionality that MATLAB has so it would be possible to interface that as well.
Anup Vernekar An example (i) Use ML to control the transmit power in WLANs. Here a higher power can lead to better data rates at the associates STAs. However, a higher transmit power also blocks other transmitters (due to CSMA) and interferers with other receivers. Hence there is a tradeoff which needs to be carefully calculated.
This should be possible. You can send the performance data from NetSim to an ML program which in turns returns the transmit power to be used. This process can be carried out continuously (say every 1s).
This is another NetSim ML paper - https://www.researchgate.net/publication/360724660_Adaptive_Hybrid_Heterogeneous_IDS_for_6LoWPAN - which uses incremental machine learning (ML) approaches and various ‘concept-drift detection’ mechanisms.
You could also look at it as a high dimensional expensive (computationally) black-box (HEB) problem. For example, in a problem we are working on in NetSim, we are trying to do gNB power control in 5G for obtaining maximum sum throughput under interference conditions. Say, you have N gNBs, M UEs, and P power settings. We don't change the power within one simulation but rather run many many simulations one after the other. This calls for N^P simulations. If N were 10 and P were 20 then this calls for 10^20 simulations. This is impossible since each simulation takes a long time - 10s of minutes - to output the network performance. One must therefore look at black-box learning algorithms that minimize the number of simulation runs required to reach the optimum vector of transmit powers
You can interface NetSim with Matlab via python. You can use Multi_parameter_sweeper(MPS) for NetSim which runs the simulation multiple times for the given array of input parameters. See this for details - https://www.tetcos.com/pdf/v13.1/NetSim-Multi-Parameter-Sweeper_v13.1.pdf
Now, you can call the Matlab engine from runTest.py (a file in MPS), where Matlab generates the simulation parameters. You can call Matlab synchronously or asynchronously based on the requirements.
Another doubt: How can I get various kinds of simulation data? For example, PHY layer logs of SINR, pathloss, then MAC layer details like available PRBs, allocated PRBs, and general info like UE positions, its associated gNB etc.
Yes, if you enable logging then you can record “Radio measurements” and “Radio resource allocation”. It will be similar to the screen shots shown in https://tetcos.com/5g.html