Runtime data exchange between OMNeT++ simulation and Python-based machine learning model

I’m working with an OMNeT++ simulator (version 4.6) that uses the INET framework for network simulation. My goal is to establish real-time communication between the running simulation and a Python machine learning algorithm.

The simulation needs to continuously send network metrics like signal quality measurements and node position data to the Python program. The Python side processes this information to train a model that optimizes network performance.

Once the Python model calculates the best configuration parameters, it must send these control commands back to the OMNeT++ simulation while it’s still running.

What’s the best approach to set up this bidirectional communication between these two separate processes? I need both programs to exchange data without stopping the simulation.

Had the same issue with OMNeT++ 4.6 a couple years back while working on adaptive routing algorithms. ZeroMQ sockets worked best for me - way more reliable than other options I tried. Set up a simple message queue where your OMNeT++ modules publish metrics to topics and subscribe to control updates. The trick is building a lightweight C++ wrapper in your simulation modules that handles socket communication without blocking the sim thread. Python side’s easy - just use pyzmq for message passing. Got sub-millisecond latency and zero simulation timing problems. Just make sure you’re using non-blocking socket operations in OMNeT++ so you don’t mess with the discrete event scheduler.