I’m working with an OMNeT++ simulation (version 4.6) that uses the INET framework. My setup requires real-time communication between the network simulator and a Python ML algorithm while both are running.
The simulator needs to continuously send network metrics like signal-to-noise ratios and node positions to the Python program. The Python side processes this information to train a learning model that optimizes network performance.
Once the Python algorithm calculates the optimal configuration parameters, it must send these control commands back to the OMNeT++ simulator immediately. This creates a feedback loop where both applications exchange data during execution.
What’s the best approach to establish this bidirectional communication channel between these two running processes? I need a solution that works reliably during simulation runtime.
TCP sockets are probably easier than pipes. I used ZeroMQ for something similar and it handled async messaging really well. You can have OMNeT++ push metrics through a ZMQ publisher while Python subscribes and sends back the optimized parameters. Just handle connection drops properly or your simulation will crash every time Python hiccups.
I’ve done something similar with named pipes for OMNeT++/Python communication. Worked great for my adaptive routing project. Create the pipe first, then have OMNeT++ write network state data while Python reads it and sends back control parameters through another pipe. The tricky part is synchronization - you don’t want OMNeT++ hanging around waiting for Python during critical sim events. I’d add a timeout on the OMNeT++ side and buffer the latest ML recommendations. That way your sim keeps running even when Python gets slow. Performance hit was pretty minimal at 100ms update intervals.