Runtime communication between OMNeT++ simulator and Python-based AI agent?

I’m working on a project that involves an OMNeT++ 4.6 simulator running with the INET framework and a Python-based AI agent. I need to set up a method to enable real-time data exchange between these two systems.

The simulator sends data such as the average SNR of network links and the locations of mobile nodes. The AI agent should capture this information, update its training with the received data, and then generate corresponding actions to maintain or increase the SNR, which it will send back to the simulator.

I’m looking for a solution to establish this two-way communication during runtime. Any insights or methods, especially those that effectively use OMNeT++ and Python together, would be greatly appreciated.

Have you considered using Apache Kafka for your integration needs? It’s a robust distributed streaming platform that could effectively handle the communication between OMNeT++ and your Python AI agent.

You’d set up Kafka topics for different data types - one for SNR values, another for node locations, etc. OMNeT++ would act as a producer, publishing data to these topics. Your Python agent would be a consumer, subscribing to the relevant topics and processing the incoming data in real-time.

For sending actions back, you’d create separate topics where the Python agent publishes, and OMNeT++ consumes. This approach provides scalability and fault-tolerance, which could be beneficial as your project grows.

Kafka’s persistence feature also allows for replay of data streams if needed. While there’s a learning curve, the long-term benefits in terms of reliability and performance make it worth considering for your real-time data exchange requirements.

hey man, have u considered using zeromq? its a messaging library that works great for this kinda stuff. you can setup a pub/sub system where omnet++ publishes data and your python agent subscribes to it. then use another channel for sending actions back. its pretty fast and reliable, ive used it for similar projects before

I’ve tackled a similar challenge in my research, and found that using a RESTful API approach worked quite well. You could set up a lightweight web server within your OMNeT++ simulation using a library like Crow or cpp-httplib. This server would expose endpoints for your Python AI agent to query.

On the Python side, you’d use a library like requests to periodically poll these endpoints, fetching the latest simulation data. For sending actions back, your Python script could make POST requests to designated endpoints.

This method is relatively straightforward to implement and doesn’t require learning complex new frameworks. It’s also quite flexible - you can easily add new data types or actions by creating additional endpoints. The main drawback is slightly higher latency compared to some other methods, but for most simulation scenarios, this shouldn’t be a significant issue.

Remember to handle potential network issues and implement appropriate error handling on both sides to ensure robustness.

i kinda prefer using sockets. try setup a simple socket server in omnet++ and a client in python for realtime data exchange. its fast and keeps things minimal. give it a shot.

For your OMNeT++ and Python integration, I’d recommend exploring the gRPC framework. It’s designed for efficient, language-agnostic remote procedure calls and can handle bidirectional streaming. You’d need to define a protocol buffer for your data structures, then implement a gRPC server in OMNeT++ (using C++) and a client in Python. This setup allows for real-time communication with minimal overhead. The OMNeT++ side can stream simulation data, while the Python agent can send back actions as needed. It’s a bit more complex to set up initially, but offers excellent performance and flexibility for your use case.