Creating concurrent Telegram bot in Ruby for multiple users

I’m working on a Telegram bot using the telegram-bot-ruby gem and facing issues with handling multiple users simultaneously. The bot needs to support complex conversation flows where each user can interact independently without their sessions interfering with each other.

Currently my implementation breaks when multiple users try to use the bot at the same time. I’m not sure if I need asynchronous processing or if there’s a different approach to handle concurrent user sessions.

Here’s my current code that demonstrates the problem:

Telegram::Bot::Client.run(bot_token) do |client|
  client.listen do |msg|
    case msg.text
    when '/begin'
      client.api.send_message(chat_id: msg.chat.id, text: "Welcome! Please tell me your username:")
      
      client.listen do |response1|
        @username = response1.text
        break
      end
      
      client.api.send_message(chat_id: msg.chat.id, text: "Thanks #{@username}! What's your location?")
      
      client.listen do |response2|
        @location = response2.text
        break
      end
      
    when '/end'
      client.api.send_message(chat_id: msg.chat.id, text: "Goodbye!")
    else
      client.api.send_message(chat_id: msg.chat.id, text: "Type /begin to start")
    end
  end
end

I’ve searched through various GitHub examples and documentation but haven’t found a clear solution for handling multiple concurrent users. I’m building this as a standalone Ruby script without Rails. Any guidance on the right architecture would be really helpful.

yeah, you’re blockin those nested listeners. had the same issue too. just store convo state in redis or a simple hash with chat_id as key. handle everything in one main loop - no nesting. works great for hundreds of concurrent users.

Those nested client.listen blocks are what’s causing your concurrency problems. Each nested listen creates a blocking loop, so other users can’t get processed until the current conversation finishes. I’ve run into this exact issue with my own bot. The fix is simple: use a single event loop and store user state externally. I use a hash with chat_id as the key: ruby user_sessions = {} Telegram::Bot::Client.run(bot_token) do |client| client.listen do |msg| chat_id = msg.chat.id user_sessions[chat_id] ||= { step: :idle } session = user_sessions[chat_id] case session[:step] when :idle if msg.text == '/begin' client.api.send_message(chat_id: chat_id, text: "Welcome! Please tell me your username:") session[:step] = :waiting_username end when :waiting_username session[:username] = msg.text session[:step] = :waiting_location client.api.send_message(chat_id: chat_id, text: "Thanks #{session[:username]}! What's your location?") when :waiting_location session[:location] = msg.text session[:step] = :idle # Process complete data end end end end This way every message gets processed immediately without blocking other users.

That nested listener approach you’re using blocks other users - you’re basically creating a synchronous conversation flow. I ran into this exact problem in production and here’s what I learned: treat each message as its own event, not part of some sequential conversation. Your current setup waits for responses one by one, which kills concurrency. What fixed it for me was a state machine pattern. I persist each user’s conversation state between messages, so when a message comes in, I just check what step that user is on and handle it accordingly. For persistence, I started with a simple Ruby hash while developing, then moved to Redis for production. Redis handles crashes way better and keeps state across restarts. The key insight: Telegram bots are async by nature. Messages come in any order from different users, so your code needs to roll with that instead of fighting it.