How to stream AI chat responses through tRPC to frontend React component

I’m trying to build a streaming chat feature in my Next.js app using tRPC. The AI responses take too long without streaming, so I need to send data chunks as they arrive.

Here’s my current backend setup:

import { PassThrough } from 'stream';
import { createAIClient } from 'utils/ai-client';

export const chatRouter = router({
  sendMessage: protectedProcedure
    .input(z.object({
      prompt: z.string()
    }))
    .mutation(async ({ ctx, input }) => {
      const aiStream = new PassThrough({ objectMode: true })
      const client = createAIClient()

      const aiResponse = await client.createCompletion({
        messages: [{
          role: 'user',
          content: input.prompt
        }],
        model: 'gpt-3.5-turbo',
        temperature: 0.7,
        stream: true
      }, { responseType: 'stream' })

      let currentRole = '';

      aiResponse.data.on('data', (chunk: Buffer) => {
        const textLines = chunk.toString().split('\n').filter(l => l.trim() !== '');
        
        textLines.forEach(line => {
          const content = line.replace(/^data: /, '');
          if(content === '[DONE]') return;

          const parsedData = JSON.parse(content);
          if(parsedData.choices[0].finish_reason === 'stop') return;

          currentRole = parsedData.choices[0].delta.role || currentRole;

          if(parsedData.choices[0].delta.content) {
            aiStream.push({
              role: currentRole,
              text: parsedData.choices[0].delta.content
            });
          }
        });
      });

      return aiStream;
    })
});

And my React component:

useEffect(() => {
  const fetchStream = async () => {
    const responseStream = await trpc.chat.sendMessage.mutate({ 
      prompt: 'Hello world' 
    });
    
    responseStream.on('data', (chunk) => {
      console.log('Received:', chunk);
    });
  };
  
  fetchStream();
}, []);

The problem is that responseStream.on throws an error saying it’s not a function. The stream events don’t reach my React component properly. How can I make tRPC handle streaming data correctly? Is there a different approach I should use for real-time AI responses?

Same issue here. tRPC mutations return serialized data, not actual Node.js streams - that’s the core problem. Here’s what worked for me: create a custom /api/chat/stream endpoint that handles AI streaming directly, then use fetch with a ReadableStream reader in your component. Combine this with React Query for state management. You can still use tRPC for saving chat history and other stuff, just handle the streaming outside tRPC’s request/response cycle. This way you get proper backpressure handling without fighting tRPC’s design.

trpc mutations dont support streaming like that. u should check out subscriptions for real-time data instead. consider using trpc with websockets or server-sent events to handle streaming responses effectively.

You’re mixing up server-side streams with client-side handling. That PassThrough stream won’t serialize through tRPC’s JSON protocol. Had this exact problem last month building something similar. Skip trying to force streaming through tRPC mutations - use Server-Sent Events instead. Set up a separate API route in your Next.js app for the streaming response, then use EventSource in React to consume it. Keep tRPC for regular operations and use SSE for AI responses. Way cleaner separation and actually works across browsers.

Yeah, tRPC subscriptions could work but I’d just use WebSockets directly. Tried something similar a few months back and ditched tRPC for streaming entirely. Set up a Socket.io server and emit chunks as they come from the OpenAI API. Way simpler than dealing with tRPC’s subscription complexity.