How to fetch YouTube comments for multiple video IDs using YouTube Data API v3 through RapidAPI

I’m building a comment analysis system and need to extract comments from YouTube videos. Right now my code works fine for single videos but I have hundreds of video IDs to process. The current approach requires me to manually change the video ID each time which is really slow.

import requests
import json

api_endpoint = "https://youtube-v31.p.rapidapi.com/commentThreads"

parameters = {
    "maxResults": "50",
    "videoId": "single_video_id_here",
    "part": "snippet"
}

request_headers = {
    'x-rapidapi-key': "your_api_key_here",
    'x-rapidapi-host': "youtube-v31.p.rapidapi.com"
}

api_response = requests.get(api_endpoint, headers=request_headers, params=parameters)
data = api_response.json()
print(data)

I have a list with around 200 video IDs and want to collect all comments from these videos automatically. Is there a way to pass multiple video IDs in one request or do I need to loop through them? What’s the best approach to handle this without hitting rate limits?

yup, u gotta loop through each vid id since the api doesn’t do batch requests. just use time.sleep(0.1) to help with rate limits. threading might boost speed but watch ur quota.

Indeed, using the YouTube API requires individual requests for each video ID. I’ve encountered similar challenges when extracting comments for over 300 videos for my sentiment analysis project. To efficiently manage this, I implemented a basic queue system with retry logic. I recommend placing all video IDs in a list and processing them sequentially with strong error handling in place. It’s also wise to save your progress to a file or database to continue where you left off in case of any interruptions. Maintaining a 200ms delay between requests is effective to balance speed and compliance with rate limits. Additionally, verify that videos remain public to avoid exhausting your quota on unresponsive links. My process took about six hours to complete, running smoothly overnight, and don’t overlook the necessity of capturing the nextPageToken when dealing with videos that have a large number of comments.

YouTube Data API v3 doesn’t support batch requests for comments, so you’ll have to loop through each video ID separately. I dealt with this when analyzing comments from ~150 videos last year. You need solid rate limiting and error handling or you’ll blow through your daily quota fast. Use exponential backoff when you hit limits and save results as you go - trust me, you don’t want to lose hours of progress. Start with 100ms delays between requests, but bump it up if you get 403 errors. Also split your video list across multiple days since the 10,000 unit daily quota disappears quickly with comment requests. Each commentThreads call is 1 unit, but you’ll need multiple calls per video if there are tons of comments.

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.