How to fetch YouTube comments for multiple video IDs using YouTube Data API v3 through RapidAPI

I’m building a script to gather YouTube comments for research purposes. Right now my code works fine for getting comments from a single video, but I need to process many videos at once. The current setup requires me to manually change the video ID each time which is really slow.

import requests
import json

api_endpoint = "https://youtube-v31.p.rapidapi.com/commentThreads"

parameters = {
    "maxResults": "50",
    "videoId": "VIDEO_ID_HERE", 
    "part": "snippet"
}

request_headers = {
    'x-rapidapi-key': "your_api_key_here",
    'x-rapidapi-host': "youtube-v31.p.rapidapi.com"
}

api_response = requests.get(api_endpoint, headers=request_headers, params=parameters)
comment_data = api_response.json()
print(comment_data)

I have a list with hundreds of video IDs that I want to process. Instead of running this manually for each video, I want to loop through all of them automatically. What’s the best way to modify this code so it can handle multiple video IDs from a list or CSV file? Should I use a for loop or is there a way to send multiple IDs in one request?

Yes, working with the YouTube Data API can be quite challenging when fetching comments. You will indeed need to send separate requests for each video ID, which can be tedious. One efficient way to handle this is by storing your video IDs in a list and looping through that list to make individual API calls. It’s crucial, however, to implement error handling for cases when you reach the API’s rate limits or if certain videos have restrictions. I recommend including retry logic with exponential backoff for handling such errors. Additionally, saving each response in its own JSON file named after the video ID can be advantageous. This way, if the process gets interrupted, you can easily resume without losing progress.

YouTube’s API doesn’t do batch requests for comments - you’ve got to hit each video ID separately. I hit this same wall when pulling sentiment data across channels. Here’s what worked: wrap your current code in a function that takes a video ID, then loop through your list. Make sure you handle errors properly since some videos have disabled comments or are private. I throw a try-except around each API call and dump failed requests to a log file for retries later. Rate limiting saved my butt too - I add 1-2 seconds between requests to avoid quota hell. If you’re doing hundreds of videos, split them into chunks and spread the work across multiple days.

you gotta loop through each video ID, since the API dont accept multiple at once. just set parameters['videoId'] = video_id for each one, and then request it. also, dont forget to add sleep() between calls to avoid hitting rate limits.