How to properly concatenate multiple image URLs to Airtable field when handling simultaneous requests with backend processing delays

I’m building a messaging bot that processes multiple photos sent by users at the same time. The problem is that when users upload several images together, my server gets all the requests instantly but it takes some time for the database to actually save each URL.

Because of this timing issue, my code only processes the first condition and keeps replacing the cell content instead of adding new URLs to what’s already there.

Here’s my current setup:

@app.route('/process_media', methods=['POST'])
def handle_request():
    payload = request.json
    user_id = payload['userId']
    username = payload['userName']
    
    if payload['mediaType'] == 'photo':
        fresh_photo_url = payload.get('mediaData')
        print("New URL: ", fresh_photo_url)
        
        record_id, current_photos = db.fetch_data(user_id, "photos_collected")
        print(record_id, current_photos)
        
        if current_photos is None:
            current_photos = ""
        
        photo_list = current_photos.split("\n")
        photo_list.append(fresh_photo_url)
        
        combined_photos = "\n".join(photo_list)
        print(combined_photos)
        
        db.save_photo_urls(record_id, combined_photos)
    
    return payload

The terminal shows all URLs being processed but only the final one gets stored in the database. How can I fix this so that all image URLs get properly added to the existing field content even when multiple requests come in at once and there’s a delay in the backend updates?

Classic race condition. Multiple requests read the same old value before any of them finish writing back.

I hit this exact problem building a file upload system for batch uploads. Here’s what actually works:

Use database-level locking or atomic operations. Most databases support this:

# Use a transaction with SELECT FOR UPDATE
def save_photo_urls_atomic(record_id, new_url):
    with db.transaction():
        current_photos = db.fetch_with_lock(record_id, "photos_collected")
        if current_photos is None:
            current_photos = ""
        
        photo_list = current_photos.split("\n")
        photo_list.append(new_url)
        combined_photos = "\n".join(photo_list)
        
        db.update(record_id, combined_photos)

Or if your database supports it, append directly:

# Some databases let you do this atomically
db.append_to_field(record_id, "photos_collected", "\n" + fresh_photo_url)

You could also queue the URLs and process them one by one instead of handling concurrent writes. I’ve used Redis for this - works great.

Bottom line: only one request can modify that field at a time. Without proper locking, you’ll keep losing data regardless of how you structure the code.

Had the same nightmare with batch file processing last year. The issue isn’t just race conditions - Airtable gets weird with rapid updates to the same record.

A simple retry mechanism with exponential backoff saved me. When multiple requests hit at once, some fail initially but work after the database settles.

import time
import random

def save_with_retry(record_id, new_url, max_attempts=3):
    for attempt in range(max_attempts):
        try:
            current_photos = db.fetch_data(record_id, "photos_collected")[1]
            if current_photos is None:
                current_photos = ""
            
            photo_list = current_photos.split("\n") if current_photos else []
            if new_url not in photo_list:  # prevent duplicates
                photo_list.append(new_url)
            
            combined_photos = "\n".join(photo_list)
            db.save_photo_urls(record_id, combined_photos)
            return True
            
        except Exception as e:
            if attempt < max_attempts - 1:
                time.sleep(0.1 * (2 ** attempt) + random.uniform(0, 0.1))
            else:
                raise e

This fixes both the timing issue and Airtable’s occasional hiccups. The random jitter stops multiple retries from colliding again.

This topic was automatically closed 4 days after the last reply. New replies are no longer allowed.