How to handle multiple file upload and download in Spring Boot REST endpoints

I’m working on a Spring Boot application where I need to create REST endpoints that can handle multiple files at once. For the upload endpoint, I want users to be able to send several files in one request. For the download endpoint, I need to return multiple files back to the client.

Here’s what I’m trying to achieve:

@PostMapping("/files/batch-upload")
public ResponseEntity<?> handleMultipleFiles(@RequestParam("documents") MultipartFile[] documents) {
    // Process multiple uploaded files
    return ResponseEntity.ok().build();
}

@GetMapping("/files/batch-download")
public ResponseEntity<Resource[]> getMultipleFiles(@RequestParam String requestId) {
    // Return multiple files to client
    return ResponseEntity.ok().build();
}

For each file I’m handling, I need access to the filename, content type, and the actual byte data. Creating a zip archive is not an option for my use case since the client application needs to process individual files separately.

I’m primarily using Spring Boot but I’m open to integrating other libraries if needed. Has anyone implemented something similar? What’s the recommended approach for this kind of multi-file handling in REST APIs?

Storage config matters way more than people realize. Learned this the hard way building a media processing API for video thumbnails.

Your upload code looks good, but Spring Boot’s default multipart limits will bite you. Files cap at 1MB, total requests at 10MB. Fix it in application.properties:

spring.servlet.multipart.max-file-size=50MB
spring.servlet.multipart.max-request-size=200MB

Downloads are trickier since HTTP doesn’t do multiple files natively. I went with batch download sessions:

@PostMapping("/files/prepare-batch")
public ResponseEntity<BatchDownloadResponse> prepareBatch(@RequestBody List<String> fileIds) {
    String sessionId = UUID.randomUUID().toString();
    downloadSessionService.createSession(sessionId, fileIds);
    return ResponseEntity.ok(new BatchDownloadResponse(sessionId, fileIds.size()));
}

@GetMapping("/files/batch/{sessionId}/next")
public ResponseEntity<Resource> getNextFile(@PathVariable String sessionId) {
    return downloadSessionService.getNextFile(sessionId);
}

Client hits prepare first, then polls next until it gets everything. Sessions expire after 30 minutes.

Worked great for mobile clients - they could pause and resume batches without hassle.

The Problem: You’re building a Spring Boot REST API that needs to handle multiple file uploads and downloads without zipping the files. The client application requires processing individual files separately. You’ve attempted to use @RequestParam("documents") MultipartFile[] documents for uploads and @GetMapping for downloads, but you’re unsure of the best approach for efficient and scalable multi-file handling in a RESTful context.

:thinking: Understanding the “Why” (The Root Cause): HTTP is fundamentally designed for single-file transfers. While you can upload multiple files in a single request using MultipartFile[], returning multiple files directly in a single HTTP response is inefficient and problematic. Browsers don’t inherently handle multiple files within one response. This necessitates a different approach for downloads. Simply attempting to return a Resource[] will not work reliably. Furthermore, processing large files directly in memory (file.getBytes()) is memory-intensive and can lead to performance issues, especially with multiple files.

:gear: Step-by-Step Guide: The recommended approach involves asynchronous processing and a change in how you manage downloads. We will implement a system using a “prepare” endpoint for initiating the download process and a polling endpoint to retrieve files one by one.

Step 1: Stream Uploads: Modify your upload endpoint to process files individually using streams instead of loading them fully into memory:

@PostMapping("/files/batch-upload")
public ResponseEntity<?> handleMultipleFiles(@RequestParam("documents") MultipartFile[] documents) {
    for (MultipartFile file : documents) {
        try (InputStream stream = file.getInputStream()) {
            // Process each file using the stream
            processFileStream(stream, file.getOriginalFilename(), file.getContentType());
        } catch (IOException e) {
            // Handle exceptions appropriately, perhaps log and return an error response
            return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).body("Error processing file: " + e.getMessage());
        }
    }
    return ResponseEntity.ok().build();
}

Step 2: Implement Asynchronous Download Session Management: Create a service (DownloadSessionService) to manage download sessions. This will handle tracking which files have been sent and maintaining the session state. A unique session ID (UUID) will be used to identify each download session.

Step 3: Create Prepare and Next File Endpoints: Create two endpoints: one to prepare a download session and another to retrieve the next file in the session:

@PostMapping("/files/prepare-batch")
public ResponseEntity<BatchDownloadResponse> prepareBatch(@RequestBody List<String> fileIds) {
    String sessionId = UUID.randomUUID().toString();
    downloadSessionService.createSession(sessionId, fileIds);
    return ResponseEntity.ok(new BatchDownloadResponse(sessionId, fileIds.size()));
}

@GetMapping("/files/batch/{sessionId}/next")
public ResponseEntity<Resource> getNextFile(@PathVariable String sessionId) {
    return downloadSessionService.getNextFile(sessionId);
}

BatchDownloadResponse is a custom class that would contain the sessionId and the total number of files. The getNextFile method in DownloadSessionService should return the next file in the session, or a null/empty response if the session is complete or invalid. Implement appropriate session expiration logic (e.g., after 30 minutes).

Step 4: Client-Side Polling: The client application will now need to:

  • Call the /files/prepare-batch endpoint with the list of desired file IDs to initiate a download session.
  • Poll the /files/batch/{sessionId}/next endpoint repeatedly until all files have been retrieved.
  • Handle potential errors (e.g., session expiration, file not found).

Step 5: Configure Multipart File Limits (Spring Boot): Increase the maximum file size and request size limits in your application.properties file to avoid hitting default limits:

spring.servlet.multipart.max-file-size=50MB
spring.servlet.multipart.max-request-size=200MB

:mag: Common Pitfalls & What to Check Next:

  • Error Handling: Implement robust error handling for both upload and download processes. Handle IOExceptions during file processing, invalid session IDs, and file not found errors. Return informative error messages to the client.
  • Session Expiration: Configure a reasonable session expiration time to prevent resource exhaustion. Consider adding session cleanup mechanisms.
  • File Storage: Ensure your chosen file storage mechanism (e.g., filesystem, cloud storage) is appropriately configured and can handle the expected volume of files.
  • Concurrency: If you expect a high volume of concurrent requests, consider using a queuing system (e.g., RabbitMQ, Kafka) to manage file processing and downloads asynchronously.
  • Security: Protect your download endpoints with appropriate authentication and authorization mechanisms. Use short-lived, signed tokens for download URLs if you need finer-grained control.

:speech_balloon: Still running into issues? Share your (sanitized) config files, the exact command you ran, and any other relevant details. The community is here to help!

Been through this nightmare before with a file processing service. Upload’s easy enough, but large files will destroy your memory if you’re not careful. I switched to streaming instead of dumping everything into byte arrays - saved my sanity. For downloads, since you can’t zip them, build a download queue. Make an endpoint that takes the request and spits back a job ID. Then add a status endpoint where clients can check if it’s done. When it’s ready, return file URLs with short-lived signed tokens. This handles timeouts way better and you actually control your resources. I also added retry logic because network issues happen constantly with multiple file transfers. Yeah, polling feels clunky compared to straight REST calls, but it’s rock solid in production where file sizes and connections are all over the place.

Async file handling was a game changer for me. Skip the complex session management - just use CompletableFuture for uploads and server-sent events for download progress. The client subscribes to events, then the server pushes file URLs when they’re ready. Much cleaner than polling and handles failures way better than batch processing.

I’ve dealt with this exact scenario in a document management system. Your upload approach works well, but add file size validation to prevent memory issues with large batches. For downloads, HTTP limitations are the real problem. Since you can’t zip files, I used a hybrid approach: return a JSON response with file metadata and temporary download tokens. Each token maps to a specific file that gets downloaded via separate GET requests. ```java
@GetMapping(“/files/batch-download”)
public ResponseEntity<Map<String, Object>> getMultipleFiles(@RequestParam String requestId) {
List files = fileService.getFilesByRequest(requestId);
Map<String, String> downloadTokens = generateDownloadTokens(files);
return ResponseEntity.ok(Map.of(“files”, files, “tokens”, downloadTokens));
}

File upload endpoints get messy quick, especially with validation, processing, and storage.

Your upload approach looks good. Add proper validation and consider @RequestPart over @RequestParam if you need metadata with files:

@PostMapping("/files/batch-upload")
public ResponseEntity<?> handleMultipleFiles(@RequestPart("documents") MultipartFile[] documents) {
    for (MultipartFile file : documents) {
        String filename = file.getOriginalFilename();
        String contentType = file.getContentType();
        byte[] data = file.getBytes();
        // Process each file
    }
    return ResponseEntity.ok().build();
}

Downloads are trickier. Returning multiple files without zipping doesn’t work well with REST. You could return JSON with file metadata and separate download URLs.

This whole workflow screams automation though. Skip the complex Spring controllers and set up automated pipelines that handle uploads, validation, storage, and download URL generation.

I’ve built similar systems where files get processed automatically, validated, and made available through generated endpoints. Everything runs without manual work and scales better than traditional REST.

Latenode makes this file automation super straightforward. Build workflows that handle multiple file operations, integrate with cloud storage, and generate download links automatically.

Check it out: https://latenode.com

multipart uploads work fine, but downloads are tricky. you’ll need to stream files one by one or use chunked responses. i tried returning Resource arrays before - doesn’t work. browsers can’t handle multiple files in a single HTTP response. better to return json with file ids, then let the client make separate requests for each file.

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.