I’m dealing with a caching problem after implementing JS compression in our build process.
Our web application contains hundreds of JavaScript files that we recently started minifying during the build process to improve performance. The minification works great, but I’ve run into an unexpected issue.
When we deploy the minified files to our production server, their timestamps get updated to the current deployment time. This causes browsers to treat all JS files as “new” content, even if the actual source code hasn’t changed. As a result, clients end up downloading our entire JavaScript bundle on every release, which actually hurts performance instead of helping it.
// Example of our build process
const compressFiles = async (sourceDir, outputDir) => {
const jsFiles = await getJavaScriptFiles(sourceDir);
for (let file of jsFiles) {
const minifiedContent = await minifyJS(file.content);
await writeFile(path.join(outputDir, file.name), minifiedContent);
// Problem: this creates new timestamps!
}
};
Has anyone else faced this challenge? I’m curious about different approaches people use. Do you maintain separate development and production versions of your JS files? Or maybe there’s a way to preserve original file modification dates during the minification process?
I’m considering tracking actual file changes through our version control system, but wanted to see what solutions others have implemented first.
Had the same issue when we switched to automated builds. Content-based versioning fixed it for us - way better than relying on timestamps. We hash each minified file’s content and stick it on the filename during deployment. Only files that actually changed get new names, everything else stays cached. You can preserve timestamps with Node’s fs.utimes() if you want - just grab the original timestamp before minifying and restore it after. But honestly, content hashing works better since timestamps get screwed up all the time in deployment pipelines or when moving files between systems.
We ditched file timestamps and switched to ETags - way better solution. Our build process creates strong ETags from file content hashes, so the server actually knows if cached resources changed, no matter when you deployed them. Browser sends the ETag with each request and gets a 304 if nothing’s different. Works much better than trying to preserve timestamps since deployment systems usually screw with file dates anyway. You can set this up server-side with Apache or Nginx - both handle it out of the box. Just make sure you’re generating ETags from content, not inode data.
we fixed this by adding a checksum comparision before file updates. check if the minified output actually differs from what’s deployed, then only replace changed files. this keeps timestamps intact for unchanged files so browsers dont redownload everything. simple fix that works with any build tool.