I need to implement a data archiving system in my PHP application that allows users to browse historical records from past months.
My current approach involves creating corresponding archive tables for each existing table (like users_archived, orders_archived, etc.) and using INSERT INTO SELECT statements to move old data. However, I’m working with a database containing roughly 300 tables, so this would mean adding another 300 archive tables.
My main concerns are:
- Will doubling the number of tables (from 300 to 600) impact my application’s performance?
- How can I prevent users from modifying archived data? I need to restrict INSERT, UPDATE, and DELETE operations on these archive tables.
- Are there more efficient alternatives to this table duplication approach?
I’m looking for advice on database design patterns for archiving or any PHP-specific solutions that might work better for this scenario.