PHP script for importing Gmail to database hits limit - any workarounds?

I’m stuck with a problem and need some advice. I made a script to parse emails from CPanel and put them in a MySQL database. Now my client wants to import old emails from about 50 Gmail accounts. Some have over 20,000 messages.

When I try to import, the script stops after about 7,000 emails. No errors show up. It just quits. I think I’m hitting Gmail’s IMAP bandwidth limit of 750MB per hour.

How can I confirm if this is the issue? And what’s the best way to get around it?

I thought about importing in smaller chunks, but that would take too long. Another idea is to move all the emails to a CPanel account first, then use my existing script to process them.

Has anyone dealt with this before? What would you suggest? Thanks for any help!

I’ve actually tackled this problem before when migrating a large email archive for a client. Here’s what worked for me:

Instead of using IMAP, I leveraged Google’s API. It’s more robust and has higher limits. You’ll need to set up OAuth2 authentication, but it’s worth it for large imports.

I broke the import into batches of 1000 emails, with a 5-minute pause between batches. This approach kept me well under Google’s rate limits.

Also, I optimized my database insertions by using bulk inserts rather than individual queries. This dramatically sped up the process on the database side.

One last tip - implement proper error handling and logging. It’ll save you hours of troubleshooting if something goes wrong mid-import.

Hope this helps! Let me know if you need any clarification on the API setup.

try google takeout method. i had a simlar issue. export emails as mbox and then run your own script to import dem. may avoid the imap limit. test on a small batch first, ok? cheers.

I’ve encountered similar issues when working with large-scale email imports. Your suspicion about hitting Gmail’s IMAP bandwidth limit is likely correct. To confirm, you could implement logging in your script to track the number of emails processed and the time taken.

As for workarounds, I’ve found success with a hybrid approach. First, use Google Takeout to export the emails in MBOX format. This bypasses the IMAP limitations. Then, write a script to process these MBOX files and import them into your database. This method is much faster and more reliable for large volumes.

Additionally, consider implementing a queuing system with rate limiting to manage the import process more effectively. This way, you can control the flow and avoid overloading your database or hitting API limits.

Remember to test thoroughly with a smaller subset before running the full import.