I’m working with a React project that displays private Notion pages. I need to authenticate using specific credentials but I’m running into issues with keeping them updated.
Currently I’m setting up authentication like this:
const notionClient = new NotionService({
userId: process.env.USER_ID,
sessionToken: process.env.SESSION_TOKEN
})
Right now I have to manually get these values by opening browser dev tools, going to Application tab, finding Cookies section, and copying the session_token and user_id values.
The main issue is that these authentication tokens expire or change periodically. When this happens my application suddenly stops working and I have to manually update the environment variables and restart everything.
Is there any way to automatically fetch these authentication values instead of having to manually copy them from the browser each time they change? Looking for a programmatic solution to avoid the manual process.
Had the exact same problem with a client dashboard that died every few days. Manually refreshing tokens gets old fast, especially when you’re not babysitting it 24/7. Here’s what fixed it for me: I added a token validation check before every API call. When the token fails, Playwright automatically spins up a browser session, re-authenticates silently, and grabs fresh credentials. Store those in a secure config file your app can read - don’t hardcode them in env vars. The trick is expecting tokens to expire, not treating it like some weird error. I built the auth layer to fail gracefully and fix itself. Wrapped the whole Notion client in a retry system that auto-refreshes tokens when auth fails. Still unofficial methods, sure, but zero manual work. Your app keeps running and handles token rotation behind the scenes. Way better than checking browser cookies every few days.
Hit this exact wall last month. Session tokens expire by design - that’s the whole security point. You’re fighting Notion’s system here. I switched to the official API like everyone else mentioned, but if you absolutely need those private pages that aren’t shared through integrations, try a cron job with Selenium to grab fresh tokens every few hours. It’s hacky but beats copying manually every time.
You’re scraping session cookies meant for browsers, not API access. These tokens rotate constantly because they’re tied to active user sessions. Use Notion’s Integration API instead. Go to your workspace settings, create an integration, and grab the secret token. These don’t expire like session cookies. If you’re stuck with sessions for some reason, automate the token extraction with Puppeteer. Set up a headless browser to authenticate and pull fresh tokens periodically. Not pretty, but it works. You could also use Notion’s public API with database sharing - make your pages accessible through database shares instead of requiring full session auth. Depends on what security you need. Fighting session token expiration will give you constant headaches. The integration route is way more stable for anything in production.
Been there - manual token copying is a nightmare, especially with production apps that need to stay up.
You’re stuck because you’re using unofficial auth methods meant for browsers, not APIs. Those session tokens weren’t built for what you’re doing.
Switch to Notion’s official Integration API with OAuth. Problem is, building OAuth flows, handling token refresh, managing webhooks - it’s a ton of work.
I fixed this with Latenode. It handles the OAuth setup, auto-refreshes expired tokens, and keeps your Notion connection running without any manual work.
Set up scenarios to monitor and refresh tokens in the background. Connects straight to your React app via webhooks or API calls, so your frontend never hits auth failures.
Best part? No OAuth boilerplate or token management code to write. Configure once, done.