I’m working on a project where I need to fetch blog content from Notion using their API and display it in multiple languages. My app already has a working translation system for static content using translation keys.
For my local translation strings, I can easily use something like:
{translate('articles')}
But when it comes to dynamic content pulled from Notion, I’m not sure how to handle the translation. The data comes in like this:
{article.heading}
I have English to Chinese translations set up locally, but I need help figuring out how to translate the dynamic content that comes from the Notion database. What’s the best approach to handle this kind of internationalization with external API content?
Been there multiple times - automation’s the only clean fix. Manual translation gets ugly real quick at scale.
Build a pipeline that monitors your Notion database, auto-translates new stuff, and syncs to your app. Your content team keeps working in Notion like normal, your app gets fresh translations without the manual headache.
Best part? Only translate when content actually changes, not on every page load. Add some smart caching and fallback logic so users never hit untranslated content.
I built this exact setup with Latenode. Hooks into Notion API, catches content changes, runs translation services, updates your database automatically. Way more solid than doing it by hand and cheaper than re-translating the same stuff over and over.
Same issue here! I went with a hybrid setup - store translation keys in Notion with the content instead of raw text. Use something like translation.key.identifier in your Notion properties, then your translate function works like normal. For stuff that has to be dynamic text, I built a background job that pre-translates everything when Notion content gets updated. Don’t do it live - too slow. The key is using Notion’s webhooks to trigger translations when editors make changes. Keeps everything fresh without crushing your translation APIs during user requests. Plays nice with existing i18n since you’re using the same pipeline.
just store translations directly in notion - make separate properties for each lang (title_en, title_zh, etc.). then grab the right property based on the user’s locale. more manual work, but everything stays in one place and you skip translation api costs.
Had this exact problem last year. Here’s what worked for me: build a translation mapping service between your Notion data and display layer. When you fetch content from Notion, run it through Google Translate or Azure Translator, then cache the results locally using the original text as the key. You’ll only translate each piece once and serve everything else from cache. I made simple middleware that checks local storage first and only calls the translation API for new content. Performance is solid since most stuff gets translated once and cached, plus it plays nice with your existing translation workflow.
I’ve had good luck with a content hash approach for this. When you pull data from Notion, hash the content and use that as your cache key. Store translations in your database with the hash - that way identical content blocks across different pages get reused without hitting the translation API again. The key is catching your translate function before it makes the API call to check for cached content first. Just add a translateDynamic() method to your current setup that handles both cached and new content. Works with your existing workflow and cuts down on duplicate translation costs.
you could also use notion’s database relations - set up separate databases for each lang and link them. just pull the right database based on the user’s locale instead of translating live. takes more setup upfront, but you skip translation apis completely and keep everything organized in notion.