I want to prevent database design issues in my app where users need to add, edit and remove custom fields dynamically. I’m thinking about using JSON storage in my database table. Here’s my table structure:
I’m stuck on how to search within the metadata field. Like how do I find all items where owner_id = 5 AND custom_category = special? I might need to filter by one or multiple JSON properties for reports and filtering. Is this a good way to handle dynamic fields or should I use a different approach?
json in mysql is overrated for this. sure, it works, but debugging sucks when you have typos in field names or mixed data types. i’ve seen countless bugs where someone stores “12.30” as a string instead of a number and filters break silently. go with a simple key-value table instead - item_id, field_name, field_value. way easier to query, better for reports, and you can rebuild the object in your app if you need to.
MySQL’s JSON functions handle this pretty well, but there are a few gotchas. Besides the basic JSON_EXTRACT stuff others mentioned, check out JSON_CONTAINS for flexible matching and JSON_OVERLAPS for arrays. Data types tripped me up recently - MySQL treats JSON strings differently than regular strings, so you might need to cast values depending on how your app stores them. I’d add a JSON schema validation trigger to catch bad data before it gets in. For performance, generated columns are surprisingly good if you know which JSON fields get queried most. Create virtual columns for those properties and index them. Not as clean as proper normalization, but you get decent speed without restructuring everything. The biggest pain I’ve hit is migrating JSON data when business requirements change. Write helper functions early for common transformations - you’ll need them later, trust me.
I’ve hit this exact problem multiple times. MySQL’s JSON support works well for dynamic fields, but querying gets messy quick.
For your search, you’d do:
SELECT * FROM Items
WHERE owner_id = 5
AND JSON_EXTRACT(metadata, '$.custom_category') = 'special';
Here’s where it gets ugly - complex filtering UI and reports mean writing tons of dynamic SQL. You’ll deal with different JSON data types, indexing headaches, and nested objects are a nightmare.
I used to build custom query builders until I realized I was recreating the same patterns endlessly. Now I handle this through Latenode workflows.
Create endpoints that take filter parameters, let Latenode build the JSON queries, handle data transformation, and cache frequent searches. You can set up different flows for simple filters vs complex reports without touching your app code.
Best part? When users want new field types or filtering logic, just update the workflow instead of deploying code changes. I’ve saved weeks of dev time this way.
Your table structure’s fine - just automate the complexity: https://latenode.com
JSON can work, but think about how you’ll actually query your data first. Most apps need both structured queries and flexible storage. Here’s what I do: keep frequently filtered fields as regular columns, use JSON for dynamic stuff that gets displayed but rarely searched. If custom_category becomes important for filtering later, just promote it to a real column. JSON_EXTRACT gets messy fast when building dynamic WHERE clauses. Pro tip: store searchable JSON keys in a separate search table updated via triggers. You get JSON flexibility with proper indexing. MySQL 8.0 handles JSON way better than older versions, but complex nested searches will still hit performance walls. Test with realistic data volumes early - queries that work fine with 10K records die at 100K without the right indexing.
JSON in MySQL works fine, but storage and queries aren’t your real problem - it’s all the dynamic filtering logic you’ll need afterward.
Your SQL looks like this:
SELECT * FROM Items
WHERE owner_id = 5
AND JSON_UNQUOTE(JSON_EXTRACT(metadata, '$.custom_category')) = 'special';
Here’s what nobody mentions: you’ll spend months building interfaces for dynamic filters. Want to filter multiple JSON fields? Sort by custom properties? Export filtered data? That’s more conditional query building code every time.
I’ve built this system three times at different companies. Always ended up with thousands of lines of unmaintainable query building logic.
Now I set up basic JSON storage like you planned, then handle filtering complexity through automation workflows. Users get simple filter interfaces, the system generates MySQL JSON queries automatically, handles type casting issues, and caches popular combinations.
New filter types or export formats? I update the workflow instead of touching app code. Saves weeks and users get features same day.
Your table structure works - just don’t build that filtering logic by hand: https://latenode.com
JSON storage in MySQL works but has nasty performance issues most devs don’t see coming. Found this out the hard way when our app tanked after 6 months. Main issue: JSON queries can’t use regular indexes well. Even with MySQL’s generated columns and JSON indexes, complex filters crawl compared to normalized tables. We hit 2-3 second query times on 100k+ record tables. Try a hybrid approach instead. Keep searchable stuff like cost and owner_id as normal columns, dump the truly dynamic metadata in JSON. Filter on indexed columns first, then hit the JSON on that smaller set. Or check out Entity-Attribute-Value with a separate custom_fields table. More work upfront but you get proper indexing and way better performance. Really depends how much you’re searching those custom fields vs just showing them.
Been there, ended up with a mixed approach that worked great. Started with pure JSON storage like you’re doing, but quickly needed better data consistency. What saved me was adding JSON schema validation at the app level before inserting anything. Catches type mismatches and missing required fields before they break your queries. For searches, JSON_EXTRACT works but wrap it right: SELECT * FROM Items WHERE owner_id = 5 AND JSON_UNQUOTE(JSON_EXTRACT(metadata, ‘$.custom_category’)) = ‘special’ AND JSON_EXTRACT(metadata, ‘$.discount’) > 0.2; The real game-changer: keep a separate metadata_fields table tracking which custom fields exist. Gives you validation rules and helps build dynamic forms without scanning every JSON blob. I cache common JSON paths as computed columns when I see query patterns. Makes migration easier since you’ve got a registry of all custom fields. Performance stayed solid up to 500K records in my case, but definitely watch your execution plans as you grow.