I’m using n8n v1.89.2 with the xAI Grok Chat Model node set to grok-3-mini-beta. It returns an error: ‘presencePenalty’ not supported. How can I remove this parameter?
I’ve dealt with this ‘presencePenalty’ error in the xAI Grok Chat Model node before. It’s a known limitation with Grok models. Here’s what worked for me:
First, check if there’s a node update available that addresses this issue. If not, you’ll need to manually remove the ‘presencePenalty’ parameter.
Look in the node’s configuration for any advanced settings where you can edit parameters. If you find it, simply delete the ‘presencePenalty’ entry or set it to null.
If that’s not possible, you might need to use a Function node before the Grok node to filter out this parameter from the input data. This ensures it’s not passed to the model at all.
As a last resort, consider using a different AI model node that’s compatible with your workflow and doesn’t have this parameter conflict.
I encountered a similar issue with the xAI Grok Chat Model node. From my experience, the ‘presencePenalty’ parameter is not supported by Grok models, which is why you’re getting that error. To resolve this, you’ll need to modify the node configuration.
In the node settings, look for any advanced options or parameter fields. You should be able to remove or set the ‘presencePenalty’ to null or 0. If you can’t find this option, you might need to edit the node’s code directly.
Alternatively, you could create a custom function node before the Grok node to filter out the ‘presencePenalty’ parameter from the input. This way, you can ensure it’s not being passed to the Grok model.
If none of these solutions work, consider reaching out to n8n’s support team or checking their documentation for Grok-specific configurations. They might have a workaround or a planned update to address this compatibility issue.
hey alex, had the same issue. try updating ur n8n to latest version. if that doesn’t work, check the node settings. there might be an option to disable or remove ‘presencePenalty’. if not, u could try using a different AI model node that doesn’t have this param. good luck!