I’m dealing with an external API that gives data in a way that isn’t compatible with what Zapier’s webhooks expect. The API sends back several JSON objects individually, which isn’t in the correct format.
The data from the API looks like this:
{"city":"Warren Township","country":"United States"}
{"city":"New York","country":"United States"}
{"city":"Stamford","country":"United States"}
However, Zapier wants it structured like this:
[{"city":"Warren Township","country":"United States"},
{"city":"New York","country":"United States"},
{"city":"Stamford","country":"United States"}]
To resolve the issue, I wrote a JavaScript code step in Zapier. Here’s my solution:
let url = 'https://api.iterable.com/api/export/data.json?dataTypeName=emailOpen&range=Today&onlyFields=email&onlyFields=createdAt&onlyFields=templateId&onlyFields=messageId&onlyFields=campaigniD&onlyFields=contentID'
let iterableData = 'undefined'
let response = await fetch(url, {
method: "GET",
headers: {
'Api-Key': '----------redacted----------------',
},
});
iterableData = await response.text()
iterableData = iterableData.replace(/\}/g, "},")
iterableData = "[" + iterableData.slice(0, -2) + "]"
let iterableJSON = JSON.parse(iterableData)
output = {iterableJSON}
This method successfully converts the incorrectly structured data into a valid JSON array that Zapier can handle properly. The important steps include fetching the response as text, adding commas to separate the objects, and enclosing everything in brackets.
I ran into this exact problem about six months ago when connecting a third-party analytics API to Zapier. The API was returning what’s called NDJSON (newline-delimited JSON) which is pretty common for streaming data but definitely trips up Zapier’s webhook parser.
One thing I learned the hard way is to always add error handling around the JSON.parse step. Sometimes these APIs will include a final summary line or error message that isn’t valid JSON, which will break your entire transformation.
What worked reliably for me was adding a try-catch block around each line parsing and logging any failures. Also worth noting that some of these APIs will return empty responses during certain time periods, so checking if iterableData actually contains content before processing saves you from mysterious Zapier failures later.
The text-based approach you used is actually more reliable than trying to parse streaming JSON responses directly, since the response isn’t technically valid JSON until you transform it.
nice approach! tho you could also use split(‘\n’) to break the response into lines then filter out empty ones and parse each json object separately. something like data.split('\n').filter(line => line.trim()).map(JSON.parse)
might be cleaner than string manipulation but your regex solution works too.
I had a similar issue when working with streaming APIs that return NDJSON format. Your string replacement method works well but can be fragile if the JSON objects contain nested braces.
Another approach is to handle this at the response level by checking the content-type header first. Some APIs that return this format actually set the content-type to ‘application/x-ndjson’ or ‘application/jsonlines’.
For a more robust solution, you could try splitting by newlines and then validating each line before parsing:
const lines = iterableData.split('\n')
const validObjects = lines
.filter(line => line.trim() && line.startsWith('{'))
.map(line => JSON.parse(line))
This prevents issues if there are empty lines or malformed entries in the response. I’ve found this particularly useful when APIs occasionally return error messages mixed with valid JSON objects.