Syncing Robot Framework test results with Jira Xray test cases

I’m working on connecting Robot Framework automated tests with existing Jira Xray test cases. The problem is when I upload the output.xml file from Robot Framework to Xray, it creates brand new test cases instead of updating my existing ones.

I already have test cases created in Xray and I’ve written corresponding automated tests in Robot Framework. My goal is to have the automation results update the existing Xray test cases, not generate duplicates.

I tried using matching test names between both platforms and adding specific tags in my Robot Framework tests, but the synchronization still doesn’t work properly. Has anyone successfully linked these two tools together? What’s the correct approach to map Robot Framework test results to existing Xray test cases?

This happens because Xray treats each Robot Framework import separately by default. Here’s what worked for me: use Xray’s precondition feature with proper test execution linking. First, create a test execution in Xray that includes all your existing test cases. Then modify your Robot Framework output by adding the test execution key to the XML metadata before uploading. A simple Python script can parse output.xml and inject the execution reference. This forces Xray to update results within your predefined test execution instead of creating new test cases. The key is telling Xray upfront which execution context to use - no more guesswork on Xray’s side.

Had this same problem six months back. You need to use Robot Framework’s listener interface with a custom listener that tweaks the test metadata before Xray gets it. Here’s what worked for me: embed the Xray test execution ID directly in your Robot Framework test metadata using Set Test Variable or suite setup. Skip the documentation tags - they’re unreliable. Instead, add the Xray test key as a custom field in the metadata section. When you upload to Xray, use the import endpoint for test execution updates, not test creation. Set your Xray import config to map by custom fields instead of test names. This stopped the duplicate creation issue and consistently updated existing test cases with fresh execution data.

Your problem is that Xray can’t automatically link Robot Framework tests to existing test cases - name matching and tags don’t cut it. You need to explicitly map them using Xray’s test execution keys. Here’s what works: Add the Xray test case key directly in your Robot Framework test documentation like ‘[Documentation] XRAY:PROJECT-123’ where PROJECT-123 is your existing test case key. When you upload output.xml, Xray will see this mapping and update the existing test instead of creating duplicates. I’ve used this method plenty of times and it’s solid. Just make sure your Xray import settings are configured to update existing tests rather than always creating new ones. Way more reliable than hoping name matching will work - that’s hit or miss.

Been down this road way too many times. Manual mapping works but it’s hell to maintain at scale.

You need proper automation that handles the sync for you. I built a workflow that watches Robot Framework test runs and automatically pushes results to the right Xray test cases through API calls.

Here’s the trick: create a mapping table linking your Robot Framework test IDs to Xray test case keys. Then run an automated process that reads output.xml, grabs the results, finds the right Xray mappings, and updates test cases via REST API.

I built this whole thing with automation tools and ditched all the manual tagging nonsense. Runs after every test execution and keeps everything synced without creating duplicates.

Bonus: you can add logic to create new Xray test cases when it finds new Robot Framework tests, or send alerts when mappings are missing.

Latenode makes this integration stupid easy with API connectors and visual workflow automation. Set up the entire sync process without writing custom code.

This happens because Xray sees each import as a new test run instead of updating your existing test cases. I fixed this by ditching the standard Robot Framework XML upload and using Xray’s REST API instead. Create a test execution in Xray first, then import your Robot Framework results straight into that execution using the execution key. This skips the automatic test case creation completely. You need the test execution ID in your import request - without it, Xray just creates new test cases every time. Also double-check that your Robot Framework test names match exactly with the summary field in your existing Xray test cases, including spacing and special characters.

Double-check you’re uploading to the right Xray project with the correct import format. Often it’s just the wrong API endpoint - try /import/execution/robot instead of the general import. Also make sure your Xray token has permissions to update existing tests, not just create new ones.