I am running the Robotframework Script in Jenkins Linux environment. in that i am using the Rebot option to remove the unwanted log in xml but due to the xml size is large (ie, more than 300 MB) it is not working
can you please give me the solution for large XML files
You didn’t tell how Rebot usage fails, but most likely memory runs out. An easy solution is getting more memory to the machine (300MB output.xml file is big but not huge), but that just fixes symptoms and doesn’t address the root cause.
Most likely the issue is caused by deeply nested keyword structures and/or loop constructs. They can often be simplified or the logic moved to Python where they don’t cause a new entry being written to output.xml with each keyword call and iteration.
If restructuring data isn’t possible, you can try to “flatten” keyword structures and loops with the
--flattenkeywords option when using Rebot. This option is used already when output.xml is parsed can it can save substantial amount of memory. The just released RF 6.1 also supports
robot:flatten keyword tag that handles flattening already during execution.
Thanks for the response,
Let me explain the issue in more detail.
I have tested this on my laptop instead of checking Jenkins
i have run the more robot script to make the XML file large, and it makes around 1.2GB after the execution.
After running the script, I used the Rebot option to flatten or remove keywords, but the result I got was a memory error. Please refer to the screenshot.
The screenshot makes it clear you run out of memory. My earlier reply lists all solutions I have. I cannot help you more.