Correct place from where we can send report.html to amazon s3 bucket in robot framework
Currently I am trying to send file created log.html and report.html to amazon s3 bucket from test teardown using api available in my project. I am able to send file if that present in directory.
But when I am running entire suite , then in Test or Suite teardown I am not getting log.html and report.html and its saying file not found error.
Second run it becomes available as that is overwriting result. But what is the right place to send those files
Current case its keep on writing log in the same file unless Suite Teardown so that file wont available unless Suite Teardown is not completed.
FAIL FileNotFoundError: [Errno 2] No such file or directory: ‘C:\automation-framework\RScripts\report.html’
Because file is not available unless Suite Teardown completed. Then how should I send this file to s3 ?
At the point of teardown the test hasn’t yet finished (the teardown is still running, it’s part of the test) so it’s not surprising that those files haven’t been created.
How are you running robot?
I would expect the script or process thats running robot needs to wait till robot ends and then copy/transfer the files.
I haven’t actually looked at the robot framework code, so one of the developers might need to correct me, but based on how programs typically work, the teardown result is included in the result, therefore it stands to reason that the html files are not generated until after the last test result is added to the output.xml (I would expect this to be the last step of the suite teardown, assuming the suite teardown comes after the test teardown)
This is why I suggested that you’d need to do the file copy after robot has exited, and this is why I asked how you are running robot.
For the 2 ways you mentioned I’ll start with the 2nd:
This is possibly the easier one to address, in your python script that calls robot.run(), wait till robot.run() has finished then call a python function to copy the files where you need them. Not sure how you transfer the files to a amazon s3 bucket, if its a simple file copy, then shutil.copyfile might be all you need, if its a http post to a web server then you probably want to use requests
For the case where you run robot from the command line, i’d suggest you create a shell script (batch file on windows) that you’d call something like robotbk which you put within your path. It would take all the command line options and just pass them onto robot, when robot exits it would then upload the files.
The reason I addressed this second, is the upload part could be the same python code as I mentioned above, just call the function as an argument to python in the second part of the shell script. then instead of calling robot test_script.robot you would simply call robotbk test_script.robot