I have a Test Suite Collection with 2 Test Suites inside of it. I run them periodically throughout the day locally on my Windows computer. I loved the idea of Test Suite Collections to organize tests better so I was very happy to start using it once the feature became available in version 6.0.0?
Whenever I do run these tests it seems that during a very random part of running one of the Test Suites it will just stop executing, but it will act like it’s still executing. Katalon will not send any errors or failures, and will still show the Stop button at the very top. It sometimes happens in the first Test Suite or the second, it truly does seem random.
This seems to only have started since the version 6.1.0 release, but I’m still not quite sure. Has anyone else experienced this problem? The Test Suites run fine on their own, it just seems to be in the collection.
Me too. I can generally run two in parallel, but the moment I go to three, the third goes batshit (run seq’ly they’re mostly fine).
I get a weird problem sporadically where one or more suites decides it can’t run test due to “Browser not open” which is just nuts. Once in a while, I know Firefox (in my case) can decide it’s going to update something and that kills a test or two. But when I see 15-20 tests do that… has to be something else.
Funny that.
Yep. Me too. No such issues on my Win7 box where I develop my tests.
I thought that might have been my problem, but I see that you have quote a number of test cases as well. I do have a lot of tests with 50+ steps so that could be a factor?
Yeah, mine seem to go all crazy when I run in parallel and honestly they don’t run any faster, the whole execution time is the same as in sequential so I don’t see a need for it at the moment.
Are you using the Test Suite to open your browser? That’s what I have done with mine and I haven’t encountered that issue. I’ve had 10+ tests fail for no apparent reason at all and I’ve noticed if you close Katalon and open it back up that you can run them flawlessly after that. Very strange behavior.
I appreciate the link!
I do have a screenshot that can explain my dilemma a little further.
The test was started at 7:59 AM as shown in the picture, but my screenshot was taken an hour later at 8:59 AM. If you add up the seconds for the setup and 2 test cases to run that equates to a little bit over 2 minutes… yet it still acts like it’s running and even in Katalon Analytics it shows the test cases haven’t finished. Hopefully this helps a little bit more.
No. Each test (well, 99% of them) opens a browser, logs in, performs the test, closes browser.
Only reason I close Kat is when it’s out of memory and crashed - but that’s dev work, not suite executions. Start here if you want the gory details…
Brett, I could easily return this to Bug Reports, but I know what will happen, there’s nothing in this that can point a developer to the cause.
When you see this happening, bring up task manager and see how much memory geckodriver is using.
I have mine set to be killed when suites are running but stay open when executing tests. Sometimes I forget to kill them. I use this in a batch file to do that:
It did happen again today, I brought up task manager and geckodriver is using 4mb of memory, which seems like average. Hmmm interesting, but these stop mid test suite.
I think I’ve found the solution! After much testing, I’m almost 99% certain that this problem is caused by Katalon Analytics.
I’ve ran each test collection > 5 times, when I run it with Katalon Analytics the tests stopped executing 2-3/5 times.
When I run it without Katalon Analytics the test collection finishes 5/5 times.
@Russ_Thomas, is there anyway this could be looked into? I wish I had some other way to show this is a consistent problem with the Katalon Analytics integration.
I couldn’t find anything specifically in there as it refers me to the xml log file that doesn’t seem to have anything, but the console log did have some information that might be useful:
1561045898886 Marionette INFO Stopped listening on port 53337
[Parent 24616, Gecko_IOThread] WARNING: pipe error: 109: file z:/task_1559936101/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 341
[Child 14688, Chrome_ChildThread] WARNING: pipe error: 109: file z:/task_1559936101/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, l[Child 15056, ine 341
Chrome_C[Child 14688, Chrome_ChildThread] WAhildThread] WARNING: pipe error: 109: file z:/task_1559936101/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 341
[Child 15056, Chrome_Child[Parent 24616, Gecko_IOThread] WARNING: pipe error: 109: file z:/task_1559936101/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 341
[Child 22616, Chrome_ChildThread] WARNING: pipe error: 109: file z:/task_1559936101/build/src/ipc/chromium/src/chrome/common/ipc_channel_win.cc, line 341
[Child 22616, Chrome_ChildThread] WARNING: pipe error: 109: file z:/task_1559936101/build/src/ipc/chromium/
###!!! [Child][RunMessage] Error: Channel closing: too late to send/recv, messages will be lost
[GPU 19284, Chrome_ChildThread] WARNING: pipe error: 109:
###!!! [Child][MessageChannel::SendAndWait] Error: Channel error: cannot send/recv