Failing unit tests #162
Replies: 1 comment
-
The unit tests (and the unit test running script) are not designed to be run with MPI. It's a known issue that attempting to run the unit tests with MPI will result in a number of failures. -- Don't worry, this is just a result of the particulars of the unit tests, and the details of how the unit tests are launched and initialized. Failure of the unit tests here doesn't correlate with issues in actually running MPI protocols. Honestly, the unit tests are intended to be used by people who are modifying Rosetta, to make sure that your added changes haven't negatively affected the running of some other part of Rosetta. They aren't intended to be a compilation check for released versions. If your compilation of 3.14 finished successfully, and if you're not getting error messages or noticeably odd results when you run your protocol, then you can be (relatively) confident that things compiled successfully, without the need to run the unit tests. |
Beta Was this translation helpful? Give feedback.
-
Hi all,
I have been trying to deploy Rosetta 3.14 on my local machine. The compilation seem to be ok since I did not observe any error occured and scon reported a normal exit. However I got some very funny results when running unit tests, here is a snippet:
-------- Unit test summary --------
Total number of tests: 4613
number tests passed: 1662
number tests failed: 2951
-------------Omitted---------------
Success rate: 36.028614784305226%
---------- End of Unit test summary
Done!
I checked the test output and it seems that 'Signal code: Address not mapped (1)' error occurred very frequently and have caused this issue.
For compilation, I set the path to mpicxx and mpicc in the site.setting file
Compiler: AMD AOCC compiler 4.2.0 (mpicxx and mpicc)
mpi version: openMPI 4.1.1 compiled with aocc-4.2.0
System: openSUSE Leap 15.5
I have also tried other compilers (GNU gcc 11.3.0/14.2.0) and the openMPI 4.1.6/5.0.5. gcc 14.2.0 wouldn't pass the compilation and other combinations didn't work either, the success rate fluctuate around 35%. However, when compiling without mpi, the success rate was around 99%.
I was wondering if there is anything to do with my compiler or openMPI. Do I need to link the program with a specific version of openMPI? I have attached the compilation log and output from tests for your reference.
Many thanks
compile_debug_test.log
compile_debug.log
compile_release.log
test.log
Beta Was this translation helpful? Give feedback.
All reactions