<div>Steve,</div> <div> </div> <div>Thanks a lot for the reply. </div> <div> </div> <div>I could run the cpi from the example directory. </div> <div> </div> <div>But I see some error message when trying to run the IMB-MPI1. I am using 219297_IMB_2.3. Which version are you using?</div> <div> </div> <div>David<BR><BR><B><I>Steve Wise <swise@opengridcomputing.com></I></B> wrote:</div> <BLOCKQUOTE class=replbq style="PADDING-LEFT: 5px; MARGIN-LEFT: 5px; BORDER-LEFT: #1010ff 2px solid">On Wed, 2006-12-06 at 10:03 -0800, david elsen wrote:<BR>> Shaun / Steve,<BR>> <BR>> To pass the "librdmacm.so: cannot open shared object file: No such<BR>> file or<BR>> >> directory" error message, LD_RUN_PATH also need to be set. <BR>> <BR>> Anyway, after I am able to run the mvapich2 0.9.8-Release, I am trying<BR>> to figure out how to run the various nenchmark tests using this MPI<BR>> tool.<BR>> <BR>> Has anyone run the
Pallas tool with the OSC MPI or OpenMPI. I also<BR>> want to run the OSC benchmark tests. Any guideline availabvle for<BR>> these please?<BR>> Thanks,<BR>> David<BR><BR>I've run IMB benchmarks (aka pallas) on mvapich2 0.9.8 over iwarp. The<BR>mvapich2 user guide explains how to start up mpd daemons and use<BR>mpiexec. Its fairly straight forward. You need ssh or rsh access and<BR>you need to setup a few files. <BR><BR>Then pull down IMB and build it.<BR><BR>To run 2 node IMB-MPI1 tests, you do something like this:<BR><BR>$ mpdboot -n 2<BR>$ mpiexec -n 2 <PATH-TO-IMB>/IMB-MPI1 <BR><BR>This will run the entire MPI1 suite.<BR><BR><BR><BR>Steve.<BR><BR><BR></BLOCKQUOTE><BR><p>
<hr size=1>Any questions? Get answers on any topic at <a href="http://answers.yahoo.com/;_ylc=X3oDMTFvbGNhMGE3BF9TAzM5NjU0NTEwOARfcwMzOTY1NDUxMDMEc2VjA21haWxfdGFnbGluZQRzbGsDbWFpbF90YWcx">Yahoo! Answers</a>. Try it now.