<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:dt="uuid:C2F41010-65B3-11d1-A29F-00AA00C14882" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=us-ascii">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:purple;
text-decoration:underline;}
span.EmailStyle17
{mso-style-type:personal-compose;
font-family:"Calibri",sans-serif;
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri",sans-serif;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang="EN-US" link="blue" vlink="purple">
<div class="WordSection1">
<p class="MsoNormal">I have a four node cluster of QEMU/KVM virtual machines. I installed MPICH-3.2 and ran the mpi-hello-world program with no problem.<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">I installed libfabric-1.5.3 and ran fabtests-1.5.3:<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New"">$ $PWD/runfabtests.sh -p /nfs/fabtests/bin sockets 192.168.100.201 192.168.100.203<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New""><o:p> </o:p></span></p>
<p class="MsoNormal">And all tests pass:<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New""># --------------------------------------------------------------<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New""># Total Pass 73<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New""># Total Notrun 0<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New""># Total Fail 0<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New""># Percentage of Pass 100<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:10.0pt;font-family:"Courier New""># --------------------------------------------------------------<o:p></o:p></span></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">I rebuilt MPICH after configuring it to use libfabric. I recompiled the mpi-hello-world program. When I run mpi-hello-world with libfabric, it prints the “hello” message from all four nodes but hangs in MPI_Finalize.<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">I rebuilt libfabric and MPICH with debugging enabled and generated a log file when running mpi-hello-world on just two nodes (i.e. using “-n 2” instead of “-n 4”). The log file indicates that it is stuck “Waiting for 1 close operations”,
repeating “MPID_nem_ofi_poll” over and over until I stop the program with control-C:<o:p></o:p></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New"">...
<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> <"MPID_nem_ofi_poll"(3e-06) src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_progress.c[124]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> >"MPID_nem_ofi_poll" src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_progress.c[45]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> >"MPID_nem_ofi_cts_send_callback" src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_cm.c[188]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> >"MPID_nem_ofi_handle_packet" src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_cm.c[167]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> <"MPID_nem_ofi_handle_packet"(3e-06) src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_cm.c[175]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> <"MPID_nem_ofi_cts_send_callback"(9e-06) src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_cm.c[191]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> >"MPID_nem_ofi_data_callback" src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_msg.c[124]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> <"MPID_nem_ofi_data_callback"(3e-06) src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_msg.c[173]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> <"MPID_nem_ofi_poll"(0.00404) src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_progress.c[124]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""><MPIDI_CH3I_PROGRESS(0.00796) src/mpid/ch3/channels/nemesis/src/ch3_progress.c[659]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New"">Waiting for 1 close operations src/mpid/ch3/src/ch3u_handle_connection.c[382]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New"">>MPIDI_CH3I_PROGRESS src/mpid/ch3/channels/nemesis/src/ch3_progress.c[424]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> >"MPID_nem_ofi_poll" src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_progress.c[45]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New""> <"MPID_nem_ofi_poll"(3e-06) src/mpid/ch3/channels/nemesis/netmod/ofi/ofi_progress.c[124]<o:p></o:p></span></p>
<p class="MsoNormal"><span style="font-size:8.0pt;font-family:"Courier New"">...
<o:p></o:p></span></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">I get the same behavior with OpenVPN; mpi-hello-world prints the “hello” message from all four nodes and hangs. Without libfabric, it runs normally.<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Is there a known issue with libfabric on a QEMU/KVM virtual cluster? It seems like this should work.<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">-- <o:p></o:p></p>
<p class="MsoNormal">John Wilkes | <b><span style="color:#00B050">AMD Research</span></b> |
<a href="mailto:john.wilkes@amd.com"><span style="color:blue">john.wilkes@amd.com</span></a> | office: +1 425.586.6412 (x26412)<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
</div>
</body>
</html>