Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Fri, 3 May 2013 02:52:39 +0200
From: magnum <>
Subject: Re: MPI vs. --fork

On 3 May, 2013, at 0:20 , magnum <> wrote:
> Currently it's not finished and MPI session save/resume is busted. A warning is printed about that if applicable. I am not aware of any problems with non-MPI builds of bleeding though: All half-baked code is #ifdef HAVE_MPI.

For the record I believe the currently committed version works fine now. The only thing missing is hijacking the -fork line in the session file (using it for MPI as well).

By the way I can see an additional reason to do that hijacking: If we don't, there's nothing telling you that a rec file was from an MPI run. So if you just resume it without mpirun, you will only resume node 0.


Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.