Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [thread-next>] [day] [month] [year] [list]
Message-ID: <1394387372.61661.YahooMailBasic@web140302.mail.bf1.yahoo.com>
Date: Sun, 9 Mar 2014 10:49:32 -0700 (PDT)
From: Anthony Tanoury <tanoury@...oo.com>
To: john-users@...ts.openwall.com
Subject: Re: Re: Help - mpi ocl restore session


> On 2014-03-09 16:18, Anthony Tanoury wrote:
> >> So are you sure that node has the same view of directories and NFS as the other have?
> >> What if you ssh into that missing node, can you see its session file and log somewhere there?
> >> Or if you start a non-MPI opencl session on that node, using the same format, input files,
> >> mode and so on - does it work fine? You might want to add --verb=5 to get more output than usual.
> >
> > Yes, I can ssh into the same directories and see all files.
>
> What do you mean "all files"? What about the missing ones?

I can start an opencl session (one master, two remotes). I can ssh into the remote computers, enter into the john/run directory and see the session log and .rec files. 

Keep in mind that the only trouble I'm having is restoring a session! My master node is only producing one .rec file.

>
> > But starting an opencl session fails on both remote computers. Here is the error message:
> >
> > Last login: Sat Mar  8 20:05:33 2014 from ub0
> > tony@ub2:~$ sh ~/PSKs/john/SSHGPU.sh
> > [ub2:06117] *** Process received signal ***
> > [ub2:06117] Signal: Segmentation fault (11)
> > [ub2:06117] Signal code: Address not mapped (1)
> > (...)
>
> Ignore MPI until this is solved. If you can't run john on the nodes, you can't expect to have them working under MPI.
>
> What is the content of that shell script?

cd /home/tony/bleeding-jumbocl/run/

./john - -form:wpapsk-opencl -dev=cpu --wordlist=rockyou.txt --rules jtr.hccap

 If I  ssh into the remote computers and enter the above command lines, I'll get that segment fault.

 If I  enter this command line instead, (without the -opencl) everything runs just fine.

./john - -form:wpapsk -dev=cpu --wordlist=rockyou.txt --rules jtr.hccap

Does that indicate an opencl problem??


>
> > I can ssh into the directory of version 1.8.0.2-bleeding-jumbo_mpi [linux-x86-64-native] and run an mpi session just fine.
> >
> > Can you point me in the right direction to fix this?
>
> I'm not sure I understand your problem and some of your wordings are not helping. For example, you say twice here that you "ssh into a directory". Such statements leave me guessing what you really did.
>
> Once you can ssh to each host and succesfully run john, you should be able to start an MPI job using all of them - as long as all nodes and the master have the same view of the working directories. There's nothing more to it.
>
> magnum

An opencl problem??

Thanks so much magnum!!



Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.