Follow @Openwall on Twitter for new release announcements and other news
[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Date: Sat, 1 Sep 2018 14:39:14 -0400
From: Matt Weir <>
Subject: Re: How to use multiple lists at the same time?

I sometimes find myself doing this when using lists of previously
cracked passwords. I like to keep the sources separate which equals a
lot of files. Admittedly there is a high amount of overlap so this is
usually not the optimal way to do things, but it works for me. I've
been mostly using john's '--pipe' command for this since I can cat the
files out, pipe them into JtR and still have mangling rules applied to
them. Just FYI the main difference between '--pipe' and '--stdin' is
that pipe allows you to apply mangling rules to incoming guesses,
stdin does not.

Another approach I used to use before '--pipe' was the 'xargs' command
on linux based systems. I'd need to dig into some of my old scripts
for examples with JtR, but here is a link to a page describing xargs


On Sat, Sep 1, 2018 at 1:16 PM, Eric Oyen <> wrote:
> Well, the problem I am finding is there are always more passwords to add and incremental would include stuff that hardly anyone uses. Mostly, I would just be adding to the end of the last file until it hits max size and then create another.
> Also, I have been finding (through some reading) that folks are also starting to use URL’s as passwords and we know how long those can get.. btw, this is both part of my continued efforts to crack one of my own hashes, as well as start a project (that hopefully turns into a paying job soon).
> I know that some of my methodology seems a bit archaic, but as they say, if it works, use it.
> Btw, this is also part and parcel of some of my own education. I have IT skills that are a little rusty and at my age (53), I would still like to be able to get back into the workforce and not feel like I am playing catch up with the younger set. :)
> -Eric
>> On Sep 1, 2018, at 9:36 AM, Solar Designer <> wrote:
>> Hi Eric,
>> On Sat, Sep 01, 2018 at 09:19:07AM -0700, Eric Oyen wrote:
>>> Ok, my situation is this: I use a 30+ GB file, which can be a real PITA to edit. I have used the split command (here on OS X) to split those down to more manageable sized (about 10 MB each) Sure, this will create a lot of files. What I want JTR to do is to be able to load each list in turn during the same session to process against hashes I have.
>> JtR doesn't support use of multiple wordlist files in one session.
>> We might or might not add this feature later.
>> Meanwhile, you could work around this with commands like:
>> cat *.txt | ./john --stdin ...
>> or using "--pipe" in place of "--stdin".
>> However, I don't recommend this.  What I recommend is that you simply
>> avoid generating wordlists this large.  Especially not if you also
>> intend to edit them (why would you?!)  Use a smaller wordlist, and use
>> JtR's wordlist rules, etc. to modify and extend it on the fly as needed.
>> I understand that with pre-applied rules you can filter out duplicates
>> (such as with our "./unique" program), which is about the only valid use
>> case for such large wordlists.  If that's what you do, you should have
>> no need to edit the resulting wordlist.
>> Having a large generated wordlist _and_ wanting to edit it indicates
>> that you need to take a step back, and reconsider what you're doing.
>> IIRC, at some point in the past you mentioned you were using Crunch,
>> but perhaps you don't anymore?  There's no valid use case for Crunch
>> along with JtR as we have equivalent or better functionality built in.
>> I hope this helps.
>> Alexander

Powered by blists - more mailing lists

Confused about mailing lists and their use? Read about mailing lists on Wikipedia and check out these guidelines on proper formatting of your messages.