custom_app_context.verify performance decreased largely after upgrading to 1.6.5

Issue #60 open
Pan Luo
created an issue

Here is the code I used for testing:

from passlib.apps import custom_app_context as pwd_context

hash = pwd_context.encrypt("somepass")
for _ in range(1,100):
    ok = pwd_context.verify("somepass", hash)  
  • Version 1.6.2: 31 seconds
  • Version 1.6.5: 182 seconds

Also the 100 loop encrypt slowed down from 28 seconds to 182 seconds.

Comments (8)

  1. Eli Collins repo owner

    This is because 1.6.4 made an across-the-board update to the default rounds settings in passlib, to match increasing processor speeds. A number of them, particularly the rounds choices in custom_app_context, hadn't been updated in a few years.

    That said, I wasn't intending that much of a slowdown. I tried to scale things to account for performance on the system I tested on, but I may not have scaled things down far enough. I'll have to run some benchmarks on a few more systems.

    In the meantime, in the passlib source directory there is a "" script ( If you could run it with "python sha256_crypt", and post the output, that would give me a good timing comparison.

  2. Pan Luo reporter

    Here you go: For 1.6.2:

    $ python sha256_crypt
    hash............: sha256_crypt (using builtin backend)
    speed...........: 416177 iterations/second
    target time.....: 350 ms
    target rounds...: 145662

    For 1.6.5

    $ python sha256_crypt
    hash............: sha256_crypt (using builtin backend)
    speed...........: 436932 iterations/second
    target time.....: 350 ms
    target rounds...: 152926
  3. Eli Collins repo owner

    I meant to get back to you much sooner on this.

    My main CPU target for the default settings in custom_app_context is cloud hosting providers such as Linode / Digital Ocean / etc. The current 1.6.5 tunings are a little high for them currently, but only by 15% or so (I really need to turn my retuning into a reproducible process, since it only gets done every 1.5 years on average); so I don't think I want to change the current default settings.

    Your problem highlights a deficiency in the custom_app_context model... it's trying to offer a preset configuration which by definition won't be equally suitable to all CPUs; and your CPU seems disproportionately slower at sha256 than the ones I have access to. At the same time, turning the defaults down for lower CPUs isn't good, as an attacker will generally have high-end CPUs.

    For now, for your specific case, I the best thing is probably to stop using custom_app_context, and use a CryptContext instance tuned to your CPU. I put the following together from custom_app_context.to_string(), combined with the target rounds you listed above (rounded to the first 2 significant figures)...

    from passlib.context import CryptContext
    pwd_context = CryptContext.from_string("""
    # generated 2015-8-18, 350ms target time 
    schemes = sha256_crypt
    default = sha256_crypt
    all__vary_rounds = 0.1
    sha256_crypt__min_rounds = 150000
    admin__sha256_crypt__min_rounds = 300000

    If your application runs from an .ini file, I'd recommend placing that configuration there, and using CryptContext.from_path() instead of hard-coding the above string into your code. In either case, it should probably be retuned when you change CPUs.

    Going forward, the main solution I've come up with is to replace custom_app_context with a script developers can run to generate their own configuration file, along with an expiration date to alert when a configuration needs updating again (I've started a separate issue #65 to track that feature).

    Let me know if this solves things (for now) for you, and I'll close this issue.

  4. Eli Collins repo owner

    Bumping this to 1.8. What would solve this long-term is 1) formalize & document the process that the defaults are chosen, and 2) add a commandline tool for choosing these rounds per-system (issue #80).

  5. Log in to comment