Never the less, more rounds isn't much extra strength... What you need, as ever, is a good strong and long password with good entropy.
Thank you! This is not surprising coming from Paul, but still encouraging as I've spent years trying to convince people of the folly of long key derivation in the face of exponential password strength.
An industry-minimum strength for key derivation is good and memory-hard algorithms are even better, but adding multiple seconds worth of rounds on low-power devices is just a waste of time and electricity that needlessly inconveniences the user. If you want to inconvenience your user, then do it the *right* way; include an algorithm in your password creation routine that rejects short and simplistic passwords. Heck, even run it through HIPB and reject any that are found there as well if you've got an Internet connection. It's the *password* that matters (ITPS).
Adding 1 random ASCII character to your password requires 95X more work to brute force and that work applies only to an attacker, not to you; it's likely <8% more work for you to type the extra character. Extended key derivation is for suckers and I can't help but feel that Steve beguiled himself into doing this with SQRL via Enscrypt. The 5-second default derivation time in SQRL is not the end of the world, but it's pointless as it's a substantial delay that must be done every single time you use the password. It makes users *feel* more secure and that makes the hair stand up on the back of my head. I'm *VERY* glad that Steve included a setting to adjust the derivation time in his SQRL Windows client; I use 1 second, which is the minimum.
To put this another way, >95% passwords can be categorized as either easily guessed or not easily guessed. An easily guessed password cannot be protected even with a PBKDF2 delay of 60 seconds, while a difficult-to-guess password will be secure even with only 500ms of PBKDF2 delay. Forcing users to wait for 5 seconds during key derivation will add relevant protection to 1-5% of your users, yet 100% of users will lose 5 seconds of their lifetimes every single time they log in or decrypt the resource.
It doesn't really matter if Authy is using 1,000 rounds. 5,000 would *only* make brute force 5X harder. You'd have to bump it to 95,000 rounds just to simulate 1 extra random password character. Yes, I understand that most users aren't using random characters for their "master" passwords, but that's why you add inconvenience to password creation rather than the KDF; minimum requirements, though imperfect and arbitrary, are far better than extended key stretching.
However...
The docs don't say if Authy is using PBKDF2 with SHA1 or SHA256/512, but hopefully the latter so that's what I'll assume. If so, an
Intel i7 from 2016 can do ~500,000 PBKDF2-SHA256 hashes in 1 second in *Python* (much more in C). I've found no present-day ARM metric, but I found a test where a
Nexus One phone could do ~3,000 PBKDF2-SHA1 hashes per second. SHA256 is ~250% faster, so let's just assume that a modern phone can do 10,000 PBKDF2-SHA256 hashes per second. Anything less than ~1.5 second on the slowest devices is fine. so it seems very likely that Authy could use at least 10X more iterations than their docs claim. They should definitely bump this to 10,000 as an easy update and then implement Argon2id as a
real upgrade.
Hardened key derivation was a great idea that is certainly necessary for password systems, but it should only be done to a degree that does not impose upon usability at all. If the user has to patiently wait for key derivation, then you're doing it wrong.
If readers want to see what happens when users and developers get suckered into key derivation strength and "secret" extra factors, take a look at VeraCrypt's ridiculous PIM system. That's a rant for another thread, though.