Login to Origam is too slow

What version of Origam are you using: Origam 2025.10

I was trying to

login to my origam account

I was expecting

Swift login

Instead I’ve got

30 seconds of loading the page

Additional Info

While debugging, we got to following snippet, that causes slow loading times on hashing the password.

using (var deriveBytes = new Rfc2898DeriveBytes(password, salt, iterationCount))
{
    generatedSubkey = deriveBytes.GetBytes(PBKDF2_SUBKEY_LENGTH);
}

We were able to get to 15 sec. from original 30 by using

Microsoft.AspNetCore.Cryptography.KeyDerivation.KeyDerivation.Pbkdf2

Instead of

 System.Security.Cryptography.Rfc2898DeriveBytes

WE followed guidelines from PBKDF2 Hash - Password Storage - Online Encoder

And SHA256 was used with 600 000 iteration count, as per OWASP recommendation - PBKDF2 - Wikipedia

At the end we we able to get down to ~900ms load time.

These changes were made to new class CorePasswordHasherService which introduces new format of hashed keys, in following format
[prefix].[iteration_count in hexdecimal].[salt in base64].[subkey in base64]

inspired by Python library Passlib $pbkdf2-algo$rounds$salt_b64$hash_b6.

Currently, our class replaces CorePasswordHasher which is called for legacy password hashes.
It would be nice to be able to switch PasswordHasher method via configuration.

  1. Class backend/Origam.Security.Common/BrockAllen.IdentityReboot/System.Web.Helpers.Crypto.cs optimized by calling faster implementation Microsoft.AspNetCore.Cryptography.KeyDerivation.KeyDerivation.Pbkdf2
  2. Class backend/Origam.Server/CorePasswordHasherService .cs is new implementation of IPasswordHasher<IOrigamUser> interface that is using SHA-256 hashing algorithm.
  3. Password field in model’s Security package enlarged to allow longer hashes, now up to 1000 characters.

What should be done next:

  • In Startup.cs, there is encoded registering new password hasher CorePasswordHasherService .cs. We believe that it would be better to make it configurable item, probably in appsettings.json file.

Could you give us some recommendations how should we make it configurable? There are plenty of configuration options and it would be better to get an overview what should be changed before we dive in deeper.

Thank you,

David

The slowdown you’re observing is intentional and comes from increasing the computational cost of password verification.

This has two security benefits:

  • It slows down online brute-force attacks by making each login attempt more expensive.
  • It also protects against offline attacks in case the password database is compromised, by increasing the cost of each hash comparison.

Because of that, reducing this cost (as proposed in the pull request) would directly weaken both protections at the same time. It would make automated login attempts faster and significantly reduce resistance against offline password cracking.

At the moment, the only way to improve login responsiveness without weakening these protections is to run the application on a more powerful server.

For this reason, we consider the current behavior to be correct and aligned with security best practices, and we are not going to accept the pull request.

Hi Petr,

yes, of course we understand the reasons the hashing protections and both reasons you have stated.

There are few improvements in our implementation:

  1. The Rfc2898DeriveBytes implementation in .Net 4.8 is very slow as discussed on stackoverflow/questions/78441140. Inside backend/Origam.Security.Common/BrockAllen.IdentityReboot/System.Web.Helpers.Crypto.cs we could easily use faster implementation of Pbkdf2 (Microsoft.AspNetCore.Cryptography.KeyDerivation.KeyDerivation.Pbkdf2). This improved performance twice. And it does not decrease any security because the slowliness is achieved not by better security hashing algorithm but only by slow implementation. So at least your second point is not valid for this situation.
  2. Before implementing we checked recommendations we could find.

OWASP Cheatsheet states that SHA-256 is recommeded as hashing function with PBKDF2 method.

Since PBKDF2 is recommended by NIST and has FIPS-140 validated implementations, so it should be the preferred algorithm when these are required.

The PBKDF2 algorithm requires that you select an internal hashing algorithm such as an HMAC or a variety of other hashing algorithms. HMAC-SHA-256 is widely supported and is recommended by NIST.

The work factor for PBKDF2 is implemented through an iteration count, which should set differently based on the internal hashing algorithm used.

  • PBKDF2-HMAC-SHA1: 1,400,000 iterations
  • PBKDF2-HMAC-SHA256: 600,000 iterations
  • PBKDF2-HMAC-SHA512: 220,000 iterations

Parallel PBKDF2

  • PPBKDF2-SHA512: cost 2
  • PPBKDF2-SHA256: cost 5
  • PPBKDF2-SHA1: cost 10

These configuration settings are equivalent in the defense they provide. (Number as of december 2022, based on testing of RTX 4000 GPUs)

Here Why You Should Use 310,000+ Iterations with PBKDF2 in 2025 author states SHA-1 is not recommended anymore and also states hashing takes 50–150ms on your production hardware which is very different from tens of seconds. And it should not be necessary as we follow all recommendations and reach 1s to log in.

  1. Use SHA-256 or SHA-512

Avoid older digests like SHA-1.
SHA-256 is widely supported and fast enough for most applications.

  1. Measure Performance

Benchmark your setup and ensure the hashing takes 50–150ms on your production hardware.
That’s a good sweet spot between security and UX.

Based on the above evidence I don’t agree that our new implementation decrease any security measures of password hashing. We use SHA-256 hashing function instead of not anymore recommended SHA-1, the implementation is also ready to possibly add newer recommended hashing methods like Argon2id but we stayed first with already used PBKDF2 method but it uses much faster .NET 8 implementation of the method.

What we can discuss is:

  • how many iterations should be used.
  • if there should be also supported another method of hashing to have good performance and fullfil all security requirements.
  • Allowing ORIGAM to configure which implementation of IPasswordHasher you would like to use in your instance.

In our new implementation we use:

  • Already used PBKDF2 method, only faster implementation
  • SHA-256 hashing function
  • We use 64 bytes salt
  • Currently we have set 600 000 iterations (but we can easily increase or set there the same method as used in the former method, but this could be discussed how many iterations should be considered as secure enough).

Former method:

  • PBKDF2, slow imlementation of .NET 4.8
  • SHA-1 hashing function (not recommended anymore)
  • 16 bytes salt
  • There is dynamic count of iterations based on year. Currently it uses 8M+ iterations.
  • It also protects against offline attacks in case the password database is compromised, by increasing the cost of each hash comparison.

It is definitely achieved by our implementation.

  • It slows down online brute-force attacks by making each login attempt more expensive.

Is it this an argument good enough for staying with slower implementations? Could it also have an impact of DDOS attacks as the hashing computation is very resource consuming?