Hashing Performance

Understanding Liferay's Hashing algorithms and how their performance affects logins.


Recently there have been a bunch of questions lately in Slack, in Ask, and even in Liferay Support tickets complaining about the time it takes to log into Liferay and what can be done to improve performance, specifically targeting the hash algorithms...

TL;DR - Liferay has increased the rounds on the PBKDF2 hash which is detrimental to your login performance. Set the passwords.encryption.algorithm=PBKDF2WithHmacSHA1/160/128000 in your, force users to change passwords (notes far below) and consider changing to a BouncyCastle-based implementation.

Now, if you grab a copy of the Liferay DXP Performance Benchmark Study, you can read really impressive statements like:

At 45,750 virtual users, we exceed the established performance budget of this test (i.e., sub 1 second login times).Thus, the performance inflection point for login is roughly between 45,500 and 45,750 virtual users while stable performance and throughput is around 47,000 virtual users.

These numbers sound fantastic, right? Too good to be true maybe?

Well, they are true, I can tell you, but there's a catch - their choice of encryption (hashing) algorithm.

Liferay Password Encryption Algorithms

Although they're referred to as encryption algorithms, they're actually 1-way hashing algorithms.

Liferay does not encrypt passwords. Encrypting a value means that you use a password or key to encrypt the data, but that you can also use that password or key to decrypt back to the original value. That would be a bad practice for passwords, because Liferay somewhere would have to have access to the password or key used to do the encryption, so it would be in a format for a hacker to steal and use to decrypt all of your passwords. So Liferay does not encrypt your passwords...

Instead, when you're creating your account and set your password, Password123, the algorithm processes this as a 1-way hash, meaning it goes from Password123 -> Fa94R8pB..., but it can never go from Fa94R8pB... -> Password123. The Fa94R8pB... value is then stored in the database as the "encrypted password" for your account.

So each time you log in, Liferay calculates the hash for the password you've provided and then that hash is compared against the value that is stored in the database.

Password123 becomes Fa94R8pB... and if that matches what is stored in the database, then login is successful. If it doesn't match, the login fails and you get to try again.

This is important in case a hacker gets your database or specifically your User_ table. Since the hashes are all 1-way, they can't decrypt Fa94R8pB... to get your original password, however they can try and use brute force to find a password that hashes to Fa94R8pB.... Depending upon the hashing algorithm being used in Liferay and whether your users are, in fact, using simple passwords like Password123, hardware systems these days can quickly identify some passwords and the hacker can then worm their way in.

Additionally some hashing algorithms are kind of weak in that other different words can hash to the same value. This is referred to as a "collision". For example, using MD5 as the hashing algorithm, here's an example of a collision:

Input 1: abcdefghijklmnopqrstuvwxyz
MD5 Output 1: c3fcd3d76192e4007dfb496cca67e13b

Input 2: abcdefghijklmnopqrstuvwxyy
MD5 Output 2: c3fcd3d76192e4007dfb496cca67e13b

So if my password was input 1, the MD5 hash in the database would be the c3fcd... value. Now, maybe my password is really complex so I feel safe, but a hacker who can brute force the algorithm can perhaps find another password (by happenstance or perhaps even some underground tools) to find a string that collides with (or generates the same hash code as) the c3fcd... hash. Since Liferay is just comparing the calculated hash values, they can enter for example input 2 and Liferay would treat that as a successful login even though the password is not the same.

So there has been a long-running battle with hackers and hash algorithm complexity. From a security perspective, you want an algorithm that generates a (hopefully) unique value that does not have many collisions and also one that takes time to protect against brute force attacks. As hardware improvements make it easier to find passwords or collisions using brute force, the hashing algorithms have had to add complexity to make them harder to attack.

The only problem for us, as the hashing algorithms get more complex, they typically always get slower...

Liferay currently supports nine different hashing algorithms:

  • MD2
  • MD5
  • PBKDF2WithHmacSHA1
  • SHA
  • SHA-256
  • SHA-384
  • SSHA

Some of these algorithms are configurable, but they each have different performance characteristics and complexity.

Oh, and the hashing algorithm used in the performance study I wrote of above? Those numbers are achieved using the hashing algorithm NONE - yeah, that's right, there's no hashing at all in order to hit those numbers. In their database, your Password123 is stored right there in the User_ table as Password123. So NONE clearly offers great performance, but if a hacker gets your User_ table, they have all of your accounts and can easily violate you.

So, in my personal opinion, the login metrics from the performance study are worthless since they do not represent a real world scenario. Even for an intranent-only implementation where the only users are direct employees, all it takes is one disgruntled employee who can read the User_ table and your passwords are exposed.

Liferay Default Encryption Algorithm Changes

So for those that don't know, Liferay's default encryption algorithm in 7.x is PBKDF2WithHmacSHA1, but even that has slightly changed over time.

You'll typically find it defined in the file such as PBKDF2WithHmacSHA1/160/720000 where the first number after the slash is the key size and the second number is the number of rounds.

In Liferay, the number of rounds has been increasing over time; it used to be 128,000, and some time it got bumped to 720,000, but even now Liferay is behind OWASP's current recommendation of 1,300,000 rounds (which will impact performance even farther) and Liferay is preparing to update the default to match.

These changes are appropriate from a security standpoint, but they also significantly impact performance.

Raw Performance Info

Before you think about changing the hashing algorithm, one of the two things you need to know is the performance impact of the algorithm. We don't really publish any stats on the supported algorithms, plus there's the issue that performance of the CPU-bound algorithms will depend upon the system you run on...

So to help get some performance info that you can use to test on your hardware, I've created a simple set of Gogo commands. You'll find them in the Github repository.

The basic syntax is to use hash:algorithm password hashes threads where algorithm comes from the table below, password is the password to hash, hashes is the number of hashes to complete and threads is the number of threads invoking the hashing algorithm.

Why is threads important? Well, consider if you have 100 people trying to log in at the same time, that's going to be 100 hashes that need to occur, but they'll all be on separate threads. So the implementation takes a number of hashes to complete and the number of threads to use to complete the hashes.

Take care when you run this because it will churn your system quite a bit. The first time I tried with 10,000 hashes and 200 threads and my system was tied up for a long, long time.

Here's the complete table of commands:

Algorithm Command Details
NONE hash:none password hashes threads This does no hashing at all, it shows how the performance study paper gets away with its numbers.
BCRYPT hash:bcrypt password rounds hashes threads Rounds here is the number of internal rounds the hash algorithm will hash. From the example, this is akin to using BCRYPT/10 for example if you pass 10 for the rounds.
MD2 hash:md2 password hashes threads Uses the MD2 hash.
MD5 hash:md5 password hashes threads Uses the MD5 hash.
PBKDF2 hash:pbkdf2 password keySize rounds hashes threads This uses the PBKDF2WithHmacSHA1 which is the default for Liferay. Specify the keySize to use (Liferay uses 160) and the number of rounds (Liferay currently recommends 720,000 but soon will increase to 1,300,000). Using key size of 160 and rounds of 128,000 is the equivalent in the as PBKDF2WithHmacSHA1/160/128000.
  hash:simplepbkdf2 password hashes threads This is a manufactured method, it actually does three different runs using PBKDF2WithHmacSHA1/160/128000, PBKDF2WithHmacSHA1/160/720000, and PBKDF2WithHmacSHA1/160/1300000 so you can compare the changes.
SHA hash:sha password hashes threads Uses the SHA hash.
SHA-256 hash:sha256 password hashes threads Uses the SHA-256 hash.
SHA-384 hash:sha384 password hashes threads Uses the SHA-384 hash.
SSHA hash:ssha password hashes threads Uses the SSHA hash.
UFC-CRYPT hash:ufccrypt password hashes threads Uses the UFC-Crypt hash.
  hash:all password hashes threads Another manufactured method, this one runs all of the hash methods using the given password, hashes and threads, allows you to compare them all.

Now really the importance of the numbers you get here will really depend upon the hardware you're hosting Liferay on.

On my Intel-based iMac, 3.8 GHz 8-Core Intel Core i7, using a password of Password123 and 1,000 hashes and 20 threads:

g! hash:all Password123 1000 20
  Hashed Password: {NONE}Password123
  Time for 1000 on 20 threads: 2 ms
  Throughput: 500 per ms
  Hashed Password: {BCRYPT}$2a$10$6nBoaq5HOcUv.zMK0x4C0ueDXYMEPK283.LJ3OomZ9twD4I
  Time for 1000 on 20 threads: 4742 ms (4 seconds)
  Average: 4 ms per hash
  Hashed Password: {MD2}J1+DYe35St5RH5jOIm/ExA==
  Time for 1000 on 20 threads: 2 ms
  Throughput: 500 per ms
  Hashed Password: {MD5}QvdJref54ZW/R183pEyvyw==
  Time for 1000 on 20 threads: 2 ms
  Throughput: 500 per ms
  Hashed Password: {PBKDF2WithHmacSHA1}AAAAoAAB9ADxgcXJ8tEfLprl6wTPvGvV5tpm+HNgNA
  Time for 1000 on 20 threads: 13149 ms (13 seconds)
  Average: 13 ms per hash
  Hashed Password: {PBKDF2WithHmacSHA1}AAAAoAAK/IDsOH/sUCqN+1EIVvNCQo/4Kp2OwIUXHs
  Time for 1000 on 20 threads: 73623 ms (1 minute 13 seconds)
  Average: 73 ms per hash
  Hashed Password: {PBKDF2WithHmacSHA1}AAAAoAAT1iAdLV/18DqjH0SwjlnbGduXOZKRcYKiWL
  Time for 1000 on 20 threads: 137559 ms (2 minutes 17 seconds)
  Average: 137 ms per hash
  Hashed Password: {SHA}sumK1vbrhQjdahTPpwS61/Bfb7E=
  Time for 1000 on 20 threads: 2 ms
  Throughput: 500 per ms
  Hashed Password: {SHA-256}AIxwOS46v70PpHu8LtlqqZvUnhWXJ/y6Dy5qvrOp1gE=
  Time for 1000 on 20 threads: 2 ms
  Throughput: 500 per ms
  Hashed Password: {SHA-384}abrlqxaeAO0w0d2YOoy1zt+bVa9HeVMGLDMcEgIN4m4XKRoD3zokw
  Time for 1000 on 20 threads: 2 ms
  Throughput: 500 per ms
  Hashed Password: {SSHA}w+MLnmrWdwu/IQ/1kf3ZNLhUKUtusKtwrdp20Q==
  Time for 1000 on 20 threads: 2 ms
  Throughput: 500 per ms
  Hashed Password: {UFC-CRYPT}umEUxHoYAQmHI
  Time for 1000 on 20 threads: 29 ms
  Throughput: 34 per ms

Now when you look at this listing, it becomes easy to see that algorithm choice absolutely impacts your performance characteristics. Many of the insecure hashes handle the 1,000 hashes in a few milliseconds, while the more secure ones have a significant impact.

Let's focus just on the Liferay default, the PBKDF2WithHmacSHA1 results:

  Hashed Password: {PBKDF2WithHmacSHA1}AAAAoAAB9ADxgcXJ8tEfLprl6wTPvGvV5tpm+HNgNA
  Time for 1000 on 20 threads: 13149 ms (13 seconds)
  Average: 13 ms per hash
  Hashed Password: {PBKDF2WithHmacSHA1}AAAAoAAK/IDsOH/sUCqN+1EIVvNCQo/4Kp2OwIUXHs
  Time for 1000 on 20 threads: 73623 ms (1 minute 13 seconds)
  Average: 73 ms per hash
  Hashed Password: {PBKDF2WithHmacSHA1}AAAAoAAT1iAdLV/18DqjH0SwjlnbGduXOZKRcYKiWL
  Time for 1000 on 20 threads: 137559 ms (2 minutes 17 seconds)
  Average: 137 ms per hash

Here we can see the impact of adding additional rounds during the hash calculation. 128,000 rounds averages 13 ms per hash, 720,000 is 561% worse clocking in at 73 ms per hash, but 1,300,000 is 1053% worse than 128,000 and 187% worse than 720,000, averaging 137 ms per hash.

So if there are 1,000 users waiting to log in and you only have 20 threads available, they are all logged in in 13 ms at 128k rounds, 1 minute and 13 seconds at 720k, and 2 minutes 17 seconds at 1.3m rounds.

So you might be asking yourself why the heck would you want to increase the rounds to 1,300,000 since every login will take significantly longer to complete?

The Other Shoe

Remember when I shared one of the two things you needed to know before changing your hashing algorithm? The first one was the performance characteristics of the algorithms.

The second thing you need to know is the risk. All of those really low time, low intensity algorithms shown above? They are all basically no different than choosing NONE for your passwords.encryption.algorithm (at least from a security perspective).

Think about it, if you're a hacker and you have a list of the ### most frequently used passwords and you can calculate the hash in less than a millisecond and you're comparing against some hash value you have, you must understand that it is only a matter of time before the hacker will figure out a password (or collision) and then be able to get into your system.

Even with a moderately complex password, you could build a program that just starts generating sequences like a, b, c, ... A, B, C, ... 0, 1, 2, ..., aa, ab, ac, ... Aa, Ab, Ac, ... and, with millisecond hash calculation, even the most complex passwords will fall on current hardware, and this is assuming the hacker only has brute force available and not some underground info about how to figure out the hash easier. I mean, just from the example times above, you can see how I could hash 1000 passwords in about 2ms in most cases; this translates into 500,000 hashes each second, 30 million hashes every minute, and a whopping 1,800,000,000 passwords every hour, and 302 trillion passwords in a week... At that rate, I think you can see just how quickly it would be possible to derive all of the hashes necessary to get access to at least one account in your environment, if not more... And these numbers were based off of my system running with only 20 threads; an AWS compute system would likely blow these numbers away (although you'll pay for that privilege)...

So ultimately the risk comes down to this:

1. If the hash value is somehow leaked, i.e. a developer puts it in a comment in the code, or someone forgets to protect your cloud database, or a hacker gets in and dumps your User_ table, how hard is it going to be for the hacker to figure out the original password?

2. If the hacker can figure out the password, what is the potential risk? I ask this because, well if I'm just hosting a site of recipes, a discovered password is certainly going to give me a lot of headaches, but that is quite different than if I'm running an online crypto wallet site where users are holding their crypto and a discovered password could results in losses of significant money.

Most organizations are going to fall somewhere between hosting recipes and online crypto wallets. Where your organization lands on this scale will determine what risk you have and, if your risk is on the high end, regardless of the performance characteristics, you're going to want to consider a more complex hash algorithm.

One word of caution: don't underestimate your risk... Remember that most people inherently reuse passwords. So even if you are just a recipe hosting site, a hacker knows that a certain percentage of your registered users will have the same credentials used on other more critical sites. For that reason, your site is the perfect one to attack if security is low, it's the perfect stepping stone to other bigger fish.

Complexity != Slow

Actually Olaf and I got into an argument about this...

He was arguing that as you add rounds (complexity) to a hashing algorithm, there will of course be an additional penalty that you have no choice but to pay. There's no exception, you have to pay the bill as it comes due.

I however was looking at it differently; adding complexity does not necessarily translate into being slow. I knew that there were alternative implementations out there, including Bouncy Castle. And that's really what I wanted to know - was there a better implementation out there that offered the same protection but was better optimized for the hash calculation.

I was curious as to whether it was possible to provide an alternative implementation that could still have the same level of complexity (rounds) yet not be as slow.

So yeah, I built one (also included in the Github repo). Called the BCPBKDF2WithHmacSHA1 (unique, yeah? I just prefixed with BC to indicate it was BouncyCastle...), it is an implementation of the PasswordEncryptor interface, but it leveraged Bouncy Castle's implementation of the PBKDF2WithHmacSHA1 algorithm.

So how did it perform? Check it for yourself:

g! hash:simplebcpbkdf2 Password123 1000 20
  Hashed Password: {BCPBKDF2WithHmacSHA1}AAAAoAAB9AAgLw0eOfAPvMhA8ZEpy8qEQSr++/kaB
  Time for 1000 on 20 threads: 8245 ms (8 seconds)
  Average: 8 ms per hash
  Hashed Password: {BCPBKDF2WithHmacSHA1}AAAAoAAK/IDmPGuMG/dbK9c72gb1KRChlw+MBc5RJl
  Time for 1000 on 20 threads: 46757 ms (46 seconds)
  Average: 46 ms per hash
  Hashed Password: {BCPBKDF2WithHmacSHA1}AAAAoAAT1iDyKFUKOnsrWt/nxusNnOmd0hSczw6l
  Time for 1000 on 20 threads: 85824 ms (1 minute 25 seconds)
  Average: 85 ms per hash

Using Bouncy Castle's implementation, I saw about a 38% reduction on all three rounds: 128,000, 720,000, and 1,300,000 rounds.

Take note of the new hash command, hash:simplebcpbkdf2 password hashes threads, there's also a hash:bcpbkdf2 password keySize rounds iterations for individual testing.

Using This Was Another Story...

Boy, you can't believe how excited I was to try out this new implementation...

Since it was based upon Liferay's PBKDF2PasswordEncryptor, I already supported the arguments for key size and rounds, and I registered my implementation using the type BCPBKDF2 (Liferay's uses the type PBKDF2, so again I was copying them).

With my component built and deployed, it was time to update my hash gogo command.

The key part of leveraging all of these different Liferay algorithms was simply to call PasswordEncryptorUtil.encrypt(algorithm, password, (String) null); I could do this, for example like PasswordEncryptorUtil.encrypt("PBKDF2WithHmacSHA1/160/720000", "Password123", (String) null); and it would magically work.

So I basically changed so I was using PasswordEncryptorUtil.encrypt("BCPBKDF2WithHmacSHA1/160/720000", "Password123", (String) null); and darn it, I just got NPEs. Why? Because Liferay couldn't find my password encryptor.

Tracing through the stack trace, I found that the class had a _select(String algorithm) method to select the password encryptor based on the algorithm passed in. And wouldn't you know it, Liferay has special code in that class that says if the algorithm STARTS WITH "PBKDF2", regardless of what follows, it was going to use the password encryptor with the type "PBKDF2". The other algorithms, well those had to be an exact match on the type the password encryptor component was assigned.

Cool, I just changed my line then to be PasswordEncryptorUtil.encrypt("BCPBKDF2/160/720000", "Password123"); since my type is "BCPBKDF2" and expected it to work, but no, it failed. Remember the "exact match" thing above? Well, that meant that I could not use parameterized password encryptors, so I couldn't allow for different key sizes or rounds...

So I opened a feature request ticket,, so that CompositePasswordEncryptor would ignore the arguments when looking for the PasswordEncryptor to use. I have the code changed and a PR ready to submit to Liferay, but in the mean time I still wanted to get it working for the repo, so just as I wrote in, I created an override module so I could use a custom CompositePasswordEncryptor that ignored the parameters. This module is not included in the Github repo because I ended up creating the Alternative Usage in the next section which was a better implementation.

And so, once this was all in place, my hash gogo command started working and I could gather the performance numbers that I shared above.

I didn't know when I was going through all of this if it was going to be worth it, whether Bouncy Castle would prove the better implementation or not. Thankfully though I found almost a 40% improvement when using Bouncy Castle, so it ended up being well worth the effort.

Alternative Usage

So, when I got through all of this, I had it all working, but boy, was it a kludge using the marketplace override approach.

I aspect I suspected, but was not 100% sure of, that the JCE implementation of PBKDF2WithHmacSHA1 was computing the exact same value as the Bouncy Castle implementation, and I needed to prove it.

So I added a new command, hash:verifyPbkdf2 password that would compute the hash using both algorithms and verify they are equal. To do this, I basically have a bunch of lines like:

String pbkdf2Hash = PasswordEncryptorUtil.encrypt(
  PasswordEncryptor.TYPE_PBKDF2 + "WithHmacSHA1/160/128000", password, (String) null);
String bcHash = PasswordEncryptorUtil.encrypt("BCPBKDF2WithHmacSHA1/160/128000", 
  password, pbkdf2Hash);
boolean same = pbkdf2Hash.equals(bcHash);

Since I pass the encrypted hash from the first call as the last argument on the second call, I'm providing the Bouncy Castle implementation with the same key size, rounds and salt, so it basically has no choice but to generate the same hash result.

The new hash:verifyPbkdf2 command will check for the three common rounds, 128,000, 720,000 and 1,300,000 rounds.

After proving Bouncy Castle generates the same hashes yet with better performance, that meant that I could actually register a BC implementation using Liferay's PBKDF2 type and provide a higher service ranking. I called this component class BouncyCastleReplacementPBKDF2PasswordEncryptor. I still suspect though that even with the higher service ranking, potentially Liferay's implementation could still be available. If you do want to use this component to replace Liferay's implementation, I strongly suggest that you blocklist the component. That should prevent Liferay's from starting, leaving only the Bouncy Castle implementation for the system to use.

Choosing the Right Algorithm

Was having a little fun with ChatGPT and I asked it to author a section explaining how to choose the right hashing algorithm that had the right balance of performance and security. This is what it came up with...

When choosing a hashing algorithm for your implementation, it is important to consider both performance and security. On the one hand, you want the algorithm to be fast enough to handle large volumes of logins and meet the performance requirements of your site. On the other hand, you also want the algorithm to be secure enough to protect user passwords from unauthorized access and tampering.

One way to evaluate the performance of a hashing algorithm is to consider its hash length, computational complexity, and the speed of the hash computation. Algorithms with a shorter hash length, such as MD5, are generally faster but also less secure, as they have a higher risk of hash collisions and are more susceptible to attacks such as birthday attacks. In contrast, algorithms with a longer hash length, such as SHA-256, SHA-384 and PBKDF2, offer better security but may be slower in terms of performance.

Another factor to consider when choosing a hashing algorithm is its resistance to attacks. For example, algorithms that are considered "broken" or "deprecated" should generally be avoided, as they may be vulnerable to attacks such as collisions, preimage attacks, or differential attacks. On the other hand, newer and more secure algorithms, such as SHA-384 and PBKDF2, have been designed to be more resistant to these types of attacks and may offer better security for your implementation.

Ultimately, the right balance between performance and security will depend on the specific requirements of your implementation and the importance of the passwords. To evaluate this criteria, you should consider factors such as the volume of expected logins (passwords) to be hashed, the risk associated with the exposure of account password hashes, the security requirements of your site, and any legal or regulatory compliance requirements. Based on these factors, you can choose an algorithm that offers the right balance of performance vs security for your implementation.

How to Change Algorithms

Okay, so let's say you have decided that you want to change up the algorithm. Liferay's latest bundle was switched to PBKDF2WithHmacSHA1/160/1300000 and you've decided that your site is going to be find at PBKDF2WithHmacSHA1/160/128000. But since users have already been logging in, their passwords have been hashed already and, since it is 1-way only, you can't redo the hashes yourself, what are your options?

Well, unfortunately I have bad news for you - changing the algorithm will have no impact on current users, even if they change their password on their own. Their password will tend to stick with the same algorithm, key size and rounds from their previous password rather than just switching over.

Now you can force the issue, if you want. You can set the password reset flags on the users and clear the encrypted passwords (you must clear the passwords out, otherwise Liferay will still use the keySize and rounds from the current password even though you have changed the default), this way they'll be forced to enter new passwords, but likely this is going to be problematic for you as it is not a great UX and sometimes not even an option depending upon your business. But it is truly the only way you're going to know that you're not leaving a password around with the old hash algorithm.

One option I'd encourage you to do is to go and add your vote to As a System Administrator, I want to trigger a password hashing migration process (LPS-115867). This ticket has unfortunately been delayed many times but a show of support (via votes) may be enough to get it moving again.

And of course, there's always a customization out there that can get the job done. I haven't tried one yet, but if I do, I'll add it to the Github repo.


Wow, what a wild ride, wasn't it?

Hopefully you've learned a lot about how Liferay is handling passwords, how the choice of hashing algorithm can definitely impact your performance, how you can change the default, but mostly the things you need to know in order to pick the right algorithm for your implementation.

You're probably going to want to check out the Github repo, especially if you're interested in using the faster Bouncy Castle PBKDF2 hashing and minimize your impact on the hashing rounds changes.

Be aware though that ultimately the prediction for quantum computers have them arriving in anywhere between 10-20 years and, when that happens, it won't matter how complex your current algorithm is or how many rounds you use. The prediction is that quantum computers will be able to brute force these calculations such that none of our current or near future hashing algorithms will be able to stand against them. Not that means we shouldn't be trying to secure our data, I'm just suggesting that in the long run things will keep changing and the war with hackers will continue well into the future.

Anyhow, let me know how the code in the repo works out for you!


So I've been sharing my results with the Liferay security team. They've asked me to submit a PR for switching to Bouncy Castle on ticket LPS-175308... So hopefully, some time in the future, we'll get the Bouncy Castle version into core. Even when that happens, it won't invalidate anything that is in the repo, that code will still work alongside Liferay's class.


"The only problem for us, as the hashing algorithms get more complex, they typically always get slower..."

That's actually the idea. A hashing algorithm SHOULD be expensive in terms of computing time and resources. It must be costly for an attacker to do a brute force attack. So, while this has the downside of increasing the login time slightly, this is a feature, not a bug.

Of course, if Bouncy Castle is indeed faster, there is no reason not to use it. That's a nice find there! I mean, an attacker will use the fastest implementation of a given algorithm anyway.

OpenSSL apparently as a super-fast implementation of PBKDF2WithHmacSHA1, but to get there you'd need some JNI skills and know how to statically link in the necessary libraries. I've started to try and implement it, but it's been years since I've done JNI so it's been a slow process...

However, that's the crux of my argument; just because it is getting more complex, that doesn't mean we shouldn't be looking for better/optimized implementations which can complete the same amount of work in a shorter time period.