I was looking over the 1.7 changelog and I noticed harden_verify (and the deprecated min_verify_time) and I would like to implore you to reconsider these options all together. They do not appear to meaningfully increase the security of a particular application.
I get that the idea is that if you have some users using a weak hash (say sha1) and some using a strong hash (say argon2) this would let you hide which users are still using a weak hash and which are not. However I don't think hiding that is actually all that useful. Using a slow password hash is there to protect against offline attacks from people who have gained a copy of the database. For this people, they can already tell who has a strong or a weak hash simply by looking at the data they possess.
So then, this only protects people who can submit password attempts but can't otherwise access the underlying data, presumably through something like an online login form. My guess is the goal here is to make it so that an attacker can't go "oh, I'll focus on the users that have a sha1 hash because I can submit more of those quicker". However, I think a better way to defend against this is to recommend people put rate limiting on their authentication (by IP, user, or whatever makes sense for them). This will provide much stronger protection not just against people with weak hashes, but also against people with weak passwords in general.
All in all, at best case all this tells you is if person A has a strong hash or not, but that information does not appear to overly useful. The only real data leakage I can think of beyond that is that it possibly lets you figure out when the last time they logged in was, but even that doesn't appear very useful to me. The dummy_verify is maybe useful, but I feel like it would be better implemented by just hashing a random (or empty string) password using the default hasher and then just throwing the result away before returning.