"Crawling" is inaccurate here, since the link blocked may be a direct download link. In that context, the intent is to say, "I found a URL that would satisfy your request, but I will not be downloading it." -- the purpose being to inform the user that, in the event the requirement isn't met, that it may be because their --allow-host blocked relevant links.
Downgrading the warning to debug defeats this purpose, but perhaps we could arrange to also notify the user no more than once per package that "Some links for package are blocked by your --allow-hosts setting; you may not be able to install all available versions". Or for that matter, a more thorough refactoring could actually add the links to the index, but skip them during the process of finding one to download, so that it says, "Skipping http://whatever/foo.tar/gz due to --allow-hosts" as it skips over them.
In that approach, it would make most sense to separate crawling links from download candidates. Under the more sophisticated scheme, one would not issue any warnings at all for a URL unless it is something that you would've installed, had it not been for --allow-hosts, and the URLs for finding additional links could be reduced to at most one per package, or subdued to info level.
I believe we want to maintain the message in some form, so merging this as a temporary fix isn't really appropriate, as pje mentioned. Rather than burden setuptools with temporary fixes, let's come up with a patch for a proper fix.
Also, you may or may not be aware that PyPI has made some changes, most of which won't go into effect for a few weeks, but which, if I understand correctly, may reduce a lot of the links inadvertently included in the download search. If that's the case, the annoyance for users should also be reduced.