I created an automated service to find crawler IPs and ban them, I did this for fun (parsing stream of requests, finding malicious behavior and blocking using firewall API was a challenging task).
Not only this service didn't stop her but she is trying harder and her request rate has tripled today (she is using more IPs. today, 1.5k of her IPs were banned).
What do you think I should do, let her crawl or chase this rabbit hole?
Thanks
Personally I don't see a reason to block non abusive things from crawling a site. Sites are there to be found and read. Indexers, archivers, etc... are normal things that may provide a non obvious benefit.
This seems to be a judgement call on if you believe the the actions are nefarious or not.