This week I had an issue with a robots.txt file. I have to be honest. I can't remember the last time I edited a robots file. Most of the sites I work on generally speaking we want all the content crawlable. So, we don't have a lot of advanced things set up. But, in this case. I needed to see what file Google had cached as well if one of my URLs was crawlable. I don't think this tool is linked from the Search Console anymore. But, it can still be something that can help you debug and run some tests especially if you are editing a robots file.