A Python based Robots.txt audit tool
Parsero is a free script written in Python which reads the Robots.txt file of a web server and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn't be indexed. For example, "Disallow: /portal/login" means that the content on www.example.com/portal/login it's not allowed to be indexed by crawlers like Google, Bing, Yahoo... This is the way the administrator have to not share sensitive or private information with the search engines.
Release | Stable | Testing |
---|---|---|
Fedora Rawhide | 0.81-31.fc41 | - |
Fedora 41 | 0.81-31.fc41 | - |
Fedora 40 | 0.81-29.fc40 | - |
Fedora 39 | 0.81-27.fc39 | - |
You can contact the maintainers of this package via email at
parsero dash maintainers at fedoraproject dot org
.