Skip to content

0x51-dev/robots

Folders and files

NameName
Last commit message
Last commit date

Latest commit

443bd40 · Oct 19, 2024

History

1 Commit
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024
Oct 19, 2024

Repository files navigation

Robots

This is a simple Go library for parsing and handling robots.txt files. The library allows developers to check whether a given user-agent is allowed or disallowed from accessing specific URLs on a site.

Some Features

  • Full compliance with RFC 9309 (the latest specification for robots.txt).
  • Parsing robots.txt files to extract rules for various user-agents.
  • Determining whether a URL is allowed or disallowed based on the rules in robots.txt.
  • Handling of the following directives: User-agent, Allow, Disallow, Sitemap, etc.
  • Support for comments, case-insensitivity, and empty lines as per the specification.

Installation

To use the library, install it using:

go install github.com/0x51-dev/robots

References

About

Robots Exclusion Protocol - robots.txt

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published