Skip to content

alexrudy/roboto

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Roboto: Parse and use robots.txt files

crate Docs Build Status MIT licensed

Roboto provides a type-safe way to parse and use robots.txt files. It is based on the Robots Exclusion Protocol and is used to approximately try control the behavior of web crawlers and other web robots.

Installation

Add this to your Cargo.toml:

[dependencies]
roboto = "0.1"

Usage

use roboto::Robots;

let robots = r#"
User-agent: *
Disallow: /private
Disallow: /tmp
"#.parse::<Robots>().unwrap();

let user_agent = "googlebot".parse().unwrap();

assert_eq!(robots.is_allowed(&user_agent, "/public"), true);

About

Library for robots.txt files in Rust

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages