Function tokenize

Source
pub fn tokenize(
    input: &str,
    frontmatter_allowed: FrontmatterAllowed,
) -> impl Iterator<Item = Token>
Expand description

Creates an iterator that produces tokens from the input string.

When parsing a full Rust document, first strip_shebang and then allow frontmatters with FrontmatterAllowed::Yes.

When tokenizing a slice of a document, be sure to disallow frontmatters with FrontmatterAllowed::No