1
0
mirror of https://git.FreeBSD.org/ports.git synced 2024-12-20 04:02:27 +00:00
freebsd-ports/textproc/p5-String-Tokenizer/pkg-descr
Bill Fenner 83764e1daa Update search.cpan.org WWW: entries to have a trailing slash.
Change some URLs from author dirs to dist dirs.

The example in the porter's handbook didn't have the trailing slash;
mea culpa for not having caught that when it went in.
2006-02-20 20:50:22 +00:00

19 lines
963 B
Plaintext

A simple string tokenizer which takes a string and splits it on
whitespace. It also optionally takes a string of characters to use as
delimiters, and returns them with the token set as well. This allows for
splitting the string in many different ways.
This is a very basic tokenizer, so more complex needs should be either
addressed with a custom written tokenizer or post-processing of the output
generated by this module. Basically, this will not fill everyones needs,
but it spans a gap between simple split / /, $string and the other options
that involve much larger and complex modules.
Also note that this is not a lexical analyser. Many people confuse
tokenization with lexical analysis. A tokenizer mearly splits its input
into specific chunks, a lexical analyzer classifies those chunks.
Sometimes these two steps are combined, but not here.
WWW: http://search.cpan.org/dist/String-Tokenizer/
Author: stevan little <stevan@iinteractive.com>