As a noun tokenizer
is (computing) a system that parses an input stream into tokens.
As a verb tokenized is
(tokenize).
tokenizer
English
Noun
(
en noun)
(computing) A system that parses an input stream into tokens.
tokenized
English
Verb
(head)
(tokenize)
tokenize
English
(Tokenization)
Verb
(computing) To reduce to a set of tokens by parsing.
To treat as a token minority.