tokenization
breaking a stream of text up into chunks for analysis or further processing
--Agreed Upon Solutions

The discussion does not exist would you like to add the first comment?

anonymous