Preparing search index...
The search index is not available
picosearch - v1.0.0
picosearch - v1.0.0
Tokenizer
Type alias Tokenizer
Tokenizer
:
(
(
text
:
string
)
=>
string
[]
)
Type declaration
(
text
:
string
)
:
string
[]
The tokenizer is a function that splits a text into its tokens.
Parameters
text:
string
Returns
string
[]
Settings
Theme
OS
Light
Dark
Modules
picosearch -
v1.0.0
Index
Index
Field
Mappings
Query
Field
Query
Options
Search
Results
Search
Results
Hit
Trie
Node
Analyzer
Compact
Range
Keyword
Field
Index
Mapping
Type
Number
Field
Index
Text
Field
Index
Tokenizer
DEFAULT_
FIELD_
OPTIONS
DEFAULT_
QUERY_
OPTIONS
EMPTY_
TEXT_
FIELD_
INDEX
FIELD_
CLASSES
REGEXP_
PATTERN_
PUNCT
DEFAULT_
ANALYZER
DEFAULT_
TOKENIZER
create
Index
index
Document
search
Index
The tokenizer is a function that splits a text into its tokens.