SpaCy Wrapper

you can read all about Spacy here.

class wrappers.spacy_wrapper.SpacyWrapper(spacy_module: str)[source]

Wrapper object that load SpaCy module and helps use it.

sentence_tokenizer(text: str) → List[str][source]

Tokenize (split) text in sentences.

for example:

sentence_tokenizer(“Hello, world. Here are two sentences.”)

will output:

[‘Hello, world.’, ‘Here are two sentences.’]

Parameters

text – raw text to split into sentences

Returns

list of strings, each string is a sentence.

←back to github