You can easily install small-text using pip or conda:
pip install small-text
conda install small-text
This installs a minimal setup without any integrations. By installing the integrations you can enable larger scopes of gpu-based functionality. Further optional dependencies, i.e. dependencies which are only needed for one to a few strategies and are not installed by default (to avoid bloated dependencies), might be required.
The small-text library is designed to be usable in combination with as many classifiers/classification libraries as possible. Whenever possible, we try to keep most dependencies optional to avoid dependency bloat. Dependending on the classifier of your choice, you might need additional python libraries.
The Pytorch and Transformers Integrations are best used with a CUDA-capable GPU. You need CUDA version 10.1 or newer, and your GPU must also support that specific version.
To enable the Pytorch Integration, install the library with the pytorch extra:
pip install small-text[pytorch]
conda install small-text "torch>=1.6.0" "torchtext>=0.7.0"
After installation, make sure the installed torchtext and Pytorch versions are compatible.
To enable the Transformers Integration, install the library with the transformers extra:
pip install small-text[transformers]
conda install small-text "torch>=1.6.0" "torchtext>=0.7.0" "transformers>=4.0.0"
The Transformers Integration also requires Pytorch, so installing this automatically entails an installation of the Pytorch Integration.
We keep certain python dependencies optional when they are either only required for very specific (query or stopping) strategies or are purely convenience functions.
An overview of such dependencies is given in table below: