For now, this integration will only work in MacOS, with Linux and Windows to come later.
- zero-shot text classification
- sentiment analysis
- question answering
- summarization
- translation
Setup
- A macOS machine, M1 chip or greater.
- A working Ollama installation. For instructions refer to their webpage. This step should be really simple.
- For 7B models, at least 8GB RAM is recommended.
- For 13B models, at least 16GB RAM is recommended.
- For 70B models, at least 64GB RAM is recommended.
AI Engine
Before creating a model, it is required to create an AI engine based on the provided handler. You can create an Ollama engine using this command:ollama
) should be used as a value for the engine
parameter in the USING
clause of the CREATE MODEL
statement.
AI Model
TheCREATE MODEL
statement is used to create, train, and deploy models within MindsDB.
Name | Description |
---|---|
engine | It defines the Ollama engine. |
model_name | It is used to provide the name of the model to be used |
DESCRIBE ollama_model;
DESCRIBE ollama_model.model;
DESCRIBE ollama_model.features;