Large Language Models
Hugging Face Inference API
This documentation describes the integration of MindsDB with Hugging Face Inference API. The integration allows for the deployment of Hugging Face models through Inference API within MindsDB, providing the models with access to data from various data sources.
Prerequisites
Before proceeding, ensure the following prerequisites are met:
- Install MindsDB locally via Docker or use MindsDB Cloud.
- To use Hugging Face Inference API within MindsDB, install the required dependencies following this instruction.
- Obtain the API key for Hugging Face Inference API required to deploy and use Hugging Face models through Inference API within MindsDB. Generate tokens in the
Settings -> Access Tokens
tab of the Hugging Face account.
Setup
Create an AI engine from the Hugging Face Inference API handler.
Create a model using huggingface_api_engine
as an engine.
Usage
The following usage examples utilize huggingface_api_engine
to create a model with the CREATE MODEL
statement.
Create a model to classify input text as spam or ham.
Query the model to get predictions.
Here is the output:
Next Steps
Follow this link to see more use case examples.