Install Docker

Install Docker on your machine following the instructions.

To make sure Docker is successfully installed on your machine, run a test container as follows:

docker run hello-world

You should see the Hello from Docker! message. Otherwise, check the Docker’s Get Started documentation.

Docker for Mac users - RAM allocation issues

By default, Docker for Mac allocates 2 GB of RAM, which is insufficient for deploying MindsDB with Docker. We recommend increasing the default RAM limit to 8 GB. Please refer to the Docker Desktop for Mac users manual for more information on how to increase the allocated memory.

Install MindsDB

Please note that this method of MindsDB installation requires a minimum of 8 GB RAM and 20 GB free storage.

Run the command below to start MindsDB in Docker and follow the logs:

docker run -p 47334:47334 -p 47335:47335 mindsdb/mindsdb

Alternatively, if you run this command with the detach flag, it will return the container ID and not follow the logs:

docker run -d -p 47334:47334 -p 47335:47335 mindsdb/mindsdb

If you would like to persist your models & configurations in the host machine, run the following commands:

mkdir mdb_data
docker run -p 47334:47334 -p 47335:47335 -v $(pwd)/mdb_data:/root/mdb_storage mindsdb/mindsdb

Let’s analyze this command part by part:

  • docker run is a native Docker command used to start a container
  • -p 47334:47334 publishes port 47334 to access MindsDB GUI and HTTP API
  • -p 47335:47335 publishes port 47335 to access MindsDB MySQL API
  • -v $(pwd)/mdb_data:/root/mdb_storage maps the newly created folder mdb_data on the host machine to the /root/mdb_storage inside the container
  • mindsdb/mindsdb is the container we want to start

Now you can access the MindsDB editor by going to 127.0.0.1:47334 in your browser.

Install dependencies

MindsDB integrates with numerous data sources and AI frameworks. To use any of the integrations, you should enure that the required dependencies are installed in the Docker container.

Here is how to install the dependencies in the Docker container:

  1. Run the MindsDB Docker container:

    docker run -d -p 47334:47334 -p 47335:47335 mindsdb/mindsdb
    
  2. Start an interactive shell in the container:

    docker exec -it container-name sh
    
  3. Install the dependencies:

    pip install mindsdb[handler_name]
    

    For example, run this command to install dependencies for the OpenAI handler:

    pip install mindsdb[openai]
    
  4. Exit the interactive shell:

    exit
    
  5. Restart the container:

    docker restart container-name
    

Configuration Options

Default Configuration

The default configuration for MindsDB’s Docker image is stored as a JSON code, as below.

{
    "config_version": "1.4",
    "storage_dir": "/root/mdb_storage",
    "log": {
        "level": {
            "console": "ERROR",
            "file": "WARNING",
            "db": "WARNING"
        }
    },
    "debug": false,
    "integrations": {},
    "auth": {
        "username": "mindsdb",
        "password": "123"
    },
    "api": {
        "http": {
            "host": "127.0.0.1",
            "port": "47334"
        },
        "mysql": {
            "host": "127.0.0.1",
            "port": "47335",
            "database": "mindsdb",
            "ssl": true
        },
        "mongodb": {
            "host": "127.0.0.1",
            "port": "47336",
            "database": "mindsdb"
        }
    }
}

Custom Configuration

To override the default configuration, you can mount a config file over /root/mindsdb_config.json, as below.

docker run -v mbd_config.json:/root/mindsdb_config.json mindsdb/mindsdb

Known Issues

#1

If you experience any issues related to MKL or your training process does not complete, please add the MKL_SERVICE_FORCE_INTEL environment variable, as below.

docker run -e MKL_SERVICE_FORCE_INTEL=1 -p 47334:47334 -p 47335:47335 mindsdb/mindsdb

What’s Next

Now that you installed and started MindsDB locally in your Docker container, go ahead and find out how to create and train a model using the CREATE MODEL statement.

Check out the Use Cases section to follow tutorials that cover Large Language Models, Chatbots, Time Series, Classification, and Regression models, Semantic Search, and more.