Mia-Assistant
Mia-Platform Console includes Mia Assistant, an AI-based application that can be interrogated on anything included in the official Mia-Platform Documentation.
MongoDB configuration
Mia-Assistant relies on a MongoDB vector store collection, that is automatically populated during deployment.
However, the Helm Chart is not yet capable of creating the vector store collection and the necessary indexes for it to work; so you have to manually create a collection named assistant-documents
and configure an Atlas Vector Search index on it.
Please mind that Atlas Vector Search indexes are available only on MongoDB Atlas instance with version 6.0.11, 7.0.2 or higher.
Unfortunately, MongoDB's Vector Search indexes are not available on previous versions or on MongoDB Entreprise Server edition. If you don't meet these requirements, unfortunately the service will not work.
Please refer to the official MongoDB official documentation to have more information regarding this.
You can create the index in two ways:
- with a script with the official MongoDB drivers that connects to MongoDB and prepare the embeddings for you, as explained in the MongoDB official documentation
- manually, connecting to your MongoDB Atlas cluster and creating the index from the Atlas web application, as explained in this guide
It is important that the index have this structure:
{
"fields": [
{
"numDimensions": 1536,
"path": "embedding",
"similarity": "euclidean",
"type": "vector"
},
{
"path": "__STATE__",
"type": "filter"
}
]
}
The structure of the index is mandatory, otherwise the documents cannot be extracted from the collection.
OpenAI Configuration
The Mia-Assistant service can be configured via Helm Chart using the .assistant
value.
At the moment, the only supported models are the ones developed by OpenAI.
The Helm Chart will require including the API key for both the embedding model and the large language model used. For OpenAI models, these two API keys are the same and can be created from the OpenAI API keys page. After logging in with the credentials of your company, you can create the API Key that must be included in the assistant
object inside the Helm chart and that will be used by the Mia-Assistant service.
The service is already configured to use the following models:
text-embedding-3-small
as Embedding modelgpt-3.5-turbo
as Large Language Model (LLM)
Please note that using these models has a cost, which is detailed on the Pricing page of the OpenAI documentation.
When registering with OpenAI, you also have to set up a billing plan in order to use OpenAI services with the Mia-Assistant.
Mia-Assistant Configuration
The configuration regarding the Assistant is included inside the assistant
object, which is composed by:
Name | Type | Description | Default | Optional |
---|---|---|---|---|
enabled | boolean | If set to true , the Mia-Assistant will be enabled | false | ✅ |
keys.llm | string | The API key for the Large Language Model | ❌ | |
keys.embeddings | string | The API key for the Embedding Model | ❌ |
Please mind that, without key.llm
and key.embeddings
correctly populated, the service will crash at its startup.
Using OpenAI, the API key for embeddings and LLM are the same, so the same value should be included in both properties.
Example
mia-console:
configurations:
...
assistant:
enabled: true
keys:
llm: "<YOUR_OPENAI_API_KEY>"
embeddings: "<YOUR_OPENAI_API_KEY>"