
NOTICE: All information contained herein is, and remains
the property of TechnoCore.
The intellectual and technical concepts contained
herein are proprietary to TechnoCore and dissemination of this information or reproduction of this material
is strictly forbidden unless prior written permission is obtained
from TechnoCore.
The ObjAiMcpHuggingface class provides an interface to the Hugging Face Hub, allowing you to use a vast range of open-source language models via their Inference API. It is part of the Multi-Cloud Provider (MCP) framework and inherits from the ObjAiMcpBase class.
The class is initialized with the following parameters:
def __init__(self, db=0, api_key: str = "", model: str = "mistralai/Mistral-7B-Instruct-v0.2"):
db: The database connection object.api_key: Your Hugging Face API token.model: The ID of the Hugging Face model to use (e.g., "mistralai/Mistral-7B-Instruct-v0.2").promptSends a prompt to the specified Hugging Face model and returns the response.
def prompt(self, role: str = "", prompt: str = "", image_base64: str = "") -> str:
role: The system role for the AI (e.g., "You are a helpful assistant.").prompt: The user's prompt or question.image_base64: Not currently supported by this implementation.To use the Hugging Face provider, you must first configure your API token in the config.yaml file under the ai_mcp_huggingface section.
ai_mcp_huggingface:
api_key: YOUR_HUGGINGFACE_API_KEY
You can then instantiate and use the ObjAI class with the appropriate model string:
ai_obj = ObjAI(db=0, model="mcp:huggingface:mistralai/Mistral-7B-Instruct-v0.2")
response = ai_obj.prompt("You are a helpful assistant.", "What is the capital of Germany?")
print(response)