Language Models
Intric offers a wide range of language models that are updated frequently to ensure that you always have access to the latest technology.
You can choose to run these models in two ways:
- As sub-processors: By hiring external LLM providers (e.g., OpenAI, Anthropic or Mistral) that function as sub-processors.
- Self-hosted (On-prem): By operating open source models locally on your own infrastructure with your own GPUs. This option provides maximum control over data flow.
Administrators can see a complete and updated list of available models, including where they’re operated, whether they’re Open Source and which ones have support for Tools, by visiting: https://portal.intric.ai/admin/models
For specific questions about specific models or configuration, please contact Intric.