Skip to content

Commit cbf2e79

Browse files
committed
chore(model gallery): Add Ministral 3 family of models (aside from base versions)
Signed-off-by: rampa3 <68955305+rampa3@users.noreply.github.com>
1 parent da8207b commit cbf2e79

File tree

1 file changed

+305
-0
lines changed

1 file changed

+305
-0
lines changed

gallery/index.yaml

Lines changed: 305 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12398,6 +12398,311 @@
1239812398
- filename: llama-cpp/mmproj/mmproj-mistral-community_pixtral-12b-f16.gguf
1239912399
sha256: a0b21e5a3b0f9b0b604385c45bb841142e7a5ac7660fa6a397dbc87c66b2083e
1240012400
uri: huggingface://bartowski/mistral-community_pixtral-12b-GGUF/mmproj-mistral-community_pixtral-12b-f16.gguf
12401+
- !!merge <<: *mistral03
12402+
name: "mistralai_ministral-3-14b-instruct-2512-multimodal"
12403+
urls:
12404+
- https://huggingface.co/mistralai/Ministral-3-14B-Instruct-2512
12405+
- https://huggingface.co/unsloth/Ministral-3-14B-Instruct-2512-GGUF
12406+
description: |
12407+
The largest model in the Ministral 3 family, Ministral 3 14B offers frontier capabilities and performance comparable to its larger Mistral Small 3.2 24B counterpart. A powerful and efficient language model with vision capabilities.
12408+
12409+
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 24GB of VRAM in FP8, and less if further quantized.
12410+
12411+
Key Features:
12412+
Ministral 3 14B consists of two main architectural components:
12413+
12414+
- 13.5B Language Model
12415+
- 0.4B Vision Encoder
12416+
12417+
The Ministral 3 14B Instruct model offers the following capabilities:
12418+
12419+
- Vision: Enables the model to analyze images and provide insights based on visual content, in addition to text.
12420+
- Multilingual: Supports dozens of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, Korean, Arabic.
12421+
- System Prompt: Maintains strong adherence and support for system prompts.
12422+
- Agentic: Offers best-in-class agentic capabilities with native function calling and JSON outputting.
12423+
- Edge-Optimized: Delivers best-in-class performance at a small scale, deployable anywhere.
12424+
- Apache 2.0 License: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
12425+
- Large Context Window: Supports a 256k context window.
12426+
12427+
This gallery entry includes mmproj for multimodality and uses Unsloth recommended defaults.
12428+
tags:
12429+
- llm
12430+
- gguf
12431+
- gpu
12432+
- mistral
12433+
- cpu
12434+
- function-calling
12435+
- multimodal
12436+
overrides:
12437+
context_size: 16384
12438+
parameters:
12439+
model: llama-cpp/models/mistralai_Ministral-3-14B-Instruct-2512-Q4_K_M.gguf
12440+
temperature: 0.15
12441+
mmproj: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-14B-Instruct-2512-f32.gguf
12442+
files:
12443+
- filename: llama-cpp/models/mistralai_Ministral-3-14B-Instruct-2512-Q4_K_M.gguf
12444+
sha256: 76ce697c065f2e40f1e8e958118b02cab38e2c10a6015f7d7908036a292dc8c8
12445+
uri: huggingface://unsloth/Ministral-3-14B-Instruct-2512-GGUF/Ministral-3-14B-Instruct-2512-Q4_K_M.gguf
12446+
- filename: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-14B-Instruct-2512-f32.gguf
12447+
sha256: 2740ba9e9b30b09be4282a9a9f617ec43dc47b89aed416cb09b5f698f90783b5
12448+
uri: huggingface://unsloth/Ministral-3-14B-Instruct-2512-GGUF/mmproj-F32.gguf
12449+
- !!merge <<: *mistral03
12450+
name: "mistralai_ministral-3-14b-reasoning-2512-multimodal"
12451+
urls:
12452+
- https://huggingface.co/mistralai/Ministral-3-14B-Reasoning-2512
12453+
- https://huggingface.co/unsloth/Ministral-3-14B-Reasoning-2512-GGUF
12454+
description: |
12455+
The largest model in the Ministral 3 family, Ministral 3 14B offers frontier capabilities and performance comparable to its larger Mistral Small 3.2 24B counterpart. A powerful and efficient language model with vision capabilities.
12456+
12457+
This model is the reasoning post-trained version, trained for reasoning tasks, making it ideal for math, coding and stem related use cases.
12458+
12459+
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 32GB of VRAM in BF16, and less than 24GB of RAM/VRAM when quantized.
12460+
12461+
Key Features:
12462+
Ministral 3 14B consists of two main architectural components:
12463+
12464+
12465+
- 13.5B Language Model
12466+
- 0.4B Vision Encoder
12467+
12468+
The Ministral 3 14B Reasoning model offers the following capabilities:
12469+
12470+
12471+
- Vision: Enables the model to analyze images and provide insights based on visual content, in addition to text.
12472+
- Multilingual: Supports dozens of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, Korean, Arabic.
12473+
- System Prompt: Maintains strong adherence and support for system prompts.
12474+
- Agentic: Offers best-in-class agentic capabilities with native function calling and JSON outputting.
12475+
- Reasoning: Excels at complex, multi-step reasoning and dynamic problem-solving.
12476+
- Edge-Optimized: Delivers best-in-class performance at a small scale, deployable anywhere.
12477+
- Apache 2.0 License: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
12478+
- Large Context Window: Supports a 256k context window.
12479+
12480+
12481+
This gallery entry includes mmproj for multimodality and uses Unsloth recommended defaults.
12482+
tags:
12483+
- llm
12484+
- gguf
12485+
- gpu
12486+
- mistral
12487+
- cpu
12488+
- function-calling
12489+
- multimodal
12490+
overrides:
12491+
context_size: 32768
12492+
parameters:
12493+
model: llama-cpp/models/mistralai_Ministral-3-14B-Reasoning-2512-Q4_K_M.gguf
12494+
temperature: 0.7
12495+
top_p: 0.95
12496+
mmproj: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-14B-Reasoning-2512-f32.gguf
12497+
files:
12498+
- filename: llama-cpp/models/mistralai_Ministral-3-14B-Reasoning-2512-Q4_K_M.gguf
12499+
sha256: f577390559b89ebdbfe52cc234ea334649c24e6003ffa4b6a2474c5e2a47aa17
12500+
uri: huggingface://unsloth/Ministral-3-14B-Reasoning-2512-GGUF/Ministral-3-14B-Reasoning-2512-Q4_K_M.gguf
12501+
- filename: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-14B-Reasoning-2512-f32.gguf
12502+
sha256: 891bf262a032968f6e5b3d4e9ffc84cf6381890033c2f5204fbdf4817af4ab9b
12503+
uri: huggingface://unsloth/Ministral-3-14B-Reasoning-2512-GGUF/mmproj-F32.gguf
12504+
- !!merge <<: *mistral03
12505+
name: "mistralai_ministral-3-8b-instruct-2512-multimodal"
12506+
urls:
12507+
- https://huggingface.co/mistralai/Ministral-3-8B-Instruct-2512
12508+
- https://huggingface.co/unsloth/Ministral-3-8B-Instruct-2512-GGUF
12509+
description: |
12510+
A balanced model in the Ministral 3 family, Ministral 3 8B is a powerful, efficient tiny language model with vision capabilities.
12511+
12512+
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 8B can even be deployed locally, capable of fitting in 12GB of VRAM in FP8, and less if further quantized.
12513+
12514+
Key Features:
12515+
Ministral 3 8B consists of two main architectural components:
12516+
12517+
- 8.4B Language Model
12518+
- 0.4B Vision Encoder
12519+
12520+
The Ministral 3 8B Instruct model offers the following capabilities:
12521+
12522+
- Vision: Enables the model to analyze images and provide insights based on visual content, in addition to text.
12523+
- Multilingual: Supports dozens of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, Korean, Arabic.
12524+
- System Prompt: Maintains strong adherence and support for system prompts.
12525+
- Agentic: Offers best-in-class agentic capabilities with native function calling and JSON outputting.
12526+
- Edge-Optimized: Delivers best-in-class performance at a small scale, deployable anywhere.
12527+
- Apache 2.0 License: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
12528+
- Large Context Window: Supports a 256k context window.
12529+
12530+
This gallery entry includes mmproj for multimodality and uses Unsloth recommended defaults.
12531+
tags:
12532+
- llm
12533+
- gguf
12534+
- gpu
12535+
- mistral
12536+
- cpu
12537+
- function-calling
12538+
- multimodal
12539+
overrides:
12540+
context_size: 16384
12541+
parameters:
12542+
model: llama-cpp/models/mistralai_Ministral-3-8B-Instruct-2512-Q4_K_M.gguf
12543+
temperature: 0.15
12544+
mmproj: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-8B-Instruct-2512-f32.gguf
12545+
files:
12546+
- filename: llama-cpp/models/mistralai_Ministral-3-8B-Instruct-2512-Q4_K_M.gguf
12547+
sha256: 5dbc3647eb563b9f8d3c70ec3d906cce84b86bb35c5e0b8a36e7df3937ab7174
12548+
uri: huggingface://unsloth/Ministral-3-8B-Instruct-2512-GGUF/Ministral-3-8B-Instruct-2512-Q4_K_M.gguf
12549+
- filename: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-8B-Instruct-2512-f32.gguf
12550+
sha256: 242d11ff65ef844b0aac4e28d4b1318813370608845f17b3ef5826fd7e7fd015
12551+
uri: huggingface://unsloth/Ministral-3-8B-Instruct-2512-GGUF/mmproj-F32.gguf
12552+
- !!merge <<: *mistral03
12553+
name: "mistralai_ministral-3-8b-reasoning-2512-multimodal"
12554+
urls:
12555+
- https://huggingface.co/mistralai/Ministral-3-8B-Reasoning-2512
12556+
- https://huggingface.co/unsloth/Ministral-3-8B-Reasoning-2512-GGUF
12557+
description: |
12558+
A balanced model in the Ministral 3 family, Ministral 3 8B is a powerful, efficient tiny language model with vision capabilities.
12559+
12560+
This model is the reasoning post-trained version, trained for reasoning tasks, making it ideal for math, coding and stem related use cases.
12561+
12562+
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 8B can even be deployed locally, capable of fitting in 24GB of VRAM in BF16, and less than 12GB of RAM/VRAM when quantized.
12563+
12564+
Key Features:
12565+
Ministral 3 8B consists of two main architectural components:
12566+
12567+
12568+
- 8.4B Language Model
12569+
- 0.4B Vision Encoder
12570+
12571+
The Ministral 3 8B Reasoning model offers the following capabilities:
12572+
12573+
12574+
- Vision: Enables the model to analyze images and provide insights based on visual content, in addition to text.
12575+
- Multilingual: Supports dozens of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, Korean, Arabic.
12576+
- System Prompt: Maintains strong adherence and support for system prompts.
12577+
- Agentic: Offers best-in-class agentic capabilities with native function calling and JSON outputting.
12578+
- Reasoning: Excels at complex, multi-step reasoning and dynamic problem-solving.
12579+
- Edge-Optimized: Delivers best-in-class performance at a small scale, deployable anywhere.
12580+
- Apache 2.0 License: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
12581+
- Large Context Window: Supports a 256k context window.
12582+
12583+
This gallery entry includes mmproj for multimodality and uses Unsloth recommended defaults.
12584+
tags:
12585+
- llm
12586+
- gguf
12587+
- gpu
12588+
- mistral
12589+
- cpu
12590+
- function-calling
12591+
- multimodal
12592+
overrides:
12593+
context_size: 32768
12594+
parameters:
12595+
model: llama-cpp/models/mistralai_Ministral-3-8B-Reasoning-2512-Q4_K_M.gguf
12596+
temperature: 0.7
12597+
top_p: 0.95
12598+
mmproj: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-8B-Reasoning-2512-f32.gguf
12599+
files:
12600+
- filename: llama-cpp/models/mistralai_Ministral-3-8B-Reasoning-2512-Q4_K_M.gguf
12601+
sha256: c3d1c5ab7406a0fc9d50ad2f0d15d34d5693db00bf953e8a9cd9a243b81cb1b2
12602+
uri: huggingface://unsloth/Ministral-3-8B-Reasoning-2512-GGUF/Ministral-3-8B-Reasoning-2512-Q4_K_M.gguf
12603+
- filename: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-8B-Reasoning-2512-f32.gguf
12604+
sha256: 92252621cb957949379ff81ee14b15887d37eade3845a6e937e571b98c2c84c2
12605+
uri: huggingface://unsloth/Ministral-3-8B-Reasoning-2512-GGUF/mmproj-F32.gguf
12606+
- !!merge <<: *mistral03
12607+
name: "mistralai_ministral-3-3b-instruct-2512-multimodal"
12608+
urls:
12609+
- https://huggingface.co/mistralai/Ministral-3-3B-Instruct-2512
12610+
- https://huggingface.co/unsloth/Ministral-3-3B-Instruct-2512-GGUF
12611+
description: |
12612+
The smallest model in the Ministral 3 family, Ministral 3 3B is a powerful, efficient tiny language model with vision capabilities.
12613+
12614+
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 3B can even be deployed locally, capable of fitting in 8GB of VRAM in FP8, and less if further quantized.
12615+
12616+
Key Features:
12617+
Ministral 3 3B consists of two main architectural components:
12618+
12619+
- 3.4B Language Model
12620+
- 0.4B Vision Encoder
12621+
12622+
The Ministral 3 3B Instruct model offers the following capabilities:
12623+
12624+
- Vision: Enables the model to analyze images and provide insights based on visual content, in addition to text.
12625+
- Multilingual: Supports dozens of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, Korean, Arabic.
12626+
- System Prompt: Maintains strong adherence and support for system prompts.
12627+
- Agentic: Offers best-in-class agentic capabilities with native function calling and JSON outputting.
12628+
- Edge-Optimized: Delivers best-in-class performance at a small scale, deployable anywhere.
12629+
- Apache 2.0 License: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
12630+
- Large Context Window: Supports a 256k context window.
12631+
12632+
This gallery entry includes mmproj for multimodality and uses Unsloth recommended defaults.
12633+
tags:
12634+
- llm
12635+
- gguf
12636+
- gpu
12637+
- mistral
12638+
- cpu
12639+
- function-calling
12640+
- multimodal
12641+
overrides:
12642+
context_size: 16384
12643+
parameters:
12644+
model: llama-cpp/models/mistralai_Ministral-3-3B-Instruct-2512-Q4_K_M.gguf
12645+
temperature: 0.15
12646+
mmproj: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-3B-Instruct-2512-f32.gguf
12647+
files:
12648+
- filename: llama-cpp/models/mistralai_Ministral-3-3B-Instruct-2512-Q4_K_M.gguf
12649+
sha256: fd46fc371ff0509bfa8657ac956b7de8534d7d9baaa4947975c0648c3aa397f4
12650+
uri: huggingface://unsloth/Ministral-3-3B-Instruct-2512-GGUF/Ministral-3-3B-Instruct-2512-Q4_K_M.gguf
12651+
- filename: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-3B-Instruct-2512-f32.gguf
12652+
sha256: 57bb4e6f01166985ca2fc16061be4023fcb95cb8e60f445b8d0bf1ee30268636
12653+
uri: huggingface://unsloth/Ministral-3-3B-Instruct-2512-GGUF/mmproj-F32.gguf
12654+
- !!merge <<: *mistral03
12655+
name: "mistralai_ministral-3-3b-reasoning-2512-multimodal"
12656+
urls:
12657+
- https://huggingface.co/mistralai/Ministral-3-3B-Reasoning-2512
12658+
- https://huggingface.co/unsloth/Ministral-3-3B-Reasoning-2512-GGUF
12659+
description: |
12660+
The smallest model in the Ministral 3 family, Ministral 3 3B is a powerful, efficient tiny language model with vision capabilities.
12661+
12662+
This model is the reasoning post-trained version, trained for reasoning tasks, making it ideal for math, coding and stem related use cases.
12663+
12664+
The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 3B can even be deployed locally, fitting in 16GB of VRAM in BF16, and less than 8GB of RAM/VRAM when quantized.
12665+
12666+
Key Features:
12667+
Ministral 3 3B consists of two main architectural components:
12668+
12669+
- 3.4B Language Model
12670+
- 0.4B Vision Encoder
12671+
12672+
The Ministral 3 3B Reasoning model offers the following capabilities:
12673+
12674+
- Vision: Enables the model to analyze images and provide insights based on visual content, in addition to text.
12675+
- Multilingual: Supports dozens of languages, including English, French, Spanish, German, Italian, Portuguese, Dutch, Chinese, Japanese, Korean, Arabic.
12676+
- System Prompt: Maintains strong adherence and support for system prompts.
12677+
- Agentic: Offers best-in-class agentic capabilities with native function calling and JSON outputting.
12678+
- Reasoning: Excels at complex, multi-step reasoning and dynamic problem-solving.
12679+
- Edge-Optimized: Delivers best-in-class performance at a small scale, deployable anywhere.
12680+
- Apache 2.0 License: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
12681+
- Large Context Window: Supports a 256k context window.
12682+
12683+
This gallery entry includes mmproj for multimodality and uses Unsloth recommended defaults.
12684+
tags:
12685+
- llm
12686+
- gguf
12687+
- gpu
12688+
- mistral
12689+
- cpu
12690+
- function-calling
12691+
- multimodal
12692+
overrides:
12693+
context_size: 32768
12694+
parameters:
12695+
model: llama-cpp/models/mistralai_Ministral-3-3B-Reasoning-2512-Q4_K_M.gguf
12696+
temperature: 0.7
12697+
top_p: 0.95
12698+
mmproj: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-3B-Reasoning-2512-f32.gguf
12699+
files:
12700+
- filename: llama-cpp/models/mistralai_Ministral-3-3B-Reasoning-2512-Q4_K_M.gguf
12701+
sha256: a2648395d533b6d1408667d00e0b778f3823f3f3179ba371f89355f2e957e42e
12702+
uri: huggingface://unsloth/Ministral-3-3B-Reasoning-2512-GGUF/Ministral-3-3B-Reasoning-2512-Q4_K_M.gguf
12703+
- filename: llama-cpp/mmproj/mmproj-mistralai_Ministral-3-3B-Reasoning-2512-f32.gguf
12704+
sha256: 8035a6a10dfc6250f50c62764fae3ac2ef6d693fc9252307c7093198aabba812
12705+
uri: huggingface://unsloth/Ministral-3-3B-Reasoning-2512-GGUF/mmproj-F32.gguf
1240112706
- &mudler
1240212707
url: "github:mudler/LocalAI/gallery/mudler.yaml@master" ### START mudler's LocalAI specific-models
1240312708
name: "LocalAI-llama3-8b-function-call-v0.2"

0 commit comments

Comments
 (0)