Ollama + HuggingFace β π₯
Create Custom Models From Huggingface with Ollama

Image by Author
Ollama helps you get up and running with large language models, locally in very easy and simple steps. Compared with Ollama, Huggingface has more than half a million models. Wouldnβt it be cool, if we can create custom models from Huggingface with Ollama? If your answer is yes, you landed in the right post π
If you are new to Ollama, I have created a playlist, take your time to go through it, no pressure!
Here are the steps to create custom models.
- Make sure you have Ollama installed and running ( no walking π )
- Go to huggingface website and download the model ( I have downloaded the GGUF model )
- Create a modelfile and input necessary things.
- Create a model out of this modelfile and run it locally in the terminal.
Now, you might be wondering what are the necessary things to input in the modelfile, I know your thinking, here it is π€ In this example, I am using the TheBloke/CapybaraHermes-2.5-Mistral-7B-GGUF model.
# Modelfile
FROM "./capybarahermes-2.5-mistral-7b.Q4_K_M.gguf"
PARAMETER stop "<|im_start|>"
PARAMETER stop "<|im_end|>"
TEMPLATE """
<|im_start|>system
{{ .System }}<|im_end|>
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assistant
"""
You might also be thinking how to create and run the custom model, for that too, here it is βοΈ
ollama create my-own-model -f Modelfile
ollama run my-own-model
Now, you know how to create a custom model from model hosted in Huggingface with Ollama. Give a try and good luck with it. Still, If you prefer a video walkthrough, here is the link.
Recommended YouTube playlists:
Thank you for your time in reading this post!
Make sure to leave your feedback and comments. See you in the next blog, stay tuned π’
Machine Learning Engineer | Video about AI, Data Science and LLM π https://www.youtube.com/@datasciencebasics
Responses (3)
Write a responseWhat are your thoughts?
Is there a difference how I supposed to create the Model file for Embeddings model vs. the LLM model?For example, for this embeddings model:https://huggingface.co/dranger003/SFR-Embedding-Mistral-GGUFOr this one:https://huggingface.co/GritLM/GritLM-8x7B
3
Where can I get expected for the model Parameters values for a given model?For example, for this modelhttps://huggingface.co/dranger003/SFR-Embedding-Mistral-GGUFOr this one doesn't have Template, where do I get that?https://huggingface.co/GritLM/GritLM-8x7B
More from Sudarshan Koirala
Recommended from Medium