Skip to content

PolarSPARC

Articles | Notes | Tips | Tutorials

PolarSPARC

Category: Ollama

Quick Primer on Running GGUF models on Ollama

In this primer, we will demonstrate how one can download, deploy, and use LLM models in the GGUF format on the Ollama platform.

Here is the link to the article on running GGUF models on Ollama:

Quick Primer on Running GGUF models on Ollama

Enjoy 🙂 !!!

Author bhaskar_sPosted on January 4, 2025May 4, 2025Categories Docker, LLM, OllamaTags docker, llm, ollama

Posts pagination

Previous page Page 1 Page 2

Categories

  • Agentic (5)
  • Algorithm (4)
  • Android (14)
  • Ansible (4)
  • ARM (16)
  • AWS (19)
  • Besu (1)
  • BigData (25)
  • Blockchain (15)
  • BPF (1)
  • C++ (3)
  • Calcite (1)
  • Cassandra (5)
  • Chaincode (1)
  • Chatbot (3)
  • Cloud (51)
  • Containers (13)
  • Cryptography (3)
  • Data Structure (3)
  • Database (18)
  • DataScience (53)
  • DeepLearning (25)
  • DevOps (5)
  • Diffusion (1)
  • Docker (28)
  • Ethereum (9)
  • Finance (2)
  • General (43)
  • Golang (9)
  • Google Cloud (15)
  • Gradio (1)
  • GraphDB (3)
  • gRPC (4)
  • Hadoop (8)
  • Hardware (5)
  • HashiCorp (2)
  • HuggingFace (1)
  • Hyperledger (4)
  • IoT (3)
  • IPFS (1)
  • Java (102)
  • JVM (3)
  • Kafka (10)
  • Kubernetes (4)
  • LangChain (4)
  • LangGraph (1)
  • Leadership (11)
  • Links (15)
  • Linux (36)
  • LLM (15)
  • LocalAI (1)
  • MachineLearning (24)
  • Management (3)
  • Mathematics (26)
  • MCP (1)
  • Messaging (12)
  • MongoDB (5)
  • Network (21)
  • NFT (1)
  • NLP (5)
  • NoSQL (11)
  • Ollama (11)
  • OpenFaaS (1)
  • PydanticAI (1)
  • Python (99)
  • PyTorch (12)
  • R (4)
  • RAG (2)
  • Rasa (3)
  • Raspberry Pi (2)
  • Reactive (6)
  • Scala (8)
  • Scikit-Learn (20)
  • Security (17)
  • Serverless (1)
  • Solidity (5)
  • Spark (2)
  • Spring (18)
  • SQL (5)
  • Terraform (1)
  • Vault (1)
  • VectorStore (3)
  • Virtualization (5)
  • Web (13)
  • Windows (2)

Past Archives

PolarSPARC Proudly powered by WordPress