<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Ollama on 查拉图的数字花园</title><link>https://www.chalatu.xyz/tags/ollama/</link><description>Recent content in Ollama on 查拉图的数字花园</description><generator>Hugo</generator><language>zh-CN</language><lastBuildDate>Fri, 17 Apr 2026 12:00:00 +0800</lastBuildDate><atom:link href="https://www.chalatu.xyz/tags/ollama/index.xml" rel="self" type="application/rss+xml"/><item><title>本地部署大模型完整指南（2026 年 4 月最新版）</title><link>https://www.chalatu.xyz/posts/solo-company/2026-04-17-local-llm-deployment-guide/</link><pubDate>Fri, 17 Apr 2026 12:00:00 +0800</pubDate><guid>https://www.chalatu.xyz/posts/solo-company/2026-04-17-local-llm-deployment-guide/</guid><description>系统梳理 2026 年本地部署大语言模型的完整流程：工具选型（Ollama/LM Studio/llama.cpp）、主流模型横评（Qwen3.5/GLM-4.7-Flash/Gemma 4/DeepSeek-R1）、硬件显存门槛、与云端 API 的核心差异，以及对 Cline/OpenCode 等 Agent 工作流的支持情况。</description></item></channel></rss>