{"id":118185,"date":"2024-10-29T14:12:23","date_gmt":"2024-10-29T14:12:23","guid":{"rendered":"\/tutorials\/?p=118185"},"modified":"2026-03-10T09:34:46","modified_gmt":"2026-03-10T09:34:46","slug":"ollama-cli-tutorial","status":"publish","type":"post","link":"\/ph\/tutorials\/ollama-cli-tutorial","title":{"rendered":"Ollama CLI tutorial: Running Ollama via the terminal"},"content":{"rendered":"<?xml encoding=\"utf-8\" ?><p>As a powerful tool for running large language models (LLMs) locally, Ollama gives developers, data scientists, and technical users greater control and flexibility in customizing models.<\/p><p>While you can use Ollama with third-party graphical interfaces like Open WebUI for simpler interactions, running it through the command-line interface (CLI) lets you log responses to files and automate workflows using scripts.<\/p><p>This guide will walk you through using Ollama via the CLI, from learning basic commands and interacting with models to automating tasks and deploying your own models. By the end, you&rsquo;ll be able to tailor Ollama for your AI-based projects.<\/p><p>\n\n\n\n<\/p><h2 class=\"wp-block-heading\" id=\"h-setting-up-ollama-in-the-cli\">Setting up Ollama in the CLI<\/h2><p>Before using Ollama in the CLI, make sure you&rsquo;ve <a href=\"\/ph\/tutorials\/how-to-install-ollama\">installed it on your system<\/a> successfully. To verify, open your terminal and run the following command:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama --version<\/pre><p>You should see an output similar to:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b383f6\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"177\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-ollama-version-1024x177.png\" alt=\"Terminal output displaying the installed Ollama version\" class=\"wp-image-118187\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-version-1024x177.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-version-300x52.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-version-150x26.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-version-768x133.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-version.png 1052w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><p>Next, familiarize yourself with these essential Ollama commands:<\/p><figure tabindex=\"0\" class=\"wp-block-table\"><table class=\"has-fixed-layout\"><tbody><tr><td><strong>Command<\/strong><\/td><td><strong>Description<\/strong><\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama serve<\/code><\/td><td>Starts Ollama on your local system.<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama create &lt;new_model&gt;<\/code><\/td><td>Creates a new model from an existing one for customization or training.<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama show &lt;model&gt;<\/code><\/td><td>Displays details about a specific model, such as its configuration and release date.<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama run &lt;model&gt;<\/code><\/td><td>Runs the specified model, making it ready for interaction<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama pull &lt;model&gt;<\/code><\/td><td>Downloads the specified model to your system.<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama list<\/code><\/td><td>Lists all the downloaded models.<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama ps<\/code><\/td><td>Shows the currently running models.<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama stop &lt;model&gt;<\/code><\/td><td>Stops the specified running model.<\/td><\/tr><tr><td><code data-enlighter-language=\"generic\" class=\"EnlighterJSRAW\">ollama rm &lt;model&gt;<\/code><\/td><td>Removes the specified model from your system.<\/td><\/tr><\/tbody><\/table><\/figure><h2 class=\"wp-block-heading\" id=\"h-essential-usage-of-ollama-in-the-cli\">Essential usage of Ollama in the CLI<\/h2><p>This section will cover the primary usage of the Ollama CLI, from interacting with models to saving model outputs to files.<\/p><h3 class=\"wp-block-heading\" id=\"h-running-models\">Running models<\/h3><p>To start using models in Ollama, you first need to download the desired model using the <strong>pull<\/strong> command. For example, to pull Llama <strong>3.2<\/strong>, execute the following:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama pull llama3.2<\/pre><p>Wait for the download to complete; the time may vary depending on the model&rsquo;s file size.<\/p><p><div class=\"protip\">\n                    <h4 class=\"title\">Pro tip<\/h4>\n                    <p> If you're unsure which model to download, check out the <a href=\"https:\/\/ollama.com\/library\" target=\"_blank\" rel=\"noopener\">Ollama official model library<\/a>. It provides important details for each model, including customization options, language support, and recommended use cases.<\/p>\n                <\/div>\n\n\n\n<\/p><p>After pulling the model, you can run it with a predefined prompt like this:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Explain the basics of machine learning.\"<\/pre><p>Here&rsquo;s the expected output:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b38d03\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"585\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-llama-3-2-output-machine-learning-1024x585.png\" alt=\"Terminal displaying an Ollama model's response about machine learning\" class=\"wp-image-118191\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-machine-learning-1024x585.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-machine-learning-300x172.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-machine-learning-150x86.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-machine-learning-768x439.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-machine-learning-1536x878.png 1536w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-machine-learning-2048x1171.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><p>Alternatively, run the model without a prompt to start an interactive session:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2<\/pre><p>In this mode, you can enter your queries or instructions, and the model will generate responses. You can also ask follow-up questions to gain deeper insights or clarify a previous response, such as:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">Can you elaborate on how machine learning is used in the healthcare sector?<\/pre><p>When you&rsquo;re done interacting with the model, type:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">\/bye<\/pre><p>This will exit the session and return you to the regular terminal interface.<\/p><p><div class=\"protip\">\n                    <h4 class=\"title\">Suggested reading<\/h4>\n                    <p> Learn how to <a href=\"\/ph\/tutorials\/ai-prompt-engineering\">create effective AI prompts<\/a> to improve your results and interactions with Ollama models.<\/p>\n                <\/div>\n\n\n\n<\/p><h3 class=\"wp-block-heading\" id=\"h-training-models\">Training models<\/h3><p>While pre-trained open-source models like Llama <strong>3.2<\/strong> perform well for general tasks like content generation, they may not always meet the needs of specific use cases. To improve a model&rsquo;s accuracy on a particular topic, you need to train it using relevant data.<\/p><p>However, note that these models have <strong>short-term memory limitations<\/strong>, meaning the training data is only retained during the active conversation. When you quit the session and start a new one, the model won&rsquo;t remember the information you previously trained it with.<\/p><p>To train the model, start an interactive session. Then, initiate training by typing a prompt like:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">Hey, I want you to learn about [topic]. Can I train you on this?<\/pre><p>The model will respond with something like:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b39584\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large is-resized wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"78\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-llama-3-2-output-training-1024x78.png\" alt=\"Terminal displaying an Ollama model's response to a training prompt\" class=\"wp-image-118192\" style=\"width:840px;height:auto\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-training-1024x78.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-training-300x23.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-training-150x11.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-training-768x59.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-training-1536x117.png 1536w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-output-training-2048x156.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><p>You can then provide basic information about the topic to help the model understand:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b39d29\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"554\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-llama-3-2-prompt-training-1024x554.png\" alt=\"Terminal displaying a prompt for training purposes\" class=\"wp-image-118193\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-prompt-training-1024x554.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-prompt-training-300x162.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-prompt-training-150x81.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-prompt-training-768x415.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-prompt-training-1536x831.png 1536w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-prompt-training-2048x1108.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><p>To continue the training and provide more information, ask the model to prompt you with questions about the topic. For example:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">Can you ask me a few questions about [topic] to help you understand it better?<\/pre><p>Once the model has enough context on the subject, you can end the training and test if the model retains this knowledge.<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b3a3f3\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"551\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-llama-3-2-response-fast-beauty-phenomenon-1024x551.png\" alt=\"Terminal displaying an Ollama model's response after training\" class=\"wp-image-118194\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-fast-beauty-phenomenon-1024x551.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-fast-beauty-phenomenon-300x161.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-fast-beauty-phenomenon-150x81.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-fast-beauty-phenomenon-768x413.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-fast-beauty-phenomenon-1536x827.png 1536w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-fast-beauty-phenomenon-2048x1102.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><h3 class=\"wp-block-heading\" id=\"h-prompting-and-logging-responses-to-files\">Prompting and logging responses to files<\/h3><p>In Ollama, you can ask the model to perform tasks using the contents of a file, such as summarizing text or analyzing information. This is especially useful for long documents, as it eliminates the need to copy and paste text when instructing the model.<\/p><p>For example, if you have a file named <strong>input.txt<\/strong> containing the information you want to summarize, you can run the following:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Summarize the content of this file in 50 words.\" &lt; input.txt<\/pre><p>The model will read the file&rsquo;s contents and generate a summary:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b3b374\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"151\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-llama-3-2-response-summary-1024x151.png\" alt=\"Terminal displaying an Ollama model's response to summarizing a TXT file\" class=\"wp-image-118195\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-summary-1024x151.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-summary-300x44.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-summary-150x22.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-summary-768x114.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-summary-1536x227.png 1536w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-llama-3-2-response-summary-2048x303.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><p>Ollama also lets you log model responses to a file, making it easier to review or refine them later. Here&rsquo;s an example of asking the model a question and saving the output to a file:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Tell me about renewable energy.\"&gt; output.txt<\/pre><p>This will save the model&rsquo;s response in <strong>output.txt<\/strong>:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b3bcf2\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"341\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-cat-output-1024x341.png\" alt=\"Terminal displaying the content of output.txt using the Linux cat command\" class=\"wp-image-118196\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-cat-output-1024x341.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-cat-output-300x100.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-cat-output-150x50.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-cat-output-768x255.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-cat-output-1536x511.png 1536w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-cat-output-2048x681.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><h2 class=\"wp-block-heading\" id=\"h-advanced-usage-of-ollama-in-the-cli\">Advanced usage of Ollama in the CLI<\/h2><p>Now that you understand the essentials, let&rsquo;s explore more advanced uses of Ollama through the CLI.<\/p><h3 class=\"wp-block-heading\" id=\"h-creating-custom-models\">Creating custom models<\/h3><p>Running Ollama via the CLI, you can create a custom model based on your specific needs.<\/p><p>To do so, create a Modelfile, which is the blueprint for your custom model. The file defines key settings such as the base model, parameters to adjust, and how the model will respond to prompts.<\/p><p>Follow these steps to create a custom model in Ollama:<\/p><p><strong>1. Create a new Modelfile<\/strong><\/p><p>Use a text editor like <a href=\"\/ph\/tutorials\/how-to-install-and-use-nano-text-editor\">nano<\/a> to create a new Modelfile. In this example, we&rsquo;ll name the file <strong>custom-modelfile<\/strong>:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">nano custom-modelfile<\/pre><p>Next, copy and paste this basic Modelfile template, which you&rsquo;ll customize in the next step:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\"># Use Llama 3.2 as the base model\n\nFROM llama3.2\n\n# Adjust model parameters\n\nPARAMETER temperature 0.7\n\nPARAMETER num_ctx 3072\n\nPARAMETER stop \"assistant:\"\n\n# Define model behavior\n\nSYSTEM \"You are an expert in cyber security.\"\n\n# Customize the conversation template\n\nTEMPLATE \"\"\"{{ if .System }}Advisor: {{ .System }}{{ end }}\n\nClient: {{ .Prompt }}\n\nAdvisor: {{ .Response }}\"\"\"<\/pre><p><strong>2. Customize the Modelfile<\/strong><\/p><p>Here are the key elements you can customize in the Modelfile:<\/p><ul class=\"wp-block-list\">\n<li><strong>Base model (FROM)<\/strong>. Sets the base model for your custom instance. You can choose from available models like Llama <strong>3.2<\/strong>:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">FROM llama3.2<\/pre><ul class=\"wp-block-list\">\n<li><strong>Parameters (PARAMETER)<\/strong>. Control the model&rsquo;s behavior, such as:\n<ul class=\"wp-block-list\">\n<li><strong>Temperature<\/strong>. Adjusts the model&rsquo;s creativity. Higher values like <strong>1.0<\/strong> make it more creative, while lower ones like <strong>0.5<\/strong> make it more focused.<\/li>\n<\/ul>\n<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">PARAMETER temperature 0.9<\/pre><ul class=\"wp-block-list\">\n<li><strong>Context window (num_ctx)<\/strong>. Defines how much previous text the model uses as context.<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">PARAMETER num_ctx 4096<\/pre><ul class=\"wp-block-list\">\n<li><strong>System message (SYSTEM)<\/strong>. Defines how the model should behave. For example, you can instruct it to act as a specific character or avoid answering irrelevant questions:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">SYSTEM &ldquo;You are an expert in cyber security. Only answer questions related to cyber security. If asked anything unrelated, respond with: &lsquo;I only answer questions related to cyber security.&rsquo;\"<\/pre><ul class=\"wp-block-list\">\n<li><strong>Template (TEMPLATE)<\/strong>. Customize how to structure the interaction between the user and the model. For instance:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">TEMPLATE \"\"\"{{ if .System }}&lt;|start|&gt;system\n\n{{ .System }}&lt;|end|&gt;{{ end }}\n\n&lt;|start|&gt;user\n\n{{ .Prompt }}&lt;|end|&gt;\n\n&lt;|start|&gt;assistant\n\n\"\"\"<\/pre><p>After making the necessary adjustments, save the file and exit <strong>nano<\/strong> by pressing <strong>Ctrl + X &rarr; Y &rarr; Enter<\/strong>.<\/p><p><strong>3. Create and run the custom model<\/strong><\/p><p>Once your Modelfile is ready, use the command below to create a model based on the file:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama create custom-model-name -f .\/custom-modelfile<\/pre><p>You should see an output indicating the model was created successfully:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b3c707\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"235\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-ollama-create-custom-model-name-success-1024x235.png\" alt=\"Terminal output displaying the successful creation of a custom model\" class=\"wp-image-118197\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-create-custom-model-name-success-1024x235.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-create-custom-model-name-success-300x69.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-create-custom-model-name-success-150x34.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-create-custom-model-name-success-768x176.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-create-custom-model-name-success-1536x353.png 1536w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-ollama-create-custom-model-name-success-2048x470.png 2048w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><p>After that, run it just like any other model:<\/p><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run custom-model-name<\/pre><p>This will start the model with the customizations you applied, and you can interact with it:<\/p><div class=\"wp-block-image\"><figure data-wp-context='{\"imageId\":\"69e1288b3cf21\"}' data-wp-interactive=\"core\/image\" class=\"aligncenter size-large wp-lightbox-container\"><img decoding=\"async\" width=\"1024\" height=\"135\" data-wp-class--hide=\"state.isContentHidden\" data-wp-class--show=\"state.isContentVisible\" data-wp-init=\"callbacks.setButtonStyles\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-on-async--load=\"callbacks.setButtonStyles\" data-wp-on-async-window--resize=\"callbacks.setButtonStyles\" src=\"\/tutorials\/wp-content\/uploads\/sites\/2\/2024\/10\/terminal-custom-model-name-response-1024x135.png\" alt=\"Terminal displaying a custom model's response to an unrelated topic\" class=\"wp-image-118198\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-custom-model-name-response-1024x135.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-custom-model-name-response-300x40.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-custom-model-name-response-150x20.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-custom-model-name-response-768x101.png 768w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2024\/10\/terminal-custom-model-name-response.png 1530w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><button class=\"lightbox-trigger\" type=\"button\" aria-haspopup=\"dialog\" aria-label=\"Enlarge\" data-wp-init=\"callbacks.initTriggerButton\" data-wp-on-async--click=\"actions.showLightbox\" data-wp-style--right=\"state.imageButtonRight\" data-wp-style--top=\"state.imageButtonTop\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"12\" height=\"12\" fill=\"none\" viewbox=\"0 0 12 12\">\n\t\t\t\t<path fill=\"#fff\" d=\"M2 0a2 2 0 0 0-2 2v2h1.5V2a.5.5 0 0 1 .5-.5h2V0H2Zm2 10.5H2a.5.5 0 0 1-.5-.5V8H0v2a2 2 0 0 0 2 2h2v-1.5ZM8 12v-1.5h2a.5.5 0 0 0 .5-.5V8H12v2a2 2 0 0 1-2 2H8Zm2-12a2 2 0 0 1 2 2v2h-1.5V2a.5.5 0 0 0-.5-.5H8V0h2Z\"><\/path>\n\t\t\t<\/svg>\n\t\t<\/button><\/figure><\/div><p>You can continually tweak and refine the Modelfile by adjusting parameters, editing system messages, adding more advanced templates, or even including your own datasets. Save the changes and re-run the model to see the effects.<\/p><h3 class=\"wp-block-heading\" id=\"h-automating-tasks-with-scripts\">Automating tasks with scripts<\/h3><p>Automating repetitive tasks in Ollama can save time and ensure workflow consistency. By using bash scripts, you can execute commands automatically. Meanwhile, with cron jobs, you can schedule tasks to run at specific times. Here&rsquo;s how to get started:<\/p><p><strong>Create and run bash scripts<\/strong><\/p><p>You can create a bash script that executes Ollama commands. Here&rsquo;s how:<\/p><ol class=\"wp-block-list\">\n<li>Open a text editor and create a new file named <strong>ollama-script.sh<\/strong>:<\/li>\n<\/ol><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">nano ollama-script.sh<\/pre><ol start=\"2\" class=\"wp-block-list\">\n<li>Add the necessary Ollama commands inside the script. For instance, to run a model and save the output to a file:<\/li>\n<\/ol><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">#!\/bin\/bash\n\n# Run the model and save the output to a file\n\nollama run llama3.2 \"What are the latest trends in AI?\" &gt; ai-output.txt<\/pre><ol start=\"3\" class=\"wp-block-list\">\n<li>Make the script executable by <a href=\"\/ph\/tutorials\/vps\/change-linux-permissions-and-owners\">giving it the correct permissions<\/a>:<\/li>\n<\/ol><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">chmod +x ollama-script.sh<\/pre><ol start=\"4\" class=\"wp-block-list\">\n<li>Run the script directly from the terminal:<\/li>\n<\/ol><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">.\/ollama-script.sh<\/pre><p><strong>Set up cron jobs to automate tasks<\/strong><\/p><p>You can combine your script with a <a href=\"\/ph\/tutorials\/cron-job\">cron job<\/a> to automate tasks like running models regularly. Here&rsquo;s how to set up a cron job to run Ollama scripts automatically:<\/p><ol class=\"wp-block-list\">\n<li>Open the crontab editor by typing:<\/li>\n<\/ol><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">crontab -e<\/pre><ol start=\"2\" class=\"wp-block-list\">\n<li>Add a line specifying the schedule and the script you want to run. For example, to run the script every Sunday at midnight:<\/li>\n<\/ol><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">0 0 * * 0 \/path\/to\/ollama-script.sh<\/pre><ol start=\"3\" class=\"wp-block-list\">\n<li>Save and exit the editor after adding the cron job.<\/li>\n<\/ol><h2 class=\"wp-block-heading\" id=\"h-common-use-cases-for-the-cli\">Common use cases for the CLI<\/h2><p>Here are some real-world examples of using Ollama&rsquo;s CLI.<\/p><p><strong>Text generation<\/strong><\/p><p>You can use pre-trained models to create summaries, generate content, or answer specific questions.<\/p><ul class=\"wp-block-list\">\n<li>Summarizing a large text file:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Summarize the following text:\" &lt; long-document.txt<\/pre><ul class=\"wp-block-list\">\n<li>Generating content such as blog posts or product descriptions:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Write a short article on the benefits of using AI in healthcare.\"&gt; article.txt<\/pre><ul class=\"wp-block-list\">\n<li>Answering specific questions to help with research:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"What are the latest trends in AI, and how will they affect healthcare?\"<\/pre><p><strong>Data processing, analysis, and prediction<\/strong><\/p><p>Ollama also lets you handle data processing tasks such as text classification, sentiment analysis, and prediction.<\/p><ul class=\"wp-block-list\">\n<li>Classifying text into positive, negative, or neutral sentiment:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Analyze the sentiment of this customer review: 'The product is fantastic, but delivery was slow.'\"<\/pre><ul class=\"wp-block-list\">\n<li>Categorizing text into predefined categories:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Classify this text into the following categories: News, Opinion, or Review.\" &lt; textfile.txt<\/pre><ul class=\"wp-block-list\">\n<li>Predicting an outcome based on the provided data:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">ollama run llama3.2 \"Predict the stock price trend for the next month based on the following data:\" &lt; stock-data.txt<\/pre><p><strong>Integration with external tools<\/strong><\/p><p>Another common use of the Ollama CLI is combining it with external tools to automate data processing and expand the capabilities of other programs.<\/p><ul class=\"wp-block-list\">\n<li>Integrating Ollama with a third-party API to retrieve data, process it, and generate results:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">curl -X GET \"https:\/\/api.example.com\/data\" | ollama run llama3.2 \"Analyze the following API data and summarize key insights.\"<\/pre><ul class=\"wp-block-list\">\n<li>Using Python code to run a subprocess with Ollama:<\/li>\n<\/ul><pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">import subprocess\n\nresult = subprocess.run(['ollama', 'run', 'llama3.2', 'Give me the latest stock market trends'], capture_output=True)\n\nprint(result.stdout.decode())<\/pre><?xml encoding=\"utf-8\" ?><figure class=\"wp-block-image size-large\"><a href=\"\/ph\/vps-hosting\" target=\"_blank\" rel=\"noreferrer noopener\"><img decoding=\"async\" width=\"1024\" height=\"300\" src=\"https:\/\/www.hostinger.com\/tutorials\/wp-content\/uploads\/sites\/2\/2023\/02\/VPS-hosting-banner-1024x300.png\" alt=\"\" class=\"wp-image-77934\" srcset=\"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2023\/02\/VPS-hosting-banner.png 1024w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2023\/02\/VPS-hosting-banner-300x88.png 300w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2023\/02\/VPS-hosting-banner-150x44.png 150w, https:\/\/www.hostinger.com\/ph\/tutorials\/wp-content\/uploads\/sites\/44\/2023\/02\/VPS-hosting-banner-768x225.png 768w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/a><\/figure><h2 class=\"wp-block-heading\" id=\"h-conclusion\">Conclusion<\/h2><p>In this article, you&rsquo;ve learned the essentials of using Ollama via CLI, including running commands, interacting with models, and logging model responses to files.<\/p><p>Using the command-line interface, you can also perform more advanced tasks, such as creating new models based on existing ones, automating complex workflows with scripts and cron jobs, and integrating Ollama with external tools.<\/p><p>We encourage you to explore Ollama&rsquo;s customization features to unlock its full potential and enhance your AI projects. If you have any questions or would like to share your experience using Ollama in the CLI, feel free to use the comment box below.<\/p><h2 class=\"wp-block-heading\" id=\"h-ollama-cli-tutorial-faq\">Ollama CLI tutorial FAQ<\/h2><div class=\"schema-faq wp-block-yoast-faq-block\"><div class=\"schema-faq-section\" id=\"faq-question-1730210427465\"><h3 class=\"schema-faq-question\">What can I do with the CLI version of Ollama?<\/h3> <p class=\"schema-faq-answer\">With the CLI version of Ollama, you can run models, generate text, perform data processing tasks like sentiment analysis, automate workflows with scripts, create custom models, and integrate Ollama with external tools or APIs for advanced applications.<\/p> <\/div> <div class=\"schema-faq-section\" id=\"faq-question-1730210434605\"><h3 class=\"schema-faq-question\">How do I install models for Ollama in the CLI?<\/h3> <p class=\"schema-faq-answer\">To install models via the CLI, first make sure you have downloaded Ollama on your system. Then, use the <strong>ollama pull<\/strong> command followed by the model name. For example, to install Llama <strong>3.2<\/strong>, execute <strong>ollama pull llama3.2<\/strong>.<\/p> <\/div> <div class=\"schema-faq-section\" id=\"faq-question-1730210439931\"><h3 class=\"schema-faq-question\">Can I use multimodal models in the CLI version?<\/h3> <p class=\"schema-faq-answer\">While it&rsquo;s technically possible to use multimodal models like LlaVa in Ollama&rsquo;s CLI, it&rsquo;s not convenient because the CLI is optimized for text-based tasks. We suggest <a href=\"\/ph\/tutorials\/ollama-gui-tutorial\">using Ollama with GUI tools<\/a> to handle visual-related work.<\/p> <\/div> <\/div>\n","protected":false},"excerpt":{"rendered":"<p>As a powerful tool for running large language models (LLMs) locally, Ollama gives developers, data scientists, and technical users greater control and flexibility in customizing models. While you can use Ollama with third-party graphical interfaces like Open WebUI for simpler interactions, running it through the command-line interface (CLI) lets you log responses to files and [&#8230;]<\/p>\n<p><a class=\"btn btn-secondary understrap-read-more-link\" href=\"\/ph\/tutorials\/ollama-cli-tutorial\">Read More&#8230;<\/a><\/p>\n","protected":false},"author":411,"featured_media":126192,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"rank_math_title":"Ollama CLI tutorial: Learn to use Ollama in the terminal","rank_math_description":"Learn how to use Ollama in the command-line interface for technical users. Set up models, customize parameters, and automate tasks.","rank_math_focus_keyword":"ollama cli tutorial","footnotes":""},"categories":[22639],"tags":[],"class_list":["post-118185","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-vps"],"hreflangs":[{"locale":"en-US","link":"https:\/\/www.hostinger.com\/tutorials\/ollama-cli-tutorial","default":0},{"locale":"fr-FR","link":"https:\/\/www.hostinger.com\/fr\/tutoriels\/tutoriel-ollama-cli","default":0},{"locale":"es-ES","link":"https:\/\/www.hostinger.com\/es\/tutoriales\/que-es-nslookup-3","default":0},{"locale":"id-ID","link":"https:\/\/www.hostinger.com\/id\/tutorial\/panduan-ollama-cli","default":0},{"locale":"en-UK","link":"https:\/\/www.hostinger.com\/uk\/tutorials\/ollama-cli-tutorial","default":0},{"locale":"en-MY","link":"https:\/\/www.hostinger.com\/my\/tutorials\/candle-business-name-ideas-9","default":0},{"locale":"en-PH","link":"https:\/\/www.hostinger.com\/ph\/tutorials\/ollama-cli-tutorial","default":0},{"locale":"en-IN","link":"https:\/\/www.hostinger.com\/in\/tutorials\/ollama-cli-tutorial","default":0},{"locale":"en-CA","link":"https:\/\/www.hostinger.com\/ca\/tutorials\/ollama-cli-tutorial","default":0},{"locale":"es-AR","link":"https:\/\/www.hostinger.com\/ar\/tutoriales\/que-es-nslookup-3","default":0},{"locale":"es-MX","link":"https:\/\/www.hostinger.com\/mx\/tutoriales\/que-es-nslookup-3","default":0},{"locale":"es-CO","link":"https:\/\/www.hostinger.com\/co\/tutoriales\/que-es-nslookup-3","default":0},{"locale":"en-AU","link":"https:\/\/www.hostinger.com\/au\/tutorials\/ollama-cli-tutorial","default":0},{"locale":"en-NG","link":"https:\/\/www.hostinger.com\/ng\/tutorials\/ollama-cli-tutorial","default":0}],"_links":{"self":[{"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/posts\/118185","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/users\/411"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/comments?post=118185"}],"version-history":[{"count":12,"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/posts\/118185\/revisions"}],"predecessor-version":[{"id":126187,"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/posts\/118185\/revisions\/126187"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/media\/126192"}],"wp:attachment":[{"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/media?parent=118185"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/categories?post=118185"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hostinger.com\/ph\/tutorials\/wp-json\/wp\/v2\/tags?post=118185"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}