{"id":514,"date":"2024-05-10T12:04:58","date_gmt":"2024-05-10T12:04:58","guid":{"rendered":"https:\/\/blog.hostinger.io\/support\/2024\/05\/10\/9310983-how-to-use-the-ollama-vps-template-at-hostinger\/"},"modified":"2025-08-27T06:45:52","modified_gmt":"2025-08-27T06:45:52","slug":"9310983-how-to-use-the-ollama-vps-template-at-hostinger","status":"publish","type":"post","link":"https:\/\/www.hostinger.com\/support\/9310983-how-to-use-the-ollama-vps-template-at-hostinger\/","title":{"rendered":"How to Use the Ollama VPS Template at Hostinger"},"content":{"rendered":"<p class=\"no-margin\"><b>Ollama<\/b> is a versatile platform designed for running and fine-tuning machine learning models, including advanced language models like <b>Llama3<\/b>. The <b><a href=\"\/support\/4965922-how-to-change-the-operating-system-of-vps\" target=\"_blank\" class=\"intercom-content-link\">Ubuntu 24.04 with Ollama<\/a><\/b> VPS template on Hostinger comes <b>pre-installed with Ollama, the Llama3 model, and Open WebUI<\/b>, providing an efficient way to manage and run these models. This guide will walk you through accessing and setting up Ollama.<\/p><p class=\"no-margin\">\n<\/p><div class=\"intercom-interblocks-callout\" style=\"background-color: #e3e7fa80;border-color: #334bfa33\">\n<p class=\"no-margin\">If you don&rsquo;t have a VPS yet, check the available options here: <b><a href=\"https:\/\/www.hostinger.com\/vps\/llm-hosting\" target=\"_blank\" class=\"intercom-content-link\">LLM VPS hosting<\/a> &#128640;<\/b><\/p>\n<\/div><p class=\"no-margin\">\n<\/p><h2 id=\"h_9eca25e7b0\">Accessing Ollama<\/h2><p class=\"no-margin\">Open your web browser and navigate to:<\/p><pre><code>https:\/\/[your-vps-ip]:8080<\/code><\/pre><p class=\"no-margin\">Replacing <b>[your-vps-ip]<\/b> with the <b><a href=\"\/support\/5139756-how-to-find-your-vps-ip-address\" target=\"_blank\" class=\"intercom-content-link\">IP address of your VPS<\/a>.<\/b><\/p><p class=\"no-margin\">\n<\/p><p class=\"no-margin\">The first you access, you&rsquo;ll be prompted to <b>create an account<\/b>:<\/p><p class=\"no-margin\">\n<\/p><div class=\"intercom-container intercom-align-center\"><img decoding=\"async\" src=\"\/support\/wp-content\/uploads\/sites\/55\/2024\/05\/d579af74-3cf9-46a1-85de-2f9da9d3badb.jpg\" width=\"400\"><\/div><p class=\"no-margin\">\n<\/p><p class=\"no-margin\">These will be your login credentials for subsequent access.<\/p><p class=\"no-margin\">\n<\/p><h2 id=\"h_1a59cb6314\">Getting to Know Open WebUI<\/h2><p class=\"no-margin\">Once logged in, explore the Open <b>WebUI dashboard<\/b>, where you can:<\/p><ul>\n<li>\n<p class=\"no-margin\">Monitor active models<\/p>\n<\/li>\n<li>\n<p class=\"no-margin\">Upload new datasets and models<\/p>\n<\/li>\n<li>\n<p class=\"no-margin\">Track model training and inference tasks<\/p>\n<\/li>\n<\/ul><div class=\"intercom-container\"><img decoding=\"async\" src=\"\/support\/wp-content\/uploads\/sites\/55\/2024\/05\/9316ef80-b6b8-40c6-b9db-1490ace8efce.jpg\"><\/div><p class=\"no-margin\">\n<\/p><p class=\"no-margin\">The pre-installed Llama3 model can be fine-tuned and managed through the interface. You can also add more models in the <b>settings <\/b>by clicking on the gear icon and selecting <b>Models<\/b>:<\/p><p class=\"no-margin\">\n<\/p><div class=\"intercom-container intercom-align-center\"><img decoding=\"async\" src=\"\/support\/wp-content\/uploads\/sites\/55\/2024\/05\/3ba86fce-63cc-4b5e-b84f-3f4760e00d72.jpg\"><\/div><p class=\"no-margin\">\n<\/p><h2 id=\"h_38105eb61e\">Running Inference with Llama3<\/h2><p class=\"no-margin\">You can use the Llama3 pre-trained model for inference directly through the Open WebUI interface, input custom prompts or datasets to see how the model responds.<\/p><p class=\"no-margin\">\n<\/p><p class=\"no-margin\">Adjust the <b>hyperparameters <\/b>and experiment with <b>fine-tuning on custom data<\/b> to improve performance:<\/p><p class=\"no-margin\">\n<\/p><div class=\"intercom-container intercom-align-center\"><img decoding=\"async\" src=\"\/support\/wp-content\/uploads\/sites\/55\/2024\/05\/3a8ab498-00f3-4c7f-ac67-8584c98adc8a.jpg\" width=\"500\"><\/div><p class=\"no-margin\">\n<\/p><p class=\"no-margin\">It&rsquo;s important to note that the <b>response speed<\/b> of models highly depends on the specific language model used and the number of CPU cores available. <b>Larger models <\/b>with more parameters often require more computational resources. If you&rsquo;re interested in expanding your machine-learning project, you can consider <b><a href=\"\/support\/1583229-how-to-upgrade-a-vps-server\" target=\"_blank\" class=\"intercom-content-link\">upgrading your VPS<\/a><\/b> and get more CPU core counts for optimal performance.<\/p><p class=\"no-margin\">\n<\/p><p class=\"no-margin\"><b>Additional Resources<\/b><\/p><ul>\n<li>\n<p class=\"no-margin\">For more detailed information and advanced configurations, refer to the official <b><a href=\"https:\/\/github.com\/ollama\/ollama\/tree\/main\/docs\" target=\"_blank\" class=\"intercom-content-link\" rel=\"noopener\">Ollama documentation<\/a><\/b><\/p>\n<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Getting started with the Ollama VPS template at Hostinger<\/p>\n","protected":false},"author":581,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"include_on_kodee":true,"footnotes":""},"categories":[206],"tags":[],"class_list":["post-514","post","type-post","status-publish","format-standard","hentry","category-vps-os-and-templates"],"hreflangs":[{"locale":"en-US","link":"https:\/\/www.hostinger.com\/support\/9310983-how-to-use-the-ollama-vps-template-at-hostinger\/","default":1}],"include_on_kodee":true,"_links":{"self":[{"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/posts\/514","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/users\/581"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/comments?post=514"}],"version-history":[{"count":1,"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/posts\/514\/revisions"}],"predecessor-version":[{"id":2458,"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/posts\/514\/revisions\/2458"}],"wp:attachment":[{"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/media?parent=514"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/categories?post=514"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hostinger.com\/support\/wp-json\/wp\/v2\/tags?post=514"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}