I installed Ollama. Now what?
from Mubelotix@jlai.lu to selfhosted@lemmy.world on 19 Jan 21:28
https://jlai.lu/post/14464674

I installed Ollama but I don’t have any ideas of what to do with it.

Do you have any fun/original use cases for it? I’m a programmer so it doesn’t have to exist already.

#selfhosted

threaded - newest

TootSweet@lemmy.world on 19 Jan 21:31 next collapse

Uninstall it and make the world a slightly better place?

massive_bereavement@fedia.io on 20 Jan 00:03 collapse

Thank you Ollama.

ThanksObama@sh.itjust.works on 20 Jan 00:26 collapse

You’re welcome!

/s

Harvey656@lemmy.world on 19 Jan 21:32 next collapse

Make a bot that viscously rips Into people based off their username lol

But for real, nice. I get by with kobold for my uses. How far do you think you’ll take this?

iii@mander.xyz on 19 Jan 21:38 next collapse

Have it pretend to be Gandalf working in a coffee shop

just_another_person@lemmy.world on 19 Jan 21:40 next collapse

Exactly

slazer2au@lemmy.world on 19 Jan 21:43 next collapse

Ask it to do stupid things. Like a to-do list I. Web assembly, or why does a triangle have 4 sides and keep saying it is wrong till it believes you.

scrubbles@poptalk.scrubbles.tech on 19 Jan 21:46 next collapse

Great job trying to learn! Ignore the naysayers here, as a fellow programmer like it or not, you’re going to need to learn how to interact with it. That’s the real way we’d lose our jobs, if you don’t keep up with this stuff you’re doomed to fall behind.

I recommend trying to first build a simple CLI API that you can work with and ask questions similar to chat gpt. This will give you a good foundation on the APIs and how it works. Then you can move up into function calling for things like HomeAssistant, and then maybe even later training loras. Start small, just getting basic stuff working first, and build from there.

TheHobbyist@lemmy.zip on 19 Jan 21:53 next collapse

Ollama is very useful but also rather barebones. I recommend installing Open-Webui to manage models and conversations. It will also be useful if you want to tweak more advanced settings like system prompts, seed, temperature and others.

You can install open-webui using docker or just pip, which is enough if you only care about serving yourself.

Edit: open-webui also renders markdown, which makes formatting and reading much more appealing and useful.

Edit2: you can also plug ollama into continue.dev, an extension to vscode which brings the LLM capabilities to your IDE.

alphakenny1@lemmy.world on 19 Jan 23:18 next collapse

Link it to openweb-ui makes things easier Then can knowledge it on say all the manuals in your house Or your home insurance policy or soemthing.

Link it to “speaches” and then you can make a voice chat.

Link it to continue.dev for coding

I think alot of the use case come from developing system prompts You can them make a “custom” model for specific tasks. I.e this model knows about my home insurance policy but writes back lile it’s pirate with a stutter

Nothing useful but gets your toes wet.

hendrik@palaver.p3x.de on 19 Jan 23:45 next collapse

Instruct it to be your dungeon master and do some roleplay.

DarkDarkHouse@lemmy.sdf.org on 20 Jan 11:41 collapse

I put on my robe and wizard hat.

hendrik@palaver.p3x.de on 20 Jan 11:56 collapse

Sure. I think that's a valid use case. Maybe use one of the community fine-tunes for that... 😆

Suoko@feddit.it on 20 Jan 05:17 next collapse

github.com/ggozad/oterm

possiblylinux127@lemmy.zip on 20 Jan 17:28 collapse

Install Alpaca if you are on gnome on Linux