RimWorld

RimWorld

Otillräckligt med betyg
Local AI Social Interactions
   
Utmärkelse
Favorit
Favoritmarkerad
Avfavoritmarkerad
Mod, 1.5
Filstorlek
Lades upp
Uppdaterades
3.015 MB
23 jan @ 18:36
23 jan @ 18:39
2 ändringsnotiser ( visa )

Abonnera för att ladda ner
Local AI Social Interactions

Beskrivning
A Rimworld mod that will display what the colonists are saying to each other, using a locally installed LLM. It takes their bio and relationship into account. The results can be pretty funny.

Updated instructions:

You will need to download Ollama from ollama.com . It is free and open source. Once it is running, open up your terminal and do:

ollama run llama3.2:3b

It will download and start up the LLM (it is a 2gb file that downloads). You will need 2gb of RAM available. The mod will eventually have settings but it currently only works with the default Ollama settings. You may have to find instructions on starting LLM’s with Ollama on Windows, I only have a Mac, but it should work.
14 kommentarer
тетеря, блин 29 jul @ 11:28 
please, update 👍
tw_boring 15 feb @ 3:43 
Thank so much.:steamhappy:
DealingLace 6 feb @ 11:30 
This looks sick! Keen to check out the repo, thanks man!
gavin.blair  [skapare] 3 feb @ 9:27 
@Deankiller Turnupseed haha that might be overkill. it is on github for anyone that wants to take it to the next level
Deankiller Turnupseed 2 feb @ 23:03 
deepseek when
Szczygiel 26 jan @ 10:17 
Well its fun, pity
gavin.blair  [skapare] 26 jan @ 9:45 
@Szczygiel thanks for the feedback. it is on github for anyone that wants to do something more with it: https://github.com/gavinblair/SocialInteractions

it was just a fun weekend project i thought people might enjoy. I likely won’t develop it further beyond maybe adding some settings for Ollama.
Szczygiel 25 jan @ 3:19 
Not only here to whine, really like the idea. But you can do better and I am sure you will do better, looking forward to that. I have to say I am even excited. And hyped, pretty sure mod will be popular or have adaptations.
First thing that comes to my mind is some kind of restriction, how many convos mod makes into this notification by using LLM. It heavily spams the console, not sure if this won't be a problem for logfile size, idk.
As rimworld player you should know rimworld alone is really... well, not going peacefully with number of mods and performance. This mod adress LLM so outside the game, thus need some optimisation on performance, as LLM alone takes some of it. It could be done - as far as I imagine possible - done by mod options you could only think of.
Szczygiel 25 jan @ 3:18 
I will make it one message to avoid clutter
1. something to ask : Does it takes into account only vanilla backrounds and traits (or definied, pointed by mod author)? Or it takes whatever it is in pawn bio? One AI mod that I tried completely ignored things added by other mods sadly.
2. So, this instruction in description on model version, this is example? Or this is the model?
3. Will it work with any other model from ollama, or any other llama?
4. If no, do you plan to add support to newer versions or other variants, like this uncensored one? The newest is 3.3 I believe.
5. How hard it would be for you to add support or somebody to patch it?
6. From very first look, it is really performance heavy. Could be good in scenario where you have 3 colonists, but if you got 33 good luck with that.
7. Messages are barely readable and dissappear too fast.
8. Not only lags game pretty much, but also lagging itself. You will get notifications from past convos on paused game.
gavin.blair  [skapare] 24 jan @ 10:55 
@GRUFOLA I tried that but it became difficult to read and opted to put it at the top. Perhaps it can be an option in the future.