RimWorld

RimWorld

Not enough ratings
Local AI Social Interactions
   
Award
Favorite
Favorited
Unfavorite
Mod, 1.5
File Size
Posted
Updated
3.015 MB
Jan 23 @ 6:36pm
Jan 23 @ 6:39pm
2 Change Notes ( view )

Subscribe to download
Local AI Social Interactions

Description
A Rimworld mod that will display what the colonists are saying to each other, using a locally installed LLM. It takes their bio and relationship into account. The results can be pretty funny.

Updated instructions:

You will need to download Ollama from ollama.com . It is free and open source. Once it is running, open up your terminal and do:

ollama run llama3.2:3b

It will download and start up the LLM (it is a 2gb file that downloads). You will need 2gb of RAM available. The mod will eventually have settings but it currently only works with the default Ollama settings. You may have to find instructions on starting LLM’s with Ollama on Windows, I only have a Mac, but it should work.
13 Comments
tw_boring Feb 15 @ 3:43am 
Thank so much.:steamhappy:
DealingLace Feb 6 @ 11:30am 
This looks sick! Keen to check out the repo, thanks man!
gavin.blair  [author] Feb 3 @ 9:27am 
@Deankiller Turnupseed haha that might be overkill. it is on github for anyone that wants to take it to the next level
Deankiller Turnupseed Feb 2 @ 11:03pm 
deepseek when
Szczygiel Jan 26 @ 10:17am 
Well its fun, pity
gavin.blair  [author] Jan 26 @ 9:45am 
@Szczygiel thanks for the feedback. it is on github for anyone that wants to do something more with it: https://github.com/gavinblair/SocialInteractions

it was just a fun weekend project i thought people might enjoy. I likely won’t develop it further beyond maybe adding some settings for Ollama.
Szczygiel Jan 25 @ 3:19am 
Not only here to whine, really like the idea. But you can do better and I am sure you will do better, looking forward to that. I have to say I am even excited. And hyped, pretty sure mod will be popular or have adaptations.
First thing that comes to my mind is some kind of restriction, how many convos mod makes into this notification by using LLM. It heavily spams the console, not sure if this won't be a problem for logfile size, idk.
As rimworld player you should know rimworld alone is really... well, not going peacefully with number of mods and performance. This mod adress LLM so outside the game, thus need some optimisation on performance, as LLM alone takes some of it. It could be done - as far as I imagine possible - done by mod options you could only think of.
Szczygiel Jan 25 @ 3:18am 
I will make it one message to avoid clutter
1. something to ask : Does it takes into account only vanilla backrounds and traits (or definied, pointed by mod author)? Or it takes whatever it is in pawn bio? One AI mod that I tried completely ignored things added by other mods sadly.
2. So, this instruction in description on model version, this is example? Or this is the model?
3. Will it work with any other model from ollama, or any other llama?
4. If no, do you plan to add support to newer versions or other variants, like this uncensored one? The newest is 3.3 I believe.
5. How hard it would be for you to add support or somebody to patch it?
6. From very first look, it is really performance heavy. Could be good in scenario where you have 3 colonists, but if you got 33 good luck with that.
7. Messages are barely readable and dissappear too fast.
8. Not only lags game pretty much, but also lagging itself. You will get notifications from past convos on paused game.
gavin.blair  [author] Jan 24 @ 10:55am 
@GRUFOLA I tried that but it became difficult to read and opted to put it at the top. Perhaps it can be an option in the future.
Giorgio Dibitonto Jan 24 @ 10:38am 
This seems amazing!

Could you consider to make it look a little more like "speak up" mod? It use "bubble interaction" mod to make the sentence appear near the pawn as a bubble? It would look a lot better.