AI2U: With You 'Til The End

AI2U: With You 'Til The End

View Stats:
game dosent work without internet connection
yet another game relying on an internet connection to even run, literally throwing up an error refusing to go beyond the first screen. absolutely refunding because of this. THIS (and the pointless censorship) is the reason why you need to move away from whatever online AI service youre using, and let us run our own LLMS.
< >
Showing 1-15 of 40 comments
Rasseru Mar 1 @ 5:27am 
Originally posted by Chad "The King" ThunderCuck:
yet another game relying on an internet connection to even run, literally throwing up an error refusing to go beyond the first screen. absolutely refunding because of this. THIS (and the pointless censorship) is the reason why you need to move away from whatever online AI service youre using, and let us run our own LLMS.
The advance AI is not on your client-side and is ran on the developers' server. This is likely because the server-side could process complex thinking and easier to update to fix issues. Plus it securing the intellectual property from being stolen from the developers.

If you have a issue where you get a blank reply or triggers "I can't do that request". Then could copy and re-paste your text before submitting. So if a error happens, you could reword it or fix grammar for the AI to better understand.
Last edited by Rasseru; Mar 1 @ 6:59am
Originally posted by Rasseru:
Originally posted by Chad "The King" ThunderCuck:
yet another game relying on an internet connection to even run, literally throwing up an error refusing to go beyond the first screen. absolutely refunding because of this. THIS (and the pointless censorship) is the reason why you need to move away from whatever online AI service youre using, and let us run our own LLMS.
The advance AI is not on your client-side and is ran on the developers' server. This is likely because the server-side could process complex thinking and easier to update to fix issues. Plus it securing the intellectual property from being stolen from the developers.

If you have a issue where you get a blank reply or triggers "I can't do that request". Then could copy and re-paste your text before submitting. So if a error happens, you could reword it or fix grammar for the AI to better understand.
its not "advanced" by any degree, as it seemingly has a context memory of less than 1000 tokens. there is no "complex thinking" its a basic LLM generating text based on instructions given similar to a "character card". i run better LLM's than this all the time on my own hardware. not sure what your second statement is for, i literally lost internet while playing and all the bot could say was "please dont say that, it embarrasses me!". trying to restart the game results in a "we cant connect to the network" style error where it kicks you from the game.
Originally posted by Chad "The King" ThunderCuck:
Originally posted by Rasseru:
The advance AI is not on your client-side and is ran on the developers' server. This is likely because the server-side could process complex thinking and easier to update to fix issues. Plus it securing the intellectual property from being stolen from the developers.

If you have a issue where you get a blank reply or triggers "I can't do that request". Then could copy and re-paste your text before submitting. So if a error happens, you could reword it or fix grammar for the AI to better understand.
its not "advanced" by any degree, as it seemingly has a context memory of less than 1000 tokens. there is no "complex thinking" its a basic LLM generating text based on instructions given similar to a "character card". i run better LLM's than this all the time on my own hardware. not sure what your second statement is for, i literally lost internet while playing and all the bot could say was "please dont say that, it embarrasses me!". trying to restart the game results in a "we cant connect to the network" style error where it kicks you from the game.
Stay mad
Originally posted by The Totes Adorbs:
Originally posted by Chad "The King" ThunderCuck:
its not "advanced" by any degree, as it seemingly has a context memory of less than 1000 tokens. there is no "complex thinking" its a basic LLM generating text based on instructions given similar to a "character card". i run better LLM's than this all the time on my own hardware. not sure what your second statement is for, i literally lost internet while playing and all the bot could say was "please dont say that, it embarrasses me!". trying to restart the game results in a "we cant connect to the network" style error where it kicks you from the game.
Stay mad
aint mad, i already have my money back.
well, don't be surprised if you can't play anymore when the dev's cut the line to gpt azure or so, just saying. i hope MyRobot will be better for customers.
Wicked Mar 1 @ 12:18pm 
Come on, give some break to the devs. Even with all the flaws it's a 10dollar euro game. It's worth way more than that and has some very nice moments and replayability. The internet connection is in the system requirements, and sure it is a meh, but the game is still in extremely early development.
Originally posted by -tantalos-:
MyRobot
MyRobot will have local stuff but the gameplay looks less interesting imo. You can only do minning and hoverboarding.
Spidey01 Mar 1 @ 12:44pm 
I mostly view local models as desirable from a 'that API won't always be there' perspective. If they did purely local models, people would probably be moaning about the memory requirements--either the local storage or the RAM usage.

So, I'd say it's more of a feature than for most games requiring network connectivity. Most people are probably not going to play this game offline, unless they're really board on an airplane without expensive connectivity.
Originally posted by Spidey01:
I mostly view local models as desirable from a 'that API won't always be there' perspective. If they did purely local models, people would probably be moaning about the memory requirements--either the local storage or the RAM usage.

So, I'd say it's more of a feature than for most games requiring network connectivity. Most people are probably not going to play this game offline, unless they're really board on an airplane without expensive connectivity.

is it also a "feature" that as soon as they stop paying for a web host, the game INSTANTLY becomes completely unplayable forever? it is effortless to allow the user to set their own tunnel for the LLM work. the only reason they arent doing it that way, is so they can run some ai datamining on your computer with your consent.
That's no different than any proprietary MMO game, except those usually pass the cost on to users with a subscription rather than a flat-fee for the game. One of my ancestors used to say something like, "You pay your dime, you take your chances." That's very true of video games, now and since the age of shrink wrap.

Also, it's not without consent. Games have had some form of EULA since at least the 386 was hot and beautiful hardware, they now typically have a PP and TOS--which AI2U does. By installing and running a program, you're also declaring some measure of trust (or stupidity) for running it in the first place. You don't think your typical AAA release isn't interested in what they can see about their customers? Don't like what games collect? Then stop running executables from people and companies you don't trust. I've always found it silly, how willing PC and Mac users are to run random code from random people and now, even websites.

Besides, most players are probably too busy trying to sex up or mentally fracture the AI characters to even notice that most of what they say probably lands in the realm of "That would be against ChatGPT's TOS" to give a hoot about the PP/TOS.

The real question, I suppose, is what the policies governing the API endpoint that AI2U relies on, not as much the developer's own. But hey, I put a dime in that jukebox, and I'm not stupid enough to ask Elyssia how to build a nuclear bomb in my garage and then debate whether G-men will show up looking for missing plutonium.

In any case, the point of games is to have fun. You probably shouldn't be asking Eddie to write code for you at work or Estelle to do your stock trading. It's a game. Life is transient. Go touch digital grass and enjoy life.
Originally posted by Spidey01:
That's no different than any proprietary MMO game,

what are you, high? this is a single player game, not an online game. there is no excuse for this.
Originally posted by Chad "The King" ThunderCuck:
THIS (and the pointless censorship) is the reason why you need to move away from whatever online AI service youre using, and let us run our own LLMS.

You know, running an AI locally requires tremendous processing power and that's where $1000+ (or euros) GPUs come in. I think most players are the broke student types, that play on laptops, that'll never be able to seamlessly deliver that amount of processing power compared to those systems inside a chassis box. Although still I am in favor of selecting from using their AI server, our AI server (via API keys) or run them locally. Llama 3.3 seems to be kicking @$$.

The censorship happens *after* the reply was retrieved from the AI servers. It is inside the game logic itself, where the word replacement happens. The AI itself complies with using foul language and I can show an example of it, that I've recorded before they rolled out the steam Early Access update. Or you can see it for yourself in the Yandere AI Girlfriend Simulator demo, where you need to insert your OpenAI API key.

Instead of seriously tackling bugs, making the AI parameters not change negatively on replies, that are harmless and adding missing or requested features (i know, it happened so-and-so but), they spent their effort, time (time = money) and energy (1) adding dialog word filters, (2) made the AI dialogs less flexible/free, (3) entire hoops to jump through to unlock subsequent levels, adding to it, that in an ending you must either be an @$$hole or die like a dumb@$$ or be an OCD completionist and (4) removed the ability to use your own API keys, which was added previously after strong community feedback-backlash back when they had tokens for playtime. They took 1 step forward and 3 back.
Originally posted by Motherf♥♥♥er Big Scorpio:
Originally posted by Chad "The King" ThunderCuck:
THIS (and the pointless censorship) is the reason why you need to move away from whatever online AI service youre using, and let us run our own LLMS.

You know, running an AI locally requires tremendous processing power and that's where $1000+ (or euros) GPUs come in. I think most players are the broke student types, that play on laptops, that'll never be able to seamlessly deliver that amount of processing power compared to those systems inside a chassis box. Although still I am in favor of selecting from using their AI server, our AI server (via API keys) or run them locally. Llama 3.3 seems to be kicking @$$.

The censorship happens *after* the reply was retrieved from the AI servers. It is inside the game logic itself, where the word replacement happens. The AI itself complies with using foul language and I can show an example of it, that I've recorded before they rolled out the steam Early Access update. Or you can see it for yourself in the Yandere AI Girlfriend Simulator demo, where you need to insert your OpenAI API key.

Instead of seriously tackling bugs, making the AI parameters not change negatively on replies, that are harmless and adding missing or requested features (i know, it happened so-and-so but), they spent their effort, time (time = money) and energy (1) adding dialog word filters, (2) made the AI dialogs less flexible/free, (3) entire hoops to jump through to unlock subsequent levels, adding to it, that in an ending you must either be an @$$hole or die like a dumb@$$ or be an OCD completionist and (4) removed the ability to use your own API keys, which was added previously after strong community feedback-backlash back when they had tokens for playtime. They took 1 step forward and 3 back.
my $300 GPu can run much higher quality LLM's with much higher token counts than what theyre using here, while running skyrim at 4k and MUCH higher load on said GPU than this game. you have no idea what youre talking about.

the censorship that people dont want isnt the pointless "swear filter", its the NSFW filter BECAUSE NOBODY IS WASTING THEIR TIME ON AN AI WIAFU BOT JUST TO TALK TO IT. the filter thats telling the LLM "do not engage in sexual or violent situations" or something along those lines.
Wicked Mar 2 @ 3:06pm 
Speak for yourself I only have problem with the swear filter. If I would want secks I'd just ask a random real woman (but I'm not that kind). Don't turn girls into sluts. If they would be "easy" that would mean they are already after certain things.
Originally posted by Chad "The King" ThunderCuck:
things

Unless you mean by $300 a 2nd hand GTX 1080 Ti or something, that you are able to run Skyrim (a game made in 2011 or it's anniversary remaster edition in 2021) at 4K on highest settings, i will toss that "you don't know what you're talking about" right back at you. I expect you to know, that by 1000 dollars or euros it was meant to refer to the currently iphone-like priced high end GPU's on the market.

I covered the censorship part. You missed it when I mentioned the dialogs being less flexible/free.
Originally posted by Chad "The King" ThunderCuck:
my $300 GPu can run much higher quality LLM's with much higher token counts than what theyre using here, while running skyrim at 4k and MUCH higher load on said GPU than this game. you have no idea what youre talking about.

the censorship that people dont want isnt the pointless "swear filter", its the NSFW filter BECAUSE NOBODY IS WASTING THEIR TIME ON AN AI WIAFU BOT JUST TO TALK TO IT. the filter thats telling the LLM "do not engage in sexual or violent situations" or something along those lines.
lol. no it can't. lmao
< >
Showing 1-15 of 40 comments
Per page: 1530 50