Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
About the ability of kids looking up porn on Google, they are actively seeking it out. In VRChat it appears unexpectedly and in my case I could not block the person using the particles or leave to another room because I could not access my game menu. So I had to manually shut down my game. An easier way for devs to fix this problem is for certain buttons in the menu visible and accessable through meshes and particles.
If I wanted to see all that I would have gladly been seeking it out, but no it was shoved in my face like it was religion.
But it wasn't easy to block or mute whoever did it. The particles took up the entire Great Pug, loud ass moaning music and the pictures were to an extreme. The only option was to look down and jump worlds, no telling where this person was that activated the particles.
After seeing that, yeah... At least have particles be gone as all it does is lag the worlds out and allows people to make extremely disgusting content if they really wanted to.
Otherwise, keep the internet free and open. Hell yeah. Keep children out of online chat rooms.
Having said that...
Do you really want people telling you what you can do? Or, just "supervise" yourself.
If you let others tell you what is appropriate you might find yourself "supervised"...when they find your character or how you act "disgusting" by what they think is "O.K." or not.....
"Those who deny freedom to others..... deserve it not for themselves" - Abraham Lincoln
A bot can easily recognize profanity in text and be programed to moderate a text chatroom. Some programs are able to filter pictures for weird content like facebook or google search, but this is more complicated, especially in a specific environment like VRChat which needs its own custom tools to do so. The objectionable content you are talking about was in a particle effect..
Like I'm trying to wrap my head around how to actually screen or filter an avatar model with attached animations, hidden objects, sound files etc. without actually manually loading each avatar into unity or in game and a human going through each animation one by one to check for offensive conent. Sure the particle .jpg might be found through a picture scanning software, but a rude animation or triggered object wouldn't...So.. like most similar situations or platforms involving large amounts of 3d user-made content, it relies often on user flagging and reports. Like checking each model that is uploaded, thoroughly, for offensive content is impractical from several standpoints, if you are going to allow users to upload custom avatars with animations and effects. There is a limit to what can be easily filtered in bulk given the complexity of an avatar. If an avatar only had a walk animation, it'd be easier, you could do a visual scan of the texture, but since they can be animated to do... anything, including scat particles.. well, it gets a lot more complicated.
Moreover, I also suspect the number of uploads is fairly large. These forums are awash with people trying to make avatars is one big indicator. As another example, as I tweak or make changes to an avatar, I might upload it 5 (or more) times in an evening till I get an animation or effect just right....This can go on for days as I build an avatar. There is no one on the other end checking those builds as I upload them, would be my guess. If I had to wait for manual approval of each build, I don't think I would have finished 3 models by now, or even finished the first, likely.
Despite the challenges in filtering avatars for content, there is certainly some room for improvement in flagging and blocking tools. I think, in some ways it just takes practice and muscle memory to be able to do it quickly. But, too often, names get lost in geometry or whatever, and it becomes impossible to block someone. Right now, it can be an issue if someone is being offensive and you can't block them. It'd be great if people playing music or with open mics were automatically at the front of the "social in-room list", so they'd be easier to find and block, for example.
Lastly, although the rules require users to be 13 or older, as listed under community guidelines, it took me less than one afternoon in VRChat to understand that the place was not suitable for young kids.
To begin with...... it's an internet chat room........You know, like the old as the internet cliche, 'don't let your kids hang out in a random internet chat room,". Except, in this internet chat room, people have virtual bodies and can interact with these bodies, like virtually grab each other. They can add any sort of custom animation or objects they can dream up and it seems many people are obsessed with making anime schoolgirls, if popular avatar worlds are any indication. Am I making sense here?
People keep calling it a "game"... but it is an internet chat room (with some low population "game" rooms). Sure you can run around as Pikachu, which might appeal to kids, but if you are running around next to VR particle scatman, then yeah...it's not for kids. The first time I showed VRChat to my friend, in less than 10 minutes someone walked up and propositioned him...He was dressed as a male looney tunes character, in case you were wondering. Even I was a bit taken aback how quickly it happened, even though I warned him he might see weird stuff.
Yes, the rules state 13 years or older, but in practice that means a lot of people are older than 13, waaaay older than 13, doing weird stuff online. Fun for some, not great for kids.
It's the wild west of cyberspace in 2018. It's wonderful, hilarious, and mindblowing, and full of content, conversations, and behavior that is waaaaaaaaaaaaay inappropriate for almost any age, let alone kids. Even if mods could monitor every single uploaded avatar, more avatars spring up daily. And fundamentaly, it's still an internet chat room, not a place for young kids, esp not with all the kanas and waifus making out in the corner everywhere.
Community can help by blocking and reporting. Blocking tools will improve, Mods will do their best, but fundamentaly preventing children from seeing weird content online is a parents job, and a tough and at times impossible one when it comes to the internet, especially when Mr. Scatparticle guy just appears out of nowhere unexpectedly. I am not unsympathetic. I wouldn't want my kids seeing Mr. Scatman. I just don't see, currently, a realistic solution for pre-screening uploads.
A good friends list and private rooms can help insulate you from some of the random stuff, if needed.
Good luck!
It isnt a 100% solution but the vast majority of undesireables are on desktop.
Dont tell me about private mode, i dont need to hear it again because i want to be able to randomly meet players, if the desktop players i randomly meet have all been invited by another vr user the chance is a lot smaller it will be the scat guy.
Upvote the suggestion here if interested:
https://vrchat.canny.io/feature-requests/p/vr-user-invited-only-desktop-users-room-option
Take a look at AAA FPS games. You will always find a cheater on them. All it takes is the right person to do it and you won't stand a chance. No game is safe. Nexon for example battled cheaters since 2009 in Combat Arms(not AAA, to clarify) and eventually just sold the IP as of last year. It was not profitable to develop the game and combat cheaters at the same time as whatever block was put in place, could easily be bypassed(though easier to detect). In the case of VRChat, nothing is being bypassed. Just using an intended feature in an unintended way, also known as an exploit.
Do people still seriously not know about the panic button?