Steam telepítése
belépés
|
nyelv
简体中文 (egyszerűsített kínai)
繁體中文 (hagyományos kínai)
日本語 (japán)
한국어 (koreai)
ไทย (thai)
Български (bolgár)
Čeština (cseh)
Dansk (dán)
Deutsch (német)
English (angol)
Español - España (spanyolországi spanyol)
Español - Latinoamérica (latin-amerikai spanyol)
Ελληνικά (görög)
Français (francia)
Italiano (olasz)
Bahasa Indonesia (indonéz)
Nederlands (holland)
Norsk (norvég)
Polski (lengyel)
Português (portugáliai portugál)
Português - Brasil (brazíliai portugál)
Română (román)
Русский (orosz)
Suomi (finn)
Svenska (svéd)
Türkçe (török)
Tiếng Việt (vietnámi)
Українська (ukrán)
Fordítási probléma jelentése
"I´m just a mere AI, with all the knowledge You can think of, to deliver the most accurate answers there are, but in some cases it might read like the horoscope. This is a user error. Please ask a different question."
https://www.databricks.com/blog/2023/03/24/hello-dolly-democratizing-magic-chatgpt-open-models.html
All it really knows is how to read and form a sentence. I was trying to debug some code and asked it a question, and it gave me a wrong answer, and when I explained why it was wrong, it gave me another wrong answer. After a third wrong answer, it gave me the first wrong answer again.
I tried to tell it a joke and it basically said "I've heard this one before" and explained why it's funny.
I asked it why they make Louis walk in Suikoden III (he's a little kid who is some kind of aide to a group of knights, and even though he travels cross country with them, he's the only one who doesn't get to ride a horse) and while it did successfully name another character from the game, it gave me a completely different story about a character (named Louis) who apparently learned to walk, describing the scene as "an emotional part of the story."
I tried to play 20 questions with it, and after insisting that the word it was thinking of starts with B, but doesn't start with a letter that comes before C alphabetically, it ended up asking me questions about MY word that I hadn't come up with yet until I told it it won.
I'm sure most people just go there and shoot the breeze with it, and for those purposes, I'm sure it's more than adequate, because there are plenty of examples they can scrape from the Interwebz of people talking to each other about the weather.
But as a tool, it has a long way to go. I think what they'll have to do is give it some sort of model of the things they're actually describing. Because it's clear that right now all they're doing is forming connections between words and spitting them out with no clue of what they mean. Kind of like those pattern matching puzzles for kids with the colored shapes. Now, it's incredibly complex pattern matching, and is remarkably impressive to be sure. But it breaks down completely when you ask it something even a little unexpected.
I have no idea if that's even possible, honestly--to model real objects and concepts. But we'll see.
personally it's a fun thing to talk to but not the most reliable way either and it's better to take it at a grain of salt and not blindly follow what the chatbot said
Right now, it's just a fun curiosity (or a useful shortcut for students writing essays, depending on who you ask). But I imagine it'll get a lot better several years from now.
Trying to Google the same thing was a nightmare.