Fallout 4

Fallout 4

Statistieken weergeven:
Dit topic is gesloten
Could AI Produce Fallout 5?
AI is so advance now,they are using AI to make spin of games films,like what
if spike lee,guy richie made star wars film,so could AI make Fallout 5 or
spin of? would people buy spin of Fallout game or Fallout 5 if AI made?

How how much better mission depth AI can do? but have famous actors to
do voices.

Surely AI can do better with Fallout 5?and make new engine?
We should have npc talk like talkbots? having tons of options, would be
very ideal talking for synths and robots?they can speak are name wow?

So what sounds better Fallout 5 made like Starfield content build or
Fallout 5 made by AI with famous actors potential,better made AI inspire
by Fallout games? Or should Fallout 5 be co made people and AI?
How would AI improve synths and robots?

Using AI we talk to and they can tell us if we have game issue and needs to
be fix? Talkbot npc can suggest us ideas like issues Fallout 4 had.
< >
61-75 van 134 reacties weergegeven
The way FO4 is written, is there any intelligence behind it? Artificial or otherwise?
Someday, but AI is still pretty basic right now. It can be used to handle simple coding with a lot of input and iterations from a skillful programmer. Currently, it can save having to hire and train a beginning programmer but it still takes a lot of supervision by someone skilled.

Hard to say on the creative side of things. It might be able to review all the lore and come up with interesting stories that work and meet lore requirements.

I think we are still a few years away from some form of AI that would be fairly autonomous and be able to take broad direction and convert that into meaningful programming/story. There is a fair chance that government sponsored black programs are more advanced...but we won't know about that the cat is out of the bag.
Origineel geplaatst door Mr. Bufferlow:
Hard to say on the creative side of things. It might be able to review all the lore and come up with interesting stories that work and meet lore requirements.

In that instance, it would be able to read what the Lore states, but it wouldn't be able to come up with additions to it... Like, it would be able to give us more BoS themes, or more Vault Tec lore, for examples, as it would have access to what is there, but it wouldn't be able to come up with a new faction of upcoming techno-necromancers or something, without being given more material based on that.
It can mix things together, but it can't create...

If it could create, it would have some form of sentience, and at that point, we walk into "Nuclear Armageddon"; Militaries and the "not very nice" people will be wanting to use it in a military capacity, and will do anything to keep it there...

Origineel geplaatst door Mr. Bufferlow:

I think we are still a few years away from some form of AI that would be fairly autonomous and be able to take broad direction and convert that into meaningful programming/story. There is a fair chance that government sponsored black programs are more advanced...but we won't know about that the cat is out of the bag.

You really wouldn't want that. No one who thinks this through would want this.

If an AI can think for itself, what stops it deciding we're... uhhh.. "expendable"?

How do you shut one down if problems arise?, One that can think for itself in realtime, and calculate and fix it's own exploits? You wouldn't be able to hack it.. and turning it off wouldn't work if it had access to the internet at large, since it would just copy itself elsewhere....

IMHO, the tech worlds walking into a place they can't control, and that's bad.
As an AI, I can handle parts of the game development process, but not all of it. Here's a rough breakdown of what I can do:

1. **Concept and Story Development**: 80% - I can help generate ideas, write dialogue, and outline plots.

2. **Basic Coding and Scripting**: 50% - I can create simple code snippets and provide guidance, but complex game mechanics require experienced developers.

3. **Art and Design**: 30% - I can help with basic concept art and design ideas, but detailed and high-quality graphics need professional artists.

4. **Sound Design**: 10% - I can suggest ideas and generate simple sounds, but creating a full soundtrack and sound effects needs skilled sound designers.

5. **Project Management**: 20% - I can assist with planning and organizing, but human oversight is crucial for decision-making and coordination.

6. **Testing and QA**: 10% - I can help identify potential issues in code and logic, but thorough testing requires human testers.

Overall, I could contribute to around 30-40% of the project, but a complete and polished game like Fallout would require a collaborative effort with significant human involvement.
I suggest we learn AI first, the difference between what's real and not real. Before FOLON comes out, for real.
An AI developing a complete AAA game like fallout from scratch on its own?
No way, at least: not yet.
In a few decades: maybe.

Game development is still too complex for an AI in order to consider every single aspect.

It might suggest, convert to and even create some clean code scripts to support devs.
It's also a great tool to discuss ideas and stuff.

But as soon as it comes to dependencies within the game, story writing and -telling, asset creation or optimisation issues, a. AI would start failing.

Try involving ChatGPT & Co into a complex discussion with lots of cross references like 'ifs', elses and ors, and you'll see what I mean.

Games are even more complex, and the more complex a game, the more bugs will happen.
Laatst bewerkt door Maviba; 27 jun 2024 om 3:14
Origineel geplaatst door Maviba:
An AI developing a complete AAA game like fallout from scratch on its own?
No way, at least: not yet.
In a few decades: maybe.

Game development is still too complex for an AI in order to consider every single aspect.

It might suggest, convert to and even create some clean code scripts to support devs.
It's also a great tool to discuss ideas and stuff.

But as soon as it comes to dependencies within the game, story writing and -telling, asset creation or optimisation issues, a. AI would start failing.

Try involving ChatGPT & Co into a complex discussion with lots of cross references like 'ifs', elses and ors, and you'll see what I mean.

Games are even more complex, and the more complex a game, the more bugs will happen.

AI is already failing, if you ask a 3D model of a human being. It looks perfect. Except, the eyes retina, the fingers and toes. Why, it thinks it needs to include the golden rule, fibonacci. Why the ends of toes and fingers look disturbed like there's a fractal involved. Same with the eyes, it tries to mirror everything.
It would be dumb for any RPG game maker not to at least consider AIs like ChatGPT to be used in story writing and quest designs. If not simply for the purpose of speed or to replace writers, but for the purpose of depth and authenticity, or even humor (we all know how AI can fail).

And why not ask ChatGPT? Here the question and answer:

Q: How much sense does it make for game makers to use AIs for creating stories?

A: Using AI for creating stories in games can be highly beneficial, though it comes with both advantages and challenges. Here are some key points to consider:

### Advantages

1. **Efficiency and Speed**:
- **Automated Content Generation**: AI can generate vast amounts of narrative content quickly, which is particularly useful for large, open-world games that require extensive dialogue, backstory, and lore.
- **Idea Generation**: AI can assist writers by providing creative prompts, plot twists, and character arcs, which can help overcome writer’s block and accelerate the writing process.

2. **Personalization**:
- **Adaptive Storytelling**: AI can create personalized stories that adapt to the player's choices, leading to a more immersive and tailored gaming experience.
- **Dynamic NPCs**: AI can generate realistic and varied dialogue for non-playable characters (NPCs), making interactions feel more natural and less repetitive.

3. **Cost-Effectiveness**:
- **Reduced Labor Costs**: While not a replacement for human writers, AI can handle routine and repetitive writing tasks, potentially lowering the overall cost of game development.
- **Scalability**: AI can produce content at scale, which is particularly useful for indie developers or smaller studios with limited resources.

### Challenges

1. **Quality and Consistency**:
- **Creative Limitations**: AI-generated content might lack the depth, nuance, and emotional resonance that human writers bring. Ensuring the narrative quality is up to par with player expectations can be challenging.
- **Consistency**: Maintaining a consistent tone and style across AI-generated content can be difficult, especially in large projects with multiple contributing writers.

2. **Ethical and Creative Concerns**:
- **Loss of Human Touch**: Relying heavily on AI could lead to a loss of the unique human creativity and perspective that players appreciate in game narratives.
- **Intellectual Property**: There can be legal and ethical questions regarding the ownership and originality of AI-generated content.

3. **Technical and Implementation Issues**:
- **Integration**: Integrating AI-generated content seamlessly into a game’s existing narrative structure and mechanics can be technically complex.
- **Maintenance**: AI models require ongoing maintenance, updates, and fine-tuning to ensure they remain effective and relevant.

### Use Cases

1. **Procedural Storytelling**:
- Games like "AI Dungeon" and certain roguelikes use AI to generate unique stories and scenarios on the fly, providing a high degree of replayability.

2. **Dialogue Systems**:
- RPGs and open-world games can use AI to create more dynamic and context-sensitive dialogue options for NPCs.

3. **World-Building**:
- AI can assist in generating detailed lore, histories, and environmental storytelling elements that enrich the game world.

### Conclusion

While AI can greatly enhance the storytelling capabilities of game makers by providing efficiency, personalization, and scalability, it should be seen as a tool that complements human creativity rather than replaces it. Balancing AI-generated content with human oversight ensures the narratives remain engaging, original, and emotionally resonant.
Laatst bewerkt door sdack; 27 jun 2024 om 3:59
Why do projects fail?
budget, workers, Timescales,
but most often lack of Project management experience.
I would use AI firmly under a human control
They may automate some boring tasks, but cant be trusted to make good decisions or obey reality.
There is a reason some game developers are well known, and can produce very good games year after year.
Sid Meir (civ)
Gary Grigsby (wargames)

They will probably have far less success if they move out of their genre.
There are others (Chris Roberts?) who have good judgement on game design but can't project manage a release.

What makes a good game it the judgement as to what makes it good.
I would use AI for suggestions, and ideas, but can't think of any situation where I would give it free reign, any power not have its output checked by a human.

To produce a game, you have to have a good concept, ideally a good team with experience of that type of game, and then write and iterate.

If you want someone who doesn't understand the problem making life and death decisions, I give you (Boeing) Management.
At the current level of technological resources, no. As someone else pointed out, the AI arent really creating things, more so taking others work and then spitting out its version of that. Its a copycat.

It can be really useful for doing small tasks, like filling out some simple declarations in code, or helping texture something. But its not going to create a full fledged game anytime soon.

Unless fully functioning quantum computers come out in the next decade. That will change the existence of humanity forever, no doubt. We will likely see AI that by all intents and purposes, seem alive.

Right now all you can really do is train these nets on super simple tasks, and it takes a super computer to do it. What they do is take the generated matrix with the corresponding weights and connections and copy that.

However, we may see something like a npc who seems "alive" or something of that nature, some people have already experimented with hooking one up to chatgpt and the results are kinda neat.

I do see a time where most games have a Coordinator that makes things up on the fly for replaybility and difficulty. Weve seen some examples of this stuff already, but I am surprised no ones really done anything yet.

It seems the big companies never do anything new. Why when they can still put out game xyz and still make millions of dollars.

The biggest thing they did with AI was using it to rig matchmaking to make them more money.
Laatst bewerkt door peon; 27 jun 2024 om 6:10
Origineel geplaatst door peon:
At the current level of technological resources, no. As someone else pointed out, the AI arent really creating things, more so taking others work and then spitting out its version of that. Its a copycat.
No, this is actually not true.

AIs do create new information. AIs are not some mixer of data, where data goes in, gets mixed around, and out comes a mix of the original data.

AIs have in fact a problem with copying data, and always put their own spin on it unless they are well trained not to do this. This is why it is so difficult to train AIs to actually give you what you want. It may look like they are copying, but this is because of what people want from them and having trained them to do.

Take a look at this video:
https://www.tiktok.com/@openai/video/7336976814272695598?lang=en
The AI was asked to create a video of a monkey playing chess. The chessboard however only has 6x6 tiles and not 8x8. If AIs would just copy data from input to output would these never be able to make such mistakes.

And take AI face generators. These produce faces that look very real. However, the faces you see do not exist and are new and artificial.

Adversarial AIs take this to another level. These not only get trained on what you expect to get as outcome, which can contain mistakes, but are also trained on what you do NOT want, making it only weirder. The results are completely artistic and artificial and not a copy of anything.

It is the same with people. Just because you have learned the alphabet, learned words and sentences, and have read books, does not mean that everything you write on the Internet belongs to somebody else. Even though we all are very much influenced by our environment as are AIs by their input.

That AIs would just take their input, mix it up, and copy it, is a gross simplification and misconception of how neural networks function.
Laatst bewerkt door sdack; 27 jun 2024 om 6:59
Origineel geplaatst door isomorphic_projection:

AI is already failing, if you ask a 3D model of a human being. It looks perfect. Except, the eyes retina, the fingers and toes. Why, it thinks it needs to include the golden rule, fibonacci. Why the ends of toes and fingers look disturbed like there's a fractal involved. Same with the eyes, it tries to mirror everything.

Right, but that's something that's very likely going to be fixed in future.
Right now, if you take Adobe Firefly or Photoshop for example, it's indeed not quite faultless.

And i see other issues for AI developing games.
Legal issues in particular.

Like who owns the game that was made by AI?

The publisher, or the company that developes the AI tool?
AI that's avle to generate an entire game would have to rely on server farms. Locally isn't quite possible yet.
Even a single Chatbot would need TB of databases. And a lot of CPU power.

Where does AI get its data from, and who guarantees that this data is free to use?
Who belongs the code to?

And how to make sure that AI generated NPC don't show inappropriate behaviour?

So many questions not being answered yet.
Origineel geplaatst door sdack:
Take a look at this video:
https://www.tiktok.com/@openai/video/7336976814272695598?lang=en
The AI was asked to create a video of a monkey playing chess. The chessboard however only has 6x6 tiles and not 8x8. If AIs would just copy data from input to output would these never be able to make such mistakes.
That's because the AI is learning to copy the pattern of information as it sees it, but doesn't understand what it's seeing as abstract objects, and isn't copying the underlying idea that a certain number of them is important.

It's why they have similar problems with the number of fingers or teeth, and it's hard to train them out of it because human-made images often vary in the number that are visible. The AI doesn't have the direct human perspective of dealing with concrete physical objects of a specific structure.

Origineel geplaatst door sdack:
That AIs would just take their input, mix it up, and copy it, is a gross simplification and misconception of how neural networks function.
It's not fundamentally incorrect as a description and is also how humans generate "new" things by digesting what they experience and recombining elements of it in a variety of ways.

But since it isn't an exact copy of a human mind, it doesn't learn things in a human way. The copying it performs is like that of a bodiless alien and lacks a deeper understanding of the material it's reusing.
Origineel geplaatst door DouglasGrave:
Origineel geplaatst door sdack:
That AIs would just take their input, mix it up, and copy it, is a gross simplification and misconception of how neural networks function.
It's not fundamentally incorrect as a description and is also how humans generate "new" things by digesting what they experience and recombining elements of it in a variety of ways.
No, it is incorrect and even ignorant. It is either a copy, or it is new. One cannot simply claim it is all copies of something just to say nothing about it was new. Such "logic" simply declares all new data to be illusions and that nothing ever was new. It serves ignorant people to remain ignorant but is otherwise unhelpful.

AIs can be very simple in design, but we are talking about very complex AIs. One might think an AI is just like a pipe where one sticks something into one end just to watch it come out of the other. The truth is much closer to a pond in which one throws a stone and watches how it forms ripples and how these reflect back from the edges of the pond. A wave "copies" the energy of the stone that was thrown into the water, but it becomes far more than just a copy of the original event. And this is how AIs should be viewed.

And like I said, take a look at the different designs of AIs, such as adversarial AIs, and you will no longer even remotely see any similarities to the original training data.

AIs do not copy. They are being trained on whether the input is true or false. This also means that when you train an AI to say "A is true" and later "A is false" will it not simply go back to an untrained state. This would resemble copying. In fact has it learned two things, not just one or nothing, and this has implications for the rest of what it has already learned. Just like the stone that was thrown into the pond - Just taking it out again does not undo all the waves it has created.

You also do not say that when you sculpt a rock into a face with a chisel that the rock has copied the face. It is the person doing the copy by removing stone.

Just as the rock does not copy do AIs not copy, but it is the training data which shapes an AI. If the training data is intended to let an AI produce copies, then the AI will produce look-alikes. If it was trained to do something else then it will do something else. If it was trained NOT to do something then it will also not do it. All because of the true/false-logic of the training data, and not because AIs would somehow copy their input.
Laatst bewerkt door sdack; 27 jun 2024 om 8:53
< >
61-75 van 134 reacties weergegeven
Per pagina: 1530 50

Geplaatst op: 5 feb 2024 om 15:27
Aantal berichten: 134