Multiplatform How do you think AI will influence gaming?

Maybe we'll start seeing some real consequences to decisions you make in games.

For example, in Mass Effect, you would just follow branching paths and arrive at a predetermined ending.

Would be cool if the AI could generate thousands of responses to your actions in real time (not just at set storyline points) and thus mold the story around the choices you make.

I'm sure that's a long way off, but this type of dynamic world building would be incredible.

If you look at the Skyrim video posted a few posts above, it's already well underway in the modding community, at least exposition wise. Wait until animations can also be integrated with their replies.

I can't wait until that's properly utilized in full games, with voice recognition. It's one thing to type a back and forth with ChatGPT, just imagine with a companion using your voice. I may hasten our extinction from AI alone.

They're not going to even need voice actors soon, which will really help indie titles. I'll prefer a real human doing the voice acting, so long as they allow AI to generate more lines, otherwise I'll take the bot. Imagine all the branching pathways that can result.

I suspect this use case for AI (game concept art, cover art, marketing collateral et cetera) will be rampant once AI remedies its current-day tells. What would take weeks now takes minutes.

On top of concept art, I imagine AI generated levels/worlds/characters/NPC's will be ubiquitous in near the term, and not just in games. It will just require the guiding hand a of a creator/director speaking/typing in commands with near instant results. Changing the layout of a level with be as easy/instantaneous as changing the colour of a logo in photoshop/illustrator.
 
On top of concept art, I imagine AI generated levels/worlds/characters/NPC's will be ubiquitous in near the term, and not just in games. It will just require the guiding hand a of a creator/director speaking/typing in commands with near instant results. Changing the layout of a level with be as easy/instantaneous as changing the colour of a logo in photoshop/illustrator.
The way games like Warframe use tilesetting to procedurally generate levels has long been a production shortcut, I’ve thought. All this AI biz is a whole new level of exactly that. Good, bad….I don’t know, but I do fear a lesser authenticity in gaming the more hands-off game production becomes – less a human touch, to say, even with human AI directors or creative managers.
 
I suspect this use case for AI (game concept art, cover art, marketing collateral et cetera) will be rampant once AI remedies its current-day tells. What would take weeks now takes minutes.
Sort of. But it's not like art deadlines are last min usually. You know you'll need a cover, promo shots, etc.

If it wasn't clear the company here hired an artist who uses AI. Thats why it's so bizarre since they were hiring someone any ways.

I'm also not sure how much better the product gets since at the moment it's being heavily subsidized for end users.
 
Duke Nukem 1 and 2 Remaster cover art/lead image criticised and recommissioned after outed by viewers as AI-generated.

Niche retro gaming brand Blaze Entertainment, creator of the handheld Evercade, today revealed a remaster of Duke Nukem 1 and 2 for its platform, but some fans were more concerned with the quality of the remaster's cover art. As Twitter users quickly pointed out, the Duke Nukem 1+2 Remastered art posted by the Evercade account was obviously AI-generated.

Blaze has since deleted the tweets, but other Twitter users have re-shared the image in question. It displays all the hallmarks of AI-generated art, from weird-looking hands to a heaped pile of stuff that doesn't quite resemble anything.

Image in question:

Full read here. (External)

The thing is this can look like anything done on photoshop in the last decades. people are just getting more suspcious.
 
The way games like Warframe use tilesetting to procedurally generate levels has long been a production shortcut, I’ve thought. All this AI biz is a whole new level of exactly that. Good, bad….I don’t know, but I do fear a lesser authenticity in gaming the more hands-off game production becomes – less a human touch, to say, even with human AI directors or creative managers.

This is a good point. In the future you could make an argument that games will be marketed as those with little to no AI vs full AI (within certain parts of course). Personally, I think AI will help developers with many tasks freeing up their time to work on the most important aspects.

What excites me more than anything and I hope it becomes much more advanced is enemy AI. Nothing can take you out of immersion like badly tuned enemies. I hope one day we can have AI strong enough to counter your decisions as you hunt each other on the battlefield.
 
Innovation? Over saturation?

We've had AI generated levels/random dungeons for eons and I always found them dull. We will likely have AI generated games? I'm browsing through fan made Mario games and they look great, but there's so fucking many of them it leaves me wondering how anyone will compete in future. If fans are capable of making AAA games why would the market rely on established brands? Now the market is going pour open.

Is it now the era of the lone innovator who never fit in on the team?
The mindblowing potential idea here, IMO, if well executed, will be whoever figures out the most sophisticated prompt tree that instructs an A.I. to write choose-your-own-adventure narratives. Imagine that. A.I. is already getting smart enough to write college-level essays that are getting A's. Impossible for professors to identify plagiarism on the text alone even if they demand it be submitted digitally (where in the past they could sweep the internet for passages suspected to be plagiarized). Now they could create entire new storylines that make a game's potential narrative branches infinite.

Imagine marrying this to A.I. trained to create completely unique NPCs, monsters, textures, maps, and so on using toolboxes like Unreal Engine the same way developers use those tools now. A lot of the work has already been done. It's LEGOs. So...teach the AI how to build with LEGOs. The challenge will be on teaching the A.I. how to create new stuff that doesn't cause clipping issues, for example, and meshes well with a theme. For example, it will be on helping it to understand what flora or fauna that suits an environment for a new map it created, and how to generate brand new textures or landscape ideas that conform to that theme. It will be on helping it to understand what kind of clothes a villain would wear, or what type of spells he would cast, and what it would look like. What the villain himself would look like. How he would talk. What sort of vernacular he would use. What sort of inflection, tone, sneers, and so on.

We've always built machines to do jobs we already did. The challenge became building better, more skilled, more capable machines. We're doing the same thing, but the machine has just become vastly more sophisticated because, well, now the machine is a mind.
 
In most cases, people’s attention spans and inability to hone in on substance past skin deep clickbait will pave the way for AI to be “good enough.”

Considering video games are one of the few “cures” to ADD where the user does become immersed, I would imagine AI standards will be substandard.

At least to start — and where it counts most.
 
The mindblowing potential idea here, IMO, if well executed, will be whoever figures out the most sophisticated prompt tree that instructs an A.I. to write choose-your-own-adventure narratives. Imagine that. A.I. is already getting smart enough to write college-level essays that are getting A's. Impossible for professors to identify plagiarism on the text alone even if they demand it be submitted digitally (where in the past they could sweep the internet for passages suspected to be plagiarized). Now they could create entire new storylines that make a game's potential narrative branches infinite.

Imagine marrying this to A.I. trained to create completely unique NPCs, monsters, textures, maps, and so on using toolboxes like Unreal Engine the same way developers use those tools now. A lot of the work has already been done. It's LEGOs. So...teach the AI how to build with LEGOs. The challenge will be on teaching the A.I. how to create new stuff that doesn't cause clipping issues, for example, and meshes well with a theme. For example, it will be on helping it to understand what flora or fauna that suits an environment for a new map it created, and how to generate brand new textures or landscape ideas that conform to that theme. It will be on helping it to understand what kind of clothes a villain would wear, or what type of spells he would cast, and what it would look like. What the villain himself would look like. How he would talk. What sort of vernacular he would use. What sort of inflection, tone, sneers, and so on.

We've always built machines to do jobs we already did. The challenge became building better, more skilled, more capable machines. We're doing the same thing, but the machine has just become vastly more sophisticated because, well, now the machine is a mind.
I love that. Do you ever play any of those gamebooks? They're basically choose youre own adventure games with some battle mechanics. When I was on one my Dubai trips, we were stuck indoors because of a sandstorm. The internet was out and the only thing I had were a couple of gamebooks on my phone, namely Judge Dredd. I think it was made tin man games. I spent every morning playing that before work, as I was up at 5am because of the time difference. I don't even see them on the Android store any longer.

Another thing I was thinking about is D&D, it wouldn't be too difficult to make a good DM, fully voiced with an elaborate campaign with sound effects/music etc...
 
I think in the distant future you'll probably be able to create your own games with it, with nothing more than suggestions. Likely within a VR space.
 
The mindblowing potential idea here, IMO, if well executed, will be whoever figures out the most sophisticated prompt tree that instructs an A.I. to write choose-your-own-adventure narratives. Imagine that. A.I. is already getting smart enough to write college-level essays that are getting A's. Impossible for professors to identify plagiarism on the text alone even if they demand it be submitted digitally (where in the past they could sweep the internet for passages suspected to be plagiarized). Now they could create entire new storylines that make a game's potential narrative branches infinite.

Imagine marrying this to A.I. trained to create completely unique NPCs, monsters, textures, maps, and so on using toolboxes like Unreal Engine the same way developers use those tools now. A lot of the work has already been done. It's LEGOs. So...teach the AI how to build with LEGOs. The challenge will be on teaching the A.I. how to create new stuff that doesn't cause clipping issues, for example, and meshes well with a theme. For example, it will be on helping it to understand what flora or fauna that suits an environment for a new map it created, and how to generate brand new textures or landscape ideas that conform to that theme. It will be on helping it to understand what kind of clothes a villain would wear, or what type of spells he would cast, and what it would look like. What the villain himself would look like. How he would talk. What sort of vernacular he would use. What sort of inflection, tone, sneers, and so on.

We've always built machines to do jobs we already did. The challenge became building better, more skilled, more capable machines. We're doing the same thing, but the machine has just become vastly more sophisticated because, well, now the machine is a mind.

Assignments are already uploaded into a proctoring system and checked against one another. If you write the same question into Chatgpt you will get the same or similar answer most of the time. It's a great research tool but needs to be used in conjunction with your own knowledge otherwise simply using what the system says will get flagged as plageirsm. As it is right now AI responses are easy to spot.

TBH I view it as a valuable assistant for people who are trapped in a narcissistic control cycle
 
Last edited:
If you write the same question into Chatgpt you will get the same or similar answer most of the time.
Huh? It's easy to direct ChatGPT to write an essay of a certain length with a unique prompt that is only several paragraphs long about a certain subject. The probability of that language being significantly different from any other prompt of instructions of the same length is overwhelming.
 
Huh? It's easy to direct ChatGPT to write an essay of a certain length with a unique prompt that is only several paragraphs long about a certain subject. The probability of that language being significantly different from any other prompt of instructions of the same length is overwhelming.

But I mean directly ... so say an assignment has 3 required criteria for answering, if a student just copy and pastes the questions then uses the answers, all the tutor has to see is recognize that AI tone then copy and paste his own questions and see if the tone and the response matches what ChatGPT says. That's just assuming the student doesn't have the wit to change the wording around, which I know sounds odd but I've been studying law for a while now and students simply writing the formal legal text into the assignment rather than paraphrasing and applying it into their own words is a pretty common occurrence. It wouldn't surprise me to see these same people use ChatGPT in the same way.

e.g the criteria for my made up example assignment are:

1. Explain the drawbacks of boxing against a wrestling TDD
2. What advantages does a boxer have when grappled to the ground?
3. How can a boxer use these advantages to recover himself?

You can copy and paste those questions into chatgpt and we will probably get the same results on both ends, but our views and self evaluation on the matter will end up changing the argument. You can prompt it to change it around as you like but like law students writing legal clauses directly from the text into the assignment many upon many of students won't do that. Likewise, simply pasting the assignment criteria into ChatGPT will result in a plethora of students writing the same answers in the same context. These will be easily spotted. What I'm saying is there is essentially a wheat from the chaff scenario playing out here ... to put it politely.
 
Last edited:
But I mean directly ... so say an assignment has 3 required criteria for answering, if a student just copy and pastes the questions then uses the answers, all the tutor has to see is recognize that AI tone then copy and paste his own questions and see if the tone and the response matches what ChatGPT says.
You might catch the dumbest students imaginable with this (which admittedly plagiarists are more likely to be). But it's so unbelievably easy to circumvent this pitfall by paraphrasing the questions, and burying them in unique prompts directing the AI to compose an essay that won't be like anyone else's.
 
Back
Top