As a crusty old veteran at the dignified age of 26, I’ve been around for a lot of advancements in gaming technology. It’s easy to forget in hindsight where the most major innovations come from. The games that create concepts are often overshadowed almost immediately by a bigger game that makes them popular. Many gamers only encounter these concepts when they appear in a title belonging to a major franchise. Credit however should be given as due, so here’s some credit for the recent innovators in gaming technology.
The QTE (Quick-Timed-Event)
You should not have dissed Interview with a Vampire, bro.
The games most lauded for popularizing this one are God of War and Resident Evil 4. The idea actually predates these by almost a decade. First appearing in Japanese arcade fighting games like Fist of the North Star and Dragon Ball Z in the mid 90s, the cabinet would feature large pads the player would punch in reaction to onscreen prompts. The first console game to adapt this was Sega’s Shenmue in 2000. Still keeping very close to the arcade design, Shenmue used button prompts in it’s fighting sequences to simulate the reactions of the character. Resident Evil 4 uses QTEs mostly in the knife fighting sequence and finally God of War introduced the current iteration, where the prompts are exaggerated versions of the normal function of the button in a choreographed sequence. Most gamers are pretty over the QTE at this point. It’s generally only tolerated in small tasteful bits, like the box puzzle. QTEs and the similar music/dance games have worn themselves out a bit in the last generation with overuse. Fortunately the concept has evolved into systems like Arkham City’s combat, where the prompts have been integrated into the HUD and the controls are intuitive.
The Conversation Wheel
This face not captured from a real person.
For a lot of people, Knights of the Old Republic was their first introduction to a ‘wheel’ conversation system in 2002. Making Mass Effect 2′s moral rating of those conversation choices a huge achievement in game design. News flash for you kiddo: for some of us, that’s iteration #347 of this concept and Zork used the same simple math to evaluate your responses waaaaay back in them 80s. Hell, Zork was so smart it knew to respond to “Kick dragon in balls” with “The Dragon is not harmed or amused, he incinerates you.” The big advancement made by Bioware was animating the NPC faces in reaction to your choices. In Zork, you have to imagine the the smirk on the wizard’s face. In KOTOR, you could empathize with a human face and that is when the conversation wheel caught on. These days, games like L.A. Noire and Uncharted are using captures of the voice actor’s faces as they record the dialogue to reach the next level of empathy. A similar but different technology called Mova (designed by OnLive creator Steve Perlman) seeks to emulate human reactions with an AI on a powerful server cluster. Assuming they put it in a terminatrix that looks like Summer Glau, I’m buying one.
“Thing, eh? Walking carpet eh? Who’s a dumb brute now you fascists!?”
Again, that’s the shader that makes flat textures look bumpy. While many believe this to originally be a driver-based technology created by id Software for Doom 3, this is not the case. Significantly more impressive was the version created by Lucasarts for the ForceWare middleware package used in Star Wars Galaxies and Rebel Strike over a year prior to Doom 3. Lucasarts’ tech is entirely software based and could potentially run on any hardware that can do the math. In Star Wars Galaxies, this caused the high end shaders to be very CPU taxing due to the MMO nature of the game. For Rebel Strike this meant a game that looked just fine next to Chronicles of Riddick. Like many things in the house that George built, propriety got in the way of success. Lucasarts did not freely share the tech, instead hoping to hoard it for future Star Wars games that never came as both Factor 5 and Sony Online went their separate ways from the publisher. Instead the OpenGL or DirectX based driver-side version of the effect (which requires hardware running then-current versions of the those APIs) became standard. Meanwhile a lot of PS2 and Gamecube games could have looked a lot more comparable to Xbox games if Lucasarts had just shared that software.
Facial Capture technology
It’s not Botox Tony, we’re on the PS2.
Speaking of the facecap I mentioned above, that technology also started a lot earlier than it became popular. The first game that captured the actors’ likenesses while they recorded their lines was 24: The Game for the Playstation 2. It didn’t look nearly as hot as L.A. Noire in case you’re wondering, but the same idea was in play. The first games that really made this look even remotely similar to live performances were Naughty Dog’s Uncharted series 3-4 years later. Now some games like Enslaved have gone one step further and dressed Andy Serkis up in mocap gear (probably what he wears every day) and basically recorded him living the entire game.
That’s all for now. I may add more as I think of them.