Why does Assassin’s Creed III run like ass on PCs?

Posted: April 3, 2013 by ryanlecocq in Off-topic, Technology

I could have named this article a dozen things.  I could have substituted L.A. Noire or any number of other bad console to PC ports.  I could have called it “Why multiplatform development is essential from the start.”  It’s the same explanation in all cases.  This article will be a tiny bit technical, but as always I’ll try to put in as clear of laymen’s terms as possible.

What it basically comes down to is that every program has to be specifically designed to use instruction sets (tells CPU to do stuff) for every type of CPU.  In normal PC development, Intel sets the standard for CPU instruction sets (though they owe it originally to IBM, but that’s another long story).  You’ve probably seen “SSE 1, 2, 3, etc. etc.” on descriptions of CPUs before.  All consumer PC CPUs use these instruction sets.  So although an AMD CPU may be forced to run in less optimized sets because obviously Intel favors their own parts, they save a boatload of money and research by using them.  IBM CPUs however, like the ones in the three current generation video game consoles, have long ago departed from the blueprint they once created for CPU design.  So basically this means that nowhere in the design of normal PC instruction sets are there ones for IBM and vice versa.  So if you say, drop a program designed for an IBM Power CPU into an Intel or AMD system without having designed it to use their instruction sets, it’s not happy.

The long technical version of not happy is that it defaults to something they have in common which is going to be something ancient and 32-bit.  So it will only run on one of your cores without anything telling it to do otherwise, which is a huge bummer since most CPUs these days are not meant to stand on one core alone for gaming.  If the developers at least take the step of adding DirectX 10 support, as in the case of L.A. Noire, you can at least force the CPU to spread it evenly across all the cores.  Still not true multi-threading, but this is what any DirectX 10 game, such as Skyrim, does anyway.  In the case of Assassin’s Creed III, the game is DirectX 9 native, like many raw ports.  So it uses one CPU core to 100% and occasionally touches another for PhysX if you have an AMD GPU, or if you have a weak nVidia card.  Even this is a huge drag as PhysX on your CPU sucks and even more so if it has just been sitting there doing nothing between intermittent bursts of PhysX.  What this results in is the giant clusterfuck that is any crowded area in AC3, even on top of the line hardware much more powerful than a console.  Normally for gaming, an i7 or high end i5 quad core is a fantastic performer.  When it’s only running one core, even in turbo, it’s really not much more than a Pentium 4 Extreme Edition.

So before you nerd rage your monitor in half, hear this.  The developers are not so inept and backwards they don’t realize this.  They probably tried to explain it to the publishers at the start.  I’m sure it went something like this:

Devs:  If we don’t do this from the beginning, we will catch mad hell on the PC version.  Think Dark Souls.

Pubs:  So what you’re saying is that you want to do something that will take extra time and money, even though the game will actually run anyway.

Devs:  Yeah but it will run terrible!  The GPU manufacturers will be pissed too!  Also we can’t easily fix this later, if at all.

Pubs:  We don’t really care.  Most of our revenue is in boxed copies on console anyway.  Steam copies don’t cost us shit, so we don’t really care how many we sell.  Most people will just pirate it anyway.

The End.

You may say to yourself “Where’s the craftsmanship and pride?”  Well, I hate to say it but everything the publishers said is true.  If you want a lesson on the realities of life, go on youtube and look for Alec Baldwin’s scene from Glengarry Glen Ross.  They play it at West Point and they play it at Harvard.  That’s usually a pretty good indicator that it’s good advice if you want to conquer your enemies physical or financial, which is of course the drive of capitalism and the military.  Ubisoft is a competitive company holding a very tenuous #3 spot as an independent publisher.  In a market where numbers one and two are EA and Activision, there’s not much margin for error.

They’re right.  The large majority of their sales will be on consoles.  Especially with a game like Assassin’s Creed that is very console friendly and heavily leans toward being played on a controller.  We’re not talking about an FPS or a Bioware game, something that people demand to be cross-platform.  We’re talking about a game that has had far worse problems, yet even less uproar than Dark Souls.  Considering most PC players laugh at Dark Souls and few people play it on PC, that doesn’t really make for a PR nightmare by comparison.  The fact is most people have already just headbutted their way through the game, suffering the bad framerates and jumps in the crowded cities.  Only a few people who are crusading on principle are even bugging Ubi for a patch anymore.  All in all, a pretty justifiable strategic loss.

Now get this, this is the part that hurts and is totally the fault of the developers.  The solution already exists.  All about the place there are game engines and middleware that make cross-platf0rm development almost effortless.  Fortunately the one used for Dark Souls was such, so despite the developers, a community user had already created a simple fix before the game’s launch and just waited to confirm final code before releasing it…. 12 minutes after game launch.  I don’t mean that as a jab at From Software.  Although their lack of knowledge of their own tools is almost comical, they at least had the foresight to pick an engine that will run on damn near anything.  Other developers however, like to stick with what they are comfortable with.  They cling to either popular tools of the recent past or their own proprietary engines which they have invested time and money in.  Folly either way.  I don’t care how much you’ve spent on something, if it is inferior to available products, dump it.

The studios who develop on “friendly” tools, reap the benefits for years to come.  Not only is it easy to port the game to other systems in the future, it’s far more likely that the software can be accessed or emulated by future hardware.  You can easily see this with older PC games being playable on new hardware.  I don’t know if you’ve ever run an old Bethesda game on a high-end modern PC, but it’s a pain in the ass.  It’s honestly easier to run Outcast (look this game up if you love game design, 1999 Infogrames) than many popular older games with unique game engines.

I hate to be like some gaming Karl Marx or Ayn Rand, saying:  “Why aren’t you better than you are!?”  That is sadly the case though.  It’s just plain bad business to do things on the short term.  Unfortunately international business runs on the fiscal quarter, which let me just say is not a lot of time when developing a game.  You’re lucky to get like, a level finished in that time.  Games just plain can’t be developed like movies or albums.  You can’t bank on it in the shortest time possible.  It’s not just some short-term fan backlash you need to worry about, it’s the long-term value of what has been programmed.  If the developers basically need to throw out everything they do, as they will surely have to do for AC4 if it’s going to run on new AMD based consoles, a lot of time and money is wasted.  First party titles are so successful because Microsoft or Sony can have their devs just make one engine, keep improving it on the same hardware and keep cranking out games.  The result is more games of higher quality at a fraction of the cost.  That’s why Naughty Dog can crank out three great Uncharted games in a row and Tomb Raider needs to take mulligans.

To start this paragraph as fun as the last one, I don’t want to turn this into an “I have a dream…” speech.  I’ve just seen message boards filled with people raging at various games over the years.  In the end they just want to understand why nobody can help them and if they took the time to read the bazillion words I just typed in the last few minutes, now they know.  And knowing is half the battle.  The other half is fighting it, I presume, so feel free to go out armed with this knowledge and give those developers some constructive criticism.

Advertisements
Comments
  1. Cryio's Qax says:

    Reading this is interesting. In some places you know your trade, in other places you have zero clues about what you are talking about. Reading the history of IBM CPUs and how programming for x86 is cool. Didn’t know it maxes out only 1 thread. HOWEVER:

    1) Assassin’s Creed 3 is a DirectX10 game. There is no DX9 render path.
    2) L.A. Noire is DX9 game and it’s got patched in a DX11 path for better CPU utilization
    3) DX11 did the whole CPU optimization thingy, not DX10
    4) Skyrim is not a true multicore game. It uses only 2 cores and it uses DX9 only
    5) Assassin’s Creed 3 is not a raw port, just judging by the whole bump in visual quality compared to its console part
    6) Assassin’s Creed 3 doesn’t have Physx. There is zero proccessing.
    7) This whole sentence „So it uses one CPU core to 100% and occasionally touches another for PhysX if you have an AMD GPU, or if you have a weak nVidia card.” I think you were just naive back then.
    8) You just said an i7 is not much more than a Pentium 4 EE in single core performance. Seriously? The average P4 was 2.8-3.3 with the fastest P4 going to 3.8 GHz. The latest i7 now goes to 4.4 GHz on single core, not to mention that core for core performance has improved like 3 or 4 times since then. Also since Assassin’s Creed 3 is aware of 6 cores (maybe more ?) an i7 with 8 threads would be a massive improvement over a P4 EE with 2 threads.

    You have all the good intentions, but at least hit the nail on the head.

    Some time ago I had a Intel Core 2 Quad Q9550 at @2.8 Ghz. With this and my 560 Ti, I got 25 fps in Boston and 35-40 in every other place. This happened in 720p with FXAA. Now I have a 6300 @4.5 GHz and I get 36 in Boston in 1080p with max AA in-game. With the same GPU, this is a massive improvement and only due to the fact that the game is aware of more than 4 cores. It also needs all 6 cores of my CPU to max out the GPU. So there’s that.

  2. ancalimon says:

    AC3 drops down to 27fps on certain locations look at certain directions on an i7 3820 (4.3Ghz) and overclocked GTX1070.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s