Since the release of the Xbox One and PS4 the discussions of framerate has been rampant. It was expected that these machines would deliver a consistent level of gaming experience i.e. 1080p 60 fps. In reality a lot of AAA game launches have failed to reach 60fps and even 1080p in certain cases. So when Bethesda announced Fallout 4 would also not reach 60 framerate per second mark, I decided to ask Reddit. And as per usual, my trusted community was able to come up with some plausible answers to my questions as I am a novice in these matters.
Taking some key pieces of wisdom from the “more informed”, I have attempted to breakdown the main reasons behind the lacking framerate within games on Xbox One and PS4.
The Trade Off
Both Xbox One and PS4 can hit 1080p 60fps on games and have. Don’t believe me? Here is the IGN breakdown. The problem is there is a trade-off. These consoles are more powerful than the last generation but they are not powerful enough to render particular games in both 1080p AND at 60fps. So the developers have a choice between graphical fidelity and framerate – aka looking pretty or running smoothly.
A case in practice: Witcher 3
Initially, there were cheers from PS4 camp as the developers announced it would run the game at 1080p on PS4 and 900p on Xbox One. But upon implementation, the PS4 hasn’t [to date] provided a better experience overall. And when your game is on patch 1.07 – which made things worst – you have to wonder whether you bit off more than you can chew. This shows that when you push consoles to the limit, you risk unfolding new problems that will surely impact the overall gaming experience.
Then there is the marketing perspective – the market responds better to graphics than framerate. As a developer you want to provide an optimum experience but are undecided on where to make the sacrifice. But before you even reach that point, you have to consider how you are going to sell the game first. PR would be great in promoting the capabilities of your 60fps game but consider the number of consumers who have the attention span to read articles about your awesome framerate to those who are inspired by screenshots/footage of your 1080p game. So when it comes to it, you do what sells more – business is business.
Wisdom of the Redditors
Here are some quotes – taken from my Reddit thread – that may explain the issues developers face in creating current gen games:
“For 30 fps, you have to render your scenes in 33ms or less. For 60, you have to do it in 16ms.”
“Technically devs could get every game running at 60 fps if they really wanted to, but they would have to cut graphical detail (simple polygon count) and fancier effects. For example, high quality shadows are notorious for being fps drains.”
“Some people argue that the PS4 is underpowered because it doesn’t do 60fps more often. But I’d bet that even if they gave it more power, many devs would use that extra power for even greater visuals and still run at 30fps.”
“Given X number of resources, you must divide those resources into: Fidelity, Resolution, and Framerate. Given more resources, should we use it to up resolution to 4K? Up framerate to 60? what about 120? Improve Anti-aliasing? Increase draw distance?”
As you can see, it’s all about balance. If you increase one area, you impede another.
Will games eventually reach the 1080p 60fps mark consistently?
On certain games, yes. Fighting, sports and first person shooters all require fluid motion and if you look at the IGN breakdown – linked above – you’ll realise this is already the case. The 60fps gameplay allows for more responsive AI, more fluid reactions and extra resolution for the details that matter. For example, these are beneficial when taking a sniper shot on an online gaming session of Battlefield. Every detail counts, whether it’s revealing the precise location of your enemy and timing your shot to a fraction of a second. Another case would be an intensive game of FIFA where every little movement could be the difference that wins the game.
Open World Games Will Proabably Never Run 60fps
Although, this is different for open-world games. There are other more important factors such as the narrative and the graphical details that provide a more immersive experience. And even if you tried you are faced with new problems of scale. These two quotes from my fellow redditors explain this predicament:
In summary, the Witcher world would be less detailed if it ran at 60fps.
“Take Witcher 3. To have it running at 60fps, you’d likely have to remove a lot of the foliage, which would reduce how alive the world feels. When you’re in the woods, you’re surrounded by trees. Halve the number of trees and the effect would diminish quite a bit.”
With The Last of Us Remastered, they were able to run at optimum levels because:
“TLOU was a game that had only a few characters on screen at a time, that were pretty much carbon copies of one another, and they occupied small areas and it was able to run 60 fps flawlessly.”
Large scale games in 1080p will continue to operate at 30fps, even Bethesda is running Fallout 4 at 30fps. It’s just not worth whatever sacrifice it would take to reach such levels of framerate.
But really, does it matter?
Based on my personal experience, it does. Having played Wolfenstein: New Order, I could tell instantly the difference in gaming experience at 1080p 60fps. Everything felt very polished and smooth. So when I read headlines that claim the vast improvements – like Uncharted – I see why.
In the end we want the best gaming experience possible (without buying a PC). Based on my personal experience, I am happy with 1080p 30fps games as long as it is stable. Certain games can still look and feel just as amazing at 30fps, like Batman Arkham Knight and (in my opinion) Shadow of Mordor. Witcher 3 is an amazing game, the framerate issues didn’t ruin my PS4 experience but I would have preferred a more stable one without the glitches even if it meant running at 900p.