All the talk before release has been that, at least on paper, the Xbox Series X is more powerful than the PS5. The former boasts more overall conventional power and processing ability while the latter has less but uses various custom tinkering and optimisation to make what it has run far more efficiently. Essentially it’s less hardware but laid out and managed more carefully to do more for its buck.
The results on the first cross-gen title comparison of “Devil May Cry: Special Edition” last week showed the two consoles were relatively comparable – boasting effectively the same frame rates, load times and visual appearance with the advantage often switching between the two on a scene by scene basis if only to a negligible amount.
But that was an older game, not one made with the new consoles in mind. Now two much more recent games have been compared – Ubisoft’s just released Fall juggernaut “Assassin’s Creed Valhalla” and Activision’s “Call of Duty: Black Ops – Cold War”. Both are third-party AAA multi-platform titles with ‘Valhalla’ coming from a franchise infamous for failing to optimise and so requiring much more raw power to run.
These are the kinds of titles where the Xbox Series X is supposed to shine and clearly show its advantage over the PS5. Tests of the game however show that with both titles that, though close, the PS5 is proving better than Xbox.
‘Valhalla’ is a game that’s proven a challenge even for high-end PCs. The most powerful graphics card on the market (RTX 3090) can’t reach 4K/60fps with the title which is often sitting between 40-56fps and doesn’t even support native 4K resolution according to DSO Gaming and The Verge.
Both consoles meanwhile are using dynamic resolution targeting a consistent 60fps and resolution ranging from 1440p to 4K. Now the folks at Digital Foundry have done an analysis of the two machines and found the Xbox Series X version has a substantial problem when it comes to frame tearing, one that’s less of an issue if you have a VRR capable display which only a few TV models do – namely LG OLEDs & Samsung QLEDs from the past 2-3 years. If you’ve got HDMI 2.1 on your TV you’ll likely have it.
Even then, both the Xbox Series X and S have been spotted with unusual stuttering even during cut scenes. Several players have reported that as long as you have a VRR capable display and VRR switched on, the problem essentially disappears and becomes much more akin to the PS5 version (which has an occasional stutter).
That’s not all though. The frame rate is proving more stable on the PS5 with the game holding at 60fps pretty consistently in scenes where the Series X drops numerous times into the very low 50s – both maintained the same resolution throughout.
In regards to “Call of Duty,” a similar scenario is happening. Youtuber VG Tech ran the title at both 120fps and at 60fps/with Ray Tracing enabled. In 120fps mode, both were fairly constant at 120fps in most cases, but the Xbox had more drops and fell below 110fps at times, whereas the PS5 barely left 120fps. On cut scenes, the PS5 fell to 105-110fps while the Series X fell below 100fps at points.
In 60fps with ray tracing on, both were reportedly much more consistent, though here the PS5 hit an issue where it could randomly drop below 60fps during scenes that it previously ran at a solid 60fps. In both cases, optimisation of both the console SDKs and the games themselves should iron out many of the bugs – but it certainly seems to shatter the idea the Series X will play third party games better.
A clear picture coming from this and other recent tech announcements is that more conventional power on paper automatically equalling better performance is a fallacy as AMD, Sony and Apple have all made the push into in-house re-design, production and optimisation of their own lower powered components as opposed to using higher powered modular ‘off the shelf’ ones from third party manufacturers – yielding far better results in the process and at better value for the consumer.