I'm telling you that the Xbone and the PS4 don't work the same way.
same here..and tbh don't miss it.HD is a bit surplus IMO, I never really notice the eye candy after ten minutes of playing a game so turn it off for a little performance boost.
The devkits have been out for ages. There's no "soon enough".We'll see soon enough.
That's a real oddity though. In general it is true that a 77xx is faster than a 67xx. But the real performance difference usually lies in the second number rather than the first one. There needs to be a several generation difference before x8xx doesn't outperform x7xx.A 5870 is better performance than a 6870
The devkits have been out for ages. There's no "soon enough".
That's a real oddity though. In general it is true that a 77xx is faster than a 67xx. But the real performance difference usually lies in the second number rather than the first one. There needs to be a several generation difference before x8xx doesn't outperform x7xx.
As for Tomb Raider, a 6870 (still a pretty decent card!) can't even pull 30fps average at ultra settings and a 1680x1050 screen res. It is actually a very stressing game at Ultra settings.
The devkits have been out for ages. There's no "soon enough".
The devkits aren't the final silicon. Indeed, for a release date of the end of this year, I would expect the final silicon to have only just been finalised. You do know that the demo systems at E3 may well have been PCs running Windows 7, don't you? And, allegedly, Nvidia's Geforce cards to boot! True, I don't expect the first round of games to be optimised, but once developers get their hands on the actual kit, then we'll see significant performance improvements.
Mine is looking a bit older then that, it's an E8400 @3.6 ghz with a 5850. Some things still run sweet, but some stuff is starting to look juddery. Got an ssd already. It's a little frustrating as everything else I use it for is buttery smooth.
If it's a 5850 then you should be able to overclock the tits off it no problem, it's a good card that.