Urban75 Home About Offline BrixtonBuzz Contact

Cheap video card to play Tomb Raider?

I'm telling you that the Xbone and the PS4 don't work the same way.

Though he didn't really prove his point with Rage, did he?
 
Oh and forgot to mention that OH plays the latest Tomb Raider on max settings on his PC with my old HD 5850 graphics card (I swapped it out for a 2Gb GTX 670 in my own PC) - it's not a game that is massively taxing or requires the latest hardware to play.

If buying an ATI/AMD graphics card, bear in mind that the numbering is not always a case of bigger number = better card because the way they number their cards is slightly odd, at least to the buyer. A 5870 is better performance than a 6870, so check reviews and performance charts before buying.

Here's a very useful page about graphics card performance comparisons:

http://www.videocardbenchmark.net/
 
We'll see soon enough.
The devkits have been out for ages. There's no "soon enough".

A 5870 is better performance than a 6870
That's a real oddity though. In general it is true that a 77xx is faster than a 67xx. But the real performance difference usually lies in the second number rather than the first one. There needs to be a several generation difference before x8xx doesn't outperform x7xx.


As for Tomb Raider, a 6870 (still a pretty decent card!) can't even pull 30fps average at ultra settings and a 1680x1050 screen res. It is actually a very stressing game at Ultra settings.
 
The devkits have been out for ages. There's no "soon enough".


That's a real oddity though. In general it is true that a 77xx is faster than a 67xx. But the real performance difference usually lies in the second number rather than the first one. There needs to be a several generation difference before x8xx doesn't outperform x7xx.


As for Tomb Raider, a 6870 (still a pretty decent card!) can't even pull 30fps average at ultra settings and a 1680x1050 screen res. It is actually a very stressing game at Ultra settings.

No it's not true. The example you give of a 6870, is actually the series replacement for the 5770, and is lower performance than any 5800 series card. ATI/AMD cards come in 2 series - a high performance series (model numbers starting 58xx or 69xx and so on) and a budget series (model numbers starting 57xx or 68xx and so on). The high performance series is priced higher and outperforms the budget series. You've misunderstood if you think the first 2 numbers of the model number are unimportant in terms of performance. My 5850 outperforms a 6870.

ATI gpu numbering system has been this way for a while now - we all know it's hard for the consumer to wrap their head around it, but it's the way they have decided to do it.
 
The devkits have been out for ages. There's no "soon enough".

The devkits aren't the final silicon. Indeed, for a release date of the end of this year, I would expect the final silicon to have only just been finalised. You do know that the demo systems at E3 may well have been PCs running Windows 7, don't you? And, allegedly, Nvidia's Geforce cards to boot! True, I don't expect the first round of games to be optimised, but once developers get their hands on the actual kit, then we'll see significant performance improvements.
 
The devkits aren't the final silicon. Indeed, for a release date of the end of this year, I would expect the final silicon to have only just been finalised. You do know that the demo systems at E3 may well have been PCs running Windows 7, don't you? And, allegedly, Nvidia's Geforce cards to boot! True, I don't expect the first round of games to be optimised, but once developers get their hands on the actual kit, then we'll see significant performance improvements.

Yeah I have heard from numerous other sources that the xbox One demos at E3 were actually run on PCs with Win 7 or 8 and nVidia graphics cards (given that they don't actually have an xbox one yet!), so not a surprise at all to hear it on Urban also!
 
Sony was most definitely using PS4 devkits though. The hardware is quite similar to the Xbox. Sony's is even a bit more powerful. It's just that Sony had their order in before MS and GF only has so much capacity. For god's sake, it's running a CPU you can buy today with a GPU that's just like one you can buy today (but with a slightly different number of pipeline clusters).
 
Mine is looking a bit older then that, it's an E8400 @3.6 ghz with a 5850. Some things still run sweet, but some stuff is starting to look juddery. Got an ssd already. It's a little frustrating as everything else I use it for is buttery smooth.

If it's a 5850 then you should be able to overclock the tits off it no problem, it's a good card that.
 
If it's a 5850 then you should be able to overclock the tits off it no problem, it's a good card that.

5850 is good but I do have a slight heat issue with mine in my 2nd PC (and also had a heat issue when it was in this PC) - it's slightly overclocked (not as OC'd as my GTX670 mind you), but you'd better have good fans on it if you want to push it. Use a utility to monitor GPU temperature during use.
 
Back
Top Bottom