Export thread

Need a Temporary Gaming Video Card


#2

AshburnerX

AshburnerX

What's it need to run, Left 4 Dead 2? I get by on my GeForce 8500... you might be able to get by with less.


#3

Shegokigo

Shegokigo

Well right now, I'll be honest.

I can run L4D2 on a 7800gt with all the settings at minimum, but it's still a bit "skippy" and I'm pretty useless with a rifle.

I want something that will run it at least smoothly on minimum or medium settings, and those are my only 3 options nearby that I can return within 2 weeks.


#4

AshburnerX

AshburnerX

I think ether of those last two should be able to run it perfectly fine. I run it at low because I like optimum performance, but I could probably run it at medium on mine. However, do keep in mind that Left 4 Dead 2 seems to be having some problems with GeForce cards according to the forums. I've had weird graphic glitches with mine (which can be fixed by changing the resolution to reload the graphics) but it doesn't happen too often anymore.


#5

SpecialKO

SpecialKO

If that's your requirement, I would ask how long you're going to need it.

Because an 8400 may support more advanced effects that a 7800gt, but it's a much weaker card and probably will have worse aggregate performance.

Tom's Hardware has a pretty good hierarchy chart for basic equivalency of performance.
http://www.tomshardware.com/reviews/best-graphics-card,2464-8.html

The performance-equivalent of a 7800gt is an 8600gts, 9500 gt, an X1800 XL or an HD 4650 with DDR2 memory


#6

AshburnerX

AshburnerX

This is true... I have an 8500gt. Your performance with a normal 8400 may not be close to what I have.


#7

Shegokigo

Shegokigo

Looking at the heirarchy chart. I'm screwed....


#8

Shegokigo

Shegokigo

If I push my cash limit at the moment, I could grab a

http://www.bestbuy.com/site/EVGA+-+...Card/9638496.p?id=1218135450650&skuId=9638496

it shows higher than the 7800gt on the heirarchy but I wonder if it's by enough.


#9



Soliloquy

Now, see, this is why I mostly play old games now.


#10

figmentPez

figmentPez

Why Best Buy? Because you want to buy from brick and mortar?

You could get a much better price at Newegg or elsewhere online (or go with ATI), if you can wait a few days for shipping that is.

Note, those aren't the pick of the litter, just what I found quickly that had free shipping. There may be much better options for nVidia, and certainly are if you go ATI.


#11



Soliloquy

I'm guessing it's because

A) it's temporary
B) she wants a place to return it to quickly and easily


#12

Shegokigo

Shegokigo

Reason being is I have a Best Buy nearby I can buy and return to.

My main system is going to be gone two weeks. My current "2nd system" is running my 3rd computer's video card, a 7800gt and I get choppy frame rates at low settings on L4D2, I just want something to tide me over for 2 weeks.


#13



Cuyval Dar

Newegg is really good about returns, and those cards are all ripoffs. Might as well get a cheap Radeon 4670 off of the 'egg and call it a night. Besides, you never know when you might need that backup rig again.

---------- Post added at 08:45 PM ---------- Previous post was at 08:40 PM ----------

Also, there is no way in hell you will run L4D2 at acceptable frame rates on a 8400GS.


#14

Shegokigo

Shegokigo

Yeah, but by the time the newegg card gets here, I'll have gotten my system back. :paranoid:

http://www.videocardbenchmark.net/video_lookup.php?cpu=GeForce+GT+220M

Depressing the hell out of me.

Someone want to tell me why the GTX285 is ranked higher than the 295?


#15

Necronic

Necronic

Sweet, my card is at the top of the list, a year after I bought it. Fuckin A the 4870X2 is an amazing card.

Only thing I can suggest is if maybe you have a Microcenter or even a Fry's might have a better deal on a card than best buy. Fuck, walmart may have a better deal...



#16



Cuyval Dar

Yeah, but by the time the newegg card gets here, I'll have gotten my system back. :paranoid:[COLOR=\"black\"]

---------- Post added at 03:52 AM ---------- Previous post was at 03:48 AM ----------

[/COLOR]http://www.videocardbenchmark.net/video_lookup.php?cpu=GeForce+GT+220M

Depressing the hell out of me.
That benchmark is for a Mobile or "M" card. eg, laptop.


Like I said, you may have need for that backup PC in the future, so you might as well just invest in a cheap, decent card anyway.

---------- Post added at 08:58 PM ---------- Previous post was at 08:56 PM ----------

http://www.videocardbenchmark.net/video_lookup.php?cpu=GeForce+GT+220
Try this.


#17

Shegokigo

Shegokigo

At the moment, this 7800gt is outperforming anything at Best Buy for under $150 so I'm pretty much SoL.

Yes I need to invest in a backup card, mostly for my secondary computer to even be running when my main system gets back, I'm just financially tied up at the moment.


#18



Cuyval Dar

Unless you go the Newegg route, yes, that 7800 GT beats out all of your choices.


#19

Shegokigo

Shegokigo

I'm still curious how a GTX285 is outperforming the 295.


#20



Cuyval Dar

Higher memory and core clocks, because there aren't the thermal constraints of an Nvidia-style dual GPU card.

---------- Post added at 09:08 PM ---------- Previous post was at 09:07 PM ----------

Also, SLI isn't remotely close to 100% efficiency.


#21

Shegokigo

Shegokigo

Reading up on the ATI 5870, I'm starting to drool.

My Nvidia days may come to an end before long.


#22



Cuyval Dar

Oh, sure, NOW you choose to look at ATI. :rolleyes:


#23

Shegokigo

Shegokigo

Well, previously the Nvidia and ATI cards would be neck and neck and PhsyX pushed Nvidia over the edge for me.

The numbers I'm reading for the 5870 and later, the 5870 are just nuts. Granted, Nvidia hasn't released anything new yet but just wow. :eek:

Hm, though it seems the GTX295 is outperforming the 5870?

http://hothardware.com/Articles/ATI-Radeon-HD-5970-DualGPU-Powerhouse/?page=4

http://hothardware.com/Articles/AMD-ATI-Radeon-HD-5870-Unquestionably-Number-One/?page=6

I mean I know it's a single GPU card, but it's not convincing me to take it over the GTX295.

Though what TWO 5870s can do in Crossfire.... :drool:

---------- Post added at 04:28 AM ---------- Previous post was at 04:19 AM ----------

WTF is this?

http://magicboxlive.blogspot.com/2009/09/ati-hd-5870-vs-gtx-285-benchmarks.html

---------- Post added at 04:33 AM ---------- Previous post was at 04:28 AM ----------

I wonder what would would be smarter, dual 5870s or just a 5970....


#24



Cuyval Dar

Well, previously the Nvidia and ATI cards would be neck and neck and PhsyX pushed Nvidia over the edge for me.

The numbers I'm reading for the 5870 and later, the 5870 are just nuts. Granted, Nvidia hasn't released anything new yet but just wow. :eek:[COLOR=\"black\"]

Hm, though it seems the GTX295 is outperforming the 5870?

http://hothardware.com/Articles/ATI-Radeon-HD-5970-DualGPU-Powerhouse/?page=4

http://hothardware.com/Articles/AMD-ATI-Radeon-HD-5870-Unquestionably-Number-One/?page=6

I mean I know it's a single GPU card, but it's not convincing me to take it over the GTX295.

Though what TWO 5870s can do in Crossfire.... :drool:[COLOR=\"black\"]

---------- Post added at 04:28 AM ---------- Previous post was at 04:19 AM ----------

[/COLOR]WTF is this?

http://magicboxlive.blogspot.com/2009/09/ati-hd-5870-vs-gtx-285-benchmarks.html
Name one game that you play on a daily basis that uses Physx. Plus, Physx also runs very well on the CPU.

Plus, NV doesn't even have their answer to the 5xxx-series, and when they finally do, AMD will just hammer them again.

Both the single-GPU 5870 and dual-GPU 5970 are vastly more power efficient and faster, they actually support DX11, and there are a myriad of of other hardware improvements in areas like HD video rendering.
Not only that, but in the review you linked, both the 5870 and 5970 were unleashing a major can of whoopass on the 295 and 285.


#25

Shegokigo

Shegokigo

Yeah, my bigger debate at this point is dual 5870s or a 5970 by itself.

Thankfully Jan/Feb is going to be good to me financially and I may try unloading my GTX295 for $300.


#26

Shannow

Shannow

HAHAHAHAHAHAHAHHAHAHA


#27



Cuyval Dar

Yeah, my bigger debate at this point is dual 5870s or a 5970 by itself.

Thankfully Jan/Feb is going to be good to me financially and I may try unloading my GTX295 for $300.
Personally, I would go with the fastest single card.
Remember, when Hydra comes out, that will pay off big time, being able to SLI/CF different cards from different vendors.


#28

Shegokigo

Shegokigo

HAHAHAHAHAHAHAHHAHAHA
I honestly, and completely wish you'd stop doing that.

If for anything other than I actually enjoy hearing your opinion on subjects instead of a complete waste of a post that really makes you come off as a completely inane mentally weak individual. When I know you could really participate and bring an interesting perspective on a number of conversations.

---------- Post added at 05:39 AM ---------- Previous post was at 05:38 AM ----------

Only thing is, I'd buy the two 5870s together as it would be, but I see your point of being able to remove one and pop in another type later.


#29

AshburnerX

AshburnerX

Name one game that you play on a daily basis that uses Physx. Plus, Physx also runs very well on the CPU.
City of Heroes. That's about it.


#30

SpecialKO

SpecialKO

Irritating as it may be, I think you should just wait until you can replace your the video card in your main computer.

If you were trying to do big image/video-editing work, that would totally be a different case, but if all you're going to be unable to do for 2-3 weeks is pay L4D2, I would just hold off until you get your main machine back.


#31

Shegokigo

Shegokigo

I've actually been out of the house ALOT in the past 2 weeks, but when I'm home (and GF isn't home) I'm jumping between my PSP and L4D (at this framerate I can't even compete well in TF2) but I guess I'll focus more on my consoles as it is, instead of buying the video card I'm probably going to be getting Metroid Prime Trilogy.


#32

PatrThom

PatrThom

Higher memory and core clocks, because there aren't the thermal constraints of an Nvidia-style dual GPU card. Also, SLI isn't remotely close to 100% efficiency.
The GTX295 is actually 2xGTX280 chips on one card (2x240 shader processors) but each only running at the speed of the GTX260 (576MHz). It's like having two "GTX260 Ultra" cards in SLI but on only one card.
The GTX285 is a die shrink of the GTX280, meaning the chip is smaller/uses less power/doesn't get as hot and therefore the clock speed can be increased. It still has the same 240 cores, but they run/shade/talk to memory about 12.5% faster. Additionally, the memory bus has been widened from 448bit (280) to a full 512bit (285) and the amount of usable RAM has been upgraded from 896MB (280) to a full 1GB (285).

In one L4D test, the GTX295 is only a measly 17% faster than a single GTX285 at low resolutions, but it climbs all the way up to 33% better at really high resolution with bells and whistles turned on. Mind you, according to their tests, the GTX295 also requires over 50% more power than the GTX285 to give you that 17-33% performance boost.

I have to stand by my original assertion. As I said previously in an earlier thread...
The short of it is that, for right now, ATI holds the triple crown for image quality, performance-per-watt, and framerate. So if you need these things, and you need them right now (and you don't want to get there by merely lowering your graphics settings), the 5870 is the way to go. If you can afford to wait (or can't afford to switch), then by all means wait to see what NVIDIA releases (3xx in Q1 2010), or even whether Larrabee is all that Intel says it will be (likely Q2 of 2010).
But to your original question, of the three cards you specifically mention, I can't recommend any of them. I don't think I'd personally go for anything less than a 9500GT 1GB/HD4670 card (which are about $20 more than your three choices).

--Patrick


#33

Shegokigo

Shegokigo

Strange, the video card heirarchy shows the 9500gt lower than the 7800 I have right now. :blue:


#34

SpecialKO

SpecialKO

Strange, the video card heirarchy shows the 9500gt lower than the 7800 I have right now. :blue:
It's because there are different versions of both. The 7800 GT is better than the 9500 GT with DDR2 RAM, but about the same as the 9500 GT with DDR3.

And the 7800 GTX is just better.

Just something to keep in mind about the hierarchy chart (as with all benchmarkers really). Different cards will perform differently with different tests.

I like Tom's because they use both 3DMark that just pushes the card's capabilities from a math perspective and they current games at varying levels of video settings.


#35

Hailey Knight

Hailey Knight

I'm just gonna make this an all video card topic...

I'm running with Nvidia 8200M G for my laptop. Now, the chart posted earlier says three tiers up, which would be 6800 (128-bit) or in the range of ATI X1450, X1600, X1700, 2400 XT, X2500, 3450...

Firstly, why lower numbers, from 8200 to 6800... second, am I limited to that range, or... I don't know how high I can go, outside of price. I'd like to know where I'm "pushing it" with video cards.

Okay, know what? More importantly:

WHERE THE FUCK DO I BUY THEM?

All these sites, Nvidia's goes nowhere, I cannot find where to buy laptop video cards, unless somehow they're labeled the same as desktop video cards (or it doesn't matter). Example, I cannot find where to buy the 9800M GTX... Where do I buy laptop video cards without buying a laptop?


#36



Cuyval Dar

You don't buy a laptop graphics card; you buy a new laptop. That 8200M isn't upgradeable anyway.

---------- Post added at 04:01 PM ---------- Previous post was at 03:58 PM ----------

Strange, the video card heirarchy shows the 9500gt lower than the 7800 I have right now. :blue:
It's because there are different versions of both. The 7800 GT is better than the 9500 GT with DDR2 RAM, but about the same as the 9500 GT with DDR3.

And the 7800 GTX is just better.

Just something to keep in mind about the hierarchy chart (as with all benchmarkers really). Different cards will perform differently with different tests.

I like Tom's because they use both 3DMark that just pushes the card's capabilities from a math perspective and they current games at varying levels of video settings.[/QUOTE]

3dMark is a synthetic benchmark POS that has no bearing on real-world performance. The only reason that Futuremark is around as a company is because Intel bribes them.


#37

PatrThom

PatrThom

Strange, the video card heirarchy shows the 9500gt lower than the 7800 I have right now. :blue:
Kinda my point. ;)

Take my advice for what you will, my main computers still get by on an ATI FireGL X3 (X800XT equiv) and a Quadro FX 4000 (6800 Ultra equiv). The wife's machine has a 7800GS AGP installed. Everything else I know (ie, newer cards), I know from specs and reviews.

--Patrick


#38

Necronic

Necronic

Well, previously the Nvidia and ATI cards would be neck and neck and PhsyX pushed Nvidia over the edge for me.
Nvidia has been severely lagging behind since the 4870X2 came out, that beast blew the doors off of anything NVidia had, and the only way they could come close to matching it at the time was duct taping two inferior cards together, and that still ended up way below the 4870.

That said, Nvidia manufacturers generally have massively better customer support than ATI manufactureres. EVGA and BFG's lifetime warranties blow Sapphire's shitty little 2 year warranty away.

PatrThom said:
a Quadro FX 4000 (6800 Ultra equiv)
Damn, that is a slightly expensive card. I always find it funny when I read newegg reviews on those and people will say "Gaming performance was terrible, waste of moneys!!!" Makes me giggle. So you do CAD or something? Friend of mine put one of those into a laptop for some engineering work. Brutally expensive piece of equipment.


#39

Shegokigo

Shegokigo

So I'm getting curious

Two 5870s or One 5970?

From what I've been seeing on benchmarks is to go with the dual 5870s?


#40

Hailey Knight

Hailey Knight

You don't buy a laptop graphics card; you buy a new laptop. That 8200M isn't upgradeable anyway.
Well then, I hate this thing once again...


#41

PatrThom

PatrThom

Gelato[/URL] GPU rendering with it, but I just never set aside the time to pursue my 3D modeling dreams.
Two 5870s or One 5970?
I always prefer to recommend multiple single cards unless you really need to have 3 or 4 GPUs in the machine. Single cards are usually clocked faster/overclock better, have more RAM, and are a lot easier to swap out/troubleshoot if one goes bad. Also, for those games where dual GPU doesn't give you any boost (or even gives a penalty!), you can just turn CF off (or remove a card to save power/heat). I've always been a big fan of modularity, and will happily sacrifice 1 or 2fps for the versatility and/or ability to enormously simplify troubleshooting.

--Patrick


#42



Cuyval Dar

When Hydra motherboards hit the market, that recommendation may change.


#43

Shegokigo

Shegokigo

Why's that Cuy? I thought Hydra was all about the mixing of different single GPU cards?


#44

PatrThom

PatrThom

In case word hasn't gotten around, it seems Intel's Larrabee has been canceled. This means that 2010 will still be a battle between AMD and NVIDIA for graphics supremacy.

--Patrick


#45

Shegokigo

Shegokigo

Buh?


#46

Shegokigo

Shegokigo

Good news, I'm getting a

http://www.bestbuy.com/site/XFX+-+A...6025.p?id=1218125775469&skuId=9566025&st=4890

till my main system comes back!


#47

Shegokigo

Shegokigo

It was a tossup between the GTX275 and the Radeon4980, but I decided I'll give ATI a shot again before jumping straight to Nvidia.


#48

AshburnerX

AshburnerX

ATI cards are generally better for anything not specifically optimized for a GeForce or Physx Card. The only reason I use a GeForce is because of City of Heroes.


#49

Shegokigo

Shegokigo

Installing now.....


#50

figmentPez

figmentPez

Early Christmas present? Eating Ramen for the next month? Advance payment on a contract killing?


#51

Shegokigo

Shegokigo

Little of column C, little of column D.


#52

PatrThom

PatrThom

I would be interested in hearing how well it works, Shego. I would expect to get about 2/3 the performance/fps of your GTX295 with it (on average). I know that card has issues (the entire high end of the 4xxx/RV770 series, actually), but they seem to be the sort of issues that don't come up under 'normal usage.' If you DO run into these problems, they can be solved by slightly underclocking your card.

--Patrick


#53



Cuyval Dar

Yeah, stress test programs that I've never heard of seem like a really accurate benchmark for stability and reliability. Now, show me issues that crop up in real-world scenarios, and the we're talking.

---------- Post added at 09:36 PM ---------- Previous post was at 09:35 PM ----------

Oh, and if that wasn't clear enough: I call bullshit.

---------- Post added at 09:38 PM ---------- Previous post was at 09:36 PM ----------

Originally Posted by Tetedeiench
UPDATE : the 4870 on the test is a 4870 PCS+ from PowerColor whose VRM are a 4-phase numerical VRM instead of 3, and that's why it is not crashing.

Sorry, i thought it was an Asus design. It is Not. My mistake.

Eastcoasthandle:
So you basically designed a new test which is designed for 4-phase numerical VRM instead of 3?
Just what I thought.


#54

Shegokigo

Shegokigo

Honestly, I pumped up the graphics in L4D2 pretty high (not 16xAA or anything) but on par with my GTX295 and I have to say, it runs smooth as butter. Granted my load times are a little longer, but that's because this older system has an older motherboard and less RAM.

On the graphics side, I can safely say, this will easily keep me tided over till my main system comes back, and I'm really looking forward to Crossfiring two 5870s in a few months.


#55



Cuyval Dar

Solution?

Don't run OCCT GPU test or Furmark.
Problem solved.
Truely, I lol'd.


#56

Shegokigo

Shegokigo

One thing to note, I play on 1680x1050 setting, so the high resolution benchmarks don't mean much to me.


#57



Cuyval Dar

Wait, what size monitor do you use?



#59



Cuyval Dar

Link really doesn't care for me.


#60

figmentPez

figmentPez

Little of column C, little of column D.
Ah, I see. You sold nude pictures of yourself to some poor sap, and now you're going to kill him and any other man who saw them. Devious. :twisted:

---------- Post added at 12:09 AM ---------- Previous post was at 12:05 AM ----------

Link really doesn't care for me.
23" LG W2361VG-PF

At least, that's what comes up at the link for me. It's native resolution is 1920 x 1080, so I'm not sure why Shego would be playing @ 1680x1050 on it.


#61

Shegokigo

Shegokigo

Cause shit gets too tiny at 1980


#62



Cuyval Dar

The aspect ratio is better at 1920x1080. I say that from experience I prefer the 16:9 aspect ratio. It just seems better for gaming


#63

Shegokigo

Shegokigo

Yeah but frame rates are hurt by larger resolutions.


#64

SpecialKO

SpecialKO

Yeah but frame rates are hurt by larger resolutions.
True, but playing the game at a different aspect ratio than the native one for your monitor can affect performance.

In this case, probably not by much, but the fps difference between 1680x1050 and 1920x1080 probably isn't that much either (at least in L4D2).


#65

Shegokigo

Shegokigo

I'll give it a try then, I've just always played all my games at 1680.


#66

SpecialKO

SpecialKO

I'll give it a try then, I've just always played all my games at 1680.
Testing on an actual game is really the best way to check. :p

When I installed Dawn of War on my old PC desktop with a spanky new 9600 GT, it recommended extremely low graphics settings as "optimal". I tried it out, then turned all the settings to max and tried it again. Didn't affect performance barely at all, and looked much better.


#67

Shegokigo

Shegokigo

I'll test when I get my main system back, this one is having a hard enough time as it is.


#68

figmentPez

figmentPez

I'll test when I get my main system back, this one is having a hard enough time as it is.
It could be CPU bound. My 2.5Ghz Core 2 Duo with Radeon 4830 is on Left 4 Dead (I haven't tested L4D2). I get pretty much the same frame rates at 1280x720 as I do at 1920x1080 (both 16:9 resolutions). I made a time demo and ran it a few times with varying AA levels, and they were all within a few fps in the results.


#69

SpecialKO

SpecialKO

I'll test when I get my main system back, this one is having a hard enough time as it is.
It could be CPU bound. My 2.5Ghz Core 2 Duo with Radeon 4830 is on Left 4 Dead (I haven't tested L4D2). I get pretty much the same frame rates at 1280x720 as I do at 1920x1080 (both 16:9 resolutions). I made a time demo and ran it a few times with varying AA levels, and they were all within a few fps in the results.[/QUOTE]

That sounds right since they're both 16:9. As best as I understand it, as long as the native resolution of your monitor is well within the capabilities of your graphics card (you're not running at 2560x1600 or something like that), performance at different resolutions of the same aspect ratio won't be substantially different, assuming all other settings (AA/AF/textures/lighting/shadows/etc) remain the same.


#70

Necronic

Necronic

I swear you young rapscallions with your highest graphical settings. I've always thought it was a bit odd. Back in the day playing Quake 1-3 we would build these really powerful systems, then crank back the graphics to the absolute minimum to get the best frame rates possible.

Low frame rates will completely fuck your ability to play FPSs at a competitive level. High screen res can be nice though, as it often gives you a larger viewable area (especially in the old games with solid status bars.) But it still blows my mind that people play these games with anything less than the 75ish hz limit of LCD screens/human eye.


#71

Shegokigo

Shegokigo

Except a powerful system can run at max FPS and highest settings.


#72

figmentPez

figmentPez

Meh, I don't play FPS at a competitive level, and I have no intention to. I play to have fun, and as long as my minimum sustained FPS stays above ~20-25 the game feels smooth enough to me. Heck, on my old system I finished HL2 when almost any use of the gravity gun would drop my FPS so single digits, and I still enjoyed myself.

If I'm getting pretty much the same framerate on my system at 1920 as I can at 1280, then I'm going with the higher resolution. I don't like jaggies, and at higher resolutions I can spot things farther away in-game. It's easier for me to distinguish between a smoker and a common infected when they're made up of twice as many pixels to give detail.

That said, most LCD monitors are limited to 60hz, even if they can accept 75hz input (The exceptions being new 120Hz LCDs meant for use with 3D goggles). That, and the human eye is decidedly not limited to 75hz, I don't even consider myself to have fast eyes and on a CRT 75hz looks flickery and gives me a headache.


Top