11/14/09 [Filler] - Let's talk Graphics Cards

Started by ChaosMageX, November 14, 2009, 10:45:26 AM

Previous topic - Next topic

ChaosMageX

So, what kind of graphics card do you think Amber is getting?
Maybe it's one of those new Nvidia GTX 260 models.  I think you can run Crysis at full settings when using two in tandom.
But for an artist, I think one would be more than enough.

And Amber got two new monitors, wow.  How can she even fit two tubes on her desk, unless...

she has a second desk behind the first!  That would really jut out from any wall it was put up against.

Icon by Sunblink

Dekari

Quote from: ChaosMageX on November 14, 2009, 10:45:26 AM
Maybe it's one of those new Nvidia GTX 260 models.  I think you can run Crysis at full settings when using two in tandom.

Actually one could handle that.  If you have the money to get 2 just get a GTX 295.

Though I will say if she did get any of the 200 series models word of warning....install the tuning software that comes with it and adjust the fan speed from auto (turn auto off) to about 60-70%.  They say the fan speed should scale with workload automatically (the fan always runs at 40% and should go up to whatever is needed), but it doesn't seem to do it effectively and can cause locking up issues and graphic dumps.  But that will only happen when using graphic heavy programs like games and what not.
I somehow get the feeling that you didn't think your cunning plan all the way through.

Thanks go to Kipiru and Rhyfe for the art work used in avatars.

http://drakedekari.deviantart.com/

Tapewolf


J.P. Morris, Chief Engineer DMFA Radio Project * IT-HE * D-T-E


Dekari

Quote from: Tapewolf on November 14, 2009, 11:07:29 AM
But will they work with Thief?

Are you talking about the original one from '98?

I don't know, I got a copy of it though.  Could check it out for you.
I somehow get the feeling that you didn't think your cunning plan all the way through.

Thanks go to Kipiru and Rhyfe for the art work used in avatars.

http://drakedekari.deviantart.com/

Tapewolf

Quote from: Dekari on November 14, 2009, 11:13:51 AM
Are you talking about the original one from '98?

I don't know, I got a copy of it though.  Could check it out for you.

Thief, Thief 2, System Shock II.  Any or all of these.  If you could try one of them, that would be great.  It has been said that some of the more recent cards have broken 16bpp support, and render the world without any proper shading.

J.P. Morris, Chief Engineer DMFA Radio Project * IT-HE * D-T-E


ChaosMageX

#5
Is it worth it to buy a laptop with the new Nvidia GTX 260M processor in it?

I've heard that their specs are equivalent to the Nvidia 9800 GT desktop card.

I just need something in a laptop that can play The Sims 3 at max settings.  It's my brother that likes to play Crysis, Battlefield 2142, and Fallout 3, so that'd make a good gift to him.

But also, I don't think his motherboard has the PCI ports necessary to run the newer cards.  I think it had something to do with 2.0, but I don't pay attention to that stuff since I'm a laptop kind of guy.

Icon by Sunblink

Chakat Blackspots

#6
Quote from: Dekari on November 14, 2009, 11:06:38 AM
Quote from: ChaosMageX on November 14, 2009, 10:45:26 AM
Maybe it's one of those new Nvidia GTX 260 models.  I think you can run Crysis at full settings when using two in tandom.

Actually one could handle that.  If you have the money to get 2 just get a GTX 295.

Though I will say if she did get any of the 200 series models word of warning....install the tuning software that comes with it and adjust the fan speed from auto (turn auto off) to about 60-70%.  They say the fan speed should scale with workload automatically (the fan always runs at 40% and should go up to whatever is needed), but it doesn't seem to do it effectively and can cause locking up issues and graphic dumps.  But that will only happen when using graphic heavy programs like games and what not.

That might depend on brand of card though.  I have the BFG GTX260 MaxCore 55 896MB video card.  The fan does speed up when in use, and my front case fan also speeds up as my motherboard controls its speed as the case warms up, so that helps with cooling.

--EDIT--
Forgot to mention this, perhaps Amber can do the oven trick on her video card.  That sometimes fixes things.

Dekari

#7
Quote from: Tapewolf on November 14, 2009, 11:15:32 AM
Quote from: Dekari on November 14, 2009, 11:13:51 AM
Are you talking about the original one from '98?

I don't know, I got a copy of it though.  Could check it out for you.

Thief, Thief 2, System Shock II.  Any or all of these.  If you could try one of them, that would be great.  It has been said that some of the more recent cards have broken 16bpp support, and render the world without any proper shading.

Well, Thief is a no go.  Actually it runs and graphics don't seem to have any glitches.  However, it locks up after about 10 seconds-5 mins of game play.  You can still ctlr+alt+del out of it so it's not a complete lock up.

It reminds me of the problem I had with Red Faction: Gorilla.  Running full screen it would lock up and/or get choppy till I managed to swing the camera angle to something with less detail.  Switched to windowed mode and it would run flawlessly.  So if there is a way to force those games to a windowed mode it might work.
I somehow get the feeling that you didn't think your cunning plan all the way through.

Thanks go to Kipiru and Rhyfe for the art work used in avatars.

http://drakedekari.deviantart.com/

Tapewolf

Quote from: Dekari on November 14, 2009, 11:52:31 AM
Well, Thief is a no go.  Actually it runs and graphics don't seem to have any glitches.  However, it locks up after about 10 seconds-5 mins of game play.  You can still ctlr+alt+del out of it so it's not a complete lock up.

Thief and its brethen don't like multicore processors.  If it's run unpatched on a multicore CPU it will do exactly what you've just described.  There are workarounds - you can start it and then set it's CPU affinity via Task Manager, or if you know what you're doing you can patch the EXE header so it does this by default.

This may help:  http://www.thief-thecircle.com/guides/hyperthreading/

...but it sounds like it does support 16bpp properly  :P

J.P. Morris, Chief Engineer DMFA Radio Project * IT-HE * D-T-E


Amber Williams

According to my friend and my notepad document, I've ordered a:
EVGA E-GEFORCE 9800GT 55NM 600MHZ 512MB 1.8GHZ 256BIT GDDR3 Dual DVI-I HDCP HDTV Out Video Card


NO CLUE WHAT THIS MEANS. But I trust my friends judgement.

Dekari

Quote from: Amber Williams on November 14, 2009, 12:13:00 PM
According to my friend and my notepad document, I've ordered a:
EVGA E-GEFORCE 9800GT 55NM 600MHZ 512MB 1.8GHZ 256BIT GDDR3 Dual DVI-I HDCP HDTV Out Video Card


NO CLUE WHAT THIS MEANS. But I trust my friends judgement.

It means you probably won't be playing Left 4 Dead 2 at full graphics when it comes out......but I somehow have a feeling that won't bother you too much  :P
I somehow get the feeling that you didn't think your cunning plan all the way through.

Thanks go to Kipiru and Rhyfe for the art work used in avatars.

http://drakedekari.deviantart.com/

Tapewolf

Quote from: Amber Williams on November 14, 2009, 12:13:00 PM
EVGA E-GEFORCE 9800GT 55NM 600MHZ 512MB 1.8GHZ 256BIT GDDR3 Dual DVI-I HDCP HDTV Out Video Card
NO CLUE WHAT THIS MEANS.

I don't know what you got to replace your old monitor, but it may not have a DVI connector on it - if it's a CRT monitor it certainly won't.  Fortunately these cards usually ship with a DVI->VGA adaptor.

J.P. Morris, Chief Engineer DMFA Radio Project * IT-HE * D-T-E


Chakat Blackspots

#12
Quote from: Amber Williams on November 14, 2009, 12:13:00 PM
According to my friend and my notepad document, I've ordered a:
EVGA E-GEFORCE 9800GT 55NM 600MHZ 512MB 1.8GHZ 256BIT GDDR3 Dual DVI-I HDCP HDTV Out Video Card


NO CLUE WHAT THIS MEANS. But I trust my friends judgement.

Ok, I can pick this apart:
9800GT <-- GPU
55nm <-- process (how big the interconnects are in the GPU)
600MHz <-- Core speed
512MB <-- amount of RAM
1.8GHz <-- memory speed
256bit <-- memory bus width
GDDR3 <-- memory type (GDDR3 or above is best)
Dual DVI-I  <-- used by most LCD monitors  http://en.wikipedia.org/wiki/Digital_Visual_Interface
All CRTs (the tube type) used the old VGA connector http://en.wikipedia.org/wiki/VGA_connector with some exceptions having BNC connectors on the rear in addition to the VGA connector  http://en.wikipedia.org/wiki/BNC_connector  But even if you have an old CRT monitor you'll still be able to use it with the DVI to VGA adapter like Tapewolf said.
HDCP = High Definition Content Protection -- used by Vista and Windows 7 to show full definition HD video on computer monitors.  Your computer monitor must support this, otherwise the video is down converted to something like 720p
HDTV Video Out -- usually this means either you have an HDMI connector on the video card, or it has a DVI to HDMI converter and you have a twisted pair wire connected to a port on the motherboard that connects to the video card that outputs the audio through the DVI to HDMI adapter.

llearch n'n'daCorna

Personally, I would have plumped for more memory on the video card, on the grounds that more is better. But that'll do just fine.
Thanks for all the images | Unofficial DMFA IRC server
"We found Scientology!" -- The Bad Idea Bears

Amber Williams

Quote from: Tapewolf on November 14, 2009, 12:31:52 PM
I don't know what you got to replace your old monitor, but it may not have a DVI connector on it - if it's a CRT monitor it certainly won't.  Fortunately these cards usually ship with a DVI->VGA adaptor.

There are adaptors coming with this card. My friend double checked to make sure that they were.

Quote from: llearch n'n'daCorna on November 14, 2009, 01:08:48 PM
Personally, I would have plumped for more memory on the video card, on the grounds that more is better. But that'll do just fine.

Had I the resources, I would have gone bigger and better. But I had to budget and this was the best I could get for my buck.

Gamma

#15
That card will be fine...

It should even be capable of helping Photoshop CS4 render faster. Adobe compatibility list.
It's not specifically mentioned but it runs within the same lines of cards specified.
Though I doubt Mrs. Williams has CS4 yet.

EDIT: Though I did notice something, the card you ordered has a PCI-X power connector, does your power-supply have those connectors? Almost any decently branded one of 300 watts or greater has them, so the likelihood of them not being there is very slim.
011010000111010001110100011100000011101000101111001011110111011101110111011101110010111001111001011011110111010101110100011101010110001001100101001011100110001101101111011011010010111101110111011000010111010001100011011010000011111101110110001111010110111101001000011001110011010101010011010010100101100101010010010010000100000100110000

Chakat Blackspots

#16
Quote from: Gamma on November 14, 2009, 02:40:48 PM
That card will be fine...

It should even be capable of helping Photoshop CS4 render faster. Adobe compatibility list.
It's not specifically mentioned but it runs within the same lines of cards specified.
Though I doubt Mrs. Williams has CS4 yet.

EDIT: Though I did notice something, the card you ordered has a PCI-X power connector, does your power-supply have those connectors? Almost any decently branded one of 300 watts or greater has them, so the likelihood of them not being there is very slim.

Even if her power supply doesn't have the 6 pin power connector for the card, the card will come with a adapter to connect two 4 pin molex connectors to a 6 pin connector.  If Amber has a store bought computer (Dell, HP, Acer, whatever), it might not even have any extra connectors in it.

Also, the 9800GT requires at least a 450W power supply with 30A on the +12V rail.  Though Amber did get a new PSU, she might be good.

llearch n'n'daCorna

Quote from: Amber Williams on November 14, 2009, 01:26:43 PM
Quote from: llearch n'n'daCorna on November 14, 2009, 01:08:48 PM
Personally, I would have plumped for more memory on the video card, on the grounds that more is better. But that'll do just fine.
Had I the resources, I would have gone bigger and better. But I had to budget and this was the best I could get for my buck.

Indeed. I thought that might be the case.

Quote from: Gamma on November 14, 2009, 02:40:48 PM
EDIT: Though I did notice something, the card you ordered has a PCI-X power connector, does your power-supply have those connectors? Almost any decently branded one of 300 watts or greater has them, so the likelihood of them not being there is very slim.

I'm pretty sure Amber's machine can do that. The person in question poked around inside the machine, and anyone competent will manage to sort that out real easy. It's not like it's tricky... ;-]
Thanks for all the images | Unofficial DMFA IRC server
"We found Scientology!" -- The Bad Idea Bears

Fibre

#18
I wonder if I'm the only person left who is completely happy with 2D acceleration only, and no 3D...

Quote from: ChaosMageX on November 14, 2009, 11:37:21 AM
But also, I don't think his motherboard has the PCI ports necessary to run the newer cards.  I think it had something to do with 2.0, but I don't pay attention to that stuff since I'm a laptop kind of guy.

If you're referring to what I think you are, you should be able to put a PCI Express 2.0 card into a PCI Express 1.x slot with no problems (and the other way around too), as long as the slot is large enough for the card (which is of course the same requirement for matching versions as well).

Quote from: Gamma on November 14, 2009, 02:40:48 PM
EDIT: Though I did notice something, the card you ordered has a PCI-X power connector, does your power-supply have those connectors? Almost any decently branded one of 300 watts or greater has them, so the likelihood of them not being there is very slim.

Please do note that PCI-X is pretty much obsolete, and very different than and completely incompatible with PCI Express/PCIe/PCI-E.

Frigid

#19
Quote from: Dekari on November 14, 2009, 12:24:24 PM
It means you probably won't be playing Left 4 Dead 2 at full graphics when it comes out......but I somehow have a feeling that won't bother you too much  :P

You underestimate the power of video cards... I have an 8600 GTS and it runs Left 4 Dead 2 on max graphics at 1280x1024 at a minimum 30 FPS.

'Course, I DO have both my CPU and GPU overclocked, but the 9800 is much better than a stock 8600, anyway.