Graphic Cards

Graphic cards have been around since around the 1990's.  The idea then, as it is today, was to off load the drawing of the display on the monitor from the central processing unit to speed up the computer.  This has not changed in all these years, and yes, graphic cards are very much needed today.  With the continued improvement in computer gaming software, the continued increase in both the resolution and size of monitor displays, and the movement toward hi-definition video, graphic cards have become a critical part of the computer system.

From the start, graphic cards have plugged into slots on the motherboard, and have their monitor output plugs on the card come out the back of the computer.  In the beginning graphic cards used a shared PCI bus to plug into the motherboard. Soon technology began to push the limits of the shared PCI interface, and a new interface came along called, AGP, accelerated graphics port. This provided a dedicated path to the CPU. There were several versions of AGP with different voltages required to run the card, and you had to be careful which version your motherboard could use.  Because of the confusion, and the need for even higer tranfer rates, starting in 2004 a new interface was introduced that all modern graphic cards use called PCI Express, or PCIe, or PCI-E, depending on the specs. There all the same.  To give you an idea of technology improvements. PCI had a data rate of 250Mb/s, AGP had a rate of 500Mb/s, and PCIe has a rate of 1Gb/second.

Graphic cards work by building a display in the graphic cards memory of what you will see on the screen and then transferring that display image to your monitor.  As games got faster, more graphic card memory was needed to build displays in the background, so as one image was being displayed on the monitor, the next image was being built in the graphic card memory.  High definition video and realistic 3d computer games have progressed to the point that two physical graphic cards are used to draw alternate images and to make your display operate like your television.

How do you pick a graphic card?  Well, you have three choices.  Intel has been pushing its graphic chips to be included with the motherboard, and thus manufacturers, like Dell and HP, would not need to provide a graphic card with their computers.  This resulted in a great savings to the manufacturers, and this technology is pushed today.  Buyer beware.  I view this as step back, not forward. Why?  Well these chips do not have their own graphic memory.  They use the computers main memory to draw their images. So your main memory is being used for all your computer operations, which can slow down your overall system performance, in addition, it requires you to load up on memory if you want to have decent performance.  My advice, your building your own computer, don't get a motherboard with built-in graphic chips that run off of main memory, and hamper your overall performance.  Buy a separate graphic card.

That was choice one, back to the other two choices.  There are two companies that design and make the graphic chips, ATI and NVIDIA. They do not, as their main business, sell graphic cards, but manufacture the chips. Other vendors sell the cards with their chips on it. Most people that I know either are ATI or NVIDIA users, they don't switch back and forth. I personally have used both, but currently favor NVIDIA.  Both companies use to be independent, but in 2006 AMD the competitor of Intel purchased ATI.  What this means is now AMD can compete with the built-in graphic chip set of Intel. Microsoft just announced they will use the AMD and ATI for their next generation Xbox so the merger is paying off.

You will choose one of the other, either ATI, or NVIDIA, and this will affect which motherboard you purchase, as will see later.  One other wrinkle is the use of two graphic cards instead of one in your computer.  I do not recommend this unless you are a state-of-the-art computer gamer.  One card is usually enough.  ATI calls their dual graphic card set-up, "Crossfire." and NVIDIA calls theirs, "SLI."  You'll see motherboards and power supplies that claim the are either "Crossfire-compliant" or "SLI ready."  There are pluses and minuses of each.  If you decide to go the two card route, I recommend you research the differences first.

graphiccard2

Since the graphic cards plug into the motherboard, you will have to pick a motherboard that fits the graphic card, or visa versa pick a graphic card that fits the motherboard. For example, if you want two graphic cards, your motherboard must have two PCIe slots the proper distance apart on the motherboard.  More on this when I talk about motherboards.

My recommendation is you get one of the latest graphic cards, that fit in your budget, produced by a graphic card manufacturer, like XFX, PNY, or EVGA.  You can get two card compatibility, i.e. SLI ready, and just buy one board with the idea of picking up the second board later, but do some reading first, if you want to go that route in the future.  You want a good chunk of memory with the card.  Usually the higher the number of the card, the better performance, and the higher the price.  Just like computer chips.

There is an exception to this.  NVIDIA recently got to their 9800GX2, and instead of going into GTX10000 for their next products decided to start their numbering over with the 210.  There latest is the GeForce GTX 295.

Comments are closed.