|
|
Workstation Smackdown (Nvidia Quadro DCC review) Added on: Mon Feb 04 2002 |
Page: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
Nvidia carpet bombs the competition with the Quadro DCC
Workstations have been known to be rather expensive at times. A few years ago you could pick up a whopping 128 megs of ram on a supremely powerful workstation running at a ridiculous two-hundred megahertz, in a DUAL configuration (Wow! Dual two-hundred megahertz processors!
What will I ever do with that much power?) for a paltry four to six thousand dollars. Up the price to seven to eight thousand if you wanted to use a professional level OpenGL accelerator.
In some cases the accelerator actually had almost twenty-four megs of combined video memory! With all this horsepower available would we ever realistically need to upgrade? We didn�t think so. For a few months we paraded our super computers around the neighborhood, flaunting our insanely expensive metal boxes. Over time the systems evolved, taking on slower software with increasingly faster and more expensive hardware.
Then suddenly it all changed - the competition had arrived.
Nvidia bakes �Special� Cookies
Nvidia broke into the professional with speed and precision. In the age of huge video cards, with multiple chips and processors, Nvidia broke the pack in half by providing a powerful single chip solution, their GPU. (Graphics Processing Unit, also know as �Hella of a fast chip�).
Nvidia�s GPU moved geometry, transforms, and lighting off the main system processor, and into the video hardware itself. Though certainly not a new feat in the video card market, (Other companies had onboard transforming and lighting years before), this was entirely new to the gaming community.
I mention the gaming community because that is what the original Nvidia line was designed to compete in. The first GPU enabled card, the Geforce SDR, was designed to defeat Nvidia�s rival, 3dfx in a duel to the death. As you well know 3dfx was defeated and Nvidia moved on.
It wasn�t long before a poor (Aren�t we all) 3D user loaded up his favorite program, trying to put together a group of polygons to represent his favorite playboy bunny. (Or favorite Enterprise Vulcan�no NOT Mr.Spock).
The community was shocked upon hearing that a �gaming card� was actually competing against some top tier video cards. Users threw up their arms in disgust stating, �There is no way this Geforce card beats my Ultrafrog 3123ZX.� Other users were quick to point out that the cards were buggy and had many driver issues and problems. (Viewport redraw problems anyone?) But the facts were still there the card was fast and worth noticing.
The competition however was quick to discredit the card, all but ignoring its abilities. Nvidia quickly noticed the power they had at their disposal and quickly began working their drivers, unifying them, and working on a professional line of their flagship accelerator.
The first Quadro struck the market like a sledgehammer. (No pun intended to AMD of course.) With a unified back buffer, overlay support, and rewritten drivers Nvidia was able to produce an inexpensive and very speedy card. Because of Nvidia�s six month product cycle, their new cards entered the market just as their competition was answering the initial onslaught. The Quadro 2 Pro replaced the initial Quadro�s followed by the Quadro DCC, which is the latest modeled based on the Geforce 3 chipset.
But what is this new accelerator from Nvidia? What new features does it support? How does it perform? And how does it compare to the other cards in Nvidia�s line? These are the questions we will be addressing in this article, starting with the core behind the heatsink. The Nvidia GPU.
|
|
|
|