|
|
The Creation of a Benchmark Added on: Wed Nov 21 2001 |
Page: 1 2 3 4 5 6 7 8 9 10 |
The creation of a world begins with an idea. This idea, tempered by creativity, takes form in a sketch. Sketches form into drawings which eventually cross over into the digital realm, forming into splines and polygons. These basic building blocks form the models, creatures, and worlds born of our original ideas. Not merely simple curves
and triangles, these structures increase in complexity beyond the capabilities of most machines, requiring impressive amounts of horsepower just to be manipulated in the most basic of ways. But how do we characterize these tools?
We can not look at them superficially, as one might a easel, or a pottery wheel. Nor can we compare them by looking at their entertainment value. One must look beyond the obvious and test the components as one would test a brush, by taking
hold and painting upon the canvas.
Discreet 3dsmax4 As a Benchmarking Tool
This canvas happens to be Discreet�s 3D Studio Max R4. Standing strong in the computer industry, Max has found its place among thousands of artists, in film, tv, and game production. The latest release adds a variety of features including
sub-division surfaces, active shade, multitexturing, direct3d and hardware shader support. With the advancement of Sub-division surfaces, viewport power becomes even more important. Since we have already addressed the rendering performance issue, today we prepare to embark on a new set of benchmarking meant to bring realistic performance
expectations known to the end user, within the program itself.
Recording Viewport Performance
There are a variety of ways to record viewport performance. The first, and most obvious, is to record a generic performance number by running a series of tests through a recording utility, such as maxscript. This unfortunately isflawed due to the very nature of maxscript. Under certain circumstances, over a variety of tests, maxscript will begin
to cache or load Max scenes in a different manner, providing unreliable results which are not reproducible in a statistical manner. The second method, born of the popular gaming community, is frames per second.
This method isn�t a reliable source of initial data, as the recording method used in Max (The ShowFPS=1) is erratic at best.
To be able to correctly generate a performance curve more information is needed then frames per second.
The final, and possibly most task intensive, is recording time to completion data. This is the method I will be using to categorize a variety of accelerators, from low end budget models, to top tier professional powerhouses.
|
|
|
|