Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > General Discussion > Front Page News > ATI Crossfire - X850 Crossfire Edition and Platform Basics

Reply
Thread Tools

ATI Crossfire - X850 Crossfire Edition and Platform Basics

 
 
Silverstrand Silverstrand is offline
Senior Member
Silverstrand's Avatar
Join Date: Jun 2005
Location: Wisconsin
Posts: 1,953
 
      09-27-2005
It's here, finally, but was it worth waiting for?

Quote: Crossfire was always set to launch before the next generation of ATI graphics processors and with those launching in a week or so it was time for Crossfire systems to make their way to folks like me recently, for full evaluation. Were ATI's claims for the technology preview correct? How would integrating a Crossfire system work for the end-user? What's the Crossfire platform like as a whole?
Link: http://www.hexus.net/content/reviews...lld19JRD0xNjUx
 
Reply With Quote
 
 
 
 
unholy unholy is offline
Senior Member
Join Date: Jun 2005
Location: Australia
Posts: 3,928
 
      09-27-2005
YAH CROSSFIRE i cant wait to see if mike is wrong
 
Reply With Quote
 
 
 
 
The Modfather The Modfather is offline
Senior Member
The Modfather's Avatar
Join Date: Sep 2005
Posts: 2,454
 
      09-27-2005
I... Am never... Wrong...
From Maximum PC...
Head to head the 7800 is the best there is, hands down.
Quote:
It’s no contest: If you want the absolute fastest videocard, and money is no object, you want a board—or two—powered by nVidia’s GeForce 7800 GTX. That’s precisely why we put a pair of reference-design boards in this year’s Dream Machine. After examining one of the first retail boards powered by this GPU, we’re even more certain of our choice.
XFX managed to push the 7800 beyond nVidia’s reference design by clocking this board’s graphics core at 450MHz and its memory at 625MHz (compared with the reference design’s 430- and 600MHz clocks, respectively). Despite the high clocks, the board remains outfitted with a single-slot cooling solution, thanks to the 7800’s smaller, cooler process size (110 nanometers, compared with the 6800’s 130nm) and the new chip’s improved power management—unused portions of the chip are automatically turned off.
If you just dropped $800 on a 512MB GeForce 6800 Ultra board (or $450 for a 512MB Radeon X800 XL, for that matter) because you thought more graphics memory was going to be the next big thing, we feel your pain. Both nVidia’s reference design and this $600 XFX board are outfitted with only 256MB of 256-bit DDR3 memory. It’s going to be some time before game developers even begin thinking about tapping into 512MB of graphics memory. The cost of 512MB boards are just too prohibitive for most consumers.
The XFX board has dual DVI ports and a VIVO (video-in/video-out) port on the mounting bracket. The company includes two DVI-to-VGA adapters in the box, plus a breakout cable for analog video (composite, S-Video, and component) for those interested in editing analog video or using a television as a display. XFX even throws a free T-shirt and an XGear force-feedback gamepad into the package.
But let’s face it: The 7800 GTX is the real star of this show, and nVidia can claim legitimate bragging rights for designing the most powerful mainstream graphics chip to date. Equipped with 302 million transistors and 24 pixel pipelines (compared with 16 pipes on the 6800 and X850), this card chewed through our benchmarks like a beaver through birch.
With resolutions cranked up to 1600x1200 and with 4x antialiasing enabled, a single 7800 GTX delivered performance increases ranging from 30 to 63 percent over a single GeForce 6800 Ultra. Doom 3’s benchmark performance increased the least, moving from 43.5fps on the 6800 Ultra to 56.7fps on the 7800 GTX. The biggest increase occurred with 3DMark03’s Game 4, which jumped from 33.6fps to 54.9fps. Refer to the benchmark chart for the full scoop.
Running two 7800 GTXs in SLI yielded even more impressive results, including a 122 percent frame-rate increase in Far Cry when compared with a 6800 Ultra (137.7fps, compared with 62fps). On some games, however, running two of these cards in SLI moves the performance bottleneck from the GPU to the CPU. When we throttled our test platform’s CPU back from 2.6GHZ to 1.8GHz, Doom 3 performance on a single 7800 GTX decreased only one frame per second. When we performed this same test in SLI, frame rates dropped from 86.3fps to 71.4fps, indicating that the dual GPUs were left tapping their feet as they waited for the CPU to catch up.
Our test bed is outfitted with an Athlon FX-55 processor. If you’re running a slower processor, it might make more sense to upgrade your CPU before you buy more than one 7800 GTX. The flipside of the equation, of course, is that a single 7800 GTX will deliver a considerable performance boost even if you are running a slower CPU. The downside of the equation is the price tag: A single XFX GeForce 7800 GTX costs $600; double that for an SLI configuration. Is it worth it? Well, let’s put it this way: We didn’t let the high price stop us from rating it Kick Ass.
—Michael Brown
+ HEADSHOT: Enables you to crank up the resolution in the most demanding games.
- GUTSHOT: Pricey; CPU becomes a bottleneck in SLI mode.
Month reviewed: September 2005
Verdict: 10 Kick Ass
 
Reply With Quote
 
unholy unholy is offline
Senior Member
Join Date: Jun 2005
Location: Australia
Posts: 3,928
 
      09-27-2005
that proves nothing bar 7800gtx's are sweet, like they always will be but who says Crossfire wont be a good contenter
I always new the more Video ram u had didn't mean much, it was like the old 64mb's and the128mb's
 
Reply With Quote
 
The Modfather The Modfather is offline
Senior Member
The Modfather's Avatar
Join Date: Sep 2005
Posts: 2,454
 
      09-28-2005
It proves nothing? It proves that NVidias top board is miles beyond ATI's top board and in SLI (or Crossfire if you prefer) NVidia's two cards will still be miles beyond ATI's two boards. It's the most basic of math here, 1+1, that's all.
And more video ram doesn't mean quite so much at the moment, but go ahead and try to run BF2 on a 64mb card and see how far you get. The future isn't going to wait because you don't believe in it, ya know?
 
Reply With Quote
 
unholy unholy is offline
Senior Member
Join Date: Jun 2005
Location: Australia
Posts: 3,928
 
      09-28-2005
i ran bf2 on a 9200se... it crashed
We haven't seen the performance of ATi, its like Intel vs AMD, intel always was the best in the beginning, but look how far AMD have gone, they have dominated the market. i am not saying ATi are going to over turn nVidia, but they are the better in performance value. I have only ever purchased ATi cards, and the only problem i have it having a card new enough... as in i had a 9200se for like 2 years. and when i upgraded i was like "OMFG THIS IS FASt!!!! i can render faster then 30 fps!!!"
But nvidia see to have the best choice in performance, though its really hardly worth SLi it. i am talking bout the 7800's, ATi is yet to release there new Chip.. we will have to see then
I am not taking side, i play fair in the game, but when its Inte vs amd, i always say AMD
If u want a good GCard its what ever u get at the price. Sometimes nVidia sometimes ATi.
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
ati crossfire mike Computer Support 0 07-30-2007 06:29 PM
Budget ATI Crossfire - X1300XT and X1650PRO Tested Silverstrand Front Page News 0 10-10-2006 07:29 PM
NVIDIA Quad-SLI vs. ATI Crossfire Silverstrand Front Page News 0 05-01-2006 12:40 AM
HEXUS.article - ATI Crossfire Sneak Peak with DFI @ Computex Silverstrand Front Page News 0 07-06-2005 01:57 AM
HEXUS.column - Jon Peddie June 2005: ATI drives CrossFire to Silverstrand Front Page News 4 06-27-2005 12:59 AM



Advertisments