My mistake, it was only a 3 SLI setup making its debut from Nvidia. The game was running at 1920 x 1080 with all settings maxed. While the game did run well there were moments were it was way to much work and things going on it dropped into the teens. They might not be the final drivers but I'm quite sure that setup could easily out perform a similar system with just one card on the same settings.
http://images.tomshardware.com/2007/11/ ... sgraph.jpg
an interesting graph showing a quad core work load while playing Crysis. Note that only a single core is maxed out at any given moment. I honestly don't know if thats more efficient performance wise for each core to trade off given tasks or if you could dedicate a core to physics, another to AI, and stuff like that.... although technically a graphics card would be better suited for physics calculations
in response to SMer, its gonna be a long time but eventually it might get to that quality in real time. I'd think the true advancements are gonna be in displays and the way you view digital media in the future. Sure we could develop a way to render that real time, but whats the fun of that if its going to be cramped onto an LCD?
Photo/video realism can only take you so far, it will eventually get old and game developers will be sick of putting such an insane amount of detail into characters, not to mention the time it takes to make stuff like that. Thats why TF2 is so amazingly awesome, because its not trying to be photoreal.