
Perpheral views, 2d maps, and inventory lists would be just fine on something to the equivalent to a GeForce 4 MX ($32 new). With the advent of sub $100 video cards and CRT monitors, and the fact that not every output would have to be super hi rez. I've got a stack of 17inch CRT monitors in the garage I picked up for $5 a piece that are just begging to be used. Who could afford all this you ask? Well just about anyone these days. (We can all remember when GF 2 cards cost $400 each, that'll buy you 50 of them these days. I see another scenario where people boost their systems performance by picking up cheaper versions of cards they own to keep their graphics improving without breaking the bank. For example those lovely to the horizon maps that show up in various games that add about 100meters of high detail every year. There are always those twisted monkeys that come up with graphics that won't run on any one GPU these days. You could double, tripple or even quadruple up on the number of required cards for any one monitor that would require higher end graphics. I could see a use without trying hard that would require at least six monitors. Same for Massive Multiplayer Online Games. Well that could become quite the array depending on how much you wanted covered. RTS resource monitors, sat view, and ground maps. Brings the grand total to five.ĭriving games could finally have a true perspective instead of the stupid 3rd person or 1/3 screen in car view. This can apply easily to any game.įirst person shooters could finally have peripheral vision (one center and two on the sides) along with a inventory and map screen. Along with an array of cheap monitors I will finally have a wrap around view of the sim world. You obviously don't have much imagination if you can't think of a use of more than two video cards/monitors.Īs a lover of flight sims I'll be first in line to buy a mother board that can support 10 video cards. Of course some might question whether a siamesed pair of processors actually constitutes a single IC. In fact, dual cores could mean that the transistor count increases at greater than Moores Law in the near term. End result, more speed, higher transistor count, and Moores Law still fine. Dual cores means you can increase the number of transistors per IC more and actually use them to do real work rather than simply adding a huge cache (as was done with the latest Itanic). However, increasing speed is a consequence of higher clock speeds and higher transistor counts. "Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue." Many people assume Moores Law states that speed of processors will double every 18 months and that the fact that it is becoming difficult to increase clock speed now means that Moores Law is finished. soon dual-core CPUs, is it a sign that we're slowly approaching the Moores Law limit? The 'dual' strategy allow for further performance gains VIA did on the other hand have two ATI cards up and running, although not in SLI mode."ĭual video cards. But what has emerged is that DualGFX after all doesn't support SLI, at least not for the time being, since it seems like nVidia some how has managed to lock out other manufacturers chipsets from working properly with SLI.
#DUEL P2 CARD READER HOW TO#
has taken a look at how the new SLI works, how to set it up (and how not to,) along with benchmarks using both of the rendering modes available in the new SLI." And reader Oh'Boy writes "VIA on its latest press tour stopped by and visited in the UK and TrustedReviews have some new information on VIA's latest chipsets for AMD Athlon 64, the K8T890 and the K8T890 Pro which supports DualGFX. SLI works differently this time around, but the basic concept of using two cards to get the rendering work done is the same. Sadly, that era ended a long time ago (although somebody has managed to get Doom III to play on a pair of Voodoo 2's.) However, Nvidia have revived SLI with their GeForce 66 cards. SLI'd Voodoo 2's were a force to reckoned with. Kez writes "I'm sure many Slashdot readers fondly remember the era of 3dfx.
