AMD Bids Farewell To CrossFire After 12 Years, Retiring Brand In Favor Of mGPU

You don't need quad GPU support for mining....All that's needed is windows detecting all the hardware and that's it, the program used takes it from there.


As for the rest, it's more to do with not having enough users with setups like this for developers to pay more attention to it, but I suppose that's only natural given the cost even if not using the absolute fastest cards, the power use and the cooling involved in such setups.


Though I still say it'll make a comeback once we can't shrink processors any further ,and we're getting there since there's only the 10 nm and the 7nm process until quantum physics ruins it all with current leakage......No choice but to go multi GPU once that point is reached within the next 5 years.


When i said "quad-Gpu" i was a bit ironic.Crypto mining is not using CFX at all or quad fire...Is something different.
AMD and Nvidia can't scale their GPU's enough for the performance to worth the money..The scalling for quad-gpu's is kinda volatile, inconsistent and plenty of technical anomalies to solve.It's a huge time consuming task for driver team i suppose to treat quad fire properly.The results are dissapointing for the sheer cost of the configuration and technical time invested in optimizing and fixing stuff.
Imo don't count on comeback in the same form like quad fire or quad sli...May come back in a very different coat that you will imagine.. The comeback of mGPU in the 5 years is just desperate intermediary step
The future is actually going quantum ...
 
When i said "quad-Gpu" i was a bit ironic.Crypto mining is not using CFX at all or quad fire...Is something different.
AMD and Nvidia can't scale their GPU's enough for the performance to worth the money..The scalling for quad-gpu's is kinda volatile, inconsistent and plenty of technical anomalies to solve.It's a huge time consuming task for driver team i suppose to treat quad fire properly.The results are dissapointing for the sheer cost of the configuration and technical time invested in optimizing and fixing stuff.
Imo don't count on comeback in the same form like quad fire or quad sli...May come back in a very different coat that you will imagine.. The comeback of mGPU in the 5 years is just desperate intermediary step
The future is actually going quantum ...


In DX12, it has nothing to do with driver optimizations on AMD's or Nvidia's end to begin with and it's exclusively up to game developers to leverage them, and they can use the one strong point of a GPU that beats any CPU by a mile......Their compute ability to make super realistic physics and artificial intelligence......they don't have to be leveraged in graphics exclusively and to put it into perspective about their math power, there the latest review of the 18 core Intel 7980x which according to anandtech is the first CPU to reach 1 Teraflop math power across it's 18 cores.


The precision needed for either workload ( 16 bit half precision ), allows a Vega to hit 25 Teraflop's......basically 25x faster than an 18 core CPU........It would make for super realistic behaving physics and NPC's smarter than the user playing the game with just one of the GPU's being leveraged.
 
https://www.techpowerup.com/237315/amd-phasing-out-crossfire-brand-with-dx-12-adoption-favors-mgpu

THere is a graph there..Except 4 games the scalling is not encouraging really.


How many of those are using DX12? Or explicit multi GPU rendering?.......None, or leveraging them in the examples I quoted in physics and A. I.?.....None.


Even my old 3+ year old GPU's can still crank out 6 Teraflop single precision math for each one.......6x higher than an 18 core chip costing 2000$ that just launched.


All those charts are showing is just how behind the times most game software really is, nothing more.
 
See pretty much all those games start off as console games, and then a version for the PC is made that adds some extra eye candy like higher quality textures and the means to handle more players online in the case of more multiplayer oriented games, but they are what they are......Console ports.


In no way they're really using the much higher levels of graphics power, math power or memory that a PC can have and create a gamed aimed squarely for them where no console has enough grunt to run it.
 
vega 64 and a vega 56 in cf

mixed cfx :nuts:

they just got cf working with vega maybe a little early to be trying mixed


It is at the very least astounding when I think about it ( which I try my best not to do it ), but those two spare hawaii GPU's I own have roughly the same FP32 single precision math power in their shader array as a single Vega GPU and slightly more than a GP102 in a 1080TI at a combined total of 12+Teraflop and they're just there, sitting with their proverbial thumbs up their a** since they're not being used.


And they're old, outdated, ran out of power, shown their age......Take your pick


A pair of Vega's or Pascal's would be extremely powerful without a doubt, but nothing shakes the feeling of me retiring those Hawaii's early, knowing only their full power in synthetic benchmarks like 3D mark, not in game with all 4 used.
 
Last edited:
Back
Top