AMD RX Vega 64/56 Review Thread

WINDOWS will only see one gpu

the whole point of HBM is MCM-GPU as MCM-GPU's need on die memory

Although depending on how it is exposed/the associated market share, LordHawkwind has reason to be sceptical about this;

To quote myself from another thread;
Right, so when it comes to multiple GPUs on a single card it depends very much on how they are arranged.

1) Two GPUs each with their own set of memory.
2) Two GPUs sharing memory.

Case 1 is the same as SLI/CFX now, just with a potentially faster bus between the two GPUs, however data is still "local" to one GPU or the other. This would be like NUMA nodes for CPUs, sure you might be able to access the other data 'directly' but you have to go via the other chip's memory controller and then bounce across the bus before hitting your own caches. In this setup, for the new APIs, you still want to see two physical devices so that you can coordinate the copying between memory "banks" associated with each GPU to get the data "local" for the real work (much like now this would be handled by a dedicated copy/dma queue so no performance loss for ALU ops for normal GPU stuff) - removal of this returns you to the world of driver profiles to make things fast, which isn't easy when you aren't allowed to spin up driver threads and use that to inject extra commands behind the scenes; there might be some scope with Vulkan's Pass/Subpass stuff for detecting when it was last used and injecting a copy, but we are straying back in to the land of unpredictable driver magic.

Granted, this does bring with it other benefits, as the GPUs are now physically closer and wired together it would be much easier for control logic/signals to get sent around along with things like cache snooping to happen - the PS4 and XBox do this between CPU and GPU.

Fundamentally however you have two GPUs with two separate memory pools who, for optimal performance, want their data "local" just like today.

Case 2 on the other hand removes the need to copy things around and data could be shared via the interconnecting bus, assuming the layout is something like;
[GPU][GPU]
[interface layer]
[memory]

So everyone shares the same memory pool but that brings with it other issues.

Firstly bus contention; both GPUs are going to be hitting the same memory bus at the same time, at which point you'll start to lose bandwidth (which isn't a linear fall off, I've a vague memory of the PS4 docs saying something like 1gig/frame of memory access by the CPU reduces the total bandwidth for the GPU by 1.5gig/frame, or words to that effect) due to extra data and signaling happening on the line. More stacks might reduce this problem with HBM at the cost of more footprint.

If the two GPUs are exposed as a single physical device then the question of work balance comes up; right now we AFR with two (or more) GPUs because of the data locality problem (combined with a 'where the **** even is the thing on the screen?!?' problem which vertex shaders bring ;) ) but if everyone is on the same memory that problem goes away so there is no real need to AFR between the two GPUs, a more intelligent work balancing system might make sense however with the game only being able to see a single GPU there is no way to balance this work, instead we are back in the land of driver optimisations which, again, new APIs make difficult although potentially easier than with the copying issue above, as the driver could, as each set of commands is submitted to a queue, route the command buffer as required.

Of the two layouts the 2nd is probably the preferable, assuming the extra data/signaling doesn't degrade memory bandwidth too much and/or, potentially, wires don't complicate things too much (assuming more stacks are added to address contention there), but even then always presenting as a single device brings with it drawbacks.

Personally I feel that both options might be a good way to go; extend the APIs slightly to let the app say it won't be doing the MGPU stuff itself so bind them as one device and let the driver route things (defaulting to AFR), or let the app say "I know better than you, give me both devices" and let them handle it. (The former might even be able to include something to hint at workload splits to help the driver out, but I'm just spit-balling here.) Until this physical setup is common most games would probably take route one anyway, but it leaves the door open for those who want to do otherwise which is worth while.

Some people might be wondering why you'd want two GPUs sharing the same memory anyway; why not just glue them together so that you have more ALU/cache/rop/etc on the same die and be done with it?

Well, there are good reasons to have separate dies/packages even in a shared memory situation and that comes down to work retirement.

When some work is done the GPU allocates some internal structures (from an internal memory pool), allocates which CUs are going to do the work and then chucks the work at the CUs to do it - this is handled by the front end command processor. However the command processor can only allocate so much work before it runs out of internal memory to use, this might be 10 units of work. At which point it has to wait for something to finish before it can move on to the next piece of work.

The problem comes when work completes out of order; if Work Unit 2 completes before WU0 then the GPU waits for WU0 and WU1 to complete before it can reuse WU2's memory space. (This it likely done to simplify the microcode/circuits associated with this operation, dealing with out of order retirement would require more book keeping structures, for example).
(For AMD this applies to both the Graphics Command Processor and the Async Compute Units, the difference being the GCP has more "work in flight" space than the ACEs - NV has a single command processor.)

So if you doubled the size of the ALUs then you naturally want to push more work in to keep things busy, at which point you run a risk of having these stalls more often as work completes differently; Assuming 10 WU in flight was enough to keep a 'normal' GPU fed and happy, you might be tempted to increase that to 20, at which point potential stalls become a much bigger issue - keeping the hardware separate isolates that stall to only one location, allowing the other GPU to continue to progress.

Ultimately if feels like the right direction to go, if you have to go MGPU, is two GPUs sat on a single block of memory - just needs to be exposed sanely for all concerned.

(It should also be noted that CPUs don't hide their cores and locations away; an app can query core count and NUMA nodes to make best use of the hardware; it makes sense that GPUs allow similar querying and work distribution to get the best out of the hardware for the workload - and include a "don't care" mode where you just want the work to get done :) )
 
In Sweden Vega 56 is priced (4090 SEK = 513.82 USD = 431.82 Euro) the same as the very cheapest GTX 1070s.

Vega 64 seems about ~500 SEK cheaper than GTX 1080 at 5090 SEK =639.45 USD.

Something like an Asus Vega 56 bios modded to 64-ish performance could be quite the bang for the buck.
 
My vega56 arrived today...First impresions..
- no driver cd in the box
- no monitor to vga cable (no hdmi or display port cable).Just a power cable convertor that i never saw until now and i don't use.
-The ports and PCI express connection were protected.

I played Obduction game for a bit and the cooler is loud and is changing speed dynamically.It's a jet.Without headphones i kinda hear the vga fan too much.
 
My vega56 arrived today...First impresions..
- no driver cd in the box
- no monitor to vga cable (no hdmi or display port cable).Just a power cable convertor that i never saw until now and i don't use.
-The ports and PCI express connection were protected.

I played Obduction game for a bit and the cooler is loud and is changing speed dynamically.It's a jet.Without headphones i kinda hear the vga fan too much.

Uh video cards have NEVER come with cables.. those come with the monitor.

No reason now days to put driver disks in with the cards, since they are easily downloaded, and are updated so frequently now (drivers on cd in the past have usually been months behind latest).
 
Uh video cards have NEVER come with cables.. those come with the monitor.

No reason now days to put driver disks in with the cards, since they are easily downloaded, and are updated so frequently now (drivers on cd in the past have usually been months behind latest).


Well it's my first card without cd ... :D
The package felt very scarce.I am pretty sure old reference AMD cards had more in the package...CFX is not needed anymore, dvi to vga not needed, maybe some multimedia cables were there.
Anyway the biggest dilema is how i silence the fan durring gaming.This is a jet engine...
 
Video cards do come with cables, not sure why you're saying they never have. My 980TI came with an HDMI and DP cable.
 
Well it's my first card without cd ... :D
The package felt very scarce.I am pretty sure old reference AMD cards had more in the package...CFX is not needed anymore, dvi to vga not needed, maybe some multimedia cables were there.
Anyway the biggest dilema is how i silence the fan durring gaming.This is a jet engine...

Were you planning on buying some third party cooler?
 
Uh video cards have NEVER come with cables.. those come with the monitor.

No reason now days to put driver disks in with the cards, since they are easily downloaded, and are updated so frequently now (drivers on cd in the past have usually been months behind latest).

My sapphire 5870 came with an hdmi cable
 
Video cards do come with cables, not sure why you're saying they never have. My 980TI came with an HDMI and DP cable.
I have never experienced receiving cables with a video card, even when I was in business and/or worked in retail computer hardware. not my gtx 8800s, not the gtx 970, not the gtx 1060, and non of the AMD/ATI cards. Maybe it was a special vendor specific limited program or a cable vendor promotion (kind of like when a game was included with the package).

Only time I ever received cables is with the monitor. The only thing that I have ever experienced was adapters included with video cards. Cables included with a video card is very very rare, and is not a common practice. So, I guess saying never not the best way to say it.
 
Last edited:
Mine came with a DVI to HDMI converter only if I remember right. That's it. I got a DP cable that my monitor came with.
 
Were you planning on buying some third party cooler?


Yep...Kinda is needed.I had some reference AMD with arctic coolers.I am acustomed to low noise gaming.THis reference fan noise makes things unplayable in default settings.Even with my audio technica ath m50 closed back i can hear the fan.
I begun to experiment with wattman to lower the fan speed durring gaming.Undervolting , lowering power limit can allow me 2000 rpm's from 2400rpms.I want to find a way to go even lower to 1500 rpm.

BUGS part ... I have this bug :D

https://community.amd.com/thread/219563
 
Last edited:
Yep...Kinda is needed.I had some reference AMD with arctic coolers.I am acustomed to low noise gaming.THis reference fan noise makes things unplayable in default settings.Even with my audio technica ath m50 closed back i can hear the fan.
I begun to experiment with wattman to lower the fan speed durring gaming.Undervolting , lowering power limit can allow me 2000 rpm's from 2400rpms.I want to find a way to go even lower to 1500 rpm.

BUGS part ... I have this bug :D

https://community.amd.com/thread/219563

No issues here, 17.8.2 drivers, windows 10, 64 bit, hardware acceleration enabled. Vega 64 AC, Using Chrome.
 
Well it's my first card without cd ... :D
The package felt very scarce.I am pretty sure old reference AMD cards had more in the package...CFX is not needed anymore, dvi to vga not needed, maybe some multimedia cables were there.
Anyway the biggest dilema is how i silence the fan durring gaming.This is a jet engine...

You must have the ears of a bat. It's not a jet engine :lol: I can hear the air flow on my 64 AC when it first ramps up but that's about it. My case is a bequiet! with noise dampening on the panels so that probably helps. Plus I'm a heavy metal fan and having seen Iron Maiden, Motorhead, AC/DC, Van Halen to name a few numerous times maybe my hearing is not so good!

With regards to CD for drivers I've ditched my optical drives because they're pretty much legacy these days TBH. I also concur with NWR as I can't remember the last time I got a cable with a Gfx card. Dongles yes but cables came with the monitors. So you bought a Gfx card and only got a Gfx card and you thought the package was scarce? I suppose we're all different after all :lol:
 
You must have the ears of a bat. It's not a jet engine :lol: I can hear the air flow on my 64 AC when it first ramps up but that's about it. My case is a bequiet! with noise dampening on the panels so that probably helps. Plus I'm a heavy metal fan and having seen Iron Maiden, Motorhead, AC/DC, Van Halen to name a few numerous times maybe my hearing is not so good!

With regards to CD for drivers I've ditched my optical drives because they're pretty much legacy these days TBH. I also concur with NWR as I can't remember the last time I got a cable with a Gfx card. Dongles yes but cables came with the monitors. So you bought a Gfx card and only got a Gfx card and you thought the package was scarce? I suppose we're all different after all :lol:


In idle the card is impressively quiet..THe quietest reference i have owned.On the other hand in games is jet engine.The HD6950 is quieter in load than V56.
I was an Iron Maiden fan too .... Anyway i listened in the past to hammerfall, battle beast, gamma ray as heavy metal rock bands ...
 
Back
Top