AMD RX Vega 64/56 Review Thread

of course it is. APUs get them into laptops and laptops are where the oem unit sales are. They need to make money laptops and servers are how you make money.
 
Well after looking at Mindblank review on youtube he stated that gaming is on last place of use of Vega.THis card was meant for compute.The gaming driivers are unfinished and they may have lower priority.The card seems to have higher potential.Drivers for Pascal on the other hand are already matured and not so much to expect.


https://www.youtube.com/watch?v=kUomUbYsEUY
 
Well after looking at Mindblank review on youtube he stated that gaming is on last place of use of Vega.THis card was meant for compute.The gaming driivers are unfinished and they may have lower priority.The card seems to have higher potential.Drivers for Pascal on the other hand are already matured and not so much to expect.


https://www.youtube.com/watch?v=kUomUbYsEUY

OK let's all take a pause here. Maybe we are all looking at Vega in completely the wrong way. Remember what Raja said on his twitter account about looking at Vega in the wider context and that it's one card competing with the GP100/102/104. It is at least as good as a 1080 in gaming and better than a 1070 and it beats them both in compute. In compute it also beats out the Titan which is why Nvidia suddenly released this mysterious compute driver for Titan that seemed to have slipped their memory until Vega came along.

Overall what does that also say about Nvidia's 1070/80 gaming cards that aren't as good as an AMD compute card that you can game on. Both Nvidia cards are pure gaming cards that have had all their compute capabilities stripped away. Their performance looks totally different now in this scenario don't you think. A bit like two thoroughbreds going up against a workhorse and being made to look rather lame (pun intended).

Also I don't really buy this crap that all the miners are buying Vega. The economics for them just don't stack up at all. However, if you're a small studio or a 1 or 2 man band of content creators a RX Vega at $599 is a big chunk of change cheaper that a Vega FE or the Titan sans the 3GB of GDDR5X or 8GB of HBM2. A pretty good cost cost saving if you ask me.

I'm not ignoring the power draw, heat or noise but in reality they come down to an individual choice in each area and we're all different in those respects. Overall I'm beginning to think Vega isn't that bad if you take into context what it's trying to achieve. For me as a gamer it's a good upgrade from a Fury Pro and I'd argue it's a good upgrade from a 290/390 too. For a small studio or content creators it's close to $500 dollars cheaper than a Vega FE and Titan and as a bonus it's not a bad gamer. What is there to not like about those scenarios? If you want to use it's compute capabilities for whatever reason it's even a good upgrade for 1070/1080 owners as you don't lose the gaming performance. Obviously this is my own personal supposition only.
 
OK let's all take a pause here. Maybe we are all looking at Vega in completely the wrong way. Remember what Raja said on his twitter account about looking at Vega in the wider context and that it's one card competing with the GP100/102/104. It is at least as good as a 1080 in gaming and better than a 1070 and it beats them both in compute. In compute it also beats out the Titan which is why Nvidia suddenly released this mysterious compute driver for Titan that seemed to have slipped their memory until Vega came along.

Overall what does that also say about Nvidia's 1070/80 gaming cards that aren't as good as an AMD compute card that you can game on. Both Nvidia cards are pure gaming cards that have had all their compute capabilities stripped away. Their performance looks totally different now in this scenario don't you think. A bit like two thoroughbreds going up against a workhorse and being made to look rather lame (pun intended).

Also I don't really buy this crap that all the miners are buying Vega. The economics for them just don't stack up at all. However, if you're a small studio or a 1 or 2 man band of content creators a RX Vega at $599 is a big chunk of change cheaper that a Vega FE or the Titan sans the 3GB of GDDR5X or 8GB of HBM2. A pretty good cost cost saving if you ask me.

I'm not ignoring the power draw, heat or noise but in reality they come down to an individual choice in each area and we're all different in those respects. Overall I'm beginning to think Vega isn't that bad if you take into context what it's trying to achieve. For me as a gamer it's a good upgrade from a Fury Pro and I'd argue it's a good upgrade from a 290/390 too. For a small studio or content creators it's close to $500 dollars cheaper than a Vega FE and Titan and as a bonus it's not a bad gamer. What is there to not like about those scenarios? If you want to use it's compute capabilities for whatever reason it's even a good upgrade for 1070/1080 owners as you don't lose the gaming performance. Obviously this is my own personal supposition only.


:lol:

so it beets a low midrange 1070 that is a year and a half old and bearly matches a midrange 1080 that is just as old
or just as good at 3x and 2x the power draw respectively :lol:

and as a gamer who the **** cares about compute
I don't want a compute card that you "can" game on or a jack of all trades master of none heat pump card
I want a gaming card .

if it was not for the compute BS it most likely would not be as good for coincrap mining
and it would most likely draw a lot less power and be in stock at MSRP

......
miners

https://hothardware.com/news/amd-radeon-rx-vega-56-gaming-ethereum-beast-gpu-launches

http://www.tomshardware.com/reviews/radeon-rx-vega-56,5202-20.html


........
you sound like eisberg defending star citizen


the only context is that they screwed the pooch with vega and it came out worse than the ati 2900 xt
and now they have to sell a dog's breakfast for tha price of filet mignon
 
:lol:

so it beets a low midrange 1070 that is a year and a half old and bearly matches a midrange 1080 that is just as old
or just as good at 3x and 2x the power draw respectively :lol:

and as a gamer who the **** cares about compute
I don't want a compute card that you "can" game on or a jack of all trades master of none heat pump card
I want a gaming card .

if it was not for the compute BS it most likely would not be as good for coincrap mining
and it would most likely draw a lot less power and be in stock at MSRP

......
miners

https://hothardware.com/news/amd-radeon-rx-vega-56-gaming-ethereum-beast-gpu-launches

http://www.tomshardware.com/reviews/radeon-rx-vega-56,5202-20.html


........
you sound like eisberg defending star citizen


the only context is that they screwed the pooch with vega and it came out worse than the ati 2900 xt
and now they have to sell a dog's breakfast for tha price of filet mignon

Undervolting resulting in increase performance is the saving grace...
 

If you are worried about an extra $100 a year in power costs, then you have no business owning any high end cards, you can't afford them. :lol: :lol: :D :D


Also, where are you getting NO 4k. It does fine in the majority of titles, and in the games that it doesn't, even the 1080ti is having a tough time having decent average frame rates, which means it is the title more so than the card.

You also keep saying that it barely keeps up with the 1080, yet in most of the reviews, it is shown that it beats the 1080 in many of the benchmarked games, with some of them showing it between the 1080 and the 1080ti, game dependent of course.

Also, it was already stated, Not so sure about some of the numbers in the reviews, because even I on my ancient x58 overclocked xeon platform which sports pci-e 2.0, I am getting higher averages than reported with equal or better settings (when they are not testing on max settings) at 1440p. And I'm not benching with an over clocked 7700k at 4.8Ghz or higher.
 
Last edited:
If you are worried about an extra $100 a year in power costs, then you have no business owning any high end cards, you can't afford them. :lol: :lol: :D :D


Also, where are you getting NO 4k. It does fine in the majority of titles, and in the games that it doesn't, even the 1080ti is having a tough time having decent average frame rates, which means it is the title more so than the card.

You are also keep saying that it barely keeps up with the 1080, yet in most of the reviews, it is shown that it beats the 1080 in many of the benchmarked games, with some of them showing it between the 1080 and the 1080ti, game dependent of course.

Also, it was already stated, Not so sure about some of the numbers in the reviews, because even I on my ancient x58 overclocked xeon platform which sports pci-e 2.0, I am getting higher averages than reported with equal or better settings (when they are not testing on max settings) at 1440p. And I'm not benching with an over clocked 7700k at 4.8Ghz or higher.


no 4k in that review about undervolting only 1440p
4k sometimes puts a heavier load so higher power draw in somethings

...


but Vega 64 LC like the 1080 8gb needs two cards at 4k60 being 25% to 35% slower depending on game
and sure it does better at dx12 but dx12 is very low on number of games coming out still

and my 1080 ti Strix does fine at 4k in most all games but sometimes with little or no aa
but at 4k 40 inch i can live without aa
but even the 1080 ti is borderline at 4k60 and needs to turn things off sometines so no a single Vega 64 LC at 699 will not do 4k .

i didn't buy a 1080 8gb for the same resion it needs two cards at 4k

.....
and two Vega 64 would be "extra $200 a year in power costs"

and no i don't care about power cost

but
XFzBKt.png


wLhi21.png

https://www.pcper.com/reviews/Graph...64-Vega-64-Liquid-Vega-56-Tested/Clocks-Power-


i do care about heat living in Phoenix AZ and a 325 watt to 430 watt times two for crossfire ( if they ever get it working on RX Vega )
i don't need a 650 to 860 watt crossfire space heater when it is 110f to 120f outside when a single 1080 it is 250 to 300 watts max

my office is just a spare bedroom at 15' x 14' feet that much heat in that room would need a dedicated air conditioner window or wall unit
 
Last edited:
not enough and still draw close to 100 watts more than a 1080 8gb

https://translate.google.de/transla...d-vega-64-im-undervolting-test.html&edit-text=

and no 4k


I was refering to vega56.Vega56 undervolted is around 321 watts aproaching gtx 1070.The difference in performance between vega64 undervolted and vega56 undervolted is minimal.This is where the value of the vega line is.400$ for a gtx1080-like performance aproaching gtx 1070 power consumption is the sweet spot.
Of course if you add the need for 4k both the gtx1080 or vega64/56 are not enough in single gpu so the conversation is irelevant anyway.This segment of price can't sustain 4k.
 
You can do the same thing to the 1070, in fact I have to reduce the heat output in my pincab from 150W+ to 110W+ barely noticing it, running 4K. I'm guessing this applies to the 1080 too?
 
:lol:

so it beets a low midrange 1070 that is a year and a half old and bearly matches a midrange 1080 that is just as old
or just as good at 3x and 2x the power draw respectively :lol:

and as a gamer who the **** cares about compute
I don't want a compute card that you "can" game on or a jack of all trades master of none heat pump card
I want a gaming card .

if it was not for the compute BS it most likely would not be as good for coincrap mining
and it would most likely draw a lot less power and be in stock at MSRP

......
miners

https://hothardware.com/news/amd-radeon-rx-vega-56-gaming-ethereum-beast-gpu-launches

http://www.tomshardware.com/reviews/radeon-rx-vega-56,5202-20.html


........
you sound like eisberg defending star citizen


the only context is that they screwed the pooch with vega and it came out worse than the ati 2900 xt
and now they have to sell a dog's breakfast for tha price of filet mignon

Gosh bill you're bitching more than I did about the Vega drivers :lol: I bought my Vega 64 AC at MSRP and as I've said numerous times I couldn't be happier with the performance upgrade from my Fury Pro. I'll go through the math again as I did for the Nvidiots who've now seemed to stop posting (wonder why)? I paid £450 for my Vega 64 and at the time the cheapest 1080ti I could source was £685. That made the 1080ti 52% more expensive for at best 35% more performance. That performance advantage dropped off considerably in DX12 from at best 20% to nothing. A bit of a no brainer really and that's why MY Vega purchase made sense to me.

As with the last few launches from both camps prices have been gouged until supply managed to meet demand and that's just economics. In the UK I can still buy a standalone Vega 64 for £469 so if that's not the case in the good ol USA then someone's trying to screw you. Just checked and the 1080ti prices are still the same.

At that price it's a compelling argument to upgrade for 290/390 and Fury Pro owners (Fury X is LC so different altogether). I do agree the Vega 64 LC is way overpriced for the difference it brings and I think most people know my thoughts by now on water cooling so I'm not even going to comment on that card as there is no way in a month of Sundays I would buy it, or any WC card for that matter.

I don't want to appear to be an apologist for Vega as I acknowledge it has a lot of faults. The biggest problem it faces is the over-hype of what people thought it would achieve to what it actually did achieve. TBH part of that issue was most definitely AMD's fault but let's not forget that Shaiderharan got it right months ago by predicting it would land between 1070/1080 but closer to 1080ti in DX12. He was derided for his comments but guess who was right.

Bill just stop bitching about Vega and go and buy another 1080ti FFS and leave us Vega owners who are happy with their purchases alone :rolleyes:
 
Gosh bill you're bitching more than I did about the Vega drivers :lol: I bought my Vega 64 AC at MSRP and as I've said numerous times I couldn't be happier with the performance upgrade from my Fury Pro. I'll go through the math again as I did for the Nvidiots who've now seemed to stop posting (wonder why)? I paid £450 for my Vega 64 and at the time the cheapest 1080ti I could source was £685. That made the 1080ti 52% more expensive for at best 35% more performance. That performance advantage dropped off considerably in DX12 from at best 20% to nothing. A bit of a no brainer really and that's why MY Vega purchase made sense to me.

As with the last few launches from both camps prices have been gouged until supply managed to meet demand and that's just economics. In the UK I can still buy a standalone Vega 64 for £469 so if that's not the case in the good ol USA then someone's trying to screw you. Just checked and the 1080ti prices are still the same.

At that price it's a compelling argument to upgrade for 290/390 and Fury Pro owners (Fury X is LC so different altogether). I do agree the Vega 64 LC is way overpriced for the difference it brings and I think most people know my thoughts by now on water cooling so I'm not even going to comment on that card as there is no way in a month of Sundays I would buy it, or any WC card for that matter.

I don't want to appear to be an apologist for Vega as I acknowledge it has a lot of faults. The biggest problem it faces is the over-hype of what people thought it would achieve to what it actually did achieve. TBH part of that issue was most definitely AMD's fault but let's not forget that Shaiderharan got it right months ago by predicting it would land between 1070/1080 but closer to 1080ti in DX12. He was derided for his comments but guess who was right.

Bill just stop bitching about Vega and go and buy another 1080ti FFS and leave us Vega owners who are happy with their purchases alone :rolleyes:
No .


I will keep "bitching about Vega" till they fix it with a Vega 2 or Navi comes out in the hopes that Navi is not just as bad

because Navi is their last chance to stay in GPU gaming .
their GPU gaming market share is going down fast
http://store.steampowered.com/hwsurvey/


......
one 1080 ti is enough
and Fury x was a better card and i'll keep the two I have in my amd system
and vega may have been over-hyped but fury x was within 2% at 4k to the 980 ti it is just nuts that Vega draws about a 100 watts more than fury x or 1080 ti and is in between a 1070 and 1080 in lack of performance
 
No .


I will keep "bitching about Vega" till they fix it with a Vega 2 or Navi comes out in the hopes that Navi is not just as bad

because Navi is their last chance to stay in GPU gaming .
their GPU gaming market share is going down fast
http://store.steampowered.com/hwsurvey/


......
one 1080 ti is enough
and Fury x was a better card and i'll keep the two I have in my amd system
and vega may have been over-hyped but fury x was within 2% at 4k to the 980 ti it is just nuts that Vega draws about a 100 watts more than fury x or 1080 ti and is in between a 1070 and 1080 in lack of performance

It is what is it is. Simples. For others, like you, it's not a great upgrade I get it OK. Steam powered survey come on half of them are using iGPU's :lol: That's not representative of high end/enthusiasts cards :lol: IMHO there will be no Vega 2 AMD will just move onto Navi and if that's a multi-GPU linked by IF count me out big time. Not interested at all. For those of us who game at just 1440p it's a great card but if you really want 4K spend another $800 on a 1080ti but when it doesn't support SLI any more don't come bitching to us.
 
It is what is it is. Simples. For others, like you, it's not a great upgrade I get it OK. Steam powered survey come on half of them are using iGPU's :lol: That's not representative of high end/enthusiasts cards :lol: IMHO there will be no Vega 2 AMD will just move onto Navi and if that's a multi-GPU linked by IF count me out big time. Not interested at all. For those of us who game at just 1440p it's a great card but if you really want 4K spend another $800 on a 1080ti but when it doesn't support SLI any more don't come bitching to us.

well steam hardware survey is all gaming and from march 2016 to now AMD dropped from 25.5% to just 18.63 and yes that is a big deal lowest I've seen it in years
and it was not intel with iGPU's that picked it up but NVidia, Intel has lost iGPU market share on steam also

if that's a multi-GPU linked by IF count me out big time

don't know why if it looks like one gpu to the OS just like threadripper
and since both nv & amd will be doing it soon I think
are you going to stop gaming

if you really want 4K spend another $800 on a 1080ti but when it doesn't support SLI any more don't come bitching to us
#1 a single 1080 ti does 4k fine at this time unlike any Vega

and RX Vega doesn't do crossfire at all yet :hmm:

nv cards do still do two way sli
 
well steam hardware survey is all gaming and from march 2016 to now AMD dropped from 25.5% to just 18.63 and yes that is a big deal lowest I've seen it in years
and it was not intel with iGPU's that picked it up but NVidia, Intel has lost iGPU market share on steam also



don't know why if it looks like one gpu to the OS just like threadripper
and since both nv & amd will be doing it soon I think
are you going to stop gaming



#1 a single 1080 ti does 4k fine at this time unlike any Vega

and RX Vega doesn't do crossfire at all yet :hmm:

nv cards do still do two way sli

So to carry on gaming you're saying I need a $700 CPU + $300 mobo and $200 quad RAM plus a new graphics card :nuts: Wake up and smell the coffee bill. I'm gaming happily as I am thank you.

Multi GPU is MEH TBH and both sides are moving away from that because the devs won't do it. Give me one one good reason why a developer would support multi-gpu if it cost them time and money and the segment is <1% of the gaming market. Not exactly the hardest math to work out really. Maybe you should go and live with shadow001 in his tri-crossfire world :lol:
 
So to carry on gaming you're saying I need a $700 CPU + $300 mobo and $200 quad RAM plus a new graphics card :nuts: Wake up and smell the coffee bill. I'm gaming happily as I am thank you.

Multi GPU is MEH TBH and both sides are moving away from that because the devs won't do it. Give me one one good reason why a developer would support multi-gpu if it cost them time and money and the segment is <1% of the gaming market. Not exactly the hardest math to work out really. Maybe you should go and live with shadow001 in his tri-crossfire world :lol:
:confused: :nuts:

who said that ?

video cards are going Multi-Chip like threadripper not that you need to buy a threadripper

or MCM-GPU

https://www.extremetech.com/gaming/252022-nvidia-considers-multi-chip-gpus-future-designs

WINDOWS will only see one gpu

the whole point of HBM is MCM-GPU as MCM-GPU's need on die memory
 
Last edited:
So it'll work on my Asus Z270K mobo will it. Don't think Intel will be too hapy about that :lol:

I don't know about that it maybe or need PCIE 4.0 by then
PCIE 4.0 should start next year if all goes well

but it is just a video card like any other intel has nothing to say about that as they don't make any

or 5.0

https://www.extremetech.com/computi...ns-launch-pcie-5-0-2019-4x-bandwidth-pcie-3-0


but 1 to 4 years and I think we will see MCM-GPU's depending on how well 7nm goes
and why both are starting to let go of sli/cfx now
 
Last edited:
I don't know about that it maybe or need PCIE 4.0 by then
PCIE 4.0 should start next year if all goes well

but it is just a video card like any other intel has nothing to say about that as they don't make any

or 5.0

https://www.extremetech.com/computi...ns-launch-pcie-5-0-2019-4x-bandwidth-pcie-3-0


but 1 to 4 years and I think we will see MCM-GPU's depending on how well 7nm goes
and why both are starting to let go of sli/cfx now

Hey bill I'm more than happy with one Vega 64 AC ATM thank you.
 
Back
Top