Vega refresh?

Status
Not open for further replies.

jesdals

New member
Was looking forward to the Asus Strix RX Vega 64, but did not like the reviews. Any solid news about Vega refresh - 7nm? :sherlock:
 
Surprised the interview with anandtech didnt see the 12nm question asked.

https://www.anandtech.com/show/1231...n-exclusive-interview-with-dr-lisa-su-amd-ceo


But this one bit may hint at 7nm vega for graphics vs compute consumer:

Q18: With GlobalFoundries 14nm, it was a licensed Samsung process, and 12nm is an advancement of that. 7nm is more of a pure GF design. Is there any change in the relationship as a result?

LS: So in 7nm, we will use both TSMC and GlobalFoundries. We are working closely with both foundry partners, and will have different product lines for each. I am very confident that the process technology will be stable and capable for what we’re trying to do.
 
Unfortunately, as an owner, I think Vega is dead and AMD will concentrate all resources on Navi now going forward (maybe H2 2019 release). No refresh and no fine wine for Vega just like the Fury line up. RX Vega RIP I'm afraid to say. Well you win some and you lose some that's life.
 
Unfortunately, as an owner, I think Vega is dead and AMD will concentrate all resources on Navi now going forward (maybe H2 2019 release). No refresh and no fine wine for Vega just like the Fury line up. RX Vega RIP I'm afraid to say. Well you win some and you lose some that's life.




You have an atraction to macabre ... you bought fury and now vega :D:lol:
I like more the expression from tom's review ...

We were lucky to get our hands on a sample of Gigabyte's Radeon RX Vega 64 Gaming OC 8G. Buying a card of our own was the only way to get one into into the German lab. Here in the U.S., this is a mythical creature. Nobody has it for sale, and the only references online come from Gigabyte's site or product announcement news stories.


I don't think Vega is dead.Miners bought it...
 
Well Toms just reviewed the GB Vega and its in fact very much fine wine.

Vega now beats the 1080 in most games:

http://www.tomshardware.com/reviews/gigabyte-rx-vega-64-gaming-oc-review,5441-3.html

Only beats a stock 1080 in most games and only because this GB model is overclocked, at the expense of running 15 degrees hotter than a regular Vega 64. And it's not even available in the US nor will it be produced in any significant numbers.

A GTX 1080 OC is faster than this, uses less power and runs cooler, and is available everywhere (barring mining prices).
 
Only beats a stock 1080 in most games and only because this GB model is overclocked, at the expense of running 15 degrees hotter than a regular Vega 64. And it's not even available in the US nor will it be produced in any significant numbers.

A GTX 1080 OC is faster than this, uses less power and runs cooler, and is available everywhere (barring mining prices).

Has nothing very little to do with the 50 Mhz overclocking unless you seriously think 1 to 2 fps is the determining factor. I would suggest you go back and look at the game benchmarks again. The reference vega 64 (air cooled) is also beating the 1080 in the majority of them, and in some cases as much as 10% By the time the Vega 64 has been out for a year (another 7 months from now), it will have even pulled ahead more as drivers mature even more.
 
Last edited:
Has nothing very little to do with the 50 Mhz overclocking unless you seriously think 1 to 2 fps is the determining factor. I would suggest you go back and look at the game benchmarks again. The reference vega 64 (air cooled) is also beating the 1080 in the majority of them, and in some cases as much as 10% By the time the Vega 64 has been out for a year (another 7 months from now), it will have even pulled ahead more as drivers mature even more.


I don't think Vega drivers will mature much more. GCN is the base architecture which has matured.

Its up to the game developers to code for Vega's new features which many won't because.. well.. Marketshare = S**T for GCN courtesy of miners.
 
Has nothing very little to do with the 50 Mhz overclocking unless you seriously think 1 to 2 fps is the determining factor. I would suggest you go back and look at the game benchmarks again. The reference vega 64 (air cooled) is also beating the 1080 in the majority of them, and in some cases as much as 10% By the time the Vega 64 has been out for a year (another 7 months from now), it will have even pulled ahead more as drivers mature even more.

I would suggest you go back and look at the game benchmarks again yourself, the reference Vega 64 (air cooled) beats the 1080 by only 1 or 2 fps (as you said) for the majority of the games it "beats" the 1080, with the exception of a two. All while drawing more power, creating more noise, etc..

And if you look at another recent review, this one that covers a far larger sample of games compared to Tom's, the reference Vega 64 is back to just trading blows with a 1080:

http://www.guru3d.com/articles_pages/galax_geforce_gtx_1070_ti_hof_review,15.html

But hey, keep thinking those magical drivers will continue to improve performance, you've been saying that since launch.

You also conveniently ignored the other point I made, when OC'ing a 1080 it soundly beats the more expensive, more power hungry, more noisy RX Vega 64 OC. Just look at any Asus Strix 1080 OC benchmarks for example.
But hey, congratulations on a Vega 64 finally pulling its weight against a 2.5 year old stock 1080 (basically founder's edition) that nobody buys anymore. :up:
 
I would suggest you go back and look at the game benchmarks again yourself, the reference Vega 64 (air cooled) beats the 1080 by only 1 or 2 fps (as you said) for the majority of the games it "beats" the 1080, with the exception of a two. All while drawing more power, creating more noise, etc..

And if you look at another recent review, this one that covers a far larger sample of games compared to Tom's, the reference Vega 64 is back to just trading blows with a 1080:

http://www.guru3d.com/articles_pages/galax_geforce_gtx_1070_ti_hof_review,15.html

But hey, keep thinking those magical drivers will continue to improve performance, you've been saying that since launch.

You also conveniently ignored the other point I made, when OC'ing a 1080 it soundly beats the more expensive, more power hungry, more noisy RX Vega 64 OC. Just look at any Asus Strix 1080 OC benchmarks for example.
But hey, congratulations on a Vega 64 finally pulling its weight against a 2.5 year old stock 1080 (basically founder's edition) that nobody buys anymore. :up:

First, NO, I did not say the reference Vega only beats the GTX 1080 by 1 or 2 fps, I was referring to the OC vega vs the reference Vega, meaning the 50 OC does not have anything to do with it beating the GTX 1080. However I see how you could have came to that conclusion, because I didn't spell it out for you in my response. But the tom's benchmarks should have made that obvious since the reference 64 and the OC version beats the GTX 1080 in most games by much more than 1 or 2 fps, and shows the OC vega and the reference Vega only 1 or 2 fps apart... hmm.. sorry... I guess.

As for your "other recent results" LOL, get back to me when you cherry pick from a source that is NOT using the same results from August 14,2017 in their January 18, 2018 article (not recent at all). They have not re ran their tests since release, broken drivers.. Great source to try and disprove us.

The only game they ran new benchmarks on is battlefront II using DX 12 (which DX 12 is a broken mess in the game, causing lag, stutter, etc for many). At 1440p, which is the only resolution I care about (1080p is more cpu bound than GPU bound) the 1080 is 7 fps behind the reference Vega 64. ALL other benchmarks are just their same old release day numbers... so I will be inclined to NOT trust a source that won't even properly run the all benchmarks 6 months later.

To be fair, they are still using the same numbers for the GTX 1080, 1.5 years later in their comparisons except battlefront 2(unless you want to sit and try and tell me that even the GTX 1080 has exactly the same frame rate in GTA V and tomb raider for example as it did 1.5 years ago on it's release date). .. Considering the changes, optimizations, the game and it's engine alone go thru in 1.5 years, much less 6 months, shows that guru3d can't even be considered a legit source, since it appears, they bench a card once, and use those same numbers thru the life span of comparing that card with others, which ignores all driver optimizations etc. Not to mention the changes in the test system (CPU speed, Memory)

GTX 1080 release date article 05/17/2016 (more like 1.5 years ago): http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,20.html

August 14, 2017 (5.5 months ago):
http://www.guru3d.com/articles_pages/amd_radeon_rx_vega_64_8gb_review,26.html

and your January 18, 2018 article:
http://www.guru3d.com/articles_pages/galax_geforce_gtx_1070_ti_hof_review,27.html




as for your GTX 1080 OC beats the OC Vega, you are talking about a 152 MHz OC over a stock GTX 1080 (about 9%) vs a 50 Mhz OC (about 3%) vs a Stock Vega...Big difference. Most Vega owners have their cards undervolted as well as getting higher clocks using substantial less power, so, yes, I did ignore your comment on the OC GTX, because it is not a comparison at all, and is just a 'but, but, but response".
 
Last edited:
With that wattage it should be beating the 1080TI..


Do keep in mind that the more advanced features in Vega aren't even being used yet since no game has them yet or DX12 needs an update to support tiling within the geometry hardware itself, with the only exception being far cry 5 and using the half precision floating point feature, which is something that big Pascal doesn't even support but Volta does and as usual, you'll have to buy a new card to get it.


I think that is what it comes down to in the end, Nvidia releases hardware that does perform extremely well right off the gate, but at the expense that long term longevity using more advanced features takes a back seat, while AMD puts a bigger emphasis on longer term features that eventually get used and increases the GPU's lifespan in the long run, but doesn't do the hardware any favors in current games where said features aren't used to begin with, and there's only so much that can fit on the die, transistor wise.


2 different approaches, and for those that change cards every 6 months or a year, Nvidia's way of doing it is better.....For those hanging on to their cards longer, AMD's way might be better at least when it comes to the money left in the wallet....:lol: :p
 
First, NO, I did not say the reference Vega only beats the GTX 1080 by 1 or 2 fps, I was referring to the OC vega vs the reference Vega, meaning the 50 OC does not have anything to do with it beating the GTX 1080. However I see how you could have came to that conclusion, because I didn't spell it out for you in my response. But the tom's benchmarks should have made that obvious since the reference 64 and the OC version beats the GTX 1080 in most games by much more than 1 or 2 fps, and shows the OC vega and the reference Vega only 1 or 2 fps apart... hmm.. sorry... I guess.

As for your "other recent results" LOL, get back to me when you cherry pick from a source that is NOT using the same results from August 14,2017 in their January 18, 2018 article (not recent at all). They have not re ran their tests since release, broken drivers.. Great source to try and disprove us.

The only game they ran new benchmarks on is battlefront II using DX 12 (which DX 12 is a broken mess in the game, causing lag, stutter, etc for many). At 1440p, which is the only resolution I care about (1080p is more cpu bound than GPU bound) the 1080 is 7 fps behind the reference Vega 64. ALL other benchmarks are just their same old release day numbers... so I will be inclined to NOT trust a source that won't even properly run the all benchmarks 6 months later.

To be fair, they are still using the same numbers for the GTX 1080, 1.5 years later in their comparisons except battlefront 2(unless you want to sit and try and tell me that even the GTX 1080 has exactly the same frame rate in GTA V and tomb raider for example as it did 1.5 years ago on it's release date). .. Considering the changes, optimizations, the game and it's engine alone go thru in 1.5 years, much less 6 months, shows that guru3d can't even be considered a legit source, since it appears, they bench a card once, and use those same numbers thru the life span of comparing that card with others, which ignores all driver optimizations etc. Not to mention the changes in the test system (CPU speed, Memory)

GTX 1080 release date article 05/17/2016 (more like 1.5 years ago): http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,20.html

August 14, 2017 (5.5 months ago):
http://www.guru3d.com/articles_pages/amd_radeon_rx_vega_64_8gb_review,26.html

and your January 18, 2018 article:
http://www.guru3d.com/articles_pages/galax_geforce_gtx_1070_ti_hof_review,27.html

Ah I see, the only one cherry picking here is you, as you are blatantly dismissing benchmark results and gone to great lengths to dismiss Guru3d. How about you look at any recent Titan V, 1080Ti, 1070ti, or RX64 review from any other site? They all paint the same picture, a Vega 64 at BEST comparable to a GTX 1080, a card that has been out since May 2016. A card that uses less power, generates less heat, costs less, and a stop gap for the 1080Ti is actually where the RX64 is priced at.

But let us not ignore where your bias is coming from, after all you did state Nvidia is an evil corporation just like Intel? Therefore we know where your arguments lean and why.


as for your GTX 1080 OC beats the OC Vega, you are talking about a 152 MHz OC over a stock GTX 1080 (about 9%) vs a 50 Mhz OC (about 3%) vs a Stock Vega...Big difference. Most Vega owners have their cards undervolted as well as getting higher clocks using substantial less power, so, yes, I did ignore your comment on the OC GTX, because it is not a comparison at all, and is just a 'but, but, but response".

You ignored it because the fact that a 1080 OC uses less power than a RX64 OC, generates less heat, costs less (barring mining prices), and is a much more efficient design while being FASTER than the 64 OC (liquid or air cooled) goes against your ideals of Nvidia being the "evil"corporation like Intel, amirite?
 
This thread is about vega refresh rumors not about vega vs the 1080 please take these posts to a new topic.
 
Ah I see, the only one cherry picking here is you, as you are blatantly dismissing benchmark results and gone to great lengths to dismiss Guru3d. How about you look at any recent Titan V, 1080Ti, 1070ti, or RX64 review from any other site? They all paint the same picture, a Vega 64 at BEST comparable to a GTX 1080, a card that has been out since May 2016. A card that uses less power, generates less heat, costs less, and a stop gap for the 1080Ti is actually where the RX64 is priced at.

But let us not ignore where your bias is coming from, after all you did state Nvidia is an evil corporation just like Intel? Therefore we know where your arguments lean and why.




You ignored it because the fact that a 1080 OC uses less power than a RX64 OC, generates less heat, costs less (barring mining prices), and is a much more efficient design while being FASTER than the 64 OC (liquid or air cooled) goes against your ideals of Nvidia being the "evil"corporation like Intel, amirite?

Sure.. okay buddy.

You dismissed Tom's Current results, and posted guru3D in rebutal, and I just proved to you, that your rebuttal (Guru3d) is flawed.. and I am bias and the cherry picker???? Question: how can one(me) using the results posted by others in this thread (toms/guru3d, with guru3d posted by you) be called a cherry picker? Please, explain this to me. I don't understand your connection.

Better design, as Shadow already pointed out, AMD is about future abilities, vs Nvidia who is not future proof.. So, yes, if you are stuck in the here and now, and not worried about Future tech, because it can't take advantage of any new technology or implementations in the future, then Nvidia is the best design if you are okay in having to purchase a new card to take advantage of new technology.

But if you would rather have a card that runs today's games great, maybe not being the top dog, but still great, and can take advantage and is designed to use the new technology and implementations coming out in the future, then AMD is the better design.

However, The Titan V ($3000), which is not designed for gamers, but is a business version of the Volta, all though seriously out performs the Vega 64, also uses 17 more watts (reference design) than the Vega 64 and a 135 watts more than the GTX 1080, 46 watts more than the GTX 1080ti, and is the ONLY Nvidia GPU that can take advantage of the same technology the Vega can. https://hothardware.com/reviews/nvidia-titan-v-volta-gv100-gpu-review?page=6
So.. is Vega a bad design?? Kind of appears that being able to run future stuff take a bit more power.. Hmm go figure.


Can the Vega be undervolted and Clock substantial higher and use less power.. YES!

We are talking about the same ball park as the GTX 1080 and GTX OC in power draw, yet being able to beat them both in the end. But I will give you credit, that is not stock off the shelf as the GTX 1080 is. (future proof the reason why??? could be) However, as new games come out using the new technology, Vega will run it perfectly, the GTX 1080/1080ti won't because it can't use the new technology, which means as time goes on, their performance will drop from not being able to take full advantage of all the games features, where the Vegas won't.



Now, I took you off my ignore list, because I hoped you had changed your ways, but you have quickly proven that not to be the case, as you are once again trying to rehash the Nvidia/Intel evil/can't be trusted crap again, and trying to make the discussion personal and call it a bias thing, when in fact, it wasn't. Proving your source as out dated has NOTHING to do with being bias.

You simply can't accept the very fact that You rebuttal used information that was flawed (guru3d), and it didn't take great lengths to prove it. I actually looked because Guru3d's numbers didn't look right as I get better numbers on my old 7 year old CPU/MB with PCI-E 2.0, than what they where showing.. And now we ALL know why, they are using release day numbers, for both the Vega and the GTX 1080. If the truth hurts, I am truly sorry, talk to Guru3d about it, but don't blame me.

IF AMD does a refresh using a smaller die, than the clock speeds, power draw, before undervolting will only improve. As this whole thread is about Vega refresh, IF it happens, which many believe not as AMD's road map shows nothing about a Vega Refresh.
 
Last edited:
Status
Not open for further replies.
Back
Top