Vega 7nm early benches

So the 32GB version is definitely and exclusively a workstation card given it's price, while a 16GB version for gaming would be expensive but still possible if one is willing to work out the $$$, even if the sales would be low since not that many can still be built at 7nm since production capacity is still low until sometime in 2019.....:bleh:

I think they mean a Titan V price point for any 16GB Vega 20 version for gaming

looks like Vega is still a flop for gamers
Navi better be a homerun for gamers or they might as well throw in the towel on gaming and just put out workstation cards and mining cards only :bleh:

and we got a year to wait for Navi :(
 
Last edited:
Yeah but up to 96 CU? 6144 shaders. That thing would be a shading monster.

if a video card is over 1100 bucks I don't care if it licks my balls
i'm not buying it AMD or NV .

let alone 3k like a Titan V, i'll but a xbox one x first :bleh:


I might pay 1500 (maybe) for one that can do 8k 144hz in most all games
 
I think they mean a Titan V price point for any 16GB Vega 20 version for gaming

looks like Vega is still a flop for gamers
Navi better be a homerun for gamers or they might as well throw in the towel on gaming and just put out workstation cards and mining cards only :bleh:

and we got a year to wait for Navi :(



Well keep in mind that if said ( theoretical ) card is released, a 70% + performance increase actually puts it slightly faster than the Volta based Titan V, so calling that a flop is being pretty harsh given the potential performance involved...… Seems that Vega 7nm clocked at 1 Ghz even is just as fast as Vega 14nm clocked at 1.75 Ghz, so to do that isn't just a die shrink and call it day but also having added more hardware.



Having said that, AMD being a business will sell it at a premium and not the usual 700~800$ that a high end card goes for, at least until there's a lot more 7nm fabrication capacity available because it's just starting out now for the most part...… I guess what i'm saying, is that we're so used to both Intel and Invidia charging outrageous prices for some of their products, that now having AMD starting to get back on it's feet between the thread ripper up to 32 cores, and being the first to have working 7nm GPU's that can take the top spot, that seeing them also charge a premium for them is something we're not used to, but such is business....:bleh:
 
for 1500+ it better be 170% + performance increase over a 1080 ti

been since 2013 and the 290x since we got a good new AMD gaming card

the 390x was a refresh
fury x was only fair
vega 64 a flop for all but mining

and now vega 20 has no real gaming card and some places say navi is a midrange card like a 1080 non ti for 325 bucks :nuts:
 
for 1500+ it better be 170% + performance increase over a 1080 ti

been since 2013 and the 290x since we got a good new AMD gaming card

the 390x was a refresh
fury x was only fair
vega 64 a flop for all but mining

and now vega 20 has no real gaming card and some places say navi is a midrange card like a 1080 non ti for 325 bucks :nuts:

I paid $235 NIB for my 290X in 2016. Not buying a new card until I can get 2x its FPS for $500 or less.
 
I paid $235 NIB for my 290X in 2016. Not buying a new card until I can get 2x its FPS for $500 or less.
:lol:
I paid 589 each for two ASUS Radeon R9 290X-DC2OC's on 5/30/14
and about the same for two fury x's

skipped Vega but I'm 59 and I'm not waiting till I'm 70 for a new card :bleh:
 
:lol:
I paid 589 each for two ASUS Radeon R9 290X-DC2OC's on 5/30/14
and about the same for two fury x's

skipped Vega but I'm 59 and I'm not waiting till I'm 70 for a new card :bleh:

2x GPU performance improvement in 5 years is not unreasonable. But since AMD seems to be content sitting in the corner eating paste we are stuck with the $700 1080 Ti and $1200 Titan Xp.
 
2x GPU performance improvement in 5 years is not unreasonable. But since AMD seems to be content sitting in the corner eating paste we are stuck with the $700 1080 Ti and $1200 Titan Xp.
:yep:


soon we will be stuck with NVidia doing Intel like upgrades of 10% on a new card for 750 a pop
 
if a video card is over 1100 bucks I don't care if it licks my balls
i'm not buying it AMD or NV .

let alone 3k like a Titan V, i'll but a xbox one x first :bleh:


I might pay 1500 (maybe) for one that can do 8k 144hz in most all games

Go on Bill buy an XboneX I did and you won't regret it. Only played BF1 but it looks amazing on my 49' HDR TV compared to my 27' Freesync monitor. Still struggling with the game pad controls but that will come.

All rumours point to Navi being a mid-range upgrade and 'next gen' being the high end card which isn't slated to arrive until some time in 2020. AMD aren't in the enthusiasts market ATM and won't be for a couple of years. We just have to accept it and move on I'm afraid.

BTW Vega 64 is an awesome 1440p card so I'd argue it is not a flop for gamers at all but it's not for 4K gaming with all the eye candy which TBF the 1080ti is not much better.
 
Go on Bill buy an XboneX I did and you won't regret it. Only played BF1 but it looks amazing on my 49' HDR TV compared to my 27' Freesync monitor. Still struggling with the game pad controls but that will come.

All rumours point to Navi being a mid-range upgrade and 'next gen' being the high end card which isn't slated to arrive until some time in 2020. AMD aren't in the enthusiasts market ATM and won't be for a couple of years. We just have to accept it and move on I'm afraid.

BTW Vega 64 is an awesome 1440p card so I'd argue it is not a flop for gamers at all but it's not for 4K gaming with all the eye candy which TBF the 1080ti is not much better.

can't stand the controllers

as for vega 64 it is unplayable at 4k .
and my 1080 ti strix is in most all games
 
can't stand the controllers

as for vega 64 it is unplayable at 4k .
and my 1080 ti strix is in most all games

Have two Vega FE's, no issue at 4K and FC5 :drool: . One of the better games with CFX. 1080 Ti I've found limited at 4K and SLI seems more miss then hit compared to CFX. 1440p Vega's kick ass, I don't consider Vega's as a total flop, they do game well, mine even better. Just that Nvidia should have something much better in the near future.
 
Have two Vega FE's, no issue at 4K and FC5 :drool: . One of the better games with CFX. 1080 Ti I've found limited at 4K and SLI seems more miss then hit compared to CFX. 1440p Vega's kick ass, I don't consider Vega's as a total flop, they do game well, mine even better. Just that Nvidia should have something much better in the near future.
only talking single card

gave up on sli and cfx after a lot of years too few new games working with it now


if FC5 works great with cfx is fine but I don't play it
 
only talking single card

gave up on sli and cfx after a lot of years too few new games working with it now


if FC5 works great with cfx is fine but I don't play it

Bill, when DX12/Vulkan makes it into most games and the devs drop DX11 like they should have done 2-3 years ago for AAA titles.. DX12 will enable pretty seamless multi-GPU abilities.

Which btw.. Does Nvidia still disable the ability to use a secondary Nvidia GPU in the system for PhysX/Hairworks stuff when an AMD GPU is present anywhere in the system?
 
Bob vodka has explained many times that the whole “DX12 will be easy mGPU” is false.
 
Bill, when DX12/Vulkan makes it into most games and the devs drop DX11 like they should have done 2-3 years ago for AAA titles.. DX12 will enable pretty seamless multi-GPU abilities.

Which btw.. Does Nvidia still disable the ability to use a secondary Nvidia GPU in the system for PhysX/Hairworks stuff when an AMD GPU is present anywhere in the system?

To support MGPU in Vulkan and Dx12 the developers NEED to implement that themselfs, is not a switch that you toggle and you are in MGPU heaven.

Afaik Nvidia still doesn't support GPU PhysX, if you use a AMD GPU for main render GPU. I would be very surprised if that has ever changed, that decision was not because of true technical difficulties, it was just Nvidia not supporting being relegated to PhysX acceleration and that has never changed.
 
Bob vodka has explained many times that the whole “DX12 will be easy mGPU” is false.


Still, at least the main drawback of Multi GPU goes away with DX12 and vulkan…..The need for specialized drivers having a profile for a specific game like it was the case for DX11 meaning having to wait weeks to get those from the GPU vendors, so now we can blame the developers and only the developers for not having it implemented in their games.....GPU drivers are off the hook and the API supports it.



GPU vendors may no longer be interested in encouraging developers to use multi GPU, primarily to encourage new GPU sales since gamers are using a single card, so as games become more demanding the odds are greater that a single card will run out of poke much sooner than a dual GPU one would, and in my case in particular it's been 3 to 3 1/2 year cycles using 2 cards and never having to back off settings in games released thru that period, while maintaining the max Fps that the display handles......It's also enough time for cards to become twice as fast, so when the upgrade does happen the performance increase doesn't need a Fps counter to notice it.....:lol:
 
I’d argue the specialized drivers and NV/AMD led support was the reason why mGPU worked; because there was incentive for those companies to push mGPU, it made them money. Developers don’t give a damn because they don’t make money off it.

We were better off during DX9-11 days. MGPU is dead now and will remain that way until there is incentive for developers to push it.
 
I’d argue the specialized drivers and NV/AMD led support was the reason why mGPU worked; because there was incentive for those companies to push mGPU, it made them money. Developers don’t give a damn because they don’t make money off it.

We were better off during DX9-11 days. MGPU is dead now and will remain that way until there is incentive for developers to push it.

this

I have not many Developers rush to support MGPU in Vulkan and Dx12 if there has been any at all


SLI/CFX needed AMD and NV pushing it with drivers monthly for it to work as Developers could care less to put in the work for the low numbers of MGPU users
you want them to do all this extra work for SLI and CFX when most of them don't even fix their ****ing games to work right or do one or two patches and abandon it


and I was running both SLI and CFX with top cards from each but it got to be nuts after NV all but dropped it
 
Last edited:
There’s a reason SLI did so well for so long. NV was pumping huge amounts of cash and developer/coding assistance to make sure it was scaling well.
 
Back
Top