You are not logged in.
Pages: 1
My Summary :
70 % of buyers will look for them as a gaming graphic card and 30% for content creation. More than half of the manufacturing cost of R7 goes for HBM 2.0 inclusion. Opting to GDDR6 will position R7 much better with the inclusion of of RT & Tensor cores.
I am going for the best gaming experience and Nvidia nail it in all area. Ironically , it is monoply asking high prices with lack of serious AMD competition. #Innovation wins let us give Nvidia this credit.
Well in game the Radeon VII is almost universally 1-4fps ahead of the 1080ti at 1440p-2160p which really isn’t impressive at all given the price. They could have made this exact card but with 8gb GDDR6 and sold it for $400-$500 and it would have been a huge hit.
7nm means the distance is reduced between transistors, that would mean more transistors in the same area hence more computing Power, but a processor with more die area with same number of transistors will have the same computing power and more over will have longer life span because of less electromagnetic radiation (as it is inversely proportional to square of distance). So amd is only fooling around when trying to sell us 7nm lithography, because its nothing really great.
amd should instead try to improve ray tracing because this is what will benefit the community.
Nvidia rtx GPUs on the otherhand are a step in the right direction. and not to forget that they are the first of their kind and they were never meant to be perfect, though the driver update on Feb 4 has improved the performance of the rtx card by almost 50%. Not to forget that Nvidia has introduced DLSS which is also new technology.so my verdict is that amd is trying to fool low budget gamers.
I am brainwashed of AMD 7nm damn that vs 12nm and 14nm from Nvidia is really disappointing with lack of features too.
Linus Tech Tips:
https://youtu.be/alhEgNvzv50
Last edited by Tech Guru (February 8 2019)
Latest AMD Drivers - Radeon VII Tested on 33 Games
7 % slower vs Rtx 2080
5 % slower vs GTX 1080ti
In traditional rasterization
AMD slower than the 1080ti , a two years old 14nm Graphic card. :)
The gap will incease vs the RTX 2080 , sure.
it is the same existing design of Vega but using 7nm, also the drivers are still very immature, the launch was very rushed, giving that gpu a couple of months of drivers update will reveal its real power
AMD fans slogan shifted from "Wait for Vega!" to "Wait for miracle drivers!" and now there's even "Wait for Navi!"
"Wait for future driver update, AMD card will eventually beat Nvidia card"
Zero Competition tell now , from the bigger value proposition Nvidia is offering. Which makes it a monoply setting their own pricing schemas.
When AMD trigger their fear factor with very competitive offers / new technological advancements / raw rasterization rendering gaming performance = Nvidia will force to do repricing strategies. If R7 is strong enough , Nvidia will automatically drop the price of the rtx 2080. R7 is simply "underwhelming" as Linus said.
Previously , Vega 64 has more memory bandwidth & tflops compared to a 1080ti not 1080 , and the 1080ti beats regardless of drivers. It is a core architecture issue AMD has been sucked in. Innovation & Features.
Last edited by Tech Guru (February 11 2019)
First, I am not an AMD fan, I have owned CPUs and VGAs from all the manufacturers, but I have to admit, I am a fan of Logic and Reason, I am not a single brand worshiper like you, so to sum up the current situation:
AMD is the better option in the low budget and the medium budget, 150$ ish for RX 570 and Rx 580 and 590 destroys anything Nvidia could offer in the 280$ ish or below, Vega 56 and 64 were not up the challenge, Radeon 7 came to narrow the gap with what Nvidia offers for the 700$ mark, that 16GB HBM2 VRAM makes it a very suitable workstation GPU, but as I said in my previous post, it needs couple of months of drivers update to reveal its real potential.
I expect you will throw the brag card as soon as someones favors AMD in some way, and speaking about WAITING, we are STILL waiting for NEXT GEN RTX AND DLSS god like tech to be used in more than 2 games and not to shave half of your FPS to play 60fps 1080p with a 1300$ video card.
Last edited by anayman_k7 (February 12 2019)
First, I am not an AMD fan, I have owned CPUs and VGAs from all the manufacturers, but I have to admit, I am a fan of Logic and Reason, I am not a single brand worshiper like you, so to sum up the current situation:
AMD is the better option in the low budget and the medium budget, 150$ ish for RX 570 and Rx 580 and 590 destroys anything Nvidia could offer in the 280$ ish or below, Vega 56 and 64 were not up the challenge, Radeon 7 came to narrow the gap with what Nvidia offers for the 700$ mark, that 16GB HBM2 VRAM makes it a very suitable workstation GPU, but as I said in my previous post, it needs couple of months of drivers update to reveal its real potential.
I expect you will throw the brag card as soon as someones favors AMD in some way, and speaking about WAITING, we are STILL waiting for NEXT GEN RTX AND DLSS god like tech to be used in more than 2 games and not to shave half of your FPS to play 60fps 1080p with a 1300$ video card.
Wrong ,
1440p Ultra TAA with RT Ultra ~ 75 Fps using the RTX 2080ti. Without DLSS. DLSS is coming patch will be dropping soon , in one week. You can grab now an EVGA RTX 2080ti Black ed. @ 999 USD - Shipped to here from USA to your door ~1200 USD .
Wishful thinking and expectation , you can always say that turing successor and successor of successor will be better and navi will be better .. I enjoy what #current tech offers me , and yes my gaming experience leveraged when I swapped the 1080ti with a 2080ti. Well that's just how technology works. It gets cheaper as is gets older. Welcome to the way of life. You could say a 1tb hard drive is super cheap compared to a $4k 8mb drive when it first came out.
AMD has a self identity problem they bundle the R7 with 3 games and say it is a workstation , they are trying hard which is good but they need to stop lying with sequeezing a vega architecture into 7nm and claim before lunch its a 7nm graphic card.
Last edited by Tech Guru (February 12 2019)
That last post made no sense at all. And not just because the English is taken out of a Chinese product manual...
Last edited by vlatkozelka (February 12 2019)
Let me break it down for you:
RTX 2080 = 800$
Radeon VII = RTX 2080 - DLSS - RTX = 700$
But wait
DLSS + RTX = 0 value because DLSS is just an AA algorithm that no one cares about. And everyone will turn RTX off just like everyone turns Nvidia Hairworks off. (and any other Nvidia gimmick that they kept adding to games throughout the years)
So Radeon VII = RTX 2080 at 100$ less + extra RAM and better performance at what you would describe as "content creation" ... lol. I don't expect you to know what a workstation is because you probably don't work.
Is it a great deal? Probably not. It is OK at best. But you have to understand that a lot of research and hard work goes into creating those GPUs (both from NVidia and AMD). Even if they fell short of NVidia's flagships, you should still show some respect for creating those things. I don't think you bake GPUs in your kitchen, do you?
My opinion is that AMD rushed out a graphics card just to take a share of the market. Just like NVidia released the 1070Ti, which was a card that never needed to exist, except as a response to Vega 64. I really hope AMD would flip the tables one day. Until then, enjoy your overpriced 2080Ti... 1300$ haha
I would like to add to vlatkozelka's point that while RTX and DLSS are Nvidia gimmicks, DirectX 12 and Vulcan are not.
DirectX and Vulcan are the low-level APIs that most newer games are based on.
This means that if you were to optimize your hardware to them, you'll directly see performance improvements that translates to all those games.
If you were to look closely into the benchmarks over the past 3-4 years, you will see how AMD managed to implement their hardware to be more inline with the specifications set forth by those APIs and therefore lead to them having an edge over Nvidia in features such as"async compute" and others. This directly translates to better performance on current titles that use those APIs and a long-term noticable improvements over time (while more games use those APIs and while the drivers/libraries fget even more optimized).
FYI, I've exclusively owned Nvidia cards for the past 10 years(it's my personal preference and I'm not enticed to upgrade) but it doesn't mean I would bash a piece of technology if I'm having buyer's remorse.
In rasterization 2080ti always win
R7 vs 2080 in 33 titles > 7 % faster the 2080
Give it time for mature drivers
Vega 64 vs 1080ti
with mature drivers , the vega 64 short fall even it has more tflops and memory bandwidth compared to the 1080ti.
Reason: Obsolete Weak Architecture Level
R7 = Vega Architecture on a 7nm Die Shrink = fake , the 2080 will always beat the R7 +DLSS + RT.“ This year is our year”
- AMD and Liverpool FC All reviews from tech community shows how the R7 is "underwhelming" in a competitve scope. However, People can be true ignorant , especially when they hate on something because they can't have it for themselves due to RTX 2080 / RTX 2080ti prices.
Nvidia made a market niche with Real Time Ray Tracing away from traditional rasterization with canned/baked shadows , reflections ..
It is not an Nvidia Gamework feature like HBAO+ , Grass Turf , Hairwork , PCSS , HFTS etc.. Now what was done at Holywood movies of ray tracing as a rendering technique for generating an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects , that need a lot of compuational power - now it is available to the gamer end.
Bottom Line - Jay2Cent : "Lot of People there are super negative rather than seeing the advancement of technology we are getting here" , "& Unfortunately , Now I have seen real time ray tracing in person , Screen Space Reflection are .. disgusting" , Is It playable : For me the Answer is personaly Yes .. " 23 - 11 - 2018 , Card Used: Asus Rtx 2080ti Strix
Digital Foundry
"EveryThing is Better Represented - Fullstop". On RT Low too vs none RT enabled. 21 - 11 - 2018
Real time Ray Tracing has been existing for years and heavily used in Hollywood movies. It needs a lot of computational power away from baked cubic maps used for shadows and reflections on regular rasterization rendering. Now RT is available to the gamer end , which is a technological leap in graphical realism & fidelity.
Waiting for Metro Exodus with wide RT Implementation
In
Reflections
Global Illumination
Shadows
Nvidia is Gaining momentum with its technological advancements in the Turing line of graphic cards.
Nvidia DLSS boosts Port Royal Benchmark performance by up to 50 %
What is DLSS ? - Stedio Science
Metro Exodus will have support for RTX and DLSS at lunch on 15th February -Tech4Gamers
You can quote whoever you want. It would still not make sense to turn RTX on in a game and lose all that framerate. That's a 500$ worth of shadows and reflections haha (considering you will be getting 970 performance on a 1300$ card)
I honestly think he is exaggerating. Especially in a game of BF5 where you should be busy shooting people not looking at their reflections.
AMD and Liverpool FC
What is this? What are you talking about!!!
they hate on something because they can't have it for themselves due to RTX 2080 / RTX 2080ti prices
I suggest you visit a doctor because you desperately need one rich boiiiiiiiii
You can quote whoever you want. It would still not make sense to turn RTX on in a game and lose all that framerate. That's a 500$ worth of shadows and reflections haha (considering you will be getting 970 performance on a 1300$ card)
I honestly think he is exaggerating. Especially in a game of BF5 where you should be busy shooting people not looking at their reflections.
RTX on Low vs Baked Relections / Global illumination , I will take it any day and play a constant 2160p@60fps Ultra TAA RT low vs no RT.
On the other side , I do not know from where are you getting your exaggerated assumptions to make a " mockery" or " irony"
A 970 on 1440p Ultra ~ 48 fps vs 75 Fps with an rtx 2080ti on 1440p Ultra with RT Ultra without DLSS.
Last edited by Tech Guru (February 13 2019)
AMD and Liverpool FC
What is this? What are you talking about!!!
they hate on something because they can't have it for themselves due to RTX 2080 / RTX 2080ti prices
I suggest you visit a doctor because you desperately need one rich boiiiiiiiii
No personal insultation mate , to take it personal. I am giving my opinion and I didnot name you in specific or anyone. Drop this perception , it is a tech forum. More heat you can spot on comments below articles at Anandtech , TomsHardware , famous youtube tech channels etc.. , it is healthy in the tech community.
Do not worry when AMD drops something good , I will give it total credit.
Last edited by Tech Guru (February 13 2019)
Well, the Radeon VII aint that impressive since again AMD falls behind in this race against the green team, i have been an nvidia buyer for more than 10 years now as well and it has been a pleasant experience overall, i still dont see any benefits till now to own an AMD card (at least in the high end category), however i am patiently waiting to see what Intel is cooking, i read there is a discrete graphics card in the making right now.
On the other hand, you might not agree with Tech Guru, but it is so childish to take this discussion to a personal level with insults.
Well, the Radeon VII aint that impressive since again AMD falls behind in this race against the green team, i have been an nvidia buyer for more than 10 years now as well and it has been a pleasant experience overall, i still dont see any benefits till now to own an AMD card (at least in the high end category), however i am patiently waiting to see what Intel is cooking, i read there is a discrete graphics card in the making right now.
On the other hand, you might not agree with Tech Guru, but it is so childish to take this discussion to a personal level with insults.
Tbh its more than disagreement, more like a bit of whining on the forum, especially when I hit "Active" in topics section, but that off topic for now. Tech Guru constant facebook like opinion reporting on lebgeeks is not very useful in my opinion - no offense, maybe a new forum category for this?
Feels like pushing something down your throat. Maybe keep every topic like this in one thread?
Last edited by Elitism Guru (February 13 2019)
Aly wrote:Well, the Radeon VII aint that impressive since again AMD falls behind in this race against the green team, i have been an nvidia buyer for more than 10 years now as well and it has been a pleasant experience overall, i still dont see any benefits till now to own an AMD card (at least in the high end category), however i am patiently waiting to see what Intel is cooking, i read there is a discrete graphics card in the making right now.
On the other hand, you might not agree with Tech Guru, but it is so childish to take this discussion to a personal level with insults.Tbh its more than disagreement, more like a bit of whining on the forum, especially when I hit "Active" in topics section, but that off topic for now. Tech Guru constant facebook like opinion reporting on lebgeeks is not very useful in my opinion - no offense, maybe a new forum category for this?
Feels like pushing something down your throat. Maybe keep every topic like this in one thread?
I usually state my inputs as a hardware enthusiast based on actual facts / numbers / benchmarks - not a subjective way of presenting things. I have been participating in many topics in this forum including hardware / Networks / General.
I usually state my inputs as a hardware enthusiast based on actual facts / numbers / benchmarks - not a subjective way of presenting things.
Do you realize that this makes no sense?
Stating your opinion explicitly means the content is subjective
And for the 1000 time, drop the "hardware enthusiast" act. Your post is the usual "look at the shiny new thing that I bought, that makes all of your things look bad".
If you want to talk about hardware, in a "hardware enthusiast" fashion. Start talking about the hardware, without bashing everything that you didn't chose to buy, and stop using lines like "I am going for the best gaming experience and Nvidia nail it in all area. Ironically , it is monoply asking high prices with lack of serious AMD competition. #Innovation wins let us give Nvidia this credit."
Whenever you say "I" it stops being objective, and a lot of the people on here are starting to catch wind of you Facebook-like posts.
If you wanna actually be that helpful hardware enthusiast who actually cares about technology. Maybe start making your own benchmarks, write something (IN YOUR OWN WORDS) to explain to us what those fancy numbers and terminologies you always use mean, stop bragging about owning a product from X company and bashing anything from Y company,etc... I'm not forcing rules on you, I am no place to do so. I'm just pointing out to you why you receive all the comments that you do.
You want a serious reply?
YES, AMD is behind Nvidia in gaming performance. I personally have been on the green team's side for quite so time now. But unlike you, I am not happy about it, especially when Nvidia relies on gimmicks to get the edge in some situations (I am talking value for price here, not best vs best) and inflates prices from 400$ to 1300$.
We will most likely continue to buy Nvidia cards for now (at least when looking for high end cards), but it's only because we have no other choice. We, actual hardware enthusiasts, don't brag about some foreign company winning over another foreign company where we have no input in the development process.
We watch, we wait, we read... but most certainly we don't brag nor bash anything that isn't Nvidia.
Yes, we waited for Vega64 and for Vega 2 for a long time, and both were "disappointing" (64 more than vii). But on the other hand, we actually want AMD to improve so that things go back to normal.
Did you guys forget that AMD is getting big bucks from their collaboration with Microsoft and Sony for their respective consoles? Check the mobile versions of AMD's GPUs and you will notice that they were basically lagging for the past 2 years. As for the Radeon VII release it might be simply released so that they won't miss out on the yearly GPU releases (which are mostly focused toward high end gaming, they later reveal the midrange - both Nvidia and AMD), as Nvidia is spearheading the releases with better "overall" performance improvements at launch.
This is proof that some here are saying Radeon might be rushed, but i can't fully agree with this since notice the price hike due to the inclusion of HBM2. I do wonder why they did this...probably testing the waters for the upcoming consoles which are rumored to incorporate GDDR6 or HBM. Or they could be simply pushing the HBM standard, as they are one of the companies developed the technology.
Did you guys forget that AMD is getting big bucks from their collaboration with Microsoft and Sony for their respective consoles? Check the mobile versions of AMD's GPUs and you will notice that they were basically lagging for the past 2 years. As for the Radeon VII release it might be simply released so that they won't miss out on the yearly GPU releases (which are mostly focused toward high end gaming, they later reveal the midrange - both Nvidia and AMD), as Nvidia is spearheading the releases with better "overall" performance improvements at launch.
This is proof that some here are saying Radeon might be rushed, but i can't fully agree with this since notice the price hike due to the inclusion of HBM2. I do wonder why they did this...probably testing the waters for the upcoming consoles which are rumored to incorporate GDDR6 or HBM. Or they could be simply pushing the HBM standard, as they are one of the companies developed the technology.
I agree with you.
HBM 2.0 are good for content creators and not gaming - at least from the trend
Vega 64 HBM 2.0
Total Memory Bandwidth: 483.8 GB/s
GTX 1080ti GDDR5X
Total Memory Bandwidth : 484 GB/s
No competitive advantage , and the 1080ti beats the Vega 64
Similar Trend
Redeon 7 - HBM2
Total Memory Bandwidth: 1 TB/s
RTX 2080 - GDDR6
Total Memory Bandwidth: 448GB/s
Yet on both 1440p / 2160p the R7 get beaten by the rtx 2080 / and the two years old 16nm 1080ti.
AMD need to drop the HBM trend , it seems a total failure, and inflated memory bandwidth numbers are on paper only. Added to the that , their high manufacturing costs.
Last edited by Tech Guru (February 14 2019)
Tech Guru wrote:I usually state my inputs as a hardware enthusiast based on actual facts / numbers / benchmarks - not a subjective way of presenting things.
Do you realize that this makes no sense?
Stating your opinion explicitly means the content is subjectiveAnd for the 1000 time, drop the "hardware enthusiast" act. Your post is the usual "look at the shiny new thing that I bought, that makes all of your things look bad".
If you want to talk about hardware, in a "hardware enthusiast" fashion. Start talking about the hardware, without bashing everything that you didn't chose to buy, and stop using lines like "I am going for the best gaming experience and Nvidia nail it in all area. Ironically , it is monoply asking high prices with lack of serious AMD competition. #Innovation wins let us give Nvidia this credit."
Whenever you say "I" it stops being objective, and a lot of the people on here are starting to catch wind of you Facebook-like posts.
If you wanna actually be that helpful hardware enthusiast who actually cares about technology. Maybe start making your own benchmarks, write something (IN YOUR OWN WORDS) to explain to us what those fancy numbers and terminologies you always use mean, stop bragging about owning a product from X company and bashing anything from Y company,etc... I'm not forcing rules on you, I am no place to do so. I'm just pointing out to you why you receive all the comments that you do.
You want a serious reply?
YES, AMD is behind Nvidia in gaming performance. I personally have been on the green team's side for quite so time now. But unlike you, I am not happy about it, especially when Nvidia relies on gimmicks to get the edge in some situations (I am talking value for price here, not best vs best) and inflates prices from 400$ to 1300$.
We will most likely continue to buy Nvidia cards for now (at least when looking for high end cards), but it's only because we have no other choice. We, actual hardware enthusiasts, don't brag about some foreign company winning over another foreign company where we have no input in the development process.
We watch, we wait, we read... but most certainly we don't brag nor bash anything that isn't Nvidia.
Yes, we waited for Vega64 and for Vega 2 for a long time, and both were "disappointing" (64 more than vii). But on the other hand, we actually want AMD to improve so that things go back to normal.
I agree with you and I add:
Economically , Nvidia created a monoply in the gaming graphic cards line. This has been causing a repercussion of Nvidia setting their own prices. Simply due to the following reasons:
Continuous underwhelming performance of AMD
Lack of New Features / Innovations (Regardless of whether of added value to the gamer end or no - that is a matter of preference , but they are there and new ) - Sure AMD in the near future will ride the train of Real Time Ray Tracing & DLSS equivalent - using WindowsML or DirectML, which is a "software solution". Yet to be seen if it will be effective or no.
Compared to the Previous generation Pascal
Raw Rasterization Boost
New Techonological Features
New Cores (RT + Tensor Cores)
Hybrid Rendering
Dedicated HDR Processing HW pipeline to reduce HDR latency
GDDR6
DP 1.4a with DSC
Mesh Shading , Adaptive Shading , and texture space shading
More Efficient Cuda Cores
New execution unit (INT32). This unit enables Turing GPUs to execute floating point and non-floating point processes in parallel.
For budget oriented gamers they can grab either an Rtx 2060 or RTX 2070 , and have all these new features. I highly recommend the RTX 2070 with a mid way 1080 and 1080ti class rasterization performance.
Let us give Nvidia #credit on these. On the other hand , sadly it is a " monoply" for now. I was hoping that AMD , from their "7nm" marketing , will create a crack in the wall. However , this didn't happen until now. "Maybe" future drivers may turn the table on the RTX 2080 , but I highly doubt that.
Last edited by Tech Guru (February 14 2019)
@vlatkozelka well said :)
The Argument that the 16GB HBM 2.0 Excels on 4K vs Rtx 2080 8GB GDDR 6 is again busted. Furthermore , after AMD drivers' updates too.
Still Again on 38 AAA Games - Retested
8% Slower on 1440p Ultra
9% Slower on 2160p Ultra
What a Shame 7nm vs 12nm. The weakness confined within the core AMD architecture.
Pages: 1