If I had the budget I would have buy this 2080ti, but I have other things to pay.
I understand Tech guru.
Life is short, live your passions when you can
I understand Tech guru.
Life is short, live your passions when you can
It will be interesting to see how this rtx plays out, it has potential for sure.Tech Guru wroteIndeed they increased around ~300 usd price difference on the none OC ti variant on the 1080ti vs 2080ti lunch prices. No competition in performance /innovation especially from AMD graphic cards line.nas93 wroteTheir pricing for the 2080ti is insane, I bought my 980ti for $700 on release. But I guess it's worth it if the card outperforms the 1080ti by almost twice the power, glad Nvidia is spending so much money on R&D because it's the only company that makes quality gpus.
Here is an interesting article that popped out recently
NVIDIA GeForce RTX 2080 3DMark TimeSpy Score Leaked – Clocked At 2GHz And Beats A GTX 1080 Ti Without AI Cores
https://wccftech.com/nvidia-geforce-rtx-2080-3dmark-timespy-score-leaked-clocked-at-2ghz-and-beats-a-gtx-1080-ti-without-ai-cores/
I agree. Even HairWorks doesn't run very well to this day.anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
mmk92 wroteI agree. Even HairWorks doesn't run very well to this day.anayman_k7 wroteDay after day the leaks are starting to appear, for now all the leaks confirms that there will be around 30% average performance difference between the 2080Ti and the 1080Ti, while the 1080Ti today could cost you around 650$ and the 2080Ti is at 1200$, no wonder why Nvidia didn't release performance graphs during the announcement and then had to release some shady DLSS/4k HDR for few games to fool the users about the new series having double the performance.
In addition the RTX will be the new HairWorks that everyone will turn off to gain back half the of the lost FPS
Personally, I'm going to upgrade to a used 1080ti. A used card would market for as low as ~$450-$500 which gives me incredible bang for my buck. The more people upgrade the higher the saturation of used 1080tis.
Also, no one mentioned SLI. A pair of 1080tis would cost less than a 2080ti. SLI scaling isn't the best, but I'm confident if you get at least %40-%50 performance from the second card, you're still performing better than a 2080ti at a lower cost (assuming you already had the setup that can support the SLI configuration).
1- You cannot generalize yet by saying " 20xx series seems to be the worst video card to invest in" , without the real lunch and benchmarks of a new gen and comparing it with the last gen apple to apple.anayman_k7 wrotePersonally I don't think the RTX will justify that price tag (1200$), I saw Jackfrags Battlefield 5 gameplay with RTX and the game colors was too washed and I think without RTX it looks much better, yes the implementation of the RTX is still early with developers learning how to use it properly which is the main point, why a person would pay 1200$ when he is most likely not experience a good RTX performance soon, waiting for the next series from Nvidia is wiser here, we will have more RTX games (if the RTX was really adopted and got mature and successful), and if we look to the games without RTX, currently with only one or 2 models of 4k 120Hz(144Hz) monitors that will cost in thousands of dollars and are not yet in the market, and the 1080Ti (650$) is capable of 4k 60hz, the 20xx series seems to be the worst video card to invest in
What does AVX offset have to do with this thread? I thought we were talking about RTX cards...Tech Guru wrote (Negative AVX offset set to 0)
Avx negative offset set to zero it means a locked 5Ghz OC on AVX instructions too -AVX code will downclock the processor to help keep core temperatures below the throttling point , which is not a real CPU 5ghz lockvlatkozelka wroteWhat does AVX offset have to do with this thread? I thought we were talking about RTX cards...Tech Guru wrote (Negative AVX offset set to 0)
AVX instructions are mostly used by the operating system, and are usually heavier than the more common SSE instructions. They are often left out of overclocking, by adding a negative offset to them, so that a CPU overclock can be stable.
What are you trying to tell us here?
2nd Closed AlphaPowerPC wroteHow did they test Battlefield 5? even the beta isn't available yet..
https://www.reddit.com/r/explainlikeimfive/comments/459nz3/eli5_why_isnt_realtime_ray_tracing_used_in_video/czw9686/Tech Guru wroteRT is a true generational leap in the gaming industry (AMD will follow sure in their upcoming Navi cards ) , some think its a mere reflections here and there that adds on the graphical fidelity of the image. It is a lot of rendering computational power , that Hollywood movies usually use rendering farms to do it. Now its is available to the end gamer , which is good.
Say you have a scene in your game with 1000 objects in it, no problem at all for a modern GPU. It takes each object, one after the other, and first finds out what area of the screen the object is visible in. For each of those pixels it executes a shader, which calculates the color of that pixel based on the object's material, textures, lights and the lights in the scene. Relatively speaking that's a very small amount of data that you need to do the calculations.
GPUs do this super quickly because the color of the pixel doesn't depend on anything except that one object's data and the lights in the scene, so you can calculate thousands of pixels at the same time using thousands of tiny processors. Modern engines then do all kinds of post processing steps, where they take the finished image and combine it with other data to do lots of neat effects like SSAO or bloom.
Ray tracing works completely different. In ray tracing you're shooting a ray into a scene and you have no idea before hand what it will hit. So every ray needs to have access to all the objects in the scene, their materials and so on, at the same time. Even if you knew which object a ray would hit what happens when that object has some reflective properties? Or say you put a bright red object next to a white one, some of the red color will reflect onto the white one. So each time you hit something you need to shoot even more rays from there, hitting other objects, and then you need to combine the results of all those to get the color of the pixel.