|
MikeC posted:That's a wierd flex when the correct play was not to buy Turing though. All of the criticism was true and is still true to this day. This is literally always true though. Don't buy Ampere because something better is coming out, just wait 2 years like you did with Turing. Turing ended up "pretty ok" because a lot of it's features ended up being really cool, it just took a while to get there. And there were lots of people who thought the Tensor cores were totally worthless and now DLSS is the most exciting new feature thats come along in a long time. (For the record, I think the non-Super lineup had questionable value but the Super refresh was pretty good).
|
# ? Aug 13, 2020 15:41 |
|
|
# ? Apr 29, 2024 11:32 |
|
repiv posted:why would you do that when nvidia already has the best encoder block in the business otoh, my ryzen can encode 1080p60 x264 on slow preset at ~20% cpu usage, including whatever it has to throw at the game, lmao. i'm gonna try very slow preset tomorrow to see if there's any more quality to be gained, though i somewhat doubt it if you're just recording the gpu encoding is probably the best, but for streaming stuff you kinda want x264 to capture other poo poo and ryzen just absolutely crunches that poo poo now at settings that usually produce better quality video. it's loving amazing y'all
|
# ? Aug 13, 2020 15:59 |
|
Lockback posted:
I'm incredibly happy with my 2070S, it was in exactly the sweet spot for that graphics card lineup. That said, it's probably going to be my shortest-lived graphics card since Ampere is dropping like 8 months after I got it.
|
# ? Aug 13, 2020 16:01 |
|
Carecat posted:What I'm hearing from this is Nvidia saying "check out this $2000 3090" If the 3090 is the new Titan naming scheme, $2k would be a bargain. The Turing one is $2500+. Lockback posted:
But it took a substantially longer "a while to get there" for this generation. In most past ones, you got most of the goodness upfront, and then maybe some new features actually got picked up later on that made some mild improvement. Turing suffered from not being all that much faster than Pascal, while upping prices, and promising really cool stuff that, frankly, still hasn't panned out except for in a few games. In those games where it did finally show up, it is indeed pretty cool, but now we're at the point that we're actively discouraging people from buying Turing because Ampere is so close. Don't get me wrong, I think NVidia made the right strategic play by using Turing to prime the pump that'll be DLSS 2.0 + RTX adoption with Ampere, but that doesn't change the fact that for the vast majority of its life it was a pretty lackluster generation. Had they launched with Super-performance cards no one would be arguing, but they didn't.
|
# ? Aug 13, 2020 16:02 |
|
DLSS becoming relevant right at the point where Turing is about to be replaced isn't really the strongest argument in your favour. NVIDIA have absolutely got a long term business strategy that we can see looking like it's about to start paying off heavily, but that doesn't change the fact that the 2000 series were overpriced and underwhelming.
|
# ? Aug 13, 2020 16:27 |
|
Alchenar posted:DLSS becoming relevant right at the point where Turing is about to be replaced isn't really the strongest argument in your favour. Especially when it's focused on the 2080ti. You could have double the # of games with support for ray tracing + DLSS 2.0 and it would still be questionable as a 'sensible' purchase for all but the most hardcore enthusiasts. It's a $1200 card, hell it's always been ~$2000 Cndn (at best!) here, that is simply unobtainable for the vast majority of gaming public, even with relatively high-end PC's. If that's where RTX gets 'good' then if anything it cements 'wait for Ampere'.
|
# ? Aug 13, 2020 16:58 |
|
Which one out of the new cards should I be waiting for if I'm interested in 1440p 144hz gaming? Would be paired with ryzen 3700X. 3070? I have no idea what modern cards do, my old setup was like 8 years old. DeadlyHalibut fucked around with this message at 17:08 on Aug 13, 2020 |
# ? Aug 13, 2020 17:02 |
|
DeadlyHalibut posted:Which one out of the new cards should I be waiting for if I'm interested in 1440p 144hz gaming? Would be paired with ryzen 3700X. Probably, yeah.
|
# ? Aug 13, 2020 17:11 |
|
I agree that the 20 series were overpriced and underwhelming. It's the ugly, awkward teen stage of the RTX, DLSS, etc newness that looks like it'll find its legs and its maturity with the 30 series. I guess it was a necessary step that could have gone better for us price-wise but what's done is done and I think when we're all playing cyberpunk with all that raytracing poo poo it'll be worth it. Suffice to say I think the days of paying just 500-700 for the top-end Ti are over, I do think the new cards will be expensive once again. My 1080ti will have a viking funeral when it finally bites the dust.
|
# ? Aug 13, 2020 17:11 |
|
https://www.anandtech.com/show/15974/intels-xehpg-gpu-unveiled-built-for-enthusiast-gamers-built-at-a-thirdparty-fab Intels new "HPG" gaming targeted Xe version confirmed for 2021. Sure looks like the "Xe is cancelled!" was dumb madeup dogshit, wow! DXR hardware ray tracing and GDDR6, not on EMIB it appears. Cygni fucked around with this message at 17:14 on Aug 13, 2020 |
# ? Aug 13, 2020 17:12 |
|
Wichard Leadbeater seems pretty upbeat about it too https://www.eurogamer.net/articles/digitalfoundry-2020-intel-architecture-day-tiger-lake-xe-gaming-focus
|
# ? Aug 13, 2020 17:13 |
|
Zedsdeadbaby posted:I agree that the 20 series were overpriced and underwhelming. It's the ugly, awkward teen stage of the RTX, DLSS, etc newness that looks like it'll find its legs and its maturity with the 30 series. I guess it was a necessary step that could have gone better for us price-wise but what's done is done and I think when we're all playing cyberpunk with all that raytracing poo poo it'll be worth it. I will say that one thing in Turing's favor is that DLSS2 might make it's overall viable lifespan quite long. It's kind of aging like fine wine, assuming you believe in a DLSS-centric future. And it really did gain on benchmarks over the years it was in market with current games as well. Not the best architecture ever, but it's at least intellectually interesting how it got much better over time. Also I mentioned this before (and it doesn't change your point) but the die size on Ampere is significantly smaller, which will save them a decent amount on fab. If Nvidia takes pity on us poor souls, they could ratchet down the pricing, or maybe just keep prices consistent while increasing the VRAM up to 20GB on the top end. It's a possibility at least...
|
# ? Aug 13, 2020 19:10 |
|
...or reap more profit. Hmmm, what would a company do, what would a company do.....
|
# ? Aug 13, 2020 19:22 |
|
Lockback posted:...or reap more profit. Hmmm, what would a company do, what would a company do..... Reinvest in Shroud Tech 2.0 to better dissipate all the heat from the denser cores!
|
# ? Aug 13, 2020 19:25 |
|
Zedsdeadbaby posted:Wichard Leadbeater seems pretty upbeat about it too https://www.eurogamer.net/articles/digitalfoundry-2020-intel-architecture-day-tiger-lake-xe-gaming-focus Is he ever not? Lol
|
# ? Aug 13, 2020 19:43 |
|
Taima posted:I will say that one thing in Turing's favor is that DLSS2 might make it's overall viable lifespan quite long. It's kind of aging like fine wine, assuming you believe in a DLSS-centric future. And it really did gain on benchmarks over the years it was in market with current games as well. Not the best architecture ever, but it's at least intellectually interesting how it got much better over time. DLSS2 absolutely means that your ability to stretch out a card's lifespan by turning down the settings on newer games presumably got a hell of a lot longer.
|
# ? Aug 13, 2020 19:48 |
|
I bought a 2070 Super because I had built a new desktop and bought a new 1440p monitor and my old 1060 6GB just couldn't keep up with it. Until we see the performance, pricing, and availability of Ampere (as well as the demands of games I actually want to play) I'm waiting to see how I feel about the need to upgrade. At this time I'm kind of thinking I'll be able to comfortably stay with the 2070 Super for a while - my 1060 6GB lasted me 4 years and would likely still be my GPU if I hadn't upgraded my monitor.
|
# ? Aug 13, 2020 21:26 |
|
Does anyone actually believe in Ampere AIB boards launching any time soon? There's a lack of leaks in regards to those, considering it's supposedly happening so soon. That close before the release of Turing, pictures of AIB boards were being passed around.
|
# ? Aug 13, 2020 21:31 |
|
Combat Pretzel posted:Does anyone actually believe in Ampere AIB boards launching any time soon? There's a lack of leaks in regards to those, considering it's supposedly happening so soon. That close before the release of Turing, pictures of AIB boards were being passed around. leaks say they’ll be launching immediately, which in practical terms probably means within a month or so of FE
|
# ? Aug 13, 2020 21:38 |
|
"Launched" and "Available" are also two separate, distinct states.
|
# ? Aug 13, 2020 21:40 |
|
Lockback posted:"Launched" and "Available" are also two separate, distinct states. Yeah, this. I wouldn't bet terribly surprised if they launched alongside the FEs shortly after NVidia's announcement. But I also wouldn't be terribly surprised if that initial batch was very small, and it took a month or two for them to really have inventory enough that you don't have to snipe one from somewhere. Or maybe NVidia learned from the last two launches and have stocked up.
|
# ? Aug 13, 2020 22:03 |
|
Okay my million dollar idea that I'm providing here for free because I'm fed up: Someone use what they did for RTX Voice to develop something that mutes the buzz from the drone audio feed for all these live sports events. It's only getting more and more popular to use drones for sport coverage and it's something super marketable to the companies doing the streaming.
|
# ? Aug 13, 2020 22:12 |
|
Carecat posted:What I'm hearing from this is Nvidia saying "check out this $2000 3090" I think my hard line in the sand is $1300-1500 for a 3080 Ti/3090. Def not paying for a rebranded Titan
|
# ? Aug 13, 2020 22:21 |
|
VelociBacon posted:Okay my million dollar idea that I'm providing here for free because I'm fed up: This is only slightly relevant, but a different issue on a lot of live sports is the commentary - a really useful tip I learned some years ago is that on some broadcasts with 5.1 audio, disabling the central speaker is all you need to do to get rid of commentary altogether, since that's only what it's used for. You still get all the rest of the audio on the other speakers. Perhaps the drone audio is exclusive to one of the other speakers, it's worth taking a look at next time you are watching live sports.
|
# ? Aug 13, 2020 22:36 |
|
I like the intel shrouds and also the lists of GPUs the only two GPU's that ill probably really care about in any sentimental sense are the TNT Riva 16mb i had that screamed at the time & the GeForce4 MX440, which, ill just say i look back on fondly & smile
|
# ? Aug 13, 2020 23:34 |
|
I remember the upgrade from Hercules to VGA.
|
# ? Aug 13, 2020 23:37 |
|
DrDork posted:Or maybe NVidia learned from the last two launches and have stocked up. In the Year of the Furlough, they'd be fools if they haven't done this. Absolute fools.
|
# ? Aug 13, 2020 23:40 |
|
People are so jACKED uP about this launch that will sell through them all regardless of the price or stock i think.
|
# ? Aug 13, 2020 23:42 |
|
sean10mm posted:I remember the upgrade from Hercules to VGA. That was some crazy poo poo! Cga and hercules having 4? colours, then some guy had an EGA which was 16?, then suddenly there was vga with 256 colours, which was 100% 'photorealistic' and absolutely blew our minds.
|
# ? Aug 13, 2020 23:44 |
|
I'm kind of wondering what effect the Postal Service collapse is going to have on people jockeying for acquiring these parts ASAP I mean, not even just GPUs, but in general
|
# ? Aug 13, 2020 23:46 |
|
oh man. going back and reminiscing about all the times i said "haha man this is it, its basically real" i think i gave up doing that in like 1997 because man it was tiring being wrong so often dark forces first level: oh my god its here im actually in star wars
|
# ? Aug 13, 2020 23:47 |
|
Yeah I've been itching to upgrade for about a year now and I've been waiting and holding off on playing most of the modern releases I want to play because of the GPU landscape being what it is right now, and I'll be fuming if I can't in time for Cyberpunk just because Nvidia decided to launch with only a couple of thousand units available worldwide. They better not be trying to pull a nintendo switch on us, it isn't big and it isn't clever.
|
# ? Aug 13, 2020 23:47 |
|
redreader posted:That was some crazy poo poo! Cga and hercules having 4? colours, then some guy had an EGA which was 16?, then suddenly there was vga with 256 colours, which was 100% 'photorealistic' and absolutely blew our minds. SELECT GRAPHICS MODE F1) CGA F2) TANDY F3) EGA F4) MCGA F5) HERCULES F6) VGA F7) SVGA F8) XGA F9) MONCHROME F10) MDA F11) MDMA F12) HERCULES: ZERO TO HERO (Disney - 1999 - VHS)
|
# ? Aug 13, 2020 23:56 |
|
sean10mm posted:I remember the upgrade from Hercules to VGA. Wasn’t Hercules sth like 740*350 an beat even EGA but was only black and white? IIRC Hercules cards were used for monochrome, black-green and black-orange CRT Monitors. I remember my dad and my buddies gaming dads were from CGA to EGA and VGA around 1988, 1989. I also remember playing Comanche in a 256 color VGA Glory on the NovaLogic VoxelSpace engine that was groundbreaking back in the day and finally buried the Amiga and Atari ST as 3D contenders.
|
# ? Aug 14, 2020 00:42 |
|
Mr.PayDay posted:Wasn’t Hercules sth like 740*350 an beat even EGA but was only black and white? Yup, monochrome text with graphics. Jumping straight to VGA was wild.
|
# ? Aug 14, 2020 01:27 |
|
Furiously hits F11
|
# ? Aug 14, 2020 02:17 |
|
3090 pcb leak? https://wccftech.com/nvidia-geforce-rtx-3090-enthusiast-ampere-gaming-graphics-card-pcb-pictured-triple-8-pin-connectors-next-gen-g6-memory/quote:There's also a secondary chip that seems to be featured right underneath the GPU itself. The leaker placed an Intel CPU on top of the chip so that it doesn't get exposed but it looks like NVIDIA may offer a secondary chip that is not a part of the GPU die itself that may handle a set of specific work loads which are yet to be detailed. Other features we can expect from the NVIDIA GeForce RTX 30 series graphics cards is a fully PCIe Gen 4 compliant design and enhanced power delivery to several components on the PCB. ... rtx accelerator?... In any case if there are completed 3rd party cards that could actually mean an at-launch positioning.... potentially, maybe. Also ~22gb memory on pcb apparently. CaptainSarcastic posted:I bought a 2070 Super because I had built a new desktop and bought a new 1440p monitor and my old 1060 6GB just couldn't keep up with it. Just out of curiosity, in retrospect how do you feel about the 2070S purchase? Didn’t you recently buy that? Was it worth having it in the meantime? I can see how it might have been considering the quarantine. Taima fucked around with this message at 09:36 on Aug 14, 2020 |
# ? Aug 14, 2020 09:10 |
|
Taima posted:Just out of curiosity, in retrospect how do you feel about the 2070S purchase? Didn’t you recently buy that? Was it worth having it in the meantime? I can see how it might have been considering the quarantine. I feel good about it. I got my new monitor in May, and it was clear my 1060 6GB could not drive it at a comfortable resolution. I got my 2070 Super last week of May/first week of June, and have been able to run whatever I want pretty much maxed out. Trying to wait until Ampere released just didn't seem reasonable, and I think it is highly likely that I won't need to upgrade anyway, or at least not for a while. As it is I feel no time pressure to upgrade to Ampere, so supply issues are not a concern to me. Similarly, I have less invested in how it turns out to be in terms of price/performance ratio. Since I don't run at 4K and don't care about super high framerates I feel like I am pretty well-positioned with a 2070 Super. If Ampere does look good/provides much higher performance/has compelling bells and whistles then I can always upgrade down the road.
|
# ? Aug 14, 2020 09:24 |
|
Three 8-pin connectors
|
# ? Aug 14, 2020 09:36 |
|
|
# ? Apr 29, 2024 11:32 |
|
Is it possible to dedicate separate GPUs for separate monitors? I have two 980s still which work great for Witcher 3 which I'm playing right now, but a lot of my time I am playing Hearthstone and MTG:Arena at the same time on adjacent monitors. With both plugged into my primary card my secondary game tends to lag and stutter which is very distracting, as both games combined max out the one 980 they use. I've got one monitor plugged into each gpu now and "enable all monitors" turned on in the control panel but it seems like anything on the second monitor is still run on the first card and then fed through the second. When I move a game from 1 to 2 the load on GPU1 stays the same but GPU2 loads up partially and there seems to be latency which is why I think this is what is happening. Any way to separate them?
|
# ? Aug 14, 2020 10:14 |