Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

MikeC posted:

That's a wierd flex when the correct play was not to buy Turing though. All of the criticism was true and is still true to this day.

Anyone who didn't buy Turing will get DLSS and Ray Tracing at a better price on Ampere when the features are starting to come online. The only reason a lot of people care now is that CP 2077 is coming out with both major pieces of tech supported.

This is literally always true though. Don't buy Ampere because something better is coming out, just wait 2 years like you did with Turing. Turing ended up "pretty ok" because a lot of it's features ended up being really cool, it just took a while to get there.

And there were lots of people who thought the Tensor cores were totally worthless and now DLSS is the most exciting new feature thats come along in a long time.

(For the record, I think the non-Super lineup had questionable value but the Super refresh was pretty good).

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

repiv posted:

why would you do that when nvidia already has the best encoder block in the business

otoh, my ryzen can encode 1080p60 x264 on slow preset at ~20% cpu usage, including whatever it has to throw at the game, lmao. i'm gonna try very slow preset tomorrow to see if there's any more quality to be gained, though i somewhat doubt it

if you're just recording the gpu encoding is probably the best, but for streaming stuff you kinda want x264 to capture other poo poo and ryzen just absolutely crunches that poo poo now at settings that usually produce better quality video. it's loving amazing y'all

Ugly In The Morning
Jul 1, 2010
Pillbug

Lockback posted:


(For the record, I think the non-Super lineup had questionable value but the Super refresh was pretty good).

I'm incredibly happy with my 2070S, it was in exactly the sweet spot for that graphics card lineup. That said, it's probably going to be my shortest-lived graphics card since Ampere is dropping like 8 months after I got it.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Carecat posted:

What I'm hearing from this is Nvidia saying "check out this $2000 3090"

If the 3090 is the new Titan naming scheme, $2k would be a bargain. The Turing one is $2500+.

Lockback posted:


This is literally always true though. Don't buy Ampere because something better is coming out, just wait 2 years like you did with Turing. Turing ended up "pretty ok" because a lot of it's features ended up being really cool, it just took a while to get there.

But it took a substantially longer "a while to get there" for this generation. In most past ones, you got most of the goodness upfront, and then maybe some new features actually got picked up later on that made some mild improvement. Turing suffered from not being all that much faster than Pascal, while upping prices, and promising really cool stuff that, frankly, still hasn't panned out except for in a few games. In those games where it did finally show up, it is indeed pretty cool, but now we're at the point that we're actively discouraging people from buying Turing because Ampere is so close.

Don't get me wrong, I think NVidia made the right strategic play by using Turing to prime the pump that'll be DLSS 2.0 + RTX adoption with Ampere, but that doesn't change the fact that for the vast majority of its life it was a pretty lackluster generation. Had they launched with Super-performance cards no one would be arguing, but they didn't.

Alchenar
Apr 9, 2008

DLSS becoming relevant right at the point where Turing is about to be replaced isn't really the strongest argument in your favour.

NVIDIA have absolutely got a long term business strategy that we can see looking like it's about to start paying off heavily, but that doesn't change the fact that the 2000 series were overpriced and underwhelming.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Alchenar posted:

DLSS becoming relevant right at the point where Turing is about to be replaced isn't really the strongest argument in your favour.

NVIDIA have absolutely got a long term business strategy that we can see looking like it's about to start paying off heavily, but that doesn't change the fact that the 2000 series were overpriced and underwhelming.

Especially when it's focused on the 2080ti. You could have double the # of games with support for ray tracing + DLSS 2.0 and it would still be questionable as a 'sensible' purchase for all but the most hardcore enthusiasts. It's a $1200 card, hell it's always been ~$2000 Cndn (at best!) here, that is simply unobtainable for the vast majority of gaming public, even with relatively high-end PC's. If that's where RTX gets 'good' then if anything it cements 'wait for Ampere'.

DeadlyHalibut
May 31, 2008
Which one out of the new cards should I be waiting for if I'm interested in 1440p 144hz gaming? Would be paired with ryzen 3700X.

3070?

I have no idea what modern cards do, my old setup was like 8 years old.

DeadlyHalibut fucked around with this message at 17:08 on Aug 13, 2020

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

DeadlyHalibut posted:

Which one out of the new cards should I be waiting for if I'm interested in 1440p 144hz gaming? Would be paired with ryzen 3700X.

3070?

I have no idea what modern cards do, my old setup was like 8 years old.

Probably, yeah.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I agree that the 20 series were overpriced and underwhelming. It's the ugly, awkward teen stage of the RTX, DLSS, etc newness that looks like it'll find its legs and its maturity with the 30 series. I guess it was a necessary step that could have gone better for us price-wise but what's done is done and I think when we're all playing cyberpunk with all that raytracing poo poo it'll be worth it.

Suffice to say I think the days of paying just 500-700 for the top-end Ti are over, I do think the new cards will be expensive once again.
My 1080ti will have a viking funeral when it finally bites the dust.

Cygni
Nov 12, 2005

raring to post

https://www.anandtech.com/show/15974/intels-xehpg-gpu-unveiled-built-for-enthusiast-gamers-built-at-a-thirdparty-fab

Intels new "HPG" gaming targeted Xe version confirmed for 2021. Sure looks like the "Xe is cancelled!" was dumb madeup dogshit, wow!

DXR hardware ray tracing and GDDR6, not on EMIB it appears.

Cygni fucked around with this message at 17:14 on Aug 13, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Wichard Leadbeater seems pretty upbeat about it too https://www.eurogamer.net/articles/digitalfoundry-2020-intel-architecture-day-tiger-lake-xe-gaming-focus

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Zedsdeadbaby posted:

I agree that the 20 series were overpriced and underwhelming. It's the ugly, awkward teen stage of the RTX, DLSS, etc newness that looks like it'll find its legs and its maturity with the 30 series. I guess it was a necessary step that could have gone better for us price-wise but what's done is done and I think when we're all playing cyberpunk with all that raytracing poo poo it'll be worth it.

Suffice to say I think the days of paying just 500-700 for the top-end Ti are over, I do think the new cards will be expensive once again.
My 1080ti will have a viking funeral when it finally bites the dust.

I will say that one thing in Turing's favor is that DLSS2 might make it's overall viable lifespan quite long. It's kind of aging like fine wine, assuming you believe in a DLSS-centric future. And it really did gain on benchmarks over the years it was in market with current games as well. Not the best architecture ever, but it's at least intellectually interesting how it got much better over time.

Also I mentioned this before (and it doesn't change your point) but the die size on Ampere is significantly smaller, which will save them a decent amount on fab. If Nvidia takes pity on us poor souls, they could ratchet down the pricing, or maybe just keep prices consistent while increasing the VRAM up to 20GB on the top end. It's a possibility at least...

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
...or reap more profit. Hmmm, what would a company do, what would a company do.....

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Lockback posted:

...or reap more profit. Hmmm, what would a company do, what would a company do.....

Reinvest in Shroud Tech 2.0 to better dissipate all the heat from the denser cores!

an actual dog
Nov 18, 2014


Is he ever not? Lol

Alchenar
Apr 9, 2008

Taima posted:

I will say that one thing in Turing's favor is that DLSS2 might make it's overall viable lifespan quite long. It's kind of aging like fine wine, assuming you believe in a DLSS-centric future. And it really did gain on benchmarks over the years it was in market with current games as well. Not the best architecture ever, but it's at least intellectually interesting how it got much better over time.

Also I mentioned this before (and it doesn't change your point) but the die size on Ampere is significantly smaller, which will save them a decent amount on fab. If Nvidia takes pity on us poor souls, they could ratchet down the pricing, or maybe just keep prices consistent while increasing the VRAM up to 20GB on the top end. It's a possibility at least...

DLSS2 absolutely means that your ability to stretch out a card's lifespan by turning down the settings on newer games presumably got a hell of a lot longer.

CaptainSarcastic
Jul 6, 2013



I bought a 2070 Super because I had built a new desktop and bought a new 1440p monitor and my old 1060 6GB just couldn't keep up with it.

Until we see the performance, pricing, and availability of Ampere (as well as the demands of games I actually want to play) I'm waiting to see how I feel about the need to upgrade. At this time I'm kind of thinking I'll be able to comfortably stay with the 2070 Super for a while - my 1060 6GB lasted me 4 years and would likely still be my GPU if I hadn't upgraded my monitor.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Does anyone actually believe in Ampere AIB boards launching any time soon? There's a lack of leaks in regards to those, considering it's supposedly happening so soon. That close before the release of Turing, pictures of AIB boards were being passed around.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Combat Pretzel posted:

Does anyone actually believe in Ampere AIB boards launching any time soon? There's a lack of leaks in regards to those, considering it's supposedly happening so soon. That close before the release of Turing, pictures of AIB boards were being passed around.

leaks say they’ll be launching immediately, which in practical terms probably means within a month or so of FE

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
"Launched" and "Available" are also two separate, distinct states.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Lockback posted:

"Launched" and "Available" are also two separate, distinct states.

Yeah, this. I wouldn't bet terribly surprised if they launched alongside the FEs shortly after NVidia's announcement. But I also wouldn't be terribly surprised if that initial batch was very small, and it took a month or two for them to really have inventory enough that you don't have to snipe one from somewhere.

Or maybe NVidia learned from the last two launches and have stocked up. :iiam:

VelociBacon
Dec 8, 2009

Okay my million dollar idea that I'm providing here for free because I'm fed up:

Someone use what they did for RTX Voice to develop something that mutes the buzz from the drone audio feed for all these live sports events. It's only getting more and more popular to use drones for sport coverage and it's something super marketable to the companies doing the streaming.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Carecat posted:

What I'm hearing from this is Nvidia saying "check out this $2000 3090"

I think my hard line in the sand is $1300-1500 for a 3080 Ti/3090. Def not paying for a rebranded Titan

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

VelociBacon posted:

Okay my million dollar idea that I'm providing here for free because I'm fed up:

Someone use what they did for RTX Voice to develop something that mutes the buzz from the drone audio feed for all these live sports events. It's only getting more and more popular to use drones for sport coverage and it's something super marketable to the companies doing the streaming.

This is only slightly relevant, but a different issue on a lot of live sports is the commentary - a really useful tip I learned some years ago is that on some broadcasts with 5.1 audio, disabling the central speaker is all you need to do to get rid of commentary altogether, since that's only what it's used for. You still get all the rest of the audio on the other speakers.

Perhaps the drone audio is exclusive to one of the other speakers, it's worth taking a look at next time you are watching live sports.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

I like the intel shrouds and also the lists of GPUs

the only two GPU's that ill probably really care about in any sentimental sense are the TNT Riva 16mb i had that screamed at the time & the GeForce4 MX440, which, ill just say i look back on fondly & smile

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World
I remember the upgrade from Hercules to VGA.

:corsair:

Cactus
Jun 24, 2006

DrDork posted:

Or maybe NVidia learned from the last two launches and have stocked up. :iiam:

In the Year of the Furlough, they'd be fools if they haven't done this. Absolute fools.

Cygni
Nov 12, 2005

raring to post

People are so jACKED uP about this launch that will sell through them all regardless of the price or stock i think.

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum

sean10mm posted:

I remember the upgrade from Hercules to VGA.

:corsair:

That was some crazy poo poo! Cga and hercules having 4? colours, then some guy had an EGA which was 16?, then suddenly there was vga with 256 colours, which was 100% 'photorealistic' and absolutely blew our minds.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I'm kind of wondering what effect the Postal Service collapse is going to have on people jockeying for acquiring these parts ASAP

I mean, not even just GPUs, but in general

Worf
Sep 12, 2017

If only Seth would love me like I love him!

oh man. going back and reminiscing about all the times i said "haha man this is it, its basically real"

i think i gave up doing that in like 1997 because man it was tiring being wrong so often

dark forces first level: oh my god its here im actually in star wars

Cactus
Jun 24, 2006

Yeah I've been itching to upgrade for about a year now and I've been waiting and holding off on playing most of the modern releases I want to play because of the GPU landscape being what it is right now, and I'll be fuming if I can't in time for Cyberpunk just because Nvidia decided to launch with only a couple of thousand units available worldwide.

They better not be trying to pull a nintendo switch on us, it isn't big and it isn't clever.

Cygni
Nov 12, 2005

raring to post

redreader posted:

That was some crazy poo poo! Cga and hercules having 4? colours, then some guy had an EGA which was 16?, then suddenly there was vga with 256 colours, which was 100% 'photorealistic' and absolutely blew our minds.

SELECT GRAPHICS MODE

F1) CGA
F2) TANDY
F3) EGA
F4) MCGA
F5) HERCULES
F6) VGA
F7) SVGA
F8) XGA
F9) MONCHROME
F10) MDA
F11) MDMA
F12) HERCULES: ZERO TO HERO (Disney - 1999 - VHS)

Mr.PayDay
Jan 2, 2004
life is short - play hard

sean10mm posted:

I remember the upgrade from Hercules to VGA.

:corsair:

Wasn’t Hercules sth like 740*350 an beat even EGA but was only black and white?
IIRC Hercules cards were used for monochrome, black-green and black-orange CRT Monitors.

I remember my dad and my buddies gaming dads were from CGA to EGA and VGA around 1988, 1989.

I also remember playing Comanche in a 256 color VGA Glory on the NovaLogic VoxelSpace engine that was groundbreaking back in the day and finally buried the Amiga and Atari ST as 3D contenders.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Mr.PayDay posted:

Wasn’t Hercules sth like 740*350 an beat even EGA but was only black and white?
IIRC Hercules cards were used for monochrome, black-green and black-orange CRT Monitors.

I remember my dad and my buddies gaming dads were from CGA to EGA and VGA around 1988, 1989.

I also remember playing Comanche in a 256 color VGA Glory on the NovaLogic VoxelSpace engine that was groundbreaking back in the day and finally buried the Amiga and Atari ST as 3D contenders.

Yup, monochrome text with graphics.

Jumping straight to VGA was wild.

Parallelwoody
Apr 10, 2008


Furiously hits F11

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
3090 pcb leak? https://wccftech.com/nvidia-geforce-rtx-3090-enthusiast-ampere-gaming-graphics-card-pcb-pictured-triple-8-pin-connectors-next-gen-g6-memory/

quote:

There's also a secondary chip that seems to be featured right underneath the GPU itself. The leaker placed an Intel CPU on top of the chip so that it doesn't get exposed but it looks like NVIDIA may offer a secondary chip that is not a part of the GPU die itself that may handle a set of specific work loads which are yet to be detailed. Other features we can expect from the NVIDIA GeForce RTX 30 series graphics cards is a fully PCIe Gen 4 compliant design and enhanced power delivery to several components on the PCB.

:raise: ... rtx accelerator?... :shrug:

In any case if there are completed 3rd party cards that could actually mean an at-launch positioning.... potentially, maybe.

Also ~22gb memory on pcb apparently.

CaptainSarcastic posted:

I bought a 2070 Super because I had built a new desktop and bought a new 1440p monitor and my old 1060 6GB just couldn't keep up with it.

Until we see the performance, pricing, and availability of Ampere (as well as the demands of games I actually want to play) I'm waiting to see how I feel about the need to upgrade. At this time I'm kind of thinking I'll be able to comfortably stay with the 2070 Super for a while - my 1060 6GB lasted me 4 years and would likely still be my GPU if I hadn't upgraded my monitor.

Just out of curiosity, in retrospect how do you feel about the 2070S purchase? Didn’t you recently buy that? Was it worth having it in the meantime? I can see how it might have been considering the quarantine.

Taima fucked around with this message at 09:36 on Aug 14, 2020

CaptainSarcastic
Jul 6, 2013



Taima posted:

Just out of curiosity, in retrospect how do you feel about the 2070S purchase? Didn’t you recently buy that? Was it worth having it in the meantime? I can see how it might have been considering the quarantine.

I feel good about it. I got my new monitor in May, and it was clear my 1060 6GB could not drive it at a comfortable resolution. I got my 2070 Super last week of May/first week of June, and have been able to run whatever I want pretty much maxed out. Trying to wait until Ampere released just didn't seem reasonable, and I think it is highly likely that I won't need to upgrade anyway, or at least not for a while. As it is I feel no time pressure to upgrade to Ampere, so supply issues are not a concern to me. Similarly, I have less invested in how it turns out to be in terms of price/performance ratio. Since I don't run at 4K and don't care about super high framerates I feel like I am pretty well-positioned with a 2070 Super. If Ampere does look good/provides much higher performance/has compelling bells and whistles then I can always upgrade down the road.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Three 8-pin connectors :stonk:

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

Is it possible to dedicate separate GPUs for separate monitors? I have two 980s still which work great for Witcher 3 which I'm playing right now, but a lot of my time I am playing Hearthstone and MTG:Arena at the same time on adjacent monitors.

With both plugged into my primary card my secondary game tends to lag and stutter which is very distracting, as both games combined max out the one 980 they use.

I've got one monitor plugged into each gpu now and "enable all monitors" turned on in the control panel but it seems like anything on the second monitor is still run on the first card and then fed through the second. When I move a game from 1 to 2 the load on GPU1 stays the same but GPU2 loads up partially and there seems to be latency which is why I think this is what is happening.

Any way to separate them?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply