Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Stickman posted:

Let them eat Titans!

E: In a near-monopoly product pricing and product stack choices are not going to be closely tethered to some reasonably fixed margin.

Yep. Nvidia finally realize they have a monopoly, even if RDNA2 is cool and good they still have a lock in the data center (hence the same gpu in a consumer $700 then $1200 card selling for $6000)

As an NVDA shareholder, thank you Jensen!!!

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

Mr.PayDay posted:

Nvidia got greedy, the consensus of self called tech experts alias gamers.

it's not just nvidia, it's all corporations, op

Mr.PayDay
Jan 2, 2004
life is short - play hard

B-Mac posted:

I don’t think I’ve seen as much hyperbole packed into a post in a long time.

Might be the PCGH, Computerbase, hardwareluxx and Gamestar experience. German gamers feel they are entitled to 500 Euro maximum new gen fps.

Truga
May 4, 2014
Lipstick Apathy
computing has been in a really lovely state for a long time where every advance in performance is immediately followed by similar advance in bloat our computers run, and as long as you disable a couple really stupid features the game ships with like hairworks or weirdly expensive ambient occlusion algorithms that do nothing games run just fine on a $200-500 gpu depending on your resolution, so german gamers are both correct and dumb as poo poo.

raytracing is a bit of a different beast, but i'm also 100% sure it's being done in the dumbest, most expensive way possible right now and won't be actually good for another few years so there's probably no point in worrying about it for another generation or two either.

e: like seriously i can play a bunch of slighly older games at 4k on my 980Ti and just dropping hbao++ to ssao does nothing to image quality and gets me from 45 to vsync 60fps.

Truga fucked around with this message at 19:58 on Aug 20, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Truga posted:

e: like seriously i can play a bunch of slighly older games at 4k on my 980Ti and just dropping hbao++ to ssao does nothing to image quality and gets me from 45 to vsync 60fps.

While often true, if you're happy at 60fps you're not the demographic any of the xx60 on up Ampere are targeted at in the first place.

And "slightly older" games able to run comfortably at 4k@45 on a 980Ti are probably from...2016?

Truga
May 4, 2014
Lipstick Apathy
2017/18 too but yeah

also i'm not really happy with 60fps but my ancient monitor that did 2560x1600@72 died and i'm waiting until buying a replacement because i'm also waiting out for new nvidia/radeon cards to see what the gently caress is up. if RDNA2 can run my games at 4k/120 on high-ish settings i'm probably going with that, otherwise i'm gonna get 1440p/144hz

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Truga posted:

if RDNA2 can run my games at 4k/120 on high-ish settings i'm probably going with that

That would be hugely optimistic, let's put it that way. The new cards are competing with Turing, as much as loyal AMD users hate to face it. I don't expect anything much better than 2080ti-tier speeds on their fastest cards.

AMD simply aren't the ones to go to for the extreme high-end, they haven't been for years. They're all about the segments underneath it (and they do well there)

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
1440p/~144Hz is probably gonna be the sweet spot for price:quality for a bit, yeah. Personally I'm waiting for HDMI 2.1 to finally start showing up in monitors so I can move up off my X34, but that's because I spend way too much time in front of a screen, so dropping $2k or whatever on a real nice monitor isn't an outlandish expense for me.

So what I'm saying is that if the stars align and manufacturers get on fuckin' board and start dropping some quality monitors in the next quarter or two, I'm gonna be out a big chunk of cash between Zen3, a 3080(???), and whatever 21:9 high-hz monitor I can get.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zedsdeadbaby posted:

That would be hugely optimistic, let's put it that way. The new cards are competing with Turing, as much as loyal AMD users hate to face it. I don't expect anything much better than 2080ti-tier speeds on their fastest cards.

AMD simply aren't the ones to go to for the extreme high-end, they haven't been for years. They're all about the segments underneath it (and they do well there)

Maybe not as much as you think. Remember he's playing games from 2018 or earlier, and he's fine with turning stuff down to medium or whatever. Given that, a 2080/2080Ti-tier card might get pretty close to 4k@120ish. Maybe not a solid 120, but possibly close enough, especially with FreeSync thrown in.

Though at that point I might just look for a firesale/used 2080Ti, since at least you know the performance characteristics already and you know that the drivers are stable. I suppose it depends what AMD does with their pricing.

repiv
Aug 13, 2009

Intel just released driver updates for 3rd-6th gen iGPUs, even though they're generally not supported anymore, so uh update because they probably fixed some nasty as-yet undisclosed exploit

Cygni
Nov 12, 2005

raring to post

repiv posted:

Intel just released driver updates for 3rd-6th gen iGPUs, even though they're generally not supported anymore, so uh update because they probably fixed some nasty as-yet undisclosed exploit

wonder if it had anything to do with the Microsoft bug with the 2004 roll out and Intel graphics? I can't believe that was ever a thing, but here we are.

Ugly In The Morning
Jul 1, 2010
Pillbug

Zedsdeadbaby posted:


AMD simply aren't the ones to go to for the extreme high-end, they haven't been for years. They're all about the segments underneath it (and they do well there)

I wish they would, for two reasons:

A)I used to love Radeon cards and they were some of my best GPU buys ever
And
B)It would at least make NVidia cool it on the price increases some.

MikeC
Jul 19, 2004
BITCH ASS NARC

Zedsdeadbaby posted:

That would be hugely optimistic, let's put it that way. The new cards are competing with Turing, as much as loyal AMD users hate to face it. I don't expect anything much better than 2080ti-tier speeds on their fastest cards.

AMD simply aren't the ones to go to for the extreme high-end, they haven't been for years. They're all about the segments underneath it (and they do well there)

Do you know something we don't? The new Xbox should be just about as powerful as a 2080Ti. Are you saying that AMD is not interested in releasing a high-end RDNA2 card and will simply stay in the mid-range as they did with RDNA1 or are you saying that the Xbox SoC is literally the best performance they can wring out of RDNA 2?

Not being snarky, just curious.

Cygni
Nov 12, 2005

raring to post

MikeC posted:

The new Xbox should be just about as powerful as a 2080Ti.

in what, tflops? if tflops were all that mattered, the Vega 64 would be faster than a 2080 Ti. it is not.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

MikeC posted:

Do you know something we don't?

The XBoSxX claims 12 TFLOPs vs the 2080Ti's 14+ TFLOPs, so even on paper it's trailing by 15% off the start. The released clock rates of ~1.8Ghz is about where the base clocks of current Navi cards are, with boost clocks around 2Ghz. So if they just ported it over directly like that with ~2Ghz clocks, yeah, it could probably be somewhere close to or a bit below 2080Ti performance under optimal conditions (never forget that TFLOPs don't tell the whole story thanks to drivers and game optimizations--and RDNA1 tended not to scale nearly as well with resolution vs Turing in some games).

That they'll have something at least a bit higher than that seems natural. I'd expect that they didn't wring every bit of performance out of something they had to slap into a SoC and share a power budget with a CPU on. But how much higher they bother going? Total mystery right now, and prices are also an open question. If I were AMD and knew I had a competitive product, though, I'd start messaging that real goddamned soon, or NVidia is gonna release Ampere into an environment where people are salivating for new cards.

On the other hand, AMD has traditionally been quite ok with tapping out around the xx60/70 level, on the argument that that's about where volume profits start to drop off. Sure, the xx80Ti and Titan make $$ profits, but they don't sell anywhere near as many of them as they do lower-tier cards, where the "real" money is made via volume sales. I honestly could see them hanging around in the mid tier for one more generation, then using the hot cash injection they've gotten from Zen to try to turn it up to 11 for whatever their next gen will be.

DrDork fucked around with this message at 00:47 on Aug 21, 2020

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

repiv posted:

Intel just released driver updates for 3rd-6th gen iGPUs, even though they're generally not supported anymore, so uh update because they probably fixed some nasty as-yet undisclosed exploit

gonna put some money on the shader engine getting tickled by WebGL, just for old time’s sake

shrike82
Jun 11, 2005

lol people on the various boards I follow seem to be latching onto the hope/rumor that the 3090s will be $1400. Don't see that happening if it's loaded with 24GB. Maybe if there's a 12GB variant.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

MikeC posted:

Do you know something we don't? The new Xbox should be just about as powerful as a 2080Ti. Are you saying that AMD is not interested in releasing a high-end RDNA2 card and will simply stay in the mid-range as they did with RDNA1 or are you saying that the Xbox SoC is literally the best performance they can wring out of RDNA 2?

Not being snarky, just curious.


The XSX should be around 2080 range, not 2080 Ti range. That's a 30% gap or so in performance. The rumored Big Navi is probably faster than a 2080 Ti, but Nvidia will be releasing 2 cards faster than the 2080 Ti in the 3080 and 3090. Realistically I think we're looking at Big Navi around RTX 3080 range and the 3090 as King poo poo of Frames Mountain will stand tall and demand exorbitant prices.

I really feel if AMD was to compete with the 3090, they would have already started some sort of marketing campaign. If they let NV get to September 1st, launch Ampere and are still silent, presume they won't be competitive as it will be super dumb to let NV just take all that fat whale money for 2 months before Big Navi launches unopposed.

NewFatMike
Jun 11, 2015

I'm honestly not surprised that Radeon marketing has taken a chill pill, wasn't all the insane Soviet style and "poor Volta" poo poo on Raja's watch?

Not to say they're sandbagging or anything about the product, but if I were Lisa, I'd keep the marketing department in a cavern until they produced something that's less of a dumpster fire.

MikeC
Jul 19, 2004
BITCH ASS NARC

Cygni posted:

in what, tflops? if tflops were all that mattered, the Vega 64 would be faster than a 2080 Ti. it is not.

Not sure what you are getting at. No one has an Xbox doing testing so yes, tflops is what we have to go by. Or do you too have information that we don't to know that it is poo poo?


DrDork posted:

The XBoSxX claims 12 TFLOPs vs the 2080Ti's 14+ TFLOPs, so even on paper it's trailing by 15% off the start. The released clock rates of ~1.8Ghz is about where the base clocks of current Navi cards are, with boost clocks around 2Ghz. So if they just ported it over directly like that with ~2Ghz clocks, yeah, it could probably be somewhere close to or a bit below 2080Ti performance under optimal conditions (never forget that TFLOPs don't tell the whole story thanks to drivers and game optimizations--and RDNA1 tended not to scale nearly as well with resolution vs Turing in some games).

That they'll have something at least a bit higher than that seems natural. I'd expect that they didn't wring every bit of performance out of something they had to slap into a SoC and share a power budget with a CPU on. But how much higher they bother going? Total mystery right now, and prices are also an open question. If I were AMD and knew I had a competitive product, though, I'd start messaging that real goddamned soon, or NVidia is gonna release Ampere into an environment where people are salivating for new cards.

On the other hand, AMD has traditionally been quite ok with tapping out around the xx60/70 level, on the argument that that's about where volume profits start to drop off. Sure, the xx80Ti and Titan make $$ profits, but they don't sell anywhere near as many of them as they do lower-tier cards, where the "real" money is made via volume sales. I honestly could see them hanging around in the mid tier for one more generation, then using the hot cash injection they've gotten from Zen to try to turn it up to 11 for whatever their next gen will be.

Yes, I agree that Tflops is meaningless but it is all we have. I was asking the poster who basically flat out stated they *know* AMD isn't going to try to compete in the high end this year and that their expected best performer is around a 2080Ti. To me, this makes no sense. The Xbox is going to be close to the 2080Ti in raw Tflops on a 56 CU die. The rumors of an 80 CU RDNA 2 card being in the works has been literally around forever.

So unless the poster was just spitballing, one of two things must be true. Either he knows the 80 CU card is cancelled and just like what happened with RDNA 1 or that he knows that RNDA 2 on the Xbox isn't going to come close to the 2080Ti and AMD needs the 80 CU discrete GPU to match the 2080Ti in performance. And he didn't sound like he was just spitballing given the certainty of his tone. If I misinterpreted and he is just goofing around like the rest of us, fair play, thats why I asked for clarification.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

MikeC posted:

Not sure what you are getting at. No one has an Xbox doing testing so yes, tflops is what we have to go by. Or do you too have information that we don't to know that it is poo poo?
He's saying the tflops can't be compared between architectures, but this holds true for RDNA2 which is what the Series X GPU is based on as well. It's far more efficient than Vega. Everything I've read indicates that yes, it is in the 'range' of a 2080ti.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Calling it now, devs will target the new consoles to run at max settings, the headroom on Nvidia's new cards will be unusable until panel manufacturers exceed 4k120.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

MikeC posted:

Not sure what you are getting at. No one has an Xbox doing testing so yes, tflops is what we have to go by. Or do you too have information that we don't to know that it is poo poo?


Yes, I agree that Tflops is meaningless but it is all we have. I was asking the poster who basically flat out stated they *know* AMD isn't going to try to compete in the high end this year and that their expected best performer is around a 2080Ti. To me, this makes no sense. The Xbox is going to be close to the 2080Ti in raw Tflops on a 56 CU die. The rumors of an 80 CU RDNA 2 card being in the works has been literally around forever.

So unless the poster was just spitballing, one of two things must be true. Either he knows the 80 CU card is cancelled and just like what happened with RDNA 1 or that he knows that RNDA 2 on the Xbox isn't going to come close to the 2080Ti and AMD needs the 80 CU discrete GPU to match the 2080Ti in performance. And he didn't sound like he was just spitballing given the certainty of his tone. If I misinterpreted and he is just goofing around like the rest of us, fair play, thats why I asked for clarification.

It's spitballing more than anything else. Common sense alone tells us there is no way the XSX's graphical output will match anywhere near a $1100 video card. The XSX's overall power draw tops out at 300w according to MS, while the 2080ti alone can hit 277. The XSX is tiny and has to think about thermals, something AMD is notoriously poor at with high-end GPUs (vega 56 and 64 says hello). Teraflops mean gently caress all and really should only be used as a point of comparison across similar architectures. And of course, this is the most crucial thing - MS has to sell consoles at a low price point. It's generally believed it will retail for 500-600 dollars.

If it truly was 2080ti level power, their flagship game Halo Infinite wouldn't look like such a bag of poo poo now would it? I think that's the most telling thing.

Zedsdeadbaby fucked around with this message at 04:42 on Aug 21, 2020

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Zedsdeadbaby posted:

If it truly was 2080ti level power, their flagship game Halo Infinite wouldn't look like such a bag of poo poo now would it? I think that's the most telling thing.

Are you saying that Halo Infinite would look good if someone just stuck it on a 2080Ti somehow? It looks approximately 5.7x shittier than Ghost of PlayStation 4 Platform Mastery, but I don’t think that means the PS4 is somehow more powerful than the XSX.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Subjunctive posted:

Are you saying that Halo Infinite would look good if someone just stuck it on a 2080Ti somehow? It looks approximately 5.7x shittier than Ghost of PlayStation 4 Platform Mastery, but I don’t think that means the PS4 is somehow more powerful than the XSX.

This is what it looked like in 2018
This is what it looks like as of a month ago

There's an obvious reason projects get downgraded in the visual department

BurritoJustice
Oct 9, 2012

I'm unironically pumped for a 24gb 3090 so I can play 600 mod Skyrim load orders without stutter.

shrike82
Jun 11, 2005



lots of people willing to pay >$1500 for the 3090. the stock situation at launch is going to be nuts

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I'm sorely tempted at $1399 if it has 24GB of buffer. I don't NEED it, but I plan on sticking with UW 1440p for a while and it'd be nice to not have to really worry about buffer overruns for a good three-plus years. Maybe.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

shrike82 posted:



lots of people willing to pay >$1500 for the 3090. the stock situation at launch is going to be nuts

At the end of the day, even the 2080 Ti despite being an awful price/performance value, is still a relatively popular GPU. Based on total percentage of users on Steam, NV probably sold a million+ units of that card. The 2080 Ti whales are likely looking at the 3090 and the 1080 Ti owners who could have bought a 2080 Ti but chose not to due to it not being a great upgrade are also likely looking at the 3090. There's enough hype around RTX tech like DLSS 2.0 to get people more interested now than they were at the 2080 Ti launch. The pandemic may even help the cause, those who did keep their jobs have likely been saving a lot of money and want something to splurge on for entertainment's sake. I know personally I won't be going to any bars, movies or indoor restaurants here in NYC until I'm vaccinated, all that money I'm not spending sitting at home shitposting about GPU's online is going right into the 3090.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

BIG HEADLINE posted:

I'm sorely tempted at $1399 if it has 24GB of buffer. I don't NEED it, but I plan on sticking with UW 1440p for a while and it'd be nice to not have to really worry about buffer overruns for a good three-plus years. Maybe.

Same exact situation

Warmachine
Jan 30, 2012



Beautiful Ninja posted:

At the end of the day, even the 2080 Ti despite being an awful price/performance value, is still a relatively popular GPU. Based on total percentage of users on Steam, NV probably sold a million+ units of that card. The 2080 Ti whales are likely looking at the 3090 and the 1080 Ti owners who could have bought a 2080 Ti but chose not to due to it not being a great upgrade are also likely looking at the 3090. There's enough hype around RTX tech like DLSS 2.0 to get people more interested now than they were at the 2080 Ti launch. The pandemic may even help the cause, those who did keep their jobs have likely been saving a lot of money and want something to splurge on for entertainment's sake. I know personally I won't be going to any bars, movies or indoor restaurants here in NYC until I'm vaccinated, all that money I'm not spending sitting at home shitposting about GPU's online is going right into the 3090.

Also if any more coronabux come down the pipe, that'll be a big driver. I know my stimulus check financed half of the PC I built this year.

CaptainSarcastic
Jul 6, 2013



Warmachine posted:

Also if any more coronabux come down the pipe, that'll be a big driver. I know my stimulus check financed half of the PC I built this year.

It financed pretty much all of my new desktop. I did pay for a new monitor and GPU after that, though, when I decided my old 24" just wasn't cutting it anymore.

The Big Bad Worf
Jan 26, 2004
Quad-greatness

shrike82 posted:



lots of people willing to pay >$1500 for the 3090. the stock situation at launch is going to be nuts

yeah if these rumored prices are anywhere near accurate i'm just going to buy whatever AMD card is around $400~600 later this year and call it a day. i moved away from a gsync module monitor to one that's Gsync Compatible so i wont even lose VRR support

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

Zedsdeadbaby posted:

It's spitballing more than anything else. Common sense alone tells us there is no way the XSX's graphical output will match anywhere near a $1100 video card. The XSX's overall power draw tops out at 300w according to MS, while the 2080ti alone can hit 277. The XSX is tiny and has to think about thermals, something AMD is notoriously poor at with high-end GPUs (vega 56 and 64 says hello). Teraflops mean gently caress all and really should only be used as a point of comparison across similar architectures. And of course, this is the most crucial thing - MS has to sell consoles at a low price point. It's generally believed it will retail for 500-600 dollars.

If it truly was 2080ti level power, their flagship game Halo Infinite wouldn't look like such a bag of poo poo now would it? I think that's the most telling thing.

I would say the reason that Halo Infinite's graphics are underwhelming at this stage isn't so much the lack of power in the XSX, but that it needs to also run on the base Xbox One.

While Microsoft's cross-gen game policy sounds like a good idea. I think it will leave them at a disadvantage against Sony. You can only scale down so much. Which results in two problems. Either the base Xbox One version of games are going to run like garbage and be unsatisfying to play or the potential of the XSX cant be used to it's full extant and games won't look or run as well as those on the PS5.

fakeedit: It occurred to me while typing this that there's been games like The Witcher 3 and Doom that have been scaled down to run on the Switch, but I'd say that gets pass as the expectations in resolution and frame rate are lower for a handheld.

Stickman
Feb 1, 2004

Isn't backporting big launch titles a time-honored tradition at this point?

Craptacular!
Jul 9, 2001

Fuck the DH
It’s very weird to see Paul drink the Turing fine wine. Cards that take nine months to come into their own usually aren’t the kind of thing he praises.

As far as “prime” lifespan, the 1070 was a year out when I bought mine. I don’t see a problem with that, especially if you use 1080p. The chances you’re going to need a more powerful. GPU than a 1070 the past three years at 1080p seemed unlikely, and you got the 8GB card when 3GB/6GB 1060s are feeling aged.

One thing that exists now that didn’t exist even five years ago is a real diversity of displays. Everybody in the old days basically stepped up resolution as graphics cards allowed, and in the HD era we technically had two popular display resolutions but if you were interested even in consoles the benefits of 1080 were obvious.

These days you can buy a card intended for a display more advanced than what you own and ride it out longer.

Arzachel
May 12, 2012

Stickman posted:

Isn't backporting big launch titles a time-honored tradition at this point?

More like forward porting. Games will be developed around PS4/XBone with bells and whistles added for the new consoles until they have a large enough install base two years down the line, just like last gen. Except this time it's not just third-parties but also Sony and Microsoft.

CactusWeasle
Aug 1, 2006
It's not a party until the bomb squad says it is
I am still on a GTX 1070. The only reason I didnt buy a 2070 was because 80% of my issue was running out of vram so I sure wasnt going to spend 500~ for a new card with the exact same amount, and sure as hell wasnt spending 1300~ for a Ti. So I am one of the idiots praying for a 16GB 3070 :pray:

Truga
May 4, 2014
Lipstick Apathy

DrDork posted:

Though at that point I might just look for a firesale/used 2080Ti, since at least you know the performance characteristics already and you know that the drivers are stable. I suppose it depends what AMD does with their pricing.

yeah unless RDNA2 is pretty good, i'm going to go with a used 2080ti and ditch the linux-only idea for a couple more years

Adbot
ADBOT LOVES YOU

Craptacular!
Jul 9, 2001

Fuck the DH

Lockback posted:

Keep in a mind a 3060 probably won't be available at launch, so there will still be a lot of demand and nothing new for that range.
Lots of people say this, but I'd like to express skepticism this time, because while it's the way a lot of generations of cards have launched, there hasn't been a generation in a long time that launched alongside something as big as Cyberpunk.

When you've got a game on the shelves that has people looking to buy an upgrade as much as this one does, if everything is priced for true believers then many will simply forget about it. I can't think of a game that has pushed people to upgrade like this since Half-Life 2.

Truga posted:

yeah unless RDNA2 is pretty good, i'm going to go with a used 2080ti and ditch the linux-only idea for a couple more years

The reasons to not go Linux only are not related to Nvidia. They're related to anti-cheat kmods. Speaking of, someone figured out how to get GeForce Now to run and made a Lutris install script.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply