Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BrainDance
May 8, 2007

Disco all night long!

I just wanna max everything out in VR and not have to be all "hmm well beatsabers fine but better turn down the resolution for skyrim better spend more time looking at framerates than actually playing" and I'm willing to pay a lot for it.

I don't care 4080 just be that

Adbot
ADBOT LOVES YOU

cheesetriangles
Jan 5, 2011





If you are caring about power the 4000 series makes more sense. You can get same performance at less power thanks to the smaller process.

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22

cheesetriangles posted:

If you are caring about power the 4000 series makes more sense. You can get same performance at less power thanks to the smaller process.

I hope this is true, I'm generally trying to reduce power draw across my system. I guess undervolted 3000 series cards would probably get me an OK result on power as well.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
The Asrock version of the Arc A380 apparently costs $192 in China. That is not a compelling price for that hardware.

It has a single slot cooler and four display outputs directly sprouting from the PCB. For some reason it has a two-slot PCI slot cover.

I run Linux on the desktop. I rarely play computer games. I use a discrete card to get multiple display outputs. Even with Intel's strong history of open source graphics drivers compared to AMD and Nvidia $200 is way too much money for the weak performance on offer. Make it $100 and I'll bite.

Is there any reason to believe the two higher cards (the i5 and i7 ones) will be any less reliant on rebar?

PBCrunch fucked around with this message at 19:01 on Aug 4, 2022

kliras
Mar 27, 2021
it also doesn't have hdmi 2.1 :/

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

PBCrunch posted:

The Asrock version of the Arc A380 apparently costs $192 in China. That is not a compelling price for that hardware.

It has a single slot cooler and four display outputs directly sprouting from the PCB. For some reason it has a two-slot PCI slot cover.

I run Linux on the desktop. I rarely play computer games. I use a discrete card to get multiple display outputs. Even with Intel's strong history of open source graphics drivers compared to AMD and Nvidia $200 is way too much money for the weak performance on offer. Make it $100 and I'll bite.

Is there any reason to believe the two higher cards (the i5 and i7 ones) will be any less reliant on rebar?

Yeah, that kind of money gets you a GTX 1660 even in this moment's hosed up graphics card economy.

I bet we don't get a US launch at all.

Dr. Video Games 0031
Jul 17, 2004

I would not compare prices in one country to prices in another. They will be different once they hit store shelves in the US. That said...

https://www.techpowerup.com/297490/intel-arc-board-partners-are-reportedly-stopping-production-encountering-quality-issues

I'll just quote the whole article since it's short:

quote:

According to sources close to Igor Wallossek from Igor's lab, Intel's upcoming Arc Alchemist discrete graphics card lineup is in trouble. As the anonymous sources state, certain add-in board (AIB) partners are having difficulty adopting the third GPU manufacturer into their offerings. As we learn, AIBs are sitting on a pile of NVIDIA and AMD GPUs. This pile is decreasing in price daily and losing value, so it needs to be moved quickly. Secondly, Intel is reportedly suggesting AIBs ship cards to OEMs and system integrators to start the market spread of the new Arc dGPUs. This business model is inherently lower margin compared to selling GPUs directly to consumers.

Last but not least, it is reported that at least one major AIB is stopping the production of custom Arc GPUs due to quality concerns. What this means is yet to be uncovered, and we have to wait and see which AIB (or AIBs) is stepping out of the game. All of this suggests that the new GPU lineup is on the verge of extinction, even before it has launched. However, we are sure that the market will adapt and make a case for the third GPU maker. Of course, these predictions should be taken with a grain of salt, and we await more information to confirm those issues.

Oof. Not all AIBs will be in this same position (e.g. previously AMD-exclusive AIBs like ASRock), but I have been wondering how Intel would manage to step into the game while most potential board partners are oversupplied with competing products.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

https://www.youtube.com/watch?v=G2SP9u5ke1k

Are people just getting ARC cards now?

teagone
Jun 10, 2003

That was pretty intense, huh?


I thought Intel was giving them to everyone, but they could only publish A380 numbers.

Dr. Video Games 0031
Jul 17, 2004

I don't remember who it was, but I've seen some people accuse some of these accounts posting split-screen performance comparisons as faking the results. Considering this would be the first time we've ever heard of anyone having the A770 or A750, I'm very suspicious of that video. As far as I know, nobody's even managed to test them in China yet.

Dr. Video Games 0031
Jul 17, 2004

teagone posted:

I thought Intel was giving them to everyone, but they could only publish A380 numbers.

I don't think this is the case. They had the A750 running on some test benches at LTT and GamersNexus (maybe elsewhere?), but they haven't been supplying anyone with review samples. Those were just demo units that they went on a media tour with, basically.

teagone
Jun 10, 2003

That was pretty intense, huh?

Dr. Video Games 0031 posted:

I don't think this is the case. They had the A750 running on some test benches at LTT and GamersNexus (maybe elsewhere?), but they haven't been supplying anyone with review samples. Those were just demo units that they went on a media tour with, basically.

Ahh, ok. I only watch GN, and had just assumed Intel was giving them to everyone after I saw LTT had their hands on an A770 or A750, lol.

repiv
Aug 13, 2009

yeah that's absolutely a fake benchmarker

they're barely even making an effort, if you look at the other videos on their channel they use exactly the same gameplay footage for every video, just with a different set of AVG and 1% low numbers overlaid on top

the more competent fake benchmark channels at least go to the effort of downclocking their card and recording actual gameplay to plausibly approximate footage recorded on a weaker card

repiv fucked around with this message at 00:30 on Aug 5, 2022

MarcusSA
Sep 23, 2007

I mean they have a joker as the splash screen at the start of the video. That should tell you everything.

repiv
Aug 13, 2009

we live in a society

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Is that joker society or costanza society

Dr. Video Games 0031
Jul 17, 2004

The MSI Mech 6600 XT is down to $300 after rebate: https://www.newegg.com/msi-radeon-rx-6600-xt-rx6600xt-mech2x-8goc/p/N82E16814137682?Item=N82E16814137682
The MSI Mech 6600 non-XT is down to $250 after rebate, but get the Sapphire Pulse for $250 without any mail-in rebate fuckery instead: https://www.newegg.com/sapphire-radeon-rx-6600-11310-04-20g/p/N82E16814202415?quicklink=true

It's nice to see actual midrange pricing for midrange cards.

edit: The pulse linked above is a version with just one DP port and one HDMI port, so that's something to be aware of. There's a version with two more DP ports that costs $20 more.

Dr. Video Games 0031 fucked around with this message at 04:14 on Aug 5, 2022

Shipon
Nov 7, 2005
adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop

a lot of people making a big deal out of this, i don't understand what they're trying to say. are you trying to say they just shouldn't bother adding more performance? because things have changed, you're not getting the performance gains with node shrinks like you used to. want more performance? better expect more power draw for the most part

Dr. Video Games 0031
Jul 17, 2004

Shipon posted:

adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop

a lot of people making a big deal out of this, i don't understand what they're trying to say. are you trying to say they just shouldn't bother adding more performance? because things have changed, you're not getting the performance gains with node shrinks like you used to. want more performance? better expect more power draw for the most part

What they're trying to say is very simple to understand because they've already said it quite clearly: they want to either reduce or avoid increasing their power consumption. That's all. It's not an unreasonable desire, either, especially if you live in europe:

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Shipon posted:

adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop

a lot of people making a big deal out of this, i don't understand what they're trying to say. are you trying to say they just shouldn't bother adding more performance? because things have changed, you're not getting the performance gains with node shrinks like you used to. want more performance? better expect more power draw for the most part

A noticeably super hot gaming PC is a pain in the rear end for the hot months of the year, regardless of power. It feels like doing a roast in the oven!

Also, you don't have to have the silicon pushed way up the power curve. Apple is shipping gigantic dies that are matching high-end Intel stuff while using 1/3 the wattage, and big GPUs that should be somewhere in the ballpark of a 3060, but nobody's distributing games for them so they only get used for compute or video editing. They're only doing it by having stonking huge dies, 8 channel LPDDR5, and using a low-power targeted process.

I guess what you could say about modern gaming GPUs is that you can just undervolt them yourself if you want to use 25% less power in exchange for giving up 5% performance.

Dr. Video Games 0031
Jul 17, 2004

Twerk from Home posted:

A noticeably super hot gaming PC is a pain in the rear end for the hot months of the year, regardless of power. It feels like doing a roast in the oven!

Also, you don't have to have the silicon pushed way up the power curve. Apple is shipping gigantic dies that are matching high-end Intel stuff while using 1/3 the wattage, and big GPUs that should be somewhere in the ballpark of a 3060, but nobody's distributing games for them so they only get used for compute or video editing. They're only doing it by having stonking huge dies, 8 channel LPDDR5, and using a low-power targeted process.

I guess what you could say about modern gaming GPUs is that you can just undervolt them yourself if you want to use 25% less power in exchange for giving up 5% performance.

There's a limit to how far undervolting and power limiting goes on modern GPUs, though. As an experiment, I tried to see how much performance I could wring out of the 3080 Ti at 75W, only to find that it's basically impossible to run any modern 3D games on it and have it draw less than 150W, no matter what you're doing to the voltage curve and power limits. At a certain point, it just started ignoring my power limits and drew 150W anyway, and performance was kind of all over the place. Buying a more expensive GPU to bring it down one or two performance tiers feels bad, anyway, so current undervolting practices focus on maintaining or slightly improving stock performance while reducing power draw instead. And something I've discovered recently is that 4K causes the card to draw way more current than 1440p, so undervolting doesn't even help with reducing power draw much at that resolution (though it does improve performance).

Dr. Video Games 0031 fucked around with this message at 11:33 on Aug 5, 2022

Listerine
Jan 5, 2005

Exquisite Corpse
I seem to remember the 4070 and the 4070Ti were so close in specs so it wasn't worth buying one of the models given the price difference, but I can't remember which model was the one to buy, can anyone confirm?

Also are all Nvidia brand cards Founder's Edition? There's one on the Best Buy app that is Nvidia but the box picture doesn't say Founder's.

edit- this is for rendering, I don't care about game performance

Listerine fucked around with this message at 06:40 on Aug 5, 2022

Dr. Video Games 0031
Jul 17, 2004

Listerine posted:

I seem to remember the 4070 and the 4070Ti were so close in specs so it wasn't worth buying one of the models given the price difference, but I can't remember which model was the one to buy, can anyone confirm?

Also are all Nvidia brand cards Founder's Edition? There's one on the Best Buy app that is Nvidia but the box picture doesn't say Founder's.

edit- this is for rendering, I don't care about game performance

Assuming you mean the 3070/3070 Ti, the one to buy is the 3070. If they're almost identical in performance, then it stands to reason that you should just get the cheaper one, after all.

If the card itself is manufactured by Nvidia, then it is a founder's edition. The "founder's edition" branding isn't displayed prominently on the box or the best buy store listings, it seems, but this is an FE, for instance. They all have that same general design, with fans embedded into a large heatsink and no tacky plastic shroud.

Listerine
Jan 5, 2005

Exquisite Corpse

Dr. Video Games 0031 posted:

Assuming you mean the 3070/3070 Ti, the one to buy is the 3070. If they're almost identical in performance, then it stands to reason that you should just get the cheaper one, after all.

If the card itself is manufactured by Nvidia, then it is a founder's edition. The "founder's edition" branding isn't displayed prominently on the box or the best buy store listings, it seems, but this is an FE, for instance. They all have that same general design, with fans embedded into a large heatsink and no tacky plastic shroud.

Doh, that's exactly what I meant (but I'd certainly take a 4070 tomorrow if I could!).

Thanks!

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Dr. Video Games 0031 posted:

I would not compare prices in one country to prices in another. They will be different once they hit store shelves in the US. That said...

https://www.techpowerup.com/297490/intel-arc-board-partners-are-reportedly-stopping-production-encountering-quality-issues

I'll just quote the whole article since it's short:

Oof. Not all AIBs will be in this same position (e.g. previously AMD-exclusive AIBs like ASRock), but I have been wondering how Intel would manage to step into the game while most potential board partners are oversupplied with competing products.

Oh my god. I hope Intel figures things out by the end, because it’s looking like a disaster atm.

ijyt
Apr 10, 2012

Shipon posted:

adding another 150W to my 3090 will maybe make me spend, like, 5-10 more a month in electric bills. big whoop

a lot of people making a big deal out of this, i don't understand what they're trying to say. are you trying to say they just shouldn't bother adding more performance? because things have changed, you're not getting the performance gains with node shrinks like you used to. want more performance? better expect more power draw for the most part

like i'm going to care about the opinion of someone who bought a 3090 lol

bloodysabbath
May 1, 2004

OH NO!
We’re almost 2 years into next gen systems but nobody can buy the things and we’re still seeing big titles go multi gen. I’m not trying to do the Bill Gates “640k Ram is enough for everyone” thing, but unless you need 4K RT 120hz no DLSS or some crazy high VR or rendering use case, when are we going to see games that actually need this much power?

Arzachel
May 12, 2012

bloodysabbath posted:

We’re almost 2 years into next gen systems but nobody can buy the things and we’re still seeing big titles go multi gen. I’m not trying to do the Bill Gates “640k Ram is enough for everyone” thing, but unless you need 4K RT 120hz no DLSS or some crazy high VR or rendering use case, when are we going to see games that actually need this much power?

I need a 4090 because clicking the "This thing halves your fps for no perceivable gain" checkbox makes my fps go bad

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
if you had a game that was running at a capped FPS [and could hit that cap], and assuming everything else being equal, you would expect that moving from one generation of card to the next would consume less power, right?

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

gradenko_2000 posted:

if you had a game that was running at a capped FPS [and could hit that cap], and assuming everything else being equal, you would expect that moving from one generation of card to the next would consume less power, right?

In general yeah, as mentioned earlier smaller nodes consume less power.
iirc the gtx 780 and gtx 1080 both had the same peak power consumption for example, but the 1080 is obviously better performance because it's a newer, more efficient process. If you capped a game at the same framerate on both GPUs, the 1080 will have significantly less power consumption than the 780, it's not working as hard to hit the cap.
There's a small number of exceptions to this, I am sure, but it's generally the rule that the next generation of cards need less power to reach the same framerate as the previous generation.

Unsinkabear
Jun 8, 2013

Ensign, raise the beariscope.





ijyt posted:

like i'm going to care about the opinion of someone who bought a 3090 lol

:same:

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22

bloodysabbath posted:

We’re almost 2 years into next gen systems but nobody can buy the things and we’re still seeing big titles go multi gen. I’m not trying to do the Bill Gates “640k Ram is enough for everyone” thing, but unless you need 4K RT 120hz no DLSS or some crazy high VR or rendering use case, when are we going to see games that actually need this much power?

Some people like to play MSFS I guess

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22

Twerk from Home posted:

A noticeably super hot gaming PC is a pain in the rear end for the hot months of the year, regardless of power. It feels like doing a roast in the oven!

it's definitely not rational from a cost perspective for me in the US, but it seems super wasteful for me to blast out more wattage through my little space heater in the summer in order for me to spend even more power to cool the apartment back down

CoolCab
Apr 17, 2005

glem
that heat only really sucks for me maybe two or three months, most of the year it's literally all the heating i pay for otherwise

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

CoolCab posted:

that heat only really sucks for me maybe two or three months, most of the year it's literally all the heating i pay for otherwise

For real?

CoolCab
Apr 17, 2005

glem

Rinkles posted:

For real?

yeah. my name isn't ironic, i used to live in the arctic. where i live now still isn't that cold to me, even twenty years later. i usually just use enough heat so the pipes don't freeze.

MarcusSA
Sep 23, 2007

This is why you don’t buy GPUs that were used for mining






You could just end up with some ash!

CoolCab
Apr 17, 2005

glem
i was chewing that over in the other thread. those are gamerocks right? i think they're at least 3080s but i could be wrong. aren't those fans blowing air directly into the exhaust vent

CoolCab
Apr 17, 2005

glem
like, what is the story with those 120s mounted horizontally. fuckin environmental storytelling but in miner pic form

e: rgb 120s! those cost a fortune WHY

Adbot
ADBOT LOVES YOU

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

MarcusSA posted:

This is why you don’t buy GPUs that were used for mining






You could just end up with some ash!

Maybe it’s arson fraud after the crypto downturn

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply