Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
mA
Jul 10, 2001
I am the ugly lover.
Does Gigabyte have the worst AIB GPU software? Aorus master looks like it was made in the 90s, and their RGB software is a complete joke that doesn't recognize my Gigabyte GPU 90% of the time. I literally went from X1, which allows you to control everything on an EVGA card with little to no issues to this horseshit.

mA fucked around with this message at 00:39 on Apr 11, 2021

Adbot
ADBOT LOVES YOU

Geemer
Nov 4, 2010



OhFunny posted:

The Europeans meanwhile have a less state-run economic system than China and yet I don't believe they have any semiconductor companies capable of making chips. This is because they don't have the desire from a state view of building their own since there's no national need nor from an economic view since just buying from Samsung and others is more profitable than pouring money into R&D for ten plus years.

We just build the machines that actually do the work. https://en.m.wikipedia.org/wiki/ASML_Holding

CaptainSarcastic
Jul 6, 2013



mA posted:

Does Gigabyte have the worst AIB GPU software? Aorus master looks like it was made in the 90s, and their RGB software is a complete joke that doesn't recognize my Gigabyte GPU 90% of the time. I literally went from X1, which allows you to control everything on an EVGA card to this horseshit.

I have a Gigabyte mobo and GPU and my advice is to never install Gigabyte software. I think that is the general consensus around them - the hardware is fine but their software is redundant at best, buggy and insecure at worst.

mA
Jul 10, 2001
I am the ugly lover.

CaptainSarcastic posted:

I have a Gigabyte mobo and GPU and my advice is to never install Gigabyte software. I think that is the general consensus around them - the hardware is fine but their software is redundant at best, buggy and insecure at worst.

Yeah, I have a Gigabyte mobo and GPU - both are really good, but drat their software is unbelievably terrible. I updated the VBIOS via Aorus Master on the 3090 Vision I just got and I was making GBS threads myself the whole time. All I want is to be able to control the RGB on the card, but they can't even get that right.

mA fucked around with this message at 00:44 on Apr 11, 2021

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

OhFunny posted:

SMIC is profitable. It's been building 14nm at volume for two years. Like other manufactures its older processes are in such high demand it's building new 28nm fabs to meet that demand. It's 7nm has been taped out and should enter volume production next year. Now SMIC getting added to the US Entity list last December will certainly have slowed its progress, but necessary is the mother of invention after all. Competition between China and the US will push such developments forward.

I think that's my biggest issue with your statements. You're to closely tying economic models with technological invention and progress. Korea and Taiwan have more regulated economic models than the United States and yet their semiconductor companies development is well ahead of Intel. Who has been stuck on 14nm for a decade. The Europeans meanwhile have a less state-run economic system than China and yet I don't believe they have any semiconductor companies capable of making chips. This is because they don't have the desire from a state view of building their own since there's no national need nor from an economic view since just buying from Samsung and others is more profitable than pouring money into R&D for ten plus years.

You made very good specific points that refute my broad characterizations, I will try to be more specific in the future.
The one thing that needs to be said is that anything a chinese company says about themselves is suspect. Having dealt with chinese companies in varying industries, they are mostly lying about things like profit, production and safety. No fault of their own, everyone else is too and if you dont keep up appearance you will suffer.

If an independent western agency had access to their books and says they are profitable then I'll believe it at face value. I just hope they also accounted for the billions the chinese govt pumped into those industries to get to the point of barely competing with the low end.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

mA posted:

Does Gigabyte have the worst AIB GPU software? Aorus master looks like it was made in the 90s, and their RGB software is a complete joke that doesn't recognize my Gigabyte GPU 90% of the time. I literally went from X1, which allows you to control everything on an EVGA card with little to no issues to this horseshit.

I couldn't get RGB Fusion to work at all (it would straight up hardlock my OS) until I deleted the phizon dll in the software's main directory. Signs point to yes.

Internet Explorer
Jun 1, 2005





No one wants to hear your idiot hot takes, Fauxtool. Shut the gently caress up.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
I had a different post written up, but man I wish I could buy a GPU at MSRP anytime I wanted. That would be great.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Bloodplay it again posted:

I couldn't get RGB Fusion to work at all (it would straight up hardlock my OS) until I deleted the phizon dll in the software's main directory. Signs point to yes.

it hasnt meaningfully updated since z170. My z170 rgb fusion looks the same and is about as feature starved as the z490 i just built. Its really the worst. If I was starting from scratch I would just turn off all the mobo lights and sync everything with a 3rd party controller like razers or corsairs.

Fauxtool fucked around with this message at 01:14 on Apr 11, 2021

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

OhFunny posted:

I had a different post written up, but man I wish I could buy a GPU at MSRP anytime I wanted. That would be great.

can I interest you in some corn?

https://www.newegg.com/corn-electronics-nvidia-geforce-gtx-750/p/1FT-0040-00004?quicklink=true

They have whole fields at MSRP
https://www.newegg.com/p/pl?N=100007709%2050120625

CoolCab
Apr 17, 2005

glem
to change the subject definitively- at what point does the shortage have an impact on game development? if it's going to be impossible for another year to buy a decent RTX card at anything but ruinous rates does that disincentivize adoption of technologies like ray tracing or DLSS? i feel like the 20 series the technology was so immature it was basically a joke and now the 30 series - allegedly capable of actually pushing some of those features - are loving impossible to source.

i think if you're thinking about launching a game that doesn't run very well on a 1060 or worse this year has to be scary.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
Things like steam HW survey and sales figures indicate that supply is actually quite high overall. Demand is bananas to a degree that could not have been predicted

Shipon
Nov 7, 2005

CoolCab posted:

to change the subject definitively- at what point does the shortage have an impact on game development? if it's going to be impossible for another year to buy a decent RTX card at anything but ruinous rates does that disincentivize adoption of technologies like ray tracing or DLSS? i feel like the 20 series the technology was so immature it was basically a joke and now the 30 series - allegedly capable of actually pushing some of those features - are loving impossible to source.

i think if you're thinking about launching a game that doesn't run very well on a 1060 or worse this year has to be scary.

Honestly with the prices people are actually paying on ebay for cards, I think they may be more worried about when COVID restrictions end and people go back to going out for entertainment instead of gaming.

GruntyThrst
Oct 9, 2007

*clang*

Shipon posted:

Honestly with the prices people are actually paying on ebay for cards, I think they may be more worried about when COVID restrictions end and people go back to going out for entertainment instead of gaming.

Gaming as an industry was doing amazing even before the neoplague, it'll be fine when people who aren't disgusting game gremlins start going back outside.

FilthyImp
Sep 30, 2002

Anime Deviant

CoolCab posted:

to change the subject definitively- at what point does the shortage have an impact on game development?
No one really cared when Doom 3 got like 20fps or Crysis needed a rendering bay to run. People still chase those bleeding edges

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

FilthyImp posted:

No one really cared when Doom 3 got like 20fps or Crysis needed a rendering bay to run. People still chase those bleeding edges

pretty good point. Crysis was held as a standard for so long but its only recently that people have understood its ubisoft levels of poor optimization.

People are playing cp2077 on 970s and still saying it looks great. I dont think a hardware to game mismatch is a bad thing

CaptainSarcastic
Jul 6, 2013



CoolCab posted:

to change the subject definitively- at what point does the shortage have an impact on game development? if it's going to be impossible for another year to buy a decent RTX card at anything but ruinous rates does that disincentivize adoption of technologies like ray tracing or DLSS? i feel like the 20 series the technology was so immature it was basically a joke and now the 30 series - allegedly capable of actually pushing some of those features - are loving impossible to source.

i think if you're thinking about launching a game that doesn't run very well on a 1060 or worse this year has to be scary.

People say this but my experience with a 2070 Super is that RT and DLSS work fine. :shrug:

I've been playing CP2077 for the last week and it looks good and my FPS seems fine at ultra detail and balanced DLSS, just with crowd density turned down a notch or two (I have a hard time picturing EVEN MORE people being around than what I already have).

I'm at 1440p on a G-sync-compatible monitor, and stopped obsessing over my FPS a while back and have been happier for it. I'm not sure what FPS I'm getting, but it's smooth and everything seems responsive so I assume I'm staying steadily over 60 FPS.

Hemish
Jan 25, 2005

My take on the impact on game development will be pretty minimal unless you're working from home and need an upgrade or your video card dies and you can't work.

I think the pandemic will be worse for game development and the games we got during 2020 and early 2021 were stuff that was already pretty far in the dev cycle but stuff that was expected later in 2021 or 2022... Then that sliding farther down the road also impacts the next project that was planned, etc... I was listening to a Giantbomb podcast the other day and they were actually talking about this with a guest who's been in the last few weeks. That guy was saying 2-3 years to get back to normal on that front, all because of the pandemic and not the GPU shortage since most games are on consoles and those are nowhere near as bad as the GPU hunt.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Cyberpunk may have been the worst, but a lot of big titles from the last few months came out in a pretty rough shape.

FlamingLiberal
Jan 18, 2009

Would you like to play a game?



Hemish posted:

My take on the impact on game development will be pretty minimal unless you're working from home and need an upgrade or your video card dies and you can't work.

I think the pandemic will be worse for game development and the games we got during 2020 and early 2021 were stuff that was already pretty far in the dev cycle but stuff that was expected later in 2021 or 2022... Then that sliding farther down the road also impacts the next project that was planned, etc... I was listening to a Giantbomb podcast the other day and they were actually talking about this with a guest who's been in the last few weeks. That guy was saying 2-3 years to get back to normal on that front, all because of the pandemic and not the GPU shortage since most games are on consoles and those are nowhere near as bad as the GPU hunt.
Yes the release calendar for this year is looking pretty dire right now...a lot of games scheduled to come out this year have been bumped to 2022 already and I expect more to follow.

FuturePastNow
May 19, 2014


mA posted:

Does Gigabyte have the worst AIB GPU software? Aorus master looks like it was made in the 90s, and their RGB software is a complete joke that doesn't recognize my Gigabyte GPU 90% of the time. I literally went from X1, which allows you to control everything on an EVGA card with little to no issues to this horseshit.

The first motherboard I bought when I started building my own PCs 16 years ago was a Gigabyte and even then their software's visual design was like cartoon vomit. When I got another Gigabyte board in 2010 I didn't even install the stuff it came with.

I'm not sure they're exceptionally bad, though, I installed and then uninstalled the Asus software suite for my X570 in about five minutes.

Raymond T. Racing
Jun 11, 2019

Hemish posted:

My take on the impact on game development will be pretty minimal unless you're working from home and need an upgrade or your video card dies and you can't work.

I think the pandemic will be worse for game development and the games we got during 2020 and early 2021 were stuff that was already pretty far in the dev cycle but stuff that was expected later in 2021 or 2022... Then that sliding farther down the road also impacts the next project that was planned, etc... I was listening to a Giantbomb podcast the other day and they were actually talking about this with a guest who's been in the last few weeks. That guy was saying 2-3 years to get back to normal on that front, all because of the pandemic and not the GPU shortage since most games are on consoles and those are nowhere near as bad as the GPU hunt.

I think it was more of a "Does it make sense to develop a game with raytracing and DLSS if no one can buy a card with those features anytime soon" impact on development.

Truthfully, I suspect they're not going to bother changing development lifecycles, they're just going to assume that by the time they finish the game, availability will be fine.

Shipon
Nov 7, 2005

Buff Hardback posted:

I think it was more of a "Does it make sense to develop a game with raytracing and DLSS if no one can buy a card with those features anytime soon" impact on development.

Truthfully, I suspect they're not going to bother changing development lifecycles, they're just going to assume that by the time they finish the game, availability will be fine.

People are buying them though? The steam charts are showing plenty of uptake compared to previous generations. It's just that demand is so through the roof that it's still difficult as hell to find.

Collateral
Feb 17, 2010
Consoles don't do rt very well though, and they certainly don't dlss. I very much doubt big devs are concerned at all.

Alchenar
Apr 9, 2008

If you are on Unreal and can just effectively flick a switch, why wouldn't you?

The Grumbles
Jun 5, 2006

CoolCab posted:

to change the subject definitively- at what point does the shortage have an impact on game development? if it's going to be impossible for another year to buy a decent RTX card at anything but ruinous rates does that disincentivize adoption of technologies like ray tracing or DLSS? i feel like the 20 series the technology was so immature it was basically a joke and now the 30 series - allegedly capable of actually pushing some of those features - are loving impossible to source.

i think if you're thinking about launching a game that doesn't run very well on a 1060 or worse this year has to be scary.

I think the pandemic itself has had such a huge impact on slowing down game development - the results of which we'll only really see over the next couple of years i think - that it probably dwarfs any impact that the GPU/console shortage is having. That said, as other posters have indicated, it's likely that you won't see a raft of Crysis type games this generation, because what's the point of optimizing for the high end when it doesn't really exist in the market?

Fauxtool posted:

Yet the pace of advancement has only accelerated since Reagan and unrestrained capitalism and free market ideals. Almost enough to make you not think.

Im not typing this on a 1960s PC btw. I hear you can still buy a soviet era Lada car though. I would probably take a terrible korean plastic econobox for the same price.

How come China with all its control and state infrastructure cant seem to make a chip worth poo poo despite trying for decades? Maybe rigid control can only output steady slow growth. Maybe some sort of personal motivation to become wildly successful might help spur innovation?

Just for my own peace of mind and sanity I need to clear up that the pace of advancement hasn't exactly accelerated since Reagan, it's just become both more profitable and risk averse - most technologies are still on the path set by the big key innovations of the 50s' - 60's and are fairly homogeneous, which is good for reliable profits but probably not great in the long term. And those big key innovations that we're still iterating on were dreamt up by people who didn't have to worry so much about student debt or the cost of housing or whatever else. Obviously many of the objects we use have got better since the 60's (although many haven't or have got worse, due to the demands of the market!). And uh a significant portion of the PC I'm typing on right now has parts manufactured by Samsung, which as you may know is a Korean company. Also I'm not talking about Soviet Russia or Communist China or whatever extreme scenario you're suggesting. I'm talking about countries like the USA and the UK up pretty much up until Reagan, which were hugely creative and innovative because they were mostly free markets but with robust controls and socialist state infrastructures. The whole reason pop culture itself - and the creative industries that followed (including and especially video games) - is because of western countries in the 60's having big social safety nets for people and managed market economies that gave people who didn't alreay have huge amounts of capital the space to be risky, innovative and creative. Yes, that includes the USA.

The only other thing I want to add is that it's v naive to think that western companies are somehow more honest because they're in a mostly unregulated market? Companies lie all the time at all levels about all kinds of things. Most statement any company makes is carefully managed by a team of people and not some candid off the cuff remark.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

I just got my 3070 FE in the mail and holy poo poo that is a beautiful piece of industrial engineering

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Yeah, post-1970s and 1980s can almost be referred to as refinement of known and discovered technologies, and not so much brand new development.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

The Grumbles posted:

I think the pandemic itself has had such a huge impact on slowing down game development - the results of which we'll only really see over the next couple of years i think - that it probably dwarfs any impact that the GPU/console shortage is having. That said, as other posters have indicated, it's likely that you won't see a raft of Crysis type games this generation, because what's the point of optimizing for the high end when it doesn't really exist in the market?


Just for my own peace of mind and sanity I need to clear up that the pace of advancement hasn't exactly accelerated since Reagan, it's just become both more profitable and risk averse - most technologies are still on the path set by the big key innovations of the 50s' - 60's and are fairly homogeneous, which is good for reliable profits but probably not great in the long term. And those big key innovations that we're still iterating on were dreamt up by people who didn't have to worry so much about student debt or the cost of housing or whatever else. Obviously many of the objects we use have got better since the 60's (although many haven't or have got worse, due to the demands of the market!). And uh a significant portion of the PC I'm typing on right now has parts manufactured by Samsung, which as you may know is a Korean company. Also I'm not talking about Soviet Russia or Communist China or whatever extreme scenario you're suggesting. I'm talking about countries like the USA and the UK up pretty much up until Reagan, which were hugely creative and innovative because they were mostly free markets but with robust controls and socialist state infrastructures. The whole reason pop culture itself - and the creative industries that followed (including and especially video games) - is because of western countries in the 60's having big social safety nets for people and managed market economies that gave people who didn't alreay have huge amounts of capital the space to be risky, innovative and creative. Yes, that includes the USA.

The only other thing I want to add is that it's v naive to think that western companies are somehow more honest because they're in a mostly unregulated market? Companies lie all the time at all levels about all kinds of things. Most statement any company makes is carefully managed by a team of people and not some candid off the cuff remark.

Its been proven I cant respond to this without feelings getting hurt and reported. You arent having discussion if you are just agreeing with each other. Dont come out of this thinking you won or are right because opposing thoughts on this topic get punished

The Big Bad Worf
Jan 26, 2004
Quad-greatness
It's a good thing that we invented capitalism first so there would be financial incentive to discover fire

Craptacular!
Jul 9, 2001

Fuck the DH
Read the room.

This is SA Forums. We’re all socialists. Some of us are goddamn Communists. Anyone expressing libertarian or free-market ideals was dogpiled off the forums years ago. It doesn’t help that most of them didn’t have any real thing to debate besides, “well the system worked out great for me.”

Craptacular! fucked around with this message at 19:52 on Apr 11, 2021

CoolCab
Apr 17, 2005

glem
politely, instead of using this space to have a discussion and/or debate that definitely could find a home better than the gpu thread, i know that the 30 series is starting to appear on steam hardware lists and stuff but i'm surprised that they've supposedly running up the charts faster than previous generations? that really shocks me to be honest - they're starting to turn up, finally, like six months after launch - i assumed less must be appearing. does strongly suggest what i've been saying for awhile, crypto is a contributor to the problem but not the problem itself.

mA
Jul 10, 2001
I am the ugly lover.

Craptacular! posted:

Read the room.

This is SA Forums. We’re all socialists. Some of us are goddamn Communists. Anyone expressing libertarian or free-market ideals was dogpiled off the forums years ago. It doesn’t help that most of them didn’t have any real thing to debate besides, “well the system worked out great for me.”

LOL I still remember when D&D used to be run by Bush supporting neocons and libertarians. The good ol days. It does make me happy how much more left the forums have become since then. Alright that's enough of my politics contribution.

Shipon
Nov 7, 2005

CoolCab posted:

politely, instead of using this space to have a discussion and/or debate that definitely could find a home better than the gpu thread, i know that the 30 series is starting to appear on steam hardware lists and stuff but i'm surprised that they've supposedly running up the charts faster than previous generations? that really shocks me to be honest - they're starting to turn up, finally, like six months after launch - i assumed less must be appearing. does strongly suggest what i've been saying for awhile, crypto is a contributor to the problem but not the problem itself.

Nvidia says that shipments haven't been reduced compared to previous generations and that there are no issues with yield problems, and the steam charts seem to back that up. It's just that demand is so sky high compared to the past due to a mix of more people wanting to game, a lot of people who skipped the 2000 series, and crypto.

Sure, you can say that's still a supply problem since it hasn't matched demand but how the hell was nvidia supposed to predict the collapse of global supply chains last year due to a pandemic that will take years to overcome.

CoolCab
Apr 17, 2005

glem

Shipon posted:

Nvidia says that shipments haven't been reduced compared to previous generations and that there are no issues with yield problems, and the steam charts seem to back that up. It's just that demand is so sky high compared to the past due to a mix of more people wanting to game, a lot of people who skipped the 2000 series, and crypto.

ridiculously good value compared to the 20 series too, like i was very happy paying about 590 quid for a 3070 because it replaced a part that launched for, one second... loving lord jesus christ £1099 UK launch 2080ti rrp.

they course corrected on the super a bit but they really put the price as low as they could get away with this launch.

Craptacular!
Jul 9, 2001

Fuck the DH
Things are showing up in Steam hardware charts because for every four crypto enthusiasts who would pay $2400 for a card with intentions to make it all back, there’s one person who will spend $2400 for a $900 card just to play games and with no intention to make it all back. Not all gamers have to eat ramen to keep their landlord happy.

These people are in some sense the real problem, because they’re telling Nvidia that these prices can be somewhat normalized. However there are also many other factors like AMD still being a whole generation behind.

Shipon
Nov 7, 2005
Yeah the 3080 and 3070 at their original MSRP were one of the best deals we've seen in the market for a long time. Even with a 20-30% markup beyond that I would say it's still not a horrible buy at that price, if you can find them at that point that is. They were clearly spooked by AMD having a potentially competitive card.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
Great news!
3rd party marketplace gpu prices are accelerating. The 3060 that was $800 last week is now 1100. The 3070 that was 1500 is now 1750.
3090 is sitting stable at 3.3-3.9k depending on the sku.

The dream of a 3080 for $3080 is still alive

Didnt check the other models, but probably similarly bad.

Fauxtool fucked around with this message at 20:24 on Apr 11, 2021

mA
Jul 10, 2001
I am the ugly lover.

Fauxtool posted:

Great news!
3rd party marketplace gpu prices are accelerating. The 3060 that was $800 last week is now 1100. The 3070 that was 1500 is now 1750.
3090 is sitting stable at 3.3-3.9k depending on the sku.

The dream of a 3080 for $3080 is still alive

Didnt check the other models, but probably similarly bad.

I've been holding off selling my 3080, but if resale prices start to push $3000 it's definitely going on eBay.

Adbot
ADBOT LOVES YOU

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

mA posted:

I've been holding off selling my 3080, but if resale prices start to push $3000 it's definitely going on eBay.

Take the money, and come back to me for a 3080ti. Win win win.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply