Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

Paul MaudDib posted:

For legitimate games, you can set up a certificate authority system. NVIDIA has the root cert, you issue a sub-CA to each studio, when they release a binary they sign it and the driver verifies the chain of signatures before allowing HDCP to be disabled. Which would prevent most of the problems with compatibility/expense - the only real problem is that if you are an aspiring game-dev you will need to buy a HDCP-compatible display until you're legit enough to get a cert.

Are they going to retroactively issue certificates for every application and game that ever used occlusion queries (or any other form of readback), going all the way back to DX9 when they were introduced?

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

Are they going to retroactively issue certificates for every application and game that ever used occlusion queries (or any other form of readback), going all the way back to DX9 when they were introduced?

They are already maintaining driver support for them, so they presumably have access to (at least) binaries. Why not? They would need to validate all that stuff for a new uarch anyway, running a batch job that signed the binaries seems like the least of their problems.

Again, not saying they will, this would be painful and I suspect NVIDIA doesn't actually have much of a problem with miner sales (see: mining exception for datacenter use), but as a thought exercise it's certainly not impossible. It's just another form of DRM, ensuring that nothing can get at your precious pixels except the display, when operating in secure mode. Theoretically-secure DRM can exist, it's just usually weakened due to practical concerns.

Mining generally a bigger problem for AMD since they have such a tiny marketshare in the first place. Lower production levels makes it easier for crypto booms to buy up all the stock, and when it pops there's a much smaller number of people interested in buying them. Their small marketshare amplifies the boom and the bust for them.

Paul MaudDib fucked around with this message at 00:15 on Jan 19, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

1gnoirents posted:

Itd be major news if they raised chip prices to AIBs. They are happy about the volume im sure but thats all it is to Nvidia and AMD specifically.

Distributors win first, AIBs second. Retail typically not at all

There have actually already been rumbles about price increases coming down the pipe for a while now, mostly in the context of RAM but also some bullshit about "passing on wafer costs" that is probably just code for general price increases.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

They are already maintaining driver support for them, so they presumably have access to (at least) binaries. Why not? They would need to validate all that stuff for a new uarch anyway, running a batch job that signed the binaries seems like the least of their problems.

Again, not saying they will, this would be painful and I suspect NVIDIA doesn't actually have much of a problem with miner sales (see: mining exception for datacenter use), but as a thought exercise it's certainly not impossible.

This would also gently caress over hobbiest CUDA developers, small-time research labs, etc., and would likely be a public relations nightmare. On the other hand, if AMD decided to keep their poo poo open, it might actually get some people to switch over to AMD Compute....

I'd also give it roughly a week for someone to figure out how to spoof a miner app to match the signature of a legitimate game to get around the signing. It'd be a huge cat-and-mouse game for no real practical gain on NVidia's end.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

This would also gently caress over hobbiest CUDA developers, small-time research labs, etc., and would likely be a public relations nightmare. On the other hand, if AMD decided to keep their poo poo open, it might actually get some people to switch over to AMD Compute....

* unless you pay for a Quadro/Tesla card, or potentially for a key that unlocks CUDA/OpenCL on your gaming card. After all, Intel does DLC, why not NVIDIA?

Hell, if you wanted to mitigate this you could even give a free key per generation per developer account or something, and give them away free to schools (remember, they give Tesla cards away free to schools for their clusters), etc.

Yeah, of course it's a PR nightmare, but as a thought exercise it's technically possible.

And good luck buying an AMD product at the moment. Largely because they cater to miners, and because they're too chickenshit to actually commit to it and increase production.

edit:

DrDork posted:

I'd also give it roughly a week for someone to figure out how to spoof a miner app to match the signature of a legitimate game to get around the signing. It'd be a huge cat-and-mouse game for no real practical gain on NVidia's end.

Assuming a general break in SHA1/SHA2/whatever is not really realistic.

Paul MaudDib fucked around with this message at 00:17 on Jan 19, 2018

Yaoi Gagarin
Feb 20, 2014

Locking down cuda would be an awful idea. A big part of how it got popular was that developers could start learning it on regular old GPUs.

repiv
Aug 13, 2009

Paul MaudDib posted:

They are already maintaining driver support for them, so they presumably have access to (at least) binaries. Why not? They would need to validate all that stuff for a new uarch anyway, running a batch job that signed the binaries seems like the least of their problems.

Again, not saying they will, this would be painful and I suspect NVIDIA doesn't actually have much of a problem with miner sales (see: mining exception for datacenter use), but as a thought exercise it's certainly not impossible. It's just another form of DRM, ensuring that nothing can get at your precious pixels except the display, when operating in secure mode. Theoretically secure DRM can exist, it's just usually weakened due to practical concerns.

Mining generally a bigger problem for AMD since they have such a tiny marketshare in the first place. Lower production levels makes it easier for crypto booms to buy up all the stock, and when it pops there's a much smaller number of people interested in buying them. Their small marketshare amplifies the boom and the bust for them.

I doubt Nvidia has every DX9+ app ever created on file - sure they'll have the AAA games but not everything. Not to mention that your theory would require crippling any setup without a HDCP 2.2 monitor attached...

Practically I think there's no way to stop mining on GeForce cards without causing massive disruption to non-miners.

edit: now that I think of it, even under the super heavy-handed system you're proposing miners could still exfiltrate data from the GPU by just rendering a QR code to the screen and pointing a camera at it :cheeky:

VostokProgram posted:

Locking down cuda would be an awful idea. A big part of how it got popular was that developers could start learning it on regular old GPUs.

Locking down CUDA would also achieve nothing when DX/Vulkan compute shaders have nearly identical capabilities and are required for modern games to function.

repiv fucked around with this message at 00:25 on Jan 19, 2018

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Number19 posted:

That was my original point: make mining specific cards for the idiots so you can service your traditional market along with this new, unstable one. Then you can insulate the risk better, and extract the extra profits yourself instead of letting the AIB or retail chain reap all the benefits.

...

It might not even be possible for them to make mining only gear in which case this is all moot.

AMD tried this by making versions that were slightly optimized for mining and had no video-outs to really drive home the market segmentation. To the best of my knowledge, they did not sell very well (though that could be a lot more about lol AMD than about being a bad idea).

One of the things likely fueling the fire here is miner's assumptions that there's a measure of safety in buying gaming cards: if the market tanks, at least they can always sell off the hardware for an ok amount. So while they might be looking at a ROI of 120 days, maybe they only need 60 days to hit the break-even point were they to sell the card after a bubble pop, and 60 days looks a lot safer than 120. Drop that safety net by making them incapable of gaming, and I doubt we'd see prices as insane as they are now.

Honestly, they'd be assuming more risk by developing an entire separate product lineup: if the bottom falls out of the buttcoin market, they're left holding the bag of warehouses worth of cards they can't sell. Right now, they are selling everything instantly, and even if the bottom dropped out of the market, they'd almost certainly still be in a better position to continue selling cards than they normally are at the tail end of a product lifecycle--normally right now we'd be seeing tons of MIRs and other attempts to get us to buy up the tail end of the stock over the next few months before Ampere drops. I suspect this is the view NVidia is taking, because they absolutely could do mining-specific cards, and are intentionally not.

If they really want to help out the gamer community, the most sensible option is simply to demand that retailers enforce a limited purchasing option--1 or 2 per customer, for example. Yes, truly dedicated buttminers will still find ways to acquire a lot of them, but it should at least give normal people a fighting chance once the inevitable launch-week (or month) shortage clears up.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

I doubt Nvidia has every DX9+ app ever created on file - sure they'll have the AAA games but not everything. Not to mention that your theory would require crippling any setup without a HDCP 2.2 monitor attached...

Practically I think there's no way to stop mining on GeForce cards without causing massive disruption to non-miners.

If you have HDCP then you could run whatever, regardless of signature, as long as HDCP mode is enabled.

If you don't have HDCP, then you could only run signed binaries.

If you have a Tesla/Quadro, or a DRM key/sub-CA for your card, then you could run whatever, regardless of HDCP. Make it relatively easy to get a single card worth with a developer account, and give them away to educational institutions/studios/etc.

So really the only problem would be people who don't have a HDCP monitor, and need old stuff that NVIDIA doesn't have binaries for, and needs to run lots of cards, and isn't an educational institution or studio or anyone else. That's pretty narrow.

quote:

edit: now that I think of it, even under the super heavy-handed system you're proposing miners could still exfiltrate data from the GPU by just rendering a QR code to the screen and pointing a camera at it :cheeky:

You proposed doing this in pixel shaders, yeah? It's a little bit too late to drastically redraw the screen by that point, a pixel shader renders a pixel. But yeah you could probably redraw it next frame if you allow occlusion queries then. So I guess you would break that on older stuff.

On the other hand you've turned what used to be "launch Nicehash, receev internet funbux" into a titanic struggle involving a monitor and webcam for every single GPU, so it would undoubtedly still reduce miner usage.

Paul MaudDib fucked around with this message at 00:33 on Jan 19, 2018

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
HDCP would wreck "professional" streaming setups, wouldn't it?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

Yeah, of course it's a PR nightmare, but as a thought exercise it's technically possible.

As a thought exercise, communism looks great, too! I mean, I agree that there are measures that they could try if they really cared, but they've explicitly not done so and I can't believe that their lack of action on that front is an accident or an oversight. It just doesn't make a whole lot of business sense to try and crush it.

Paul MaudDib posted:

And good luck buying an AMD product at the moment. Largely because they cater to miners, and because they're too chickenshit to actually commit to it and increase production.

You can't buy Vega because they produced only a tiny handful of the cards in the first place. That the price:performance for anyone other than miners was hilariously bad is likely why they more or less wrote off this generation. When they came out the entire review world panned them as terrible cards that basically no one should buy--can't really blame that bit on miners, especially when you look at the progression (or lack thereof) over their last few GPU generations.

I don't think they went out intending to court miners, other than perhaps retroactively when they realized that they had made some very bad bets (whoops HBM) and now their entire architecture blows and needs to be redone because it doesn't actually game very well anymore.

repiv posted:

edit: now that I think of it, even under the super heavy-handed system you're proposing miners could still exfiltrate data from the GPU by just rendering a QR code to the screen and pointing a camera at it :cheeky:

You could automate this via print screen, too, unless you wanted to be so heavy-handed that you kill off even that.

repiv
Aug 13, 2009

Paul MaudDib posted:

If you have HDCP then you could run whatever, regardless of signature, as long as HDCP mode is enabled.

If they can run whatever then what's to stop them just reading back GPU memory like normal?

Paul MaudDib posted:

So really the only problem would be people who don't have a HDCP monitor, and need old stuff that NVIDIA doesn't have binaries for, and needs to run lots of cards, and isn't an educational institution or studio or anyone else. That's pretty narrow.

It becomes much less narrow once you specify HDCP 2.2, the only version that hasn't been broken, and only started appearing in monitors in 2015. AFAIK it's not universally supported even on the newest monitors.

Paul MaudDib posted:

You proposed doing this in pixel shaders, yeah? It's a little bit too late to drastically redraw the screen by that point, a pixel shader renders a pixel.

A pixel shader could render a pixel to a texture, which could then be read by another shader that renders a QR code or some other graphical data encoding.

This theorycrafting is silly, there's too many holes for it to ever work.

repiv fucked around with this message at 00:37 on Jan 19, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

As a thought exercise, communism looks great, too! I mean, I agree that there are measures that they could try if they really cared, but they've explicitly not done so and I can't believe that their lack of action on that front is an accident or an oversight. It just doesn't make a whole lot of business sense to try and crush it.

I have never said any of this is a good idea (in fact, explicitly the opposite, repeatedly). It was a narrow technical question about "explain how could you stop miners from using gaming GPUs".

The answer is DRM, and I'm sure you can come up with some form of DRM that is appropriately permissive when the signal chain is secure or when the appropriate signatures/disabling keys are present. Sure, there will be edge cases that will break ("obscure DX9 titles which NVIDIA doesn't support/have binaries for and which use occlusion queries"), but it wouldn't be the first time old, unsupported games have broken in the history of computing.

Paul MaudDib fucked around with this message at 00:42 on Jan 19, 2018

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Just randomly insert single-bit errors at some stage. Wouldn’t be noticeable visually, but would gently caress up a hash.

lDDQD
Apr 16, 2006

VostokProgram posted:

Locking down cuda would be an awful idea. A big part of how it got popular was that developers could start learning it on regular old GPUs.

There's no need to lock it down - much like there's no need to make geForce cards literally unable to run your CAD program. All it takes to gently encourage consumers to buy cards in their specific market segment is to just make it perform poorly.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

If you have HDCP then you could run whatever, regardless of signature, as long as HDCP mode is enabled.

If you don't have HDCP, then you could only run signed binaries.

If you have a Tesla/Quadro, or a DRM key/sub-CA for your card, then you could run whatever, regardless of HDCP. Make it relatively easy to get a single card worth with a developer account, and give them away to educational institutions/studios/etc.

So really the only problem would be people who don't have a HDCP monitor, and need old stuff that NVIDIA doesn't have binaries for, and needs to run lots of cards, and isn't an educational institution or studio or anyone else. That's pretty narrow.

You're still cutting out all the individual hobbiest CUDA people, who are not an insignificant group. Especially when NVidia wants hobbiests to dick around with stuff, since it means higher CUDA adoption rates when they eventually join/start companies. Doubly so if AMD doesn't follow suit in terms of locking stuff down. You mention them giving Teslas away to schools--this sort of trailing adoption is exactly why they did that.

To deal with that, you'd have to allow individual "personal development use" licenses or whatever, and now you're back to where you started, except perhaps that you've increased the crazy-miner's initial capital outlay because he has to set up 4 complete systems instead of 1 system with 4 cards. It would be interesting to see how much of the market right now is actually composed of people running tons of cards per system, but I'd be willing to bet that the numbers are fairly small, and thus wouldn't have any great impact on retail card availability.

Paul MaudDib posted:

The answer is DRM, and I'm sure you can come up with some form of DRM that is appropriately permissive when the signal chain is secure or when the appropriate disabling keys are present. Sure, there will be edge cases that will break, but it wouldn't be the first time old, unsupported games have broken in the history of computing.

The point is that any sort of DRM that would be strong enough to be effective would be so strong as to make using them as gaming cards and development cards highly problematic. Yeah, unsupported games in the past have broken, but this would be the first time a company was taking the risk of breaking tens of thousands of games all at once in order to spite someone trying to buy their product.

DrDork fucked around with this message at 00:43 on Jan 19, 2018

repiv
Aug 13, 2009

lDDQD posted:

There's no need to lock it down - much like there's no need to make geForce cards literally unable to run your CAD program. All it takes to gently encourage consumers to buy cards in their specific market segment is to just make it perform poorly.

But compute is an integral part of modern game engines, anything that hurts miners performance would also hurt game performance to some extent.

Modern engines are even using integer hash algorithms inside shaders to generate random numbers, good luck discriminating between "bad integer hash for mining" and "good integer hash for rendering".

repiv fucked around with this message at 00:52 on Jan 19, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

To deal with that, you'd have to allow individual "personal development use" licenses or whatever, and now you're back to where you started, except perhaps that you've increased the crazy-miner's initial capital outlay because he has to set up 4 complete systems instead of 1 system with 4 cards.

The easiest way to prevent people from signing up tons of accounts is... charge per account. Which is equivalent to letting people pay for a key for each specific card they want to run CUDA on, which has been my point all along. But again, you can also give away one key per generation per .edu email address, or find other ways to apportion them out fairly readily without making it just a straight signup.

quote:

It would be interesting to see how much of the market right now is actually composed of people running tons of cards per system, but I'd be willing to bet that the numbers are fairly small, and thus wouldn't have any great impact on retail card availability.

You're looking at this backwards, miners make up a small percent of the number of people with cards, but they make up a disproportionate amount of the actual cards. One small-time miner might have 30 midrange cards, even a high-end gamer might have 2 cards from very specific models. As a result, cutting out miners would increase availability quite a lot - it's really hard to argue otherwise when a 1070 has gone from $300 to $700 over the course of the last year, when you can find them in stock at all.

DrDork posted:

The point is that any sort of DRM that would be strong enough to be effective would be so strong as to make using them as gaming cards and development cards highly problematic. Yeah, unsupported games in the past have broken, but this would be the first time a company was taking the risk of breaking tens of thousands of games all at once in order to spite someone trying to buy their product.

I really doubt there are tens of thousands of DX9 games which use occlusion queries and which NVIDIA doesn't have binaries to sign, and/or could not reasonably acquire binaries once you filed a bug report. Merely by assuming lack of support you are implying a massive degree of obscurity, and it only gets narrower from there. And if it's legit, it's just a matter of them getting ahold of binaries. eBay exists.

Obviously any DRM scheme is intrusive to at least some degree (particularly until it reaches critical mass in terms of adoption/hardware support), and people aren't going to like it. I never said it was a good idea (again, I've repeatedly said the opposite), I said it was technically possible, which was the original question.

repiv posted:

But compute is an integral part of modern game engines, anything that hurts miners performance would also hurt game performance to some extent.

Modern engines are even using integer hash algorithms inside shaders to generate random numbers, good luck discriminating between "bad integer hash for mining" and "good integer hash for rendering".

And, with a modern game engine you could easily get a binary to sign that allows that kind of stuff. Games studios can have their own sub-CAs that allow them to develop and release, even, and you just validate the chain of signatures.

And like subjunctive said, you can always throw in some random errors to integer math of unsigned applications running in graphics mode. On average small errors in the least-significant-bits aren't going to hurt anything graphical, but it'll gently caress up hashing 100% of the time.

Paul MaudDib fucked around with this message at 01:09 on Jan 19, 2018

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy
They'd just sell the unlock to miners, and make a tidy profit from it since miners don't mind paying well above MSRP already. There'd still be a shortage of cards.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
What's a good example we can look to in history where something suddenly becomes more valuable (if you buy into the hype of course) for doing something other than it was originally intended, and the people that just wanted it for the original purpose are left high and dry?

I assume somebody is gonna reply right away with some obvious answer that I'm not thinking of.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

Subjunctive posted:

Just randomly insert single-bit errors at some stage. Wouldn’t be noticeable visually, but would gently caress up a hash.

lol, that's evil

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

fletcher posted:

What's a good example we can look to in history where something suddenly becomes more valuable (if you buy into the hype of course) for doing something other than it was originally intended, and the people that just wanted it for the original purpose are left high and dry?

I assume somebody is gonna reply right away with some obvious answer that I'm not thinking of.

Sudafed and related related cold/cough medications maybe? I dunno, it's a good question.

ufarn
May 30, 2009
With the state of GPU prices, what's the state of the VFX industry? Do they use something like Tesla/Quadro instead, or do they rely on cloud computing?

They're having a hard time as it is, hopefully this isn't another straw.

shrike82
Jun 11, 2005

LOL at suggesting that Nvidia restrict use of CUDA on their cards

No way this wouldn't piss the poo poo out of the entire ML sector

Truga
May 4, 2014
Lipstick Apathy

Paul MaudDib posted:

The answer is DRM

DRM is never, ever "the answer"

shrike82 posted:

LOL at suggesting that Nvidia restrict use of CUDA on their cards

No way this wouldn't piss the poo poo out of the entire ML sector

it'd kill nvidia overnight pretty much

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

fletcher posted:

What's a good example we can look to in history where something suddenly becomes more valuable (if you buy into the hype of course) for doing something other than it was originally intended, and the people that just wanted it for the original purpose are left high and dry?

I assume somebody is gonna reply right away with some obvious answer that I'm not thinking of.

I'd say any kind of a land-rush situation, like "they struck oil and now my landlord is kicking me out so they can drill/property taxes are too high for me to remain". Or gentrification in general.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Paul MaudDib posted:

The easiest way to prevent people from signing up tons of accounts is... charge per account. Which is equivalent to letting people pay for a key for each specific card they want to run CUDA on, which has been my point all along. But again, you can also give away one key per generation per .edu email address, or find other ways to apportion them out fairly readily without making it just a straight signup.
This tacks you into the dilemma where either you make the licenses quite expensive--and drive off a lot of hobby devs (many of which don't have .edu emails), or you make them cheap--and then the miners don't care, because what's another $100 when you're already paying $800/card? Trying to figure out a model where people only ever get one license without either loving over the people you want to have them and without making obtaining extra licenses trivial is basically impossible in this market.

Paul MaudDib posted:

You're looking at this backwards, miners make up a small percent of the number of people with cards, but they make up a disproportionate amount of the actual cards. One small-time miner might have 30 midrange cards, even a high-end gamer might have 2 cards from very specific models. As a result, cutting out miners would increase availability quite a lot - it's really hard to argue otherwise when a 1070 has gone from $300 to $700 over the course of the last year, when you can find them in stock at all.
Sure, but the question remains "how many moderate- or large-scale miners are there?" A mid-scale mining setup might have 30 cards, but if there are only 10,000 such setups, that 300k is a drop in the bucket compared to the ~35,000,000 cards NVidia sells annually. A big part of the lack of current availability is the lack of incoming stock--NVidia and the AIB manufacturers simply aren't producing many cards right now, presumably on the anticipation of Ampere in the next quarter or so. Another big factor is that even single-card mining is profitable right now, so a ton of people who otherwise would have sat content with their 7- or 9-series cards are looking to upgrade to Pascal and mine while they're not playing. I would actually imagine that those "casual miners" probably collectively account for far more cards than the serious business buttcoiners.

Paul MaudDib posted:

And, with a modern game engine you could easily get a binary to sign that allows that kind of stuff. Games studios can have their own sub-CAs that allow them to develop and release, even, and you just validate the chain of signatures.
An interesting question arises when you look at mods: I wonder if it would be possible to take a legitimate game with signed binary which is capable of loading mods and/or has built-in support for scripting and leverage that to do mining work from inside the game itself.

Paul MaudDib posted:

And like subjunctive said, you can always throw in some random errors to integer math of unsigned applications running in graphics mode. On average small errors in the least-significant-bits aren't going to hurt anything graphical, but it'll gently caress up hashing 100% of the time.
While no doubt this would mess up current coins, you're kidding yourself if you don't think enterprising buttcoiners wouldn't respond by developing new algos and new coins that simply disregarded the least-significant-bit of the hash or otherwise were intentionally built around avoiding the issue. It's not like such a move hasn't been done before--in fact, that's basically how we got here in the first place when BitCoin got overrun by ASICs; people decided to make new coins that preferenced GPUs rather than ASICs (it's also why the 1070/GDDR5 is the best price:performance card--that wasn't an accident).

DrDork fucked around with this message at 03:01 on Jan 19, 2018

Craptacular!
Jul 9, 2001

Fuck the DH
How about instead of this complicated licensing scheme where you pay to be a developer that discourages development, that you allow development on high end cards and not low end cards? If you want to program with CUDA, get a 1080. Don't get a 1060, that's a consumption card. Done.

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~

Dadbod Apocalypse posted:

“Gentlemen, we need to do something to our upcoming product line in order to reduce sales,” said no one ever.

There was the NES Classic, though that wasn't upcoming, I am still confused why they stopped making them.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Craptacular! posted:

How about instead of this complicated licensing scheme where you pay to be a developer that discourages development, that you allow development on high end cards and not low end cards? If you want to program with CUDA, get a 1080. Don't get a 1060, that's a consumption card. Done.

To be fair, this is already a problem: the hardest hit cards are the high-end ones. The 1080 got a bit of a pass for a while thanks to GDDR5X, but once people realized there were algos that worked just fine with it and were about as profitable as the others, it, too, shot up, and now it's impossible to find 1070/1080/1080Ti's for sane prices.

If you want a 1050 or below, you probably can get one without too much trouble.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Alpha Mayo posted:

There was the NES Classic, though that wasn't upcoming, I am still confused why they stopped making them.

Maybe they were worried that too many people would like it and stop buying the fairly maligned Wii U?

Craptacular!
Jul 9, 2001

Fuck the DH

DrDork posted:

To be fair, this is already a problem: the hardest hit cards are the high-end ones. The 1080 got a bit of a pass for a while thanks to GDDR5X, but once people realized there were algos that worked just fine with it and were about as profitable as the others, it, too, shot up, and now it's impossible to find 1070/1080/1080Ti's for sane prices.

If you want a 1050 or below, you probably can get one without too much trouble.

The highest end cards subsidize a lot of the R&D costs by being overpriced out of the box, they are models where the very worst result is a bunch of boxes sitting on shelves. Selling out of them regularly is a very nice problem to have, and doesn't have a chilling effect on the game developer industry quite so much as the $300-and-under cards do.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Craptacular! posted:

The highest end cards subsidize a lot of the R&D costs by being overpriced out of the box, they are models where the very worst result is a bunch of boxes sitting on shelves. Selling out of them regularly is a very nice problem to have, and doesn't have a chilling effect on the game developer industry quite so much as the $300-and-under cards do.

Yeah, I absolutely agree. Hence why--despite the bitching and moaning of the PCMR crowd--NVidia has been perfectly happy to leave things as-is for the moment. And I can't say I blame them much.

Craptacular!
Jul 9, 2001

Fuck the DH

DrDork posted:

Yeah, I absolutely agree. Hence why--despite the bitching and moaning of the PCMR crowd--NVidia has been perfectly happy to leave things as-is for the moment. And I can't say I blame them much.

You don't really agree because you seem to be cool with 1050s at nearly $300 and everything else higher. My point is that the company is not long-term going to benefit from there being such a high barrier to entry for any and all GPUs. It's one thing when the 1080 is $550. It's another when a 1060 is.

People expect Maseratis to be very expensive, what's another $20,000 on the price. But if a Toyota Camry costs as much as a Porsche, people will notice. :iiaca:

Craptacular! fucked around with this message at 03:34 on Jan 19, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Alpha Mayo posted:

There was the NES Classic, though that wasn't upcoming, I am still confused why they stopped making them.

They designed it with a lot of end-of-life parts (possibly as a way to clear stock of near-obsolete parts with a limited-production product) and blew though the entire available stockpile in no time flat.

Paul MaudDib fucked around with this message at 03:47 on Jan 19, 2018

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Craptacular! posted:

You don't really agree because you seem to be cool with 1050s at nearly $300 and everything else higher. My point is that the company is not long-term going to benefit from there being such a high barrier to entry for any and all GPUs. It's one thing when the 1080 is $550. It's another when a 1060 is.

People expect Maseratis to be very expensive, what's another $20,000 on the price. But if a Toyota Camry costs as much as a Porsche, people will notice. :iiaca:

I don't know where you're seeing 1050's for $300, but wherever that is, you're being ripped off. As I sit here, NewEgg has multiple 1050's for ~$150, which is--admittedly--a few bucks above what MSRP should be, but it's not enormously off. 1050 Ti's start at $160, which is also within $20 of MSRP.

NVidia also isn't likely to stay in this situation long-term. Either the mining craze will continue and they will ramp up production to match, or it'll fizzle and life will go back to normal. Right now we're at a confluence of highly profitable mining and NVidia tapering production off to switch over to Ampere. It's the combo that's driving prices nuts; it won't last.

lDDQD
Apr 16, 2006

repiv posted:

But compute is an integral part of modern game engines, anything that hurts miners performance would also hurt game performance to some extent.
Modern engines are even using integer hash algorithms inside shaders to generate random numbers, good luck discriminating between "bad integer hash for mining" and "good integer hash for rendering".
Rendering triangles is an even integraler part of modern game engines (one would think,) and they've found a way to severely gimp the triangle-rendering performance - as long as the application requesting said triangles to be rendered is a CAD or 3d modelling program, instead of a game.
This isn't even particularly difficult - they actually can do this mostly using the same driver subsystem that matches shader code against a library of known shaders, and replaces then with nVidia's (or AMD's) more efficient version. Just in this case, the goal isn't to improve performance - it's the opposite.

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

Niwrad posted:

Does HDMI vs DisplayPort matter with my GPU if I'm just running dual monitors (two Dell U2410) that can only do 60hz? Is there any advantage to using DisplayPort?

I'm running a RX 480 right now and noticed I get an occasional screen go out for a couple seconds and then come back on. It's maybe twice a day but annoying. I'm wondering if this is due to the HDMI cable and if switching to DisplayPort would potentially fix that?

I guess I'm just wondering if the GPU has a preference in HDMI or DisplayPort. Whether it's worth dropping $10 to buy some cables and ditching the HDMI.

The only difference I can think of is that DisplayPort let's you daisy chain monitors. There may be something wrong with the HDMI cable, try using a different one if you have another, to see if screen still happens. If you don't then ahead and but a DisplayPort cable if you want. If it still happens the problem is probably with the card or drivers.

There's no technical preference from the GPU when it comes to cables. Just what that cable supports and both HDMI and DisplayPort support your current monitors resolution and hz.

repiv
Aug 13, 2009

lDDQD posted:

Rendering triangles is an even integraler part of modern game engines (one would think,) and they've found a way to severely gimp the triangle-rendering performance - as long as the application requesting said triangles to be rendered is a CAD or 3d modelling program, instead of a game.
This isn't even particularly difficult - they actually can do this mostly using the same driver subsystem that matches shader code against a library of known shaders, and replaces then with nVidia's (or AMD's) more efficient version. Just in this case, the goal isn't to improve performance - it's the opposite.

Nvidia is easily able to discriminate between CAD apps and games because CAD is the only thing that uses certain legacy OpenGL features. For example both of these achieve the same result:

(a) render a mesh with the legacy two-sided shading mode enabled
(b) disable backface culling and render the mesh with a shader that applies to both front and back facing polygons

but on a GeForce card (b) will be an order of magnitude faster because (a) is the deliberately crippled CAD path. Nvidia only seem willing to use generic heuristics like that, rather than a specific whitelist of programs or shaders, since pro apps like Maya and Blender that avoid using legacy GL functions have great performance on GeForce cards - the driver just assumes they're games and lets them run at full speed.

Mining is different because there are no obvious "tells" like there are with CAD, a mining kernel doesn't do anything that would look out of place in a shader.

repiv fucked around with this message at 13:14 on Jan 19, 2018

Adbot
ADBOT LOVES YOU

literally a hog
Jan 5, 2006

Mandarrrrrk! Bring me the head of Dexter and Dee Dee shall forever be yours!
Oh poo poo Newegg weekend email deals out!



580s in stock!??! 24 hour only sale?!? Lets check that price.




"SALE" lmao

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply