Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
wargames
Mar 16, 2008

official yospos cat censor

latinotwink1997 posted:

Are ASICs just going to basically kill crypto altogether?

till next craze happen, then no more gpus.

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

latinotwink1997 posted:

Are ASICs just going to basically kill crypto altogether?

you need holy water and a steak to kill crypto

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

wargames posted:

till next craze happen, then no more gpus.

I don’t see how any Gpus can compete with any asic

wargames
Mar 16, 2008

official yospos cat censor

Comfy Fleece Sweater posted:

I don’t see how any Gpus can compete with any asic

i can go to best buy and buy 16 gpus and no asics.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Klyith posted:

you need holy water and a steak to kill crypto

sounds like a nice meal

divabot
Jun 17, 2015

A polite little mouse!
there was a scheme to buy a hydro plant in upstate New York, for bitcoin mining!

it is of course insane

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

latinotwink1997 posted:

Are ASICs just going to basically kill crypto altogether?

Pretty much. Anyone who claims their crypto is asic-proof or resistant is lying.

QuarkJets
Sep 8, 2008

wargames posted:

i can go to best buy and buy 16 gpus and no asics.

that just means that it's harder for a cryptocurrency to remain decentralized once it becomes valuable enough to be mined by ASIC hardware

like you're technically correct that you can buy GPUs right off the shelf but once a cryptocurrency becomes valuable enough to make ASIC mining worthwhile then the economics of GPU mining no longer makes sense

latinotwink1997
Jan 2, 2008

Taste my Ball of Hope, foul dragon!


QuarkJets posted:

that just means that it's harder for a cryptocurrency to remain decentralized once it becomes valuable enough to be mined by ASIC hardware

This. It just seems like you’ll end up with a couple Chinese ASIC farms controlling a majority of the currency, making it no longer decentralized. They manipulate it how they choose meaning the benefits of crypto is gone and suddenly it’s just dead. No one uses it, the farms die out and we’re back where we started.

And honestly, I long to see that day come.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

latinotwink1997 posted:

This. It just seems like you'll end up with a couple Chinese ASIC farms controlling a majority of the currency, making it no longer decentralized. They manipulate it how they choose meaning the benefits of crypto is gone and suddenly it’s just dead. No one uses it, the farms die out and we’re back where we started.

And honestly, I long to see that day come.

This is good for bitcoin.

Stickman
Feb 1, 2004

Palladium posted:

This is good for bitcoin.

The dawning realization that it is neither decentralized nor secure enough to warrant its enormous cost, then being put out of its misery? Sure!

divabot
Jun 17, 2015

A polite little mouse!

Stickman posted:

The dawning realization that it is neither decentralized nor secure enough to warrant its enormous cost, then being put out of its misery? Sure!

this has been the case since 2014, but bitcoin is here

it turns out bitcoin completely refutes the Efficient Market Hypothesis

Dr. Fishopolis
Aug 31, 2004

ROBOT

divabot posted:

this has been the case since 2014, but bitcoin is here

it turns out bitcoin completely refutes the Efficient Market Hypothesis

it's from march but I just came across this wonderful tale in which a crypto conference is held and a cannabis company is hired to cater, which doesn't adequately announce the fact that the entire menu is spiked with THC.

predictable, wonderful finger-pointing ensues. i love everything about it.

Stickman
Feb 1, 2004

divabot posted:

this has been the case since 2014, but bitcoin is here

it turns out bitcoin completely refutes the Efficient Market Hypothesis

I keep holding out hope that if enough people get screwed by exchanges and enough real money starts shifting out of the US, we'll actually see some regulation. But America :smith:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

latinotwink1997 posted:

This. It just seems like you’ll end up with a couple Chinese ASIC farms controlling a majority of the currency, making it no longer decentralized. They manipulate it how they choose meaning the benefits of crypto is gone and suddenly it’s just dead. No one uses it, the farms die out and we’re back where we started.

Where we started is inventing random shitcoins and seeing which one takes off. So, yeah, you're right: someone will develop a new coin that is moderately resistant to existing ASICs, and the cycle will begin anew.

Setset
Apr 14, 2012
Grimey Drawer
Interesting. Crypto influencing a major chip maker? or am I reading this wrong

Granted, 7nm is probably too expensive for them

quote:

GlobalFoundries Reshapes Technology Portfolio to Intensify Focus on Growing Demand for Differentiated Offerings

Semiconductor manufacturer realigns leading-edge roadmap to meet client need and establishes wholly-owned subsidiary to design custom ASICs.

https://www.anandtech.com/show/13277/globalfoundries-stops-all-7nm-development/3

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

We're hitting a hard wall with per-core process improvements and in the absence of that you run custom silicon to accelerate the bottlenecks in your workload. This is an inevitable consequence of the world hitting the limits of silicon die-shrinks and GF is following the money.

divabot
Jun 17, 2015

A polite little mouse!

Lube banjo posted:

Interesting. Crypto influencing a major chip maker? or am I reading this wrong

Granted, 7nm is probably too expensive for them

you're reading it wrong. ASIC just means application-specific integrated circuit, and could be anything. Most chips in a phone are ASICs.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BangersInMyKnickers posted:

We're hitting a hard wall with per-core process improvements and in the absence of that you run custom silicon to accelerate the bottlenecks in your workload. This is an inevitable consequence of the world hitting the limits of silicon die-shrinks and GF is following the money.

It's a little more complicated than that. GF could choose to chase 7nm/10nm like Intel, Samsung, and TSMC but the huge capital costs to develop the process need to be amortized over a large number of wafers, and GF didn't want to pay to expand their facilities to the extent that would be required for that. Custom silicon is well and good but there were process improvements that GF could have made that they chose not to.

The article says that their investors are getting antsy with GF continuing to lose money and wanted to go for profitability today rather than another big capital outlay trying to stay competitive with the big dogs. Of course, there are many tiny fabs out there that did the same thing, you just don't know them because nobody cares about them anymore, and if GF goes down that road then eventually they'll fade out too.

Charlie Demerjian was all-in on the idea that AMD was going to multi-source Zen2, GF made changes to their anticipated 7nm process that put their design rules more in-line with TSMC 7SOC, and then all of a sudden AMD announced they were single-sourcing from TSMC and GF announced they weren't doing 7nm. It's not clear which came first, because GF could definitely get cold feet if their lead customer pulled out, but AMD could also have pulled out because GF wasn't serious about building out enough capacity to get costs down sufficiently.

wargames
Mar 16, 2008

official yospos cat censor
GF really needed to get into the 7nm game to be honest, and just if they could get there and stay there for 8+ years that would have done them wonders becuase going past 7nm is going to be basically impossible without EUV.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

wargames posted:

GF really needed to get into the 7nm game to be honest, and just if they could get there and stay there for 8+ years that would have done them wonders becuase going past 7nm is going to be basically impossible without EUV.

They may just roll it out after TSMC/Intel/Samsung get it up and running, via the tried and try process of 'hire people from those places, make them in charge of unfucking your product'. Also known as Industrial Cross-Pollination.

wargames
Mar 16, 2008

official yospos cat censor

Methylethylaldehyde posted:

They may just roll it out after TSMC/Intel/Samsung get it up and running, via the tried and try process of 'hire people from those places, make them in charge of unfucking your product'. Also known as Industrial Cross-Pollination.

I mean the gloflo 14nm does come from samsung.

Red Rox
Aug 24, 2004

Motel Midnight off the hook
Hey goons, I'm writing an article on other potential uses for all the leftover processing power that's been built up over the past few years. Like rendering or AI computing, for example.

Are there any ex-miners here who'd be keen to answer some questions? Ideally I'm looking for someone who knows their poo poo and invested in a decent setup.

QuarkJets
Sep 8, 2008

Disco De Soto posted:

Hey goons, I'm writing an article on other potential uses for all the leftover processing power that's been built up over the past few years. Like rendering or AI computing, for example.

Are there any ex-miners here who'd be keen to answer some questions? Ideally I'm looking for someone who knows their poo poo and invested in a decent setup.

I'm a professional HPC person (computational physics and artificial intelligence) who has also paid a lot of attention to cryptocurrency mining for like... years and years. Since at least the days when CPU bitcoin mining was the norm. I remember the response from Satoshi when the first CUDA bitcoin miner was released (he asked that people not use it because that would violate the spirit of the bitcoin experiment). Most miners won't be able to adequately answer your questions, because they have no experience in those fields.

The GPUs used for cryptocurrency mining are reusable for many tasks, because they're GPUs; lots of HPC is done on GPUs these days, and gamers have always maintained a large second-hand market for used video cards. Plenty of academics and hobbyists use GTX 1080 Tis for machine learning, for instance, because they're cheap and pretty powerful

There are some technical reasons why someone might prefer say a Quadro or a Tesla card over a gaming card (example: the GTX cards do not have ECC VRAM, meaning they are less reliable if you need your results to be very accurate; GTX cards also have poor performance for double-precision computation, if that's something that you need). And the window for transitioning hardware to the HPC realm is also slowly closing; the GTX 2080 is shipping to consumers this week, and this time next year it'll be very hard to find an HPC person willing to buy the previous generation of hardware (plenty of gamers will still be willing to buy them, though).

There was some AI graduate student researcher who was trying to build a network for people to sell their GPU power to researchers instead of to cryptocurrency miners (e.g. NiceHash but for neural networks), but IIRC he was doing it in his free time and I don't think it ever got off the ground, and he was never able to solve the honesty problem (e.g. how do you prove that the "computational miner" actually performed the work that you gave them when your outputs are not easily verifiable?)

The huge number of ASIC processors (for BTC, LTC, ETH, etc) are completely useless for other tasks. Those are going to wind up in a landfill. They are custom-made to mine bitcoins/litecoins/any other cryptocurrency using the same hashing algorithm, they can accomplish no other tasks

Do you have any other questions?

QuarkJets fucked around with this message at 10:28 on Sep 19, 2018

Red Rox
Aug 24, 2004

Motel Midnight off the hook

QuarkJets posted:

*awesome poo poo*

Do you have any other questions?

Thanks, this was very insightful.

I do have some other questions - I'll PM you so I don't hijack the thread.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

Disco De Soto posted:

Thanks, this was very insightful.

I do have some other questions - I'll PM you so I don't hijack the thread.

I wouldn’t mind reading your questions and what Quark has to say.

I love effortposts.

It’s a problem, sorry.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

tehinternet posted:

I wouldn’t mind reading your questions and what Quark has to say.

Yeah, you might as well.

I mean, honestly, what else is gonna go on in this thread?

"Hey, is GPU mining still dead?"
"Yup"
"Oh, well then should I spend $10k on ASICs?"
"Nope, market's fuckin' hosed, son."
"Oh, :("

Shrimp or Shrimps
Feb 14, 2012


DrDork posted:

"Oh, well then should I spend $10k on ASICs?"

Is there a secondhand market for ASICs? Asking for a friend.

E: I guess I should say third-hand because they're already second-hand when they get to the original buyers lmao

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Shrimp or Shrimps posted:

Is there a secondhand market for ASICs? Asking for a friend.

Someone is always looking for landfill or to melt stuff down, I suppose.

QuarkJets
Sep 8, 2008

Shrimp or Shrimps posted:

Is there a secondhand market for ASICs? Asking for a friend.

E: I guess I should say third-hand because they're already second-hand when they get to the original buyers lmao

Sure, there will always be people who think that cryptocurrency mining is bound to make a resurgence or that cryptocurrency X is about to shoot to the moon (any day now!)

QuarkJets
Sep 8, 2008

Some more effortly details

- All of the people with GPU mining rigs would probably be really happy to have another way to use their hardware to make money, like what that grad student I mentioned was trying to do. All that matters to them is that they be paid for running their hardware. Most of the computational hardware is in the hands of non-experts, so you just need to make the user interface as simple as possible. NiceHash did a great job of this, "click a button and start slowly making money from your idle hardware" is an easy to understand paradigm and would bring a lot of people on board. In fact, more generalized GPU computing would probably bring in more people than cryptocurrency mining ever did, because it would like the natural skeeviness attached to cryptocurrency and people contribute idle resources for free to all kinds of good causes (SETI@Home, BOINC, etc.)

- Cryptocurrency miners are not actively seeking out these opportunities but would surely embrace them if they advertised themselves.

- Amazon is practically printing money because AWS offers a lot of computational power to anyone who wants to pay for it. AWS is highly generalized (e.g. not just for heavy computation), and there's certainly room for a lower-cost alternative focused on heavy computation

- I can't emphasize the need for an intuitive user interface enough.

- The subset of tasks I mentioned that aren't well-suited to a GTX 1080 Ti are definitely in the minority. Machine Learning is the vast majority of computational effort right now, it's a big hot topic in HPC and it doesn't need any of the bells and whistles offered by the premium cards. Caveat: the more premium cards have tensor cores that are extremely well-optimized for machine learning tasks, and they're way better at the half-precision computations that machine learning algorithms frequently perform. A GTX 1080 Ti will never be superior to a V100 for machine learning, but if compute time with ten GTX 1080 Tis is cheaper than compute time on one V100 then most professionals will go with the 1080 Tis.

- Corporations are risk-averse and are going to want assurances that you're not sending their data to their competitors.

- Solving the honesty problem means being able to easily verify computational outputs. This is easy by design in cryptocurrency, figuring out the correct nonce for the next block is hard but then verifying that the nonce is correct is easy. This is lot harder for computational problems. Say that I need to convolve one billion matrices with one billion other matrices; the only way to verify those outputs is to do the difficult computation yourself and check the answer. You can't solve this issue but you can mitigate it; other providers on the network can perform the verification, and you could wrap that into the cost offered to people looking for computational power (e.g. a user could ask that N% of the outputs be independently verified X times by Y independent providers; N could be 100 for renders, and maybe this degree of redundant computation is still cost effective because you don't have to purchase and maintain your own hardware)

QuarkJets fucked around with this message at 22:21 on Sep 19, 2018

Stickman
Feb 1, 2004

QuarkJets posted:

- Solving the honesty problem means being able to easily verify computational outputs. This is easy by design in cryptocurrency, figuring out the correct nonce for the next block is hard but then verifying that the nonce is correct is easy. This is lot harder for computational problems. Say that I need to convolve one billion matrices with one billion other matrices; the only way to verify those outputs is to do the difficult computation yourself and check the answer. You can't solve this issue but you can mitigate it; other providers on the network can perform the verification, and you could wrap that into the cost offered to people looking for computational power (e.g. a user could ask that N% of the outputs be independently verified X times by Y independent providers; N could be 100 for renders, and maybe this degree of redundant computation is still cost effective because you don't have to purchase and maintain your own hardware)

It seems like you could also use redundant computes to weed out bad actors - X% failures and your reinbursement rate takes a hit, Y% and you lose your account (perhaps even banning the GPU device ID, though you'd want some sort amnesty for second-hand purchases).

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

QuarkJets posted:

Some more effortly details

- All of the people with GPU mining rigs would probably be really happy to have another way to use their hardware to make money, like what that grad student I mentioned was trying to do. All that matters to them is that they be paid for running their hardware. Most of the computational hardware is in the hands of non-experts, so you just need to make the user interface as simple as possible. NiceHash did a great job of this, "click a button and start slowly making money from your idle hardware" is an easy to understand paradigm and would bring a lot of people on board. In fact, more generalized GPU computing would probably bring in more people than cryptocurrency mining ever did, because it would like the natural skeeviness attached to cryptocurrency and people contribute idle resources for free to all kinds of good causes (SETI@Home, BOINC, etc.)

- Cryptocurrency miners are not actively seeking out these opportunities but would surely embrace them if they advertised themselves.

- Amazon is practically printing money because AWS offers a lot of computational power to anyone who wants to pay for it. AWS is highly generalized (e.g. not just for heavy computation), and there's certainly room for a lower-cost alternative focused on heavy computation

- I can't emphasize the need for an intuitive user interface enough.

- The subset of tasks I mentioned that aren't well-suited to a GTX 1080 Ti are definitely in the minority. Machine Learning is the vast majority of computational effort right now, it's a big hot topic in HPC and it doesn't need any of the bells and whistles offered by the premium cards. Caveat: the more premium cards have tensor cores that are extremely well-optimized for machine learning tasks, and they're way better at the half-precision computations that machine learning algorithms frequently perform. A GTX 1080 Ti will never be superior to a V100 for machine learning, but if compute time with ten GTX 1080 Tis is cheaper than compute time on one V100 then most professionals will go with the 1080 Tis.

- Corporations are risk-averse and are going to want assurances that you're not sending their data to their competitors.

- Solving the honesty problem means being able to easily verify computational outputs. This is easy by design in cryptocurrency, figuring out the correct nonce for the next block is hard but then verifying that the nonce is correct is easy. This is lot harder for computational problems. Say that I need to convolve one billion matrices with one billion other matrices; the only way to verify those outputs is to do the difficult computation yourself and check the answer. You can't solve this issue but you can mitigate it; other providers on the network can perform the verification, and you could wrap that into the cost offered to people looking for computational power (e.g. a user could ask that N% of the outputs be independently verified X times by Y independent providers; N could be 100 for renders, and maybe this degree of redundant computation is still cost effective because you don't have to purchase and maintain your own hardware)

What Quarkjets is trying to say here is that you need a Blockchain to solve these problems

QuarkJets
Sep 8, 2008

Stickman posted:

It seems like you could also use redundant computes to weed out bad actors - X% failures and your reinbursement rate takes a hit, Y% and you lose your account (perhaps even banning the GPU device ID, though you'd want some sort amnesty for second-hand purchases).

Yeah and you need a way of ensuring that your redundancy is independent.

And there's also a matter of padding your hours. Say that provider X takes 4 hours to process some data and provider Y takes 4.5 hours. Did provider Y pad their hours for greater payout? Hard to say. Or if you pay based on availability instead of wall time, how do you detect fake "outages"?

These are tough problems but I doubt they're insurmountable.

Stickman
Feb 1, 2004

QuarkJets posted:

Yeah and you need a way of ensuring that your redundancy is independent.

And there's also a matter of padding your hours. Say that provider X takes 4 hours to process some data and provider Y takes 4.5 hours. Did provider Y pad their hours for greater payout? Hard to say. Or if you pay based on availability instead of wall time, how do you detect fake "outages"?

These are tough problems but I doubt they're insurmountable.

I assume you'd probably pay by the computation rather than hour in order to promote efficient computation (like *coin mining, but with a more useful outcome). If there are deadlines, you could add a bit extra for priority processing (i.e., the data cruncher would see the deadline for their computation component and receive a bonus if they dedicate enough resources to meet the deadline), or if the problem supports it, simply distribute small enough chunks that you can cut off processing at a certain time. I see your point, though - some problems might not quite be so granular, so there might need to be some additional incentives to meet availability targets.

Red Rox
Aug 24, 2004

Motel Midnight off the hook
I was reading bitcoin energy consumption and how proof-of-stake algorithms use way less energy compared to proof-of-work algorithms that bitcoin currently uses. Do you think it's likely that bitcoin could switch?

I was visiting the Auckland Bioengineering Institue today and talking to a PHD student who's working on modeling the pulmonary valve that connects the heart and lungs. They have an HPC setup he sometimes uses to process his work. Seems a much better use for all that processing power than mining bitcoin.

Red Rox fucked around with this message at 03:53 on Sep 20, 2018

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Disco De Soto posted:

I was reading bitcoin energy consumption and how proof-of-stake algorithms use way less energy compared to proof-of-work algorithms that bitcoin currently uses. Do you think it's likely that bitcoin could switch?

Proof of stake is just getting rid of any pretense that this isn't just a straight ponzi scheme.

Comfy Fleece Sweater
Apr 2, 2013

You see, but you do not observe.

Proof of steak heh heh heh 🥩

QuarkJets
Sep 8, 2008

Stickman posted:

I assume you'd probably pay by the computation rather than hour in order to promote efficient computation (like *coin mining, but with a more useful outcome). If there are deadlines, you could add a bit extra for priority processing (i.e., the data cruncher would see the deadline for their computation component and receive a bonus if they dedicate enough resources to meet the deadline), or if the problem supports it, simply distribute small enough chunks that you can cut off processing at a certain time. I see your point, though - some problems might not quite be so granular, so there might need to be some additional incentives to meet availability targets.

Normally supercomputers charge by computational time, which captures what you described. But it's very easy to spoof that to be whatever you want, if you exclusively own the hardware.

Adbot
ADBOT LOVES YOU

QuarkJets
Sep 8, 2008

Disco De Soto posted:

I was reading bitcoin energy consumption and how proof-of-stake algorithms use way less energy compared to proof-of-work algorithms that bitcoin currently uses. Do you think it's likely that bitcoin could switch?

I was visiting the Auckland Bioengineering Institue today and talking to a PHD student who's working on modeling the pulmonary valve that connects the heart and lungs. They have an HPC setup he sometimes uses to process his work. Seems a much better use for all that processing power than mining bitcoin.

no, bitcoin will never ever switch to proof of stake. Other cryptocurrencies may, some day, if someone can ever figure out a design that doesn't just turn into proof of work with extra steps

  • Locked thread