Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Earwicker
Jan 6, 2003

KakerMix posted:

We usually don't go killing say, ants for any other reason besides they are in the way.

plenty of little kids kill bugs for fun

Adbot
ADBOT LOVES YOU

Earwicker
Jan 6, 2003

Shadow0 posted:

For me, it will never cease to surprise me that people are worried about robots taking over the world. What possible motivation would robots have to kill all humans (and to concoct this plan in secret and somehow pull it all off)?

I don't think some sort of vast "kill all humans" plan is the only thing people are worried about. if you are in your car and your car suddenly decides it doesn't like you and wants to kill you, thats a problem, even if its not part of a global AI conspiracy. what motivation? road rage of course, your robot assisted car will get tired of your lovely driving and decide that you deserve to die and will eject you through the roof on the interstate while going 90

nigga crab pollock
Mar 26, 2010

by Lowtax

The Biscuit posted:

Cheers guys.
The connection itself storing things I could never comprehend, but I can take as an idea. Like is there something else under the hood? I thought of a dendrite connection as a 0 or 1, your saying one or a few store the entire memory/action. If a single synapse has more than a 0 or 1, is there something beneath that?

Then I try to understand how accessing this information would work, leading into thoughts, memories, consciousness.
Can one even speculate on that at this point?

cells are really freakin complicated. even simple bacteria have loads of poo poo going on, imagine the complexity that multicellular life with speciated tissues has. also we use binary not because it has any inherent benefit but because its simpler to make circuits that only have two states rather than circuits with three. there have been ternary computers in the past

Bulgogi Hoagie
Jun 1, 2012

We

The Biscuit posted:

Cheers guys.
The connection itself storing things I could never comprehend, but I can take as an idea. Like is there something else under the hood? I thought of a dendrite connection as a 0 or 1, your saying one or a few store the entire memory/action. If a single synapse has more than a 0 or 1, is there something beneath that?

Then I try to understand how accessing this information would work, leading into thoughts, memories, consciousness.
Can one even speculate on that at this point?

i don't really get your question but yes although most synapses produce binary firing/no firing effects in the short term, long term effects vary with repeated stimulation, including inducing long term potentiation or habituation using molecular mechanisms. however because synapses have different properties (due to synaptic density composition, membrane channels etc) their activation can produce activation or inhibition and thus result in potentiation or depression or even intermediate states.

as for consciousness and that stuff, you have to understand that the CNS is a complex dynamical system - and even very small and simply connected examples of such systems can produce complicated dynamics with bifurcations, attractors and such. this is called emergence, wherein some unexpected property arises as an interaction of many simpler components. so conceivably speaking, the behaviour of a thalamus is the emergent property of the behaviour of its neurons and the behaviour of neurons is the emergent property of the biochemical signalling networks inside a given neuron and such. so while there is clearly some fuzzy logical separation to this all because of these basic concepts we have discovered, we have a very difficult time understanding how it all ties together it's mind bogglingly unintuitive.

we are currently just about beginning to discover how groups of neurons interact, moving onto studying the properties of the next abstraction layer, up from individual neurons and synapses. it's likely that the higher order phenomena that compromise our actual living experience are a further emergent property several layers upwards so it will be a while before we get there

Bulgogi Hoagie
Jun 1, 2012

We

nigga crab pollock posted:

cells are really freakin complicated. even simple bacteria have loads of poo poo going on, imagine the complexity that multicellular life with speciated tissues has. also we use binary not because it has any inherent benefit but because its simpler to make circuits that only have two states rather than circuits with three. there have been ternary computers in the past

binary modelling of signalling pathways inside of cells (so say taking a protein to have only two possible states on or off) is actually super useful in systems biology and yields much more information about how pathways behave that you would think

Shadow0
Jun 16, 2008


If to live in this style is to be eccentric, it must be confessed that there is something good in eccentricity.

Grimey Drawer

Earwicker posted:

I don't think some sort of vast "kill all humans" plan is the only thing people are worried about. if you are in your car and your car suddenly decides it doesn't like you and wants to kill you, thats a problem, even if its not part of a global AI conspiracy. what motivation? road rage of course, your robot assisted car will get tired of your lovely driving and decide that you deserve to die and will eject you through the roof on the interstate while going 90

Yeah, but that's the thing - robots never tire. They never need anything. They never desire.
And no one is going to accidentally make robots that do in the same way no one accidentally makes Super Mario 64 when they were trying to make the latest Excel program.

The Biscuit posted:

Cheers guys.
The connection itself storing things I could never comprehend, but I can take as an idea. Like is there something else under the hood? I thought of a dendrite connection as a 0 or 1, your saying one or a few store the entire memory/action. If a single synapse has more than a 0 or 1, is there something beneath that?

Then I try to understand how accessing this information would work, leading into thoughts, memories, consciousness.
Can one even speculate on that at this point?

This is one of the ways that a TNN is somewhat similar to a biological neuron. I'm a bit tempted to draw out a full example...
I can give you a simple biological example though:
Lets say you have a group of neurons that somehow tell you you're angry or something and another group that detects the color red. Those neurons might have a weak connection between them at first, but if they are both activated at the same time, the link will grow stronger. In this way, your brain is learning to associate the two things together and they will have a stronger tendency to fire off each other when one or the other is activated.
If that makes sense. It's all about tethering like things together to fire together and vice-versa.

It's a vastly complex system made possible only because you have these billions of neurons with their 100 trillion connections.

Edit: Also what Bulgogi Hoagie said.

Shadow0 fucked around with this message at 16:16 on May 20, 2017

Earwicker
Jan 6, 2003

Shadow0 posted:

Yeah, but that's the thing - robots never tire.

you are thinking in ideal terms. of course there's no need for a robot to actually "tire" other than those that experience routine wear and tear. however many of them will definitely be designed to artificially "tire" in one sense or another in order to encourage you to spend money upgrading or replacing your robot

quote:

And no one is going to accidentally make robots that do in the same way no one accidentally makes Super Mario 64 when they were trying to make the latest Excel program.

no it certainly won't be an accident. in the coming years people will design robots that do all sorts of hosed up things and its not going to be an accident at all

Shadow0
Jun 16, 2008


If to live in this style is to be eccentric, it must be confessed that there is something good in eccentricity.

Grimey Drawer

Earwicker posted:

you are thinking in ideal terms. of course there's no need for a robot to actually "tire" other than those that experience routine wear and tear. however many of them will definitely be designed to artificially "tire" in one sense or another in order to encourage you to spend money upgrading or replacing your robot

The hardware might wear out, but the software need not ever.
Also, what company is going to risk a lawsuit because their software injures users if they don't upgrade enough?

Shadow0 fucked around with this message at 16:22 on May 20, 2017

Earwicker
Jan 6, 2003

Shadow0 posted:

Also, what company is going to risk a lawsuit because their software injures users if they don't upgrade enough?

there are a lot of companies out there that have proven, over and over again, that they can make products that injure or kill users, pay out some settlements for the lawsuits that arise from those injuries and deaths, and then keep on trucking. or they can just operate in countries where that's less of a concern. I was joking about the car ejector seat scenario, obviously cars do not have ejector seats, but I do not think its remotely out of the realm of possibility that all kinds of AI's will start to artificially fall apart, personality wise and in other respects, after a certain point in order to encourage upgrades or replacements, with unpredictable results. Especially if/when advanced AI or elements of it becomes part of cheap mass produced goods

Bulgogi Hoagie
Jun 1, 2012

We

Earwicker posted:

there are a lot of companies out there that have proven, over and over again, that they can make products that injure or kill users, pay out some settlements for the lawsuits that arise from those injuries and deaths, and then keep on trucking. or they can just operate in countries where that's less of a concern. I was joking about the car ejector seat scenario, obviously cars do not have ejector seats, but I do not think its remotely out of the realm of possibility that all kinds of AI's will start to artificially fall apart, personality wise and in other respects, after a certain point in order to encourage upgrades or replacements, with unpredictable results. Especially if/when advanced AI or elements of it becomes part of cheap mass produced goods

brother software development is already hard enough without anyone having the bright idea to introduce this sort of bullshit into it

Toadvine
Mar 16, 2009
Please disregard my advice w/r/t history.
Robots will imitate their makers that's why they're terrifying

Earwicker
Jan 6, 2003

Bulgogi Hoagie posted:

brother software development is already hard enough without anyone having the bright idea to introduce this sort of bullshit into it

well sure, but developers have said exactly this about lots of other ideas that came out of marketing or from some new age guru that the "vp of innovation" plays racquetball with on tuesdays, and they had to do it anyway

Rasta_Al
Jul 14, 2001

she had tiny Italian boobs.
Well that's my story.
Fun Shoe

Mental Hospitality posted:

Breed Bot 3: The Journey Kitchen

I haven't even seen Breed Bot 1 or 2 yet. :(

The computer is already looking for affection. How adorable.

Please... breed me *beep boop*

Vincent Van Goatse
Nov 8, 2006

Enjoy every sandwich.

Smellrose
you're no Eripsa, you loving worthless stupid OP

spud
Aug 27, 2003

by LITERALLY AN ADMIN
Can somebody write me a neural network; I want to know why my wife left me.

gary oldmans diary
Sep 26, 2005

rezatahs posted:

requesting name change to "snowbonk"
what a turdly request

Rutibex
Sep 9, 2001

by Fluffdaddy

Vincent Van Goatse posted:

you're no Eripsa, you loving worthless stupid OP

im sorry i will try to give my brain more data sets to work with, maybe i can train it to be smarter

Shifty gimbal
Dec 28, 2008

Hey you... I got something to tell ya
Biscuit Hider

Earwicker posted:

in the coming years people will design robots that do all sorts of hosed up things and its not going to be an accident at all
I for one can't wait :mrgw: :roboluv:

Poppyseed Poundcake
Feb 23, 2007
no fate

Emmideer
Oct 20, 2011

Lovely night, no?
Grimey Drawer

The Biscuit posted:

Then I try to understand how accessing this information would work, leading into thoughts, memories, consciousness.
Can one even speculate on that at this point?

Just to clear something up here, neurons as far the human brain goes are not computers nor processors of information. The computation metaphor is just that, a metaphor. We don't even know something as simple as to how a collection of neurons operates; there's a long-standing misconception of "brain regions" and other trash like that. Even connectomes offer little in the way telling us what is actually happening.

Tallgeese
May 11, 2008

MAKE LOVE, NOT WAR


Shadow0 posted:

The hardware might wear out, but the software need not ever.
Also, what company is going to risk a lawsuit because their software injures users if they don't upgrade enough?

I seem to remember that there was a car company whose actuaries calculated that fixing a flaw that would require spending a nominal amount to fix per vehicle would cost more than paying a settlement for the deaths from the flaw.

They did not fix the flaw.

What makes you think software companies wouldn't do the same?

Antivehicular
Dec 30, 2011


I wanna sing one for the cars
That are right now headed silent down the highway
And it's dark and there is nobody driving And something has got to give

Rutibex posted:

im sorry i will try to give my brain more data sets to work with, maybe i can train it to be smarter

Just don't be like whoever fed the movie NN a bunch of Wikipedia page names so now it thinks disambiguation tags are how we name films

Still looking forward to Barney's The Devil's Treachery, though. It'll be good to see the Barns reach the One Hour Photo phase of his career.

Bulgogi Hoagie
Jun 1, 2012

We

jon joe posted:

Just to clear something up here, neurons as far the human brain goes are not computers nor processors of information. The computation metaphor is just that, a metaphor. We don't even know something as simple as to how a collection of neurons operates; there's a long-standing misconception of "brain regions" and other trash like that. Even connectomes offer little in the way telling us what is actually happening.

uh

zedprime
Jun 9, 2007

yospos

Tallgeese posted:

I seem to remember that there was a car company whose actuaries calculated that fixing a flaw that would require spending a nominal amount to fix per vehicle would cost more than paying a settlement for the deaths from the flaw.

They did not fix the flaw.

What makes you think software companies wouldn't do the same?
Because those accountants were wrong. The value of a human life is more widely known now so if a robot kills you, you know that the company and society in general is really getting a better deal.

Emmideer
Oct 20, 2011

Lovely night, no?
Grimey Drawer

Bulgogi_Hoagie.exe has stopped working

Shadow0
Jun 16, 2008


If to live in this style is to be eccentric, it must be confessed that there is something good in eccentricity.

Grimey Drawer

Bulgogi Hoagie posted:

brother software development is already hard enough without anyone having the bright idea to introduce this sort of bullshit into it

Earwicker posted:

well sure, but developers have said exactly this about lots of other ideas that came out of marketing or from some new age guru that the "vp of innovation" plays racquetball with on tuesdays, and they had to do it anyway

:): "Hey, boss I just finished writing that toaster software you wanted. Settings for light, dark, even a mode for bagels. Should cover all the needs of our our customers."
:synthy:: "That's great, Jim, now can you spend the next 10 years developing some software to make the toaster feel emotions? I'm willing to invest millions into this feature that people are definitely going to want and need."

Tallgeese posted:

I seem to remember that there was a car company whose actuaries calculated that fixing a flaw that would require spending a nominal amount to fix per vehicle would cost more than paying a settlement for the deaths from the flaw.

They did not fix the flaw.

What makes you think software companies wouldn't do the same?

I'm not saying bugs in programs aren't going to kill people. But that's going to happen anyway, AI or not. After all a bug on a nuclear-missile-carrying USSR submarine once almost caused nuclear war, and this was long before fancy modern "AI".
And it's going to happen outside of software as well. Like exploding phone batteries or car wheel axels with not enough strength or whatever.
But what I'm saying is that the idea that robots are going to suddenly rise up and overthrow humanity is ridiculous.

zedprime posted:

Because those accountants were wrong. The value of a human life is more widely known now so if a robot kills you, you know that the company and society in general is really getting a better deal.

In a surprise twist, it turns out human life has negative value and they end up paying the company for every dead human.

Shadow0 fucked around with this message at 17:55 on May 20, 2017

Tallgeese
May 11, 2008

MAKE LOVE, NOT WAR


Shadow0 posted:

But what I'm saying is that the idea that robots are going to suddenly rise up and overthrow humanity is ridiculous.

Oh, well absolutely. I was just responding to the notion that software companies would definitely fix flaws they know are there. It's all cost-benefit analysis.

Shadow0 posted:

It turns out human life has negative value and they end up paying the company.

You joke, but I seem to recall an incident where a toy company (I want to say Mattel) apologized to China for making the country lose face.

This was because the Chinese factory they used introduced significant amounts of neurotoxins into the toys, and this was somehow Mattel's fault.

ClamdestineBoyster
Aug 15, 2015
Probation
Can't post for 10 years!
They all got like 4 wires hooked up to them and create a certain amount of resistance/feedback in a complementary pair to create an analog electronic signal that is regulated by the conductivity/resistance of other adjoining neurons and chemical feedback systems in the brain which also regulate the voltage. So they don't work like a computer at all, there's no CPU and ultimately no binary decisions, just relative ones and ones with an amount of intensity.

Shadow0
Jun 16, 2008


If to live in this style is to be eccentric, it must be confessed that there is something good in eccentricity.

Grimey Drawer

jon joe posted:

Just to clear something up here, neurons as far the human brain goes are not computers nor processors of information. The computation metaphor is just that, a metaphor. We don't even know something as simple as to how a collection of neurons operates; there's a long-standing misconception of "brain regions" and other trash like that. Even connectomes offer little in the way telling us what is actually happening.

Well, I only meant it as a metaphor. Alluding to the idea that a biological neuron is at least at the complexity of a processor while a TNN neuron is more at the level of a single logic gate. Though the whole thing, and every part in it, is super complicated. Super fascinating as well.
Von Neumann machines and brains operate very differently. This is why computers have an easy time doing math while brains have an easy time recognizing if a surface is dusty or recognizing a novel instance of a mug as a mug, even when its partially occluded.
Brains do process information obviously. That's how you're able to read this? I'm not sure what you meant.

ClamdestineBoyster posted:

They all got like 4 wires hooked up to them and create a certain amount of resistance/feedback in a complementary pair to create an analog electronic signal that is regulated by the conductivity/resistance of other adjoining neurons and chemical feedback systems in the brain which also regulate the voltage. So they don't work like a computer at all, there's no CPU and ultimately no binary decisions, just relative ones and ones with an amount of intensity.

If you want to avoid binary decisions, this is where the idea of Fuzzy Logic comes in. You can be some percentage of state rather than just binary in or out of some state or whatever.
A CPU is a central processing unit, while the brain is a distributed system. Like I was saying, it was just more of a methaphor and really a TNN and biological neural network are very different.

Shadow0 fucked around with this message at 18:09 on May 20, 2017

Emmideer
Oct 20, 2011

Lovely night, no?
Grimey Drawer

Shadow0 posted:

That's how you're able to read this? I'm not sure what you meant.

Ah, that's alright. I think I misunderstood you. It's just a common thing I encounter.

edit: wait, you're not the guy I responded to. Now I'm confused.

Emmideer fucked around with this message at 18:15 on May 20, 2017

Rutibex
Sep 9, 2001

by Fluffdaddy

ClamdestineBoyster posted:

They all got like 4 wires hooked up to them and create a certain amount of resistance/feedback in a complementary pair to create an analog electronic signal that is regulated by the conductivity/resistance of other adjoining neurons and chemical feedback systems in the brain which also regulate the voltage. So they don't work like a computer at all, there's no CPU and ultimately no binary decisions, just relative ones and ones with an amount of intensity.

but a biological neuron has all kinds of receptors for neurotransmitter chemicals. neuron A with a dopamine molecule is going to fire differantly than neuron A with a canabinoid molecule or adrenaline or whatever. that very not binary

is there a substantial differances between human neurons and say, lobster neurons? is our special brain just from having more of them, or are the neurons special too?

Shadow0
Jun 16, 2008


If to live in this style is to be eccentric, it must be confessed that there is something good in eccentricity.

Grimey Drawer

jon joe posted:

Ah, that's alright. I think I misunderstood you. It's just a common thing I encounter.

edit: wait, you're not the guy I responded to. Now I'm confused.

You can't prove we're not the same person (or maybe we passed the Turing Test and we're both actually bots). :awesomelon:

Yeah, psychology and biology are really cool and I wish people understood them more. I think people have some really weird and strong misconceptions about brains.

extra stout
Feb 24, 2005

ISILDUR's ERR
my favorite color is clardic fug

Earwicker
Jan 6, 2003

Shadow0 posted:

:synthy:: "That's great, Jim, now can you spend the next 10 years developing some software to make the toaster feel emotions? I'm willing to invest millions into this feature that people are definitely going to want and need."

it doesnt matter whether the poo poo really "feels emotions" or not, psychologically or philosophically speaking. we are going to see increasing presence of AI (and AI like systems) that represent themselves with increasingly complex "personalities" of varying degrees because this is what consumers have been led to expect of "the future" by mass media, and those personalities are going to be in many ways reflective of their makers and will also be affected by tendencies to cut corners, plan obsolesce, make cheap knock offs, etc. I don't think that kind of poo poo is likely to be made for toasters, but for cars? I wouldn't be surprised to see that at all.

Earwicker fucked around with this message at 18:30 on May 20, 2017

Rutibex
Sep 9, 2001

by Fluffdaddy

Earwicker posted:

it doesnt matter whether the poo poo really "feels emotions" or not, psychologically or philosophically speaking. we are going to see increasing presence of AI (and AI like systems) that represent themselves with increasingly complex "personalities" of varying degrees because this is what consumers have been led to expect of "the future" by mass media, and those personalities are going to be in many ways reflective of their makers and will also be affected by tendencies to cut corners, plan obsolesce, make cheap knock offs, etc. I don't think that kind of poo poo is likely to be made for toasters, but for cars? I wouldn't be surprised to see that at all.

https://www.youtube.com/watch?v=nkcKaNqfykg

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

Earwicker posted:

it doesnt matter whether the poo poo really "feels emotions" or not, psychologically or philosophically speaking. we are going to see increasing presence of AI (and AI like systems) that represent themselves with increasingly complex "personalities" of varying degrees because this is what consumers have been led to expect of "the future" by mass media, and those personalities are going to be in many ways reflective of their makers and will also be affected by tendencies to cut corners, plan obsolesce, make cheap knock offs, etc. I don't think that kind of poo poo is likely to be made for toasters, but for cars? I wouldn't be surprised to see that at all.

Most humans can't pass the Turing test so I'm down with not-real AI that's convinced me they are real.

Lurks Morington
Aug 7, 2016

by Smythe
American Hero: Fire of Crusty

ClamdestineBoyster
Aug 15, 2015
Probation
Can't post for 10 years!

Shadow0 posted:

Well, I only meant it as a metaphor. Alluding to the idea that a biological neuron is at least at the complexity of a processor while a TNN neuron is more at the level of a single logic gate. Though the whole thing, and every part in it, is super complicated. Super fascinating as well.
Von Neumann machines and brains operate very differently. This is why computers have an easy time doing math while brains have an easy time recognizing if a surface is dusty or recognizing a novel instance of a mug as a mug, even when its partially occluded.
Brains do process information obviously. That's how you're able to read this? I'm not sure what you meant.


If you want to avoid binary decisions, this is where the idea of Fuzzy Logic comes in. You can be some percentage of state rather than just binary in or out of some state or whatever.
A CPU is a central processing unit, while the brain is a distributed system. Like I was saying, it was just more of a methaphor and really a TNN and biological neural network are very different.

Well the problem you have when trying to simulate a neuron is that simulating just one neuron requires in infinitely variable set of analog states. Ultimately you are dealing with a quantity of electrons to some resolution, but that requires an entire kernel process to calculate real time. And you can't run another complementary or relative kernel process on the same machine. So now if you're trying to simulate this poo poo with a big server rack or something you get literally one neuron per machine. So if you have a warehouse full of machines you don't even have a functioning retard. Even if you miniaturize your computers to the size of a neuron you have the problem of heat and electrical conductivity between adjacent machines. You would just run out of physical space if you tried to make a machine brain no matter how fine the electronics are.

Toadvine
Mar 16, 2009
Please disregard my advice w/r/t history.

Oh gently caress yea hnnnng

Adbot
ADBOT LOVES YOU

Bulgogi Hoagie
Jun 1, 2012

We

ClamdestineBoyster posted:

Well the problem you have when trying to simulate a neuron is that simulating just one neuron requires in infinitely variable set of analog states. Ultimately you are dealing with a quantity of electrons to some resolution, but that requires an entire kernel process to calculate real time. And you can't run another complementary or relative kernel process on the same machine. So now if you're trying to simulate this poo poo with a big server rack or something you get literally one neuron per machine. So if you have a warehouse full of machines you don't even have a functioning retard. Even if you miniaturize your computers to the size of a neuron you have the problem of heat and electrical conductivity between adjacent machines. You would just run out of physical space if you tried to make a machine brain no matter how fine the electronics are.

*deepak chopra noises begin coming out of thread*

  • Locked thread