|
Cojawfee posted:Sure grandpa, and we're back on the moon too. Just eat your oatmeal. Stop buying lattes every day and maybe you could afford to buy your own house, you snot nosed brat.
|
# ? Oct 3, 2019 17:30 |
|
|
# ? Apr 28, 2024 19:42 |
|
Cojawfee posted:grandpa semi-related, but its been bugging me. What's the story behind the thread's current subject line?
|
# ? Oct 3, 2019 17:34 |
|
Somehow a confluence of SR-71 and Grandpa chat. (The myth that the SR-71 leaks fuel until it gets up to speed, heats up, and expands)
|
# ? Oct 3, 2019 17:35 |
|
Queen Combat posted:Somehow a confluence of SR-71 and Grandpa chat. This and the screw in butt plugs that are used on dead people.
|
# ? Oct 3, 2019 17:37 |
|
buttcrackmenace posted:
We were discussing processing dead bodies in a funeral home and then cremating them. When processing a body it requires a screw-in butt plug to keep fluids from leaking out the anus. Another goon wondered if there could be a market for making screw in titanium butt plugs that would survive the cremation process for later resale. I then conflated the SR-71 question (since it leaks fuel before heating and sealing up around mach 2). Yeah, strange.
|
# ? Oct 3, 2019 17:40 |
|
ili posted:Fucken ay mate. It's not just a tesla thing either, there's a score of bastards who can't drive for poo poo but all the car smarts paper over their incompetence until poo poo is really fuckered. Airplanes work the same way.
|
# ? Oct 3, 2019 17:44 |
|
Seems to start right about here: https://forums.somethingawful.com/showthread.php?threadid=3222431&userid=0&perpage=40&pagenumber=962#post491714246Queen Combat posted:"Stationary, Grandpa actually leaks a bit. That's by design. You see, when he gets above mach 2, the titanium plug expands by centimeter per meter of length, and he becomes water tight." Aha! There's the actual quote that kicked it off.
|
# ? Oct 3, 2019 17:51 |
|
Cojawfee posted:This and the screw in butt plugs that are used on dead people. this thread continues to not disappoint
|
# ? Oct 3, 2019 17:58 |
|
xzzy posted:How is that relevant in any way? Because it’s decades away which was my point? You won’t see a commercially available fully self driving car this side of 2040.
|
# ? Oct 3, 2019 17:59 |
|
big crush on Chad OMG posted:Because it’s decades away which was my point? You won’t see a commercially available fully self driving car this side of 2040. And I'm disagreeing with your point and think you're completely wrong!
|
# ? Oct 3, 2019 18:02 |
|
buttcrackmenace posted:this thread continues to not disappoint I see what you did there
|
# ? Oct 3, 2019 18:03 |
|
xzzy posted:And I'm disagreeing with your point and think you're completely wrong! Is this based on any sort of factual evidence because nobody believes level 5 stuff is coming anytime soon
|
# ? Oct 3, 2019 18:08 |
|
Who needs facts for anything, this is a forum not wikipedia. (current technology limits mean it's not going to happen "soon" but the amount of money being dumped into the research means it is not "decades" away)
|
# ? Oct 3, 2019 18:11 |
|
People dump money and their entire lives into lots of things and never make progress. So yeah, self driving cars will happen eventually, but I don't trust its creation to some douchy tech bro who got his money from his parents' slave mines.
|
# ? Oct 3, 2019 18:17 |
|
Weird because all the industry experts say otherwise Toxx me you won’t see a full level 5 car on sale to Joe Public before 2040, these forums will be dead and buried but I’ll find you so you can confirm you were wrong
|
# ? Oct 3, 2019 18:18 |
|
I'd definitely bet money against fully self driving cars before the end of the 2020 calendar year. But I wouldn't be terribly surprised if they happened by 2030, and I'd probably wager they're almost a certainty by 2040. edit: Point being, it's not going to be "a year", or "a couple of years", but "a decade" isn't impossible, and "decades" (especially given 2100 is also "decades") is virtually certain. Think about how much has changed since 1998. There are things that few or no people imagined were right around the corner (in 1998) which are utterly commonplace today, and it's not crazy to think that the same path is happening with cars. Yes, there's a shitload that needs to be done to figure this out, but as xzzy mentioned, there's a ton of effort and money being expended to that end. The path is not even littered with utter impossibilities like many other things we've done as a society have been. Sure, we need to figure out how the car decides whether to run over the one child, the two middle aged people, or the 10 grandparents. We need to improve image recognition and program for really absurdly stupid road configurations (or just change them). It's horrifying and complex, and potentially really expensive, but it's not impossible. Edit to add: Elon Musk is a dangerous fool, and probably not the one who's going to be the sole driver of this. Krakkles fucked around with this message at 18:28 on Oct 3, 2019 |
# ? Oct 3, 2019 18:24 |
|
DARPA has been running autonomous vehicle competitions since 2004, where teams full of the smartest people compete to build the best one, but yeah, Elon is totally going to have this thing wrapped up in a few months. *tesla veers into light pole*
|
# ? Oct 3, 2019 18:34 |
|
Some hot takes in here. Objectively engineers need real data. You can send a learning algorithm through recorded roads as much as you want, but it's always going to be Tesla engineers driving in a biased way to achieve a specific result. An early release of full automation gives you access to a tremendous amount of real data. Real drivers, real roads, real traffic conditions. The ethical implications of this are a whole other discussion. Imagine one of those machine learning programs meant to play a game like flappy bird or whatever the current version is. The first generation never gets very far. But as you go along and breed the most successful examples of the software, you get better and better at doing a job as procedural as driving. Driving is a perfect task for a computer to do.
|
# ? Oct 3, 2019 18:53 |
|
um excuse me posted:Some hot takes in here. Objectively engineers need real data. You can send a learning algorithm through recorded roads as much as you want, but it's always going to be Tesla engineers driving in a biased way to achieve a specific result. An early release of full automation gives you access to a tremendous amount of real data. Real drivers, real roads, real traffic conditions. The ethical implications of this are a whole other discussion. I mean, yeah, it's got some issues, but it seems like the logical way to get the data needed without opening up the ethical implications/risk you mention. Of course, saying that out loud, I'm either missing something really obvious, or Tesla has already been doing this, or Elon isn't as smart as Joe Rogan probably thinks he is.
|
# ? Oct 3, 2019 19:00 |
|
Krakkles posted:Of course, saying that out loud, I'm either missing something really obvious, or Tesla has already been doing this, or Elon isn't as smart as Joe Rogan probably thinks he is. That's exactly what they do. Tesla’s Deep Learning at Scale: Using Billions of Miles to Train Neural Networks
|
# ? Oct 3, 2019 19:14 |
|
Elon says the cars already do that, but people who have gone under the hood and looked at the network traffic say they’ve never seen it.
|
# ? Oct 3, 2019 19:16 |
|
Saukkis posted:That's exactly what they do. Translation: I use real word people to beta test key safety systems, while pitching it as a finished product.? This is horrible mechanical and electrical failure.
|
# ? Oct 3, 2019 19:19 |
|
Krakkles posted:Couldn't this be similarly accomplished (or maybe it is being similarly accomplished) by having the car (Tesla, let's be honest) monitor and record the decisions it would have made while still Well it's definitely a good idea and I'm glad you're considering the well being of the general public, but from a cold software engineering standpoint, that still generates too much bias. If a person swerves around a deer and gets into an accident with another car, will the program recognize the deer as the cause of the accident or the car the driver struck? Will it understand that an accident with a deer is less severe than an accident with another car? Does it consider that the deer is the right choice even if striking the other car causes a less severe accident? Driving correctly is far from black and white. Nuances like this comes from the result of collecting real data.
|
# ? Oct 3, 2019 19:19 |
|
I thought the point was to have self driving cars that are better than human drivers, not just ones that learn the bad habits of all human drivers and combine em into the worst of all worlds.
|
# ? Oct 3, 2019 19:22 |
|
um excuse me posted:Well it's definitely a good idea and I'm glad you're considering the well being of the general public, but from a cold software engineering standpoint, that still generates too much bias. If a person swerves around a deer and gets into an accident with another car, will the program recognize the deer as the cause of the accident or the car the driver struck? Will it understand that an accident with a deer is less severe than an accident with another car? Does it consider that the deer is the right choice even if striking the other car causes a less severe accident? Driving correctly is far from black and white. Nuances like this comes from the result of collecting real data. Which is fun to consider because human drivers habitually swerve away from deer and cause horrible wrecks. Fortunately they're usually single car wrecks so the parties involve are naturally limited, but society in general has accepted this as an acceptable risk for driving. But if a computer were to do that? Everyone loses their loving minds.
|
# ? Oct 3, 2019 19:26 |
|
It can be safer in boring holes.
|
# ? Oct 3, 2019 19:36 |
|
xzzy posted:Which is fun to consider because human drivers habitually swerve away from deer and cause horrible wrecks. Fortunately they're usually single car wrecks so the parties involve are naturally limited, but society in general has accepted this as an acceptable risk for driving. So should a FSD car plow into a deer? Kill a squirrel, dog, cat, etc, running across the road?
|
# ? Oct 3, 2019 19:38 |
|
um excuse me posted:Well it's definitely a good idea and I'm glad you're considering the well being of the general public, but from a cold software engineering standpoint, that still generates too much bias. If a person swerves around a deer and gets into an accident with another car, will the program recognize the deer as the cause of the accident or the car the driver struck? Will it understand that an accident with a deer is less severe than an accident with another car? Does it consider that the deer is the right choice even if striking the other car causes a less severe accident? Driving correctly is far from black and white. Nuances like this comes from the result of collecting real data.
It has the upside of allowing them to record what decision the software would have made and reconcile it against what a real driver did, which would allow analysis to drive both better software decisions and mayyyyybe the hard decisions that come up ("hitting a deer is better than swerving because swerving is 67% likely to cause multiple vehicle involvement"). I'd imagine this only works until a point - at some point, it would have to start driving, and presumably, iteration would still be required - but it certainly seems like a better starting point than, uh, what they seem to be doing. (As above, they may already be doing this, or they may not, I don't know) distance is in leagues; speed is in furlongs per fortnight Krakkles fucked around with this message at 19:44 on Oct 3, 2019 |
# ? Oct 3, 2019 19:42 |
|
um excuse me posted:Well it's definitely a good idea and I'm glad you're considering the well being of the general public, but from a cold software engineering standpoint, that still generates too much bias. If a person swerves around a deer and gets into an accident with another car, will the program recognize the deer as the cause of the accident or the car the driver struck? Will it understand that an accident with a deer is less severe than an accident with another car? Does it consider that the deer is the right choice even if striking the other car causes a less severe accident? Driving correctly is far from black and white. Nuances like this comes from the result of collecting real data. While I don't know the best solution to the trolley problem, a (working) autopilot could see the deer, see the other car, and then decide to swerve a different direction or hit the brakes, or do all kinds of things faster than a human can do it. Having the AI decide what it would do in that situation and have it ignore the part where the human crashed would still be useful. The point of the AI is not to understand why accidents are bad, it's to have it work to avoid any kind of accident. This all assumes that the people making this put more thought into their product than a company that thinks having your turn signals make fart sounds is a cool feature. In any case, a proper self-driving car would require all self driving cars to be speaking with each other and planning movements together. If the car knows where all other cars are, and a child or an animal runs into the street, all the cars can work together to figure out the best plan of action to keep everyone from being damaged. So one car can swerve out of the way, even though that would cause it to hit this other car, but it's ok because that car also moved out of the way to make room because it was safe for it to do so. Cojawfee fucked around with this message at 19:47 on Oct 3, 2019 |
# ? Oct 3, 2019 19:43 |
|
I doubt we'll have real full self driving before 2030, maybe by 2040, but it's not coming from the company that can't avoid plowing into big rigs at full speed, or giant red firetrucks.
|
# ? Oct 3, 2019 19:45 |
|
Platystemon posted:Elon says the cars already do that, but people who have gone under the hood and looked at the network traffic say they’ve never seen it. Lol elon.txt
|
# ? Oct 3, 2019 19:45 |
|
Also worth noting that other manufacturers are working on self-driving cars. They just have the good sense to keep press under wraps until they have something worthwhile to show off. Not everyone company thinks the Early Access approach to automated vehicles is sane.
|
# ? Oct 3, 2019 19:50 |
|
Colostomy Bag posted:So should a FSD car plow into a deer? Kill a squirrel, dog, cat, etc, running across the road? That's what's taught as best practice for humans. Drop anchor and don't swerve even if it means you hit them.
|
# ? Oct 3, 2019 20:04 |
|
xzzy posted:That's what's taught as best practice for humans. Drop anchor and don't swerve even if it means you hit them. What about some person jaywalking?
|
# ? Oct 3, 2019 20:18 |
|
Colostomy Bag posted:What about some person jaywalking? Self-driving cars would reduce the incidence of this through two mechanisms: Seeing the pedestrian earlier, and having a more predictable reaction to them. People run out in front of cars, in part, because they think the driver will stop. If they know that self-driving cars will not ever stop for them (or that they will, I'm not implying one or the other is correct), they will modify their behavior. But man, day one will be interesting.
|
# ? Oct 3, 2019 20:22 |
|
Everyone knows that once the trolley problem is introduced, conversation must come to a halt.
|
# ? Oct 3, 2019 20:38 |
|
xzzy posted:Everyone knows that once the trolley problem is introduced, conversation must come to a halt. is there an option for the trolley to run over the one person, then back up and proceed on the path that has ten people tied to the tracks?
|
# ? Oct 3, 2019 20:40 |
|
Krakkles posted:See? We can't solve this, I don't even know why you're talking about it, xzzy. Day one will basically be called the Purge.
|
# ? Oct 3, 2019 20:44 |
|
Step one: put a Tesla body on literally any other vehicle frame Step two: get away with murder
|
# ? Oct 3, 2019 20:56 |
|
|
# ? Apr 28, 2024 19:42 |
|
MomJeans420 posted:I doubt we'll have real full self driving before 2030, maybe by 2040, but it's not coming from the company that can't avoid plowing into big rigs at full speed, or giant red firetrucks. xergm posted:Also worth noting that other manufacturers are working on self-driving cars. GM, for instance, seems to be doing well with super cruise. Aside from starting with an easier problem and focusing at least a little on safety they had the good sense not to market adaptive cruise and lane keeping assist as loving autopilot.
|
# ? Oct 3, 2019 21:42 |