Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
enojy
Sep 11, 2001

bass rattle
stars out
the sky

FCKGW posted:

Sony movies always had Sony products in them, including Sony PCs.

Do they still make PCs?

The Last of Us Part 2 (a PlayStation exclusive) has a character playing a real game on a PS Vita. The game takes place sometime around 2040, but revolves around a mass extinction event that happened in late 2013, years before the handheld was deemed a failure and discontinued.

Adbot
ADBOT LOVES YOU

KOTEX GOD OF BLOOD
Jul 7, 2012

Mister Facetious posted:

They sold/spun off their Vaio brand a few years ago.

quote:

Sony sold its PC business to the investment firm Japan Industrial Partners in February 2014 as part of a restructuring effort to focus on mobile devices. Sony maintains a minority stake in the new, independent company, which currently sells computers in the United States, Japan and Brazil as well as an exclusive marketing agreement. Sony still holds the intellectual property rights for the VAIO brand and logo. Currently in the US, VAIO business products are sold by Trans Cosmos America, Inc.

rufius
Feb 27, 2011

Clear alcohols are for rich women on diets.

Godzilla07 posted:

One of the best Thunderbolt 3 docks, the CalDigit TS3 Plus, is on sale for $225 until July 10 through CalDigit's site. Free shipping and no sales tax for non-CA residents.

Do you know if it plays nice with the 2019 MBP 16”?

I’ve got a 3 year old Plugable TBT3-UDV that crashes my 16” MBP.

Supposedly the crashes are a software thing so I’m guessing it won’t matter what dock from the sounds of it but it’s a little frustrating to not have my dock anymore.

KOTEX GOD OF BLOOD
Jul 7, 2012

So I finally got around to replacing the battery in this 2013 13" rMBP, unfortunately I am an idiot and tried using the wrong screws to hold the battery in. Now that I have the proper screws, they go in, but they don't get tight. They get tight about midway through their travel and then keep spinning.

It seems like the battery is held in by them, but I don't want to take any chances. Can I just Gorilla tape it, or is there a better option?

Binary Badger
Oct 11, 2005

Trolling Link for a decade


Godzilla07 posted:

One of the best Thunderbolt 3 docks, the CalDigit TS3 Plus, is on sale for $225 until July 10 through CalDigit's site. Free shipping and no sales tax for non-CA residents.

Pretty good considering Amazon sells it for $239 and Apple charges $249...

Anybody who picks one up should get the last (1/2020) firmware update which does the following:

quote:

Firmware 44.1 Release Notes:

The new TS3 Plus firmware 44.1 provides support for charging a laptop up to 87W

It also improves compatibility and interoperability with some USB-C monitors when connecting to the TS3 Plus’s downstream Thunderbolt 3 port

jokes
Dec 20, 2012

Uh... Kupo?

Someone convince me not to get a second HomePod for my bedroom projector setup. They’re only like $150 nowadays.

nitsuga
Jan 1, 2007

KOTEX GOD OF BLOOD posted:

So I finally got around to replacing the battery in this 2013 13" rMBP, unfortunately I am an idiot and tried using the wrong screws to hold the battery in. Now that I have the proper screws, they go in, but they don't get tight. They get tight about midway through their travel and then keep spinning.

It seems like the battery is held in by them, but I don't want to take any chances. Can I just Gorilla tape it, or is there a better option?

You mean, sticking a piece of Gorilla Tape over the screw head or trying to thread the screws into screw holes lined with Gorilla Tape? Either might work, but you might have an easier time taping the battery (maybe with automotive trim tape) to the chassis and using some blue Loctite on the screws, threading them as far as they'll go. Alternatively, you could use the wrong screws with this repair, if everything else seems OK.

nitsuga fucked around with this message at 17:23 on Jul 5, 2020

trilobite terror
Oct 20, 2007
BUT MY LIVELIHOOD DEPENDS ON THE FORUMS!

nitsuga posted:

You mean, sticking a piece of Gorilla Tape over the screw head or trying to thread the screws into screw holes lined with Gorilla Tape? Either might work, but you might have an easier time taping the battery (maybe with automotive trim tape) to the chassis and using some blue Loctite on the screws, threading them as far as they'll go. Alternatively, you could use the wrong screws with this repair.

https://youtu.be/l2hqzEKbaIQ

KOTEX GOD OF BLOOD
Jul 7, 2012

nitsuga posted:

You mean, sticking a piece of Gorilla Tape over the screw head or trying to thread the screws into screw holes lined with Gorilla Tape? Either might work, but you might have an easier time taping the battery (maybe with automotive trim tape) to the chassis and using some blue Loctite on the screws, threading them as far as they'll go. Alternatively, you could use the wrong screws with this repair, if everything else seems OK.
Thanks for this advice! I decided to just go with some blue Loctite. I'm hoping that between that and the speaker covers over the battery, it should hold in OK.

Binary Badger
Oct 11, 2005

Trolling Link for a decade


Welp, this was brought up in YOSPOS:

https://twitter.com/never_released/status/1280207485278789633

So to finalize: yes, ARM Macs will get the Apple GPU, and ONLY the Apple GPU, and AMD/Intel GPUs will finish batting cleanup for the Intel Macs.

Also, they must mean past Macs had Nvidia GPUs, LOL if they ever give a new machine an Nvidia chip.

Binary Badger fucked around with this message at 06:46 on Jul 7, 2020

MarcusSA
Sep 23, 2007

Eh I feel like the apple gpu gives them both a run for their money.

Shouldnt an eGPU still work though?

FlapYoJacks
Feb 12, 2009

MarcusSA posted:

Eh I feel like the apple gpu gives them both a run for their money.

Shouldnt an eGPU still work though?

It depends if they keep thunderbolt around and they decide to not drop support for eGPU’s.

japtor
Oct 28, 2005
Well there’s also this:

https://twitter.com/never_released/status/1279823667199967238?s=20

(Probably for some tasks where the latency boost from the shared memory outweighs moving data to the discrete GPU)

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD
I find it very hard to believe that this hypothetical Apple iGPU is more powerful than say a middling dGPU like the AMD 580 or 5500.

MarcusSA
Sep 23, 2007

~Coxy posted:

I find it very hard to believe that this hypothetical Apple iGPU is more powerful than say a middling dGPU like the AMD 580 or 5500.

I dunno about that because the graphics / power use I see on my iPad Pro are pretty amazing.

Like the arm version of dolphin on iOS blows everything else in the mobile market away.

I think it’s going to be interesting to see how things shake out.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Binary Badger posted:

Also, they must mean past Macs had Nvidia GPUs, LOL if they ever give a new machine an Nvidia chip.

Apple's done some "when hell freezes over" stunts before. Boot Camp launched on April 1 for a reason.

Besides, it's not like the actual corporations hate each other. There's bad blood between people. People move to other jobs, or retire, or get fired. Eventually there are new people who are more willing to let the past die. If Apple ships third-party GPUs in a decade, it's not too crazy to think that they'd be Nvidia silicon.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

~Coxy posted:

I find it very hard to believe that this hypothetical Apple iGPU is more powerful than say a middling dGPU like the AMD 580 or 5500.

Why can't it be, though? iGPUs suck in the PC world mostly because Intel is perennially unwilling to budget enough silicon to make them not suck. GPU performance is mostly down to (a) how efficient your design is at converting raw compute to rasterization and (b) how much raw compute you're willing to throw at the problem. Memory bandwidth is also an important factor, but Apple's tile based GPUs scale bandwidth by adding on-die cache rather than requiring tons of bandwidth to the external DRAM, so that converts it back into a "how much silicon do you spend" problem.

And on that note, Apple has shown many times that they're willing to build a much bigger die than their competitors in iOS devices, so I wouldn't be surprised if that pattern holds true on Mac silicon too. The GPU architecture they're using is well known for its efficiency, so there's potential.

(I also wouldn't be surprised if the chip for low end / ultralight Mac laptops ends up just being a slightly upclocked iPad chip, but that's not a bad thing. As MarcusSA mentioned, the iPad Pro is already in a pretty nice place, both CPU and GPU.)

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD
https://browser.geekbench.com/metal-benchmarks

I'm sure there's problems with this list, it is geekbench after all.
I can believe the Apple silicon will be better than Intel iGPU. But to compare it to a separate part with its own TDP and fast RAM is silly.

American McGay
Feb 28, 2010

by sebmojo
I don't think it's silly to compare it to the part it's replacing.

Pivo
Aug 20, 2004


BobHoward posted:

Why can't it be, though? iGPUs suck in the PC world mostly because Intel is perennially unwilling to budget enough silicon to make them not suck. GPU performance is mostly down to (a) how efficient your design is at converting raw compute to rasterization and (b) how much raw compute you're willing to throw at the problem. Memory bandwidth is also an important factor, but Apple's tile based GPUs scale bandwidth by adding on-die cache rather than requiring tons of bandwidth to the external DRAM, so that converts it back into a "how much silicon do you spend" problem.

And on that note, Apple has shown many times that they're willing to build a much bigger die than their competitors in iOS devices, so I wouldn't be surprised if that pattern holds true on Mac silicon too. The GPU architecture they're using is well known for its efficiency, so there's potential.

(I also wouldn't be surprised if the chip for low end / ultralight Mac laptops ends up just being a slightly upclocked iPad chip, but that's not a bad thing. As MarcusSA mentioned, the iPad Pro is already in a pretty nice place, both CPU and GPU.)

Oversimplified to the point where it's almost tautological. Quite literally efficiency is performance per watt, so all you have said is, "the answer to making a design with high performance is making a design with high performance per watt and increasing the power".

Okay. How? I'm sure AMD would love to know. NVIDIA has the architectural advantage in raw performance, and outside of rendering, the 'political' advantage with wide adoption of CUDA. Competitors like AMD can't just throw more silicon and power at the problem, which is why they have been heavily focused on iterating their architecture rather than just trying to scale GCN. It's a big unknown whether mobile focused GPUs will scale to that extent. Apple GPU also does things differently enough to require platform-specific optimizations. Not everything is running a tile-based deferred renderer. They will require both the architectural advantage and the political advantage.

It is something Apple has pulled off in the mobile SoC space. They may yet pull it off in the desktop CPU space, but that is still unknown. To say it is a foregone conclusion - or highly likely - that Apple will match or exceed currently desktop dGPUs on their current trajectory is dishonest.

Pivo
Aug 20, 2004


Let's not forget that the war between Intel's Xe and AMD's iGPUs is just heating up, too. Intel's made huge gains in Ice Lake and it looks like they will do it again with Tiger Lake. Once their manufacturing is out of the hole they've dug - remember that Intel, unlike Apple & AMD, is also a hardcore manufacturing firm which has historically been one of their key advantages - and you should see some heavyweight competition from players already well-worn in the space.

It is cool that Apple will complete, but it is far far far too early to make any kind of broad statements other than we hope they'll do well for the benefit of the consumer.

Corb3t
Jun 7, 2003

Pivo posted:

Oversimplified to the point where it's almost tautological. Quite literally efficiency is performance per watt, so all you have said is, "the answer to making a design with high performance is making a design with high performance per watt and increasing the power".

Okay. How? I'm sure AMD would love to know. NVIDIA has the architectural advantage in raw performance, and outside of rendering, the 'political' advantage with wide adoption of CUDA. Competitors like AMD can't just throw more silicon and power at the problem, which is why they have been heavily focused on iterating their architecture rather than just trying to scale GCN. It's a big unknown whether mobile focused GPUs will scale to that extent. Apple GPU also does things differently enough to require platform-specific optimizations. Not everything is running a tile-based deferred renderer. They will require both the architectural advantage and the political advantage.

It is something Apple has pulled off in the mobile SoC space. They may yet pull it off in the desktop CPU space, but that is still unknown. To say it is a foregone conclusion - or highly likely - that Apple will match or exceed currently desktop dGPUs on their current trajectory is dishonest.

But we already know the big players are buying in to Apple Silicon (Adobe, Maya), and cross platform games run a helluva lot better on any iOS device made in the last few years than they do with a MacBook and an Intel iGPU.

Pivo
Aug 20, 2004


Gay Retard posted:

But we already know the big players are buying in to Apple Silicon (Adobe, Maya), and things like Fortnite run a helluva lot better on any iPhone made in the last few years than it does with a MacBook and an intel iGPU.

Intel hasn't taken iGPU seriously until recently. But, this discussion is about Apple replacing dGPUs entirely!

Ampere is rumoured to deliver 30%+ raster perf over Turing. Even 20% would be huge. NVIDIA is so far ahead at the moment, pointing to Fortnite on the iPhone as a tell to how desktop-class machines will fare in gaming & compute is just misleading.

I have no doubt Apple GPU will beat Intel 14nm iGPU.

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

~Coxy posted:

https://browser.geekbench.com/metal-benchmarks

I'm sure there's problems with this list, it is geekbench after all.
I can believe the Apple silicon will be better than Intel iGPU. But to compare it to a separate part with its own TDP and fast RAM is silly.

I don't know what the Geekbench GPU benchmark measures, but an A12X pulling a number comparable to a GTX 950 isn't unimpressive. A Mac mini with a GPU on par with a GTX 1050 Ti and eight.four cores could be a very nice thing, but unless Apple's springing for a quad-channel memory controller the performance will probably hit a bandwidth wall. That'd still be adequate for a family PC.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Pivo posted:

Oversimplified to the point where it's almost tautological. Quite literally efficiency is performance per watt, so all you have said is, "the answer to making a design with high performance is making a design with high performance per watt and increasing the power".

That would be the theme of all high performance silicon over the past decade+, yes. Including CPUs. You may have noticed that Apple has decided that the efficient CPU cores it designed primarily for iPhones and secondarily for iPads are now good enough to take on motherfucking Intel. So: why not GPUs too?

Of course there's a whole world of details on the way to scaling up, but these aren't insurmountable problems. There is no magic in this industry. Mostly it's just high R&D cost barriers. When I've been skeptical of ARMing the Mac before, it's largely been based on that.

But now we know that Apple decided to go ahead and spend, so the question becomes, where did they set their performance targets and how well did they execute? (I put this in the past tense since, if the first Apple Silicon Macs ship by EOY, they should have had first silicon in the labs months ago.)

For GPUs specifically, scaling up is much easier than CPUs. GPUs are all about exploiting embarrassingly parallel problems by throwing more parallel hardware at them. So once you've got a very efficient baseline tile renderer, you render more tiles in parallel. Since TBDR leans super heavily on tile cache for memory performance, it is particularly well suited to the "throw some more silicon at it" option: you just add more tile cache to go with each tile rasterizer. Yes, eventually you need to scale the main memory interface too, but it's just not as critical with TBDR.

quote:

Okay. How? I'm sure AMD would love to know. NVIDIA has the architectural advantage in raw performance, and outside of rendering, the 'political' advantage with wide adoption of CUDA.

CUDA is irrelevant to GPU-as-GPU. And there is no dramatic architectural advantage behind "raw" GPU performance, that's literally always been about how much area you spend on ALUs. Architectural advantages come in when you start looking at how efficient the GPU design is at converting that raw computational performance into useful work, and guess what? TBDR is well known for being really efficient on that front. There's a reason why it has dominated the mobile space.

quote:

Competitors like AMD can't just throw more silicon and power at the problem, which is why they have been heavily focused on iterating their architecture rather than just trying to scale GCN. It's a big unknown whether mobile focused GPUs will scale to that extent. Apple GPU also does things differently enough to require platform-specific optimizations. Not everything is running a tile-based deferred renderer. They will require both the architectural advantage and the political advantage.

I will give you a little hint: in recent years both NVIDIA and AMD have changed their architectures to adopt TBDR-adjacent techniques (NVIDIA went first).

quote:

It is something Apple has pulled off in the mobile SoC space. They may yet pull it off in the desktop CPU space, but that is still unknown. To say it is a foregone conclusion - or highly likely - that Apple will match or exceed currently desktop dGPUs on their current trajectory is dishonest.

gently caress off, pivo, I never said it was a foregone conclusion you massive shitter

Pivo
Aug 20, 2004


"More ALUs = better than" is again oversimplifying. How do you keep them fed, what does your pipeline look like, how wide is it, what's your precision mix (especially if Apple is pushing render+compute), how good is your ILP and dispatch ... Come on. Scaling GPUs is absolutely not a solved problem. Saying that rendering is embarrassingly parallel is again just going back to axiomatic truth and presenting it as a solution rather than the starting point. We know perf/watt is a silicon design target, we know GPU workloads are parallel.

BobHoward posted:

There is no magic in this industry. Mostly it's just high R&D cost barriers.
It's not magic, it's engineering. Dumping money into R&D does not guarantee good engineering outcomes, see: Intel 10nm. You don't need to spend as much as the others to get fantastic outcomes, see: AMD Zen 2.

We'll see.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Pivo posted:

"More ALUs = better than" is again oversimplifying. How do you keep them fed, what does your pipeline look like, how wide is it, what's your precision mix (especially if Apple is pushing render+compute), how good is your ILP and dispatch ... Come on. Scaling GPUs is absolutely not a solved problem. Saying that rendering is embarrassingly parallel is again just going back to axiomatic truth and presenting it as a solution rather than the starting point. We know perf/watt is a silicon design target, we know GPU workloads are parallel.

It's not magic, it's engineering. Dumping money into R&D does not guarantee good engineering outcomes, see: Intel 10nm. You don't need to spend as much as the others to get fantastic outcomes, see: AMD Zen 2.

We'll see.

I see you still haven't given up on pivo'ing, pivo

engage with the words i wrote, not the imaginary things you think I said because you couldn't be bothered, at any point, to do anything more than skim for reasons to be antagonistic

Pivo
Aug 20, 2004


BobHoward posted:

skim for reasons to be antagonistic



I could micro-quote everything, but I don't think you pay attention to the things you write yourself.

Crunchy Black
Oct 24, 2017

by Athanatos
I think the elephant y'all are ignoring here is, "good enough."

We're a bunch of loving nerds arguing about imaginary CPU/GPU architectures which we're just speculating about at performance envelopes that make 10 years ago look like an eon ago.

The real question for Apple is, "is it good enough to move units?" Because they don't give a poo poo about anyone who is posting in this thread and haven't for 15+ years lol. And I daresay a turnt' up iPad Pro chip will do just fine in a new iMac.

well why not
Feb 10, 2009




It'll be more than good enough for the vast majority of people who buy apple computers. It will run productivity tools (Word, Excel, Photoshop, etc) and do light video editing nicely. Everything on Apple Arcade will probably work great.

The important part isn't how fast it is, it's how much more profitable it is.

Pivo
Aug 20, 2004


It's true, Apple has no problem leveraging GPU compute and fixed function hardware to accelerate things like video editing. They demoed it at WWDC, editing many 4K streams in real time on the iPad chip. I know, I know, tech-tubers, but MKBHD's comment on iPhone 12 rumoured to support 4K 240fps recording as just "showing off their image processing pipeline" and "dunking on the competition" rings true. That is some power in a small package.

It's possible no one will truly care about raster performance anyway. Macs aren't for gaming. Maybe the whole argument is moot.

well why not
Feb 10, 2009




It'll be at least as powerful as the iPad pro, which is enough to replace a family computer for most/many people. Removing the battery restriction and improving the cooling is gonna be super interesting, but it's really hard to predict how much faster it'll be. Has anyone watercooled an iPad Pro yet?

EL BROMANCE
Jun 10, 2006

COWABUNGA DUDES!
🥷🐢😬



Internet Recovery brings up a nice screen with apple.com/support with an error code or 5010F (15).

Thank you, Apple, for a URL that doesn’t contain an explanation of the error I have. Nor does Google. The parenthesis bit is important it seems, and virtually everyone talks about (3).

So, I guess I’m refreshing the Refurb Mini page again this week because I’m already sick to the loving teeth dealing with this (straw that broke the camels back etc).

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:

well why not posted:

It'll be at least as powerful as the iPad pro, which is enough to replace a family computer for most/many people. Removing the battery restriction and improving the cooling is gonna be super interesting, but it's really hard to predict how much faster it'll be. Has anyone watercooled an iPad Pro yet?

I have a brother in BC, I'll tell him to ask Linus. :haw:

cowofwar
Jul 30, 2002

by Athanatos
The question is what is the battery life on an ARM macbook pro and if it’s like double what it is now that’s going to be a major dunk.

MarcusSA
Sep 23, 2007

cowofwar posted:

The question is what is the battery life on an ARM macbook pro and if it’s like double what it is now that’s going to be a major dunk.

I dunno about double but I really feel like at least 25% more

American McGay
Feb 28, 2010

by sebmojo
You know they're just gonna use that as an excuse to shrink the battery by 25% and make the shell .03mm thinner.

japtor
Oct 28, 2005
They'll shave off what they can and leave just enough for "all day" battery life like the iPads (aka ~10 hours).

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:
Give me the Surface Book 3 design philosophy, but instead of a dGPU in the keyboard section, it's just more battery.

Adbot
ADBOT LOVES YOU

Binary Badger
Oct 11, 2005

Trolling Link for a decade


It should be interesting to see how much they ramp up the PowerVR-based GPU architecture for the laptop / desktop.

Should also be interesting to see how well certain programs will run in emulation AND with the new GPU, which includes the A-list games that aren't ported to Apple Silicon.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply