Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Powerful Two-Hander
Mar 10, 2004

Mods please change my name to "Tooter Skeleton" TIA.


akadajet posted:

i used to think u needed to build a new compy every 2 years or so, but now that's not really the case

still rocking a core i5 2500k from like 2010 over here

Adbot
ADBOT LOVES YOU

eukara
Aug 5, 2014

Do you believe in life after bomb?
until last year I used a Core2Quad from 2008...
I only upgraded because I needed a new mobo and RAM.
That way I only ever had to pay big bucks for a good GPU and the rest is history

Notorious b.s.d.
Jan 25, 2003

by Reene

poty posted:

the amd releasing a good cpu for once thing is irrelevant but everything else is pretty problematic op

i hope whoever we get after the Intel tyranny is toppled can figure out how to drive one 5k screen per thunderbolt port

the problem with thunderbolt is that it's stupid

thunderbolt 3.1 is good for up to 40 gbps of bandwidth. multiplexed inside that bandwidth, hypothetically, is usb 3.1 (10 gbps), pci-e 3.0 (64 gbps), and displays (48 gbps for hdmi 2.1)

i think you can see how 10 + 64 + 48 doesn't add up to 40 gbps

thunderbolt will never work as a bus, with multiple devices hanging off. it barely even has the bandwidth required to support an individual display.

The Management
Jan 2, 2010

sup, bitch?

Notorious b.s.d. posted:

the problem with thunderbolt is that it's stupid

thunderbolt 3.1 is good for up to 40 gbps of bandwidth. multiplexed inside that bandwidth, hypothetically, is usb 3.1 (10 gbps), pci-e 3.0 (64 gbps), and displays (48 gbps for hdmi 2.1)

i think you can see how 10 + 64 + 48 doesn't add up to 40 gbps

thunderbolt will never work as a bus, with multiple devices hanging off. it barely even has the bandwidth required to support an individual display.

lol.

first,
4K/60 = 12gbps.
pcie3 = 8gbps per lane

so you can drive 2 4K displays and have enough bandwidth for all of your usb3 needs plus gigabit Ethernet plus a SATA controller and a sound card. and if you want a 4x pcie device, you can plug in the second thunderbolt port.

it’s working great for me.

EIDE Van Hagar
Dec 8, 2000

Beep Boop

The Management posted:

lol.

first,
4K/60 = 12gbps.
pcie3 = 8gbps per lane

so you can drive 2 4K displays and have enough bandwidth for all of your usb3 needs plus gigabit Ethernet plus a SATA controller and a sound card. and if you want a 4x pcie device, you can plug in the second thunderbolt port.

it’s working great for me.

https://www.extron.com/product/videotools.aspx

basically maxed out 4k60 with 10 bit color is 22.

and in the beforetime when i designed mipi interfaces for mobile devices, we did lossless compression to save power. you can really reduce that data rate, but it has been so long since i looked at hdmi standards that idk if there are plans to support display stream compression like what mipi has in hdmi.

echinopsis
Apr 13, 2004

by Fluffdaddy
is it better than ethernet over hdmi

The Management
Jan 2, 2010

sup, bitch?

C.H.O.M.E. posted:

i designed mipi interfaces for mobile devices

LP in the VBI or gtfo.

Notorious b.s.d.
Jan 25, 2003

by Reene

The Management posted:

first,
4K/60 = 12gbps.
pcie3 = 8gbps per lane

these are both old as gently caress numbers, since now the ball has moved to bigger displays. also, i was using the figure for a 4x pci-e because lol what the gently caress runs on 1x

the problem is that thunderbolt is always a day late and a dollar short when a new thing comes along, and it's never not going to be

if it were just a pci-e extension that had usb, that would probably be fine

at every new iteration of thunderbolt it tries to also be a display setup and every drat time it's behind

The Management
Jan 2, 2010

sup, bitch?

Notorious b.s.d. posted:

these are both old as gently caress numbers, since now the ball has moved to bigger displays. also, i was using the figure for a 4x pci-e because lol what the gently caress runs on 1x

the problem is that thunderbolt is always a day late and a dollar short when a new thing comes along, and it's never not going to be

if it were just a pci-e extension that had usb, that would probably be fine

at every new iteration of thunderbolt it tries to also be a display setup and every drat time it's behind

8k displays exist. they take 2x display port, so it’s not like there is anything that can drive them available yet. so thunderbolt is still the highest bandwidth display transport available. not sure how that is a day late.

it can be just a pcie extension that has usb. you don’t have to plug it into a display.

every iteration of thunderbolt is a superset of displayport, the current standard display technology. you can pretend it’s display port if that’s all you want.

MrMoo
Sep 14, 2000

16k displays exist too, but DisplayPort for whatever reason has been slow updating for "newer and better things"(tm). When DisplayPort gets the bump no doubt Thuderbolt will too afterwards to match.

Many TV networks and cable distribution are still only 720p so it's all a bit of a niche market.

MrMoo fucked around with this message at 19:33 on Jun 23, 2018

Notorious b.s.d.
Jan 25, 2003

by Reene

The Management posted:

8k displays exist. they take 2x display port, so it’s not like there is anything that can drive them available yet. so thunderbolt is still the highest bandwidth display transport available. not sure how that is a day late.

well no, right off the bat, hdmi 2.1 is higher bandwidth... and hdmi 2.1 has fewer multiplexing obligations.

and hdmi 2.1 notably has native support for 8k @ 60 Hz, already. no double cable higgery pokery required

echinopsis
Apr 13, 2004

by Fluffdaddy
32k displays are in the pipeline and when you buy one you'll discover 64k displays are about 2 months from coming out


im not interested until they get to 256k. that is when I finally be content

Silver Alicorn
Mar 30, 2008

𝓪 𝓻𝓮𝓭 𝓹𝓪𝓷𝓭𝓪 𝓲𝓼 𝓪 𝓬𝓾𝓻𝓲𝓸𝓾𝓼 𝓼𝓸𝓻𝓽 𝓸𝓯 𝓬𝓻𝓮𝓪𝓽𝓾𝓻𝓮
420k my dude

skimothy milkerson
Nov 19, 2006

Captain Foo posted:

what if gpu but phone

what if ur posts but good

echinopsis
Apr 13, 2004

by Fluffdaddy

:worship:

Silver Alicorn
Mar 30, 2008

𝓪 𝓻𝓮𝓭 𝓹𝓪𝓷𝓭𝓪 𝓲𝓼 𝓪 𝓬𝓾𝓻𝓲𝓸𝓾𝓼 𝓼𝓸𝓻𝓽 𝓸𝓯 𝓬𝓻𝓮𝓪𝓽𝓾𝓻𝓮

still keep thinking of this :allears:

Captain Foo
May 11, 2004

we vibin'
we slidin'
we breathin'
we dyin'

Skim Milk posted:

what if ur posts but good

idgi

EIDE Van Hagar
Dec 8, 2000

Beep Boop

420k by 69k

kzin602
May 14, 2007




Grimey Drawer

Notorious b.s.d. posted:

no that story is completely laughable

there are details being covered up, and being permitted to resign with his reputation intact was the deal made

Brian Krzanich (or BK as he presents himself to the Intel employees) is known to be a womanizing dickhead company wide. There is 100% another story behind this. He will say some pretty vile poo poo while on conference calls or walk through tours.

EIDE Van Hagar
Dec 8, 2000

Beep Boop

kzin602 posted:

Brian Krzanich (or BK as he presents himself to the Intel employees) is known to be a womanizing dickhead company wide. There is 100% another story behind this. He will say some pretty vile poo poo while on conference calls or walk through tours.

i don't know about his reputation, but anyone who can't keep their dick in their pants for a 21 million dollar salary surely has poor impulse control.

Notorious b.s.d.
Jan 25, 2003

by Reene

C.H.O.M.E. posted:

i don't know about his reputation, but anyone who can't keep their dick in their pants for a 21 million dollar salary surely has poor impulse control.

how do you think you make it to ceo in the first place

funeral home DJ
Apr 21, 2003


Pillbug
I thought Apple announced the death of intel processors in their computers so they could crash intel stock and buy the company with the copious amounts of cash Tim Cook uses to wipe his rear end on a daily basis

EIDE Van Hagar
Dec 8, 2000

Beep Boop

Notorious b.s.d. posted:

how do you think you make it to ceo in the first place

struth

The Management
Jan 2, 2010

sup, bitch?

Ripoff posted:

I thought Apple announced the death of intel processors in their computers so they could crash intel stock and buy the company with the copious amounts of cash Tim Cook uses to wipe his rear end on a daily basis

why would he buy a company whose chips they no longer use?

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
could they retool the fabs to make chips they do want to use?

Powerful Two-Hander
Mar 10, 2004

Mods please change my name to "Tooter Skeleton" TIA.


i didn't even known thunderbolt was still a thing

edit: was thundebolt that the one they codenamed "light peak" and everyone thought it sounded cool but by the time it finally came out we had usb3 and displayport and nobody used it?

funeral home DJ
Apr 21, 2003


Pillbug

The Management posted:

why would he buy a company whose chips they no longer use?

Farmer Crack-rear end posted:

could they retool the fabs to make chips they do want to use?

it’s this - Apple likes vertical integration whenever possible so I’m kinda surprised they haven’t decided to buy intel and make a-chips in-house instead of relying on one of their literal competitors for components

who makes the a-series now? Samsung and some Chinese manufacturer, right?

echinopsis
Apr 13, 2004

by Fluffdaddy

C.H.O.M.E. posted:

i don't know about his reputation, but anyone who can't keep their dick in their pants for a 21 million dollar salary surely has poor impulse control.

i think you underestimate the scent of a woman

Agile Vector
May 21, 2007

scrum bored



pretty sure placing your dick on the table in a full suit gets you to the c level but you gotta jelq to ceo

Notorious b.s.d.
Jan 25, 2003

by Reene

Ripoff posted:

it’s this - Apple likes vertical integration whenever possible so I’m kinda surprised they haven’t decided to buy intel and make a-chips in-house instead of relying on one of their literal competitors for components

they already design the chip in-house, that is why they bought pa semi

Ripoff posted:

who makes the a-series now? Samsung and some Chinese manufacturer, right?

samsung and tsmc

Notorious b.s.d.
Jan 25, 2003

by Reene
if apple wanted to manufacture their own semiconductors, they could choose to merge with either intel or tsmc. (samsung is off the table because it is not really a public company despite the elaborate sham).

but

apple total market cap is 900B; intel and tsmc are each about 250B; total acquisition costs would probably be in the 300-400B range

does anyone really, truly believe that vertically integrating the production stage would be worth diluting apple stock by 40-50%? plainly apple's management doesn't

right now, without that final stage of integration, they have two fabs competing against each other (samsung and tsmc) with two more equally capable giants waiting in the wings (gf and intel).

a fab acquisition would make them less, rather than more, valuable. (edit: in a world where apple owned tsmc or intel, they would have a harder time convincing samsung or gf to be a second source)

Notorious b.s.d. fucked around with this message at 05:39 on Jun 25, 2018

Hugh G. Rectum
Mar 1, 2011

Powerful Two-Hander posted:

still rocking a core i5 2500k from like 2010 over here

upgraded from one of these in 2017 to a ryzen 7 1800X cause it was on sale for $100 off. even at 4.5Ghz the 2500k just couldn't keep up. haven't owned an amd cpu since my dual athlon 2500+ MP machine. it's good to be back.

go amd go

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER


C.H.O.M.E. posted:

420k by 69k



i was hoping for 219 by nice but here we are

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

Notorious b.s.d. posted:

4x pci-e because lol what the gently caress runs on 1x

lots of things actually

i mean you probably don’t want to dangle a rack of tesla cards off a single thunderbolt cable or use thunderbolt to build a supercomputer but 1) ~prosumer~ poo poo and 2) legacy poo poo will benefit from having babby level cable pcie on a device with no exposed or free regular pcie slots

the most pcie bandwidth that’ll actually get used on thunderbolt is an external GPU for someone’s mobile 1337 :pcgaming: battlestation and apparently gpus mostly run ok even when somewhat bandwidth starved. everything else will be controllers for a bazillion peripheral ports which will mostly not be in use at once/be attached to slow poo poo and will do ok sharing pcie x2 or something

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
i didn't quite realize their next process was so dead, or that the new amd chips were anything but :coresjester: . everyone's getting in on the digs at them lately tho

quote:

ast week Brian Krzanich resigned as the CEO of Intel after violating the company’s non-fraternization policy. The details of Krzanich’s departure, though, ultimately don’t matter: his tenure was an abject failure, the extent of which is only now coming into view.

Intel’s Obsolete Opportunity
When Krzanich was appointed CEO in 2013 it was already clear that arguably the most important company in Silicon Valley’s history was in trouble: PCs, long Intel’s chief money-maker, were in decline, leaving the company ever more reliant on the sale of high-end chips to data centers; Intel had effectively zero presence in mobile, the industry’s other major growth area.

Still, I framed the situation that faced Krzanich as an opportunity, and drew a comparison to the challenges that faced the legendary Andy Grove three decades ago:

By the 1980s, though, it was the microprocessor business, fueled by the IBM PC, that was driving growth, while the DRAM business was fully commoditized and dominated by Japanese manufacturers. Yet Intel still fashioned itself a memory company. That was their identity, come hell or high water.

By 1986, said high water was rapidly threatening to drag Intel under. In fact, 1986 remains the only year in Intel’s history that they made a loss. Global overcapacity had caused DRAM prices to plummet, and Intel, rapidly becoming one of the smallest players in DRAM, felt the pain severely. It was in this climate of doom and gloom that Grove took over as CEO. And, in a highly emotional yet patently obvious decision, he once and for all got Intel out of the memory manufacturing business.

Intel was already the best microprocessor design company in the world. They just needed to accept and embrace their destiny.

Fast forward to the challenge that faced Krzanich:

It is into a climate of doom and gloom that Krzanich is taking over as CEO. And, in what will be a highly emotional yet increasingly obvious decision, he ought to commit Intel to the chip manufacturing business, i.e. manufacturing chips according to other companies’ designs.

Intel is already the best microprocessor manufacturing company in the world. They need to accept and embrace their destiny.

That article is now out of date: in a remarkable turn of events, Intel has lost its manufacturing lead. Ben Bajarin wrote last week in Intel’s Moment of Truth:

Not only has the competition caught Intel they have surpassed them. TSMC is now sampling on 7nm and AMD will ship their architecture on 7nm technology in both servers and client PCs ahead of Intel. For those who know their history, this is the first time AMD has ever beat Intel to a process node. Not only that, but AMD will likely have at least an 18 month lead on Intel with 7nm, and I view that as conservative.

As Bajarin notes, 7nm for TSMC (or Samsung or Global Foundries) isn’t necessarily better than Intel’s 10nm; chip-labeling isn’t what it used to be. The problem is that Intel’s 10nm process isn’t close to shipping at volume, and the competition’s 7nm processes are. Intel is behind, and its insistence on integration bears a large part of the blame.

Intel’s Integrated Model
Intel, like Microsoft, had its fortunes made by IBM: eager to get the PC an increasingly vocal section of its customer base demanded out the door, the mainframe maker outsourced much of the technology to third party vendors, the most important being an operating system from Microsoft and a processor from Intel. The impact of the former decision was the formation of an entire ecosystem centered around MS-DOS, and eventually Windows, cementing Microsoft’s dominance.

Intel was a slightly different story; while an operating system was simply bits on a disk, and thus easily duplicated for all of the PCs IBM would go on to sell, a processor was a physical device that needed to be manufactured. To that end IBM insisted on having a “second source”, that is, a second non-Intel manufacturer for Intel’s chips. Intel chose AMD, and licensed first the 8086 and 8088 designs that were in the original IBM PC, and later, again under pressure from IBM, the 80286 design; the latter was particularly important because it was designed to be upward compatible with everything that followed.

This laid the groundwork for Intel’s strategy — and immense profitability — for the next 35 years. First off, the dominance of Intel’s x86 design was assured thanks to its integration with DOS/Windows: specifically, DOS/Windows created a two-sided market of developers and PC users, and DOS/Windows ran on x86.

Microsoft and Intel were integrated in the PC value chain

However, thanks to its licensing deal with AMD, Intel wasn’t automatically entitled to all of the profits that would result from that integration; thus Intel doubled-down on an integration of its own: the design and manufacture of x86 chips. That is, Intel would invest huge sums of money into creating new and faster designs (the 386, the 486, the Pentium, etc.), and also invest huge sums of money into ever smaller and more efficient manufacturing processes that would push the limits of Moore’s Law. This one-two punch would ensure that, despite AMD’s license, Intel’s chips would be the only realistic choice for PC makers, allowing the company to capture the vast majority of the profits created by the x86’s integration with DOS/Windows.

Intel was largely successful. AMD did take the performance crown around the turn of the century with the Athlon 64, but the company was unable to keep up with Intel financially when it came to fabs, and Intel illegally leveraged its dominant position with OEMs to keep them buying mostly Intel parts; then, a few years later, Intel not only took back the performance lead with its Core architecture, but settled into the “tick-tock” strategy where it alternated new designs and new manufacturing processes on a regular schedule. The integration advantage was real.

TSMC’s Modular Approach
In the meantime there was a revolution brewing in Taiwan. In 1987, Morris Chang founded Taiwan Semiconductor Manufacturing Company (TSMC) promising “Integrity, commitment, innovation, and customer trust”. Integrity and customer trust referred to Chang’s commitment that TSMC would never compete with its customers with its own designs: the company would focus on nothing but manufacturing.

This was a completely novel idea: at that time all chip manufacturing was integrated a la Intel; the few firms that were only focused on chip design had to scrap for excess capacity at Integrated Device Manufacturers (IDMs) who were liable to steal designs and cut off production in favor of their own chips if demand rose. Now TSMC offered a much more attractive alternative, even if their manufacturing capabilities were behind.

In time, though, TSMC got better, in large part because it had no choice: soon its manufacturing capabilities were only one step behind industry standards, and within a decade had caught-up (although Intel remained ahead of everyone). Meanwhile, the fact that TSMC existed created the conditions for an explosion in “fabless” chip companies that focused on nothing but design. For example, in the late 1990s there was an explosion in companies focused on dedicated graphics chips: nearly all of them were manufactured by TSMC. And, all along, the increased business let TSMC invest even more in its manufacturing capabilities.

Integrated intel was competing with a competitive modular ecosystem

This represented into a three-pronged assault on Intel’s dominance:

Many of those new fabless design companies were creating products that were direct alternatives to Intel chips for general purpose computing. The vast majority of these were based on the ARM architecture, but also AMD in 2008 spun off its fab operations (christened GlobalFoundries) and became a fabless designer of x86 chips.
Specialized chips, designed by fabless design companies, were increasingly used for operations that had previously been the domain of general purpose processors. Graphics chips in particular were well-suited to machine learning, cryptocurrency mining, and other highly “embarrassingly parallel” operations; many of those applications have spawned specialized chips of their own. There are dedicated bitcoin chips, for example, or Google’s Tensor Processing Units: all are manufactured by TSMC.
Meanwhile TSMC, joined by competitors like GlobalFoundries and Samsung, were investing ever more in new manufacturing processes, fueled by the revenue from the previous two factors in a virtuous cycle.
Intel’s Straitjacket
Intel, meanwhile, was hemmed in by its integrated approach. The first major miss was mobile: instead of simply manufacturing ARM chips for the iPhone the company presumed it could win by leveraging its manufacturing to create a more-efficient x86 chip; it was a decision that evinced too much knowledge of Intel’s margins and not nearly enough reflection on the importance of the integration between DOS/Windows and x86.

Intel took the same mistaken approach to non general-purpose processors, particularly graphics: the company’s Larrabee architecture was a graphics chip based on — you guessed it — x86; it was predicated on leveraging Intel’s integration, instead of actually meeting a market need. Once the project predictably failed Intel limped along with graphics that were barely passable for general purpose displays, and worthless for all of the new use cases that were emerging.

The latest crisis, though, is in design: AMD is genuinely innovating with its Ryzen processors (manufactured by both GlobalFoundries and TSMC), while Intel is still selling varations on Skylake, a three year-old design. Ashraf Eassa, with assistance from a since-deleted tweet from a former Intel engineer, explains what happened:

According to a tweet from ex-Intel engineer Francois Piednoel, the company had the opportunity to bring all-new processor technology designs to its currently shipping 14nm technology, but management decided against it.

my post was actually pointing out that market stalling is more troublesome than Ryzen, It is not a good news. 2 years ago, I said that ICL should be taken to 14nm++, everybody looked at me like I was the craziest guy on the block, it was just in case … well … now, they know

— François Piednoël (@FPiednoel) April 26, 2018

The problem in recent years is that Intel has been unable to bring its major new manufacturing technology, known as 10nm, into mass production. At the same time, the issues with 10nm seemed to catch Intel off-guard. So, by the time it became clear that 10nm wouldn’t go into production as planned, it was too late for Intel to do the work to bring one of the new processor designs that was originally developed to be built on the 10nm technology to its older 14nm technology…

What Piednoel is saying in the tweet I quoted above is that when management had the opportunity to start doing the work to bring their latest processor design, known as Ice Lake (abbreviated “ICL” in the tweet), [to the 14nm process] they decided against doing so. That was likely because management truly believed two years ago that Intel’s 10nm manufacturing technology would be ready for production today. Management bet incorrectly, and Intel’s product portfolio is set to suffer as a result.

To put it another way, Intel’s management did not break out of the integration mindset: design and manufacturing were assumed to be in lockstep forever.

Integration and Disruption
It is perhaps simpler to say that Intel, like Microsoft, has been disrupted. The company’s integrated model resulted in incredible margins for years, and every time there was the possibility of a change in approach Intel’s executives chose to keep those margins. In fact, Intel has followed the script of the disrupted even more than Microsoft: while the decline of the PC finally led to The End of Windows, Intel has spent the last several years propping up its earnings by focusing more and more on the high-end, selling Xeon processors to cloud providers. That approach was certainly good for quarterly earnings, but it meant the company was only deepening the hole it was in with regards to basically everything else. And now, most distressingly of all, the company looks to be on the verge of losing its performance advantage even in high-end applications.

This is all certainly on Krzanich, and his predecessor Paul Otellini. Then again, perhaps neither had a choice: what makes disruption so devastating is the fact that, absent a crisis, it is almost impossible to avoid. Managers are paid to leverage their advantages, not destroy them; to increase margins, not obliterate them. Culture more broadly is an organization’s greatest asset right up until it becomes a curse. To demand that Intel apologize for its integrated model is satisfying in 2018, but all too dismissive of the 35 years of success and profits that preceded it.

So it goes.

Agile Vector
May 21, 2007

scrum bored



Hugh G. Rectum posted:

upgraded from one of these in 2017 to a ryzen 7 1800X cause it was on sale for $100 off. even at 4.5Ghz the 2500k just couldn't keep up. haven't owned an amd cpu since my dual athlon 2500+ MP machine. it's good to be back.

go amd go

i feared this was the case. I can tell mine is limiting things with the new card, but it's okay for now

if that article is right, in a year or two maybe it'll be amd again after I, too, haven't touched them since the barton 2500

Hugh G. Rectum
Mar 1, 2011

Agile Vector posted:

i feared this was the case. I can tell mine is limiting things with the new card, but it's okay for now

if that article is right, in a year or two maybe it'll be amd again after I, too, haven't touched them since the barton 2500

in destiny 2 with a 980 I went from 80-100fps and 100% CPU usage to solid 144 at 25-30% cpu on the ryzen. it's a big upgrade.

The Management
Jan 2, 2010

sup, bitch?

Ripoff posted:

it’s this - Apple likes vertical integration whenever possible so I’m kinda surprised they haven’t decided to buy intel and make a-chips in-house instead of relying on one of their literal competitors for components

Apple is an engineering and operations company, it doesn’t do manufacturing. that’s low margin, high capital expenditure commodity business. that includes chip fabrication.

apple multi-sources everything they can from suppliers to guarantee supply and quality. Apple pre-buys commitments to lock competitors out of supply, get the best price, and sometimes corner the market on components. Apple squeezes them on margin when they compete and become commoditized.

Apple does not do manufacturing, life is much better for them being the big customer.

Agile Vector
May 21, 2007

scrum bored



Hugh G. Rectum posted:

in destiny 2 with a 980 I went from 80-100fps and 100% CPU usage to solid 144 at 25-30% cpu on the ryzen. it's a big upgrade.

dang. after you mentioned it earlier, i monitored mine and yeah, im definitely reaching the cpu limit on some games and it's holding it back

my display is only 60hz, so for a bunch of games it's fine but for some larger mmos the bottleneck is pretty clear and dropping the gpu settings does nothing to improve it until its limiting models

i had to replace some bad ram, so I'm going to milk it for all its worth but I guess its time to consider this only a fine cpu

intel: just fine

Adbot
ADBOT LOVES YOU

Hugh G. Rectum
Mar 1, 2011

Agile Vector posted:

dang. after you mentioned it earlier, i monitored mine and yeah, im definitely reaching the cpu limit on some games and it's holding it back

my display is only 60hz, so for a bunch of games it's fine but for some larger mmos the bottleneck is pretty clear and dropping the gpu settings does nothing to improve it until its limiting models

i had to replace some bad ram, so I'm going to milk it for all its worth but I guess its time to consider this only a fine cpu

intel: just fine

yeah it depends on the game but its just freakin old as dirt at this point. even at 4.5Ghz it was a huge bottleneck.

  • Locked thread