Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Xae
Jan 19, 2005


That is a ton more power for 500MHz. They must be pushing these as hard as they can.

Adbot
ADBOT LOVES YOU

Xae
Jan 19, 2005

Lolcano Eruption posted:

It's perfectly fair to use stock RAM speeds for Ryzen.

Not when you're using PC3200 on the comparison chip, which the benchmark was.

Xae
Jan 19, 2005

Craptacular! posted:

Explain this to a person who doesn’t really know how operating systems work

Hardware can send signals to the main processors called Interrupt Requests or "IRQs". These signals do a poo poo load of work connecting the GPU and CPU.

It looks like the signals were getting ignored because some stuff wasn't properly implemented.

Xae
Jan 19, 2005

Don Lapre posted:

Edit I'm wrong. Both parties agreement is terminated if one is sold.

It also means Intel can't make 64bit chip since AMD holds the license on the 64bit extension.

And with multiple anti-Trust ruling against it in North America and Europe Intel would have to be dumb as gently caress to play hardball with the x86 license.

Xae
Jan 19, 2005

Bulgakov posted:

intel has gotta to be getting explicit or implicit pressure from a big OEM like apple or otherwise if they're so yet-again on boldly chasing the idea of keeping everything comfy on one homemade chunk of silicon

phi is...phi

but them turning a bunch a money towards a future-bound fright of losing the competitive process wars is big time indicative assuming intel management isn't totally ignorant of scraped inside knowledge of future industry-wide plans

I would be really surprised if Intel tried that poo poo again.

They almost got chopped up and sold for parts after the Anti-Trust lawsuits last time they tried in the early 2000s.

Then again, they were incredibly brazen about it last time and it kinda worked so who knows.

Xae
Jan 19, 2005

CapnBry posted:

Amazon USA right now has :swoon:
1800X $319
1700X $279
1700 $269

1950X $799
1920X $649
1900X $449

I really do wish I had a Microcenter within 400 miles of me. Seems like they have the best deals on this stuff because a 1700 with the ASRock Pro4 (B350) motherboard is $285 after rebate which is mind boggling.

I might have to break the Bank and build myself a 1950X.

Have they committed to socket TR4 for Ryzen+/2/whatever?

Xae
Jan 19, 2005

That rumor strikes me as wrong on everything.

The core counts are unlikely to change. The clock gains are way too high and why would they use the 2XXX naming system instead of 1X50?


Everything about it seems dumb and bad.

So it is probably right?

Xae
Jan 19, 2005

SwissArmyDruid posted:

I can't see this doing anything to Intel in any actuality. I think of the FDIV bug recall, and they still made out like bandits that year.

Also, how badly do you want to kick Intel when you're partnering with them for the 8809G?

It isn't a company ending gently caress up for Intel, but it hurts them badly.

Losing performance and having a huge security problem sure isn't going to help them.

If EPYC can actually "For REALZ" launch AMD could easily pick up some volume from companies looking to make hedge their bets.

Xae
Jan 19, 2005


I'm a little skeptical because the GPU market is about 15M units/year.

The discrete desktop GPU market anyway.

Xae
Jan 19, 2005

GRINDCORE MEGGIDO posted:

What, realistically, is expected of the respin, is the new process likely to really allow clocks of around 4.3ish, something like that?

Wonder if the memory controller is happier with faster RAM speeds or not.

Around there.

The real gain, at least for gaming, will be if they can get the single core boost up higher. One of the reasons why Coffee Lake does so well on gaming is that it will boost one core up +1GHz.

If the 2700/2800 can get a single core to push 5GHz they're neck and neck for single threaded performance.

Xae
Jan 19, 2005

Risky Bisquick posted:

Feb 12th is APU day. 2200G looks like a nice HTPC / mining apu



If ECC support gets straightened out I might do some upgrading to my little home server box.

Xae
Jan 19, 2005

Kazinsal posted:

The gently caress is with some of those Linux distributions just being horrendous at some things compared to the others? Is library choice and performance across different distros so wildly different that you actually in some cases have to pick which distro you want to use specifically based on how poo poo your preferred one will perform in it?

God drat. No wonder Linux people still try to dunk on Windows like the past 20 years haven't happened. Modern Linux is a complete shitshow of platform fragmentation and modern Windows is a stable operating system that works out of the box :stare:

e: Also gently caress Adored, dude is seriously like an Alex Jones wannabe who wishes he could be Linus Sebastian.

If you're talking about 3d performance then yeah, it is a complete shitshow of who is using what drivers and if they're closed source or open source.

Xae
Jan 19, 2005

spasticColon posted:

Why would buttcoin miners buy these APUs? Pentiums and Celerons are still cheaper if they need cheap CPUs for their mining rigs.

Because an APU has a decent built in GPU.

Xae
Jan 19, 2005

GRINDCORE MEGGIDO posted:

Linus seems curious how the worst solutions will actually end up working. Which I find commendable.

Learning how to deal with the issues that arise from implementing "the worst design ever" is like two thirds of IT.

Xae
Jan 19, 2005

AVeryLargeRadish posted:

That doesn't work, at least for me, I've done that and I just still see the same recommendations, it's really annoying.

Once THE ALGORITHM has determined that you will like a video/channel there isn't a drat thing you can do to get it to gently caress off.

Xae
Jan 19, 2005

There was a video a while back where someone tried to get an Epyc to work in a TR4 socket.

It is physically compatible but there is differences in the memory controller that prevent it from booting.

So it is very possible for AMD to roll out a 32 core TR.

Xae
Jan 19, 2005

GRINDCORE MEGGIDO posted:

Also I guess a lot of game engines were built during the long term period of 100% AMD suckage.

It is more that even if an engine uses 8 threads it can still bottleneck on a single thread. That single thread is usually the core engine or the thread managing the render calls.

Xae
Jan 19, 2005

bobfather posted:

It’s a hacked EBay account.

Or some processors fell off the back of a truck.

Xae
Jan 19, 2005

SourKraut posted:

Wonder if he’d still have been retained if they were doing better on the CPU side.

It could be a face saving move, an affair is much less embarrassing than losing a 5 year lead in 5 years.

Xae
Jan 19, 2005

How big is the new Intel socket going to have to be to handle Rajas shroud?

Xae
Jan 19, 2005

TheCoach posted:

How many times has intel totally entered the GPU game by now?

Its been on a 5 year cycle since the late 90s.

Xae
Jan 19, 2005

Paul MaudDib posted:

How in the world did BK keep his job for so long?

Because Intel was "6 months away" from 10mn for 5 years.

Xae
Jan 19, 2005

GRINDCORE MEGGIDO posted:

How much throughput do modem SPARC processors put out per dollar these days?

About 2-3% goes back to the Oracle sales guy who conned you into buying it.

IBM does the same poo poo with POWERPC.

Ask me about replatforming an Oracle database from Linux/x86 to AIX/POWERPC in TYOL 2016.

The same company also stood up a new DB instance in 2015.

They did not always make good decisions.

Xae
Jan 19, 2005

Broose posted:

Thread count so high you could use it as a thrifty blanket.

How do video games play with THREADRIPPER's? Is a normal processor of similar GHz better due to whatever reason involving the multichip set-up? I wish I had a reason to actually entertain the thought of getting a THREADRIPPER. But all I do is play video games, I don't even stream or do anything creative.

It does nothing for games. Most games bottleneck hard on one thread. That is why the hex core Intel's get higher gaming benchmarks than the octo core AMDs in most games.

For games the threads are asymmetrical. There is N number of minor threads doing small things and then usually one render or core engine thread that bottlenecks.

So even though you have games that "use" 8 or 10 or even 20 threads you get more use out of single threaded performance than having a ton of cores.

Xae
Jan 19, 2005

Combat Pretzel posted:

Yeah, I don't get the drama home users create about these exploits.

Home users have to be very worried about concept level exploits.

*Downloads and installs dozens of Skyrim mods*

Xae
Jan 19, 2005

Munkeymon posted:

I bought a UPS in college after the dorm wiring killed my third PSU and it was a champ for 10+ years until new batteries wouldn't stop it from alarming anymore.

A good UPS and a good power supply are the best things you can buy to extend the life of your components.

A good UPS will last through multiple builds and if it prevents one component from burning out it paid itself off.

Xae fucked around with this message at 19:20 on Mar 28, 2019

Xae
Jan 19, 2005

What even uses per socket licensing anymore?

Everything I can think of switched to per core a while ago?

Xae
Jan 19, 2005

It almost sounds like an Optane cache, but an AMD processor with a Optane support would be unusual to say the least.

Adbot
ADBOT LOVES YOU

Xae
Jan 19, 2005

Broken Machine posted:

You jest, but the Air Force literally did build a computing cluster out of ps3s, because it was the most cost effective choice, due to the speed of calculating matrices..

https://phys.org/news/2010-12-air-playstation-3s-supercomputer.html

Also because the hardware was sold below cost.

The_Franz posted:

Did they even care at that point? The Cell was interesting for a short period, then Nvidia debuted CUDA and the scientific market lost interest almost immediately. IBM declared the Cell a dead end and bowed out long before Sony patched out linux support.

Running Linux on the PS3 was always a pretty miserable experience since IO was extremely slow and it had a pathetic amount of memory available. I did get :10bux: from the class action lawsuit though.

They cared because they had sunk a few million into something that suddenly broke. Even if your can replace it with something better it's going to cost money.

Super computers usually run until they physically break down.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply