Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Laslow
Jul 18, 2007
One warning about LTSB, now called LTSC is that they don’t get new hardware support til the next version.

So if your CPU comes out after the latest LTSC release, you have to wait for the next one.

I think you may be able to download drivers and install them manually if they’ll run on builds 1603/1809. Just be aware that out of the box, it’ll likely load up generic ACPI PC drivers for your chipset/CPU until then.

Adbot
ADBOT LOVES YOU

Laslow
Jul 18, 2007
I think I might target a Zen 2 Threadripper and multiple video cards for IOMMU for my next build, then I can have like 4 8c16t computers in one, all at native speed. For what purpose? Just to have it, I suppose.

I've been suspecting Zen 2 is a good spot to get on the AMD bus for a long while now, since the Zen 1 release, which was basically Broawell IPC but way more cores for your money. Just enough time for the arch to mature and my Devil's Canyon workstation to age enough to justify it, especially if it's IPC gains put it ahead of Skylake.

Laslow
Jul 18, 2007

NewFatMike posted:

:same: except R7 1700. I'll donate the CPU, mobo, and RAM to my startup to be the dedicated Render Bender.

Anyone else praying for 8C/8T being Zen2's entry level/mobile starting point?
That would be nice, but even 6c/12t would be cool, seeing as that 5820k was HEDT not all that long ago. It's so cool to finally see some progress on CPU's again, in regards to raw CPU power and core count getting cheaper year over year now as opposed to whatever the gap was between Sandy Bridge and........
...
...
Coffee Lake? Jesus Christ!

Laslow
Jul 18, 2007
A quick googling and it seems the difference between 3.0 8x and 3.0 16x on a GTX 1080 was less than 1%.

PCI-E 4.0 will benefit superfast storage more than anything.

I recall an article back in the day where high end at the time PCI-E 2.0 video cards not being significantly affected until they were dropped down from 4x to 1x. I guess the reason they were all x16 is because of better power delivery and some meaningless spec number boosts.

Laslow
Jul 18, 2007
A good leak would have had a couple 14nm Excavator parts thrown in there for flavor.

Laslow
Jul 18, 2007
I doubt that both systems will use the same exact chip. Both using AMD APU’s, sure of course, but not identical ones.

Laslow
Jul 18, 2007

iospace posted:

I feel the processor market is a bit cyclical. AMD took over in the early 2000s for a year or so when they came out with x86-64, a 64 bit architecture that didn't require a reinvention of the wheel like Intel's did. Intel then took over once they worked their kinks out.

However, this feels a bit different. The side-channel vulnerabilities, coupled with the die-shrink issues, means they'll be lagging behind for more than a year in this case. Add on that AMD has really good multi-threading and you can see them making significant inroads into Intel's market share.
It is different because last time it had more to do with Intel loving up and ditching P6 for NetBurst and using the incompatible and expensive Itanium as their sole 64 bit effort, and AMD sticking with incremental changes of the K7.

K8 wasn’t really faster because it was 64 bit, but it went a long way in the marketability of those CPU’s and made AMD look like a legitimate market leader for the first time.

This time Intel doesn’t have any old designs to fall back on to bail them out.

Laslow
Jul 18, 2007
It’s not like Intel doesn’t have the cash to license process tech for use in their own fabs.

Can anyone more knowledgeable in the workings of the semiconductor industry explain the business or technical reasons why they can’t or won’t?

I know this is the AMD thread and all, and I don’t mean to derail, I’m just really curious.

Laslow
Jul 18, 2007
Maybe they don’t think Ryzen APU’s are going to be very impressive regardless of if it’s Vega, Navi, or something actually good in there since they’re all saddled with DDR4 at the moment.

We’re not going to be able to guess how much DDR5 will boost those APU’s(hopefully into decent 1080p 3D territory) at the moment, but maybe they’ve got some idea.

Laslow
Jul 18, 2007
Kind of sucks, I want that 3950x to replace my aging Haswell E3, but 4K60 will still be a $1200 pipe dream GPU wise, even waiting til September.

Laslow
Jul 18, 2007

Stanley Pain posted:

420, Smoke AMD erryday. :catdrugs:
I had a Thoroughbred Athlon XP scorch through the stock thermal pad.

I also had a Sledgehammer Opteron that also got pretty close. They both ran fine but I’m glad I thought I to repaste them after a couple years though.

Laslow
Jul 18, 2007

SwissArmyDruid posted:

Someday we will have a combined x86 thread, but until then, these are the kinds of jokes we'd never be able to get away with without some troglodyte getting butthurt.
The Intel thread is pretty self aware at this point, I’m even a repeat Intel customer, their stuff has been super reliable for me, but they’re a tough sell and hard to defend at the moment, even for me.

Laslow
Jul 18, 2007

CaptainSarcastic posted:

Part of me hopes for a comedy option board that can run either DDR4 or DDR5 but not both at once, like in the transition to DDR2 days.
I built a system on one of those boards(open box, lol), so I could use spare DDR I had lying around. It was for a friend that was broke and wanted to play WoW with me and some friends. Between the open box Duron, refurb Radeon X600 and literally the cheapest case with an included power supply at CompUSA, he had a ridiculously capable machine for like 75 bucks.

So much so that a friend with a top of the line Dell with a P4HT got jealous because WoW ran like dog poo poo on it’s included FX5200 or MX440 or whatever POS was in that thing that he had me order another dirt cheap X600 for him immediately and was so pissed at the difference a $40 GPU made and the fact that they’d even sell him such trash for the express purpose of gaming as that GPU was so cheap relative to the total price tag of his system.

The problem is nowadays motherboard manufacturers would put so much of a premium on such a niche goofball comedy mobo that it’d hardly be worth it. Gone are the days where you could get the absolute top of the line board for $65 if you were willing to play the refurb game. :(

Laslow
Jul 18, 2007

Farmer Crack-rear end posted:

There were also SDR/DDR boards too - the ECS K7S5A comes to mind.
That board was legendary/infamous. The fact that I can recall it off the top of my head nearly 20 years later from just the terribly generic model number is a testament to that.

Laslow
Jul 18, 2007

PC LOAD LETTER posted:

There used to be mobo's with support for multiple consumer sockets and stuff too to act as 'transistional upgrade' boards. They were usually poo poo and made by guys like PCChips but the idea itself is cool as heck. Here is one that supports socket 370 and slot 1: PC Chips M741LMRT

https://cdn8.bigcommerce.com/s-a1x7...c=2?imbypass=on
Let’s not forget about slockets either. Slot 1 to socket 370 adapters were probably the most popular. I think the closest thing to that in recent memory was a sticker that’d let you use a socket 771 chip on a 775 motherboard by covering specific LGA contacts, which is not only ingenious but also mind blowing that it actually works.

Laslow
Jul 18, 2007

movax posted:

Forget performance numbers, 5xxx series it is!

Why is naming chips so loving hard?

4xxx already exists though? Just because they’re not interesting for leet gaming rigs doesn’t mean they don’t exist. Same thing happened with the i7 5xxx series, Broadwell.

Laslow
Jul 18, 2007
And don’t get me started on “Core i7”, the name actually made sense, it was supposed to mean that it was the successor to the P6/i686 line, basically shorthand for i786. Until marketing hosed that up, of course.

Laslow
Jul 18, 2007

Theris posted:

Edit: I think the problem you run into in trying to figure this out is how big a change to the microarchitecture justifies an x86 generation increase anyway? IIRC there was a bigger difference between Core 2 and the "686" Core than there is between Core 2 and original i7 Nehalem.
Exactly.

Although it’s still called i686 for 32 bit builds of most open source software. It’s just “x86-64” or even AMD64 for the 64-bit builds, “x64” in the case of Microsoft, to differentiate itself from “64 Bit Edition” which was the Itanium builds of Windows.

Got it? Good.

Laslow
Jul 18, 2007
I want ASRock Rack to make GPU cards with the blandest aesthetics possible. Anything that’s not covered by a functional looking cooling solution is just a plain green PCB and be called “ASRock Rack 3D Accelerator - NVidia GA102 chipset - 24GB” and since there’s supply constraints other than getting their own sales guys getting CDW, Techdata, Ingram, etcetera’s sales guys to make IT and retailers purchasing aware of their existence, they can spend zero dollars promoting them.

Laslow
Jul 18, 2007

Cygni posted:

u know i had that ASUS K8N nForce3 250 board with a Sempron 3100+ OCed and hard modded to be an A64 baby, nForce boys rise up
SK8N nForce3 Pro/Opteron. I didn’t even wait for them to release regular desktop CPUs/chipsets before jumping ship, I was that disloyal/impatient.

Laslow
Jul 18, 2007

MeruFM posted:

16 core/32 thread 128gb 3090 custom loop posting machine
5400RPM HDD

Laslow
Jul 18, 2007
Finally AMD can get into a real growth business, boutique bespoke FPGA SNES systems. Get Keller and kevtris on board and they’ll be unstoppable!

Laslow
Jul 18, 2007
Doesn’t PBO essentially make the distinction between non-X,X, and XT parts pretty much the amount of letters printed on the box? Apart from the included HSF, I guess.

Laslow
Jul 18, 2007
It’s also so much worse for Intel this time around because they’re stuck with what they got. It’s not like they’re here because of hubris, holding out on NetBurst too long while simultaneously wishing a decent compiler for Itanium into existence. They don’t have a plan C, like the good P6 designs like Banias/Dothan cores to fall back and iterate upon to create Conroe, Destroyer of Worlds. Some OEMs even made socket 479 desktop boards with something like 1.6GHz Pentium M’s on par with much more expensive 2GHz Opterons in performance and with better power consumption.

If only they had it so easy now.

Laslow
Jul 18, 2007

Indiana_Krom posted:

The software has also been known to expose massive security holes. But this is hardly unique to Gigabyte, honestly nobody should run motherboard vendor utilities.
Has anything good happened after downloading a SFX ZIP EXE from ftp.chainstarrock.com.tw at 74kbps?

What’s worse is you’ve got no choice if you want BIOS updates. “Hey, you know this piece of software that could brick someone’s system if it’s corrupted? Let’s host it on a server on Typhoon Island, Taiwan and if they’re worried about sharks chewing on the transpacific underwater cable at just the wrong time, we’ll put their mind at ease by putting it on a webpage that resembles an Angelfire warez portal!”

Laslow
Jul 18, 2007
Also try leaving the CMOS battery out for longer. I thought a motherboard got toasted during a move, tried removing the battery for maybe a minute, got nothing. Pulled it out again and left it while I went outside and had a smoke in frustration. Maybe 15 minutes later I popped it back in and gave it one more try and it came back to life. I guess it just wasn’t getting discharged 100%. That said, be sure the power cable is unplugged too just in case.

Laslow
Jul 18, 2007

Seamonster posted:

I paid full price for a 2500k one time, at launch. The next cpu I bought was a 3800x so by that measure Intel barely got a peek.
i’m seeing a lot of people upgrading from sandy bridge to zen and then again to a newer zen. and i’m figuring that they’ll do it again for ddr5. if they don’t get their poo poo together, intel’s gonna lose all the ones that landed on skylake that they haven’t already.

Laslow
Jul 18, 2007
As an early adopter of DDR3, the first sticks wanted 1.8v(!), rendering them completely incompatible with later sticks, lol.

Laslow
Jul 18, 2007

Twerk from Home posted:

At least you got that triple channel RAM! Actually this is the AMD thread, so maybe I should be saying “condolences on your Phenom”.
It was an Intel X48 board.
X48 was dual channel, X58 with triple channel was at least a year later. I don’t know, maybe Phenoms got DDR3 a little sooner though.

Laslow
Jul 18, 2007

CommieGIR posted:

I've seen a couple, and yeah they largely lock you out of even modest performance for these older cpus.

thankfully mine live in a HP and Dell Bladecenter.
My z97 xtreme gamer board failed and replacements were over $100. I got a Dell Precision workstation motherboard for like $30. After the adapters for putting it in a mATX case/PSU, probably about 55. I did have a locked Xeon though, so losing OC options didn’t hurt.

Laslow
Jul 18, 2007

Mofabio posted:

Man, that Van Gogh APU's what's gonna lighten my wallet. An APU that fixes the iGPU memory bandwidth issue with DDR5 and a 256-bit bus, in a potential 40w TDP NUC. That thing's gonna play PC games from the 80s till the mid-2010s and cost mid-range $$. Mmm mmm.
Yeah, I like the idea of not having to scrounge for a cheap slot powered 1050ti/1650 on builds for friends and family. I always want desktops I build to have decently capable 3D, otherwise they get a MacBook Air. Even if they’re not gamers it usually works out, like my wife wanting to play Fallout 4 or my brother wanting to play Madden some time after the machine’s built and they’re already set.

Laslow
Jul 18, 2007
The current state of desktop APU’s is kinda lame. APU’s are an ideal solution for people like me who end up building systems for friends and family who’d like to eliminate the need for a token bus powered VGA card for moderately decent 3D because they always end up wanting to run Minecraft or Civ 4 or some such nonsense. Even more so now if you’ve seen the prices of even a GT740!

I feel like once DDR5 and that new sillycache stuff finally hits they’ll be over a big hump and be in a really good spot though. Whether or not they choose to actually make some for people to buy is another thing.

Laslow
Jul 18, 2007
I hope it’s not some insurmountable problem that the engineers foisted on the future to deal with without any idea of even the direction of where they could to begin to look for solutions. Or did they use all their foresight to determine it wouldn’t be a noticeable enough problem before their stock options are fully vested?

We all saw how effective wishing super hard and dumping the problem on compiler developers was for Intel and IA64/Itanium, for example. There’s no doubt there’s executives out there that believe that there’s no problem that can’t be overcome given sufficient market pressure, completely ignoring little things like thermodynamics or just physics at all.

Adbot
ADBOT LOVES YOU

Laslow
Jul 18, 2007

gradenko_2000 posted:

has AMD ever outright said/marketed that their APUs are there so you can game on them without a video card, as opposed to Intel's iGPUs being there to drive Windows and Excel and to share such functionality with their laptop chips (at least prior to Xe)?
Well I had a Vaio laptop with an A6 in it that had a full RADEON GRAPHICS badge on it, even though it was something laughable like a HD 4230D. It was enough for gathering crafting materials in WoW while at work though.

I don’t know if that counts, but I don’t see why they’d advertise the graphics at all if some kind of gaming wasn’t implied, it’s not going to improve my Excel experience in any meaningful way. No idea about their desktop APU’s though.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply