New around here? Register your SA Forums Account here!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $10! We charge money because it costs us money per month for bills alone, and since we don't believe in shady internet advertising, we try to make the money back through forum registrations.
 
  • Locked thread
Klyith
Aug 3, 2007

GBS Pledge Week

Kazinsal posted:

How do you all keep bending pins on poo poo? Do you take those lovely stock photos that involve using wrenches to remove CPUs as gospel?

:kiddo: ...




Though one time I got a CPU that had a mildly bent pin out of the box, just enough to prevent it from socketing and so would have been made worse if I tried to force it. I think that a lot of bent pins start out that way, but the person installing it is too new or impatient to be gentle. A bent CPU pin isn't *that* hard to fix unless it's totally flattened. You just have to stop and think.

(I've never once hosed up when building / upgrading a PC for a friend, simply because having another person there while you work cuts down on the dumb mistakes caused by impatience. Even when the friend doesn't know pc hardware and so isn't doing anything, I'm talking about each step and so going slower.)

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

WhyteRyce posted:

I thought Intel's deal let them charge the same price throughout the life of the console. So instead of the cost going down over the life of the console, Intel kept getting to charge the same price for a laughably out-dated CPU

I think that was the nvidia chipset & GPU they got hosed by.

Intel was willing to undercut to freeze out AMD, but nvidia didn't need to do the same to compete -- this being before the ATI-AMD merger and at a time when nvidia was massively dominant over ATI & PowerVR. Also microsoft didn't really know what they were doing at the time of the first Xbox. There was a neat storytime on the giantbombcast where Gerstman was talking about how MS was pretty crazy and acting like their hardware was gonna set the world on fire, despite having no games at first.

Klyith
Aug 3, 2007

GBS Pledge Week
So what's up with the 6 core ryzens, they don't clock as high as either the 4s or 8s. Are they actually 8 core units with disabled (damaged) cores? Or are they just keeping those a bit hobbled to make the 1700x a better top dog?


Anyways the 1700x looks like a real steal, but I'm also pretty excited about the "mainstream enthusiast" zone. The 1400x and 1200x could be really great alternatives to the standard 7500 or 7600K that pretty much everybody slaps in their gaming pc. About the same performance, save $50, what's not to like? The 7700K is going to stay at the top of all gaming benchmarks though, so people who like to spend $350 for benchmark performance will stick with intel.

Klyith
Aug 3, 2007

GBS Pledge Week

Dante80 posted:

By the time that a game comes out that pushes current CPUs to the point of effectively throttling your gaming experience (for example, sub-60 FPS minimums at your playing resolution), the games existing would already be in need of more cores than what your standard i5 or/and i7 can handle.
Yup, though that never stopped anyone from buying more CPU than they needed or could see any benefit in real performance before!

More cores for streaming is a good point though. These days is seems like everyone is streaming their games, whether there's anyone watching or not.

quote:

In other words. If you already have a decent OCing i5/i7 for gaming right now, there is no need to replace it for a ryzen system.
I'm still on a ivy bridge @ 3.2ghz (non-K), which is still adequate but getting a bit long in the tooth. So I was planning on doing something with a new build in 2017 anyways. It would warm my cold black heart to be back on AMD again -- I started putting together pcs with athlons back in the day.

But I'm definitely waiting for a full set of reviews before getting out my credit card. One thing I really want to know is idle power efficiency. I like a quiet pc, so if ryzen continues AMD's pattern of having 15 watts more power dissipation than intel when idling that will be unfortunate. That could be enough that you can't have fans spun down to inaudible levels.

Klyith
Aug 3, 2007

GBS Pledge Week
On the one hand, zero-discount bundles are absolutely meaningless, and have nothing to do with amd or corsair and everything to do amazon trying to sell more stuff. Unless you think this bundle says something about the thermals of that CPU or a intel/thermaltake deal? Water coolers are a thing that nerds who drop $400 on a CPU might be induced to buy.

On the other hand, AMD must be performance competitive and in recent years they've frequently done it by yelling for more powah!!! AMD's TDP numbers are not the same as other TDP numbers -- when they say a part uses 95 watts, there's no max or maybe about it. That chip is eating those 95 watts and then licking the plate for leftover electrons. Let's not forget that the 480 drew out-of-spec power because it couldn't keep under 150w in real-world use.


EdEddnEddy posted:

Maybe dumb question, but what is the reason to not go all in and put DRAM on the CPU Chiplets themselves as well?
Intel has eDRAM on some Iris Pro chips, and one possible thing they might do if Ryzen really does start eating their lunch is put that on all CPUs in the next cycle. If you're not using it for the on-die graphics it becomes a sort of superior universal cache that sits directly on the memory controller. IIRC people tested mostly-identical mobile chips but with & without the eDRAM, and it was a modest performance gain for many tasks.

edit: ^^^ the above is strictly about general purpose CPUs, I know nothing about HPC

Klyith
Aug 3, 2007

GBS Pledge Week
of course there's also the possibility that Intel has been doing something other than this with their time for the last 5 years. I mean, if we're wildly speculating here, their slowdown in performance improvements could be them keeping stuff in their back pocket that they just haven't bothered with.

But as long as ryzen is at least good enough to compete and force intel to respond in some way I'll be happy -- price drops, big performance improvements, or going super saiyan, I don't care.

Klyith
Aug 3, 2007

GBS Pledge Week

Alereon posted:

Somebody should post a new AMD thread for Ryzen and I will close this one. Dunno if this should be now or when we have more details, but it should happen!

AMD CPU and Platform Discussion: Ryzen from your grave!

Klyith
Aug 3, 2007

GBS Pledge Week

SwissArmyDruid posted:

This opening of preorders ahead of NDA stinks more and more the longer I think about it.

They did it for the 480 as well. Kinda a mixed bag. Preorders were useful in that they were the best way to get the thing at launch price, after that the price went up by like $30 for a couple months. Downside was that you got a launch 480.

Nvidia had preorders for founders edition cards as well come to think about it.

I dunno, preorders are dumb anywhere but at least they make some sense for a physical product that will probably not have enough stock to meet demand. Better than for digital games anyways.


Seamonster posted:

Its not just the chips themselves that determine overclockability - you need good motherboards too and those will also get better with time.
They're gonna be substantially the same mobos as for recent FX chips though, right? Anything that can keep an FX happy should be able to meet the more modest needs of Ryzen.

Klyith
Aug 3, 2007

GBS Pledge Week

priznat posted:

So are AMD chips like intel with some pcie lanes coming direct from the cpu and then some more coming from the pch?

I see in the specs for the x370 24 pcie lanes, is that chipset only and then the cpu has 16 gen3 as well? Or just 24 total?

Yes, the chipset has additional pcie lanes. The x370 has 8 pcie lanes from the chipset, making 24 total with the CPU's 16.

Pretty tight when you want to have a M.2 x4 slot plus two slots capable of doing SLI video cards, even if they're choked to x8 each.

Klyith
Aug 3, 2007

GBS Pledge Week

priznat posted:

drat that seems really stingy compared to intel pchs allowing up to 24 gen3 additional now with kaby lake. Just add the cost of a pcie switch on there I suppose.

I'm not sure but it could be a little bit apples and oranges, the block diagrams of intel mobos always seem to consume pcie lanes for "built in" stuff.

But yeah. No matter how many pcie multiplexers they throw at the problem, they can't fit the triple SLI support, dual nics, and not one but two m.2 slots that the xxxtreme gamer x270 boards have. The high end x370 boards are gonna be a bit pitiful by comparison. Like this thing that ASUS has the audacity to charge $250 for, mostly for the ROG label.

Personally I don't care much, 24 is plenty for anything I need. Heck, I want 1 old PCI slot just so I don't have to replace my sound card.

Klyith
Aug 3, 2007

GBS Pledge Week

when you think about it, case windows have enormous negative value

Klyith
Aug 3, 2007

GBS Pledge Week
I could maybe imagine that a compute product built from Iris Pro parts could be good, but a consumer graphics (ie gaming) product? Nah man. Just nah.

Beyond the fact that Iris Pro kinda sucks (a chip with dedicated memory outperforms a thing that relies on system memory? you don't say!), there is just a near-impossible cliff of game-specific tuning and optimization that AMD & Nvidia have worked on for decades. They do huge amounts of work to wring the extra performance out of (often shoddy) game code, or "cheat" in ways that are perceptually unnoticeable but give a tiny edge in FPS. Meanwhile intel sits back, implements hardware & drivers to the specs, and gets by on minimum effort.


I think intel could be competitive in that market, but as long as they have their dominant position they've never needed to. And the years of up-front cost to catch up aren't worth it. Maybe if things had gone a different way around the start of Core 2, with AMD staying neck-and-neck on CPU performance but also acquiring ATI and becoming better in integrated graphics... But that's a lot of alternate history to handwave.

Klyith
Aug 3, 2007

GBS Pledge Week

Kerbtree posted:

Eh, rowhammer is a thing.

Measly Twerp posted:

And consumers everywhere gave not one single poo poo.

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

Consumers probably should care about Rowhammer given that there's the potential for some javascript to bypass the entire CPU "ring" security model - potentially even to the level of the management engine or secure enclave, from which it is 100% impossible to extract an attacker (by design).

ECC alone isn't a good enough protection against rowhammer, so no.

e: since the memory corruption can possibly flip more than one bit, and ECC can only handle 1 bit errors. If your system isn't detecting the rowhammer attack, multiple attempts eventually work. but current and near future hardware has protections against this type of attack, without the need for ECC.

Klyith fucked around with this message at 09:57 on Feb 24, 2017

Klyith
Aug 3, 2007

GBS Pledge Week
Goons: consumers should care about ECC memory on their home fileservers to protect against single-bit errors in their next-gen journaling filesystems corrupting backups!

Consumers: :confused:

Prosumers: Yes, backups! You should make backups! I got this nifty hard drive box from seagate that does backups when I push the button.

Klyith
Aug 3, 2007

GBS Pledge Week

Combat Pretzel posted:

--edit: ^^^ An urban legend? Really? I thought it was proved that it does happen?

It can and does happen. The question is, how often?

https://en.wikipedia.org/wiki/Soft_error posted:

One experiment measured the soft error rate at the sea level to be 5,950 failures in time (FIT) per DRAM chip. When the same test setup was moved to an underground vault, shielded by over 50 feet (15 m) of rock that effectively eliminated all cosmic rays, zero soft errors were recorded.[9] In this test, all other causes of soft errors are too small to be measured, compared to the error rate caused by cosmic rays.
1 FIT = 1 failure per billion hours

If keep your fileserver running for the next 20 years you're statistically average to see 1 bit-flip error from cosmic rays. If you keep the computer in the basement rather than the attic, I bet you'd halve that. If you live in boulder CO your error rate would be increased by approximately 3 times.


Surplus medical lead blankets are only $100-150, you could get one of those to drape over your fileserver. I mean, you can get ECC ram but consider the memory cache on your HD is the most likely place for a bit-flip to actually get written to permanent storage. Can't ECC that, better invest in rad shielding your pc!

Or consider that over the course of 20 years you're about a million times more likely to gently caress up your data by accidentally doing something that's your own drat fault. A 1 bit error corrupts the picture of your kid, and for some reason you don't have a thousand more kid pics? Or you accidentally rm-rf every pic you ever took because you were typing in the wrong terminal and delete all your pics at once?

Klyith
Aug 3, 2007

GBS Pledge Week
https://www.amazon.com/Sheet-Lead-12-Rotometals/dp/B00IS5EEZ6/
1/8" sheet lead, 12"x12"

Get a few pieces of that, do a radshield casemod for your PC!

Klyith
Aug 3, 2007

GBS Pledge Week

Dante80 posted:

Two things collaborate to a strong Ryzen showing for games.

1. Games tend to use more cores/threads nowadays, and this will change even more as we move forward. A couple of interesting articles.

https://www.computerbase.de/2017-02/cpu-skalierung-kerne-spiele-test/
http://www.eurogamer.net/articles/digitalfoundry-2017-how-amds-ryzen-will-disrupt-the-cpu-market

Eh, I'm not sure what Ryzen will do that the last 3 years of games targeting the 8 weedy jaguar cores of the consoles didn't accomplish. And it still seems to be the case that if the CPU is what limits performance, it's mostly about the single core grunt. The i3 7350K smokes a lot of mid-range quad-cores if you run it at 4.8 ghz.

I'm expecting the stock 1700X to lose to stock 7700K in practical gaming benchmarks, and oc'd comparison to push it even farther in intel's favor. But that's still complete speculation, we'll see how much extra the per-core dynamic clocking of Ryzen can extract from it.

Klyith
Aug 3, 2007

GBS Pledge Week

That pic is ridiculous. What the hell was the reasoning that a 2mm change needed to be made?



Anyways depending on your existing cooler's mounting hardware, it may be possible to do minor modification or just bodge it to make it work. I did that for a while one time, I can't remember which socket transition it was but it must have been years and years ago because it was back when heatpipe tower coolers were still moderately expensive. So before the hyper 212 came out. Anyways, I got slightly longer bolts from the hardware store and kinda angled them, it was fine for the 6 or whatever months until I replaced it.

Don't do this if you move your pc around at all of course.

Klyith
Aug 3, 2007

GBS Pledge Week

Lowen SoDium posted:

Probably, and that is probably why the holes were moved: To keep people from using a cooler that wasn't made to be used with the new height.

Ah that makes a lot more sense then. Though now I'm wondering how that asus board with both holes works if the height is different. I guess those black things are spacers to fix the height issue?

And obviously everything about my suggestion of fitting a cooler that's not compatible has a giant disclaimer to use common sense, test fit carefully, don't overstress board or cpu, may not work at all depending on mounting hardware, etc etc.

Klyith
Aug 3, 2007

GBS Pledge Week

Voyager I posted:

Yes, but it's in poor taste to get caught.

cinebunch.exe

Klyith
Aug 3, 2007

GBS Pledge Week

spasticColon posted:

So it looks like Intel chips are still going to be better for gaming for now. Oh well, an i7-7700K it is then for my next build.

lol the reviews still aren't in yet. goons are both pre-ordering ryzen on hype, and declaring it trash and buying intel.

https://www.youtube.com/watch?v=ryHWLPiejYw

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

for the record y'all should post ... techreport ... as soon as they are live

so sometime in june then? :v:


(it's a burn from the heart, I was on TR for years and years before I came over to SA)

Klyith
Aug 3, 2007

GBS Pledge Week

Rastor posted:

We apparently have different definitions of "mainstream" desktop computer use.

twitch is streaming 2 million unique people per month now. god knows how many people are uploading gaming vids to youtube every week. god knows why, I don't get it, but lots of people do regardless of whether they have an audience or not. recording video of games is totally mainstream.

not that broadcasting video of games to zero people is a really great reason to buy a $500 CPU, intel or AMD.

Klyith
Aug 3, 2007

GBS Pledge Week

Palladium posted:

Well, even AMD themselves are making their own SKUs above $320 look plenty bad already. Pay more for factory OC and XFR = lol

top of range parts are traditionally way out in the land of diminishing returns, ryzen 1800x is no different.

when you look at a situation where that isn't the case -- like the core i5s having hyperthreading disabled -- what you're seeing is a monopoly distortion, not "good value".

Klyith
Aug 3, 2007

GBS Pledge Week

Dante80 posted:

Which also speaks a lot about how loving clown-ish and rushed the launch was...AMD stock has lost more than $1.5bn in two days due to their PR department lol.

Well, it gained quite a bit of that 1.5 billion due to their PR department as well, by handing out pre-release samples selectively to hypesters and LN overclockers. :v:

Klyith
Aug 3, 2007

GBS Pledge Week
Here's a good article about the limits of standard algorithms for judging quality of lossy video. (cached version, I can't get to the original right now.)

a) it's somewhat more valid in the case of images than audio. Our ears suck, eyes are slightly less easy to fool. Unlike with psycho-acoustic audio compression, you don't get totally inverted relationships between perceived quality and PSNR or whatever. The perceptual models of our hearing are more mature than our vision, and it was an easier problem in the first place. The next generation of video encoding will maybe start to incorporate models of eye-tracking, face recognition, and stuff like that.

b) for cartoons and other artificial images, PSNR and other "dumb" algorithms track much better to perceived quality than for video or real things. The CH animation results in the second set of charts show that. Where a video game falls on that spectrum I don't know -- I suspect something like DOTA is more animation-like, and a 1st/3rd person action game more "real-world".

Paul MaudDib posted:

Mean squared error against a lossless source file would tell you whether the MP3 sounds good or not though.

No, in fact it does not. Optimizing MP3s for PSNR against the original is actively detrimental to sound quality. It's impossible to overstate how much our ears are "listening" to different things from the audio waveform.

And using the psychoacoustic model of a lossy algorithm to judge the results is dumb -- it's just going to grade itself as perfect. What you're trying to determine is the quality of the psychoacoustic model itself, which means scoring it by humans (in a proper experiment of course).

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

I wasn't implying that MSE was the gold standard by any means. Does it tell you more than just looking at an encoded waveform with no reference waveform to compare against? Most definitely yes.


That's why you have multiple psychoacoustic models. Run the MP3 through the OGG Vorbis psychoacoustic model and see what it says. If all the (high-quality) models say it's good, it's more likely to be good than something they say is lovely.

Secondly, it's not like you encode something against a psychoacoustic/visual model and that means it has zero error. Let's say I encode a 16 kbps MP3, do you think that would sound perfect?

Finally - what we have here are two totally separate decoders. We can most definitely look at the psychovisual error between the two of them and say which one the model says is better.

I firmly reject the idea that psychoacoustic/visual models have absolutely nothing to tell us. As does the entire field of audiovisual compression, in fact. That's why we have the psychoacoustic/visual models.

Do the psychoacoustic/visual models have to be validated by actual people? Yes, of course, you win,

I am not sure what your original point was in that case, other than "I like to argue". The sentence I quoted was definitively wrong. If the way you want to make it correct is to say that MSE/PSNR is better than no data at all, then sure.

"Reversing" a perceptual lossy product through other codecs is not something that anyone would do -- at least for audio, the models that each codec uses are extremely similar and the places that they are different are tuned for the particular idiosyncrasies of that codec. You would still be going down the wrong path if your method for finding the best 16kbps MP3 is by MSE/PSNR similarity to 128kbps Ogg Vorbis.

Instead, you'd do what Netflix has done in that article I linked: build a second, unrelated, algorithm to judge quality based on a data set evaluated by humans (VMAF in this case). This is a perceptual model in itself, but it doesn't do the same thing as the ones in the codecs. They're not interchangeable. For a while the VMAF will give good results, but if the codecs start to incorporate VMAF's methods or tune themselves to it, the divergence between VMAF's results and real humans will increase.

AFAIK nobody built a "judgement" program for audio because the problem was easier and doing tests with humans is quick for audio compared to video.

:ironicat:

edit:

Paul MaudDib posted:

Again, the problem with this is clearly evident if you've ever been around audiophiles.

Blind testing is a lot more effort than most people will go to and/or can go to (depending on the field).

If you know which sample you're looking at, your opinion is meaningless as far as empirical results are concerned. This has been long-since established.

Also, you're explaining this to a guy who was blind-testing his family on DIVX and XVID codec settings back in ~2002, age 13 :lol:
Soooooo... because some people out there are wrong, the posters you are replying to must also be wrong even thought they're not saying the wrong thing, because you're Paul MaudDib and you were blind testing codecs at age 13? WTF?

Klyith fucked around with this message at 06:49 on Mar 5, 2017

Klyith
Aug 3, 2007

GBS Pledge Week
I mean if you want to argue with golden-ears audiophiles go to headfi or wherever they are. They're not hard to find! Don't argue with us by constructing imaginary positions we didn't say.


edit either that or you just have a really confrontational way of agreeing with people

Klyith fucked around with this message at 06:57 on Mar 5, 2017

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

OK now that you've gotten the point, please don't suggest that going to a forum thread with labelled samples is a good way to evaluate any sort of lossy compression codec.

I suggest maybe going back a page or two and re-reading, maybe keeping notes on who said what, because that's important.

edit: I'd even suggest reading the OBS thread in question, because the guy has a chart of MSE results and then talks about how the nvidia encoder is worse by visual examination, and says that MSE is not good enough. He then encourages people to download his samples to look at them.

Paul MaudDib posted:

You seem to be extra confrontational about that concept for some reason though.

See you understood perfectly well what I was referring to, that's why you brought up what Netflix did.

Dude, I made post about netflix and their evaluations of different ways of judging perceptual quality, and you fishmeched me. Of the people in this thread, you're the one interpreting anyone who quotes you as looking for a fight. People can say things about a topic without it being the opening salvo of a flamewar.

Klyith fucked around with this message at 07:54 on Mar 5, 2017

Klyith
Aug 3, 2007

GBS Pledge Week

FaustianQ posted:

Isn't there a disadvantage to doing this though?

Every idiotic piece of software with per-CPU licensing or restrictions, including win10 home.

Klyith
Aug 3, 2007

GBS Pledge Week

Measly Twerp posted:

Our favourite Scot talks about the Ryzen launch:

https://www.youtube.com/watch?v=ylvdSnEbL50

That's a interesting take, the stuff about the FX is really eye-opening. Demolishes some long-held assumptions about gaming benchmarking.


I don't know that I agree with his statement that games are heavily optimized for intel CPUs. If anything, any cross-platform game (or game built on a cross-platform engine) would have the most optimization work done for Jaguar CPUs, since that's the platform that really needs it. But mostly I think that the bulk of optimization is graphics related and CPUs are left to fend for themselves.

Klyith
Aug 3, 2007

GBS Pledge Week

Dante80 posted:

Those are salvaged chips

Source?

Klyith
Aug 3, 2007

GBS Pledge Week
there is no way the 4 core CPUs are 8 core chips with half the thing disabled


the 6 maybe, but you can't sell a $150 part that has 50% wasted die area without losing your shirt

Klyith
Aug 3, 2007

GBS Pledge Week

Lowen SoDium posted:

I might be wrong, but I thought Zen was made out of 4 core modules and the ones currently for sale were 2 of those modules.

I figured that the 6 core chips would be some combo of 2 modules with defective or disabled cores, the 4 cores would just be a single module, and the 2 cores would be defective or disable single modules.

yes, that is generally possible (though the modules are on a single piece of silicon, so it's not like you can mix and match.)



intel dual core i3s are not defective i5s. as the price goes does and the number you sell goes up, you can't waste silicon and keep margin

Klyith
Aug 3, 2007

GBS Pledge Week

Dante80 posted:

You can explain said quote by him simply being on team red. Even back in the day where the Athlon 64 ran circles around P4, the weapon of choice was the superlative NV 6600GT (or higher). I remember using a gainward golden sample card, and OCing through the roof with it.

Though the year before that would have been the one time an AMD+ATI was the right call, an Athlon 64 plus ATI 9000 (because that was the year of the Dustbuster lol).

But the idea that all AMD is better because the CPU and GPU are somehow "optimized" to work with each other is dumb. Being a fan of any company, AMD intel or nvidia, is dumb. They're a corporation, they're not your friend. Anyone that needs validation that they made the "right choice" with their gear through fanboyism, probably should be buying less crap.

PC LOAD LETTER posted:

Well yeah sure, and I've said before in thread its probably not worth getting Ryzen if you already have Haswell or newer unless you need lots of threads. edit: If you're in the market though for a new CPU and coming from a much older one (ie. Sandy Bridge) it still seems to hold up real well vs the 7700K or 6900 to me.

Yeah, like I'm on Ivy and waiting for the next set of Ryzens to come out before I decide what to do for my next desktop. Let the launch performance weirdness settle out, and see what the $200-250 competition looks like.

Klyith
Aug 3, 2007

GBS Pledge Week
16:10 is great but the only 2560x1600 screens that exist are 30"

if you're getting a 30" screen just go for 4k

Klyith
Aug 3, 2007

GBS Pledge Week

SwissArmyDruid posted:

Last I checked, it's 2017.

Yes, I am typing out this post on this dead gay forum on this monitor in 2017. Kill me.

oh man, is that like your work monitor and your boss won't buy a new one until it stops working? you do realize you are going to have to resort to sabotage right?

the factory that built those things was on top of an old indian graveyard or something. they will not die. no natural cause can kill them.

jpl9330 posted:

I used a 1280*1024 as a secondary up until a couple years ago.
I still do. Starting to think about replacing it, but I kinda like the non-widescreen for secondary. I wish there were modern 1600x1200 LCDs other than stuff being sold for medical equipment prices.

Klyith fucked around with this message at 20:49 on Mar 8, 2017

Klyith
Aug 3, 2007

GBS Pledge Week

NewFatMike posted:

Anyway, the 6C/4C stuff is prooooooobably more gaming oriented, the idea is that you'll be able to get the clocks needed to be near parity because the thermals are easier to manage across fewer cores.

With that thermal load / ghz graph posted earlier, I'm not expecting the 6 & 4 core parts to have a ton of headroom over the 1800x. Whatever it is about the design or gloflo process just taps out at 4 ghz. But the R5s could easily be good enough at games to make a case for themselves, especially at the $250 mark where you're competing against a 4c/4t 7600k that's not exactly a stunning overclocker itself. Mobos are generally a bit cheaper on the AMD side as well.

The thing I really liked about that graph was the performance when limited to 35-45 watts. I routinely underclock my PC in July & August if it's super-hot.

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

Twerk from Home posted:

I could have sworn that the Celerons / Pentiums / i3s were the dual-core dies that they use for mobile -U CPUs, but really leaky. i3s are actually a 2 core disabled i5?

I don't know about leaky -U or whatever else their production chain might be made from, but at least the physical evidence speaks on this:

http://www.xtremesystems.org/fugger/a7.jpg
some OCer's delidded 7700 & i3 7350, the i3 is a smaller die. obviously not the same chip as a 4 core.

  • Locked thread