Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
The GPU Megathread is here, this thread is about CPUs, APUs, and AMD platforms in general.



After six goddamn years of being completely uncompetitive and terrible in almost anything but the cheapest parts of the CPU market, AMD has finally managed to produce a completely new design that isn't trying to salvage anything from the construction core dumpster fire. The Zen microarchitecture is a modern design created in part by Jim Keller, the heroic lead designer of the Athlon (K7) and Athlon 64 (K8) architectures, and it manages to actually be quite competitive with Intel's best offerings.

Models:
Currently, the R5 and R7 lines of desktop processors have been released.

code:
                       R5 1400     R5 1500X     R5 1600     R5 1600X  
                     ----------- ------------ ----------- ----------- 
Cores/Threads:         4/8         4/8          6/12        6/12
Base Clock (GHz):      3.2         3.5          3.2         3.6       
Boost Clock (GHz):     3.4         3.7          3.6         4.0       
L2 Cache:              2 MB        2 MB         3 MB        3 MB      
L3 Cache:              8 MB        16 MB        16 MB       16 MB     
TDP:                   65W         65W          65W         95W       
MSRP:                  $169        $189         $219        $249

                       R7 1700     R7 1700X     R7 1800X  
                     ----------- ------------ ----------- 
Cores/Threads:         8/16        8/16         8/16      
Base Clock (GHz):      3.0         3.4          3.6       
Boost Clock (GHz):     3.7         3.8          4.0       
L2 Cache:              4 MB        4 MB         4 MB      
L3 Cache:              8 MB        16 MB        16 MB     
TDP:                   65W         95W          95W       
MSRP:                  $329        $399         $499      
The workstation and small server platform, Snowy Owl and the mobile/desktop APU platform, Raven Ridge should be coming by the end of the year, but that's about all we know at the moment.

Benchmark Summary


In terms of price for performance, the superiority of Ryzen over Intel's Broadwell-E CPUs is indisputable, with the only options from Intel that's strictly superior to the Ryzen chips being the i7 6900K and i7 6950X, which cost $1,049 and $1,649 respectively.

The R5 1600X is essentially equivalent in terms of performance to the lower end Broadwell-E hexacore, the i7 6800K, although the gap in single-threaded performance does allow the i7 6850K to be slightly ahead overall.

More controversial is the comparison with Kaby Lake. In productivity tasks and multi-thread optimized games, the hexacores (much less the octocore R7s) are much faster than even the flagship i7-7700K because they have so many more threads of execution available to be utilized, even if each is not quite as performant as the higher-clocked Intel quad core.

However, when one is exclusively playing purely/mostly single threaded games while not using their computer in other ways, the i7-7700K is faster in CPU bottlenecked games. Some focus on this quite heavily, but the fundamental truth of the matter is that the performance difference will be small in any reasonable setup.

As long as you're not pairing a Titan Xp with a $249 CPU or more broadly aren't spending huge amounts on a gaming PC to merely run games at 1080p or lower resolutions, the bottleneck will almost always be the GPU rather than the CPU. In addition, most benchmarks are done with NVIDIA cards (understandable considering the fact that AMD hasn't put out an actual high end GPU in ages), which seem to have performance problems on Ryzen, compared to AMD's own graphics cards.

In addition, with DX12 and Vulkan, games will be getting more and more multithreaded, thanks in part to the use of old, crappy low-power AMD cores in the consoles this generation (including the PS4 Pro and Scorpio). Overall, looking forward, Ryzen is a much better value proposition than Intel's offerings, at least at the R5 1600 level and above.

The Ryzen quad cores, however, are only really competing against the bottom of the barrel i5s and the i3-7350K, which they are indeed superior to.

Motherboards:
Ryzen uses the new AM4 socket, and motherboards are available with the following chipsets:



However, at the moment, the motherboard situation isn't ideal. It's been a long time since AMD has launched a new CPU, and there have been some significant teething issues with BIOSes and especially the memory support. Ryzen is quite picky about the RAM timings that are used with it, and the long story short is to actually pay attention to the QVL list for whatever motherboard you are getting.

In addition, there are some fairly important motherboard features beyond the standard motherboard issues (like VRMs), with special relevance to using high speed memory, which is far more important for Ryzen's performance than it is for Intel CPUs, because of how it affects the clock rate that the internal communication bus between the modular complexes that make up a Ryzen CPU. In addition to the quality of the actual electrical design of the board, some boards offer external clock generators that may allow higher memory speeds and (depending on the settings) either better or worse stability than the default 100 MHz.

The following motherboards have BCLK generators:
  • Asus Crosshair VI Hero
  • ASRock Taichi
  • ASRock Fatal1ty X370 Professional Gaming
  • GIGABYTE GA-AX370-Gaming K7

However, there are some reports that BCLK overclocking may cause problems with peripherial clocks (PCI-E, USB 3), but if you are planning on overclocking it is worth getting a board that has it, since it is hardly more expensive than any of the other top quality X370 boards.

Sinestro fucked around with this message at 21:37 on Apr 14, 2017

Adbot
ADBOT LOVES YOU

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
:firstpost:

I'm excited for the rumored 16-core parts coming.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Dual processor boards with 16c/32t processors. More cores. All of them!

repiv
Aug 13, 2009

Sinestro posted:

code:
                       R7 1700     R7 1700X     R7 1800X  
                     ----------- ------------ ----------- 
Cores/Threads:         8/12        8/12         8/12      

Those should be 16 threads not 12 :cheeky:

SwissArmyDruid
Feb 14, 2014

by sebmojo
Your thread counts are wrong for the R7 parts, Sinestro.

e:fb

Dante80
Mar 23, 2015

Assuming that AMD goes for the HEDT market, and assuming that it will use single socket half-Naples chips for it (16C/32T), we should get something like quad channel DDR4 and 64 lanes out of the platform. Call them R9, and release them on 12 and 16 core flavors.

That would be really sweet. But...that is assuming too much, at least at this point in time.

Next stop, Naples.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

repiv posted:

Those should be 16 threads not 12 :cheeky:

SwissArmyDruid posted:

Your thread counts are wrong for the R7 parts, Sinestro.

e:fb

Fixed.

Misc
Sep 19, 2008

Is there really much to the argument in the OP that gaming performance is expected to improve because the consoles are operating on 8 threads? That's a point I literally bought into with bulldozer, which was a poor gamble vs just buying whatever the Intel equivalent was and overclocking the hell out of it. There doesn't seem to be an incentive for developers to optimize for any more than a maximum of 8 threads when that's all the consoles have had for a hot minute and will continue to have until at least the next console gen.

orcane
Jun 13, 2012

Fun Shoe
There is some evidence that recent games are scaling better with more threads, according to a few tests I've read on German websites. It's possible they simply get better at running multithreaded in general, no idea how much is related to consoles.

No (good) mini-ITX Ryzen boards for another few months is a huge bummer, tho :(

Bareback Werewolf
Oct 5, 2013
~*blessed by the algorithm*~
Took the plunge and got the 1600X. This will be my triumphant return to PC gaming. I tried to do the console thing for a while, but I can not play any FPS with a controller for the life of me.

Klyith
Aug 3, 2007

GBS Pledge Week

Misc posted:

Is there really much to the argument in the OP that gaming performance is expected to improve because the consoles are operating on 8 threads? That's a point I literally bought into with bulldozer, which was a poor gamble vs just buying whatever the Intel equivalent was and overclocking the hell out of it. There doesn't seem to be an incentive for developers to optimize for any more than a maximum of 8 threads when that's all the consoles have had for a hot minute and will continue to have until at least the next console gen.

Well, bulldozer did improve over time. Just not enough to make up for it's awful handicap at the start. At least when compared to bulldozer, you can look at ryzen and say it's not a ton of performance difference in average real-world gameplay.

Definitely when looking at the R5s and comparing to a 4c/4t 7600K at about the same price, it seems like there should be something to be gained. It's not a slam dunk though, so if all you care about is games you should weigh the current performance vs potential gain vs intangibles like mobo quality and watts consumed (intel advantage) or total system cost (amd advantage).

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Misc posted:

Is there really much to the argument in the OP that gaming performance is expected to improve because the consoles are operating on 8 threads? That's a point I literally bought into with bulldozer, which was a poor gamble vs just buying whatever the Intel equivalent was and overclocking the hell out of it. There doesn't seem to be an incentive for developers to optimize for any more than a maximum of 8 threads when that's all the consoles have had for a hot minute and will continue to have until at least the next console gen.

Things have been moving in the direction of more threads being better for gaming for a long time now, these days it's hard to say that an i5 CPU is worth it over an i7 and I expect that five years down the line we will start seeing 6c/12t thread CPUs pull ahead of 4c/8t CPUs even with the 4c/8t one being faster in single threaded performance. But really, trying to predict things this far out with any accuracy is a fool's errand, trends tell us that more threads should be better down the line, that is about all that can be said on the matter.

Misc
Sep 19, 2008

Klyith posted:

Well, bulldozer did improve over time. Just not enough to make up for it's awful handicap at the start. At least when compared to bulldozer, you can look at ryzen and say it's not a ton of performance difference in average real-world gameplay.

Definitely when looking at the R5s and comparing to a 4c/4t 7600K at about the same price, it seems like there should be something to be gained. It's not a slam dunk though, so if all you care about is games you should weigh the current performance vs potential gain vs intangibles like mobo quality and watts consumed (intel advantage) or total system cost (amd advantage).

I have seen that bulldozer aged well, even if it didn't start out good enough to make it worth waiting around. I am within the ideal use case for Ryzen as I have multithreaded workloads and play games socially, but am weighing my options towards building a dedicated, small form factor gaming machine instead since I still carry my stuff around for lan parties. There are no mITX boards available, and my local Micro Center aren't offering many Ryzens with mATX boards, so I'm going to continue to observe where this all goes for a while longer, especially if the Ryzen-based APUs allow for good enough gaming performance that I can keep in an enclosure small enough to fit in a bag.

SlayVus
Jul 10, 2009
Grimey Drawer

Misc posted:

I have seen that bulldozer aged well, even if it didn't start out good enough to make it worth waiting around. I am within the ideal use case for Ryzen as I have multithreaded workloads and play games socially, but am weighing my options towards building a dedicated, small form factor gaming machine instead since I still carry my stuff around for lan parties. There are no mITX boards available, and my local Micro Center aren't offering many Ryzens with mATX boards, so I'm going to continue to observe where this all goes for a while longer, especially if the Ryzen-based APUs allow for good enough gaming performance that I can keep in an enclosure small enough to fit in a bag.

This kind of my problem now, I want to upgrade to itx build for portability. However, there are no boards at all and it'll be several months before any come out. To have say a SilverStone Raven RVZ02 with a Titan XP and Ryzen 1800x would be an amazing VR rig.

Rabid Snake
Aug 6, 2004



Trip report. I was able to overclock my Corsair Vengeance LPX to 3200MHz by upping the voltage to 1.35 and loosening the timings to C16. The gigabyte AB350M Gaming 3 has been a pleasure to work with so far.

It's handled my 1700 at 3.7 on stock voltage just fine. I only wish this MicroATX board came with built in WiFi.

This thing runs surprisingly cool, even with the stock cooler. Never seeing anything above 70 degrees, even with prime 95

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN
My brother upgraded to a FX 8350 from a 5 year old Intel system because....cheap? I think he fell into the core trap with it. That said he likes it and has a R9 390 and can run BF1 well enough that he doesn't care. I'm still on a Phenom II x4 system and really feeling it in terms of gaming. Elite runs, but going to planets can lead to single digit FPS and while I can "run" BF1 I wouldn't exactly call it playable. Really think the R5 is currently the way to go. While memory seems to be a bit of a pain, on a B350 board the most you're going to hit is 3200 anyways so meh. In addition, with the AM4 platform you have a pretty obvious upgrade path even with a B350...which for better or worse I didn't get an AM3+ socket 6 years ago when I built this thing otherwise yeah I'd probably Bulldozer it up.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Rabid Snake posted:

Trip report. I was able to overclock my Corsair Vengeance LPX to 3200MHz by upping the voltage to 1.35 and loosening the timings to C16. The gigabyte AB350M Gaming 3 has been a pleasure to work with so far.

It's handled my 1700 at 3.7 on stock voltage just fine. I only wish this MicroATX board came with built in WiFi.

This thing runs surprisingly cool, even with the stock cooler. Never seeing anything above 70 degrees, even with prime 95

Did you run benchmarks at both lower and higher speeds on the memory? Sometimes the Ryzen memory controller sets certain subtimings which are inaccessible to the end user to extremely slack settings when raising memory clock speeds, this can result in lower performance in spite of the higher clock rate.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Misc posted:

I have seen that bulldozer aged well, even if it didn't start out good enough to make it worth waiting around. I am within the ideal use case for Ryzen as I have multithreaded workloads and play games socially, but am weighing my options towards building a dedicated, small form factor gaming machine instead since I still carry my stuff around for lan parties. There are no mITX boards available, and my local Micro Center aren't offering many Ryzens with mATX boards, so I'm going to continue to observe where this all goes for a while longer, especially if the Ryzen-based APUs allow for good enough gaming performance that I can keep in an enclosure small enough to fit in a bag.

Unfortunately all the really tiny cases are mITX, not mATX. I am in the same boat.

SlayVus posted:

This kind of my problem now, I want to upgrade to itx build for portability. However, there are no boards at all and it'll be several months before any come out. To have say a SilverStone Raven RVZ02 with a Titan XP and Ryzen 1800x would be an amazing VR rig.

That's actually why I built my mITX box - Raven RVZ01 with my old 4690K, a Corsair H75, and my secondhand GPU at the time (780 Ti reference). Pulled it when I flipped those cards but eventually it'll probably get my current 1080. We have cats that get underfoot, and my fiance doesn't keep the living room clean enough for VR anyway, so I'm probably going to get a second pair of lighthouses and do cockpit sims on my desktop in my room instead anyway. Vive is supposed to release a Mark 2 Lighthouse soon that will be cheaper so I'm holding out for a bit.

Highly recommend the Raven boxes though. If you have the money you can even do a custom loop, which is absolutely absurd for a mITX box that size. Apart from the Corsair One I think they're probably the best mITX boxes ever made. Dan SFX-A4 is the only real competitor there.


I have mixed feelings about Ryzen and VR though. I'd actually love if someone dug up some reviews of how it does there. My gut instinct is that the single-threaded punch of the 7700K would win the day but there are really a lot of variables in play here.

First off - the background positioning tasks are something that can be parallelized and run on separate cores. This is IMO the classic example of "stuff is getting more parallel" - we now have some fairly compute-intensive tasks that run in the background.

Second - the better minimum framerates in a lot of games on Ryzen. In theory that's a plus.


The counterarguments though:

Positioning processing has totally diverged in approaches. The Oculus Rift really needs lots of processing power since it's basically handling a pair of USB 3.0 cameras streaming in realtime. In contrast the Vive relies on precision timing - detecting the beam sweeps from the lighthouse (probably with the precision timing happening in the headset). So the Ryzen might well be better for the Oculus Rift with its processing needs, and the 7700K for the Vive with its need for precision timing. Nowadays the fanbase has mostly gravitated towards the Vive though (by about 2-to-1).

Ryzen's generally acknowledged as being modestly worse for gaming. Yeah, the minimum frametimes matter, but a VR build is also the quintessential subject where you can't just handwave and go "but my cinebench performance!". It's an all-out gaming build and historically max single-threaded performance has won that niche.

Furthermore, even though minimum frametimes do matter, the VR community has expended a lot of effort to lessen that impact with stuff like "reprojection"/"asynchronous warp"/etc.


Sorry, that's clear as mud, but I still have a bunch of questions here re: Ryzen. With the boards that are available today, a 7700K is certainly the answer for most people. Anything a 7700K can't handle, you are probably better off with a very high-clocked Haswell-E on the (only) mITX board, but that's an expensive setup and will be difficult to cool (you would certainly want to go liquid cooling there, and bear in mind you will be dissipating 140W at 4.1 GHz and 200W+ at 4.5-4.7 GHz through a 120mm radiator mount).

I feel a lot of the same arguments on the Haswell-E will apply to Ryzen as well though. A Ryzen system, at stock clocks, with no GPU load consumes about 200-230W at full CPU load (measured at the wall). That's a lot of power to dissipate in a mITX case no matter how you slice it. The RVZ02 is certainly inappropriate for this given its even tighter dimensions and lack of 120mm radiator mount.

I don't want to say the Ryzen TDPs are outright lies but there's absolutely zero question that they cover the same level of boost clocks as the Intel processors do. Frankly my 5820K boosts to 4.13 GHz all-core at stock voltages, and still undershoots its 145W rated TDP fairly significantly. In practice Ryzen is perhaps 10% more efficient than Haswell-E at best, and that's being a little gracious. The 1800X is more like a 125W processor even at stock clocks. And like Haswell-E, once you hit the point of diminishing returns then the power starts going nuts. The difference with Ryzen is you hit that wall about 10% sooner than Intel - Ryzen hits it at ~3.9 GHz and Haswell-E hits it at 4.3 GHz, and Ryzen maxes out about 4.1 GHz while Haswell-E maxes out about 4.5 GHz.

That's normally not a huge deal anyway - I am the first to point out that 10 and 20 watts here and there doesn't matter when you're talking about pushing 200+ watts through your processor - but an ITX case is the exception to that rule because you have very few options to dump that heat.

Paul MaudDib fucked around with this message at 03:14 on Apr 15, 2017

SwissArmyDruid
Feb 14, 2014

by sebmojo
Let's christen the new thread's first page with some ultimately pointless news:

Remember Der8auer? The German delid guy? He did a 1600X to 5.9 GHz on LN2, DDR4-3000 CL12. All cores enabled.

https://www.youtube.com/watch?v=CZ0SxpGzbw0

All hexacore Intel records beaten: https://ocaholic.ch/modules/news/article.php?storyid=16401

SwissArmyDruid fucked around with this message at 04:26 on Apr 15, 2017

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN

Paul MaudDib posted:

I have mixed feelings about Ryzen and VR though. I'd actually love if someone dug up some reviews of how it does there. My gut instinct is that the single-threaded punch of the 7700K would win the day but there are really a lot of variables in play here.

Gamers Nexus have one: https://www.youtube.com/watch?v=4DJdkDms7Y0

TLDR: i7-7700k out performs the 1700 both stock and OC'ed, however because both headsets are limited to 90Hz the 'experience', as it were, is imperceptibly different.

They did a follow up thing with Scott Wasson (some AMD guy as far as I know, but I've only just started paying attention to tech stuff again) https://www.youtube.com/watch?v=FiZeU9oCXrE. I haven't watched this so I don't know if there's any decent takeaways so ymmv.

rex rabidorum vires fucked around with this message at 04:59 on Apr 15, 2017

Haquer
Nov 15, 2009

That windswept look...

rex rabidorum vires posted:

My brother upgraded to a FX 8350 from a 5 year old Intel system because....cheap? I think he fell into the core trap with it. That said he likes it and has a R9 390 and can run BF1 well enough that he doesn't care. I'm still on a Phenom II x4 system and really feeling it in terms of gaming. Elite runs, but going to planets can lead to single digit FPS and while I can "run" BF1 I wouldn't exactly call it playable. Really think the R5 is currently the way to go. While memory seems to be a bit of a pain, on a B350 board the most you're going to hit is 3200 anyways so meh. In addition, with the AM4 platform you have a pretty obvious upgrade path even with a B350...which for better or worse I didn't get an AM3+ socket 6 years ago when I built this thing otherwise yeah I'd probably Bulldozer it up.

BF1 on my Phenom II X6 sits between 85-90% CPU and some physics poo poo bumps it up to 100% here or there making me go from 45-60fps (rx480 gpu) down to like 15-20 :smith:

Still waiting for the teething issues to iron out before looking at a new cpu, been eyeballing the 1700x :getin:

SwissArmyDruid
Feb 14, 2014

by sebmojo

Former founder and editor-in-chief of The Tech Report. He's the one that helped get AMD and Nvidia to start thinking about their products in terms of frame time as opposed to raw FPS numbers in the first place.

kirtar
Sep 11, 2011

Strum in a harmonizing quartet
I want to cause a revolution

What can I do? My savage
nature is beyond wild

Haquer posted:

BF1 on my Phenom II X6 sits between 85-90% CPU and some physics poo poo bumps it up to 100% here or there making me go from 45-60fps (rx480 gpu) down to like 15-20 :smith:

Still waiting for the teething issues to iron out before looking at a new cpu, been eyeballing the 1700x :getin:

I'm also just waiting for some of the teething issues and will most likely build sometime mid-late May. At that point, the memory compatibility AGESA should hopefully be pushed out to stable BIOS.

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN

Haquer posted:

BF1 on my Phenom II X6 sits between 85-90% CPU and some physics poo poo bumps it up to 100% here or there making me go from 45-60fps (rx480 gpu) down to like 15-20 :smith:

Still waiting for the teething issues to iron out before looking at a new cpu, been eyeballing the 1700x :getin:

If I could get the x6 1100T for like $50 I'd probably try it. Squeeze maybe 3.9 or 4.0 OC and it could almost be playable. Personally I'm probably looking at a 1600. There's a Microcenter nearby and the 1600 plus an ASROCK B350 is a touch over $250. The ASROCK is a 9+2 power phase board so moving to a higher TDP chip down the line shouldn't pose too many issues and stress the board too much (if I understand how that works correctly). Although I'm still hemming and hawing if the X370 would be better for that kind of move. Luckily September is when I'm looking at doing my rebuild so I have plenty of time to see how things shake out.

SwissArmyDruid posted:

Former founder and editor-in-chief of The Tech Report. He's the one that helped get AMD and Nvidia to start thinking about their products in terms of frame time as opposed to raw FPS numbers in the first place.

Very cool. I can see why he would be a good person to talk about VR stuff.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

rex rabidorum vires posted:

My brother upgraded to a FX 8350 from a 5 year old Intel system because....cheap? I think he fell into the core trap with it. That said he likes it and has a R9 390 and can run BF1 well enough that he doesn't care. I'm still on a Phenom II x4 system and really feeling it in terms of gaming. Elite runs, but going to planets can lead to single digit FPS and while I can "run" BF1 I wouldn't exactly call it playable. Really think the R5 is currently the way to go. While memory seems to be a bit of a pain, on a B350 board the most you're going to hit is 3200 anyways so meh. In addition, with the AM4 platform you have a pretty obvious upgrade path even with a B350...which for better or worse I didn't get an AM3+ socket 6 years ago when I built this thing otherwise yeah I'd probably Bulldozer it up.

5 year old Intel could mean Sandy or possibly Ivy. Unless we're talking dual cores, they would be better than Bulldozer.

rex rabidorum vires
Mar 26, 2007

KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN KASPERI KAPANEN

HalloKitty posted:

5 year old Intel could mean Sandy or possibly Ivy. Unless we're talking dual cores, they would be better than Bulldozer.

:shrug: It was probably older than that. He never OC'ed it and basically impulse bought the FX because cheap and MORE CORES so whatever. In my quest to try and help someone in the PC Building thread I came across this: http://rymem.vraith.com/ which seems to be a depository of validated ram speed by type and board from users. Not sure if it'll be something super useful, but 'what ram should is use' and 'which ram is works best/fastest' is probably going to be a re-occurring question with Ryzen for a while.

GRINDCORE MEGGIDO
Feb 28, 1985


I just want to congratulate the op for spelling platfrom correctly :v:

Prescription Combs
Apr 20, 2005
   6
So far I'm really digging my 1700. Running it stable at 4Ghz just under 1.5v on water and barely touches 60c under load. Could probably tune the voltage down a bit but I just cranked the offset to +0.3 and let'er rip.

GRINDCORE MEGGIDO
Feb 28, 1985


What speed memory? That's great temps.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
From last thread on DDR4 prices:

quote:

Demand is outstripping supply right now with the move to DDR4 and it can take years for new production lines to be built or old DDR3 production lines to be converted to DDR4 production.

Not only that, but DRAM/NAND production is overwhelmingly caterd to mobile devices and socketed RAM is now an afterthought. DDR3/4 was priced low in the past as they were overproduced in anticipation to a large PC demand spike by Win 8/10 that never materialized.

I go as to say the days of cheap desktop RAM is over now that the economics are altered permanently, the age-old "losing margins and making up in volume" business model is increasingly proven to be a trash concept and manufacturers will never make the same PC demand mistake again.

Palladium fucked around with this message at 04:00 on Apr 16, 2017

Prescription Combs
Apr 20, 2005
   6

GRINDCORE MEGGIDO posted:

What speed memory? That's great temps.

A 3200 8x2 kit running at 2666. I've been meaning to see if I can get the memory to run higher with some timing tweaking.

GRINDCORE MEGGIDO
Feb 28, 1985


Might get better with some BIOS updates on the way.
I'm going to get a 1700 when Asus get their head out of their rear end and make an itx board.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
isnt amd doing an entirely different chipset for itx? I'd imagine that would introduce some complications even if isn't that much of a difference and just 'less ram' and 'only one pcie slot'

Drakhoran
Oct 21, 2012

Watermelon Daiquiri posted:

isnt amd doing an entirely different chipset for itx?

Kind of. AMD announce five different chipsets for AM4 of which two, the X300 and the A200, were for SFF PCs. So far I believe we've only seen boards based on X370 and B350.

Drakhoran fucked around with this message at 18:48 on Apr 16, 2017

Anime Schoolgirl
Nov 28, 2002

Drakhoran posted:

Kind of. AMD announce five different chipsets for AM4 of which two, the X300 and the A200, where for SFF PCs. So far I believe we've only seen boards based on X370 and B350.
They're not-actually-chipset chipsets, the implementation of which I figure AMD is apparently having a problem with :ohdear:

FuturePastNow
May 19, 2014


AMD will always have lovely chipsets and building them into the processor won't change that.

orcane
Jun 13, 2012

Fun Shoe
This X300 doesn't seem very interesting (PCI-E lanes, USB support, SATA ports - I know Ryzen brings a few of its own but still) and having it delay ITX boards makes it worse :ohdear:

I remember having great fun with awful VIA chipsets because those were the only chipsets I could get for some early AthlonXP builds :suicide:

orcane fucked around with this message at 18:46 on Apr 16, 2017

Anime Schoolgirl
Nov 28, 2002

FuturePastNow posted:

AMD will always have lovely chipsets and building them into the processor won't change that.
FM2 chipsets were actually really, really good. The processors themselves, though :ohdear:

GRINDCORE MEGGIDO
Feb 28, 1985


orcane posted:

I remember having great fun with awful VIA chipsets because those were the only chipsets I could get for some early AthlonXP builds :suicide:

:gibs: horrible.

Adbot
ADBOT LOVES YOU

Platystemon
Feb 13, 2012

as a person who never leaves my house i've done pretty well for myself.
lol if you didn’t use a VIA CPU with your VIA chipset

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply