|
Paul MaudDib posted:these are totally separate tasks and my best advice is to sever, but the 5820K is going to be your best mix of gaming and also more than 4 cores. Glad to see my 5775c way up there... Glad I didn't get on the ryzen hype train!
|
# ? Mar 2, 2017 19:41 |
|
|
# ? Oct 16, 2024 08:30 |
|
Paul MaudDib posted:Right now that's probably the most solid option. Or hoping for Haswell-E or 7700K price drops. My computer is 6-7 years old and no longer keeping up effectively with modern titles, and I have enough money in the bank now that I can build a new one without being one dead car away from desperation, so this is when I'm doing it. To emphasize the timeframe I'm looking at, I had the entire build in the newegg shopping cart but decided to wait and see if the 1080ti / Ryzen releases would have any knock-on effects for the kind of parts I'm looking to buy. Waiting another week for the 1080ti price effects to trickle, down, I'm fine with. Even assuming AMD works out all the kinks and the 4 core Ryzen are competitive options for my needs, I would prefer to have the system built sooner. quote:The 5820K is a Zen that you can buy right now and also get good gaming performance out of. The 7700K/7600K are the clear winners for maxxx single-thread performance. No, you're thinking of someone else. I'm building a new system from scratch, and definitely not with any $800 parts in it.
|
# ? Mar 2, 2017 19:48 |
|
This graph by the anandtech poster linked earlier is really interesting. Zen seems most efficient up to 3.3 Ghz and hits a wall around 3.9 Ghz. That's realistically going to be the absolute maximum on aircooling.
|
# ? Mar 2, 2017 19:52 |
|
Voyager I posted:No, you're thinking of someone else. I'm building a new system from scratch, and definitely not with any $800 parts in it. sorry, I thought you were talking about spending at least $320 on a Ryzen processor if not $500, and then another $200-300 on a motherboard, and then another $200 on DDR4 if you aren't ready to spend $800 I think you have the wrong thread
|
# ? Mar 2, 2017 19:53 |
|
Paul MaudDib posted:sorry, I thought you were talking about spending at least $320 on a Ryzen processor if not $500, and then another $200-300 on a motherboard, and then another $200 on DDR4 I misunderstood you then. I am spending somewhere north of $1,000 in total, but on an entirely new machine and those specific examples are definitely not from me. I waited on building to see if Ryzen would have any immediate effects on my market, and it looks like the answer is 'no' since they products they offered are 8 core processors that aren't optimized for gaming. Voyager I fucked around with this message at 19:58 on Mar 2, 2017 |
# ? Mar 2, 2017 19:56 |
|
Paul MaudDib posted:these are totally separate tasks and my best advice is to sever, but the 5820K is going to be your best mix of gaming and also more than 4 cores. Unfortunately after looking into it a 5820k is 24% more expensive than a 1700, and the cheapest x99 motherboard I can buy is 33% more expensive than the not-cheapest x370 motherboard I would buy. The difference is about $165 USD, which I'd rather use towards not having a massive GPU bottleneck of a GTX 950.
|
# ? Mar 2, 2017 20:03 |
|
eames posted:some actually useful responses over in the AMA, i.e. one related to lacking SMT performance: Lisa su apparently takes gamers for complete idiots. If you up the resolution, that workload falls on the GPU, not CPU. What the gently caress? Am I stupid for thinking they have marketed their CPU the wrong way targeting it at gamers when this CPU is clearly better somewhere else? Also, who would buy a 350-500 dollar CPU for 1080p gaming when most people rarely spend more than 700 bux on a desktop, anyway? I love cats fucked around with this message at 20:09 on Mar 2, 2017 |
# ? Mar 2, 2017 20:04 |
|
https://www.youtube.com/watch?v=BXVIPo_qbc4 Interesting. It is that outlier guy.
|
# ? Mar 2, 2017 20:12 |
|
It at least makes AMD not a brain-dead choice for system builders, but yeah - many of those game benchmarks are a little concerning. It's not so much that the 7700k is beating it so soundly in some games, it's that the $65 G4560 is so close in many games as well - I'm far more interested in the 6 and 4-core variants coming later, and this doesn't really bold well for their value proposition in at least gaming terms atm. I'm sure their longevity prospects are far better than a 2c/4t cpu (probably?), but when we're talking CPU's in the $100 range I don't think most expect ~4 years of use regardless. We'll see with further optimizations. I'd like to see Watch Dogs 2 performance really as that's supposedly one of the most well-threaded games out there, haven't seen it being tested yet.
|
# ? Mar 2, 2017 20:15 |
|
Happy_Misanthrope posted:I'd like to see Watch Dogs 2 performance really as that's supposedly one of the most well-threaded games out there, haven't seen it being tested yet. GamersNexus did Watch Dogs 2: http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7
|
# ? Mar 2, 2017 20:18 |
|
repiv posted:GamersNexus did Watch Dogs 2: http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7 Youch.
|
# ? Mar 2, 2017 20:23 |
|
Happy_Misanthrope posted:It at least makes AMD not a brain-dead choice for system builders, but yeah - many of those game benchmarks are a little concerning. It's not so much that the 7700k is beating it so soundly in some games, it's that the $65 G4560 is so close in many games as well - I'm far more interested in the 6 and 4-core variants coming later, and this doesn't really bold well for their value proposition in at least gaming terms atm. I'm sure their longevity prospects are far better than a 2c/4t cpu (probably?), but when we're talking CPU's in the $100 range I don't think most expect ~4 years of use regardless. Is there a cities skyline benchmark with zen because I do remember cities to be a cpu heavy game more them a gpu heavy game in some benchmarks.
|
# ? Mar 2, 2017 20:25 |
|
I love cats posted:Lisa su apparently takes gamers for complete idiots. If you up the resolution, that workload falls on the GPU, not CPU. What the gently caress? They're desperate to portray it as useful for games, because that's somewhere they've been really hosed for years and years. Unfortunately it doesn't really work to fix the problems they had there, although it is still way better than bulldozer garbage.
|
# ? Mar 2, 2017 20:29 |
|
So at Resolutions of 1440P+ and VR, would Ryzen be better than this 1080P gaming that while most common, is becoming less of the target as games progress? Any VR Benchmarks yet?
|
# ? Mar 2, 2017 20:33 |
|
So far the only one I remember seeing was on [H]
|
# ? Mar 2, 2017 20:36 |
|
We need an :amd: like we have a Anyways I'm gonna go price out an i7-7700K system since my GTX 1070 bottlenecks my loving i7-3820
|
# ? Mar 2, 2017 20:41 |
|
wargames posted:Is there a cities skyline benchmark with zen because I do remember cities to be a cpu heavy game more them a gpu heavy game in some benchmarks. Would also be really interested to see a Cities Skylines benchmark. The game is very CPU intensive but there's conflicting info out there about how well-threaded it is. I'm planning on building a new computer soon and it looks like I'll be going with a 7700K, but I'm interested to see how Ryzen stacks up on Cities.
|
# ? Mar 2, 2017 20:41 |
|
EdEddnEddy posted:So at Resolutions of 1440P+ and VR, would Ryzen be better than this 1080P gaming that while most common, is becoming less of the target as games progress? Any VR Benchmarks yet? FCAT just got updated to support VR so you'll probably see a big wave of VR benchmarks soon. I'm sure some sites will try running it with different CPUs.
|
# ? Mar 2, 2017 20:43 |
|
repiv posted:GamersNexus did Watch Dogs 2: http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7 Going under 60 minimum in the Watchdogs for the 1800x really gives me the uncomfortes. Like I dont plan on buying a monitor that goes over 60hz but the least you could do is hit the 60 frame minimum on these types of games.
|
# ? Mar 2, 2017 20:43 |
|
wrongp ost
|
# ? Mar 2, 2017 20:47 |
|
Edit- Ha, fixed before I posted..
|
# ? Mar 2, 2017 20:47 |
|
repiv posted:GamersNexus did Watch Dogs 2: http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7 Thanks. Still losing to the 7700k but at least blowing away the i3's in that game.
|
# ? Mar 2, 2017 20:52 |
|
EdEddnEddy posted:So at Resolutions of 1440P+ and VR, would Ryzen be better than this 1080P gaming that while most common, is becoming less of the target as games progress? Any VR Benchmarks yet? Increasing the resolution, increases the load on the GPU, not the CPU. Match a 250-350 bucks cpu with a powerful GPU, i.e. GTX 1070 or 1080 and you should be fine. Don't buy into the marketing nonsense that tells you that you absolutely positively need the most expensive CPU to run games at higher resolutions.
|
# ? Mar 2, 2017 20:54 |
|
Ryzen doesn't seem to handle draw calls very well compared to Intel, that could explain the discrepancy in synthetic CPU and gaming performance and AMD nudging people to use higher resolutions. Higher resolution = lower FPS = fewer draw calls. A stock Ryzen 1800X scored 12.8 FPS @ 3.6 Ghz with an R9 Nano in a draw call heavy benchmark, a 3.9 Ghz Haswell + R9 Nano scored 17.8. That's almost 30% difference when assuming equal clock speed at perfect frequency scaling. I believe VR applications are also very draw call heavy.
|
# ? Mar 2, 2017 21:11 |
|
There is _one_ use case where more cores improve VR, and that's in terms of tracking. I believe that Oculus dedicates an entire thread to each camera for positioning? In which case, it will gobble that poo poo up. I don't know how the Vive handles their tracking, since the lighthouses are just strobing beacons.
|
# ? Mar 2, 2017 21:11 |
|
Pawn 17 posted:Wow, I completely missed the fact that the tech demos showing equal or better performance for gaming with Ryzen vs. intel were 4K and gpu bottlenecked. So tricky! Ehh 1800x is a great allrounder and blows out HEDT for price to performance Nobody smart thought it was going to beat a 7700k for gaming perf
|
# ? Mar 2, 2017 21:21 |
|
HalloKitty posted:I think AMD have made a mistake by releasing only an 8-core launch line-up, because 8-core is still basically kind of niche. A solid 4 and 6 core offering with clocks a bit over 4GHz would have made more sense for the mainstream gaming market. But maybe they can't push it any more, it certainly seems that way. GloFlo strikes again? Bareback Werewolf posted:Can somebody explain to me why they're doing a staggered release of the 8 core, 6 core, and 4 core parts? How many people do they think need a high end 8 core / 16 thread processor? I think there are several reasons that lead to this: 1. At least the 6 core parts, possibly one or more 4 core SKUs(?), are physically 8 core parts where one or more cores failed verification. Releasing them later gives you more time to build up supply as a by-product of 8 core production. 2. As you imply, 8 cores is still overkill for mainstream desktop/gaming use. Launching these first at top dollar lets you fleece the fanboys who want the highest-end part and want it first. 3. You also make a splash by launching the biggest version with the highest numbers. 4. It's clear that there are still bugs to shake out. There are still memory timing/access issues. Games need patches to optimize for the different architecture, and Windows needs some tweaks as well, for which AMD needs to wait for at least one, probably 2 or 3 Patch Tuesdays to pass. By launching the too-many-cores-for-gaming version first you get the rest of the ecosystem to start tuning and shaking out bugs before that launch. 5. For gaming you want to hit higher speeds, waiting to release the more gaming-suitable chips gives your fab partner more time to work out kinks and bin faster chips. 6. AMD's GPU division isn't ready to release Vega yet, they may be planning a simultaneous release? EdEddnEddy posted:I feel that a chunk of performance is probably still left on the table, in the form of GloFo improving a little in 6 months, a number of Bios Updates, and even a Windows Update that might help the ThreadRipper rip more efficiently.
|
# ? Mar 2, 2017 21:23 |
|
repiv posted:I was looking for an excuse to get rid of this 2500k but I'm still not feeling it I got bored and purchased a second hand 3770K, delidded it, and settled on an overclock of 4.7GHz with decent temps and reasonable voltage. My 2500K was doing the business just fine, but I had an itch. It seems silly, but now I don't feel like it was a waste of time, since it's still very competitive in gaming performance... Prescription Combs posted:Glad to see my 5775c way up there... Glad I didn't get on the ryzen hype train! What an absolute beast. I'll say it again: Intel should be giving us Kaby Lake with that drat cache... HalloKitty fucked around with this message at 22:07 on Mar 2, 2017 |
# ? Mar 2, 2017 22:02 |
|
Dante80 posted:https://www.youtube.com/watch?v=BXVIPo_qbc4 This idiot thinks turning off v-sync means his GPU can't be the bottleneck. While staring at 99% pegged GPU utilization and less than 50% CPU utilization on the Ryzen. Yeah, I think we see why there's an outlier.
|
# ? Mar 2, 2017 22:21 |
|
Rastor posted:2. As you imply, 8 cores is still overkill for mainstream desktop/gaming use. Launching these first at top dollar lets you fleece the fanboys who want the highest-end part and want it first. I don't believe this is true at this point. Benchmarks on a sterile test system show no real gain from 8c, but real world scenarios where you have something streaming on your second monitor, are streaming yourself, playing one game while sitting in queue for another, etc are all going to favor more cores than just a game needs. That and as we continue into the main era of this console generation, releases are going to be more and more optimized for the 6-7 cores those machines make available.
|
# ? Mar 2, 2017 22:28 |
|
K8.0 posted:I don't believe this is true at this point. Benchmarks on a sterile test system show no real gain from 8c, but real world scenarios where you have something streaming on your second monitor, are streaming yourself, playing one game while sitting in queue for another, etc are all going to favor more cores than just a game needs. That and as we continue into the main era of this console generation, releases are going to be more and more optimized for the 6-7 cores those machines make available. Arstechnica did a test on Dota2 + OBS, which did show a smaller FPS drop with OBS than the 7700K, but also lower average and 99th percentile FPS in both cases.
|
# ? Mar 2, 2017 22:35 |
|
What are those Turkish overclocker reviewer guys saying about the lacklustre overclocking? This is pretty contrary to their comments (Also that it would leave a 7700K as history" in gaming). What we've seen so far indicates they are full of poo poo. /drops the mic like Salt bae GRINDCORE MEGGIDO fucked around with this message at 22:42 on Mar 2, 2017 |
# ? Mar 2, 2017 22:35 |
|
K8.0 posted:I don't believe this is true at this point. Benchmarks on a sterile test system show no real gain from 8c, but real world scenarios where you have something streaming on your second monitor, are streaming yourself, playing one game while sitting in queue for another, etc are all going to favor more cores than just a game needs. That and as we continue into the main era of this console generation, releases are going to be more and more optimized for the 6-7 cores those machines make available. We apparently have different definitions of "mainstream" desktop computer use.
|
# ? Mar 2, 2017 22:38 |
|
Deuce posted:This idiot thinks turning off v-sync means his GPU can't be the bottleneck. He's chatting live with the Gamers Nexus guy right now. Maybe he can learn something
|
# ? Mar 2, 2017 22:44 |
|
Horrific launch, holy poo poo. It's Ivy Bridge-E IPC at best, and really it should have never been pushed beyond 3.3, 3.5Ghz according to that guy on Anandtech, because it goes bonkers power hungry from there (basically the 1700X and 1800X are pre OC'ed, lol). I was at least hoping the TDP numbers were close to honest because that means good things for server space, but lol nope. I'm starting to think there is nowhere to go from here except a new process, I have absolutely no faith GloFo can produce poo poo with the higher clocks and lower voltages required, gently caress the last good CPU AMD had had similar clockspeed behavior so why would this be any different. Also this means no price drops on older Intel processors. BTW Paul, not everyone has the ability to walk into a Microcenter to get cheap deals. I hate when people bring this up as if it's some definitive answer to Intel pricing.
|
# ? Mar 2, 2017 22:49 |
|
It's honestly the Launch I and I bet others here expected from AMD. Its literally the 480 launch all over again but really, its nice to see AMD be compared to modern processors again and not just the i3's/i5 Non K's finally. I realllllllly do want to see what AMD can launch once they start using Samsung instead of GloFo though.
|
# ? Mar 2, 2017 22:53 |
|
FaustianQ posted:Horrific launch, holy poo poo. It's Ivy Bridge-E IPC at best, no.
|
# ? Mar 2, 2017 22:54 |
|
SwissArmyDruid posted:There is _one_ use case where more cores improve VR, and that's in terms of tracking. I believe that Oculus dedicates an entire thread to each camera for positioning? In which case, it will gobble that poo poo up. How is VR benchmarking going to work? As someone who got a vive over the winter I'm interested in some VR benchmarks
|
# ? Mar 2, 2017 22:54 |
|
Dante80 posted:no. How not? I'm seeing Ivy processors keeping up with it, and it can barely keep up with Haswell according to Stilt on Anandtech.
|
# ? Mar 2, 2017 22:55 |
|
|
# ? Oct 16, 2024 08:30 |
|
Rabid Snake posted:How is VR benchmarking going to work? As someone who got a vive over the winter I'm interested in some VR benchmarks Negative testing for pukes per session FaustianQ posted:How not? I'm seeing Ivy processors keeping up with it, and it can barely keep up with Haswell according to Stilt on Anandtech. https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/3/ https://www.computerbase.de/2017-03/amd-ryzen-1800x-1700x-1700-test/4/ It's not great but it's not bad
|
# ? Mar 2, 2017 22:57 |