|
gwrtheyrn posted:If my main use case is 11-boxing eve online, actively running another game, and watching some sort of streaming video (twitch, amazon, youtube), would ryzen be a better purchase over an intel hex-core with the information out so far? You are the usage case that 8+ core CPUs are for, congratulations.
|
# ? Mar 3, 2017 23:37 |
|
|
# ? Oct 11, 2024 06:18 |
|
gwrtheyrn posted:If my main use case is 11-boxing eve online, actively running another game, and watching some sort of streaming video (twitch, amazon, youtube), would ryzen be a better purchase over an intel hex-core with the information out so far? 8-core Ryzen is definitely the CPU for you
|
# ? Mar 3, 2017 23:38 |
|
gwrtheyrn posted:If my main use case is 11-boxing eve online, actively running another game, and watching some sort of streaming video (twitch, amazon, youtube), would ryzen be a better purchase over an intel hex-core with the information out so far? ...You have 11 EVE accounts? God drat. You can definitely afford to go whole hog with Ryzen then. Post a trip report.
|
# ? Mar 3, 2017 23:56 |
|
Kazinsal posted:...You have 11 EVE accounts? Active ones. I have paid real money for eve in years
|
# ? Mar 3, 2017 23:59 |
|
All this talk of streaming games makes me feel like an old man who just doesn't understand these youths
|
# ? Mar 4, 2017 00:05 |
|
Palladium posted:Well, even AMD themselves are making their own SKUs above $320 look plenty bad already. Pay more for factory OC and XFR = lol top of range parts are traditionally way out in the land of diminishing returns, ryzen 1800x is no different. when you look at a situation where that isn't the case -- like the core i5s having hyperthreading disabled -- what you're seeing is a monopoly distortion, not "good value".
|
# ? Mar 4, 2017 00:23 |
|
WhyteRyce posted:All this talk of streaming games makes me feel like an old man who just doesn't understand these youths Why do you watch the Kings play basketball? Because it's entertaining and they're better at basketball than you. Apply that logic to a streamer playing a game you might also enjoy and is probably better at it than you.
|
# ? Mar 4, 2017 00:50 |
|
teagone posted:Kings teagone posted:better at basketball than you Well, on a good day at least.
|
# ? Mar 4, 2017 00:52 |
|
AMD posted:As a general guideline: a CPU voltage of up to 1.35 V is acceptable for driving everyday overclocks of the AMD Ryzen processor. Core voltages up to 1.45 V are also sustainable, but our models suggest that processor longevity may be affected. Regardless of your voltage, make sure you’re using capable cooling to keep temperatures as low as possible. Most R7 CPUs seem to go up to 3.8 Ghz at that voltage which matches the "Critical 2" point in the voltage vs frequency chart linked earlier. This also explains why a 1800X with XFR only boosts to 3.7 Ghz with load on all 8 cores. It only goes up to 4.1 Ghz nominal turbo with load on up to two cores. Somebody linked a overclocking guide, open at your own risk, could have a virus embedded for all I know. http://www.mediafire.com/file/3knlj278nr6jdx9/C6H+XOC+Guide+v04.pdf
|
# ? Mar 4, 2017 00:59 |
|
Apparently x370 boards are basically nonexistant at this point I'm probably looking to do something like CPU: 1700x GPU: GTX1080 (already have) Motherboard: asrock taichi or killer or something, everything is preorder, OoS, or backordered RAM: 4x16gb g.skill 3000 cas 15 Cooler: NH-U12s Case: Fractal design r5 PSU: evga supernova 750w SSD: 960 evo 500gb Not sure about going for a full 64gb of ram, but I do probably want more than the 32 I have right now. Total cost is ~1500 excluding the gpu which I already have
|
# ? Mar 4, 2017 01:04 |
|
teagone posted:Why do you watch the Kings play basketball? Because it's entertaining and they're better at basketball than you. Apply that logic to a streamer playing a game you might also enjoy and is probably better at it than you. If we are going with the Kings basketball analogy it would be like watching someone poorly play CoD and manage to staple their balls to the their leg while doing it
|
# ? Mar 4, 2017 01:12 |
|
Sinestro posted:Well, on a good day at least. WhyteRyce posted:If we are going with the Kings basketball analogy it would be like watching someone poorly play CoD and manage to staple their balls to the their leg while doing it I laughed, well played. Haha.
|
# ? Mar 4, 2017 01:18 |
|
eames posted:Nice launch.
|
# ? Mar 4, 2017 02:26 |
|
eames posted:Most R7 CPUs seem to go up to 3.8 Ghz at that voltage which matches the "Critical 2" point in the voltage vs frequency chart linked earlier. "If ratio is set above default (on 1800X = 36.25x), the CPU will enter “OC Mode” and disable CPU/XFR and any power saving or limitations." "- CPU temperature will read ~60 °C in BIOS due to no power savings enabled." "CPU Core Voltage reading from the SIO (BIOS/CPU-Z) fluctuates, use DMM for accurate readings." My god, rough around the edges is an understatement for AM4 or at least this Asus Crosshair VI mobo. How is this excusable when Intel got this right for almost 10 years. Palladium fucked around with this message at 02:29 on Mar 4, 2017 |
# ? Mar 4, 2017 02:27 |
|
Alereon posted:So to clarify, you're trying to use software to compress the video for the game you're playing in real-time? Yeah that'll be hard on a quad-core CPU but it seems like there's better solutions, like hardware encoding. Also, QuickSync duplicates your display when you encode with that, which is nice if you're playing at 1080p60fps, not so much when you're playing anything with a higher refresh rate or resolution
|
# ? Mar 4, 2017 02:37 |
|
Palladium posted:"If ratio is set above default (on 1800X = 36.25x), the CPU will enter “OC Mode” and disable CPU/XFR and any power saving or limitations." I think for weeks before the release people in the thread were talking about the mobo manufacturers dragging their feet for this release. The release bios in video game terms must be at least an alpha state release. Presumably cause they didn't know Ryzen was a serious (solid?) performing chip and weren't planning on releasing so soon. Also maybe because AMD didn't whip them into shape fast enough. I don't know how this works. Who knows, maybe 9 months ago the r5/r3's release date was The release date.
|
# ? Mar 4, 2017 03:08 |
|
Palladium posted:How is this excusable? AMD has been broke as poo poo for a while and didn't have the resources. But if you look at the 480 at launch and where it is today you will see improvements because guess what amd doesn't just release a product and move on. The 480's directx 11 performs has improved a bunch since release and equals a 1060. These are first generation teething issues because guess what amd hasn't really released a new chip in 5 years they forgot how to do it. Look at first gen i7s the 860 and 920 vs 2500k, the first gen i7s where good but the refreshes where better and you can see zen doing the same. Also SMT seems to just be just stupid broken so benchmarks seem off?
|
# ? Mar 4, 2017 03:22 |
|
It's a hilariously rushed launch, errata, BIOS and coding problems abound. But if I understand correctly, it has forced board manufacturers to start pushing product, which is what AMD wanted. I mean if the 1 million CPUs already sold are correct, accompanying number of motherboards usually follow suit. AMD could be willing to eat this if it mean they get volume up, and a staggered release (betting R5 CPUs with RX 500 series, and R3 series CPUs in tandem with Raven Ridge) might mean the much more affordable and better value for gamers R5 and R3 will have rosier reviews.
|
# ? Mar 4, 2017 03:31 |
|
FaustianQ posted:I mean if the 1 million CPUs already sold are correct. Quarterly report should be interesting if true.
|
# ? Mar 4, 2017 03:48 |
|
This is no more chaotic than any previous AMD processor launch. If you're looking for perfection, you probably have some rose-tinted memories of Athlon 64. At least these problems are nothing compared to the hot garbage of Phenom I and Bulldozer.
FuturePastNow fucked around with this message at 03:52 on Mar 4, 2017 |
# ? Mar 4, 2017 03:50 |
|
VostokProgram posted:Video encoding seems like it would be exactly the kind of workload that would saturate a core's execution units. Not a lot of dependencies and branches, just lots and lots of math. I'm sure it's pretty cache efficient, yeah, but unless the AVX instructions block the whole core (would not be too surprised if they did) I don't see why it'd be a problem. Boiled Water posted:Very likely, but it looks like a decent amount of space is wasted compared to the good ol' LGA mounts. If you're counting the heat sink mount as wasted space, I'm going to count the label as the real wasted space
|
# ? Mar 4, 2017 07:54 |
|
MaxxBot posted:[H] did some VR stuff in their review, it looks stronger in VR than in the general gaming benchmarks. Truga posted:GPU encoding produces poo poo quality unless you're willing to stream at higher bitrates and then everything goes to poo poo anyway, because youtube will recompress your stream to poo poo so people can actually watch it.
|
# ? Mar 4, 2017 11:34 |
|
Some more news... AMD SMT cores are mapped differently than Intel: - Some websites claim than Intel logical core mapping is: thread 1 of every CPU 1,2,3..,8 and thread 2 of every CPU 9, 10, 11... 16. - AMD Ryzen logical cores are apparently mapped sequentially (one core at a time): CPU1 = 1,2, CPU2 = 3,4... CPU8 = 15,16. - This causes problems in game engines that core lock their 6-8 worker threads (assuming console port). A game engine like this would only use 3 or 4 cores (both SMT threads on each) on AMD 8-core Ryzen. This would explain huge gains seen by some reviewers when disabling hyperthreading. ---------------- G.SKILL Announces Flare X Series and FORTIS Series DDR4 Memory for AMD Ryzen https://www.gskill.com/en/press/view/g-skill-announces-flare-x-series-and-fortis-series-ddr4-memory-for-amd-ryzen --------------- https://twitter.com/Dresdenboy/status/837219996166205442 (Context. Apparently, the reason AIDA64 is not reporting L2/L3 stats correctly is because they were not given a Ryzen sample before launch. Changes coming now.)
|
# ? Mar 4, 2017 11:56 |
|
Dante80 posted:AMD SMT cores are mapped differently than Intel: Why would they do this?
|
# ? Mar 4, 2017 11:59 |
|
Twerk from Home posted:Where is AVX2 expected to matter and do any of these reviews look at AVX2 256-bit wide commands? HPC stuff. Doesn't matter really for games or desktop stuff. Maybe in several years that will change but not a whole looks to benefit from it on the desktop. Maaaybe physics engines in games? That could just as easily get tossed at the GPU though.
|
# ? Mar 4, 2017 12:06 |
|
Platystemon posted:Why would they do this? How do you mean? The scheduling is the same that the consoles have, so it was pretty easy for AMD to do the same (it also makes logical sense due to the way the Uarch is arranged on both dies). Microsoft and game engines on PC have different scheduling, and one of the jobs needed to port the game into PC from the newest consoles is to change said scheduling from what Jaguar does to what Intel does (AMD never had SMT before, so there was no need to do otherwise in the past). Thus, without proper patches to Windows and the games themselves, we see Ryzen performance getting worse when SMT is enabled. Which also speaks a lot about how loving clown-ish and rushed the launch was...AMD stock has lost more than $1.5bn in two days due to their PR department lol. Dante80 fucked around with this message at 12:33 on Mar 4, 2017 |
# ? Mar 4, 2017 12:30 |
|
I mean, why would they not go with Intel’s convention? This is a foreseeable outcome of the failure to do so. I don’t care if AMD engineers think that this ordering scheme is more logical. When in Rome, do as the Romans do.
|
# ? Mar 4, 2017 12:41 |
|
Intel cores have always been arranged as 1/H1/2/H2/3/H3.... and so on in Windows, as Ryzen is. I'm pretty sure that's just baseless theorycrafting when the real problem is that SMT just doesn't work properly in the first place. In unix-based systems, intel cores are arranged as 1/2/3/4/H1/H2/H3/H4. I can see how the problems arise if Ryzen doesn't follow this convention. Anime Schoolgirl fucked around with this message at 12:56 on Mar 4, 2017 |
# ? Mar 4, 2017 12:47 |
|
Dante80 posted:Thus, without proper patches to Windows and the games themselves, we see Ryzen performance getting worse when SMT is enabled. It looks like Windows is already working correctly, The Stilt ran coreinfo against his Ryzen chip and the logical cores are being detected as sequential pairs as you described: code:
repiv fucked around with this message at 13:43 on Mar 4, 2017 |
# ? Mar 4, 2017 12:57 |
|
So the performance disparity with SMT on could be attributed to the OS or the engine not seeing that the chip is two 4 core complexes, thus instead of utilizing the slaved L3 for each complex it puts the interconnect (fabric or sth) in overdrive? That would give a lot of latency, right? In other news, it seems that memory is pretty important to the platform. Memory Speed Has a Large Impact on Ryzen Performance Dante80 fucked around with this message at 13:22 on Mar 4, 2017 |
# ? Mar 4, 2017 13:14 |
|
Dante80 posted:Some more news... On my 3770K, core 1 is the second thread of core 0, core 2 is physical, core 3 is the second thread of core 2, etc. etc.. alternating. This launch has kind of sucked really badly for AMD. I think they should have delayed it until Vega launch, so they could not only have worked out all the bugs in the firmware and so on, so nobody ends up with pre-release firmware on their board, giving lovely performance; but also so they could generate Vega sales with new Ryzen/Vega systems as opposed to Ryzen/NVIDIA systems.
|
# ? Mar 4, 2017 13:28 |
|
Dante80 posted:Which also speaks a lot about how loving clown-ish and rushed the launch was...AMD stock has lost more than $1.5bn in two days due to their PR department lol. Well, it gained quite a bit of that 1.5 billion due to their PR department as well, by handing out pre-release samples selectively to hypesters and LN overclockers.
|
# ? Mar 4, 2017 13:33 |
|
Says the difference from DDR4 2133 (CL10) to DDR4 3466 (CL14) is about 10% increase in performance which is noticeable. Weird to see performance improve that much from RAM speed. I wonder if the firmware updates they're going to be doing will make a difference there. DDR4 3466 (CL14) isn't easy to achieve on Ryzen right now. Some AMD specific DRAM is coming for Ryzen from GSkill I think so maybe that will help. HalloKitty posted:This launch has kind of sucked really badly for AMD. Realistically they seem to be selling well if some of the rumors are anything to go by so people are at least giving them some benefit of the doubt. But yeah delaying another month or 2 to let things get fixed would've been a drat good idea. I have no clue why they rushed things. \/\/\/\/On the value stuff yeah it really could be that simple, the higher performance stuff might be a different story\/\/\/\/ PC LOAD LETTER fucked around with this message at 13:47 on Mar 4, 2017 |
# ? Mar 4, 2017 13:36 |
|
AMD specific being single rank dimms
|
# ? Mar 4, 2017 13:38 |
|
PC LOAD LETTER posted:But yeah delaying another month or 2 to let things get fixed would've been a drat good idea. I have no clue why they rushed things. AMD is still a loss making company. Cash flow problems could have dictated that they release Ryzen as soon as they could. There's a good chance that performance will improve with firmware updates and as games or whatever are optimised for the new architecture. Meanwhile they'll have more cash on hand for the Vega launch.
|
# ? Mar 4, 2017 14:34 |
|
Cardboard Box A posted:This is not borne out by the OBS tests https://obsproject.com/forum/threads/comparison-of-x264-nvenc-quicksync-vce.57358/ Are we reading the same thread? "mse" is mean square error, i.e. "how much of this video is different from the lossless one" x264 is noticeably better in general, and copes far better with high motion games like 1st/3rd person games, which I stream. Again, from that thread: quote:From this computer-generated rating with mainly the mse as criteria, you may come to the conclusion that NVENC is on par with x264 preset=veryfast (the default in OBS), or even a bit better, but unfortunately it isn't. At least for high motion scenes. I was super hyped about gpu encoding some years ago, but it just can't be that good when you actually sit down and think about it. It's annoying and I wish it weren't the case, but there's not much that can be done about this, especially about high motion games. GPUs just aren't suited very well to encoding video - video encoding is highly deterministic and thus isn't something that profits that much off of parallel threads after a certain amount. You need the previous frame to encode the next one correctly and the next frame might be completely different. Highly simplified: by the time a frame gets through a GPU's longass pipeline, a cpu has already gone over it 4-5 times, choosing progressively better settings each time. GPUs on the other hand just throw n threads at the problem, with a rainbow of settings loosely on the previous frames, and hope it produces a good frame. You can see how this might not be ideal in many scenarios. There's a reason why a decent encoding card costs $1000 or more. :P Also, I've been getting the eve itch again, but that guy with 11 clients reminded me why I stopped playing. Thanks, goons!
|
# ? Mar 4, 2017 14:48 |
|
Beating x264 on “veryfast” is better than I expected for hardware encoding.quote:From this computer-generated rating with mainly the mse as criteria, you may come to the conclusion that NVENC is on par with x264 preset=veryfast (the default in OBS), or even a bit better, but unfortunately it isn't. At least for high motion scenes. Well then.
|
# ? Mar 4, 2017 14:58 |
|
Yeah, hardware encoding is getting pretty good, in about 5 more years I expect it to be on-par
|
# ? Mar 4, 2017 15:02 |
|
Platystemon posted:Why would they do this? game developers are generally under crunch pressure and don't make the best decisions https://msdn.microsoft.com/en-us/library/windows/desktop/ms683194(v=vs.85).aspx is the right api to use and I guess they didn't use it!
|
# ? Mar 4, 2017 15:07 |
|
|
# ? Oct 11, 2024 06:18 |
|
This is a tangentially related question, but I don’t know of a thread it fits in better and we’re on the subject anyway: What hardware do commercial video streaming services use for encoding?
|
# ? Mar 4, 2017 15:14 |