|
Are Jaguar cores based on bulldozer? Everyone thinks that consoles have "8" cores, but if they're actually terrible CMT cores then a quad core ryzen would an upgrade. But I'm sure most people would be confused about why the core count "downgraded"
|
# ¿ May 21, 2017 18:26 |
|
|
# ¿ Apr 26, 2024 22:46 |
|
Perhaps in his place a new engineer will ryze
|
# ¿ May 27, 2017 01:10 |
|
SwissArmyDruid posted:...how is that loving _entry level_? Isn't entry level supposed to be some 10-core part with SMT, clocked up to 3.4 with boost to 3.8? I think it means the cheapest 16-core, not the cheapest out of any threadripper.
|
# ¿ Jun 6, 2017 22:56 |
|
Risky Bisquick posted:Conspiracy theory: Mac Pros are being held back to plop in Threadrippers. They could increase their margins and increase the price to the consumer I agree
|
# ¿ Jun 9, 2017 19:04 |
|
Paul MaudDib posted:Subtracting performance from Intel's benchmark results is bullshit which completely invalidates all of those results. And just to be clear, they are cutting the performance of the Intel chips almost in half. That's an immense level of bullshit right there. And if they really want an apples-to-apples comparison they could run all the benchmarks with gcc and use those numbers. Arbitrarily cutting numbers is so hilariously bad that I can't believe even the worst marketing idiot would expect it to fly.
|
# ¿ Jun 21, 2017 02:43 |
|
I much prefer Ryzen, Threadripper, and EPYC over "7th Generation Intel yawnzzzzz..."
|
# ¿ Jun 30, 2017 02:28 |
|
JnnyThndrs posted:
I was looking at some computer parts with my dad about a year ago and when he saw the Pentium name he was really surprised that Intel would still be selling such an old chip
|
# ¿ Jun 30, 2017 03:45 |
|
Cygni posted:i mean all the good motherboards let you turn the LEDs off and they cost pennies, thats why they are on everything. i dunno if its too worth getting worked up over, especially when unironically buying a cpu named _-~+=ThReAdRiPpEr=+~-_ old people get angry at diodes
|
# ¿ Jul 27, 2017 07:34 |
|
a funny thought occurred to me: assuming AMD has a policy of only using their own CPUs in their workstations, their employees were probably extremely happy when ryzen hit production
|
# ¿ Aug 2, 2017 05:10 |
|
you can read this for some information from back then: http://techreport.com/review/21813/amd-fx-8150-bulldozer-processor here's a paragraph from the conclusion: Scott Wasson posted:Faced with such results, AMD likes to talk about how Bulldozer is a "forward-looking" architecture aimed at "tomorrow's workloads" that will benefit greatly from future compiler and OS optimizations. That is, I am sure, to some degree true. Yet when I hear those words, I can't help but get flashbacks to the days of the Pentium 4, when Intel said almost the exact same things—right up until it gave up on the architecture and went back the P6. I'm not suggesting AMD will need to make such a massive course correction, but I am not easily sold on CPUs that don't run today's existing code well, especially the nicely multithreaded and optimized code in a test suite like ours. The reality on most user desktops is likely to be much harsher. scott was right
|
# ¿ Aug 2, 2017 06:30 |
|
Risky Bisquick posted:The current AM4 CPU and B350 board will work, this does not mean new cpus will necessarily work without a different board/chipset. See Intel. AMD is historically pretty good about this though, and I believe they've said AM4 will be supported through 2020. Bigger question is whether there will be any CPUs that actually warrant an upgrade in that time.
|
# ¿ Aug 2, 2017 19:26 |
|
FaustianQ posted:Zen2 is most assuredly going to be AM4, and will likely be very worth it, ignoring any potential gains AMD might get from Zen steppings in 2018. Oh I have no doubts that it will be good, but will it be good enough that if you spent $200 or $300 Ryzen today, that you would then get your money's worth out of another $200 or $300 Zen 2 chip?
|
# ¿ Aug 2, 2017 19:54 |
|
EoRaptor posted:Compilers have gotten better (In fact, a lot better, LLVM was/is a major advance in compiler design), however it turns out what itanium needed from a compiler was perfect knowledge of all possible operations an application could perform and all the CPU states that would result, which even with very simple code seems to be an NP hard problem. Also, compilers are still written by humans and aren't capable of the level of 'perfection' needed to even approach what itanium demanded. I agree with what you're saying about itanium but I'm curious why you say LLVM is a major advance in compiler design. To my knowledge its backend isn't as good as GCC or (some) commerical compilers
|
# ¿ Aug 26, 2017 08:20 |
|
Was DEC Alpha supposed to be super badass or something back in the day? I find mention of it in a lot of places but no explanation of why it was so interesting e: besides it having bizarre super weak memory ordering
|
# ¿ Aug 29, 2017 05:57 |
|
Cygni posted:hasnt X299 had NVME raid since launch, and people were dogging AMD for not having it on their premium platform? Intel wants you to pay 300 dollars for a raid key to use it. AMD giving it for free
|
# ¿ Aug 31, 2017 17:52 |
|
SwissArmyDruid posted:Was only kidding, because that's almost word-for-word a tweet from him. Aren't qemu and kvm related? I thought qemu by itself only does emulation, and you need kvm to use hardware assisted virtualization
|
# ¿ Aug 31, 2017 21:08 |
|
Ryzen's memory controller ages like fine wine
|
# ¿ Sep 2, 2017 17:38 |
|
Paul MaudDib posted:Yes, streamers are typically better off using GPU encoding. CPU encoding has better quality but it's difficult to achieve this in real-time, and if you are going to attempt to do so then the gold standard is to do the encoding on a dedicated machine since anything that exceeds the quality of GPU encoding is also going to poo poo up your framerate something fierce, even on an 8-core processor. Seems hard, you'd need a dedicated machine that can really rip through several threads at once
|
# ¿ Sep 5, 2017 22:22 |
|
bobfather posted:I apply TIM to a CPU not in the socket. Why don't you try the heatsink spreading method on a chip, let it do a few hot/cold cycles, and then take the heatsink off and look at the paste
|
# ¿ Sep 19, 2017 17:50 |
|
Generic Monk posted:ps4 pro gpu is roughly equivalent in perf to a gtx970 which is still a solid 1080p60 card I'm gonna need you to source a benchmark on this just because I have a 970 and don't want to believe you
|
# ¿ Oct 29, 2017 21:20 |
|
repiv posted:The closest thing to the PS4 Pro GPU on PC is the RX470 - they're the same architecture, the RX470 is a little faster but the PS4 Pro has bolted-on 2xFP16 support so they're probably pretty close in practice. Dang, that's kind of alarming. I'm unused to console hardware being this close to PC performance.
|
# ¿ Oct 30, 2017 00:34 |
|
FaustianQ posted:Ryzen was originally delayed because it's was so fast it managed to get an overflow error on the core frequency. Ryzen 2 will be noticeably slower in absolute terms but faster practically. Jim Keller was a big Dragon Age Origins fan
|
# ¿ Dec 10, 2017 23:20 |
|
Paul MaudDib posted:If it didn't need ridiculous RAM timings it wouldn't be a problem. 3000 or 3200 isn't particularly expensive unless you need the B-die. Come on, it can't be that hard. Just throw some FIFOs here and there and sacrifice a few more goats to Jim Keller and I'm sure it'll all work out
|
# ¿ Dec 21, 2017 00:33 |
|
Paul MaudDib posted:It's not really about price, it's about wasting slots and x16 lanes and screwing up your airflow for something that can be integrated onboard via the PCH. Tons of slots+lanes is one of the big selling points of X399 and you're screwing that up right off the bat. Minor differences in cost don't matter all that much on a $1300 build - even if you could buy a 10 GbE NIC for $50 today, I'd rather drop the extra $50 on the motherboard and have it integrated. The fact that the Zenith Extreme is more expensive than the Fatal1ty is the final nail in the coffin IMO. But then I'd have a motherboard named "Fatal1ty"
|
# ¿ Feb 5, 2018 22:04 |
|
PerrineClostermann posted:....Modern overclocking is a risk? I thought it was just clever marketing for higher clocked chips. "Oh, you're totally pushing it to the edge! You're the one bringing the chip to its maximum potential! Isn't that amazing and worth 100 dollars?" The real trick is that it allows chip manufacturers to benefit from performance numbers that they don't actually have to support. Pretty much all coffee lake CPUs should make it to an all-core 5 GHz... But if they don't, it's not Intel's problem
|
# ¿ Feb 7, 2018 20:00 |
|
Potato Salad posted:Wouldn't have happened if it was ARMed
|
# ¿ Feb 18, 2018 21:44 |
|
SlayVus posted:x64 is AMD technology. To run x64 you need AMD services. I don't believe you re: the second sentence. I have seen embedded operating systems running on x64 that most definitely do not have any "AMD services" running
|
# ¿ Feb 19, 2018 19:51 |
|
Threadviscerator
|
# ¿ Feb 19, 2018 20:19 |
|
SwissArmyDruid posted:"We can't shrink the node anymore" is bad. But that's not the problem that we're facing. I mean, using something that's not silicon requires reinventing everything we know about manufacturing doesn't it? That's a tall order
|
# ¿ Mar 8, 2018 07:00 |
|
ufarn posted:I'm planning on getting Zen+ - still can't figure out which of 27/2800(+) are the way to go - because the whole Spectre/Meltdown thing soured me on Intel. Curious, what about that soured you on Intel
|
# ¿ Apr 2, 2018 19:15 |
|
I wonder if threadripper 1 CPUs will get cheaper when TR2 is out. Might be a good opportunity to build a modern-ish many core server on the cheap
|
# ¿ Apr 8, 2018 19:08 |
|
If AMD isn't able to get clocks much higher in the next couple years because of process problems, I wonder if they should go all in on the slower but wider idea. Add more execution units to each core and put in 4-way SMT instead of 2-way. I know IBM made some Power chips that do that, and I think there's an ARM server chip that also has 4-way SMT.... it could potentially be an easy way to get a lot better in some server workloads.
|
# ¿ May 17, 2018 05:54 |
|
PC LOAD LETTER posted:A GF fab guy said that they're expecting 5Ghz range chips with their 7nm process although he didn't specify which chip exactly he was talking about its generally accepted he meant Zen2 FWIW. Oh gently caress I stand corrected. GloFo making a good process is but I'm hyped
|
# ¿ May 17, 2018 17:34 |
|
SwissArmyDruid posted:It's Semiaccurate, and therefore sits firmly in the same category of "salt now, so you're not salty later" as WCCFT, but: according to this, that thing that I was worried about, where people just upgrade to the next thing that Intel comes out with out of inertia may not be happening. Oh my God
|
# ¿ May 25, 2018 19:15 |
|
ET was my jam in middle school / high school. But I played it on whatever lovely CPU/iGPU was in the Sony vaio I used at the time so I don't think I got much more than 30fps, if that
|
# ¿ Nov 12, 2018 02:40 |
|
ufarn posted:Can someone quantify "thousands of 300mm wafers" for me? Probably somewhere between 50-100x that in CPUs. I think a wafer makes about that many?
|
# ¿ Jan 29, 2019 00:46 |
|
It should help with battery life if nothing else. Batteries aren't getting much better and the bigger, brighter, higher resolution screens suck more and more energy. I'm sure somebody somewhere wants to put HDR on a phone too
|
# ¿ Feb 11, 2019 08:12 |
|
No that's exactly my point. The display sucks more and more power but the battery can only be so big, so it's the CPU and other electronics that need to get more efficient. E: in other words, making your silicon more power efficient isn't something you do to make battery life go up, it's something you do to make it stay the same or go down less
|
# ¿ Feb 12, 2019 08:12 |
|
Lambert posted:It's pretty clear the whole "Saddam buying PS2s to launch missiles" story was complete fabrication by World Net Daily. The PS2 wasn't even good at that type of calculation. Also, by 2000 encryption export restrictions were all but gone, but there was an embargo on Iraq. Are encryption export restrictions actually gone? At my last job we got an annual email that went something like "if you sell/give any products to anyone check with legal first because ITAR yo"
|
# ¿ Apr 17, 2019 20:25 |
|
|
# ¿ Apr 26, 2024 22:46 |
|
fishmech posted:Yeah that's what they'd been upset about. Sony gave very little warning to the public that they were going to stop allowing PS3s to use the Linux feature. Sony even ended up in several class action lawsuits over it for the following years, one of which resulted in payouts to most people who had bought PS3s before the cancellation of "other os" functionality and could prove they had done so. Did they say why?
|
# ¿ Apr 18, 2019 01:22 |