|
This is kinda disappointing. I've had an i7 3770k (ivy bridge) since it came out in 2012 and I was looking to maybe upgrade in early 2016. It looks like I am going to be sitting on this system (i7 3770k / 16gb of ram / gtx 670) for an eternity. Judging by the road map, probably until 2017 at the earliest and 2018 at the latest. That's nuts. I think the only other system I had for a duration similar to this was an E8400. That thing can still run lots of games coming out today at 60+ fps.
Khorne fucked around with this message at 21:58 on Aug 5, 2015 |
# ¿ Aug 5, 2015 21:53 |
|
|
# ¿ Mar 28, 2024 18:09 |
|
Josh Lyman posted:Upgrading your GPU to a 970 will be a significant performance increase. But GPU and my Samsung 830 256gb from the original build are the two things that might get upgraded before I actually build a new pc. Mostly because 256gb isn't enough for a primary hard drive and juggling applications onto other hard drives is obnoxious. If I hadn't installed an aftermarket cooler on the 670 I'd want to get rid of it just because of how obscenely loud it was. Tab8715 posted:Where did you read this and what's there reasoning? I don't know what other people are doing, but PCI-E ssds seem pointless at this moment. And I'm not going to have more than 2 ssds so why would I want more than 2 sata 3.0 ports. And USB 3 being native or not is irrelevant because it performs the same. On top of all that, if I actually needed any of these things I could build a current gen system when I need them instead of now when I don't need them. And let's not forget that even though modern ssds can saturate sata 3, the pci-e ssds perform pretty much identical for most real world applications. I feel like a luddite making this post. If I were building a desktop today I'd go with Skylake, but because I have a decent ivy bridge system there's still no compelling reason to upgrade. Laptops have came a long way since 2012 at least. Khorne fucked around with this message at 09:08 on Aug 6, 2015 |
# ¿ Aug 6, 2015 09:03 |
|
I posted in the wrong thread.
|
# ¿ Aug 7, 2015 22:10 |
|
WhyteRyce posted:I don't know much of anything in this space but not to long ago I remember hearing the complaints about CUDA is that it sucks to write for, won't give you the performance boost they claim unless you write it correctly, requires you to hire expensive software guys in order to write and maintain good CUDA code. I'm assuming those complaints were overblown? I don't know what performance boost they claim. For financial stuff I'm sure CUDA developers cost a fortune. For scientific stuff you're looking at fairly average pay if you can find someone for the job. Grundulum posted:Also, have fun if you don't use C as your scientific language of choice. CUDA Fortran exists, but only for one proprietary compiler and is just as much of a pain in the dick to code for. The next HPC purchase I make will probably be a Phi. If someone wanted me to use Cuda Fortran I'd probably quit my job. Having to deal with fortran code frequently is unfortunate enough, but taking something that is well suited to one language and bringing it to another language it has no business being associated with would really irk me. There's a reason Fortran isn't used for systems programming and C is. With that said, I'd work on a fortran project that used CUDA but it would have to be called through a binding/dll of some kind and not be done in CUDA Fortran. Khorne fucked around with this message at 19:37 on Aug 19, 2015 |
# ¿ Aug 19, 2015 19:21 |
|
Richard M Nixon posted:I can't imagine what market they were targeting where you'd need huge parallelism but your dataset was insignificant in size. Khorne fucked around with this message at 19:19 on Oct 20, 2015 |
# ¿ Oct 20, 2015 18:59 |
|
Kazinsal posted:Dumbass friend of mine needs a quiet 1366 cooler on a budget. Like, a $40 budget. Khorne fucked around with this message at 21:05 on Nov 30, 2015 |
# ¿ Nov 30, 2015 21:02 |
|
Smol posted:Is Prime95 just brutal on Skylakes? It caused like 20C higher temperatures than any other stress test. It's the best tool for making sure your overclock is stable and not going to have too high of an average temp under stress. I'd also recommend doing other things while prime95 is running after you think it's stable. While ocing my i7 3770k I ran into a few situations where it could run prime95 for 24 hours+, but if I tried to watch a youtube video while prime95 was going or browse websites prime95 would crash or I'd blue screen. Getting the voltage settings just right was a pain. Khorne fucked around with this message at 20:55 on Jun 7, 2016 |
# ¿ Jun 7, 2016 20:48 |
|
mediaphage posted:Honestly if it's that multithreaded (I assume because the mods are all separate?) and you can afford it, maybe it's time to move up to something with more than four cores.
|
# ¿ Jun 14, 2016 18:53 |
|
necrobobsledder posted:Sandy Bridge will be the first CPU to be unusable because the supply of LGA1150 motherboards were exhausted long before the CPUs failed. My Xeon E3-1230 is going to be my Kubernetes machine into the next decade it seems. I might end up upgrading just to donate my current machine to a nephew or something. I feel like the biggest drat old person/nerd hybrid alive for not wanting to pay current DDR3 prices because of what they once were. Khorne fucked around with this message at 00:19 on Jul 1, 2017 |
# ¿ Jul 1, 2017 00:16 |
|
Combat Pretzel posted:So what exactly is the difference between a server and desktop product these days anyway? Especially when the purported desktop product supports all relevant server things like ECC, high bus bandwidth and virtualization? The difference between a server and desktop product is why AMD doesn't exist in the server market anymore. I am not even sure when the last time AMD had a competitive server product was, and that they escaped the bulldozer architecture without a legitimate lawsuit from vendors or customers in that market is impressive. I'm biased because I'm on the HPC side of things. For home use, the current gen AMD cpus are good for the first time in a long, long time. Khorne fucked around with this message at 11:47 on Jul 14, 2017 |
# ¿ Jul 14, 2017 11:44 |
|
movax posted:I'm convinced buying a 2600K at launch was the best CPU purchase I've ever made. I think my laptop is also Sandy Bridge, and my old MBP is a Nehalem; unfortunately, that one limps a little bit with websites that are full of lovely JavaScript. Or that's just an old OS X install and Safari sucking... I keep looking to upgrade and the motivation just isn't there yet. I care too much about >=144 fps in games to play at over 1080p, and everything else it does just fine. Khorne fucked around with this message at 18:35 on Jul 23, 2017 |
# ¿ Jul 23, 2017 18:28 |
|
bobfather posted:Your 3770k will overclock to at least 4.3 on air, easily, so do that if you haven't yet. It's great to know I can still upgrade my windows install. Actually, I've always had a question about overclocking it. I have it set to a x45 multiplier and 1.330v, but it usually draws 1.28v maybe 1.31v running prime95. Is this because of my motherboard, just how it is, or what's the deal? I have an ASRock Z77 Extreme6 if it makes any difference. I also noticed putting voltage higher than 1.325V-1.330V to say 1.35v made little to no difference in stability at higher clock speeds Khorne fucked around with this message at 19:10 on Jul 23, 2017 |
# ¿ Jul 23, 2017 18:48 |
|
Why are people so hyped about ryzen/threadripper? It seems competitive on paper, but if you look at benchmarks it has 7%-10% worse single core performance than a 6 year old intel architecture. Which is mostly what matters on the desktop. It's nowhere near competitive with recent intel stuff unless your life is encoding videos, and at that point intel has a better solution there too: buy used 10 core xeons for $70 each and put them in a dual socket motherboard. I have some mild hype because intel might release some 6 core+ chips at a competitive price to compete with the people easily duped into buying an inferior AMD product, but that's about it. I'm an AMD fanboy at heart, as everyone should be, but man who spends money on them anymore. Khorne fucked around with this message at 04:33 on Jul 28, 2017 |
# ¿ Jul 28, 2017 04:28 |
|
Malloc Voidstar posted:source your press releases Anime Schoolgirl posted:Games are not the only thing that matters to people, hard as it may seem to believe. Anime Schoolgirl posted:Sure, people can afford 500 dollar used motherboards. PerrineClostermann posted:Uh, what? Khorne fucked around with this message at 04:53 on Jul 28, 2017 |
# ¿ Jul 28, 2017 04:50 |
|
PerrineClostermann posted:Like...? Most people aren't gaming. Even the gamers don't spend the majority of their time in game. Most people aren't running a ton of VMs with active cpus or hpc applications. Anime Schoolgirl posted:I think we found the first person in the thread who actually bought a Skylake-X chip
|
# ¿ Jul 28, 2017 04:56 |
|
PerrineClostermann posted:TIL multitasking is single-threaded Combat Pretzel posted:Yeah, right. I want more cores too, but the cost of significant single core performance is too much. Khorne fucked around with this message at 05:02 on Jul 28, 2017 |
# ¿ Jul 28, 2017 04:58 |
|
PerrineClostermann posted:A Web Browser. Combat Pretzel posted:I'm not even sure who he's preaching towards. It's almost a given that most people posting in this thread are more of the power user sort and can make use of a higher number of cores. Sometimes I dabble in video, rendering, I work with VMs and like to multitask like an idiot. The more cores, the merrier. Video and rendering both make great use of more cores. VM stuff can also.
|
# ¿ Jul 28, 2017 05:08 |
|
Anime Schoolgirl posted:That's not true anymore unless you're still using Firefox 24. All browsers these days will eat all and every core they want. I am a huge drat nerd who has hundreds of tabs open at any time like everyone else in the thread. NewFatMike posted:Not everyone wants to buy used/warranties are cool and good also. Khorne fucked around with this message at 05:36 on Jul 28, 2017 |
# ¿ Jul 28, 2017 05:11 |
|
Anime Schoolgirl posted:Except they do in Chrome. My lovely braswell student computer is significantly faster than a core m3 on loving Facebook. PerrineClostermann posted:Chrome regularly takes up 20-40% of my CPU while I'm browsing and not doing intensive tasks. I recently slimmed down to 95 tabs. 2600k. Khorne fucked around with this message at 05:24 on Jul 28, 2017 |
# ¿ Jul 28, 2017 05:21 |
|
SourKraut posted:Source? I'm curious. If you mean on benchmarks, check user submitted benchmarks or any aggregate site. The numbers are roughly, 3770k just flat out beats 1700 stock/oc, it's even with 1700x stock and has a ~6% edge oc, it gets beat by 1800x stock, has ~2% edge oc, and obviously at that slim of a margin it gets smashed on anything that leverages the extra cores. Right, and quad core the 1700x and 1800x both outperform the 3770k slightly if I remember right. Kinda moot given that no one should buy a 3770k in 2017 because even used they demand too high of a price, but it's not like performance got worse on later gen intel cpus. PerrineClostermann posted:...And it's definitely not the only thing I do at once. I don't think web browser justifies more cores in any way, and even things that benchmark new cpus with web browsers will agree as far as I've seen. 8+ cores is definitely a big reason I am looking forward to upgrading in a year or two. At the rate things are going it looks like I'm waiting for my computer to die a horrible death or for 7nm to hit, though, which will probably be longer than a year or two. Khorne fucked around with this message at 05:52 on Jul 28, 2017 |
# ¿ Jul 28, 2017 05:41 |
|
NihilismNow posted:Where do i get these used 10 core xeons for $70? Ryzen is favorable to getting an old xeon for desktop use. The TDP difference alone if you plan on using it heavily should pay for itself, it likely has better gaming or single threaded performance, and not having to "hack" the bios for nvme support is a big plus. If it's not going to be under constant heavy load, if you don't want nvme, if single thread performance isn't a sticking point, and if you value ECC the water gets much murkier. Technically, ryzen/threadripper support ecc, but do/will affordable motherboards exist with ecc support? Twerk from Home posted:So we've gotten to the bottom of this: You're a sucker if you buy CPUs less than 3 years old. Why get a Xeon v4 or Gold/Platinum when you can get a v1 or v2? Why buy an X299 chip when you can buy 2014's finest 5820K? Why buy memory in 2017 when you could time travel to when it cost half as much 18 months ago? I do lots of cpu intensive stuff on the hpc clusters I have access to, but if the task is 4 or less cores it runs notably faster on my desktop even vs fairly decent broadwell xeons. I still don't usually run them on my desktop because sending it to something that does nothing else and checking back tomorrow is real convenient, and in all honesty depending on what I was doing at home I'd build a system with a large number of cores that's not my desktop and do the same drat thing. I realize this isn't an option for certain serious hobby/work stuff. Khorne fucked around with this message at 15:38 on Jul 28, 2017 |
# ¿ Jul 28, 2017 15:29 |
|
Cygni posted:For real 7nm, everyone basically needs EUV to work, which looked shaky for like a decade there. The machinery exists now, finally. Now its going to be a race to see who can actually make a product with it that works and has realistic margins. I wouldn't be surprised at all to see everyone stall out again here. I like how those guys have "5nm" on their 3 year roadmap. It'll be interesting to see when quantum tunneling comes into play and how it is dealt with. Khorne fucked around with this message at 12:13 on Aug 4, 2017 |
# ¿ Aug 4, 2017 12:07 |
|
ufarn posted:I literally have no idea what laptop I'd get if I had to move from MacBooks. ThinkPad were awesome, but after Lenovo bought them, they don't hold the same allure as they used to. The only real con is it's tedious to get inside and I worried I would break something. There is no way I am getting an SSD or more RAM from lenovo. They markup price a lot and don't really describe what their ssd is. I bought and installed those myself. Khorne fucked around with this message at 16:41 on Aug 18, 2017 |
# ¿ Aug 18, 2017 16:37 |
|
priznat posted:I've never personally owned an Asrock motherboard but we buy a lot at work and it'll probably be my go to on the next build. Asus has gone too wacky with the floater lights and a lot of extra poo poo no one needs. ColHannibal posted:Western digital for platter drives. If you're in the US, bestbuy's ezstore or whatever 8tb externals go on sale for $159.99 regularly and have wd red drives with a 256mb cache in them, are easy to shuck, and you can RMA them. Not sure I'd even bother with any other platter drives these days at the consumer level. Khorne fucked around with this message at 16:36 on Sep 2, 2017 |
# ¿ Sep 2, 2017 14:01 |
|
Craptacular! posted:For now I'm just going to keep my Ivy and see what I can reach at 1.2v or something. I saw some guy achieved 4.5ghz at 1.4 and everyone was telling him that that was too much voltage and he was killing his chip, so I figure at either stock voltage or just a little over I'll see if I can get what I want and that's that. Delidding is absolutely not an option as I can't afford to replace a CPU right now, and actually to be honest I could totally just get by without OC since it's not like I ever complain about this PC. I just decided to do it now that the generation is five years old and no more documentation about how to OC it is going to appear than what's already out there. My advice would be to go higher. Just pick a safe enough voltage and see where you get. My chip doesn't run stable at 4.6 no matter how much voltage I throw at it. It's just as unstable as it is at around 1.35 or so. Has anyone actually had a CPU die in the past 5 years? I can only think of one anecdotally, and it was an amd chip crushed by a too heavy copper heatsink that required multiple high risk modifications to even get into that state. I probably shouldn't post this here, but I have a P3 that ran 24/7 for over 12 years with a nickel jammed between the heatsink and cpu to force contact on the other half of the heatsink. I retired it because I had no use for it anymore and a better replacement. Why there were coins in between my heatsink and cpu is another story entirely. Khorne fucked around with this message at 00:18 on Sep 28, 2017 |
# ¿ Sep 27, 2017 23:59 |
|
MaxxBot posted:From the linus vid It says value - productivity, but then it gives "MSRP" and "Total System". Presumably it's some coefficient involving price and productivity, but uh who the hell knows how they work. The chart does not even appear to be sorted by any known metric because as far as I know Threadripper 1950x is the most expensive setup on it. Khorne fucked around with this message at 04:38 on Mar 15, 2018 |
# ¿ Oct 11, 2017 21:18 |
|
Michael Jackson posted:more like MS in Paint
|
# ¿ Oct 13, 2017 00:32 |
|
Debating delidding my i7 3770k. Temps are going up and it has been almost 6 years so I am going to have to reseat it anyway. Does anyone have decent resources on doing it? I have access to a lot of tools. So I don't think I need to buy a kit, but I also am not entirely sure what I need to do to safely remove it. Or perhaps more importantly, what I should avoid doing. I'll probably record video of it happening, and if I break it I'll post it for the amusement of the thread. Khorne fucked around with this message at 17:26 on Nov 15, 2017 |
# ¿ Nov 15, 2017 17:22 |
|
Palladium posted:The real surprise is how a 4.8GHz Sandy Bridge is now unable to keep up with a Haswell at only 3.6GHz.
|
# ¿ Dec 22, 2017 14:17 |
|
Measly Twerp posted:I was just responding to the idea that it's sensationalised. Khorne fucked around with this message at 01:19 on Jan 25, 2018 |
# ¿ Jan 24, 2018 23:31 |
|
Shaocaholica posted:Are modern games even taking advantage of that or is it more to cover the overhead of capturing/steaming? If you only open a game and have nothing else open then it matters a whole lot less.
|
# ¿ Mar 20, 2018 19:13 |
|
bobfather posted:The better z68 and z77 boards still sell for $$ on Ebay. Also 2600K and 3770K chips still fetch ~$120-130, which is crazy. With z77 you can bios hack m2 support as well, but there's no real point because SATA ssds are pretty much identical in real world performance and cost way less. Khorne fucked around with this message at 18:31 on Apr 6, 2018 |
# ¿ Apr 6, 2018 18:28 |
|
spasticColon posted:My 2500K won't go past 4.2Ghz no matter what I do which is one of the reasons why I'm not going to bother with overclocking next time around. Palladium posted:Yeah, because I'm sure everyone will honestly report their actual OCs in a subject matter mostly fueled by e-peen dick waving. Heck, the last few gens we have stats from siliconlottery on the percentage of processors that can hit what clock at what voltage. Khorne fucked around with this message at 05:21 on Apr 13, 2018 |
# ¿ Apr 13, 2018 05:04 |
|
JazzmasterCurious posted:Fake. It should have a base clock speed of 4.7 GHz with the boost maxed at 5 Khorne fucked around with this message at 15:54 on Apr 16, 2018 |
# ¿ Apr 16, 2018 12:59 |
|
BobHoward posted:(you missed the joke)
|
# ¿ Apr 17, 2018 05:01 |
|
When is intel announcing 8 cores?
Khorne fucked around with this message at 22:19 on Apr 18, 2018 |
# ¿ Apr 18, 2018 21:21 |
|
BIG HEADLINE posted:To be fair, both AMD and Intel are out of ideas - their only play at the moment is shoehorning as many cores as they can onto substrate. I mean, are we going to see LGA16528 someday?
|
# ¿ Apr 26, 2018 13:40 |
|
GRINDCORE MEGGIDO posted:Be ironic if Intel's Keller designed architecture sucks and the RajaGPU is outstanding.
|
# ¿ Apr 26, 2018 19:00 |
|
Space Racist posted:Has anything been confirmed about exactly *why* Intel’s 10nm process is so boned compared to the rest of the industry? Global Foundries and TSMC seem rather confident of hitting 7nm, while Intel is 3+ years past their target date and still sweating the ramp for 10nm. I should put "roughly equivalent" because they are different processes, but the density and performance should be similar. The biggest difference is probably that the 7nm coalition is borderline "everyone who isn't intel". It's a combination of IBM, Samsung, and Globalfoundries. TSMC should have 7nm at production capacity around when Intel has 10nm up. If not sooner. Khorne fucked around with this message at 15:03 on Apr 28, 2018 |
# ¿ Apr 28, 2018 14:52 |
|
|
# ¿ Mar 28, 2024 18:09 |
|
Anime Schoolgirl posted:it's like they could use a excellent thermally conductive bonding material for this Intel is also able to cut a lot more corners by not soldering than just the TIM. You really see it in some of their 14nm processors. The TIM is also more reliable than solder in theory, but in reality we see almost no reports of damage to soldered IHS besides in the LN2 community which is super niche. Khorne fucked around with this message at 21:43 on May 3, 2018 |
# ¿ May 3, 2018 21:37 |