Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

No Gravitas posted:

Given the cash, just buy a few of the natex deal I linked above, it will at least run everything you throw at it.

I am kind of tempted, how is the motherboard availability for those?

Adbot
ADBOT LOVES YOU

No Gravitas
Jun 12, 2013

by FactsAreUseless

MaxxBot posted:

I am kind of tempted, how is the motherboard availability for those?

Mixed bag, but you can try for a decently priced bundle.

http://www.natex.us/category-s/1865.htm
http://www.natex.us/category-s/1868.htm

SuperDucky
May 13, 2007

by exmarx
Gravitas, how many K-corners were you running and what hardware setup? I'm personally curious because I work for an integrator and I want to see hard numbers against, say, K80s, given a modicum of optimization. For reference, I have CUDA benchmark numbers for 4 K80s with dual e5-26xx's on our hardware configuration.

I know the whole point of K-landing is you shouldn't need to optimize, I just want to know where the sweet spot is for that optimization, especially versus CUDA.

No Gravitas
Jun 12, 2013

by FactsAreUseless

SuperDucky posted:

Gravitas, how many K-corners were you running and what hardware setup? I'm personally curious because I work for an integrator and I want to see hard numbers against, say, K80s, given a modicum of optimization. For reference, I have CUDA benchmark numbers for 4 K80s with dual e5-26xx's on our hardware configuration.

I know the whole point of K-landing is you shouldn't need to optimize, I just want to know where the sweet spot is for that optimization, especially versus CUDA.

I was running just one Phi, quite briefly, hosted in a basic Haswell 4-core 4-thread Xeon, single CPU computer. It was a fun project, not very serious. When I turned out to be garbage for my load, I gave up on it. I think I posted dhrystone results in here already. I never got the Intel compiler sorted out, so I never got to run the vector units and the machine is pointless without them. Not that my workload was a very SIMD one either. I don't have the setup ready to run right now, I was basically homebrewing the ducting to cool the monster and I disassembled it a long while back. Fun ride, but a bad match.

From what I have seen your code either works great on a GPU or you need a CPU. Stuff like branches? CPU. The Phi tries to kinda be in the middle of the CPU/GPU choice and if it only weren't garbage...

Somehow all workloads I run are branchy as all hell and with unpredictable stride lengths and loop iteration counts. No GPU for me, but I can at least cluster across many computers.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Its more likely that Sandy Bridge coincided with the proliferation of solid state drives.

BONESTORM
Jan 27, 2009

Buy me Bonestorm or go to Hell!

sincx posted:

Speaking of not upgrading from Sandy Bridge, I think (overclocked) Sandy Bridge is really the first time CPUs got "fast enough"--i.e. the vast majority of actions felt like they were completed instantly--even for enthusiasts. If things already feel like they're happening as fast as possible, why change?

Totally agree. I have been running an i5-2500k over clocked to 4.2 GHz in my rig since mid 2011, and just last week I had a visiting family member remark about how instantaneous everything felt while they were using it (I imagine the ssd containing the OS is part of that, to be fair). There is very little reason for me to upgrade, even for videogaming purposes. I can't see myself bothering until there is a need for more than 4 cores to avoid bottlenecking whatever flavour of gpu I have in the future.

EdEddnEddy
Apr 5, 2012



Hell my C2Q 9550 Oced to 3.84 (2.84 stock) ran about the same with an SSD. Everything was instantaneous as far as basic usage was concerned.

The only downside was the hacky way to get SLI to work on a X48 chipset, and the fact that even OC'ed, it wasn't powerful enough to really feed GPU's faster than a 580 or so at the time.

Gave it to my Sis to do Photoshop stuff and it works like a charm for that to this day. Need to replace the SSD in it as its an old one without the Trim FW installed to it yet lol, but to upgrade it, you have to wipe the drive. Might just put in the old Plextor M3P's I just took out of my system as I needed more space then the Raid-0 240G they provided for these past years.

Durinia
Sep 26, 2014

The Mad Computer Scientist

SuperDucky posted:


I know the whole point of K-landing is you shouldn't need to optimize, I just want to know where the sweet spot is for that optimization, especially versus CUDA.

*snerk*

You can run x86/AVX binaries on it natively, but even on KNL they will likely run like garbage unless you spend some time on them. In terms of total work, KN* and CUDA will both require you to expose a lot more parallelism in your code, it's just the language you do it in that differs.

In terms of performance, K80 is significantly better than KNC.

Gwaihir
Dec 8, 2009
Hair Elf

go3 posted:

Its more likely that Sandy Bridge coincided with the proliferation of solid state drives.

That's the biggest thing by far for performance leaps in modern PCs.

CPUs still matter to an extent of course, even broadwell/skylake U series ULV chips can feel a bit laggy in use when you have poo poo like 40 tabs open in chrome or FF.

Ludicrous Gibs!
Jan 21, 2002

I'm not lost, but I don't know where I am.
Ramrod XTreme
I've got an I5-2500 non-k that's coming up on 5 years old now. Since OC'ing isn't an option, I take it an upgrade to Skylake is probably a good idea when I build my VR rig in a month or so? Should I go for an OC-able chip this time?

EdEddnEddy
Apr 5, 2012



Ludicrous Gibs! posted:

I've got an I5-2500 non-k that's coming up on 5 years old now. Since OC'ing isn't an option, I take it an upgrade to Skylake is probably a good idea when I build my VR rig in a month or so? Should I go for an OC-able chip this time?

It is usually worth it if you buy a good motherboard and CPU cooler, OC'ing allows you to pull a good bit of extra performance out of your chip. If you would have had a K version of your chip, you would potentially have still been good for another year or two with a solid OC.

And with the quality of some of the self contained water coolers out there now that are like 2 piece air coolers really, it is rather easy to keep things reasonably cool too.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Ludicrous Gibs! posted:

I've got an I5-2500 non-k that's coming up on 5 years old now. Since OC'ing isn't an option, I take it an upgrade to Skylake is probably a good idea when I build my VR rig in a month or so? Should I go for an OC-able chip this time?

The biggest boost here is that two USB 3 ports are required for Occulus VR (and probably others), and getting anew motherboard that includes a bunch will fulfill that nicely.

If you can hold off making decisions around GPU's until at least April 7th, we will have some more news about next gen nVidia GPU's, and probably AMD as well, whcih should help with any planning you are making.

LiquidRain
May 21, 2007

Watch the madness!

sincx posted:

Speaking of not upgrading from Sandy Bridge, I think (overclocked) Sandy Bridge is really the first time CPUs got "fast enough"--i.e. the vast majority of actions felt like they were completed instantly--even for enthusiasts. If things already feel like they're happening as fast as possible, why change?
This just means you don't use any sluggish apps out of a lack of alternative. I'd kill for anything to run Lightroom faster than my i5-4690k @4.4GHz, but nothing helps no matter how much CPU, RAM, and I/O I throw at it. It's pure single-thread CPU.

edit: vvv yeah, for editing there's no helping it, purely single threaded. it does use all available cores on a single export job. you can see N files show up at a time for N cores as it goes by. export time I'm less concerned about, I want more faster editing. :|

LiquidRain fucked around with this message at 09:49 on Mar 31, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

LiquidRain posted:

This just means you don't use any sluggish apps out of a lack of alternative. I'd kill for anything to run Lightroom faster than my i5-4690k @4.4GHz, but nothing helps no matter how much CPU, RAM, and I/O I throw at it. It's pure single-thread CPU.

Can't help you on editing, but if you launch multiple export jobs they will run in parallel. So export your photos in 4 batches of N/4 photos each.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
What version of Lightroom are you using? Most stages of the editing pipeline are GPU accelerated these days.

LiquidRain
May 21, 2007

Watch the madness!

Latest, and no, the previewing of touching effects is GPU accelerated, such as brushes. Those layer on top of the photo. However, anything to do with white balance, color, or exposure (highlights/shadows/etc) is all reliant on the CPU, as is flipping between photos in a library. It doesn't help that Adobe's Fuji RAW converter is slow as hell in all this. (bla bla Adobe Fuji RAW bla bla demosaicing bla bla Capture One :words: I use LR and I like its workflow, I just want it to be as fast as working with JPEGs)

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

EoRaptor posted:

The biggest boost here is that two USB 3 ports are required for Occulus VR (and probably others), and getting anew motherboard that includes a bunch will fulfill that nicely.

Vive doesn't require USB3, so there's that.

Anime Schoolgirl
Nov 28, 2002

Vive requires a DP 1.2 port, which should be on pretty much any videocard out these days.

EdEddnEddy
Apr 5, 2012



Anime Schoolgirl posted:

Vive requires a DP 1.2 port, which should be on pretty much any videocard out these days.

HDMI or DP. It can use either.

Ludicrous Gibs!
Jan 21, 2002

I'm not lost, but I don't know where I am.
Ramrod XTreme
Well, I'm gettin' a Rift, so the extra USB3.0 slots will come in handy, besides I'd like to actually be able to use the front-panel ports on my case. I'm pretty much building a new system from scratch anyhow, as I don't have a SSD and the 560Ti I bought at the same time as the CPU isn't exactly going to cut the mustard for VR stuff. I'll probably spring for the OC-able CPU this time, as being able to ride out the rest of the decade+ without needing an upgrade would be nice. Any aftermarket coolers that are particularly good? I've never OC'd a thing in my life, despite having built a half dozen PCs over the years.

And I can wait a week no problem, so hopefully the new GPUs are the step forward they've been advertised to be, or at least they'll cause a drop in 970/980Ti prices.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




If anyone was considering a 980Ti, this is the deal to get. $545 for the MSI Golden, which has the all copper heatsink and is probably binned for massive overclocked. Also it's eBay, so chances are you wont pay taxes.

Coupon code: C15LIMITEDTIME

http://m.ebay.com/deals?&_trksid=p1468660.m2212&deal=5002796903

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

sincx posted:

One of my college buddies just upgraded last December from his 2.8 Ghz Conroe (built in 2008) to Devil's Canyon and a GTX 970. He only did it because he's one of the lead project managers for Oculus and he figured he should have something that can run the Rift at home.

Q9550 is a far from throw from Conroe. The only reason I upgraded from a Q9550 to a 2500K was because I was having weird mystery problems. (Unrelated to the actual CPU, it turns out).

EdEddnEddy
Apr 5, 2012



sincx posted:

One of my college buddies just upgraded last December from his 2.8 Ghz Conroe (built in 2008) to Devil's Canyon and a GTX 970. He only did it because he's one of the lead project managers for Oculus and he figured he should have something that can run the Rift at home.

Lead PM for Oculus? May I ask how he got into that gig?

Been trying to break into the VR market myself for ages since the KS, but timing and job position just doesn't seem to align for me. :(

That and now I seem to only be able to apply through the Facebook Careers page now which I haven't had much luck with.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

mobby_6kl
Aug 9, 2009

by Fluffdaddy
New Broadwell Xenons. The E-5 2699 is looking good :getin:

EdEddnEddy
Apr 5, 2012



sincx posted:

He got an internship with Google in college, went to Google full-time right after graduation as a project manager, and then moved on from there.

This is why I regret working in IT through college and not doing an Internship somewhere else for something not IT Admin geared. drat it makes changing paths a pain in the rear end. Good for him though.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

mobby_6kl posted:

New Broadwell Xenons. The E-5 2699 is looking good :getin:



Wow, kind of surprised. I had figured they would go straight into Skylake/Purley E5s after the Broadwell delays.

Basically I want Purley E5s with PCIe SRIS support already.

SuperDucky
May 13, 2007

by exmarx
2695 is the good stuff this cycle. It crushes the 2680 in Multimedia and bests it by about 10% in Arithmetic.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Oh good, I'm glad the 5.1 GHz Xeon was just a clickbait rumor. That was/would have been bad for my health.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Sidesaddle Cavalry posted:

Oh good, I'm glad the 5.1 GHz Xeon was just a clickbait rumor. That was/would have been bad for my health.

As I recall, the rumor about that was that it was a ~speshul~ chip just for the NSA. But I think the original rumor was outed by WCCFTech, which should tell you all you need to know.

The E5-2620 v4 gives me hope for a 'cheap' 8/16 -K SKUed enthusiast version of the same chip.

BIG HEADLINE fucked around with this message at 02:37 on Apr 1, 2016

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
drat, minimum $400+ for the privilege of running DDR4 2133 on a Xeon? Guess I'll just have to double up on some of the current-gen E5s and I'll hold out on my Sandy Bridge Xeon E3.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
What kind of idle power consumption would I be looking at for a W3565?

I'd like to replace my mITX AM1 system with something that would be a heavier-duty server for ZFS and stuff. It's probably not going to run flat-out all the time, but I don't want to waste power idling if I don't need to. I have an old HP Z400 workstation with a W3565 and 16GB of ECC RAM right now - I could either build it on that, or sell/part it out and get something newer if there's a compelling tradeup there. Half wondering about looking for an AM1 mobo that supports ECC memory, but the W3565 is much faster and I probably won't notice an extra $10 in idle power per year or whatever.

Paul MaudDib fucked around with this message at 04:04 on Apr 1, 2016

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

BIG HEADLINE posted:

As I recall, the rumor about that was that it was a ~speshul~ chip just for the NSA. But I think the original rumor was outed by WCCFTech, which should tell you all you need to know.

The E5-2620 v4 gives me hope for a 'cheap' 8/16 -K SKUed enthusiast version of the same chip.


Broadwell-E is supposed to have a 10-core but it will probably be $999. I'm hoping for a ~$500 8-core for my next system.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

MaxxBot posted:

Broadwell-E is supposed to have a 10-core but it will probably be $999. I'm hoping for a ~$500 8-core for my next system.

Pretty much the same. Four physical cores without threads has been fine since early 2012 - eight physical cores with or without threads would be a good upgrade.

Anime Schoolgirl
Nov 28, 2002

Paul MaudDib posted:

Half wondering about looking for an AM1 mobo that supports ECC memory
There are none of these.

syzygy86
Feb 1, 2008

Anime Schoolgirl posted:

There are none of these.

There's at least one, the Asus AM1M-A:
https://www.asus.com/us/Motherboards/AM1MA/specifications/
http://www.overclock.net/t/1495837/ecc-works-on-am1

My understanding is that the chips themselves support ECC, but most motherboards don't enable that support.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
I have an unexplainable itch to upgrade the sandy dual core i7 in my laptop to a proper quad i7. Does anyone know what's the fastest one that can be crammed into a Lenovo T520? It's a socket 988B rPGA and using the QM67 chipset.

Adbot
ADBOT LOVES YOU

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?
There's a bloke claiming to have put in an i7-2630qm quad core into a T520 here: http://forum.notebookreview.com/threads/upgrading-a-thinkpad-t520i-with-an-i7-2630qm.771352/

The Sandy Bridge quads have a 10W higher TDP than the dual cores that model shipped with, so things might get warm. You're also giving up quite a bit in clock speed, going to the quad core.

Mr Chips fucked around with this message at 01:41 on Apr 4, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply