Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Tab8715 posted:

Thunderbolt has been around for 2-4 years, I don't see why we couldn't use that interface.

It has a Thunderbolt tax you must pay to Intel at around $100 .

Adbot
ADBOT LOVES YOU

EoRaptor
Sep 13, 2003

by Fluffdaddy

fishmech posted:

Because nothing ships with Thunderbolt besides a few random Sony laptops and Apple computers. The Sony laptops did use it for external GPU stuff, but IIRC they weren't that good.

Also intel refused to certify any device which broke out the thunerbolt port into a pcie port in an external enclosure, so that kiboshed graphics cards along with everything else.

They changed that with thunderbolt3, and adopted the usb3 port specification, so now there is much less barrier to entry and much more flexibility to implement devices.

Perplx
Jun 26, 2004


Best viewed on Orgasma Plasma
Lipstick Apathy

Tab8715 posted:

I'm shocked it's taken so long for someone to market a commercial external GPU. Most gaming laptops are disgustingly gaudy and it'd be nice to be able have a Thinkpad or MacBook play real games.

msi demoed one years ago but until tb3 intel wouldnt certifify gpu over thunderbolt for some reason

japtor
Oct 28, 2005
Yeah Intel blocked anything meant for GPU use, but there was nothing blocking it from a technical standpoint. Bandwidth/performance was lower but people were still able to hack up setups with it (using the PCIe chassis that were released*) that were still faster than internal graphics.

*meant for peripheral cards and "pro" stuff, most are half length and too low power to handle decent GPUs.

japtor fucked around with this message at 21:22 on Jan 8, 2016

EdEddnEddy
Apr 5, 2012



The one area that is going to grow that does depend on PCI-E speeds due to latency, is VR. If there is latency in getting data from the CPU, to the GPU, to render changes in a scene while using VR, that is going to wreck the experience due to the time added for the slower interface. (There are plenty of test showing things like VR SLI having a huge difference between PCI-E 2.0 X16 and 3.0 X16.

Now not many people will be going out to buy ultrabooks to hook up VR and play through an external GPU housing (and also while GPU's currently are huge, the next gen with HBM similar to the AMD FURY, should show that they will be shrinking a lot once that becomes mainstream).

But you know some will, and they will piss and moan and wonder why their super $$$ laptop doesn't work with VR and blame everything except the technical limitations.


Either way neat tech but not something I feel is more than a niche of a niche market outside of maybe the professional rendering market or extremely space limited gamer that wants a single system to work on then come home and play on, and that's it.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

EdEddnEddy posted:

The one area that is going to grow that does depend on PCI-E speeds due to latency, is VR. If there is latency in getting data from the CPU, to the GPU, to render changes in a scene while using VR, that is going to wreck the experience due to the time added for the slower interface. (There are plenty of test showing things like VR SLI having a huge difference between PCI-E 2.0 X16 and 3.0 X16.

I would be interested to know if there was a latency difference between 3.0 X8 and 3.0 X16

Kazinsal
Dec 13, 2011



I can't remember if this was posted at any point, so in case it hasn't, here it is: Prime95 and similar heavy complex workloads freeze Skylake CPUs up on certain exponents. Intel acknowledges the problem at the top of the second page of the thread. Microcode update incoming to fix it.

computer parts
Nov 18, 2010

PLEASE CLAP

Tab8715 posted:

I'm shocked it's taken so long for someone to market a commercial external GPU. Most gaming laptops are disgustingly gaudy and it'd be nice to be able have a Thinkpad or MacBook play real games.

I imagine there'd be software issues so it couldn't just be plug & play.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

computer parts posted:

I imagine there'd be software issues so it couldn't just be plug & play.

Computers can already handle USB videocards being hotplugged just fine - many of those small USB monitors you see actually have an onboard video card to make the most efficient use of the USB 2.0 data limitations.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Kazinsal posted:

I can't remember if this was posted at any point, so in case it hasn't, here it is: Prime95 and similar heavy complex workloads freeze Skylake CPUs up on certain exponents. Intel acknowledges the problem at the top of the second page of the thread. Microcode update incoming to fix it.

I can see it now. Do you want to keep overclocking your skylake non k cpu, or want this bug fixed. Cause we are integrating them together.

japtor
Oct 28, 2005
More Skull Canyon NUC stuff:
http://www.tomshardware.com/news/intel-skull-canyon-gaming-pc-confirmed,30928.html

But particularly interesting is the linked story in there, Intel apparently has a new smaller form factor in the works, mini STX:
http://www.tomshardware.com/news/silverstone-mini-stx-case-form-factor,30924.html
http://www.tomshardware.com/news/asrocck-mini-stx-pc-motherboard,30912.html

Daviclond
May 20, 2006

Bad post sighted! Firing.

Don Lapre posted:

I can see it now. Do you want to keep overclocking your skylake non k cpu, or want this bug fixed. Cause we are integrating them together.

According to that thread it only affects CPUs with hyperthreading and therefore won't affect the mid-range models people are interested in baseclock OCing.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

Daviclond posted:

According to that thread it only affects CPUs with hyperthreading and therefore won't affect the mid-range models people are interested in baseclock OCing.

You forget that every i3 is hyperthreaded and those are absolutely chips people are interested in BCLK OCing.

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

Boiled Water posted:

It has a Thunderbolt tax you must pay to Intel at around $100 .

Unless intel's goal was "make a port for macs and maybe high end workstations that will die in ten years" how the gently caress did they think $100 for a license fee for a loving port was a good idea when OEMs can just spend $1 on a USB controller instead :psyduck:

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
My understanding is that originally Intel wanted to wait until the fiber-optic version of Thunderbolt was ready, and they slapped that ridiculous fee on there to discourage "early" copper cable versions like we have now.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

fishmech posted:

My understanding is that originally Intel wanted to wait until the fiber-optic version of Thunderbolt was ready, and they slapped that ridiculous fee on there to discourage "early" copper cable versions like we have now.

You maybe shouldn't believe everything you read, that has all the earmarks of an apocryphal internet story somebody edited into wikipedia or whatever. Thunderbolt 1.0 ports work fine with optical cables like this one. You could buy fiber optic cables almost from day 1 iirc, I don't think there was any significant delay getting them on the market.

The grain of truth is that fiber was planned to be the primary or only connection method, but that depended on research projects at Intel and Corning to dramatically reduce the cost/size/power/etc of 10G optical transceivers, connectors, and fiber to the point where they could become mass market products. This apparently didn't work out as well as they planned, and thus we got the Thunderbolt that went to market: a pair of 10G electrical links at the connector, good for only a few inches distance, with either a fiber optic transceiver or copper media line conditioning IC integrated into the cable head. It's a bit like ancient AUI Ethernet gear, except the transceiver is part of the cable rather than a separately sold item. (This also contributed to high costs, of course: even copper Thunderbolt cables have two line conditioner ICs, one at each end. One of the nice things about thunderbolt 3 which is supposed to help is a new spec for passive 20Gbps (2x10G links) copper cables.)

Also I've never heard of an actual $100 fee or whatever. There's a perfectly viable explanation for why today's Thunderbolt stuff costs a lot: it's that Intel doesn't permit third party silicon other than the cable transceiver ICs, and low volume. The latter is probably the most important part, and is something a lot of people don't really appreciate. Like, I once designed a 1" x 3" controller board for a little industrial device and even though it was seriously low tech even at the time we built it (8-bit CPU, 8 MHz), the cost to make each one was over $100 because we were only making about 100 of them. Compared to that thunderbolt is high volume manufacturing, but it's nowhere near the USB tier of high volume, and that really does matter. If Intel had been more willing to open the standard up and if Microsoft had done a better job of supporting it, Thunderbolt 1/2 might have gone mainstream on the Windows side and things would be a lot cheaper.

japtor
Oct 28, 2005
Back in 2013 according to DigiTimes the controllers were $35, but apparently Intel rebut that and said they were under $10.
http://appleinsider.com/articles/13/05/28/intels-grip-on-thunderbolt-keeps-accessories-off-the-market
http://appleinsider.com/articles/13/05/31/intel-on-thunderbolt-our-goal-is-quality-more-compatible-accessories-on-the-way

Course that'd still be $10 and however much to implement on board, plus whatever certification and licensing crap Intel was enforcing (if they chose to work with you at all). TB3 is supposedly price competitive with USB 3.1g2 controllers, plus I think it's one of the first controllers that's even available in the first place.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

blowfish posted:

Unless intel's goal was "make a port for macs and maybe high end workstations that will die in ten years" how the gently caress did they think $100 for a license fee for a loving port was a good idea when OEMs can just spend $1 on a USB controller instead :psyduck:

Its why USB 3.0 and the new USB-C standards are sticking.

Because Intel didn't learn the lesson from Sony Betamax and Firewire.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Given the new rumors of Broadwell-E prices, seeing how they're pretty much similar for minimal performance improvements, I am considering going with a Haswell-E. In regards to power consumption, are the 14nm processors that much greener than the 22nm ones, or is it chicken poo poo amounts?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Combat Pretzel posted:

Given the new rumors of Broadwell-E prices, seeing how they're pretty much similar for minimal performance improvements, I am considering going with a Haswell-E. In regards to power consumption, are the 14nm processors that much greener than the 22nm ones, or is it chicken poo poo amounts?

I expect that Broadwell-E will use a good bit less power, the Skylake stuff does as long as it's not using the integrated GPU and Broadwell-E obviously won't have that.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

CommieGIR posted:

Because Intel didn't learn the lesson from Sony Betamax and Firewire.

No matter what anyone else tells you, the only true dealbreaker disadvantage Betamax had was the recording length issue. And that's really not applicable to Thunderbolt.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

fishmech posted:

No matter what anyone else tells you, the only true dealbreaker disadvantage Betamax had was the recording length issue. And that's really not applicable to Thunderbolt.

Also the whole market penetration thing.

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

BobHoward posted:

You maybe shouldn't believe everything you read, that has all the earmarks of an apocryphal internet story somebody edited into wikipedia or whatever. Thunderbolt 1.0 ports work fine with optical cables like this one. You could buy fiber optic cables almost from day 1 iirc, I don't think there was any significant delay getting them on the market.

The grain of truth is that fiber was planned to be the primary or only connection method, but that depended on research projects at Intel and Corning to dramatically reduce the cost/size/power/etc of 10G optical transceivers, connectors, and fiber to the point where they could become mass market products. This apparently didn't work out as well as they planned, and thus we got the Thunderbolt that went to market: a pair of 10G electrical links at the connector, good for only a few inches distance, with either a fiber optic transceiver or copper media line conditioning IC integrated into the cable head. It's a bit like ancient AUI Ethernet gear, except the transceiver is part of the cable rather than a separately sold item. (This also contributed to high costs, of course: even copper Thunderbolt cables have two line conditioner ICs, one at each end. One of the nice things about thunderbolt 3 which is supposed to help is a new spec for passive 20Gbps (2x10G links) copper cables.)

Also I've never heard of an actual $100 fee or whatever. There's a perfectly viable explanation for why today's Thunderbolt stuff costs a lot: it's that Intel doesn't permit third party silicon other than the cable transceiver ICs, and low volume. The latter is probably the most important part, and is something a lot of people don't really appreciate. Like, I once designed a 1" x 3" controller board for a little industrial device and even though it was seriously low tech even at the time we built it (8-bit CPU, 8 MHz), the cost to make each one was over $100 because we were only making about 100 of them. Compared to that thunderbolt is high volume manufacturing, but it's nowhere near the USB tier of high volume, and that really does matter. If Intel had been more willing to open the standard up and if Microsoft had done a better job of supporting it, Thunderbolt 1/2 might have gone mainstream on the Windows side and things would be a lot cheaper.

Even without licensing bullshit, Intel could have just subsidised the poo poo out of thunderbolt if they actually wanted it to succeed and become useful to the point where you get two ports in every $350 craptop. It's not like they don't sink billions into mobile chips each year just to have a presence.

Also regarding low tech controller boards, couldn't you just essentially use $5 raspberry pi zeros for everything.

suck my woke dick fucked around with this message at 12:11 on Jan 10, 2016

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

AVeryLargeRadish posted:

I expect that Broadwell-E will use a good bit less power, the Skylake stuff does as long as it's not using the integrated GPU and Broadwell-E obviously won't have that.
Might possibly worthwhile, after all, considering what my Sandy Bridge box draws during idle.

What's the difference between the 1S Xeons that are EP and the E ones? Apart from supporting ECC and probably not being binned for overclocking (I'm assuming)?

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

fishmech posted:

No matter what anyone else tells you, the only true dealbreaker disadvantage Betamax had was the recording length issue. And that's really not applicable to Thunderbolt.

Licensing fees. Betamax was an excellent product held back by Sony jacking licensing costs.

The same could be said of Lightning.
Hell, that is what happened to FireWire. Excellent protocol, fast, but poor market penetration outside of high end or specialized systems.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

CommieGIR posted:

Licensing fees. Betamax was an excellent product held back by Sony jacking licensing costs.

Sony could have taped a $10 bill to every Betamax cassette and it wouldn't have mattered, because it still wouldn't have held a feature film.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Subjunctive posted:

Sony could have taped a $10 bill to every Betamax cassette and it wouldn't have mattered, because it still wouldn't have held a feature film.

And by the time they did, there were already so many people with VHS that they couldn't do anything about it. Like yeah Sony having high licensing fees and their own players being very expensive didn't help. But even if they'd been as cheap as VHS decks there was the fact that then the same money could get you a machine that recorded 2 hours in normal mode and 4 hours in extended mode vs a machine that did 1 hour in normal mode and 2 hours in extended mode.

Incidentally the "betamax has better picture quality" thing was only because Sony's very expensive Betamax players had top-notch parts to go with the high prices, while you could get significantly cheaper VHS decks that cut all kinds of corners. The on-tape storage for both was equivalent - Betamax has very slightly better luminance resolution while VHS has very slightly better color resolution.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug

Subjunctive posted:

Sony could have taped a $10 bill to every Betamax cassette and it wouldn't have mattered, because it still wouldn't have held a feature film.

True, didn't think about that.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

blowfish posted:

Also regarding low tech controller boards, couldn't you just essentially use $5 raspberry pi zeros for everything.

On the one hand, raspberry pis didn't exist ten-ish years ago. On the other, even if they did we'd still have needed a second custom board to provide stimulus to the physical process and measure the response, because that required some analog circuitry which isn't on the rpi 0. And on the gripping hand, the microcontroller I used was under $5 qty 1 even then, included the ADCs I needed to measure analog signals (the interface board for a rpi would need an extra part), included enough flash memory to hold the control loop, and consumes probably 1/10th the power. The rpi 0 has several orders of magnitude more performance, RAM, and permanent storage if you add a microSD, but if you don't need those things it might not be a great fit.

Psmith
May 7, 2007
The p is silent, as in phthisis, psychic, and ptarmigan.
So I'm finally considering upgrading my CPU/MOBO and I have a few questions.

I have an i5 2500k that I bought about 4 years ago. I have recently upgraded my GPU to a 970 and I've been keeping my eye on Oculus (and VR tech in general) so I kind of want to get my CPU up to date. The main selling point for me here is that because the 2500k is still a good chip that allows me to run mostly everything at higher quality, I will use the current board/chip to create a secondary gaming PC for the living room.

So with that being said, I'm thinking about upgrading to the i7 4790K as it is $300 at Microcenter. I would get a new board as well. Getting the new board is because when I built this machine originally I got a small form factor board to fit my smaller case at the time. This has been something of a hassle as I've continued to upgrade so I feel like it's time for a full size motherboard.

So I have a few questions:

1) Is the upgrade even worth it? I've done a little research on google and the responses seem a little mixed. It seems that I could OC the 2500k fairly easily but I'm currently using stock cooling. So I'd have to try and fit an aftermarket cooler on my small form factor mobo. The recommended CPU for Oculus is "i5-4590 equivalent or greater" so that's my target. Oculus isn't my only concern but I believe it would be the most intensive one.

2) This is always my concern with buying a new expensive computer component but is there some wonderful brand new CPU tech on the horizon that is worth waiting for?

Thanks!

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Psmith posted:

So I'm finally considering upgrading my CPU/MOBO and I have a few questions.

I have an i5 2500k that I bought about 4 years ago. I have recently upgraded my GPU to a 970 and I've been keeping my eye on Oculus (and VR tech in general) so I kind of want to get my CPU up to date. The main selling point for me here is that because the 2500k is still a good chip that allows me to run mostly everything at higher quality, I will use the current board/chip to create a secondary gaming PC for the living room.

So with that being said, I'm thinking about upgrading to the i7 4790K as it is $300 at Microcenter. I would get a new board as well. Getting the new board is because when I built this machine originally I got a small form factor board to fit my smaller case at the time. This has been something of a hassle as I've continued to upgrade so I feel like it's time for a full size motherboard.

So I have a few questions:

1) Is the upgrade even worth it? I've done a little research on google and the responses seem a little mixed. It seems that I could OC the 2500k fairly easily but I'm currently using stock cooling. So I'd have to try and fit an aftermarket cooler on my small form factor mobo. The recommended CPU for Oculus is "i5-4590 equivalent or greater" so that's my target. Oculus isn't my only concern but I believe it would be the most intensive one.

2) This is always my concern with buying a new expensive computer component but is there some wonderful brand new CPU tech on the horizon that is worth waiting for?

Thanks!

I would wait until the Rift is actually out and games for it have been benchmarked, by then the prices on the new Skylake chips will have hopefully come down so you would be able to get the newest chip/mobo/RAM for a more reasonable price or you will be able to just put a good cooler on your 2500k, overclock it and save a lot of money.

CommieGIR
Aug 22, 2006

The blue glow is a feature, not a bug


Pillbug
Update your BIOS if you have Skylake CPUs:

http://arstechnica.co.uk/gadgets/2016/01/intel-skylake-bug-causes-pcs-to-freeze-during-complex-workloads/

quote:

Intel has confirmed that its Skylake processors suffer from a bug that can cause a system to freeze when performing complex workloads. Discovered by mathematicians at the Great Internet Mersenne Prime Search (GIMPS), the bug occurs when using the GIMPS Prime95 application to find Mersenne primes.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨


Are the updates available yet?

EdEddnEddy
Apr 5, 2012



Psmith posted:

So I'm finally considering upgrading my CPU/MOBO and I have a few questions.

I have an i5 2500k that I bought about 4 years ago. I have recently upgraded my GPU to a 970 and I've been keeping my eye on Oculus (and VR tech in general) so I kind of want to get my CPU up to date. The main selling point for me here is that because the 2500k is still a good chip that allows me to run mostly everything at higher quality, I will use the current board/chip to create a secondary gaming PC for the living room.

So with that being said, I'm thinking about upgrading to the i7 4790K as it is $300 at Microcenter. I would get a new board as well. Getting the new board is because when I built this machine originally I got a small form factor board to fit my smaller case at the time. This has been something of a hassle as I've continued to upgrade so I feel like it's time for a full size motherboard.

So I have a few questions:

1) Is the upgrade even worth it? I've done a little research on google and the responses seem a little mixed. It seems that I could OC the 2500k fairly easily but I'm currently using stock cooling. So I'd have to try and fit an aftermarket cooler on my small form factor mobo. The recommended CPU for Oculus is "i5-4590 equivalent or greater" so that's my target. Oculus isn't my only concern but I believe it would be the most intensive one.

2) This is always my concern with buying a new expensive computer component but is there some wonderful brand new CPU tech on the horizon that is worth waiting for?

Thanks!

As mentioned above, save the money, get a really good CPU coolers and overclock that thing.

You should be good to go for entry level Rift support as long as you have a few good USB 3.0 Ports to use.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Psmith posted:

So I'm finally considering upgrading my CPU/MOBO and I have a few questions.

I have an i5 2500k that I bought about 4 years ago. I have recently upgraded my GPU to a 970 and I've been keeping my eye on Oculus (and VR tech in general) so I kind of want to get my CPU up to date. The main selling point for me here is that because the 2500k is still a good chip that allows me to run mostly everything at higher quality, I will use the current board/chip to create a secondary gaming PC for the living room.

So with that being said, I'm thinking about upgrading to the i7 4790K as it is $300 at Microcenter. I would get a new board as well. Getting the new board is because when I built this machine originally I got a small form factor board to fit my smaller case at the time. This has been something of a hassle as I've continued to upgrade so I feel like it's time for a full size motherboard.

So I have a few questions:

1) Is the upgrade even worth it? I've done a little research on google and the responses seem a little mixed. It seems that I could OC the 2500k fairly easily but I'm currently using stock cooling. So I'd have to try and fit an aftermarket cooler on my small form factor mobo. The recommended CPU for Oculus is "i5-4590 equivalent or greater" so that's my target. Oculus isn't my only concern but I believe it would be the most intensive one.

2) This is always my concern with buying a new expensive computer component but is there some wonderful brand new CPU tech on the horizon that is worth waiting for?

Thanks!

I have an i5-2500k with a cheap air cooler (Hyper 212) and can run it at 4.4Ghz on stock voltage, changing no settings but the multiplier. It might be able to go higher, I've never tried. I don't know if you could do this effectively on the stock cooler but you could probably find an upgrade that would work in your case, especially considering your power consumption won't increase that much if you don't change voltage.

If you do have a 2500k with a good overclock, it's not going to be holding you back any appreciable amount. See these benchmarks with a Skylake vs. a much older Nehalem i7 965. This system has the additional handicap that Nehalem only supports PCIe 2.0 and it still manages to mostly not be bottlenecked much more than an overclocked Skylake, using a Titan X.

There isn't much on the horizon that's nearby and confirmed as far as CPUs go.

Eletriarnation fucked around with this message at 07:38 on Jan 13, 2016

Wistful of Dollars
Aug 25, 2009


I don't have much to add, but it's not worth it atm unless your current one dies.

At a minimum I'd wait to see what comes out this year, first.

lDDQD
Apr 16, 2006
So, because of this stupid thread I'm now a proud owner of a 5775C. Now, can anyone sell me their z97 board please?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
EK recalling all their AIO units lol

https://www.ekwb.com/news/656/19/Important-notice-from-EKWB-EK-XLC-Predator-240-and-360-R1-0-Product-recall/

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Yup, that's me. On my way to return it tonight.

lDDQD posted:

So, because of this stupid thread I'm now a proud owner of a 5775C. Now, can anyone sell me their z97 board please?

I'm teetering on the brink of getting one myself because CPU/mobo/RAM are the last things I haven't nailed down for my new build yet. For the rest of the thread, they're 369.99 USD at B&H right now if that store applies to you.

Adbot
ADBOT LOVES YOU

JacksAngryBiome
Oct 23, 2014

lDDQD posted:

So, because of this stupid thread I'm now a proud owner of a 5775C. Now, can anyone sell me their z97 board please?

I have see about 4 roll through the used-parts auction site here and they always go for at or just under the equivalent of $300 US. Each time I am tempted.

Not that I would ever do anything with it to justify that much chip.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply