|
Anyone know if the i7 2600s is going to be available the same day as the rest of the lineup? I'm re-doing my mATX build, and looking to use the 2600s.
|
# ¿ Jan 5, 2011 02:03 |
|
|
# ¿ Apr 24, 2024 09:44 |
|
Out of curiosity does anyone happen to have any knowledge/info about the i7-2600S? I usually build my systems around mATX cases, and so I was looking to use it in my next rebuild since the 30W TDP difference seems nice for an mATX system (I do a decent bit of gaming, but I don't overclock). The i7-2600S is listed by Intel as having been released on the 9th, but so far I've only see the 2600 and 2600K available at retail stores, and the only "S" variant I've seen is the i5-2400S (there should also be a 2500S, which I haven't seen). Intel's own "boxed processor" info page only lists the 2400S, which is making me think that either the 2600S hasn't started shipping yet, or won't be available in retail. In that case, I'm debating between the 2500 and 2600.
|
# ¿ Jan 14, 2011 03:29 |
|
Roughly a week and a half almost after official release, and the only real signs of the i7-2600S are from a dozen or so HP desktop variants that are using it. I guess that's probably where the majority of the i7-2600S's went (large OEMs like HP an Dell). Anyway, guess it's time to order a i5-2500K or a i7-2600.
|
# ¿ Jan 18, 2011 23:59 |
|
Star War Sex Parrot posted:It was, but it was Q1. Now it's looking more like Q2.
|
# ¿ Sep 3, 2011 03:01 |
|
Shaocaholica posted:Oddball question about old rear end Intel CPU but... Articles relating to Intel's announcement of the lower power consumption appear to support it: http://www.xbitlabs.com/news/cpu/display/20060615113237.html and this Tom's Hardware forum discussion also indicated the relevant s Specs: http://www.tomshardware.com/forum/188276-28-quot-intel-lower-pentium-power-consumption-quot Additionally, while that link to the Intel Ark site doesn't specifically give the wattages, it does indirectly indicate that the D0 spec is using less energy as it ha a lower Tcase temperature, via the Package Specifications section: Intel Ark posted:TCASE C1=68.6°C, D0=63.4°C
|
# ¿ Apr 19, 2014 21:23 |
|
pmchem posted:Same date as the WWDC keynote... unlikely to be a coincidence. It's also a day before Computex 2014 kicks off. Not everything revolves around Apple.
|
# ¿ May 24, 2014 23:19 |
|
Agreed posted:Well, what the gently caress am I going to do with a couple of suddenly last-gen parts? Never buy and then don't build a computer, it's the dumbest feeling. gently caress you too, bad back. You could put the parts up on SAmart at a discount, I'm sure there would be interest, and then recoup some of the money to put towards Devil's Canyon and Z97? I'd imagine you'd be able to cover at least 60-70% of the cost. You'd still be out a little extra but hey, super-awesome over clocking! (Possibly)
|
# ¿ Jun 3, 2014 18:46 |
|
Has there been any info on if there will be a Xeon equivalent of the 4790K?
|
# ¿ Jun 19, 2014 05:51 |
|
The Lord Bude posted:If you manage to overclock the bejeesus out of it; then it can nearly match a core i3. Of course this also relies on buying a more expensive z97 mobo
|
# ¿ Jun 27, 2014 18:21 |
|
The Lord Bude posted:ooh, I haven't heard about these. Ignoarints posted:I think the best part is that there will be suitable $100 overclocking boards more than anything I think the intent is to have them be even cheaper than that. I'll have to find the site but I thought at Computex that Asrock had indicated that they hope to have some of the budget Z97 boards down in the $75-80 range. Granted it won't come with the best of features but for a budget gaming system it shouldn't be bad at all.
|
# ¿ Jun 28, 2014 05:15 |
|
Don Lapre posted:People who got a DC from Tigerdirect. They dropped the price $10 on the 4790k. I called in and they are sending me a $10 gift card. Make sure you stay on them about it. I bought a monitor from them in 2012 and the price dropped within a day or two by $10 also. I contacted them and they said they'd issue a GC for $10 but that it'd take 2-3 weeks to receive. I waited a month, contacted them, they said they'd issue it. Waited another month, nothing, contacted them again and the person on the phone said they had no record of the prior attempts to issue a GC for it. I finally gave up after that.
|
# ¿ Jul 3, 2014 18:01 |
|
Cardboard Box A posted:I thought DDR4 was mostly the same price as DDR3 since DDR3 went up in price? Did DDR4 go up as well?
|
# ¿ Jul 4, 2014 20:06 |
|
atomicthumbs posted:That reminds me... will single-GPU setup suffer from performance degradation if I put it in a PCIe 3.0 x8 slot instead of an x16 slot? I'm driving a 1080p monitor with it. Someone can confirm this but I believe PCIe 3.0 x8 bandwidth is equal to the bandwidth of PCIe 2.0 x16 (though ultimately it depends on how the lane was electrically wired). And I don't think PCIe 2.0 x16 was very limiting to single card situations. There was some concern that R9 290X cards in Crossfire might be limited by PCIe 3.0 when the slots are operating as x8 but I'm not sure if that bore out (and obviously in this instance this doesn't matter). Canned Sunshine fucked around with this message at 20:45 on Jul 26, 2014 |
# ¿ Jul 26, 2014 20:41 |
|
BobHoward posted:I'm not sure what you mean by 'how it was wired'
|
# ¿ Jul 27, 2014 05:50 |
|
BurritoJustice posted:To be fair, you can't actually use SLI alongside a M.2 3.0 4x SSD (like the Samsung is) on LGA1150 without a PLX chip. Honestly if you really had to do it, the cheapest option would be a 4690k/4790k alongside an ASRock Extreme9 (only motherboard I know of with both PCIE 3.0 4x M.2 and a PLX chip, also a good 200 dollars cheaper than other PLX boards). But you shouldn't. And doesn't the PLX chip introduce quite a bit of latency also? So it might not actually be an option for some either.
|
# ¿ Aug 31, 2014 10:01 |
|
The human eye doesn't observe in frames per second so the whole idea is ridiculous anyway. Military testing on human response time has shown people reacting to images shown at several hundred frames per second, but there's a big difference between what you can notice/react to and the time it takes the brain to actually process the information being observed, and it's definitely not a matter of "the brain can process all the information it receives as fast as it receives it".
|
# ¿ Feb 1, 2016 17:43 |
|
Palladium posted:they would have saved ~$150 in electricity bills @ 4 hours full load per day alone by now, not to mention the resale value of their parts.
|
# ¿ Jun 19, 2016 05:36 |
|
Palladium posted:The smart crowd would have easily made a profit by upgrading to a 2500K/cheap P67 combo at 2011 resale prices while getting higher performance all around and power savings gravy, while the not-so-smart ones are still mulling their heads over the sunk costs they paid for their X58 setup. But part of the whole argument is couched in 20/20 hindsight. No one (outside of Intel) at the time knew that performance would stagnate as it has, and that Sandy Bridge would still be as viable as it is today. So the majority of people would have argued to hold on to Nehalem/x58 for another generation or two and make the upgrade really worth it. Instead, Intel went a different direction to where we are today and so the above argument looks better in hindsight.
|
# ¿ Jun 19, 2016 06:59 |
|
Atomizer posted:If you still use a ball mouse then there are no words to express the pity we have for you. I've got one of the Apple puck mice framed in a shadow box above my desk to remind me of both the hilarious and sometimes stupidly of Apple designs (and I say this as an Apple fan), as well as to act as a reminder of the craziness of ball mice.
|
# ¿ Jul 20, 2016 04:17 |
|
Rastor posted:That can never happen, there's too much legacy code out there. Unless you think they are so arrogant they would force the switch and subject their users to some kind of slow translated mode for their current software.
|
# ¿ Dec 30, 2016 03:15 |
|
Platystemon posted:Exchanging heat with a lake or river is one thing, but making it an ornamental water water feature? I would think that is unusual. If you want to, say, keep a water feature going in cold temperature areas, a fountain is a great way to do so, since the heat exchanger will help keep the fountain water from freezing up and causing any valves or pumps from seizing, while the extremely cold water will provide beneficial for maximizing heat transfer across the exchanger. I get to help design closed loop cooling systems for ozone generators and we see a decent increase in exchanger efficiency during the colder months of the year when service water from distribution systems are at their coldest. Fountains work similarly.
|
# ¿ Jan 2, 2017 06:56 |
|
Sashimi posted:I was about to joke about not being hardcore enough if you didn't use a pump with an underground connection to the water table for cooling, then I scrolled down... A part of me wants to call him out on some of his hydraulics/soil comments but whatever, it's an interesting project.
|
# ¿ Jan 8, 2017 02:32 |
|
I got excited and then saw that it's $1300 and got sad.
|
# ¿ Jan 16, 2017 08:12 |
|
Watermelon Daiquiri posted:i still find it funny people are legitimately upset and disappointed that AMD 'only' managed to get within like 15% of Intel's latest offering of a decade long refinement process (thats in turn based on an already established design skeleton!) after only a few years of alpha and beta development on a complete brand spanking new design (unless zen is a refinement of the old k6s or whatever) Yeah, I think some people on this forum built up an idea that Ryzen would instantly surpass all of Intel's offerings, and when it didn't, apparently the release became a greater disappointment than the Titanic.
|
# ¿ May 7, 2017 20:15 |
|
SlayVus posted:If I were to move to X299 with an i9, this would probably be the board I would go with. It reminds me in a strange way of those tacky fiber optic "plants"/table pieces that people had in the late 80s/early 90s.
|
# ¿ Jun 1, 2017 05:44 |
|
Combat Pretzel posted:If that rumors shows up a few more times in near future, I consider it true and postpone my upgrade. For realsies, because that's a FTFY
|
# ¿ Jun 18, 2017 16:57 |
|
Is it wrong that seeing these results now makes me want to buy a 1700 non-X and simply OC it to 3.9/4.0?
|
# ¿ Jun 19, 2017 22:52 |
|
ufarn posted:This seems bad? Jesus Intel, it's a mature platform. How do they gently caress these things up?
|
# ¿ Jun 25, 2017 17:32 |
|
MaxxBot posted:Intel claimed that the Skylake-X was ideal for "12k" gaming by which they really meant 3x 4k screens or something, really dumb either way since no GPU setup can reasonably push that. You know if that'd been AMD claiming it, Paul would be all over it!
|
# ¿ Jul 2, 2017 19:16 |
|
Intel is one step away from calling all the review sites "fake news".
|
# ¿ Jul 13, 2017 19:13 |
|
DrDork posted:But I thought they were re-architected for l33t performance, unlike AMD's glued together cores? Well you see, that's just a biased test that isn't part of the SW Ecosystem.
|
# ¿ Jul 15, 2017 20:11 |
|
Anime Schoolgirl posted:the frozencpu guy literally did this Is there a complete write up on this somewhere? I was always interested in finding out the full story and what became of the owner. priznat posted:Most of Intels fabs are actually in the USA, the assembly sites where the dies are packaged are in places like Malaysia.. Yeah, I drive by their north Chandler/south Tempe fab every week. Canned Sunshine fucked around with this message at 19:20 on Jul 16, 2017 |
# ¿ Jul 16, 2017 19:14 |
|
So if there's a CL-S, will there be a CL-T variant? Just wondering if it's real or just a myth...
|
# ¿ Jul 18, 2017 15:56 |
|
Paul MaudDib posted:I'm pricing out the Plans B to Threadripper for a luxury home NAS with a big-gun Xeon build. Just use Threadripper.
|
# ¿ Jul 19, 2017 06:09 |
|
Khorne posted:Nah, I have an i7 3770k and it has around 10% higher single and quad core performance than an overclocked ryzen. I paid around $330 for it 5 years ago.
|
# ¿ Jul 28, 2017 05:35 |
|
Kazinsal posted:The 8700K looks really good but holy poo poo I do NOT want to give Intel my money. Then don't and wait for the 1900X Threadripper.
|
# ¿ Aug 8, 2017 22:36 |
|
Doesn't WD own HGST?
|
# ¿ Sep 2, 2017 15:52 |
|
HalloKitty posted:Correct, but that doesn't mean the drives are the same. In fact, Toshiba was selling HGST drives for a while even when WD owned HGST because of some monopoly ruling or something. I think that changed in 2015/early 2016 though and everything HGST branded is manufactured by WD now.
|
# ¿ Sep 2, 2017 18:21 |
|
WhyteRyce posted:at the thought of having irreplaceable family photos stored only in Google or Amazon's cloud services. Yeah, I'm so paranoid about losing my wife and I's various photos and videos (wedding, baby, etc) plus other files, that I have them backed up to two different cloud services (one of which is free), a external WD Red I keep at work, a Airport Time Capsule at home, and both my gaming PC, my Mac mini, and our rMBP. Every 1-2 weeks I simple create a "backup folder" on a portable external HD and go around replacing the old folder with the current/new one.
|
# ¿ Sep 3, 2017 16:36 |
|
|
# ¿ Apr 24, 2024 09:44 |
|
Dr. Fishopolis posted:You're really going to wait a whole year for a new chipset instead of buying a $50 PCI card? Thunderbolt 3 PCIe cards are available for $50? I hope these end up true since I'd just go ahead and get a R7 1700 and X370 and call it good. I've wanted to see how Coffee Lake's performance would compare and how it might affect prices on Ryzen. Canned Sunshine fucked around with this message at 15:24 on Sep 11, 2017 |
# ¿ Sep 11, 2017 15:21 |