Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Combat Pretzel posted:

Weird limbo. Plus some bullshit between Intel and Apple.

Maybe try eBay? There are lots of grey-market electronics that get sold there. Don't expect a warranty or it to work terribly well though.

Adbot
ADBOT LOVES YOU

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

kwinkles posted:

Tablets and phones are the perfect use case if the price is right.

If they can produce it economically, everything has a good use case for this. Why not blow away the entire SSD & HD market and own it all. Intel could effectively become the only game in town for storage, if it's as good as they say and isn't cost prohibitive.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Durinia posted:

There's "economically viable" and "economically dominant".

They've stated that it's clearly built to be "viable", which means it probably lands on a $/bit scale somewhere between flash and DRAM. Some applications don't need the performance that XPoint could bring, and they will always choose the sufficient and cheaper (per bit) solution.

There's a reason that tape drives are still being used today. (Yes, really)

If Intel has effectively turned storage into a problem they can solve with their CPU fabs, and have 1000x performance improvement, with a technology only they will have patents on and only they could manufacture, they could push Samsung and all other SSD makers out of the market by aggressively pushing the cost of this new tech down. None of us have any idea how much this costs, but if I were Intel, I would be looking to own as much of the storage market as possible, from phones & tablets to consumer drives to high end server drives & specialty devices.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
While I'm sure chip manufacturers don't care about this, it is significantly easier to recycle and reuse LGA CPUs vs PGA. Now you might say that just makes it harder to recycle/reuse the motherboard, but that was already crazy difficult. If a CPU is bad, it likely won't post, so if that was the point of failure, it's easy to test. Motherboards can have dozens of points of failure and are a headache no matter what socket is used, so most recyclers don't even try to reuse them. They just shred them and reclaim precious minerals, so it doesn't matter how much harder you make it to reuse motherboards.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

PerrineClostermann posted:

Not arguing that, my t100 is great. But most enthusiasts do their heavy lifting on their towers, not on a more expensive, less able laptop.

So again, that's not the metric enthusiasts particularly care for.

Physics is a bitch. Until we can figure out an entirely new mode of computing, we'll have to focus on parallelizing our work instead of just making one really fast core. Cheap space travel would be nice too, but it turns out, it is actually REALLY HARD.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

canyoneer posted:

The silliest marketing naming thing they still do is "nth Generation Core" processor. I get that Core is the brand, but within tech journalism/enthusiast they always refer to the product generation by the internal codename.

And it's sort of bizarre to use a generic, industry term as a key word in your branding. Core means something already in semiconductor parlance, why make that the backbone of your branding instead of a made-up word? (like Pentium!)
https://en.wikipedia.org/wiki/Generic_trademark

It's better than it was in the early Core days. Core, Core Duo, Core2, Core2 Duo, Core2 Quad... WTF...

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

~Coxy posted:

Makes more sense than Pentium/i3/i5/i7.

Just think of a 920 as a Core 3 Quad, 2600K as Core 4 Quad, etc.

I don't think so... i3/i5/i7 maps well to budget, mainstream, performance. It doesn't conflate confusingly common words (Core) with the name. Sure, the model numbers within (4790K wtf) don't help, but the numbers for the Core series didn't help much better (6600 vs 8300).

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
ACTUAL Game Designer

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Palladium posted:

Heh, because getting non-K chips at higher stock clocks, higher IPC, less power and better chipsets at the same price isn't progress. "But I can't overclock those so I have to spend an equivalent of a 850 Evo 500GB more to get my *free OC performance*"

This has been my biggest problem with overclocking... it's just not worth the effort or cost anymore. You get maybe 5% improvement, and even that isn't worth even an hour of my time anymore. I'd rather just spend more on the CPU and use it hassle free, or invest in other areas (SSDs, RAM, etc).

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Krailor posted:

Hey, a person can dream can't they...

While in reality I'll just wait for the day that AMD goes bankrupt and on that day buy whatever the top-of-the-line i7 and Geforce card is and ride those off into the sunset for the next 20 years.

While AMD is not doing great these days, ARM based CPUs have been nipping at Intel's heels for awhile, so hopefully if AMD goes under, someone will be able to keep Intel from raising their prices too high.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Gwaihir posted:

Also, next year is the year of Linux on the desktop!

Hey, I said hopefully someone steps up with an ARM design to keep Intel honest, not that ARM is currently competition for Intel in the desktop market.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Nintendo Kid posted:

An ARM design that could seriously compete would need to be able to run x86-64 code at an acceptable speed. That's the barrier they'd have to hit to keep Intel "honest".

It doesn't even need to compete to keep Intel honest, just threaten competition. It's not like AMD is seriously competing with Intel right now, but things would be a lot worse without them around.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

kwinkles posted:

Except that every arm server thing to date has been vaporware.

There isn't a demand for them yet. Intel has the high end, AMD and others the rest. But if AMD were to go bye bye, and Intel started jacking up prices, there is much more incentive to develop an alternative, and ARM is closest (assuming someone doesn't pick up AMD designs from their corpse and start reusing them)

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

JawnV6 posted:

That's not how this works. That's not how any of this works.

Whats this supposed to mean? Samsung could buy AMD and decide to go into the CPU market. Who knows what will happen to AMDs IP if/when they go under.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Edward IV posted:

Wait, so water does actually flow through the laptop using those hydraulic-style quick disconnect ports on the back? :lol:

I suppose it is more efficient to do it that way instead of, say, a conductive contact pad that connects the laptop to the water cooling system. Still I wonder how it deals with the water left in the laptop after you disconnect it from the cooling unit because having any water left in the laptop sounds like a bad idea. The while things sounds like a bad idea.

I believe it uses water-cooling internally as well, so it's supposed to have water left inside. But when connected to the dock, has a larger reservoir and radiator.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

As your second graph shows, no. Moores law specifically relates to the density of transistors, not their clock rate.

Taken in a larger perspective, the answer is still probably no. Ray Kurzweil's law of accelerating returns would have it that, while clock rate may have hit a wall, other aspects of computing have continued to improve significantly, like # of cores, power consumption, cost / performance, etc.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

dpbjinc posted:

Moore's Law is doomed to fail eventually. You need at least 24 atoms split among the three semiconductor regions, plus the metal that connects them to each other. If you need smaller, you won't be using traditional transistors.

Once the observable universe is converted completely into computing substrate, yes, we will have to make do with what is available.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
They also don't need to soundly defeat Intel in any specific performance metric, as they can also seriously compete on cost/core.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Combat Pretzel posted:

Why wouldn't one want XPoint, if it's even faster solid state memory? --edit: I mean with NVMe interface.

Assuming it costs the same, of course I'd want XPoint. I doubt it will be cost effective for desktop users in 2016/2017.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
XPoint makes a lot of sense in tablets too, if the cost can be brought down. No more need for a separate bus for memory and for storage, just partition 8gb as memory and the other 120gb as storage.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

A Bad King posted:

Are we stuck at 2013 performance levels then? Doesn't Intel have a huge amount of cash for research?

There are some physical limitations we are running up against with current processor technology. Going to take something major to break through that.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Subjunctive posted:

What would happen over the course of those couple of days? Is there some metallurgical process that occurs or is undone slowly at room temperature? This stuff fascinates me but my materials-science knowledge doesn't include anything like that, sorry if it's obvious to everyone else in the thread.

I think he was just suggesting not playing any games until he gets his new parts. Not to cool off, but not to push the CPU any harder. The goal is for nothing bad to happen to the CPU over those days.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Ika posted:

never heard of a waterpump header, I know 3pin case fan headers are fairly common, but they usually don't provide enough current for a pump and have a warning not to draw too much.

I would imagine they would basically be the same, except one CAN draw more current, but if a fan is plugged into it should work fine.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Malloc Voidstar posted:

My current laptop is running with some hosed up combination that added up to 24GB (too lazy to disassemble to remove the old) and I never noticed a speed decrease, FWIW.

You can get to 24 with matching sticks, it's 2x8 + 2x4. As long as they are paired correctly, should not be significantly decreasing performance.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
How fast the eye can see things is a fairly complex thing. 24fps is basically what it took to make films stop flickering, but is no where near the maximum. It also depends on what kinds of frames. For example, if you have 100 bright white frames, and a single black frame, you probably won't notice it. If you have 100 black frames and 1 bright white frame, you will definitely notice it.


http://www.100fps.com/how_many_frames_can_humans_see.htm

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Lovable Luciferian posted:

Do they require the actual cores or is four threads sufficient in these cases?

Most do not REQUIRE 4 cores, but will suffer without them.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Pryor on Fire posted:

I'm confused about the new CPU requirements for windows 10. For those of us who are never installing Windows 10 no matter what- what's will be the best/final intel CPU available?

It's so bizarre to talk about the "final" hardware we'd ever be able to run, never thought I'd see this day in the tech world. Goddamn.

Why don't you want to install Windows 10? It's been one of the smoothest new OS installs I've ever done. It sounds like you're making life hard for yourself, and then complaining like it's not your fault.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

slidebite posted:

I don't know - it's not like the Windows 10 thread is just made up of people talking about breakfast, it's a lot of people having issues with it albeit with varying degrees of severity.

I'll probably take the Win X plunge on my new PC come July, but I'm certainly not in any rush.. although the whole MS trying to force updates to people that obviously don't want it (hiding nag updates, etc) is becoming pretty annoying and actually turning me off of it.

Not being in a rush is very different from "NEVER EVER EVER EVER INSTALLING WINDOWS 10! 3.11 FOR LYFE! PS WHY ARE THERE NO DRIVERS FOR MY GTX980 NO FAIR M$ YOU SUCK". Main reason I installed 10 was because I also got an Intel 750 and decided doing a fresh install was warranted anyways and might as well try 10. Has been working pretty smoothly so far.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

PBCrunch posted:

I gave my thirteen-year-old nephew an old desktop computer I cobbled together out of spares and craigslist parts. It is an old Dell Inspiron 530 with:

C2D E7500 (~3GHz)
stock Dell motherboard and microATX case
8GB DDR3 (all four DIMM slots populated)
Geforce GTX 460
EVGA 500W PSU
250GB Sandisk SSD (the highlight of the whole thing)
Windows 10
22" 1680x1050 Viewsonic LCD (pretty old, probably TN, but not a fast refresh example)
PCIe Wifi card

His mom does not have a ton of money, so they "borrow" internet access from the neighbor's wifi (with permission). My nephew has Minecraft, Portal 1 & 2, TF2, and Dota2. He really seems to like Minecraft and TF2. Gaming performance is the main priority for this machine. The internet access seems pretty crappy to me, but he still likes playing TF2 on it.

My mom (his grandma) wants to spend ~$100 on his birthday, and my nephew is very excited about the prospect of upgrades. I saw a sale on a G3258 and a Gigabyte unlock-friendly B85 motherboard packaged together for $110. He has expressed interest in Subnautica and Dying Light. Subnautica is just on the edge of playable on his hardware, and from what I can tell Dying Light is pretty hardware-intensive.

I'm not sure if the G3258 direction is the right idea or not. It would be fun to explore overclocking with him and maybe fool around with water cooling eventually. Something like that would have been very appealing to me at his age. On the other hand, the two core/no hyperthreading limitations of the G3258 kind of suck. The positives here are that he would have a modern platform and could upgrade to a Haswell Core i5 quad core later (although looking at ebay it seems that Intel quad core CPUs newer than Core 2 Quads hold their value pretty well). Are there games on the horizon that will completely lock him out for not having dual cores/HT?

Another direction, as far as I can tell is getting a Core 2 Quad Q9650 or an LGA 771 Xeon in his existing motherboard, and save the remaining budget towards a GTX 750 or 950. Single core CPU performance would be reduced compared to G3258, and overclocking would be off the table, but he would have four cores.

It seems like the AMD alternatives are pretty weak all the way around. A comparison I saw showed the G3258 beating the Athlon 860K in many games, including the Witcher 3 and GTA V, which seem like titles where four cores would help. Does AMD offer anything compelling at this budget level?

Whatever we do, it seems that the PSU, RAM, SSD, and graphics card should be reusable. I have an ATX case and a pretty decent aftermarket CPU cooler he can have if that becomes necessary. He has a budget Android tablet of some variety (Kindle Fire maybe?), a Wii (not Wii U), a PS2, and a 3DS. He does not own any newer consoles than that, so this PC is his primary gaming machine at this point.

Or is all this CPU/motherboard chat worthless, and he really should get the best graphics card available for ~$100?

$100 graphics card probably wouldn't be worth it since he already has a GTX 460.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Subjunctive posted:

Why do they get hotter than 2.5" or slot?

I believe it's the controller chips that ends up producing most of the heat, when doing constant read/write activity. The actual storage chips barely heat up at all.

Adbot
ADBOT LOVES YOU

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Subjunctive posted:

And the NVMe controller chips run hotter?

E: hey there new page, sup?

They are capable of producing more heat, and for the M.2 drives, usually do not have any heatsinks like 2.5" drives do.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply