Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

go3 posted:

You're not going to convince IT guy since he is a cast-iron idiot. If you want change, convince the people above/around him.

"My lovely laptop I got for $300 at walmart runs our software twice as fast as the machines you just finished building for us, why is that? Were they even cheaper than $300?"

Adbot
ADBOT LOVES YOU

KillHour
Oct 28, 2007


Aquila posted:

Probably never. Intel is very bad about actually bringing enterprise ssd's to market, and demand for the P3600 and P3700 is phenomenal.

KHAAAAAAN! :argh:

I really, really want those drat drives when I refresh our server offerings. Unless anyone else can think of a good PCIe SSD for write-heavy(ish) loads <$1.50/GB.

KillHour fucked around with this message at 07:48 on Sep 18, 2014

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I seem to remember Intel teasing a feature where they could basically sleep a computer, and then have it instantly wake up to (say) respond to an attempted connection. This was different from a typical Wake on Lan.

Anyone remember what I'm talking about? Is that feature in any of the NUC PCs?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Paul MaudDib posted:

I seem to remember Intel teasing a feature where they could basically sleep a computer, and then have it instantly wake up to (say) respond to an attempted connection. This was different from a typical Wake on Lan.

Anyone remember what I'm talking about? Is that feature in any of the NUC PCs?

Connected Standby?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Not quite but this let me dig up what I was talking about.

quote:

Ready Mode essentially allows a 4th Gen Core processor (and presumably newer chips, which are slated to arrive later in the year), to enter a low C7 power state, while the OS and other system components remain connected and ready for action. Intel demoed the technology, and along with compatible third party applications and utilities, showed how Ready Mode can allow a mobile device to automatically sync to a PC to download and store photos. The PC could also remain in a low power state and stream media, server up files remotely, or receive VOIP calls. Ready Mode will be supported by a number of Intel’s system partners in a variety of all-in-one type systems a little later in the year.
http://hothardware.com/News/Intel-Outs-HaswellE-and-Devils-Canyon-CPUs-Ready-Mode-Technology-and-More-at-GDC/

More or less the same concept, but aimed at the desktop market instead of Atom processors and Metro apps.

I want this on a NUC or something similar as a low-power server. Doesn't look like it's available yet, though.

Paul MaudDib fucked around with this message at 21:59 on Sep 19, 2014

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
C7 is disabled by default, because most PSUs except the most recent ones don't support it.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Does HDMI 2.0 support just require a board redesign or does it require cpu support as well? Is it possible we could get some haswell boards with hdmi 2?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It requires a board redesign because of the huge clock speed increase, to 600 MHz from 340(?) MHz. Haswell and Broadwell support HDMI 1.4, not 2.0, and that's that.

uaciaut
Mar 20, 2008
:splurp:
Wait wait wait, i was actually browsing around, saw the thread and skimmed through the OP and saw that the integrated graphics of the i5 4690k is better than the GPU on my old PC (a radeon 4850)???

Did i read that wrong? Because wanted to get my 2-weeks money back warranty on my video card so i sent it back and plugged everything back to my old PC.
Would there be any realistic downside to this (like extra strain on my CPU or something). I don't plan to actually do any gaming till i get a new video card but i do want to watch youtube, vidoes, films, etc and do poo poo.

P.S. do i have to change my currently use GPU in Bios to use the integrated graphics?

edit: apparently i'm ancient, the GPU die is separated on the actual CPU chip, i should stress the cooler a bit more but unless i abuse it (which i won't) i should be ok, right?

uaciaut fucked around with this message at 19:17 on Sep 24, 2014

1gnoirents
Jun 28, 2014

hello :)

uaciaut posted:

Wait wait wait, i was actually browsing around, saw the thread and skimmed through the OP and saw that the integrated graphics of the i5 4690k is better than the GPU on my old PC (a radeon 4850)???

Did i read that wrong? Because wanted to get my 2-weeks money back warranty on my video card so i sent it back and plugged everything back to my old PC.
Would there be any realistic downside to this (like extra strain on my CPU or something). I don't plan to actually do any gaming till i get a new video card but i do want to watch youtube, vidoes, films, etc and do poo poo.

P.S. do i have to change my currently use GPU in Bios to use the integrated graphics?

No strain. Integrated graphics is used a lot, although not so typically on a 4690k if I had to guess. Its no harm or anything.

I'd just plug it in and boot it. There are options in the BIOS for it but I think that really matters if you specifically disable it

Yaoi Gagarin
Feb 20, 2014

Silly nomenclature question: is there any reason at all why all the <architecture>-E chips are one number too high in the "thousands place"? Sandy Bridge CPUs were 2xxx, Sandy Bridge-E were 3xxx, Ivy Bridge were 3xxx, Ivy Bridge-E were 4xxx, and now Haswell CPUs are 4xxx and Haswell-E are 5xxx and Broadwell will probably be 5xxx and a hypothetical Broadwell-E will be 6xxx and so on. That seems like a pointless and confusing decision. Did Intel ever explain this?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It's marketing. Pure "bigger numbers = better than." For a tenuous justification: The -E variants are released later, closer to the release of the die-shrink than the release of the original uarch. Add to that that they are better chips (in the sense of numbers, if not necessarily benchmarks), and therefore they're "next-gen."

Welmu
Oct 9, 2007
Metri. Piiri. Sekunti.
What's the latest word on desktop Broadwell? Core M chips should launch within a month or two, we *may* get LGA1150 CPUs in Q1 2015 and it seems that they'll be a ~5% clock-for-clock upgrade over Haswell.

I'm pondering whether should I grab a 4690K / 4790K now or wait for the Broadwell desktop lineup. I could also grab an anniversary edition Pentium for *cheap*, overclock the hell out of it (watercooled rig) and sell it once Broadwell launches since Z97 motherboards support both Haswell and Broadwell.

Upgrading from Nehalem i7-975.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Welmu posted:

What's the latest word on desktop Broadwell? Core M chips should launch within a month or two, we *may* get LGA1150 CPUs in Q1 2015 and it seems that they'll be a ~5% clock-for-clock upgrade over Haswell.

I'm pondering whether should I grab a 4690K / 4790K now or wait for the Broadwell desktop lineup. I could also grab an anniversary edition Pentium for *cheap*, overclock the hell out of it (watercooled rig) and sell it once Broadwell launches since Z97 motherboards support both Haswell and Broadwell.

Upgrading from Nehalem i7-975.

Wait for Skylake; intel has said it will not be delaying it.

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf
Are there going to be any more CPUs made for the z97 chipset? Is the 4790k the last and the greatest for this mobo?

EoRaptor
Sep 13, 2003

by Fluffdaddy

r0ck0 posted:

Are there going to be any more CPUs made for the z97 chipset? Is the 4790k the last and the greatest for this mobo?

Broadwell has, so far, very different power needs than haswell. Even if they preserve lga1150, you would likely need a new MB to accommodate the new power requirements.

Skylake will absolutely need a new socket, as the switch to DDR4 is a big move.

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf
Pretty much what I figured, I think the last time I upgraded my CPU without replacing my mobo was the celeron 300a.

Welmu
Oct 9, 2007
Metri. Piiri. Sekunti.

EoRaptor posted:

Broadwell has, so far, very different power needs than haswell. Even if they preserve lga1150, you would likely need a new MB to accommodate the new power requirements.
I thought all 9-series motherboard have native support for Broadwell CPUs?

Rime
Nov 2, 2011

by Games Forum

Malcolm XML posted:

Wait for Skylake; intel has said it will not be delaying it.

That's what they said about Broadwell. Let's be honest, for a product launching in twelve months or less, there is practically nothing known about Skylake and this does not bode well.

Edit: Intel is apparently considering foregoing EUV for the 7nm node.

Rime fucked around with this message at 17:37 on Sep 26, 2014

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

EoRaptor posted:

Broadwell has, so far, very different power needs than haswell. Even if they preserve lga1150, you would likely need a new MB to accommodate the new power requirements.

Most motherboard makers are saying that their Z97 chip boards will support the 5th gen Core chips when they come out. But time will tell.

Double Punctuation
Dec 30, 2009

Ships were made for sinking;
Whiskey made for drinking;
If we were made of cellophane
We'd all get stinking drunk much faster!
So what's the over/under on when they finally say gently caress it and start switching over to graphene processes?

Rastor
Jun 2, 2001

Not until a number of years from now, by which time silicon progress has ground to an almost halt, sputtering adrift with enormous gaps of time between process changes.

KillHour
Oct 28, 2007


Correct me if I'm wrong, but I don't think anyone's ever managed to make even a one-off graphene processor in the lab with a non-trivial amount of transistors (like more than a half-adder, or something).

Rime
Nov 2, 2011

by Games Forum

KillHour posted:

Correct me if I'm wrong, but I don't think anyone's ever managed to make even a one-off graphene processor in the lab with a non-trivial amount of transistors (like more than a half-adder, or something).

Google search says...

Derp, that's not a processor. :suicide:

KillHour
Oct 28, 2007


Rime posted:

Google search says...

Derp, that's not a processor. :suicide:

Still impressive; further than I thought they were. Unless they can get a band gap, though, there's no hope in ever making a digital logic circuit out of it.

Rastor
Jun 2, 2001

KillHour posted:

Still impressive; further than I thought they were. Unless they can get a band gap, though, there's no hope in ever making a digital logic circuit out of it.

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
Yeah, graphene is fundamentally incompatible with traditional semiconductor manufacturing and logic without some very clever workarounds that haven't really been successful.

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf
Can they simply combine graphene with quantum computers and be done with it?

1gnoirents
Jun 28, 2014

hello :)
this is all moot, synaptic cpus will allow us to bend time and space and run crysis at 5k at 300 fps before any of that

Gucci Loafers
May 20, 2006
Probation
Can't post for 2 hours!
Won't a non-silicon processor also need a completely new instruction set?

JawnV6
Jul 4, 2004

So hot ...

Tab8715 posted:

Won't a non-silicon processor also need a completely new instruction set?

No.

One Eye Open
Sep 19, 2006
Am I awake?

KillHour posted:

Still impressive; further than I thought they were. Unless they can get a band gap, though, there's no hope in ever making a digital logic circuit out of it.

What about virtual bandgaps in graphene?

Lord Windy
Mar 26, 2010
With the vacuum tube, if it has a higher Ghz limit would it be worth skipping graphene and going straight for that? I don't know anything about electrical engineering but to me it sounds like the magic next step not graphene.

Is Quantum computing some new super fast computer or is it a kind of new magic way of looking at electrons? I remembering someone telling me it's about using the fuzzy circumference of electrons as bits instead of using electrons as bits (ie, 1 electron or whatever is now 2+ bits instead of just 1) but that doesn't sound like any performance gains.

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?

Tab8715 posted:

Won't a non-silicon processor also need a completely new instruction set?

Basic digital logic instructions (AND, OR, NAND, NOR, XOR and so on) don't necessarily require silicon to be executed, nor do x86 instructions (although building an 8086 entirely out of non-silicon components would be a fun project)

KillHour
Oct 28, 2007


Lord Windy posted:

With the vacuum tube, if it has a higher Ghz limit would it be worth skipping graphene and going straight for that? I don't know anything about electrical engineering but to me it sounds like the magic next step not graphene.

Is Quantum computing some new super fast computer or is it a kind of new magic way of looking at electrons? I remembering someone telling me it's about using the fuzzy circumference of electrons as bits instead of using electrons as bits (ie, 1 electron or whatever is now 2+ bits instead of just 1) but that doesn't sound like any performance gains.

A normal computer stores data in memory as a state (A bit is either '0' or '1', in binary systems). A quantum computer takes advantage of the fact that you can have a particle be in a superposition of multiple states (A qubit [short for quantum bit] can be both '0' and '1' at the same time.). For most math we do on a computer, this doesn't mean jack-squat. However, there are certain problems that are hard on classical computers, but aren't hard on a quantum computer ('Hard' has a specific meaning in computer science). It's these problems that quantum computers help solve; they're not better than classical computers in any generalized sense.

Edit: Watch this. https://www.youtube.com/watch?v=g_IaVepNDT4

KillHour fucked around with this message at 04:35 on Sep 27, 2014

Rastor
Jun 2, 2001

Lord Windy posted:

Is Quantum computing some new super fast computer or is it a kind of new magic way of looking at electrons? I remembering someone telling me it's about using the fuzzy circumference of electrons as bits instead of using electrons as bits (ie, 1 electron or whatever is now 2+ bits instead of just 1) but that doesn't sound like any performance gains.
Quantum computers are neither necessarily faster (though they have the potential to solve problems quickly that classical computers could not solve in a reasonable time, or at all) nor do they work via switching flowing electrons. They are fundamentally different things; even at the concept level they are based on qubits instead of bits. They are (would be) used for solving different problems than classical computers, and though no doubt someone will invent some kind of game for them, you won't be running Crysis on one.

Rime
Nov 2, 2011

by Games Forum

Lord Windy posted:

With the vacuum tube, if it has a higher Ghz limit would it be worth skipping graphene and going straight for that? I don't know anything about electrical engineering but to me it sounds like the magic next step not graphene.

Is Quantum computing some new super fast computer or is it a kind of new magic way of looking at electrons? I remembering someone telling me it's about using the fuzzy circumference of electrons as bits instead of using electrons as bits (ie, 1 electron or whatever is now 2+ bits instead of just 1) but that doesn't sound like any performance gains.

Quantum computing, at least in the way that it's currently being done by outfits like D-Wave (which I've toured, weird place) is not in any way usable for consumer purposes. It's not general purpose hardware, it's closer to bitcoin ASICS, and it's unlikely that you could ever run a traditional OS or game on a quantum processor just due to the weirdness of it being a non-binary system.

Certainly, backwards compatibility would not be possible.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Rastor posted:

Not until a number of years from now, by which time silicon progress has ground to an almost halt, sputtering adrift with enormous gaps of time between process changes.

the chipmakers might have to focus their innovation in areas other than shrinking :aaaaa:

Mr Chips posted:

Basic digital logic instructions (AND, OR, NAND, NOR, XOR and so on) don't necessarily require silicon to be executed, nor do x86 instructions (although building an 8086 entirely out of non-silicon components would be a fun project)

you could probably build one out of germanium fairly easily for a laid-back, chilled-out processing experience.

Assepoester
Jul 18, 2004
Probation
Can't post for 10 years!
Melman v2

Rastor posted:

Quantum computers are neither necessarily faster (though they have the potential to solve problems quickly that classical computers could not solve in a reasonable time, or at all) nor do they work via switching flowing electrons. They are fundamentally different things; even at the concept level they are based on qubits instead of bits. They are (would be) used for solving different problems than classical computers, and though no doubt someone will invent some kind of game for them, you won't be running Crysis on one.
But if you ported Crysis to a quantum computer, would you finally get decent performance?

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

atomicthumbs posted:

the chipmakers might have to focus their innovation in areas other than shrinking :aaaaa:

But we have ample evidence that this simply can't happen, look at how badly nV screwed up when they got stuck at 28nm for too long, the fools

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply