Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Anime Schoolgirl
Nov 28, 2002

http://www.bloomberg.com/news/articles/2016-06-21/intel-fights-record-1-2-billion-antitrust-fine-at-top-eu-court

:thumbsup:

Adbot
ADBOT LOVES YOU

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

Jeeeeesus I had no idea this was still going on. So gross. I really hope zen is something incredible (despite just ordering an Intel cpu today...).

Atomizer
Jun 24, 2007




Intel should get a goddamn award for "squeezing out AMD." :colbert:

Nam Taf
Jun 25, 2005

I am Fat Man, hear me roar!

dud root posted:

My Asrock Z170 Extreme 4 refused to clock my Gskill 3200 RAM anything above about 2400, until the recent V3.20 BIOS. Now it does 3200 without issue. At a guess the compatibility has been improved a bunch across other boards with bleeding edge BIOS (It should've worked in the first place 6 months ago though)

I literally had the opposite with my gaming 4. 1.90 accepts my G.skill 3200 XMP fine but upgrading the bios means it won't even post if it's not run at 2133.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

AVeryLargeRadish posted:

Yeah, it's probably the memory controller, hell I was reading a review for the ASRock Z170M OC Formula which is supposed to be able to run RAM at over 4500MHz and they only put two RAM slots on it because they found that putting four on was just stupid because it's impossible to run any more than two DIMMs at those sorts of speeds.

The DDR4 spec allows progressively fewer DIMM sockets per channel the faster you run it. It's hard to make digital signals wiggle that fast if there's too many loads and stubs on the bus. Same's true of DDR3, actually, it's just worse with DDR4.

Paul MaudDib posted:

You know what burns me up? SATA Express. Who gives a poo poo that your Z170 motherboard has three SATA Express ports? poo poo's been dead since 2014, nobody cares. There do not even exist (to my knowledge) any devices you can plug into it and there probably never will be.

Apple's been shipping SATA Express SSDs in much of their Mac lineup since 2013. Probably the only place it gained any traction, and mostly because Apple did not want to wait for NVME in order to break the 500 MB/s barrier. They started moving to NVME in 2015 but there are still some models shipping with SATA Express SSDs today.

They didn't use the SATA Express connector, though. Not M.2 either, the standard wasn't ready / didn't exist in 2013. Instead it's a proprietary M.2-like gumstick design. So if you have that poo poo on your motherboard, I am afraid you cannot rip an Apple SSD out of a 2013-2016 MacBook Pro and plug it in! I am sure you are so disappointed.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Paul MaudDib posted:

Still oscillating between a 5820k and a 6700k.

You know what burns me up? SATA Express. Who gives a poo poo that your Z170 motherboard has three SATA Express ports? poo poo's been dead since 2014, nobody cares. There do not even exist (to my knowledge) any devices you can plug into it and there probably never will be.

And when it comes to USB3 which people actually cared about, they were late to the party by 2 years, presumably because an open industrial standard won't roll in as much dough as their in-house proprietary TB that they relentlessly pushed and only caved in when the entire market gave them a big middle finger.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

BobHoward posted:

The DDR4 spec allows progressively fewer DIMM sockets per channel the faster you run it. It's hard to make digital signals wiggle that fast if there's too many loads and stubs on the bus. Same's true of DDR3, actually, it's just worse with DDR4.


Apple's been shipping SATA Express SSDs in much of their Mac lineup since 2013. Probably the only place it gained any traction, and mostly because Apple did not want to wait for NVME in order to break the 500 MB/s barrier. They started moving to NVME in 2015 but there are still some models shipping with SATA Express SSDs today.

They didn't use the SATA Express connector, though. Not M.2 either, the standard wasn't ready / didn't exist in 2013. Instead it's a proprietary M.2-like gumstick design. So if you have that poo poo on your motherboard, I am afraid you cannot rip an Apple SSD out of a 2013-2016 MacBook Pro and plug it in! I am sure you are so disappointed.

NVMe and Sata Express work on different layers, you can have them operate together (https://www.sata-io.org/sites/default/files/documents/NVMe%20and%20AHCI%20as%20SATA%20Express%20Interface%20Options%20-%20Whitepaper_.pdf)


it's not like it matters since apple uses whatever is best for apple inc at any given time. they will likely move to PoP or BGA SSDs


also the difference between AHCI and NVMe on consumer HW is minimal for perf, maybe it uses less power?

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
I think I just did something stupid. I bought an open box Gigabyte GA-X99-UD3P motherboard for $135. I don't have a CPU or memory to go with it. This would be for my file/Plex server in my basement. I do a lot of transcoding, and this should be way faster than the (Lynnfield) Xeon 3450/P55 that I have now. I might save some money on electricity, too. The P55 board goes for ~$75 on ebay.

I'm thinking ebay E5-2670v3. Do I have to buy ECC/registered memory when using a Xeon CPU on X99? Can I? Should I? The board should work with 2 DIMMs, right? It just won't have quad channel memory bandwidth?

My Cooler Master Hyper T4 should work, as long as I can dig up the box of adapter pieces, right?

EdEddnEddy
Apr 5, 2012



You should be able to use normal DDR4 with the X99 with a Xeon. Though sounds a little overkill for a Nas/Plex server.

I run a little E-450 passive cooled rig for my NAS and as long as the video is encoded correctly at the start, Streaming it out seems to be rather straightforward. Hell Plex can run on a Shield TV now with like 4 encoding threads at the same time if need be which is pretty sweet.

Now if you plan to run some VM's or whatnot on that thing then the X99 makes perfect sense.

NihilismNow
Aug 31, 2003

EdEddnEddy posted:

You should be able to use normal DDR4 with the X99 with a Xeon. Though sounds a little overkill for a Nas/Plex server.

Actually highly recommended if you run ZFS (ECC that is).

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
I meant an ebay 12 core chip in the ballpark of an E5-2670. I'm definitely not spending $1500 on a CPU.

I generally start with HD source material, keep it around in HD for one viewing, then transcode it into something like 400p for long term file storage.

I'm sure it is ridiculous overkill, but the price is low. I really wanted an X79/E5-2670 (the 8-core model), but the motherboards are absurdly priced and lack features.

Can I expect something like this to work? E5-2683 is not on the CPU HCL list for the motherboard, but 2685 and 2680 are.

Does anyone have experience with gaming in VMs? What is the feasibility of running two Windows VMs on this system, with each VM having access to its own GPU (probably something like RX480)? Kind of like the Linus Tech Tips 7 gamers system knocked down to two gamers and without the unlimited budget.

PBCrunch fucked around with this message at 18:47 on Jun 22, 2016

EdEddnEddy
Apr 5, 2012



NihilismNow posted:

Actually highly recommended if you run ZFS (ECC that is).

Interesting. I may have to look into this when my storage grows again in the future.

Potato Salad
Oct 23, 2014

nobody cares


Windows Server 2016 is the only Hyper-V system with pci passthrough, and it is in young age.

Windows VDI workstation resources are really meant to be used with RemoteFX, which is a graphics adapter for Hyper-V VMs that actually implements calls to a host's own graphics adapter. This differs from GRID on Xen or VMware, where the host exposes fractions of available hardware directly to each guest. VMware's host-mediated solution is poo poo because their drivers for guests are poo poo, hence the drive for nvidia to develop the directly-exposed hardware option.

Problem for you: RemoteFX is Enterprose-only with Software Assurance active, so that's volume-licensing territory only. Win Server 2016 will be expensive as gently caress.

Go the vmware route of you want to have multiple local consoles for gaming. If you need remote gaming consoles, you'll need VMware + expensive Nvidia Quatro cards or a poo poo AMD card. If you want to use non-AMD-poo poo, non-gouging-priced Nvidia cards for remote VDI, you will need to use KVM as it is capable of, with a lot of work, preventing a GTX card from discovering that it is on a virtualized pci bus.

SuperDucky
May 13, 2007

by exmarx

PBCrunch posted:


Can I expect something like this to work?

We had this conversation a few pages ago but that is a "ES" i.e., engineering sample, and board compatibility can be wonky.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

PBCrunch posted:

Can I expect something like this to work?

As mentioned, engineering samples can have motherboard compatability issues, different clocks, turbo clocks, or power draw than final CPUs, and also are distributed with an agreement that they remain the property of Intel and are not to be resold. They're kind of like buying a book without its cover: someone sold it who shouldn't have at point in the past and now you're getting a good price because you're buying stolen goods.

With how cheap and widely available normal E5-2670s are on Ebay, I'd stay away from engineering samples unless you really wanted a weird chip.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
You don't even need to mess with eBay if you don't want to for some reason. This website sells them straight up at comparable prices: http://www.natex.us/product-p/intel-e5-2670-sr0kx.htm

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
It was my understanding that those older E5-2670 chips needed OG lga2011 boards to run (X79). Used X79 motherboards seem to be selling for $300+, which erases the goods pricing of those chips.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Oh yeah, I forgot that the conversation was about X99 since the post I was looking at didn't use version numbers on the proc. For the v3 chips it looks like around $1000 difference for the normal chip and an ES so it's a lot more understandable to want to use an ES.

redeyes
Sep 14, 2002

by Fluffdaddy

Potato Salad posted:

Windows Server 2016 is the only Hyper-V system with pci passthrough, and it is in young age.

Windows VDI workstation resources are really meant to be used with RemoteFX, which is a graphics adapter for Hyper-V VMs that actually implements calls to a host's own graphics adapter. This differs from GRID on Xen or VMware, where the host exposes fractions of available hardware directly to each guest. VMware's host-mediated solution is poo poo because their drivers for guests are poo poo, hence the drive for nvidia to develop the directly-exposed hardware option.

Problem for you: RemoteFX is Enterprose-only with Software Assurance active, so that's volume-licensing territory only. Win Server 2016 will be expensive as gently caress.

Go the vmware route of you want to have multiple local consoles for gaming. If you need remote gaming consoles, you'll need VMware + expensive Nvidia Quatro cards or a poo poo AMD card. If you want to use non-AMD-poo poo, non-gouging-priced Nvidia cards for remote VDI, you will need to use KVM as it is capable of, with a lot of work, preventing a GTX card from discovering that it is on a virtualized pci bus.


I'm not sure all the variables involved here but RemoteFX works on Windows 10 Pro HyperV. I have a few VMs with full RemoteFX acceleration. I don't know if its worse or the same than Server 2016 but it is there and works.

EdEddnEddy
Apr 5, 2012



On Windows 10 Pro wouldn't it just be limited to Local VM's vs Windows Server where you would be able to remote in and have accelerated graphics?

NihilismNow
Aug 31, 2003

PBCrunch posted:

Does anyone have experience with gaming in VMs? What is the feasibility of running two Windows VMs on this system, with each VM having access to its own GPU (probably something like RX480)? Kind of like the Linus Tech Tips 7 gamers system knocked down to two gamers and without the unlimited budget.

I did this for a few months with ESX 6.
It works, it works well with most games but not all games. Most games were near native experience. Overwatch was 30 fps on low, then i installed Win10 on the same box and it gets 70fps on high, same story with Warthunder. Other games (KSP, Bioshock infinite) would have tons of crashes that are solved by installing win10 on the same hardware.
I couldn't really find a reason for the difference in performance, so i recently decided i will get another dedicated desktop to go with my server.

This is all with a local console, as Potato salad mentions there are really no affordable 3d capable remoting protocols available to end users. Short of installing Citrix or a PCOIP card it doesn't really work (other than Steam streaming ofcourse).

Maybe try QEMU, people seem to have better experiences with that.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
Thanks. I was really only interested in local VMs for gaming (with some Steam streaming mixed in).

mobby_6kl
Aug 9, 2009

by Fluffdaddy
I just installed Intel's extreme tuning thingie after seeing it referenced here a couple of times, but it won't let me change anything - all the parameters on the Tuning page are grayed out. Are there any specific requirements for it to work? Sandy Bridge is the newest I have but their page on the tool doesn't mention anything.

craig588
Nov 19, 2005

by Nyc_Tattoo
Motherboard makers have the option of locking down GPU overclocking even if it's supposed to be supported by the chipset and processor. If you have an OEM motherboard I'd bet they don't allow GPU overclocking. I ran into a similar problem when I wanted to try overclocking an otherwise great laptop that was held back by its thermally limited GPU so if anyone happens to know a workaround I'd love to know it, but from my research mobo makers can deny you access to the GPU clock controls.

craig588 fucked around with this message at 00:31 on Jun 26, 2016

Zotix
Aug 14, 2011



Looking to do a new build between now and the end of the year. What are the chances that Kaby Lake processors drop this year? I have a 3570k and while it is still a decent CPU, I want to do a new build from scratch but I don't really want to go with a skylake as they are now almost a year old, so I'm wondering how close we are at this point to the Kaby Lake processors dropping.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Zotix posted:

Looking to do a new build between now and the end of the year. What are the chances that Kaby Lake processors drop this year? I have a 3570k and while it is still a decent CPU, I want to do a new build from scratch but I don't really want to go with a skylake as they are now almost a year old, so I'm wondering how close we are at this point to the Kaby Lake processors dropping.

http://www.extremetech.com/extreme/229725-report-claims-intel-amd-will-both-delay-next-gen-processors-to-early-2017

EdEddnEddy
Apr 5, 2012



Kabby Lake hopefully will deliver a few good chips as the outlook for Intel through 2017 looks to be quite barren... Though I do look forward to the end of that drought.

Verizian
Dec 18, 2004
The spiky one.
Didn't Intel say that after Skylake, they would likely decrease single threaded performance in favour of better multi-threading, lower temps and lower power usage because Desktop PC's aren't where the big money is while most gamers are perfectly fine with performance equivalent to an overclocked CPU from five generations back?

HMS Boromir
Jul 16, 2011

by Lowtax
Pretty much, though I don't think they've said anything about better multi-threading.

NihilismNow
Aug 31, 2003
If kaby lake is like Skylake there will be supply and price issues the first few months and it is really unlikely to bring anything exciting to the table.
At least with Haswell-R/Broadwell vs Skylake you have the platform upgrade, are we really that excited about native USB 3.1? If i had bought a 4790k just before the 6700k was released i wouldn't really care, performance is pretty much the same.

(Remind me of this post when Kaby lake blows away everything that came before it and delivers 5 GHZ unlocked hexa cores at $200).

EdEddnEddy
Apr 5, 2012



It's a good bet that Kaby Lake will be more or less focused on the Mobile sector with a few Desktop parts thrown in (and whatever Xeon focused chips come with it as well) Then maybe we will also get a Skylake E and a hopefully a new platform to run it on, as that will be the High End push around mid 2017 I feel.

Will be interesting to see what AMD does with the chance they may have over the next year of Intel getting their crap back together after this Reorg.

Guni
Mar 11, 2010
Hey goons, I have a slight dilemma of the first world proportion and am after some advice (since I can't find any benchmarks). I currently have an i5 6500 with a ASRock b150m and have just picked up a 1080 after being out of the PC game for about 6-12 months (previously having had an i5 4690k + 970, which I sold for ~reasons~). Basically I want to know is it worth upgrading to an i5 6600K or i7 6700k for gaming at 3440x1440? The games I'll be playing are GTA V, Battlefied 1 (when it comes out) and then other various non-demanding titles. I guess it's worth noting that money isn't a massive issue and I have a couple of other reasons why I want to get a K chip (mainly so I can stick a AIO in my case + I could give the i5 6500 to my GIRLFRIEND, so if there's a perceivable difference, it could be worth it).

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Guni posted:

Hey goons, I have a slight dilemma of the first world proportion and am after some advice (since I can't find any benchmarks). I currently have an i5 6500 with a ASRock b150m and have just picked up a 1080 after being out of the PC game for about 6-12 months (previously having had an i5 4690k + 970, which I sold for ~reasons~). Basically I want to know is it worth upgrading to an i5 6600K or i7 6700k for gaming at 3440x1440? The games I'll be playing are GTA V, Battlefied 1 (when it comes out) and then other various non-demanding titles. I guess it's worth noting that money isn't a massive issue and I have a couple of other reasons why I want to get a K chip (mainly so I can stick a AIO in my case + I could give the i5 6500 to my GIRLFRIEND, so if there's a perceivable difference, it could be worth it).

Do you already have a Z170? If not you need to factor that into the marginal cost here (versus the cost of a cheaper non-OC capable motherboard if you're giving it away anyway I guess).

It's definitely worth it to get a 6600K or 6700K. If are flush with money and don't care then just go for the 6700K, it's worth it for a high-end gaming system and you'll appreciate it in terms of long-term longevity. The 6600K isn't that much better than the 4690K, but the 6700K buys you an extra 4 threads.

If you have a MicroCenter store near you, they have the absolute best prices on Intel CPUs. They also have the Big Haswell (HEDT) 6-core/12-thread CPU (5820K) for almost the same price as the 6700K, just to add some fuel to this money fire :unsmigghh: Significantly better at tasks that take advantage of lots of threads, very slightly inferior to Skylake at single-threaded performance once you throw in a nice overclock (which will need a big AIO cooler).

Paul MaudDib fucked around with this message at 04:20 on Jun 28, 2016

Guni
Mar 11, 2010

Paul MaudDib posted:

Do you already have a Z170? If not you need to factor that into the marginal cost here (versus the cost of a cheaper non-OC capable motherboard if you're giving it away anyway I guess).

It's definitely worth it to get a 6600K or 6700K. If are flush with money and don't care then just go for the 6700K, it's worth it for a high-end gaming system and you'll appreciate it in terms of long-term longevity. The 6600K isn't that much better than the 4690K, but the 6700K buys you an extra 4 threads.

If you have a MicroCenter store near you, they have the absolute best prices on Intel CPUs. They also have the Big Haswell (HEDT) 6-core/12-thread CPU (5820K) for almost the same price as the 6700K, just to add some fuel to this money fire :unsmigghh: Significantly better at tasks that take advantage of lots of threads, very slightly inferior to Skylake at single-threaded performance once you throw in a nice overclock (which will need a big AIO cooler).

Sadly I (a) don't have a z170 and (b) live in Australia. I guess money isn't exactly an issue, but I still like to consider price:performance (1080 excepted of course) where possible and don't want to pay 50% extra for 2% performance. Looks like I might have to upgrade though, so thanks!

GRINDCORE MEGGIDO
Feb 28, 1985


For a gaming system though, does ht, usually, not help frames and min frame rate? (And in some circumstances actually drop them?)

Maybe that's not the case in new dx12 / vulkan games?

If it isn't the case I wonder if 5820ks and the like will seem a ton more attractive for gaming, even over skylakes strong single core performance. 

GRINDCORE MEGGIDO fucked around with this message at 06:08 on Jun 28, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

wipeout posted:

For a gaming system though, does ht, usually, not help frames and min frame rate? (And in some circumstances actually drop them?)

Maybe that's not the case in new dx12 / vulkan games?

If it isn't the case I wonder if 5820ks and the like will seem a ton more attractive for gaming, even over skylakes strong single core performance.

Many games don't make use of more than 4 threads. Most games don't make use of more than 8 threads right now. Real cores are always better than hyperthreaded cores clock-for-clock but it's often not possible to overclock them as far.

DX12 has a lot of multithreading capabilities. Scaling to more threads/cores is at least one of the promises of DX12.

GRINDCORE MEGGIDO
Feb 28, 1985


Understood, I'd expect games to take advantage of more cores going forward. But is it conclusive that they will take advantage of hyperthreading?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

wipeout posted:

Understood, I'd expect games to take advantage of more cores going forward. But is it conclusive that they will take advantage of hyperthreading?

Those are effectively the same thing, software just sees the number of threads, it does not care about the actual number of physical cores. Though I expect in a best case scenario you will only gain around 20%-30% of the performance you would gain from having 8 actual cores compared to just having 4C/8T via hyperthreading.

Guni
Mar 11, 2010
So goons, basically it's worth it to get an i7 with the emergence of DX12 games utilising HT? A 5820k is actually not that much more expensive than a i7 6700k (I.e. $100) and the mobos will be about similar - what do ye goons say, yay or nay?

Adbot
ADBOT LOVES YOU

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Guni posted:

So goons, basically it's worth it to get an i7 with the emergence of DX12 games utilising HT? A 5820k is actually not that much more expensive than a i7 6700k (I.e. $100) and the mobos will be about similar - what do ye goons say, yay or nay?

Both of those are i7 processors, but the 5820k has 6 physical cores and the 6700k has 4. In the case of DX12 using more threads, a 5820k will be 'better' at some point due to having 2 more cores. But, the cost of the X99 platform is higher than Z170, so that is something to consider too.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply