Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
I believe Buffy was 16:9 in Europe and that's where those come from.

Adbot
ADBOT LOVES YOU

WhyteRyce
Dec 30, 2001

Buffy wasn't framed for 16:9 so in your quest for better quality your watching a bunch of boom operators and mics and people that shouldn't be there

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Saukkis posted:

Earlier this year I decided to finally finish watching Buffy. I watched few eps on Netflix and then I decided to compare the picture quality to the DVD collection I also owned. And the DVDs had superior quality and 16:9 widescreen format compared to the dim and dark 4:3 format on Netlix, probably close in quality to the original TV broadcasts. Supposedly Joss Whedon considers that the correct and true way to watch Buffy, but I'll have none of that.

That's a tough one, because the widescreen Buffy and Angel releases were never intended to be seen in 16:9. There's studio crap visible in the sides in some shots, and in other shots the editors have chosen to do stuff like cut off actors heads to make the 16:9 frame work.

http://www.avclub.com/article/fox-making-buffy-widescreen-and-joss-whedon-isnt-h-213031

Edit: Well I got beaten repeatedly.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Twerk from Home posted:

That's a tough one, because the widescreen Buffy and Angel releases were never intended to be seen in 16:9. There's studio crap visible in the sides in some shots, and in other shots the editors have chosen to do stuff like cut off actors heads to make the 16:9 frame work.

http://www.avclub.com/article/fox-making-buffy-widescreen-and-joss-whedon-isnt-h-213031

Edit: Well I got beaten repeatedly.

Wow, there are a hell of a lot of shots that are simply butchered. Pretty lovely.

Twinty Zuleps
May 10, 2008

by R. Guyovich
Lipstick Apathy
The only purchase of new hardware I can justify is some kind of Xeon-ready motherboard, with associated memory, SSD, graphics hardware, and OS that I can put an 8 core and 16 GB in now while holding onto the dream of a 20-core and 128 GB 2 years from now. What should I be keeping an eye on as far as workstation processors, workstation sockets and such? I don't know if the near future is in the 1151 or the 2011-v3, or if I should expect an entirely new socket to come around in 6 months that will "obsolete" anything available right now.


To put in a really specific question: In Photoshop, using brushes with what Adobe presents as bristle simulators, if I go too high on the pixel density I get really bad, 1+ second input lag between the stroke on the tablet and the stroke on the screen. Is that a CPU bottleneck? GPU? I was in here asking stupid questions about the hardware bottlenecks for ray-tracing renderers a few months ago and now I guess I'm on to the bottlenecks for 2D paintbrush simulators.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
There is a new socket coming for Skylake-E, LGA-3647. It's loving enormous, but it might be 9 months out.

It will have a hexa-channel memory interface. Start saving your pennies.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Wulfolme posted:

The only purchase of new hardware I can justify is some kind of Xeon-ready motherboard, with associated memory, SSD, graphics hardware, and OS that I can put an 8 core and 16 GB in now while holding onto the dream of a 20-core and 128 GB 2 years from now. What should I be keeping an eye on as far as workstation processors, workstation sockets and such? I don't know if the near future is in the 1151 or the 2011-v3, or if I should expect an entirely new socket to come around in 6 months that will "obsolete" anything available right now.


To put in a really specific question: In Photoshop, using brushes with what Adobe presents as bristle simulators, if I go too high on the pixel density I get really bad, 1+ second input lag between the stroke on the tablet and the stroke on the screen. Is that a CPU bottleneck? GPU? I was in here asking stupid questions about the hardware bottlenecks for ray-tracing renderers a few months ago and now I guess I'm on to the bottlenecks for 2D paintbrush simulators.

You should really ask the maker of the plugin/Adobe, lots of stuff in Photoshop is not well multithreaded but that can all go out the window depending on plugins and such, it'd really suck to spend a bunch and get no returns out of it.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

BIG HEADLINE posted:

There is a new socket coming for Skylake-E, LGA-3647. It's loving enormous, but it might be 9 months out.

It will have a hexa-channel memory interface. Start saving your pennies.

We have one of the Skylake/Purley Intel SDKs at work and got sent the upgrade to Beta CPUs and when I swapped them out I was like what the fuuuuuck because the chips were so huge under the heatsinks. Also the heatsinks snap right on to the CPU and you have to carefully pry them off which is ridiculous.

I thought I had snapped a pic of them but apparently not. They're almost the size of a playing card. Serve the home has a good writeup with pics: https://www.servethehome.com/big-sockets-look-intel-lga-3647/

Eyes Only
May 20, 2008

Do not attempt to adjust your set.

AVeryLargeRadish posted:

You should really ask the maker of the plugin/Adobe, lots of stuff in Photoshop is not well multithreaded but that can all go out the window depending on plugins and such, it'd really suck to spend a bunch and get no returns out of it.

Getting ahold of someone who is equipped to answer this would be really time consuming.

Just open up the process monitor and gpu-z, crank up the pixel density to 11, and see how many cores it uses / whether gpu is used.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

priznat posted:

We have one of the Skylake/Purley Intel SDKs at work and got sent the upgrade to Beta CPUs and when I swapped them out I was like what the fuuuuuck because the chips were so huge under the heatsinks. Also the heatsinks snap right on to the CPU and you have to carefully pry them off which is ridiculous.

I thought I had snapped a pic of them but apparently not. They're almost the size of a playing card. Serve the home has a good writeup with pics: https://www.servethehome.com/big-sockets-look-intel-lga-3647/

I do kind of *not* look forward to a $150 HSF or $250 CLC for them. :smith:

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

BIG HEADLINE posted:

I do kind of *not* look forward to a $150 HSF or $250 CLC for them. :smith:

Yeah no kidding. The plastic clips on the stock heatsinks are very strong but I was really worried I was going to break them especially when removing the CPU. They're definitely not very user friendly. Perhaps a 3rd party will come up with a clever retention clip.

Also the screws on the heatsinks are gigantic torx style heads too which was also annoying, didn't have those lying around and had to hunt one down.

As they are it wouldn't be a very good cpu for the "prosumer" crowd, they'd get a lot of RMA'd cpus and boards from tragic mishaps.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

priznat posted:

Yeah no kidding. The plastic clips on the stock heatsinks are very strong but I was really worried I was going to break them especially when removing the CPU. They're definitely not very user friendly. Perhaps a 3rd party will come up with a clever retention clip.

Also the screws on the heatsinks are gigantic torx style heads too which was also annoying, didn't have those lying around and had to hunt one down.

As they are it wouldn't be a very good cpu for the "prosumer" crowd, they'd get a lot of RMA'd cpus and boards from tragic mishaps.

I have to admit that I'm a fan of losing the latches - the most harrowing moment on my last build was my first time putting pressure down on the LGA retention clip, wondering to myself "I really shouldn't have to be putting this much force onto something so loving fragile, do I...seriously, am I bending the loving pins?!?!?"

SCheeseman
Apr 23, 2003

I don't think anything is as bad as attaching HSFs to Athlons back in the day. The amount of times I slammed flathead screwdrivers into motherboards was too many.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SwissCM posted:

I don't think anything is as bad as attaching HSFs to Athlons back in the day. The amount of times I slammed flathead screwdrivers into motherboards was too many.

Not saying I did it back then (flathead screwdriver like everyone else), but there is an easy way. Use a small hex head driver, the same one you'd use to tighten motherboard stand-offs. No slippage, or sharp point to wreck the board if you do manage to slip. Would have saved a lot of boards. Oh well.

Kerbtree
Sep 8, 2008

BAD FALCON!
LAZY!
Non-serrated hemostats might be what you need.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Wulfolme posted:

The only purchase of new hardware I can justify is some kind of Xeon-ready motherboard, with associated memory, SSD, graphics hardware, and OS that I can put an 8 core and 16 GB in now while holding onto the dream of a 20-core and 128 GB 2 years from now. What should I be keeping an eye on as far as workstation processors, workstation sockets and such? I don't know if the near future is in the 1151 or the 2011-v3, or if I should expect an entirely new socket to come around in 6 months that will "obsolete" anything available right now.


To put in a really specific question: In Photoshop, using brushes with what Adobe presents as bristle simulators, if I go too high on the pixel density I get really bad, 1+ second input lag between the stroke on the tablet and the stroke on the screen. Is that a CPU bottleneck? GPU? I was in here asking stupid questions about the hardware bottlenecks for ray-tracing renderers a few months ago and now I guess I'm on to the bottlenecks for 2D paintbrush simulators.

Neither 1151 or 2011-v3 have much of a future, which has been universally true for Intel sockets in general. They don't stay on any socket long enough for CPU-only upgrades to make sense. Do you want a Xeon to have a bunch of cores or to have a bunch of memory? You can get to 64GB of RAM now on cheap consumer platforms. As mentioned, if you really need lots of cores and lots of memory both, then Intel has a brand new Xeon ecosystem coming out in about 9 months on socket 3647.

Also, which 8 core CPUs are you looking at? If you spend all this money and then go with an E5-2620v4 for ~$420, you're going to have 8 Broadwell cores at 2.1 GHz, which isn't appreciably more computing power than 4 Skylake cores at 4.0GHz in a cheaper consumer i7-6700K. Fast Xeon 8-cores like the E5-2667v4 cost ~$2000.

Twinty Zuleps
May 10, 2008

by R. Guyovich
Lipstick Apathy

Twerk from Home posted:

Also, which 8 core CPUs are you looking at? If you spend all this money and then go with an E5-2620v4 for ~$420, you're going to have 8 Broadwell cores at 2.1 GHz, which isn't appreciably more computing power than 4 Skylake cores at 4.0GHz in a cheaper consumer i7-6700K. Fast Xeon 8-cores like the E5-2667v4 cost ~$2000.

I was just throwing out an example with the number 8. I'll probably need to make an intermediate step up from my old Sandy Bridge and anything you could call a workstation, especially if there's no chance of the processor socket at the center of everything lasting long enough to upgrade in phases. That 6700K will probably be the way I go. I really already knew that.


Thanks for bringing up the 3647, though. I had been trying to look for news on the new socket after someone here mentioned seeing one, but I couldn't find the post and I had no luck searching for the new socket number.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Not sure how the 6800K fares in regards to overclocking, but a 5820K, if you still can get one, would very easily overclock to 4GHz with just a negligible increase in power usage over stock (at least for my specimen). Would get you six cores at 4GHz for an OK price.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
If anything, I'd imagine the 3647s are going to require a new method other than the classic "pea-sized" dollop of TIM. Might have to resurrect the spreading method. But I do worry a "prosumer" Sky-E setup from scratch will top $3k conservatively, which is problematic for my next build this summer since I see myself incorporating a high-refresh 1440p and a 1070 (or waiting for Volta and making do with my 970).

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
What the hell is Skylake-E going to have that necessitates such a huge socket?

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

What sort of work can justify an -E processor? I mean compared to just farming our work to an AWS instance.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib
Has there been any talk about pairing CPUs with HBM/HMC?

PerrineClostermann posted:

What the hell is Skylake-E going to have that necessitates such a huge socket?

6 channel memory has got to be a big part of the reason.

Kazinsal
Dec 13, 2011



Boiled Water posted:

What sort of work can justify an -E processor? I mean compared to just farming our work to an AWS instance.

A lot of video work is a lot faster on processors with more than four cores and loads of fast memory. And you really don't want to be sending a terabyte of raw 4K footage to an AWS instance to do you edit and render on.

Pryor on Fire
May 14, 2013

they don't know all alien abduction experiences can be explained by people thinking saving private ryan was a documentary

I think 90% of "video work" these days is just done by a phone or a youtube server, I don't get the sense there are many jobs left where you actually do production. I mean there are a few but it's been like a decade since I can recall anyone complaining about a codec or a bitrate or a star wipe or whatever stupid poo poo you used to need a professional for.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Kazinsal posted:

A lot of video work is a lot faster on processors with more than four cores and loads of fast memory. And you really don't want to be sending a terabyte of raw 4K footage to an AWS instance to do you edit and render on.

Ahh. Even faster than with a video card?

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Pryor on Fire posted:

I think 90% of "video work" these days is just done by a phone or a youtube server, I don't get the sense there are many jobs left where you actually do production. I mean there are a few but it's been like a decade since I can recall anyone complaining about a codec or a bitrate or a star wipe or whatever stupid poo poo you used to need a professional for.

In the business word, at least, you would be wrong. To be sure, the whole "video work" landscape is now about 95% simpler than it was years back when you were continuously fighting with codecs and incompatible encoders and yadda yadda. But there are still a ton of jobs out there which require you to do video editing here and there; a friend of mine does PR for a non-profit and spends good chunks of her days stitching together little press-release videos, 30 second adds to dump on FaceBook, etc. No, it's not sitting there compressing a 4k feature-length movie or anything, but you'd be surprised how long exporting a 5-minute 720p Adobe Premiere video can take.

....when you do it on a 2010 iMac. :gonk:

SuperDucky
May 13, 2007

by exmarx

priznat posted:

We have one of the Skylake/Purley Intel SDKs at work and got sent the upgrade to Beta CPUs and when I swapped them out I was like what the fuuuuuck because the chips were so huge under the heatsinks. Also the heatsinks snap right on to the CPU and you have to carefully pry them off which is ridiculous.

I thought I had snapped a pic of them but apparently not. They're almost the size of a playing card. Serve the home has a good writeup with pics: https://www.servethehome.com/big-sockets-look-intel-lga-3647/

You're breaking nda, friend, just fyi.

Twinty Zuleps
May 10, 2008

by R. Guyovich
Lipstick Apathy

Boiled Water posted:

Ahh. Even faster than with a video card?

It turns out once you step outside real-time rendering, there's a whole lot of video data processing that has to be done by a CPU and is completely unaffected by GPU hardware acceleration.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

ConanTheLibrarian posted:

Has there been any talk about pairing CPUs with HBM/HMC?

KNL has 8/16 GB on-package.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

SuperDucky posted:

You're breaking nda, friend, just fyi.

He expressed an opinion and pasted a link to a site he doesn't own to explain his viewpoint. You're both platinum members, so obviously the best way to advise him that he *might* be breaking an NDA is to quote his supposedly NDA-breaking material instead of, oh, I dunno, PMing him.

eames
May 9, 2009

Looks like somebody in Hong Kong managed to get his hands on a retail version of Kaby Lake (i5-7600K) for the Desktop.

http://translate.google.com/translate?u=http%3A//www.facebookhk.com/&hl=en&langpair=auto%7Cen&tbb=1&ie=UTF-8

The chip seems to be stable at 5.1 Ghz with 1.55V with air cooling but I have no idea how safe that voltage is/will be.
Will these be widely available before christmas?

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Black Friday is a thing that stores can use to free up space for new inventory so maybe???

Going to wait and see how KBL pans out over the next year and hope for it on 3647 as rumored

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Again, I have a strange feeling that the 'enthusiast' LGA3647 CPUs will be "Saudi Prince's Son" builds.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

eames posted:

Looks like somebody in Hong Kong managed to get his hands on a retail version of Kaby Lake (i5-7600K) for the Desktop.

http://translate.google.com/translate?u=http%3A//www.facebookhk.com/&hl=en&langpair=auto%7Cen&tbb=1&ie=UTF-8

The chip seems to be stable at 5.1 Ghz with 1.55V with air cooling but I have no idea how safe that voltage is/will be.
Will these be widely available before christmas?

"But the temperature is very good, idle less than 30 degrees Celsius!"

glad hes giving us temps for when the cpu is running at 900mhz and doing nothing.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Don Lapre posted:

"But the temperature is very good, idle less than 30 degrees Celsius!"

*confirms SpeedStep and idle states still work and you don't need to disable them*

*just like all overclocking for the past 5+ years*

EdEddnEddy
Apr 5, 2012



eames posted:

Looks like somebody in Hong Kong managed to get his hands on a retail version of Kaby Lake (i5-7600K) for the Desktop.

http://translate.google.com/translate?u=http%3A//www.facebookhk.com/&hl=en&langpair=auto%7Cen&tbb=1&ie=UTF-8

The chip seems to be stable at 5.1 Ghz with 1.55V with air cooling but I have no idea how safe that voltage is/will be.
Will these be widely available before christmas?

You can put almost anything to ~5Ghz with 1.55V, Jeeze. And "Air Cooling" at idle should be fine, but any actual load on that chip should throw it up into the 80C+ realm real quick unless its one of those huge Noctua's or something.

1.5V+ will fry a chip rather quick in the Sandy Bridge era, I couldn't imagine one of the latest nm processes lasing long at that voltage.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

EdEddnEddy posted:

You can put almost anything to ~5Ghz with 1.55V, Jeeze. And "Air Cooling" at idle should be fine, but any actual load on that chip should throw it up into the 80C+ realm real quick unless its one of those huge Noctua's or something.

1.5V+ will fry a chip rather quick in the Sandy Bridge era, I couldn't imagine one of the latest nm processes lasing long at that voltage.

Even custom loops would be bouncing off 100c at that voltage. There just isn't enough chip surface area. You would need active cooling like ln2

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Is there any reason not to expect the pattern to continue that the distance between max turbo clock on a stock chip vs. reasonable overclocks achievable will shrink? If the i7-7700K is shipping with a 4.5GHz turbo, then 7600K overclocks should be in the 4.6-4.7 GHz range. That's still better than most 6600Ks get, as far as I've heard!

EdEddnEddy
Apr 5, 2012



They have been shrinking each generation as the Base Clock and Turbo has gone up, however the Max Turbo is also for what? 1 core's Max vs All Cores like you get when you manually OC the chip? That is the actual performance difference that makes the stock "Max Turbo" pretty much pointless.

I guess that is the bonus we get with the Skylake-E Chips. That massive thing should be big enough to mount one hell of a water block onto to vent that heat. Also won't consumer chips possibly grow a little in the next gen that actually starts to deliver 6+ cores?

Adbot
ADBOT LOVES YOU

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

EdEddnEddy posted:

Also won't consumer chips possibly grow a little in the next gen that actually starts to deliver 6+ cores?

The actual driver for chip size is pin count; not cores. If you look at a de-lidded chip the die itself is actually a fairly small part of the overall size of the chip. As we get to smaller and smaller processor nodes the overall die size should continue to shrink.

Look at the 2011 chips; they have no problem fitting 24 cores on there and that's not much bigger than the 1151 chips.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply