Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

japtor posted:

I think their reading comprehension is fine, they're probably just taking issue with one particular part of how you opened the topic:

That part is fine (...although I vaguely recall reading that it's the other way around with games, i.e. there's a handful of games that push desktop quads, not "a lot", but whatever, we get the point about lower power CPUs being used nowadays)

That part is wrong and/or worded really poorly though, there's a reason it seems like the root of all the replies are about that particular statement while you've had to clarify (multiple times!) that you agree with them.

But anyway Razer Core, yeah there's probably gonna be inflated/unrealistic expectations with it, but at the same time I wouldn't really expect it to hit the same user base as most light notebook users to begin with. Price for one thing obviously, but also just the idea of having a desktop setup at all. If someone's serious enough to get a GPU dock I'm thinking they might get a decent enough laptop in the first place, whether cause they want so and so specs, and/or they just throw a bunch of money at the highest spec stuff.

Yeah, what I meant there was "a given laptop probably has a slower/more efficient processor than it did a couple years ago" rather than processor-for-processor things have gotten slower. Or at least, hasn't gotten faster. Can't really source that without access to some Intel or OEM sales data though. I know all of our office laptops are U-series processors, a couple years ago they were probably M series processors. The ultrabook design methodology has kinda infected everything because thinness/light weight/battery life sells and you can get away with cheaper processors and cheaper cooling systems. It's probably the right choice for your average momputer or office worker, but it's not going to be worth plugging a Razer Core into because those design choices go in direct opposition to sustained performance. You want a fast processor and heavy duty cooling, otherwise the CPU is going to massively bottleneck a desktop GPU.

If you want something portable, get a mITX. If you want something you can carry around the house and game on, set up an NVIDIA Shield and stream from your desktop. And yeah, there's totally desktop-replacement and workstation systems that haven't succumbed to the ultrabook trend. But you can't magically make an ultrabook into a gaming machine by plugging in a Razer Core, it's just going to throttle once it hits its thermal limit.

How much processor you need heavily depends on your definition of "gaming". If you are just doing DOTA or whatever then a laptop CPU with an iGPU will be fine. Once you start stepping into midrange and higher 3D games then you really need at least a quadcore. Some games will even refuse to start up on less than 4 cores nowadays - most notoriously, GTA:V. The G3258 used to be a popular recommendation but it fell by the wayside for this reason. Many (but not all) games will make use of 8 threads if they are available. Using more than 8 threads isn't too common right now, but may become more common as DX12/Vulkan increase the level of multithreading that is possible.

So basically on desktop 2C/4T is the low-end recommendation nowadays, 4C/4T is guaranteed to run anything decently, and 4C/8T is the practical high-end. For laptops, I would have trouble saying that anything less than a M-class 2C/4T or 4C/8T would perform well enough to justify an onboard dGPU let alone a Razer Core.

Paul MaudDib fucked around with this message at 22:10 on May 18, 2016

Adbot
ADBOT LOVES YOU

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

SuperDucky posted:

Essentially a BIOS issue with two faces. If we use the Broadwell BIOS and put a Haswell in, CPU1 doesn't show up if its a "-L." Sorta similarly, non-engineering sample 2680v4's (our primary go-to) don't show up in the BIOS at all, whereas all other v4s work fine. Mind you, the engineering sample 2680v4's work fine in the same serialized board. :frogsiren:

Wow, that is extremely strange, I would expect engineering samples to be nigh-identical to production models. No wonder Intel is scratching their heads. The only thing I can think of is that engineering samples and production models use slightly different IDs and that someone hosed up the BIOS for that one model so that it is expecting a engineering sample ID from the production CPU.

Gorau
Apr 28, 2008
I just have one question though: will a 6700hq be able to run games well for the next couple years? I have a gaming laptop and a desktop, but both are starting to show their age and I want to consolidate if I'm not giving up too much performance.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gorau posted:

I just have one question though: will a 6700hq be able to run games well for the next couple years? I have a gaming laptop and a desktop, but both are starting to show their age and I want to consolidate if I'm not giving up too much performance.

Will you be using a discrete GPU or the integrated GPU? That's about as good as it gets on CPU power but the HD 530 is only about 40% as fast as a 750 Ti.

Paul MaudDib fucked around with this message at 22:36 on May 18, 2016

Gorau
Apr 28, 2008

Paul MaudDib posted:

Will you be using a discrete GPU or the integrated GPU? That's about as good as it gets on CPU power but the HD 530 is only about 40% as fast as a 750 Ti.

Discrete. I was figuring on the 14" razor blade and the core. I'm essentially the ideal candidate for this system, I travel for two weeks a month but still want to play games on a big rear end monitor when I'm home.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gorau posted:

Discrete. I was figuring on the 14" razor blade and the core. I'm essentially the ideal candidate for this system, I travel for two weeks a month but still want to play games on a big rear end monitor when I'm home.

BTW, definitely check out GSync/Freesync too. It hugely reduces the need for stable framerates - dropping into the 40s or 50s is not a big deal with Adaptive Sync. Laptops are never going to be able to push as many frames as a gaming desktop but with Adaptive Sync you don't need to. Much, much smoother gameplay across the board. Laptop gaming is a major, major use-case for it.

There's also Mobile versions of both which work with your laptop's internal panel. I highly recommend it if you can get it.

For anyone who's super set on iGPUs, it's probably worth holding off for a little while because Intel is rumored to be introducing FreeSync (VESA Adaptive Sync) support in Kabylake. If not Kabylake, Cannonlake for sure.

Paul MaudDib fucked around with this message at 22:42 on May 18, 2016

Gwaihir
Dec 8, 2009
Hair Elf

BobHoward posted:

Passmark is a dumb benchmark but what the hell, hopefully with a few more numbers for roughly the best 2-core laptop CPU from each generation you will see the light.

6567U 28W 5479
5557U 28W 4935
4610M 37W 5110
3540M 35W 4626
2640M 35W 3918

From Haswell to Broadwell they lost a tiny bit of CPU performance, but they also dropped 9 watts of power and upgraded the iGPU to Iris, which was a giant leap in graphics performance. So what you're seeing there is efficiency and overall performance gains in every generation, some of them pretty big. And if you compare Skylake to Sandy, hey guess what - you can't quite find support for some of the hyperbole there, but the Sandy gets smoked. You'd have to go to a desktop power bin dual core Sandy to beat the mobile Skylake.

E: it's the i3-2130 and it scores 4062 with a 65w tdp so whoops best Sandy desktop dual doesn't quite compete

Too bad you can't get the 28w TDP versions in anything without an apple on the lid grumble grumble

Gorau posted:

Discrete. I was figuring on the 14" razor blade and the core. I'm essentially the ideal candidate for this system, I travel for two weeks a month but still want to play games on a big rear end monitor when I'm home.

Razer blades are loving killer for the form factor and the hardware contained in them. So long as you have power outlets available for your traveling, it'll serve you well for that use case I think.

Gwaihir fucked around with this message at 22:54 on May 18, 2016

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Paul MaudDib posted:

For anyone who's super set on iGPUs, it's probably worth holding off for a little while because Intel is rumored to be introducing FreeSync (VESA Adaptive Sync) support in Kabylake. If not Kabylake, Cannonlake for sure.

AMD APUs already support it for anyone interested.

Anime Schoolgirl
Nov 28, 2002

SuperDucky posted:

Essentially a BIOS issue with two faces. If we use the Broadwell BIOS and put a Haswell in, CPU1 doesn't show up if its a "-L." Sorta similarly, non-engineering sample 2680v4's (our primary go-to) don't show up in the BIOS at all, whereas all other v4s work fine. Mind you, the engineering sample 2680v4's work fine in the same serialized board. :frogsiren:
seems like they pushed a testing bios to production because there can't possibly be any other reason why Haswell isn't recognized

SuperDucky
May 13, 2007

by exmarx

Anime Schoolgirl posted:

seems like they pushed a testing bios to production because there can't possibly be any other reason why Haswell isn't recognized

Right but why would there only be issues with the 2680? All non -L Haswells and non-2680 Broadwells work fine.

Anime Schoolgirl
Nov 28, 2002

I somehow :downs: and read that as all Haswells, in which case that's a real mystifying fuckup. I assume since Intel is confused there isn't a bios update?

EdEddnEddy
Apr 5, 2012



I was just sorta pissed internally about the i3/5/7 being used with the U and yet everything being a dual core part. I wish Intel would have just stuck with letting the i7 remain a true Quad Core and left the i3/i5 as the U/DC parts. The naming scheme really has irked me since the SandyBridge era and they keep messing with it. (Now the 3960X/4960X/5960X being replaced by a 6950X? WTF!? and the 3930K/4930K/5930K by, the 6900K?!) The -E Series was the last bit of true naming that made sense since Nehalem Series. :(

GRINDCORE MEGGIDO
Feb 28, 1985


It seems pretty intentionally designed to mislead people tbh. Such is life.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Cardboard Box A posted:

Yeah, the whole graphics extender box concept makes sense for laptop users but no sense to couple with a NUC.

I like the idea of being able to unplug something and put it away when it's not needed.

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

KingEup posted:

I like the idea of being able to unplug something and put it away when it's not needed.

Kind of a pain to get out a box, adapter, and plug the monitor and/or other peripherals into the new box every time you want to game.

I think there's a reasonable market to pair the two on size alone. I'm hoping that when nvidia releases hbm-based cards that they'll also be small, then someone can release a half-sized external dock. Adorable computer setups, ahoy!

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

mediaphage posted:

Kind of a pain to get out a box, adapter, and plug the monitor and/or other peripherals into the new box every time you want to game.

You don't have to plug monitor or peripherals in.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




KingEup posted:

You don't have to plug monitor or peripherals in.

I don't know how many people are going to be stowing away their Core on a regular basis while the monitor still sits on the desk...

That said, desktop gamers still use wired devices. Wired mice for the most part, even more commonly wires keyboards and most still prefer Ethernet connections too. Then you have your headset and/or speakers. The NUC/Core combination really blows my mind, as I find it hard to imagine a usage model that a laptop plus Core doesn't do better.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

EdEddnEddy posted:

I was just sorta pissed internally about the i3/5/7 being used with the U and yet everything being a dual core part. I wish Intel would have just stuck with letting the i7 remain a true Quad Core and left the i3/i5 as the U/DC parts.

Man people hold the longest grudges about the oddest things. Far as I can tell from ARK, they've been using i7 for high end dual core mobile parts since Q1 2010, which happens to be the same time they started using i3/i5/i7 branding on mobile parts at all.

Far as I've ever been able to tell, i3/i5/i7 really mean good/better/best within a market segment, with other brands like Pentium used for products they want to position below "good".

HMS Boromir
Jul 16, 2011

by Lowtax
As someone who's never bought a laptop, I associate i3/i5/i7 with 2c/4t, 4c/4t, 4c/8t, and it's really confusing whenever I want to help a friend pick out a laptop because I have to figure out what they actually mean there. The one thing I think I've learnded is that high end quad cores are i7-####HQ. Is that consistent or are there exceptions?

NewFatMike
Jun 11, 2015

Re: Core Chat

The Core and future alternatives strike my fancy because I really do only want one device, and a docking station will perfectly serve my needs. I'm not a huge settings hound, but the Core having USB ports built in means that I could just use one plug and have my laptop on a cooling pad with an external SSD and a controller dongle along with the graphics card so I could plop down on the couch and play some games. I'm certain that I would even be able to get Steam Big Picture mode to launch on plugging my laptop in.

I was really hoping Razer would launch a middle of the road model 14" with an HQ processor and more battery where the GTX 970M is, but I guess I'm not that lucky yet. Someone probably will.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

As someone who's never bought a laptop, I associate i3/i5/i7 with 2c/4t, 4c/4t, 4c/8t, and it's really confusing whenever I want to help a friend pick out a laptop because I have to figure out what they actually mean there. The one thing I think I've learnded is that high end quad cores are i7-####HQ. Is that consistent or are there exceptions?

My Sandy Bridge 4 core/8 thread mobile i7 is i7-2630QM. Not sure when they changed between QM and HQ.

EdEddnEddy
Apr 5, 2012



BobHoward posted:

Man people hold the longest grudges about the oddest things. Far as I can tell from ARK, they've been using i7 for high end dual core mobile parts since Q1 2010, which happens to be the same time they started using i3/i5/i7 branding on mobile parts at all.

Far as I've ever been able to tell, i3/i5/i7 really mean good/better/best within a market segment, with other brands like Pentium used for products they want to position below "good".

And back in 2010 the i7 Mobile chips were still i7 QM's which were Quad Mobile. Then they came out with the U series i7 which then shoved the i# chips into a dual core realm, but really blurred the lines for anyone but those that really look the specs up, to know what the difference is except for exactly "Good/Better/Best" but leaves much less to know what the hell that actually means.

In the conventional 2c/4t 4c/4t and 4c/8t style of the i3/i5/i7, it all made sense and I still feel they should have instead added i#'s for more cores. Or hell, just made the i# series count for the core count of the drat chips instead. i2/i4/i6/i8?

But alas, I guess I would have to really sit in on the meetings for whoever is responsible for these naming schemes to actually get an idea of what their logic is behind it besides brand confusion. If a IT Enthusiast gets sort of confused/baffled by the names a bit, the average Joe has to be just going for Larger # + $ and judging from there, which in this day and age, can still burn you unless you know what you need it for, and the sales guy isn't a complete toolbag.

All I know personally, is outside of mobile, I am sticking with -E CPU's for all my builds personally. And even mobile I want a Q chip before I get a U chip in anything besides a XPS13.




Now, in other news, a co-worker apparently has gotten his hands on a few test 10core chips... I shall do everything in my power to get my hands on one of those, that and another friend has Kaby Lake test rigs already so I might see if I can take a gander at those too.

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

EdEddnEddy posted:

And back in 2010 the i7 Mobile chips were still i7 QM's which were Quad Mobile. Then they came out with the U series i7 which then shoved the i# chips into a dual core realm, but really blurred the lines for anyone but those that really look the specs up, to know what the difference is except for exactly "Good/Better/Best" but leaves much less to know what the hell that actually means.

In the conventional 2c/4t 4c/4t and 4c/8t style of the i3/i5/i7, it all made sense and I still feel they should have instead added i#'s for more cores. Or hell, just made the i# series count for the core count of the drat chips instead. i2/i4/i6/i8?

But alas, I guess I would have to really sit in on the meetings for whoever is responsible for these naming schemes to actually get an idea of what their logic is behind it besides brand confusion. If a IT Enthusiast gets sort of confused/baffled by the names a bit, the average Joe has to be just going for Larger # + $ and judging from there, which in this day and age, can still burn you unless you know what you need it for, and the sales guy isn't a complete toolbag.

All I know personally, is outside of mobile, I am sticking with -E CPU's for all my builds personally. And even mobile I want a Q chip before I get a U chip in anything besides a XPS13.

90% of laptops are glorified facebook machines that occasionally have to run office. That's what all manufacturers target.

Instant Sunrise
Apr 12, 2007


The manger babies don't have feelings. You said it yourself.

blowfish posted:

90% of laptops are glorified facebook machines that occasionally have to run office. That's what all manufacturers target.

Which means you're gonna need a decently powerful CPU to run broken resource-hogging Javascript.

EdEddnEddy
Apr 5, 2012



Instant Sunrise posted:

Which means you're gonna need a decently powerful CPU to run broken resource-hogging Javascript.

This.

And considering Chrome/Opera tab handling while improved, still eats memory like a friggin pig. (Opera has gotten vastly better in it's Beta's however. AdBlocking/VPN/BatteryMode/Delayed Tab Loading/ :woop: )

12G in my work laptop and 3437U, and it still can chug along quite a bit at times.



You are paying for the "performance vs size/design" though in a lot of recent Ultrabooks for sure, so if you want a Quad, you get a 15" laptop around the $1200+ mark, and if you want a 13" you're stuck with a dual core but sexy small laptop for the same price. (Going off the XPS 15 vs 13 here)

I like how Intel has the new Core M chips that make sense for what they are designed to do, and hell, with active cooling they are within spitting distance at times of the U series chips if not passing them at times. They made great improvements with them over the BW versions, but until they are put into more active cooled laptops, they are stuck being zippy for short bit chips, and then you are back to the U series.

If Intel kept the i Series true to their name/cores, then made the U series a freaking series all its own. U3/U5/U7 (U For Ultrabook perhaps?) and the M series for exactly what it is, Mobile, then all would be well IMO.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

HMS Boromir posted:

As someone who's never bought a laptop, I associate i3/i5/i7 with 2c/4t, 4c/4t, 4c/8t, and it's really confusing whenever I want to help a friend pick out a laptop because I have to figure out what they actually mean there. The one thing I think I've learnded is that high end quad cores are i7-####HQ. Is that consistent or are there exceptions?

Pentium? Core i5? Core i7? Making sense of Intel’s convoluted CPU lineup

In a laptop the i*-xxxxU (or M, for older generations) parts are 15W (35W for M) dual-cores with HT and turbo and i5/i7 just means it goes faster. There are i5-6xxxHQ parts now, which are 35-45W quads like the i7s but just slower. Generally with laptops you want to look at the letters instead of the number first to get an idea of what kind of part it is, then look at the number to get an idea of where it's ranked among parts of that type. Weird non-round numbers like something in the ones digit or i*-xx10 usually mean either upgraded Iris graphics or some other kind of niche feature.

Iris branding is its own little puzzle since some models just have more EUs in the IGP, some have a higher TDP allowed as well (see: 13" Macbook Pro processors), and some (Iris Pro or Crystal Well) have a L4 cache that can be used for graphics as needed.

Eletriarnation fucked around with this message at 17:24 on May 20, 2016

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

Instant Sunrise posted:

Which means you're gonna need a decently powerful CPU to run broken resource-hogging Javascript.

The typical consumer has like 5 tabs open at most.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
yes please dont confuse autistic browsing habits with actual hardware limitations

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

blowfish posted:

The typical consumer has like 5 tabs open at most.

And? That doesn't make horrible messes of javascript any easier to execute.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

EdEddnEddy posted:

And back in 2010 the i7 Mobile chips were still i7 QM's which were Quad Mobile.Then they came out with the U series i7 which then shoved the i# chips into a dual core realm, but really blurred the lines for anyone but those that really look the specs up, to know what the difference is except for exactly "Good/Better/Best" but leaves much less to know what the hell that actually means.

http://ark.intel.com/products/family/59143/Previous-Generation-Intel-Core-i7-Processor#@Mobile

Look at all those dual core i7 mobile parts launched in Q1 to Q3 2010. Not all "U" series either, lots of "E" "M" "LE" and "LM", which appear to distinguish between 25W and 35W and some other feature I can't be hosed to figure out. It looks like quad i7 mobile was first, but only by months (Q3 09). There ended up being a lot more non quad i7 laptop chips in 2010 than there were quads. I think you're mistakenly assuming that what got all the enthusiast press attention was all that was going on, in reality 2010 was the year of the i7 mobile dual.

quote:

In the conventional 2c/4t 4c/4t and 4c/8t style of the i3/i5/i7, it all made sense and I still feel they should have instead added i#'s for more cores. Or hell, just made the i# series count for the core count of the drat chips instead. i2/i4/i6/i8?

Pretty sure the loose inspiration is BMW M3 / M5 / M7.

The real problem with Intel's model branding is that they can't resist selling one die design as a billion barely-different SKUs with different combos of features fused on or off (multiplied by the number of speed bins of each variant they want to sell). Theoretically this helps them to optimize their price per chip in different markets, and there's even a fairly sound economic argument that doing so helps not only the manufacturer but customers, but in practice Intel takes it a bit far.

In that context, the way they use i3/i5/i7 is actually a good thing! When you have that many variables there's no way to sensibly encode everything into a part number that's also decently marketable. A signifier of good/better/best that's not tied to technical specs helps non technical customers have at least a vague sense of what's better. You need to remember that the vast majority of people have no loving clue what the benefits of 4C/8T over 4C/4T might be. It's also adaptive: lets Intel change up what differentiates "better" and "best" in future product generations.

EdEddnEddy
Apr 5, 2012



Considering I can watch as Facebook makes the system runs slower and slower left open on a single tab as it eats memory for whatever reason.

Joe/Jane Average Consumer will be buying the appropriate Quad Atom/Pentium/i3/i5 system for $200-500 for 5 tab browsing. When you get the less average user shooting for an i7 powered system, and expect them to pay $1000-2400 for it, he is probably looking to do a lot more than just have 5 tabs open. In the Surface Pro world, they probably are doing some office and Adobe stuff too. Where the 8+GB of ram comes in as a requirement, and having a quad core could be a huge saving grace if they could have that option.

The Size/Performance range is getting better with a lot of thin 15"ers coming with Quads and Dedicated 960M's (Hopefully 1070M's next?), but you still have a dual core i7 being only a bit faster than a i5 or even i3 sometimes. (usually just a lot more power usage instead unless you get the iris Pro's for the higher end i7's)

All I am saying, is I wish they made the names make a little more sense based on what they are putting them into.

i3/i5/i7 (Remain Dual/Quad/Quad+HT)

U3/U5/U7 (Ultrabook chips currently. All 2C + HT but different IGP+Cache/speeds/Turbo)

M3/M5/M7 (Fanless Mobiles currently, but potentially can replace the U series from the looks of it.)

Z3/Z5/Z7 (Atom Mobile chips, but similar to the U series, not a huge difference in core speed between versions, more on the Cache/GPU side it appears)

movax
Aug 30, 2008

So Skull Canyon is basically perfect for HTPC usage with HDMI 2.0 at this point, right?

Need to check into 10-bit HEVC, but I assume that a i7-6770HQ won't blink at decoding that even without HW assistance.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




movax posted:

So Skull Canyon is basically perfect for HTPC usage with HDMI 2.0 at this point, right?

Need to check into 10-bit HEVC, but I assume that a i7-6770HQ won't blink at decoding that even without HW assistance.

You don't need that high end of a NUC for that.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

blowfish posted:

90% of laptops are glorified facebook machines that occasionally have to run office. That's what all manufacturers target.

blowfish posted:

The typical consumer has like 5 tabs open at most.


I don't know what proportion of laptops are used as corporate machines but I see a ton of people around the office with hilarious amounts of stuff open at once, including non tech-savvy people.

movax
Aug 30, 2008

VulgarandStupid posted:

You don't need that high end of a NUC for that.

True, but HDMI 2.0 is sort of future-ready at least (though one could argue a better NUC will be out by the time I get a HDMI 2.0 TV). I assume it far exceeds the requirements needed for Steam game streaming, which is something I'm interested in trying out.

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

movax posted:

True, but HDMI 2.0 is sort of future-ready at least (though one could argue a better NUC will be out by the time I get a HDMI 2.0 TV). I assume it far exceeds the requirements needed for Steam game streaming, which is something I'm interested in trying out.

Yes but you'll want an Ethernet drop

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mediaphage posted:

Yes but you'll want an Ethernet drop

It's nice but shouldn't be necessary if your wireless is good enough. I've done streaming with AC wireless to an AP in the same room and it works great.

mediaphage
Mar 22, 2007

Excuse me, pardon me, sheer perfection coming through

Eletriarnation posted:

It's nice but shouldn't be necessary if your wireless is good enough. I've done streaming with AC wireless to an AP in the same room and it works great.

Sure though that is of course the absolute ideal scenario you understand.

Ceros_X
Aug 6, 2006

U.S. Marine
Late on the small pc/NUC chat but if you willing to ditch standard sized PSUs you can get pretty small, like laptop messenger bag small:

http://nfc-systems.com/shop/s4-mini-chassis


https://smallformfactor.net/forum/threads/nfc-systems-s4-mini.96/

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mediaphage posted:

Sure though that is of course the absolute ideal scenario you understand.

That's true. It also worked fairly well with a 2.4GHz N access point about 20 feet away through drywall, but if you're trying to go through external walls or to a different floor you probably wouldn't want to count on it working in my experience. To be specific, this is with Macbook Pros and Ubiquiti APs.

Eletriarnation fucked around with this message at 01:20 on May 21, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply