Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fishmech
Jul 16, 2006

by VideoGames
Salad Prong
You realize that people just using Intel's onboard graphics has been norm for nearly a decade now right? The Sandy Bridge stuff seems to be more about snagging people who some gaming as well.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

CommieGIR posted:

...and this is why I'm going to stick with AMD, they haven't tried to pull poo poo like this yet

Yeah AMD tells you you should pay full price for a complete replacement CPU from the same batch that was binned differently.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Gembolah posted:

Not sure if you're serious, but it's a pretty stupid idea overall. This kind of consumer surplus squeezing is fairly annoying, and deprives most people of value they shouldn't have to pay extra for.

Here is the current situation:

Joe Average Computer Owner walks into Wal-Mart and buys a crappy Gateway with a slow Intel CPU. If he wants it to go faster he has to go buy another CPU or take the time and effort to learn how to properly manipualte voltages and multipliers and poo poo.

Here is the situation with this card:

Joe Average Computer Owner walks into Wal-Mart and buys a crappy Gateway with a slow Intel CPU. If he wants it to go faster he buys this card, types in a code and his computer is faster.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Gembolah posted:

Except that Intel would be selling their crappy CPU for less money except for their near monopoly on parts for manufacturers. Now they get to squeeze consumers because of their obscene market power.

Basically, one of two things can happen.

1) Joe Average Computer Owner 1 who *doesn't* want the upgrade shouldn't have to subsidize Joe Average Computer Owner 2 who *does* want the better performance.

2) Joe Average computer Owner 2 ends up paying more than he should for his performance upgrade because Intel has to recoup the cost of selling the more expensive part to everyone, even those who didn't want it.

Except they already are selling the more expensive part to everyone, or did you really think binning only ever happened to defective chips? Did you ever wonder WHY it is that it's often possible to overclock and such the really cheap CPUs much more than the high end ones?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Zhentar posted:

Binning happens to all chips (although for that matter, all chips are defective to some extent). What you're complaining about is selling chips at a lower rating than the bin they qualified for.

I'm not complaining about it. I think this is great for your average computer user.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
Apple has been pretty big on tons of things that get unceremoniously shitcanned when they can do soemthing else...

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
AMD can't make decent mobile chips to save their lives, that's why Intel dominates.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Medikit posted:

The new fusion mobile chip is a monster. Read the most recent articles regarding Zacate or Brazos.

That's the trouble with AMD mobile chips, they're always monsters. Nice if you like desktop replacements.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Disgustipated posted:

Atoms aren't about price/performance, though. They're performance/watt. If a 1.6 GHz Atom really is as fast as a 2.2 GHz P4 that's pretty fantastic given how little power they use, especially compared to how ridiculous Netburst was. This is really a point in Atom's favor in my opinion. :shobon:

Yeah that Pentium 4 took as much power as a bright incandescent and could cook dinner, that atom takes as much power as a couple of remote control IR bulbs and might be mildly uncomfortable if you pressed it against your skin.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

WhyteRyce posted:

That may be the point of what AMD did, but there are still people who talk about playing games in the comments of any netbook related review or article. It seems like every week I've had to talk my dad down from buying a near-$500 ION netbook, even after he bought an ipad.

It's probably my complaint with netbooks in general. I liked the concept of a cheap, near-disposable computer which handles most of my day-to-day poo poo (poor video playback is a sore spot), and then people wanted to go and beef them up to the price of a low-end laptop or near a ULV laptop.

Originally netbooks were something that ran Linux, was slow with a tiny low res screen and had aobtu no storage. They were only usable for browsing and word processing.

Now they are nearly full fledged computers and can often outperform larger laptops from similar pricepoints in most areas but raw cpu power, and definitely have better battery life.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
IIRC any of Intel's CPUs with the Boost can increase clock speed on multiple cores so long as at least one core is turned off. If there's only two physical cores, you'll only boost one core, but if there's 4 you could boost 3 or 2 cores instead of just one.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
The last time anyone should have turned off the pagefile was the half-broken virtual memory in Mac System 7.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

zachol posted:

Edit: Although I don't know why I'm surprised. I'm probably one out of only a dozen people in the market for a Mini ITX with no onboard graphics.

As soon as they stop selling stuff earlier than Core iX processors tho, it'll be practically impossible to not get some form of integrated graphics. Even the cheapy Core i3s (and I am talking pre Sandy Bridge) have integrated graphics.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Zhentar posted:

It's in the IGP output, not the decoding. It doesn't matter whether or not you're using software decoding, it's just plain incapable of sending the right signal to the display. But apparently they have a software patch that will let you do 23.97hz, which comes out to dropping a single frame about every 2 1/2 minutes, so I'd still call it a pretty loving :spergin: complaint. Plus the low-end HTPC targeted nvidia and ATI cards aren't exactly costly if it really bothers you that much.

Where do people even get 23.976 media?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

mayodreams posted:

They drop frames to get to the standard. Also, 23.98 is NOT 24. Dropping a frame may not seem like a big deal, but the drift adds up in a hurry. At 1 frame per 2.5 min you watch 10 minutes and you've got a 17% drift, which is noticeable by almost anyone.

Most Blu-Ray content is 1080p24.

But does anyone actually have monitors that play back at 23.97 or whatever? I mean no matter what you do you're going to be doing some flavor of pulldown on a 60 hz monitor or 59.94 hz sdtv to display 23.97 fps or 24 fps content

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Ryokurin posted:

Yes. My 3 year old Samsung A650 has a 24P mode, even though it's a undocumented feature that will only show up if the device tries it.

The thing is, 23.97 or 59.94 is the correct standard. when NTSC was started for the first couple of years it was set to 60hz, but then lowered to 59.94 due to issues of getting hardware that could lock on to that exact speed. it was a tolerance concession. Anyhow, people have just shortened it to 24fps or 60fps for clarity, so any monitor, NTSC, ATSC or computer monitor should take either with ease.

Can you show me where it says all the monitors out there marked at 60 hz actually run at 59.94 hz because I mean you'd think it would at least note that in the back of the manual.

A decade old CRT SDTV runs at 59.94 hz yes but the only LCD device I've ever seen that ran at that was one of the first generation of cheap SDTV LCDs.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Combat Pretzel posted:

Seeing how Intel CPUs are x86 on the outside only these days, while having an apparently stable microcode format, they should just call it quits. Add a separate lighter RISC instruction set, to which the operating system can switch to on a per thread/process basis, easing the migration, and at some point just remove the x86 decoder. Apparently it is that piece which eats a lot of power.

That was only true, like, 13 years ago.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SpelledBackwards posted:

Dumb question, perhaps because I don't keep up with where and when 4k is useful, but what content is in 4k60 today or in the near future? That just sounds like the height of decadence.

You'd want 4k60 if you get a 4k monitor, just so that things won't look weird (since the typical LCD monitor these days refreshes at 60hz). I remember when cheap LCDs were first coming out and some could only handle 30hz updates and it was just really annoying to use a computer with them. To say nothing of the dodgy laptop screens where 100 milliseconds response time was considered good.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
It's like having a decent sports car but it only functions in Gary, Indiana

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Tab8715 posted:

Thunderbolt has been around for 2-4 years, I don't see why we couldn't use that interface.

Because nothing ships with Thunderbolt besides a few random Sony laptops and Apple computers. The Sony laptops did use it for external GPU stuff, but IIRC they weren't that good.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

computer parts posted:

I imagine there'd be software issues so it couldn't just be plug & play.

Computers can already handle USB videocards being hotplugged just fine - many of those small USB monitors you see actually have an onboard video card to make the most efficient use of the USB 2.0 data limitations.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
My understanding is that originally Intel wanted to wait until the fiber-optic version of Thunderbolt was ready, and they slapped that ridiculous fee on there to discourage "early" copper cable versions like we have now.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

CommieGIR posted:

Because Intel didn't learn the lesson from Sony Betamax and Firewire.

No matter what anyone else tells you, the only true dealbreaker disadvantage Betamax had was the recording length issue. And that's really not applicable to Thunderbolt.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Subjunctive posted:

Sony could have taped a $10 bill to every Betamax cassette and it wouldn't have mattered, because it still wouldn't have held a feature film.

And by the time they did, there were already so many people with VHS that they couldn't do anything about it. Like yeah Sony having high licensing fees and their own players being very expensive didn't help. But even if they'd been as cheap as VHS decks there was the fact that then the same money could get you a machine that recorded 2 hours in normal mode and 4 hours in extended mode vs a machine that did 1 hour in normal mode and 2 hours in extended mode.

Incidentally the "betamax has better picture quality" thing was only because Sony's very expensive Betamax players had top-notch parts to go with the high prices, while you could get significantly cheaper VHS decks that cut all kinds of corners. The on-tape storage for both was equivalent - Betamax has very slightly better luminance resolution while VHS has very slightly better color resolution.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

The Lord Bude posted:

They will stop providing all but the most essential security updates for PCs with skylake or newer CPUs that aren't running Windows 10, and they won't do any work to ensure older versions of Windows run correctly on newer CPUs.

Basically, in the future, if you want to buy a new CPU, you will be expected at a minimum to use whatever version of Windows was the newest when that CPU came out. So essentially no more building a new PC and putting Windows 7 on it for example.

Microsoft has never done work to ensure older versions of Windows run "correctly" on newer CPUs, because if Windows can't run correctly on it then installs would fail, and they don't provide modified installers for certain CPUs. Additionally, Windows 7 and 8 are already approaching the point where they transition from regular security updates into the "essential" updates.

Windows Vista ended mainstream support a while ago and loses all support in March 2017. Windows 7 ended "mainstream support" on January 13, 2015 and plain Windows 8 ended "mainstream support" sometime this month. Windows 8.1 doesn't end it until January 9, 2018, butMicrosoft moves to the fewer updates thing pretty soon anyway for it.

So really, they're doing absolutely nothing different, just restating what they already did.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Mr Chips posted:

Does releasing a new kernel for Windows 7 / 2008R2 to better support AMD processors that came out after Win 7 RTM: https://support.microsoft.com/en-us/kb/2645594 not count?

There's also things like this: https://support.microsoft.com/en-us/kb/3064209

That was a really rare thing for them to do.

The latter is an Intel microcode update, which is not created by Microsoft and not subject to their policy. Microcode updates are also delivered through BIOS/EFI updates and through packages from OEM sites. It's literally patching the CPU's firmware directly.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
As a product segmenting sort of thing, even though "non server" 32 bit versions of Windows for a long time have been able to use PAE, they've not been able to use PAE to get beyond a 4 GB limit. If you use PAE on some of the matching 32 bit Server versions, you can usually use more RAM - for instance:
Windows XP 32 bit: 4 GB
Windows XP 64 bit: 128 GB
Windows Server 2003 Standard 32 bit: 4 GB
Windows Server 2003 Enterprise or Datacenter 32 bit: 64 GB
Windows Server 2003 SP1 Enterprise or Datacenter 64 bit: 1 TB (there is no 64 bit Server 2003 before SP1)

Let's look at the Windows Vista generation (because that's the last time Windows Server has 32 bit):
Windows Vista (except Starter) 32 bit: 4 GB
Windows Vista Home Basic/Home Premium 64 bit: 8/16 GB
Windows Vista Ultimate/Enterprise/Business 64 bit: 128 GB
Windows Server 2008 Standard 32 bit: 4 GB
Windows Server 2008 Datacenter or Enterprise 32 bit: 64 GB
Windows Server 2008 Standard 64 bit: 32 GB
Windows Server 2008 Datacenter or Enterprise 64 bit: 1 TB


Here's the Microsoft page that lists all the official limits: https://msdn.microsoft.com/en-us/li..._server_2003_r2

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
I'm planning to get a new laptop sometime this year, because my current laptop is a 5 year old Sandy Bridge i7 and it's starting to show it's age. And obviously I can't squeeze more years out of by overclocking.

Would it be worth it to wait for another generation of chips to come out, for a replacement laptop that I also expect to keep for 5 years, or are the generation out now fine?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

blowfish posted:

Is it even possible to see a difference between say 60 Hz and 144 Hz?

Back when we all had CRTs, being able to run your CRT at 85hz refresh (and your game at 85 fps to match) was very noticeable - and the same for when you had a really high end monitor that did 120 hz on a CRT with a beefy computer to keep up. 144 hz you might not see all the difference, but you'll definitely see at least as much of a difference as 85 or 120 vs 60.

The idea that 60 is as far as most people go in being able to really see a difference is a misinterpretation of the fact that 60 hz/fps is a pretty good place to target for your money. Going above it often costs a lot more in processing power and hardware, so it's a pragmatic target to pick.

Same reason that movies have been at 24 FPS for so long - anything much higher meant needing a whole lot more physical film to shoot, edit, and project, and 24 FPS can be ok enough for use. And now we've got an industry where everyone is trained around the limits of 24 FPS and the precise way to light scenes that work well with 24 FPS, so there's gonna be a lot of retraining before they can handle higher framerates properly.

fishmech fucked around with this message at 16:36 on Feb 1, 2016

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

We've also got a lot of eyeballs trained to watch 24 FPS movies and holy hell does anything higher look wrong. It wouldn't be a big loss for me since I barely watch movies but if anyone manages to hook a new generation onto >24 FPS and it becomes the new standard I might have to stop entirely.

Honestly the most likely end result is moves moving to 30 fps and 60 fps, just like HDTV. And there's no reason they can't do it right at high framerate.

I mean 90% the reason the 48 FPS The Hobbit was so bad was that the movie was poorly shot in general, ya know?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

I mean, my point of comparison is the HDTV my parents got about a year ago, which is the first place I've seen high framerate live action stuff. And boy do I not want to see it ever again.

Er, the only broadcast standards we're using for HDTV is 30 and 60 FPS, with much of it being 30. These are hardly high framerate, considering as those have been standard rates for TV and monitors for ages.

Are you sure you're not thinking of a TV set that upconverts lower frame-rates to 120 FPS by literally generating new frames based on the average of the frames it's actually receiving? That looks pretty fake, but it's because the actual content is being stretched out with frames that don't actually exist in the source. That sort of stuff is always going to look wonky.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

Hm. Might be that. I've never owned a TV so my only data points are my parents' old one from like 1992 and the new one which is some kind of Samsung 1080p deal. Would I notice a significant difference that feels weird between what would have gotten displayed on the old TV over all these years and an HDTV without the generated inbetween frames?

Well a regular old SDTV was 30 frames per second consisting of 60 interlaced fields per second. And your computer monitor since the 80s has rarely been under 60 frames per second (barring off cases like old laptop LCDs that often only did 30 FPS to save battery/power).

Neither 30 nor 60 fps should look weird to you, and 120 fps also shouldn't look weird so long as there's 120 actual frames and it's not interpolated frames being used to beef up 24/30/60 per second to 120. But pretty much the only true 120 fps content out there is video games and some specialty recordings of like, nature documentaries and sports, so it's not exactly common yet.

Toast Museum posted:

If I understand correctly, a lot of the "soap opera effect" comes from trying to display 24fps content at 60fps. 24 doesn't divide into 60 evenly, so the TV has to create interpolated frames to pad things out. It should be less of a problem with 120Hz displays because 120/24 = 5, so for 24fps content it can just show each frame for five refreshes without doing any interpolation.

You've got it kinda backwards. The soap opera effect is that for a long time high budget TV shows were shot on film and did 3:2 pulldown to convert that back to 60 field per second/30 frame per second analog TV. But since soap operas were on really cheap budgets, they'd just be shot first just live and later to cheap-rear end videotape, which means they never got carefully done lighting, editing, etc. And as part of making the best of a bad situation they'd stick to always over-lighting a scene because people will put up with a too bright scene on TV more than a too dark scene, etc.

Many people got so used to high class shows only being 24 fps (reconverted to standard TV rates, but it's still noticeable) that as more shows in the HD era have moved to just shooting on high quality cameras but at full 30 or 60 fps, it "looks like a soap opera" to them. But really it's something that you had to get used to an old way to notice.

Although soap operas still look kinda bad today, because even though they're recorded the same way they still have cut-price sets, lighting, and all the rest.

fishmech fucked around with this message at 18:45 on Feb 1, 2016

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Pryor on Fire posted:

I'm confused about the new CPU requirements for windows 10.

There are no new CPU requirements.

Pryor on Fire posted:

It's so bizarre to talk about the "final" hardware we'd ever be able to run

Because that's not a real thing?

td4guy posted:

Newer CPUs aren't just gonna stop working with Windows 7/8.1. The older OSs just simply won't be able to take advantage of any new features.

Of note is that older Windows OSes rarely did take advantage of new CPU features except through major service packs, or through CPU manufacturer software.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

NihilismNow posted:

The driver model between windows 8 and windows 10 isn't that different is it? It may not be supported but you can probably use windows 10 drivers on windows 8 like you could use Vista drivers on windows 7.
I wouldn't be worried about the drivers and hardware compatability as much as security holes that just won't get patched.

If you are really stuborn there are options. You can compile your own browser from open source projects. You can use ancient add-on cards to get audio/networking.

Yes but there's also no reason to be using Windows 8 if you're the kind of person who hates Windows 10. 8 has all of the downsides and none of the upsides.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

canyoneer posted:

I remember the cool thing was to use Windows 2000, even past the point where XP was pretty good.

I remember people buying fancy 64 bit CPUs and 16 GB of memory, but still insisting on using them with (32 bit only of course) Windows 2000, and complaining it wouldn't support all their stuff.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

redstormpopcorn posted:

I've only held off on the Win10 upgrade because I haven't found a conclusive yes/no "it works" for drivers on my HD4870s. :corsair:

They work. Of course they won't support DirectX 12 stuff, since they're 8 year old cards now, but they'll do all the same stuff they did in 7 or 8.

My source is that they're what my brother was using in his Windows 10 computer, until he finally got a better system last month.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

feedmegin posted:

I certainly can't imagine more than 4GB being usual before Vista, because XP being 32-bit couldn't use more than that (there was briefly I think a 64 bit Windows XP but basically noone actually used it).

64 bit Windows XP was basically a hastily rebadged Windows Server 2003 SKU, which many programs couldn't run on either because they still relied on 32 bit only drivers or because they detected a server OS and thus demanded a different license.

There was also the original 64 bit XP, which even fewer people used - because it was Itanium only.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

necrobobsledder posted:

Last I saw that would push OEMs into Microsoft Tax territory since the lower RAM machines were qualified for bulk Windows licensing at stupidly-cheap rates rather than the 4GB tablet style models. Unless MS has relented I don't see how they could keep costs down to make this work for the market. Laptop / PC sales are just plain down for years and years now and even tablets are hardly moving and have hit market saturation point it appears.

Basically what happened is that laptops and tablets started being durable enough and fast enough to last a good long while.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Tab8715 posted:

I'm not following, weren't netbooks an abysmal failure?

I don't know, they certainly didn't last, but they were very popular for a couple years going, probably made tens of billions. Regular laptops dropped close enough in price to obsolete them, more than anything else.

Adbot
ADBOT LOVES YOU

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

VulgarandStupid posted:

It's probably not a big deal in the States, but I'd have to imagine ASRock wouldn't have made the feature if there was no demand for it. Then I remembered that VHS was never big in China, because the humidity would destroy the tapes. As a result, China mostly adopted VCD (Video CD) as a precursor to DVD... So... yeah. Maybe its a real issue in humid areas of the world, but man, oh man, was that video bad.

Er, VHS was quite popular in Hong Kong and Macau and Taiwan for a while, but through the 70s and even into the 80s there wasn't a lot of TVs in use in mainland China to pick up VCRs, even if they could afford them. And besides, hundreds of millions of mainland Chinese live in areas that aren't very humid in usual conditions. Video CDs on the other hand, the players cost about as much as a VCR did, but the CDs were so much cheaper to manufacture and buy than tapes were. The same cheapness and equivalent quality is why the VCD format picked up in those other Chinas once it was around.

Correspondingly, there was plenty of usage of VHS/Beta in humid parts of Japan, Europe, the Americas, regardless of whether or not people had working A/C or dehumidifiers to keep humidity down. Really, VHS tapes only get ruined faster than normal play does if you insist on leaving them in sauna conditions or storing them in your bathroom. Typical household humidity in a humid area is ok for them, even though they'd be best in a perfect A/C storage facility.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply