Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
shrughes
Oct 11, 2008

(call/cc call/cc)
That's possible. I don't think the player I was using (around '06-07) was very good. Or maybe what I was trying to play was encoded anally to minimize filesize at the cost of decompression time.

Adbot
ADBOT LOVES YOU

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Shaocaholica posted:

Hmm. Back in school I was able to play back 720p offline h.264 on it although they were circa 2006 720p encodes.

If you're watching anime, the 10 bit ~future~ will demolish that CPU. Normal, 8 bit h264 should work if it's low bitrate.

gary oldmans diary
Sep 26, 2005
The go-to software for playing files otherwise too demanding was always a Core product. TCPMP ruled. Then CoreAVC ruled.

Shaocaholica
Oct 29, 2002

Fig. 5E

PerrineClostermann posted:

If you're watching anime, the 10 bit ~future~ will demolish that CPU. Normal, 8 bit h264 should work if it's low bitrate.

Not to derail this thread but where is all this 10bit anime coming from? Are studios releasing 10bit blurays already? I always thought that 8bit was deemed enough for the consumer and any extra bits would probably go towards compression/4k and not bit depth. Is there even a consumer 10bit signalling standard to go from Bluray > HDMI > TV?

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Shaocaholica posted:

Not to derail this thread but where is all this 10bit anime coming from? Are studios releasing 10bit blurays already? I always thought that 8bit was deemed enough for the consumer and any extra bits would probably go towards compression/4k and not bit depth. Is there even a consumer 10bit signalling standard to go from Bluray > HDMI > TV?

It's purely due to benefits to compression and image quality, specifically banding. There are no 10 bit sources or displays, basically. Anime encoder Diaz did a whole thing on why if you Google it, and he's a lot more versed than me.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

shrughes posted:

I don't know, and very doubtful. I had a 745 at 1.8 GHz, it wasn't close to H.264, but the software could have improved.

And.... I have a 1.7 GHz 735 right now. The thing's at 86% to play a 360p YouTube video in Chrome on linux, but it starts going real slow if you full screne it. Let's try the real flash player.

The real flash player can play a 360p and fullscreen it, which makes sense because it could do that before. It can do 480p too, smoothly. It starts struggling at 720p. There's no hope for 1080p YouTube.

Boy I sure do miss 1400x1050 14" screens.

But running video in a browser, in a Flash wrapper, is always CPU heavy.

Running the same video offline in a decent player will yield good results.

To throw my hat into this arena and era, I have a Latitude D800 that shipped with a 1.7GHz Dothan, I upgraded it to 2GHz. It also shipped with an FX 5200, which I replaced with a 9600 Pro Turbo. 1920x1200 15.4". So baller. Of course, I rarely use it now, and it's mainly used for car diagnostics. I could try running some videos on it, but I imagine it'll be fairly competent, especially if that GPU can help out.

HalloKitty fucked around with this message at 10:41 on Feb 10, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Flash can actually be surprisingly efficient if hardware acceleration is working, the GPU is used for both bitstream decoding as well as rendering. This does depend on having a relatively modern GPU, current drivers, and a decent browser. Unfortunately on most systems I use or see hardware acceleration is blocked due to old drivers and the bugs they contain.

Shaocaholica
Oct 29, 2002

Fig. 5E
Whats still flash? I thought most video streaming sites switched to HTML5?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Youtube is Flash-only for a lot of content (anything that would normally require an ad), cross-platform support for H.264 in HTML5 video is a relatively new thing since Cisco released their codec for free in October 2013. WebM is so drat awful that H.264 in Flash looks better and has lower overhead than WebM in HTML5.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Shaocaholica posted:

Whats still flash? I thought most video streaming sites switched to HTML5?

Pretty much anything that requires DRM of any sort is still either Flash or Silverlight and likely will be for a long long time.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Anandtech is reporting that all Core-based Intel CPUs have a memory controller defect that prevents using unbuffered DIMMs larger than 8GB, according to a company called "I'M Intelligent Memory". They're talking about this because they've developed a technology that works around this issue and allows DIMMs with their special sauce to function on Intel CPUs. The actual bug seems to be that that the memory controller can't talk to 8Gbit dies, I'M's tech pairs two 4Gbit dies in a way that will work (so a 16GB DIMM would have 32 4Gbit dies).

The MUMPSorceress
Jan 6, 2012


^SHTPSTS

Gary’s Answer

Alereon posted:

Anandtech is reporting that all Core-based Intel CPUs have a memory controller defect that prevents using unbuffered DIMMs larger than 8GB, according to a company called "I'M Intelligent Memory". They're talking about this because they've developed a technology that works around this issue and allows DIMMs with their special sauce to function on Intel CPUs. The actual bug seems to be that that the memory controller can't talk to 8Gbit dies, I'M's tech pairs two 4Gbit dies in a way that will work (so a 16GB DIMM would have 32 4Gbit dies).

How many systems that need 8gb+ dimms are running unbuffered ram? Seems like the only systems running that much memory would be servers.

shrughes
Oct 11, 2008

(call/cc call/cc)

LeftistMuslimObama posted:

How many systems that need 8gb+ dimms are running unbuffered ram? Seems like the only systems running that much memory would be servers.

Any laptop with more than 8GB probably has an 8GB+ dimm.

Double Punctuation
Dec 30, 2009

Ships were made for sinking;
Whiskey made for drinking;
If we were made of cellophane
We'd all get stinking drunk much faster!

The article posted:

Mass production is set to begin in March and April, with initial pricing per 16GB module in the $320-$350 range for both DIMM and SO-DIMM, ECC being on the higher end of that range. To put that into perspective, most DRAM modules on sale today for end-users are in the $8-$14/GB range, making the modules have a small premium which is understandable to get the higher density.

Because I really need to cram 64 GiB of memory into my workstation for over twice the cost of normal memory. That "small premium" is totally worth it. :homebrew:

Hell, even Lenovo's SO-DIMMs aren't that outrageously priced. Who is seriously going to buy this before DDR4 comes out and makes it obsolete?

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

dpbjinc posted:

Because I really need to cram 64 GiB of memory into my workstation for over twice the cost of normal memory. That "small premium" is totally worth it. :homebrew:

Hell, even Lenovo's SO-DIMMs aren't that outrageously priced. Who is seriously going to buy this before DDR4 comes out and makes it obsolete?

You don't but the people buying into high density servers are.

RAM is the limiting factor for a lot of deployments.

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD
SIMM doublers are back (in pog form!)

pienipple
Mar 20, 2009

That's wrong!
Finally made the jump to Intel with an i5-4570 :toot:

Ryokurin
Jul 14, 2001

Wanna Die?

Shaocaholica posted:

Anyone care to guess what this ~10 year old top tier mobile CPU can handle?

http://ark.intel.com/products/27596/Intel-Pentium-M-Processor-765-2M-Cache-2_10-GHz-400-MHz-FSB

Supposedly it was $600+ when it came out in 2004.

1080p youtube? 1080p H.264 playback?

Paired with an Intel 855GM GPU/chipset. I'd test myself but I won't have the notebook for a few weeks. It was my grad school notebook :allears:

1080p is highly doubtful. Maybe 720p. Back in the day, most people had to use codecs like CoreAVC to play on similar systems and even then it wasn't 100% It would be better if the chipset could accelerate H.264 but the 855 did none of that.

craig588
Nov 19, 2005

by Nyc_Tattoo
I used to use a laptop with an Intel integrated GPU and a similar CPU, though a 2.3GHz 133MHz (533MT) FSB version, and it handled 720P h264 at 3500kbs easily using Media Player Classic. I don't think 1080P would be entirely unreasonable, depending on the bitrate. Youtube at even 720P might be a stretch though.

craig588 fucked around with this message at 19:10 on Feb 15, 2014

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Ivy Bridge EX processors were officially announced today, if anyone is in the market for $7K/socket CPUs.

mobby_6kl
Aug 9, 2009

Well 15 2.8Ghz cores would be pretty drat sweet.

In other news, FFS, apparently Broadwell's getting delayed until late '14 or Q1'15 and this is making me look like an rear end in a top hat with a Q6600 who didn't jump on Haswell, or Ivy Bridge or Sandy Bridge...
http://techreport.com/news/26053/leaked-slides-echo-rumored-broadwell-delay

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

mobby_6kl posted:

Well 15 2.8Ghz cores would be pretty drat sweet.

In other news, FFS, apparently Broadwell's getting delayed until late '14 or Q1'15 and this is making me look like an rear end in a top hat with a Q6600 who didn't jump on Haswell, or Ivy Bridge or Sandy Bridge...
http://techreport.com/news/26053/leaked-slides-echo-rumored-broadwell-delay
Don't worry, you still get Haswell Refresh and 9-series chipsets this Summer. We weren't expecting Broadwell to be socketed anyway until recent plan changes so the process-related delays aren't earth-shattering. I have a Core 2 Quad also (though an overclocked 9550) and I think that's my plan.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
This makes me wonder just how much of a difference my two most recent computers are. C2D e6750 @ ....3.5ghz? I can't even remember. Compared to my 2600k @ 4.5ghz.

Hell, how does my 2600k compare to whatever the latest processors are?

jink
May 8, 2002

Drop it like it's Hot.
Taco Defender

PerrineClostermann posted:

This makes me wonder just how much of a difference my two most recent computers are. C2D e6750 @ ....3.5ghz? I can't even remember. Compared to my 2600k @ 4.5ghz.

Hell, how does my 2600k compare to whatever the latest processors are?

It should be around 10-20% slower depending on the task (mostly synthetic benchmarks). Nothing to worry about for games.

http://www.anandtech.com/show/7003/the-haswell-review-intel-core-i74770k-i54560k-tested/6

WhyteRyce
Dec 30, 2001

The CEO of Intel did an IAmA on reddit but refused to answer any questions about will.i.am
http://www.reddit.com/r/IAmA/comments/1ycs5l/hi_reddit_im_brian_krzanich_ceo_of_intel_ask_me/

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

PerrineClostermann posted:

This makes me wonder just how much of a difference my two most recent computers are. C2D e6750 @ ....3.5ghz? I can't even remember. Compared to my 2600k @ 4.5ghz.

Hell, how does my 2600k compare to whatever the latest processors are?

2600K vs 3770K vs 4770K - Stock & @ 4.5GHz.

HalloKitty fucked around with this message at 09:51 on Feb 20, 2014

Ignoarints
Nov 26, 2010
I always had the impression delidding was a kind of process I'd just never want to do, as in "well that's as far as you'll get unless you want to delid it *lol*". But I just stumbled across an article how to do it and it seems not terribly hard. I realized I don't know why I thought it was some undoable thing in the first place. Then I saw a youtube video of putting the chip in a vice and a shirtless man hammering a block of wood into it and the the chip sliding off the lid :lol:

Has anybody done this here? The results seem pretty phenomenal. 15-25* drop under full load is exactly what I'd love to see. I have an i5-4670k 4.4 GHz at 1.28v. I'm hitting 79-80* in prime95 with this. I had it at 4.5 GHz but it would spike to uncomfortable temperatures in the mid 80's with just a .10 increase in voltage, but it is not stable at 1.28.

I understand my warranty on my basically brand new chip will be toast, but that's not a huge deal for me. If I break it, I break it, but I'd love to do it if the risk wasn't necessarily too high.

Ignoarints fucked around with this message at 16:08 on Feb 20, 2014

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Ignoarints posted:

I always had the impression delidding was a kind of process I'd just never want to do, as in "well that's as far as you'll get unless you want to delid it *lol*". But I just stumbled across an article how to do it and it seems not terribly hard. I realized I don't know why I thought it was some undoable thing in the first place. Then I say a youtube video of putting the chip in a vice and a shirtless man hammering a block of wood into it and the the chip sliding off the chip :lol:

Has anybody done this here? The results seem pretty phenomenal. 15-25* drop under full load is exactly what I'd love to see. I have an i5-4670k 4.4 GHz at 1.28v. I'm hitting 79-80* in prime95 with this. I had it at 4.5 GHz but it would spike to uncomfortable temperatures in the mid 80's with just a .10 increase in voltage, but it is not stable at 1.28.

I understand my warranty on my basically brand new chip will be toast, but that's not a huge deal for me. If I break it, I break it, but I'd love to do it if the risk wasn't necessarily too high.

You've learnt all there is anyone can tell you, really. Read as much as you can about doing it.

You may gently caress up the chip. But if you don't, you'll get good gains.

Don't pretend the risk is small; because even if it is, you could still fall into the inevitable failure rate. If you can't afford to replace the chip, don't consider it.

ShaneB
Oct 22, 2002


PerrineClostermann posted:

This makes me wonder just how much of a difference my two most recent computers are. C2D e6750 @ ....3.5ghz? I can't even remember. Compared to my 2600k @ 4.5ghz.

Hell, how does my 2600k compare to whatever the latest processors are?

I was actually just doing some cursory research on this and was amazed that 3 years hasn't really pushed the envelope TOO far for CPUs. I mean yeah sure Haswell is superior to Sandy Bridge, but you could still have a perfectly awesome computing experience on high-end Sandy Bridge hardware you bought back then. The only semi-daily use thing that really pushes my CPU to the limit and would cost me a few seconds is RAW photo processing work in Lightroom, but that's not my daily job or anything.

HalloKitty posted:

You've learnt all there is anyone can tell you, really. Read as much as you can about doing it.

You may gently caress up the chip. But if you don't, you'll get good gains.

Don't pretend the risk is small; because even if it is, you could still fall into the inevitable failure rate. If you can't afford to replace the chip, don't consider it.

Delidding is so absurdly easy if you know someone with a vise. I took my brand new i5 to my garage, chucked it in the vise with some duct tape on the edges for cushioning, whacked the IHS off, and left my garage in about 45 seconds total.

ShaneB fucked around with this message at 16:29 on Feb 20, 2014

Ignoarints
Nov 26, 2010

HalloKitty posted:

You've learnt all there is anyone can tell you, really. Read as much as you can about doing it.

You may gently caress up the chip. But if you don't, you'll get good gains.

Don't pretend the risk is small; because even if it is, you could still fall into the inevitable failure rate. If you can't afford to replace the chip, don't consider it.

I'll keep reading, thanks. I can afford it, I just would greatly prefer not to buy a new one of course. Seems like the vice method is a great success, maybe I'll buy this and remove the rubber pads.

ShaneB posted:



Delidding is so absurdly easy if you know someone with a vise. I took my brand new i5 to my garage, chucked it in the vise with some duct tape on the edges for cushioning, whacked the IHS off, and left my garage in about 45 seconds total.


I have a vice but the surface is just way too rough (I would think) to get a good wide grip on a small chip. It's a beast (pressed control arm bushings with that thing) but I'm trying to find a cheap vice with a flat surface. Glad to hear it's so easy, the videos ive seen seem to suggest the same thing. I keep imagining slipping and crushing all the pins with the wood everytime I see a video though

Ignoarints fucked around with this message at 16:38 on Feb 20, 2014

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

ShaneB posted:

Delidding is so absurdly easy if you know someone with a vise. I took my brand new i5 to my garage, chucked it in the vise with some duct tape on the edges for cushioning, whacked the IHS off, and left my garage in about 45 seconds total.

I didn't say it was necessarily hard or high-risk, but it's not exactly no risk if you accidentally chip the die.

Ignoarints posted:

I'll keep reading, thanks. I can afford it, I just would greatly prefer not to buy a new one of course. Seems like the vice method is a great success, maybe I'll buy this and remove the rubber pads.

Good luck!

ShaneB
Oct 22, 2002


Ignoarints posted:

I have a vice but the surface is just way too rough (I would think) to get a good wide grip on a small chip. It's a beast (pressed control arm bushings with that thing) but I'm trying to find a cheap vice with a flat surface. Glad to hear it's so easy, the videos ive seen seem to suggest the same thing. I keep imagining slipping and crushing all the pins with the wood everytime I see a video though

There aren't pins on Haswell, the pins are in the socket now, so that's one less thing to worry about. I suppose you could shear off SOMETHING down there but it's pretty easy if you just set the wood block against it snugly.

Ignoarints
Nov 26, 2010

ShaneB posted:

There aren't pins on Haswell, the pins are in the socket now, so that's one less thing to worry about. I suppose you could shear off SOMETHING down there but it's pretty easy if you just set the wood block against it snugly.

I can't believe I didn't notice something like that when I put it in. Good news though. drat, I might just do this today.

edit: nm couldnt source that CLU stuff locally. not even easy to find online

Ignoarints fucked around with this message at 22:38 on Feb 20, 2014

Straker
Nov 10, 2005

PerrineClostermann posted:

This makes me wonder just how much of a difference my two most recent computers are. C2D e6750 @ ....3.5ghz? I can't even remember. Compared to my 2600k @ 4.5ghz.

Hell, how does my 2600k compare to whatever the latest processors are?
C2Ds are pretty lovely, I had something overclocked to 3.2GHz that could barely transcode video in realtime, now with a 2500K I can transcode video and play bf4 or whatever at the same time no problem. Nothing out now is much better than a 2500K though, kinda sad.

computer parts
Nov 18, 2010

PLEASE CLAP

Straker posted:

C2Ds are pretty lovely, I had something overclocked to 3.2GHz that could barely transcode video in realtime, now with a 2500K I can transcode video and play bf4 or whatever at the same time no problem. Nothing out now is much better than a 2500K though, kinda sad.

Yeah, I switched out my brother's computer from a C2D to a 2500K and the difference was incredibly noticeable, although that may have also been because he was still using DDR2 RAM at the time and I switched that up to DDR3.

ShaneB
Oct 22, 2002


Ignoarints posted:

I can't believe I didn't notice something like that when I put it in. Good news though. drat, I might just do this today.

edit: nm couldnt source that CLU stuff locally. not even easy to find online

I got mine off amazon.

Ignoarints
Nov 26, 2010

ShaneB posted:

I got mine off amazon.

Yeah I did too, a reseller had some (then they sold out within a day and now its even more expensive)

coffeetable
Feb 5, 2006

TELL ME AGAIN HOW GREAT BRITAIN WOULD BE IF IT WAS RULED BY THE MERCILESS JACKBOOT OF PRINCE CHARLES

YES I DO TALK TO PLANTS ACTUALLY

Ignoarints posted:

I always had the impression delidding was a kind of process I'd just never want to do, as in "well that's as far as you'll get unless you want to delid it *lol*". But I just stumbled across an article how to do it and it seems not terribly hard. I realized I don't know why I thought it was some undoable thing in the first place. Then I saw a youtube video of putting the chip in a vice and a shirtless man hammering a block of wood into it and the the chip sliding off the lid :lol:

Has anybody done this here? The results seem pretty phenomenal. 15-25* drop under full load is exactly what I'd love to see. I have an i5-4670k 4.4 GHz at 1.28v. I'm hitting 79-80* in prime95 with this. I had it at 4.5 GHz but it would spike to uncomfortable temperatures in the mid 80's with just a .10 increase in voltage, but it is not stable at 1.28.

I understand my warranty on my basically brand new chip will be toast, but that's not a huge deal for me. If I break it, I break it, but I'd love to do it if the risk wasn't necessarily too high.

Assuming you're not one of the BIGGER NUMBER BETTER THAN nutcases, it's important to keep in mind just what you're trying to achieve by delidding it. Going from 4.4GHz to 4.5GHz is a 2% performance increase, and that's only when you're already bottlenecked by the CPU.

Ignoarints
Nov 26, 2010

coffeetable posted:

Assuming you're not one of the BIGGER NUMBER BETTER THAN nutcases, it's important to keep in mind just what you're trying to achieve by delidding it. Going from 4.4GHz to 4.5GHz is a 2% performance increase, and that's only when you're already bottlenecked by the CPU.

I'd hope for more 4.6 4.7. Seems reasonable based on what other people have gotten. Some have gotten 4.8+ but with a better cooler than I have.

So yes

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

computer parts posted:

Yeah, I switched out my brother's computer from a C2D to a 2500K and the difference was incredibly noticeable, although that may have also been because he was still using DDR2 RAM at the time and I switched that up to DDR3.
The 45nm C2Ds were a lot better, the originals were often paired with P4 chipsets and slow DDR2 and that was not a good combination. A 4-series chipset with AHCI support and DDR2 800+ is in a much better position today.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply