Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Bob Morales posted:

I've seen 5-20MB/sec with the card I have. Some expensive cards will do 50+ but that's still about what a old 80GB HD will do. Your 10GB is probably not quite that fast, though.

Those are just raw transfers, I'm not sure what the read/write patterns and seek times are though.

From my recollection, a fair number of drives that old didn't support anything higher than Ultra ATA/33, so they of course never got any better than 33MB/s. According to Wikipedia, the spec for 66MB didn't hit until 2000, and I know they were producing 10GB drives as far back as '98 at least.

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

mAlfunkti0n posted:

If I remember correctly, those had to have the frequency ramped to the moon before they became useful due to something or other that my concussed brain cannot recall. You could use it for a NAS, I wouldn't use it for an HTPC as it is going to have difficulty if any video processing is demanded of it.

Technical reason - the pipeline (path that instructions take while going through the processor) is much longer than in more modern (and older, like P3!) processors, at 31 stages for Prescott. (90nm A- and E-series P4s) This allows you to get many more clock cycles because each step is simpler and thus it doesn't take as long of a cycle for an instruction to propagate from one side of a bank of transistors to the other, but the instructions often end up going through more banks so you get less bang for your clock.

Practically, a 3.0GHz Prescott is not that much worse than a 2.0GHz C2D for loads that aren't well multithreaded (they have Hyperthreading, so they can handle a bit of thread-level parallelism but not like a dualcore) except that as everyone mentioned it will use much more power, something like 50% more under full load perhaps and a much greater margin when idle or nearly so due to the various newer power saving techniques on newer processors. A Prescott will wipe the floor with any single core Atom, but will also use something like 30x the power considering the CPUs alone.

It's not bad for what other people have said, like office software and web browsing as such, as long as you're prepared for it to use more power.

Eletriarnation fucked around with this message at 01:30 on Feb 23, 2011

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Nierbo posted:

I know my computer is antiquated, but as a student, I don't have money to splash around. All I really want to play is TF2 and it runs sort of ok, but not really as good as I'd like. Should I get a X2 CPU or should I get that 5770 radeon card that is recommended in the parts thread?

I know the long term answer is save up and get a whole new system etc, but thats just not on the cards right now.

What is your current card?

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
The new card will make more of a difference.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Kaluza-Klein posted:

I am doing it now and it is charging. If the apartment burns down, at least I'll have some random person on the internet to blame instead of myself. Thank you!

I could be wrong, I'm no electrician, but I'm pretty sure that since the input's just going straight into a DC rectifier anyway that the ground really doesn't matter. This theory is reinforced by the fact that the Thinkpad W510 I'm typing this on, with an adapter that at 135W is as powerful as any I've ever seen for a laptop, takes a two-pin AC input and not a three-pin. I'm pretty sure if it mattered to any real extent Lenovo would spring for a three-pin brick/cord on a $1500+ laptop.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

spe posted:

Haven't got a thumb drive at hand but the wireless mouse + adapter work a lot better on the hub. The hub is connected to a short bit of its own cable before connecting to a 5m usb extension that plugs into my pc. Does anything like this apply to ethernet cables? I was gonna install a 20m cable.

Spec for 10/100/1000Base-T is all 100 meters, I believe.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Lovie Unsmith posted:

Pretty sure that every AMD CPU from Socket 754 on has been 64-bit compliant.

There were some s754 Semprons that were 32-bit only, I believe, but as mentioned before the only substantial exceptions to "everything in the last 5 years from Intel or AMD is 64-bit" are the Atom N270/N280, which were ubiquitous in netbooks from two-three years ago or so but have been replaced by the 64-bit capable N4xx/5xx series.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

StickFigs posted:

My question remains unanswered in the networking thread...


Any help would be appreciated!

If the connector matches, I'd imagine that's all it takes for the antenna to be compatible. Alternatively, this adapter picks up an amazing signal (I get 5 bars from a cheap Linksys router 100 feet away in another building) and theoretically does up to 150Mbps, if you don't want the uncertainty and have a couple spare USB ports.

EDIT: On your question about selecting G vs. N - as far as I am aware, there is no way to pick a specific access point if they're all one SSID, although you may have a setting that would lock you out of N operation.

Eletriarnation fucked around with this message at 14:42 on Sep 11, 2011

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

real_scud posted:

Does anyone know of a sound card that can take optical audio in and isn't Creative? I have an Xbox connected to one of my secondary monitors, and currently the sound goes into my onboard sound via the stereo-to-3.5 headphone jack, and it works ok.

However the sound is really lovely on my 5.1 speakers and regularly gets blown out on certain games so I'd really like to have the xbox's optical audio go into an optical on the soundcard and then get output at 5.1.

I found a HT|Omega Striker on Newegg that I thought could do it, but reading the reviews says that when you have optical in it won't transfer that if you've only got analog going out of the soundcard, which I do because my speaker setup just has the 5.1 analog inputs and no optical connector.

Anyone know of an alternative? It kind of makes me sad that my Game Theater XP doesn't really work with Win 7.

http://www.amazon.com/Sewell-USB-SoundBox-Sound-Card/dp/B004Y0ERRO/ref=sr_1_1?ie=UTF8&qid=1321499949&sr=8-1

I have one of these and it works great for analog I/O, although I haven't tried the digital. You may want to give it a try - I'm going to get an optical cable in later this week to hook up the old PS2, so I guess I'll learn if it works then.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Cardboard Fox posted:

This is a rather simple question, but after reading dozens of conflicting opinions around the internet I've decided to ask you goons.

Is it better to leave your computer on or off at all times?

I generally always leave my computer on without putting it to sleep. I've done this with my last computer over a 5 years span and it's still working just fine. Since I have an SSD now should I maybe consider turning this new computer off every night or possibly putting it to sleep instead of just turning off the LCD?

I don't think it really matters substantially for system durability, just for power consumption - I sleep my computer whenever I'm not using it because it uses 80W idling and 2W asleep. I'm not sure how having an SSD would affect this except that it uses less power than a hard drive.

I think if you don't care about the power consumption (if you want to use your computer as a space heater) or you have a machine that uses very little when idling, you might as well leave it on.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

GobiasIndustries posted:

Yeah, as was pointed out, the size is the sticking point, otherwise I would have tossed the PS out and put in an Antec right off the bat. I was really hoping that for such a low power build I'd be able to get away with the stock unit, but no dice I suppose. Ended up having to go with a Sparkle brand unit, as it was the only one that was shallow enough to not block the DVD connectors..their overall reviews for power supplies seemed to be pretty positive so fingers crossed that it turns out OK.

Sparkle is just rebranded Fortron/FSP, which has been a reliable brand for a long time. I built my roommate's system with a FSP 350W and it has been running close to max regularly for five years now. You should be fine.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Drone_Fragger posted:

Hey, I'm about to upgrade my system but I have a small question. Alledgely the HD7xxx series is gonna launch soonish, and even if I don't get one its probably gonna drop the price of the HD6xxx series a bit. Is it worth waiting and getting a low end HD7xxx series card or a (possibly reduced) HD6xxx series card later or should I just get a HD6850 or something now?

Comedy option: don't use either and just upgrade my CPU, mobo and rams.

What CPU and graphics card are you running now? For reference, the 7970 is around 20% faster than the 6970, and uses a bit more power at load but a lot less (proportionally) when idling, so that can inform you on how the rest of the series might pan out.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Drone_Fragger posted:

eh, I decided I'd just wait till the HD7 series launches. If nothing else it'll probably drop the cost of the HD6 series a bit, and I can wait a few weeks using my old graphics card anyway. Thanks for the help Eletriarnation!

I have a 4850, running with an original i7 920. The 920 is still plenty fast (especially if I ever bother to overclock it) so I've been planning on getting a 7000-series too.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Lblitzer posted:

Does anyone have any opinions on BIAS lighting? I've lowered the brightness on each of my monitors and I'm not a fan of F.lux that much anymore. Does it help quite a bit with eye strain that's worth throwing $12 into?

I don't know if I can confidently say it's better for my eyes or anything like that, but I got a couple of the Antec LED strips for my monitors and I like the effect. They don't stick very well on a sloped surface (like the back of a monitor), so a strip of packing tape on top will be helpful if that's the kind of thing you're thinking.

It also kind of turns your monitor into a nightlight if you leave your computer/USB ports on while sleeping.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Alereon posted:

Nope. Basically nothing you can do to harddrives short of knocking them around, feeding them lovely power, or letting them get stupid hot (>50C) will affect their lifespans. For what little it's worth, I've got three WD harddrives in my system and I shut it down every night, they're going strong in their fourth year.

This is maybe a bit pedantic because it doesn't come up much, but running hard drives below 25C will begin to decrease their life too according to the big Google study. Although "normal" ambient temperature is around 22, in my experience a running drive in a well-ventilated case will usually reach equilibrium around 30C so that's safe.

Doing what I used to and using outside air for your intakes in the winter is probably not a good idea, though.

Adbot
ADBOT LOVES YOU

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

BusinessWallet posted:

Last week, before all the iPad hardware specs were released, it was speculated that the new iPad would have 512 MB of RAM, the same as the iPad 2. Some folks around my office were discussing why Apple would release the iPhone 4S and not increase the RAM from the previous model, and why they would release the new iPad without updating the memory as well. They came to the conclusion that adding memory would decrease battery life in the devices, because RAM uses battery power. That makes no sense to me, so I argued that added RAM would increase battery life because it takes load off the CPU and other resources, causing the device to run more efficiently, but they would not relent.

I want to know if anyone has insight about this, because I am genuinely curious. I know that in a standard laptop, not a tablet, if you add RAM it is generally accepted that this will increase battery life because it takes load off the CPU and hard drive, so I don't understand how it would be different for mobile devices like iPhone or iPad.

If adding memory caused the device to swap into virtual memory less, maybe. I don't think iPads use virtual memory, personally, so power delta going from 512 to 1G would depend purely on the number of chips involved and how much power each uses. That is, they may just be using twice as many chips but they may instead be using newer ones with higher density and/or lower voltage. I don't feel like it's going to be a consequential difference in any case.

Eletriarnation fucked around with this message at 16:33 on Mar 13, 2012

  • Locked thread