Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
Actually I've never hotswapped my actual eSATA port, but I have an internal hot-swap bay in a 5.25" slot which is totally awesome. Only problem I have is that I typically have to rescan disks in Disk Management on putting something in.

Adbot
ADBOT LOVES YOU

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

AvatarSteve posted:

Any news on Saints Row the Third performance on the 7000 series? Due to it being my most played and loved game I grabbed a GTX 680 and have been playing it flawlessly at 60fps (can drop near the zombie area with all the smoke though, no biggie). I've always liked the design and aesthetics of my 7970 so I'd love to chuck it back in, but if my favorite game still runs like hell it's a no go.

Big thanks if anyone has any input on this situation.

I recently moved from 5850 to 7950, and my SR3 performance went from working decently at 1920x1200 on Medium, to 2560x1440 on Ultra or whatever. This was only a brief check as I've been focused on other games, and I hadn't really checked performance in a while, so I don't know how much is card and how much drivers. Either way, it felt really smooth in some brief play with everything cranked up.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
I'm pretty happy that the 660Ti looks like a great deal, but at the same time nothing big enough to make me regret buying a 7950 a while back.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

KillHour posted:

I don't know why they don't just program games to be 64 bit nowadays. Is anyone really still using a 32 bit OS for gaming?

I don't know, how many vocal gamers are left bitterly clinging to XP because they're certain how badly Windows jumped the shark in reliability and performance after it (from reading forum discussions/jokes rather than by using anything newer)?

It also doesn't help that a lot of computers still have been sold with 32-bit Windows for no good reason the last several years, and I'm sure a fair number are owned by people that buy games. Even past that, marketing never wants to put minimum requirements they feel will leave anyone out, even if it means giving processor/video requirements that lead to customers upset with the slideshow they're viewing.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

PC LOAD LETTER posted:

The WiiU tablet controller is probably what is causing that. Apparently if you buy 1 game Nintendo more than makes up the loss though so it'll probably still be profitable from day 1 for them. Given the rumors we've been hearing about what the PS4 (glorified Trinity APU of sorts) and X720 (AMD 67xx class GPU + maybe AMD CPU or multi-core ARM variant, or a bit of both + 4-8GB of RAM) I don't think its unreasonable to believe any console manufacturer is going to push the limits financially as much as they did back when the PS3/X360 launched.

I hadn't thought of the tablet controller, but that would explain some of it. I agree on the rest: no one's going to be trying to challenge high end gaming PCs this generation, and there's nothing like the HD-DVD/Blu-Ray standards war driving hardware specs from outside the gaming industry either. So no one's going to be releasing fantastically expensive consoles still sold at a steep loss. The WiiU is going to be the weakest of its generation, but it should be closer to the GameCube vs. the PS2/Xbox than the Wii vs. PS3/360. Its RAM concerns me more than anything. At 2GB it's far larger than the Wii in any case, but only 1GB available to games means it's not that huge a leap over the PS3/360, and it will be well behind their successors. Memory seems to be a huge and often overlooked factor in game design and porting, especially as consoles age, so I expect it might be one of the big concerns with the system a few years from now.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Jan posted:

When they weren't forced to downscale the totality of their engine to work on archaic consoles with 350MB of RAM to work with between CPU and GPU?

You guys are arguing against needing new consoles because we don't have any better graphics, when it's the other way around. :psyduck:

Yeah, that's pretty much it. High end game development is so expensive now that the only people who have reason to sink that kind of cash into single-platform development rather than maximize their target audience are console manufacturers looking to grow market share through exclusives. And multiplatform development means an end result defined by the weakest link. The end result is a market where technology advance has been hampered by a console generation lasting so long and developers having to trim things back to fit - sometimes unsuccessfully, as seen in the Skyrim DLC.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

SocketSeven posted:

Being able to make realistic hair might also be able to let you make things like realistic grass. Assuming it can scale from head sized to map sized.

Interesting times ahead for sure, whatever they pull out of their hat.

Better a head full of grass than a map full of hair, I guess.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Endymion FRS MK1 posted:

Here's an article showing the top 10 most important graphics cards.

Some... odd choices. 8800GTX instead of the GT? 5970 instead of 4870?

And of course the final one, the GTX Titan

I don't know, if I remember this right wasn't the 8800GT just a revision and die shrink that gave incremental improvement over the initial GTS/GTX (and landed between them in performance)? The GTX was the one that made a huge leap in features and performance and became what a serious card needed to match. I mean, the GT was out longer and sold more, but I can't argue with this any more than I can with the GeForce 256 being on there instead of the longer lived and better selling GeForce 2 that just built on it a little.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

roadhead posted:

http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/

This is pretty big and I assume basically the PS4's Arch. The cache coherency is a Big Deal.

And if that's correct, doesn't it mean that developers making games using that feature in AMD APUs for the PS4 and Xbox will have a low barrier to also support it in PC releases? Pretty nice strategy if so.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

real_scud posted:

Is my videocard dying? It's a 7970 that I've had for a while, the temps are pretty low since I put the aftermarket cooler on it and I'm running the 13.5 beta drivers.

What's even weirder is it's only Firefox that exhibits this tendency so far, and I haven't had any glitches or weird things happen while playing games.



I've had this happen too, and have assumed it to be a driver thing since it's seemed to come and go with driver installs.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

w00tazn posted:

Good thing the designers of the Xbone say that they weren't targeting high end graphics. You should also remember that for current gen, 576p/30 was the target framebuffer/fps and that most games target Medium/Low graphics compared to their PC counterparts.

For next gen, I can imagine the target framebuffer resolution going up, but not much more than that. I doub't we'll be seeing any huge leaps in fidelity anytime soon as we haven't really seen anything that really pushes the boundaries on PCs today and doing so would just mean increased costs for developers who already operate on razor thin margins.

That's the same thing I've taken away from the new console reveals. They're not really powerful, but they have a large, unified memory space so the PC gamers that are mostly going to feel pain with new titles coming out are those with 1GB cards or such that handled most of the last generation's multiplatforms.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Zero VGS posted:

I'm sure there's a simple answer for this but why don't GPUs just have slots for you to put more ram into?

They had some like that in the old days! Like, before they called them "GPUs" and when you were lucky to get meaningful 3D feature sets anyway. I imagine higher speed, higher bandwidth, varying clock speeds, physical demands imposed by cooler design, and generally exacting demands compared to desktop RAM mean that it would be a headache for manufacturers, a limited value even for enthusiasts, and generally have a high cost/benefit ratio.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

b0nes posted:

Technical question: A lot of the new Ultrabooks have crazy high resolution screens. I am not a big gamer but I do play from time to time. I see when games are played at the native resolution usually they get crap frame rates. If you played a game at a lower resolution would the frame rates improve? Or not since the same amount of pixels need to be lit up no matter what (If you played at full screen).

Lower resolutions improve frame rates, though running on an LCD at less than native resolution hurts picture quality since it has to interpolate rather than output each pixel directly as the card produces it. So you'll get faster, but blurrier How bad this is depends on the screen and the resolution, but it's at least better than playing a slideshow.

One exception is if you can have an exact multiple of resolution. For example, if you get one of the crazy 2560x1440 ultrabooks and then game fullscreen at 1280x720 (exactly half that) you'll have a perfectly sharp image, just effectively with bigger pixels.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Athropos posted:

Which is a lie because two cards at 2GB does not equal 4GB at all. Or I was told wrong.

True or not, it's a lie that's been industry standard since the Voodoo 5 at least.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
Also it sounds like even when it's a huge risk to buy GPUs for the sake of mining long-term, using one you already have and seeing how a boom rides out is a lot less to lose.

Adbot
ADBOT LOVES YOU

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
In my experience, the cost of PC gaming for years has more or less been "if you have reason for a midrange desktop already, spend the cost of a console on a video card instead", and the GPU shortage was an anomaly there. You can extend your PC gaming budget upward by a lot from that level is all, but you don't need to unless you have the money to spend.

Console's more convenient, but it's mostly only cheaper if you wouldn't otherwise have a desktop. Which more and more people don't.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply