|
Actually I've never hotswapped my actual eSATA port, but I have an internal hot-swap bay in a 5.25" slot which is totally awesome. Only problem I have is that I typically have to rescan disks in Disk Management on putting something in.
|
# ¿ May 13, 2012 07:05 |
|
|
# ¿ Apr 27, 2024 05:52 |
|
AvatarSteve posted:Any news on Saints Row the Third performance on the 7000 series? Due to it being my most played and loved game I grabbed a GTX 680 and have been playing it flawlessly at 60fps (can drop near the zombie area with all the smoke though, no biggie). I've always liked the design and aesthetics of my 7970 so I'd love to chuck it back in, but if my favorite game still runs like hell it's a no go. I recently moved from 5850 to 7950, and my SR3 performance went from working decently at 1920x1200 on Medium, to 2560x1440 on Ultra or whatever. This was only a brief check as I've been focused on other games, and I hadn't really checked performance in a while, so I don't know how much is card and how much drivers. Either way, it felt really smooth in some brief play with everything cranked up.
|
# ¿ Jul 17, 2012 11:28 |
|
I'm pretty happy that the 660Ti looks like a great deal, but at the same time nothing big enough to make me regret buying a 7950 a while back.
|
# ¿ Aug 16, 2012 23:21 |
|
KillHour posted:I don't know why they don't just program games to be 64 bit nowadays. Is anyone really still using a 32 bit OS for gaming? I don't know, how many vocal gamers are left bitterly clinging to XP because they're certain how badly Windows jumped the shark in reliability and performance after it (from reading forum discussions/jokes rather than by using anything newer)? It also doesn't help that a lot of computers still have been sold with 32-bit Windows for no good reason the last several years, and I'm sure a fair number are owned by people that buy games. Even past that, marketing never wants to put minimum requirements they feel will leave anyone out, even if it means giving processor/video requirements that lead to customers upset with the slideshow they're viewing.
|
# ¿ Oct 17, 2012 21:16 |
|
PC LOAD LETTER posted:The WiiU tablet controller is probably what is causing that. Apparently if you buy 1 game Nintendo more than makes up the loss though so it'll probably still be profitable from day 1 for them. Given the rumors we've been hearing about what the PS4 (glorified Trinity APU of sorts) and X720 (AMD 67xx class GPU + maybe AMD CPU or multi-core ARM variant, or a bit of both + 4-8GB of RAM) I don't think its unreasonable to believe any console manufacturer is going to push the limits financially as much as they did back when the PS3/X360 launched. I hadn't thought of the tablet controller, but that would explain some of it. I agree on the rest: no one's going to be trying to challenge high end gaming PCs this generation, and there's nothing like the HD-DVD/Blu-Ray standards war driving hardware specs from outside the gaming industry either. So no one's going to be releasing fantastically expensive consoles still sold at a steep loss. The WiiU is going to be the weakest of its generation, but it should be closer to the GameCube vs. the PS2/Xbox than the Wii vs. PS3/360. Its RAM concerns me more than anything. At 2GB it's far larger than the Wii in any case, but only 1GB available to games means it's not that huge a leap over the PS3/360, and it will be well behind their successors. Memory seems to be a huge and often overlooked factor in game design and porting, especially as consoles age, so I expect it might be one of the big concerns with the system a few years from now.
|
# ¿ Nov 25, 2012 21:22 |
|
Jan posted:When they weren't forced to downscale the totality of their engine to work on archaic consoles with 350MB of RAM to work with between CPU and GPU? Yeah, that's pretty much it. High end game development is so expensive now that the only people who have reason to sink that kind of cash into single-platform development rather than maximize their target audience are console manufacturers looking to grow market share through exclusives. And multiplatform development means an end result defined by the weakest link. The end result is a market where technology advance has been hampered by a console generation lasting so long and developers having to trim things back to fit - sometimes unsuccessfully, as seen in the Skyrim DLC.
|
# ¿ Jan 23, 2013 19:38 |
|
SocketSeven posted:Being able to make realistic hair might also be able to let you make things like realistic grass. Assuming it can scale from head sized to map sized. Better a head full of grass than a map full of hair, I guess.
|
# ¿ Feb 26, 2013 09:24 |
|
Endymion FRS MK1 posted:Here's an article showing the top 10 most important graphics cards. I don't know, if I remember this right wasn't the 8800GT just a revision and die shrink that gave incremental improvement over the initial GTS/GTX (and landed between them in performance)? The GTX was the one that made a huge leap in features and performance and became what a serious card needed to match. I mean, the GT was out longer and sold more, but I can't argue with this any more than I can with the GeForce 256 being on there instead of the longer lived and better selling GeForce 2 that just built on it a little.
|
# ¿ Apr 18, 2013 04:40 |
|
roadhead posted:http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/ And if that's correct, doesn't it mean that developers making games using that feature in AMD APUs for the PS4 and Xbox will have a low barrier to also support it in PC releases? Pretty nice strategy if so.
|
# ¿ Apr 30, 2013 22:02 |
|
real_scud posted:Is my videocard dying? It's a 7970 that I've had for a while, the temps are pretty low since I put the aftermarket cooler on it and I'm running the 13.5 beta drivers. I've had this happen too, and have assumed it to be a driver thing since it's seemed to come and go with driver installs.
|
# ¿ May 12, 2013 23:32 |
|
w00tazn posted:Good thing the designers of the Xbone say that they weren't targeting high end graphics. You should also remember that for current gen, 576p/30 was the target framebuffer/fps and that most games target Medium/Low graphics compared to their PC counterparts. That's the same thing I've taken away from the new console reveals. They're not really powerful, but they have a large, unified memory space so the PC gamers that are mostly going to feel pain with new titles coming out are those with 1GB cards or such that handled most of the last generation's multiplatforms.
|
# ¿ May 30, 2013 18:51 |
|
Zero VGS posted:I'm sure there's a simple answer for this but why don't GPUs just have slots for you to put more ram into? They had some like that in the old days! Like, before they called them "GPUs" and when you were lucky to get meaningful 3D feature sets anyway. I imagine higher speed, higher bandwidth, varying clock speeds, physical demands imposed by cooler design, and generally exacting demands compared to desktop RAM mean that it would be a headache for manufacturers, a limited value even for enthusiasts, and generally have a high cost/benefit ratio.
|
# ¿ Nov 2, 2013 03:23 |
|
b0nes posted:Technical question: A lot of the new Ultrabooks have crazy high resolution screens. I am not a big gamer but I do play from time to time. I see when games are played at the native resolution usually they get crap frame rates. If you played a game at a lower resolution would the frame rates improve? Or not since the same amount of pixels need to be lit up no matter what (If you played at full screen). Lower resolutions improve frame rates, though running on an LCD at less than native resolution hurts picture quality since it has to interpolate rather than output each pixel directly as the card produces it. So you'll get faster, but blurrier How bad this is depends on the screen and the resolution, but it's at least better than playing a slideshow. One exception is if you can have an exact multiple of resolution. For example, if you get one of the crazy 2560x1440 ultrabooks and then game fullscreen at 1280x720 (exactly half that) you'll have a perfectly sharp image, just effectively with bigger pixels.
|
# ¿ Nov 7, 2013 18:11 |
|
Athropos posted:Which is a lie because two cards at 2GB does not equal 4GB at all. Or I was told wrong. True or not, it's a lie that's been industry standard since the Voodoo 5 at least.
|
# ¿ Nov 19, 2013 23:35 |
|
Also it sounds like even when it's a huge risk to buy GPUs for the sake of mining long-term, using one you already have and seeing how a boom rides out is a lot less to lose.
|
# ¿ Dec 5, 2013 22:01 |
|
|
# ¿ Apr 27, 2024 05:52 |
|
In my experience, the cost of PC gaming for years has more or less been "if you have reason for a midrange desktop already, spend the cost of a console on a video card instead", and the GPU shortage was an anomaly there. You can extend your PC gaming budget upward by a lot from that level is all, but you don't need to unless you have the money to spend. Console's more convenient, but it's mostly only cheaper if you wouldn't otherwise have a desktop. Which more and more people don't.
|
# ¿ Jul 16, 2022 06:10 |