|
Krailor posted:There's certainly no technical reason why this couldn't be done; it's called a PCIe slot. Sure, but: a) The N64 expansion pak sold well enough, and indeed there were games that were improved by the pak without outright requiring it, though the execution wasn't consistent [edit:in some cases it looks like a hilarious shitshow]: http://nintendo.wikia.com/wiki/Nintendo_64_Expansion_Pak As long as Sony enforced the backwards compatibility and performance standards, they wouldn't have needed to worry about it. b) Stuff like the Xbox 360 had user-upgradable hard drives and plenty of people found the time to dick with those. It's a weird argument to save that plugging a single thing into a slot is more of a hassle than buying another console at the store. All I'm saying is that while it's better than doing a PS5, I seriously hope this doesn't become a thing. I only game on PC so it doesn't affect me, I just pity the consumers. It sucks having to re-buy 90% of the system all over again to have what looks like a single component upgraded.
|
# ? Apr 20, 2016 23:25 |
|
|
# ? Jun 10, 2024 16:42 |
|
Just saying that the PS4K is primarily meant to target 4k playback and VR, 100% sure 2304 cores is AMD's new VR target and it's 200-220$ product. That's why this is more a revision than it is a new product, and it's likely that it's pretty close to a drop in design, where GCN4 cores take up half the space the GCN cores do. It'll likely have better minimum frames, and for games where all cores get used could probably do a very steady 1080p60. Also, this is a huge win for AMD, as it gets developers to work with what is roughly it's midrange 2304 dGPU and optimize around it.
|
# ? Apr 21, 2016 00:02 |
|
I bought this GTX 950 because it's got 3 DisplayPorts, except none of them actually function from a fresh boot. To get the DP ports working I have to do this: 1) Boot with a monitor connected over HDMI 2) Once the system is booted, switch my monitor to DP 3) Reboot and enjoy working DP If I shut the computer down again, the DisplayPorts stop working again. What the hell?
|
# ? Apr 21, 2016 00:21 |
|
I really hope AMD releases a consumer product similar to this APU with actually useful levels of integrated graphics performance instead of just sticking to a low-end budget niche with their desktop APU line. There are lots of people who want to build tiny steambox-like gaming systems and the current options for doing so aren't very good.
|
# ? Apr 21, 2016 00:24 |
Measly Twerp posted:I bought this GTX 950 because it's got 3 DisplayPorts, except none of them actually function from a fresh boot. To get the DP ports working I have to do this: It might be the monitor instead of the video card, I know some monitors have issues with recognizing that they are connected after sleep or restart when connected via DP.
|
|
# ? Apr 21, 2016 00:36 |
|
Measly Twerp posted:I bought this GTX 950 because it's got 3 DisplayPorts, except none of them actually function from a fresh boot. To get the DP ports working I have to do this: Might be the cable there are only a few Vesa certified ones and chances the one you're using isn't
|
# ? Apr 21, 2016 02:02 |
|
SlayVus posted:Which might mean that when Polaris does release, the might be higher prices because of short supply. You probably won't see AMD/nV get nearly all their products off the 28nm process, with those products sold as mediocre rebrands, for another year at best while the foundries 14/16nm process ramp volume up. 2yr is probably the safe bet.
|
# ? Apr 21, 2016 03:25 |
|
THE DOG HOUSE posted:I always thought the greatest appeal to consoles was "everything is the same" and I was thoroughly confused at the idea of a PS4.5. GPU architecture won't much, if at all, from a backwards compatibility stand point. Polaris is a evolution of the same GPU currently in PS4. If anything they're adding features not taking them away. Its when you take away features that games depend on that you really run into major compatibility problems. For RAM they're not changing capacity, just adding more bandwidth. Performance wise the GPU is getting the biggest upgrade and should perform around a R9 290 when its not bandwidth limited. For 1080p games it should be pretty awesome, dunno for sure about 4K. I think it'll still really depend on how much the developers want to optimize the game to pull that off.
|
# ? Apr 21, 2016 03:38 |
|
SlayVus posted:I can imagine how many people will be purchasing a PS4.5(Can also be called PS Neo). The user base is already 40,000,000+. I can't imagine there would be any more market for the 4.5 to grab hold of. You might see an influx of people second hand selling to GameStop, eBay, and Craigslist. You can look at the 3DS and the New 3DS sales to get a rough idea of how the PS4.5 might do. I'm sure there's a decent number of existing early PS4 buyers who might trade theirs in for an upgraded version just like people who swap for new cell phones every year or two.
|
# ? Apr 21, 2016 04:33 |
|
PC LOAD LETTER posted:GPU architecture won't much, if at all, from a backwards compatibility stand point. Polaris is a evolution of the same GPU currently in PS4. If anything they're adding features not taking them away. Its when you take away features that games depend on that you really run into major compatibility problems. The problem is screwing over customers who have regular PS4s. Sony is enforcing that games have to run on the old hardware, but doesn't say anything about how well they have to run. There are 3DS games that run at a much lower frame rate, for a worse experience, on a regular 3DS. At the same time this also means developers can't really take advantage of the new hardware as much as they otherwise would because development effort spent on PS4K exclusive features won't be experienced by all users. I imagine most games will just detect if it's a PS4K and turn up some graphics settings and gain a more consistent frame rate. I bought my PS4 nine months ago and I've not even played it once. I was going to play Bloodborne but never found the time. I probably won't upgrade; the few exclusives I do plan to play probably won't gain too much from the new hardware.
|
# ? Apr 21, 2016 04:54 |
|
Will 60fps @ 4k be possible this upcoming generation with a single card?
|
# ? Apr 21, 2016 06:29 |
|
Desuwa posted:The problem is screwing over customers who have regular PS4s. Sony is enforcing that games have to run on the old hardware, but doesn't say anything about how well they have to run. If they're indeed letting older PS4's run new PS4K games like poo poo than yes that is bad. If they just reduce the resolution/IQ until it runs ok on PS4's, while still looking decent and with decent fps, then I don't see that as a problem. objects in mirror posted:Will 60fps @ 4k be possible this upcoming generation with a single card?
|
# ? Apr 21, 2016 06:42 |
|
Desuwa posted:The problem is screwing over customers who have regular PS4s. Sony is enforcing that games have to run on the old hardware, but doesn't say anything about how well they have to run. The only thing I'm finding Sony taking a stance on is that developers must make the game playable in both modes. Basic and Neo mode. Also, even games current available on the market will get NO benefit from the new PS4 UNLESS the developers patch it into the game. So Just Cause 3, FallOut 4, and The Witcher 3 will not run better unless they get patched for it. PC LOAD LETTER posted:If they're indeed letting older PS4's run new PS4K games like poo poo than yes that is bad. Game must run in basic mode, to not alienate any players. Game must not have a Neo mode only features. Ex. Co-Op only in Neo Mode! If the developer does not make a Neo mode, it can't even benefit from PS Neo. Neo mode games will run at 1080p then upscaled how the developer sees fit to 4k, Basic can be upscaled to 1080p from any lower resolution. Also, the PS4B will NOT get 4k upscale. Only the PS4N will do 4k upscaling. SlayVus fucked around with this message at 07:00 on Apr 21, 2016 |
# ? Apr 21, 2016 06:53 |
|
SlayVus posted:The only thing I'm finding Sony taking a stance on is that developers must make the game playable in both modes. Basic and Neo mode. Also, even games current available on the market will get NO benefit from the new PS4 UNLESS the developers patch it into the game. So Just Cause 3, FallOut 4, and The Witcher 3 will not run better unless they get patched for it. Yeah, that's what I said. The games have to support Basic mode but there is nothing that says it has to perform as well or can't take cheap graphical shortcuts. It's unlikely that it will happen too much because good publishers would be wary of backlash if their game only averages 20fps or has tons of stutter, but significant dips below 30 or 60fps already happen in current PS4 games. I imagine that will only get worse with games that target the PS4N and then are hastily cut down to work on the PS4B without too much time spent. I could see a developer targeting the PS4N and making the basic version by just reducing one or two settings by a lot, like draw distance or texture quality, because taking a more balanced approach would be too much work. Hell one of the games that just performs really poorly on the old model 3DS is the new Hyrule Warriors game, which isn't an unknown game by a tiny studio or anything. If it can happen to a spinoff Zelda game it can happen to any game.
|
# ? Apr 21, 2016 07:57 |
|
Desuwa posted:I could see a developer targeting the PS4N and making the basic version by just reducing one or two settings by a lot, like draw distance or texture quality, because taking a more balanced approach would be too much work. On a very large (70"+) high end 4K TV is where games on the PS4 will probably look crappy vs a PS4K even at typical couch viewing distances. 1080p starts to get a bit blockish at such sizes even if you sit far away from the screen and 4K will still say pretty smooth until you get to silly sized screens that only projectors can pull off.
|
# ? Apr 21, 2016 08:21 |
|
PC LOAD LETTER posted:CPU architecture doesn't change at all. They're not even offering more cores, just more clockspeed. Which is good for single thread performance and that is where Jaguar suffers the most. For a 2016 CPU its still wimpy from a single thread perspective. Given how much developers have complained about it publicly I'd have thought they'd really try to get the clockspeed higher at least. To 2.5Ghz or so. That would amount to a near 50% increase in clockspeed which I'd think would be a awfully nice bump in performance. I'm not sure if they're making a 14nm version of the Jaguar core, though, and if they're switching the GPU to Polaris the CPU portion also has to follow the same dieshrink if they want to keep cooling/interconnects simple.
|
# ? Apr 21, 2016 13:58 |
|
Anime Schoolgirl posted:Jaguar cannot reliably clock above 2.3ghz Anime Schoolgirl posted:I'm not sure if they're making a 14nm version of the Jaguar core
|
# ? Apr 21, 2016 14:15 |
|
PC LOAD LETTER posted:I'd thought they were just heat/power limited on 28nm to 25W which was causing the modest clockspeeds but OK.
|
# ? Apr 21, 2016 14:18 |
|
I think a low clock ceiling is part of the sacrifice inherent to high density libraries, and for the power ranges targeted most of the time it's not a big deal.
|
# ? Apr 21, 2016 14:18 |
|
Wow that is kinda crappy. Sorry for the sarcasm/wrongness then. Seems even Puma and Puma+ didn't clock much higher too.
|
# ? Apr 21, 2016 14:42 |
|
Anime Schoolgirl posted:you can feed AM1 (Jaguar) as much power as it wants and you can use a 95w cooling solution, there's just nothing but the silicon lottery that affects whether your chip can actually break that barrier. Especially considering that as a single monolithic part, there's no way to take the world's tiniest chainsaw and cut it up to bin a console processor as a lower-spec product.
|
# ? Apr 21, 2016 15:29 |
|
This is why I'm taking the jaguar cores with a grain of salt, Excavator+ shows that it can easily match Jaguar in power consumption and heat, yet also be vastly faster. Excavator+ could likely do 8 cores running @ 2.5ghz for 40W, if Stoney is anything to go by, while having a large IPC increase touching stock Sandy.
|
# ? Apr 21, 2016 15:49 |
|
FaustianQ posted:This is why I'm taking the jaguar cores with a grain of salt, Excavator+ shows that it can easily match Jaguar in power consumption and heat, yet also be vastly faster. Excavator+ could likely do 8 cores running @ 2.5ghz for 40W, if Stoney is anything to go by, while having a large IPC increase touching stock Sandy. This is why the stick with Jaguar makes no sense. There's Puma, there's Puma+ (Carrizo-L), but much more interestingly there's Carrizo. 8 Carrizo cores at somewhere a bit over 2GHz would be an excellent upgrade for the PS4.
|
# ? Apr 21, 2016 16:46 |
|
Just saw that Newegg has a deal on MSI 980Ti's for $529 after rebate. Rather good price for what is still a fast card, but man talk about a fire sale, kinda wish I returned my STRIX but at the same time, I still feel the wait for the next Ti might be quite a while...
|
# ? Apr 21, 2016 16:51 |
|
EdEddnEddy posted:Just saw that Newegg has a deal on MSI 980Ti's for $529 after rebate. With how retailers are selling these cards, it really makes you wonder what's coming. 980 Ti for the price of a 980, what kind of world are we living in.
|
# ? Apr 21, 2016 17:07 |
|
SlayVus posted:With how retailers are selling these cards, it really makes you wonder what's coming. 980 Ti for the price of a 980, what kind of world are we living in. I'm certain the retailers don't know what's coming, and are just worried that they'll get stuck with old stock if the new cards are super desirable.
|
# ? Apr 21, 2016 17:11 |
|
Gotta give Acer some credit I guess, their next gaming laptops look to run 980's (not "m") so its nice to see them take a chance on Gaming/VR more than in the past. More competition in that arena is always better. http://www.engadget.com/2016/04/21/acer-predator-gaming-pcs/?sr_source=Twitter SlayVus posted:With how retailers are selling these cards, it really makes you wonder what's coming. 980 Ti for the price of a 980, what kind of world are we living in. Last I remember, the 780Ti came and went at a premium since it was only a short bit before the 900 series arrived to cancel it out. The 980Ti seems to have had a much longer lifespan before the new stuff arrived and it is interesting seeing it being phased out with nothing new announced/available yet. Yay if a 1070 is as fast as a 980Ti is now, but that leaves no incentive for me to upgrade for VR until I see what the next TI can do. I want to be able to run Project CARS at max with a single GPU and all the rumors seem to point to that possibility, but I doubt it will be anything but the top high end of course only one generation later..... EdEddnEddy fucked around with this message at 17:27 on Apr 21, 2016 |
# ? Apr 21, 2016 17:24 |
|
Don't worry, SLI for VR is really easy you just render each eye on one gpu and will drop any moment now! With how much hype there was for this, I'm glad I didn't actually get a 2nd 980Ti yet, because AFAIK nothing actually works with SLI VR yet.
|
# ? Apr 21, 2016 17:38 |
|
Is AMD XCF any better?
|
# ? Apr 21, 2016 17:43 |
|
AFAIK it's crossfire vr is just as inexistent.
|
# ? Apr 21, 2016 17:44 |
|
Truga posted:Don't worry, SLI for VR is really easy you just render each eye on one gpu and will drop any moment now! IT worked great when we had the Extended mode, but now with the actual VR direct to HMD mode, they have yet to do anything with it, and I feel it's them waiting until the DX12 VR SLI support to roll out. Really though, the other issue is the latency between the 2 cards. While normal SLI doesn't have a problem with it, with VR, the latency between the cards can increase a ton based on the PCI-E Lanes and Spec you are running. Technically you need PCI-E 3.0 X16 to have near .00X latency where 2.0 X16 or 3.0 X8 = 1-4~.XXX latency numbers which is bad. That limits your market not only to the few with SLI, but the few with X79/X99 systems with 40 3.0 lanes...
|
# ? Apr 21, 2016 17:45 |
|
What about those cards that are 2 gpus on 1 board? does that solve that problem? I realize thats a tiny addressable market, I'm just curious if that particular problem goes away.
|
# ? Apr 21, 2016 19:03 |
|
greasyhands posted:What about those cards that are 2 gpus on 1 board? does that solve that problem? I realize thats a tiny addressable market, I'm just curious if that particular problem goes away. Not really, they're the same as two cards from a software perspective.
|
# ? Apr 21, 2016 19:08 |
|
Would a 650Ti Boost work when powered from only one Molex connector? The PSU only has one (the rest are SATA; no 6-pin) and I cut off one of the Molex connectors from my adapter for a different project anyway. I'd rather make this work than swap a PSU out from another PC that works fine now.
|
# ? Apr 21, 2016 20:20 |
|
No
|
# ? Apr 21, 2016 20:26 |
|
Has anyone had Afterburner somehow set absurdly high core and memory clocks by itself? I just had to figure out why I suddenly started getting artifacts under a minute after booting. Turns out Afterburner had set both clocks to +1000 MHz. I haven't even been running overclocks for months.
|
# ? Apr 21, 2016 21:19 |
|
Pollyzoid posted:Has anyone had Afterburner somehow set absurdly high core and memory clocks by itself? I just had to figure out why I suddenly started getting artifacts under a minute after booting. Turns out Afterburner had set both clocks to +1000 MHz. I haven't even been running overclocks for months. Do you have a roommate or a cat
|
# ? Apr 21, 2016 21:43 |
|
Naffer posted:I'm certain the retailers don't know what's coming, and are just worried that they'll get stuck with old stock if the new cards are super desirable. Those who make these decisions often have a pretty good idea. Why offer price cuts on a desirable item if its public knowledge that production has stopped
|
# ? Apr 21, 2016 22:26 |
|
Pollyzoid posted:Has anyone had Afterburner somehow set absurdly high core and memory clocks by itself? I just had to figure out why I suddenly started getting artifacts under a minute after booting. Turns out Afterburner had set both clocks to +1000 MHz. I haven't even been running overclocks for months. Why does Anyone (Everyone) seem to still be using MSI Afterburner. Afterburner hasn't been good since the 500 series era or earlier. Switch to EVGA Precision and watch nearly all the Clock/Fan problems disappear since it actually works.
|
# ? Apr 21, 2016 23:18 |
|
|
# ? Jun 10, 2024 16:42 |
|
EdEddnEddy posted:Why does Anyone (Everyone) seem to still be using MSI Afterburner. I've only ever had a problem with EVGA precision lol. But If I (or anybody) has an issue with afterburner im sure id switch since they do the same thing
|
# ? Apr 21, 2016 23:31 |