Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Krailor posted:

There's certainly no technical reason why this couldn't be done; it's called a PCIe slot.

There are some business reasons why this might not be a good idea.

1. It splits your user base so either games will need to specify 'add-on card required' or you force everyone to develop to the base system and the add-on card is for 'Pro Gamers' and gives you better resolution/higher framerate.

2. One of the big benefits of console gaming is it 'Just Works'. The more PC-like you make your console (base processing unit, add-on accelerator card, etc) the higher risk you run of people making that connection and losing that customer to the PC gaming side forever.

Sure, but:

a) The N64 expansion pak sold well enough, and indeed there were games that were improved by the pak without outright requiring it, though the execution wasn't consistent [edit:in some cases it looks like a hilarious shitshow]: http://nintendo.wikia.com/wiki/Nintendo_64_Expansion_Pak As long as Sony enforced the backwards compatibility and performance standards, they wouldn't have needed to worry about it.

b) Stuff like the Xbox 360 had user-upgradable hard drives and plenty of people found the time to dick with those. It's a weird argument to save that plugging a single thing into a slot is more of a hassle than buying another console at the store.

All I'm saying is that while it's better than doing a PS5, I seriously hope this doesn't become a thing. I only game on PC so it doesn't affect me, I just pity the consumers. It sucks having to re-buy 90% of the system all over again to have what looks like a single component upgraded.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Just saying that the PS4K is primarily meant to target 4k playback and VR, 100% sure 2304 cores is AMD's new VR target and it's 200-220$ product. That's why this is more a revision than it is a new product, and it's likely that it's pretty close to a drop in design, where GCN4 cores take up half the space the GCN cores do.

It'll likely have better minimum frames, and for games where all cores get used could probably do a very steady 1080p60. Also, this is a huge win for AMD, as it gets developers to work with what is roughly it's midrange 2304 dGPU and optimize around it.

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy
I bought this GTX 950 because it's got 3 DisplayPorts, except none of them actually function from a fresh boot. To get the DP ports working I have to do this:

1) Boot with a monitor connected over HDMI
2) Once the system is booted, switch my monitor to DP
3) Reboot and enjoy working DP

If I shut the computer down again, the DisplayPorts stop working again.

What the hell?

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
I really hope AMD releases a consumer product similar to this APU with actually useful levels of integrated graphics performance instead of just sticking to a low-end budget niche with their desktop APU line. There are lots of people who want to build tiny steambox-like gaming systems and the current options for doing so aren't very good.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Measly Twerp posted:

I bought this GTX 950 because it's got 3 DisplayPorts, except none of them actually function from a fresh boot. To get the DP ports working I have to do this:

1) Boot with a monitor connected over HDMI
2) Once the system is booted, switch my monitor to DP
3) Reboot and enjoy working DP

If I shut the computer down again, the DisplayPorts stop working again.

What the hell?

It might be the monitor instead of the video card, I know some monitors have issues with recognizing that they are connected after sleep or restart when connected via DP.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Measly Twerp posted:

I bought this GTX 950 because it's got 3 DisplayPorts, except none of them actually function from a fresh boot. To get the DP ports working I have to do this:

1) Boot with a monitor connected over HDMI
2) Once the system is booted, switch my monitor to DP
3) Reboot and enjoy working DP

If I shut the computer down again, the DisplayPorts stop working again.

What the hell?

Might be the cable there are only a few Vesa certified ones and chances the one you're using isn't

PC LOAD LETTER
May 23, 2005
WTF?!

SlayVus posted:

Which might mean that when Polaris does release, the might be higher prices because of short supply.
I think given the rebranding they're doing on the M400 line its safe to say they don't have the 14/16nm capacity available to them to do everything they want. But 14/16nm processes will still be very new by then, and everyone doing some high performance MPU of some sort is looking to produce their stuff on it, so that is normal.

You probably won't see AMD/nV get nearly all their products off the 28nm process, with those products sold as mediocre rebrands, for another year at best while the foundries 14/16nm process ramp volume up. 2yr is probably the safe bet.

PC LOAD LETTER
May 23, 2005
WTF?!

THE DOG HOUSE posted:

I always thought the greatest appeal to consoles was "everything is the same" and I was thoroughly confused at the idea of a PS4.5.
CPU architecture doesn't change at all. They're not even offering more cores, just more clockspeed. Which is good for single thread performance and that is where Jaguar suffers the most. For a 2016 CPU its still wimpy from a single thread perspective. Given how much developers have complained about it publicly I'd have thought they'd really try to get the clockspeed higher at least. To 2.5Ghz or so. That would amount to a near 50% increase in clockspeed which I'd think would be a awfully nice bump in performance.

GPU architecture won't much, if at all, from a backwards compatibility stand point. Polaris is a evolution of the same GPU currently in PS4. If anything they're adding features not taking them away. Its when you take away features that games depend on that you really run into major compatibility problems.

For RAM they're not changing capacity, just adding more bandwidth.

Performance wise the GPU is getting the biggest upgrade and should perform around a R9 290 when its not bandwidth limited. For 1080p games it should be pretty awesome, dunno for sure about 4K. I think it'll still really depend on how much the developers want to optimize the game to pull that off.

Evil Fluffy
Jul 13, 2009

Scholars are some of the most pompous and pedantic people I've ever had the joy of meeting.

SlayVus posted:

I can imagine how many people will be purchasing a PS4.5(Can also be called PS Neo). The user base is already 40,000,000+. I can't imagine there would be any more market for the 4.5 to grab hold of. You might see an influx of people second hand selling to GameStop, eBay, and Craigslist.

Also, the PS Neo isn't coming out until holiday season. So AMD probably doesn't have the manufacturing capacity for Polaris right now. Which might mean that when Polaris does release, the might be higher prices because of short supply.

You can look at the 3DS and the New 3DS sales to get a rough idea of how the PS4.5 might do. I'm sure there's a decent number of existing early PS4 buyers who might trade theirs in for an upgraded version just like people who swap for new cell phones every year or two. :shrug:

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

PC LOAD LETTER posted:

GPU architecture won't much, if at all, from a backwards compatibility stand point. Polaris is a evolution of the same GPU currently in PS4. If anything they're adding features not taking them away. Its when you take away features that games depend on that you really run into major compatibility problems.

The problem is screwing over customers who have regular PS4s. Sony is enforcing that games have to run on the old hardware, but doesn't say anything about how well they have to run. There are 3DS games that run at a much lower frame rate, for a worse experience, on a regular 3DS. At the same time this also means developers can't really take advantage of the new hardware as much as they otherwise would because development effort spent on PS4K exclusive features won't be experienced by all users. I imagine most games will just detect if it's a PS4K and turn up some graphics settings and gain a more consistent frame rate.

I bought my PS4 nine months ago and I've not even played it once. I was going to play Bloodborne but never found the time. I probably won't upgrade; the few exclusives I do plan to play probably won't gain too much from the new hardware.

objects in mirror
Apr 9, 2016

by Shine
Will 60fps @ 4k be possible this upcoming generation with a single card?

PC LOAD LETTER
May 23, 2005
WTF?!

Desuwa posted:

The problem is screwing over customers who have regular PS4s. Sony is enforcing that games have to run on the old hardware, but doesn't say anything about how well they have to run.
Can you link to where they say that? Not doubting, and maybe my google skills suck, but I can't find anything official where they say that.

If they're indeed letting older PS4's run new PS4K games like poo poo than yes that is bad. If they just reduce the resolution/IQ until it runs ok on PS4's, while still looking decent and with decent fps, then I don't see that as a problem.

objects in mirror posted:

Will 60fps @ 4k be possible this upcoming generation with a single card?
Probably but likely only on the high end if you want 60fps in most every game coming out for the next few years at 4K. You'll probably have to wait until near the end of the year to see what this generations high end cards will do and cost exactly.

SlayVus
Jul 10, 2009
Grimey Drawer

Desuwa posted:

The problem is screwing over customers who have regular PS4s. Sony is enforcing that games have to run on the old hardware, but doesn't say anything about how well they have to run.

The only thing I'm finding Sony taking a stance on is that developers must make the game playable in both modes. Basic and Neo mode. Also, even games current available on the market will get NO benefit from the new PS4 UNLESS the developers patch it into the game. So Just Cause 3, FallOut 4, and The Witcher 3 will not run better unless they get patched for it.

PC LOAD LETTER posted:

If they're indeed letting older PS4's run new PS4K games like poo poo than yes that is bad.
They way I have been interpreting the information so far is this.

Game must run in basic mode, to not alienate any players.
Game must not have a Neo mode only features. Ex. Co-Op only in Neo Mode!
If the developer does not make a Neo mode, it can't even benefit from PS Neo.
Neo mode games will run at 1080p then upscaled how the developer sees fit to 4k, Basic can be upscaled to 1080p from any lower resolution.

Also, the PS4B will NOT get 4k upscale. Only the PS4N will do 4k upscaling.

SlayVus fucked around with this message at 07:00 on Apr 21, 2016

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

SlayVus posted:

The only thing I'm finding Sony taking a stance on is that developers must make the game playable in both modes. Basic and Neo mode. Also, even games current available on the market will get NO benefit from the new PS4 UNLESS the developers patch it into the game. So Just Cause 3, FallOut 4, and The Witcher 3 will not run better unless they get patched for it.

They way I have been interpreting the information so far is this.

Game must run in basic mode, to not alienate any players.
Game must not have a Neo mode only features. Ex. Co-Op only in Neo Mode!
If the developer does not make a Neo mode, it can't even benefit from PS Neo.
Neo mode games will run at 1080p then upscaled how the developer sees fit to 4k, Basic can be upscaled to 1080p from any lower resolution.

Also, the PS4B will NOT get 4k upscale. Only the PS4N will do 4k upscaling.

Yeah, that's what I said. The games have to support Basic mode but there is nothing that says it has to perform as well or can't take cheap graphical shortcuts. It's unlikely that it will happen too much because good publishers would be wary of backlash if their game only averages 20fps or has tons of stutter, but significant dips below 30 or 60fps already happen in current PS4 games. I imagine that will only get worse with games that target the PS4N and then are hastily cut down to work on the PS4B without too much time spent.

I could see a developer targeting the PS4N and making the basic version by just reducing one or two settings by a lot, like draw distance or texture quality, because taking a more balanced approach would be too much work.

Hell one of the games that just performs really poorly on the old model 3DS is the new Hyrule Warriors game, which isn't an unknown game by a tiny studio or anything. If it can happen to a spinoff Zelda game it can happen to any game.

PC LOAD LETTER
May 23, 2005
WTF?!

Desuwa posted:

I could see a developer targeting the PS4N and making the basic version by just reducing one or two settings by a lot, like draw distance or texture quality, because taking a more balanced approach would be too much work.
This is what I'd expect to happen though maybe I'd also add resolution (ie. 720p or some other wonky resolution between 720p or 1080p) reduction and cutting some other minor IQ effects to maintain a decent level of performance. A PS4 running side by side vs a PS4K would probably be a easily noticeable difference but that doesn't necessarily mean the PS4 games would look or run like poo poo. Especially if they're both ran on a common sized (60" or less) 1080p TV with typical couch viewing distances.

On a very large (70"+) high end 4K TV is where games on the PS4 will probably look crappy vs a PS4K even at typical couch viewing distances. 1080p starts to get a bit blockish at such sizes even if you sit far away from the screen and 4K will still say pretty smooth until you get to silly sized screens that only projectors can pull off.

Anime Schoolgirl
Nov 28, 2002

PC LOAD LETTER posted:

CPU architecture doesn't change at all. They're not even offering more cores, just more clockspeed. Which is good for single thread performance and that is where Jaguar suffers the most. For a 2016 CPU its still wimpy from a single thread perspective. Given how much developers have complained about it publicly I'd have thought they'd really try to get the clockspeed higher at least. To 2.5Ghz or so. That would amount to a near 50% increase in clockspeed which I'd think would be a awfully nice bump in performance.
They'd have to move to full construction cores for that, Jaguar cannot reliably clock above 2.3ghz. (for reference, a Jaguar core is the same core used in AM1)

I'm not sure if they're making a 14nm version of the Jaguar core, though, and if they're switching the GPU to Polaris the CPU portion also has to follow the same dieshrink if they want to keep cooling/interconnects simple.

PC LOAD LETTER
May 23, 2005
WTF?!

Anime Schoolgirl posted:

Jaguar cannot reliably clock above 2.3ghz
I'd thought they were just heat/power limited on 28nm to 25W which was causing the modest clockspeeds but OK.

Anime Schoolgirl posted:

I'm not sure if they're making a 14nm version of the Jaguar core
Its an APU so I don't think they have much choice if they want to maintain compatibility and keep costs in line. Doing a MCM or a interface through a interposer would be expensive and probably require changes to the CPU to GPU bus performance or latency.

Anime Schoolgirl
Nov 28, 2002

PC LOAD LETTER posted:

I'd thought they were just heat/power limited on 28nm to 25W which was causing the modest clockspeeds but OK.
you can feed AM1 (Jaguar) as much power as it wants and you can use a 95w cooling solution, there's just nothing but the silicon lottery that affects whether your chip can actually break that barrier.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

I think a low clock ceiling is part of the sacrifice inherent to high density libraries, and for the power ranges targeted most of the time it's not a big deal.

PC LOAD LETTER
May 23, 2005
WTF?!
Wow that is kinda crappy. Sorry for the sarcasm/wrongness then. Seems even Puma and Puma+ didn't clock much higher too.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Anime Schoolgirl posted:

you can feed AM1 (Jaguar) as much power as it wants and you can use a 95w cooling solution, there's just nothing but the silicon lottery that affects whether your chip can actually break that barrier.

Especially considering that as a single monolithic part, there's no way to take the world's tiniest chainsaw and cut it up to bin a console processor as a lower-spec product.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
This is why I'm taking the jaguar cores with a grain of salt, Excavator+ shows that it can easily match Jaguar in power consumption and heat, yet also be vastly faster. Excavator+ could likely do 8 cores running @ 2.5ghz for 40W, if Stoney is anything to go by, while having a large IPC increase touching stock Sandy.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

FaustianQ posted:

This is why I'm taking the jaguar cores with a grain of salt, Excavator+ shows that it can easily match Jaguar in power consumption and heat, yet also be vastly faster. Excavator+ could likely do 8 cores running @ 2.5ghz for 40W, if Stoney is anything to go by, while having a large IPC increase touching stock Sandy.

This is why the stick with Jaguar makes no sense. There's Puma, there's Puma+ (Carrizo-L), but much more interestingly there's Carrizo. 8 Carrizo cores at somewhere a bit over 2GHz would be an excellent upgrade for the PS4.

EdEddnEddy
Apr 5, 2012



Just saw that Newegg has a deal on MSI 980Ti's for $529 after rebate.

Rather good price for what is still a fast card, but man talk about a fire sale, kinda wish I returned my STRIX but at the same time, I still feel the wait for the next Ti might be quite a while...

SlayVus
Jul 10, 2009
Grimey Drawer

EdEddnEddy posted:

Just saw that Newegg has a deal on MSI 980Ti's for $529 after rebate.

Rather good price for what is still a fast card, but man talk about a fire sale, kinda wish I returned my STRIX but at the same time, I still feel the wait for the next Ti might be quite a while...

With how retailers are selling these cards, it really makes you wonder what's coming. 980 Ti for the price of a 980, what kind of world are we living in.

Naffer
Oct 26, 2004

Not a good chemist

SlayVus posted:

With how retailers are selling these cards, it really makes you wonder what's coming. 980 Ti for the price of a 980, what kind of world are we living in.

I'm certain the retailers don't know what's coming, and are just worried that they'll get stuck with old stock if the new cards are super desirable.

EdEddnEddy
Apr 5, 2012



Gotta give Acer some credit I guess, their next gaming laptops look to run 980's (not "m") so its nice to see them take a chance on Gaming/VR more than in the past. More competition in that arena is always better.

http://www.engadget.com/2016/04/21/acer-predator-gaming-pcs/?sr_source=Twitter

SlayVus posted:

With how retailers are selling these cards, it really makes you wonder what's coming. 980 Ti for the price of a 980, what kind of world are we living in.

Last I remember, the 780Ti came and went at a premium since it was only a short bit before the 900 series arrived to cancel it out. The 980Ti seems to have had a much longer lifespan before the new stuff arrived and it is interesting seeing it being phased out with nothing new announced/available yet.

Yay if a 1070 is as fast as a 980Ti is now, but that leaves no incentive for me to upgrade for VR until I see what the next TI can do. I want to be able to run Project CARS at max with a single GPU and all the rumors seem to point to that possibility, but I doubt it will be anything but the top high end of course only one generation later.....

EdEddnEddy fucked around with this message at 17:27 on Apr 21, 2016

Truga
May 4, 2014
Lipstick Apathy
Don't worry, SLI for VR is really easy you just render each eye on one gpu and will drop any moment now!


With how much hype there was for this, I'm glad I didn't actually get a 2nd 980Ti yet, because AFAIK nothing actually works with SLI VR yet. :v:

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Is AMD XCF any better?

Truga
May 4, 2014
Lipstick Apathy
AFAIK it's crossfire vr is just as inexistent.

EdEddnEddy
Apr 5, 2012



Truga posted:

Don't worry, SLI for VR is really easy you just render each eye on one gpu and will drop any moment now!


With how much hype there was for this, I'm glad I didn't actually get a 2nd 980Ti yet, because AFAIK nothing actually works with SLI VR yet. :v:

IT worked great when we had the Extended mode, but now with the actual VR direct to HMD mode, they have yet to do anything with it, and I feel it's them waiting until the DX12 VR SLI support to roll out.


Really though, the other issue is the latency between the 2 cards. While normal SLI doesn't have a problem with it, with VR, the latency between the cards can increase a ton based on the PCI-E Lanes and Spec you are running. Technically you need PCI-E 3.0 X16 to have near .00X latency where 2.0 X16 or 3.0 X8 = 1-4~.XXX latency numbers which is bad.

That limits your market not only to the few with SLI, but the few with X79/X99 systems with 40 3.0 lanes...

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered
What about those cards that are 2 gpus on 1 board? does that solve that problem? I realize thats a tiny addressable market, I'm just curious if that particular problem goes away.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

greasyhands posted:

What about those cards that are 2 gpus on 1 board? does that solve that problem? I realize thats a tiny addressable market, I'm just curious if that particular problem goes away.

Not really, they're the same as two cards from a software perspective.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Would a 650Ti Boost work when powered from only one Molex connector? The PSU only has one (the rest are SATA; no 6-pin) and I cut off one of the Molex connectors from my adapter for a different project anyway. I'd rather make this work than swap a PSU out from another PC that works fine now.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
No

Pollyzoid
Nov 2, 2010

GRUUAGH you say?
Has anyone had Afterburner somehow set absurdly high core and memory clocks by itself? I just had to figure out why I suddenly started getting artifacts under a minute after booting. Turns out Afterburner had set both clocks to +1000 MHz. I haven't even been running overclocks for months.

penus penus penus
Nov 9, 2014

by piss__donald

Pollyzoid posted:

Has anyone had Afterburner somehow set absurdly high core and memory clocks by itself? I just had to figure out why I suddenly started getting artifacts under a minute after booting. Turns out Afterburner had set both clocks to +1000 MHz. I haven't even been running overclocks for months.

Do you have a roommate or a cat

penus penus penus
Nov 9, 2014

by piss__donald

Naffer posted:

I'm certain the retailers don't know what's coming, and are just worried that they'll get stuck with old stock if the new cards are super desirable.

Those who make these decisions often have a pretty good idea. Why offer price cuts on a desirable item if its public knowledge that production has stopped

EdEddnEddy
Apr 5, 2012



Pollyzoid posted:

Has anyone had Afterburner somehow set absurdly high core and memory clocks by itself? I just had to figure out why I suddenly started getting artifacts under a minute after booting. Turns out Afterburner had set both clocks to +1000 MHz. I haven't even been running overclocks for months.

Why does Anyone (Everyone) seem to still be using MSI Afterburner.

Afterburner hasn't been good since the 500 series era or earlier.

Switch to EVGA Precision and watch nearly all the Clock/Fan problems disappear since it actually works.

Adbot
ADBOT LOVES YOU

penus penus penus
Nov 9, 2014

by piss__donald

EdEddnEddy posted:

Why does Anyone (Everyone) seem to still be using MSI Afterburner.

Afterburner hasn't been good since the 500 series era or earlier.

Switch to EVGA Precision and watch nearly all the Clock/Fan problems disappear since it actually works.

I've only ever had a problem with EVGA precision lol. But If I (or anybody) has an issue with afterburner im sure id switch since they do the same thing

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply