Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
uhhhhahhhhohahhh
Oct 9, 2012

Agreed posted:

If you're gaming at a higher resolution, you might be in for SLI sooner than you'd like, as even at 1080p the very powerful GTX 780 won't lock frames in at 60FPS minimum in all modern games. 30, sure, 45 usually, but 60 is a tough nut to crack with the kind of graphics we have today. The option is always there to turn down settings, but who wants to buy the best card made just to turn down stuff? :holy:

This is demoralizing. I was planning to get a 27" 1440p screen soonish (my old Samsung 226BW is constantly emitting some high pitched whining sound and I think it's giving me a Constant Headache) and was hoping I'd never have to use SLi considering all the problems it has. I only have a GTX 580 right and I just bought a 550w PSU for £100 in November 2012 because it was much quieter than the 900w one I had for ~5 years (and sold.) Was hoping a single GTX 770 would be enough, but looking over benchmarks even SLi 770s are barely enough for 60fps @ 1440p right now. Who knows how much lower that will go when newer games out at the end of this year and 2014. I probably wouldn't even be able to get 30fps on 1440p with my current GPU.

Not that a constant 60fps minimum matters to me in anything other than Dota2 and probably BF4 multiplayer, since they're the only games I'll end up playing for more than a month most likely. I can tolerate 30-60fps in some random single player game I would play for a week, but having to get a new ~£150-200 850w PSU, plus 2x GPUs for £800 that can barely push 60fps minimum on top of the £550 monitor wouldn't be the greatest investment... especially considering I'm Unemployed As gently caress right now.

If anyone itt has a 27" 1440p screen, some free time and a decentish camera could you do some comparison photos of different games at 1080p and 1440p fullscreen as an experiment to see how blurry it is?

uhhhhahhhhohahhh fucked around with this message at 16:06 on Jun 24, 2013

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo
You won't have problems maintaining 60FPS in Dota 2 at 2560x1440 with a single 770. No one can say anything about BF4 yet since it's not out.

craig588 fucked around with this message at 16:05 on Jun 24, 2013

synthetik
Feb 28, 2007

I forgive you, Will. Will you forgive me?
So I found out I have no need for a 780. FEZ runs fine without the upgrade. Selling it for $535 shipped in SA-Mart if anyone is interested.

http://forums.somethingawful.com/showthread.php?threadid=3556286

Animal
Apr 8, 2003

synthetik posted:

So I found out I have no need for a 780. FEZ runs fine without the upgrade. Selling it for $535 shipped in SA-Mart if anyone is interested.

http://forums.somethingawful.com/showthread.php?threadid=3556286

Replied in SMart thread!

Anyone want an EVGA 670 for $250 priced for our people only? :)

(Sorry to turn the GPU thread into a Marrakesh market)

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
If'n'I hadn't just bought Agreed's 680 :argh:

Nah, I think this shuffle is kinda cool, personally. I don't want to see it get out of hand, but it's neat to see high-end cards shuffled down the enthusiast chain.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Animal posted:

Replied in SMart thread!

Anyone want an EVGA 670 for $250 priced for our people only? :)

(Sorry to turn the GPU thread into a Marrakesh market)

That is one seriously awesome mate's rates discount, send him a thank-you card (Hallmark or better, drat it) when you make payment or you're not allowed in the 780 club :colbert:

Also, three cheers for sensibility, may it never fall on my fractured brain.

Animal
Apr 8, 2003

I am a little tight economically but this 780 will be a nice birthday gift for myself from the GPU thread :) I'll make sure I pay it forward with a good deal on the 670.

I actually prefer the blower models since my CPU's radiator fans are the only two fans blowing air out of the case, I dont want them getting a lot of hot overclocked 780 air and warm up the CPU.

I cant wait to try that fancy temperature based throttling!

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

uhhhhahhhhohahhh posted:

If anyone itt has a 27" 1440p screen, some free time and a decentish camera could you do some comparison photos of different games at 1080p and 1440p fullscreen as an experiment to see how blurry it is?
I was literally just talking about this in the monitor thread, most games downscale really, really well to 1080p on a 27"

uhhhhahhhhohahhh
Oct 9, 2012
I know, I was there. I just wanted to see a photo. I probably should've asked there instead. I couldn't find any examples of it on Google or YouTube. It's not that I didn't believe you, it's just that I tried dropping mine from 1680x1050 to 1440x900 in a game and it looked like poo poo... 2560x1440 to 1920x1080 is a much bigger jump.

uhhhhahhhhohahhh fucked around with this message at 19:26 on Jun 24, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
A 1920x1200 screen lowered to 1440x900 is the same ratio of "rendered pixels per physical pixel," so that's probably easier to get pitchers of.

Incredulous Dylan
Oct 22, 2004

Fun Shoe

Agreed posted:

Where's the marketing BS at that you're trying to avoid?

When I look at the stuff promised for gaming with PhysX or real time Laura Croft ponytail modelling it just strikes me as absurd. This stuff is always used in only two or three of the big games and, with the exception of Arkham Asylum/City and Borderlands 2, barely even in those. 3D vision was poorly supported by developers and only a handful of games used it correctly. God forbid you played a game on the unreal 3 engine unless you enjoy your brain being split in half by lighting issues.

For the 280, I remember it really bumming me out when I couldn't run Just Cause 2 in 3D on the highest settings on my 280 without bad performance issues. It seems like all of the games I was excited to experience in 3D that came out just a few months later hosed the playability/fidelity ratio. 3D performance was pretty unreliable for new games right up until the 680, which started reliably putting out the numbers for a great playable experience. The difference between a 480 and 680 for me in Battlefield 3 was night and day for sure!

edit: vv Man, talk about setting it up and knocking it down!

Incredulous Dylan fucked around with this message at 20:57 on Jun 24, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
You mention 3D a lot, so this might interest you: With DirectX 11.1 (GCN Radeon or Kepler GeForce on Windows 8), Stereoscopic 3D is a 100% standardized feature that does not need particular input from either AMD or Nvidia.

Caveat: besides needing a new enough card and Windows 8, there's currently only one DX11.1 game out, Star Trek Online.

Thoom
Jan 12, 2004

LUIGI SMASH!
I remember hearing that having multiple monitors plugged into a GPU would cause it to never properly idle and waste a lot of power/heat due to vblank syncing issues. Is this still a thing with modern cards (specifically the 780)?

I'm trying to figure out whether I should plug my second monitor and TV (neither of which is likely to be used for gaming) into the integrated graphics or into the GPU on my new PC.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Thoom posted:

I remember hearing that having multiple monitors plugged into a GPU would cause it to never properly idle and waste a lot of power/heat due to vblank syncing issues. Is this still a thing with modern cards (specifically the 780)?

I'm trying to figure out whether I should plug my second monitor and TV (neither of which is likely to be used for gaming) into the integrated graphics or into the GPU on my new PC.

The same fundamental problem exists, but it's greatly mitigated by a few factors:

  • Kepler and GCN Radeons have lower power consumption in general and many more intermediate power states to choose from, so they get less hot.
  • Certain multi-monitor configurations don't trigger the problem.

The GPU will use idle clocks if it is driving either one "screen" or two identical "screens." A Surround or Eyefinity array counts as one screen, so a 3x or 5x setup etc. will run at idle clocks. If you add another monitor that's not part of the array, the card will clock up. Two identical screens (same refresh rate, same bit depth, same resolution) will also run at idle clocks, but if they differ in the slightest, the GPU will clock up.

So if your TV and monitor are both 1080p @ 60 Hz, the card may well run at idle clocks. Alternatively, the clock-up problem with a new GPU may be such a small use of extra power that it's a was between running them both on the dGPU and splitting one off to the IGP.

Thoom
Jan 12, 2004

LUIGI SMASH!
The two monitors are 1440p and the TV is 1080p. I guess I'll try it both ways and see how each one performs.

David Mountford
Feb 16, 2012

Thoom posted:

I remember hearing that having multiple monitors plugged into a GPU would cause it to never properly idle and waste a lot of power/heat due to vblank syncing issues. Is this still a thing with modern cards (specifically the 780)?

I'm trying to figure out whether I should plug my second monitor and TV (neither of which is likely to be used for gaming) into the integrated graphics or into the GPU on my new PC.

Thankfully, there is an easy solution to this problem: Download NVIDIA Inspector, fire it up, right click on the Show Overclocking button and select Multi Display Power Saver. It'll let you select the GPU you want to follow the guidelines, and then allow you to set a threshold of GPU utilization to activate 3D clocks. Even with 3 identical displays attached to my Titan, the card still doesn't clock all the way down to the P8 power state without the Multi Display Power Saver initialized. I did a Windows 8 reinstall yesterday and only got around to reinstalling this today after noticing that the card was idling in the 50s at 875mhz core clock.

Peteyfoot
Nov 24, 2007
What's the easiest way to know when new Nvidia drivers have been released without running Nvidia Experience?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Check regularly, watch news sites that might happen to comment on it, etc.

Star War Sex Parrot
Oct 2, 2003

terre packet posted:

What's the easiest way to know when new Nvidia drivers have been released without running Nvidia Experience?
Windows Update :getin:

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."
I have a Maxcore 55 that I was considering off-loading on SA Mart. Would it be worth the effort?

Squibbles
Aug 24, 2000

Mwaha ha HA ha!
Betanews does or used to have a thing where you could register with them and pick what products you want to be notified of releases for (nvidia drivers are one of them) and they email you when there's new releases. They notify of betas too though and I'm not sure if there's a way to filter for just release versions or not.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Ghostpilot posted:

I have a Maxcore 55 that I was considering off-loading on SA Mart. Would it be worth the effort?

Eh. You might be surprised. Someone could be looking for a cheap card for an SLI upgrade because they don't pay for their own electricity or something.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Ghostpilot posted:

I have a Maxcore 55 that I was considering off-loading on SA Mart. Would it be worth the effort?
Probably. 260's (and anything from the 48xx-era) are still decent enough lower-resolutions, so someone might want to pick one up for a family-member's PC. I recycled an old 4870 by giving it to my brother since it was just taking up space, and it works fine on his old 1400x900 monitor. You could also list it on [H] or eBay and likely snag a buyer.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

terre packet posted:

What's the easiest way to know when new Nvidia drivers have been released without running Nvidia Experience?

I just refresh guru3d once a day or so, also keep up with afterburner that way.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Incredulous Dylan posted:

When I look at the stuff promised for gaming with PhysX or real time Laura Croft ponytail modelling it just strikes me as absurd. This stuff is always used in only two or three of the big games and, with the exception of Arkham Asylum/City and Borderlands 2, barely even in those. 3D vision was poorly supported by developers and only a handful of games used it correctly. God forbid you played a game on the unreal 3 engine unless you enjoy your brain being split in half by lighting issues.

For the 280, I remember it really bumming me out when I couldn't run Just Cause 2 in 3D on the highest settings on my 280 without bad performance issues. It seems like all of the games I was excited to experience in 3D that came out just a few months later hosed the playability/fidelity ratio. 3D performance was pretty unreliable for new games right up until the 680, which started reliably putting out the numbers for a great playable experience. The difference between a 480 and 680 for me in Battlefield 3 was night and day for sure!

edit: vv Man, talk about setting it up and knocking it down!

Well, UE4 supports PhysX, and it's going to be powering a ton of games, so I'm hoping for broader support with the next generation there on PC. If it doesn't materialize, the 650Ti will be the last PhysX card I'll ever buy.

I always thought 3D was gimmicky crap, I think my definition of "gimmicky crap" is probably identical to your definition of "marketing garbage." Some people get a kick out of it but for me, never clicked at all. No thanks. In a couple years maybe the Rift will be awesome, but current state of 3D is "I like it at IMAX but nowhere else."

Fair points made about being disappointed by that stuff, though. One thing that isn't a gimmick is dramatic generational increases in processing power, put to good use by games to run like crazy compared to older hardware. That's mainly what I look for as a single-GPU gamer, and so far I have not had trouble finding it. This isn't even a new generation, just a bad mother of a card with the big chip that we were speculating might never make it into consumer products at all.

BeanBandit
Mar 15, 2001

Beanbandit?
Son of a bitch!
What do we think the odds are on seeing a 6GB 780? I was kind of holding out for one, but I've been reading that Nvidia aren't allowing their partners to manufacture them because a 6GB 780 is basically a Titan.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

uhhhhahhhhohahhh posted:

This is demoralizing. I was planning to get a 27" 1440p screen soonish (my old Samsung 226BW is constantly emitting some high pitched whining sound and I think it's giving me a Constant Headache) and was hoping I'd never have to use SLi considering all the problems it has. I only have a GTX 580 right and I just bought a 550w PSU for £100 in November 2012 because it was much quieter than the 900w one I had for ~5 years (and sold.) Was hoping a single GTX 770 would be enough, but looking over benchmarks even SLi 770s are barely enough for 60fps @ 1440p right now. Who knows how much lower that will go when newer games out at the end of this year and 2014. I probably wouldn't even be able to get 30fps on 1440p with my current GPU.

Not that a constant 60fps minimum matters to me in anything other than Dota2 and probably BF4 multiplayer, since they're the only games I'll end up playing for more than a month most likely. I can tolerate 30-60fps in some random single player game I would play for a week, but having to get a new ~£150-200 850w PSU, plus 2x GPUs for £800 that can barely push 60fps minimum on top of the £550 monitor wouldn't be the greatest investment... especially considering I'm Unemployed As gently caress right now.

If anyone itt has a 27" 1440p screen, some free time and a decentish camera could you do some comparison photos of different games at 1080p and 1440p fullscreen as an experiment to see how blurry it is?

For what it's worth, I have a gtx680, and I have no trouble playing games on max settings at 1440p (except maybe turning AA down, or off, but you don't need lots of AA at that resolution anyhow.) I also have a crappy old Phenom IIx6 that is probably holding my 680 back from performing at its absolute best. If you bought a 780, which from memory has around a 30% performance improvement over the 680, you should be absolutely fine.

The whole games must run at 60fps at all times thing is basically sperglord bullshit in any case, unless you are playing an online fps at the professional level or something. As long as your minimum fps doesn't drop below 30, games are going to play perfectly well.

The Lord Bude fucked around with this message at 06:24 on Jun 25, 2013

uhhhhahhhhohahhh
Oct 9, 2012
Thanks for the reassurance. I don't play professionally anymore but I still have some Care and like to not be bad. Not having a smooth 60fps in a game like Battlefield 4 if I'm going to play it will annoy me. The problem I have with spending ~£1500 on those upgrades is that I might get tired of BF4 in a month and I'll be back to only playing Dota2 and some old game like Heroes of Might and Magic 3 in a 800x600 window, plus whatever big new single player game comes out for the 6 hours of combined gameplay.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

uhhhhahhhhohahhh posted:

Thanks for the reassurance. I don't play professionally anymore but I still have some Care and like to not be bad. Not having a smooth 60fps in a game like Battlefield 4 if I'm going to play it will annoy me. The problem I have with spending ~£1500 on those upgrades is that I might get tired of BF4 in a month and I'll be back to only playing Dota2 and some old game like Heroes of Might and Magic 3 in a 800x600 window, plus whatever big new single player game comes out for the 6 hours of combined gameplay.

You know there's a modern widescreen resolution mod for HoMM3 right? no need for 800x600 windows any more.

uhhhhahhhhohahhh
Oct 9, 2012
I've been using it but it still look like poo poo fullscreen. I like to play it in a window so I can look at other stuff while playing anyway.


AMD made a dumb video: https://www.youtube.com/watch?v=eH6XayaLTw8

uhhhhahhhhohahhh fucked around with this message at 12:52 on Jun 25, 2013

VorpalFish
Mar 22, 2007
reasonably awesometm

For what it's worth, I'm very happy gaming at 1440p with "just" a 670.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If you can't beat 'em, lie your rear end off.

My favorite part is where the nVidia PCB took the first hit from the sledgehammer without cracking, just dented it and bounced the sledgehammer back. If you've ever handled a 10lb sledge before, that speaks to the durability of PCBs, drat. :stonk:

Edit: I disheartened somebody because a GPU won't put out 60FPS minimum in Crysis 3 with everything turned up? ... Ahem. Well, the deal there is 60FPS minimum in Crysis 3 is a laughable pipe-dream. (Have you seen those graphics?) But you wouldn't ever notice playing it, with a 780 the game runs like CRAZY cranked. As previously well put, unless you're a competitive gamer who needs 60FPS minimum and throws a fit if it ever falls between 30 and 60, you're going to god damned love the GTX 780. It is a MONSTER for performance. Just don't expect unrealistic things like the most demanding games in the world obligingly turning into Super Mario Bros just because you've got a nice card stuffed in your machine, dig?

Agreed fucked around with this message at 13:32 on Jun 25, 2013

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

uhhhhahhhhohahhh posted:

I've been using it but it still look like poo poo fullscreen. I like to play it in a window so I can look at other stuff while playing anyway.


AMD made a dumb video: https://www.youtube.com/watch?v=eH6XayaLTw8

Dear AMD, instead of wasting money on lovely ads that will alienate your customers, why not invest that money into drivers that work?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Honestly, unless they used an ad agency for that, that could have just been a side project among employees with hobbies that was touched up by the marketing department for release.

E: Actually, I'd say that the video is a smashing (:v:) success, because we're talking about AMD instead of the GTX 760 reviews. $249.

Factory Factory fucked around with this message at 14:39 on Jun 25, 2013

Wistful of Dollars
Aug 25, 2009

Factory Factory posted:

Eh. You might be surprised. Someone could be looking for a cheap card for an SLI upgrade because they don't pay for their own electricity or something.

Or their electricity is stupidly cheap. :smug:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

Honestly, unless they used an ad agency for that, that could have just been a side project among employees with hobbies that was touched up by the marketing department for release.

E: Actually, I'd say that the video is a smashing (:v:) success, because we're talking about AMD instead of the GTX 760 reviews. $249.

Glad to see the $249 rumor vindicated, that was the appropriate price point for launch imo.

Also, talking about AMD doesn't do poo poo if we're all buying nVidia. Just sayin'. ;) We talked about Bulldozer a lot, too, but mainly to :stonk: at how awful it was. I don't think this will disrupt nVidia's marketing in the least, and it's not good enough to truly go viral. Cheese factor: 10, it's just going to start slapfights between nVidia and AMD die-hards in youtube comment sections.

Agreed fucked around with this message at 14:54 on Jun 25, 2013

synthetik
Feb 28, 2007

I forgive you, Will. Will you forgive me?

uhhhhahhhhohahhh posted:

I've been using it but it still look like poo poo fullscreen. I like to play it in a window so I can look at other stuff while playing anyway.


AMD made a dumb video: https://www.youtube.com/watch?v=eH6XayaLTw8

This is OUYA levels of marketing. I think the 'gamer' is trying hard not to laugh during the destruction montage.

uhhhhahhhhohahhh
Oct 9, 2012

I wasn't referring to Crysis 3 in particular; I've already played it, finished it and probably won't play it again for 3 years. It was more that if other modern games are barely getting 60fps right now at 1440p, how bad will it be in 3-6 months when BF4 and others start coming out.

uhhhhahhhhohahhh fucked around with this message at 14:56 on Jun 25, 2013

GigaFuzz
Aug 10, 2009

terre packet posted:

What's the easiest way to know when new Nvidia drivers have been released without running Nvidia Experience?

You could have them email you when new ones come out:
http://www.geforce.co.uk/drivers

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

New Beta drivers out for nVidia cards, by the way. Party time, excellent.

uhhhhahhhhohahhh posted:

I wasn't referring to Crysis 3 in particular; I've already played it, finished it and probably won't play it again for 3 years. It was more that if other modern games are barely getting 60fps right now at 1440p, how bad will it be in 3-6 months when BF4 and others start coming out.

I really wouldn't worry about it. The user experience with current high-end graphics is as good as it has ever been. Game engines are forward-looking in ways that they have only occasionally been in the past. It's much more common now to have options that you either need a multi-card setup or to wait for future, more powerful hardware to crank. And yet the 780 cranks Far Cry 3 (Dunia 2 engine), Crysis 3 (well, you know that one), Tomb Raider, Bioshock Infinity, Metro 2033 (without their screwy ADoF that looks like garbage anyway, it can average over 80FPS) etc., etc., etc. just fine.

1440p is an in-betweener and isn't as demanding as 1600p, but even if it were, you'd not see FPS below 30, and that's where you really start to feel it. As far as competitive ultra-high-end gaming, you're either in or you're out. Find a way to hit your framerate target, be it turning down options intended for the future or for multi-card, or by going with additional hardware. That segment of the market is like the top 5% of the enthusiast-level gaming market subset, it's not unusual that you'd need extraordinary stuff to keep up with other people building systems around SB-E or IVB-E (soon) and 32GB of RAM. If money is not an object and performance is, go that route. Otherwise, be delighted with higher FPS than any other option can give you today, and gameplay experiences that are unmatched in both smoothness and image quality.

Agreed fucked around with this message at 15:13 on Jun 25, 2013

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply