Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Wistful of Dollars
Aug 25, 2009

It only dawned on me on reflection that none of the articles mentioned it was limited to TN panels, they only used them as the demo machines. I was caught up on the thought of having to go TN to enjoy the technology, but I guess if it works as advertised IPS will be able to deliver the smoothness benefits of 120/144hz with IPS quality. :911:

Adbot
ADBOT LOVES YOU

Gamer2210
Nov 15, 2011

Agreed posted:

This one's easy I hope I beat FactoryFactory to it fast typing skills GO!

1. Update when you can afford to get the thing that you want. The future will hold much better and more powerful components, that is always the case, and the best you can do there is pay attention to product release schedules and hope that you don't invest at a bad time. I didn't do a very good job of assessing the value proposition of replacing my high-performing GTX 680 with a GTX 780, and as a result I paid much more for it than I would have to shortly, and in addition didn't get any AMAZINGLY GOOD free game bundles (I did get 3Dmark so I can show off how stupid I am, though, rock and roll :argh:). My advice on this particular moment re: timing can be a little more specific, though, because we are an extremely short period of time away from AMD dropping the NDA on their high end cards and nVidia will respond accordingly. Wait to make this decision until it can be an informed one, because, rarely, you actually have that luxury.

2. If your resolution is higher than 1440p or so, you're probably going to need a multiple card setup, especially if you're using the better price:performance options available as opposed to going with the big ol' heavy hitters, cost-is-no-object style. Right now, nVidia's multi-monitor experience for gaming is better than AMD's because AMD got caught flat-footed and are still scrambling to get their frame pacing issues solved for even most reasonable (let alone all) use cases. But don't jump on it right now, wait until we learn a bit more. If you do get a multiple card setup, the extra VRAM is worth it on the higher VRAM models. Note people with two 680s running away from them because while the chip can handle all the throughput to make a perfectly nice very high res experience, relatively speaking, 2GB of VRAM isn't enough for such high resolutions and swapping has performance penalties that many find inexcusable given the cost of the setup.

3. As a bit of a qualifier to the above, do you know for a fact that you'll be adding multiple monitors? I mean, I'm an rear end in a top hat who is probably an NDA dropping and price adjustment away from selling a few guitar pedals to fund another goddamned GTX 780 and I'm still on 1080p (so I'll end up buying a bigger monitor, which has, to be fair, been on the to-do list for a while now, but still, watch the tail wag the dog). Be realistic and don't overspend, you'll turn out like me and I'm just, god, horrible.

I just noticed a minute ago that I'm not actually supposed to ask for advice about PC parts in this thread, so I apologize.
But thanks a lot for the help all the same, I appreciate it a lot.

I think I'll wait a little for the time being on buying a GPU until I learn more about these recent GPU releases, AMD's cards and Nvidia's 780 TI.
As for the multi display setup. Well, I really want to. But since Nvidia just announced the G-sync chip that'll be featured in future monitors, I fear I'll buy a monitor too soon and regret it.

I've been saving money since early 2012, so waiting isn't an issue. My plan for now is to get a new GPU after waiting for news, and buy 3 monitors with a G-sync chip once it's released.
I was also advised not to wait for things to happen, but then I bought a GTX 580 just a little while before the 6xx series was released, so you can see why I'm hesitant to upgrade my PC when it looks like new products are about to be released.

Once again, thanks a lot for the reply :)

Gamer2210 fucked around with this message at 18:24 on Oct 19, 2013

fookolt
Mar 13, 2012

Where there is power
There is resistance

Gonkish posted:

I'd like to see it on more devices, but right now they're trying to push Shield so that will pretty much never happen.

Easy solution: sell it as a Steam program, print money. Hire me, Nvidia :crossarms:

Lolcano Eruption
Oct 29, 2007
Volcano of LOL.

El Scotch posted:

It only dawned on me on reflection that none of the articles mentioned it was limited to TN panels, they only used them as the demo machines. I was caught up on the thought of having to go TN to enjoy the technology, but I guess if it works as advertised IPS will be able to deliver the smoothness benefits of 120/144hz with IPS quality. :911:

Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times.

Mayne
Mar 22, 2008

To crooked eyes truth may wear a wry face.

Lolcano Eruption posted:

Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times.

There are plenty of those korean 27inch 2560x1440 IPS monitors with 120hz refresh rate.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Are you guys thinking one 290 non-x should be able to do Battlefield 4 @ 2560x1600?

What about with another 24" and 20" attached?

Magic Underwear
May 14, 2003


Young Orc

Tab8715 posted:

Are you guys thinking one 290 non-x should be able to do Battlefield 4 @ 2560x1600?

What about with another 24" and 20" attached?

Well, we don't know for sure, no benchmarks have been released.

But, assuming the 290 competes with the 780, it should be able to handle it pretty well. Here are the benchmarks for BF3 maxed out at 1440p: http://www.anandtech.com/bench/GPU13/581. 780 gets 65 fps which is pretty drat good.

Extra monitors shouldn't change anything.

Wistful of Dollars
Aug 25, 2009

Lolcano Eruption posted:

Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times.

You mistake my meaning.

I meant that this technology appears to be able to give 60hz displays the same smoothness (or perhaps superior smoothness) than 120/144 TN displays. I don't know if it has the ability to make IPS run faster than 60hz, but it doesn't need to, because it makes 60 (or even less) hz/fps appear super smooth, and that smoothness is normally the only reason people by 120/144 TN displays.

Assuming my assumptions about it are right, anyway.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

LooKMaN posted:

There are plenty of those korean 27inch 2560x1440 IPS monitors with 120hz refresh rate.
There are a few that are potentially capable of refresh rates up to 120Hz at unknown operational risk over time and no warranty. That isn't the same thing.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Lolcano Eruption posted:

Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times.

My interest isn't in this to push 120 or 144 FPS, but so that I could have synchronized frames at 40-60 FPS where I tend to run most stuff as someone who coasts on a mid-high GPU for 4 years at a time. If your machine ever drops below 60FPS Vsync can be a pretty terrible experience, it'd be great to not have to deal with tearing just because I'm at 50fps.

microwave casserole
Jul 5, 2005

my god, what are you doing
Can't you get 90% of what G-Sync is offering with a 120hz monitor? Being able to V-Sync at 40 and 60 frames per second gives you a lot of performance breathing room with very little impact on visuals. I guess until more IPSes can do 120hz this is a decent compromise.

If you're going for more than 60fps this will probably help, but that seems like a pretty niche market.

microwave casserole fucked around with this message at 23:17 on Oct 19, 2013

Purgatory Glory
Feb 20, 2005
John Carmack, Tim Sweeney, & Johan Andersson discuss AMD's Mantle at Nvidia's conference:
http://www.youtube.com/watch?v=3RF0zgYFFNk

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

Purgatory Glory posted:

John Carmack, Tim Sweeney, & Johan Andersson discuss AMD's Mantle at Nvidia's conference:
http://www.youtube.com/watch?v=3RF0zgYFFNk

I liked how the video pretty much turned into a friendly discussion between Andersson and Carmack for a bit. Two game engine gurus just talking it up may be my new fetish. :fap:

As for what I gathered from this video, it seems that all parties aren't particularly thrilled at the existence of Mantle, but rather the possibility of Mantle ushering in changes to APIs to allow more low level access in general. They regard Mantle as the stepping stone needed to progress, but not as the end all solution. And ideal solution would be one API allowing low level access to the majority of architectures, regardless of brand. There are rumors that Mantle may actually be this (with it apparently being open to all), but they are rather baseless and don't seem to make much sense at this current time. That would be like NVIDIA announcing that G-Sync is able to be used on AMD GPUs right from the start. I think the ideal solution for both is to test the waters for a bit, iron things out, and maybe then allow competitors the opportunity to utilize your work.

Also, one of the NVIDIA guys apparently said something about how G-Sync might be able to be licensed to Intel or AMD in the future, so I have high hopes for that.

lllllllllllllllllll
Feb 28, 2010

Now the scene's lighting is perfect!
Hi thread, I'm a bit confused about various models like SC, GTI and TI. Which one would I want if I wanted a reasonably good card (Nvidia 660)? I'm more concerned about stability and noise than performance. Thanks.

Wow, this is helpful. Thanks! Probably mixed up "GTI" in my mind somehow. Sorry about that!

lllllllllllllllllll fucked around with this message at 20:20 on Oct 21, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

lllllllllllllllllll posted:

Hi thread, I'm a bit confused about various models like SC, GTI and TI. Which one would I want if I wanted a reasonably good card (Nvidia 660)? I'm more concerned about stability and noise than performance. Thanks.

SC = EVGA's shorthand for "superclocked," i.e. a factory-overclocked card. The hardware is no different from any other card of the same type, just a different clockrate and therefore slightly higher performance.

Ti = Nvidia branding for "Better than non-Ti, not as good as the next number up." Like, if you had a 660, 660 Ti, and 670, you could call the 660 Ti a "665" and it would mean the same thing. Nvidia did this in the 400 series (GeForce GTX 460, 465, and 470) but reintroduced the "Ti" branding for the 500 series (GeForce GTX 560, 560 Ti, 570). It's "Ti" like the chemical symbol for titanium.

I have no clue what "GTI" is. Googling suggests it's just a malapropism combining "Ti" with "GTX," the latter being Nvidia's common branding suffix for a high-performance GeForce card with SLI support (as opposed to GT/GTS for mid-low/no SLI, and GS or no suffix for crapola).

Factory Factory fucked around with this message at 10:49 on Oct 20, 2013

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

lllllllllllllllllll posted:

Hi thread, I'm a bit confused about various models like SC, GTI and TI. Which one would I want if I wanted a reasonably good card (Nvidia 660)? I'm more concerned about stability and noise than performance. Thanks.

You also probably don't want to buy a GTX 660 right now, the 760 or AMD 7950 are significantly better options at the moment.

Drunken Warlord
Jul 8, 2013

It's a dogs life.
I'm a bit confused as to what G-Sync entails. Is it an option like V-Sync that usually needs to be manually enabled in the application, or will it just be a passive feature of the card to remove tearing or whatever?

Animal
Apr 8, 2003

Drunken Warlord posted:

I'm a bit confused as to what G-Sync entails. Is it an option like V-Sync that usually needs to be manually enabled in the application, or will it just be a passive feature of the card to remove tearing or whatever?

It will be the card working in conjunction with a piece of silicon in a monitor. They will make sure the monitor only refreshes when the video card sends a frame, instead of refreshing at a set value (60hz/120hz) regardless of what the GPU pushes out (a bad thing.)

As to whether you have to enable it in a supported application or at the driver level, we don't know yet. Hopefully at driver level and working seamlessly with every application.

Animal fucked around with this message at 19:37 on Oct 20, 2013

Midee
Jun 22, 2000

I can't wait to see it in action. Just the idea that every frame will be rendered properly no matter what (so long as it's within a given threshold) still amazes me and kinda hurts my brain a bit...

Here's another in depth article with a bit of a history lesson: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Rahu X posted:

As for what I gathered from this video, it seems that all parties aren't particularly thrilled at the existence of Mantle, but rather the possibility of Mantle ushering in changes to APIs to allow more low level access in general.
WDDM is designed for a really specific problem--3D accelerated desktop UIs--and all of its tradeoffs are built around that. This is why you can do things like "games generally don't crash when I alt-tab" and "I can see window previews when I alt-tab" and "badly behaved drivers don't cause BSODs." It's also why you get other things like "command buffer submission to the GPU takes forever" and "compute APIs are always going to be second-class citizens." WDDM was designed for NV40/G70 class hardware ten years ago, and it shows. If you remember back in the proverbial day, there was a proposal for WDDM 2.0 that was spectacularly unrealistic, like "all hardware must support instruction-level preemption" unrealistic (to my knowledge, no GPU supports instruction level preemption). MS finally added support for any sort of preemption in WDDM 1.2 (Win8), but they haven't done anything to address things like buffer queue overhead (not since they fixed something completely horrible in Vista with something less horrible in Win7), GPU page faulting, or shared memory machines.

The thing I'm most curious about with Mantle is how it will work alongside WDDM, because upon reflection and discussion with some similarly knowledgeable folks, none of us can figure out how you could get WDDM interoperability except in one of two ways:

1. a large static memory carve-out at boot and a second Mantle-specific device node, rendering into a D3D surface
2. only run on a platform that has a GPU with reasonable preemption (at least per-wavefront) and an AMD IOMMU

Of course, they could ship Mantle in a separate driver that blatantly circumvents WDDM and that they never attempt to get WHQL'd, but that seems unrealistic.

If you look at the HSA slides from Hot Chips, the driver they propose is definitely a response to the stagnancy of WDDM, but it's also mired in some unrealistic stuff (the idea that you can return from a GPU operation to a suspended user-mode process without entering the kernel is nonsense) and some pointless stuff; a standardized pushbuffer format was tried by MS briefly in the DX5/6 timeframe, I think, and it was a travesty that vendors all rebelled against.

(i know a lot about driver models, i should really write my own sometime)

Chuu
Sep 11, 2004

Grimey Drawer
What video cards support GSync? Somehow I completely missed this reading through all the tech articles.

Gonkish
May 19, 2004

Chuu posted:

What video cards support GSync? Somehow I completely missed this reading through all the tech articles.

I thought it was any Nvidia card?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Gonkish posted:

I thought it was any Nvidia card?

I believe it is only Kepler-based cards.

GrizzlyCow
May 30, 2011

Gonkish posted:

I thought it was any Nvidia card?

GeForce GTX 650 Ti Boost or Higher.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend
I'm pretty sure it was only Kepler cards.

EFB

Purgatory Glory
Feb 20, 2005
contains some one-on-one time with John Carmack and Tim Sweeney:
http://www.youtube.com/watch?v=gbW9IwVGpX8

Purgatory Glory fucked around with this message at 01:18 on Oct 21, 2013

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
So I upgraded Windows to 8.1 and got the 3d Stereoscopic crap Factory Factory mentioned but now my SLI flat out refuses to turn on. Any ideas?

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

deimos posted:

So I upgraded Windows to 8.1 and got the 3d Stereoscopic crap Factory Factory mentioned but now my SLI flat out refuses to turn on. Any ideas?

I did a quick Google search and it looks like this isn't an uncommon issue - people with desktops and laptops have had graphics issues with SLI since upgrading to 8.1. Nvidia released a 326.01 driver for Win8.1, maybe remove the existing drivers and do a clean install with the new ones?

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!
Whinny-voiced nerds won't shut up about G-Sync:
https://www.youtube.com/watch?v=gbW9IwVGpX8

ethanol
Jul 13, 2007



I got my rma sapphire 7950 back today. Haven't plugged it in but it appears to be identical. I was really hoping for a vapor since this card is discontinued. I guess they're still sitting in warehouses.


Edit: Never had to do this before, but I had to bend the metal tab a bit to clear the motherboard before the card would fit. At least it works.

ethanol fucked around with this message at 19:09 on Oct 21, 2013

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



I'm cross-posting this from the parts picking thread since I realized it may be more appropriate for this thread given the context, so hopefully this doesnr get me in trouble:

Out of curiosity, how undesirable is it to do either SLI or Crossfire (just two GPUs) on a board with a PLX chip? I'm considering someday going with 2x ASUS R9-280X Matrix's which are the triple-slot coolers. To allow a slot for adequate air intake for the first card I was looking at boards that place the second GPU lower such as the Asrock Z77 WS or (ideally to keep my Hackintosh hobby alive) the Gigabyte GA-Z77X-UP 7. I know there is some overheard for the PLX chip but wasn't sure whether it should be avoided at all costs.

For reference though my motherboard and cards will be horizontal (keeping a Corsair Air 540 on its side) so hot air should be rising up from one card into the upper card such as in a vertical arrangement. Otherwise are there any standard x8/x8 boards that could accommodate 2x triple slot coolers?

Dr Cheeto
Mar 2, 2013
Wretched Harp
Is purchasing a video card with more memory than the reference card desirable? I've been considering the different models of GTX 760 out there, and several offerings (notably those from EVGA) possess 4GB as opposed to the reference 2GB amount.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Asked earlier, I'd tell you that of course not, because video cards are designed for an amount of VRAM suitable for what they can actually render. Now? Depends on how much it costs, and I still wouldn't bother doing it with anything smaller, or choosing VRAM over a more powerful GPU.

Even six months ago, only things like modded Skyrim (and apparently Bioshock Infinite on ridiculous detail settings) needed that kind of RAM, but that was before it sank in that eighth-generation consoles are going to have like 4 GB of VRAM to play with. We're about to see VRAM usage get a good solid kick in the rear end, so it might be worthwhile.

veedubfreak
Apr 2, 2005

by Smythe

Sir Unimaginative posted:

Asked earlier, I'd tell you that of course not, because video cards are designed for an amount of VRAM suitable for what they can actually render. Now? Depends on how much it costs, and I still wouldn't bother doing it with anything smaller, or choosing VRAM over a more powerful GPU.

Even six months ago, only things like modded Skyrim (and apparently Bioshock Infinite on ridiculous detail settings) needed that kind of RAM, but that was before it sank in that eighth-generation consoles are going to have like 4 GB of VRAM to play with. We're about to see VRAM usage get a good solid kick in the rear end, so it might be worthwhile.

Are you going back as far as the atari? Did I miss a console somewhere?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004).

This is going to be some cool tech.

beejay
Apr 7, 2002

Check it:

http://en.wikipedia.org/wiki/History_of_video_game_consoles_(first_generation)
http://en.wikipedia.org/wiki/History_of_video_game_consoles_(second_generation)
http://en.wikipedia.org/wiki/History_of_video_game_consoles_(third_generation)
http://en.wikipedia.org/wiki/History_of_video_game_consoles_(fourth_generation)
http://en.wikipedia.org/wiki/History_of_video_game_consoles_(fifth_generation)
http://en.wikipedia.org/wiki/History_of_video_game_consoles_(sixth_generation)
http://en.wikipedia.org/wiki/History_of_video_game_consoles_(seventh_generation)
http://en.wikipedia.org/wiki/History_of_video_game_consoles_(eighth_generation)

That's a lot of video games!

beejay fucked around with this message at 22:48 on Oct 21, 2013

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Agreed posted:

I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004).

This is going to be some cool tech.

I basically did a 180-degree opinion turn in regards to what my next planned card would be down the road when I read about this the other day. Back when I had learned about Adaptive Vsync it dawned on me that frame spewing was an issue that really could only be solved with workarounds. Not so, apparently. This attacks the problem directly and also helps me decide on what sort of monitor I'd buy with the card to finish my setup on which I can uselessly while away the rest of my life on.

Gwaihir
Dec 8, 2009
Hair Elf

Agreed posted:

I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004).

This is going to be some cool tech.

Yea, it's legit making me regret JUUUUST buying my U3014. I can't realistically turn on vsync, because I can't maintain 60 FPS solid with only one OCed GTX680, so I get lots of tearing. But vsync leads to all those weird feeling lags and stutters in motion. Hopefully Asus at the very least puts it in one of their IPS models, because gently caress if I want to go back to a TN 144hz screen from a full sized 30" just to get this cool new tech.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Sidesaddle Cavalry posted:

I basically did a 180-degree opinion turn in regards to what my next planned card would be down the road when I read about this the other day.

I was set on picking up a dirt cheap 7950 to replace my 5850 that's getting long in the tooth, but I'm going to keep coasting a while and wait and see. Between this and shadowplay nVidia is putting together quite the package of GPU fringe benefits.

Adbot
ADBOT LOVES YOU

Wistful of Dollars
Aug 25, 2009

I'm curious how long it will take for AMD to compete. There was a good article on it on Techreport today, and I think they may be right as to perhaps where AMD should be aiming.

quote:

AMD will need to counter with its own version of this tech, of course. The obvious path would be to work with partners who make display ASICs and perhaps to drive the creation of an open VESA standard to compete with G-Sync. That would be a typical AMD move—and a good one. There's something to be said for AMD entering the display ASIC business itself, though, given where things may be headed. I'm curious to see what path they take.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply