|
It only dawned on me on reflection that none of the articles mentioned it was limited to TN panels, they only used them as the demo machines. I was caught up on the thought of having to go TN to enjoy the technology, but I guess if it works as advertised IPS will be able to deliver the smoothness benefits of 120/144hz with IPS quality.
|
# ? Oct 19, 2013 18:00 |
|
|
# ? Apr 28, 2024 21:23 |
|
Agreed posted:This one's easy I hope I beat FactoryFactory to it fast typing skills GO! I just noticed a minute ago that I'm not actually supposed to ask for advice about PC parts in this thread, so I apologize. But thanks a lot for the help all the same, I appreciate it a lot. I think I'll wait a little for the time being on buying a GPU until I learn more about these recent GPU releases, AMD's cards and Nvidia's 780 TI. As for the multi display setup. Well, I really want to. But since Nvidia just announced the G-sync chip that'll be featured in future monitors, I fear I'll buy a monitor too soon and regret it. I've been saving money since early 2012, so waiting isn't an issue. My plan for now is to get a new GPU after waiting for news, and buy 3 monitors with a G-sync chip once it's released. I was also advised not to wait for things to happen, but then I bought a GTX 580 just a little while before the 6xx series was released, so you can see why I'm hesitant to upgrade my PC when it looks like new products are about to be released. Once again, thanks a lot for the reply Gamer2210 fucked around with this message at 18:24 on Oct 19, 2013 |
# ? Oct 19, 2013 18:17 |
|
Gonkish posted:I'd like to see it on more devices, but right now they're trying to push Shield so that will pretty much never happen. Easy solution: sell it as a Steam program, print money. Hire me, Nvidia
|
# ? Oct 19, 2013 18:19 |
|
El Scotch posted:It only dawned on me on reflection that none of the articles mentioned it was limited to TN panels, they only used them as the demo machines. I was caught up on the thought of having to go TN to enjoy the technology, but I guess if it works as advertised IPS will be able to deliver the smoothness benefits of 120/144hz with IPS quality. Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times.
|
# ? Oct 19, 2013 18:38 |
|
Lolcano Eruption posted:Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times. There are plenty of those korean 27inch 2560x1440 IPS monitors with 120hz refresh rate.
|
# ? Oct 19, 2013 18:57 |
|
Are you guys thinking one 290 non-x should be able to do Battlefield 4 @ 2560x1600? What about with another 24" and 20" attached?
|
# ? Oct 19, 2013 19:12 |
|
Tab8715 posted:Are you guys thinking one 290 non-x should be able to do Battlefield 4 @ 2560x1600? Well, we don't know for sure, no benchmarks have been released. But, assuming the 290 competes with the 780, it should be able to handle it pretty well. Here are the benchmarks for BF3 maxed out at 1440p: http://www.anandtech.com/bench/GPU13/581. 780 gets 65 fps which is pretty drat good. Extra monitors shouldn't change anything.
|
# ? Oct 19, 2013 19:41 |
|
Lolcano Eruption posted:Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times. You mistake my meaning. I meant that this technology appears to be able to give 60hz displays the same smoothness (or perhaps superior smoothness) than 120/144 TN displays. I don't know if it has the ability to make IPS run faster than 60hz, but it doesn't need to, because it makes 60 (or even less) hz/fps appear super smooth, and that smoothness is normally the only reason people by 120/144 TN displays. Assuming my assumptions about it are right, anyway.
|
# ? Oct 19, 2013 20:42 |
|
LooKMaN posted:There are plenty of those korean 27inch 2560x1440 IPS monitors with 120hz refresh rate.
|
# ? Oct 19, 2013 21:14 |
|
Lolcano Eruption posted:Yes but currently there are no IPS panels that can do 120/144Hz. Unless this tech can somehow boost their inherently slower response times. My interest isn't in this to push 120 or 144 FPS, but so that I could have synchronized frames at 40-60 FPS where I tend to run most stuff as someone who coasts on a mid-high GPU for 4 years at a time. If your machine ever drops below 60FPS Vsync can be a pretty terrible experience, it'd be great to not have to deal with tearing just because I'm at 50fps.
|
# ? Oct 19, 2013 21:15 |
|
Can't you get 90% of what G-Sync is offering with a 120hz monitor? Being able to V-Sync at 40 and 60 frames per second gives you a lot of performance breathing room with very little impact on visuals. I guess until more IPSes can do 120hz this is a decent compromise. If you're going for more than 60fps this will probably help, but that seems like a pretty niche market. microwave casserole fucked around with this message at 23:17 on Oct 19, 2013 |
# ? Oct 19, 2013 23:12 |
|
John Carmack, Tim Sweeney, & Johan Andersson discuss AMD's Mantle at Nvidia's conference: http://www.youtube.com/watch?v=3RF0zgYFFNk
|
# ? Oct 20, 2013 02:39 |
|
Purgatory Glory posted:John Carmack, Tim Sweeney, & Johan Andersson discuss AMD's Mantle at Nvidia's conference: I liked how the video pretty much turned into a friendly discussion between Andersson and Carmack for a bit. Two game engine gurus just talking it up may be my new fetish. As for what I gathered from this video, it seems that all parties aren't particularly thrilled at the existence of Mantle, but rather the possibility of Mantle ushering in changes to APIs to allow more low level access in general. They regard Mantle as the stepping stone needed to progress, but not as the end all solution. And ideal solution would be one API allowing low level access to the majority of architectures, regardless of brand. There are rumors that Mantle may actually be this (with it apparently being open to all), but they are rather baseless and don't seem to make much sense at this current time. That would be like NVIDIA announcing that G-Sync is able to be used on AMD GPUs right from the start. I think the ideal solution for both is to test the waters for a bit, iron things out, and maybe then allow competitors the opportunity to utilize your work. Also, one of the NVIDIA guys apparently said something about how G-Sync might be able to be licensed to Intel or AMD in the future, so I have high hopes for that.
|
# ? Oct 20, 2013 10:01 |
|
Hi thread, I'm a bit confused about various models like SC, GTI and TI. Which one would I want if I wanted a reasonably good card (Nvidia 660)? I'm more concerned about stability and noise than performance. Thanks. Wow, this is helpful. Thanks! Probably mixed up "GTI" in my mind somehow. Sorry about that! lllllllllllllllllll fucked around with this message at 20:20 on Oct 21, 2013 |
# ? Oct 20, 2013 10:29 |
|
lllllllllllllllllll posted:Hi thread, I'm a bit confused about various models like SC, GTI and TI. Which one would I want if I wanted a reasonably good card (Nvidia 660)? I'm more concerned about stability and noise than performance. Thanks. SC = EVGA's shorthand for "superclocked," i.e. a factory-overclocked card. The hardware is no different from any other card of the same type, just a different clockrate and therefore slightly higher performance. Ti = Nvidia branding for "Better than non-Ti, not as good as the next number up." Like, if you had a 660, 660 Ti, and 670, you could call the 660 Ti a "665" and it would mean the same thing. Nvidia did this in the 400 series (GeForce GTX 460, 465, and 470) but reintroduced the "Ti" branding for the 500 series (GeForce GTX 560, 560 Ti, 570). It's "Ti" like the chemical symbol for titanium. I have no clue what "GTI" is. Googling suggests it's just a malapropism combining "Ti" with "GTX," the latter being Nvidia's common branding suffix for a high-performance GeForce card with SLI support (as opposed to GT/GTS for mid-low/no SLI, and GS or no suffix for crapola). Factory Factory fucked around with this message at 10:49 on Oct 20, 2013 |
# ? Oct 20, 2013 10:45 |
|
lllllllllllllllllll posted:Hi thread, I'm a bit confused about various models like SC, GTI and TI. Which one would I want if I wanted a reasonably good card (Nvidia 660)? I'm more concerned about stability and noise than performance. Thanks. You also probably don't want to buy a GTX 660 right now, the 760 or AMD 7950 are significantly better options at the moment.
|
# ? Oct 20, 2013 18:37 |
|
I'm a bit confused as to what G-Sync entails. Is it an option like V-Sync that usually needs to be manually enabled in the application, or will it just be a passive feature of the card to remove tearing or whatever?
|
# ? Oct 20, 2013 19:22 |
|
Drunken Warlord posted:I'm a bit confused as to what G-Sync entails. Is it an option like V-Sync that usually needs to be manually enabled in the application, or will it just be a passive feature of the card to remove tearing or whatever? It will be the card working in conjunction with a piece of silicon in a monitor. They will make sure the monitor only refreshes when the video card sends a frame, instead of refreshing at a set value (60hz/120hz) regardless of what the GPU pushes out (a bad thing.) As to whether you have to enable it in a supported application or at the driver level, we don't know yet. Hopefully at driver level and working seamlessly with every application. Animal fucked around with this message at 19:37 on Oct 20, 2013 |
# ? Oct 20, 2013 19:32 |
|
I can't wait to see it in action. Just the idea that every frame will be rendered properly no matter what (so long as it's within a given threshold) still amazes me and kinda hurts my brain a bit... Here's another in depth article with a bit of a history lesson: http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate
|
# ? Oct 20, 2013 19:52 |
|
Rahu X posted:As for what I gathered from this video, it seems that all parties aren't particularly thrilled at the existence of Mantle, but rather the possibility of Mantle ushering in changes to APIs to allow more low level access in general. The thing I'm most curious about with Mantle is how it will work alongside WDDM, because upon reflection and discussion with some similarly knowledgeable folks, none of us can figure out how you could get WDDM interoperability except in one of two ways: 1. a large static memory carve-out at boot and a second Mantle-specific device node, rendering into a D3D surface 2. only run on a platform that has a GPU with reasonable preemption (at least per-wavefront) and an AMD IOMMU Of course, they could ship Mantle in a separate driver that blatantly circumvents WDDM and that they never attempt to get WHQL'd, but that seems unrealistic. If you look at the HSA slides from Hot Chips, the driver they propose is definitely a response to the stagnancy of WDDM, but it's also mired in some unrealistic stuff (the idea that you can return from a GPU operation to a suspended user-mode process without entering the kernel is nonsense) and some pointless stuff; a standardized pushbuffer format was tried by MS briefly in the DX5/6 timeframe, I think, and it was a travesty that vendors all rebelled against. (i know a lot about driver models, i should really write my own sometime)
|
# ? Oct 20, 2013 19:57 |
|
What video cards support GSync? Somehow I completely missed this reading through all the tech articles.
|
# ? Oct 21, 2013 00:14 |
|
Chuu posted:What video cards support GSync? Somehow I completely missed this reading through all the tech articles. I thought it was any Nvidia card?
|
# ? Oct 21, 2013 00:36 |
|
Gonkish posted:I thought it was any Nvidia card? I believe it is only Kepler-based cards.
|
# ? Oct 21, 2013 00:49 |
|
Gonkish posted:I thought it was any Nvidia card? GeForce GTX 650 Ti Boost or Higher.
|
# ? Oct 21, 2013 00:54 |
|
I'm pretty sure it was only Kepler cards. EFB
|
# ? Oct 21, 2013 01:00 |
|
contains some one-on-one time with John Carmack and Tim Sweeney: http://www.youtube.com/watch?v=gbW9IwVGpX8 Purgatory Glory fucked around with this message at 01:18 on Oct 21, 2013 |
# ? Oct 21, 2013 01:15 |
|
So I upgraded Windows to 8.1 and got the 3d Stereoscopic crap Factory Factory mentioned but now my SLI flat out refuses to turn on. Any ideas?
|
# ? Oct 21, 2013 01:49 |
|
deimos posted:So I upgraded Windows to 8.1 and got the 3d Stereoscopic crap Factory Factory mentioned but now my SLI flat out refuses to turn on. Any ideas? I did a quick Google search and it looks like this isn't an uncommon issue - people with desktops and laptops have had graphics issues with SLI since upgrading to 8.1. Nvidia released a 326.01 driver for Win8.1, maybe remove the existing drivers and do a clean install with the new ones?
|
# ? Oct 21, 2013 02:01 |
|
Whinny-voiced nerds won't shut up about G-Sync: https://www.youtube.com/watch?v=gbW9IwVGpX8
|
# ? Oct 21, 2013 13:38 |
|
I got my rma sapphire 7950 back today. Haven't plugged it in but it appears to be identical. I was really hoping for a vapor since this card is discontinued. I guess they're still sitting in warehouses. Edit: Never had to do this before, but I had to bend the metal tab a bit to clear the motherboard before the card would fit. At least it works. ethanol fucked around with this message at 19:09 on Oct 21, 2013 |
# ? Oct 21, 2013 16:48 |
|
I'm cross-posting this from the parts picking thread since I realized it may be more appropriate for this thread given the context, so hopefully this doesnr get me in trouble: Out of curiosity, how undesirable is it to do either SLI or Crossfire (just two GPUs) on a board with a PLX chip? I'm considering someday going with 2x ASUS R9-280X Matrix's which are the triple-slot coolers. To allow a slot for adequate air intake for the first card I was looking at boards that place the second GPU lower such as the Asrock Z77 WS or (ideally to keep my Hackintosh hobby alive) the Gigabyte GA-Z77X-UP 7. I know there is some overheard for the PLX chip but wasn't sure whether it should be avoided at all costs. For reference though my motherboard and cards will be horizontal (keeping a Corsair Air 540 on its side) so hot air should be rising up from one card into the upper card such as in a vertical arrangement. Otherwise are there any standard x8/x8 boards that could accommodate 2x triple slot coolers?
|
# ? Oct 21, 2013 18:13 |
|
Is purchasing a video card with more memory than the reference card desirable? I've been considering the different models of GTX 760 out there, and several offerings (notably those from EVGA) possess 4GB as opposed to the reference 2GB amount.
|
# ? Oct 21, 2013 20:37 |
|
Asked earlier, I'd tell you that of course not, because video cards are designed for an amount of VRAM suitable for what they can actually render. Now? Depends on how much it costs, and I still wouldn't bother doing it with anything smaller, or choosing VRAM over a more powerful GPU. Even six months ago, only things like modded Skyrim (and apparently Bioshock Infinite on ridiculous detail settings) needed that kind of RAM, but that was before it sank in that eighth-generation consoles are going to have like 4 GB of VRAM to play with. We're about to see VRAM usage get a good solid kick in the rear end, so it might be worthwhile.
|
# ? Oct 21, 2013 20:50 |
|
Sir Unimaginative posted:Asked earlier, I'd tell you that of course not, because video cards are designed for an amount of VRAM suitable for what they can actually render. Now? Depends on how much it costs, and I still wouldn't bother doing it with anything smaller, or choosing VRAM over a more powerful GPU. Are you going back as far as the atari? Did I miss a console somewhere?
|
# ? Oct 21, 2013 22:35 |
|
I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004). This is going to be some cool tech.
|
# ? Oct 21, 2013 22:44 |
|
Agreed posted:I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004). I basically did a 180-degree opinion turn in regards to what my next planned card would be down the road when I read about this the other day. Back when I had learned about Adaptive Vsync it dawned on me that frame spewing was an issue that really could only be solved with workarounds. Not so, apparently. This attacks the problem directly and also helps me decide on what sort of monitor I'd buy with the card to finish my setup on which I can uselessly while away the rest of my life on.
|
# ? Oct 21, 2013 22:52 |
|
Agreed posted:I don't know about y'all but this G-Sync stuff is some stuff I would totally buy. Like, the more I think about it, the more sense it makes and I'm hyped as hell (as well as wondering why someone didn't already do it, given that the need to sync refresh to power or refresh to phosphor fade rate went away in like 2004). Yea, it's legit making me regret JUUUUST buying my U3014. I can't realistically turn on vsync, because I can't maintain 60 FPS solid with only one OCed GTX680, so I get lots of tearing. But vsync leads to all those weird feeling lags and stutters in motion. Hopefully Asus at the very least puts it in one of their IPS models, because gently caress if I want to go back to a TN 144hz screen from a full sized 30" just to get this cool new tech.
|
# ? Oct 21, 2013 22:53 |
|
Sidesaddle Cavalry posted:I basically did a 180-degree opinion turn in regards to what my next planned card would be down the road when I read about this the other day. I was set on picking up a dirt cheap 7950 to replace my 5850 that's getting long in the tooth, but I'm going to keep coasting a while and wait and see. Between this and shadowplay nVidia is putting together quite the package of GPU fringe benefits.
|
# ? Oct 21, 2013 23:02 |
|
|
# ? Apr 28, 2024 21:23 |
|
I'm curious how long it will take for AMD to compete. There was a good article on it on Techreport today, and I think they may be right as to perhaps where AMD should be aiming.quote:AMD will need to counter with its own version of this tech, of course. The obvious path would be to work with partners who make display ASICs and perhaps to drive the creation of an open VESA standard to compete with G-Sync. That would be a typical AMD move—and a good one. There's something to be said for AMD entering the display ASIC business itself, though, given where things may be headed. I'm curious to see what path they take.
|
# ? Oct 21, 2013 23:10 |