Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
KillHour
Oct 28, 2007


I'm not really seeing any recent games that use PhysX. Have there been any high profile ones that use it for more than incidental effects in the last year or 2?

Adbot
ADBOT LOVES YOU

KillHour
Oct 28, 2007


td4guy posted:



If anyone's waiting on the dual-fan overclocked EVGA SC Signature 2 GTX 680, it's apparently still coming, and more information will be released on it at the end of the month. (source)

No idea why you'd want it over the ASUS version aside from brand loyalty though.

Can you even GET the ASUS ones anywhere? I've yet to see them for sale.

KillHour
Oct 28, 2007


KKKLIP ART posted:

My Asus 670 will be here tomorrow and I am super stoked. I am using a GTX260 now, so this is going to be such a massive improvement that it isn't funny.

Hey, GTX 260 bro. Unfortunately, I'm going to have to wait a bit for an upgrade, due to money issues. Tell me how much more awesome it is. :unsmith::respek::unsmith:

KillHour
Oct 28, 2007


I'm pretty sure it has to be 3 monitors, 2 won't work. Why would you want to play a game across 2 monitors, anyways?

KillHour
Oct 28, 2007


That Wikipedia page is pretty out of date...

KillHour
Oct 28, 2007


What model is the PSU, and is it still under warranty?

KillHour
Oct 28, 2007


Agreed posted:

It doesn't demand a GF110 card's full resources, but it does clock up to full (stock) speed, and you can overclock the memory to get some added bandwidth.

But don't do it unless you're messed in the head, there are with the addition of Borderlands 27 games of dramatically varying quality that have GPU accelerated PhysX implementation.

If you're gonna play a LOT of Borderlands 2, I guess hang onto it if your power supply can handle a 670 and the (hungry hungry) GF110-based card simultaneously; my 750W Corsair HX is actually a 900W-safe CWT and it can pretty effortlessly manage a highly overclocked 2600K, a highly overclocked GTX 680, and the GTX 580 (which will, annoyingly, sometimes spin up even when there is no GPU PhysX going on at all - blame the drivers), as well as one optical drive, two 2TB HDDs, and 3 SSDs. And 4x4GB DDR3 1600 DIMMs at 1.525V. I ought to get a measurement tool to see what my total system draw is, but the added energy bill component is about the same as having a 100W lightbulb on in a room constantly. To carry the analogy, you also only go into that room about once a month. Not a major expenditure there, but it is part of the final analysis.

Another drawback - if you aren't using an Ivy Bridge processor, the lack of PCI-e 3.0 support and a tendancy for most Sandy Bridge motherboards capable of running two 16-lane devices in 8x/8x mode will mean that a modern, high-end graphics card will be somewhat bottlenecked (game dependent, but expect as much as 5% for PCI-e 2.0 8x). So it's going to cost you a portion of your possible performance at anything lower than PCI-e 3.0 8x/8x to do.

The benefit, in that minority of games that support it, is that at all resolutions, you will get superior framerates using a dedicated PhysX processor than if you used, say, two 680s in SLI. No kidding, the rendering hit and the inefficiency of juggling the card's shader resources as CUDA cores no they're shaders no they're CUDA no they're shaders is big enough that you end up eating poo poo for FPS in demanding PhysX calculations if you have ANY rendering card, SLI or otherwise, trying to pull double duty.

So, does that mean it would be worth it to use one of my GTX260's as a PhysX card when I upgrade? (Doubt it)

KillHour
Oct 28, 2007


FISHMANPET posted:

I'm pretty sure the cameras are all IP cameras, so it's just a matter of opening however many browser windows is necessary.

I'm not sure what you mean by needing a video engineer. What kind of inputs do you think I'm talking about?

You're viewing a ton of IP cameras by opening them in browser windows? :wtc:

How many cameras are you trying to do? There's no way you can decode enough video streams to fill 16 monitors at anything resembling a decent frame rate with a single PC.

You want this with 4 PCs each driving 4 monitors:

http://www.milestonesys.com/productsandsolutions/xprotectaddonproducts/xprotectsmartwall/

https://www.youtube.com/watch?v=07J7mtTDuYQ


Edit:

Out of curiosity, I tried to see if my computer (heavily overclocked i5 3570K @ >4.5GHz -w- 16GB of RAM) could decode 16 HD H.264 streams. I have a 720p version of Hunger Games, so I put it on my SSD and opened up as many copies as I could. After 6 or so, I started dropping frames and the GUI was being sluggish. After 12, the computer was barely usable and as you can see, the stats were blinking in and out. The video was tearing pretty badly, as well.



TL;DR: Don't try to run a 16 screen video wall with 1 computer. Bad things happen.

KillHour fucked around with this message at 07:35 on Sep 4, 2012

KillHour
Oct 28, 2007


Boten Anna posted:

For shits and giggles I tried with a copy of Black Swan using VLC and while Final Fantasy XIV was still running (a hog of a game even though I'm standing in my small inn room). I have a 3770K and a GTX 670.

Everything started artifacting heavily though my computer was usable at 16, 12 was choppy but I've seen people think worse is acceptable.

I closed FFXIV and tried again and it is almost watchable but there is still artifacting which I think is an issue with disk I/O (256GB Crucial M4 SSD notwithstanding) as it doesn't start until I pick random seek points. It seems I could do 13 videos smoothly; still kind of choppy but not as bad as 12 with XIV open.

I think you'd need two of my computers to run a 16 screen video wall well, but it'd be cheaper to use four lesser specced ones I think.

Do note that HD movies are generally 720P, as well. The monitors for video walls tend to be at least 1920x1080, which has more than double the amount of pixels.

Edit: Also, it takes 160Mbps of network bandwidth to stream 16 1080p cameras. Hope you have a beefy network.

KillHour fucked around with this message at 13:25 on Sep 4, 2012

KillHour
Oct 28, 2007


Scalding Coffee posted:

The noisy card might have been getting used to being on all the time. It very occasionally made that buzzing sound, but it seems to be as quiet as my 560ti. Just getting used to that new card smell.

Buzzing usually comes from the chokes. I'm pretty sure it never just "goes away".

KillHour
Oct 28, 2007


Glen Goobersmooches posted:

Gigabyte's 670 is literally the thinnest model there is. The problem would have to be your motherboard's crowded PCI-E array.

Also, 144Hz? What monitor is this? If you've got a 120Hz you should basically never need vsync ever, unless you're playing ancient games without some kind of frame limiter on the high end.

A 670 can easily run 150+FPS on many games maxed. Not Metro, but still.

KillHour
Oct 28, 2007


Goon Matchmaker posted:

I honestly have to wonder why MSI did that. From a warranty perspective they had to have realized that they'd be burning out cards left and right. The end cost to them is going to be far higher replacing all the burnt out cards than they'll make in profit.

It's just utterly stupid.

These kinds of things are usually done by a single person or small group with something to gain in the short term, drat the long term consequences.

KillHour
Oct 28, 2007


I don't know why they don't just program games to be 64 bit nowadays. Is anyone really still using a 32 bit OS for gaming?

KillHour
Oct 28, 2007


Endymion FRS MK1 posted:

Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games.

Until I clicked that link, I was wondering why you were buying a graphics card from 2006.

http://www.nvidia.com/page/geforce_7950.html

KillHour
Oct 28, 2007


I've always loved tech demos. Even the most graphically intense production games generally use graphics techniques that were bleeding edge 5-10 years ago, and shown off in tech demos back then.

Here are some examples of older real-time tech demos that blew people's socks off when they came out, but would be pretty uninspired today:

https://www.youtube.com/watch?v=hJ0ycLo3PFM
https://www.youtube.com/watch?v=cv8cYrGG220

Even things as recent as a few years ago aren't all that impressive any more (but still really cool):

https://www.youtube.com/watch?v=fQMbFQVLhMc
https://www.youtube.com/watch?v=Br3rMKApGNI

So, what's new? Well, here are a few of the best looking demos I've seen recently:

https://www.youtube.com/watch?v=UVX0OUO9ptU
https://www.youtube.com/watch?v=bI1_quVr_3w
https://www.youtube.com/watch?v=RSXyztq_0uM
https://www.youtube.com/watch?v=HUcutZTObfM

I'd love to see some other new stuff if I may have missed, if anyone knows of anything particularly impressive.

KillHour
Oct 28, 2007


Well, if we're comparing demoscene stuff...

https://www.youtube.com/watch?v=IFXIGHOElrE

KillHour
Oct 28, 2007


Factory Factory posted:

So I was just reading this article on high framerate (HFR) filmmaking and The Hobbit; there's hubbub that 48 FPS showings of the movie are triggering Uncanny Valley effects in a good number of people by being effectively faster than the brain's Conscious Moment Per Second rate - about 40 at rest state, 80 to 100 during intense activity. While the eye can distinguish motion at a faster rate, again varying by person but on average 66 Hz, apparently that 40 Moments per Second rate is killing the suspension of disbelief, because 48 FPS film no longer looks sufficiently different from reality for a lot of folks.

It makes me wonder: Why doesn't this happen with video games? Or will it, in the future? For many people, the difference between 30 FPS and a 60 FPS in gaming is not only incredibly noticeable, but it's incredibly desirable, even if we have no problem with 30/25/24 FPS video. What distinguishes these situations? Is it the stylized aesthetics or other cues for non-reality, so that the uncanny valley isn't approached for other reasons? Or does the interactivity make a game sufficiently "real" in terms of our brain function that the imagery itself has different standards for uncanniness? Is it simply a remaining enormous gulf between real-time generated imagery and photoreality? Or does the higher framerate reduce latency between action and reaction in a way that reduces unreality more than the increased framerate increases it by itself? Or are all video game players hyperspergs who don't find Japanese robotics incredibly weird?

A lot of TVs with temporal interpolation - like, upscaling the framerate to 60 or 120 FPS from 30p or 60i content - look great to some people and awful to others. Yet I'm not sure those categories map or correlate with video gaming - I know I love a solid 60 FPS in games, but motion interpolated video has a kind of hyperreal queasiness to me, like the difference between a 24p film and a 60i football game taken five steps too far.

Brains man. What gives with brains?

Maybe this will finally stop the "But the human eye can't see faster than 24 FPS anyways!" :downs: crap that I hear on a daily basis. Probably not, but one can hope.

KillHour
Oct 28, 2007


I miss BFG. :smith:

I'm still using a pair of BFG GTX260 Maxcore 55 OC2's.

Your ridiculous model names and absurd warranties will be fondly remembered. :911:

KillHour
Oct 28, 2007



Jumped on this like a fat kid on the last Twinkie.

KillHour
Oct 28, 2007


I'm going to bench before and after. Anyone want to guess how much of an FPS bump I'll get?

Current specs:

Intel i5 3570K (OC: 4.5GHz)
16GB DDR3 1600
BFG GTX 260 OC2 MAXCORE 55 (x2 in SLI)
Sandisk Extreme 120GB SATA III SSD

KillHour
Oct 28, 2007


Damnit, Epic, why don't you ever release your tech demos to the public?

KillHour
Oct 28, 2007


Klyith posted:

Your issues sound very similar to a thing I had happen to me about a year ago, except mine was with my nvidia card. At some point I installed new drivers over the top of old ones without fully uninstalling. Windows XP used to be ok with that, but 7 gets hosed up.

Basically every time after that when I tried to update drivers, windows would dig out this old version from the depths of WinSXS and try to install its files instead. Driver cleaning programs would remove the active files, but not the backups from whatever buttcrack they were lodged in. Finally I had to remove any new drivers, and let windows go back the old ones so I could find which exact version they were. Then I got that exact version from nvidia's driver archive, installed it, then used the uninstaller properly. That finally got rid of everything, and I could update again with no problems.

WinSXS is the devil's rear end in a top hat.

KillHour
Oct 28, 2007


booshi posted:

I've tried the receiver, which doesn't pass through the TV (the way my system is set up is that all devices connect to the TV via HDMI, then audio out via digital audio from the TV to the receiver), and tried with only the receiver connected to the computer. If I could find a way to allow both of my digital audio cables to be plugged in and work that would be fine. The way it is now, I have a splitter that I run in reverse, with one input from the receiver and the other from the computer. I can mute my computer and everything else (TV, Xbox, etc) work fine, but I can't mute anything else to stop a signal from going when the PC is also going, so no signals come through since you can't just mix digital optical audio signals. So, my current solution has been to just switch what is plugged in to the splitter.

I'll play around with it some more, as I had a feeling that what nVidia said was the case: using EDID info. If anyone else has any tips feel free, and if I get it working I'll update with my fix in case anyone else runs into the problem.

What receiver do you have? Upgrading the receiver may be the best option.

KillHour
Oct 28, 2007


booshi posted:

It's part of a Sony HTITB, a year old. 2 HDMI in, 1 out, a few of the other standard inputs, also a digital optical audio in, and has 5.1 surround, blu-ray player, one of those types. I may actually be in luck because my Dad has been complaining about issues with his Sony receiver (a real one, not like my current one) with his new Samsung TV. I think he has/is about to buy a new one and is going to give it to me, since I have a Sony TV, and their receivers and TVs work really well together with the Bravia sync stuff. Only issue is I only visit my family flying, so it's either they ship it to me or I take a bigass receiver back on the plane somehow.

You're doing it wrong. You need to hook up your receiver like this:



As for the Bravia Link stuff, don't listen to Sony's marketing. Any receiver worth jack (Not a HTIAB) will support controlling devices over HDMI. I can control my PS3 with the remote for my Onkyo, for instance (And the remote for the TV... and vice versa - the PS3 turns off the receiver and TV when I shut it down).

I've tested it with both LG and Toshiba TVs, and everything works together just fine.

KillHour
Oct 28, 2007


If you still don't have enough ports:

http://www.monoprice.com/products/product.asp?c_id=101&cp_id=10110&cs_id=1011002&p_id=4088&seq=1&format=2

Edit: Goddamnit, beaten.

KillHour
Oct 28, 2007


Man, that 760 looks great. Maybe that means I can snag a second 670 on the cheap for SLI soon. :getin:

Real quick question: The 670 supports DX 11.1, right?

KillHour
Oct 28, 2007


Alereon posted:

No, only nVidia's GK200 GPUs support DX11.1. That said, essentially all DX11.1 gaming capabilities will be usable on nVidia's DX11.0 cards, only non-gaming features are missing and preventing compliance.

Do those missing features include the standardized 3D?

Edit: Also, considering the 670 and 760 are basically the same hardware, what are the chances of being able to flash one to be the other?

KillHour
Oct 28, 2007


I spent some time looking at new benchmarks/demos, and stumbled across this.

http://www.pouet.net/prod.php?which=61211

:drat:

Does anyone have a system that can run this at 1080@30? My 670 gets as low as 4-5 FPS in some parts.

KillHour
Oct 28, 2007


Agreed posted:

This is awesome. Ran smoothly for me except for one part when a thing got gelled and blew up, probably around 10-15fps.

Everyone should definitely watch this, though, hot drat what a demo. Raytracing For Real, Maya-quality in realtime. Woah.

Turns out my GTX 670 overclocks like a beast.

Doesn't hold a candle to your setup, but I can pull down some pretty respectable performance if I'm willing to let my GPU go into "leafblower mode".



This is the absolute lowest performing spot for me. If you had shown me that screenshot a few years ago, I would have called you a liar.

KillHour
Oct 28, 2007


PC LOAD LETTER posted:

Fixed cuz' you were right the 1st time.:colbert:

Agreed. Chromatic aberration, barrel distortion, Bokeh, reflections, and refractions all in real time at 1080P. :psyduck:

If it was anti-aliased, I would have guessed a 2-3 hour render per frame.

KillHour fucked around with this message at 05:54 on Nov 22, 2013

KillHour
Oct 28, 2007


Found a "making of," if anyone's curious as to how they got the raytracing demo to run so smoothly.

http://directtovideo.wordpress.com/2013/05/07/real-time-ray-tracing/
http://directtovideo.wordpress.com/2013/05/08/real-time-ray-tracing-part-2/

The answer: Lots of application-specific optimizations, and a rasterizer/raytracer hybrid engine.

KillHour
Oct 28, 2007



Where did you get the Shodan wallpaper?

Edit: Never mind, it's included in the GOG edition of System Shock 2... which I have. :downs:

KillHour fucked around with this message at 01:42 on Nov 26, 2013

KillHour
Oct 28, 2007


I just bought into triple 4K monitors. Obviously, there's nothing on the market that can push that at full tilt in the latest games, but I'd like to get as close as possible for under $1k. They support Freesync, but am I right in thinking that AMD can't touch nVidia for $:performance with the 1080?

KillHour
Oct 28, 2007


THE DOG HOUSE posted:

Yeah its the 1080 ... but

Go on...


Oh, if anybody is interested in picking up the same monitors, Massdrop has the same deal I bought them at.

https://www.massdrop.com/buy/lg-27-...y%20Promotional

KillHour
Oct 28, 2007


SwissArmyDruid posted:

Eh, I question the value of that monitor. It doesn't hit that range and is only 40Hz - 60 Hz.

To be clear: I think 4K is dumb right now and if you're going to game at that resolution, you want LFC even more so than usual.

I had a 120hz monitor as my last one, and I had a hard time telling the difference between 60 and 120 in games anyways. Maybe I'm just broken. So 1080 is the way to go, even though I'll miss out on freesync?

KillHour
Oct 28, 2007


With that in mind, is there anywhere with aftermarket cards actually in stock, and is there any advantage to getting one with 3 DisplayPort outputs, or is HDMI 2 on par now?

KillHour
Oct 28, 2007


6 THOUSAND dollars? I hope that's not USD.

KillHour
Oct 28, 2007


FaustianQ posted:

We're entering bizarro timeline where AMD drivers and control panel are consistently better than Nvidia's. I am okay with this.

I'm not. nVidia is still a better buy for the money. I shouldn't have to choose between good drivers and good performance. :(

KillHour
Oct 28, 2007


spasticColon posted:

Okay, who on here is going to buy a $1200 Titan X?

For $1200 you could build a decent gaming PC from scratch.

I'm considering it in a vain attempt to drive my 3x 4k monitors. :suicide:

Adbot
ADBOT LOVES YOU

KillHour
Oct 28, 2007


Yeah.. I know. I'm probably going to get a 1080 and run them all at 1080p with GPU scaling to 4K. Hopefully, by the time a GPU comes out that can drive them properly, either nVidia will support freesync (hah!) or AMD will have something worth buying on the high end.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply