|
I'm not really seeing any recent games that use PhysX. Have there been any high profile ones that use it for more than incidental effects in the last year or 2?
|
# ¿ May 15, 2012 05:15 |
|
|
# ¿ Apr 25, 2024 06:50 |
|
td4guy posted:
Can you even GET the ASUS ones anywhere? I've yet to see them for sale.
|
# ¿ May 16, 2012 19:54 |
|
KKKLIP ART posted:My Asus 670 will be here tomorrow and I am super stoked. I am using a GTX260 now, so this is going to be such a massive improvement that it isn't funny. Hey, GTX 260 bro. Unfortunately, I'm going to have to wait a bit for an upgrade, due to money issues. Tell me how much more awesome it is.
|
# ¿ Jun 4, 2012 21:37 |
|
I'm pretty sure it has to be 3 monitors, 2 won't work. Why would you want to play a game across 2 monitors, anyways?
|
# ¿ Jun 9, 2012 00:54 |
|
That Wikipedia page is pretty out of date...
|
# ¿ Jun 12, 2012 22:17 |
|
What model is the PSU, and is it still under warranty?
|
# ¿ Jul 2, 2012 14:20 |
|
Agreed posted:It doesn't demand a GF110 card's full resources, but it does clock up to full (stock) speed, and you can overclock the memory to get some added bandwidth. So, does that mean it would be worth it to use one of my GTX260's as a PhysX card when I upgrade? (Doubt it)
|
# ¿ Aug 18, 2012 06:18 |
|
FISHMANPET posted:I'm pretty sure the cameras are all IP cameras, so it's just a matter of opening however many browser windows is necessary. You're viewing a ton of IP cameras by opening them in browser windows? How many cameras are you trying to do? There's no way you can decode enough video streams to fill 16 monitors at anything resembling a decent frame rate with a single PC. You want this with 4 PCs each driving 4 monitors: http://www.milestonesys.com/productsandsolutions/xprotectaddonproducts/xprotectsmartwall/ https://www.youtube.com/watch?v=07J7mtTDuYQ Edit: Out of curiosity, I tried to see if my computer (heavily overclocked i5 3570K @ >4.5GHz -w- 16GB of RAM) could decode 16 HD H.264 streams. I have a 720p version of Hunger Games, so I put it on my SSD and opened up as many copies as I could. After 6 or so, I started dropping frames and the GUI was being sluggish. After 12, the computer was barely usable and as you can see, the stats were blinking in and out. The video was tearing pretty badly, as well. TL;DR: Don't try to run a 16 screen video wall with 1 computer. Bad things happen. KillHour fucked around with this message at 07:35 on Sep 4, 2012 |
# ¿ Sep 4, 2012 06:49 |
|
Boten Anna posted:For shits and giggles I tried with a copy of Black Swan using VLC and while Final Fantasy XIV was still running (a hog of a game even though I'm standing in my small inn room). I have a 3770K and a GTX 670. Do note that HD movies are generally 720P, as well. The monitors for video walls tend to be at least 1920x1080, which has more than double the amount of pixels. Edit: Also, it takes 160Mbps of network bandwidth to stream 16 1080p cameras. Hope you have a beefy network. KillHour fucked around with this message at 13:25 on Sep 4, 2012 |
# ¿ Sep 4, 2012 13:22 |
|
Scalding Coffee posted:The noisy card might have been getting used to being on all the time. It very occasionally made that buzzing sound, but it seems to be as quiet as my 560ti. Just getting used to that new card smell. Buzzing usually comes from the chokes. I'm pretty sure it never just "goes away".
|
# ¿ Sep 21, 2012 14:49 |
|
Glen Goobersmooches posted:Gigabyte's 670 is literally the thinnest model there is. The problem would have to be your motherboard's crowded PCI-E array. A 670 can easily run 150+FPS on many games maxed. Not Metro, but still.
|
# ¿ Sep 21, 2012 16:19 |
|
Goon Matchmaker posted:I honestly have to wonder why MSI did that. From a warranty perspective they had to have realized that they'd be burning out cards left and right. The end cost to them is going to be far higher replacing all the burnt out cards than they'll make in profit. These kinds of things are usually done by a single person or small group with something to gain in the short term, drat the long term consequences.
|
# ¿ Oct 2, 2012 17:32 |
|
I don't know why they don't just program games to be 64 bit nowadays. Is anyone really still using a 32 bit OS for gaming?
|
# ¿ Oct 17, 2012 21:07 |
|
Endymion FRS MK1 posted:Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games. Until I clicked that link, I was wondering why you were buying a graphics card from 2006. http://www.nvidia.com/page/geforce_7950.html
|
# ¿ Nov 4, 2012 06:16 |
|
I've always loved tech demos. Even the most graphically intense production games generally use graphics techniques that were bleeding edge 5-10 years ago, and shown off in tech demos back then. Here are some examples of older real-time tech demos that blew people's socks off when they came out, but would be pretty uninspired today: https://www.youtube.com/watch?v=hJ0ycLo3PFM https://www.youtube.com/watch?v=cv8cYrGG220 Even things as recent as a few years ago aren't all that impressive any more (but still really cool): https://www.youtube.com/watch?v=fQMbFQVLhMc https://www.youtube.com/watch?v=Br3rMKApGNI So, what's new? Well, here are a few of the best looking demos I've seen recently: https://www.youtube.com/watch?v=UVX0OUO9ptU https://www.youtube.com/watch?v=bI1_quVr_3w https://www.youtube.com/watch?v=RSXyztq_0uM https://www.youtube.com/watch?v=HUcutZTObfM I'd love to see some other new stuff if I may have missed, if anyone knows of anything particularly impressive.
|
# ¿ Dec 7, 2012 06:02 |
|
Well, if we're comparing demoscene stuff... https://www.youtube.com/watch?v=IFXIGHOElrE
|
# ¿ Dec 7, 2012 22:37 |
|
Factory Factory posted:So I was just reading this article on high framerate (HFR) filmmaking and The Hobbit; there's hubbub that 48 FPS showings of the movie are triggering Uncanny Valley effects in a good number of people by being effectively faster than the brain's Conscious Moment Per Second rate - about 40 at rest state, 80 to 100 during intense activity. While the eye can distinguish motion at a faster rate, again varying by person but on average 66 Hz, apparently that 40 Moments per Second rate is killing the suspension of disbelief, because 48 FPS film no longer looks sufficiently different from reality for a lot of folks. Maybe this will finally stop the "But the human eye can't see faster than 24 FPS anyways!" crap that I hear on a daily basis. Probably not, but one can hope.
|
# ¿ Dec 16, 2012 08:43 |
|
I miss BFG. I'm still using a pair of BFG GTX260 Maxcore 55 OC2's. Your ridiculous model names and absurd warranties will be fondly remembered.
|
# ¿ Jan 2, 2013 20:38 |
|
Don Lapre posted:Amazon has EVGA 670 FTW 2gb cards for $315 open box. Jumped on this like a fat kid on the last Twinkie.
|
# ¿ Feb 25, 2013 15:34 |
|
I'm going to bench before and after. Anyone want to guess how much of an FPS bump I'll get? Current specs: Intel i5 3570K (OC: 4.5GHz) 16GB DDR3 1600 BFG GTX 260 OC2 MAXCORE 55 (x2 in SLI) Sandisk Extreme 120GB SATA III SSD
|
# ¿ Feb 25, 2013 17:11 |
|
Damnit, Epic, why don't you ever release your tech demos to the public?
|
# ¿ Mar 30, 2013 05:34 |
|
Klyith posted:Your issues sound very similar to a thing I had happen to me about a year ago, except mine was with my nvidia card. At some point I installed new drivers over the top of old ones without fully uninstalling. Windows XP used to be ok with that, but 7 gets hosed up. WinSXS is the devil's rear end in a top hat.
|
# ¿ May 5, 2013 06:14 |
|
booshi posted:I've tried the receiver, which doesn't pass through the TV (the way my system is set up is that all devices connect to the TV via HDMI, then audio out via digital audio from the TV to the receiver), and tried with only the receiver connected to the computer. If I could find a way to allow both of my digital audio cables to be plugged in and work that would be fine. The way it is now, I have a splitter that I run in reverse, with one input from the receiver and the other from the computer. I can mute my computer and everything else (TV, Xbox, etc) work fine, but I can't mute anything else to stop a signal from going when the PC is also going, so no signals come through since you can't just mix digital optical audio signals. So, my current solution has been to just switch what is plugged in to the splitter. What receiver do you have? Upgrading the receiver may be the best option.
|
# ¿ May 12, 2013 20:12 |
|
booshi posted:It's part of a Sony HTITB, a year old. 2 HDMI in, 1 out, a few of the other standard inputs, also a digital optical audio in, and has 5.1 surround, blu-ray player, one of those types. I may actually be in luck because my Dad has been complaining about issues with his Sony receiver (a real one, not like my current one) with his new Samsung TV. I think he has/is about to buy a new one and is going to give it to me, since I have a Sony TV, and their receivers and TVs work really well together with the Bravia sync stuff. Only issue is I only visit my family flying, so it's either they ship it to me or I take a bigass receiver back on the plane somehow. You're doing it wrong. You need to hook up your receiver like this: As for the Bravia Link stuff, don't listen to Sony's marketing. Any receiver worth jack (Not a HTIAB) will support controlling devices over HDMI. I can control my PS3 with the remote for my Onkyo, for instance (And the remote for the TV... and vice versa - the PS3 turns off the receiver and TV when I shut it down). I've tested it with both LG and Toshiba TVs, and everything works together just fine.
|
# ¿ May 14, 2013 08:10 |
|
If you still don't have enough ports: http://www.monoprice.com/products/product.asp?c_id=101&cp_id=10110&cs_id=1011002&p_id=4088&seq=1&format=2 Edit: Goddamnit, beaten.
|
# ¿ May 14, 2013 14:01 |
|
Man, that 760 looks great. Maybe that means I can snag a second 670 on the cheap for SLI soon. Real quick question: The 670 supports DX 11.1, right?
|
# ¿ Jun 26, 2013 19:23 |
|
Alereon posted:No, only nVidia's GK200 GPUs support DX11.1. That said, essentially all DX11.1 gaming capabilities will be usable on nVidia's DX11.0 cards, only non-gaming features are missing and preventing compliance. Do those missing features include the standardized 3D? Edit: Also, considering the 670 and 760 are basically the same hardware, what are the chances of being able to flash one to be the other?
|
# ¿ Jun 26, 2013 19:45 |
|
I spent some time looking at new benchmarks/demos, and stumbled across this. http://www.pouet.net/prod.php?which=61211 Does anyone have a system that can run this at 1080@30? My 670 gets as low as 4-5 FPS in some parts.
|
# ¿ Nov 22, 2013 04:34 |
|
Agreed posted:This is awesome. Ran smoothly for me except for one part when a thing got gelled and blew up, probably around 10-15fps. Turns out my GTX 670 overclocks like a beast. Doesn't hold a candle to your setup, but I can pull down some pretty respectable performance if I'm willing to let my GPU go into "leafblower mode". This is the absolute lowest performing spot for me. If you had shown me that screenshot a few years ago, I would have called you a liar.
|
# ¿ Nov 22, 2013 05:32 |
|
PC LOAD LETTER posted:Fixed cuz' you were right the 1st time. Agreed. Chromatic aberration, barrel distortion, Bokeh, reflections, and refractions all in real time at 1080P. If it was anti-aliased, I would have guessed a 2-3 hour render per frame. KillHour fucked around with this message at 05:54 on Nov 22, 2013 |
# ¿ Nov 22, 2013 05:51 |
|
Found a "making of," if anyone's curious as to how they got the raytracing demo to run so smoothly. http://directtovideo.wordpress.com/2013/05/07/real-time-ray-tracing/ http://directtovideo.wordpress.com/2013/05/08/real-time-ray-tracing-part-2/ The answer: Lots of application-specific optimizations, and a rasterizer/raytracer hybrid engine.
|
# ¿ Nov 23, 2013 04:42 |
|
Where did you get the Shodan wallpaper? Edit: Never mind, it's included in the GOG edition of System Shock 2... which I have. KillHour fucked around with this message at 01:42 on Nov 26, 2013 |
# ¿ Nov 26, 2013 01:39 |
|
I just bought into triple 4K monitors. Obviously, there's nothing on the market that can push that at full tilt in the latest games, but I'd like to get as close as possible for under $1k. They support Freesync, but am I right in thinking that AMD can't touch nVidia for $:performance with the 1080?
|
# ¿ Jul 16, 2016 08:28 |
|
THE DOG HOUSE posted:Yeah its the 1080 ... but Go on... Oh, if anybody is interested in picking up the same monitors, Massdrop has the same deal I bought them at. https://www.massdrop.com/buy/lg-27-...y%20Promotional
|
# ¿ Jul 16, 2016 08:47 |
|
SwissArmyDruid posted:Eh, I question the value of that monitor. It doesn't hit that range and is only 40Hz - 60 Hz. I had a 120hz monitor as my last one, and I had a hard time telling the difference between 60 and 120 in games anyways. Maybe I'm just broken. So 1080 is the way to go, even though I'll miss out on freesync?
|
# ¿ Jul 16, 2016 16:51 |
|
With that in mind, is there anywhere with aftermarket cards actually in stock, and is there any advantage to getting one with 3 DisplayPort outputs, or is HDMI 2 on par now?
|
# ¿ Jul 16, 2016 17:48 |
|
6 THOUSAND dollars? I hope that's not USD.
|
# ¿ Jul 17, 2016 18:57 |
|
FaustianQ posted:We're entering bizarro timeline where AMD drivers and control panel are consistently better than Nvidia's. I am okay with this. I'm not. nVidia is still a better buy for the money. I shouldn't have to choose between good drivers and good performance.
|
# ¿ Jul 18, 2016 03:13 |
|
spasticColon posted:Okay, who on here is going to buy a $1200 Titan X? I'm considering it in a vain attempt to drive my 3x 4k monitors.
|
# ¿ Jul 24, 2016 16:14 |
|
|
# ¿ Apr 25, 2024 06:50 |
|
Yeah.. I know. I'm probably going to get a 1080 and run them all at 1080p with GPU scaling to 4K. Hopefully, by the time a GPU comes out that can drive them properly, either nVidia will support freesync (hah!) or AMD will have something worth buying on the high end.
|
# ¿ Jul 24, 2016 18:41 |