|
Tab8715 posted:Is there such a quiet video card? I don't know if it's just me but I find them obscenely loud. Sure. Many vendors have a mid-low range (the catch) card that's cooled passively. incoherent fucked around with this message at 03:35 on May 12, 2012 |
# ¿ May 12, 2012 03:32 |
|
|
# ¿ Apr 19, 2024 15:35 |
|
Factory Factory posted:Intel's folly Intel is shipping poo poo (human fecal matter) and getting people to buy it. Think about this: it ONLY supports DX10.1, and ships 9.0 natively. We're 2 years and a service pack into windows 7 and they're shipping DX9 hardware. incoherent fucked around with this message at 09:04 on May 29, 2012 |
# ¿ May 29, 2012 09:01 |
|
Factory Factory posted:Update to this: Fairly sure someone on the higher up the food chain "fix it". While it doesn't affect the greater AMD, they don't want anything out there saying "AMD (the organization) is unsecure". Even if it a part of the drivers for video cards. Much ado though, when people are actively defeating cryptographic hashes in windows update.
|
# ¿ Jun 11, 2012 07:28 |
|
NihilCredo posted:AMD has just released a new beta driver, 12.7, which upon installation turned out to offer a massive performance increase in the one game I play that still made my 5770 struggle (Skyrim with tons of graphic mods). Problem is, those same 12.7 drivers are also the ones that drop support for older video cards, including my 4200. As soon as I install them, the second screen goes black and the 4200 is reported as not working correctly in Device Manager. you could modify the driver package with the string to identify your 4200. Perhaps verifying the differences in the ini file of the old driver and new, and forcing the device manger to install both. I don't think AMD immediately removed everything to do with those cards and I've seen ini fuckery work (for example, silicon image drivers that work in windows 7 x64, but not Server 2008 R2).
|
# ¿ Jun 30, 2012 08:21 |
|
I need this. For my HTPC. e: I don't even have a HTPC.
|
# ¿ Jun 30, 2012 08:23 |
|
I belive i've found a terrible use for my GTX 670. I'm watching the rover land on mars with this app http://eyes.nasa.gov/ I've locked the framerate at 30fps and running 16xQ CSAA. Note that java doesn't hit the graphics card that hard, but it is going to look sweet. I love the adaptive frame rate capabilities (tried it on my older 260, didn't look nearly as smooth). incoherent fucked around with this message at 02:59 on Aug 6, 2012 |
# ¿ Aug 6, 2012 02:56 |
|
Factory Factory posted:For Premiere's Mercury Engine, the one that does rendering, it's still mostly Nvidia-only with only a few official Radeons in Macbook Pros. Hacking in support for Nvidia cards is easy, but hacking in support for Radeons is not. This should be a legit adobe gripe blog post.
|
# ¿ Aug 18, 2012 04:47 |
|
Amusingly sleeping dogs is a AMD branded joint. The game is a surprise sleeper hit (and legitimately fun). If nvidia is going that far to make it work, then they must of dropped the ball in marketing and attempting to save face. It really is the kind of game you want to show off a video card with. fake edit: it fixed some noticeable stuttering issues when locking in half-rate vysnc.
|
# ¿ Aug 28, 2012 07:10 |
|
Agreed posted:They support very high quality scaling, and you can choose whether to scale while maintaining aspect ratio, or stretch-to-fit, with other options available as well. It is dramatically superior in quality to the mediocre nearest-neighbor quickie scaling done by typical monitors when operating at non-native resolutions, but I forget exactly which type of scaling method is employed. Both companies have had on-board high quality scaling that's basically free (in terms of performance and latency) for a long, long time now, the 600-series with latest drivers are no exception. People with high resolution screens have remarked on their successful preference for setting demanding games' internal resolution lower and then allowing the graphics card to scale that up so that they can play with visual goodies enabled that would otherwise kill the steady performance they enjoy. This should (or a variation of) should go in the op. Also, should my digital color format be anything other than RGB? My monitor can do the YcBcCr settings, but I lose my mouse pointer (a bug perhaps). The colors look very vibrant...slightly over saturated but nothing an adjustment couldn't fix. incoherent fucked around with this message at 19:49 on Nov 25, 2012 |
# ¿ Nov 25, 2012 19:41 |
|
Dodoman posted:Anyone else having issues with the GeForce Experience tool from Nvidia? It's been stuck on scanning for games since last night and I have no idea if it's doing anything at all (though presumably it's not working). I couldn't find out how to sign up for the beta so I was surprised it ran at all - is this why it's not working? Mine is working fine. I just pointed manually to the steamapps folder, ea games/origin folder, and the other games you want optimized. Although i'd like to say that in borderlands 2, fullscreen windowed is the superior option.
|
# ¿ Dec 9, 2012 21:57 |
|
-RX designations carry a less than 1 year warranty depending on the card. Check that part number! E: Who's running the GeForce experience beta software? Its basically cloud sourced "optimized" settings, but in practice they seem contradictory. For example, nvidia advertising touts TXAA support for ACIII but the recommended setting is "high". The setting high is the below TXAA . incoherent fucked around with this message at 09:10 on Dec 12, 2012 |
# ¿ Dec 12, 2012 09:01 |
|
Tab8715 posted:The current video cards make my room way too goddamn warm. My 2 260s used to keep my room warm in the winter (unbearable in the summer). Sometimes I wonder if my 670 is even on.
|
# ¿ Jan 29, 2013 09:50 |
|
At least we can expect what the baseline look of what a UT powered game will look like.
|
# ¿ Mar 30, 2013 03:53 |
|
You're going to be hard pressed to find anything that crushes the current video card crop till 2015. Outside of BF4 and watch_dogs I think we've seen the best (bioshock) we're going to get graphically this year. e: unless the next CoD game for this year is going to include high-def, wrinkly old dudes. incoherent fucked around with this message at 06:48 on Apr 4, 2013 |
# ¿ Apr 4, 2013 06:46 |
|
No, I think it's laughable that they'd compete with eyefinity but it does bode well for muti-high dpi/resolution monitor support.
|
# ¿ May 2, 2013 06:38 |
|
lovely Treat posted:I just got myself an ASUS GeForce GTX 660 DirectCU II OC which is my first Nvidia card since the days of the Geforce 6800 ultra so I am a little out of touch with Nvidia stuff. Could be a failed overclock on the card. Typically, low factory oc-cards are basically **pray it will run at this new clockspeed** on account of overhead built into the hardware. Sure they do testing, but it amounts to "Will it run 3Dmark and not crash?". Real, hard-core factory OC cards typically are volt modded (ex: evga FTW models) or have better PWM (ex: classifieds) to support the 20% or so increase in clockspeed, but they're still fallible to the quality of the hardware. incoherent fucked around with this message at 16:56 on May 21, 2013 |
# ¿ May 21, 2013 16:50 |
|
I'm genuinely surprised that the 620 isn't giving the option for 4k. Are you sure the monitor edid is sending the right information? Did you try and hack the registry to force a 4k?
|
# ¿ May 30, 2013 06:01 |
|
This is impressive. incoherent fucked around with this message at 19:34 on May 31, 2013 |
# ¿ May 31, 2013 19:31 |
|
Kramjacks posted:It looks like a wall-mounted AC unit. The busted kind that vents heat.
|
# ¿ May 31, 2013 21:19 |
|
Agreed posted:But don't worry, Self, you'll learn more in November. Promise. No really. Alright get on outta here NDA time adios. If the developer has low-level access on the PS4 and PC GPU hardware it might just lend itself to making PS4 or PC the targeted platform of choice.
|
# ¿ Sep 26, 2013 06:16 |
|
VorpalFish posted:The 290 in general makes a lot more sense than the 290x - the performance difference between the two in no way justifies the $100 price difference. Well the 670 and 680 had similar performance gap. Consumers in this end will drop the extra $100 no matter how much we bang on the drum that its a waste of money.
|
# ¿ Nov 29, 2013 23:50 |
|
veedubfreak posted:Slot 2 (the black slot) is actually what they suggest using as your primary slot for 1 or 2 cards as it is fully 16x. This board has the plx chip so it should be running 16/8/8 or 8/8/8/8 if you run 4. I just can't understand why adding a third card while making no other changes would have made gpu2 become primary. Is a monitor on the second gpu?
|
# ¿ Jan 27, 2014 00:31 |
|
Looking at that 970 to replace my 670. What's the market rate for a 670? I've never encountered a game that taxed (except BF4 on supa high), so it still has legs. I also know its not a 770 or 770ti so i'm being realistic.
|
# ¿ Sep 20, 2014 18:58 |
|
Reseat the video card or try a different pcie slot? Double check the 8 pins and try alternating them.
|
# ¿ Dec 14, 2014 23:07 |
|
Sidesaddle Cavalry posted:Can we interrupt 970 chat again? This article got through to my dumb little reptilian brain and I need this API in every time-wasting MMO I have played ever. I'd like to know the various factors weighed and considered by development teams of existing games to adapt new APIs. I'm also interested the timeframes (on average) required for such projects from "OK let's do this" to "argh my hands OK it's mostly done and tested come get it", but that's a selfish desire. The reason they chose mantel: AMD gave them lots of money and the frostbite is the de facto organization engine that will power all of EA games. AMD just needs to dump the money bags down once and it will be in every EA game for 2-3 years. Sure, they don't have a call of duty (or rear end creed...or a good battlefield game lately) but EA has a strong portfolio of games that immediately gets mantel support. Far easier to get this going than HairFX.
|
# ¿ Jan 28, 2015 08:58 |
|
Someone should do an artisanal line of motherboards with bolted on modern features for the z68/z77 chipset. For a time when cpus were authentic and there was performance for the dollar.
|
# ¿ May 29, 2015 06:52 |
|
trust me, someone will order 500 and export.
|
# ¿ May 29, 2015 20:18 |
|
in GTA 5 the quality slider replaces blades of grass with cars if you go all the way to the right.
|
# ¿ May 31, 2015 06:54 |
|
KakerMix posted:Hell yeah 649, what a deal. That owns With two new games it makes it 529.
|
# ¿ May 31, 2015 21:04 |
|
SSSSSSSSSSsssssssssssssssssssooooooooooo when do these cards get into the hands of the reviewers?
|
# ¿ Jun 17, 2015 02:15 |
|
salty tweets incoming https://twitter.com/bburke_nvidia/status/611009437864587264
|
# ¿ Jun 17, 2015 04:43 |
|
It's AMDs thing/dime. Nvidia is more than welcome to come to E3 and blow out a "hearts and minds" event. We saw two console makers just duke it out over two+ hours and console games came out on top. We need that as well.
|
# ¿ Jun 17, 2015 04:55 |
|
Lemme tell you about my GTX 670 that is playing the division very very well.
|
# ¿ Mar 17, 2016 03:11 |
|
snuff posted:Anything over 16gb is for the mentally ill with thousands of browser tabs open. Hey now: ram disks
|
# ¿ Oct 20, 2016 07:07 |
|
Nixonator posted:I'm looking to step up to a 1060 6GB, out of MSI/Gigabyte/EVGA/ASUS, are there any known awful coolers? Like, am I screwing myself if I buy an ACX 2.0 cooler instead of an ACX 3.0? I'm not looking for anything over a factory overclock, I just don't want something that sounds like a vacuum cleaner all the time. I just got the EVGA GAMING version of the 1060, which is a single-fan,half-size gpu and it spins down to silence on the desktop and I cannot hear it at all under gaming\load. While you're not looking at any Factory OC I'm easily boosting it past the XXXFACTORYOCXXX levels on the stock card (1870mhz), but it's limited to thermals so I must have good airflow incoherent fucked around with this message at 22:26 on Nov 26, 2016 |
# ¿ Nov 26, 2016 22:20 |
|
If you don't have the latest driver, there is a bug where more than one DP port will throttle the card up. For reference I have 3 devices (2 DP, 1HDMI) and its at like ~900mhz However, your browser can trigger the gpu to throttle up as well.
|
# ¿ Dec 2, 2016 06:48 |
|
Well, that doesn't fit the profile of the bug. What happens when you clock the 144hz down to 60?
|
# ¿ Dec 2, 2016 07:03 |
|
LooKMaN posted:You can. I have 2x 24inch LCD connected to my GTX1070 and 1x HDTV connected to the iGPU of my 6700k CPU. The GTX 1070 is at 139 MHz core/202Mhz memory at idle. What I suspect is there is the two outputs are naturally freebies on the card, and the third hits the GPU. Using the onboard is a solid ideal to limit the needless heat.
|
# ¿ Dec 2, 2016 19:57 |
|
Wait, what? I have a 2600+ and it has the GPU on it*. Did you get a rare mobo with no outs? (*mine has garbage DVI+VGA)
|
# ¿ Dec 2, 2016 20:52 |
|
|
# ¿ Apr 19, 2024 15:35 |
|
Nah, don't bother.
|
# ¿ Dec 7, 2016 04:12 |