Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Octopode
Sep 2, 2009

No. I work here. I manage operations for this and integration for this, while making sure that their stuff keeps working in here.

Wengy posted:

I only have the one monitor cable, but I just connected the card to my other PCI-E cable and it suddenly worked again. It's probably just a coincidence though. I think it's a driver / software issue; there's a big thread on the geforce forums on people getting black screens and "no signal" messages upon booting.

When I installed my 980, I was having this issue. I managed to solve it by going into control panel and uninstalling all my nVidia drivers using the actual uninstall program, followed by a reboot and installation of the latest drivers using the 'clean install' option.

Adbot
ADBOT LOVES YOU

Ragingsheep
Nov 7, 2009
Is 8.8k in Firestrike what you'd expect from a MSI GTX970 and 4590?

LRADIKAL
Jun 10, 2001

Fun Shoe
go look at the comparison for other systems with your score. If they match your system then it's probably a normal score. The whole POINT of the benchmark is to show you what your system compares to!

DarthBlingBling
Apr 19, 2004

These were also dark times for gamers as we were shunned by others for being geeky or nerdy and computer games were seen as Childs play things, during these dark ages the whispers began circulating about a 3D space combat game called Elite

- CMDR Bald Man In A Box

Ragingsheep posted:

Is 8.8k in Firestrike what you'd expect from a MSI GTX970 and 4590?

This is what I got before I upgraded my PC recently (just over 10k)

http://www.3dmark.com/fs/3011369

We have the same GPU and the CPU is a 3570k that was OCed to 4.2GHz. Yours should be a bit faster than mine I believe.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Got my Mini-ITX 970 and fiddled around with it in SLI.

Dropping the card in took a little bit of finagling with space. I had to make sure the good ol' HD Audio case speaker cable was down flat, because with the mini-970 in the second PCIe slot, it was the only thing between the power supply and the card. (E: PCIe power cable went straight from my PSU to the 8-pin socket on the baby card, no routing under the motherboard needed.) Overall on my micro-ATX board the card covers one fan and 2/3rds of a second fan on the Gigabyte WindForce 3X card above it. It also manages to avoid my side case fan. With a bottom case fan blowing on that remaining opening, both cards are managing quite well on temps much like the trip report with this setup earlier in this thread.

Overclocking was done with GPU Boost 2.0 over MSI Afterburner. Once I found where to switch the clock controls between GPUs, I tipped some sliders over until my two core clocks and memory clocks matched.

DSR and MFAA are disabled, as expected. I'd love to see DSR start working with SLI on this ASUS PG278Q in the future for ludicrous downsampling at luxurious frame rates.

Shinies all turned up on FFXIV at 1440p still puts me way into the zone of diminishing returns for G-SYNC (and also being solidly CPU-limited until its DX11 client comes out). I've got my colors re-calibrated now for ULMB and am going to try out the Vsync "On (smooth)" setting for a while (unless someone on this thread has a better suggestion or something else they want to see).

Please don't spend as much money on a personal computer as me.

Sidesaddle Cavalry fucked around with this message at 06:45 on Dec 1, 2014

Seamonster
Apr 30, 2007

IMMER SIEGREICH
I'm trying to use the nvidia inspector utility to prevent my 750m from throttling at ~82c (by way of the "prioritize temperature" setting) but when I try to save settings it always goes back to default. I'm running it as admin, etc. Is there some fuckery going on behind the BIOS or something?

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Oh oh before I forget, a new mini-ITX 970 was announced by ASUS, in case you hate Gigabyte or Zotac with a burning passion for some reason.

http://www.techpowerup.com/207609/asus-readies-geforce-gtx-970-directcu-mini.html#269f09b496d946fca47426a1e73eb52d

Personally, I'm a little leery of the same cooler being reused from both the 670 and 760 on this, after seeing the whole (not wholly deserved) fiasco with EVGA's ACX. Still, I'm sure ASUS is aware of the same thing, so time and reviews will tell.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Sidesaddle Cavalry posted:

Oh oh before I forget, a new mini-ITX 970 was announced by ASUS, in case you hate Gigabyte or Zotac with a burning passion for some reason.

http://www.techpowerup.com/207609/asus-readies-geforce-gtx-970-directcu-mini.html#269f09b496d946fca47426a1e73eb52d

Personally, I'm a little leery of the same cooler being reused from both the 670 and 760 on this, after seeing the whole (not wholly deserved) fiasco with EVGA's ACX. Still, I'm sure ASUS is aware of the same thing, so time and reviews will tell.

The DirectCU cooler was the worst cooler of that generation of coolers, unlike the Strix cooler that replaced it for current generation cards; so I wouldn't be expecting this to cool particularly well or maintain a reasonable noise.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Are any of you stalking some cyber monday deals? I have a GTX 760 that's not quite a year old but I could always snag an upgrade there if the price is right.

Schiavona
Oct 8, 2008

exquisite tea posted:

Are any of you stalking some cyber monday deals? I have a GTX 760 that's not quite a year old but I could always snag an upgrade there if the price is right.

I doubt there's a significant upgrade for you that will have a sale, at least on team green.

1gnoirents
Jun 28, 2014

hello :)
Are there any significant downsides to mini 970s besides lower cooling potential? The board seems so small to me. I see, from at least this one picture, that it has a single 8 pin connector as well. But I wasn't really fully convinced the 970's really *needed* 6+8 in the first place.

If I had to guess it'd just be less overclocking headroom - though no idea as to how much less. But again the sticking point to me is the board itself. Anytime I've seen the PCB I'd certainly not describe it as barren with room to shrink. Is anything given up there?

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
I doubt you'd be giving up much in the way of overclocking potential, it seems like the 970s are held back by their really conservative 110% TDP limit rather than any thermal issues.

veedubfreak
Apr 2, 2005

by Smythe

Lowen SoDium posted:

Display Fusion is a good alternative to Ultramon. It's available on steam and goes on sale there on most sales events. I use it to automate different display profiles when I switch output from my monitors to my HDTV.

I tried out display fusion for a little while on my triple screen. But then somehow managed to hit the right combination of buttons that allowed me to windows-p between eyefinity and extended desktop mode. Any tips on using display fusion? My biggest issue with both surround and eyefinity is when I want to switch between games using all 3 monitors and games that only use a single screen. Fullscreen on D3 and Shadowrun are both super janky when spread across 3 screens.

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

veedubfreak posted:

I tried out display fusion for a little while on my triple screen. But then somehow managed to hit the right combination of buttons that allowed me to windows-p between eyefinity and extended desktop mode. Any tips on using display fusion? My biggest issue with both surround and eyefinity is when I want to switch between games using all 3 monitors and games that only use a single screen. Fullscreen on D3 and Shadowrun are both super janky when spread across 3 screens.

I can't say anything specific to Eyefinity, but what I am doing is this:

In Display Fusion, I have made a Monitor profile named "Desktop" that has my two DVI connected monitors enabled and my HDMI connected TV disabled. And I have another profile named "TV" that has the two monitors disabled and the TV enabled.

I am actually calling the profiles using batch files that are launched in a ridiculously complex way from a separate computer that runs XBMC, but that gets beyond the scope of this.

Anyways, you can change a profile in a batch file like this:

code:

cd "\Program Files (x86)\DisplayFusion"
start DisplayFusionCommand.exe -monitorloadprofile TV
where TV is the name of the profile you want to load.

So you could make a batch file that changed your monitor profile to what ever you wanted and then started your game.

Or, you can define hotkeys for each profile in settings -> functions -> scroll down to Monitor Configuration and manually change between them before you launch a game.

Bleh Maestro
Aug 30, 2003
So, I have a QNIX korea monitor that has NO on screen menu controls...

I need to adjust the screen position like you would if there WERE on-screen control options, but I can't seem to figure it out. I used the nvidia panel which was my best guess, but I can't find any option to simply shift the screen up-down or side-side. Only scaling options.

Any ideas here?

life is killing me
Oct 28, 2007

So I just recently built a rig and opted for the Radeon R9 290, which has been a great card so far. However, I'm having issues with the display being not as sharp as I feel it should be. I'm using my 32" LED/LCD as a monitor, which, while it's not the best choice, seems to work fine for my laptop in terms of sharpness. So why can my nearly-three-year-old laptop display with sharpness in 1080p via HDMI, but my brand-new rig with a comparably monstrous GPU outputs a fuzzy picture? It's not terrible by any means--it's playable, but I'd like it to be sharper.

I've already tried designating HDMI2 as a PC input and no change. Is it something to do with under/overscan?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Bleh Maestro posted:

So, I have a QNIX korea monitor that has NO on screen menu controls...

I need to adjust the screen position like you would if there WERE on-screen control options, but I can't seem to figure it out. I used the nvidia panel which was my best guess, but I can't find any option to simply shift the screen up-down or side-side. Only scaling options.

Any ideas here?

They only have brightness

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

life is killing me posted:

So I just recently built a rig and opted for the Radeon R9 290, which has been a great card so far. However, I'm having issues with the display being not as sharp as I feel it should be. I'm using my 32" LED/LCD as a monitor, which, while it's not the best choice, seems to work fine for my laptop in terms of sharpness. So why can my nearly-three-year-old laptop display with sharpness in 1080p via HDMI, but my brand-new rig with a comparably monstrous GPU outputs a fuzzy picture? It's not terrible by any means--it's playable, but I'd like it to be sharper.

I've already tried designating HDMI2 as a PC input and no change. Is it something to do with under/overscan?

You might be imagining it. HDMI is a digital signal, either it works and there's no signal degradation, or the cable is hosed and you get no signal at all; there's no inbetween fuzziness like with the old VGA cables.

Hit the "Auto" button on the monitor to see if it helps adjust to get the display pixel-perfect. There's things like DPI and overscan that come into play with Windows and the AMD Catalyst control panel respectively, but you have to go out of your way to alter those defaults.

Have you tried with different games? Call of Duty Advanced Warfare has a weird thing where if you crank up a certain lighting precache feature, it knocks all textures down to low without the settings reflecting that.

SlayVus
Jul 10, 2009
Grimey Drawer

Bleh Maestro posted:

So, I have a QNIX korea monitor that has NO on screen menu controls...

I need to adjust the screen position like you would if there WERE on-screen control options, but I can't seem to figure it out. I used the nvidia panel which was my best guess, but I can't find any option to simply shift the screen up-down or side-side. Only scaling options.

Any ideas here?

Is there over scan or under scan option? Like is the screen just off set on one side or is it just too big and hanging off?

Bleh Maestro
Aug 30, 2003

life is killing me posted:

So I just recently built a rig and opted for the Radeon R9 290, which has been a great card so far. However, I'm having issues with the display being not as sharp as I feel it should be. I'm using my 32" LED/LCD as a monitor, which, while it's not the best choice, seems to work fine for my laptop in terms of sharpness. So why can my nearly-three-year-old laptop display with sharpness in 1080p via HDMI, but my brand-new rig with a comparably monstrous GPU outputs a fuzzy picture? It's not terrible by any means--it's playable, but I'd like it to be sharper.

I've already tried designating HDMI2 as a PC input and no change. Is it something to do with under/overscan?

Are you sure the resolution is set correctly? Double check. That would definitely do that I think.

SlayVus posted:

Is there over scan or under scan option? Like is the screen just off set on one side or is it just too big and hanging off?

In border less window mode I can see the line of my minimized taskbar so I wanted to just adjust the screen down a notch. I tried overscanning in the nvidia panel but it says it's maxed

Bleh Maestro fucked around with this message at 04:41 on Dec 2, 2014

Ragingsheep
Nov 7, 2009
Any reason why my MSI 970 4G is showing a 1304 clock speed using Afterburner in Far Cry 4 and Firestrike despite not touching any of the settings? It doesn't seem to match any of the advertised speeds.

BurritoJustice
Oct 9, 2012

Ragingsheep posted:

Any reason why my MSI 970 4G is showing a 1304 clock speed using Afterburner in Far Cry 4 and Firestrike despite not touching any of the settings? It doesn't seem to match any of the advertised speeds.

NVidia cards automatically step up boost bins (13mhz intervals) until they hit their stock power draw limit or their temp limit. More efficient VRMs and cooling allow them to clock over even the stock maximum boost clock.

Ragingsheep
Nov 7, 2009

BurritoJustice posted:

NVidia cards automatically step up boost bins (13mhz intervals) until they hit their stock power draw limit or their temp limit. More efficient VRMs and cooling allow them to clock over even the stock maximum boost clock.

Thanks. My most recent card was a 6850 so I'm still trying to catch up with it all.

life is killing me
Oct 28, 2007

Bleh Maestro posted:

Are you sure the resolution is set correctly? Double check. That would definitely do that I think.


Zero VGS posted:

You might be imagining it. HDMI is a digital signal, either it works and there's no signal degradation, or the cable is hosed and you get no signal at all; there's no inbetween fuzziness like with the old VGA cables.

Hit the "Auto" button on the monitor to see if it helps adjust to get the display pixel-perfect. There's things like DPI and overscan that come into play with Windows and the AMD Catalyst control panel respectively, but you have to go out of your way to alter those defaults.

Have you tried with different games? Call of Duty Advanced Warfare has a weird thing where if you crank up a certain lighting precache feature, it knocks all textures down to low without the settings reflecting that.

I don't think I'm imagining it. I'm not really talking about in games but just in general on the desktop. I can definitely see a difference. In games it's not so noticeable really because it's hard to tell, but on the Windows desktop (win8.1) the text and icons are fuzzier than they should be. I looked up the issue for a possible solution and some people have had the same problem but with Samsung TVs. All the solutions said to force the TV to recognize the input as a PC so that it would cancel out the postprocessing TVs usually do that would degrade a signal...like if it thinks it's a TV signal, it will do the postprocessing and make things fuzzier. So I renamed the input or designated it as PC input and nothing, restarted, nothing. I have an Insignia TV, or BB's store brand.

The resolution is set correctly. This is a 1080p TV and I made sure the Catalyst Control Center knew it.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

life is killing me posted:

The resolution is set correctly. This is a 1080p TV and I made sure the Catalyst Control Center knew it.

This is why, text rendering on TVs is terrible because they usually use different subpixel patterns: http://en.wikipedia.org/wiki/Chroma_subsampling#4:2:0

You can't beat a monitor for a PC display. PC Monitors and TVs have some similarities, but are not really the same thing.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

life is killing me posted:

So I just recently built a rig and opted for the Radeon R9 290, which has been a great card so far. However, I'm having issues with the display being not as sharp as I feel it should be. I'm using my 32" LED/LCD as a monitor, which, while it's not the best choice, seems to work fine for my laptop in terms of sharpness. So why can my nearly-three-year-old laptop display with sharpness in 1080p via HDMI, but my brand-new rig with a comparably monstrous GPU outputs a fuzzy picture? It's not terrible by any means--it's playable, but I'd like it to be sharper.

I've already tried designating HDMI2 as a PC input and no change. Is it something to do with under/overscan?

It depends on the TV, but 1080p doesn't really mean all that much when it comes to resolution settings for TVs. For one thing, some have built in over/underscan that you can't disable which will throw off the pixel mapping and give you blurry text/games. If you can find a setting that changes the pixel mapping or disables and overscan then it should fix the sharpness problems. If it's a cheap TV though and it doesn't let you do that, you're poo poo outta luck.

I actually had the same problem and my only solution was to switch to a VGA cable and made sure the TV was set to a proper native 1080p signal using a program called custom resolution utility. VGA cables don't output the same clean digital signal as HDMI does but if the resolution is set correctly it can have a massive improvement over hosed up looking fuzzy HDMI.

Anyway I have my own problem. Well, it's not a problem per se, but recently most games these days seem to be using DX11 renderers which is cool and all, but D3Doverrider doesn't seem to support DX11 games. What that means is I can't get nice, triple buffered v-sync in these games since almost all of them just use double buffered v-sync. I hate screen tearing and love how nice 60fps v-synced games look, but most games these days can't seem to maintain 60fps constantly on my R9 290 so it's a lot of running at 60fps, dropping down to 30 for a split second, then back up to 60 again. Constantly. It sucks. I know I can just limit the framerate to 30 to avoid this but if I could just enable triple buffering it'd solve all this nonsense. Googling around shows no useful results and from some of the links I read game developers seem to be saying that DX11 supports it, but developers aren't implementing it for some reason.

Edit: OK I have no idea why I didn't know this but if you enable desktop composition, then set your game to border less window mode it apparently gives you to triple buffering for free? I tested it on lords of the fallen and it works, but but that just highlights some really funky performance problems with that game. It plummets to about 40 fps whenever you cast a spell and stays at that frame rate for a while even when the spell effect isnt on screen. Just wait for a while and I'll shoot back up to 60 for no reason. Odd.

cat doter fucked around with this message at 07:13 on Dec 2, 2014

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe
At least with my Asus VG278H, when I made the mistake of hooking it up to my Radeon R9 270X with an HDMI cable instead of DVI-D, the Windows drivers as well as OS X defaulted to feeding it YCbCr 4:2:2 instead of RGB. OS X needed an EDID override to remove the TV attribute, Windows just needed Catalyst Control Center to be reconfigured to output RGB.

The_Franz
Aug 8, 2003

cat doter posted:

It depends on the TV, but 1080p doesn't really mean all that much when it comes to resolution settings for TVs. For one thing, some have built in over/underscan that you can't disable which will throw off the pixel mapping and give you blurry text/games. If you can find a setting that changes the pixel mapping or disables and overscan then it should fix the sharpness problems. If it's a cheap TV though and it doesn't let you do that, you're poo poo outta luck.

I actually had the same problem and my only solution was to switch to a VGA cable and made sure the TV was set to a proper native 1080p signal using a program called custom resolution utility. VGA cables don't output the same clean digital signal as HDMI does but if the resolution is set correctly it can have a massive improvement over hosed up looking fuzzy HDMI.

How old is your TV? I don't think I've seen a 1080p TV made in the last 5 or 6 years that doesn't default to 1:1 pixel mapping when feeding it the native resolution via HDMI. Even the older 1366x768 displays I've seen do 1:1 mapping when feeding them their native resolution.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers

The_Franz posted:

How old is your TV? I don't think I've seen a 1080p TV made in the last 5 or 6 years that doesn't default to 1:1 pixel mapping when feeding it the native resolution via HDMI. Even the older 1366x768 displays I've seen do 1:1 mapping when feeding them their native resolution.

It's a couple years old, it's just a cheap chinese piece of poo poo. The HDMI ports seem to have overscan built into them and there's no setting to turn it off. The EDID information is wrong when plugged in over HDMI too, it defaults to 1280x720 as the native resolution and adding a custom 1080p resolution still doesn't fix any overscan. Once I switched to VGA and added a custom 1080p resolution the pixel mapping was perfect and the image was very clean and sharp, very comparable to HDMI. It doesn't bother me anymore really.

Kleen_TheRacistDog
Feb 17, 2014

Can't bust the Krust fuckman
www.skullmund.com
I want to upgrade my set-up to two 24" 1900X1200 monitors. I want to use both simultaneously for all applications other than gaming. For gaming, I just want to use 1.

What's the best way to set this up, via software configuration? I was reading about the "nVidia surround" config procedure, but I'm not sure if that applies to me. What are my options for achieving my desired result, minimizing the "fiddling" I'll have to do on a day-to-day basis when switching from 2 monitors normal mode to 1 monitor gaming mode? Thanks

Edit:

Second, unrelated, question - assuming money is no object and gaming performance is my only concern, would it make any sense to upgrade from a GTX 760 to a GTX 970, given that I have an Intel i5-4670k @ 3.4 GHZ processor? Or would the processor bottleneck gaming performance, thereby not allowing me to utilize that added horsepower of the 970? thanks!

Kleen_TheRacistDog fucked around with this message at 13:33 on Dec 2, 2014

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Kleen_TheRowdyDog posted:

I want to upgrade my set-up to two 24" 1900X1200 monitors. I want to use both simultaneously for all applications other than gaming. For gaming, I just want to use 1.

What's the best way to set this up, via software configuration? I was reading about the "nVidia surround" config procedure, but I'm not sure if that applies to me. What are my options for achieving my desired result, minimizing the "fiddling" I'll have to do on a day-to-day basis when switching from 2 monitors normal mode to 1 monitor gaming mode? Thanks

Edit:

Second, unrelated, question - assuming money is no object and gaming performance is my only concern, would it make any sense to upgrade from a GTX 760 to a GTX 970, given that I have an Intel i5-4670k @ 3.4 GHZ processor? Or would the processor bottleneck gaming performance, thereby not allowing me to utilize that added horsepower of the 970? thanks!

A current generation CPU is not going to bottleneck your gaming. A 970 is easily twice as powerful as a 760, but whether it makes sense to upgrade depends on whether you aren't getting enough performance now. The 760 is new enough that you could probably get 140-150 for it.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Kleen_TheRowdyDog posted:

I want to upgrade my set-up to two 24" 1900X1200 monitors. I want to use both simultaneously for all applications other than gaming. For gaming, I just want to use 1.

Plug in both monitors.

Play games on main monitor.

The end.

Generic Monk
Oct 31, 2011

What would be the best aftermarket cooler for a 290X? Would like to make it sound less like a jet engine since it kind of defeats the object of me having a 'silent' case. I was thinking of the Arctic Accelero Xtreme 4 but it's ugly, takes up like 5 slots and isn't even that good at cooling the card apparently.

Apparently the best solution would be some kind of closed loop liquid cooling solution - any suggestions that would fit a Fractal Define R4 well? Never done any kind of liquid setup before - what should I be looking at that won't spray water over my case and electrocute me? What's the kind of flexibility that I can get, since if it works well I might want to replace my hyper 212 cpu cooler. Getting a bit tired of the 290X having the heat and noise output of an Apollo launch.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Generic Monk posted:

What would be the best aftermarket cooler for a 290X? Would like to make it sound less like a jet engine since it kind of defeats the object of me having a 'silent' case. I was thinking of the Arctic Accelero Xtreme 4 but it's ugly, takes up like 5 slots and isn't even that good at cooling the card apparently.

Apparently the best solution would be some kind of closed loop liquid cooling solution - any suggestions that would fit a Fractal Define R4 well? Never done any kind of liquid setup before - what should I be looking at that won't spray water over my case and electrocute me? What's the kind of flexibility that I can get, since if it works well I might want to replace my hyper 212 cpu cooler. Getting a bit tired of the 290X having the heat and noise output of an Apollo launch.

You'd be better off selling the 290X and buying a gtx970. More GPU power, substantially less noise and heat, half the power draw or less, and probably wouldn't cost very much more than buying a good aftermarket cooler for the 290X once you factor in the sale of the 290X. Less risk too, especially when you start talking about liquid cooling.

If you haven't already - replace all the stock fans on your define R4 - the fans that fractal design supplies are super quiet, but absolute poo poo at actually cooling anything. Nanoxia makes fans that cool better, and are still designed to be quiet, and phanteks makes even better cooling fans that are also really quiet. Noctua fans are great but tend to be really expensive. Doing that will make life easier for your GPU and CPU coolers, allowing them to operate more quietly.

As for your hyper212, good air coolers make far less noise than liquid, and are nearly as good at cooling - you'd be better off buying a cooler from phanteks, Noctua or Cryorig.

The_Franz
Aug 8, 2003

kode54 posted:

At least with my Asus VG278H, when I made the mistake of hooking it up to my Radeon R9 270X with an HDMI cable instead of DVI-D, the Windows drivers as well as OS X defaulted to feeding it YCbCr 4:2:2 instead of RGB. OS X needed an EDID override to remove the TV attribute, Windows just needed Catalyst Control Center to be reconfigured to output RGB.

On top of that, it's really annoying how AMD's drivers still default to 10% underscan when using the HDMI connection, even when using the display's native resolution. I could understand doing this 10 years ago when most people had CRTs and projection TVs with massive overscan, but, with the exception of crappy bottom-shelf chinese displays, TVs default to 1:1 pixel mapping these days.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

The_Franz posted:

On top of that, it's really annoying how AMD's drivers still default to 10% underscan when using the HDMI connection

The best part is how in some cases, it loves to forget that you changed that setting in the CCC and resets to underscan. Reboot? Reset. Change resolution? Reset. UAC prompt? Freaking reset.

:psypop:

Had that issue on one of my studio's workstations and ended up finding a registry fix after scouring Google for an hour.

Generic Monk
Oct 31, 2011

Y'know, I was just about to post that selling off the 290X and getting a 980 or something would be more economical rather than trying to band aid the flaws endemic to the card. Plus I've always much preferred Nvidia's drivers and featureset compared to AMD's iffy implementations. If I was to do that I'm leaning toward the 980 as I have a 1440p monitor and SLI still hasn't swayed me as being worth the money.

The_Franz
Aug 8, 2003

Jan posted:

The best part is how in some cases, it loves to forget that you changed that setting in the CCC and resets to underscan. Reboot? Reset. Change resolution? Reset. UAC prompt? Freaking reset.

:psypop:

Had that issue on one of my studio's workstations and ended up finding a registry fix after scouring Google for an hour.

The sad thing is that fixing this is probably just a matter of changing one or two lines that set the default value, but for whatever reason they just won't do it. I can't imagine that people really want this behavior since neither Nvidia nor Intel do it by default.

The_Franz fucked around with this message at 15:15 on Dec 2, 2014

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE

Never seen this one before - is it just a cosmetic change? It looks a lot nicer than that gaudy-as-gently caress dragon MSI keeps slapping on everything.

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)

Instant Grat posted:

Never seen this one before - is it just a cosmetic change? It looks a lot nicer than that gaudy-as-gently caress dragon MSI keeps slapping on everything.

No, it's a smaller heatsink and different fans. MSI has had this line for at least the last two gens. If the difference is $10, I always recommend the "Gaming" bump if for nothing else but the fans. I really hate those fans however I haven't heard them on a 970.

If you think the dragon is lame, wait until you see "DRAGON ARMY" stamped into the actual metal now :v:

edit: I totally feel you on the gaudy stuff, but at the end of the day if it were pink with little disco balls hanging off it I'd still get it if it were better for the $

1gnoirents fucked around with this message at 16:51 on Dec 2, 2014

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply