Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Byolante
Mar 23, 2008

by Cyrano4747
Comparing TwinFrozr vs DirectCU2 vs ACX for a GTX 780 is there any meaningful difference which would make buying one of them more attractive if they were all the samer price?

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo
MSI and Asus both have spotty reputations for GPU boards.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Byolante posted:

Comparing TwinFrozr vs DirectCU2 vs ACX for a GTX 780 is there any meaningful difference which would make buying one of them more attractive if they were all the samer price?

They all have to be -at least as good as the reference model- at power delivery, noise levels, construction quality, and a lot of etc.'s that are a little less directly intuitive - that's thanks to nVidia's Greenlight program. So what you'll want to scour reviews of the products for are things that they improve on over the reference design. Better power delivery might mean superior overclocking stability for long boosting. Top tier memory modules might mean that you can ramp up the 6GHz effective to 7GHz effective instead of the more common 6350-6500 range. A more aggressive BIOS might mean that you get more voltage to play with and quicker rising through each boost bin to reach and stay at maximum clock in games that can actually put that much power to use. Different types of cooling setups can sound not just quieter, but also have their sound frequency centers peak in different areas that are more or less noticeable compared to the reference cooler.

Another thing to consider: WARRANTY! If that fancy card breaks, who will fix it? How? What can you get away with doing without violating the warranty? This should be explicitly noted, read the fine print. AND THE ALL CAPS PRINT AS THEY LIKE TO PUT PRETTY IMPORTANT STUFF IN GIGANTIC RUN ON SENTENCES THERE BECAUSE IT'S DIFFICULT TO PARSE BUT DO IT ANYWAY BECAUSE IT AFFECTS YOU.

And finally there's binning - with higher end cards, you're hopefully paying for the manufacturers' having taken the time to do some careful validation of the chips so that they can offer high overclock capable models with low transistor leakage (even on really good air cooling, that matters, and can affect overclocking stability considerably since wasted TDP is still TDP as far as the rules-lawyer power management hardware and software goes, and just waste heat is still heat you have to effectively exhaust). This does not necessarily meaningfully correspond to a GPU-Z Validation % score, by the way - validation ought to be done in situ on a test board to ensure at the BARE MINIMUM that the GPU will perform as advertised, I strongly doubt that any of the competitive, aftermarket cooling and power delivery card makers are just throwing any old chip in these things and letting users paying a considerable percentage more than stock just take their chances. It'd be a waste of money for the fancy power delivery and cooling.


Some ancillary considerations that may apply to you or not:

1. How many slots does the cooler require? If you're going to be using SLI, this can become especially important to consider. Even if not, you need to consider the airflow implications of a card that exhausts outside the case (reference blower) vs. one that exhausts fully inside the case (most after-market multi-fan coolers) vs. the neat hybrid setup that Asus came up with for the DCUII. It's dreamy... But EVGA is my first love, mainly because their cooler is the second best (woohoo! silver medal! god drat you Asus for taking first place with that sexy, sexy DCUII), it's a true 2-slot design, and their warranty is second to none.

2. Value-add. Some companies bundle games or software with their cards. Do you want games or software? See what they're offering!

3. Power consumption. While some coolers operate using relatively simple setups (simple, but reliable! EVGA's ACX cooler for the GTX 780 and above is an example of this), some other ones can actually get pretty sophisticated and quite power hungry on their own. Removing a lot of heat can be a challenge, and you may need to pay attention to the power draw of the system when the card is under full load to get an idea of how much the cooler "costs" your power supply. Again, this will mean looking through various reviews, no one-size-fits-all answer here.

4. Non-standard display support, like additional Displayport outs. If you're using several monitors with DP, this could matter a lot! Or not at all, if you're not.

5. Aesthetics. This is a dumb reason to not get a great card, and a terrible reason to get a mediocre card. But some people really, really want to match their color scheme.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

craig588 posted:

MSI and Asus both have spotty reputations for GPU boards.

:raise: You sure about that? That would disagree with the return and malfunction rates we got from that one gigantic French distributor. No significant variance in nVidia cards across all manufacturers, though AMD vendors were all over the damned place. It would seem that the Greenlight program is doing its job.

But if you've got a source on this I'd be interested to see it, certainly.

craig588
Nov 19, 2005

by Nyc_Tattoo
The more recent models from both of them haven't had any issues come up, but MSI was the one who made a mistake in one of their 600 series boards and and gave a regulator that was supposed to see 5V 9V (Or something way out of spec along those lines). Asus used to have quality all over the place, but their recent offerings starting at around the 400 series have been much better, they used to be the ones who strayed furthest from the reference board design. They attempted to leverage their non reference motherboard design experience, but it didn't work out too well most of the time.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Ghostpilot posted:

Just as a heads up, Anandtech's updated their GPU Bench for 2014. 780 TI's, r9 290's and everything in between!

How come there's only two companies that compete in this space? Will we ever see a third option that is viable? Does it matter?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

fletcher posted:

How come there's only two companies that compete in this space? Will we ever see a third option that is viable? Does it matter?

Read everything this poster says and everything linked and become enlightened or just take it as "Intel's coming up, but basically it's going to be those three for the foreseeable future yeah."

Professor Science
Mar 8, 2006
diplodocus + mortarboard = party

Agreed posted:

Read everything this poster says and everything linked and become enlightened or just take it as "Intel's coming up, but basically it's going to be those three for the foreseeable future yeah."
lmao, thanks Agreed. I thought I posted in here more over the past year. apparently you're trying to make me write long posts, oh well.

so okay, a brief overview of the GPU market. what does it take to build a GPU?

- the right hardware architects. these people are employed overwhelmingly at one of four companies: NVIDIA, AMD, Intel, Qualcomm (acquired ATI's old mobile GPU division back in they day, they've picked up most of the fleeing hardware people from AMD in the past year). there are other players too, like Imagination (PowerVR) and ARM (Mali). these people can and are poached between each other, and that's really the only way to acquire them besides training your own. successful GPUs are a ludicrous collection of special-purpose units, and knowing the right ways to build those units can only be learned from what's come before. the bottom line is that there just aren't that many of these people.

- the right software people. all GPUs are obviously very dependent on drivers, so for (let's say) a desktop GPU you need essentially three classes of people: the extremely low-level people that can deal with the chip directly, the slightly higher-level people that can implement the kernel-mode interface with WDDM, and everyone else that is required to implement the APIs that sit on top of that. the third class in particular is also very hard to find people for, because OpenGL and Direct3D are really complicated standards with eighty billion corner cases. this also means both driver people and compiler people; keep in mind that developing a good optimizing compiler for a GPU is one of the hardest compiler tasks imaginable because GPU ISAs are so complex. (both AMD and NV GPUs are effectively variable register count, where using more registers results in lower potential parallelism for potentially higher per-warp/wavefront performance. have fun making that tradeoff automatically!)

- emulators of some sort. these chips are big, you need some way to run things before tapeout. all those software people won't be able to do too much without it, too.

- access to a fab, specifically TSMC or Intel on the latest process (GloFo is way behind and probably unusable).

you need 40-50 hardware people and 100+ software people, probably three years for both (for your first chip if you're starting from scratch). let's assume each person gets $150k (realistically they cost a lot more than that but whatever). you're at ~70M before you even have a chance at making money. add another $5M for emulators of some form (those things ain't cheap).

okay, you've spent $75M and are now ready to tape out. get another $30-50M (tapeout costs are already insane for 28nm at TSMC and only getting more expensive!), and now you have a product that, assuming it actually works, you can sell, assuming you are able to sell into OEMs, convince game devs to actually test with it so the experience isn't trash out of the gate, solve various compatibility woes, and never run out of money at any point during this time. oh, and you better have a follow-up in 18 months that is significantly better, or you're out of business and this was all pointless.

just to make things worse, you have much bigger companies doing exactly the same things, and your only hope is that you somehow offer better perf/$ and perf/W. there exists no differentiating feature in this market anymore, so those are your metrics. meanwhile, the market continues to get smaller (IGPs are getting better, you're not going to compete with Intel on that), so you have to target only the very high-end, a small and extremely discriminating market.

I've vastly underestimated the time and resource requirements here, probably by at the very least a factor of two, but you get the point. despite all we talk about them, for the vast majority of the market, desktop GPUs are a commodity. it's a race to the bottom, there's very little money to be made. HPC (the compute market) is a high-margin low-volume market that leverages a lot of the same tech, so that helps, but it's not that much overall.

now, there's traditionally a wildcard in this equation: mobile. that's another post altogether, but it doesn't change the desktop GPU equation too much. (fun fact: I can't think of any GPU eventually for sale that started from scratch since 2000, which is when what is now Mali was originated)

edit: fun fact number two: I think it's literally impossible to build a GPU without violating patents from at least NVIDIA, AMD, and probably Intel. they don't go after each other because mutually assured destruction, but they will smash/acquire any new player if it looks to be anything at all interesting.

Professor Science fucked around with this message at 10:07 on Dec 31, 2013

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

craig588 posted:

MSI and Asus both have spotty reputations for GPU boards.

Huh? First I've heard of that.

Anecdotally I've had an unlocked MSI 6950 running in my machine for a couple of years, and although it's still young, an MSI 280X (both cards use TwinFrozr coolers) I bought has been running under heavy load for about a month without any stability issues.

HalloKitty fucked around with this message at 10:54 on Dec 31, 2013

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen
Trying to get a clear consensus on this here and can't find a clear winner. Comparing the GTX 780 to the R9 290, I'm gaming at 1440p. Both can be had for approximately the same price.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

BusinessWallet posted:

Trying to get a clear consensus on this here and can't find a clear winner. Comparing the GTX 780 to the R9 290, I'm gaming at 1440p. Both can be had for approximately the same price.

http://anandtech.com/bench/product/1068?vs=1036 - The R9 290 tends to be a little faster in more games than a GTX 780 and seems to deal with bumps in resolution better. Some games just run better on ATi or nVidia hardware so you won't get a clear winner in this price bracket.

I don't know how the custom cooled nVidia GTX 780's stand up but a custom cooled R9 290 benches out pretty drat close to a R9 290X after factory overclocks and is probably a pretty good value when looking at the high end of things.

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen

Beautiful Ninja posted:

http://anandtech.com/bench/product/1068?vs=1036 - The R9 290 tends to be a little faster in more games than a GTX 780 and seems to deal with bumps in resolution better. Some games just run better on ATi or nVidia hardware so you won't get a clear winner in this price bracket.

I don't know how the custom cooled nVidia GTX 780's stand up but a custom cooled R9 290 benches out pretty drat close to a R9 290X after factory overclocks and is probably a pretty good value when looking at the high end of things.

Thanks for this. I have a fairly small case so heat might be an issue, and I honestly don't ever plan on altering the cooling on either card. Does that make a difference? I hear the 780 overclocks really well with stock cooling, but not so much the R9 290.

Schpyder
Jun 13, 2002

Attackle Grackle

BusinessWallet posted:

Thanks for this. I have a fairly small case so heat might be an issue

If you can't get a custom-cooled 290, then definitely get the 780. Reference-cooled 290s have terrible, terrible thermal performance. At the same price, AMD's price/perf advantage falls away, and NV's far superior reference cooler makes the biggest difference, IMO.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Schpyder posted:

If you can't get a custom-cooled 290, then definitely get the 780. Reference-cooled 290s have terrible, terrible thermal performance. At the same price, AMD's price/perf advantage falls away, and NV's far superior reference cooler makes the biggest difference, IMO.

I'd agree, the reference R9 290 is likely maxing the thermal capabilities of its cooler at stock so with heat being an issue a GTX 780 would be better especially if you can get a decent OC without any fancy coolers.

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen
Awesome. Since the custom cooled 780s are available like the 290s aren't, would you recommend one of the custom cooled 780s over a reference card? For instance this or this instead of this?

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

BusinessWallet posted:

Awesome. Since the custom cooled 780s are available like the 290s aren't, would you recommend one of the custom cooled 780s over a reference card? For instance this or this instead of this?

In Australia at least, the very first custom cooled 290 and 290X came into stock today (both from XFX). I'm sure America can't be too far behind, we are usually the other way around.

As far as the GTX780 goes, its reference cooler is really good, but a custom cooled model will still be cooler and quieter in ideal conditions. Consider the reference model if you plan on doing SLI, or you have a super tiny and cramped case with limited airflow.

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen

The Lord Bude posted:

In Australia at least, the very first custom cooled 290 and 290X came into stock today (both from XFX). I'm sure America can't be too far behind, we are usually the other way around.

As far as the GTX780 goes, its reference cooler is really good, but a custom cooled model will still be cooler and quieter in ideal conditions. Consider the reference model if you plan on doing SLI, or you have a super tiny and cramped case with limited airflow.

The US retailers have the custom cooler 290s available but they've been much more expensive and never in stock due to the data mining craze. I am returning another card to buy this one and I only have a few days to do the exchange, so time is an issue. I might just pick up that EVGA reference card.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Finally got my 780 DCII from christmas hooked up... why in the world would a bad displayport cable keep a machine from booting :smith:

Anyway, huge performance improvement from the 580. I can play tomb raider with max settings to the max at 1200p and average 55fps (no vsync bringing it down, SSAA and tressfx are spendy). To be fair I guess that's the lowest setting of SSAA, I think there's a higher one but I can't imagine the performance hit for that.

On the overclocking side I haven't fully messed with the memory on it, but according to afterburner my core clock when gaming is about 1228 and the temp hangs around 77C. It's substantially quieter than the 580 lightning. I've heard the 780 lightning is even nicer but it seemed impossible to find and I didn't want the gift buyer in question to have to bend over backwards, so whatever. I'm happy.

Dogen fucked around with this message at 18:36 on Dec 31, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

On the overclocking side I haven't fully messed with the memory on it, but according to afterburner my core clock when gaming is about 1228 and the temp hangs around 77C. It's substantially quieter than the 580 lightning. I've heard the 780 lightning is even nicer but it seemed impossible to find and I didn't want the gift buyer in question to have to bend over backwards, so whatever. I'm happy.

That's a good sample, no need to go digging deeper unless you just want to. You've got very good silicon to hit that frequency with a 780 to begin with.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
That's with temp target at 95 and max voltage (1.21 I think) so there really isn't anything else I can do to it. Not nearly as much voltage/temp balancing as the 580.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

That's with temp target at 95 and max voltage (1.21 I think) so there really isn't anything else I can do to it. Not nearly as much voltage/temp balancing as the 580.

Ooh, 1.21V, you get an extra .01V to play with compared to a GTX 780 SC ACX - and 1.2V was the maximum voltage for the launch 780s across the board, too. Some of them were several mV under, BIOS locked, and as a result had some stability issues with higher clocks, but the EVGA SC ACX models shipped with 1.2V from the factory. It was a nice selling point if you knew about it.

I used to consider dicking around with my 780's allowable voltage but decided against it on the basis of not really needing the extra 52MHz I might be able to nab if I upped it, but Asus seems to feel that such reasoning is for chumps, haha.

That unit has some additional power delivery circuitry, doesn't it? I don't know how necessary or beneficial that is at the standard OCs we're typically reaching for, honestly, the stock VRM setup is usually plenty to clock to whatever the chip and memory modules can give you, but I guess it can't hurt either.

Agreed fucked around with this message at 21:16 on Dec 31, 2013

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Dogen posted:

Finally got my 780 DCII from christmas hooked up... why in the world would a bad displayport cable keep a machine from booting :smith:

Oh god I am glad I was not the only one that had this happen to them. Well not glad, but still.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

deimos posted:

Oh god I am glad I was not the only one that had this happen to them. Well not glad, but still.

poo poo I knew I should've put out a bat signal here or in the parts picking thread, probably could've saved me a lot of hassle. I tried swapping the mobo and CPU since it was popping as a boot device error... sigh.

Star War Sex Parrot
Oct 2, 2003

BusinessWallet posted:

Trying to get a clear consensus on this here and can't find a clear winner. Comparing the GTX 780 to the R9 290, I'm gaming at 1440p. Both can be had for approximately the same price.
This isn't the thread for parts picking.

Zettace
Nov 30, 2009
Anyone know if the Geforce 460 can finally use the latest drivers? I tried upgrading my drivers a few months ago and no matter what version, the drivers would crash my desktop constantly. In the end I just ended up returning to 314.22 and haven't upgraded since.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Zettace posted:

Anyone know if the Geforce 460 can finally use the latest drivers? I tried upgrading my drivers a few months ago and no matter what version, the drivers would crash my desktop constantly. In the end I just ended up returning to 314.22 and haven't upgraded since.

Same exact thing happened to me. If you use Firefox, try disabling Hardware Acceleration in the Firefox preferences. That of all things was causing my constant crashes with anything after 314.22.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


I managed 9 days of uptime* on 331.93 on a 560 Ti, and no TDRs since I installed it a few weeks ago.

*The reboot was intentional and unrelated.

^ ^ ^ I used Firefox with hardware acceleration on as my primary browser the whole time too, because I like living dangerously.

insidius
Jul 21, 2009

What a guy!
Im currently sitting with a 670 and looking to upgrade. My primary issue is I have been using a 30 inch Dell for probably the last 24 months at 2560x1600 so the 670 just is not cutting it for me in a lost of cases.

I have no immediate need to upgrade, it would just be nice. Ive been going through benchmarks for single cards, crossfire, SLI etc. Is it worth buying now for the performance increases I will get? I have been reading
on the nvidia side that we should see the introduction of a new architecture sometime this year, just wondering if that is worth holding out for. I only really want to buy once this year if I can help it and be sort of good
for a while.

Its hard for me to follow these days now that gaming is simply something I do in my spare time rather than my entire life.

beejay
Apr 7, 2002

This is where you want to be.

Star War Sex Parrot posted:

This isn't the thread for parts picking.

insidius
Jul 21, 2009

What a guy!
Oh, my most sincere apologies. I read the OP and somehow still managed to miss that.

regulargonzalez
Aug 18, 2006
UNGH LET ME LICK THOSE BOOTS DADDY HULU ;-* ;-* ;-* YES YES GIVE ME ALL THE CORPORATE CUMMIES :shepspends: :shepspends: :shepspends: ADBLOCK USERS DESERVE THE DEATH PENALTY, DON'T THEY DADDY?
WHEN THE RICH GET RICHER I GET HORNIER :a2m::a2m::a2m::a2m:

I don't suppose anyone has a spare 7850 they're looking to sell? Preferably one with a 3rd party cooler (TwinFrozr or w/e)

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

fletcher posted:

Same exact thing happened to me. If you use Firefox, try disabling Hardware Acceleration in the Firefox preferences. That of all things was causing my constant crashes with anything after 314.22.

That and Nvidia's drivers have been pretty buggy since what, the 319 or 320 revisions? I used 314.22 as soon as I installed my 670 because I've heard nothing but horror stories about crashes, BSODs and performance problems with anything after that version.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Ozz81 posted:

That and Nvidia's drivers have been pretty buggy since what, the 319 or 320 revisions? I used 314.22 as soon as I installed my 670 because I've heard nothing but horror stories about crashes, BSODs and performance problems with anything after that version.

The R331 drivers have been extremely stable for me, can't speak to others' experiences. I had a spate of issues with the introductory drivers for the GTX 780 and for some time after, but it's been several drivers since I had any actual complaints, thankfully. Lately just feature improvements, like Shadowplay filling out and adding virtually-free streaming to its bag of tricks, and "oh neat that game goes faster now wheeee" stuff.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ozz81 posted:

That and Nvidia's drivers have been pretty buggy since what, the 319 or 320 revisions? I used 314.22 as soon as I installed my 670 because I've heard nothing but horror stories about crashes, BSODs and performance problems with anything after that version.
Those complaints were specific to pre-Kepler cards. Modern cards should be using the latest drivers. Disabling hardware acceleration also murders Firefox performance so it shouldn't be anyone's goto fix, it SHOULD be working so it's smarter to just fix whatever is wrong so you get to enjoy the videocard you paid for.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Like I said, I have 331.93 working now, under exactly the circumstances that are supposed to bring Fermi cards to their knees. Then again, I waited some two weeks for reports to filter in before committing to it, and I cut out everything that wasn't relevant to a 560 Ti from the installer folder before installing (so just the top-level, Display.Driver, HDAudio, NVI2 and PhysX remained). I assume you'll forgive me for not trusting an app that all but shames users of older cards to give those users the actual best auto-settings for them.

After eight months of this I really don't want to take chances anymore, and I'll probably ride this card into the ground/unless or hopefully until in-the-wild Intel integrated video can put up a decent showing.

And for whoever said it upthread: turning off hardware acceleration in a browser and not crashing anymore is dumb luck - Windows as a whole is hardware accelerated now. Less I'll take your word for - the next well-coded standards-compliant Web browser will probably be the first - but it's still a bad workaround for why Alereon said and should probably not be recommended unless that's the only way to keep even semi-reliably using the Web at that computer.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Sir Unimaginative posted:

And for whoever said it upthread: turning off hardware acceleration in a browser and not crashing anymore is dumb luck - Windows as a whole is hardware accelerated now.

How can you even make a claim like this?

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Ghostpilot posted:

I haven't had any issues in games or anywhere else, frankly, but when I browse SA this happens:



It'll go away when I scroll it off the screen or do anything that causes a refresh, but I can't think of why it happens. Any ideas?

Here's a weird one for ya. Turns out that the above only happens on SA and only if I'm at the bottom of a page. Turning off hardware acceleration fixes it, but as long as I don't hang around at the bottom a thread it doesn't appear in the first place. :psyduck:

Ghostpilot fucked around with this message at 04:49 on Jan 3, 2014

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Ghostpilot posted:

Here's a weird one for ya. Turns out that the above only happens on SA and only if I'm at the bottom of a page. Turning off hardware acceleration fixes it, but as long as I don't hang around at the bottom a thread it doesn't appear in the first place. :psyduck:

Blame the unobtrusive ads.

Mike the TV
Jan 14, 2008

Ninety-nine ninety-nine ninety-nine

Pillbug
Quick question-

I just upgraded to a R9 270x and being an AMD card, had to change the overscan settings in Catalyst Control Center so the screen would not have a stupid bezel (Why is this not automatic??). That's all fine, but now playing games it gives me a similar black bezel around the screen. I can't seem to find an option in CCC and it makes some text a little blurry and harder to read in-game.

Any ideas?

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Use DVI instead of HDMI. Really. HDMI makes the card assume it's plugged into a TV, and most TVs do overscan even though they shouldn't any more, so overscan compensation is turned on by default. Meanwhile, the TV is making its own assumptions about overscan and whatnot. DVI, don't have to worry about this at all. Or trudge through menus turning off overscan on both the screen and the PC.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply