Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Combat Pretzel posted:

Don't really know, I was just wondering about this regarding those cards used in bitcoin rigs being dumped onto Ebay, as just mentioned above.
All current Radeons will happily throttle themselves in response to excessive thermals, so it's not like you can even get them up to 105C--they'll just downclock themselves to drop the temps for a bit, then ramp back up. The voltage oscillation from letting that happen for an extended period of time probably isn't doing the board any long-term favors, but as Ignoarints notes, no one actually lets that happen because you get better overall performance from finding a way to keep them stable at <85C so they'll run at a constant speed, rather than letting them bounce between <800Mhz and whatever the boost clock is.

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010
Is there anything that is a good indication of what's stable for a GPU overclock? I've used Heaven to good effect before however I'm getting crashes in-game way below what's stable in Heaven. I can do about 1228 mhz there, but I'm working my way down to 1163 now for actual games. Temperature is very good at 69 degrees with stock paste

And my MSI 780ti doesn't seem to have a VRM temperature sensor which is disappointing

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
Agreed and some others have written way too many words about just that topic, but a general rule of thumb for stability is that it's as high as you can clock a card where all the games you care about run without incident. It's very normal for certain outlier games to crash and reject clockspeeds that other (more demanding ones) could run under load 24/7, due to mysterious optimization reasons you'll never discover.

Ignoarints
Nov 26, 2010

TheRationalRedditor posted:

Agreed and some others have written way too many words about just that topic, but a general rule of thumb for stability is that it's as high as you can clock a card where all the games you care about run without incident. It's very normal for outlier games to crash and reject clockspeeds that others could run under load 24/7, due to mysterious reasons you'll never truly discover.

Yeah I've had a little wiggle room before but it's always been about one step below what I can do in Heaven. Coming from 770's even in SLI it was a very good tool. In this case, for the same game with the same settings, I'm crashing to desktop much farther below Heaven is all.

Oh well, time to sli :getin:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

TheRationalRedditor posted:

Agreed and some others have written way too many words about just that topic, but a general rule of thumb for stability is that it's as high as you can clock a card where all the games you care about run without incident. It's very normal for certain outlier games to crash and reject clockspeeds that other (more demanding ones) could run under load 24/7, due to mysterious optimization reasons you'll never discover.

Too many is subjective, I think I have written the right number, which is too many. Words, I mean.

Driver hacks, weird logic pathways, it's all a mess but hey happy days seem to be on the way with games finally getting DX11+...ized. New games in 2013/2014 still DX9, the gently caress mane

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Ignoarints posted:

Yeah I've had a little wiggle room before but it's always been about one step below what I can do in Heaven. Coming from 770's even in SLI it was a very good tool. In this case, for the same game with the same settings, I'm crashing to desktop much farther below Heaven is all.

Oh well, time to sli :getin:

At least veedub has a use case for his crazy expenses, you're just loving crazy.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Agreed posted:

Too many is subjective, I think I have written the right number, which is too many. Words, I mean.

Driver hacks, weird logic pathways, it's all a mess but hey happy days seem to be on the way with games finally getting DX11+...ized. New games in 2013/2014 still DX9, the gently caress mane
Well you write too many words about every computing topic you participate in, that doesn't mean they're not unwelcome!

Ignoarints
Nov 26, 2010

deimos posted:

At least veedub has a use case for his crazy expenses, you're just loving crazy.

To be fair I was looking into cases this week. Amazingly difficult decision. But then the case outlasts many builds for me. It's kind of sad because my current $50 case has such good airflow and a lot of choices seem to move in the opposite direction. I'm almost convinced to go dremel warrior and make the radiator spots I need, and perhaps a new intake on the bottom, and bank the case money towards a new card. Or a new processor and mobo... or the watercooling stuff I'll need... or a badly needed large SSD... :( I thought I was close to done too

edit: In case it interests someone, I reflashed my bios last night to see if I could get some stability out of higher clocks. Honestly my temperatures kick rear end on air and stock stuff so I figured why not in the meantime. I had really great results on a 660ti

It seems to go smoothly, way more so than the 660ti (had issues with write protection there). All the settings I'd expect appeared, core voltage, power target bar now up to 200%... but I couldn't get anything more out of it. In fact it was reporting 1.037 volts too in GPU-Z instead of the 1.21 I was expecting or even the 1.15-1.18 stock. This was very weird so I flashed it back and it's back to normal.

Don't know what I did wrong or why it isn't working

Ignoarints fucked around with this message at 14:45 on Jun 4, 2014

veedubfreak
Apr 2, 2005

by Smythe

deimos posted:

At least veedub has a use case for his crazy expenses, you're just loving crazy.

Funny story, my reservoir keeps getting lower and lower, which obviously means I have a leak somewhere, but it's below the motherboard and other equipment, so I haven't bothered to look for it. Does this make me a bad person?

Wistful of Dollars
Aug 25, 2009

Eh, as long as it's not dripping on anything that could explode and/or catch fire you're all good.

Ignoarints
Nov 26, 2010
haha I totally misread, I thought he said veedub has a decent computer case (where as mine is getting seriously inadequate soon).

780ti is pretty good man, but I can break it with just a single 1440p monitor.

craig588
Nov 19, 2005

by Nyc_Tattoo

Ignoarints posted:

All the settings I'd expect appeared, core voltage, power target bar now up to 200%... but I couldn't get anything more out of it. In fact it was reporting 1.037 volts too in GPU-Z instead of the 1.21 I was expecting or even the 1.15-1.18 stock. This was very weird so I flashed it back and it's back to normal.

Don't know what I did wrong or why it isn't working

Upload your edited bios and I'll take a look at it when I get a chance.

Ignoarints
Nov 26, 2010

craig588 posted:

Upload your edited bios and I'll take a look at it when I get a chance.

I don't know how to view this .rom. I remember I could before with the 660ti bios but I don't remember how, or why.

It is this one:

http://www.overclock.net/attachments/22651

veedubfreak
Apr 2, 2005

by Smythe

Ignoarints posted:

haha I totally misread, I thought he said veedub has a decent computer case (where as mine is getting seriously inadequate soon).

780ti is pretty good man, but I can break it with just a single 1440p monitor.

I need more radiators. :coal:

craig588
Nov 19, 2005

by Nyc_Tattoo

Ignoarints posted:

I don't know how to view this .rom. I remember I could before with the 660ti bios but I don't remember how, or why.

It is this one:

http://www.overclock.net/attachments/22651

You just downloaded a random edited bios from the internet? I'd be too scared to do that. I don't have permission to download it so I can't tell you what might be wrong with it.

There's a dos tool called NVflash that will handle dumping and flashing bioses.

My guess is that they messed up with what absolute values they put in for power targets. The percentages exposed by the driver are only relative to whatever values are stored in the bios and don't really mean anything between 2 different cards and bioses. They might have exposed 200% power relative to their board, but compared to yours even their 200% is only equivalent to 90%. If I could download the bios I could confirm that, but I'm not comfortable will just giving you offsets since I don't have a GK110 board to flash to. I don't want to just tell you offsets to change because I can't verify them and there's also the problem of getting a correct checksum which I don't remember how to do.

Edit: I think on GK110s if you set it to temperature throttling it ignores the power targets. I don't think you need to mess with anything in the bios.

craig588 fucked around with this message at 19:13 on Jun 4, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E
This is relevant from the monitor thread:

El Scotch posted:

In more Computex news, AMD showed off an adaptive vsync prototype.

Interestingly, they suggest that there are monitors out there already that can do it with just a firmware update. What monitors exactly wasn't specified. They're also optimistic that DP 1.2a monitors will be released by the end of the year.

If this picks up speed, it would be lovely if Nvidia deliberately didn't support it to sell more gsync parts.

EoRaptor
Sep 13, 2003

by Fluffdaddy

Shaocaholica posted:

This is relevant from the monitor thread:


If this picks up speed, it would be lovely if Nvidia deliberately didn't support it to sell more gsync parts.

nVidia is hemming and hawing about it, but it's part of the displayport 1.2a standard, so they won't be able to use the displayport branding or pass certification if they don't implement it. AMD seems to already have the hardware to support DP1.2a with some of the 2x0 generation cards, so likely only need driver updates. nVidia might have the same thing, but whether they choose to expose the feature is another story, they might make it new cards only.

Ignoarints
Nov 26, 2010

craig588 posted:

You just downloaded a random edited bios from the internet? I'd be too scared to do that. I don't have permission to download it so I can't tell you what might be wrong with it.

There's a dos tool called NVflash that will handle dumping and flashing bioses.

My guess is that they messed up with what absolute values they put in for power targets. The percentages exposed by the driver are only relative to whatever values are stored in the bios and don't really mean anything between 2 different cards and bioses. They might have exposed 200% power relative to their board, but compared to yours even their 200% is only equivalent to 90%. If I could download the bios I could confirm that, but I'm not comfortable will just giving you offsets since I don't have a GK110 board to flash to. I don't want to just tell you offsets to change because I can't verify them and there's also the problem of getting a correct checksum which I don't remember how to do.

Edit: I think on GK110s if you set it to temperature throttling it ignores the power targets. I don't think you need to mess with anything in the bios.

I did, but it's a very well received bios author. The thread for them is 1110 pages long or something lol.

http://www.overclock.net/t/1438886/official-nvidia-gtx-780-ti-owners-club

I used nvflash and as far as I could tell it was successful, smoother than the last time I had to use it.

Shaocaholica
Oct 29, 2002

Fig. 5E

EoRaptor posted:

nVidia is hemming and hawing about it, but it's part of the displayport 1.2a standard, so they won't be able to use the displayport branding or pass certification if they don't implement it. AMD seems to already have the hardware to support DP1.2a with some of the 2x0 generation cards, so likely only need driver updates. nVidia might have the same thing, but whether they choose to expose the feature is another story, they might make it new cards only.

So if they limit it to new cards, that pretty much shrinks their g-sync market down to just 7xx owners? I mean, why would anyone with a 8xx or newer card want to buy into gsync?

veedubfreak
Apr 2, 2005

by Smythe

EoRaptor posted:

nVidia is hemming and hawing about it, but it's part of the displayport 1.2a standard, so they won't be able to use the displayport branding or pass certification if they don't implement it. AMD seems to already have the hardware to support DP1.2a with some of the 2x0 generation cards, so likely only need driver updates. nVidia might have the same thing, but whether they choose to expose the feature is another story, they might make it new cards only.

Will this require me to buy a displayport splitter thingy? My monitors have the ability to chain displayport from one monitor to the next, wonder if it'll work that way or not.

Shaocaholica
Oct 29, 2002

Fig. 5E

veedubfreak posted:

Will this require me to buy a displayport splitter thingy? My monitors have the ability to chain displayport from one monitor to the next, wonder if it'll work that way or not.

Oh cool I didn't know you could do this. Looks like DP KVMs just got a whole lot more fun.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

veedubfreak posted:

Will this require me to buy a displayport splitter thingy? My monitors have the ability to chain displayport from one monitor to the next, wonder if it'll work that way or not.

They shouldn't as long as you have enough DP bandwidth to drive your resolutions.

A single DP 1.2 link can drive upto a single 4K monitor.

craig588
Nov 19, 2005

by Nyc_Tattoo

quote:

Disabled boost
That's really stupid. You're better off not using that bios, you'll waste 100-200 watts without boost. Also the extra heat forcing the fans to run faster and make more noise. You get no benefit whatsoever in real world situations. If you only need 900Mhz and .98v to hit 60FPS the driver is smart enough to not ramp the card up any higher. Without boost you're just full power all the time for no good reason.

Just max out the temperature limit and you're good on GK110 cards.

Ignoarints
Nov 26, 2010

craig588 posted:

That's really stupid. You're better off not using that bios, you'll waste 100-200 watts without boost. Also the extra heat forcing the fans to run faster and make more noise. You get no benefit whatsoever in real world situations. If you only need 900Mhz and .98v to hit 60FPS the driver is smart enough to not ramp the card up any higher. Without boost you're just full power all the time for no good reason.

Just max out the temperature limit and you're good on GK110 cards.

I've had it disabled before but it still ramps down correctly. Like it will still run at 350 mhz when it needs to. It just disables the boost function so you can lock in the max clock without worrying about it stepping up beyond a stable zone which I've had trouble with in SLI before. Also the voltage is still variable. Well, assuming it works correctly (which it hasnt for me for this card).

I've had the boost disabled on a 660ti although it was MUCH harder for some reason. I had to kind of trick it with nvidia inspector

My stock temperature currently is at 70 * C at 105% TDP and it just becomes unstable in game above ~1160 mhz. It just makes me think it needs more than the nvidia "hard" 1.18v limit.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

craig588 posted:

That's really stupid. You're better off not using that bios, you'll waste 100-200 watts without boost. Also the extra heat forcing the fans to run faster and make more noise. You get no benefit whatsoever in real world situations. If you only need 900Mhz and .98v to hit 60FPS the driver is smart enough to not ramp the card up any higher. Without boost you're just full power all the time for no good reason.

Just max out the temperature limit and you're good on GK110 cards.

Boost doesn't stop the gpu from throttling. It just stops it from boosting over what you set the speed at.

craig588
Nov 19, 2005

by Nyc_Tattoo
That sounds like you're talking about low power and 2d clocks. In full 3d mode it'll automatically scale in 13mhz increments up to the limit. Disabling boost makes it run at that limit all the time, you still have low power modes available, but the granularity is much coarser than the 13mhz increments boost offers. Max out the power target first, then adjust the clock offset.

My boost enabled 680 never goes over 1293MHz.

Ignoarints
Nov 26, 2010

craig588 posted:

That sounds like you're talking about low power and 2d clocks. In full 3d mode it'll automatically scale in 13mhz increments up to the limit. Disabling boost makes it run at that limit all the time, you still have low power modes available, but the granularity is much coarser than the 13mhz increments boost offers. Max out the power target first, then adjust the clock offset.

My boost enabled 680 never goes over 1293MHz.

I've always had issues where it flips between 13 mhz increments constantly. Usually not a problem unless I'm overclocked to the edge. With boost disabled from what I've seen it simply maxes out at the clock speed its rated at when it's called for, otherwise it throttles down. I'm already at max factory power target (a whole 105%, which it reaches easily)

In any case, I'd just like 1.21 volts, which as I understand is a hardware limit. I'll look more into it

For a little backstory, the 660ti I had would be stable at a max of 1228 mhz, sometimes 1215 depending. With the fairly simple bios flash I was able to get a stable 1306 mhz, although I had it at 1280 for no real reason. The temperature hardly changed either (although honestly we're not talking about a ton of voltage here)

With boost enabled I'd actually have to set my offset to negative values because it would boost an insane amount on its own, and it was inconsistent. With that disabled I was able to lock it in where I wanted. It would not run there at all times though for any reason it otherwise acted like a stock card

Ignoarints fucked around with this message at 22:10 on Jun 4, 2014

EoRaptor
Sep 13, 2003

by Fluffdaddy

Shaocaholica posted:

So if they limit it to new cards, that pretty much shrinks their g-sync market down to just 7xx owners? I mean, why would anyone with a 8xx or newer card want to buy into gsync?

I doubt even nVidia knows. They sunk a lot of money into g-sync, only for AMD to pull the rug out with freesync, and for VESA to accept that version because it was stupidly easy to change from optional to required in the specification. This is why some monitors can be upgraded, the display controller makers were targeting mobile applicaiton with their asic, so the hardware just needs to be turned on. G-sync is dead before product launch and nVidia knows it, but backing down is a huge loss of 'face', so company politics is probably going to drive them off the rails for a bit.

I guess R != R, economists, back to the drawing board.

Shaocaholica
Oct 29, 2002

Fig. 5E

EoRaptor posted:

I doubt even nVidia knows. They sunk a lot of money into g-sync, only for AMD to pull the rug out with freesync, and for VESA to accept that version because it was stupidly easy to change from optional to required in the specification. This is why some monitors can be upgraded, the display controller makers were targeting mobile applicaiton with their asic, so the hardware just needs to be turned on. G-sync is dead before product launch and nVidia knows it, but backing down is a huge loss of 'face', so company politics is probably going to drive them off the rails for a bit.

I guess R != R, economists, back to the drawing board.

Guh its lovely both ways. If the tech existed and it was so 'easy' to roll that spec into DP, why didn't anyone raise their hands sooner to get the ball rolling? Why did it take nvidia and their proprietary now still-born tech to raise the issue? This really should have been part of DP 1.0 or even early HDMI/DVI.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Eh, you win some, you lose some. They had a hole card with their knowledge that DirectX was incorrectly reporting frame timing and how that all worked out with WDDM, and that was a huge gently caress-you to AMD (enlightened self-interest is economics for "gently caress you," right?). That was a win for nVidia but it also called attention publicly to something that had previously only been confusingly pondered at, and exposes lower level problems in ALL rendering via DirectX. Put AMD in a scramble but was part one of a two-pronged awesome strategy to be all "man gently caress DirectX and WDDM, this is some bullshit" - the other prong being AMD releasing Mantle, an arguably questionably useful tech for a much less aggressively "oh, gently caress." reason than V-Sync. At least with Mantle, from the end user's perspective it either doesn't matter or it's cool, only studios get screwed if they invest in it and shouldn't have. But the combination of nVidia showing that you shouldn't trust DirectX's frame timings in the first place and Mantle proving that it is goddamned time we got a better way of rendering things, now we're getting some real work on DirectX and optimizations to keep it relevant (since nVidia won't ever implement a mantle-like thing, what with the very different basic approaches that SMXes and the GCN architectures take, and Microsoft and Sony probably don't want to give up their long-standing technological preferences - Microsoft can't, without looking like the stupidest assholes ever, and Sony very likely won't).

More thoughts on this but I gotta go :-/

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

EoRaptor posted:

nVidia is hemming and hawing about it, but it's part of the displayport 1.2a standard, so they won't be able to use the displayport branding or pass certification if they don't implement it.

Is there a confirmation on this? Has DPAS actually gone from "optional" to "required" as a standard for 1.2a and on?

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

EoRaptor posted:

I doubt even nVidia knows. They sunk a lot of money into g-sync, only for AMD to pull the rug out with freesync, and for VESA to accept that version because it was stupidly easy to change from optional to required in the specification. This is why some monitors can be upgraded, the display controller makers were targeting mobile applicaiton with their asic, so the hardware just needs to be turned on. G-sync is dead before product launch and nVidia knows it, but backing down is a huge loss of 'face', so company politics is probably going to drive them off the rails for a bit.

I guess R != R, economists, back to the drawing board.

Something similar to freesync has been part of DP since 2009 but only on eDP not external displays.

Shaocaholica
Oct 29, 2002

Fig. 5E
Whats wrong with having the hardware support it and just disabling it in the driver? Doesn't that let nvidia have their cake and eat it too? Dick move but...

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
What's wrong is that strategy is really transparent and every GPU enthusiast will know they're being cockblocked by Nvidia out of spite. I'd think the best they can do is stall to try to make what little they can back on their investment. Anything else would be really bad PR.

veedubfreak
Apr 2, 2005

by Smythe

Shaocaholica posted:

Guh its lovely both ways. If the tech existed and it was so 'easy' to roll that spec into DP, why didn't anyone raise their hands sooner to get the ball rolling? Why did it take nvidia and their proprietary now still-born tech to raise the issue? This really should have been part of DP 1.0 or even early HDMI/DVI.

My guess is that this will allow people to buy lower end cards. They don't want people buying the cheap cards.

SCheeseman
Apr 23, 2003

Zero VGS posted:

What's wrong is that strategy is really transparent and every GPU enthusiast will know they're being cockblocked by Nvidia out of spite. I'd think the best they can do is stall to try to make what little they can back on their investment. Anything else would be really bad PR.

Nvidia don't seem to care about it. See PhysX.

Josh Lyman
May 24, 2009


SwissCM posted:

Nvidia don't seem to care about it. See PhysX.
Seriously. Remember when PhysX was a separate company and people would buy the PCI cards? Okay, maybe they didn't sell much, but Nvidia buying PhysX was supposed to usher in a new era in gaming.

SCheeseman
Apr 23, 2003

Josh Lyman posted:

Seriously. Remember when PhysX was a separate company and people would buy the PCI cards? Okay, maybe they didn't sell much, but Nvidia buying PhysX was supposed to usher in a new era in gaming.

The thing I mainly take issue with is them disallowing the use of having an Nvidia card for PhysX and an AMD card as a primary GPU. There is literally no technical reason why it should be disallowed. The drivers work perfectly fine side by side and you can even install a hacked driver to get some games to work. Newer games have harder to crack lockouts however (again, for no reason other than to be blatantly anti-competitive). I struggle to think of any other company that has done something like this in the x86 marketplace.

I remember hearing them say that PhysX is "open" and that other parties are free to license it, but with the kind of behavior that they're displaying I can't help but think that it is total bullshit.

beejay
Apr 7, 2002

Zero VGS posted:

Is there a confirmation on this? Has DPAS actually gone from "optional" to "required" as a standard for 1.2a and on?

http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010
I am all for openness and cross platform compatibility. I really, really, really am. But no matter what angle AMD is pushing now (and they are) they'd do the same exact things that nvidia is doing if they meant more money for them. Right now, its working in their favor to do the opposite of what nvidia is doing, if anything else just for the PR (and I know its more than just this). This is just the result of true competition and it will never change.

I kind of wish gsync wasn't doomed. Because I have an expensive nvidia card. If I had AMD, I'd hope that freesync takes off, but not because it's open to everyone. That's just an illusion I'm well aware could have been very different under different circumstances. If nvidia backed themselves in a corner with gsync, and freesync is putting the pressure on, the fact that its open (I mean, come on with the name freesync) is simply a characteristic of the concept with the sole goal of competing with the concept nvidia is trying to pull off.

This is just my opinion and frankly these things only matter more when they are made them into issues and big deals. But it's just so much easier to relax.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply