Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Otakufag
Aug 23, 2004
With asynchronous compute being the future in dx12 and vulkan, and feesync monitors having a much more affordable price than g-sync ones, I sorta feel it was a bad decision to buy a 1070...

Adbot
ADBOT LOVES YOU

ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

BurritoJustice posted:

I was not contesting your post about VRAM. I was contesting your statement that it is a good buy for $50 less than a 1070, when it plainly isn't. All else being equal (which is obviously not the case), 8GB of VRAM is objectively better than 4GB of VRAM. The additional VRAM is an added benefit of the 1070 over the Fury, in addition to the substantially better performance and overall featureset.

There are games with settings right now that will choke on cards with 4GB VRAM, and on an equivalent strength card with 6GB/8GB VRAM the settings can be used with a minimal performance hit. For example, ROTTR highest textures setting introduces stutter and framedrops in certain areas when using a 4GB card that isn't present on higher VRAM cards. I personally own a 980 so I am well aware of certain settings blowing out 4GB.

If the Fury was faster in overall performance than the 1070, but had less VRAM, it would be an interesting comparison. But it isn't, it's a whole chunk slower, so there is no reason to even think about the different VRAM sizes with regards to recommending a purchase.
So from your datum it seems there is a real but rare scenario where RAM is a bottleneck, but it seems rather limited in scope. I'm not arguing that these don't exist, only that they will be far less prevalent than times when the card simply isn't fast enough to justify using the settings that trigger said issue.

You also seem hung up on my comment that a firesale priced Fury X could ever even be remotely considered for purchase: could you elaborate on what price it would need to be for you to find it competitive? I'm also happy to pick literally any other card to continue this discussion if that helps. Personally I'd have loved to see how a 4 gb 1070 performed if it could slot in below 375.

a retard
Jan 7, 2013

by Lowtax
So i'm talking to a friend on irc about upgrading his GPU and he wanted to upgrade his 560ti to a 470 mainly due to "NVIDIA gimping". I looked it up and supposedly NVIDIA driver updates over the years will steadily lower the performance in games, apparently in a transparent attempt to get you to update to their latest cards. Meanwhile AMD will slowly improve performance in the same games through driver updates.

Is this bullshit (I'm pretty sure it is) and if so why?

Haquer
Nov 15, 2009

That windswept look...

Otakufag posted:

With asynchronous compute being the future in dx12 and vulkan, and feesync monitors having a much more affordable price than g-sync ones, I sorta feel it was a bad decision to buy a 1070...

Except the 1070 still outperforms the AMD cards in DX12 as well?

Truga
May 4, 2014
Lipstick Apathy

a retard posted:

Is this bullshit (I'm pretty sure it is) and if so why?

Both improve over time due to driver updates in some/most games, but old AMD cards will generally achieve higher frame rates in newer games, compared to old nvidia cards. I.e. a r9 290x fares a bit better against a 780Ti these days, than it did when they were both new.

It's likely a combination of lovely drivers improving better proportionally, and also higher memory bandwidth and more raw compute power helping out in newer games. The differences are very minor though (except in the case of 780Ti having only 3gb of ram and choking on every game that goes above that), so it probably isn't something to buy a card over. 470 4gb is a good card for the price though.

HMS Boromir
Jul 16, 2011

by Lowtax

Haquer posted:

Except the 1070 still outperforms the AMD cards in DX12 as well?

The 1070 and Fury X apparently trade blows in Vulkan DO4M and DX12 Hitman, so if there were an RX 490 offering equal or greater performance to the Fury X for cheaper than a 1070, the 1070 would start looking worse as more Vulkan/DX12 games come out. But the 490 doesn't exist and probably won't until next year, so that's reason enough to buy a 1070.

Although you can buy a $400 Fury X so I guess that looks pretty good if you ignore every other game.

HMS Boromir fucked around with this message at 07:44 on Aug 31, 2016

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Also, be sure to tell your friend that there's a reason nVidia calls them GameReady drivers and not GameRequired ones. I've been using a year-old iCafe driver and everything's peachy.

BurritoJustice
Oct 9, 2012

ZobarStyl posted:

So from your datum it seems there is a real but rare scenario where RAM is a bottleneck, but it seems rather limited in scope. I'm not arguing that these don't exist, only that they will be far less prevalent than times when the card simply isn't fast enough to justify using the settings that trigger said issue.

You also seem hung up on my comment that a firesale priced Fury X could ever even be remotely considered for purchase: could you elaborate on what price it would need to be for you to find it competitive? I'm also happy to pick literally any other card to continue this discussion if that helps. Personally I'd have loved to see how a 4 gb 1070 performed if it could slot in below 375.

If you have a preference for the AMD ecosystem, and taking a 1070 to be $400 (I'm not american so I can't comment too specifically on this, but as per previous posters you can find them for $350-370~ in the same deals as the FuryX), I could see a FuryX being a contender at ~$320. This would put it on the same point in the price/performance curve as the 1070 (taking Techpowerup's latest 1440p figure of 83% total performance). Eliminating the performance advantage for the 1070 we are left with:

FuryX:
+FreeSync
+Doom Vulkan/Hitman DX12 performance on par with 1070, closes gap in other games (though it is to be noted that Hitman is comically biased towards AMD cards)

1070:
+Power usage
+Greater gain from overclocking (10% versus basically 0%)
+DP1.4/HDMI2.0b/DL-DVI (relevant for people with Korean monitors)
+Newer VR tech
+4GB VRAM

This leaves basically the same situation as the 1060 vs 480, where the 1060 is the better buy unless you're immediately intending to buy a FreeSync panel. The DX12/Vulkan advantage for the 480/FuryX is worth noting, but only enough to bring them up to par with the 10x0 cards, not a strict advantage. The Freesync advantage is more dubious in the 1070/FuryX tier as the high end of monitors is dominated by GSync models with no true Freesync equivalent (X34/XB271HU), and these are more likely to be models looked at by 1070/FuryX buyers than with 1060/480 buyers.

Linnear
Nov 3, 2010
So a 1060 is generally considered a safer bet than a RX 480 if you don't plan on getting a Freesync monitor anytime soon? I'm tempted to buy an AIB RX 480 just for the potential that DX12 and Vulkan implementation will give it a leg up, but realistically I have no idea how Vulkan's future will play out, and I don't know if the 1060 will actually fall behind all that much in DX12 games. It's a zippy little card and priced cheaper to boot.

Is the RX480 just a bad idea all around?

betamax hipster
Aug 13, 2016
Regarding the Fury X vs. 1070, what does the difference in power consumption add up to in terms of increased electrical bills over the lifetime of the card?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

betamax hipster posted:

Regarding the Fury X vs. 1070, what does the difference in power consumption add up to in terms of increased electrical bills over the lifetime of the card?

It's not really the power consumption that matters, it's all the heat the card will dump into your room from the increased power consumption.

ZobarStyl
Oct 24, 2005

This isn't a war, it's a moider.

AVeryLargeRadish posted:

It's not really the power consumption that matters, it's all the heat the card will dump into your room from the increased power consumption.
Yeah, power (at least in America) is trivial in cost and even a hundred of hours of a 100W hungrier card might run you a little more than a dollar. If you're also dumping cash into A/C already, then that adds up. If, however, like me you are parked next to a 1500W space heater from late Oct to early March, it can actually help. I consider my card/rig to be a higher cost item in the summer and turn it off more aggressively, whereas in winter it's only a boon.

Semi-related, but it'll be interesting to see what the zero marginal cost energy future of renewables does for this concern. Quad SLI for all!

BurritoJustice posted:

If you have a preference for the AMD ecosystem, and taking a 1070 to be $400 (I'm not american so I can't comment too specifically on this, but as per previous posters you can find them for $350-370~ in the same deals as the FuryX), I could see a FuryX being a contender at ~$320.
See that seems a tad lowball for me, but then the only feature differentiator that matters to me is Eyefinity so I'm stuck until I tack on a 4K to my array. I think we can agree that were 10x0 stocks not so late in reaching MSRP parity the issue would be moot. The botched landings from both manufacturers have made this summer a very awkward time to buy.

ZobarStyl fucked around with this message at 13:11 on Aug 31, 2016

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me

a retard posted:

So i'm talking to a friend on irc about upgrading his GPU and he wanted to upgrade his 560ti to a 470 mainly due to "NVIDIA gimping". I looked it up and supposedly NVIDIA driver updates over the years will steadily lower the performance in games, apparently in a transparent attempt to get you to update to their latest cards. Meanwhile AMD will slowly improve performance in the same games through driver updates.

Is this bullshit (I'm pretty sure it is) and if so why?

AMD cards gain performance until AMD leaves the card in the cold. Geforce GTX 460 and Radeon HD 6850 were released right around the same time, and were pretty much even in terms of performance at release. If you look at the raw specifications (memory bandwidth, pixel/texture fill rate, etc), the 6850 is quite a bit stronger.

AMD stopped driver support for HD 5000/6000 cards over a year ago, and performance from the 6850 in newer titles is absolute garbage. Radeon 6000 cards were offered beginning in very late 2010 (2011 for practical purposes), and driver support ended in 2015. Radeon 4000 cards were offered for sale in mid 2008 and all support was ended in 2015. When an AMD card is five years old, you might as well throw it away.

Yes, 6850 is an old chip, but the GTX 460 absolutely blows it away in pretty much any game published in the last two years. Nvidia still offers regular driver updates going back to GTX 400 series cards, and offers full Windows 10 support going back to 8000 series cards (from ten years ago). AMD only supports cards going back to HD 7000, and even that support is incomplete. Some of the R5 2xx series cards no longer receive driver updates (the non-GCN-based R5 22x/23x cards). You can still buy these cards brand new from Newegg, Amazon, or Best Buy, and their support life is already over.

Linux is often a good way to keep old hardware working, owing to its reduced overhead compared to Windows. AMD's Linux drivers have traditionally been pretty lovely. Yes, their driver support in Linux appears to be improving, but for over a decade Nvidia has been the obvious choice if you are willing to accept a proprietary driver (Intel has really been the only option if you insist on an open source driver). I don't really trust AMD to keep up their recent improvements when it comes to Linux drivers. Canonical (Ubuntu) has gotten fed up with AMD proprietary drivers to the point where they no longer offer automated proprietary AMD driver installation.

As far as I can tell, AMD GCN cards have gotten faster over time for two reasons:

1. GCN drivers have a high level of overhead. It takes high IPC and high clock speed on the CPU side to keep the driver fed. CPU horsepower has increased a good amount since GCN debuted in the beginning of 2012. Maybe CPUs have not increased in performance by as much as we would like, but there has been an increase in performance.

2. The initial driver releases were so poorly optimized that there was nowhere to go but up. Even after all this time, GCN OGL performance is still in the toilet, and only Vulkan has offered any reason for hope when it comes to performance in cross platform games/applications (this includes performance in Windows 7/8/8.1, where DX12 is not an option).

This probably sounds like a rant against AMD. I really don't like Nvidia's marketing practices, but their drivers are far ahead of AMD in terms of consistency of performance, support lifetime, and Linux support. While AMD was laying off employees (they didn't have much of a choice) and chasing pipe dreams like desktop HBM and Mantle, Nvidia was optimizing 28nm design/manufacturing and refining their OGL/DX11 drivers.

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

PBCrunch posted:

AMD cards gain performance until AMD leaves the card in the cold. Geforce GTX 460 and Radeon HD 6850 were released right around the same time, and were pretty much even in terms of performance at release. If you look at the raw specifications (memory bandwidth, pixel/texture fill rate, etc), the 6850 is quite a bit stronger.

AMD stopped driver support for HD 5000/6000 cards over a year ago, and performance from the 6850 in newer titles is absolute garbage. Radeon 6000 cards were offered beginning in very late 2010 (2011 for practical purposes), and driver support ended in 2015. Radeon 4000 cards were offered for sale in mid 2008 and all support was ended in 2015. When an AMD card is five years old, you might as well throw it away.

Yes, 6850 is an old chip, but the GTX 460 absolutely blows it away in pretty much any game published in the last two years. Nvidia still offers regular driver updates going back to GTX 400 series cards, and offers full Windows 10 support going back to 8000 series cards (from ten years ago). AMD only supports cards going back to HD 7000, and even that support is incomplete. Some of the R5 2xx series cards no longer receive driver updates (the non-GCN-based R5 22x/23x cards). You can still buy these cards brand new from Newegg, Amazon, or Best Buy, and their support life is already over.

Linux is often a good way to keep old hardware working, owing to its reduced overhead compared to Windows. AMD's Linux drivers have traditionally been pretty lovely. Yes, their driver support in Linux appears to be improving, but for over a decade Nvidia has been the obvious choice if you are willing to accept a proprietary driver (Intel has really been the only option if you insist on an open source driver). I don't really trust AMD to keep up their recent improvements when it comes to Linux drivers. Canonical (Ubuntu) has gotten fed up with AMD proprietary drivers to the point where they no longer offer automated proprietary AMD driver installation.

As far as I can tell, AMD GCN cards have gotten faster over time for two reasons:

1. GCN drivers have a high level of overhead. It takes high IPC and high clock speed on the CPU side to keep the driver fed. CPU horsepower has increased a good amount since GCN debuted in the beginning of 2012. Maybe CPUs have not increased in performance by as much as we would like, but there has been an increase in performance.

2. The initial driver releases were so poorly optimized that there was nowhere to go but up. Even after all this time, GCN OGL performance is still in the toilet, and only Vulkan has offered any reason for hope when it comes to performance in cross platform games/applications (this includes performance in Windows 7/8/8.1, where DX12 is not an option).

This probably sounds like a rant against AMD. I really don't like Nvidia's marketing practices, but their drivers are far ahead of AMD in terms of consistency of performance, support lifetime, and Linux support. While AMD was laying off employees (they didn't have much of a choice) and chasing pipe dreams like desktop HBM and Mantle, Nvidia was optimizing 28nm design/manufacturing and refining their OGL/DX11 drivers.

I mean nvidia has seen the same issue too. Look at the Kepler series of GPUs and see how they compare to the 7970/280x and 290 series. The 770 gets trounced in newer a games and the 290 is beating the 780 ti in places.

EdEddnEddy
Apr 5, 2012



Ok, I want this Beast.

I shouldn't. But :drat:

NihilismNow
Aug 31, 2003

BurritoJustice posted:


FuryX:
+FreeSync
+Doom Vulkan/Hitman DX12 performance on par with 1070, closes gap in other games (though it is to be noted that Hitman is comically biased towards AMD cards)

1070:
+Power usage
+Greater gain from overclocking (10% versus basically 0%)
+DP1.4/HDMI2.0b/DL-DVI (relevant for people with Korean monitors)
+Newer VR tech
+4GB VRAM

If you look at benchmarks you will also see the 1070 has higher minimum framerates even when the average framerate is not that far apart.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

EdEddnEddy posted:

Ok, I want this Beast.

I shouldn't. But :drat:

Prize for the literally the most ugly laptop ever made... goes to Acer! I'm not surprised that it would be Acer, frankly.

I don't care what hardware it contains, and the only way a curved screen could be a "benefit" at laptop scale is if it has the lowest quality TN panel in the universe, so the curve is actually enabling you to see the correct colours while your head is in one position. That said, colour shift is usually worst top to bottom, so they're missing a trick by not curving it in that dimension too. Basically the opposite of a CRT.

HalloKitty fucked around with this message at 17:33 on Aug 31, 2016

Anime Schoolgirl
Nov 28, 2002

B-Mac posted:

I mean nvidia has seen the same issue too. Look at the Kepler series of GPUs and see how they compare to the 7970/280x and 290 series. The 770 gets trounced in newer a games and the 290 is beating the 780 ti in places.
nvidia is really good at getting the most performance out of their cuda revision n architecture right out of the gate vs GCN that requires multiple threads that just didn't exist in APIs for a while. also note that CUDA started at fermi, which is why those cards haven't exactly been abandoned, since CUDA is basically nvidia's catch all architecture like GCN

there are some weird spots, like chopping off preemption in maxwell to save some watts but that's the price of optimizing for the current most popular APIs

penus penus penus
Nov 9, 2014

by piss__donald
the 290 was good hardware, good enough to last two whole generations of nvidia (one of which was record breaking good at that) through nothing more than fixing their drivers



now we have the 480

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Seamonster posted:

gently caress this balkanization of GPU performance where every game performs (possbily wildly) differently depending on combinations of detail settings, render API and resolution.

That's just how it's always been, unfortunately. You can generally predict which features are going to have a relatively large overhead but you need to do some tweaking and match it to your particular card.

ZobarStyl posted:

So from your datum it seems there is a real but rare scenario where RAM is a bottleneck, but it seems rather limited in scope. I'm not arguing that these don't exist, only that they will be far less prevalent than times when the card simply isn't fast enough to justify using the settings that trigger said issue.

You also seem hung up on my comment that a firesale priced Fury X could ever even be remotely considered for purchase: could you elaborate on what price it would need to be for you to find it competitive? I'm also happy to pick literally any other card to continue this discussion if that helps. Personally I'd have loved to see how a 4 gb 1070 performed if it could slot in below 375.

4 GB is not enough money for the performance of the Fury X. The Fury X severely underperforms at 1080p, it really only does OK at 1440p and it only hits its stride at 4K (and the Fury is not fast enough for single-card 4K). 4 GB of VRAM is not enough for ultra 4K and it's pushing it for ultra 1440p. Using medium textures and using non-supersampling antialiasing will help keep VRAM utilization under control, but lowering settings kind of defeats the purpose of owning a card in that performance class.

It's also got some serious problems with microstutter and even with optimized drivers the limited VRAM makes this even worse - the game will stutter when it needs to page stuff into and out of VRAM, and even in the best case it's got more microstutter than a 980 Ti or a 1070.

The long-term prospects for the card are frankly quite poor since DX12 shifts the burden of optimization onto the game devs, and lol if you think they're going to bother optimizing VRAM utilization.

BurritoJustice posted:

If you have a preference for the AMD ecosystem, and taking a 1070 to be $400 (I'm not american so I can't comment too specifically on this, but as per previous posters you can find them for $350-370~ in the same deals as the FuryX), I could see a FuryX being a contender at ~$320. This would put it on the same point in the price/performance curve as the 1070 (taking Techpowerup's latest 1440p figure of 83% total performance). Eliminating the performance advantage for the 1070 we are left with:

I've seen Fury Xs for $375 and Furys for $275, with the 1070 at ~$375. I think the Fury is an OK buy at $275 if you really want the AMD ecosystem but there's no question it's an odd duck with some real drawbacks that you need to be aware of going into it. It's certainly not going to "age well" like the 290 did, and that's what a lot of people are frivolously hoping.

Twerk from Home posted:

There have been GPUs without enough memory though. The 320MB 8800GTS, 256MB 8800GT and the 768MB GTX 460 (maybe less so) come to mind. That's all that I can think of right now, so I guess that insufficient VRAM is very rarely an issue.

The 700 series also really had too little memory - particularly the 780 Ti - but they wanted to keep that locked up for the Titan series.

The 1060 3 GB is also pretty clearly too little memory for its performance at this point in time, you should avoid it for anything except a media-center PC.

a retard posted:

So i'm talking to a friend on irc about upgrading his GPU and he wanted to upgrade his 560ti to a 470 mainly due to "NVIDIA gimping". I looked it up and supposedly NVIDIA driver updates over the years will steadily lower the performance in games, apparently in a transparent attempt to get you to update to their latest cards. Meanwhile AMD will slowly improve performance in the same games through driver updates.

Is this bullshit (I'm pretty sure it is) and if so why?

No, drivers are not causing games to get slower. Newer games use newer features that old cards aren't particularly fast at, and that means that over time cards that were fairly equal tend to fall behind their newer counterparts. The classic example of this is tessellation, Maxwell was the first generation that had oodles of tessellation power and everyone went nuts with it. If you turn off the features that overuse tessellation like Godrays and Hairworks then Kepler and GCN keep up just fine.

This comes down to gamers who are too stupid to tweak settings properly for their hardware. If you are running a 3 GB or 4 GB card then you don't run ultra textures at 1440p/4K, and if you aren't running Pascal or Maxwell you shouldn't be going crazy on tessellation. Like I said above, this is unfortunately just how PC gaming has always been and if your friend can't be bothered to spend 10 minutes playing with settings then he's in the wrong place.

Sorry about being testy, it's not you, it's your friend and the other thousands of idiots like him who are pushing conspiracy theories when there is a very simple explanation for Kepler's performance. None of these people has ever shown an example of a game that actually got slower over time, and when called on it they inevitably retreat to "well they're obviously just not really trying hard enough to optimize!". You can't wring blood from a stone, the hardware doesn't support fast tessellation.

Paul MaudDib fucked around with this message at 18:13 on Aug 31, 2016

Gonkish
May 19, 2004

Newegg outdid itself and delivered the 950 to my dad's door in 24 hours. He is now completely lost in 4k pictures/video and can't comprehend that it is real.

Thanks goons!

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The two good things I will say about the Fury are that:

  • when the 4 GB 480 is being priced at $240 and the Fury is being priced at $275 it's a good buy in terms of performance if you can tolerate the power draw
  • between its good 4K performance and Crossfire's good scaling, Fury CF is pretty decent for medium-settings 4K up until you hit the VRAM limit

So if those are your use-cases and you really want the AMD ecosystem it's not terrible, but it's really not a good general-purpose card.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

Gonkish posted:

Newegg outdid itself and delivered the 950 to my dad's door in 24 hours. He is now completely lost in 4k pictures/video and can't comprehend that it is real.

Thanks goons!

I need to jump on a htpc card as well, but I'll wait for a card with VP10 encoding for my low power AMD5350 facebook/youtube machine :iamafag:

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Gonkish posted:

Newegg outdid itself and delivered the 950 to my dad's door in 24 hours. He is now completely lost in 4k pictures/video and can't comprehend that it is real.

Thanks goons!

This warms my goony heart

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

THE DOG HOUSE posted:

the 290 was good hardware, good enough to last two whole generations of nvidia (one of which was record breaking good at that) through nothing more than fixing their drivers



now we have the 480

Man there's some weird poo poo going on in 480-land actually, led by the Ethereum miners. They're getting almost 40% mining hashrate increases with custom BIOSes that go way outside normal voltage / clock limits: https://bitcointalk.org/index.php?topic=1584617.0

I have very little faith that JimBob's hacked-up custom BIOS is super safe to be used long-term without damaging the GPU in some way, but I'm not sure exactly what the failure mode would be. I'm assuming that dramatically lowering voltage and overclocking would lead to way more amperage going through any given component. If these custom BIOSes can get +20% performance in gaming though, then maybe AMD can get their poo poo together than the 470/480 have more performance accessible by driver updates or future AIB firmware.

penus penus penus
Nov 9, 2014

by piss__donald

Gonkish posted:

Newegg outdid itself and delivered the 950 to my dad's door in 24 hours. He is now completely lost in 4k pictures/video and can't comprehend that it is real.

Thanks goons!

Lol I just got my dad off some really really really garbage TV based 1080p monitor to a relatively cheap basic 4k monitor (strictly text, pictures, coding) after a visit last weekend and it was like mind bending. Some common samsung that costs ~$350, I forget the model.


Twerk from Home posted:

Man there's some weird poo poo going on in 480-land actually, led by the Ethereum miners. They're getting almost 40% mining hashrate increases with custom BIOSes that go way outside normal voltage / clock limits: https://bitcointalk.org/index.php?topic=1584617.0

I have very little faith that JimBob's hacked-up custom BIOS is super safe to be used long-term without damaging the GPU in some way, but I'm not sure exactly what the failure mode would be. I'm assuming that dramatically lowering voltage and overclocking would lead to way more amperage going through any given component. If these custom BIOSes can get +20% performance in gaming though, then maybe AMD can get their poo poo together than the 470/480 have more performance accessible by driver updates or future AIB firmware.

I'm surprised through all those pages nobody tested it. Although I hardly understand what I just read, there was considerably confusing numbers in that thread. But my interest is piqued pretty hard...

Hey who has a 480 who wants to go through what looks like the most obnoxious bios flash ever seen to test for 20% gaming improvement ;)

Kazinsal
Dec 13, 2011
Still poking at the prospect of going from an R9 290 with no OCing potential to a GTX 1070. Is LightBoost still a thing? My 120/144 Hz monitor supports it, apparently...

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

THE DOG HOUSE posted:

I'm surprised through all those pages nobody tested it. Although I hardly understand what I just read, there was considerably confusing numbers in that thread. But my interest is piqued pretty hard...

Hey who has a 480 who wants to go through what looks like the most obnoxious bios flash ever seen to test for 20% gaming improvement ;)

Cryptocurrency miners tend to never play games, and do poo poo like buy 10 GPUs and return the 5 worst overclockers among the batch. If you can't find RX 470/480 in stock at a reasonable price, they're part of the reason why. Also you can now expect that if something is open box, who knows what BIOS is flashed onto it.

penus penus penus
Nov 9, 2014

by piss__donald

Kazinsal posted:

Still poking at the prospect of going from an R9 290 with no OCing potential to a GTX 1070. Is LightBoost still a thing? My 120/144 Hz monitor supports it, apparently...

I didn't even know that was brand or gpu dependent but there are videos of people doing it with 10 series cards

ufarn
May 30, 2009

EdEddnEddy posted:

Ok, I want this Beast.

I shouldn't. But :drat:
Don't take this with you through airport security.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

ufarn posted:

Don't take this with you through airport security.

Plugging this thing into an airplane outlet might actually cause the plane to crash...but really all that'll happen is you'll annoy the dogshit out of anyone sitting with you.

Green Gloves
Mar 3, 2008
A regular looking laptop with a single 1080 and gsync is a lot more appealing. That Acer monstrosity is hideous and costs almost as much I make in two months. Bleh.

Glad to hear *sync is getting into laptops tho.

SlayVus
Jul 10, 2009
Grimey Drawer
Well, the main issues is that the screen is curved. All the support of the screen is done by the outer shell of the laptop. There isn't support for the screen on the inside except for the edges.

Double Punctuation
Dec 30, 2009

Ships were made for sinking;
Whiskey made for drinking;
If we were made of cellophane
We'd all get stinking drunk much faster!

PBCrunch posted:

Driver stuff

Nvidia's drivers are definitely shittier now than they were. For one example on Linux, their "support" for the new display protocol Wayland is through an API that absolutely no one but them has implemented. This is because it took a tiny bit less effort than porting to the simple API that everyone is using, and they think they can force everyone onto their API by holding back development indefinitely and requiring nonexistent firmware blobs for open source drivers to use new hardware.

DropsySufferer
Nov 9, 2008

Impractical practicality
I've finally reached that time where I need to upgrade my computer after five years. I have an i7 processor but my graphics card is a joke these days.

Right now I see the EVGA GeForce GTX 960 4GB for $250 vs GeForce GTX 1060 6GB which would cost me $300. Is the extra 2GB of VRAM worth it? I'm being cheap here and need to pushed to the correct choice.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

DropsySufferer posted:

I've finally reached that time where I need to upgrade my computer after five years. I have an i7 processor but my graphics card is a joke these days.

Right now I see the EVGA GeForce GTX 960 4GB for $250 vs GeForce GTX 1060 6GB which would cost me $300. Is the extra 2GB of VRAM worth it? I'm being cheap here and need to pushed to the correct choice.

You arn't paying $50 for the extra vram. You are paying $50 extra because the 1060 is WAY WAY faster than a 960.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

DropsySufferer posted:

I've finally reached that time where I need to upgrade my computer after five years. I have an i7 processor but my graphics card is a joke these days.

Right now I see the EVGA GeForce GTX 960 4GB for $250 vs GeForce GTX 1060 6GB which would cost me $300. Is the extra 2GB of VRAM worth it? I'm being cheap here and need to pushed to the correct choice.

The 1060 is like twice as fast so get that. Also here's a cheaper 1060.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Don Lapre posted:

You arn't paying $50 for the extra vram. You are paying $50 extra because the 1060 is WAY WAY faster than a 960.

And by 'WAY WAY faster' he means anywhere from about 60-100% faster, depending on what you're using it for.

sauer kraut
Oct 2, 2004
Selling a 960 for more than 120$ should be a felony. Whatever shop tried to pull that, avoid them in the future.

Adbot
ADBOT LOVES YOU

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
The 960 wasn't a good deal when it came out and the 1060 is a pretty rad 1080p card, so it's a no brainer. You should be able to get a 1060 shipped in the States for 270 or so.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply