Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

PC LOAD LETTER posted:

edit:/\/\/\/\Yea looks like Intel's patent licenses are safe no matter what if AMD goes under I'm reading that right.

Nah, they'd just buy up whatever patents they wanted. Not many can out bid Intel or are even interested in buying up those patents. Even then the details of any license agreement, and the method of bankruptcy, between AMD and Intel may mean that Intel won't necessarily lose access to them even if AMD goes under and someone else gets them. For all we know AMD deeded Intel the patents in the event of bankruptcy as part of a patent sharing contract.

There is a reason why any bankruptcy would probably be a lengthy affair folks. The lawyers could be hashing stuff out for years.


edit: \/\/\/\/\/\/Well sure only a select few investors/insiders/good 'ol boys benefit from any sort of monopoly but that group tends to be pretty powerful and gets what they want. Particularly in today's political/regulatory climate. I'm not advocating their position, I'm just pointing out that is the way it is right now. ARM might indeed be able to push Intel/x86 out of the market in a decade or so but that is a whole other subject altogether and isn't something that is a sealed deal yet either. Speaking about these things as if they're foregone conclusions is premature to say the least.

Alereon posted:

If there's a change in ownership of either company, the cross-licensing agreement is terminated for both Intel and AMD. Intel can't decline a new agreement with AMD's new owner because they need the IP to make 64-bit CPUs, and I think their unique relationship with MS would keep them from pushing too hard on licensing terms.

Comedy option: Microsoft buys AMD, Apple buys Intel. Everybody else in Europe, Africa, and most of Asia buys ARM CPUs in Android devices, and China keeps going MIPS on Linux.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Cardboard Box A posted:

So uh basically going from 120Hz to 144Hz on a high refresh rate monitor is enough to kick NVIDIA's GPUs from their low power state into their full power state when just on the Windows Desktop, no games....
It's better to run at 120Hz than 144Hz anyway, because the framerate is an exact multiple of 24, 30, and 60, all common video framerates, which eliminates any judder or video/audio desynch. It's not like there's a perceptible difference from the lower latency, so all you're doing is driving your monitor and videocard harder for a worse experience.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
:siren: This is not the Windows 10 thread, there is another thread for Windows chat :siren:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

monster on a stick posted:

I'm in a similar position - I'd rather not upgrade anything else right now unless I need to, so what's a good card to look for that will play TW3 but take up about the same amount of power? Would an R9 380 work?
Consider a Geforce GTX 960 4GB, the EVGA B-stock site has refurb overclocked models for $170-180. A Radeon 6870 draws 151W, an R9 380 draws 190W. A Geforce GTX 960 4GB draws only 120W and is about the same price as the R9 380W 4GB new, and is a decent bit faster overall.

For the record these kinds of posts should really go in the parts picking megathread.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
In the future, please don't post links to Sourceforge, it's a malware distribution site. If a project isn't posted on a legit site you can still post the URL to the SF project, but please don't link it (uncheck "Automatically parse URLs"). Otherwise the forums end up blocked on various lists for containing links to malware sites. I edited your post to remove the link but leave the URL. I'm mentioning this publicly because it's not super widely known that Sourceforge turned to poo poo so please don't take this as a public shaming or anything!

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

GrizzlyCow posted:

Is there a reason all the good IPS(-like) *Sync displays are having so much problems? The TN panels aren't suffering these issues, right?
That's a set of what, three monitors, two of which are from Acer and probably based on a similar platform?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I really have to give AMD props for the rate of improvement in their drivers over the past year or so. Of course, they had the farthest to go so it's easier to see visible changes, but it was like they were doing gently caress-all for a long time.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

slidebite posted:

Sorry if this is a stupid question, but is Geforce Experience a thing to have? I always just install the basic drivers and really nothing else. Am I missing anything?
nVidia announced that they will be distributing future driver updates exclusively through Geforce Experienece, not their website, so yeah it's required.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

causticBeet posted:

Ended up getting an insignia 520w psu at bestbuy because I overlooked that (dope).
Well there's your problem, you can't put a generic 520W power supply in your computer and have it work. Try with the OEM power supply, and if that doesn't work, actually buy a decent model of power supply capable of powering the computer.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
To be honest those no-name pieces of crap power supplies usually work fine for a few years in a system with a lower-end CPU, no graphics card, and a single hard drive. Not that I'd recommend it, but when not loaded they are usually functional.

E: Typo.

Alereon fucked around with this message at 06:05 on Feb 3, 2016

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Low-end videocards with DDR3 are substantially slower than onboard graphics, especially at tasks like video playback that depend on decent memory bandwidth. For AMD cards I wouldn't go below a 7730/R5 240 both for performance and driver longevity reasons, and for nVidia I wouldn't go below a GT 730 GDDR5. When you buy a lower-end card that's been rebadged for a few generations that invites losing driver support a year after purchase.

Police Automaton posted:

I use an HD6450 as second graphics card and it does 1080p60 youtube fine in firefox, and that's in Linux. It doesn't even use video hardware acceleration by the graphics card, I'm not even sure this is possible in my particular configuration.
Another way to phrase this might be "software rendering is faster than a Radeon 6450."

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

ufarn posted:

(Speaking of, why do the videos annihilate my CPU? Shouldn't my R9 270X GPU bear some of that brunt?)
Yes, make sure hardware acceleration isn't disabled. For some reason that's been a popular tip for a few years,

Fauxtool posted:

what are my options when it comes to making 2 monitors look similar. Can a calibrator still help even if 1 is ips and the other TN?
Yes calibration will help a huge amount, though you'll still get color shift from the TN monitor at different angles.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

sauer kraut posted:

I don't think youtube VP9 videos are hardware decoded unless you play it on a newer Intel integrated GPU.
1080p/60fps can be pretty taxing.
Doesn't Youtube only use VP9 for fallback on systems without H.264? That video is H.264 @ 1080p60 on my machine, using HTML5 video playback in Firefox.

ufarn: Are you getting unexpectedly high CPU usage, GPU usage, or both?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

HalloKitty posted:

XFX is now good and Sapphire is poo poo? It seems like the answer to that question hangs on whoever you ask that day.

I'd put Sapphire above XFX, for what it's worth.
For quite a few years Sapphire had much higher failure rates than other brands, but they seemed to close the gap last year and as of the latest stats from November 2015 are equal to other brands (French link).

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Naffer posted:

You're making me feel old. My first 3d card was a 3dfx voodoo banshee. It could get 60fps at quake 2 at 640x480! (http://www.tomshardware.com/reviews/3d-chips,83-7.html)
The MX cards were a strange beast. The Ti4200 was the best deal of that generation (and incidentally, my third video card).
I got an original Radeon LE DDR for the same price as a Geforce 2 MX, but it had pixel shaders and other pretty graphical features!

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Subjunctive posted:

No, it only terminates the rights of the purchased party.
It definitely terminates rights for both Intel and AMD as a result of acquisition.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Fatal posted:

K well, not at that price point but the idea still stands. I would really be interested to know how something like this would actually sell. How many people upgrade from a 970 to a 980Ti? Or even better question, how many people that upgrade stay with the same vendor?

I guess this is kinda like the evga step up program without the inconvenience of actually shipping hardware. It seems like the real risk would be to the manufacturer if somebody figured out how to unlock the card without paying for it.
Intel sold software upgradeable CPUs that became slightly faster models via an unlock code you could purchase from Intel. I think it only lasted a couple generations, and was only for specific upgradeable models. Here's an Anandtech post with details.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

SwissArmyDruid posted:

Speaking of Nvidia, what's this I'm hearing about drivers bricking cards? I thought Nvidia was supposed to be the company with the *good* driver team?
Probably just idiots making too much of coincidences, there's not really a way for drivers to damage the hardware. The closest that has ever happened is a driver screwing up fan control and causing overheating, but the card would throttle or shut down before damage was done.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Truga posted:

I tried doing this and it wouldn't work, because ASUS in all their knowledge decided not to wire the onboard gpu to anything at all, probably thinking "this is an xfire/sli mobo, surely nobody will use the haswell gpu for anything??".
That doesn't seem right, what board?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

GokieKS posted:

What? No, the cards listed as using "DDR3" are actually using plain old DDR3, not GDDR3 - If you look at the VRAM chips on any of the (terrible) nVidia cards with DDR3 they keep rebadging and search the part numbers, you'll see they're actual DDR3 modules.
Indeed. GDDR5 is based on DDR3, while GDDR3 is based on DDR2, so lower-end versions of GDDR5 cards use DDR3. This does absolutely murder performance since memory bandwidth is now slower than onboard video.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

FaustianQ posted:

So wait, would DDR4 in fact make a decent enough performance boost over DDR3? Say a 2GB DRR4 @ 4000mhz card with a 128 bit bus, I'm likely miscalculating but that should be ~64GB/s? There should be at least some PCB area and power savings, right?
Yes, DDR4 would be better than DDR3 for graphics, but since it requires new memory controllers don't expect to see it in discrete videocards.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

FaustianQ posted:

Well, the original thought experiment was that since you'd have a limit to financial return on heavily put P11 dies, AMD would fill out the lowest end of the 400 series lineup by lifting APU iGPUs and putting them on PCB. For potentially minimal fuss they'd use DDR4, as a different memory standard would require redesigning. As pointed out by Paul, the DDR4 memory controller being attached to the CPU could complicate the task and you'd end up at square one producing two separate but functionally identical performance GPUs.
AMD APUs already support GDDR5, it was supposed to be offered on DIMMs but this never came about. I wonder what prices are like for current-gen DDR4 compared to GDDR5?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Paul MaudDib posted:

XDMA CrossFire is absolutely bulletproof and significantly outperforms SLI in every way - frame pacing, scaling, and physical durability.
Do you have a link to a recent test showing this? All the tests I'm finding on old drivers show it beating old Crossfire by a huge margin but only achieving almost-parity with SLI, but it makes a lot of sense that this would have improved in the Crimson drivers.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Subjunctive posted:

Is there a way to programmatically tell that the driver is in that mode? I've been entertaining writing a warning overlay/widget.
There's a "PerfCap Reason" flag that tells you why the card isn't boosting higher, GPU-Z's Sensors tab will display it.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Don't worry about it, as long as you're not regularly using the same oven for reflowing solder and cooking food you'll be fine. There are very strict regulations preventing the use of toxic substances in electronics, lead-free solder has been required for a decade for example.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

EdEddnEddy posted:

And this is pretty much exactly what caused all the lovely electronics issues of 2005 - 2009~ish. Removal of Lead in the solder made the replacement crap brittle after a bunch of heat cycles which lead to the RLOD/YLOD and all the GPU and even Auto ECU deaths over those years.
I think that's a lot less related to RoHS requirements and more manufacturers cutting corners on thermal management at the same time as power dissipation levels were increasing beyond where they had been in the past. This meant that electronics were running hotter, shortening lifespans. The fix wasn't different solder, it was improving cooling to put devices back into the operating ranges that were typically considered safe. See also the nVidia bump underfill drama, where an entire generation of Geforces were simply missing the layer of material under the chip that provided strain-relief for the solder bumps, causing them to snap and eventually fail. This basically killed ALL laptops from certain year ranges.

Overall, never underestimate the ability of manufacturers to cut corners and then blame anyone but themselves for the resulting problems.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

SwissArmyDruid posted:

The wrongness of this post has caused me to write and delete this post three times just trying to write an informative and enlightening post about the lead-free solder switchover, and my time working in semiconductor manufacturing during that time. It proceeds to get angrier and angrier with length, so we'll do this short:

The lead-free solder switchover was a real thing. Solders that purported to be drop-in replacements weren't quite, and needed to be cooked a little hotter to really get melty. In cases where you saw widespread solder failures, it was because they didn't get cooked hot enough.
Oh sure the switchover was real and it required engineering expertise to successfully transition, but hardware didn't get less reliable because manufacturers had to stop using leaded solder. It got less reliable because they decided to stop investing in engineering and quality control, and the lead-free solder requirement was an extremely convenient scapegoat.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

SwissArmyDruid posted:

In short: More pixels mean less jaggies, because each individual pixel is a much smaller part of an overall line or edge, so the "stair effect" is reduced. This means you can turn AA down or off entirely and reclaim the power you'd normally use for it. This power can then be put towards texture quality or effects or whatever.
That only works on still images, for moving images like video or games jaggies remain distracting even at high resolutions because you see "temporal aliasing", which manifests as shimmer or sparkling. The only way to fix this is to filter out detail smaller than the pixels of the display. One way to do this is to supersample and scale down, the other way is to subsample and scale up. The latter is much faster and often works almost as well.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

mobby_6kl posted:

Seems like it would work exactly like supersampling though, except that instead of taking an average and discording the original pixels, they're all displayed. You just can't see them individually because they're too small, assuming a high enough DPI. Which is a problem if your 4k screen is >40", then you're hosed :v:
The key is that the human eye can see artifacts in moving images extremely clearly. 4K is enough to fix it for still images but not games or video. You're basically saying "let's make pixels small enough that the optics of the human eye due the filtering for us" which is a nice idea but just ridiculously wasteful and impractical. That's not a dig at you or anything this is an extremely common idea and the reason it doesn't work aren't at all obvious, because moving images work fundamentally differently than still images.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
My crystal ball says I'm gonna sell my (GTX) Titan X to someone who doesn't know any better for $1000 :D

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I'd say buy 16GB of dual-channel DDR3-2133 or 2400 (they're around the same price) and a G-Sync monitor. I'd say just the G-Sync monitor, but I think your RAM configuration will cause unnecessary stuttering. You'd get new RAM with a Skylake upgrade, but honestly just buying the RAM is cheap and your CPU is fast enough. Head to the overclocking thread if you're on the latest BIOS.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Grog posted:

The only comparison I know of offhand is the Digital Foundry one with a 2500K and 3770K compared to a 6500: http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k. It seems like increased memory speed has a bigger effect on Sandy Bridge, with relatively minor gains on Ivy for certain games. The minimums increase a decent amount in some cases, but it seems like memory speed doesn't make a huge improvement on their Ivy setup.

There are still pretty big gains in minimum frame rates going from the overclocked 2500K and DDR3-2133 to the other CPUs, though. I'm still on a 2500K at 4.5 GHz, DDR3-1600, and a GTX 770. All of this is starting to make me really feel the itch to upgrade...
Yeah if someone already has enough dual-channel DDR3-1600 you probably won't really notice the gains. If you still have DDR3-1333, aren't running fully dual-channel, or don't have 16GB of RAM, an upgrade to 16GB of faster RAM seems like a no-brainer if you plan to keep the system for awhile. Those performance gaps are only going to widen as new apps increase memory bandwidth demands. If you're buying RAM anyway fast RAM is the same price as slow RAM and has been for years, it is literally the exact same price for DDR3-2133 as 1600 and 2400 only has a $5 price premium.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Watermelon Daiquiri posted:

Wouldn't 110 be, you know, worse than 108? Itd be like 1120 poo poo tier. And i think that 14nm for the next gen is mixing rumors up......
It gets confusing, but 110 is considered the top of a new range, with 114-118 below it. (If they exist, not all positions get used for all generations)

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Geemer posted:

Apparently the new Geforce Experience 3.0 requires you to log into a Nvidia, Facebook or Google account. Which loving rear end in a top hat thought that was a good idea?

Guess I'm not going to experience the new Geforce Experience. :shrug:
Thankfully they decided to walk back from requiring GFE to get driver updates and will still offer them for download from their website without a login.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
:siren:No more Trump posts or political arguments below this line:siren:

Go to D&D or C-SPAM or wherever tolerates that poo poo.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
:siren:This thread is for discussion of GPUs, not trade policy:siren:

I mean there's some overlap and it's obviously a highly relevant topic right now, but we're at the point where the discussion really needs to be in a more appropriate thread in a more appropriate forum.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply