Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...

Lipstick Apathy

Here's a sheet I found online I used to help shop for a used card: https://docs.google.com/spreadsheet...3bkd5R1E#gid=0. I believe Asus and Gigabyte AMD card warranties are based on serial.

(Eventually settled on an Asus 7970 DCII for $200 shipped that wasn't used for mining.)

Adbot
ADBOT LOVES YOU

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.


teh_Broseph posted:

Here's a sheet I found online I used to help shop for a used card: https://docs.google.com/spreadsheet...3bkd5R1E#gid=0. I believe Asus and Gigabyte AMD card warranties are based on serial.

(Eventually settled on an Asus 7970 DCII for $200 shipped that wasn't used for mining.)

EVGA no longer has lifetime warranties.

Ignoarints
Nov 26, 2010


I was wondering about that. It's hard to imagine having a lifetime warranty on something that is basically guaranteed to fail, but likely after several generations of improvements. PNY's "lifetime warranty" is a joke. It's the lifetime... of how long they sell it. Sometimes they say 3 years, sometimes they say lifetime warranty, for the same exact card.

Shaocaholica
Oct 29, 2002

Fig. 5E


Beautiful Ninja posted:

Memory bandwidth from the benchmarks I've seen matters more than raw VRAM size

I get that part but isn't asset complexity and detail pretty much proportional to VRAM? Its a catch-22 for game devs and even GPGPU applications. If you don't have the VRAM, you can't/won't code for that kind of use.

Plus, isn't 4GB VRAM pretty much on par with current gen consoles so game devs wouldn't have to tier their assets as much for PC vs Console ports?

Again, not for resolution's sake but asset complexity's sake.

Ignoarints
Nov 26, 2010


Shaocaholica posted:

I get that part but isn't asset complexity and detail pretty much proportional to VRAM? Its a catch-22 for game devs and even GPGPU applications. If you don't have the VRAM, you can't/won't code for that kind of use.

Plus, isn't 4GB VRAM pretty much on par with current gen consoles so game devs wouldn't have to tier their assets as much for PC vs Console ports?

Again, not for resolution's sake but asset complexity's sake.

I don't know, but just from a benchmarks perspective you definitely need one with the other or it won't matter. For example, 4 GB 760's and 770's are an almost worthless expense over a 2 GB. Where something like the 500's series cards it was the opposite problem - it would choke from the lack of vram.

I kind of doubt it's going to affect game development very much though. I'm pretty sure they're trying to push the limits as much as possible without much regard to something like mid grade nvidia card memory limitations, since there are plenty of cards without that particular bottleneck.

But like was said, it's very likely to change in exactly this way, but since it requires a change in architecture it won't be in this lineup (or if it is, it'd be a new part number, like 770ti or something)

Shaocaholica
Oct 29, 2002

Fig. 5E


Ignoarints posted:

I don't know, but just from a benchmarks perspective you definitely need one with the other or it won't matter. For example, 4 GB 760's and 770's are an almost worthless expense over a 2 GB. Where something like the 500's series cards it was the opposite problem - it would choke from the lack of vram.

I kind of doubt it's going to affect game development very much though. I'm pretty sure they're trying to push the limits as much as possible without much regard to something like mid grade nvidia card memory limitations, since there are plenty of cards without that particular bottleneck.


I don't think benchmarks are really necessary to gauge VRAM size when everything else is equal. You either have enough or you don't and need to swap. That's a game/app design issue. The catch-22 is that game devs aren't going to use 4GB of vram if the majority of people don't have it. Unless I'm mistaken here and there -ARE- games that use >2GB of VRAM for game data. Games that are popular???

Ignoarints posted:

But like was said, it's very likely to change in exactly this way, but since it requires a change in architecture it won't be in this lineup (or if it is, it'd be a new part number, like 770ti or something)

Oh for sure I don't think its this generation of GPUs. Just pondering if its the next or the one after.

Ignoarints
Nov 26, 2010


Shaocaholica posted:

I don't think benchmarks are really necessary to gauge VRAM size when everything else is equal. You either have enough or you don't and need to swap. That's a game/app design issue. The catch-22 is that game devs aren't going to use 4GB of vram if the majority of people don't have it. Unless I'm mistaken here and there -ARE- games that use >2GB of VRAM for game data. Games that are popular???


Oh for sure I don't think its this generation of GPUs. Just pondering if its the next or the one after.

From what I understand vram is used to store textures to be calculated and then also completed frames. However, both of those things are directly affected by resolution and game settings, and thus the size, and can be indirectly seen by benchmarks (both FPS and frame variance). Once you get past 1080p, games do definitely use more than 2gb of vram (like Crysis 3 can use 3 gb at 1600p, and BF4 even can consistently use over 2gb). However, simply increasing vram does very little to alleviate vram issues on mid-nvidia cards because even though it's putting some of that data elsewhere, the card chokes on memory bandwidth or otherwise an inability to calculate what it does have fast enough overall. Not to say it doesn't help, but by the time it does it is unplayable.

I really don't think developers limit anything based on this specific issue, rather than the limitations of graphics cards overall. Since lowering settings directly reduces how much vram is used, they can code for much higher levels (within reason).

As far as generations though, I'd be really shocked if this wasn't addressed in the 800 series

edit: The one situation that I've always been interested in is if there is any improvement with 4gb cards in SLI over 2gb cards that on their own show very little difference. This information is hard to come by

Ignoarints fucked around with this message at 18:03 on May 14, 2014

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Ignoarints posted:

edit: The one situation that I've always been interested in is if there is any improvement with 4gb cards in SLI over 2gb cards that on their own show very little difference. This information is hard to come by

I'd be interested in hearing more about this as well. Maybe we can bug techreport.com to do an article on it?

veedubfreak
Apr 2, 2005

by Smythe


^^^^Higher VRAM starts to become useful with multicard and giant resolutions. Currently not even a Titan has enough power to use up all of the memory. But when you add more monitors and more cards, you can get into situations where the 3gb of memory on a 780 isn't enough.


Ignoarints posted:

Yes, just seriously consider from me. After reading one long thread about how they treated their cards on some mining forum it'd take some cheap price like that.

Do AMD cards generally lose their warranty if they are opened? I've noticed it both ways for nvidia so far. Even between different brands in the same chipset. That might be something to consider as well.

XFX and Powercolor both allow you to remove the cooler as long as you don't damage anything.

Ignoarints
Nov 26, 2010


fletcher posted:

I'd be interested in hearing more about this as well. Maybe we can bug techreport.com to do an article on it?

I was hoping someone would once the 6gb 780's came out. Which they did - today. Talk about timing.

It would seem logical to benchmark those against 3gb versions in a variety of resolutions. However since these have a higher bandwidth it might not relate to 770's or 760's very well (as far as this specific subject goes), but hopefully since the particular focus for this release is pure vram size perhaps they will test the others cards as well.

Especially if there is a nice improvement which is logical since 780's in SLI can probably handle a larger vram workload. So if there is an improvement, hopefully someone will compare lower cards in the same situation to see if the same improvement is present or not

edit: true to form, the first review is a typical comprehensive multi-game and 3d mark examination that doesn't include comparisons to the 3gb version despite specifically mentioning that was the purpose of the card. Unless I missed something I don't have time to read all of it at work, except at the end where it says "the extra memory doesnt matter much". If that ends up being true though, I suppose we can guess that lower cards will experience the same thing

Ignoarints fucked around with this message at 18:44 on May 14, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E


Anyone know why all of these miniDP to HDMI cables have 'fat' HDMI ends?



Is there circuitry necessary to convert DP to HDMI? I thought there was an electrically comparable mode that would have required much less hardware if any?

Ignoarints
Nov 26, 2010


Displayport is different, and transmits data on single pairs of wires to transmist data (but can use multiple pairs) where HDMI uses a fixed number always. HDMI and DVI are the same in this way and can be converted with no circuitry, however a displayport signal must be converted to the hdmi style signal. That requires circuitry which is powered by the displayport or hdmi source wrong, active only. The housing contains a small pcb with electrical level shifter to account for fatty cables

Ignoarints fucked around with this message at 21:20 on May 14, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E


Ignoarints posted:

Displayport is different, and transmits data on single pairs of wires to transmist data (but can use multiple pairs) where HDMI uses a fixed number always. HDMI and DVI are the same in this way and can be converted with no circuitry, however a displayport signal must be converted to the hdmi style signal. That requires circuitry which is powered by the displayport or hdmi source

Ah, gotcha. Not HDMI but close enough.




I wonder if there's any noticeable lag? Not that it matters for my use.

Ignoarints
Nov 26, 2010


I dont know, unfortunately converting signals within cables opens up a whole new way for a cable to suck rear end.

For the record though, I've used a ton for work (mostly on apples) with no discernible issues. Even cheap rear end monoprice ones

Ignoarints fucked around with this message at 19:26 on May 14, 2014

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!


And I quote:

quote:

DisplayPort-to-HDMI adaptors and DisplayPort-to-DVI adaptors are very simple and only operate one way. For instance, when a DP-to-HDMI adaptor is connected to a PC that supports DP++ (Dual-Mode) capability, the PC senses the presence of the adaptor and sends HDMI signals over the DisplayPort connector rather than DisplayPort signals. No signal conversion is performed by the HDMI adaptor. HDMI signals are merely passed through. The unique DisplayPort adaptor capability enables the PC to connect to a variety of displays via the DisplayPort connector including HDMI, DVI, and VGA. VGA adaptors are more complex and perform active signal conversion from DisplayPort to VGA. These adaptors also operate only one way. Unfortunately, HDMI does not support conversion to other display formats as does DisplayPort.

Ignoarints
Nov 26, 2010


deimos posted:

And I quote:

I never actually looked into it, and googling that is a huge can of worms. I was always told that they had at least basic circuitry for the reason above, passive vs. active regardless.

quote:

To use a “DP++” video connector with HDMI displays, an external signal-level (and connector) adapter dongle must be inserted.
This is essentially a “passive” device that is powered by the DisplayPort connector on the source device (laptop/PC).

Which is what I was (or used to be) told. A little vague though, but a common adapter in A/V:

http://www.accellcables.com/B086B-001B.html

States that it is a passive cable (they carry active as well) with circuitry. It would make more sense to me since every DP adapter I've ever seen is fat as hell, cheap crap to expensive stuff.

In that same quote, what is talking about HDMI? HDMI adapts to DVI directly, and VGA just as easily (edit: easily, as in, as readily as display port)

Ignoarints fucked around with this message at 20:37 on May 14, 2014

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!


Ignoarints posted:

In that same quote, what is talking about HDMI? HDMI adapts to DVI directly, and VGA just as easily (edit: easily, as in, as readily as display port)

Err no, HDMI is signal compatible with DVI but VGA requires active conversion.

My source: http://www.displayport.org/faq/ which, I think, might be considered canonical, but maybe your source is a better one.

veedubfreak
Apr 2, 2005

by Smythe


Displayport is a godsend and has taken entirely too long to become mainstream.

Ignoarints
Nov 26, 2010


deimos posted:

Err no, HDMI is signal compatible with DVI but VGA requires active conversion.

My source: http://www.displayport.org/faq/ which, I think, might be considered canonical, but maybe your source is a better one.

HDMI converts directly to DVI, and in the same sort of way as Displayport does to VGA. I read the paragraph in there too, it does literally say that it doesn't. That doesn't make any sense coming from what I'm assuming is an official website for the display port standard.

DVI and HDMI are functionally the same when we're talking about adapters. Displayport to DVI always have some kind of chip in them. While I was completely wrong about the typical cable converting the signal (as an active cable does) the fat housings should still be to house a small PCB.

Here is "Mike" explaining passive displayport adapters.

http://mikesmods.com/mm-wp/?p=494

A NXP passive design is referred in there, which you can see on page 27 here

http://dl.cubieforums.com/LapdockMaterials/NXP-DP.pdf

I dunno what to think. Adapters give me a headache. I'm inclined to think that part of the FAQ is just plain (and inconsequential) misinformation since it specifically says no signal conversion is performed when in fact it looks like signal conversion is performed even in the cheapest $3 cables

Ignoarints fucked around with this message at 21:03 on May 14, 2014

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!


Ignoarints posted:

I dunno what to think. Adapters give me a headache. I'm inclined to think that part of the FAQ is just plain (and inconsequential) misinformation since it specifically says no signal conversion is performed when in fact it looks like signal conversion is performed even in the cheapest $3 cables

No there isn't, read the article you posted:

quote:

Clearly the two standards are very different, and normally quite incompatible. In a world with millions of DVI devices, and with competitor HDMI offering backward-compatibility with these devices, DisplayPort would be dead in the water without a simple, inexpensive way to support legacy displays. To address this, VESA (DisplayPort’s standards organization) added “Dual-Mode” operation to the specification, which allow the video device to output DVI-compatible raw pixel data over three of its DP lanes, and a clock signal on the other. Add a simple level-shifter to offset the data and drive it appropriately, and you have a very convincing DVI output. This is the Passive adapter.

An active adapter, on the other hand, does not need the video card to output a special signal. It takes in native DisplayPort and converts the data itself, outputting a completely different signal on the other end. Internally, it receives and decodes the DisplayPort packets, reconstructing the video signal in a format usable by the sink device.

Essentially the "passive adapter" simply says HEY DP HOST THIS IS A DVI ADAPTER FEEEED MEEEEEE. And the DP Host says "OK, here's some data on 4 channels instead of my normal 1 (or two or four)", there is absolutely no conversion on the signal, there is a handshake and then barfing of data.

Ignoarints
Nov 26, 2010


deimos posted:

No there isn't, read the article you posted:


Essentially the "passive adapter" simply says HEY DP HOST THIS IS A DVI ADAPTER FEEEED MEEEEEE. And the DP Host says "OK, here's some data on 4 channels instead of my normal 1 (or two or four)", there is absolutely no conversion on the signal, there is a handshake and then barfing of data.

Sorry, this is what I meant

quote:

Add a simple level-shifter

which is what accounts for the fat cables (not an active signal converter, a electrical level shifter for every one)

well I learned something. I have no idea what I'll do with that information

Shaocaholica
Oct 29, 2002

Fig. 5E



So that kind of cable is passive? It sure is cheap and while I do own one I'm not in a position to bust it open to see whats inside the fat end.


edit: probably 'passive'

quote:

It should be clear that an AC-coupled 400mVPP signal cannot be converted to a DC-coupled 400-600mVPP signal riding on a 2.9V offset without something intervening in the middle. Thus, even for a “passive” adapter, active circuitry is required to handle the level-shift.


http://mikesmods.com/mm-wp/?p=494

Shaocaholica fucked around with this message at 21:37 on May 14, 2014

Ignoarints
Nov 26, 2010


Yes 'passive' in the sense that it's not doing a displayport to hdmi signal conversion. That's just a level converter for some electrical stuff. I scrolled through a bunch of reference designs and they all have it which would make sense as any displayport adapter I've ever seen has some kind of fat end on it

Zero VGS
Aug 16, 2002
"It has gunfights and shit!"


Lipstick Apathy

Don't know if anyone would know this... you know all those Youtubes where people use the Oculus Rift and the display is being mirrored onto an external monitor? Is that a "free" output or does it cost substantial rendering power on a card? I have two 1080p monitors lying around so since I ordered two rifts, I'd like to throw a party for people to play/spectate.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Zero VGS posted:

Don't know if anyone would know this... you know all those Youtubes where people use the Oculus Rift and the display is being mirrored onto an external monitor? Is that a "free" output or does it cost substantial rendering power on a card? I have two 1080p monitors lying around so since I ordered two rifts, I'd like to throw a party for people to play/spectate.
Any modern (as in, in the last 5+ years) video card will have no issue mirroring a display. It'll cost you basically nothing, performance-wise.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.


God drat, 290's are cheap as gently caress.

http://www.ebay.com/itm/Sapphire-AM...=item35d6814f4c

Someone sold 63 3m old 290's for $199 each.

Ignoarints
Nov 26, 2010


Did someone just buy everyone's 290's from mining ? For even less than that? Unbelievable price lol

Ignoarints
Nov 26, 2010


Ignoarints posted:

Did someone just buy everyone's 290's from mining ? For even less than that? Unbelievable price lol

oh... never mind then about the source

http://www.ebay.com/itm/Litecoin-SC...=item35d4d8e492

Wistful of Dollars
Aug 25, 2009



Don Lapre posted:

God drat, 290's are cheap as gently caress.

http://www.ebay.com/itm/Sapphire-AM...=item35d6814f4c

Someone sold 63 3m old 290's for $199 each.

Three 290s is bonkers gpu power for $600.

Strategy
Jul 1, 2002


Son of a bitch I just bought another 660ti to SLI. I wish I had seen those cheap 290s before.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast


Strategy posted:

Son of a bitch I just bought another 660ti to SLI. I wish I had seen those cheap 290s before.

Everyone does

Josh Lyman
May 24, 2009





Don Lapre posted:

God drat, 290's are cheap as gently caress.

http://www.ebay.com/itm/Sapphire-AM...=item35d6814f4c

Someone sold 63 3m old 290's for $199 each.
It looks like it was just that one seller.

*tells myself my 760 for $199 is good enough for D3 and SC2 @ 1440p*

Josh Lyman fucked around with this message at 13:43 on May 15, 2014

Ignoarints
Nov 26, 2010


Strategy posted:

Son of a bitch I just bought another 660ti to SLI. I wish I had seen those cheap 290s before.

Hey, 660ti SLI rules man. Don't fret.

but yeah

Shaocaholica
Oct 29, 2002

Fig. 5E


Guy at work just dumped his 3x290 mining rig as well. Seems like something happened where the potential profit wasn't worth the further depreciation of that spec hardware.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.


Shaocaholica posted:

Guy at work just dumped his 3x290 mining rig as well. Seems like something happened where the potential profit wasn't worth the further depreciation of that spec hardware.

The difficulty on doge went up a month or two ago and some idiots were still building/holding onto rigs instead of buying asics. Now they are finally dumping them.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Is there even a market for these flavor-of-the-month coins? I mean, there are at least a few online stores that'll let you pay in buttcoins...but who the hell takes dodgecoins for anything?

I'm glad to see the mining craze has largely stopped and the prices are coming back down, but the $300-$350 that most of the used 290X's are going for is still too much, if you ask me. I mean, they're almost all the original reference designs, which means they're only ~9% faster than a 290, and they throttle like hell due to their terrible heatsinks. You can get brand new 290's for ~$350 these days, with excellent 3rd party coolers and decent overclocking potential. So really, by the time you replace the reference 290X heatsink so you can actually out-perform the current 290's, you're already at or above the same price, and you're dealing with a used card vs a new one.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.


DrDork posted:

Is there even a market for these flavor-of-the-month coins? I mean, there are at least a few online stores that'll let you pay in buttcoins...but who the hell takes dodgecoins for anything?

I'm glad to see the mining craze has largely stopped and the prices are coming back down, but the $300-$350 that most of the used 290X's are going for is still too much, if you ask me. I mean, they're almost all the original reference designs, which means they're only ~9% faster than a 290, and they throttle like hell due to their terrible heatsinks. You can get brand new 290's for ~$350 these days, with excellent 3rd party coolers and decent overclocking potential. So really, by the time you replace the reference 290X heatsink so you can actually out-perform the current 290's, you're already at or above the same price, and you're dealing with a used card vs a new one.

Its all speculators. Lots apparently convert their doge immediately into bit also.

ShaneB
Oct 21, 2002




My buddy has an old 45nm intel i7, a decent PSU, and 8GB RAM. He's also quitting drinking and I want to help him distract himself with a new GPU so newish games can be played decently. If I want to keep the cost around $100 or less what's the best buy here? I don't even care if its new or not...

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

ShaneB posted:

My buddy has an old 45nm intel i7, a decent PSU, and 8GB RAM. He's also quitting drinking and I want to help him distract himself with a new GPU so newish games can be played decently. If I want to keep the cost around $100 or less what's the best buy here? I don't even care if its new or not...

First line of the OP says to use your thread for parts picking advice Does he currently have a video card?

Adbot
ADBOT LOVES YOU

ShaneB
Oct 21, 2002




fletcher posted:

First line of the OP says to use your thread for parts picking advice

Yeah but that ain't old sub-$100 stuff at all. We basically start recommendations at the 750ti.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply