|
Here's a sheet I found online I used to help shop for a used card: https://docs.google.com/spreadsheet/ccc?key=0ArCtGVODPUTtdEdEUjRiSFdyckZ1Q1dGNUI3bkd5R1E#gid=0. I believe Asus and Gigabyte AMD card warranties are based on serial. (Eventually settled on an Asus 7970 DCII for $200 shipped that wasn't used for mining.)
|
# ? May 14, 2014 14:54 |
|
|
# ? Apr 23, 2024 20:56 |
|
teh_Broseph posted:Here's a sheet I found online I used to help shop for a used card: https://docs.google.com/spreadsheet/ccc?key=0ArCtGVODPUTtdEdEUjRiSFdyckZ1Q1dGNUI3bkd5R1E#gid=0. I believe Asus and Gigabyte AMD card warranties are based on serial. EVGA no longer has lifetime warranties.
|
# ? May 14, 2014 15:18 |
I was wondering about that. It's hard to imagine having a lifetime warranty on something that is basically guaranteed to fail, but likely after several generations of improvements. PNY's "lifetime warranty" is a joke. It's the lifetime... of how long they sell it. Sometimes they say 3 years, sometimes they say lifetime warranty, for the same exact card.
|
|
# ? May 14, 2014 15:27 |
|
Beautiful Ninja posted:Memory bandwidth from the benchmarks I've seen matters more than raw VRAM size I get that part but isn't asset complexity and detail pretty much proportional to VRAM? Its a catch-22 for game devs and even GPGPU applications. If you don't have the VRAM, you can't/won't code for that kind of use. Plus, isn't 4GB VRAM pretty much on par with current gen consoles so game devs wouldn't have to tier their assets as much for PC vs Console ports? Again, not for resolution's sake but asset complexity's sake.
|
# ? May 14, 2014 18:15 |
Shaocaholica posted:I get that part but isn't asset complexity and detail pretty much proportional to VRAM? Its a catch-22 for game devs and even GPGPU applications. If you don't have the VRAM, you can't/won't code for that kind of use. I don't know, but just from a benchmarks perspective you definitely need one with the other or it won't matter. For example, 4 GB 760's and 770's are an almost worthless expense over a 2 GB. Where something like the 500's series cards it was the opposite problem - it would choke from the lack of vram. I kind of doubt it's going to affect game development very much though. I'm pretty sure they're trying to push the limits as much as possible without much regard to something like mid grade nvidia card memory limitations, since there are plenty of cards without that particular bottleneck. But like was said, it's very likely to change in exactly this way, but since it requires a change in architecture it won't be in this lineup (or if it is, it'd be a new part number, like 770ti or something)
|
|
# ? May 14, 2014 18:27 |
|
Ignoarints posted:I don't know, but just from a benchmarks perspective you definitely need one with the other or it won't matter. For example, 4 GB 760's and 770's are an almost worthless expense over a 2 GB. Where something like the 500's series cards it was the opposite problem - it would choke from the lack of vram. I don't think benchmarks are really necessary to gauge VRAM size when everything else is equal. You either have enough or you don't and need to swap. That's a game/app design issue. The catch-22 is that game devs aren't going to use 4GB of vram if the majority of people don't have it. Unless I'm mistaken here and there -ARE- games that use >2GB of VRAM for game data. Games that are popular??? Ignoarints posted:But like was said, it's very likely to change in exactly this way, but since it requires a change in architecture it won't be in this lineup (or if it is, it'd be a new part number, like 770ti or something) Oh for sure I don't think its this generation of GPUs. Just pondering if its the next or the one after.
|
# ? May 14, 2014 18:41 |
Shaocaholica posted:I don't think benchmarks are really necessary to gauge VRAM size when everything else is equal. You either have enough or you don't and need to swap. That's a game/app design issue. The catch-22 is that game devs aren't going to use 4GB of vram if the majority of people don't have it. Unless I'm mistaken here and there -ARE- games that use >2GB of VRAM for game data. Games that are popular??? From what I understand vram is used to store textures to be calculated and then also completed frames. However, both of those things are directly affected by resolution and game settings, and thus the size, and can be indirectly seen by benchmarks (both FPS and frame variance). Once you get past 1080p, games do definitely use more than 2gb of vram (like Crysis 3 can use 3 gb at 1600p, and BF4 even can consistently use over 2gb). However, simply increasing vram does very little to alleviate vram issues on mid-nvidia cards because even though it's putting some of that data elsewhere, the card chokes on memory bandwidth or otherwise an inability to calculate what it does have fast enough overall. Not to say it doesn't help, but by the time it does it is unplayable. I really don't think developers limit anything based on this specific issue, rather than the limitations of graphics cards overall. Since lowering settings directly reduces how much vram is used, they can code for much higher levels (within reason). As far as generations though, I'd be really shocked if this wasn't addressed in the 800 series edit: The one situation that I've always been interested in is if there is any improvement with 4gb cards in SLI over 2gb cards that on their own show very little difference. This information is hard to come by Ignoarints fucked around with this message at 19:03 on May 14, 2014 |
|
# ? May 14, 2014 19:00 |
Ignoarints posted:edit: The one situation that I've always been interested in is if there is any improvement with 4gb cards in SLI over 2gb cards that on their own show very little difference. This information is hard to come by I'd be interested in hearing more about this as well. Maybe we can bug techreport.com to do an article on it?
|
|
# ? May 14, 2014 19:12 |
|
^^^^Higher VRAM starts to become useful with multicard and giant resolutions. Currently not even a Titan has enough power to use up all of the memory. But when you add more monitors and more cards, you can get into situations where the 3gb of memory on a 780 isn't enough.Ignoarints posted:Yes, just seriously consider from me. After reading one long thread about how they treated their cards on some mining forum it'd take some cheap price like that. XFX and Powercolor both allow you to remove the cooler as long as you don't damage anything.
|
# ? May 14, 2014 19:13 |
fletcher posted:I'd be interested in hearing more about this as well. Maybe we can bug techreport.com to do an article on it? I was hoping someone would once the 6gb 780's came out. Which they did - today. Talk about timing. It would seem logical to benchmark those against 3gb versions in a variety of resolutions. However since these have a higher bandwidth it might not relate to 770's or 760's very well (as far as this specific subject goes), but hopefully since the particular focus for this release is pure vram size perhaps they will test the others cards as well. Especially if there is a nice improvement which is logical since 780's in SLI can probably handle a larger vram workload. So if there is an improvement, hopefully someone will compare lower cards in the same situation to see if the same improvement is present or not edit: true to form, the first review is a typical comprehensive multi-game and 3d mark examination that doesn't include comparisons to the 3gb version despite specifically mentioning that was the purpose of the card. Unless I missed something I don't have time to read all of it at work, except at the end where it says "the extra memory doesnt matter much". If that ends up being true though, I suppose we can guess that lower cards will experience the same thing Ignoarints fucked around with this message at 19:44 on May 14, 2014 |
|
# ? May 14, 2014 19:23 |
|
Anyone know why all of these miniDP to HDMI cables have 'fat' HDMI ends? Is there circuitry necessary to convert DP to HDMI? I thought there was an electrically comparable mode that would have required much less hardware if any?
|
# ? May 14, 2014 19:47 |
Ignoarints fucked around with this message at 22:20 on May 14, 2014 |
|
# ? May 14, 2014 20:03 |
|
Ignoarints posted:Displayport is different, and transmits data on single pairs of wires to transmist data (but can use multiple pairs) where HDMI uses a fixed number always. HDMI and DVI are the same in this way and can be converted with no circuitry, however a displayport signal must be converted to the hdmi style signal. That requires circuitry which is powered by the displayport or hdmi source Ah, gotcha. Not HDMI but close enough. I wonder if there's any noticeable lag? Not that it matters for my use.
|
# ? May 14, 2014 20:13 |
I dont know, unfortunately converting signals within cables opens up a whole new way for a cable to suck rear end. For the record though, I've used a ton for work (mostly on apples) with no discernible issues. Even cheap rear end monoprice ones Ignoarints fucked around with this message at 20:26 on May 14, 2014 |
|
# ? May 14, 2014 20:17 |
|
And I quote:quote:DisplayPort-to-HDMI adaptors and DisplayPort-to-DVI adaptors are very simple and only operate one way. For instance, when a DP-to-HDMI adaptor is connected to a PC that supports DP++ (Dual-Mode) capability, the PC senses the presence of the adaptor and sends HDMI signals over the DisplayPort connector rather than DisplayPort signals. No signal conversion is performed by the HDMI adaptor. HDMI signals are merely passed through. The unique DisplayPort adaptor capability enables the PC to connect to a variety of displays via the DisplayPort connector including HDMI, DVI, and VGA. VGA adaptors are more complex and perform active signal conversion from DisplayPort to VGA. These adaptors also operate only one way. Unfortunately, HDMI does not support conversion to other display formats as does DisplayPort.
|
# ? May 14, 2014 21:08 |
deimos posted:And I quote: I never actually looked into it, and googling that is a huge can of worms. I was always told that they had at least basic circuitry for the reason above, passive vs. active regardless. quote:To use a “DP++” video connector with HDMI displays, an external signal-level (and connector) adapter dongle must be inserted. Which is what I was (or used to be) told. A little vague though, but a common adapter in A/V: http://www.accellcables.com/B086B-001B.html States that it is a passive cable (they carry active as well) with circuitry. It would make more sense to me since every DP adapter I've ever seen is fat as hell, cheap crap to expensive stuff. In that same quote, what is talking about HDMI? HDMI adapts to DVI directly, and VGA just as easily (edit: easily, as in, as readily as display port) Ignoarints fucked around with this message at 21:37 on May 14, 2014 |
|
# ? May 14, 2014 21:34 |
|
Ignoarints posted:In that same quote, what is talking about HDMI? HDMI adapts to DVI directly, and VGA just as easily (edit: easily, as in, as readily as display port) Err no, HDMI is signal compatible with DVI but VGA requires active conversion. My source: http://www.displayport.org/faq/ which, I think, might be considered canonical, but maybe your source is a better one.
|
# ? May 14, 2014 21:47 |
|
Displayport is a godsend and has taken entirely too long to become mainstream.
|
# ? May 14, 2014 21:51 |
deimos posted:Err no, HDMI is signal compatible with DVI but VGA requires active conversion. HDMI converts directly to DVI, and in the same sort of way as Displayport does to VGA. I read the paragraph in there too, it does literally say that it doesn't. That doesn't make any sense coming from what I'm assuming is an official website for the display port standard. DVI and HDMI are functionally the same when we're talking about adapters. Displayport to DVI always have some kind of chip in them. While I was completely wrong about the typical cable converting the signal (as an active cable does) the fat housings should still be to house a small PCB. Here is "Mike" explaining passive displayport adapters. http://mikesmods.com/mm-wp/?p=494 A NXP passive design is referred in there, which you can see on page 27 here http://dl.cubieforums.com/LapdockMaterials/NXP-DP.pdf I dunno what to think. Adapters give me a headache. I'm inclined to think that part of the FAQ is just plain (and inconsequential) misinformation since it specifically says no signal conversion is performed when in fact it looks like signal conversion is performed even in the cheapest $3 cables Ignoarints fucked around with this message at 22:03 on May 14, 2014 |
|
# ? May 14, 2014 21:59 |
|
Ignoarints posted:I dunno what to think. Adapters give me a headache. I'm inclined to think that part of the FAQ is just plain (and inconsequential) misinformation since it specifically says no signal conversion is performed when in fact it looks like signal conversion is performed even in the cheapest $3 cables No there isn't, read the article you posted: quote:Clearly the two standards are very different, and normally quite incompatible. In a world with millions of DVI devices, and with competitor HDMI offering backward-compatibility with these devices, DisplayPort would be dead in the water without a simple, inexpensive way to support legacy displays. To address this, VESA (DisplayPort’s standards organization) added “Dual-Mode” operation to the specification, which allow the video device to output DVI-compatible raw pixel data over three of its DP lanes, and a clock signal on the other. Add a simple level-shifter to offset the data and drive it appropriately, and you have a very convincing DVI output. This is the Passive adapter. Essentially the "passive adapter" simply says HEY DP HOST THIS IS A DVI ADAPTER FEEEED MEEEEEE. And the DP Host says "OK, here's some data on 4 channels instead of my normal 1 (or two or four)", there is absolutely no conversion on the signal, there is a handshake and then barfing of data.
|
# ? May 14, 2014 22:10 |
deimos posted:No there isn't, read the article you posted: Sorry, this is what I meant quote:Add a simple level-shifter which is what accounts for the fat cables (not an active signal converter, a electrical level shifter for every one) well I learned something. I have no idea what I'll do with that information
|
|
# ? May 14, 2014 22:14 |
|
So that kind of cable is passive? It sure is cheap and while I do own one I'm not in a position to bust it open to see whats inside the fat end. edit: probably 'passive' quote:It should be clear that an AC-coupled 400mVPP signal cannot be converted to a DC-coupled 400-600mVPP signal riding on a 2.9V offset without something intervening in the middle. Thus, even for a “passive” adapter, active circuitry is required to handle the level-shift. http://mikesmods.com/mm-wp/?p=494 Shaocaholica fucked around with this message at 22:37 on May 14, 2014 |
# ? May 14, 2014 22:34 |
Yes 'passive' in the sense that it's not doing a displayport to hdmi signal conversion. That's just a level converter for some electrical stuff. I scrolled through a bunch of reference designs and they all have it which would make sense as any displayport adapter I've ever seen has some kind of fat end on it
|
|
# ? May 14, 2014 22:45 |
|
Don't know if anyone would know this... you know all those Youtubes where people use the Oculus Rift and the display is being mirrored onto an external monitor? Is that a "free" output or does it cost substantial rendering power on a card? I have two 1080p monitors lying around so since I ordered two rifts, I'd like to throw a party for people to play/spectate.
|
# ? May 15, 2014 03:22 |
|
Zero VGS posted:Don't know if anyone would know this... you know all those Youtubes where people use the Oculus Rift and the display is being mirrored onto an external monitor? Is that a "free" output or does it cost substantial rendering power on a card? I have two 1080p monitors lying around so since I ordered two rifts, I'd like to throw a party for people to play/spectate.
|
# ? May 15, 2014 03:47 |
|
God drat, 290's are cheap as gently caress. http://www.ebay.com/itm/Sapphire-AM...=item35d6814f4c Someone sold 63 3m old 290's for $199 each.
|
# ? May 15, 2014 04:56 |
Did someone just buy everyone's 290's from mining ? For even less than that? Unbelievable price lol
|
|
# ? May 15, 2014 04:59 |
Ignoarints posted:Did someone just buy everyone's 290's from mining ? For even less than that? Unbelievable price lol oh... never mind then about the source http://www.ebay.com/itm/Litecoin-SC...=item35d4d8e492
|
|
# ? May 15, 2014 05:00 |
|
Don Lapre posted:God drat, 290's are cheap as gently caress. Three 290s is bonkers gpu power for $600.
|
# ? May 15, 2014 05:06 |
|
Son of a bitch I just bought another 660ti to SLI. I wish I had seen those cheap 290s before.
|
# ? May 15, 2014 07:45 |
|
Strategy posted:Son of a bitch I just bought another 660ti to SLI. I wish I had seen those cheap 290s before. Everyone does
|
# ? May 15, 2014 10:11 |
|
Don Lapre posted:God drat, 290's are cheap as gently caress. *tells myself my 760 for $199 is good enough for D3 and SC2 @ 1440p* Josh Lyman fucked around with this message at 14:43 on May 15, 2014 |
# ? May 15, 2014 14:40 |
Strategy posted:Son of a bitch I just bought another 660ti to SLI. I wish I had seen those cheap 290s before. Hey, 660ti SLI rules man. Don't fret. but yeah
|
|
# ? May 15, 2014 15:44 |
|
Guy at work just dumped his 3x290 mining rig as well. Seems like something happened where the potential profit wasn't worth the further depreciation of that spec hardware.
|
# ? May 15, 2014 16:21 |
|
Shaocaholica posted:Guy at work just dumped his 3x290 mining rig as well. Seems like something happened where the potential profit wasn't worth the further depreciation of that spec hardware. The difficulty on doge went up a month or two ago and some idiots were still building/holding onto rigs instead of buying asics. Now they are finally dumping them.
|
# ? May 15, 2014 16:26 |
|
Is there even a market for these flavor-of-the-month coins? I mean, there are at least a few online stores that'll let you pay in buttcoins...but who the hell takes dodgecoins for anything? I'm glad to see the mining craze has largely stopped and the prices are coming back down, but the $300-$350 that most of the used 290X's are going for is still too much, if you ask me. I mean, they're almost all the original reference designs, which means they're only ~9% faster than a 290, and they throttle like hell due to their terrible heatsinks. You can get brand new 290's for ~$350 these days, with excellent 3rd party coolers and decent overclocking potential. So really, by the time you replace the reference 290X heatsink so you can actually out-perform the current 290's, you're already at or above the same price, and you're dealing with a used card vs a new one.
|
# ? May 15, 2014 23:09 |
|
DrDork posted:Is there even a market for these flavor-of-the-month coins? I mean, there are at least a few online stores that'll let you pay in buttcoins...but who the hell takes dodgecoins for anything? Its all speculators. Lots apparently convert their doge immediately into bit also.
|
# ? May 15, 2014 23:21 |
|
My buddy has an old 45nm intel i7, a decent PSU, and 8GB RAM. He's also quitting drinking and I want to help him distract himself with a new GPU so newish games can be played decently. If I want to keep the cost around $100 or less what's the best buy here? I don't even care if its new or not...
|
# ? May 15, 2014 23:40 |
ShaneB posted:My buddy has an old 45nm intel i7, a decent PSU, and 8GB RAM. He's also quitting drinking and I want to help him distract himself with a new GPU so newish games can be played decently. If I want to keep the cost around $100 or less what's the best buy here? I don't even care if its new or not... First line of the OP says to use your thread for parts picking advice Does he currently have a video card?
|
|
# ? May 15, 2014 23:50 |
|
|
# ? Apr 23, 2024 20:56 |
|
fletcher posted:First line of the OP says to use your thread for parts picking advice Yeah but that ain't old sub-$100 stuff at all. We basically start recommendations at the 750ti.
|
# ? May 15, 2014 23:51 |