|
mango sentinel posted:I'm considering going two hundred dollars over budget just because I can actually walk into a store and buy a 1070. Enos Cabell posted:It's never stupid to buy an x70 series card. Doooooo iiiiiiiit But if your monitor is only 1080p, surely saving $200 with the 1060 is the more reasonable option... right? I personally think so.
|
# ? Jul 22, 2016 16:00 |
|
|
# ? Apr 29, 2024 10:23 |
|
teagone posted:But if your monitor is only 1080p, surely saving $200 with the 1060 is the more reasonable option... right? I personally think so. I'm playing them at 2560x1080 and enjoying the rock-solid never-below-60 FPS
|
# ? Jul 22, 2016 16:01 |
|
I'm predicting a Titan X Black 6 months later with HBM2.
|
# ? Jul 22, 2016 16:03 |
|
mango sentinel posted:I'm considering going two hundred dollars over budget just because I can actually walk into a store and buy a 1070. Doesn't sound stupid to me, but maybe that's because I started with an RX480 budget and ended up strongly considering 1070 myself when I saw how bad availability is for the 480 and the 1060. Ended up getting a FE 1060 since I figured I can just flip it without too much loss if I dislike it, but I'm still not certain if I made the best decision.
|
# ? Jul 22, 2016 16:07 |
|
teagone posted:But if your monitor is only 1080p, surely saving $200 with the 1060 is the more reasonable option... right? I personally think so. Reasonable options aren't as fun. Like you go out for dinner are you going to order the grilled chicken breast with asparagus or are you going to get the lobster mac and cheese with a filet mignon?
|
# ? Jul 22, 2016 16:08 |
|
teagone posted:But if your monitor is only 1080p, surely saving $200 with the 1060 is the more reasonable option... right? I personally think so. Sure, it's definitely more reasonable. I always like to err on the side of maxing out the bells and whistles though, because I can.
|
# ? Jul 22, 2016 16:09 |
|
afkmacro posted:Reasonable options aren't as fun. Like you go out for dinner are you going to order the grilled chicken breast with asparagus or are you going to get the lobster mac and cheese with a filet mignon? The correct option is to do neither, pocket the savings and buy more hardware
|
# ? Jul 22, 2016 16:13 |
|
Khagan posted:I'm predicting a Titan X Black 6 months later with HBM2. You can't just switch between HBM2 and GDDR5X on the same chip. It's a different memory controller. If there is a HBM2-based Titan down the road, it will be a new chip.
|
# ? Jul 22, 2016 16:26 |
|
wicka posted:i don't find my gigabyte 1070 to be very noisy, at least not for me, but it does seem like it's making some grinding noises (almost as if a wire is hitting the fan) when the RPM is dropping after i exit a game Yes, its actually one of the fan bearings on mine (my fart left fan closest to the I/O). I read that this was an issue on 1080's as well and they patched the bios to fix it. I would assume the 1070 will be getting the same bios. It's something to do with the fan being pulsed with power when coming to a stop inappropriately - at least thats one unfounded theory I've read. To my ear, it sounds like a hosed up fan bearing like every other one I've ever heard. However unlike bad bearings this one is definitely intermittent and based on some condition or another so there is hope the bios could fix it. I put my finger on the fans and it was definitely just one of them. I usually don't even point out bad fans (they affect every card and brand broadly speaking) but this does seem to be a unique and widespread. I was pretty pissed off when it started happening but I dont even notice it now tbh... its from the nvidia complacency fume generator build into every pascal im pretty sure edit: Speak of the devil http://www.gigabyte.com/products/product-page.aspx?pid=5916#bios for those who got the G1's, here's a new bios for ya penus penus penus fucked around with this message at 16:42 on Jul 22, 2016 |
# ? Jul 22, 2016 16:30 |
|
teagone posted:But if your monitor is only 1080p, surely saving $200 with the 1060 is the more reasonable option... right? I personally think so. First, I've never owned anything but a lower end budget card so it's exciting on that level. Second, I currently have a 1080p monitor but a 1060 locks me into 1080p for future monitor purchases where as a 1070 frees me up. OTOH I currently feel 1080p's are enough for anyone. My current plan: I'm sitting on an EVGA 1060 order from Amazon but if it doesn't ship out by the first of August I'm grabbing a 1070.
|
# ? Jul 22, 2016 16:35 |
|
teagone posted:But if your monitor is only 1080p, surely saving $200 with the 1060 is the more reasonable option... right? I personally think so. But the futureproofing!
|
# ? Jul 22, 2016 16:38 |
|
Potentially 24% faster than GTX 1080; 60% faster than the old Titan X. Only $600 more for 24% (maybe) faster
|
# ? Jul 22, 2016 16:52 |
|
I just saw that the new Titan is actually releasing August 2nd ? Wtf? I thought that thing was coming around Vega times. Maybe that gives them time to make sales before competing with the 1080ti when Vega comes around but this release schedule is far more aggressive than I would have guessed. Has there been any changes or updates from AMD about a 1070 competitor? Is that expected to be a 490?
|
# ? Jul 22, 2016 16:53 |
|
Am I right in thinking that even a GTX 1080 isn't going to push 144fps on a 1440p monitor in new games in ultra settings? That's what most of the benchmarks seems to suggest and it's tempering my desire to buy a new monitor. On the other hand, I suppose that 80-100fps is still 33-66% more frames than a 60hz monitor with the benefit of g-sync to boot, which seems like a big improvement. Is that a reasonable assessment?
ItBurns fucked around with this message at 17:07 on Jul 22, 2016 |
# ? Jul 22, 2016 17:02 |
|
THE DOG HOUSE posted:I just saw that the new Titan is actually releasing August 2nd ? Wtf? I thought that thing was coming around Vega times. Maybe that gives them time to make sales before competing with the 1080ti when Vega comes around but this release schedule is far more aggressive than I would have guessed. There's lots of talk about AMD sticking 2 480s onto one board, putting 250W of power delivery on it, and seeing what happens. It's probably going to be a worse buy than the GTX 1070 in every way, but it'll slot up against it.
|
# ? Jul 22, 2016 17:08 |
|
THE DOG HOUSE posted:Yes, its actually one of the fan bearings on mine (my fart left fan closest to the I/O). I read that this was an issue on 1080's as well and they patched the bios to fix it. I would assume the 1070 will be getting the same bios. Are there any sort of release notes to go with this bios?
|
# ? Jul 22, 2016 17:08 |
|
ItBurns posted:Am I right in thinking that even a GTX 1080 isn't going to push 144fps on a 1440p monitor in new games in ultra settings? That's what most of the benchmarks seems to suggest and it's tempering my desire to buy a new monitor. On the other hand, I suppose that 80-100fps is still 33-66% more frames than a 60hz monitor with the benefit of g-sync to boot, which seems like a big improvement. Is that a reasonable assessment? I made the jump to a 1080 and a 1440p 144Hz gsync at the same time. Witcher 3 maxed (no hairworks) averaged around 100fps, Doom maxed was 120s with lots of stuff going on. It was definitely a very noticeable difference switching between the gsync monitor and my 1440p 60Hz monitor.
|
# ? Jul 22, 2016 17:14 |
|
Barry posted:Are there any sort of release notes to go with this bios? I didn't see any in the file package itself and nothing in a quick search online - but I've never seen such a thing with any video card bios update iirc. I'm sure they exist but I'm not particularly shocked there isn't a comprehensive one here outside of "modified fan duty" and so on. quote:Release for SAMSUNG Memory Vague but im going to get it when I get home Twerk from Home posted:There's lots of talk about AMD sticking 2 480s onto one board, putting 250W of power delivery on it, and seeing what happens. It's probably going to be a worse buy than the GTX 1070 in every way, but it'll slot up against it. barf
|
# ? Jul 22, 2016 17:17 |
|
I mean, you say that, but this is basically how it looks like AMD is going to build chips from here on out. (Both CPU and GPU.) Not by punching them out from dies on a per-GPU basis locking off the defective bits and binning, but punching out much smaller core clusters to increase overall yields and parts-per-die, binning them, and then using interposers to add as many core clusters (slightly defective or otherwise) as you need to get your x70, x80, x90, and Fury tiers while presenting a single logical interface to the rest of the system. It's ingenious as hell. I JUST HAVE ZERO FAITH THAT AMD CAN PULL IT OFF.
|
# ? Jul 22, 2016 17:25 |
|
I'm not having any real problems by any means and flashing BIOSes isn't high on my list of things to do just for the hell of it but if there was some actual worthwhile reason it would be nice to know.
|
# ? Jul 22, 2016 17:26 |
|
SwissArmyDruid posted:I mean, you say that, but this is basically how it looks like AMD is going to build chips from here on out. (Both CPU and GPU.) Not by punching them out from dies on a per-GPU basis locking off the defective bits and binning, but punching out much smaller core clusters to increase overall yields and parts-per-die, binning them, and then using interposers to add as many core clusters (slightly defective or otherwise) as you need to get your x70, x80, x90, and Fury tiers while presenting a single logical interface to the rest of the system. Oh ok, that would be cool. I automatically thought it'd be crossfire. I hope they can though, id like to see that in general. Barry posted:I'm not having any real problems by any means and flashing BIOSes isn't high on my list of things to do just for the hell of it but if there was some actual worthwhile reason it would be nice to know. If you don't have weird fan grinding I wouldn't care much either
|
# ? Jul 22, 2016 17:32 |
|
THE DOG HOUSE posted:Oh ok, that would be cool. I automatically thought it'd be crossfire. I hope they can though, id like to see that in general. Oh, the dual-480s is probably going to be a crossfire card. We probably won't see what I described until Vega at the earliest, probably Navi. Expect to see a lot more dual-GPU cards until then, though!
|
# ? Jul 22, 2016 17:33 |
|
Phlegmish posted:But the futureproofing! The best futureproofing is money left over.
|
# ? Jul 22, 2016 17:34 |
|
Meanwhile, it looks like Sapphire is finally putting out ACTUAL INFORMATION on their card. Finally. http://www.pcworld.com/article/3098825/components-graphics/sapphire-nitro-rx-480-review-polaris-rethought-and-refined.html http://www.eteknix.com/sapphire-nitro-rx-480-oc-8gb-graphics-card-review/ Nothing earth-shattering, of course, but solid. Also, it apparently releases "next week" because, you know, this is AMD marketing and specifics aren't important I guess? I dunno. It probably won't ACTUALLY be easily available for weeks or months. I hope I'm wrong, because I want one, but god drat the scalping is real.
|
# ? Jul 22, 2016 17:51 |
|
xthetenth posted:The best futureproofing is money left over. Yeah, I hear you. I paid like $260 for a 760 three years ago and it's worked pretty well for me. I just can't bring myself to blow over $300 on a video card. I've been vacillating between waiting for a 1060 to come into stock, and trying to haggle a 980 from some dumbass on Craigslist.
|
# ? Jul 22, 2016 17:52 |
|
I also bought a 760 three years ago and now I have to upgrade.
|
# ? Jul 22, 2016 17:53 |
|
What's the hottest you've ever had a card run at? I tortured a 4870 with gaming temps of just over 100°C for months before getting a 6870. The blower was rammed full of dust, and I couldn't be arsed cleaning it out. The fan on it sounded like a jet engine.
|
# ? Jul 22, 2016 17:53 |
|
My 6870s got pretty hot. I actually had one of the blower fans jam when a fly got stuck in it Also had the AIB one lose a fan too. My 6870s paid for my computer though, so that's something.
|
# ? Jul 22, 2016 18:03 |
|
ItBurns posted:Am I right in thinking that even a GTX 1080 isn't going to push 144fps on a 1440p monitor in new games in ultra settings? That's what most of the benchmarks seems to suggest and it's tempering my desire to buy a new monitor. On the other hand, I suppose that 80-100fps is still 33-66% more frames than a 60hz monitor with the benefit of g-sync to boot, which seems like a big improvement. Is that a reasonable assessment? As someone with an Acer XB270hu, you really, really don't need your GPU to always be cranking 144 FPS to experience a hugely dramatic increase in perceived smoothness. I still have a 980 @1530mhz, so I don't even stick at 60 all the time in stuff like Witcher 3, although it's certainly plenty for something more simple like WoW, Overwatch, world of warships, etc. Whether you're getting sub 60 or over 140 it's still going to look and feel fantastic. I feel like in this thread especially the need to keep 120/144 FPS with a high refresh monitor is drastically overstated, doubly so if you're using *Sync.
|
# ? Jul 22, 2016 18:48 |
|
Gwaihir posted:As someone with an Acer XB270hu, you really, really don't need your GPU to always be cranking 144 FPS to experience a hugely dramatic increase in perceived smoothness. I still have a 980 @1530mhz, so I don't even stick at 60 all the time in stuff like Witcher 3, although it's certainly plenty for something more simple like WoW, Overwatch, world of warships, etc. Doesn't 144hz help with *-sync because it gives that low framerate compatibility option more room to operate in? I got a new 144hz Freesync monitor last night and in ~5 minutes of testing I noticed Witcher 3 looks and feels a lot more smooth, even though I'm still only pulling around 40 fps. It's that AOC one, the only thing I'm worried about is the color looks a little weird next to my old ASUS monitor but Amazon reviews said after about 10 hours on the colors start looking more normal.
|
# ? Jul 22, 2016 18:55 |
|
axeil posted:Doesn't 144hz help with *-sync because it gives that low framerate compatibility option more room to operate in? Lol yeah cause you get used to it But yes sync helps with a lot.
|
# ? Jul 22, 2016 18:59 |
|
That Titan X (Why the hell not just do Titan Y? They already used Z) does look mean but the lack of HBM and only 24% faster makes me question where the 1080Ti will fit and will I want it.... Though that Titan right now for VR would be a downright brilliant piece of kit for games like Project Cars, Asseto Corsa, and Elite in VR. Maxing everything out at 90FPS VR would be glorious and hell, might even leave some over to scale up to 2X even which would be pure eye candy goodness in Elite. Hmm But man that is a price. Watch the 1080Ti come out with 10G ram or something, and less cores (if they don't end up pulling a 780Ti and releasing it with a full core instead). Probably because this still doesn't do what the OG titan did for DP computing, they will probably slot the Ti just below the Titan X like last gen, but with a OC, should be able to hit above it OC vs Stock. In other news, why has nobody said a thing about the 1070M leak? 2048 cores in a mobile chip thats just downclocked a hair. 980Ti Performance in a laptop sounds delightful.
|
# ? Jul 22, 2016 19:01 |
|
They figured out a way to sell those unsold Maxwell Titan Xs
|
# ? Jul 22, 2016 19:03 |
|
axeil posted:Doesn't 144hz help with *-sync because it gives that low framerate compatibility option more room to operate in? Yea, that's what I mean, really. You get a lot of benefit from 144hz even if your GPU is still only cranking 40. I don't think the high refresh really helps so much with the very low framerate cases though, unless you're pairing really wildly mis-matched GPUs with a gsync screen. Like a 950 or something similarly anemic.
|
# ? Jul 22, 2016 19:04 |
|
I never understood why they don't just use the architecture prefix for the Titan series. So the new one would be Titan P and we should have had Titan M and Titan K.
|
# ? Jul 22, 2016 19:09 |
|
Krailor posted:I never understood why they don't just use the architecture prefix for the Titan series. So the new one would be Titan P and we should have had Titan M and Titan K. That would make too much sense. Also while it would make sens to us, it would be hard to tell what card was the new(Better) card out after the OG Titan.
|
# ? Jul 22, 2016 19:14 |
|
EdEddnEddy posted:Also while it would make sens to us, it would be hard to tell what card was the new(Better) card out after the OG Titan. It's not like having two different cards called Titan X makes that any easier. Besides the better one would obviously just be whichever one cost more...
|
# ? Jul 22, 2016 19:20 |
|
Krailor posted:It's not like having two different cards called Titan X makes that any easier. You really think Retailers aren't going to keep the last gen Titan X high priced until they sell out? I mean they both even have 12G ram and a black cooler So some average Joe with more $ than shopping sense is easy prey for such a purchase. It is a short range problem, but I fully believe they did that because X sounds cool and they don't care as long as they sell.
|
# ? Jul 22, 2016 19:25 |
|
Next gen they'll put out the Titan One
|
# ? Jul 22, 2016 19:27 |
|
|
# ? Apr 29, 2024 10:23 |
|
The whole Titan shtick is to be a dumb marketing gimmick, it doesn't and shouldn't make much sense for Nvidia's sake. Also, not a whole lot of room for a 1080ti now, if it's only 24% faster than a 1080 cutting any cores quickly loses perf/price to the 1080 which could potentially make up the difference with OC. I sense a Kepler like transition in the future where the Titan XP gets dragged along to be the new 1180.
|
# ? Jul 22, 2016 19:38 |