|
beejay posted:Looks like 290X uses a 512-bit bus even for the 4GB version. Can someone weigh in on how 256-bit bus would handle 8GB? From what I gathered before the memory bandwidth bus was a pretty good indicator of what the card was able to deal with this generation, but that it was more of a indirect association and there is in fact much more to it than that and is hard to really compare to another generation. I wish I knew more. I tried but there was quite a bit of misinformation and contradicting ideas about this
|
# ? Jul 18, 2014 17:26 |
|
|
# ? Apr 25, 2024 02:12 |
|
MaxxBot posted:I think there would need to be better communication bandwidth/latency for this to work properly, it's a fundamental hardware problem rather than it just being an issue of poorly done software. I believe that's why they switched from doing SFR (which is what you described) to AFR because the communication overhead for SFR is too high. Oh, that's cool, I just imagined the SFR thing in my head, I was expecting someone to tell me that was idiotic. Nice to know it's a thing and there's a term.
|
# ? Jul 18, 2014 17:32 |
|
beejay posted:Looks like 290X uses a 512-bit bus even for the 4GB version. Can someone weigh in on how 256-bit bus would handle 8GB? I was actually going to post that myself. I've seen a lot of doom and gloom posting about the alleged 256 bus, but I don't understand such details well enough to know if it's hyperbole or an actual performance knee-capper.
|
# ? Jul 18, 2014 17:32 |
|
I'd be pretty surprised if it is actually going to use a 256-bit bus because you'd have to increase the frequency really, really high to still yield an increase in memory bandwidth over the 780ti. I'm not super familiar with how memory bandwidth affects performance but I know that there's no way they would release the 880 with less memory bandwidth than their previous generation card.
|
# ? Jul 18, 2014 17:34 |
|
They already have before with the 5 series. The 570 for example has a 320 bit wide bus. It's just not a direct correlation like it seems to be with this generation (6 and 7 series I mean). It seems to have been more of a reference for memory performance that has been practically accurate, but just not literally so. A literal correlation of memory bus width may only tell us is how many memory chips there will be when we compare two different generations
|
# ? Jul 18, 2014 18:16 |
|
Dobermaniac posted:I'm looking at picking up a Lenovo Y50 with a Maxwell 860M video card. They have two options 2GB and 4GB. The only video games I play are WoW and a little bit of CS:Go. I want to be able to max out the settings @ 1920x1080. Would I notice much if any difference between the 2 and 4 gig version? It's going to be paired with a i7 4710, 16 gigs of ram, and Evo 840 512gb ssd. I plan on keeping this laptop for 4+ years. It'll be replacing a laptop with a Radeon 5440 1gb. This is a fun one. Notebookcheck only mentions the 4GB version of the Maxwell 860m, so there's a chance that the 2GB part is actually a Kepler GTX760 underclocked to 800mhz. Good job Nvidia! Digging through some forums, it seems that the 4GB version should definitely be Maxwell so go with that if the price difference isn't very high.
|
# ? Jul 18, 2014 18:18 |
|
Over/under for the new Nvidia card. $800.
|
# ? Jul 18, 2014 18:26 |
|
beejay posted:Looks like 290X uses a 512-bit bus even for the 4GB version. Can someone weigh in on how 256-bit bus would handle 8GB? Bandwidth (Bus width * memory frequency) and amount of RAM are two entirely separate issues. Things like SSAA demand both more memory and more bandwidth to maintain acceptable performance. Simply storing more textures in vram (Like the rumored watch dogs keeping literally two or three coppies of the same texture in VRAM) is just a matter of amount. So long as you have enough vram, all is well, regardless of bandwidth. The only other consideration about amount of vram vs bus width is simply whether or not RAM companies make chips with large enough die sizes to get 8 gigs on a card with a 256 bit bus, since you only get a set number of chips per width of the bus iirc.
|
# ? Jul 18, 2014 18:49 |
|
Gwaihir posted:Bandwidth (Bus width * memory frequency) and amount of RAM are two entirely separate issues. Things like SSAA demand both more memory and more bandwidth to maintain acceptable performance. Simply storing more textures in vram (Like the rumored watch dogs keeping literally two or three coppies of the same texture in VRAM) is just a matter of amount. So long as you have enough vram, all is well, regardless of bandwidth. That's 8 chips, as far as I know 1gb GDDR5 chips are (were) really expensive. But a lot of time has passed now so we shall see.
|
# ? Jul 18, 2014 21:23 |
|
veedubfreak posted:Over/under for the new Nvidia card. $800. They'd be out of their minds, but not as much as whoever pays $800 for it.
|
# ? Jul 18, 2014 22:00 |
|
Rime posted:I'm just hoping the high-end cards dropping in November have 6-8GB of vram on them. 4GB is really not going to cut it moving forward. Me too. Not because I have a legitimate need for the things in gaming, but you can burn through 6gb pretty fast if you are using them for compute stuff. I'd love to be able to get 16gb on a future-gen K20 and 32gb on a K40.
|
# ? Jul 18, 2014 22:12 |
|
GDDR6 is where it will be at
|
# ? Jul 19, 2014 08:30 |
|
Might as well go to GDDR7 in keeping with the odd number theme.
|
# ? Jul 19, 2014 11:32 |
|
Khagan posted:Might as well go to GDDR7 in keeping with the odd number theme. We had GDDR4, it just didn't stick around because GDDR5 came out pretty quickly afterwards. AMD used GDDR4 on at least two dozen of their cards. The latest one was the HD 4850.
|
# ? Jul 19, 2014 13:43 |
|
Pimpmust posted:GDDR6 is where it will be at Ram is weird. https://www.youtube.com/watch?v=pbgvzVgfoSc
|
# ? Jul 19, 2014 17:14 |
|
Arzachel posted:This is a fun one. Notebookcheck only mentions the 4GB version of the Maxwell 860m, so there's a chance that the 2GB part is actually a Kepler GTX760 underclocked to 800mhz. Good job Nvidia! I've bought a Gigabyte P34G v2 recently (a powerful portable laptop that contains a 4GB GTX 860m Maxwell): VRAM amount does not indicate whether the chip is a Kepler or Maxwell part, that is false information circulating some forums. It seems, with the Y50, both the 2GB and 4GB version are Maxwell parts.
|
# ? Jul 19, 2014 17:17 |
|
wargames posted:Ram is weird. I assume that GDDR6 will be coming sometime around when DDR4 gets going next year(ish), or at least that's what the rumours about Maxwell says.
|
# ? Jul 19, 2014 19:21 |
|
I recently had to RMA a video card to EVGA (660ti) that they replaced with a 760. Overall I am very pleased with EVGA's customer service - are they a-typical in the graphics card industry?
|
# ? Jul 19, 2014 21:37 |
|
Massasoit posted:I recently had to RMA a video card to EVGA (660ti) that they replaced with a 760. Overall I am very pleased with EVGA's customer service - are they a-typical in the graphics card industry? Yes. Companies like PNY With a "lifetime warranty". Its only the lifetime they stock the product in their warehouse.
|
# ? Jul 19, 2014 21:41 |
|
1gnoirents posted:That's 8 chips, as far as I know 1gb GDDR5 chips are (were) really expensive. But a lot of time has passed now so we shall see. I wish I knew where to find more up-to-date info on this stuff. It's kind of interesting.
|
# ? Jul 19, 2014 23:20 |
|
Massasoit posted:I recently had to RMA a video card to EVGA (660ti) that they replaced with a 760. Overall I am very pleased with EVGA's customer service - are they a-typical in the graphics card industry? Unfortunately so. There used to be more companies with similar customer service policies, and they're not all gone (any company that accepts warranty work on a card that's been disassembled is a-okay in my book, that is ballsy and cool for the customer and EVGA is not the only one who does it). EVGA does stand out in that regard in the industry, though. Especially within a generation like this where you've got a number of performance analogs after the refresh, they can generally be counted on to make a warranty replacement slightly in your favor when the card that needs replaced is no longer available new.
|
# ? Jul 20, 2014 04:50 |
|
Agreed posted:Unfortunately so. There used to be more companies with similar customer service policies, and they're not all gone (any company that accepts warranty work on a card that's been disassembled is a-okay in my book, that is ballsy and cool for the customer and EVGA is not the only one who does it). EVGA does stand out in that regard in the industry, though. Especially within a generation like this where you've got a number of performance analogs after the refresh, they can generally be counted on to make a warranty replacement slightly in your favor when the card that needs replaced is no longer available new. This, and isn't EVGA one of the few that will still honor warranty even if you damage your card via overclocking? Obviously as long as the card hasn't been physically modified/damaged by misuse or whatever (e.g. soldering or stuff like that).
|
# ? Jul 20, 2014 07:29 |
|
Off the top of my head the only other company that are totally cool with it are MSI (if you overvolt a lightning card to death they'll still replace it) and maybe Galaxy with their HOF stuff.
|
# ? Jul 20, 2014 07:55 |
|
I'm pretty astonished that none of these companies use something like an e-fuse to detect when you alter the BIOS, you can overvolt any card you want and if you set it back and keep your trap shut no one would be the wiser.
|
# ? Jul 20, 2014 16:52 |
|
Zero VGS posted:I'm pretty astonished that none of these companies use something like an e-fuse to detect when you alter the BIOS, you can overvolt any card you want and if you set it back and keep your trap shut no one would be the wiser. Hiring and training employees to do this would likely cost more than just sending out new cards, especially if they can send out refurbs as replacements.
|
# ? Jul 20, 2014 16:58 |
|
I don't mess around with overclocking anything - stock runs fine for my needs and knowing myself I'd probably burn down the house. I'm really glad EVGA replaced the card though because I would not have been able to replace it with my current funds the way they are. Pretty sure EVGA just made a customer for life though.
|
# ? Jul 21, 2014 06:24 |
|
Ever since I installed my new gpu, r9 290x, skype crashes when I make a call. I have updated my video drivers and uninstalled/reinstalled/cleared cache multiple times with skype. Any ideas? It only crashes as soon as I click call, sometimes it will connect for 1 sec. I can still type to them in the program before it crashes.
|
# ? Jul 21, 2014 06:29 |
|
goodness posted:Ever since I installed my new gpu, r9 290x, skype crashes when I make a call. I have updated my video drivers and uninstalled/reinstalled/cleared cache multiple times with skype. https://community.skype.com/t5/Windows-desktop-client/Ati-Catalyst-Control-Center-11-Conflict-with-Skype-5-5-freezing/m-p/463693#M41015 Apparently skype dies when AA is forced on the call screen.
|
# ? Jul 21, 2014 07:03 |
|
G10/h55 combo ordered. Hope I dont gently caress things up.
|
# ? Jul 23, 2014 02:47 |
|
Don't over tighten like I did. It was really easy to do, but it wasn't a disaster either it just bent the weakest side of the mounting plate
|
# ? Jul 23, 2014 02:51 |
|
Don Lapre posted:G10/h55 combo ordered. Hope I dont gently caress things up. just installed a g10/x41 on my 290x. no longer louder than a jet. 1gnoirents posted:Don't over tighten like I did. It was really easy to do, but it wasn't a disaster either it just bent the weakest side of the mounting plate
|
# ? Jul 23, 2014 05:17 |
|
The change in resistance between tight and bending was very slight. If I were looking at it from the side when I was screwing it down I'd have noticed it before I did any damage. It was different than the other 3 corners though because those had support, it was just the one free corner hanging on its own.
|
# ? Jul 23, 2014 16:57 |
|
Im just gonna snug mine up. Doesnt need to be putting massive pressure down. Edit: Wow, i cant believe the temp drop with an h55. Went from ~82c to 57c in BF4 with a higher overclock. Don Lapre fucked around with this message at 03:51 on Jul 25, 2014 |
# ? Jul 23, 2014 17:03 |
|
Has anyone here tried flashing a GTX 680 to a GTX 770, primarily to get Boost 2.0? I have a Gigabyte GTX 680 2GB OC, which actually has stock clocks above the GTX 770 (except on VRAM). Is it as simple as taking the BIOS for the Gigabyte GTX 770 2GB, editing the VRAM speed down to 6Ghz to match my card in Kepler Bios Tweaker, and flashing it? Overall it seems like the GTX 770 BIOS has lower minimum voltages and higher maximum voltages, though it also seems to boost less aggressively. I was thinking of just bumping up the minimums and lowering the maximums to match my card along with the power targets, do you think I need to try to make the boost or performance tiers match up? Should I even bother with tweaking vs just flashing the 770 BIOS with the VRAM underclocked?
|
# ? Jul 26, 2014 02:56 |
|
I have not tried it, so I'm mostly regurgitating from memory and Google here, but most of what I'm finding is just cargo-culting the frequency/voltage tables into a stock 680 BIOS (basically, just using the 770 as a template for an overvolt/OC job), not actually using a 770 BIOS. This has been called out for what it is. That said, I have seen people seem to say they used a real 770 BIOS and enabled Boost 2.0 successfully, but that they saw, at most, a 1-2% difference over their previous overclocked results. That doesn't strike me as a good ROI for a risky procedure. Factory Factory fucked around with this message at 11:35 on Jul 26, 2014 |
# ? Jul 26, 2014 11:33 |
|
The latest Steam In-home Streaming Beta now supports Nvidia Kepler hardware encoding with latest Beta driver. This is nice as I can once again postpone CPU upgrade and now hardware-intensive games, such as Metro LL can be played on my netbook without massive lag. There is no ETA when Radeons get the same ability. Though one could have used Limelight for this but in my experience it was slow, laggy and had poor image quality.
|
# ? Jul 26, 2014 15:33 |
|
Is it a good idea to upgrade now, or wait?
|
# ? Jul 26, 2014 18:53 |
|
Bob NewSCART posted:Is it a good idea to upgrade now, or wait? Probably wait if you dont buy gpus that often. Would depend on what you want though, how often you upgrade, how comfortable you are with resale... etc. But the next gen supposed to be out in months vs totally unknown now so there is that
|
# ? Jul 26, 2014 19:58 |
|
Factory Factory posted:That doesn't strike me as a good ROI for a risky procedure.
|
# ? Jul 26, 2014 20:03 |
|
|
# ? Apr 25, 2024 02:12 |
|
I'm building a PC for a friend. For 1080 gaming is it worth it to get a 770 if he has the spare change? Or should he rather get a 760 and save for the next gen?
|
# ? Jul 29, 2014 14:16 |