|
yes Intel has given us nothing
|
# ? Mar 23, 2015 21:25 |
|
|
# ? May 14, 2024 12:18 |
|
go3 posted:yes Intel has given us nothing Not nothing but it has allowed them to go much longer between releasing meaningful upgrades for desktop users. I have an i7-2600k and there will be no real reason to upgrade until Skylake which is going to be like Q1 2016, five years after the 2600k was released.
|
# ? Mar 23, 2015 21:46 |
|
Luckily for us, software requirements continue to bloat (business wise) so that's where all the innovation comes from competition or not. I don't know what the % is, but I'm guessing desktop grade cpu's are dominated by the business market and we're along for the ride. Now discrete GPU's on the other hand is a far more delicate thing imo. While graphics technology is definitely cutting edge and on the rise (mobile), the kind we're interested in seems a bit more fragile, and competition driven development is more realized.
|
# ? Mar 23, 2015 21:57 |
MaxxBot posted:Not nothing but it has allowed them to go much longer between releasing meaningful upgrades for desktop users. I have an i7-2600k and there will be no real reason to upgrade until Skylake which is going to be like Q1 2016, five years after the 2600k was released. Ehhh, everyone is having problems with getting big chips on smaller processes. Intel has been making advances where it really matters for most of the market: heat and power efficiency.
|
|
# ? Mar 23, 2015 22:17 |
|
How much are people expecting out of Skylake in the desktop space anyway?
|
# ? Mar 23, 2015 22:20 |
|
Hace posted:How much are people expecting out of Skylake in the desktop space anyway? Meh. And meh until they hit the 8nm node. Because that's when things get interesting, when we see what material Intel transitions to, post-silicon.
|
# ? Mar 23, 2015 22:31 |
|
If desktop Skylake is Q1 2016 that's a really awkward time to release considering 2016 is when AMD is dropping K12 and Zen, both of which will have HMB2 while Skylake, AFAIK does not and I'm not sure how soon Intel would be willing to just drop another iteration of Err, this post really should be in the AMD thread
|
# ? Mar 23, 2015 22:36 |
|
If you are primarily playing games, Skylake probably won't be worth upgrading to from Sandy Bridge. At this rate I'll probably keep my 2500k until electromigration kills it or the caps on my mobo explode
|
# ? Mar 23, 2015 22:54 |
|
SwissArmyDruid posted:Meh. And meh until they hit the 8nm node. Because that's when things get interesting, when we see what material Intel transitions to, post-silicon. Probably SiGe or InGaAs
|
# ? Mar 23, 2015 23:00 |
|
FaustianQ posted:If desktop Skylake is Q1 2016 that's a really awkward time to release considering 2016 is when AMD is dropping K12 and Zen, both of which will have HMB2 while Skylake, AFAIK does not and I'm not sure how soon Intel would be willing to just drop another iteration of I've been saying for a while that I think that AMD has been playing the long game towards getting into the notebook/SOC market. A competitive microarchitecture, graphics that blows Intel out of the water, but have historically been bandwidth-starved by DDR3, HBM forming a combined L3 cache + graphics + possibly system memory, + HSA? It's almost like AMD planned this all along, and the only things left to do are to prove out Zen, and then bring all this tech together in a single chip. An APU is still going to be a sub-optimal product for a desktop gamer, but a notebook gamer? Those will actually have some serious brunt under the hood, if it all pans out. And any system that sells with AMD graphics is one that shuts out an NVidia sale.
|
# ? Mar 23, 2015 23:02 |
|
AMD certainly are playing the 'long game' when it comes to releasing a driver that isn't from 2014.
|
# ? Mar 23, 2015 23:17 |
|
If I'm understanding correctly eventually AMD will have a chip that's just as power efficient but with better graphics than Intel?
|
# ? Mar 23, 2015 23:22 |
|
Wait, which thread am I in?
|
# ? Mar 23, 2015 23:27 |
|
calusari posted:If you are primarily playing games, Skylake probably won't be worth upgrading to from Sandy Bridge. At this rate I'll probably keep my 2500k until electromigration kills it or the caps on my mobo explode The only thing I'm really waiting for is Pascal and NVLink. Other than that, I'm also happy with my 2500K @ 4.4Ghz for ~3 years now.
|
# ? Mar 23, 2015 23:44 |
|
sauer kraut posted:AMD certainly are playing the 'long game' when it comes to releasing a driver that isn't from 2014. They just released drivers this past week? http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
|
# ? Mar 23, 2015 23:45 |
|
Anyone got some GTX 960 experiences? Most of the benchmarks test at retarded settings. Can it get 40-60 fps out of Bioshock/TombRaider/Hitman at 1080p while looking good enough? How does MFAA come in, if it even works at all? Can you enable PhysX and that Nvidia hair fluff without the fabric of space-time collapsing around you? Thanks
|
# ? Mar 23, 2015 23:51 |
|
Tab8715 posted:If I'm understanding correctly eventually AMD will have a chip that's just as power efficient but with better graphics than Intel? With regards to power efficiency: In theory, maybe. In practicality, probably not. AMD is always going to be a few steps behind Intel, because Intel reserves their bleeding edge process tech for themselves, so they get to reap the benefits of being on Cannondale's 10nm process when Zen hits on 16nm. With regards to graphics: AMD's graphics have always been equal to or better than the equivalent Intel offering. The only part where things are a little murky, are where Intel embedded up to 128 megabytes of EDRAM onto chips with Iris Pro graphics. As I mentioned before, APU graphics cores are bandwidth starved as hell because DDR3 can't feed those Radeon cores fast enough. But putting EDRAM onto chips to improve graphics performance is very cost-inefficient. Compared to GDDR5, compared to DD4, compared to DD3, probably compared to HBM, too, because at a minimum, we're looking at 1 or 2 gigabytes of HBM, per stack, on the APU. This should allow those hungry GCN cores perform more like their desktop versions... hopefully without the heat. Bear in mind, that this is entirely my own speculation. There is zero news pointing in this direction. It just feels like that moment in a caper movie just before everything comes together perfectly timed. The only solid proof is an AMD patent that is sufficiently vague as to allow AMD to use it either for CPUs or GPUs, which I feel points in this direction and have flogged in this thread as well as the AMD thread on many occasions now. SwissArmyDruid fucked around with this message at 01:01 on Mar 24, 2015 |
# ? Mar 23, 2015 23:55 |
sauer kraut posted:Anyone got some GTX 960 experiences? You shouldn't see much of a hit from PhysX on a 960, but I'd really bump up to the 970 if at all possible, the extra memory, bandwidth and power are well worth it at 1080p.
|
|
# ? Mar 23, 2015 23:56 |
|
sauer kraut posted:Anyone got some GTX 960 experiences? I'm not saying this from experience but frankly I think you'd be way, way happier with a used 290 for the same price.
|
# ? Mar 24, 2015 00:03 |
|
That's kind of disappointing we won't being seeing any big cpu jumps until 2017 but holy crap Sandy Bridge was a great chip.
|
# ? Mar 24, 2015 00:13 |
|
BIG HEADLINE posted:The only thing I'm really waiting for is Pascal and NVLink. Other than that, I'm also happy with my 2500K @ 4.4Ghz for ~3 years now. NVLink being a thing is er, bad? That's either the death of AMD or a bunch of dumb proprietary interfaces flying around because like hell AMD will use NVLink. Nvidia is hoping NVLink is good because that's them forcing AMD out of the market, like Gsync. calusari posted:They just released drivers this past week? I'll never get why AMD dropped support for 4000 or older cards. "We've optimized the best we can" is okay and all, but how about updating them so they work in new Windows environments? It really shouldn't be a pain in the rear end to get an X850XT or X1900XTX to work in a Win 8 environment when old Geforce bullshit works, what the hell AMD. Likely dumb question, but what are the baseline requirements for Vulkan to work, any idea? It doesn't seem GCN or DX11/12 bound so...would it work with a ye olde GTX280 or HD4870? A Geforce FX5950?
|
# ? Mar 24, 2015 00:30 |
|
FaustianQ posted:Likely dumb question, but what are the baseline requirements for Vulkan to work, any idea? It doesn't seem GCN or DX11/12 bound so...would it work with a ye olde GTX280 or HD4870? A Geforce FX5950? I don't think AMD or NV have said anything concrete about Vulkan support, but it's pretty safe to assume that cards being updated to DX12 will also get the Vulkan treatment. They seem to be targeting a similar baseline although it's hard to be sure since neither spec is public yet. That means GTX400+, HD7000+/GCN APU and Haswell Iris and Iris Pro chips are a sure thing. HD5000/6000 has the same feature level as HD7000+, but AMD aren't supporting DX12 on VLIW chips so Vulkan probably isn't happening either. repiv fucked around with this message at 01:25 on Mar 24, 2015 |
# ? Mar 24, 2015 00:46 |
|
Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync Edit: Nobody try and kill me for the title, its copied verbatim from the article.
|
# ? Mar 24, 2015 02:59 |
|
BurritoJustice posted:Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync To summarize the article: nothing that hasn't been discussed in here already. The ghosting issues, and what happens to the framerate/refresh below the minimum FPS. As suspected, the G-Sync module self-panel-refreshes and handles frame pacing in some situations. nVidia called this part of the "secret sauce." The G-Sync module is tuned to each panel's voltage/electrical properties, whereas AMD would have to do this in software for each monitor once they can get frame pacing under control. AMD could solve this. It's, um, up to their driver team though. He also dropped a hint that nVidia may copy AMD's option to tear or vsync at max panel refresh rate. LiquidRain fucked around with this message at 03:26 on Mar 24, 2015 |
# ? Mar 24, 2015 03:22 |
|
FaustianQ posted:NVLink being a thing is er, bad? That's either the death of AMD or a bunch of dumb proprietary interfaces flying around because like hell AMD will use NVLink. Nvidia is hoping NVLink is good because that's them forcing AMD out of the market, like Gsync. Vulkan is compatible with any GPU that supports OpenGL ES 3.1 or higher. I think OpenGL 4.3 is a requirement too, but I'm not too sure. NVLink will have a hard time taking off if NVIDIA doesn't Intel on the idea first. Unless they're literally planning to go over Intel's head with this interface. Also, is there any reason other than licensing that AMD couldn't jump onboard the NVLink train? BurritoJustice posted:Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync So we have another person who has confirmed the ghosting thing. Why is no one talking about this? Is it just not noticeable to most people? Also, just skip to the second page.
|
# ? Mar 24, 2015 03:34 |
|
I think the 'fear' being bandied about regarding NVLink is unfounded. There's no way it's not going to be a glorified 'bolt-on' technology that will probably add $50-75 more to the cost of a motherboard that only has PCIe 3.0/4.0 on it, which means nVidia's going to be forced to make the first-gen Pascal GPUs for both NVLink *and* PCIe, so it'll be a non-issue at first.
BIG HEADLINE fucked around with this message at 06:02 on Mar 24, 2015 |
# ? Mar 24, 2015 04:18 |
|
GrizzlyCow posted:Vulkan is compatible with any GPU that supports OpenGL ES 3.1 or higher. I think OpenGL 4.3 is a requirement too, but I'm not too sure. You recognize it, I recognize it, this thread recognizes it, tech journalism sites like PCPer and Tech Report recognize it. What more do you want?
|
# ? Mar 24, 2015 04:47 |
|
SwissArmyDruid posted:You recognize it, I recognize it, this thread recognizes it, tech journalism sites like PCPer and Tech Report recognize it. What more do you want? As far as tech journalist sites goes, I thought only PCPer and, now, the Forbes article talked about it. I forgot that AMD did a short little paragraph admitting it existed (don't read the comments). Pretty much everyone else I read glossed over it, so I was wondering how noticeable it really is. I'm just expecting the people who didn't mention it to donate a couple thousand of dollars to me.
|
# ? Mar 24, 2015 07:22 |
|
GrizzlyCow posted:Vulkan is compatible with any GPU that supports OpenGL ES 3.1 or higher. I think OpenGL 4.3 is a requirement too, but I'm not too sure. I must be dumb, I'm looking at AMD card specs and I'm not finding the OpenGL ES support, and sometimes even OpenGL. The only thing I've found hints that the 5000 series would be the bottom tier for AMD. And Nvidia wouldn't stop and take a moment to realize they could essentially kill a competitor with a new standard? Refuse the license, Nvidia has a commanding lead in market share, make the license costly so that AMD can't be price competitive, etc. BIG HEADLINE posted:I think the 'fear' being bandied about regarding NVLink is unfounded. There's no way it's not going to be a glorified 'bolt-on' technology that will probably add $50-75 more to the cost of a motherboard that only has PCIe 3.0/4.0 on it, which means nVidia's going to be forced to make the first-gen Pascal GPUs for both NVLink *and* PCIe, so it'll be a non-issue at first. So Nvidia then forever holds the GPU crown due to proprietary interface and AMD weeps for their silicon? NVLink promises to make PCIe another AGP.
|
# ? Mar 24, 2015 08:07 |
|
FaustianQ posted:I must be dumb, I'm looking at AMD card specs and I'm not finding the OpenGL ES support, and sometimes even OpenGL. The only thing I've found hints that the 5000 series would be the bottom tier for AMD. Hmm. Well, according to PC World, OpenGL 4.3 corresponds with OpenGL ES 3.1, so at least theoretically, everything from AMD's HD 5000 series to now should support Vulkan. Here's OpenGL ES's page if you curious about which products are compliant. I know AMD's GCN GPUs support D3D12, so I'd expect they support Vulkan, too.
|
# ? Mar 24, 2015 08:41 |
|
FaustianQ posted:So Nvidia then forever holds the GPU crown due to proprietary interface and AMD weeps for their silicon? NVLink promises to make PCIe another AGP. ... No? PCIe will be around for a LONG time. NVLink is pretty clearly being aimed at upper-level enthusiasts and GPGPU applications. Boards *with* NVLink will undoubtedly also include PCIe slots as well, like early PCIe motherboards still had PCI slots. AMD will lose a bit of marketshare at the very apex of the pyramid, where they get the vast *minority* of their profits. OEMs like Alienware will not like having to carry NVLink cards as well as PCIe cards, because it will be very difficult to gauge how many people will pony up the extra cash for an NVLink motherboard. That $50-75 'upcharge' I theorized an NVLink-capable motherboard might cost over a PCIe-only board was what you might expect to pay for a motherboard for a *personal* build. The slightest increase in cost to OEMs greatly enhance the cost to the consumer, because not only does the OEM reserve the right to charge a premium for a premium product, but they then have to maintain a *parts chain* to support those people willing to pay extra. This also would make AMD the undisputed king of the middle-grade enthusiast market, as even if NVLink expands down into that space, every build using NVLink cards is not only going to cost more for the motherboard, but likely for the actual GPU modules themselves, since I'd imagine *demand* for cards with the interconnect will be produced less than those *with* it. We also don't know if AIBs are even going to get to *make* the NVLink modules yet. I haven't even seen a finalized socket picture for it, just something that kind of resembles a far more complex and fragile-looking OBD-II connector. NVLink doesn't 'promise' to do anything - at this point it's just an upcoming proprietary graphics card connection that they have to pray won't cost a ton to add to motherboards. AMD's still going to be an attractive option when people ask themselves "do I spend $50-75 more on aboard with NVLink and lock myself into nVidia's cards for the foreseeable future, betting on the unsure future of a proprietary connector, or do I spend $50-75 *less*, get a board with PCIe 4.0, thereby retaining my ability to use GPUs from *both* vendors for the foreseeable future, and either pocket the saving for food/miscellany or use it to invest in a better CPU/more RAM/higher-tier GPU?" tl:dr: The sky isn't falling. Chill the gently caress out. BIG HEADLINE fucked around with this message at 09:32 on Mar 24, 2015 |
# ? Mar 24, 2015 09:07 |
|
Maybe I am being hyperbolic, but when Nvidia controls almost 80% of GPU market share and it's only looking to get better for them, at what point does the decision to choose become rhetorical? You're making AMD an attractive midtier enthusiast option by Nvidia essentially loving themselves with NVLink somehow, when really most people are going to choose Nvidia to begin with so if you're already buying a high end Nvidia card, why not buy a board with NVLink and "futureproof". Maybe the sky isn't falling, or won't fall, but I'm not seeing how Nvidia won't try to use NVLink to seriously hurt AMD until Intel sets a new board standard. They've got AMD trapped in a corner, time to bring the hammer down and finish the job. I've also toxxed on AMD being a gigantic gently caress up for 2016 so maybe I have ulterior motives.
|
# ? Mar 24, 2015 09:58 |
|
NVLink is also almost exclusively aimed at those who are interested in SLI and insanely-scaled SLI, which is an even smaller subset of enthusiasts.
|
# ? Mar 24, 2015 10:02 |
|
FaustianQ posted:Maybe I am being hyperbolic, but when Nvidia controls almost 80% of GPU market share and it's only looking to get better for them, at what point does the decision to choose become rhetorical? You're making AMD an attractive midtier enthusiast option by Nvidia essentially loving themselves with NVLink somehow, when really most people are going to choose Nvidia to begin with so if you're already buying a high end Nvidia card, why not buy a board with NVLink and "futureproof". They don't have 80% of the GPU market share according to the Steam hardware survey from February, at least: http://store.steampowered.com/hwsurvey/ Their 52% is pretty high compared to AMD's 29% and Intel's 19% but it's not as crazy as you're making it out to be. Ignoring Intel entirely, Nvidia has 52/81 = 64% of the combined total of AMD and Nvidia GPUs. Whether or not the steam hardware survey is a good synopsis of what's selling right now is open for debate but with the majority of gamers buying GPUs every few years I'm not sure that it's that important.
|
# ? Mar 24, 2015 10:13 |
|
better industry figures Steam Survey isn't really the best indicator, and 76% is not far off the quoted 80%.
|
# ? Mar 24, 2015 10:22 |
|
BurritoJustice posted:better industry figures Yea, steam survey doesn't capture things like our fleet of 120,000+ office shitbox machines that nonetheless have essentially 2d only Quadro NVS300s in them. Why bother with those instead of intel CPU graphics?
|
# ? Mar 24, 2015 14:06 |
|
Gwaihir posted:Yea, steam survey doesn't capture things like our fleet of 120,000+ office shitbox machines that nonetheless have essentially 2d only Quadro NVS300s in them. Why bother with those instead of intel CPU graphics? Before the core-i series intel's IGP solutions were not great
|
# ? Mar 24, 2015 14:08 |
|
e: im a dumbass who cant read charts
|
# ? Mar 24, 2015 14:10 |
|
Hace posted:Before the core-i series intel's IGP solutions were not great Doesn't matter since outlook is the most stressful things these run, and we replace all desktops every 3 years- They're all Ivy Bridge machines at this point. It's still
|
# ? Mar 24, 2015 14:47 |
|
|
# ? May 14, 2024 12:18 |
|
I still have something like an HD2400 sitting in a box from when it took us a while to convince the powers that be to just buy a displayport cable instead of the ~$130 we were spending on these cards whenever anyone wanted dual screens. This was at the time I got a refurb 5870 for not much more. Boggles the mind.
|
# ? Mar 24, 2015 14:58 |