Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Workstation IGPs would be the ones in the Quadros and Fire series GPU market for folks doing CAD primarily. It's just that there's no such thing as a "workstation"-class IGP at this point.

Adbot
ADBOT LOVES YOU

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Nondescript Van posted:

That is the exact opposite of a phantom.
That refers to your bank account, silly.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Another consideration is that even more powerful PhysX capabilities doesn't necessarily provide better visuals in that most games that do support it don't have all the crazy options we do with primary graphics. So the optimal coprocessor would be lowest possible idle power with sufficient power to handle PhysX for the life of that system. Otherwise, unless power is really expensive, I could suggest just keeping old cards around instead of selling them and buying something cheaper every upgrade cycle, that just sounds silly.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
It's a good offensive and defensive strategy in that you can expand potential markets without doing lame channel sales agreements or even making more fab contracts and at the same time if they're having problems getting good profits off of existing IP, just licensing it can shore up margins to help out. It'll be maybe a year or two before we see anything materialize out of this though probably, but I'd like to be wrong about it and find out that there were things already in the works that just made the licensing official policy for everyone.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Speaking of quiet, I'm on the last-gen train with a "new" GTX 680 but compared to my previous EVGA GTX 560 I fried stupidly, this GPU is so much quieter at load it's not even funny. That was my previous complaint by far and a substantial factor in why I haven't considered high-end cards, but the performance gaps leading to a card being able to run well without breaking a sweat does make sense as to why the fans wouldn't need to work much at all. I can't really imagine how a GTX 780 could be even quieter is the thing given that the CPU fan is probably louder now, but perhaps the lack of airflow (and therefore cooling) in a supercompact case like a mini ITX case would be more impacted than with my micro ATX PC-A04 Lian Li case.

The best part is that the power consumption is as good or better than the GTX 560 I had for a while with far superior settings, so the only downside is the up-front cost. Another unexpected pro is the idle power consumption - it's within 3-4w of an ATI 5450 which idled the whole system around 51w on a 520w PSU. A Kill-A-Watt reading of 54w when sitting at a desktop is not what I expected given that most reviews had the idle power consumption of their systems at least in the triple digits. Unfortunately, I don't think I can really get on the cooler = faster train for the current-gen cards because I plan on building something vaguely close to a new Mac Pro in a vertical air pull configuration with a smaller case like this with limited airflow and cooling design that won't work well with ACX coolers, especially.

I'm really curious if the PC industry will take some of Apple's cue and start thinking about integrating component coolers together but I don't think that'll happen for both cost and technical reasons. I'd love to be able to swap parts out and know I'm getting decent cooling in a tiny, yet quiet case that takes up basically no room. I mean, that's the goddamn holy grail of desktop computing, isn't it? Smallest package that'll let you upgrade stuff as needed while being cool, small, etc.

Does anyone have any theoretical numbers on what kind of heat dissipation a cooling design like the new Mac Pro could achieve? If it can solidly cool a dual GTX 780 configuration it'll give enthusiasts something new to blow their money on, so there would be some financial incentive to go this route as at least a boutique industry.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I think we can just fall back to the justification to just buy optimally and upgrade often to avoid getting bit hard one way or the other as "(Computers) Strange game. The only winning move is to not play. How about a nice game of chess?"

I'm convinced a lot of the "enthusiast" high end products are basically for every other nerd to blow money on once, realize they got nothing meaningful out of it, and to move on afterward. I guess I sorta did the same thing except I bought a Powerbook G4... 4 months before Apple introduced the x86 Macbooks that spanked them on performance :(

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
AMD has been known for shoddy tooling and APIs for developers compared to nVidia while being somewhat Ok about IP sharing / licensing while the opposite seems true for nVidia (see: Linus Torvalds' rants on nVidia).

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Athropos posted:

I'm beginning to think that maybe my SLI 2GB 680s are not enough for the resolution I'm using (2560x1440). I need to run some tests and monitor VRAM usage before being certain but god damnit.

I kind of dont want to go through the hassle of selling and replacing them. I imagine they lost quite a bit of value since I bought them a while ago.

Anyone else gaming on 1440p can chime in?
I have a single 2GB 680 at 2560x1440 and I don't think I've had memory get capped nor have I had anything hit below 30 fps even if something's buggy.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
For everyone else that had a problem after running nVidia updated drivers, I was able to get out of being forced into safe mode to do anything by uninstalling all drivers and nVidia and ATI software completely (including ATI - I had both ATI and nVidia drivers present briefly) and cold booting each time for each configuration until I had finally installed drivers. Granted, this may not be the proper solution, but I tried reinstalling drivers like everyone else and that didn't work whatsoever. I'm suspecting something is wrong with a number of users' configurations that nVidia didn't quite test for. It kind of pisses me off because I'm not exactly a crazy power user when it comes to my GPUs and I've stuck with mainline driver releases, so I shouldn't have had any problem at all. Maybe it has something to do with using a GTX 680 that I'm not aware of but seeing that people on even 460s getting similar problems, I'm just going to leave the judgment at "nVidia QA hosed up"

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I got a GTX 680 2GB card for like $250 from a goon when the GTX 7xx series came out and I saw how little value I'd get for an extra $100+. I had burned out my GTX 560 when I was about to sell it and was without any gaming-capable card for months. Totally glad I waited. :colbert:

I've gotten smarter about buying hardware after I got burned buying a Powerbook G4 about 4 months before the first x86 Macbook showed up.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Has anyone tried the Accelero Hybrid II? I'd like to quiet down my GTX 680 at both idle and load and hopefully be able to use it later for a GTX 970 or even beyond.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
This debacle doesn't seem that different from the TLB, TSX, FDIV and various other CPU gently caress-ups historically by CPU makers and there'll likely be some drivers that patch it out or something. We've always had to do some crazy gymnastics to access memory at various points in computing history. Anyone remember emm386.exe? How about the bank switching modes on the Atari 2600? Any game developer that feels like releasing a patch to avoid this area is free to do so, but this may be possible to fix on nVidia's side with drivers that exclude the region from allocation unless developers specifically enable access to it on a GTX 970.

People threatening to sue over this kind of asinine crap only lets the lawyers win. Never give in to team law.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I wasn't planning on upgrading to a GTX 970 anytime soon but I'm so tempted to exploit all the rich kids around here that have more money than sense. Sure, they'll still be richer than me but I'll have a great card that's near the eBay prices of my GTX 680 for maybe $40 more despite the card being 2 generations old. It just means I'll hold an upgrade for another 3 years instead of 1.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Megasabin posted:

I actually don't know. I got the card recently, and I haven't been playing anything taxing on it. It was actually a christmas gift in anticipation for the Witcher III. If I have stuttering while trying to play the Witcher because of this issue I'm not going to be happy, so I'm trying to sort it out now.
The Witcher 3 will likely have problems at release that make it barely playable for people even with 980 GTX SLI setups at 1080P given past history... and that'll be resolved within maybe a month. It will be fantastic for whatever parts you can play most likely, but you really shouldn't buy so far ahead typically, especially for a single game. I'm on a GTX 680 still on a 3440x1440 monitor and I don't really see anything terrible happen in my games. But for Witcher 3 I might upgrade around then. Who knows what'll happen?

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Darkpriest667 posted:

Let's be honest to the folks here, the "telemetry" that they're going to be collecting isn't anything super personal or special. It's nothing secret that they wouldn't be able to get anyway. It's just more convenient.
I really suspect the primary point is to provide massive amounts of proof of the sheer volume of piracy in Asia or developing / emerging markets to the point that the data cannot be ignored and government officials are pushed into a corner to act. I believe this is a fundamental control point that Microsoft wants rather than to try to herd the billions of cats out there that will likely break their copy protection.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
My GTX 680 has either magically developed coil whine after a year of whatever or its fan has started to need lubrication because even at idle speeds I hear a squeak or chirp every couple seconds and it's pretty drat irritating. Tried blowing dust out of the card and that hasn't even changed the sound. Has anyone had to deal with something like this and would it be worth taking the fan assembly off the card to clean and re-lubricate the fan? Think it's a sleeve fan so I'm not even sure how to / if I can lube those up.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Truga posted:

Yeah, I was gonna get 2 970s to run my 30", and now I don't think I will over this. 3.5gb might be just right for 1440p, but people with 2560x1600 say it can bog down in some games. I think I'll just wait for R300 and see what happens to 980 prices or if a r300 is a better deal.
I'm on a 34" Ultrawide at 3440x1440 (almost 1M more pixels than a 30") and I don't have a huge difference in what I played on 2560x1440. Maybe lost like 15% of my FPS from like 100+ to like maybe 90+ for a lot of titles. I'm on a GTX 680 for reference, which is significantly slower than a GTX 970 even and may be somewhat close to these new GTX 960 cards. I don't have an SLI option really for my setup because I'm on a mini ITX case and motherboard, but I'm not very serious about my games so I'm just sticking with this setup for the foreseeable future.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Zig-Zag posted:

Newegg is giving away the witcher 3 when you buy a MSI gtx 970 100ME. I'm glad I waited because that's the game that I wanted a new gpu for.
Goddammit, now there really isn't much point in me holding out past this generation given I was going to buy this anyway near release day. Goodbye GTX 680, goodbye $200, hello GTX 970, hello Witcher 3 at 3440x1440 somewhere near 60 FPS hopefully.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

veedubfreak posted:

The odds of actually hurting equipment in this day and age from static discharge is so small as to be almost negligible.

I have worked on hundreds of computers over the years and never worn a static strap and have never damaged any equipment. It's not like I'm wearing flannel footie pjs while working on my stuff.
I burned out 2 motherboards in a row a couple years ago while installing them for my mini ITX file server build when it was really staticy in winter and on a vinyl chair mat I can hear the static discharging as I rolled my chair across it. I'm also quite sure that I've burned out at least 3 sticks of RAM in the previous year and perhaps hastened the departure of one more, but there's a possibility that the guy I bought it from caused the failure when dropping it into the bag. The total damage is about $550 to me so far easily. I absolutely know it's ESD because these two motherboards worked briefly before I reached over, felt a zap from my hand, heard the sound, and the drat thing turned off. You don't get two DOA motherboards in a row typically, we're far past the days of ECS and Biostar motherboard quality now. I didn't exactly wear flannel PJ bottoms or anything but when your build environment is a static paradise with pretty much everything that could go wrong, even an ESD strap might not help much. I was holding onto grounded metal as much as possible during the build and test knowing my environment was high-risk.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I moved my GTX 680 from an old machine to a new one and now I got coil whine when I never had any before. It's definitely possible to get coil whine (or potentially remove it!) by changing PSUs, motherboards, etc. that can change the harmonics in the electrical system that would cause the resonance. Some people use VSync to avoid this, others might need to disable it (I run VSync to avoid having the fans run faster). I'll just live with it until Pascal GPUs are out though I think, too busy to bother playing anything coming out in 2015 anymore, so no point upgrading to even a 970.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
There's some larger ITX cases like the new Core X1 from Thermaltake that can take large graphics cards. The Raven RVZ01 had to make a compromise on space usage by opting for an SFX power supply which limits your PSU pool pretty significantly. The Core V1 I'm on, however, won't take the Gigabyte GTX 970 that I was looking at with 3x DisplayPort outputs and also happens to have plenty of heatsink clearance for the CPU while I don't get that much height with the RVZ01 in comparison. And because of that I'm going to be a bump on a log and not even upgrade to a GTX 970 while I sit around for 4k displays to drop significantly in cost (providing that Intel has its way) at about the time that Pascal is released.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
This is what I meant by using a SFX PSU limits your selection - you have to actually do a little bit of research and think through things a little bit when with just plain old ATX PSUs you have an incredible number of possibilities. I'm all for making powerful computers smaller, but if you're custom building stuff yourself having a limited selection of parts really hurts when something breaks.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I don't buy video cards often and was wondering how much better exactly would the cooling performance be between, say, an ASUS GTX 970 and an eVGA one on ACX 2.0? I'm trying to keep a single card setup that will handle the output in non-gaming for 2 4K monitors and a 34" ultrawide and that will require a combo of 3 DisplayPort or HDMI outputs which none of the ASUS or MSI cards carry among GTX 970s. To further restrict options, I have a mini ITX case and the maximum card length is about 11.5" so Gigabyte cards are out. Otherwise, I'm looking at a stupid GTX 980 for better coolers and the output ports. I'm not sure when I'll go buy the other monitors but it would be within a year. Sounds like future proofing but gaming is not my priority. Screens and low noise and heat at idle are my priorities while being able to play something at 3440x1440 without questioning my video card.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I'm just crossing my fingers that Witcher 3 at 3440x1440 performs just fine on a GTX 970. RAM and disk shouldn't be a problem at least, and with an i7-4790k I'm going to be slightly pissed if CPU is a factor.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
You would almost think that AMD is trying to run themselves right into the ground and through the ocean floor with the decisions they're making. It sucks to be a bit of an underperforming underdog but they're not even doing what half-head management consultants would suggest anymore. I suspect they really have conceded that the only people left buying their stuff are just Intel haters or completely uninformed "enthusiasts" that are being paid by their marketing folks to give passable reviews.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Panty Saluter posted:

They broke me again. :v: This is on sale for 299 with Witcher and Batman, so I upgraded my wife's computer. She's currently on a 650 Ti which has been a fantastic little card but is getting a little long in the tooth.

drat you Nvidia, I'm trying to save money!! :doom:

gently caress, I literally bought that same card two weeks ago for $340-ish off of Amazon and I really haven't had the time to actually use it, so I might as well have waited. I got the impression those codes would go away upon Witcher 3 release though and that drove me to buying then.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Parker Lewis posted:

I mostly game with my current 970 on a 1080p TV but occasionally play stuff on a 3440x1440 monitor (where the 970 has some trouble with Witcher 3 and GTA V) and I'm trying to figure out if I'm better off spending ~ $350 on a second 970 or ~ $400 to sell my 970 and buy a 980 Ti.
I'm in this situation but actually don't have a single card option because I'm on a mini ITX motherboard, so I have to go to a 980 Ti / Titan * or wait until next year for Pascal to (hopefully) rock at high resolution. Given I'm super busy probably for the rest of the year, I'm going to opt for the last option.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
GTX 970 is sweet spot for maybe up to 2560x1440 but I am somewhat regretting it for my 3440x1440 monitor because I can hardly even get 40 fps in Witcher 3 with medium settings. But having to go up to a 980 Ti for a few AAA titles I hardly play is really a waste of money for me. I run at 2560x1080 and that gets me 50 fps+ but it's kind of a little disappointing as someone that hasn't paid more than $200 for a video card basically ever.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
For a while when we were on VGA standard still and in the 90s, I remember the DACs and ADCs used by ATI were better than nVidia's cards by a few decibels maybe which at least meant less chance of aliasing artifacts and also better demonstrable color range for certain video pros (not relevant for gamers or anything). Can't remember the review I saw that rated video cards similar to how you'd measure signal response from speakers but that was interesting then. Funny enough, we do have a digital VGA standard now, but nobody implements it except maybe three vendors.

And hard drive seek noise is audible on many laptops for me still :(

I'll contend that until the mid 80s or so DACs were not that great / efficient until the modern 1 bit DAC was perfected (early 80s had some issues for a while). Hardly 50 years. Christ, early Motown Records didn't have stuff that would remotely compete with a fuckin' iPhone 3G's DAC or any of its ADCs then.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Manifest posted:

I need a new card and am on a roughly $200 budget.
My GTX 760 is dying.

Is this the best buck for my bang at that price?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487133
Anyone have negative experiences with heat due to the size?
I could sell you my old GTX 680 4GB that's actually faster than the GTX 760 in performance for maybe $140 + shipping. It was previously goon-owned and I got it at a great price so I figure I should pay it forward somehow.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Zero VGS posted:

Why do they block it... do they actually sell Nvidia Grid to anyone?
GPU virtualization is almost certainly being sold to Amazon, Google, and Microsoft for their respective cloud GPGPU services. I'm looking at using them for some personal projects myself instead of buying a small Tesla farm or expensive GPUs. I'm even crazy enough to think of running VMs in AWS or Google Compute Engine to play games instead of buying GPUs myself.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Cingulate posted:

I'm dabbling with CUDA a bit right now and considering asking for a few 1000€ to spend on a system with a GPU. I've noticed there seems to be this three-fold split in NVIDIA's offerings: GeForce is for gamers and the Titan costs 1000 or so, then you have Quadro which starts at like 2000 for something that seems, numbers wise, much worse than a Titan, and then you have Tesla, which is the price of a car. I'm wondering, how much am I short-selling our total efforts by considering a 12GB Titan over a Tesla?
The goal is to run CUDA stuff for science, neural networks and FFT mostly.

This also influences on if we want a rack-mounted monstrosity with hundreds of GB of RAM and Xeons and poo poo hidden far away from living human beings, or a tower with an i7 sitting in a remote corner of someone's office.
It depends a lot upon your kind of neural network and whether you need double precision badly or not. I'm just doing convolutions rather than recurrent and it tends to be a bit piggy with memory on the sort of use cases I want (video processing with high dimensional features and an embarrassing number of transforms and layers that are likely doing jack all), but many people do perfectly fine on just a GTX 970 and just deal with single precision. I'm recently hobbled through a quick POC that got choked on the GTX 970's 3.5 GB / 4GB memory split which isn't a problem with gaming but is a problem if you are using a lot of RAM in the pathological case of deep convolutional neural nets with a lot of transforms like mine. So my use case is one of the unfortunate ones that warrants a Titan X or above and is where I'll probably just go use AWS GPU instances until Pascal is released next year. Teslas are almost an order magnitude better for a bit more than 12x the price basically and it shouldn't be considered unless you are thinking about compute density in a data center or are using it in a serious professional scenario like a research lab for scientists, for example, in my opinion. If your organization is just dabbling around and doesn't have a long-term bet of some sort on GPU based computing, Teslas are a Bad Idea IMO.

But there's more info on the justifications of why a Titan X is probably best overall and that if you're dipping your toes a GTX 970 will likely be fine http://timdettmers.com/2014/08/14/which-gpu-for-deep-learning/

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

SwissArmyDruid posted:

Not that hard to trick the Nvidia cards into thinking that they're not actually in a VM. I hear it only takes one or two things.
Not sure which trick(s) that's being used to do it. I'd think the driver would be what's in control of the VM detection logic and they could just patch it out easily otherwise. Only references I see that get around anything in any manner is to use nVidia GRID vGPU under ESXi and that's not what people are looking for normally I'd imagine. Things like changing MAC address to another vendor's range and whatnot seem like it wouldn't work since not every VM will be virtual network connected I'd think. Otherwise, not letting VMware Tools run or changing around PCI addresses for certain devices could very well cause bugs in other things that are not very nice either.

And I'm one of those dozen people interested in this approach because I'd like to run some CUDA stuff in one VM running Linux and when not in use have the GPU switched to an HTPC VM that runs Windows (DRM BS for TV recording and such that's easier under Windows). Maybe I should just get separate GPUs or eat the cost of a Tesla but buying multiple $1k+ cards sucks for primarily home use.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Oh.... KVM basically. I've definitely read about getting nVidia card passthrough to work via KVM but not under ESXi. According to this it's just two flags http://www.se7ensins.com/forums/threads/how-to-setup-a-gaming-virtual-machine-with-gpu-passthrough-qemu-kvm-libvirt-and-vfio.1371980/ but it may not apply under Linux as the guest OS. Makes no sense why Hyper-V matters under KVM as the hypervisor but evidently the VM detection logic triggers under Windows as the guest OS for sure.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I'm a semi-serious GPGPU deep learning / machine learning enthusiast and I have zero freakin' idea wtf nVidia means by "deep learning operations" in the PX 2 announcement. What, matrix multiplications, hardware function differentiator, or convolution acceleration? All the press releases so far about PX 2 basically explain wtf a neural net is at best going into almost zero actual technical details. Anyone got any more information on what's going on in the architecture / solution? I'm looking at trying to speed up training for models that take forever to train (I am not a smart man and am using gargantuan sized layers for the moment) and kinda slightly don't care about the real-time video classification parts I suspect may be what the platform is aimed at.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Paul MaudDib posted:

Half-precision floats mostly.
I don't understand how big the hype train on deep learning can be if this kind of non-event / re-release is what's considered acceptable presenting it like new to mostly a rigorous, highly technical community like AI researchers and companies. This is as asinine as Intel re-releasin something that's already in use and very widely known like SSE2 and calling it "new Big Data processing capabilities."

xthetenth posted:

That people manage to make spec non-compliant cables boggles the mind.

That google engineer reviewing USB C to A cables on Amazon has scarred me.
Non-spec compliant Displayport cables are very common as seen several pages back I believe (most cheaper cables including even Monoprice's drop a pin that's supposed to be in spec but few used supposedly). I had to order one from a random online electronics dealer to get my U2711 to not mess up on sleep mode years ago.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Ah, I thought CUDA already had half-precision since 16-bit precision shaders have been around forever and that whatever I was using in Theano and Keras wound up as 16-bit floats.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I work on high resolution video processing using neural networks as a hobby. My entire architecture would change (for the better) if I could treat the GPU memory like it's uh... actually memory rather than a manually managed L2 cache in practice.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
That fuckup goes right up there with the guy several years ago that "delidded" his Intel Xtreme CPU or whatever but actually ripped the CPU clean in half along 2 layers or so.

Adbot
ADBOT LOVES YOU

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
30 Hz, on the other hand, is loving murder on your eyes even when staring at a screen full of text because the instant you scroll your eyes will mutiny against your eye sockets.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply