|
Paul MaudDib posted:why are you booing me? I’m right. You may or may not be. The reality is that just because Nvidia has the ability to accelerate certain ops at a certain precision doesn't mean that it's necessary. Doing things a particular way simply because their hardware can would not be a first for Nvidia. That said, simply from a software perspective AMD / MS / whoever else anyone may fantasize about introducing a DLSS equivalent is years behind. It takes time to get things out and integrated, and Nvidia is by far the world leader in realtime graphics in terms of actually getting solutions for things into real action in the market.
|
# ? Oct 17, 2020 14:48 |
|
|
# ? Apr 23, 2024 11:09 |
|
Rolo posted:So this shitshow was the first time I tried to get popular tech on release. Is it going to be about the same trying to get a zen3? I’m on a Ryzen 7 3700x so I’m not in a rush but some friends are going to be trying. I would expect zen3 to be easier at least in part because pricing is way less attractive than the outgoing generation.
|
# ? Oct 17, 2020 14:52 |
|
Kazinsal posted:Yeah, at some point Microsoft is going to just go, "gently caress it, here's the secret sauce" and everyone will have access to it. 2.0 uses a generalized model, it doesn't require them to train every game individually. The dev adds support for it by adding in some unknown bits + movement hints for objects in each frame to help the thing make choices.
|
# ? Oct 17, 2020 15:22 |
|
I accidentally (?) bought a monitor that supports AMD FreeSync even when I wasn't really looking for it (Samsung S27R350FHE for anyone curious). Is that a valuable enough feature for me to go out of my way to get an AMD card (I'm on an RX 580 right now) when I was planning to get an RTX card (whether Turing or Ampere, availability here in the PH is weird) in December?
|
# ? Oct 17, 2020 15:35 |
|
shrike82 posted:lol dude why do you need to pretend to know ML to win an argument I'm still not sure what Marxism-Leninism has to do with video cards.
|
# ? Oct 17, 2020 15:38 |
|
people should realize that Nvidia marketing something as a tensor core exclusive feature doesn't mean that the hardware is actually a prereq. and for paul's benefit, GEMM (the relevant tensor matrix multiply operation) is hardware accelerated on all video cards (including on AMD cards) - what the tensor cores do is speed up lower precision (fp16, bfloat16, and i believe tf32 with Ampere) GEMM. there's nothing specific to tensor cores that makes DLSS inferencing exclusive to it. funnily, nvidia themselves have a (single frame) AI upscaling solution running 4K60 on their Shield TVs (with a slow Tegra X1+ that doesn't have tensor cores). it's not clear the frame + movement vectors and previous frame CNN autoencoder approach they've gone with is the only/best way to do game upscaling - just look at how fast Nvidia is iterating on their implementations. given how fast the broader ML field moves, it would not surprise me to see Microsoft or a game developer roll out an implementation of a better approach that some random academic group (or even Microsoft Research) publishes out of nowhere. and most importantly, it's something Microsoft is keen to implement with their new consoles shrike82 fucked around with this message at 15:49 on Oct 17, 2020 |
# ? Oct 17, 2020 15:40 |
|
gradenko_2000 posted:I accidentally (?) bought a monitor that supports AMD FreeSync even when I wasn't really looking for it (Samsung S27R350FHE for anyone curious). Is that a valuable enough feature for me to go out of my way to get an AMD card (I'm on an RX 580 right now) when I was planning to get an RTX card (whether Turing or Ampere, availability here in the PH is weird) in December? Hard to say, it's been known that some FreeSync monitors are fine to use with G-Sync even if not certified but since there's only a handful of Samsung displays actually certified compatible in Nvidia's own compatability list, I wouldn't count on it. But any mid to high tier card will certainly go faster than 75hz on your monitor so you're probably stuck with VSync. Edit: Reread the specs, it's a FreeSync 1 monitor so it actually won't work at all with G-Sync and it's a 1080p screen. Any new card you get now will keep it at the max refresh rate and beyond honestly. 8-bit Miniboss fucked around with this message at 15:47 on Oct 17, 2020 |
# ? Oct 17, 2020 15:44 |
|
8-bit Miniboss posted:But any mid to high tier card will certainly go faster than 75hz on your monitor so you're probably stuck with VSync. okay cool thanks.
|
# ? Oct 17, 2020 15:51 |
|
Xachariah posted:For any UK guys here, don't sleep on Curry's. Just got an email confirming my order was dispatched for a Ventus which was listed on there for £680.00. Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner. I was also running the stupid JavaScript bot in the background, and it was checking Curry's, but it somehow missed it at first as well, it alerted me some 3 minutes after I saw the Discord message. So in total I was maybe 8 minutes late. Had no chance. I'm wondering if I should just buy a 2070 or something now, rather than wait until the 3070 launch - the way this is going, it wouldn't surprise me if after the 3070 launch all 20x0 cards will get sold out
|
# ? Oct 17, 2020 16:22 |
|
jaete posted:Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner. I’m hoping that AMD has something good and that plus the 3070 takes a lot of pressure off the 3080. I just don’t think that many people are really looking to spend close to a grand on their graphics card. Great price/performance $500 cards are going to be popular. But I don’t know is how hard Samsung is loving this up. Nvidia putting out news articles that say they are going to switch to a different fab next year points to a large amount of loving it up. spunkshui fucked around with this message at 18:57 on Oct 17, 2020 |
# ? Oct 17, 2020 16:37 |
|
Pivo posted:No, that's not a very good point. Resolution is quantifiable, 'higher quality pixels' are not. Go 'not a good point' at me again with more microsoft pr buzzspeak, paul mauddib needs the competition for hot takes
|
# ? Oct 17, 2020 16:56 |
|
i think it is obvious that by "higher quality pixels" pivo meant "other characteristics of the image beyond spatial resolution." For instance the pixels' refresh rate, their color gamut and accuracy, their brightness, their black points. Same as how you don't need more than about 24mp in your camera for the vast majority of uses, so instead of just cramming in more photosites, you can make the ones you have larger with closer spacing so they capture more light, and design them to have a broader range of exposure before clipping, etc and your images will be subjectively better.
|
# ? Oct 17, 2020 17:03 |
|
Sagebrush posted:Two people who both managed to download and compile TensorFlow slapping the poo poo out of each other over which one is the true genius AI expert Hey now, compiling TF is not easy, because lol Google.
|
# ? Oct 17, 2020 17:16 |
|
joke's on you i use pytorch
|
# ? Oct 17, 2020 17:16 |
|
AirRaid posted:"Tensor" is literally a trademark name that nvidia came up with for a thing, it is not a type of thing in itself. If this was true I might’ve gotten better than a C in engineering electromagnetics.
|
# ? Oct 17, 2020 17:21 |
|
jaete posted:Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner. I've seen that Currys has both the gigabyte Aurus Master and Vision dropping on the 5th of next month, maybe wait till then for nothing to be available. MonkeyLibFront fucked around with this message at 18:46 on Oct 17, 2020 |
# ? Oct 17, 2020 18:43 |
|
shrike82 posted:people should realize that Nvidia marketing something as a tensor core exclusive feature doesn't mean that the hardware is actually a prereq. i meant to post this chart earlier and got off doing something else, so thanks for posting. i asked my so about the relative approaches this morning. not a gfx person, but an ml prof. he said this idea isn’t surprising as people refine techniques and realize they can accelerate more fundamental maths, and they can do that by just reusing pre-existing shader cores. assuming microsoft worked with them to implement this in hardware, i wonder if the ps5 has it, or if their desktops will have it / the appropriate variant. my guess is that whatever version of hardware dlss amd comes up with, it’ll probably be behind a bit at launch and the internet will wail and gnash teeth and buy nvidia, and then a few driver releases in or whatever it starts to improve. i guess maybe one question is whether amd wants to put in the same kind of model training time on it per title or come at it from a more universal, if less individually tweaked, set of rules. mediaphage fucked around with this message at 17:07 on Oct 18, 2020 |
# ? Oct 17, 2020 18:45 |
|
just checking back in and wow guys this thread is looking a little tensor than when i left
|
# ? Oct 17, 2020 18:53 |
|
shrike82 posted:it's not clear the frame + movement vectors and previous frame CNN autoencoder approach they've gone with is the only/best way to do game upscaling - just look at how fast Nvidia is iterating on their implementations. given how fast the broader ML field moves, it would not surprise me to see Microsoft or a game developer roll out an implementation of a better approach that some random academic group (or even Microsoft Research) publishes out of nowhere. You're probably right here, but there are very real limits to what you can do with frame only. You have limited information for upscaling which means you need to make very strong assumptions if you're trying to get close to native quality. DLSS 1.0 tried to use ML to optimize those assumptions on a per-game basis and it performed comparatively poorly for the huge amount of work it required. There might be some other information that can be brought into the upscaling algorithm, but I suspect that any good solution in the future is going to incorporate both the frame plus some amount of extra scene information. Stickman fucked around with this message at 19:27 on Oct 17, 2020 |
# ? Oct 17, 2020 19:22 |
|
jaete posted:Yeah I noticed Curry's as well, a day or two ago. There was a message on one of the Discords saying Curry's have some 3080s up but also sorry that was actually like 5 min ago we don't have a bot for it yet so didn't notify you all sooner. poo poo man, sounds like you actually saw the second wave there. The Ventus and Trio X went up initially, went out of stock, then they were back into stock 5 mins later briefly. I've heard of a similar kind of outcome on Amazon UK. There's the initial stock and then a staccato restock shorty after. I think these places scan in stock one by one and some people have been successful just continuing to squat on the page and F5 after they "miss" the initial drop.
|
# ? Oct 17, 2020 19:43 |
|
One month since paying for a card as yet unreceived. This feels like a bad consumer experience. Like, it's not the end of the world, but it's real bad. loving demand issues
|
# ? Oct 17, 2020 20:40 |
|
repiv posted:the usefulness of an AI upscaler is predicated on how fast it is, and nvidia has an edge there with their specialized hardware units Dedicated inference hardware is a solved problem though, isn't it? Just spend some transistors on matrix multiply-add, which is all a tensor core is. Apple, Qualcomm, Intel, NVIDIA's figured it out ...
|
# ? Oct 17, 2020 20:46 |
|
Paul MaudDib posted:why are you booing me? I’m right. Right as in correct, or right as in “microsoft flight simulator 2020 is just a glorified tech demo”?
|
# ? Oct 17, 2020 20:47 |
|
shrike82 posted:it's not clear the frame + movement vectors and previous frame CNN autoencoder approach they've gone with is the only/best way to do game upscaling - just look at how fast Nvidia is iterating on their implementations. given how fast the broader ML field moves, it would not surprise me to see Microsoft or a game developer roll out an implementation of a better approach that some random academic group (or even Microsoft Research) publishes out of nowhere. I've got some experience with neural networks and machine learning algorithms. Enough to implement a digit classifier with MNIST from scratch in C++ anyway. I'm also familiar with reconstruction algorithms for computer graphics as writing ray tracers has been a dorky hobby of mine for the last 20 years. The work NVIDIA's been doing on neural denoising is extremely impressive and could be a total game changer for stuff like the architectural visualisation field. DLSS is also extremely impressive but I don't think it's self evident that they've hit the best spot on the quality/computation curve or that their algorithm universally applicable. Nobody outside of NVIDIA knows exactly what they're doing, but part of their algorithm we do know about is an auto-encoder. Autoencoders are, put very simply, networks that accepts a bunch on inputs, contains a "choke point" in the middle that forces the inputs to be mapped to a much smaller representation, and an expander that attempts to reconstruct the original inputs from the compressed middle layers. These are great at denoising, inpainting missing details, or if you've got a larger number out outputs than inputs, scaling. Running them is just a bunch of matrix multiplications, which is what the Tensor Cores are designed to do quickly. They usually produce crap results when the inputs are too far outside their training set though and you can't train on everything. To engage in a bit so semi-informed baseless speculation, I think part of the reason DLSS still isn't widely available and has shown up in so few shipped titles is that it requires a degree of per-title or even per-scene tuning in it's current state to produce decent results. Not retraining the underlying network, but massaging and filtering it's inputs. It's in UE4 now, but NVIDIA does approval on a per-developer basis before they let you turn it on. Throwing the doors open and letting everybody play with it will mean that it's limits and weak points are quickly found. NVIDIA has determined, correctly in my opinion, that a few titles that implement DLSS really well combined with the promise of more to come is going to sell more cards than those same few games and a bunch of awful indy crap that uses DLSS badly. I guess what I'm getting at is that DLSS doesn't seem, to me, to be a general solution to the problem of game upscaling and there are almost certainly cheaper, non neural network based algorithms that will do well enough. Now that they've opened that door, I think we're going to see a lot of rapid progress in the field, and I can't wait to see where things are in a couple years.
|
# ? Oct 17, 2020 20:57 |
|
Zedsdeadbaby posted:Resolution is quantifiable, 'higher quality pixels' are not. Go 'not a good point' at me again with more microsoft pr buzzspeak, paul mauddib needs the competition for hot takes You're a moron. Higher quality pixels literally does have meaning. It means quality > quantity in terms of the content of the raster vs. the size of the raster. It has meaning in displays, it has meaning in sensors, it has meaning in rendering. I can't believe this has to be explained. It can be quantified in various ways - shading precision, mesh complexity, raytracing bounce depth, whatever. mediaphage posted:i guess maybe one question is whether amd wants to put in the same kind of model training time on it per title or come at it from a more universal, if less individually tweaked, set of rules. DLSS 2.0 is a general solution and doesn't require per-title training anymore, actually! Pivo fucked around with this message at 21:16 on Oct 17, 2020 |
# ? Oct 17, 2020 21:08 |
|
Nvidia release the cards the thread is barely holding together.
|
# ? Oct 17, 2020 21:16 |
|
Is this the right place to ask for help... if not (sorry) let me know where best to move it to. I need some advice with GPU & power requirements, I'm clueless on power so go ahead and call me an idiot if need be. (I'm in UK, if it matters) My teenage son who has a not -particularly-impressive PC, has went and blown his savings on a new GPU without properly checking if he can use it. This is the card: Gigabyte GeForce RTX 2060 OC https://www.amazon.co.uk/gp/product/B07MJGCPW5 it has an 8-pin power connection; his current GPU only uses power from the PCI slot. His 500w PSU doesn't have this 8pin power connector; it has two SATA and two molex, but the HDD is using one molex. Do we have any option here but to swap out to a new PSU which has this 8 pin connector? I've heard of adapters, eg which connect two molex to this 8pin, but then what does his HDD use? Can I use those two SATA power connectors for anything here? As I say, I'm clueless on voltages / wattage / power draw etc. Also I've heard you should just avoid adapters, unless you like occasional fires. Any advice much appreciated! other specs if they matter: MSI MS-7788 H61M-P31 (G3) LGA 1155 M-ATX Motherboard Intel Core i7-2600 3.4 GHz Quad-Core Processor 16gb DDR3 RAM 1TB HDD
|
# ? Oct 17, 2020 21:20 |
|
Fair Hallion posted:Is this the right place to ask for help... if not (sorry) let me know where best to move it to. You can convert a single sata plug to an 8pin. I think 500 might work. i7 2600 is a cpu from 2011, depending on the game its going to hold the gpu back. It will absolutely run games much faster then it currently does at least. If that machine doesn’t have a SSD for its OS its a good investment to make. It can turn into a game storage drive in an other computer down the road. spunkshui fucked around with this message at 22:00 on Oct 17, 2020 |
# ? Oct 17, 2020 21:36 |
|
Fair Hallion posted:Unfortunate situation Is the PSU a model that we can look up? Does it have any modular cables or are all the cables just coming out of one big loom? You could adapt one of the SATA power cables into molex for the HDD, which would then let you adapt the 2 x molex from the PSU into a PCI-E 8-pin. Personally, I would never do this because I don't trust those adapters and have budget for a suitable PSU, but it would probably work.
|
# ? Oct 17, 2020 21:43 |
|
Fair Hallion posted:Is this the right place to ask for help... if not (sorry) let me know where best to move it to. A 500w psu should be fine to run a 2060, but I'd very suspect of the quality of a 500w unit that doesn't have a pcie connector. I might consider getting a new one, especially if it's as old as that CPU - psus don't last forever.
|
# ? Oct 17, 2020 21:44 |
|
Kraftwerk posted:Well everyone knew there’s no way I’m waiting till Sunday for this. I’m very happy about the ending to the Kraftwerk 3080 Saga.
|
# ? Oct 17, 2020 21:52 |
|
steckles posted:To engage in a bit so semi-informed baseless speculation, I think part of the reason DLSS still isn't widely available and has shown up in so few shipped titles is that it requires a degree of per-title or even per-scene tuning in it's current state to produce decent results. Not retraining the underlying network, but massaging and filtering it's inputs. It's in UE4 now, but NVIDIA does approval on a per-developer basis before they let you turn it on. Throwing the doors open and letting everybody play with it will mean that it's limits and weak points are quickly found. NVIDIA has determined, correctly in my opinion, that a few titles that implement DLSS really well combined with the promise of more to come is going to sell more cards than those same few games and a bunch of awful indy crap that uses DLSS badly. So having the extra data available from a good TAA implementation may not be sufficient to have a good DLSS implementation in the same game? Interesting if so and may explain the gap between promise and reality right now. Thanks for the post, good read along with the one you responded to. When it comes to why DLSS adoption is slow, there's also the fact that current games have perfectly good presentation without DLSS because they were designed to have good presentation without DLSS, RTX aside. Studios may simply not be willing to put in extra work on a game that doesn't require the DLSS perf boost when they've already put in bunch of work to have reasonable performance at current resolutions. Games that are designed to use DLSS-like, reconstructive/scale up boosts from the ground up so they can go further on the quality per-pixel as so well put earlier will undoubtedly show up once the tech shakes out some more - which will certainly happen when the consoles drop and their approaches are known regardless of what happens on PCs.
|
# ? Oct 17, 2020 21:54 |
|
AirRaid posted:"Tensor" is literally a trademark name that nvidia came up with for a thing, it is not a type of thing in itself. Whoever told you this is not your friend.
|
# ? Oct 17, 2020 21:59 |
|
Riflen posted:Is the PSU a model that we can look up? Does it have any modular cables or are all the cables just coming out of one big loom? this seems to be it - pretty basic I admit. All wires come out of the same hole in the PSU's casing https://www.ebay.co.uk/itm/ATX-500-B-500W-PC-Power-Supply-Black-With-24-Pin-3-x-SATA-12CM-Silent-Fan/192162044644 Riflen posted:You could adapt one of the SATA power cables into molex for the HDD, which would then let you adapt the 2 x molex from the PSU into a PCI-E 8-pin. would the SATA provide the same power to the HDD as the molex, is it just the connector that's different? (SATA wires look thinner that the molex ones) thanks
|
# ? Oct 17, 2020 22:11 |
|
VorpalFish posted:A 500w psu should be fine to run a 2060, but I'd very suspect of the quality of a 500w unit that doesn't have a pcie connector. Thanks, the PC isn't that old though, built from new parts 18 months ago or so, but just not latest or top of the range parts. So it's not an ancient PSU, just on the basic side.
|
# ? Oct 17, 2020 22:27 |
|
Animal posted:Right as in correct, or right as in “microsoft flight simulator 2020 is just a glorified tech demo”? "Not left."
|
# ? Oct 17, 2020 22:30 |
|
Fair Hallion posted:this seems to be it - pretty basic I admit. All wires come out of the same hole in the PSU's casing Several of the reviews indicate that people are having problems trying to power discrete cards with that supply. That's the very, very bottom of the barrel, and I'd be hesitant to trust it with anything outside a cheap office computer Unfortunately, it's a pretty terrible time to be buying PSUs. Corsair CX/CXM are usually about the cheapest psus I'd recommend for a gaming PC, and even those pretty expensive right now. It looks like you could get a better quality Corsair TXM for cheaper than CX right now, though. It's still spendy, but it's a good investment to protect your components and comes with a 7-year warranty so he could easily carry it forward for quite a while.
|
# ? Oct 17, 2020 22:36 |
|
Pivo posted:Dedicated inference hardware is a solved problem though, isn't it? Just spend some transistors on matrix multiply-add, which is all a tensor core is. Apple, Qualcomm, Intel, NVIDIA's figured it out ... sure they can get there eventually if they need to, we're mostly talking about RDNA2 for the next gen consoles and desktop cards though if it turns out they can't make a viable DLSS clone without that kind of acceleration in their consumer cards it will set them back 2 or 4 years
|
# ? Oct 17, 2020 22:54 |
|
Fair Hallion posted:this seems to be it - pretty basic I admit. All wires come out of the same hole in the PSU's casing SATA is designed for Hard Disks and Optical drives. It'll be fine. I'm kind of surprised the HDD doesn't have a SATA power connector. Does it have a SATA data connector and a Molex power connector? I haven't seen one of those for 15 years.
|
# ? Oct 17, 2020 22:56 |
|
|
# ? Apr 23, 2024 11:09 |
|
Stickman posted:Several of the reviews indicate that people are having problems trying to power discrete cards with that supply. That's the very, very bottom of the barrel, and I'd be hesitant to trust it with anything outside a cheap office computer Fair enough. I think a new PSU is the route we'll have to take. Thanks for the help!
|
# ? Oct 17, 2020 22:56 |