Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Jeff Fatwood
Jun 17, 2013

Taima posted:

Wait some people only play D3? That's like Cookie Clicker: RPG-Lite Edition, like you're one layer of abstraction removed from a rat hitting a button for food for 8 years.

hell yeah dude

It's great for listening to podcasts.

Adbot
ADBOT LOVES YOU

Jeff Fatwood
Jun 17, 2013
All AIBs seem to have taken cues from Nvidia for the air through flow and... then blocked off the short or perforated PCB with the backplate :shepface:

It's like case manufacturers deliberately making the airflow poo poo all over again.

Jeff Fatwood
Jun 17, 2013

Zedsdeadbaby posted:

Isn't it great when there's a generational leap so massive it immediately wipes out the value of everything else on the market

It rules, yes.

Jeff Fatwood
Jun 17, 2013

Samadhi posted:

Due to the general behemothness of the new NVidia cards, is there any drawback to putting a 3080 in the second PCIe x16 slot on an X570 motherboard if you don't have any plans for that slot otherwise? They are both PCIe 4.0 x16 slots, so I assume the bandwidth and performance is the same?

You checked in the manual that it is at x16 speeds? If so, then yeah, it's the same. Will probably change airflow in the case and maybe the temps some way or other (or not)

My X570 Gaming Edge has the second slot running in x4 (lol).

Jeff Fatwood
Jun 17, 2013
AMD knows how to disappoint. :shepface:

Jeff Fatwood
Jun 17, 2013

repiv posted:

up thread someone posted https://www.lian-li.com/gb-001/ as a cheap solution

Of course this won't work on my MSI Gaming Edge because the upper screw hole is blocked by the stupid m.2 heatsink :shepicide:

Jeff Fatwood
Jun 17, 2013
lol I was looking at that card previously and thought to myself that there was no way it wasn't going to be stupid.

Jeff Fatwood
Jun 17, 2013
https://twitter.com/Radeon/status/1305612438675562505

https://twitter.com/NateOrb/status/1305616619310395392

https://videocardz.com/newz/amd-showcases-radeon-rx-6000-graphics-card-design

Jeff Fatwood fucked around with this message at 22:35 on Sep 14, 2020

Jeff Fatwood
Jun 17, 2013
Hey, I'll take it. I'd love to have just a single cable for a 38WN95C if I end up buying that monitor. Probably not gonna be that simple.

Jeff Fatwood
Jun 17, 2013

repiv posted:

i doubt big navi will have enough juice on the usb-c port to power such a huge monitor

turings usb-c port was only good for 27 watts

lol, yes. I meant from computer. DP + Usb HUB

Jeff Fatwood
Jun 17, 2013

repiv posted:

looks like the reference board has a footprint for an optional second stacked HDMI port, so 2xHDMI cards like the ASUS TUF may still be reference



I tried peeping on my phone and based on the gallery on Asus' site, that doesn't seem likely.

Jeff Fatwood
Jun 17, 2013
a hearty lol to the fact that Nvidia could have just delayed this for a couple of weeks to build up at least some stock but instead they went panic mode just to gently caress over everyone even more

Jeff Fatwood
Jun 17, 2013

Der Shovel posted:

Someone in Sweden called up Proshop.se, the Swedish branch of a pan-european large vendor. I, like many others, ordered from them yesterday. Their website was giving a green light for almost an hour after the GeForce launch, saying their stock arriving on 17.9. was still good, and they were getting enough cards from the manufacturers to be able to deliver by next week. Then slightly after 17:00 the website started saying they were out of stock, so all good, right? Nope!

Turns out they had 20 TUF OC cards in total at any time, apparently between all of their seven regional stores, and their system took over 700 orders before it finally flipped over to out of stock. Every one of those 700+ people were charged the full price of the card, and only during last night their orders were updated to say they weren't getting a card. Apparently this was "a bug" with the website. The rep estimated over the phone that they will be getting restock in November. Maybe. Hopefully. They apparently have no way of estimating where a given order is in the queue, so those 700 people might be getting a card in November, or they might be getting one some time next year.

I can't really imagine anyone botching a GPU launch in a bigger way.

:pusheen:

Jeff Fatwood
Jun 17, 2013

Romes128 posted:

oooooooooommmmmmmmmgggggggg he actually did the "you're the actual racist if you call someone racist"

https://twitter.com/JayzTwoCents/status/1232510423598952448?s=20

what a loving utter idiot.

if I'm such a racist then how come I'm using all my non white employees as shields against criticism of racism?? :thunk:

Jeff Fatwood
Jun 17, 2013

Fauxtool posted:

gently caress you, my time is worthless

Jeff Fatwood
Jun 17, 2013

shrike82 posted:

i think people are kinda stretching themselves to make the ability to immediately purchase a discretionary luxury good some kind of moral right

I mean when all you have is a crowded altar of sacrifice, it's probably kind of hard not to form a cult around it

Jeff Fatwood
Jun 17, 2013
a big expensive turd, hell yeah

What a hilariously dumb launch.

https://www.youtube.com/watch?v=Xgs-VbqsuKo

Jeff Fatwood
Jun 17, 2013

Durzel posted:

Just lol at class action lawsuits because a product won’t run beyond its rated specification.

also lol at proven NOT RACIST Jay going "this is almost like 3.5gb but not really" when it's nothing like the 3.5gb fiasco.

Jeff Fatwood
Jun 17, 2013

lol pretty much

Jeff Fatwood
Jun 17, 2013

exquisite tea posted:

Does anybody expect Big Navi to be competitive this time around?

Me, but I'm a hopeless romantic.

edit. I mean just looking at what a clusterfuck Nvidia has rushed into headfirst just to claim an unquestionable price/performance crown over its own products feels like they're expecting something from AMD. Or maybe they really are just that afraid of the consoles claiming PC market, idk

Jeff Fatwood fucked around with this message at 15:12 on Sep 26, 2020

Jeff Fatwood
Jun 17, 2013
I'm still not convinced that DLSS is going to be the next big thing since it seems to really have two big caveats. Interested developers and Nvidia needing to actually put the work in on a per game basis (lol)

Also in my short stint with DLSS 2.0 on Minecraft RTX, I encountered some major ghosting artefacts which seem to pop up every now and then. Can't really deny that results are great but I'd be happy with a more mediocre solution that worked with most games since that would have a bigger chance of getting implemented everywhere.

Jeff Fatwood
Jun 17, 2013

Some Goon posted:

DLSS 2.0 uses a generalized network so Nvidia doesn't have to do anything per-game. The devs still need to hook it in directly though.

DrDork posted:

DLSS 2.0 takes (from what we're being told, anyhow) minimal developer work, assuming that the game already supports TAA (which is a lot of them), and requires no per-game training. That it looks like it's being rolled into Unreal Engine is gonna make it almost a given for future games based on it.

Yeah, I also forgot that DLSS 2.0 is like 6 months old so lol, maybe I should give it a bit more time.

I get that it's generalized as opposed to 1.0 but I'm vary of it being that simple in terms of implementation or that it doesn't require any hand holding by Nvidia (I'm also dumb and probably wrong)

Jeff Fatwood
Jun 17, 2013

sauer kraut posted:

Is the downclocking/bluescreen issue really fixed? (for people who don't daisy-chain the 3 power connectors)
A 5700 would be a nice to get at around 200€ used, but then there's the issue of crazy coin miners stuffing the thing with custom Bioses and wearing the fans out.
I saw a nice Red Dragon on ebay, of course when I checked the sellers history it was all mining rig stuff :smith:
New 5700 are impossible to get in Germany, those things just vanished off the face of the earth.

Anecdote:

I fixed my 5700XT black screens with upping my 3200C16 1.35V memory voltage to 1.4v.

Jeff Fatwood
Jun 17, 2013

jisforjosh posted:

Wait your DDR voltage affected the card being stable 😳

Yup. I read that as a possible solution on Reddit since I had all the other "usual suspects" pretty much figured out:
- two power cables, good newish PSU (Corsair RM750X v2 bought in 2019)
- good ventilation
- a good AIB card (Sapphire Nitro+)
- no OC, no undervolt
- supposedly fixed new drivers (June 2020)

So yeah, lol

Jeff Fatwood
Jun 17, 2013

Taima posted:

Sorry I know people probably wonder this a lot and its pretty basic, but if RDNA2 was going to be competitive, why wouldn't AMD release anything about it in the last month? What possible competitive advantage does that confer.

I mean, the RTX 3000 launch hasn't exactly been a slam dunk so maybe they anticipated that Nvidia would rush their release and thus have just confirmed their suspicions and are taking their time (or sticking to schedule). Or maybe they have nothing. You can really theorycraft anyway you wish.

Jeff Fatwood
Jun 17, 2013
glorious what the hell were they thinking

Jeff Fatwood
Jun 17, 2013

Vir posted:

Der Bauer only replaced two banks. Got to replace them all, and stack those MLCC capacitors 5 layers deep! We'll get 2.2 GHz for sure!

Yes, I know those are resistors.

mlcc .. piled high

Jeff Fatwood
Jun 17, 2013

8-bit Miniboss posted:

Hardware Unboxed calling out Jayztwocents in so many words. Also Jay is on Nvidia's shitlist (or at least, got a very stern talking to) because of that same video as mentioned in his video earlier today.

https://www.youtube.com/watch?v=lhyCdraz54s

lol eat poo poo Jay. Any source on that shitlist quote?

Jeff Fatwood
Jun 17, 2013

yeah rip

https://www.youtube.com/watch?v=7h3Zx4bV0RE

Jeff Fatwood
Jun 17, 2013

abraham linksys posted:

so if I'm sitting on a 1060 and was hoping to do an upgrade by the end of the year so that my roommate can inherit it (for context, their pc can barely run destiny 2 at 30, a game that runs at like 90fps for me, so it'd be a big upgrade for them), but i'm still rocking a 1080p60 monitor with no plans to upgrade so I have zero reason to get a 3080, i'm now wondering what to do. hope to get a 3070 in november or december? put my prayers in big navi? look for a used 1660 super (since its not like the 2000 series is going down in price)? buy a ps5 and share it with my roommate?

I'd say used at this point. If you're not upgrading your monitor then RTX 3080 is defnitely not worth it. All this years cards will most likely be extremely overkill for 1080p (especially at 60fps). Whatever you can score for cheap for under 150usd maybe?

Jeff Fatwood
Jun 17, 2013

LRADIKAL posted:

What you are describing is, basically, a marketing lie.

I mean, that's just the technical correctness loving demo Nvidia is going for. Don't take this away from Paul.

Jeff Fatwood
Jun 17, 2013

"lets make it look like a race car!"

*makes a GPU do blackface"

Jeff Fatwood
Jun 17, 2013

Theophany posted:

Take the entire shaker of salt:

https://www.youtube.com/watch?v=zmbvUK93npo

But if this bears true, it's absolutely an artificially created supply issue. I refuse to believe nVidia didn't anticipate massive demand given the global situation and their very attractive pricing for these cards.

I was just about to post this also and even if it isn't true, Nvidia might want to reconsider their approach to releases if they don't want to start fueling (possible bullshit) narratives like this. Of course they're laughing all the way to the bank anyway so whatever. I do like the possibility that this clusterfuck of a launch is just a ruse to get everyone thinking about RTX before AMD comes in and start actually competing again. AMD doesn't even have to nuke Nvidia but to just be actually competitive.

Jeff Fatwood
Jun 17, 2013

Taima posted:

This "incentivize devs not to use DLSS" thing is pathetic and AMD should feel bad.

Or maybe and hear me out here... just maybe Nvidia should open up some of their "revolutionary" tech once in a while if they want it to actually start getting it implemented.

Jeff Fatwood
Jun 17, 2013

Sagebrush posted:

without DLSS it doesn't really matter how good they are

It's always fascinating to me that even after physx, hairworks, g-sync and whatever else Nvidia exclusive bullshit, people are still going all in on "if it doesn't have X proprietary Nvidia stuff, it's not worth using".

Jeff Fatwood
Jun 17, 2013

Sagebrush posted:

PhysX and hairworks were obviously gimmicks.

VRR is a huge huge deal and if Freesync didn't exist I would be saying that G-Sync support is a dealbreaker in 2020. Getting a monitor without either one is definitely dumb.

DLSS is another technology like G-Sync that can increase the quality and performance of any game (for which it's implemented) at essentially no cost. It's a really huge change, far bigger than dumb poo poo like hairworks.

Of course. Freesync exists because VRR is a huge deal. Not because G-sync is a huge deal. If G-sync was the only tech in town, VRR would still be niche because the ~200usd surcharge to enter a proprietary walled garden will keep it that way. My point here was that a closed proprietary approach never works for mass adoption.

DLSS will never be anything more than another G-sync and will be overtaken by a free standard if one ever comes out. Same with raytracing. That's why I find it fascinating that people are so ready to declare everything else useless when Sony and Microsoft don't seem to think like that. Hell, Sony and Microsoft themselves have tried proprietary walled garden poo poo multiple times on different markets and they've either crashed and burned or made a cool little niche that they make a buck off of.

edit. I should clarified/used better examples but my point was also that Hairworks and Physx wouldn't have probably been gimmicks if they were something more open instead of Nvidia loving around and trying to make a quick buck off of a cool idea. I mean, physics in games and swaying hair are both staples now in game tech and there's no reason that they couldn't be big Nvidia trademarks if they had an actual interest in innovation and not just trying to ultra capitalize on everything they do. They're not Apple.

double edit. also it's going to be very fitting if Cyberpunk is the last big RTX title and after that the AMD approach takes over because it's supported by everything and not just thousand dollar graphics cards that also need another proprietary tech in conjunction to make run acceptably (RE: Witcher 3 / CDPR / Hairworks)

Jeff Fatwood fucked around with this message at 09:52 on Oct 23, 2020

Jeff Fatwood
Jun 17, 2013

Riflen posted:

Technology can be successful for a company without every potential customer having access to it. We have plenty of examples of this. Not everything developed is intended to be ubiquitous and nor should it be, because designing for ubiquity will often mean steep compromises. Saying Nvidia doesn't have an actual interest in innovation is patently false.

That's true, for example Freesync and HDR10 are very flawed and I'm not really arguing here that Nvidia should do what I'm saying, they're doing fine. But I was replying to a comment that said that everything that doesn't support DLSS is irrelevant so I'm sorry for replying with a hyperbolic statement of my own. :shrug: I'd gladly see DLSS and RTX be implemented in everything but Nvidia doesn't have a great track record with their own proprietary stuff so why shouldn't I be skeptical? I've also put in my order for an RTX 3080 so I'm not even fanboying here, just keeping a healthy amount of skepticism about my own purchasing decisions. Nvidia does have interest in innovation but not a great track record in getting that innovation to the hands of everyone and that's also fine. I don't understand why I can't criticize that or why I should care about innovation that most people will never see in a market that's literally global and has millions of users. Especially when they have competition that has no problems in doing the same things but also standardizing it.

AirRaid posted:

But right now there literally is nothing else like it, because it's something that Nvidia have pioneered for the home market. So whining that Nvidia are marketing their own thing that they spent a lot of time and effort developing (and which demonstrably works and is awesome) is weird.

Well, it's not really my intention to sound whiny but isn't anyone at all concerned about propriety?

AirRaid posted:

Yes, an open solution which works on all hardware is preferable, and will likely happen, but right now it is not there at all, so calling it niche and gimmicky just because it's in it's infancy is a bit absurd.

Yeah, it'll happen if it happens and that's why I find it weird that people are so ready to just throw themselves at something in the meanwhile that might not even have a future and pay a large sum of money for it. I mean, I get that we're in the 1000USD future electronics waste thread here but I don't see myself being completely unreasonable here.

AirRaid posted:

Were Pixel Shaders "niche" when the GeForce 3 launched, and like 4 games used them?

Were pixel shaders 100% proprietary Nvidia? Asking for real, because I don't know.

Jeff Fatwood
Jun 17, 2013

AirRaid posted:

Probably, my point still stands though. They literally invented it, and it's only been around for a couple of years. It's good enough tech that it'll work out into some kind of standard, given time.

Well sure and that's great. But I think my point here also stands that I'm well within my right to call something that's 100% Nvidia proprietary and only limited to insanely expensive niche hardware, niche and not really whining.

Jeff Fatwood
Jun 17, 2013
https://videocardz.com/newz/amd-radeon-rx-6800xt-navi-21-engineering-board-photo-leaked

Comments talk about a possible Sapphire Nitro being spotted here. I wish the rumors of only AMD reference being available at launch are false and that I could get on that Nitro+ queue as early as possible.

Adbot
ADBOT LOVES YOU

Jeff Fatwood
Jun 17, 2013

AirRaid posted:

I didn't think DXR was properly established as "the way we're all going to do it" just yet.

Yeah, I had gotten the impression that RTX was somehow the whole thing. Will it be trivial to get ray tracing on all RDNA2 GPUs once they come out that already support ray tracing on Nvidia?

edit.

repiv posted:

anything in directx is by definition the way everyone is going to do it, nothing gets added to directx until NV/AMD/Intel reach a consensus

we also know that AMD is supporting the DXR API as-is with RDNA2

https://www.youtube.com/watch?v=sW0g0lqsqKA

Ahh, ok. Sorry about my spiel bundling raytracing and dlss together, my impression on RTX was clearly wrong. I'm still skeptical about DLSS and don't see myself currently benefiting from it in a long while.

Jeff Fatwood fucked around with this message at 13:43 on Oct 23, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply