Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Worf
Sep 12, 2017

If only Seth would love me like I love him!

:shrug: it ships tomorrow

sold it for $400 as parts on ebay :shrug:

ty for the advice.


wonder what kind of deals i can find on a 2080ti..

Worf fucked around with this message at 01:17 on Jun 5, 2019

Adbot
ADBOT LOVES YOU

Craptacular!
Jul 9, 2001

Fuck the DH
Here's the genius part of Nvidia certifying FreeSync monitors as Gsync Compatible:








They lose all AMD branding.

Stickman
Feb 1, 2004

Yeah, I've yet to see anything labeled both "g-sync compatible" and "freesync". It's sketchy as hell, and has to be part of the partner agreement for "g-sync compatible" certification.

Craptacular!
Jul 9, 2001

Fuck the DH
This is partly because everybody's just putting logos on VESA Adaptive Sync, which doesn't have a street team. FreeSync 2, like traditional GSync, has standards.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

There have been so many lovely implementations of FreeSync that I can understand why some might want to flaunt the better brand. I think the set of people who have AMD cards, care about *sync, and don’t do detailed enough research to find the compatibility info is small enough to not bother complicating the copy for.

It could of course be an AMD requirement about exclusive use that is backfiring on them.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

theres a lot to be said for, mostly, being able to say "ok gsync, i know pretty much what im getting in that regard" vs "ok freesync, let me peruse the variety of common implementations, "

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Aw poo poo, NVIDIA alluded to 7nm Ampere in 2020 already. Oh the patience.

ufarn
May 30, 2009
Ditching TSMC for Samsung, too

https://twitter.com/wccftechdotcom/status/1136206296451932161

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Combat Pretzel posted:

Aw poo poo, NVIDIA alluded to 7nm Ampere in 2020 already. Oh the patience.

I think we already knew that was coming, but confirmation is always nice. 7nm should be pretty huge for NV, gain what like 30% performance on the process alone even if all they did was just straight die shrunk Turing? But you figure they also have other improvements in the pipelines.

Can't wait for a GPU to make my 2080 Ti look like poo poo.

alex314
Nov 22, 2007

1440p high refresh rate and details in a card with 120W would be nice. Basically 2070 with 1060 power usage.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

What's that going to do for laptops? Huge benefit? The 7nm process I mean

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Beautiful Ninja posted:

I think we already knew that was coming, but confirmation is always nice.
I figured for it to come a year later than this.

I'd love to have a new GPU for the upcoming bunch of AAA games this fall/winter, and to increase supersampling in VR, but 2020 seems kinda close (if it falls in a similar release window as Turing, i.e. Q1/20), especially with the rumours regarding the performance bump.

I sure hope CP2077 and WD3 release end of Q1/20. :v:

Enos Cabell
Nov 3, 2004


Going to try holding out for the Ampere ti card, but it will be hard to resist upgrading from this 1080ti to the x80 part.

SurgicalOntologist
Jun 17, 2004

Related to the post I made in the part-picking thread... I am building a system to record video from two 12MP cameras (4000x3000). I am wondering if an RTX card will allow us to encode both streams at once. Despite being a computer vision shop none of us know about video encoding. I'm trying to parse this chart and it seems the cards only have 1 nvenc but support max 2 sessions. Not sure what to make of that, e.g. if using 2 sessions reduces the max resolution or framerate that can be encoded. Anyone know this stuff and can point me in the right direction?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Correct, a RTX card should do that. I would think you'd be well within the media core's capacity as it's designed to run up to 32 streams in parallel, as far as I know there's no other capacity limiting except for the 2-stream limit. Or at least, jumping up to a Quadro wouldn't gain you anything if you're running out of media core throughput.

Gyrotica
Nov 26, 2012

Grafted to machines your builders did not understand.

Paul MaudDib posted:

Correct, a RTX card should do that. I would think you'd be well within the media core's capacity as it's designed to run up to 32 streams in parallel, as far as I know there's no other capacity limiting except for the 2-stream limit. Or at least, jumping up to a Quadro wouldn't gain you anything if you're running out of media core throughput.

It's also trivial to find out how to flip the switch in the drivers that limits stream number on GTX/RTX cards. It's an artificial driver limitation meant for market segmentation.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gyrotica posted:

It's also trivial to find out how to flip the switch in the drivers that limits stream number on GTX/RTX cards. It's an artificial driver limitation meant for market segmentation.

Oh, sweet. I didn't know anyone had managed to patch that out.

Bummer there's no patch for the FreeBSD driver, but I guess I could screw around with passthrough to a virtualized Linux environment (it works, you just have to do some further trickery to keep the driver from seeing it's virtualized), or just rebuild and settle for ZFS on Linux.

Also, my media server only takes single-slot cards, so I'd have to wait until Colorful gets that single-slot 1660 Ti out. Until then the Quadros are the only GPU that would physically fit anyway.

(or, I'm finally moving to a place where I'd have a basement, I could just move everything to a server rack anyway and build it big, instead of SFF...)

edit: actually I forgot my PSU on that rig doesn't have a PCIe power connector, so I'd have to rig something up with a splitter...

Paul MaudDib fucked around with this message at 19:34 on Jun 5, 2019

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

Craptacular! posted:

Here's the genius part of Nvidia certifying FreeSync monitors as Gsync Compatible:

They lose all AMD branding.

It will probably bother AMD when they have to brand their cards as G-Sync Compatible Compatible.

Cygni
Nov 12, 2005

raring to post

Stickman posted:

Yeah, I've yet to see anything labeled both "g-sync compatible" and "freesync". It's sketchy as hell, and has to be part of the partner agreement for "g-sync compatible" certification.

It probably also has to do with the iron grip Nvidia has on the gaming GPU market. It's like 80/20 in new card sales, and essentially 100% market share at the high end that would be paired with a fancy monitor.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Now here's a GPU cooler I can get behind.

https://www.anandtech.com/show/14480/spotted-at-computex-the-ultimate-gpu-air-cooling-solution-

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~

might as well go all-out at that point
https://www.youtube.com/watch?v=Q32yxCKlOY8

Cygni
Nov 12, 2005

raring to post

Bring back Kryotech:



SwissArmyDruid
Feb 14, 2014

by sebmojo

Alpha Mayo posted:

might as well go all-out at that point
https://www.youtube.com/watch?v=Q32yxCKlOY8

Not gonna lie, when I saw the.... gently caress, what's it called. The new dual-GPU board thing that Apple is shoving into their new thing. I briefly wondered if I could make something like that for my own GPUs.

(answer: Shroud, easy, but I don't have a source for making heatsink fin stacks.)

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Cygni posted:

Bring back Kryotech:







http://www.ldcooling.com/shop/ld-pc-v10-115v-usa/87-ld-pc-v10-115v-usa-phase-change.html

(PCIe slot covers not included!)

craig588
Nov 19, 2005

by Nyc_Tattoo
I still technically have my R507 cooler, but CPUs these days respond so much less to extreme cooling and all of the problems with condensation there's no point in using it. I had a 4 GHz P4 back in the time when the fastest CPU Intel sold was 3 GHz.

Cygni
Nov 12, 2005

raring to post


oh my god i was actually kinda tempted until i saw that $1,500 price tag

craig588
Nov 19, 2005

by Nyc_Tattoo
You can build them yourself for much cheaper, I spent about 400 building mine and if you want to go really cheap you can get a broken air conditioner to build from for even less, all it needs is the compressor to still work.

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
I remember this type of cooling system being used in the first commercially available 1 GHz system. Shortly after, a regular CPU achieving that clock was released.

VelociBacon
Dec 8, 2009

craig588 posted:

You can build them yourself for much cheaper, I spent about 400 building mine and if you want to go really cheap you can get a broken air conditioner to build from for even less, all it needs is the compressor to still work.

Please post pics, I'm not calling you out I'm just super interested what that would end up looking like.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Lambert posted:

I remember this type of cooling system being used in the first commercially available 1 GHz system. Shortly after, a regular CPU achieving that clock was released.

Yup, pretty sure Prometeia made them for a long time, so did Asetek with their VapoChill units & they were on some of the P3/4 & Athlon XP systems. They were still ridiculously expensive even back then, doesn't shock me that companies still sell them for $1000+

EoRaptor
Sep 13, 2003

by Fluffdaddy

I wonder if this is the public part of a behind the scene patent license between Samsung and nVidia. We know Samsung did a patent deal with AMD so their in house mobile GPU's would be covered, and signing a deal with nvidia for super discounted fab capacity would fit into that, as nvidia almost never discloses their licensing deals.

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Those fins are going to clog up with so much poo poo so fast.

Deathreaper
Mar 27, 2010

craig588 posted:

You can build them yourself for much cheaper, I spent about 400 building mine and if you want to go really cheap you can get a broken air conditioner to build from for even less, all it needs is the compressor to still work.

Out of curiosity, is this on a regular use system (aka not a just a benchmark rig)? Also, is condensation an issue?

craig588
Nov 19, 2005

by Nyc_Tattoo
I'm disabled now so taking pictures is hard, but condensation is a real issue. What I used to do was pack the socket with dielectric grease to keep out any moisture. I used that cooler for about 2 years. I built mine into a horizontal case so the motherboard could lay flat. I wouldn't use it again because now the gains from super cooling are smaller and all of the side issues with going sub ambient. 5.5 GHz but there's a startup sequence and it could get killed any moment and it uses an extra 400 watts? It's just not worth it to me anymore.

SurgicalOntologist
Jun 17, 2004

Paul MaudDib posted:

Correct, a RTX card should do that. I would think you'd be well within the media core's capacity as it's designed to run up to 32 streams in parallel, as far as I know there's no other capacity limiting except for the 2-stream limit. Or at least, jumping up to a Quadro wouldn't gain you anything if you're running out of media core throughput.

Excellent, thanks.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Silly question - is Ampere the next-gen successor to Turing?

eames
May 9, 2009

Zedsdeadbaby posted:

Silly question - is Ampere the next-gen successor to Turing?

It should be according to roadmaps but could also be a compute focused product, like Volta was rumored to be the Pascal successor but then Turing came out of nowhere. Nvidia hasn't confirmed anything yet.

Wiggly Wayne DDS
Sep 11, 2010



craig588 posted:

I'm disabled now so taking pictures is hard, but condensation is a real issue. What I used to do was pack the socket with dielectric grease to keep out any moisture. I used that cooler for about 2 years. I built mine into a horizontal case so the motherboard could lay flat. I wouldn't use it again because now the gains from super cooling are smaller and all of the side issues with going sub ambient. 5.5 GHz but there's a startup sequence and it could get killed any moment and it uses an extra 400 watts? It's just not worth it to me anymore.
could your roommates not take a picture? i'm surprised you never mentioned the cooling system or getting one of your machines to 5.5GHz before. how stable was it and what was the rest of the setup?

repiv
Aug 13, 2009

Quake 2 RTX is available now:

https://www.nvidia.com/en-gb/geforce/campaigns/quake-II-rtx/
https://store.steampowered.com/app/1089130

It comes with the demo levels but if you install the full version of normal Quake 2 beforehand then Q2RTX will import the full game content during installation.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Well I'm getting 60fps on my 1070 with global illumination set to Low and the resolution set to 720x480 :suicide:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply