|
it ships tomorrow sold it for $400 as parts on ebay ty for the advice. wonder what kind of deals i can find on a 2080ti.. Worf fucked around with this message at 01:17 on Jun 5, 2019 |
# ? Jun 5, 2019 01:09 |
|
|
# ? Apr 27, 2024 22:26 |
|
Here's the genius part of Nvidia certifying FreeSync monitors as Gsync Compatible: They lose all AMD branding.
|
# ? Jun 5, 2019 05:38 |
|
Yeah, I've yet to see anything labeled both "g-sync compatible" and "freesync". It's sketchy as hell, and has to be part of the partner agreement for "g-sync compatible" certification.
|
# ? Jun 5, 2019 07:19 |
|
This is partly because everybody's just putting logos on VESA Adaptive Sync, which doesn't have a street team. FreeSync 2, like traditional GSync, has standards.
|
# ? Jun 5, 2019 07:43 |
|
There have been so many lovely implementations of FreeSync that I can understand why some might want to flaunt the better brand. I think the set of people who have AMD cards, care about *sync, and don’t do detailed enough research to find the compatibility info is small enough to not bother complicating the copy for. It could of course be an AMD requirement about exclusive use that is backfiring on them.
|
# ? Jun 5, 2019 12:47 |
|
theres a lot to be said for, mostly, being able to say "ok gsync, i know pretty much what im getting in that regard" vs "ok freesync, let me peruse the variety of common implementations, "
|
# ? Jun 5, 2019 13:13 |
|
Aw poo poo, NVIDIA alluded to 7nm Ampere in 2020 already. Oh the patience.
|
# ? Jun 5, 2019 14:33 |
|
Ditching TSMC for Samsung, too https://twitter.com/wccftechdotcom/status/1136206296451932161
|
# ? Jun 5, 2019 15:03 |
|
Combat Pretzel posted:Aw poo poo, NVIDIA alluded to 7nm Ampere in 2020 already. Oh the patience. I think we already knew that was coming, but confirmation is always nice. 7nm should be pretty huge for NV, gain what like 30% performance on the process alone even if all they did was just straight die shrunk Turing? But you figure they also have other improvements in the pipelines. Can't wait for a GPU to make my 2080 Ti look like poo poo.
|
# ? Jun 5, 2019 15:07 |
|
1440p high refresh rate and details in a card with 120W would be nice. Basically 2070 with 1060 power usage.
|
# ? Jun 5, 2019 15:15 |
|
What's that going to do for laptops? Huge benefit? The 7nm process I mean
|
# ? Jun 5, 2019 15:30 |
|
Beautiful Ninja posted:I think we already knew that was coming, but confirmation is always nice. I'd love to have a new GPU for the upcoming bunch of AAA games this fall/winter, and to increase supersampling in VR, but 2020 seems kinda close (if it falls in a similar release window as Turing, i.e. Q1/20), especially with the rumours regarding the performance bump. I sure hope CP2077 and WD3 release end of Q1/20.
|
# ? Jun 5, 2019 16:12 |
|
Going to try holding out for the Ampere ti card, but it will be hard to resist upgrading from this 1080ti to the x80 part.
|
# ? Jun 5, 2019 16:14 |
|
Related to the post I made in the part-picking thread... I am building a system to record video from two 12MP cameras (4000x3000). I am wondering if an RTX card will allow us to encode both streams at once. Despite being a computer vision shop none of us know about video encoding. I'm trying to parse this chart and it seems the cards only have 1 nvenc but support max 2 sessions. Not sure what to make of that, e.g. if using 2 sessions reduces the max resolution or framerate that can be encoded. Anyone know this stuff and can point me in the right direction?
|
# ? Jun 5, 2019 17:56 |
|
Correct, a RTX card should do that. I would think you'd be well within the media core's capacity as it's designed to run up to 32 streams in parallel, as far as I know there's no other capacity limiting except for the 2-stream limit. Or at least, jumping up to a Quadro wouldn't gain you anything if you're running out of media core throughput.
|
# ? Jun 5, 2019 18:06 |
|
Paul MaudDib posted:Correct, a RTX card should do that. I would think you'd be well within the media core's capacity as it's designed to run up to 32 streams in parallel, as far as I know there's no other capacity limiting except for the 2-stream limit. Or at least, jumping up to a Quadro wouldn't gain you anything if you're running out of media core throughput. It's also trivial to find out how to flip the switch in the drivers that limits stream number on GTX/RTX cards. It's an artificial driver limitation meant for market segmentation.
|
# ? Jun 5, 2019 18:21 |
|
Gyrotica posted:It's also trivial to find out how to flip the switch in the drivers that limits stream number on GTX/RTX cards. It's an artificial driver limitation meant for market segmentation. Oh, sweet. I didn't know anyone had managed to patch that out. Bummer there's no patch for the FreeBSD driver, but I guess I could screw around with passthrough to a virtualized Linux environment (it works, you just have to do some further trickery to keep the driver from seeing it's virtualized), or just rebuild and settle for ZFS on Linux. Also, my media server only takes single-slot cards, so I'd have to wait until Colorful gets that single-slot 1660 Ti out. Until then the Quadros are the only GPU that would physically fit anyway. (or, I'm finally moving to a place where I'd have a basement, I could just move everything to a server rack anyway and build it big, instead of SFF...) edit: actually I forgot my PSU on that rig doesn't have a PCIe power connector, so I'd have to rig something up with a splitter... Paul MaudDib fucked around with this message at 19:34 on Jun 5, 2019 |
# ? Jun 5, 2019 18:37 |
|
Craptacular! posted:Here's the genius part of Nvidia certifying FreeSync monitors as Gsync Compatible: It will probably bother AMD when they have to brand their cards as G-Sync Compatible Compatible.
|
# ? Jun 5, 2019 18:40 |
|
Stickman posted:Yeah, I've yet to see anything labeled both "g-sync compatible" and "freesync". It's sketchy as hell, and has to be part of the partner agreement for "g-sync compatible" certification. It probably also has to do with the iron grip Nvidia has on the gaming GPU market. It's like 80/20 in new card sales, and essentially 100% market share at the high end that would be paired with a fancy monitor.
|
# ? Jun 5, 2019 21:26 |
|
Now here's a GPU cooler I can get behind. https://www.anandtech.com/show/14480/spotted-at-computex-the-ultimate-gpu-air-cooling-solution-
|
# ? Jun 5, 2019 21:33 |
|
SwissArmyDruid posted:Now here's a GPU cooler I can get behind. might as well go all-out at that point https://www.youtube.com/watch?v=Q32yxCKlOY8
|
# ? Jun 5, 2019 22:00 |
|
Bring back Kryotech:
|
# ? Jun 5, 2019 22:31 |
|
Alpha Mayo posted:might as well go all-out at that point Not gonna lie, when I saw the.... gently caress, what's it called. The new dual-GPU board thing that Apple is shoving into their new thing. I briefly wondered if I could make something like that for my own GPUs. (answer: Shroud, easy, but I don't have a source for making heatsink fin stacks.)
|
# ? Jun 5, 2019 22:52 |
|
Cygni posted:Bring back Kryotech: http://www.ldcooling.com/shop/ld-pc-v10-115v-usa/87-ld-pc-v10-115v-usa-phase-change.html (PCIe slot covers not included!)
|
# ? Jun 5, 2019 22:57 |
|
I still technically have my R507 cooler, but CPUs these days respond so much less to extreme cooling and all of the problems with condensation there's no point in using it. I had a 4 GHz P4 back in the time when the fastest CPU Intel sold was 3 GHz.
|
# ? Jun 5, 2019 23:03 |
|
Paul MaudDib posted:
oh my god i was actually kinda tempted until i saw that $1,500 price tag
|
# ? Jun 5, 2019 23:08 |
|
You can build them yourself for much cheaper, I spent about 400 building mine and if you want to go really cheap you can get a broken air conditioner to build from for even less, all it needs is the compressor to still work.
|
# ? Jun 5, 2019 23:12 |
|
I remember this type of cooling system being used in the first commercially available 1 GHz system. Shortly after, a regular CPU achieving that clock was released.
|
# ? Jun 5, 2019 23:30 |
|
craig588 posted:You can build them yourself for much cheaper, I spent about 400 building mine and if you want to go really cheap you can get a broken air conditioner to build from for even less, all it needs is the compressor to still work. Please post pics, I'm not calling you out I'm just super interested what that would end up looking like.
|
# ? Jun 6, 2019 00:27 |
|
Lambert posted:I remember this type of cooling system being used in the first commercially available 1 GHz system. Shortly after, a regular CPU achieving that clock was released. Yup, pretty sure Prometeia made them for a long time, so did Asetek with their VapoChill units & they were on some of the P3/4 & Athlon XP systems. They were still ridiculously expensive even back then, doesn't shock me that companies still sell them for $1000+
|
# ? Jun 6, 2019 00:48 |
|
ufarn posted:Ditching TSMC for Samsung, too I wonder if this is the public part of a behind the scene patent license between Samsung and nVidia. We know Samsung did a patent deal with AMD so their in house mobile GPU's would be covered, and signing a deal with nvidia for super discounted fab capacity would fit into that, as nvidia almost never discloses their licensing deals.
|
# ? Jun 6, 2019 02:38 |
|
SwissArmyDruid posted:Now here's a GPU cooler I can get behind. Those fins are going to clog up with so much poo poo so fast.
|
# ? Jun 6, 2019 03:40 |
|
craig588 posted:You can build them yourself for much cheaper, I spent about 400 building mine and if you want to go really cheap you can get a broken air conditioner to build from for even less, all it needs is the compressor to still work. Out of curiosity, is this on a regular use system (aka not a just a benchmark rig)? Also, is condensation an issue?
|
# ? Jun 6, 2019 04:52 |
|
I'm disabled now so taking pictures is hard, but condensation is a real issue. What I used to do was pack the socket with dielectric grease to keep out any moisture. I used that cooler for about 2 years. I built mine into a horizontal case so the motherboard could lay flat. I wouldn't use it again because now the gains from super cooling are smaller and all of the side issues with going sub ambient. 5.5 GHz but there's a startup sequence and it could get killed any moment and it uses an extra 400 watts? It's just not worth it to me anymore.
|
# ? Jun 6, 2019 06:17 |
|
Paul MaudDib posted:Correct, a RTX card should do that. I would think you'd be well within the media core's capacity as it's designed to run up to 32 streams in parallel, as far as I know there's no other capacity limiting except for the 2-stream limit. Or at least, jumping up to a Quadro wouldn't gain you anything if you're running out of media core throughput. Excellent, thanks.
|
# ? Jun 6, 2019 08:30 |
|
Silly question - is Ampere the next-gen successor to Turing?
|
# ? Jun 6, 2019 09:06 |
|
Zedsdeadbaby posted:Silly question - is Ampere the next-gen successor to Turing? It should be according to roadmaps but could also be a compute focused product, like Volta was rumored to be the Pascal successor but then Turing came out of nowhere. Nvidia hasn't confirmed anything yet.
|
# ? Jun 6, 2019 09:30 |
|
craig588 posted:I'm disabled now so taking pictures is hard, but condensation is a real issue. What I used to do was pack the socket with dielectric grease to keep out any moisture. I used that cooler for about 2 years. I built mine into a horizontal case so the motherboard could lay flat. I wouldn't use it again because now the gains from super cooling are smaller and all of the side issues with going sub ambient. 5.5 GHz but there's a startup sequence and it could get killed any moment and it uses an extra 400 watts? It's just not worth it to me anymore.
|
# ? Jun 6, 2019 14:34 |
|
Quake 2 RTX is available now: https://www.nvidia.com/en-gb/geforce/campaigns/quake-II-rtx/ https://store.steampowered.com/app/1089130 It comes with the demo levels but if you install the full version of normal Quake 2 beforehand then Q2RTX will import the full game content during installation.
|
# ? Jun 6, 2019 15:41 |
|
|
# ? Apr 27, 2024 22:26 |
|
Well I'm getting 60fps on my 1070 with global illumination set to Low and the resolution set to 720x480
|
# ? Jun 6, 2019 15:54 |