Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

fishmech posted:

Sure, if you can find some way to cram dozens of gigabytes of high speed video RAM and a massive amount of cooling onto the CPU. It's a whole lot simpler to make that sort of thing work on a separate card.

Well, that's literally the design AMD proposed for their Exascale Heterogeneous Processor concept, which IIRC was in response to some early-stage/2020-delivery RFP from one of the National Labs or something. And HBM2 lets you go up to 8-high per stack so conceptually you could easily do at least 32 GB of VRAM.

You're right about the challenge posed by heat though, that's a datacenter chip, not something you would run in your PC. You would likely want to run that chip under liquid cooling (iirc NVIDIA popped the cherry on this a while back and their Tesla rack servers use liquid cooling)

Not like an AIO is all that expensive these days, though. $100 gets you a 240mm AIO that can easily do 500W+ given a sufficiently big coldplate/heatspreader (see: R9 295X2 with its 120mm AIO).

Paul MaudDib fucked around with this message at 19:35 on Jun 15, 2017

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

eames posted:

heise.de reports that Skylake-SP (and Skylake-X) uses a mesh (like the Xeon Phi) instead of a ringbus

machine translated link:
http://translate.google.com/transla...n&langpair=auto

e: i just realised that this is the wrong topic but I'll just leave it here because of infinity fabric :colbert:
Eh, it talks about the amount of hops depends on horizontal and vertical distance. Doesn't the same apply with the ringbus? Wouldn't the mesh result in shorter paths?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Combat Pretzel posted:

Eh, it talks about the amount of hops depends on horizontal and vertical distance. Doesn't the same apply with the ringbus? Wouldn't the mesh result in shorter paths?

Ideally. The big chips with their 2 rings already had some pretty nasty latency from one ring to the other. I'm curious if OSes are NUMA aware about where cores are on a given chip.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I suppose the physical layout of the LCC chips is 4x3, given that the top end part has 12 cores? --edit: Nope, 6x2. --edit: I wonder if the LCC part actually has a mesh or still uses the ringbus.

Combat Pretzel fucked around with this message at 19:43 on Jun 15, 2017

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

thechosenone posted:

so like you could use stacked memory to get a lot of data, and stitch parts together or something, like some sort of super-chip inspired by Dr.Frankenstein.

But like, I figure if 128 MB can fit, then if you can stack it, then you could at least get a gigabyte or two. This would lead to one only needing a dGPU if they needed buttloads of memory beyond what would fit.

Wow, I guess I know how people manage to click on the quote button instead of the editing button now. I always thought it would be hard to miss that.

1 gigabyte of video RAM is insufficient for removing dedicated GPUs already, let alone for future uses when such a processor would be practical. Even like 3 or 4 GB would be stretching things for something meant to replace most dedicated GPUs in use.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Lmao, iGPU can never kill dGPU, especially if panel manufacturers continue to push pixel counts, but like fishmech said you run into issues with thermals, power delivery, memory management and die space such that iGPUs are very much constrained compared to dGPU for a given node.

It doesn't mean an iGPU can't have acceptable performance for most use cases, and something like Raven Ridge promises this.

EmpyreanFlux fucked around with this message at 19:50 on Jun 15, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

fishmech posted:

1 gigabyte of video RAM is insufficient for removing dedicated GPUs already, let alone for future uses when such a processor would be practical. Even like 3 or 4 GB would be stretching things for something meant to replace most dedicated GPUs in use.

Some games are already starting to eat more than 4 GB. 6 GB is now pretty much the minimum I'd recommend on something with 980 Ti/1070/Fury X level performance, 8 GB is a safer bet all around for the long term.

Ultra-quality texture packs are an easy win for image quality without much performance degradation, and we probably haven't seen the last of Doom-style megatexturing either.

NewFatMike
Jun 11, 2015

I'm sure for laptop gaming, complaints would be limited if an APU could run 900p upscaled to 1080 like consoles do. It's not outside the realm of possibility given efficiency gains and driver tricks we've been getting lately - especially if the Maxwell style improvements Vega is getting also apply to APUs.

thechosenone
Mar 21, 2009
Now that I'm thinking about it, isn't there some distance ratio needed for really high resolution screens to be useful?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

NewFatMike posted:

I'm sure for laptop gaming, complaints would be limited if an APU could run 900p upscaled to 1080 like consoles do. It's not outside the realm of possibility given efficiency gains and driver tricks we've been getting lately - especially if the Maxwell style improvements Vega is getting also apply to APUs.

It would also help to do like NVIDIA is doing with Max-Q and run bigger chips that are binned/underclocked/undervolted for maximum efficiency.

Speaking of which, I won't lie, I feel like the 10-series has stolen AMD's thunder in the laptop market too. APUs made sense when the alternative was Intel integrated graphics or lovely mobile SKU with 64 cores, but do they make sense anymore in a world where NVIDIA is putting high-end desktop chips in laptops? Let's say a Max-Q 1070 (underclocked 1080) gets you down to 120W for 1070-level performance, why do I need an APU?

I guess having it in a single package is a win for design complexity vs separate chips, which is a win at the low end... but high-end graphics aren't going to magically be cheap just because they're an APU, you will pay for the privilege.

Are the merits of having everything in a single package so great that the APU is self-evidently better than discrete chips? I don't think so personally, at least for most applications.

Paul MaudDib fucked around with this message at 20:56 on Jun 15, 2017

repiv
Aug 13, 2009

e: wrong thread

repiv fucked around with this message at 21:18 on Jun 15, 2017

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

thechosenone posted:

Now that I'm thinking about it, isn't there some distance ratio needed for really high resolution screens to be useful?

Not really.

The argument goes that you have to be sitting X close to a screen that is Y big with Z resolution before you can see the pixels, and that therefore any more resolution than that is a waste. But the human eye isn't a camera that sees pixels and frames, it's an analog system, and a certain amount of overkill will be perceived as a smoother/higher-quality image even if you can't actually see the pixels. You just need to make sure that your applications are high-DPI aware or that your OS supports resolution scaling, otherwise you will need a microscope to read text.

This is the central idea of Retina panels, and everyone agrees they look pretty good.

Also, IMO you could even go a bit farther with the overkill to enable good non-integer-scaling of resolutions (eg running 1440p on a 4K monitor). My Dell P2715Q does really well at 1440p, it doesn't have any of the artifacts that used to be common when running panels at non-native resolutions. Part of it is that Dell obviously does the scaling well, but I also think the high-PPI may be hiding any subtle artifacts that might be showing.

The same arguments get made about refresh rates, and again, the human eye is not a camera, it's an analog system. We start perceiving smooth motion at 24 fps or so, but there is a visible difference from stepping up to 60 Hz, 100 Hz, and 144 Hz (to a smaller extent). Your peripheral vision can work much faster (a biological adaption for defense), and fighter pilots have demonstrated the ability to identify silhouettes that are flashed at the equivalent of around 1/250th of a second iirc.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

fishmech posted:

Sure, if you can find some way to cram dozens of gigabytes of high speed video RAM and a massive amount of cooling onto the CPU. It's a whole lot simpler to make that sort of thing work on a separate card.

Paul MaudDib posted:

Well, that's literally the design AMD proposed for their Exascale Heterogeneous Processor concept, which IIRC was in response to some early-stage/2020-delivery RFP from one of the National Labs or something. And HBM2 lets you go up to 8-high per stack so conceptually you could easily do at least 32 GB of VRAM.

You're right about the challenge posed by heat though, that's a datacenter chip, not something you would run in your PC. You would likely want to run that chip under liquid cooling (iirc NVIDIA popped the cherry on this a while back and their Tesla rack servers use liquid cooling)

Not like an AIO is all that expensive these days, though. $100 gets you a 240mm AIO that can easily do 500W+ given a sufficiently big coldplate/heatspreader (see: R9 295X2 with its 120mm AIO).

Oh yeah, on this note, the DOE just awarded a $258 million contract to AMD, Cray, HPE, IBM, Intel and Nvidia to build the first exascale computer with a target date of 2021 and a second computer online in 2023. Dunno if they're going with AMD's design or not, that's quite a mix of companies.

Paul MaudDib fucked around with this message at 21:47 on Jun 15, 2017

Subjunctive
Sep 12, 2006

✨sparkle and shine✨


Presumably AMD is doing the iGP.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Trump won't pay them for their work, and AMD will be dead for reals.

GRINDCORE MEGGIDO
Feb 28, 1985


sauer kraut posted:

Kaby Lake (i7-7700K) just clocks several 100 MHz higher.

Usually about a gigahertz

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

SwissArmyDruid posted:

Trump won't pay them for their work, and AMD will be dead for reals.

AMD's design also builds the wall, they win by default.

wargames
Mar 16, 2008

official yospos cat censor

Paul MaudDib posted:

Well, that's literally the design AMD proposed for their Exascale Heterogeneous Processor concept, which IIRC was in response to some early-stage/2020-delivery RFP from one of the National Labs or something. And HBM2 lets you go up to 8-high per stack so conceptually you could easily do at least 32 GB of VRAM.

You're right about the challenge posed by heat though, that's a datacenter chip, not something you would run in your PC. You would likely want to run that chip under liquid cooling (iirc NVIDIA popped the cherry on this a while back and their Tesla rack servers use liquid cooling)

Not like an AIO is all that expensive these days, though. $100 gets you a 240mm AIO that can easily do 500W+ given a sufficiently big coldplate/heatspreader (see: R9 295X2 with its 120mm AIO).

Motherboard on chip?

Wirth1000
May 12, 2010

#essereFerrari
Ok, I don't know what to do with this build now. Somebody suggest benchmarks or something. 1600X + B350 + 16GB RAM

wargames
Mar 16, 2008

official yospos cat censor

Wirth1000 posted:

Ok, I don't know what to do with this build now. Somebody suggest benchmarks or something. 1600X + B350 + 16GB RAM

Only thing that comes to mind to test cpu is Cities skylines but I don't know how multi treaded that is.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Wirth1000 posted:

Ok, I don't know what to do with this build now. Somebody suggest benchmarks or something. 1600X + B350 + 16GB RAM

I've been meaning to ask someone with a Ryzen system to try testing an exponent in Prime95 for me.

Download Prime 95 and then go to Advanced -> Test and enter in "20996011" in exponent to test. No need to finish the entire thing just let it run for a bit and note the ms/iter and estimated completion time.

For reference that exponent took 16hr on a 2600k and 9hr 19min on a 6600k.

MaxxBot fucked around with this message at 01:29 on Jun 16, 2017

Wirth1000
May 12, 2010

#essereFerrari

MaxxBot posted:

I've been meaning to ask someone with a Ryzen system to try testing an exponent in Prime95 for me.

Download Prime 95 and then go to Advanced -> Test and enter in "20996011" in exponent to test. No need to finish the entire thing just let it run for a bit and note the ms/iter and estimated completion time.

For reference that exponent took 16hr on a 2600k and 9hr 19min on a 6600k.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Not bad for having gimped AVX, that stock or overclocked?

Wirth1000
May 12, 2010

#essereFerrari
Everything's completely stock.

Wirth1000
May 12, 2010

#essereFerrari
Alright, after a couple days of stable running I just played overwatch for about 40 minutes, quit and after a couple minutes screen went red. Then BSOD and now it gets to the login screen but every time I log in it starts loading windows and blammo this happens:

GRINDCORE MEGGIDO
Feb 28, 1985


That'll teach you to be a healslut.

Wirth1000
May 12, 2010

#essereFerrari
Hey! I was playing Lucio then Reinhardt. :mad:

Also shutting it off completely for a few seconds then starting back up seems to have resolved it.... for now!

Gonna update the BIOS and fingers crossed.

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

Wirth1000 posted:

Alright, after a couple days of stable running I just played overwatch for about 40 minutes, quit and after a couple minutes screen went red. Then BSOD and now it gets to the login screen but every time I log in it starts loading windows and blammo this happens:



According to the last response here it was a memory timing issue.

First, I'd try what he suggested and enter your memory timings manually. If it's still bad, you should update your bios, you can do it from within the bios itself.

Wirth1000
May 12, 2010

#essereFerrari

Measly Twerp posted:

According to the last response here it was a memory timing issue.

First, I'd try what he suggested and enter your memory timings manually. If it's still bad, you should update your bios, you can do it from within the bios itself.

Well, I just finished updating the BIOS from 2.40 out of the box to 2.60 latest on ASRock.

RAM is still showing up as 2133 instead of 2400 though.

Thanks for the link I'll read through it.

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

Wirth1000 posted:

Well, I just finished updating the BIOS from 2.40 out of the box to 2.60 latest on ASRock.

RAM is still showing up as 2133 instead of 2400 though.

Thanks for the link I'll read through it.

Have you enabled the XMP profile under the OC Tweaker tab? It should auto detect the speed then.

Wirth1000
May 12, 2010

#essereFerrari

Measly Twerp posted:

Have you enabled the XMP profile under the OC Tweaker tab? It should auto detect the speed then.

Yep, it's set to auto. Still showing up as 2133. RAMs in the right slot and everything, too. I mean I guess this is one of the most common issues with Zen.

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop
Where's the current discussion of Zen IOMMU groupings? There was a beta BIOS that dropped last week about it but I haven't heard any updates on improvements to isolation.

NewFatMike
Jun 11, 2015

Paul MaudDib posted:

It would also help to do like NVIDIA is doing with Max-Q and run bigger chips that are binned/underclocked/undervolted for maximum efficiency.

Speaking of which, I won't lie, I feel like the 10-series has stolen AMD's thunder in the laptop market too. APUs made sense when the alternative was Intel integrated graphics or lovely mobile SKU with 64 cores, but do they make sense anymore in a world where NVIDIA is putting high-end desktop chips in laptops? Let's say a Max-Q 1070 (underclocked 1080) gets you down to 120W for 1070-level performance, why do I need an APU?

I guess having it in a single package is a win for design complexity vs separate chips, which is a win at the low end... but high-end graphics aren't going to magically be cheap just because they're an APU, you will pay for the privilege.

Are the merits of having everything in a single package so great that the APU is self-evidently better than discrete chips? I don't think so personally, at least for most applications.

I'll take the wins at the lower end for a laptop. My road warrior really only has to play games somewhat competently at 1080p/medium for me to be happy as a clam and not choke to death on 3D modeling tasks. Throw in Freesync 2 in an XPS body and I'm ecstatic. I'll do my weird experiments to drive 4k60 over network at home.

This is all coming from my use case, but I'm tired of my MSI laptop. $1600 two years ago and I got a good i7-5700HQ, GTX 970M, but the body is all plastic and aluminum sheeting over it. If I can side grade because performance is exactly what I need to 1080p60APU/Freesync/metal bodied/good battery life thing sign me up. Raven ridge will essentially be the same CPU I've got now, I'm just hoping the GPU is, too.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

NewFatMike posted:

I'll take the wins at the lower end for a laptop. My road warrior really only has to play games somewhat competently at 1080p/medium for me to be happy as a clam and not choke to death on 3D modeling tasks. Throw in Freesync 2 in an XPS body and I'm ecstatic. I'll do my weird experiments to drive 4k60 over network at home.

This is all coming from my use case, but I'm tired of my MSI laptop. $1600 two years ago and I got a good i7-5700HQ, GTX 970M, but the body is all plastic and aluminum sheeting over it. If I can side grade because performance is exactly what I need to 1080p60APU/Freesync/metal bodied/good battery life thing sign me up. Raven ridge will essentially be the same CPU I've got now, I'm just hoping the GPU is, too.

The other wildcard is that Intel just opened up Thunderbolt. The flood of eGPU enclosures has already begun.

The ability to go home and put your laptop on a docking station, and then have a good FreeSync/GSync monitor and a good GPU that isn't thermally limited, and not having to constantly carry around a huge desktop replacement built for sustained gaming, etc etc pretty much takes the thunder out of it as well.

By all means, buy a well-built laptop. But you can still build a lot thinner if you aren't trying to dissipate 120W+ of heat even from a binned underclocked 1080. Magnesium chassis and a nice lightweight laptop aren't mutually exclusive.

NewFatMike
Jun 11, 2015

All of this is true, I just re- summoned my more or less complete thoughts on eGPU in the laptop thread, I might bring them over here. TL;DR if you don't have a NAS, just set up a Windows home server that does file backups and use the eGPU enclosure, GPU, & NAS budget to make yourself a nice home server that'll serve games over Ethernet via Steam in home streaming. And you're no longer protocol/lane bound, you get the performance you paid for when you bought your GPU, not 75-80% of it.

Thunderbolt is still good even if you go that route because then you get a one-cable docking solution for monitors, keyboards, DAC, whatever.

I also don't know how Pascal battery life is on notebooks. Hopefully it's better than Maxwell in my MSI. Battery life could be a lot less of an issue now and I wouldn't know it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
By the way I completely agree about on-the-go 1080p medium being perfectly fine for a road-warrior PC. For a laptop I really care more about durability first, CPU and battery life next. If I can plug in when I'm seriously gaming then 1080p medium on adapter would be great as far as I'm concerned, especially if I could have a nicer GPU in a dock at home.

For me personally having a CUDA device available is nice for development but I really don't care about it being actually a killer GPU for games, I'm talking like Mobile 1060 is overkill.

Does GSync/FreeSync work over Steam In-Home Streaming? That would be the problem there (along with needing a server fast enough to game on). All in all you might actually be better off just gaming directly on the server, in person, and forgetting about the streaming altogether.

NewFatMike
Jun 11, 2015

*sync over SiHS is something that I need to investigate. Straight up getting measurements is hard enough on a standard system. There are a few things I need to figure out, but I'm not sure how to do it (output stream from server, output stream from Link, server client latency, etc).

I'm super *duper* curious to see if there's a way to get a crazy cheap Freesync APU powered device hooked up to my TV as the client box for that. My TV has Freesync, so maybe it'll send Freesync all the way through. Probably not, though.

The SiHS thing is also more for putting games on my TV, I think there's a software that'll let you just move your mouse between desktop and laptop as a kind of extended screen, so maybe that'll be fun to play with.

Maybe a sufficiently low latency desktop instance could do it? Can you pass adaptive sync through virtualized video output? That would be crazy but also nice.

Or just crazy over buy and peg your frame rate limit where you want like I'm doing for older games?

E: I'm glad you understand the laptop thing. Sturdiness is so good.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

NewFatMike posted:

*sync over SiHS is something that I need to investigate. Straight up getting measurements is hard enough on a standard system. There are a few things I need to figure out, but I'm not sure how to do it (output stream from server, output stream from Link, server client latency, etc).

My personal experience is that GSync freaks SIHS the gently caress out, at least in certain titles. I've tried streaming MGSV from my desktop to my laptop, it would crash the streaming exe (not the game itself) once every couple minutes until I turned off GSync.

I don't fully understand why that would be (SIHS must be handling a variable framerate anyway, either by variable-rate video or by pulldown) but clearly something in GSync breaks the hooks that SIHS is using. I've never really tried it that intensely, I've been too busy to disassemble and clean my laptop heatsink lately and it's overheating and I don't want to bake it. :smith:

What I'm waiting for is consoles that can output FreeSync directly. Microsoft's new XBox Scorpio has 2560 Polaris cores, more than the RX 580, which is actually not half bad for a console. If they can drive 45 fps at 1080p (or even upscaled 720p) it will be a smash hit. We need FreeSync TVs (or HDMI 2.1) badly. That is actually going to be a game-changer once it hits.

If that happens, GSync lock-in is loving over. NVIDIA will have no choice there. They will be a holdout in a premium niche of a premium market. How many console owners know what a DisplayPort is? I'm betting like less than 5%. Any bets on how many own a display device that accepts DisplayPort?

quote:

E: I'm glad you understand the laptop thing. Sturdiness is so good.

Yup. Old Thinkpad club represent. I have a CUDA-capable device (GT216M - 48 Tesla cores, DX10.1 support and everything!) and a quad-core high-power CPU, with a good keyboard and a 1600x900 15.6" display, paid $400 4 years ago and they're $200 now. Unfortunately Lenovo's lineup is super far behind the times, no Pascal GPUs to be seen :smith:

I actually do have an ExpressCard port, which in theory is basically a PCIe 1.1x1 lane - and enclosures are available for that too, like $120 for a kit. Yeah, that sucks poo poo in terms of throughput, but can the CPU rates rendered by a pre-Sandy Bridge entry level mobile quadcore really keep up with even what a totally gimped RX480/1060 or whatever can do?

If external GPUs with GSync/Freesync support are a thing now I'm pretty much just buying a new Thinkpad with a 3K or 4K screen and a Thunderbolt enclosure and settling in for 10 years of awesome expandable durable service. The upgrade increment has gotten to the point it's worth it now.

edit: correction I forgot AMD's warp size was twice NVIDIA's, it's 2560 cores, that's a hell of a console GPU. I bet they hit RX 470 performance at their ideal efficiency clocks.

Paul MaudDib fucked around with this message at 06:28 on Jun 16, 2017

Arzachel
May 12, 2012

NewFatMike posted:

This is all coming from my use case, but I'm tired of my MSI laptop. $1600 two years ago and I got a good i7-5700HQ, GTX 970M, but the body is all plastic and aluminum sheeting over it. If I can side grade because performance is exactly what I need to 1080p60APU/Freesync/metal bodied/good battery life thing sign me up. Raven ridge will essentially be the same CPU I've got now, I'm just hoping the GPU is, too.

:siren:Raven Ridge uses a cut down RX460:siren:

It's not going to compete with semi-recent 100W dgpus.

Adbot
ADBOT LOVES YOU

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.

NewFatMike posted:

*sync over SiHS is something that I need to investigate. Straight up getting measurements is hard enough on a standard system. There are a few things I need to figure out, but I'm not sure how to do it (output stream from server, output stream from Link, server client latency, etc).

I'm super *duper* curious to see if there's a way to get a crazy cheap Freesync APU powered device hooked up to my TV as the client box for that. My TV has Freesync, so maybe it'll send Freesync all the way through. Probably not, though.

The SiHS thing is also more for putting games on my TV, I think there's a software that'll let you just move your mouse between desktop and laptop as a kind of extended screen, so maybe that'll be fun to play with.

Maybe a sufficiently low latency desktop instance could do it? Can you pass adaptive sync through virtualized video output? That would be crazy but also nice.

Or just crazy over buy and peg your frame rate limit where you want like I'm doing for older games?

E: I'm glad you understand the laptop thing. Sturdiness is so good.

gsync on the host or client machine? I've streamed from my pc which has gsync running, and it did fine.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply