Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
mystes
May 31, 2006

I doubt they're actually proxying all the video data when you use their site to access your videos remotely, either, based on the very detailed instructions they have about port forwarding on their website.

Adbot
ADBOT LOVES YOU

other people
Jun 27, 2004
Associate Christ
i think his point is the plex software is likely tracking everything you do with it and possible sharing that scary metadata with the company.

but i still use it lol

Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell

Volguus posted:

I do believe it is, since they require you to make an account there, the data that you stream gets sent to their servers (so that they can re-send it to your phone) and in general they (the corporation, their servers, not just the application itself) have a creepy insight on everything that you do with your installation. While I know I'm not a special snowflake in any way, their application/service does make me uncomfortable.

2005 was a good year, what do you have against it? If not wanting to send my movies to them makes me an old fart ... fine. :colbert:

Plex doesn't stream your video through their servers..it goes straight from your home server or PC to your device, and even if it did, it's not weird or strange.

Far more people use Plex than attach a PC to their TV.

FWIW, Plex can be set to not send identifying metadata.

Thermopyle fucked around with this message at 19:15 on Oct 8, 2017

Volguus
Mar 3, 2009
At the end of the day, isn't it simply easier to just open your media player and play the file that you want? Sure, maybe they're not spying or anything (unlikely) but from whatever i've read on their website it just seems a lot of hassle for no benefit whatsoever (ability to watch a movie on a 5'' screen doesn't count as a benefit). And one still needs the video player software anyway, which on devices like consoles or smart TVs may or may not support the video formats that you downloaded your stuff in, which would mean that you'd have to encode them to video format, which ... sigh, just why? Open up the player and just play it from the NAS or wherever it is and don't worry about it.

Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell

Volguus posted:

At the end of the day, isn't it simply easier to just open your media player and play the file that you want? Sure, maybe they're not spying or anything (unlikely) but from whatever i've read on their website it just seems a lot of hassle for no benefit whatsoever (ability to watch a movie on a 5'' screen doesn't count as a benefit). And one still needs the video player software anyway, which on devices like consoles or smart TVs may or may not support the video formats that you downloaded your stuff in, which would mean that you'd have to encode them to video format, which ... sigh, just why? Open up the player and just play it from the NAS or wherever it is and don't worry about it.

One of the main benefits of Plex is its simplicity. For one thing, it transcodes video/audio on the fly to whatever format your playing device supports.

Your suggestion will just not fly for most normal people.

(also, I'm not sure why you get to be the gatekeeper of whether playing on a mobile device is a benefit or not...I mean, if you don't want to use plex thats fine. I don't use it either. But there's nothing wrong, weird, or strange about it.)

Thermopyle fucked around with this message at 19:16 on Oct 8, 2017

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Volguus posted:

At the end of the day, isn't it simply easier to just open your media player and play the file that you want? Sure, maybe they're not spying or anything (unlikely) but from whatever i've read on their website it just seems a lot of hassle for no benefit whatsoever (ability to watch a movie on a 5'' screen doesn't count as a benefit). And one still needs the video player software anyway, which on devices like consoles or smart TVs may or may not support the video formats that you downloaded your stuff in, which would mean that you'd have to encode them to video format, which ... sigh, just why? Open up the player and just play it from the NAS or wherever it is and don't worry about it.
If I want to play my media from my phone or from work or from someone else's computer, I can go to https://plex.tv (or use an app) and log in and play my videos, for free. I can also do this on my local network by browsing to the local network address of the computer hosting the Plex server, or if I want to play content straight to my television or media center or console, I can either use a Plex-branded app or make a DLNA connection. Like Thermopyle said, on each of these connections the content is transcoded automatically. It's a huge leap in convenience compared to pointing a media player at an NFS/SMB network share, and for a free service it's incredibly good (there is a paid tier, afaik the only feature it gives you is the ability to download your content from the remote server to a local device instead of streaming it).

evol262
Nov 30, 2010
#!/usr/bin/perl
Not to mention that you don't need to set up IR on a PC, use some hacky rdp/vnc thing from your phone, or pretend a remote control with s tiny keyboard and blackberry-style roller ball is a usable substitute for a Roku remote or whatever.

And streaming your library to 5 TVs means spending $200 total on firetv sticks instead of $400 per TV on NUCs

Volguus
Mar 3, 2009
Well, i'm not particularly sure about the convenience part. Why? It does require one to make an account. Now, you have two options (like with any account):
1) Use a password manager, therefore you do not know the password and you have to have access to both the database and the application when you are on the go (since you're not using one of those web based password managers, right?). At which point, you may as well have that drat movie on your usb/phone/tablet anyway.
2) Do like most people do and either use 1234 as the password or just reuse it, like you have done so many times in the past. At which point you just gave access to your media files to anyone who wants it.

And, to be fair, even with option 1) it is very likely that whoever wants to already has access to their database, accounts and everything. Hell, is not like security is a thing that matters to companies (any/all of them), as the last little while has taught us. To pretend otherwise is just naive. So no, having access to your media from outside your own network is obviously not an option, so them requiring an account to be able to play files on your own network is definitely a blocking requirement.
With that being said, hell .. if you (people in general) want to use said service, knock yourself out. But getting all up in arms when a stranger on the internet says that it is wrong, weird and strange ... well, it is weird, wrong and strange.

Suspicious Dish
Sep 24, 2011

2020 is the year of linux on the desktop, bro
Fun Shoe
You don't have to make an account. It nags me on startup but I still hit "skip" all the time

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

Volguus posted:

Well, i'm not particularly sure about the convenience part. Why? It does require one to make an account. Now, you have two options (like with any account):
1) Use a password manager, therefore you do not know the password and you have to have access to both the database and the application when you are on the go (since you're not using one of those web based password managers, right?). At which point, you may as well have that drat movie on your usb/phone/tablet anyway.
2) Do like most people do and either use 1234 as the password or just reuse it, like you have done so many times in the past. At which point you just gave access to your media files to anyone who wants it.

And, to be fair, even with option 1) it is very likely that whoever wants to already has access to their database, accounts and everything. Hell, is not like security is a thing that matters to companies (any/all of them), as the last little while has taught us. To pretend otherwise is just naive. So no, having access to your media from outside your own network is obviously not an option, so them requiring an account to be able to play files on your own network is definitely a blocking requirement.
With that being said, hell .. if you (people in general) want to use said service, knock yourself out. But getting all up in arms when a stranger on the internet says that it is wrong, weird and strange ... well, it is weird, wrong and strange.
You do not need to make an account, because, like people keep telling you, it can operate inside of your local network. Do you also make arguments that people don't need televisions because all that content is on the internet, or that people don't need smartphones because there's always a computer nearby? These things exist because of convenience; you argued that it was "simply easier to just open your media player and play the file that you want", and that is demonstrably untrue.

astral
Apr 26, 2004

Volguus posted:

At the end of the day, isn't it simply easier to just open your media player and play the file that you want? Sure, maybe they're not spying or anything (unlikely) but from whatever i've read on their website it just seems a lot of hassle for no benefit whatsoever (ability to watch a movie on a 5'' screen doesn't count as a benefit). And one still needs the video player software anyway, which on devices like consoles or smart TVs may or may not support the video formats that you downloaded your stuff in, which would mean that you'd have to encode them to video format, which ... sigh, just why? Open up the player and just play it from the NAS or wherever it is and don't worry about it.

As has been posted several times, you really don't understand how Plex works at all. You don't need an account (and it doesn't nag me, but I've been waiting for the privacy policy thing to shake out with the newer version before updating it so that might be something they've recently added). You don't need separate video player software. You don't need to manually encode things (plex handles transcoding if the device wouldn't support whatever codec you've got the thing in). Being able to watch your media on any of your TVs/devices and pick up where you left off is, quite frankly, amazing.

And it's all contained within your network - unless you want remote access, which you can also have!

astral fucked around with this message at 20:34 on Oct 8, 2017

mystes
May 31, 2006

astral posted:

As has been posted several times, you really don't understand how Plex works at all. You don't need an account (and it doesn't nag me, but I've been waiting for the privacy policy thing to shake out with the newer version before updating it so that might be something they've recently added). You don't need separate video player software. You don't need to manually encode things (plex handles transcoding if the device wouldn't support whatever codec you've got the thing in). Being able to watch your media on any of your TVs/devices and pick up where you left off is, quite frankly, amazing.

And it's all contained within your network - unless you want remote access, which you can also have!
Also, if you're the sort of person who could set up another, equivalent solution, you probably already have the means to remotely access your network, in which case you can watch videos just by directing your browser to your plex server without setting up a plex.tv account (unless something has changed in the last few years).

I've also used Kodi, but it's a lot more annoying in various ways.

An Enormous Boner
Jul 12, 2009

.

An Enormous Boner
Jul 12, 2009

Why would I use ZFS instead of LVM and RAID or whatever? Where does this excitement about ZFS come from? What should I read to learn about both of these things?

mystes
May 31, 2006

An Enormous Boner posted:

Why would I use ZFS instead of LVM and RAID or whatever? Where does this excitement about ZFS come from? What should I read to learn about both of these things?
I don't care about most of the ZFS-specific features, personally, but it seems like ZFS snapshots are a lot better than LVM snaphots (it sounds like you have to be pretty careful about how you use snapshots in LVM, but with ZFS you can just make them whenever you want). Whenever I read about LVM it seems too complicated to bother with for my personal desktop use, but then I'm missing out on snapshots. I'm not quite ready to trust ZFS on linux though.

In terms of things like RAID, if you're already using LVM I can imagine that the duplication of features at different levels could be annoying. I think that's mostly a side effect of how LVM was designed separately from the filesystem layer in linux. While this probably simplifies the file system design, it seems from what everyone is saying that integrating these features into the filesystem is probably the future. However, because of the licensing issues, ZFS will never become the standard filesystem on linux, and BTRFS's future seems unclear, so at this point for production use I think LVM is still the only option. So, if you only want to learn one thing, you're probably still better off learning LVM I think.

mystes fucked around with this message at 22:12 on Oct 8, 2017

Thermopyle
Jul 1, 2003

...the stupid are cocksure while the intelligent are full of doubt. —Bertrand Russell

An Enormous Boner posted:

Why would I use ZFS instead of LVM and RAID or whatever? Where does this excitement about ZFS come from? What should I read to learn about both of these things?

You probably want to talk about that in the Packrats thread.

RFC2324
Jun 7, 2012

http 418

mystes posted:

So, if you only want to learn one thing, you're probably still better off learning LVM I think.

There are people who only want to learn one thing? O.o

hifi
Jul 25, 2012

An Enormous Boner posted:

Why would I use ZFS instead of LVM and RAID or whatever? Where does this excitement about ZFS come from? What should I read to learn about both of these things?

ZFS has a lot of cool stuff like send/receive and snapshots and in my limited disaster recovery experience I thought it was a lot easier to swap in a hard drive and rebuild a dataset for the first time than with mdadm. There's a ton of secondhand crap about checksumming and self-healing and ECC ram but I'd say most of the features are just nice to have (e.g, how many files have you had corrupted on your regular old filesystem?). If you're starting from scratch or migrating data wholesale I'd look into it but if you have something that works then don't bother switching for nebulous benefits or the internet hype machine unless you have something specific in mind.

There's an entire freebsd section on zfs that's well-written and mostly applies to any OS: https://www.freebsd.org/doc/handbook/zfs.html, and if you're looking for linux-specific instructions then zfsonlinux probably has packages for what you are using, or you can compile it with not a lot of trouble or arcane error warnings: https://github.com/zfsonlinux/zfs/wiki/Getting-Started.

This has applied to basically anything I've ever done with linux but if you sort of know what you want to do then you can google it, which is how I have navigated through my LVM and mdadm problems. The man pages are verbose but they're there.

SamDabbers
May 26, 2003



hifi posted:

There's a ton of secondhand crap about checksumming and self-healing and ECC ram but I'd say most of the features are just nice to have (e.g, how many files have you had corrupted on your regular old filesystem?).

How would you know that your files aren't corrupt without the checksums and ECC ram? THEY COULD ALREADY BE CORRUPT :tinfoil:

NZAmoeba
Feb 14, 2005

It turns out it's MAN!
Hair Elf
Ok, why is this curl not working?

code:
curl --header 'Host: name.hostname.org' https://10.0.0.101
curl: (51) SSL: certificate subject name 'name.hostname.org' does not match target host name '10.0.0.101'
Same deal with -H instead of --header. Centos 6.9 with nginx if it makes a difference. It's been driving me insane, why is it not using the header I'm trying to give it?

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
Try
code:
curl --resolve name.hostname.org:443:10.0.0.101 [url]https://name.hostname.org/[/url] 
Edit: For the reason WHY it's not working, it's probably due to SNI. The order of events is:
1) Establish TCP connection
2) Establish SSL/TLS session, using SNI (which needs the hostname)
3) Send HTTP request.

By putting "Host: name.hostname.org" into the HTTP header, you're giving the correct info for the HTTP stage 3 but it needs it earlier at stage 2. By using --resolve, you're bypassing the DNS server and telling curl directly that 10.0.0.101 resolves to name.hostname.org so curl will use that in the SSL/TLS stage.

minato fucked around with this message at 16:46 on Oct 11, 2017

NZAmoeba
Feb 14, 2005

It turns out it's MAN!
Hair Elf

minato posted:

Try
code:
curl --resolve name.hostname.org:443:10.0.0.101 [url]https://name.hostname.org/[/url] 
Edit: For the reason WHY it's not working, it's probably due to SNI. The order of events is:
1) Establish TCP connection
2) Establish SSL/TLS session, using SNI (which needs the hostname)
3) Send HTTP request.

By putting "Host: name.hostname.org" into the HTTP header, you're giving the correct info for the HTTP session but it needs it earlier at stage 2. By using --resolve, you're bypassing the DNS server and telling curl directly that 10.0.0.101 resolves to name.hostname.org so curl will use that in the SSL/TLS stage.

--resolve isn't a recognised option.
--version gives curl 7.19.7 for what it's worth

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
If Google serves me correctly, then your version of curl is from November 2009 so you have far bigger problems.

NZAmoeba
Feb 14, 2005

It turns out it's MAN!
Hair Elf

minato posted:

If Google serves me correctly, then your version of curl is from November 2009 so you have far bigger problems.

Hunh, in saying that, I was also trying from another host that has curl from 2013 (woo so modern) which comes up with a similar, but less detailed error message:

code:
curl --header 'Host: hostname.domain.org' https://10.0.0.101
curl: (51) Unable to communicate securely with peer: requested domain name does not match the server's certificate.

minato
Jun 7, 2004

cutty cain't hang, say 7-up.
Taco Defender
Well, yeah, for the reasons I stated before. So either:
- use "-k" to ignore the SSL issues
- add "10.0.0.101 hostname.domain.org" to your /etc/hosts file.
or
- poke around your curl man page and see if there isn't some other way to explicitly set the SNI hostname or DNS resolution (maybe look closely at --connect-to)

Docjowles
Apr 9, 2009

I literally had this same issue today actually, heh. --resolve ended up being what I needed to get curl playing nice with SNI. There's no built in option to specify the hostname for SNI, which kind of sucks.

Super-NintendoUser
Jan 16, 2004

COWABUNGERDER COMPADRES
Soiled Meat
On the topic of SSL problems, I have a RHEL6 server out there that was scanned by my clients security team, and they found it uses OpenSSL 1.0.1e-fips, which they requested I upgrade to 1.0.2, but from GOOGLES I see that RHEL6 doesn't support 1.0.2. I downloaded a source from OpenSSL and I'm compiling it, but I'm not sure if that is the right solution. Anyone have any ideas on this?

Edit looks like compiling from source is a bad idea.

Super-NintendoUser fucked around with this message at 15:18 on Oct 12, 2017

evol262
Nov 30, 2010
#!/usr/bin/perl
Tell your security team that Redhat backports CVEs, and ask which issue they'd like addressed. Then find the right errata and use it.

There used to be a repo which provided modern opened for old EL distros, but I can't recall it

Super-NintendoUser
Jan 16, 2004

COWABUNGERDER COMPADRES
Soiled Meat

evol262 posted:

Tell your security team that Redhat backports CVEs, and ask which issue they'd like addressed. Then find the right errata and use it.

There used to be a repo which provided modern opened for old EL distros, but I can't recall it

That's what I asked them, their email basically said "we scanned an open port, and found it using an older unsupported version of OpenSSL, install 1.0.2 to fix". But I responded by asking what specifically their scan reported, since you can't just telnet to an open port and ask it what openssl version, typically the server responds with ciphers you don't support and then you know what to fix.

Their team doesn't do any real data gathering, I'm always getting reports from them that are sort of nonsense. Once they scanned a server, found that multiple versions of the same package were installed, and so they removed all the newer versions, leaving the oldest. This broke their production servers hardcore. I asked the reasoning for purging the latest version, and they told me basically 'well we figure that you aren't using the newest version because why would you have the left the older ones on there?" :facepalm:

xzzy
Mar 5, 2009

I get that from my security guys too.

"nessus told us this thing is vulnerable so fix it"

Generally I can get a pdf report out of them if I ask a couple times, but that still leaves me to play sleuth all on my own. They have zero capacity to help with resolution.

We don't run RHEL6, all our stuff is CentOS/Scientific Linux but the packages they provide come straight from RedHat and our systems pass all security scans.. so it's possible all you need is a yum update.

Docjowles
Apr 9, 2009

That is just security_scanning.txt. I think everyone has had that experience where some dipshit just clicks Scan in Nessus and sends you a 50000 page PDF of "problems" with zero context or helpful information. And somehow gets paid thousands of dollars for it.

Then you get to spend the next week responding to every high severity finding by showing them the Red Hat errata page where they backported a fix for the exploit.

Super-NintendoUser
Jan 16, 2004

COWABUNGERDER COMPADRES
Soiled Meat
This situation is sort of complex because they reported the server has openssl 1.0.1, but the application port they are scanning actually has it's own lib folder that contains 0.9.8, so even if I could get server upgraded the application includes the older version. I need to talk to the application developers to see what they can do.

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum
About how long does a fix for a broken package stay in the RedHat Bugzilla ON_QA queue before it gets rolled out to the public?

evol262
Nov 30, 2010
#!/usr/bin/perl
098e is mostly immune to the bad CVEs anyway

Red Hat's bugzilla/errata process basically goes like:

  • filed - NEW
  • triaged/working on it - ASSIGNED (often skipped)
  • patch posted - POST
  • patch merged or package including patch built, depending on team - MODIFIED
  • handed over to QA - ON_QA
  • qa confirmation - VERIFIED
  • errata being pushed -RELEASE_PENDING
  • Live - CLOSED

There's also ON_DEV, and some teams use that for "working"

As for timeline, that's hard to say. The bug can be moved to ON_QA manually, or because the errata it's attached to was moved. I don't think that information is in the public bugzilla comments.

If it's an upstream bug, it was moved manually. And upstream bugs may not have a QA team to verify, so it's up to the developer (especially Fedora). In this case, it may suddenly move to CLOSED - CURRENTRELEASE

Your best bet is to look at the target milestone and then the release cadence for that product (6mo, 9mo, etc). If it's a z-stream, those mostly come out once a month, but not every bug is targeted to z-stream, especially RHEL bugs.

We'd really have to see the specific bug to guess

anthonypants
May 6, 2007

by Nyc_Tattoo
Dinosaur Gum

evol262 posted:

098e is mostly immune to the bad CVEs anyway

Red Hat's bugzilla/errata process basically goes like:

  • filed - NEW
  • triaged/working on it - ASSIGNED (often skipped)
  • patch posted - POST
  • patch merged or package including patch built, depending on team - MODIFIED
  • handed over to QA - ON_QA
  • qa confirmation - VERIFIED
  • errata being pushed -RELEASE_PENDING
  • Live - CLOSED

There's also ON_DEV, and some teams use that for "working"

As for timeline, that's hard to say. The bug can be moved to ON_QA manually, or because the errata it's attached to was moved. I don't think that information is in the public bugzilla comments.

If it's an upstream bug, it was moved manually. And upstream bugs may not have a QA team to verify, so it's up to the developer (especially Fedora). In this case, it may suddenly move to CLOSED - CURRENTRELEASE

Your best bet is to look at the target milestone and then the release cadence for that product (6mo, 9mo, etc). If it's a z-stream, those mostly come out once a month, but not every bug is targeted to z-stream, especially RHEL bugs.

We'd really have to see the specific bug to guess
It's this one, where perl(newgetopt.pl) was removed from perl-Perl4-CoreLibs but also perl-Getopt-Long doesn't say it provides it. It's a one-line change to perl-Getopt-Long's spec file, and I'd apply it myself if I didn't think it would break anything, but it's been in ON_QA for a little under a month and I just figured since it's such a minor change it'd be released by now.

evol262
Nov 30, 2010
#!/usr/bin/perl
Had to create a new account to look at this.

Unfortunately, none of the things I was hoping to suggest are public. But you can infer a lot from the lack of "Z-stream" anywhere in the bug or keywords

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.
Red Hat can be annoyingly slow with their fixes and updates. We upgraded our servers to 7.4 and ever since we've been waiting for the 'iptables -w' fix in frustration. And every now and then running 'for i in $RHEL7 ; do ssh $i 'systemctl is-enables iptables && systemctl is-failed iptables && systemctl restart iptables ; done'. Hope you weren't running Docker on the server.

Even more annoying are the security updates. A new critical kernel bug is published, Ubuntu releases the update in couple hours and Red Hat is right behind in about a week. Is there a channel to get the untested RPMs? I'd rather take my chances with an un-QAd kernel update that shut down a general use shell server for several days.

xzzy
Mar 5, 2009

If you need overnight fixes, RHEL isn't for you. Try one of the more agile distributions. RHEL is geared more towards stability and they stay a little more cautious about pushing out updates.

That said, critical security fixes are typically available much faster. Not a couple hours fast, but they seem to show up within a couple days of the vulnerability going public.

If it's a quality of life update? You gonna be waiting a long time friend.

jre
Sep 2, 2011

To the cloud ?



Saukkis posted:

Ubuntu releases the update in couple hours and Red Hat is right behind in about a week.
This is a reason never to use Ubuntu, not a reason to avoid redhat

Adbot
ADBOT LOVES YOU

evol262
Nov 30, 2010
#!/usr/bin/perl
The speed of a urgent CVE depends on whether there was a coordinated release and good practice for disclosure.

For zero days, Canonical releases whatever without testing at all. We don't. For "known" vulnerabilities with responsible disclosure and a set date to go public at the same time as every else

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply