|
At work, we have about fifty WP sites. I manage them all with InfiniteWP. Its pricing is one-time, not subscription. Also, you run it on your own server (we needed this because some of our WP sites are internal, so something like ManageWP wouldn't be able to communicate with them). It's alright. Does everything I need, but it's missing polish in a few places (like, say, being able to sort/alphabetize my list of sites would be swell). Wouldn't hurt to play with it. The only downside, and this is pretty minor, is that you do have to host it yourself, and it's slightly more complex to install InfiniteWP than a WordPress site. It's not difficult by any means (if you know how to create a database manually, you're golden), but it's no famous five-minute install.
|
# ¿ Jul 25, 2013 00:28 |
|
|
# ¿ Apr 27, 2024 07:07 |
|
What's the current state-of-the-art for managing lots of WordPress sites? At work, I'm responsible for about ninety sites, between test and production. We're presently using InfiniteWP, which is great for updating everything all at once, but not so great for anything else. In particular:
Since they do provide a plugin structure of sorts, I'm about to start trying to read their code and see how hard it would be to hack in the features I need. But my employer isn't overly averse to spending money if it solves real problems, so if there's another management solution that has non-poo poo reporting I'd love to see it.
|
# ¿ Feb 11, 2014 19:48 |
|
Conceptually, it's an easy enough fix: Everywhere that they call file_get_contents() with a Web site, replace it with a few calls to the cURL functions, and put the downloaded content into a variable, which is basically what file_get_contents() does. Chances are the above made no sense to you, if you're not a programmer. Also, it's likely that every time the vendor updates their theme, you'll have to re-do whatever fixes you make. If the theme author has any kind of support, you might want to see if they can do this; it should be pretty easy for them to do, and it'll work not only for you but for every other customer of theirs.
|
# ¿ Mar 16, 2014 15:56 |
|
The function file_get_contents() does just what its name implies: You feed it a file, it puts the contents into a local variable. In this case, "file" can also mean "remote Web site," but your Web host turned that feature off for whatever reason. Letting developers put Web pages into file_get_contents() is a convenience. I'd argue that disabling that feature was silly on the part of your Web host. It just means that you have to put something like this (untested, don't actually use this code) in your theme's functions.php file: code:
Note that this is a very bad idea. When the folks that make the theme release a new version, their changes will overwrite yours, and you'll have to dig through the whole theme again to fix it, and any new places they might use the feature. Your choices are, in random order:
|
# ¿ Mar 16, 2014 19:53 |
|
No chance of getting the server admins to fix their stuff so that the built-in WordPress updater will work? This usually means that the hosting company has a weird permissions setup, so that the Web server doesn't have rights to change stuff in the Web space. This used to be fairly common, but it's getting less so.
|
# ¿ Mar 20, 2014 02:17 |
|
Is there a way to persuade WordPress' auto-updater to update to an older version than the most-current one? Folks at my work are a bit on the conservative side, so we updated all our test sites to 3.8.3 a few days ago. Meanwhile, 3.9 came out, and I'd rather not have to do 30+ updates "manually" if I can avoid it. (I need to update the production sites to 3.8.3, not 3.9.) Assume that getting them on an update cycle that's more often than once a month is a non-starter. Took them a week to let me patch Heartbleed.
|
# ¿ Apr 22, 2014 03:13 |
|
I started using that tool a few weeks ago, and that's just another awesome feature I didn't know about. I'll have to test it a bit, to see if our SELinux setup makes it choke, but that just might save me a bunch of manual work. Thanks!
|
# ¿ Apr 22, 2014 03:45 |
|
Do you mean that your site is hosted by GoDaddy? They (GoDaddy) often include email hosting in their plans. It's possible they think they are responsible for your domain's email, and since you don't have an address in their email system it's just disappearing into a black hole. Have you had problems receiving mail from other GoDaddy users?
|
# ¿ Apr 29, 2014 01:43 |
|
You're only seeing 32 servers most of the time, because of the "MaxSpareServers 32" line. "ServerLimit 48" means that when you're under heavy load, there can be as many as 48 processes, but when the load drops it will kill them off until it gets down to 32 (or fewer). Are you using any WordPress caching (something like WP Super Cache)? If Apache is only serving up static content and not dynamically generating it, it should be able to handle a LOT more traffic than what you're describing. I was load-testing a similar setup a couple weeks ago (4 cores, 8GB RAM) and the limiting factor ended up being the network; we couldn't generate more than about 600Mbps of test traffic. At that time the CPU was under 25% usage.
|
# ¿ May 21, 2014 22:50 |
|
I know that Jetpack has a Markdown feature, but your text still is just plain text (and if you accidentally use the WYSIWYG editor on a page with Markdown your page ends up looking like crap with a jumbled mix of Markdown and HTML). But I'm not aware of an addon that does what you're looking for. That seems like it would be a pretty major overhaul of the editor to get what you're looking for.
|
# ¿ Jun 17, 2014 22:45 |
|
For doing stuff on the server itself, I prefer WP Super Cache to WP Total Cache, but they both seem to do a good job. Maybe it's just that WPTC looks more complicated than what I need, dunno. (I'm assuming you're thinking about things within WordPress itself, not outside software like Varnish Cache.)
|
# ¿ Jul 15, 2014 20:44 |
|
At $WORK, for one very visible WordPress site, there was an incident and the server effectively died (crushed under the load, took ten minutes to log into the console). Now, we actually have two dedicated Varnish servers just for that one site, separate from the Apache server where WordPress runs. Last time we load-tested this setup, we simulated 1000 visitors per second, and things were going great right up until the F5 load-balancer broke somewhere around 700Mbps of traffic. So yeah, Varnish can help, but unless you're expecting VERY high traffic volume I'm not sure it would be worth the added complexity.
|
# ¿ Jul 15, 2014 22:23 |
|
laxbro posted:How do I increase the file upload limit in Word Press? It's capped at 2mb and I can't figure out how to do it after 30 min of googling. It's not a WordPress setting, but rather a PHP Web server setting. If you're on a Linux box you probably want to edit /etc/php.ini.
|
# ¿ Aug 30, 2014 02:37 |
|
I've been using WP CLI for things like that. A command like "wp search-replace localhost your.domain.here" will not only update the Site URL, but also a bunch of other places where the site name often lurks in the database. (I use it more for moving sites between test/stage/production environments, but it's the same principle.) Edit: You may also want to do "wp search-replace 127.0.0.1 your.domain.here" just to be safe. One of my devs, and I use the term somewhat loosely, somehow manages to have both of those in her databases, in different spots. Weird Uncle Dave fucked around with this message at 23:00 on Sep 12, 2014 |
# ¿ Sep 12, 2014 22:57 |
|
Someone just wanted to put a 130MB video file on one of their WordPress sites. This far exceeds the PHP file upload limit on my server, and I'm not inclined to change that, because your big video files really should go on YouTube, but politics. Anyway, I had them upload a placeholder file (I think it was a renamed PDF) with the same file name, then I uploaded the real file via FTP and had it replace the placeholder. This worked, except that the WordPress media library now thinks the file is 70K instead of 130M. It's a silly little cosmetic thing, but I'd like to fix it if I can. Is there an easy way to do so?
|
# ¿ Sep 25, 2014 16:35 |
|
Blinkz0rz posted:Anyone have suggestions for scaling Wordpress? I'm not talking about throwing more power at a single server; I'm talking about using load balancers and multizone hosting to scale on demand. I'm working on something similar at my work, and I used Amazon for my proof-of-concept. I made a Web server template, that mounts /etc/httpd/conf.d and /var/www/html over the network, from a group of Gluster file servers. I had three Gluster instances, in three different availability zones, with quorum required for writes (but not reads), and everything worked pretty well. I couldn't tell any difference between that, and a local file system, in terms of Web server performance after a couple minutes of warm-up for the Web servers.
|
# ¿ Oct 5, 2014 00:35 |
|
It's not that hard to set up a few separate WordPress installations, and going that route is (IMO) more flexible. You don't have to worry about installing a plugin for one site, that someone on another site activates and screws up their site. And it's not really that hard to manage multiple sites, if you use a tool like InfiniteWP (free, with some paid add-ons), ManageWP (monthly subscription), or similar.
|
# ¿ Oct 21, 2014 17:02 |
|
A WordPress backup is basically just "the site's content directory" plus "a dump of the database". You're already doing regular backups of your server, right? RIGHT? I have a script that dumps the relevant databases every 6 hours, then syncs the Web server and MySQL backups directories to The Cloud (TM). Given the small rate of change for my personal sites, once a day would probably be sufficient, but cloud storage and bandwidth are cheap. The service I use (Tarsnap) does versioned backups, so it's handy in case I really need to roll back a day or two for some reason.
|
# ¿ Oct 29, 2014 19:03 |
|
Just replace all the WordPress core files with a fresh download from wordpress.org. (Basically, overwrite everything except the wp-content directory. Nothing should ever touch any of that stuff anyway.) Then, for your plugins and themes in wp-content, try to replace them with fresh versions from wherever they came from. This should limit the scope of the files you have to investigate (if you're very lucky, just a custom theme and the stuff in the uploads directory). I'm not sure how worried you need to be about the contents of the database. In theory, nothing in there should be interpreted and executed as code, but I'm far from expert in WordPress security; hopefully someone wiser than I can chime in there.
|
# ¿ Dec 15, 2014 15:49 |
|
I've got a bit of a challenge, with SSL offloading, that I'm not sure of the best way to resolve. (Posting in the WordPress thread because there's probably a plugin for this, even though ultimately the "right" solution is likely outside WordPress.) I'm building a new hosting environment, with a couple of F5 load balancers, in front of Varnish servers, which are in front of a pretty conventional LAMP server. The F5 is doing SSL offloading - it listens on both ports 80 and 443, and passes requests to the Varnish servers on port 80 (only), which in turn goes to the LAMP servers as needed. Right now, this is implemented by giving the LAMP servers two separate VirtualHost directives, including one specifying the scheme and port, thus: code:
I've tried specifying both ServerName directives in one VirtualHost clause, and it works, but then the SSL version seems to take priority (even if you visit the insecure version of the site, all links point to the secure one). I'm fine with this, but local politics means it probably won't fly -- some of the content owners think that SSL should be reserved for "important" things, and things like brochure sites should explicitly not be served over SSL. But I can't get rid of SSL completely because it's needed for back-end authentication (via Shibboleth). I'm not sure what the best workaround for this is. I'm guessing there's a WordPress plugin that can do the needed content rewriting, but I can't find it. And I'm posting because I really hope someone has a better idea.
|
# ¿ Jan 2, 2015 21:06 |
|
That plugin looks like it does the opposite of what I need - looks like it tries to make the whole site SSL all the time. I can do that, but I need it not to display in HTTPS, unless you've explicitly requested it or are going to wp-admin.
|
# ¿ Jan 2, 2015 22:02 |
|
Anyone using HyperDB? I'm having issues with it not failing over as I'd expect, but I might just have unrealistic expectations. I've got two database servers (masterdb and slavedb) in a boring standard master/slave configuration. HyperDB's db-config.php adds the master as read/write, and the slave as read-only. As long as both servers are up, everything is fine, writes go to the master, reads are split between the master and slave. It's not really necessary, but it's fine that way. (I'm going for resilience, not speed.) But as soon as I power down the masterdb, the site just stops responding. What I'd expect, is for the site still to be readable, just not allowing new comments/pages/posts/et cetera. Does HyperDB not work that way, and/or does WordPress not like it when there are no writable databases?
|
# ¿ Feb 3, 2015 16:49 |
|
The network security folks at work just recently discovered Suhosin, and are asking if I could install it on my LAMP stack. Is WordPress going to suddenly stop working for no reason if I do this? (I'd start by running Suhosin in "simulation" mode to log errors but not act on them, of course, but I'd like to know what to expect.)
|
# ¿ Jun 8, 2015 04:06 |
|
huhu posted:Really basic question but trying to read WordPress's Wiki articles for an answer is confusing... if I install WordPress in a directory, say /website, if I download the entire directory, have I completely backed up my website? No, because that directory won't have the database. You'll have to back it up separately. The combination of those two should be sufficient, though.
|
# ¿ Nov 26, 2015 05:09 |
|
I'm kinda jealous. My site provisioning process is a shell script, and my migration process is a couple more shell scripts with a manual FTP step between.
|
# ¿ Nov 29, 2015 22:27 |
|
Red Hat will continue to support PHP 5.3 by backporting bug fixes and security fixes for RHEL subscribers through November 2020. (I'd love to have all my WP stuff onto new RHEL7 servers by, oh, 2018 or so...) As someone in a Big Organization with Slow Stupid Policies and Change Management and Committees, I appreciate the backwards compatibility. Keeps biting my developers in the rear end because they roll their own dev environments on their desktops with tools that aren't five years old, then get justifiably grumpy when things don't work in the proper, supported test environment, but welp.
|
# ¿ Mar 25, 2016 15:09 |
|
|
# ¿ Apr 27, 2024 07:07 |
|
Some features of Jetpack can be used independent of a connection to dotcom, by adding a constant to your config file. https://jetpack.com/support/development-mode/ It'll put big ugly messages about how you're using development mode in the Jetpack admin pages, so if you're building a site for someone else it might look weird. But it works fine for about half of the features in Jetpack.
|
# ¿ May 20, 2018 03:48 |