New WordPress Plugin: Multisite Administration Tools

I would like to announce the immediate availability of my new plugin, Multisite Administration Tools.  The idea behind this plugin was to give multisite administrators an easier way to keep tabs on what plugins sites have enabled, or from the other angle which sites have a certain plugin or theme enabled.  The plugin is written to be run from the standard plugins folder, and requires network activation.  Enjoy!

Download at:

Green Bay Packers Gameday Meals

This season, after a few games we decided to make a “theme” meal based on the opposing teams city of origin.  Below is the list we came up with – some cities were easier than others.  So if you’re wondering what to make for the next big game, take a look at the list below!

@ Atlanta Falcons
October 9th, 2011 at 7:20 PM
Country Fried Steak w/Biscuits and Gravy

St. Louis Rams
October 16th, 2011 at 12:00 PM
St Louis Toasted Ravioli

@ Minnesota Vikings
October 23rd, 2011 at 3:15 PM
Juicy Lucy

@ San Diego Chargers
November 6th, 2011 at 3:15 PM
California Burrito

Minnesota Vikings
November 14th, 2011 at 7:30 PM (Monday Night)
Turkey Chili

Tampa Bay Buccaneers
November 20th, 2011 at 12:00 PM
Cuban Sandwich

@ Detroit Lions
November 24th, 2011 at 11:30 AM (Thanksgiving)
Turkey – what did you expect?

@ New York Giants
December 4th, 2011 at 3:15 PM
New York Strip Steak

Oakland Raiders
December 11th, 2011 at 3:15 PM
Chicken Sandwich w/dirty rice

@ Kansas City Chiefs
December 18th, 2011 at 12:00 PM
BBQ Anything (Haven’t decided yet)

Chicago Bears
December 25th, 2011 at 7:20 PM (Christmas)
Italian Beef Sandwich

Detroit Lions
January 1st, 2012 at 12:00 PM (New Years Day)

Of course, we will be putting our own spin on any of the above linked recipes.  Hopefully we’ll also develop our own and post them on the GrillinGeeks recipe site!

Performance Tuning Lighty – Feed2JS

How do you run a site that gets roughly 3 million daily hits, while not killing server performance?  Well, the first step is to get ride of Apache – far too resource intensive to handle the load.  Lighttpd is a light weight webserver designed to do just this!  I recently spent some time performance tuning the site, and wanted to share the config tweaks here in the event that it may help someone else.

The server we are running is  Debian Lenny 64bit box, hosted by SoftLayer.  Before I even got to tuning Lighty, I was getting synflood errors.  I dug into it and discovered the hosting provider had put some checks in place to help prevent DoS attacks.  In “/etc/sysctl.conf” I change the following:

net.ipv4.tcp_synack_retries = 5

Now that I solved that problem – the next thing I went to do was make sure there would not be a restriction on the number of file handlers that the web process could use.  To set this up, I added the following in “/etc/security/limits.conf“:

www-data        soft    nofile          4096
www-data        hard    nofile          10240

This will allow the lighttpd user (www-data) to have a soft limit of 4096 handlers, with a hard limit of 10240.

In order for this file to work, you have to enabled it in PAM.  In “/etc/pam.d/su“, find and uncomment the following line:

session    required

There are two main config files that we need to work with – the first is the main config “/etc/lighttpd/lighttpd.conf” which does not have alot of changes.  The following were added to tune performance:

server.max-fds = 40000
server.max-keep-alive-requests = 100
server.max-keep-alive-idle = 2
server.max-connections = 10000

I also enabled the “compress” and “expire” modules.  The compress module is using the default settings:

#### compress module
compress.cache-dir          = "/var/cache/lighttpd/compress/"
compress.filetype           = ("text/plain", "text/html", "application/x-javascript", "text/css")

Now, in order to take advantage of this, you need to enable compression in php.  Since this is using the cgi build of php, the file to change is “/etc/php5/cgi/php.ini“.  There is only one option to change here:

zlib.output_compression = on

In PHP, I also enabled APC for opcode caching.  I currently have this running with the following settings:


The expires module is currently configure as follows:

expire.url                  = ( "/" => "access 6 hours")

The second lighttpd config file that needs to be changed is the fastcgi config “/etc/lighttpd/conf-enabled/10-fastcgi.conf“:

fastcgi.server    = ( ".php" =>
"bin-path" => "/usr/bin/php-cgi",
"socket" => "/tmp/php.socket",
"max-procs" => 4,
"idle-timeout" => 20,
"bin-environment" => (
"bin-copy-environment" => (
"broken-scriptfilename" => "enable"

Well, that about does it.  Hopefully the above notes and insight help others out!  I could not have done this without the help of the Lighttpd community – they are great and always willing to help!

WordPress Post Expirator 1.4 Released

The latest version of the Post Expirator plugin is now available!  I have fixed the compatibility issues with WordPress (still works on WordPress MU), added the ability to expire posts/pages on the minute and fixed the timezone issue to pull from the timezone configured for WordPress.  I have also done some initial testing on the 3.0 alpha trunk – and everything works as advertised!

The plugin can be downloaded at the WordPress Plugins Page.  Enjoy!

Battling the “Slashdot Effect”

I’ve always wondered what exactly one can do to combat high volume traffic from slashdot, digg, reddit and other populate news sites – known commonly as the Slashdot effect. The other day, I got a first hand opportunity to find out.  One of our clients sites was posted to multiple news site – so what better way to try out and see what works in a real life slashdotting.  Below, is the solution I ended up with that seemed to remove a significant amount of strain from the server without costing any additional money.

As I was doing some research, I came across some creative uses of the Coral CDN – and I have previously used lighttpd for high volume sites – so I put the two together. For those of you that don’t know about Coral CDN, its a peer-to-peer content distribution network. To take advantage of the CDN its easily – simply append to the end of any dns name and that’s it!  Since the site I was dealing with was fully dynamic, I did not have a lot of options that could be quickly implemented.

Once you get lighttpd setup, here are the example config bits to setup and perform the redirect:

# make sure this isn't CoralCache requesting content
$HTTP["useragent"] !~ "^CoralWebPrx" {
    # make sure that this wasn't sent back to us from CoralCache
    $HTTP["querystring"] !~ "(^|&)coral-no-serve$" {
        url.redirect = ( "^/.*" => "$0" )

The above will redirect all requests that do not have the Coral CDN user agent. It’s important to add this or else you’ll get a redirect loop.  Also – make sure you enable the "rewrite" and "redirect" server modules. 

For those of you that are using apache, you can add rules similiar to:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} !^CoralWebPrx
RewriteRule ^(.*)$$1 [R,L]

In the end, caching, static pages, and load balancing is your best bet. But if your in a bind, lighttpd and Coral CDN will do the trick!

Woot Random Crap!

Today, USPS brought me my 13.5lbs box of random crap. 

One never knows what to expect, the contents of the box were:

  • American Tourister Photo Bag – qty 3
  • NFL Pro Form Football Talking Electronic Handheld Game – qty 2
  • Spock Star Trek Quog Action Figure – qty 2
  • Captain Kirk Star Trek Quog Action Figure – qty 2
  • Victory Wax 16oz – qty 4
  • Fact or Crap Quiz Book – qty 2
  • Woot Brown Paper Bag – qty 16
  • Flying Aces Series Model – Mitsubishi A6M2 Zero – qty 1
  • Flying Aces Series Model – Spitfire Mk. Vb – qty 1
  • Flying Aces Series Model – Messerschmitt Me 109 F – qty 1
  • Flying Aces Series Model – P-47D Thunderbolt – qty 1
  • Flying Aces Series Model – Focke-Wulf Fw 190 A-5 – qty 1
  • Flying Aces Series Model – P-51D Mustang – qty 1

More pictures available here

BTW – smartpost still sucks.

WordPress Post Expirator Bugs/Feature Requests

I have created a SourceForge project to assist in tracking the number of features and bug reports that have came in via blog comments.  I did my best to populate things based on the current comments – however please add and update as necessary.  My goal is to find time in the next few weeks to start working on some of the current issues and additional feature requests.

SourceForge Project:

Please post all new features and bugs on the SF project page instead of in blog comments!

Thanks everyone for all the feedback!

Catching penguins one at a time