It is usually considered a given that "private" downloads, going through Drupal, are slower than "public" downloads, which can be served directly by Apache, or whatever web server the site is running on. This is indeed true in the general case; however, for low-cost hosting, this apparent axiom needs to be revisited.
I recently had to install Drupal 6.x for a french government agency on a low-cost hosting plan. Although the site performed reasonably well considering the limitations of the chosen hosting plan, I soon noticed it was missing mod_deflate and mod_expires, which caused pages to be served uncompressed and every static file to be served without an expiration date.
And, of course, the site had quite a few images: photos on most pages, and several logos at the bottom of each page.
Now, when mod_deflate is missing, using the "Page compression" option on
http://example.com/admin/settings/performance is a good workaround for the download page size, but what about the static files ?
Checking a few cheap hosting plans, it appeared these limitations are actually quite common. And without mod_expires, there is no way to tell Apache to serve static content with specific headers. Luckily for us, with Drupal we have a trick up our sleeves, the so-called "private" file downloads.
Gaming the system
In order to put private downloads to our advantage, we need to understand what private downloads bring.
Essentially, they allow files to be accessed using URLs like
system/somefile.png instead of
system component of the URL routes the file request via
system.module, which will internally check the currently selected files directory, and output the file from PHP without making its actual location visible, after checking no module wants to intervene by implementing hook_file_download
A lot of additional work for nothing, it seems... Well, as it happens, going through this code will indeed slow the actual delivery of the files. So what's to be gained ? Remember I just mentioned hook_file_download ? By adding one more small module to that already heavy code path, we can add some more data to our static files, namely additional HTTP headers: the ones which mod_expires would normally be adding, and to be really specific, an
Expires: header pointing a few days in the future. See the trick ?
At this point, on the first visit, the browser will receive these Expire headers as metadata for the files, and on each subsequent page, will not load them again, unlike what happened previously, drastically reducing the total load time, just as would have been the case with mod_expires.
In this specific example, combining page compression, ETags, and this module, the Drupal 6 gets a YSlow score of 81, up from 56 with the default Drupal 6 configuration, and pages being served to non-logged-in users are typically betwenn 190ms and 400ms... except for the first one, of course.
The performance gain is immediately visible, and YSlow sums it nicely:
- empty cache (initial page): 28 HTTP requests, 145kB total size served.
- primed cache (subsequent views): 1 HTTP request, 2.7kB total size served
Gimme the code !
Since this is just a small trick, we can just use a minimal amount of code: an unsubtle, unconfigurable, one-function: module is sufficient :
* Add Expires headers to Apache configurations missing them
* @author Frederic G. MARAND
* @license GPL2 and later
$type = file_get_mimetype($filename);
$ret = in_array($type, array('image/jpeg', 'image/png', 'image/gif'))
? array('Expires: ' . date(DATE_RFC1123, time() + 3*24*60*60));
// Note: example uses the OSInet coding style for readability.
// In your own code, you'll want to use the Drupal style instead.
Do not take this for more than what it is: a kludge. The best solution is still to use a properly configured web server. Remember to check the
phpinfo() anytime you must use a server you won't have control over. Sometimes, all it takes to obtain these extensions is some gentle nudging of the hosting company: after all, in most cases adding them will actually reduce both the load on their servers and the bandwidth used by the site. This is the happy ending which happened in that specific case some time after the site started, so you can strive for it too.
Going still further
What ? Still more... tssk, tssk...
Anyway, at some point, you may be able to evolve the hosting and obtain a better Apache configuration, maybe just as I described above. Need you revert to public downloads ? Well, not really, after all,
mod_rewrite will allow you to intercept these file URLs, and rewrite them to that dedicated instance serving static content which you'll have set up just for that. Or whatever : at any rate, it means you won't have to rummage through all the content in the site looking for broken file links just to serve them back with the public method again.
Looks like private downloads aren't so bad after all, are they ?
"Now, when mod_deflate is missing, using the "Page compression" option on http://example.com/admin/settings/performance is a good workaround for the download page size, but what about the static files ?"
I prefer to think of it the other way around. I think the page compression in D6 is superior to mod_deflate or mod_gzip. Since a page is compressed just once before being cached as opposed to with an Apache module processing is required to compress on every page load.
Nice trick with Private Files.
You make an interesting point, Dave. I guess this would need some serious benchmarking to be determined: did you check both choices and come to this conclusion, or is it just an educated guess ?