Caching best practices & max-age gotchas

Pattern 2: Mutable content, always server-revalidated


If possible only permit characters a-z, A-Z and and make it possible to only download from one "download-folder". This additional header is "Transfer-Encoding: The easiest way to disable this behaviour is with the following. A note on the smartReadFile function from gaosipov: Eventually figured out the problem was that I had LeechGet installed and it was intercepting the download, which in turn prevented the download from taking place. Mon, 26 Jul

Pattern 1: Immutable content + long max-age


I can't help you. Yep, I've got those, here you go. Also must-revalidate doesn't mean "must revalidate", it means the local resource can be used if it's younger than the provided max-age , otherwise it must revalidate.

Next time the client fetches the resource, it echoes the value for the content it already has via If-None-Match and If-Modified-Since respectively, allowing the server to say "Just use what you've already got, it's up to date", or as it spells it, "HTTP ".

This pattern always involves a network fetch, so it isn't as good as pattern 1 which can bypass the network entirely.

It's not uncommon to be put off by the infrastructure needed for pattern 1, but be similarly put off by the network request pattern 2 requires, and instead go for something in the middle: This is a baaaad compromise. No problem, here they are. Sure thing, it's actually changed since you last asked for it. It's also fine to use for the next 10 minutes.

This pattern can appear to work in testing, but break stuff in the real world, and it's really difficult to track down. The version mismatch broke things. Often, when we make significant changes to HTML, we're likely to also change the CSS to reflect the new structure, and update the JS to cater for changes to the style and content.

These resources are interdependent, but the caching headers can't express that. If you have some pages that don't include the JS, or include different CSS, your expiry dates can get out of sync. Multiply all this together and it becomes not-unlikely that you can end up with mismatched versions of these resources.

From subtle glitches, to entirely unusable content. If the page is loaded as part of a refresh, browsers will always revalidate with the server, ignoring max-age. So if the user is experiencing something broken because of max-age , hitting refresh should fix everything. Of course, forcing the user to do this reduces trust, as it gives the perception that your site is temperamental. In the above I'm cache-busting with a random number, but you could go one step further and use a build-step to add a hash of the content similar to what sw-precache does.

As you can see, you can hack around poor caching in your service worker, but you're way better off fixing the root of the problem. Here I'd cache the root page using pattern 2 server revalidation , and the rest of the resources using pattern 1 immutable content. Each service worker update will trigger a request for the root page, but the rest of the resources will only be downloaded if their URL has changed.

A URL can be used as a filename with this function if the fopen wrappers have been enabled. See fopen for more details on how to specify the filename. See the Supported Protocols and Wrappers for links to information about what abilities the various wrappers have, notes on their usage, and information on any predefined variables they may provide. Context support was added with PHP 5.

For a description of contexts , refer to Streams. Edit Report a Bug. Parameters filename The filename being read. Return Values Returns the number of bytes read from the file. Tip A URL can be used as a filename with this function if the fopen wrappers have been enabled. Just a note for those who face problems on names containing spaces e. My script working correctly on IE6 and Firefox 2 with any typ e of files I hope: To avoid the risk of choosing themselves which files to download by messing with the request and doing things like inserting "..

It's your script and you have full control over how it maps file requests to file names, and which requests retrieve which files.

Basic first-day-at-school security principle, that. To anyone that's had problems with Readfile reading large files into memory the problem is not Readfile itself, it's because you have output buffering on. Just turn off output buffering immediately before the call to Readfile. A note on the smartReadFile function from gaosipov: You can modify this and add fpassthru instead of fread and while, but it sends all data from begin it would be not fruitful if request is bytes from to from mb file.

In response to flowbee gmail. It's because the writers have left out the all important flush after each read. So this is the proper chunked readfile which isn't really readfile at all, and should probably be crossposted to passthru , fopen , and popen just so browsers can find this information: Be sure to include this!

Most if not all browsers will simply download files with that type. If you use proper MIME types and inline Content-Disposition , browsers will have better default actions for some of them. To deliver the file with the proper MIME type, the easiest way is to use: If you are looking for an algorithm that will allow you to download force download a big file, may this one will help you.

File Transfer' ; header 'Content-Type: This was the only way I found to both protect and transfer very large files with PHP gigabytes. It's also proved to be much faster for basically any file. Available directives have changed since the other note on this and XSendFileAllowAbove was replaced with XSendFilePath to allow more control over access to files outside of webroot. Turn it on XSendFile on Whitelist a target directory.

A mime-type-independent forced download can also be conducted by using: Mon, 26 Jul Using pieces of the forced download script, adding in MySQL database functions, and hiding the file location for security was what we needed for downloading wmv files from our members creations without prompting Media player as well as secure the file itself and use only database queries.

Something to the effect below, very customizable for private access, remote files, and keeping order of your online media. Of course you need to setup the DB, table, and columns. I have a link on my site to a script that outputs an XML file to the browser with the below code: However, if this setting is checked, and browser windows are being re-used, then it will open up on top of the page where the link was clicked to access the script.

But, if the setting is unchecked, the output XML file will open up in a new window and there will be another blank window also open that has the address of the script, in addition to our original window.


Parameters. filename. The filename being read. use_include_path. You can use the optional second parameter and set it to TRUE, if you want to search for the file in the include_path, too.. context. A context stream resource. Caching best practices & max-age gotchas Posted 27 April - this post was brought to you by: procrastination! Getting caching right yields huge performance benefits, saves bandwidth, and reduces server costs, but many sites half-arse their caching, creating race conditions resulting in interdependent resources getting out of sync.