Just noticed a recent change to FileWeblet to support compression. I noticed that it compressed the file dynamically
For static files, the best practice (IMO) is to have the user provide a compressed version of the file as a file. Just check if there's a file with the suffix .gz added and send that. No need to check mime types either, since the developer can decide for themselves which ones to gzip (or not).
Dynamic compression should only be used for dynamic content, otherwise it's kind of a waste of CPU.
Anyway, my suggestion...
SlimerDudeSun 4 Mar 2012
gzip content compression is part of the HTTP 1.1 spec and therefore should be implemented as such. Essentially you check the Accept-Encoding HTTP header in the request to check what forms of compression the client can handle.
Many (if not most) scalable web application servers give you gzip functionality out of the box - I've never heard of CPU overhead being a concern. For example T5 lets you configure 2 properties:
tapestry.gzip-compression-enabled
tapestry.min-gzip-size
That way there's not a temptation to gzip everything!
dobesvSun 4 Mar 2012
Hi Slimer,
I was looking only at the dynamic compression of static files. I guess I'm always concerned about unnecessary CPU usage when it can so easily be avoided in this case. If the original content was dynamic then obviously you would have to compress it dynamically. For static content it's simply not necessary to dynamically compress it, it can be compressed statically ahead of time just like the original content.
brianSun 4 Mar 2012
I looked at pre-compressing the JavaScript we generated by the Fantom compiler since those are fairly big and always served up to a browser. I couldn't measure any noticeable performance difference and so stuck with the simplest design.
But that doesn't mean you couldn't pre-compress your static files. That change just provides what I consider the best default behavior for text files.
dobesv Sun 4 Mar 2012
Just noticed a recent change to FileWeblet to support compression. I noticed that it compressed the file dynamically
For static files, the best practice (IMO) is to have the user provide a compressed version of the file as a file. Just check if there's a file with the suffix .gz added and send that. No need to check mime types either, since the developer can decide for themselves which ones to gzip (or not).
Dynamic compression should only be used for dynamic content, otherwise it's kind of a waste of CPU.
Anyway, my suggestion...
SlimerDude Sun 4 Mar 2012
gzip content compression is part of the HTTP 1.1 spec and therefore should be implemented as such. Essentially you check the
Accept-Encoding
HTTP header in the request to check what forms of compression the client can handle.Many (if not most) scalable web application servers give you gzip functionality out of the box - I've never heard of CPU overhead being a concern. For example T5 lets you configure 2 properties:
That way there's not a temptation to gzip everything!
dobesv Sun 4 Mar 2012
Hi Slimer,
I was looking only at the dynamic compression of static files. I guess I'm always concerned about unnecessary CPU usage when it can so easily be avoided in this case. If the original content was dynamic then obviously you would have to compress it dynamically. For static content it's simply not necessary to dynamically compress it, it can be compressed statically ahead of time just like the original content.
brian Sun 4 Mar 2012
I looked at pre-compressing the JavaScript we generated by the Fantom compiler since those are fairly big and always served up to a browser. I couldn't measure any noticeable performance difference and so stuck with the simplest design.
But that doesn't mean you couldn't pre-compress your static files. That change just provides what I consider the best default behavior for text files.
SlimerDude Sun 4 Mar 2012
I'd go 3rd party, leave the hard work to others... http://developer.yahoo.com/yui/compressor/