A Structured & Efficient Way To Improve Your Page Speed

turbin3

BuSo Pro
Joined
Oct 9, 2014
Messages
613
Likes
1,287
Degree
3
I'm kind of a page speed fiend, so I figured I might as well put up a post with a few golden nuggets I've come across as I've educated myself on the subject, and the things I've learned through trial and error. I'm sure a lot of you guys already know most of this, and I'm sure most of it won't be anything new, but if it can at least help some newbies, I figured it was worth posting. Sorry in advance. It's practically a novel, and maybe a bit disorganized as I'm a bit delirious after work today.

While I'm not going to go into specific details as to why fast page load times are important, we all know they are a ranking factor and can directly impact your site's engagement metrics (bounce, exit rate, time on page, etc.). Also, as mobile traffic is becoming more and more common, page load times become even more important. With mobile internet traffic, there are 4 additional steps in the process for a mobile device loading a web page, so desktop load times are exponentially increased when it comes to mobile. While many people go by the age old rule of "3 seconds or less" for desktop, keep in mind, it could be significantly longer load times on mobile, so that simply might not be good enough.

Anyways, getting to the meat and potatoes. Most of this will be from the standpoint of Wordpress, since I'm guessing most of us are building and running Wordpress sites pretty frequently. To start off with, if I have a page I'm trying to optimize, I'll first run it through http://gtmetrix.com/dashboard.html and benchmark its existing performance. You'll get a Google Page Speed grade as well as a Yslow grade, both with slightly different optimization recommendations. Things to take note of:
  • Page load time
  • Total page size
  • Total number of requests (HTTP requests)
In general, to improve page load times, you are looking to reduce page size, reduce the number of HTTP requests, consolidate and streamline script elements on page, and defer parsing of on page resources (scripting) which impede rendering.

Reducing Page Size:

To reduce page size, there are several main areas you'll want to look to accomplish this. Usually the quickest and easiest with GTMetrix is to look for the "Optimize Images" recommendation. If it's less than 100%, click on it to expand and show the details. The awesome thing with this site is, if there are images that can be further compressed to reduce size, they will automatically do this with lossless compression and actually PROVIDE the optimized image for you! Super cool. So right off the bat, you can download these images, rename them identical to your existing on page images, upload them to your server, and you've already made an impact! In all the times I've used these images, I have yet to see any decrease in image quality or resolution.

The general idea with the GTMetrix recommendations, is with any of the recommendation sections that are below 100%, you want to implement as many of those changes they recommend and push that section as close to 100%. Through doing this, depending on the various sections you'll be making incremental gains in reducing page size, reducing the number of HTTP requests, improving the efficiency and parsing of on page scripting, etc., all leading to a faster and better rendering page.

Other areas for reducing page size are:
  • Minify CSS
  • Minify HTML
  • Minify Javascript
  • Enable gzip Compression
The quickest, easiest win here for a Wordpress site would be to use the W3 Total Cache plugin. There are some other decent ones, and there may even be some that are ultimately better. I've found on a number of sites that simply installing and activating this plugin will usually make a substantial improvement in page load times. Something to keep in mind, though, is if you use a managed Wordpress host, they probably already have some server side caching features implemented, so W3TC may not be compatible with your host, or may not perform as well. If you can use it, enable Minify under the "Minify" section in the General Settings. Also, enabling everything won't necessarily improve page load times, and in some cases can actually increase load times, so keep this in mind. You may have to play around with the settings a bit. Start off with just Minify and the other default settings to be safe.

Quick word of advice. Create a GTMetrix account, and you can save your benchmark run. When you run it, click "Page Settings" near the top left of the page, and click "Save Page". Each time you feel you've made significant progress implementing some changes, such as the minify changes above, click "Re-Test Page" and see what effect your changes had.

Reducing HTTP Requests:

This is probably one of the most overlooked areas. Some of my clients' sites are utterly ridiculous in how overly bloated their on site scripting is, to the point of having a ridiculous number of HTTP requests. The thing to keep in mind here is, additional requests take time that you cannot afford. Also, with a lot of CMS' (don't even get me started on freakin Drupal!), and with a lot of aftermarket themes, scripting is excessive, redundant, and can add a lot of overhead here. These are factors you should consider when purchasing a theme. The theme may look great, but it probably won't matter if it's a convoluted mess "under the hood". Things that add to HTTP requests:
  • Scripting (Javascript, Jquery, etc.)
  • External styling elements (CSS)
  • Number of images
First off, scripting. Every single script your page is utilizing adds an additional HTTP request. Where possible, try to utilize themes that have scripts efficiently consolidated. A lot of themes out there are not even remotely close to "optimized for page speed" like they say. For most uses, unless there is some fairly complex stuff going on on-page, having 15-20+ separate JS files referenced on a single page is ridiculous. If you code yourself and you're building the theme from the ground up, keep this in mind. Although it can sometimes making coding more complex by consolidating everything into fewer files, it's usually going to work out better in the long run.

CSS. Same thing. Consolidate your CSS files if possible. I realize this is probably more trouble than it's worth with a lot of out of the box themes, but if it's not, do it.

Number of images. This is usually going to be one of the quicker and easier areas to reduce number of HTTP requests. Every single, separate image will add an additional request. The quickest way to make a big impact across the whole site is to think about common images that are present on most pages:
  • Header
  • Navigation
  • Footer
  • Social icons
  • Sidebar
Consider using a CSS sprite for things like social profile icons, as you'll cut those multiple separate files down to a single file. Also, instead of using older themes that use actual images for navigational elements and menus, a lot of newer themes simply use HTML5 and CSS styling elements to accomplish the same thing, often in a better looking manner, and without adding additional requests for separate images. One other thing to consider; don't use background textures that come with most themes. It's an additional image or set of images, so additional requests. HTML5 and CSS can do most of what you want in a cleaner manner. The whole Swiss "flat design" style is kind of the way to go these days anyways, and using things like background textures looks so 2005. :wink:

As far as what a reasonable number of requests is, I have my own opinions. I'll give you both ends of the spectrum, as there's no real "rule" here. Some of my sites I've had as low as 6-8 requests, but I usually try to achieve at least 10-20 maximum, if possible. Sometimes, if the other areas of the site are very efficient and fast, such as the host, overall page size, usage of a CDN, etc. you can get away with a lot more. Some sites have quite a bit more on-page scripting and end up with 30-50 HTTP requests, while still being fast. Again, there's not really a hard and fast rule here. Some of my current clients' sites are on the pretty ridiculous end of the spectrum. One previous client had something like 200-210 requests. Bounce rate was exorbitant, engagement metrics almost non-existent, rankings sucked, and they went nowhere fast. Ultimately, they paid me to provide them recommendations on the obvious, ultimately to ignore that advice and continue on into oblivion. *Pro tip*: Don't hire a professional, argue with them, and then ignore their advice, all while expecting success. Many of my current clients are around the 80-140 request range, which is extremely bloated IMO. Ideally, I'd like to see those numbers cut substantially, to 50 requests or less. It's been my experience that much beyond 40-50 requests and the total number of requests is definitely a significant factor that is probably holding your page load times back.

Anyways, GTMetrix has great documentation on all of their other recommendations, so I won't go into detail on a lot of the other options (unless someone has specific questions about any of them). Some other things to keep an eye on, that can help you in pinpointing issues or prioritizing them. The Timeline tab will show you a timeline of how the page loads, including all of its on page elements. Many will load simultaneously, but you will also see some that stand out, that take significantly longer to load. Looking at this chart, I'll usually start off looking at any of the major spikes to get a sense of what those elements are, and how that might affect the way I approach optimizing the page. If most of the spikes are images, I might be a bit more heavily focused on image optimization, finding different images that are smaller in file size, consolidating images with a CSS sprite, etc. If it's scripting, maybe I might look to combine scripts into a fewer number of files if it's not too difficult and if it isn't going to screw up some proprietary theme design. Also, I'll look at overall page size and the sizes of the files being loaded. If any stand out as significantly larger than others, I might also look to simply eliminate certain files from the page. For example, some CMS themes may have certain background or unseen scripting elements present, that aren't really in use. Sometimes it's a carryover from other pages of the site, but maybe some of the scripted elements (social sharing widgets) aren't necessarily being used at a "global level" on every page of the site. If I ever come across things like that, I might look to eliminate the scripts being called on that one page, that aren't in use.

Mobile:

Lastly, try running your site through GTM using a few of the different server options. Under the Page Settings section in the top left, you can choose the server location, browser, and connection type. Try setting it to a 3G connection, then re-test the page to see what changes. Any significant increase in page load times? If so, where did the increase come from? Look at the timeline and how it differs from the previous run (this is why you want to save runs). Also, did the overall page size change? How about the number of HTTP requests? Sometimes between desktop and mobile you may find different elements that load on both or provide an impediment. Go through the same areas for improvement with the mobile run, and see if there's any significant improvements to be made there as well.
 
sm_0009_ABC_contest_first_prize_1992_cadillac_eldorado.jpg
 
second-prize_6396.jpg


You get some steak knives... AND you get to keep your job!

The part about GTMetrix already optimizing the troublesome images for you is gold. So easy!
 

90's Caddie. Baller Status: Achieved! :wink:

Another random nugget. "Minimize Redirects" is one of the recommendations on GTM. It's something people may not think about all of the time but, where possible, try to use the ultimate destination URL for any internal or external links or resources used on a page. In many cases, as the page is loading and rendering, redirects can add to the number of HTTP requests required to load the page, increasing load times.

Also, this has to do with redirect chains as well. A lot of people's sites, due to a lack of due diligence and consistency, will have a lot of little technical issues that can cause things like this to snowball. For example, using a trailing slash on URL's while sometimes also not using a trailing slash.

Also, if your server is setup to resolve case dependency, but then you use inconsistent variations in your page code (sometimes using upper or mixed case, sometimes all lower case).

Also HTTP vs. HTTPS. Think about anything that could result in a redirect being performed. Do you have your non-www version of your site resolved (redirected) to your www version, or vice versa? If so, you should ALWAYS be internal linking using that URL format. If your site is resolved to
Code:
http://www.example.com/
format, don't go inconsistently using a
Code:
http://example.com/subdirectory/
type format on some links, as it will result in a redirect.

Also, decide once and decide NOW. Trailing slash or no trailing slash? Whatever you decide, for the love of God use the same format every single time. I can't say enough about attention to detail here. It's the little things that kill page load times.
 
@turbin3 This is a great post for everyone looking to really speed up their websites. I am a speed nut as well, and can add some things to the OP from my experiences and tests.

Ill try to chime on the subjects in the same order you laid them out to provide some consistency as well.

Before I go into Images, CSS, JS, etc, Ill mention this: HOSTING
I recently moved over many websites away from shared hosting to hosting that was SSD hosting on a VPS. This helped some of my websites tremendously after optimizing them as previously stated above. I try to buy the best hosting I can afford. That way I know my uptimes are going to be stellar and its one less thing to worry about.

Images -
As mentioned above, Images should be optimized as much as Possible, whether through GTmetrix or Photoshop ("Save for Web & Devices" option), but further more, I usually cache the heck out of images, css and javascript. On my VPS, I setup xCache. Here's a quick run down on how I did it on my VPS and have helped others as well set it up. I've seen 50%-70% speed increases using this - your mileage will obviously vary depending on theme, cms, hosting, etc etc. But here is the run down:
*Before you do this, Run website through GTMetrix and note the speeds, etc*

Install xCache using EasyApache and then install Deflate from the Exhaustive Options list too when you install xCache. After you've rebuilt your Apache install, go-to Services Configuration >> Apache Configuration >> Include Editor >> Post VirtualHost Include, select All Versions from the Dropdown, and then paste the mod_defalte.c and mod_headers.c code (listed below) on top of on another within the input field.

<IfModule mod_deflate.c>
#The following line is enough for .js and .css
AddOutputFilter DEFLATE js css

#The following line also enables compression by file content type, for the following list of Content-Type:s
AddOutputFilterByType DEFLATE text/html text/plain text/xml application/xml x-font/otf x-font/ttf x-font/eot

#The following lines are to avoid bugs with some browsers
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
</IfModule>
<IfModule mod_headers.c>
<FilesMatch "\.(js|css|xml|gz)$">
Header append Vary: Accept-Encoding
</FilesMatch>
</IfModule>

Then in your .htaccess for each site your hosting on that VPS, I paste this in:

<IfModule mod_expires.c>
# Enable expirations
ExpiresActive On
# Default directive
ExpiresDefault "access plus 1 month"
# My favicon
ExpiresByType image/x-icon "access plus 1 year"
# Images
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpg "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
# CSS
ExpiresByType text/css "access 1 month"
# Javascript
ExpiresByType application/javascript "access plus 1 year"
</IfModule>

These features allow the compressing all the mentioned files and speed up the process of delivering the data to the browser. So when your visit this site again, all images, css, fonts, js, icon, will be pulled from cache rather than re-served from your server. I know he mentioned using the caching plugin, but I've actually seen Page speed loses using it - it really depends on your Theme. If you can manually go through and Combine all CSS Files, JS files, this is reduce processing of on the backend as well.

CSS & JS magnification: If your like me and want the least amount of plugins, I usually manually Minify my CSS and JS files. I use these to sites to copy/paste code back & forth into my files. Removes all deadspace and brings size down. http://javascript-minifier.com/ and http://cssminifier.com/

CSS Sprites are a must nowadays as mentioned above too. There are services online that you drag and drop images onto, and it will actually give you the CSS code to copy and paste into your CSS files.

Page Requests - The Lower the better obviously. Browsers can simultaneously download a certain number of requests/connections at once from a server. Every browser is different, see this site for max connections per host: http://www.browserscope.org/?category=network.
Pro-Tip: One thing that I've seen is if your using Forms, or certain Plugins, they will have their own CSS & Javascript files. I have been able to copy all the code from those CSS and JS from those plugins and paste in "styles.css" and then remove those references lines from the plugins altogether which additionally reduce requests further.


As i've noted in another Page speed post, I've used other CMS's that are built for speed, no DB, nothing. If you are bold enough to go into unknown territory, you can significantly speed up your website, lower requests and really dial in Time to First Byte. Here is the screen shot i posted in a different thread regarding this:
speedtest.jpg

and more Importantly, TTFB:
speedtest.jpg


I use http://www.webpagetest.org/ to further check my website for speed, TTFB, etc. Provides some really good info to see where you stand. From what I remember reading on Moz once about TTFB, you want to try to get your TTFB below 500ms. That's what i shot for with one of my lead gen sites and the load times are ridiculous. Try out some flat file CMS's (Pico or Phile Cms) to really see what speed is all about.


- Themes:
I now search for themes that are light, lightweight, bootstrap, etc. These themes seem to have a little optimization built into them, and can further be dialed in.

That's all for now. I'm sure i've missed some things, i'll update and add as I remember.
 
Before I go into Images, CSS, JS, etc, Ill mention this: HOSTING
I recently moved over many websites away from shared hosting to hosting that was SSD hosting on a VPS. This helped some of my websites tremendously after optimizing them as previously stated above. I try to buy the best hosting I can afford. That way I know my uptimes are going to be stellar and its one less thing to worry about.

That reminds me, that's one of the nice things about the Timeline feature in GTM. It can work really well for evaluating server response, difference between hosts pre and post migration, etc.


These features allow the compressing all the mentioned files and speed up the process of delivering the data to the browser. So when your visit this site again, all images, css, fonts, js, icon, will be pulled from cache rather than re-served from your server. I know he mentioned using the caching plugin, but I've actually seen Page speed loses using it - it really depends on your Theme. If you can manually go through and Combine all CSS Files, JS files, this is reduce processing of on the backend as well.

I have definitely found that to be true. Caching plugins are always going to be a compromise, and manual implementation will usually allow you to extract maximum benefit. With W3TC, I've definitely found that enabling the wrong features for a particular site will increase page load time, sometimes substantially.

BTW, excellent advice on the xCache implementation!

CSS & JS magnification: If your like me and want the least amount of plugins, I usually manually Minify my CSS and JS files. I use these to sites to copy/paste code back & forth into my files. Removes all deadspace and brings size down. http://javascript-minifier.com/ and http://cssminifier.com/

Also, one thing I forgot to mention is GTM does already provide minified CSS, JS, and HTML files, just like with the optimized images. It might not be perfect, however, and I've tended to do mine manually as well.


- Themes:
I now search for themes that are light, lightweight, bootstrap, etc. These themes seem to have a little optimization built into them, and can further be dialed in.

I'm probably way past the point of diminishing returns in this area. I've spent countless man hours (I'm obsessive sometimes) browsing thousands of themes on Themeforest and elsewhere. I guess it's the hunt for that mythical "perfectly optimized" out of the box theme. I'm sure there is none, but when you can find some that work well, are efficiently coded, and save you on custom coding hassles, it's always nice. In the time I've wasted searching for those "perfect" themes, there is probably so much more I could have accomplished just building it myself. I definitely need to stop procrastinating and start trying some flat file, non-DB, CMS' like Pico.
 
I'm probably way past the point of diminishing returns in this area. I've spent countless man hours (I'm obsessive sometimes) browsing thousands of themes on Themeforest and elsewhere. I guess it's the hunt for that mythical "perfectly optimized" out of the box theme. I'm sure there is none, but when you can find some that work well, are efficiently coded, and save you on custom coding hassles, it's always nice. In the time I've wasted searching for those "perfect" themes, there is probably so much more I could have accomplished just building it myself. I definitely need to stop procrastinating and start trying some flat file, non-DB, CMS' like Pico.

I did the same thing - scouring through loads of themes, testing, running speed tests etc. You will get lucky once in a while, but I find that you can really just dial in themes rather quickly after you've done a few. PhileCMS is the newer, update version of Pico and it hauls. I love it.
 
I figured I'd show an example. Ever since I've found GTM a few years ago, I've basically looked at page speed optimization as sort of a game by continuing to chip away at each of the recommendations, pushing each of those bars towards 100%. :wink: Anyways, here's one of the better results I was able to achieve on one of my sites:

pagespeedexample_zps3e47f472.jpg
 
very nice !!! I take it that's a wordpress site from what it says in the top right corner?
 
Yep. Funny thing is, I went with a theme that was a bit more complicated than it needed to be. I honestly wasted an utterly ridiculous amount of time tweaking the PHP and CSS to squeeze as much page speed out of it as I could. More for the hell of it than anything else. Funny thing is, even though I bought that theme at the time for it's admin dashboard functionality, I ended up removing just about every unnecessary bit of code from the site, thereby eliminating most of the functionality of the dashboard. LOL I think there's a lesson in there somewhere. :wink:

Just curious, does anyone have any experience with utilizing flat file CMS' in conjunction with another CMS, basically to run different aspects of a site. I still have no experience with flat file stuff, but I'm wondering if it might be of any benefit on some sites, using say a flat file cms for the main site (if it's a small site with mostly static pages), maybe Wordpress in a subdirectory for easy blog functionality, etc. I'm not sure if that would just be making things more complicated than they need to be, but it's just something I thought about.
 
I'm sure you can install wordpress in a subdirectory (/blog) and have it be used for a blog. I haven't attempted it, but im sure its totally doable. These flat-file cms's are very fast, and takes very little effort to really make them blaze. Im not sure how they would do on a large scale, but like you said, on a static site, they should do just fine.
 
So I've been on an image optimization kick the past day or so. I know there are ways to do it manually, or very in depth ways to do it that are probably 100% perfect, but I don't have time for that. As far as relatively automated tools that do the job very well, here's what I've found.

PNGGauntlet works very well for PNGs. I honestly wouldn't use it for any other files though, if you aren't wanting to change file formats. For things like GIFs, it will optimize them, but it converts them to PNGs to do so. That's fine for some uses, but I was currently working on a project of optimizing all of the images on a server for an old site, so I basically had to maintain the absolute same exact file names and extensions. On default settings, I found that the compression was decent enough. Out of 284mb of images on this server, optimizing only the PNGs with this program, it cut the overall size to 261mb. A few minutes of work, and already making a nice dent in overall page size across the entire site.

As far as JPGs, GIFs, and TIFFs, go, I just found and started using FileOptimizer, and it has been working fairly well. It's mostly "lossless", however on some images I do notice a very minor decrease in image quality. I haven't seen a substantial decrease so far, so I think for many site images it'll work just fine, and it's fairly quick to do it's thing. One word of caution. Make a backup of your images. The way FileOptimizer works, it replaces the existing file in it's location and dumps the old version in the recycle bin. There's also a setting to skip the recycle bin, which I haven't tried, but I assume will just permanently delete the old version. It can make things a little bit of a hassle but, overall I've found this one to work better than some of the other lossless compression programs out there for JPGs. It also works well for large batches. With this program I cut another 19mb in image file size, so between both programs and a few minutes worth of work, it reduced the entire site's image sizes by 42mb (15%).

Another useful program I came across during this process was Rename Master. One of the issues I ran into was trying out various sites and software for compression. Everything from Smush.it to some manual command line tools. A lot of them end up changing the optimized file's filename and appending something like "-optimized.jpg" to the end. Because of my particular situation, I couldn't afford to have a single difference in any of the filenames, so I was looking for a trick to edit filenames of large batches of files. Rename Master, although it's an old and a little bit glitchy program, did the trick and did it very quickly and easily. In the main window on the right, it's a bit discreet and deceptive but there's a navigation bar at the top. You have to click the black arrows in the bar and select your image location, or you can click to the side of the current location shown in the bar, and you can copy/paste the file location. It's a bit odd and doesn't quite function exactly the same as Windows file manager, so that might throw you off at first. Anyways, navigate to where your images are. Check the appropriate box from the menu on the left, and you'll have to click on the "bar" or words for that menu item to expand it's options. In this case, I had to remove "-compressed." from the end of each filename. I checked box #1 and set it to "Remove" "the Phrase", then entered in -compressed. Make sure you have all of the files you want to edit selected in the main window on the right. Then click the "Rename" button and it will take care of things pretty quickly. It's a glitchy GUI, but at least for this specific usage I found the program to work perfectly. I can definitely recommend this one for doing batch edits of files where File Manager just isn't sufficient.
 
@turbin3 A quick thought on optimizing images. You mentioned you backed up all the images - I take it you just downloaded these files locally to have as a backup. This is a must for anyone working on images, the last thing you want is to "optimize" an image in place and your not happy with the quality.

My thought on this was what if you ran your images through the "Image Processor" feature of Photoshop (File > Scripts > Image Processor). That way you can set your desired Quality level, a sub folder to save them in, and run the images through the processor several times to get the best Size/Quality. Then simply re-upload all images through FTP. This is how I have done all my image optimizing for some sites I have. I have used Imagick commandline tools, online compressing tools, and smush.it as you mentioned above and I noticed the best quality images with the lowest file sizes came through Photoshop Image Processor. Just my .02 cents! :smile:
Im glad im not the only one who's OCD about Speed, file sizes, and optimization :wink:
 
Great ideas @red_devil010 ! I definitely need to play around with Photoshop more when it comes to this subject. It's probably been at least 1 year since I last used it to optimize any images.

Yes, what I was doing is downloading the files locally, then copying them to a backup folder just to be safe. I definitely recommend everyone create a backup of anything they're trying to optimize. This goes for PHP, CSS, JS, etc also. Copy pasta your way to success! :wink:

The really awesome thing about PNGGauntlet and FileOptimizer, which is definitely the reason I used those this time around, is I basically had all of the images in all of their sub-folders, to keep track of where everything was supposed to go on the server. I just copied the entire subdirectory of the site theme, image subfolders, etc. Well all I had to do was search in file manager with something like *.jpg or *.png, highlight everything and drag/drop to the appropriate program. In PNGGauntlet I checked the box that says "Overwrite Original Files". With that selected, both programs will optimize and replace the images in their existing locations, maintaining the file structure, making it easy to determine where everything is supposed to be located. I'm not sure if something similar is possible with the Photoshop Image Processor method?
 
Page speed means you care about the quality of your website, means again you show to the Search Engine you respect the user therefore you get a boost a bit to a spot in 50-10 range of spots.

I love that movie it was a nice intro and that prize is hilarious I also like your profile picture :smile:
 
Creating CSS Sprites

Figured I'd add this, as I've found it to be a great tool for productivity. http://spritepad.wearekiss.com/

The subject of CSS sprites is one that is sometimes maddening. You'll hear people go on and on about, "Oh, just convert it to a CSS sprite." Often, there's not much more of an explanation than that. Yes, there are tutorials out there, but they are often written by web developers for web developers. The UX with those tutorials is often crap. Yes, a lot of us are web developers, but one of the endless struggles in digital marketing is continuously finding additional ways to increase productivity and scale your efforts. It's 2015, and there should be easier ways than spending hours in Photoshop, meticulously measuring sprite dimensions, as far as I'm concerned.

With SpritePad, it's as quick and easy as it could possibly be to create a CSS sprite. You simply drag and drop images onto the grid. After pressing a few buttons and a short amount of work, your sprite image is ready, as well as the necessary CSS code! At that point you simply upload your image to your server, add the CSS to one of your existing CSS files (remember, we want to cut down on HTTP requests, so consolidate as much as possible!), then you'll need to replace the img src code on your pages with code to call the appropriate CSS class for each image.

341a81fe-aa58-4de7-86a2-ce571596ce3c_zps2wtd0rqo.jpg

^
I created that in about 15 seconds. Yes, I counted. LOL
  • Auto Alignment (Premium only unfortunately)
    • This will align and condense the images you dropped on to the page
    • It will get you 95-99% of the way there. I typically find that it's not perfect, and I can play around with moving a few of the images to shave some more length or width from the image. In a pinch, you're most of the way there, though, just using the button.
    • In the above example, as the icons were all of identical dimensions, the Auto Align feature worked 100%, with no changes necessary. Win!
  • Fit Document
    • This resizes the grid based on how you've condensed the image
    • I'll typically hit Auto Align first, make a few minor tweaks to shave some more LxW, and then I'll hit Fit Document to condense the grid to the new size. Once you've done a few, this process is stupid quick and easy. No more hours spent in Photoshop...
  • Download
    • Saves a zip file with the sprite and CSS. You're almost done!

If you don't mind spending a few extra minutes, or if you only have a very few sprites to make, you can get by with the free version just fine. It's a great time-saver.

*Pro tip: BEFORE dropping images on the page, ensure you have descriptive file names. Based on the file names, SpritePad automatically uses those as the CSS class name. :cool:
 
THIS IS MADNESS!: Defer Parsing & Caching Insanity

madness_zpsfmbwocei.jpg


Thought I would post this here, so there's as many recommendations as possible in one space. I had the idea after someone went full retard on CCarter. It's somewhat of an unwritten thing that most of us already know, but I think people simply don't think about sometimes.

You cannot or should not be deferring or caching everything. Some types of on-page elements or scripting need to be "fired" (tracking scripts), refreshed frequently (constantly changing/updating), or in real time (real time user interaction with on-page elements). If you cache and defer everything, there's a good chance that you are going to break things on your site, or they will at least break not long afterwards. Exercise some discretion here.

With long page load times (which you hopefully have improved, right?), sometimes a deferred tracking script will not fire before the user bounces off the page, effectively screwing up your traffic analytics data.

If you defer all CSS and scripts, you could actually be causing a poor user experience (UX). For example, the overall page size and structure could be largely styled in CSS and could also be using JS, jQuery, etc. for initial rendering of some of the basic page structure. By deferring all of these resources indiscriminately, you could be causing the page to render in a very glitchy manner, with the page reflowing/resizing multiple times while loading all of the on-page resources. This is often compounded by accessing the page on a mobile device, or other device with a smaller browser window.

Fake it 'til You Make It

*Life Lesson: Some discrimination is actually a good thing! The alternative is indiscriminateness of thought and action. In our modern societies, it is growing increasingly more common (unfortunately) to see any and all forms of discrimination as "hate crimes". What you need to understand is, there are choices that lead to success, and there are choices that lead to failure. Some choices are better than others, so choose wisely. A Builder learns how to make better choices, which lead to success, so get to it hater! :wink:

You need to consider which CSS and script files are absolutely vital for rendering the page properly, in the quickest and smoothest manner. Serve those as early as makes sense, and defer less vital resources. Think about the basic structure of the page; things such as boxes, tables, buttons, galleries, etc. By being discriminatory and restructuring the order your page resources load, you can effectively "fake" the appearance of a quick load time by causing the most immediately visual and vital parts of your page to render immediately, while unseen resources are loaded last. Balance this initial resource loading with loading the body content as well, and you will have a really smooth and subtle page render, providing a seamless user experience.

A great way to check this is testing your page load times with http://www.webpagetest.org/ Pay attention to the "Start Render" time. For fun, play around with deferring certain scripts, prioritizing others, and retest your page multiple times. You may not notice any appreciable decrease in overall page load time, however, if you're able to shave several tenths of a second off your start render time, that can often make all the difference for the user. HOWEVER, be discriminatory here as well. The quickest possible render time might also be glitchy, if certain scripts are deferred. So don't focus purely on that start render time. Balance that time with the actual appearance of the page load.

*On WebPageTest.org, click the yellow "Advanced Settings" selection in the main test window. Check the box for "Capture Video", and it will create a video of how the page actually renders. You can do the same thing on GTMetrix as well. Play around with this too, and see just how optimal you can tweak that UX.
 
Freebasing With Page Speed Optimization

base64-image-encoding_zps1ql02mnu.jpg


Feel free to laugh at that. It just came to mind when I thought about writing this. :wink: I'm gonna introduce you to a revolutionary new on-page optimization game changer....that's only ~17 years old. In all honesty, this is nothing new, and people have done it for years. If you've seen blended search results in Google, such as news results that have thumbnails, those are base64-encoded images. There might be a reason Google chooses to use those sometimes...

What is base64 encoding? In short, it is a method for encoding data and other media into text strings. Base64 is, however, a general term and can mean or be used for many other things. Specifically what I'm going to talk about is converting images on your site to "data URIs", which in most cases will probably mean base64, however there are other ways to do it, such as with binary text.

What's the benefit?

As with all things, there are trade-offs, so this is not a magic bullet that can be used every time. More on that in a bit. Here are some of the potential benefits:
  • Reduction in HTTP requests
  • Manipulating order of page resource rendering
  • Improved parallelization of page resource loading
First off, if you use a data URI in place of an image URL, it will eliminate that HTTP request! Cool. This can give you some more options to get creative and shave those last few HTTP requests off to make that site light and fast.

In terms of how page resources render, and the order in which they render, there are several different ways you can manipulate this. Generally speaking, the goal is to render the page progressively, starting with the most important and visible components of the page (top) first, and the least important (bottom, below the fold) last. This can improve how the page renders, improve user experience, and trick the users into thinking the page loads a bit faster than it does. Well there are several methods you can use to achieve this, without base64. For example:
  • Spread images and page resources across multiple domains/subdomains
    • Due to the nature of HTTP, you can parallelize rendering of some resources (images, JS, CSS, etc.) by simply serving some of them from another domain or subdomain. Obviously, there are a LOT of different variables here. Most often, what you'll see people do is utilize a subdomain for serving some resources, such as images.example.com or 1.example.com and 2.example.com
    • In short, people have done a lot of studies on the point of diminishing returns here, and it's recommended that you stick to probably not more than 2 additional domains or subdomains. Much beyond that and there isn't much of a benefit, and in some cases it can actually add to page load times. ALSO, if you are using multiple separate domains, you need to be concerned with the servers/hosting for each domain, and the performance differences between the two.
    • If you have cookies set at a domain level, you should probably consider utilizing a second domain to serve some of the images/resources. When cookies are set at a domain level, this means every single subdomain will have cookies as well, which means that those resources may or may not be cached depending on the user's DNS/proxy/browser. Not an issue for the first pageview, but if you have lots of return visitors, it can be a big issue.
    • Another example is using a CDN, which achieves some of the same things.
  • CSS Sprites
    • There's plenty on that earlier in this thread. In short, you've reduced HTTP requests and your sprite images render in parallel if not almost simultaneously, as opposed to having concurrent requests.
Where do we implement it?

In short, primarily for SMALL images (a good guideline is 3kb or less in many cases), you have two primary methods of implementing it:
  • Directly in the HTML in place of the <img src URL
    • Pros:
      • Quick to implement
    • Cons:
      • Those images will NOT be cached
      • Adds to HTML size
  • In CSS
    • Pros:
      • Images will be cached (more accurately, the CSS gets cached, and the URIs are in it)
      • Better control over image serving
      • Can add fallback image versions, such as URL for source image, in case a browser can't render the data URI
    • Cons:
      • More development time
      • Increases size of CSS file
Ultimately implementing them in CSS is usually the ideal route, as it offers the greatest number of options for cross-platform compatibility, responsiveness, and caching. Directly in the HTML is the quick and dirty method, which can work just fine many times, especially when used sparingly.

There is something to keep in mind with base64 data URIs. On average, they increase file size by approximately 30%. This highlights one of the reasons it may not be a great idea using them for larger images of say several hundred kb. Even so, there are always exceptions, and I have seen some sites utilize it for large images in edge cases on pages where it was an acceptable compromise to achieve a specific goal. This is something you'll have to experiment with, and it will benefit you to take a look at the numbers and figure out what's an acceptable compromise for you.

Compromise

For example, since these base64 data URIs are going to increase file size of either your HTML/PHP or CSS, there is a point of diminishing returns here. Too many files or too many large files and you could be hurting your page speed. For example, lets say your base HTML is only 5kb. Maybe you don't have the development time to waste on CSS, and you just need to fire up a base64 data URI generator, copy paste the code to your img src URLs in HTML, and be done with it in two minutes. Well if you do that for a dozen small icons and images (say most at 3kb and smaller), and maybe it increases your HTML size to something like 50kb, in the grand scheme of things that's still a small HTML file, and you may have gained enough speed improvement elsewhere that it's "good enough". Here's a couple examples of some pages where I implemented this quick and dirty method:

Example 1:
  • Went from 45 HTTP requests to 24 requests
  • HTML went from 5kb to 77kb
  • +3% PageSpeed, +8% Yslow
  • Render time improved ~200ms

base64-example1_zps1fmora1q.jpg

Example 2:
  • Went from 43 HTTP requests to 24 requests
  • HTML went from 4.5kb to 64kb
  • +3% PageSpeed, +7% Yslow
  • Overall load time improved from 800ms to ~580ms
base64-example2_zps1h1yxvam.jpg


What does it look like?

Standard HTML Image link:

Code:
<img src="https://www.panic.com/blog/wp-content/themes/panic/images/icon_smile2.png">

HTML Base64 Link (easy to see how this can increase page size):

Code:
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAUCAYAAACNiR0NAAAACXBIWXMAAAsTAAALEwEAmpwYAAAEJklEQVQ4Ea1UW2xUVRRd9zXTefQ5bZHSYKUOg3WIDZE6LbFCqEowFDVggqKS6I/G+OGXUUz0w8iXIgpNhDQG5ceg0JZRDBqI2jJtU61UW8uElNLSQplOh+m00+l9HPe+01GJ+OdO1r377sc6+5y9zwX+Z5H+iy9QhOCWO/HEAytRX1mqVnLceMwY776C86dHcWI4gd9ul/svwpI8lL+5Hvv2NJc+m1+7SVVWboDkqbBzxdwEzCudmO0/a3zaHvvs3V68Hl/A1O2I2SbdVYA13S/IUf3rJiFip4SY6xciGRFi5ocsWGcb+TiGYzmHc3OkOUUqdqL8zHNypPaZLVXK2peBzCxgzQNC5GKzb4lSZDfgzIc5cAj9x05ffvioFZrJ2JUKZSlaea9RamnetapRDe4C0hOATjuxiNQk6EnA4AVSWZs+TQtOQ/YFUO6KFrmuJZZ/exltxGUTyv4i1La86Drguq9OgqxRMJFpaWCRiBSqMkNvk8hU0nO2dJxsC5BdMmocY8Eve4x2Os/rMrEq26qlp92VXhmSAcyPUGIMg92/4vndn+BcRyd9X7fBOtvYxzF2rGKAc5mDuVR6qPXVakgpcNDq1+iTKoSFk8d/R4XHQrjtAjbeT9slCbeN2LaTx/tQs+ZeYIHr0cG5zIE+XWWLVuFTyqExISUuxiDicRSv8OLtD4IIrCtFKmHZYP0dsrGPYzjWbp7DAZuDuLjLJZ0vOSKh7WV+2Uk9cpiAlzrroOIti0C2DNlY2C+TznXM0/GkKF1XYJE/0n4juqFlMcRbFlfjZgx6yg8XsPetOfh8BgJ3A8vuAIpKgDyy0ykgTX1KUGGTk8DgH8Rlatj7hgeYFSCOG8zFhHpkxLrweEavl70GdHch5IeOIu5OYyw2CvPSFCxjPju5mgeaexncNVVAqQGzZw/gmYc5poE4BpiLAM1fjObkQacpBrziu/0QOxvrxKGD74sfe38Rk0lT6EKIDGE8nhFnu7rFgf37xJMNa0X3YRr7fq9IfuQ0mYO5uEIjOoO+1g49/Ooqbdvm7V6MDvVg5HAPJr4AvqLjsrh1JHx81HiIBLCzCah7hJpDdbWG9TBzUIiRu3oO+ik0fP+K+nntDtcK0L/lm2M6UkMGCvJMWjZ7/XRdRjIjo7hWRdNTNF40sv0n0lc3f2zspqHuIsLF3NWz0gZunhm0og+6rNDyMqXQv0mDu0rFTUVBmsZVL9TgqtYQ3OrAugba2LCFnzsWxnYcMV6bSOE8kdE1ulW42jKfC49+uFU+NdvqNESXR4hLBUKMLYH1nzxi9ojT4BiO5RxCbqd/K0vc7MgnVAZ8WP/YamljaLV8T2W5VMr+8SkRi1y0hsIXxbnhafSyicDXKHsmpPzFTPo/hdtQRCgm0KCBrpEti/ScI8wQqDU8nbfKny8J3bgvSgspAAAAAElFTkSuQmCC">

CSS Version: (Just one example, by no means complete)

Code:
<style type="text/css">
div.image {
    width:            100px;
    height:           100px;
    background-image: url('data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAUCAYAAACNiR0NAAAACXBIWXMAAAsTAAALEwEAmpwYAAAEJklEQVQ4Ea1UW2xUVRRd9zXTefQ5bZHSYKUOg3WIDZE6LbFCqEowFDVggqKS6I/G+OGXUUz0w8iXIgpNhDQG5ceg0JZRDBqI2jJtU61UW8uElNLSQplOh+m00+l9HPe+01GJ+OdO1r377sc6+5y9zwX+Z5H+iy9QhOCWO/HEAytRX1mqVnLceMwY776C86dHcWI4gd9ul/svwpI8lL+5Hvv2NJc+m1+7SVVWboDkqbBzxdwEzCudmO0/a3zaHvvs3V68Hl/A1O2I2SbdVYA13S/IUf3rJiFip4SY6xciGRFi5ocsWGcb+TiGYzmHc3OkOUUqdqL8zHNypPaZLVXK2peBzCxgzQNC5GKzb4lSZDfgzIc5cAj9x05ffvioFZrJ2JUKZSlaea9RamnetapRDe4C0hOATjuxiNQk6EnA4AVSWZs+TQtOQ/YFUO6KFrmuJZZ/exltxGUTyv4i1La86Drguq9OgqxRMJFpaWCRiBSqMkNvk8hU0nO2dJxsC5BdMmocY8Eve4x2Os/rMrEq26qlp92VXhmSAcyPUGIMg92/4vndn+BcRyd9X7fBOtvYxzF2rGKAc5mDuVR6qPXVakgpcNDq1+iTKoSFk8d/R4XHQrjtAjbeT9slCbeN2LaTx/tQs+ZeYIHr0cG5zIE+XWWLVuFTyqExISUuxiDicRSv8OLtD4IIrCtFKmHZYP0dsrGPYzjWbp7DAZuDuLjLJZ0vOSKh7WV+2Uk9cpiAlzrroOIti0C2DNlY2C+TznXM0/GkKF1XYJE/0n4juqFlMcRbFlfjZgx6yg8XsPetOfh8BgJ3A8vuAIpKgDyy0ykgTX1KUGGTk8DgH8Rlatj7hgeYFSCOG8zFhHpkxLrweEavl70GdHch5IeOIu5OYyw2CvPSFCxjPju5mgeaexncNVVAqQGzZw/gmYc5poE4BpiLAM1fjObkQacpBrziu/0QOxvrxKGD74sfe38Rk0lT6EKIDGE8nhFnu7rFgf37xJMNa0X3YRr7fq9IfuQ0mYO5uEIjOoO+1g49/Ooqbdvm7V6MDvVg5HAPJr4AvqLjsrh1JHx81HiIBLCzCah7hJpDdbWG9TBzUIiRu3oO+ik0fP+K+nntDtcK0L/lm2M6UkMGCvJMWjZ7/XRdRjIjo7hWRdNTNF40sv0n0lc3f2zspqHuIsLF3NWz0gZunhm0og+6rNDyMqXQv0mDu0rFTUVBmsZVL9TgqtYQ3OrAugba2LCFnzsWxnYcMV6bSOE8kdE1ulW42jKfC49+uFU+NdvqNESXR4hLBUKMLYH1nzxi9ojT4BiO5RxCbqd/K0vc7MgnVAZ8WP/YamljaLV8T2W5VMr+8SkRi1y0hsIXxbnhafSyicDXKHsmpPzFTPo/hdtQRCgm0KCBrpEti/ScI8wQqDU8nbfKny8J3bgvSgspAAAAAElFTkSuQmCC');
}
</style>


How Do We Create Them?

Here's a few links to some generators:

http://www.base64-image.de/
http://www.motobit.com/util/base64-decoder-encoder.asp
http://www.freeformatter.com/base64-encoder.html

There are some better ones out there, though it seems some of the ones I normally used are down right now. There are also ways you can configure your server to generate these at a server level, but I know less about that so I'm not going to talk about it. Just be aware that the data URI will have some different parameters depending on the source image/file type. Also, there is an issue where it is apparently beneficial having padding on the end of a data URI, though I forgot exactly why off-hand, and it's late. Typically this will be a couple equals signs == on the end of the data URI.

Other Concerns

I apologize but, I've run out of time tonight, so I'm just going to mention some of the other major concerns to keep in mind if you are going to play around with data URIs:
  • Not all browsers support them! For cases like this, there are simple methods to specify fallback image sources (the URL to the original image on your server or elsewhere), and you can even specify multiples. This is worth considering if you have a site that demands high cross-platform compatibility. Here's a handy chart to see what versions of what browsers will play nice with base64 data URIs: http://caniuse.com/#feat=datauri
  • Data URI images won't show up in Google Images. There are some ways around this, but I don't have time to find them at the moment. I seem to remember it being something to do with a structured data snippet pointing to the source image URL. Keep this in mind. Most of the time you'll probably be using this for icons and other small images (especially sitewide) that are unimportant anyways.
  • DO NOT BASE64 ALL THE THINGS! Seriously, don't be utterly indiscriminate here. If you are, you'll just increase the size of your CSS or HTML ridiculously, will probably end up with slower load times in the end, plus reduced browser compatibility. Try out small images and sitewide images (social icons are perfect for this) first, experiment, and stick with what demonstrates a measurable improvement. Use GTMetrix and/or WebPageTest.org to verify.

  • USE MULTIPLE RESOURCE-SERVING METHODS. Again, in keeping with not being utterly indiscriminate, I separated this because I consider it very important. If you have a lot of images and page resources you're trying to optimize, you are going to want to use multiple methods to achieve optimal results. Use the appropriate methods for the individual resource. Here's some examples:
    • CDN: Do you need better load/geo-balancing for certain resources, like JS, CSS, and/or certain images? Consider hosting them on a CDN.
    • Cookie-free Domain/Subdomain: For some resources, a CDN might not be necessary. If you're trying to improve parallelization, do some reading on that concept and HTTP. From there, do some simple math and estimates to figure out how to divvy up resources between domains/subdomains. If, for example, I had 20 images and 10 resource files, I might choose to divide them evenly across the main domain (15) and one subdomain (15). Many modern browsers can parallelize ~8 requests, and in some cases I believe a few more. You could also experiment with increasing this to 2 subdomains, so it's only 10 files per. This can quickly become working hard vs. smart, so keep it simple to start off with. Use a CDN and/or maybe 1 subdomain and see if that's good enough. For a small site, especially a locally-oriented site, much of this simply might not matter.
    • Base64 small images, icons, and some sitewide images. You might have larger images, but in many cases you probably don't want to use a data URI for those. Instead, simply compress all of your images as much as possible, so at least the larger images are reduced as much as they can be. More info on compression in some of the earlier posts in this thread.
    • Consider replacing some images with icons. *Golden nugget* you can base64 data URI icons as well! For things like social icons or other very small and simplistic/monotone images, you can likely cut a small amount of file size by switching to icons instead.
    • CSS sprites. These are still valid, even today. ALSO, you can base64 these as well! Just keep in mind, high resolution (especially height) or large file sizes are still an issue. If you've already previously created a CSS sprite for small and simple things like social icons (such as some of the previous posts above), you're in luck. Now you can take that one remaining HTTP request for the sprite, and eliminate it as well!
That's all I can think of for now.
 
Back