Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

December 04 2013

14:29

Speed Up Your Mobile Website With Varnish


  

Imagine that you have just written a post on your blog, tweeted about it and watched it get retweeted by some popular Twitter users, sending hundreds of people to your blog at once. Your excitement at seeing so many visitors talk about your post turns to dismay as they start to tweet that your website is down — a database connection error is shown.

Or perhaps you have been working hard to generate interest in your startup. One day, out of the blue, a celebrity tweets about how much they love your product. The person’s followers all seem to click at once, and many of them find that the domain isn’t responding, or when they try to sign up for the trial, the page times out. Despite your apologies on Twitter, many of the visitors move on with their day, and you lose much of the momentum of that initial tweet.

These scenarios are fairly common, and I have noticed in my own work that when content becomes popular via social networks, the proportion of mobile devices that access that content is higher than usual, because many people use their mobile devices, rather than desktop applications, to access Twitter and other social networks. Many of these mobile users access the Web via slow data connections and crowded public Wi-Fi. So, anything you can do to ensure that your website loads quickly will benefit those users.

In this article, I’ll show you Varnish Web application accelerator, a free and simple thing that makes a world of difference when a lot of people land on your website all at once.

Introducing The Magic

For the majority of websites, even those whose content is updated daily, a large number of visitors are served exactly the same content. Images, CSS and JavaScript, which we expect not to change very much — but also content stored in a database using a blogging platform or content management system (CMS) — are often served to visitors in exactly the same way every time.

Visitors coming to a blog from Twitter would likely not all be served exactly the same content — including not only images, JavaScript and CSS, but also content that is created with PHP and with queries to the database before being served as a page to the browser. Each request for that blog’s post would require not only the Web server that serves the file (for example, Apache), but also PHP scripts, a connection to the database, and queries run against database tables.

The number of database connections that can be made and the number of Apache processes that can run are always limited. The greater the number of visitors, the less memory available and the slower each request becomes. Ultimately, users will start to see database connection errors, or the website will just seem to hang, with pages not loading as the server struggles to keep up with demand.

This is where an HTTP cache like Varnish comes in. Instead of requests from browsers directly hitting your Web server, making the server create and serve the pages requested, requests would first hit the cache. If the requested page is in the cache, then it is served directly from memory, never touching Apache or the database. If the page is not in the cache, then the request is handed over to Apache as usual, whereupon Apache will create and serve the page, which is then stored in the cache, ready for the next request.

Serving a page from memory is a lot faster than serving it from disk via Apache. In addition, the page never needs to touch PHP or the database, leaving those processes free to handle traffic that does require a database connection or some processing. For example, in our second scenario of a startup being mentioned by a celebrity, the majority of people clicking through would check out only a few pages of the website — all of those pages could be in the cache and served from memory. The few who go on to sign up would find that the registration form works well, because the server-side code and database connection are not bogged down by people pouring in from Twitter.

How Does It Work?

The diagram below shows how a blog post might be served when all requests go to the Apache Web server. This example shows five browsers all requesting the same page, which uses PHP and MySQL.

When all requests go to the Apache Web server.

Every HTTP request is served by Apache — images, CSS, JavaScript and HTML files. If a file is PHP, then it is parsed by PHP. And if content is required from the database, then a database connection is made, SQL queries are run, and the page is assembled from the returned data before being served to the browser via Apache.

If we place Varnish in front of Apache, we would instead see the following:

If we place Varnish in front of Apache.

If the page and assets requested are already cached, then Varnish serves them from memory — Apache, PHP and MySQL would never be touched. If a browser requests something that is not cached, then Varnish hands it over to Apache so that it can do the job detailed above. The key point is that Apache needs to do that job only once, because the result is then stored in memory, and when a second request is made, Varnish can serve it.

The tool has other benefits. In Varnish terminology, when you configure Apache as your Web server, you are configuring a “back end.” Varnish allows you to configure multiple back ends. So, you might want to run two Web servers — for example, using Apache for PHP pages while serving static assets (such as CSS files) from nginx. You can set this up in Varnish, which will pass the request through to the correct server. In this tutorial, we will look at the simplest use case.

I’m Sold! How Do I Get Started?

Varnish is really easy to install and configure. You will need root, or sudo, access to your server to install things on it. Therefore, your website needs to be hosted on a virtual private server (VPS) or the like. You can get a VPS very inexpensively these days, and Varnish is a big reason to choose a VPS over shared hosting.

Some CMS’ have plugins that work with Varnish or that integrate it in the control panel — usually to make clearing the cache easier. But you can put Varnish in any CMS or any static website, without any particular integration with other systems.

I’ll walk you through installing Varnish, assuming that you already run Apache as a Web server on your system. I run Debian Linux, but packages for other distributions are available. (The paths to files on the system will vary with the Linux distribution.)

Before starting, check that Apache is serving your website as expected. If the server is brand new or you are trying out Varnish on a local virtual machine, make sure to configure a virtual host and that you can view a test page on the server using a browser.

Install Varnish

Installation instructions for various platforms are in Varnish’s documentation. I am using Debian Wheezy; so, as root, I followed the instructions for Debian. Once Varnish is installed, you will see the following line in the terminal, telling you that it has started successfully.


[ ok ] Starting HTTP accelerator: varnishd.

By default, Apache listens for requests on port 80. This is where incoming HTTP requests go, because we want Varnish to essentially sit in front of Apache. We need to configure Varnish to listen on port 80 and change Apache to a different port — usually 8080. We then tell Varnish where Apache is.

Reconfigure Apache

To change the port that Apache listens on, open the file /etc/apache2/ports.conf as root, and find the following lines:


NameVirtualHost *:80
Listen 80

Change these lines to this:


NameVirtualHost *:8080
Listen 8080

If you see the following lines, just change 80 to 8080 in the same way.


NameVirtualHost 127.0.0.1:80
Listen 80

Save this file and open your default virtual host file, which should be in /etc/apache2/sites-available. In this file, find the following line:


<VirtualHost *:80>

Change it to this:


<VirtualHost *:8080>

You will also need to make this change to any other virtual hosts you have set up.

Configure Varnish

Open the file /etc/default/varnish, and scroll down to the uncommented section that starts with DAEMON_OPTS. Edit this so that it looks like the following block, which will make Varnish listen on port 80.


DAEMON_OPTS="-a :80 \
-T localhost:1234 \
-f /etc/varnish/default.vcl \
-S /etc/varnish/secret \
-s malloc,256m"

Open the file /etc/varnish/default.vcl, and check that the default back end is set to port 8080, because this is where Apache will be now.


backend default {
.host = "127.0.0.1";
.port = "8080";
}

Restart Apache and Varnish as root with the following commands:


service apache2 restart
service varnish restart

Check that your test website is still available. If it is, then you’ll probably be wondering how to test that it is being served from Varnish. There are a few ways to do this. The simplest is to use cURL. In the command line, type the following:


curl http://yoursite.com --head

The response should be something like Via: 1.1 varnish.

You can also look at the statistics generated by Varnish. In the command line, type varnishstat, and watch the hit rate increase as you refresh your page in the browser. Varnish refers to something it can serve as a “hit” and something it passes to Apache or another back end as a “miss.”

Another useful tool is varnish-top. Type varnishtop -i txurl in the command line, and refresh your page in the browser. This tool shows you which files are being served by Varnish.

Purging The Cache

Now that pages are being cached, if you change an HTML or CSS file, you won’t see the changes immediately. This trips me up all of the time. I know that a cache is in front of Apache, yet every so often I still have that baffled moment of “Where are my changes?!” Type varnishadm "ban.url ." in the command line to clear the entire cache.

You can also control Varnish over HTTP. Plugins are available, such as Varnish HTTP Purge for WordPress, that you can configure to purge the cache directly from the administration area.

Some Simple Customizations

You’ll probably want to know a few things about how Varnish works by default in order to tweak it. Configuring it as described above should cause most basic assets and pages to be served from the cache, once those assets have been cached in memory.

Varnish will only cache things that are safe to do so, and it might not cache some common things that you think it would. A good example is cookies.

In its default configuration, Varnish will not cache content if a cookie is set. So, if your website serves different content to logged-in users, such as personalized content, you wouldn’t want to serve everyone content that is meant for one user. However, you’d probably want to ignore some cookies, such as for analytics. If the website does not serve any personalized content, then the only cookies you would probably care about are those set for your admin area — it would be inconvenient if Varnish cached the admin area and you couldn’t see changes.

Let’s edit /etc/varnish/default.vcl. Assuming your admin area is at /admin, you would add the following:


sub vcl_recv {
   if ( !( req.url ~ ^/admin/) ) {
     unset req.http.Cookie;
   }
 }

Some cookies might be important — for example, logged-in users should get uncached content. So, you don’t want to eliminate all cookies. A trip to the land of regular expressions is required to identify the cookies we’ll need. Many recipes for doing this can be found with a quick search online. For analytics cookies, you could add the following.


sub vcl_recv {
  // Remove has_js and Google Analytics __* cookies.
  set req.http.Cookie = regsuball(req.http.Cookie, "(^|;\s*)(_[_a-z]+|has_js)=[^;]*", "");
  // Remove a ";" prefix, if present.
  set req.http.Cookie = regsub(req.http.Cookie, "^;\s*", "");
}

Varnish has a section in its documentation on “Cookies.”

In most cases, configuring Varnish as described above and removing analytics cookies will dramatically speed up your website. Once Varnish is up and running and you are familiar with the logs, you can start to tweak the configuration and get more performance from the cache.

Next Steps

To learn more, go through Varnish’s documentation. You should understand enough of Varnish’s basics by now to try some of the examples. The section on “Achieving a High Hit Rate” is well worth a read for the simple tips on tweaking your configuration.

Speed Up Your Mobile Website With Varnish
Keep calm and try Varnish to optimize mobile websites. (Image source)

(al, ea, il)


© Rachel Andrew for Smashing Magazine, 2013.

May 18 2011

15:06

Optimizing Error Pages: Creating Opportunities Out Of Mistakes

Advertisement in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
 in Optimizing Error Pages: Creating Opportunities Out Of Mistakes  in Optimizing Error Pages: Creating Opportunities Out Of Mistakes  in Optimizing Error Pages: Creating Opportunities Out Of Mistakes

In this article I’ll be reviewing a few techniques that will help Web designers and UI professionals to improve their error pages in order to engage visitors and improve overall website experience. As C. S. Lewis once said, “Failures are finger posts on the road to achievement”. Web designers should take this to heart. I’ll be focusing on error and maintenance pages, both from tracking as well as usability perspectives. I’ll also be providing a good number of examples on how to use analytics and defensive design in order to optimize user experience for such pages.

404-art-screenshot in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
Image designed by Daniel Bronfen

First, let’s go over the error pages and cover the questions on how to optimize them efficiently:

  • Is your 404 page succeeding to engage visitors, following the frustration of not getting what they are seeking for?
  • How does one decrease the number of people landing on a 404 page?
  • How do you monitor your 404 page traffic efficiently?

Following, I will discuss techniques that can be used to improve conversion rates even when the website is under maintenance. Here are some of the questions you’ve probably been asking yourself:

  • How do I choose maintenance time wisely?
  • How do I increase visitor engagement while using a maintenance page?

Optimizing 404 Pages

The subject of improving error messages was thoroughly described on Defensive Design for the Web, a book written by the 37signals team. They cover 40 guidelines to “prevent errors and rescue customers if a breakdown does occur.” Guideline #16 tells us to offer customized “Page Not Found” error pages; and they provide an interesting insight into how to create error pages (page 93):

“Instead of merely saying a page is not found, your site needs to explain why a page can’t be located and offer suggestions for getting to the right screen. Your site should lend a hand, not kick people when they are down.

Smart things to include on your 404 page:

  1. Your company’s name and logo,
  2. An explanation of why the visitor is seeing this page,
  3. A list of common mistakes that may explain the problem,
  4. Links back to the homepage and/or other pages that might be relevant,
  5. A search engine that customers can use to find the right information and
  6. An email link so that visitors can report problems, missing pages, and so on.”

A while ago I came across the great examples shared here on Smashing Magazine (part 1 and part 2) and was very inspired to create my own 404 page. But since I believe it is so important to understand and analyze online behavior, I asked myself, “Is it really good enough? How can I make it better?” On the sessions below I go over a few techniques that can be used in order to both monitor and optimize 404 pages.

If you do not have a customized 404 page, please refer to this simple explanation on ‘How to Setup a 404 Page‘.

Monitoring 404 Page Traffic

How often do you check the traffic to your 404 page? Most of the companies I have worked with never did, not once. The importance of monitoring 404 pages constantly should not be underestimated. For example, if a website is linked from a prominent blog and the link is broken, this will cause a very negative experience to users (which will not find what they expected) and also to search engines (which will not crawl to the right content). Below, I show a few tips on how to track those pages in a seamless way using Google Analytics.

Note: The screenshots were taken using the new Google Analytics version, which is still in Beta, so your mileage may vary.

Create an Alert on Google Analytics

As you can see in the screenshot below, it is possible to set alerts on Google Analytics that will let you know each time your 404 traffic reaches a determined number of visits per day. This will enable you to do the work once and be alerted every time there should be a problem.

Google-Analytics-Alerts2 in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
This image shows how to create a custom alert to track 404 pageviews.

Track Your 404 Page as a Goal

Setting the 404 page as a goal on Google Analytics will produce important information that can be achieved only through goals, e.g. the last three steps before getting to this page. In addition, it will make the task of finding traffic sources with broken links much easier. Below is a screenshot of how to do it:

Google-Analytics-Goal1 in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
This image shows how to create a goal to track 404 pageviews on your reports.

Add Your 404 Content Report to Your Dashboard

Every report on Google Analytics can be added to the dashboard. By adding the 404 page to your dashboard, you will be able to constantly monitor the trend of visits to your 404 page:

Google-Analytics-Dashboard1 in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
This image shows how to add the 404 goal to your dashboard.

Check Your Navigation Summary Report

This will help you understand what visitors do after landing on a 404 page, which is very important in order to optimize it:

Google-Analytics-Navigation-Summary3 in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
This image shows the navigation summary for a 404 page on Google Analytics.

Track Internal Searches

If you do not have a search box on your 404 page, you should seriously consider adding one. Through searches performed in this page you will be able to understand what people were expecting to find there and you will get insights on which links you should add to the page. Below are the metrics you will be able to analyze if you use this feature:

  • Total Unique Searches
    The number of times people started a search from the 404 page. Duplicate searches within a single visit are excluded.
  • Results Pageviews/Search
    The average number of times visitors viewed a search results page after performing a search.
  • % Search Exits
    The percentage of searches that resulted in an immediate exit from your site.
  • % Search Refinements
    The percentage of searches that resulted in another search ( i.e. a new search using a different term).
  • Time After Search
    The average amount of time visitors spend on your site after performing a search.
  • Search Depth
    The average number of pages visitors viewed after performing a search.

Decrease Your Errors (Fixing Broken Links)

Monitoring your 404 pages is very important, but it is useless if you don’t take action upon it. Taking action means doing all you can do to decrease the number of people getting to the 404 pages and improving the user experience of the page to your visitors (next section). Below, I provide a few tips on how to find and fix both internal and external broken links.

Check Your Navigation Summary Report

This will help you understand where your visitors are coming from within your site, i.e. it will tell you which pages contain internal broken links. You will be able to see what percentage of the visitors have arrived to this page from internal and external sources; the internal sources will be listed in this report (see navigation summary screenshot above).

Check the Sources of Traffic That Land on the 404 Page

This will clearly show which websites have broken links leading to your site. Once you have the list, you should either contact these websites or create 301 redirects to the right pages.

Google-Analytics-Landing-Pages in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
This image shows sources of traffic that are leading visitors to the 404 page.

Usability Tips to Improve 404 Engagement

Basically, the usability tips for error pages should not be too different from website usability common practices. Below are a few of the elements that can definitely help to increase the conversion rates of 404 pages; in this context conversion should be considered Click Through Rate (CTR), as our main objective is for visitors to find whatever they were looking for:

  1. Simplicity and Focus
    It is very important to have appealing images and an original design. However, it is critical to have a clear focus on the page; users are already puzzled that they got somewhere they were not expecting, so we must make their lives easy and provide a clear action for them to take.
  2. Know Your Visitors
    Many 404 pages use humour and technical jokes. It is important to keep in mind that we are not our visitors, and jokes can be misunderstood, so use them with responsibility.
  3. Let Your Visitors Decide
    As I wrote in the Web Analytics Process: “Customers should tell us what to do, not consultants, friends or feelings; data and online surveys are the place to look for customers’ needs.” I believe the best way to understand what works for your visitors is by providing them with a few page version and let the best win (check the advanced A/B Testing techniques written by Paras Chopra).

Optimizing Maintenance Pages

Not long ago, I worked with a website that had weekly maintenance down times, about 1-2 hours a week. They chose the day with the least traffic for the maintenance, but I believe they did not completely understand how this affected the website and, more importantly, how they could improve the design of the page in order to optimize user experience and take advantage of this failure. In a previous post at Smashing Magazine, Cameron Chapman provides a good checklist to be used when designing effective maintenance pages:

  1. Keep your maintenance pages simple and useful.
  2. Realize it’s an inconvenience to your visitors.
  3. Don’t be afraid to use humor.
  4. Give your maintenance page the same look and feel as your regular site.
  5. Let visitors know when your site will be back.
  6. Provide recommended content.
  7. Invite your visitors to come back when the site is online again.
  8. Inform your visitors about the progress of the maintenance.

In additional to the tips provided above, I believe two additional rules can be especially important to please and engage your visitors.

Chose Maintenance Time Wisely

A very common practice for choosing maintenance time is to look at the traffic of the website and choose the time of the day, or day of the week that has the lowest visitor traffic. However, this misses an important point: websites want to optimize for performance, not for traffic. By choosing the maintenance time using visitor count, we might be optimizing for traffic and not for dollars. The best way to get this number would be to run an hourly report and check what is the time of the day (or day of the week) in which the conversions are the lowest.

Increasing Visitor Engagement Using Maintenance Pages

Improving visitor engagement while the website is in maintenance mode? Yes, you read it right. While your website is in maintenance mode you have a great opportunity to promote your additional marketing channels: offline stores, Facebook fan pages, Youtube channels, Twitter accounts, etc. Below you can find an example from the Online Behavior maintenance page:

Bad-timing-404-pages-screenshot in Optimizing Error Pages: Creating Opportunities Out Of Mistakes
This image shows a Maintenance page example.

Closing Thoughts

As we mentioned above, errors happen, and we must be prepared for them. We must give a hand to our visitors when they are most frustrated and help them feel comfortable again. The level of online patience and understanding is decreasing and users have a world of choices just one click away, so website owners cannot let one small error get in their way.

What are your thoughts on this subject? Feel free to share them with us in the comment section below!


© Daniel Waisberg for Smashing Magazine, 2011. | Permalink | Post a comment | Smashing Shop | Smashing Network | About Us
Post tags: 404, errors, maintenance, measurement, navigation, optimization

April 14 2011

18:00

Automated Optimization with HTML5 Boilerplate Build


HTML5 Boilerplate is widely recognized as a rock-solid foundation for building new web-based sites and applications. That said, few are aware that the tool offers more than simply setting up your development environment. It also helps you “wrap up” your work by providing an awesome cross-platform build process.


The Build Script, with Paul Irish


Overview

So why might you need this build tool? Because it’s baked into HTML5 Boilerplate, and can help you automate web performance optimization. We chose to go with Apache Ant to handle the workload. How come?

All other tools have limitations that Ant’s original author couldn’t live with when developing software across multiple platforms.

Many developers are unfamiliar with the build process. But don’t worry; a build tool isn’t a scary monster. Everything can be configured through a relatively simple XML file. This article will help you understand how to set up the build tool, customize the build process and finally change variables and run the build.


The Directory Structure

The build script makes some assumptions about how your files are sorted and structured. Here is the folder structure of HTML5 Boilerplate:

Tutorial image
  • /js/libs/ – contains common script libraries: Modernizr, jQuery and a pngfix for IE6
  • /js/mylibs/ – contains site specific custom library scripts
  • /plugins.js – all jQuery plugins
  • /script.js – site/page specific JavaScript

The Build Folder Structure

The build/ folder contains the following elements:

build.xml

Apache Ant’s build files are written in XML. This file contains our project (Boilerplate Build) and targets. Targets contain task elements. Each task element of the buildfile can have an id attribute and can later be referred to by the value supplied to it, which must be unique.

default.properties

default.properties contains the default build options, project structure and hardcore build options, which we’ll review shortly.

build.properties

This file defines overrides for default.properties. This should be created by a user when he or she needs to override particular values. Consequently, it should not be placed under version control.

tools

Tools are a set of bundles, which include opyipng, JPEGTran, YUI compressor and HTML compressor.


Set up the Build Tool

Because the goal of the build tool is to be platform agnostic, we’ll review the necessary steps to set it up, dependent upon your OS of choice.

  • Windows – Grab and install WinAnt.
  • Mac OSX – Using homebrew, install the following packages: brew install libjpeg optipng. With MacPorts, use the following install command: port install jpeg optipng
  • Ubuntu (Linux) – Using apt, install the following packages: apt-get install libjpeg-progs optipng

Walkthrough of the buildfile

The build tool is nothing more than an XML file that is based on Apache Ant. Below is a walk through of the pre-defined build process. These elements can be configured by editing the build.xml file.

Concatening / Minifying JavaScript

<!-- Optimize javascript files -->
<target name="js.all" depends="js.remove.console, js.all.min, js.main.concat, js.libs.concat, js.concat.scripts,
js.minifyonly.min, js.delete"></target>
<!--  JS: Concat primary scripts -->

...

<!-- JS, Delete concatenated libs file (only if concat.scripts and delete.unoptimized are defined) -->
<target name="js.if.concat.scripts" if="build.delete.unoptimized, build.concat.scripts">
	<delete file="./${dir.publish}/${dir.js}/libs-${build.number}.min.js"/>
	<delete file="./${dir.publish}/${dir.js}/scripts-${build.number}.min.js"/>
</target>
  • The /js/libs/ files are minified, but not concatenated. Modernizr should be alone in the head of the document. jQuery might be pulled from a CDN, and the pngfix will be included for IE6 only.
  • /js/mylibs/ contains your other various JavaScript libraries and plugins. All files stored here here will be minified (unless they end with .min.js), and then concatenated together.
  • plugins.js and script.js, in the /js/ folder, are all yours. These will also be minified and concatenated with the mylibs/ files.

Minifying CSS

<target name="css" depends="copy">
    <echo message="Minifying css..."/>
    <concat destfile="./${dir.publish}/${dir.css}/style-${build.number}.css">
     <fileset file="./${dir.css}/style.css"/>
    </concat>
    ...
</target>

All CSS files are minified using YUI compressor. The above Ant script will run style.css through YUI compressor for minification.

Image Optimization

<target name="imagespng" depends="copy">
	<echo message="Optimizing images"/>
    <apply executable="optipng" osfamily="unix">
     <arg value="-o7"/>
     <fileset dir="./${dir.publish}/">
       <include name="**/*.png"/>
     </fileset>
    </apply>
    ...
</target>

In HTML5 Boilerplate, we chose to use OptiPng and jpegtran for image optimization for PNG and JPG images, respectively. That said, there are plenty of image optimization tools. Should you wish to do so, you’re free to replace the tools with your own favorite image optimization tools.

For instance, Smush.it uses ImageMagick to identify the image type and convert GIF files to PNG files. It then uses gifsicle to optimize GIF animations by stripping repeating pixels in different frames.

Removing Development-Only Coding

<exclude name="**/${dir.js}/profiling/**"/>
<exclude name="**/${dir.test}/**"/>
...
<target name="js.remove.console" description="Comment out console.log lines">
	<echo>Commenting out console.log lines</echo>

	<replaceregexp match="(console.log\(.*\))" replace="/\*\1\*/" flags="g" >
		<fileset dir="./${dir.publish}/${dir.js}/">
			<include name="**/*.js"/>
			<exclude name="**/*.min.js"/>
		</fileset>
	</replaceregexp>  

</target>

Files like console.log, profiling and unit testing files are not needed for the release of the site.

Minifying HTML

<target name="htmlbuildkit" depends="html" >

<apply executable="java" parallel="false" force="true" dest="./${dir.publish}/" >
     <fileset dir="./${dir.publish}/" includes="*.html"/>
     <arg value="-jar"/>
     <arg path="./${dir.build}/tools/htmlcompressor-0.9.3.jar"/>

</apply>
</target>

Listed below are some various options for minifying your HTML files:

  • htmlbuildkit – Preserves comments, multiple spaces and compresses inline JavaScript and CSS.
  • htmlclean – Preserves multiple spaces, removes unneeded quotes and compress inline JavaScript and CSS
  • htmlcompress – Removes unneeded quotes and compresses inline JavaScript and CSS.

Automated Baseline Numbering / Cache Busting

HTML5 Boilerplate uses query string for JavaScript/CSS versioning and cache busting.

HTML5 Boilerplate by default uses query string for JavaScript/CSS versioning and cache busting. The drawback with this approach is that some intermediate proxies – and potentially other clients – may not cache assets that contain query strings. This is due to basic heuristics that flag such requests as dynamic data.

The build tool will first remove the query string versioning and use automated baseline numbering for release control and cache busting.

Configuring Excludes

<exclude name=".gitignore"/>
<exclude name=".project"/>
<exclude name=".settings"/>
<exclude name="README.markdown"/>
<exclude name="**/.git/**"/>
<exclude name="**/.svn/**"/>
<exclude name=".gitignore"/>
<exclude name="*.conf*"/>
<exclude name="mime.types"/>
<exclude name="**/${dir.build}/**"/>
<exclude name="**/${dir.test}/**"/>
<exclude name="**/${dir.demo}/**"/>
<exclude name="**/${dir.js}/profiling/**"/>

Not all files will need to be published. A perfect example of this would be files generated by versioning control system like subversion and git.

By default, there is a list of file types and directories that will be excluded. To add to this list, you can search and find <!-- configurable excludes --> and append your custom exludes to it.


Walkthrough of default.properties

Variables inside the build file are defined in default.properties and build.properties.

Build options

  • build.concat.scripts = true – If set, multiple script files will be smushed together to a single, cohesive file.
  • build.delete.unoptimized = true – If set, unoptimized files will be deleted.
  • file.exclude = nonexistentfile – Excludes file filter for publishing (can’t be empty).

Project Structure

dir.publish	= publish
dir.build	= build
dir.tools	= ${dir.build}/tools
dir.test	= test
dir.demo	= demo
dir.js		= js
...

The project structure contains directory names, like the ones shown above, as well as the core JS folder, JS utility libraries, and folders which should only be minified but not concatenated.

Other Build Options

  • build.info = buildinfo.properties – Build versioning is defined
  • tool.yuicompressor = yuicompressor-2.4.2.jar – YUI Compressor is defined with yuicompressor-2.4.2.jar

Okay – But How Do I Use This?

Finally, we’ll learn exactly how you can use the build tool in your projects! Refer to the following steps to run the build tool.

  • Open a command line interface, and navigate to your project folder.
  • Navigate into the build folder: cd build/
  • There are four different ways to build your site: the default way is: ant build
  • When the build script changes your HTML to reference the new minified script (usually named something like scripts-002.min.js), it looks for some HTML comments which refer to the beginning and end of the script block. Currently, it looks for <!– scripts concatenated and <!– end concatenated and minified scripts–>.

Build Options

Here’s a list of various build options that you can choose from to suit your particular need:

  • ant build – minor html optimizations (extra quotes removed). inline script/style minified (default)
  • ant buildkit – all html whitespace retained. inline script/style minified
  • ant minify – above optimizations plus full html minification
  • ant text – same as build but without image (png/jpg) optimizing

Conclusion

Performance optimization doesn’t have to be expensive or time consuming. With some reusable rules, one can slowly setup a build process to automate the repetitive aspects of optimization work. Apache Ant provides a powerful, yet easy to use, framework, while HTML5 Boilerplate leverages that to make web optimization as easy as possible for front-end web developers. Thank you so much for reading!

February 15 2011

20:57

Implement Twitter Scrolling without jQuery


jQuery is an awesome tool, but is there ever a time when you shouldn’t use it? In this tutorial, we’re going to look at how to build some interesting scrolling behavior with jQuery, and then review how our project could potentially be improved by removing as much jQuery, or abstraction, as possible.


Intro: The Plan

Sometimes, your code can be even more “magical” by subtracting some of that jQuery goodness.

You might expect this to be your normal do-something-awesome-with-jQuery tutorial. Actually, it’s not. While you might end up building a rather cool—but, frankly, perhaps equally useless—effect, that’s not the main point I want you to take away from this tutorial.

As you’ll hopefully see, I want you to learn to look at the jQuery you’re writing as just regular JavaScript, and realize that there’s nothing magical about it. Sometimes, your code can be even more “magical” by subtracting some of that jQuery goodness. Hopefully, by the end of this, you’ll be a little bit better at developing with JavaScript than when you started.

If that sounds too abstract, consider this a lesson in performance and code refactoring … and also stepping outside your comfort zone as a developer.


Step 1: The Project

Here’s what we’re going to build. I got this inspiration from the relatively new Twitter for Mac app. If you have the app (it’s free), go view someone’s account page. As you scroll down, you’ll see that they don’t have an avatar to the left of each tweet; the avatar for the first tweet “follows” you as you scroll down. If you meet a retweet, you’ll see that the retweeted person’s avatar is appropriately placed beside his or her tweet. Then, when the retweeter’s tweets begin again, their avatar takes over.

Twitter

This is the functionality that I wanted to build. With jQuery, this wouldn’t be too hard to put together, I thought. And so I began.


Step 2: The HTML & CSS

Of course, before we can get to the star of the show, we need some markup to work with. I won’t spend much time here, because it’s not the main point of this tutorial:

<!DOCTYPE HTML>
<html lang="en">
    <head>
        <meta charset="UTF-8">
        <title>Twitter Avatar Scrolling</title>
        <link rel="stylesheet" href="style.css" />
    </head>
    <body>
        <section>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/two.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say. </p>
            </article>
            <article>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/two.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/two.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
				 <!-- more assorted tweets -->

        </section>

        <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.5.0/jquery.min.js"></script>
        <script src="twitter-scrolling.js"></script>

        <script>
            $("section").twitter_scroll(".avatar");
        </script>
    </body>
</html>

Yes, it’s large.

How about some CSS? Try this:

body {
    font:13px/1.5 "helvetica neue", helvetica, arial, san-serif;
}
article {
    display:block;
    background: #ececec;
    width:380px;
    padding:10px;
    margin:10px;
    overflow:hidden;
}
article img {
    float:left;
}
article p {
    margin:0;
    padding-left:60px;
}

Pretty minimal, but it will give us what we need.

Now, on to the JavaScript!


Step 3: The JavaScript, Round 1

I think it’s fair to say that this isn’t your average JavaScript widget-work; it’s a bit more complicated. Here are just a couple of the things you need to account for:

  • You need to hide every image that’s the same as the image in the previous “tweet.”
  • When the page is scrolled, you have to determine which “tweet” is closest to the top of the page.
  • If the “tweet” is the first in a series of “tweets” by the same person, we have to fix the avatar in place, so it won’t scroll with the rest of the page.
  • When the top “tweet” is the last in a run of tweets by one user, we have to stop the avatar at the appropriate spot.
  • This all has to work for scrolling both down and up the page.
  • Since all this is being executed each time a scroll event fires, it has to be incredibly fast.

When beginning writing something, worry about just getting it working; optimizing can come later. Version one ignored several important jQuery best practices. What we start with here is version two: the optimized jQuery code.

I decided to write this as a jQuery plugin, so the first step is to decide how it will be called; I went with this:

$(wrapper_element).twitter_scroll(entries_element, unscrollable_element);

The jQuery object we call the plugin on, wraps the “tweets” (or whatever you’re scrolling through). The first parameter the plugin takes is a selector for the elements that will be scrolling: the “tweets.” The second selector is for the elements that stay in place when necessary (the plugin expects these to be images, but it shouldn’t take much adjusting to make it work for other elements). So, for the HTML we had above, we call the plugin like so:

$("section").twitter_scroll("article", ".avatar");

// OR

$("section").twitter_scroll(".avatar");

As you’ll see when we get to the code, the first parameter will be optional; if there’s only one parameter, we’ll assume that it’s the unscrollable selector, and the entries are the direct parents of the unscrollables (I know, unscrollables is a bad name, but it’s the most generic thing I could come up with).

So, here’s our plugin shell:

(function ($) {

    jQuery.fn.twitter_scroll = function (entries, unscrollable) { 

    };
}(jQuery));

From now on, all the JavaScript we’ll look at goes in here.

Plugin Set-Up

Let’s start with the set-up code; there’s some work to do before we setup the scroll handler.

if (!unscrollable) {
    unscrollable = $(entries);
    entries = unscrollable.parent();
} else {
    unscrollable = $(unscrollable);
    entries = $(entries);
}

First, the parameters: If unscrollable is a false-y value, we’ll set it to the “jQuerified” entries selector, and set entries to the parents of unscrollable. Otherwise, we’ll “jQuerify” both parameters. It’s important to notice that now (if the user has done their markup correctly, which we will have to assume they have), we have two jQuery objects in which the matching entries have the same index: so unscrollable[i] is the child of entries[i]. This will be useful later. (Note: if we didn’t want to assume that the user marked up their document correctly, or that they used selectors that would capture elements outside of what we want, we could use this as the context parameter, or use the find method of this.)

Next, let’s set some variables; normally, I’d do this right at the top, but several depend on unscrollable and entries, so we dealt with that first:

var parent_from_top  = this.offset().top,
    entries_from_top = entries.offset().top,
    img_offset       = unscrollable.offset(),
    prev_item        = null,
    starter          = false,
    margin           = 2,
    anti_repeater    = null,
    win              = $(window),
    scroll_top       = win.scrollTop();

Let’s run through these. As you might imagine, the offset of the elements we’re working with here are important; jQuery’s offset method returns an object with top and left offset properties. For the parent element (this inside the plugin) and the entries, we’ll only need the top offset, so we’ll get that. For the unscrollables, we’ll need both top and left, so we won’t get anything specific here.

The variables prev_item, starter, and anti_repeater will be used later, either outside the scroll handler or inside the scroll handler, where we’ll need values that persist across callings of the handler. Finally, win will be used in a few places, and scroll_top is distance of the scrollbar from the top of the window; we’ll use this later to determine which direction we’re scrolling in.

Next, we’re going to determine which elements are first and last in streaks of “tweets.” There are probably a couple of ways to do this; we’re going to do this by applying an HTML5 data attribute to the appropriate elements.

entries.each(function (i) {
    var img = unscrollable[i];
    if ($.contains(this, img)) {
        if (img.src === prev_item) {
            img.style.visibility = "hidden";
            if (starter) {
                entries[i-1].setAttribute("data-8088-starter", "true");
                starter = false;
            }
            if (!entries[i+1]) {
                entries[i].setAttribute("data-8088-ender", "true");
            }
        } else {
            prev_item = img.src;
            starter = true;
            if (entries[i-1] && unscrollable[i-1].style.visibility === "hidden") {
                entries[i-1].setAttribute("data-8088-ender", "true");
            }
        }
    }
});

prev_item = null;

We’re using jQuery’s each method on the entries object. Remember that inside the function we pass in, this refers to the current element from entries. We also have the index parameter, which we’ll use. We’ll start by getting the corresponding element in the unscrollable object and storing it in img. Then, if our entry contains that element (it should, but we’re just checking), we’ll check to see if the source of the image is the same as prev_item; if it is, we know that this image is that same as the one in the previous entry. Therefore, we’ll hide the image; we can’t use the display property, because that would remove it from the flow of the document; we don’t want other elements moving on us.

Then, if starter is true, we’ll give the entry before this one the attribute data-8088-starter; remember, all HTML5 data attributes must start with “data-“; and it’s a good idea to add your own prefix so that you won’t conflict with other developers’ code (My prefix is 8088; you’ll have to find your own :) ). HTML5 data attributes have to be strings, but in our case, it’s not the data we’re concerned about; we just need to mark that element. Then we set starter to false. Finally, if this is the last entry, we’ll mark it as an end.

If the source of the image is not the same as the source of the previous image, we’ll reset prev_item with the source of your current image; then, we set starter to true. That way, if we find that the next image has the same source as this one, we can mark this one as the starter. Lastly, if there is an entry before this one, and its associated image is hidden, we know that entry is the end of a streak (because this one has a different image). Therefore, we’ll give it a data attribute marking it as an end.

Once we’ve finished that, we’ll set prev_item to null; we’ll be reusing it soon.

Now, if you look at the entries in Firebug, you should see something like this:

starters and enders

The Scroll Handler

Now we’re ready to work on that scroll handler. There are two parts to this problem. First, we find the entry that is closest to the top of the page. Second, we do whatever is appropriate for the image related to that entry.

$(document).bind("scroll", function (e) {
    var temp_scroll = win.scrollTop(),
        down = ((scroll_top - temp_scroll) < 0) ? true : false,
        top_item = null, child = null;

    scroll_top = temp_scroll;

    // more coming

});

This is our scroll handler shell; we’ve got a couple of variables we’re creating to start with; right now, take note of down. This will be true if we are scrolling down, false if we are scrolling up. Then, we’re reseting scroll_top to be the distance down from the top that we’re scrolled to.

Now, let’s set top_item to be the entry closest the top of the page:

top_item = entries.filter(function (i) {
    var distance = $(this).offset().top - scroll_top;
    return ( distance < (parent_from_top + margin) && distance > (parent_from_top - margin) );
});

This isn’t hard at all; we just use the filter method to decide which entry we want to assign to top_item. First, we get the distance by subtracting the amount we’ve scrolled from the top offset of the entry. Then, we return true if distance is between parent_from_top + margin and parent_from_top - margin; otherwise, false. If this confuses you, think about it this way: we want to return true when an element is right at the top of the window; in this case, distance would equal 0. However, we need to account for the top offset of the container that’s wrapping our “tweets,” so we really want distance to equal parent_from_top.

But it’s possible that our scroll handler won’t fire when we’re exactly on that pixel, but instead when we’re close to it. I discovered this when I logged the distance and found it to be values that were a half-pixel off; also, if your scroll handling function isn’t too efficient (which this one won’t be; not yet), it’s possible that it won’t fire on every scroll event. To make sure you get one in the right area, we add or subtract a margin to give us a small range to work within.

Now that we have the top item, let’s do something with it.

if (top_item) {
    if (top_item.attr("data-8088-starter")) {
        if (!anti_repeater) {
            child = top_item.children(unscrollable.selector);
            anti_repeater = child.clone().appendTo(document.body);
            child.css("visibility", "hidden");
            anti_repeater.css({
                'position' : 'fixed',
                'top'      : img_offset.top + 'px',
                'left'     : img_offset.left + "px"
            });
        }
    } else if (top_item.attr("data-8088-ender")) {
        top_item.children(unscrollable.selector).css("visibility", "visible");
        if (anti_repeater) { anti_repeater.remove(); }
        anti_repeater = null;
    } 

    if (!down) {
        if (top_item.attr("data-8088-starter")) {
            top_item.children(unscrollable.selector).css("visibility", "visible");
            if (anti_repeater) { anti_repeater.remove(); }
            anti_repeater = null;
        } else if (top_item.attr("data-8088-ender")) {
            child = top_item.children(unscrollable.selector);
            anti_repeater = child.clone().appendTo(document.body);
            child.css("visibility", "hidden");
            anti_repeater.css({
                'position' : 'fixed',
                'top'      : img_offset.top + 'px',
                'left'     : img_offset.left + "px"
            });
        }
    }
}

You may notice that the above code is almost the same thing twice; remember that this is the unoptimized version. As I slowly worked through the problem, I started by figuring out how to get scrolling down working; once I had solved that, I worked on scrolling up. It wasn’t immediately obvious, until all the functionality was in place, that there would be this much similarity. Remember, we’ll optimize soon.

So, let’s dissect this. If the top item has a data-8088-starter attribute, then, let’s check to see if the anti_repeater has been set; this is the variable that will point to the image element that will be fixed while the page scrolls. If anti_repeater hasn’t been set, then we’ll get the child our of top_item entry that has the same selector as unscrollable (no, this isn’t a smart way to do this; we’ll improve it later). Then, we clone that and append it to the body. We’ll hide that one, and then position the cloned one exactly where it should go.

If the element doesn’t have a data-8088-starter attribute, we’ll check for a data-8088-ender attribute. If that’s there, we’ll find the right child and make it visible, and then remove the anti_repeater and set that variable to null.

Happily, if we’re not going down (if we’re going up), it’s exactly the reverse for our two attributes. And, if the top_item doesn’t have either attributes, we’re somewhere in the middle of a streak, and don’t have to change anything.

Performance Review

Well, this code does what we want; however, if you give it a try, you’ll notice that you have to scroll very slowly for it to work properly. Try adding the lines console.profile("scroll") and console.profileEnd() as the first and last lines of the scroll handling function. For me, the handler takes between 2.5ms – 4ms, with 166 – 170 function calls taking place.

Profiling

That’s way too long for a scroll handler to run; and, as you might imagine, I’m running this on a reasonably well-endowed computer. Notice that some functions are being called 30-31 times; we have 30 entries that we’re working with, so this is probably part of the looping over them all to find the top one. This means that the more entries we have, the slower this will run; so inefficient! So, we have to see how we can improve this.


Step 4: The JavaScript, Round 2

If you’re suspecting that jQuery is the main culprit here, you’re right. While frameworks like jQuery are awesomely helpful and make working with the DOM a breeze, they come with a trade-off: performance. Yes, they are always getting better; and yes, so are the browsers. However, our situation calls for the fastest possible code, and in our case, we’re going to have to give up a bit of jQuery for some direct DOM work that isn’t too much more difficult.

The Scroll Handler

Let’s with the obvious part: what we do with the top_item once we’ve found it. Currently, top_item is a jQuery object; however, everything we’re doing with jQuery with top_item is trivially harder without jQuery, so we’ll do it “raw.” When we review the getting of top_item, we’ll make sure it’s a raw DOM element.

So, here’s what we can change to make it quicker:

  • We can refactor our if-statements to avoid the huge amount of duplication (this is more of a code cleanliness point, not a performance point).
  • We can use the native getAttribute method, instead of jQuery’s attr.
  • We can get the element from unscrollable that corresponds to the top_item entry, instead of using unscrollable.selector.
  • We can use the native clodeNode and appendChild methods, instead of the jQuery versions.
  • We can use the style property instead of jQuery’s css method.
  • We can use the native removeNode instead of jQuery’s remove.

By applying these ideas, we end up with this:

if (top_item) {
    if ( (down && top_item.getAttribute("data-8088-starter")) || ( !down && top_item.getAttribute("data-8088-ender") ) ) {
        if (!anti_repeater) {
            child = unscrollable[ entries.indexOf(top_item) ];
            anti_repeater = child.cloneNode(false);
            document.body.appendChild(anti_repeater);
            child.style.visibility = "hidden";
            style = anti_repeater.style;
            style.position = 'fixed';
            style.top = img_offset.top + 'px';
            style.left= img_offset.left + 'px';
        }
    }
    if ( (down && top_item.getAttribute("data-8088-ender")) || (!down && top_item.getAttribute("data-8088-starter")) ) {
        unscrollable[ entries.indexOf(top_item) ].style.visibility = "visible";
        if (anti_repeater) { anti_repeater.parentNode.removeChild(anti_repeater); }
        anti_repeater = null;
    }
}

This is much better code: not only does it get rid of that duplication, but it also uses absolutely no jQuery. (As I’m writing this, I’m thinking it might even be a bit faster to apply a CSS class to do the styling; you can experiment with that!)

You might think that’s about as good as we can get; however, there’s some serious optimization that can take place in the getting of top_item. Currently, we’re using jQuery’s filter method. If you think about it, this is incredibly poor. We know we’re only going to get one item back from this filtering; but the filter function doesn’t know that, so it continues to run elements through our function after we have found the one we want. We have 30 elements in entries, so that’s a pretty huge waste of time. What we want to do is this:

for (i = 0; entries[i]; i++) {
    distance = $(entries[i]).offset().top - scroll_top;
    if ( distance < (parent_from_top + margin) && distance > (parent_from_top - margin) ) {
        top_item = entries[i];
        break;
    }
}

(Alternately, we could use a while loop, with the condition !top_item; either way, it doesn’t matter much.)

This way, once we find the top_item, we can stop searching. However, we can do better; because scrolling is a linear thing, we can predict which item with be closest to the top next. I learn to look at the jQuery you’re writing as just regular JavaScript, and realize that there’s nothing magical about it. Sometimes, your code can be even more “magical” by subtracting some of that jQuery goodness. Hopefully, by the end of this, you’ll be a little bit better at developing with JavaScript than when you started.

If that sounds too abstract, consider this a lesson in performance and code refactoring … and also stepping outside your comfort zone as a developer.


Step 1: The Project

Here’s what we’re going to build. I got this inspiration from the relatively new Twitter for Mac app. If you have the app (it’s free), go view someone’s account page. As you scroll down, you’ll see that they don’t have an avatar to the left of each tweet; the avatar for the first tweet “follows” you as you scroll down. If you meet a retweet, you’ll see that the retweeted person’s avatar is appropriately placed beside his or her tweet. Then, when the retweeter’s tweets begin again, their avatar takes over.

Twitter

This is the functionality that I wanted to build. With jQuery, this wouldn’t be too hard to put together, I thought. And so I began.


Step 2: The HTML & CSS

Of course, before we can get to the star of the show, we need some markup to work with. I won’t spend much time here, because it’s not the main point of this tutorial:

<!DOCTYPE HTML>
<html lang="en">
    <head>
        <meta charset="UTF-8">
        <title>Twitter Avatar Scrolling</title>
        <link rel="stylesheet" href="style.css" />
    </head>
    <body>
        <section>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/two.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say. </p>
            </article>
            <article>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/two.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/two.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
            <article>
            <img class="avatar" src="images/one.jpg" />
            <p> This is something that the twitter person had to say.</p>
            </article>
				 <!-- more assorted tweets -->

        </section>

        <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.5.0/jquery.min.js"></script>
        <script src="twitter-scrolling.js"></script>

        <script>
            $("section").twitter_scroll(".avatar");
        </script>
    </body>
</html>

Yes, it’s large.

How about some CSS? Try this:

body {
    font:13px/1.5 "helvetica neue", helvetica, arial, san-serif;
}
article {
    display:block;
    background: #ececec;
    width:380px;
    padding:10px;
    margin:10px;
    overflow:hidden;
}
article img {
    float:left;
}
article p {
    margin:0;
    padding-left:60px;
}

Pretty minimal, but it will give us what we need.

Now, on to the JavaScript!


Step 3: The JavaScript, Round 1

I think it’s fair to say that this isn’t your average JavaScript widget-work; it’s a bit more complicated. Here are just a couple of the things you need to account for:

  • You need to hide every image that’s the same as the image in the previous “tweet.”
  • When the page is scrolled, you have to determine which “tweet” is closest to the top of the page.
  • If the “tweet” is the first in a series of “tweets” by the same person, we have to fix the avatar in place, so it won’t scroll with the rest of the page.
  • When the top “tweet” is the last in a run of tweets by one user, we have to stop the avatar at the appropriate spot.
  • This all has to work for scrolling both down and up the page.
  • Since all this is being executed each time a scroll event fires, it has to be incredibly fast.

When beginning writing something, worry about just getting it working; optimizing can come later. Version one ignored several important jQuery best practices. What we start with here is version two: the optimized jQuery code.

I decided to write this as a jQuery plugin, so the first step is to decide how it will be called; I went with this:

$(wrapper_element).twitter_scroll(entries_element, unscrollable_element);

The jQuery object we call the plugin on, wraps the “tweets” (or whatever you’re scrolling through). The first parameter the plugin takes is a selector for the elements that will be scrolling: the “tweets.” The second selector is for the elements that stay in place when necessary (the plugin expects these to be images, but it shouldn’t take much adjusting to make it work for other elements). So, for the HTML we had above, we call the plugin like so:

$("section").twitter_scroll("article", ".avatar");

// OR

$("section").twitter_scroll(".avatar");

As you’ll see when we get to the code, the first parameter will be optional; if there’s only one parameter, we’ll assume that it’s the unscrollable selector, and the entries are the direct parents of the unscrollables (I know, unscrollables is a bad name, but it’s the most generic thing I could come up with).

So, here’s our plugin shell:

(function ($) {

    jQuery.fn.twitter_scroll = function (entries, unscrollable) { 

    };
}(jQuery));

From now on, all the JavaScript we’ll look at goes in here.

Plugin Set-Up

Let’s start with the set-up code; there’s some work to do before we setup the scroll handler.

if (!unscrollable) {
    unscrollable = $(entries);
    entries = unscrollable.parent();
} else {
    unscrollable = $(unscrollable);
    entries = $(entries);
}

First, the parameters: If unscrollable is a false-y value, we’ll set it to the “jQuerified” entries selector, and set entries to the parents of unscrollable. Otherwise, we’ll “jQuerify” both parameters. It’s important to notice that now (if the user has done their markup correctly, which we will have to assume they have), we have two jQuery objects in which the matching entries have the same index: so unscrollable[i] is the child of entries[i]. This will be useful later. (Note: if we didn’t want to assume that the user marked up their document correctly, or that they used selectors that would capture elements outside of what we want, we could use this as the context parameter, or use the find method of this.)

Next, let’s set some variables; normally, I’d do this right at the top, but several depend on unscrollable and entries, so we dealt with that first:

var parent_from_top  = this.offset().top,
    entries_from_top = entries.offset().top,
    img_offset       = unscrollable.offset(),
    prev_item        = null,
    starter          = false,
    margin           = 2,
    anti_repeater    = null,
    win              = $(window),
    scroll_top       = win.scrollTop();

Let’s run through these. As you might imagine, the offset of the elements we’re working with here are important; jQuery’s offset method returns an object with top and left offset properties. For the parent element (this inside the plugin) and the entries, we’ll only need the top offset, so we’ll get that. For the unscrollables, we’ll need both top and left, so we won’t get anything specific here.

The variables prev_item, starter, and anti_repeater will be used later, either outside the scroll handler or inside the scroll handler, where we’ll need values that persist across callings of the handler. Finally, win will be used in a few places, and scroll_top is distance of the scrollbar from the top of the window; we’ll use this later to determine which direction we’re scrolling in.

Next, we’re going to determine which elements are first and last in streaks of “tweets.” There are probably a couple of ways to do this; we’re going to do this by applying an HTML5 data attribute to the appropriate elements.

entries.each(function (i) {
    var img = unscrollable[i];
    if ($.contains(this, img)) {
        if (img.src === prev_item) {
            img.style.visibility = "hidden";
            if (starter) {
                entries[i-1].setAttribute("data-8088-starter", "true");
                starter = false;
            }
            if (!entries[i+1]) {
                entries[i].setAttribute("data-8088-ender", "true");
            }
        } else {
            prev_item = img.src;
            starter = true;
            if (entries[i-1] && unscrollable[i-1].style.visibility === "hidden") {
                entries[i-1].setAttribute("data-8088-ender", "true");
            }
        }
    }
});

prev_item = null;

We’re using jQuery’s each method on the entries object. Remember that inside the function we pass in, this refers to the current element from entries. We also have the index parameter, which we’ll use. We’ll start by getting the corresponding element in the unscrollable object and storing it in img. Then, if our entry contains that element (it should, but we’re just checking), we’ll check to see if the source of the image is the same as prev_item; if it is, we know that this image is that same as the one in the previous entry. Therefore, we’ll hide the image; we can’t use the display property, because that would remove it from the flow of the document; we don’t want other elements moving on us.

Then, if starter is true, we’ll give the entry before this one the attribute data-8088-starter; remember, all HTML5 data attributes must star

September 01 2010

10:00

Performance Optimization: How to Load your javascript faster!

Load your javascript fasterJavascript is now extremely important. Some sites use javascript for a tiny enchantments, many of today’s webapps are depending on it, some of them are even totally written in js. In this article I’ll point out some important rules, how to use your javascript, which tools to use and what benefits you’ll gain from it.

Keep your code at minimum

Keep your code at minimum

Don’t rely on javascript. Don’t duplicate your scripts. Treat it like a candy-tool to make things more pretty. Don’t bloat your site with s**t-loads of javascript. Use it only when necessary. Only when it really improves user experience.

Minimize DOM access

Accessing DOM elements with JavaScript is easy, code is more readable, but it’s slow. Here are some tips: Limit your layout fixing with javascript, cache references to accessed elements. Sometimes when your site is depending so much on DOM modifications you should consider limiting your markup. It’s a good reason to switch to HTML5 and leave those old XHTML, HTML4 behind. You can check the number of your DOM elements by typing in Firebug’s console: document.getElementsByTagName('*').length

Compress your code

The most efficient way to serve compressed JavaScript is to first run your code through a JavaScript compressor that shrinks variable and argument names, and then serve the resulting code using gzip compression.

Well, I don’t compress my main.js, but check if you have any jQuery plugins that are not compressed, do it (remember to keep author’s notes). Below I listed some options for compression.

GZip Compression: Idea behind this is to reduce time of transferring data between browser and server. When it’s done you get your file with Accept-Encoding: gzip,deflate header. It has some disadvantages though. It takes: CPU on both server-side and client side (to compress and uncompress) and disc space.

Avoid eval(): While sometimes it may bring some time efficiency, it’s definitely wrong practice. It makes your code look more dirty and it crashes out most of the compressors.

Tool to speed up javascript loading – Lab.js

There are many awesome tools that could speed up your javascript loading time. One is worth mentioning — Lab.js.

With LAB.js (Loading And Blocking JavaScript) you can load your javascript files in parallel, speeding up the total loading process. What is more you can also set up certain order for scripts to be loaded, so no dependencies are broken. Also, the developer declares a 2x speed improvement on his site.

Using proper CDN

Many webpages now use CDN (Content delivery network). It improves your caching, because everybody can use it. It can also save you some bandwidth. You can easy ping or firebug those servers to check from where you get data faster. Choose CDN by matching your readers localization. Remember to use public repositories when it’s possible.

Some CDN options for jQuery:

  • http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js – Google Ajax, information about more libraries
  • http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.min.js – Microsoft’s CDN
  • http://code.jquery.com/jquery-1.4.2.min.js – Edgecast (mt)

Load your javascript at the end of page

Load your javascript at the end

Very good practice if you care about user and him/her not leaving your page because of slow internet connection. Usability and user at first, javascript at the end. This may be painful, but you should be prepared for users with disabled javascript. You may put some javascript to be loaded in head section, but only if it’s loading asynchronously.

Load tracking asynchronously

This one is very important. Most of us are using Google Analytics for statistics. It’s good. Now look where you put your tracking code. Is it in head section? Is it using document.write? Then you should blame yourself for not using asynchronous tracking code for Google Analytics.

This is what asynchronous tracking code for Google Analytics looks like. We must acknowledge that it uses DOM instead of document.write, which may be better for you. It detects some of the events before page load which is very important. Now think of all the users closing your page before it even loaded. The cure of missing page views has been found.


	var _gaq = _gaq || [];
	_gaq.push(['_setAccount', 'UA-XXXXXXX-XX']);
	_gaq.push(['_trackPageview']);

	(function() {
		var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
		ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
		var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
	})();

Don’t using GA? It’s not a problem, most of today’s analytics providers will allow you to use asynchronous tracking.

Ajax Optimization

Ajax Optimization

Ajax request have great impact on your site’s performance. Below I pointed some tips about ajax optimization.

Cache your ajax

Look at your code. Is your ajax cacheable? Well, it depends on data, but mostly your ajax requests should be cacheable. In jQuery your requests are cached by default, not including script and jsonp dataType.

Use GET for Ajax Requests

POST type requests takes two TCP packets to send (headers sent first, data next). GET type request takes only one packet to send (which may depend on your amount of cookies). So while your URL length is less than 2K and you want to request some data use GET.

Use ySlow

Use Free ySlow Tool

It’s both simple and extremely powerful when it comes to performance. It grades your website and shows you what you need to correct, what should be taken care of.

Bonus: Pack your javascript into PNG File

jQuery and Prototype Packed into one image

Imagine adding your JS and CSS to the end of an image and cropping it in CSS to have all the info you need in an app in a single HTTP request.

I have recently found this. What is basically does it packs up your javascript/css data into PNG file. After that you can unpack it by using the canvas API’s getImageData(). What is more it’s very efficient. You can gain about 35% compression without minifying your data. Lossless compression! I must point out that for larger scripts you can feel “some” load time while image is pointed to canvas and pixels are read.

For more information about this one check out this article from 2008.

Final Thoughts

Hope you guys liked this article. If yes, remember to share it and to say hello to me on twitter. Stay in tune for some further posts about serious performance optimization.

April 17 2010

10:00

20 Most Valuable SEO Plugins For WordPress

Mobile-Interface-Plugins-for-Wordpress-and-toolsWordPress is one among the popular weblog publishing platforms available on the web. There are millions of blogs based on WordPress.

Search engines are the best way to promote, publicize and help people discover your content. Thus, it is very important to optimize your blog or web site so that search engines can discover your content and understand how relevant it is to specific search queries.

WordPress, in addition to providing you with the state-of-the-art publishing platform, also provides plethora of plugins for Search Engine Optimization(SEO). Some plugins are really effective whereas some are not. To help you find the right plugin for your web site, we have composed some of the best SEO plugins.

1. All in one SEO Pack

One among the most popular and widely discussed plugins for WordPress. This plugin is easy to use and is compatible with most WordPress plugins. It works as an overall SEO plugin – automatically generating META tags and optimizes your titles for search engines and helps you in avoiding duplicate content. This plugin also enables you to manually include META tags (title, description and keywords) for each page and post in your web site.

2. HeadSpace2

A powerful all-in-one plugin to manage meta-data and handle a wide range of SEO tasks. It allows you to tag your posts, create custom titles and descriptions, thereby improves your page ranking and relevance on search engines. You can also change the theme or load plugins on specific pages and much more. This plugin is available in multiple languages.
headspace-2-page-settings

3. Platinum SEO plugin

An all-in-one SEO plugin with a host of features like automatic 301 redirects for permalink changes, auto generation of META tags, avoid duplicate content, SEO optimized post and page titles and a whole lot of other features.
platinum-seo-options-page

4. TGFI.net SEO WordPress Plugin

This plugin is a fairly modified version of the all-in-one SEO Pack. The unique feature of this plugin is that, it’s directed at people who use WordPress as a CMS. It can auto generate titles, descriptions and keywords when overrides are not present and also avoids duplicate content.

5. Google XML Sitemaps

Generates an XML sitemap supported by Google, Bing, Yahoo and Ask. Sitemaps make it much easier for crawlers to see the complete structure of your web site and retrieve more efficiently. It also notifies all major search engines every time you create a new post. You can either choose to write a normal XML file or a zipped file. In case of any errors, you can rebuild the sitemap manually. As a remark, Pingomatic can be used to ping your blog to multiple search engines and other specialized services.
google-xml-sitemap-generator

6. Sitemap Generator

Creates a highly customizable sitemap for your WordPress powered web site. It enables you to choose what to show and what not to including – what order to list the items in. It supports multi-level categories, pages and permalinks.
dd-sitemap-generator

7. SEO Slugs

Slugs are long filenames assigned to your posts. Ex: http://yourblog.com/what-you-can-do-immediately-for-higher-rankings. This plugin removes common words like ‘a’, ‘the’,'in’,'what’,'you’ etc. from the automatically assigned post slug to make it more Search Engine friendly.

8. SEO Post Links

This plugin works similar to SEO slugs. It shortens the post slug and retains only the necessary keywords making it Search Engine friendly. It allows you to choose the longest number of characters in your post slug and also remove unnecessary words.
seo-post-link-options

9. Automatic SEO links

Just choose a word or a phrase for automatic linking and this plugin will replace all matches in the posts of your weblog. It allows you to set the title, target and rel. for each link. You can also set the anchor text and choose if it should be no-follow or not. If there are repeated words, only the first matched word in the post will be replaced.

automatic-seo-links-new-link

10. SEO Smart links

Automatically links keywords and phrases on your blog with corresponding posts, pages, categories or tags on your blog. It allows you to set up your own keywords and a list of matching URLs and also set the no-follow attribute. You can customize it according to your needs through the Administration Settings Panel.

seo-smart-links-options-page

11. WP Backlinks

This plugin helps in making the task of link exchange very simple. Once installed, it puts a small form on the sidebar of your blog that allows webmasters and other bloggers to quickly submit a link for link exchange. The plugin will spider the webmasters site for a reciprocal link and if everything is successful you will have made a successful link exchange. It also has the option of displaying different links on different pages.

12. SEO Title Tag

SEO Title Tag makes it easy to optimize the title tags across your WordPress powered blog. It allows you to override a page’s title tag with a custom one, mass editing of title tags, title tags for 404 error pages and much more.
seo-title-tag-options

13. 404 SEO plugin

Gives you a smart, customized ‘Page Not Found(404)’ error message and automatically displays links to relevant pages on your site, based on the words in the URL that was not found.

14. Redirection

This plugin helps you to manage 301 redirections, to keep track of 404 errors and also correct them. It also allows you to monitor your redirects by giving you full logs of all redirected URLs and also RSS feed for 404 errors. Automatically adds a redirection when a posts URL changes.
redirection-plugin

15. Simple Submit SEO/Social Bookmarking Plugin

This plugin adds submission links for Digg, Delicious, Buzz, and Stumble to pages and posts. It allows you to choose whether to display it on home page, post page, all pages etc.

16. AntiSocial

Adding this plugin on your blog allows readers to submit your posts to Digg, Reddit, Del.Icio.Us, StumbleUpon and other social bookmarking sites. It adds a row of buttons with links to the sites and also adds a nofollow to the links. It is actually a hacked version of the famous plugin Socialable.

17. AddToAny

Helps your readers share,save,email and also bookmark your posts and pages. It supports over hundred social bookmarking and sharing sites. It comes with a smart menu that places the services that visitors use a lot at the top of the menu based on their browsing and usage history.

add-to-any-screenshot

18. SEO Friendly Images

This plugin helps in making your images SEO friendly. It automatically updates all images with a proper ALT and TITLE. ALT acts as a description for your image and TITLE is the tooltip text displayed when the mouse is over the image. These attributes are one of the important part of SEO.
seo-friendly-images-plugin

19. Robots Meta

A very easy solution to add robot tags to your WordPress pages. It allows you to add meta tags to individual posts and pages, prevent indexing of your comments, login and admin pages.
robots-meta-configuration

20. Nofollow Case by Case

Allows you to selectively apply or remove nofollow attributes to comment links, comment author links, pingbacks and trackbacks and also open the comment links in a new window. If not configured it automatically strips nofollow attributes from all your comment links and comment author links.

January 06 2010

13:48

Website Performance: What To Know and What You Can Do

Smashing-magazine-advertisement in Website Performance: What To Know and What You Can Do
 in Website Performance: What To Know and What You Can Do  in Website Performance: What To Know and What You Can Do  in Website Performance: What To Know and What You Can Do

Spacer in Website Performance: What To Know and What You Can Do

Website performance is a hugely important topic, so much so that the big companies of the Web are obsessed with it. For the Googles, Yahoos, Amazons and eBays, slow websites mean fewer users and less happy users and thus lost revenue and reputation.

In your case, annoying a few users wouldn’t be much of a problem, but if millions of people are using your product, you’d better be snappy in delivering it. For years, Hollywood movies showed us how fast the Internet was: time to make that a reality.

Even if you don’t have millions of users (yet), consider one very important thing: people are consuming the Web nowadays less with fat connections and massive computers and more with mobile phones over slow wireless and 3G connections, but they still expect the same performance. Waiting for a slow website to load on a mobile phone is doubly annoying because the user is usually already in a hurry and is paying by the byte or second. It’s 1997 all over again.

Optimization in Website Performance: What To Know and What You Can Do

Performance is an expert’s game… to an extent. You can do innumerable things to make a website perform well, and much of it requires in-depth knowledge and boring testing and research. I am sure a potential market exists for website performance optimization, much like there is one now for search engine optimization. Interestingly, Google recently announced that it will factor performance into its search rankings, so this is already happening. That said, you can do a lot of things without having to pay someone to point out the obvious.

Know Your Performance Blockers

Performance can be measured in various ways. One way is technical: seeing how fast a page loads and how many bytes are transferred. Another is perceived performance, which ties into usability testing. This can only be measured by testing with users and seeing how satisfied they are with the speed of your interface (e.g. do they start clicking on your JavaScript carousel before it is ready?).

The good news (and hard truth) about performance is that 80 to 90% of poor performance happens in the front end. Once the browser gets the HTML, the server is done and the back-end developer can do nothing more. The browser then starts doing things to our HTML, and we are at its mercy. This means that to achieve peak performance, we have to optimize our JavaScript, images, CSS and HTML, as well as the back end.

So here are the things that slow down your page the most.

External Resources (Images, Scripts, Style Sheets)

Every time you load something from another server, the following happens:

  1. The browser opens up the Internet’s address book and looks up the number associated with the name of the server that’s holding the things you want (i.e. its DNS entry).
  2. It then negotiates a delivery.
  3. It receives the delivery (waiting for all the bytes to come in).
  4. It tries to understand what was sent through and displays it.

Every request is costly and slows down the loading of the page. This is also caused by browsers loading things in chunks (usually four at a time) rather than all at the same time. This is akin to ordering a product from a website, choosing the cheapest delivery option and not being at home between 9:00 am and 5:00 pm. If you include several JavaScript libraries because you like a certain widget in each, then you’ll double, triple or even quadruple the time that your page takes to load and display.

Scripts

JavaScript makes our websites awesome and fun to use, but it can also make for an annoying experience.

The first thing to know about scripts that you include in a document is that they are not HTML or CSS; the browser has to call in an expert to do something with them. Here is what happens:

  1. Whenever the browser encounters a <script> block in the document, it calls up the JavaScript engine, sits back and has a coffee.
  2. The script engine then looks at the content in the script block (which may have been delivered earlier), sighs, complains about the poor code, scratches its head and then does what the script tells it to do.
  3. Once the script engine is done, it reports back to the browser, which puts down its coffee, says good-bye to the script engine and looks at the rest of the document (which might have been changed, because the script may have altered the HTML).

The moral of the story is to use as few script blocks as possible and to put them as far down the document as possible. You could also use clever and lazy JavaScript, but more on that later.

Images

Here is where things get interesting. Optimizing images has always been the bane of every visual designer. We build our beautiful images in Illustrator, Photoshop or Fireworks and then have to save them as JPG, GIF or PNG, which changes the colors and deteriorates the quality; and if we use PNG, then IE6 arrives as the party-pooper, not letting us take advantage of PNG’s cool features.

Optimizing your images is absolutely necessary because most of the time they are the biggest files on page. I’ve seen people jump through hoops to cut their JavaScript down from 50 KB to 12 KB and then happily use a 300 KB logo or “hero shot” in the same document. Performance needs you!

Finding the right balance between visual loss and file size can be daunting, but be grateful for the Web preview tool, because we didn’t always have it. I recall using Photoshop 4 and then Photoshop with the Ulead SmartSaver, for example.

The interesting thing about images, though, is that after you have optimized them you can still save many more bytes by stripping unnecessary data from the files and running the files through tools that further compress the images but are non-lossy. The bad news is that many of them are out there, and you’ll need different ones for different image formats. The good news is that tools exist that do all that work for you, and we will come back to this later. For more advanced optimizaition techniques feel free to take a closer look at the Smashing Magazine’s articles Clever JPEG Optimization Techniques, PNG Optimization Guide and Clever PNG Optimization Techniques.

Simple Tools You Can Use Now To Improve Performance

All of those companies that obsess about page performance offer tools that allow you to check your own website automatically and make it easy to work around problems.

Test Your Performance

The first thing to do is find out how your website can be optimized. Here are three great tools (among others that crop up all the time) to use and combine.

Yahoo’s YSlow

YSlow is a Firebug add-on from Yahoo that allows you to automatically check your website for performance issues. The results are ranked like American school grades, with A being the best and F being the worst. The grades are cross-linked to best practice documentation on the Yahoo performance pages. You can test several settings: “classic YSlow,” which is targeted to Yahoo-sized websites, “YSlow 2″ and “small site or blog.” Results are listed clearly and let you click through to learn.

Image1-new in Website Performance: What To Know and What You Can Do

In the components view, YSlow lists all of the issues it has found on your website and how serious they are:

Image2new in Website Performance: What To Know and What You Can Do

The statistics view in YSlow gives you all information in pie charts:

Image3 in Website Performance: What To Know and What You Can Do

The tools section in YSlow offers a lot of goodies:

  • JSLint
    Checks the quality and security of your JavaScripts by running them through JSLint.
  • All JS
    Shows all JavaScript code in a document.
  • All JS Beautified
    Shows all JavaScript code in a document in an easy-to-read format
  • All JS Minified
    Shows all JavaScript code in a document in a minified format (i.e. no comments or white space)
  • All CSS
    Show all CSS code in a document
  • All Smush.it
    Automatically compresses all of your images (more on this later).
  • Printable View
    Creates a printable document of all of YSlow’s results (great for showing to a client after you’ve optimized the page!)

Image4 in Website Performance: What To Know and What You Can Do

Google’s Page Speed

Like YSlow, Page Speed by Google is also an add-on for Firebug. Its main difference is that it does a lot of the optimization for you and provides the modified code and images immediately.

4198657574 5865ccbda5 in Website Performance: What To Know and What You Can Do

Page Speed’s other extra is that it monitors the overall activity of your page, allowing you to see when a document loads other resources after it has been loaded and to see what happens when a user rolls over elements or opens tabs and menus that load content via AJAX.

4198689498 2e2acd4ccf in Website Performance: What To Know and What You Can Do

Be careful with this feature, though: it hammers your browser quite hard.

AOL’s WebPageTest

Rather late to the game, AOL’s WebPageTest is an application with some very neat features. (It is also available as a desktop application, in case you want to check Intranets or websites that require authentication.)

WebPageTest allows you to run tests using either IE8 or IE7 from a server in the US or the UK, and it allows you to set all kinds of parameters, such as speed and what to check for:

4198701698 900086ced1 in Website Performance: What To Know and What You Can Do

Once you have defined your parameters and the testing is completed, you will get in-depth advice on what you can do to optimize. You’ll get:

  • A summary,
  • Detailed results,
  • A performance review,
  • An optimization report,
  • The content breakdown,
  • The domain breakdown,
  • A screenshot.

4198726620 9a32df8ff4 in Website Performance: What To Know and What You Can Do

One very cool feature of WebPageTest is the visual recording you get of how long it takes for page elements to show up on screen for users. The following screenshot compares the results of this blog, Ajaxian and Nettuts+:

4198697096 7eff3706bf in Website Performance: What To Know and What You Can Do

You can even create a video of the rendering, which is another very cool thing to show clients.

Once you get the test results, it is time to fix any problems.

Use Image Sprites

Image Sprites were first discussed in an article published by Dave Shea and based on the work of Petr Stanicek. They have been covered extensively here before, but understanding their full benefit is important before you start using them:

  • All of your images will be available as soon as the main image has loaded (no flickering on roll-overs or other annoyances).
  • One HTTP request is made, instead of dozens (or hundreds, in some cases).
  • Images have a much higher chance of staying cached on the user’s machine because they are contained in a single file.

Shea’s article points out a lot of cool resources for creating CSS Sprites but misses one that was released not long ago. Sprite Me was produced by Google (under the supervision of Steve Souders) and allows you to create Sprites automatically from any website, even via a bookmarklet. It analyzes the images on a page and offers you various options before generating the Sprite and CSS for you.

Here’s a video of Steve showing Sprite Me in action:

4196848773 9d6e3b8da5 in Website Performance: What To Know and What You Can Do

Optimize Your Images

You know now that Page Speed can automatically optimize your images. Another way to do this is with Yahoo’s Smush It, which is a set of image optimization tools that analyze your images, create the smallest possible versions and sends you a ZIP file of them all.

You can use Smush.it directly in the browser or automatically from YSlow. The website tells you how many bytes you can save by optimizing your images. This is yet another cool thing to show potential clients when pitching for a job.

4197963651 F99567a8f0 in Website Performance: What To Know and What You Can Do

4197968099 720bc44915 in Website Performance: What To Know and What You Can Do

Collate Scripts and Load Scripts on Demand

As noted, try not to spread your <script> nodes all over the document, because the browser stops whenever it encounters one. Instead, insert them as far down in the document as possible.

You could even collate your scripts automatically in one single include using back-end scripts. Edward Eliot wrote one of these in PHP a while ago. It lets you create a single JavaScript include for all of your scripts and one for your CSS files, and it even versions them for you.

JavaScript can be added dynamically to the page after the page has loaded. This technique is called “lazy loading,” and several tools are available to do it. Jan Jarfalk has one to lazy load jQuery plug-ins.

Some JavaScript libraries let you import only what you really need, instead of bringing in the whole singing-and-dancing library. YUI, for example, has a configurator that allows you to pick and choose what you need from the library and either gives you a single URL where you can get the different scripts or creates a JavaScript that loads them on demand:

4196935915 39584ab3e9 in Website Performance: What To Know and What You Can Do

Notice that a tab tells you what the overall size of the library will be.

The main trick in lazy loading is to dynamically create script nodes with JavaScript after the page has loaded and only when they are needed. I wrote about that two years ago on 24ways, and it has been a best practice for displaying badges and widgets for a long time now.

Use Network Distributed Hosting

If you use a library or CSS provided by a library, make sure to use the hosted versions of the files. In the case of YUI, this is done for you if you use the configurator. And you can pick from Yahoo or Google’s network.

For other libraries, there is a Google code repository of AJAX libraries. This is useful for a few reasons:

  • Visitors to your website will get the JavaScript and CSS from the server that is as geographically close to them as possible and won’t have to wait for your server to send the information from around the globe.
  • There is a high probability that these servers are faster than yours.
  • Visitors who have visited other websites that use the same includes will already have them on their computers and won’t need to load them again.
  • You save on bandwidth and can easily upgrade the library by changing the version number of the include.

While you probably wouldn’t be able to afford distributed hosting for your own files, Coral makes an interesting offer to distribute your data onto a network of servers for an affordable $50 a month.

Watch Some Videos

If you want to see how some of this work, check out the following videos, especially Nicole Sullivan’s, which shows some very cool CSS tricks:

Follow The Leaders

To learn more about website performance, here are some resources and people to follow. (Be warned: some of the content is technically tough.)

(al)


© Christian Heilmann for Smashing Magazine, 2010. | Permalink | 40 comments | Add to del.icio.us | Digg this | Stumble on StumbleUpon! | Tweet it! | Submit to Reddit | Forum Smashing Magazine
Post tags: optimization, performance

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.
Get rid of the ads (sfw)

Don't be the product, buy the product!

Schweinderl