Search Engine Optimization in ASP.NET

by Dmitry Kirsanov 30. November 2011 09:19

ASP.NET engine is a wonderful thing. It does so much for you, that in the past could take ages and would earn you the title of demigod of Web Development. In some legacy languages, like PHP, it is still the case. For example, things like Localization and cross-browser support are very natural and automatic in ASP.NET, they don’t require any time or skills. You can add support of more languages, different output formats for date and numbers, add Ajax powered controls, caching, data access, perform various other high pilotage figures without thinking. I remember the time, when saving data between post-backs was an issue, and in ASP.NET it was solved by implementing the VIEWSTATE feature, which saves the state of page controls in one hidden field between submits - that feature alone worth a thousand words.

But ASP.NET comes to the rescue not only in obvious and direct ways - some features were introduced for one purpose, but perfectly helped in other areas. And one such area is Search Engine Optimization, or SEO.

For Search Engine, be it Google, Bing, Yahoo, Baidu or tens of other Search Engines, one of the very important things to keep your eye on is the size of the page header. That’s everything inside the <HEAD> tag. Among other things, search engine takes information about page title and keywords from there. Search Engines prefer to be fast, so extra long page headers are simply discarded. And what’s in that “discarded” for you? Right, no indexing.

Size Matters

I mentioned earlier a pretty neat feature called View State, which saves the state of your web controls (like text boxes, check boxes and everything else), so you would have them back in the same state after you submit data to the server.
Well, the drawback of this feature is that View State is saved in your page header and in some situations may become huge, like tens or hundreds of kilobytes. View State is keeping all the properties of page elements which were changed from default (i.e. from what you’ve set during design time). And some controls have just an army of properties.

When Search Engine encounters extra long header, one would think it would just read everything in the allowed limits, say, 2 kilobytes, and discard the rest. But that’s not the case. The HTML page is parsed the same way as XML, and therefore broken HTML tag structure will be either discarded or parsed as text document instead of interactive web one. When using XML parser, it’s paramount to have the root tag (in this case it’s <HEAD>), and if you discard the closing tag – there is no structure to read. So, it’s either all or nothing.

So what if we have to keep that View State for tons of controls and still need that page to be indexed? We will either move it somewhere else or compress it.

Moving View State somewhere else means that you override the saving of View State on page level, save it to the database under unique ID and save that ID instead of real View State. In case this ID is in the form of GUID (globally unique identifier, SQL data type “uniqueidentifier”), it will only take 32 bytes of your page. And it will come at the price of having to contact your SQL server on each post-back. With Ajax, that could be pain.
Compressing View State, though, is another story – the same as before, you override saving and reading of the View State at the page level and then compress View State contents before saving and decompress it before parsing. Very simple thing if done right. I will include the C# class to compress your View State in this post, so you could implement it in your web application.
Compressing the View State could decrease the it’s size twice or more – as more you have to keep in View State as better should be the compression.

So that will be one of the things you will do to help both Search Engine and your visitors to fetch and parse your pages faster. Or, in some cases, to parse them at all.

Keeping Pigs Together

As previously discussed, ASP.NET 4.5 introduced a new feature called bundling. Which allows you to shorten your page header greatly by merging all style sheets into one, the same as to merge all JavaScript files into one stream, removing links to all CSS and JS files respectively. Depending from the project, this may allow you to cut a few hundreds bytes from the page header. Keeping the file names short and simple will benefit as well.

Remember, that Search Engines of the new generation (at least Google these days, but it means other will follow) can fetch and run your JavaScript in order to render page the right way. And this means that although your scripts will not be indexed, they should be easy to fetch.  If you have large JavaScript or Cascaded Style Sheets files, consider switching on the static file compression on your server. This will make IIS to pre-compress your static files before sending them to the client, and JavaScript / CSS files can be compressed 10 times or better, so it will give you up to 90% advantage in speed.

I will keep up with Search Engine Optimization topic later, but then it won’t be so tightly tied to ASP.NET.

Attached is the C# class you can use in your ASP.NET application to compress the View State. To use it, override two functions inside of the page whose View State you need to compress:

protected override object LoadPageStateFromPersistenceMedium()
{
    return viewstate.DecompressPageState(Request.Form["__VSTATE"]); }
protected override void SavePageStateToPersistenceMedium(object viewState)
{
    viewstate.CompressPageState(viewState, this.Page); }

viewstate.zip (737.00 bytes)

 

Speed Through Size

Minimizing the output greatly improves the overall responsiveness of your website. In a larger scale environment, it also allows you to have less servers in your web farm, if the throughput is the bottleneck of your solution. What is usually neglected is that high speed positively affects your relationship with search engines. So, if you own the web server (i.e. have administrator access to IIS management console), you may benefit from switching the compression of traffic on.

All modern browsers (and search engine bots as well) supports the HTTP compression, which may dramatically decrease the amount of traffic you send to your clients. However, it comes with the price. Setting compression on dynamic contents, like your .aspx pages, will increase the load on your CPU, as it will be unique content each time and so it will be compressed every time you send the response. If your CPU is powerful enough – try this out and see if your website becomes more responsive.

Static content compression, on the other hand, will only use compression once, and will store the compressed copy of the static file in a special directory you define in the settings of your web server. So if you have large JavaScript files or Style Sheets of more than 100Kb in size – you can set the static file compression in your IIS and set the minimal file size to 100Kb. Being ordinary text files, the JavaScript and CSS files have great compression ratio, so compression could increase the throughput of these 10 times.

blog comments powered by Disqus