Category Archives: ASP.NET

Speeding Up Your Web Site with YSlow for Firebug

I’m always looking for an edge over our competitors to make using our e-commerce sites better from a usability standpoint. I think one of the easiest things to make the experience better is to make sure your site is responsive when people visit it, no matter what kind of connection they have or what they have for a computer. I decided to do some research on how to improve our sites download times and came across YSlow for Firebug.

YSlow is a Firefox extension that plugs into the Firebug extension. Any developer that doesn’t use Firebug is really missing out. So if you don’t have it, get it. Anyway, you can install YSlow right into Firefox and get access it through Firebug.

Upon analyzing our site the first time, we received a score of 42 from YSlow, which was an F. Ouch. That didn’t make me feel all that great about our site. You can see screen shots of our initial scores here and here. We scored really low for all but four of the thirteen performance criteria. I decided to attack the easiest tasks to complete first. This was Minify JS, Add an Expires header, and Gzip components.

I minified our javascript files using a utility called JSMin. It basically removes all whitespace and line returns from your file. It doesn’t compress the code all the way, but I wanted it to remain a little readable if I needed to look at the code on the live search.

Next, I wanted to handle adding an expires header. Since we use ASP.NET and C# for our web application, I was able to write a HttpHandler to do this for me. What was even better was I was able to handle the expires header and another issue, ETags configuration, all in the same snippet of code. For each request, our HttpHandler adds an empty ETag and an Expires Header of 3 days in the future. Both of these are used to determine when a cached copy of a web page needs to be refreshed. The ETag tells the browser that the version it sees now is different from the original. The Expires header obviously sets the expiration on the page.

Lastly, I wanted to GZip all of our components. This just required configuration of our IIS Server. You can also do this directly within your .NET application, but I didn’t see the value in this as IIS could do it for us.

After implementing these changes and a few other mundane ones, I ran YSlow again. Low and behold, we’d gone from a score of 42 to a score of 76. Not bad! We’re now scoring a “High C” according to YSlow. From a usability standpoint, I could definitely tell that the site responded much faster than it did when we were scoring a 42. For those of you that would like to see screen shots of the stats, you can see them here and here. Looking at the stats, you can see that we cut down the data downloaded from 413.1k to 234k, which looks like a huge improvement.

I strongly recommend anyone who’s developing web applications to take a look at YSlow. You might not be able to implement changes for all of the points it says you’re not doing well for, but even 2 or 3 changes should net you some great improvements in the performance of your site.

ASP.Net HyperLink Control And Html Encoded Ampersand’s

I just ran into some odd behavior with the HyperLink control ASP.Net. Per the W3C, you’re supposed to HtmlEncode ampersands, using & instead of ‘&’ when building URLs in your HTML code. The reason is that the ‘&’ is assumed to be an entity reference. What’s nice is most web browsers can recover from this type of error, but if you want your site to pass validation, you need to use & instead.

So I hooked up all of our URLs to use this method, especially when we wrote out URLs in our C# classes. What I found odd was if I did this using a HyperLink control instead of an HtmlAnchor control, .NET would write the & out in the URL instead of using ‘&’. Naturally this broke our site as query string references weren’t parsed properly. The fix was to use an HtmlAnchor instead.

I’m not really sure why .NET does this or if there’s another workaround for it, but this solution worked for me. I’d be curious to know the reason behind the behavior though.

Import Costumes Goes Live!

Our new e-commerce site, ImportCostumes.com, has gone live!  Its been months since development began and after lots of blood, sweat, and tears (mostly over PayPal account setup and integration; more on that to come later), we’re ready to take orders.  We have lots of cool men’s costumes and women’s costumes.  We also have a large selection of boy’s and girl’s costumes, and not to mention awesome accessories!  Come check it out and buy some stuff!

Weird IE6/ASP.NET SSL Bug

We found a weird IE6 bug in Fright Catalog and YumDrop today.  Turns out, when in the checkout process, IE6 would display a message to the user that they were navigating from a secure page to an insecure page.  Firefox, IE7, and Camino wouldn’t display any notification message at all.  The IE6 warning was totally bogus, since the resulting page was definitely secure (little gold lock, no messages, etc.).  For a while, I thought it was a bug in IE6, but the more I thought about it, that didn’t make much sense. 

That’s when I discovered that the notification only came up on a PostBack on a checkout page, not when the page first loaded.  So, knowing that we manage our secure pages using what we call a Redirect Manager, I took a look at that.  After some digging, I realized I was kind of barking up the wrong tree, and the real issue was the lack of a trailing slash on the URL.  This is something that we can either fix in our redirect or in the Secure Redirect module we use.  What I found really interesting though, is that this wasn’t an issue in Firefox, Camino, or IE7.

Frightcatalog.com Re-Launch First Impressions

We’ve had the new version of FrightCatalog.com up for a week now. We’ve had lots of great comments and feedback, but success is never really gauged by that. To that end, its good to see that a lot of hard work has paid off and I think we’re starting to see that from the numbers in Google Analytics

The first half of the month (July 1 – July 18) we saw 83,796 uniques, 9.27 pages/visit, and a 0.33% conversion rate.
The last week (July 19 – July 25), with the re-launch, we’ve seen 41,673 uniques, 12.00 pages/visit, and a 0.75% conversion rate.

Almost half the uniques in a 1 week period than we had in the first 18 days of the month. Page visits are up by almost 3.5, and our conversion rate has more than doubled. Good initial numbers after week one.

This is starting to prove that our de-facto framework for our e-commerce sites is pretty solid (all software can always be improved and we’ll continue to do so), and has a lot of promise. It’ll be nice to see how we do through the entire Halloween season with some new marketing initiatives in place. I hope this is a sign of good things to come in the coming months.

New FrightCatalog.com – 2007

Its been quite a long week. Late last night saw the coming of the latest edition of FrightCatalog.com. I think you’ll notice the more product centric slant its taken on this year. A little less glitz, a little more practicality. It should be MUCH faster too. Same framework as YumDrop.com and the soon to come ImportCostumes.com. Thanks for Kyle’s help, even though he’s moved on to a new venture.

The Debugger IS Your Friend!

I’ve been trying to contribute (when I have time) to the ASP.NET Forums lately. I’ve noticed that a lot of people who are new to ASP.NET don’t seem to know to use the Visual Studio debugger. When you see funky things going on in your code or behavior that you can’t explain, jump into the debugger. It’s saved me major headaches before. That and its good to walk through your code line by line for testing purposes. It makes you analyze each line if cide to make sure its doing what you expect it to do. So, moral of the story, don’t be afraid of your debugger, it’s your friend!

Norton AntiVirus & ASP.NET Request.UrlReferrer

I like it when my code works. Most developers do. What I don’t like is when code I KNOW works all of a sudden STOPS working. I spent a good 3 hours ripping my hair out trying to figure out why all of a sudden UrlReferrer was always null in my ASP.NET application. A week ago, my code worked great. Today it doesn’t. I did a lot of digging, asked a couple questions on the ASP.NET forum, but no luck. Then today, I stumbled upon this thread. Could this be? A third part application, one I use to protect my development machine, be hosing my app? Turns out it was. Calling the same code from my MacBook worked as expected.

So what happened? I think a NAV virus definition update changed how the headers appear in HTTP requests from my machine, as stated in the article. I realize we depend on these third party virus protection software applications to protect us from malicious software and the like, but this is crazy. At least tell me you’re doing it and then give me an EASY way to turn it off.

Another interesting implication of this, that was briefly touched on in the above mentioned thread is, would this affect statistics capturing software? Would the stats in Google Analytics be skewed? My guess is yeah, to some degree they will be if header information is held back. What are other people’s thoughts on this?

ASP.NET Caching

There are a few types of caching in ASP.NET, specifically Page Output Caching and Object Data Caching. There are whole hosts of articles out there on using the cache, but for the purposes of this post, I’m going to talk about just using Object Data Caching. To many, this will be obvious, but some cache code I wrote recently made me remember that this can sometimes be confusing in the context of an ASP.NET page.

Typically, you can access the cache from just calling Cache.Add or HttpContext.Curent.Cache.Add. This gives you access to the application wide Cache object. Any object you cache here will be available to any class within your application for the time period you specified for the cached object.

But what if you want to just cache an object for a request? Say you have an item object that you reference on 4 or 5 controls on a page? You don’t want to go to the DB to get the data each time you reference it. Instead, use the HttpContext.Current.Items collection and put your object there. It will be good for the duration of the request.

The two are very different. The first will make your cached data available application wide, to all requests. This can be dangerous if the data might be different for one user than it is for another. The second is available only to the current request. Use this if you load up an object say from the database in one control, but need it in subsequent controls that make up the same request.

Be careful with your cached objects and keep an eye on where you’re putting them. Not doing so can have some potentially disasterous effects.