Category Archives: Web Development

Cranked stuff about Web Development

Helicion Tech ISAPI 3 Rewrite Problem with ASP.NET 4.0, IIS6 & Extensionless URLs

For years we have used Helicon Tech’s ISAPI Rewrite plugin for IIS to generate pretty URLs for our ASP.NET sites.  A few months back I was in the process of migrating our ASP.NET web applications to the 4.0 Framework and I ran into an issue with ISAPI Rewrite and our site’s URLs.  Basically, it turned out that ISAPI Rewrite wasn’t even getting the chance to process our URL rewrites as it should.

The bottom line here is that with the 4.0 Framework and IIS 6.0, Extensionless URLs are turned on by default.  Since our rewrites were dependent on the ASPX extension to map our pretty URLs to actual pages, I had to turn this feature off.  To fix this, I had to go into the registry and find this key value:


Then I had to add/edit this DWORD value:

EnableExtensionlessUrls = 0

I restarted IIS and ISAPI Rewrite worked like a charm.  I didn’t have this issue on my development box which runs 64-bit Windows 7 Professional and IIS 7.5.  I only had this issue on our testing and production environments which run Windows 2003 and IIS 6.0.

Breaking ASP.NET 4.0 Framework on Windows 7 and IIS 7.5 with ASP.NET 1.1 Framework

The other day, I wanted to migrate some database changes from our development environment to our staging servers. The tool we typically use to do this is kind of old and requires the ASP.NET 1.1 Framework. So, during the installation of the tool on my new Windows 7 box, I saw a message requiring the 1.1 framework and decided to install it. During its installation, a warning popped up about known compatibility issues on my system, but I decided to proceed anyway. I mean, how bad could it be?

Well, it turned out kind of bad. None of my ASP.NET 4.0 web applications on my development box would run. I kept getting an error in the framework itself:

Calling LoadLibraryEx on ISAPI filter “C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_filter.dll” failed

Apparently you can’t run the 1.1 Framework alongside the 4.0 Framework, at least not on Windows 7.  It might work fine on XP or Windows 2003 Server, but I’m not 100% sure and I wasn’t going to waste too much time figuring out if you could or not.

To make a long story short, I had to install all of the ASP.NET Frameworks from my development box and re-install the 2.0 and 4.0 Frameworks and then make sure they were registered by running aspnet_regiis -r in the installation directory for each Framework version.  It took me about half a day of beating my head against my desk to figure out this faux pas.  Better to listen to the warning messages next time…

SqlCacheDependency and Query Notifications

There’s a lot of scattered information out there on how to configure ASP.NET applications to leverage Microsoft SQL Server’s Query Notification and Service Broker services for caching in ASP.NET applications. The two best step by step tutorials I’ve found online are:

Both of those articles should get you started for sure. I ran into issues keeping our application from crashing after a period of time though while leveraging Query Notifications for caching in a few of my sites. The biggest issue I found was that I would see the following exception in our logs:

When using SqlDependency without providing an options value, SqlDependency.Start() 
must be called prior to execution of a command added to the SqlDependency instance.

Never did quite get a handle on what was going on here. I did figure out though that I could always find this in my Application log around the time that exception was thrown:

The query notification dialog on conversation handle '{A1FB449B-DEB3-E011-B6D2-002590198D55}.' closed due to the following error: '-8470Remote service has been dropped.'.

So, does this mean that I called SqlDependency.Stop() and now queued notifications aren’t going to be delivered. Are these critical errors that keep the application from coming back? I’ve read that a lot of the Query Notification messages you see in the log aren’t critical errors and can be ignored. I can’t ignore the timing of this error and the exception being thrown above though.

Anyway, I finally decided to pull this stuff out of our application until I get a better handle on what’s going on. The last straw was that I was trying to sync some database changes during a maintenance period and I couldn’t get them to sync because of a bunch of these SQL Query Notification issues. As I write this, I can’t even get my database back online as I’m waiting for ALTER DATABASE SET SINGLE_USER to complete (approaching 3 hours!!!). As I keep waiting, my Application log keeps filling up with the following Query Notification messages:

Query notification delivery could not send message on dialog ‘{FE161F6A-D6B3-E011-B6D2-002590198D55}.’. Delivery failed for notification ‘85addbaa-ce66-431d-870f-d91580a7480a;d527d584-9fd4-4b13-85bc-87cb6c2e166f‘ because of the following error in service broker: ‘The conversation handle “FE161F6A-D6B3-E011-B6D2-002590198D55” is not found.’.
For more information, see Help and Support Center at

I had a response to a post I made on the ASP.NET Forum and it was suggested that with all the cached items in the system, that SQL Server really could not catch up. This is a problem because not only does it slow the entire system down, but when you have to cycle the SQL Server service itself, it takes forever for the system to come back up because all of the notifications get requeued or something.

What Twitter Means for Your Google SEO

The “intertubes” was abuzz recently with news that Google was going to add social media to its algorithm, meaning that tweets could be of more importance in the future. But exactly how important? I’m not sure anyone really knows, but a few things I would assume out of the gate:

  1. Massive tweeting on your part probably won’t have much effect on any traffic sent your way on Google’s part. I honestly don’t think Google will take the text from a tweet just on face value. I believe they’ll use that in conjunction with other metrics when placing a value on the importance of a tweet.
  2. Your followers will probably play an important role in the effect of tweets. Just like how similar web sites linking to your site help with your ranking (based on keywords, linking, etc.), the same will probably be said for your Twitter followers. For instance, if you’re into Ford Mustangs and you promote your Ford Mustang site on Twitter, other Ford Mustang related Twitter accounts will be more valuable to you than a Twitter follower who’s all about Britney Spears. Makes sense.
  3. The depth of your tweets will mean the most. What I mean is, how many times does your tweet get re-tweeted? By having a tweet re-tweeted a ton of times basically means whatever you had to say started to really catch on and people thought it was important. More value would be placed on a tweet Google could tell the social network found important.
  4. A combination of all of the above. I’m not sure anyone has any solid idea on how Google is going to use Twitter data. My guess is they’ll use a combination of my assumptions above when placing a value on anything it gleams from Twitter.

What’s almost certain is Google appears to be applying more metrics to its algorithm. Whereas domain names, inbound links, domain age, etc. was of utmost importance several years ago, Google is going to look into more metrics when applying your search rankings. In my opinion, this is a good thing. At the end of the day, it puts more relevant topics first based on how people are using the information across the web. Only time will tell what the importance of these changes will be though. What does everyone else think?

Google Page Speed Plugin vs. Page Speed Online

Today, I took a look at Google Labs’ Page Speed Online app to check the score of one of my sites. I was shocked to find out it was scoring really low at 59/100. Pathetic in my opinion since I consider site speed a huge priority (and so does Google in fact). I had just done a site update earlier in the week, so I was thinking that I had broken something. I checked the Page Speed Plugin for Firefox (part of Firebug), and just like I remembered, we were scoring really high at 94/100. I decided to take a look at the Page Speed for Chrome to see where that plugin would score us. It wasn’t as high as Firefox, but not nearly as low as the Online version; scoring at 81/100.

So my question to Google is this: Why the difference? Aren’t they running the same rules? Which score means more to Google? Between the browsers I would assume the rules being run in Firebug instead of straight through Chrome could cause a slight difference. Also perhaps the rendering engines for the browsers could account for some difference too. If anyone knows the answer for sure and which score I should really believe, I’d love to know!

Close jQuery ColorBox on an Action

jQuery is awesome. If you use Javascript on your website, you should use jQuery. If you don’t, you don’t know what you’re missing.

Recently, on a new site I’m about to launch, I was looking for some better ways to use jQuery and ColorBox when estimating shipping charges for customers. Previously, I called out to an internal web service to do some calculations and then do a redirect with the values to display to the user. I was thinking, meh, a redirect? You really need to do that?

So I ripped it all out and started over. I basically decided I could use jQuery and element IDs to do the same thing. Hide some controls, set the html or text values of others where I wanted calculated values to show up. But the kicker was, I could easily do that from my ColorBox modal window, but I wanted it to close after hitting the submit button. Turns out this is stupid simple. From the ColorBox documentation, you can manually close the ColorBox window:


The key to making it work is to find the element that actually opened the ColorBox window. I managed to only get this to work by finding the form that owned the element that opened the window first, then get the element in question, i.e.

var myForm = $("#myForm ");
var myElement= shoppingCartForm.find('#myElement');
if (myElement!= null) {

For some reason, just doing this didn’t work:


That would have been simpler, but I got it to work and that’s all that I really cared about. Anyway, hopefully this will be useful to someone else!

Update: So it turns out, that I could simply do this too on a button:

$('#lnkButton').click(function () {

Simple and clean! Love it!

Website Speed & Performance Tuning with GTmetrix

I stumbled upon a little gem today while searching for a few more techniques to improve the performance of my ASP.NET web applications. I use YSlow and Google Page Speed almost daily, and it was great to find this website that combines the both of them called GTmetrix. GTmetrix combines both Google Page Speed and YSlow into an easy to read, tabbed, table of recommendations. Each recommendation, once expanded, offers you a list of tasks that you can complete to improve the performance of your test. What’s more, is it ranks the grouping of recommendations from Low to High so that you know what to get after first. If you’re serious about your web site’s performance, definitely check this one out!

Uploading Content to Amazon S3 with CloudBerry Labs’ S3 Explorer

I recently made the move to Amazon S3 and CloudFront to store and server static content, in particular images, for some of my e-commerce web sites. We have thousands of images to serve to our visitors, in all different sizes. To get started, I went to Google to do some searching for some quality tools. I stumbled upon CloudBerry Labs‘ application S3 Explorer and downloaded it to give it a try. Installation was a snap and fairly quickly, I was configuring my Amazon S3 account in S3 Explorer. What’s very cool about this is that you can store as many S3 accounts that you might have, storing them for use later on. To configure an S3 connection, you will need your Amazon Access Key and your Amazon Secret Key. Now it was time to upload!

Like I mentioned earlier, we have thousands of images. In fact, we have over 27,000 images. And that’s just in one image dimension size! We have 6 sizes, so that’s well over 160,000 images. That would be a bear to do through Amazon’s S3 web interface. Especially if I needed to set headers and permissions. CloudBerry S3 Explorer came in handy for this. I selected one set of images and before I started the upload, it allowed me to set any HTTP Headers I needed on my images. After that, up they went. I’d say with my connection, it took an hour or so to get all of them up to S3, depending on the file sizes. After uploading, I needed to set permissions, which I was able to do by just selecting all of the S3 objects and setting the proper permissions. This was kind of slow because CloudBerry S3 Explorer needed to get information on all of the objects I had selected, which was over 27,000.

All in all, I think it took me a couple of days to sporadically upload and set up all of our images. The beauty is now we’re serving them from CloudFront, which makes our sites quite a bit faster. A total win win for us.

A few notes about this wonderful application:

  • It’s incredible easy to set permissions on objects. They have a check box if you want to open the objects up for the world to download, which was nice for us. It would have been nice to be able to do this before upload like HTTP Headers, but I didn’t see how.
  • Very easy to set HTTP Headers and any meta data you need on your objects. And you can do it before the upload starts!

  • One thing that confused me a little was on Windows 7, when I minimized S3 Explorer, it went into my task bar and not with other minimized applications. It took me a little while to figure out where it was hiding. At first I just thought the application had crashed on me.
  • Overwriting object preserved HTTP Headers and permissions, something I was a little concerned about.
  • Moving data between S3 folders and buckets was really easy. Again, preserves HTTP Headers and permissions.

So, all in all, my impressions of this application are really good, and I was only using the Freeware version. The pro version, for only $39.99, offers the unlimited S3 accounts and multi-threading which speeds up your uploads. Other features available in the Pro version are:

  • Compression
  • Encryption
  • Search
  • Chunking
  • FTP Support
  • Sync

For more information on CloudBerry Labs’ S3 Explorer, check out their product page for S3 Explorer. Hopefully you’ll find this nifty little application as useful as I did!

E-Commerce Checkout Usability Issues

I develop e-commerce applications for a living and I love what I do. The best part about my job is that it gives me the chance to continue to develop software, but also be involved in actual business decisions that yield real results. During the busy holiday season leading up to Christmas mean that e-commerce web sites are some of the busiest sites on the Internet. This means that e-businesses are relying on the month between Thanksgiving and Christmas to really add to their yearly bottom line. Since its the time of year where thousands of people hit the Internet to do their Christmas shopping, and I’m no different, I thought I’d bring up an issue I ran into while doing some of my online Christmas shopping. I’m going to keep my examples really vague until after Christmas in case my wife or other family members read my blog.

The handful of websites (high profile websites actually) I visited were all fairly well designed and responded rather quickly. That’s key during a busy e-tail season. I ran into issues on most of them during checkout. One site asked me to register during checkout, which I typically do, but took me back to my cart after registration instead of letting me continue my purchase. I was sort of confused because the cart details at the top of the page were really small, so I didn’t really know what to do next. If people expect to make their purchase when they click checkout, let them continue to do so. Don’t hinder the process.

Another website I visited, like most out there, allowed me to enter an offer code during checkout. Much to my delight, they let you stack coupons. So I scoured the Internet for some coupons and entered them into their offer code page during checkout. Much to my chagrin, none of them went through. However, I didn’t get a notification that the coupon was invalid. It just removed it from the offer code section and didn’t tell me anything about what might be wrong. In the end, I couldn’t use ANY of the offers I had and actually make it through checkout. All it would have taken was to tell me the offer code was invalid. What would have been better would have been to process them all at once and tell me which codes were valid and which weren’t.

I find its really important to keep your e-commerce website as usable as possible. The checkout process is the most important part of the site and you want people in and out as quickly as possible so you can capture that conversion. If you make things difficult or don’t provide usable feedback, you may lose a customer. That can have a real impact on your bottom line.

ASP.NET SQLCacheDependency with SQL Query Notifications

I’m going to make this quick and dirty. I’ve set up my ASP.NET web applications to leverage SQLCacheDependency with SQL Server 2005 Query Notifications. The setup process takes a some time and debugging can be tricky. But the payoff can be enormous. The bottom line is that the performance increase on page load times is well worth the effort, especially for data that doesn’t change all that often. I found it really useful for caching product data on my eCommerce sites, especially as the number of products in the system grew to over 5,000. However, I always seem to miss a step when when configuring my SQL Server 2005 databases; so this post is for my own reference, but if it helps someone else out there, even better.

The original article I used as a basis for configuring my applications is at the url below:

Follow the steps there and you’re good to go. Especially use the SQL Profiler debug steps at the bottom of the article if you get tripped up. One thing that I always had to do with my databases to get everything to work properly was execute the following query:

use <dbname>
EXEC dbo.sp_changedbowner @loginame = N'<dbuser>', @map = false

Make sure you use this caching technique responsible though. The Query Notifications can use up processing power so only cache data that you know will give your application a performance bump. Also beware of memory usage as you cache more and more data. You could end up caching so much data your application needs to restart often and that could cause slow page load times.