Category Archives: ASP.NET

Helicion Tech ISAPI 3 Rewrite Problem with ASP.NET 4.0, IIS6 & Extensionless URLs

For years we have used Helicon Tech’s ISAPI Rewrite plugin for IIS to generate pretty URLs for our ASP.NET sites.  A few months back I was in the process of migrating our ASP.NET web applications to the 4.0 Framework and I ran into an issue with ISAPI Rewrite and our site’s URLs.  Basically, it turned out that ISAPI Rewrite wasn’t even getting the chance to process our URL rewrites as it should.

The bottom line here is that with the 4.0 Framework and IIS 6.0, Extensionless URLs are turned on by default.  Since our rewrites were dependent on the ASPX extension to map our pretty URLs to actual pages, I had to turn this feature off.  To fix this, I had to go into the registry and find this key value:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\ASP.NET\4.0.30319.0\

Then I had to add/edit this DWORD value:

EnableExtensionlessUrls = 0

I restarted IIS and ISAPI Rewrite worked like a charm.  I didn’t have this issue on my development box which runs 64-bit Windows 7 Professional and IIS 7.5.  I only had this issue on our testing and production environments which run Windows 2003 and IIS 6.0.

Breaking ASP.NET 4.0 Framework on Windows 7 and IIS 7.5 with ASP.NET 1.1 Framework

The other day, I wanted to migrate some database changes from our development environment to our staging servers. The tool we typically use to do this is kind of old and requires the ASP.NET 1.1 Framework. So, during the installation of the tool on my new Windows 7 box, I saw a message requiring the 1.1 framework and decided to install it. During its installation, a warning popped up about known compatibility issues on my system, but I decided to proceed anyway. I mean, how bad could it be?

Well, it turned out kind of bad. None of my ASP.NET 4.0 web applications on my development box would run. I kept getting an error in the framework itself:

Calling LoadLibraryEx on ISAPI filter “C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_filter.dll” failed

Apparently you can’t run the 1.1 Framework alongside the 4.0 Framework, at least not on Windows 7.  It might work fine on XP or Windows 2003 Server, but I’m not 100% sure and I wasn’t going to waste too much time figuring out if you could or not.

To make a long story short, I had to install all of the ASP.NET Frameworks from my development box and re-install the 2.0 and 4.0 Frameworks and then make sure they were registered by running aspnet_regiis -r in the installation directory for each Framework version.  It took me about half a day of beating my head against my desk to figure out this faux pas.  Better to listen to the warning messages next time…

SqlCacheDependency and Query Notifications

There’s a lot of scattered information out there on how to configure ASP.NET applications to leverage Microsoft SQL Server’s Query Notification and Service Broker services for caching in ASP.NET applications. The two best step by step tutorials I’ve found online are:

http://www.simple-talk.com/sql/t-sql-programming/using-and-monitoring-sql-2005-query-notification/

http://dimarzionist.wordpress.com/2009/04/01/how-to-make-sql-server-notifications-work/

Both of those articles should get you started for sure. I ran into issues keeping our application from crashing after a period of time though while leveraging Query Notifications for caching in a few of my sites. The biggest issue I found was that I would see the following exception in our logs:

When using SqlDependency without providing an options value, SqlDependency.Start() 
must be called prior to execution of a command added to the SqlDependency instance.

Never did quite get a handle on what was going on here. I did figure out though that I could always find this in my Application log around the time that exception was thrown:

The query notification dialog on conversation handle '{A1FB449B-DEB3-E011-B6D2-002590198D55}.' closed due to the following error: '-8470Remote service has been dropped.'.

So, does this mean that I called SqlDependency.Stop() and now queued notifications aren’t going to be delivered. Are these critical errors that keep the application from coming back? I’ve read that a lot of the Query Notification messages you see in the log aren’t critical errors and can be ignored. I can’t ignore the timing of this error and the exception being thrown above though.

Anyway, I finally decided to pull this stuff out of our application until I get a better handle on what’s going on. The last straw was that I was trying to sync some database changes during a maintenance period and I couldn’t get them to sync because of a bunch of these SQL Query Notification issues. As I write this, I can’t even get my database back online as I’m waiting for ALTER DATABASE SET SINGLE_USER to complete (approaching 3 hours!!!). As I keep waiting, my Application log keeps filling up with the following Query Notification messages:

Query notification delivery could not send message on dialog ‘{FE161F6A-D6B3-E011-B6D2-002590198D55}.’. Delivery failed for notification ‘85addbaa-ce66-431d-870f-d91580a7480a;d527d584-9fd4-4b13-85bc-87cb6c2e166f‘ because of the following error in service broker: ‘The conversation handle “FE161F6A-D6B3-E011-B6D2-002590198D55” is not found.’.
For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

I had a response to a post I made on the ASP.NET Forum and it was suggested that with all the cached items in the system, that SQL Server really could not catch up. This is a problem because not only does it slow the entire system down, but when you have to cycle the SQL Server service itself, it takes forever for the system to come back up because all of the notifications get requeued or something.

Website Speed & Performance Tuning with GTmetrix

I stumbled upon a little gem today while searching for a few more techniques to improve the performance of my ASP.NET web applications. I use YSlow and Google Page Speed almost daily, and it was great to find this website that combines the both of them called GTmetrix. GTmetrix combines both Google Page Speed and YSlow into an easy to read, tabbed, table of recommendations. Each recommendation, once expanded, offers you a list of tasks that you can complete to improve the performance of your test. What’s more, is it ranks the grouping of recommendations from Low to High so that you know what to get after first. If you’re serious about your web site’s performance, definitely check this one out!

ASP.NET SQLCacheDependency with SQL Query Notifications

I’m going to make this quick and dirty. I’ve set up my ASP.NET web applications to leverage SQLCacheDependency with SQL Server 2005 Query Notifications. The setup process takes a some time and debugging can be tricky. But the payoff can be enormous. The bottom line is that the performance increase on page load times is well worth the effort, especially for data that doesn’t change all that often. I found it really useful for caching product data on my eCommerce sites, especially as the number of products in the system grew to over 5,000. However, I always seem to miss a step when when configuring my SQL Server 2005 databases; so this post is for my own reference, but if it helps someone else out there, even better.

The original article I used as a basis for configuring my applications is at the url below:

http://www.simple-talk.com/sql/t-sql-programming/using-and-monitoring-sql-2005-query-notification/

Follow the steps there and you’re good to go. Especially use the SQL Profiler debug steps at the bottom of the article if you get tripped up. One thing that I always had to do with my databases to get everything to work properly was execute the following query:

use <dbname>
EXEC dbo.sp_changedbowner @loginame = N'<dbuser>', @map = false

Make sure you use this caching technique responsible though. The Query Notifications can use up processing power so only cache data that you know will give your application a performance bump. Also beware of memory usage as you cache more and more data. You could end up caching so much data your application needs to restart often and that could cause slow page load times.

Easily Cause StackOverflow Exception in ASP.NET Web Application

This is a short one, but since I can’t believe I’ve never managed to do this before, I thought I’d post a little tidbit on it. I threw a StackOverflow in one of my ASP.NET C# web applications today. Watching the request keep going for about 5 minutes, I checked the Event Log on the server. There it was, StackOverflow exception. Say, what?

How did I manage it? Recursively add a control to its own control collection. The .NET framework just freaks and the application pool just gets restarted after each exception kills the app. Glad it didn’t take long to figure it out!

Remove ViewState From ASP.NET Page to Improve Performance

In a previous post, I eluded to the fact that I remove the ViewState from our ASP.NET pages to improve performance. I can’t take credit for coming up with the idea though. Originally, I got the idea and solution I wanted to implement by reading this article at EggHead Cafe. The solution I chose was to store the ViewState on the web server’s file system and reference the file in the page sent back to the client. I’ll outline how I did that below.

The first thing you’ll want to do is create a new Page base class that inherits from System.Web.UI.Page so you can override some of its methods, namely SavePageStateToPersistenceMedium and LoadPageStateFromPersistenceMedium. This will allow you to save the page’s ViewState to the file system and then re-load it on a post back. Let’s start with saving the ViewState to the file system.

protected override void protectedSavePageStateToPersistenceMedium(object viewState)
{
	// Serialize the view state into a base-64 encoded string
        LosFormatter los = new LosFormatter();
	StringWriter writer = new StringWriter();
	los.Serialize(writer, viewState);

	string guid = NewGuid();
	string vsp = ViewStateFilePath(guid);

	// Save the string to disk
	StreamWriter sw = File.CreateText(vsp);
	sw.Write(writer.ToString());
	sw.Close();

	// Save the guid in a form field
	ClientScript.RegisterHiddenField("__VIEWSTATE_GUID", guid);
}

So, let’s step through what we’re doing here. The first few lines of code, we’re serializing the view state. Next, where we call

string uid = NewGuid();

We’re creating a new guid that we will use in creating the file name on the server for the actual value of the current view state. NewGuid() just returns a new guid value from the System.Guid class.

Next, we need to create a path to where we’re going to store the file as well as its file name. Now, you can do this any way you want as long as all of your files end up being unique. You can’t go overwriting one guy’s view state with someone else’s. Now I based mine on basically the guid appended with the current request path minus the extension and replacing the slashes with dashes. So, my filename looks like:

string fileName = guid + "-" + Path.GetFileNameWithoutExtension(Request.Path).Replace(
    "/", "-") + ".vs";

Where the guid was passed in to the function that I called to create the view state file path.

So now that we have a path to where we can write the file, we can go ahead and do so using the StreamWriter that we created. Now, the last thing to do is spit out where we can find the view state to the client. This is done by registering a hidden field with the client:

ClientScript.RegisterHiddenField("__VIEWSTATE_GUID", guid);

That GUID allows you to recall the proper file for the client when you need to access the ViewState.

So now you’ve persisted the view state to the file system. Now all that’s left is to load it up when needed, which is done with LoadPageStateFromPersistenceMedium(), which is below.

protected override object LoadPageStateFromPersistenceMedium()
{
	string guid = Request.Form["__VIEWSTATE_GUID"];
	if (guid == "")
		return null;
		// determine the file to access
	if (!File.Exists(ViewStateFilePath(guid)))
	{
		return null;
	}
	else
	{
		// open the file
		StreamReader sr = File.OpenText(ViewStateFilePath(guid));
		string viewStateString = sr.ReadToEnd();
		sr.Close();
		// deserialize the string
		LosFormatter los = new LosFormatter();
		return los.Deserialize(viewStateString);
	}
}

This approach gives you the ability to minimize the amount of data that you have to send over the wire to a client browser. There are other solutions to this like storing the ViewState in memory, but I felt that it was a waste of resources if a ViewState might only be used once and then abandoned. The one thing you’ll want to do is manage your saved ViewState files on your disk. These files can get out of control, especially when you get a lot of visitor’s. So what I did was just set up a Scheduled Task in Windows to clean up stored ViewState’s every three hours. This works out pretty good since a ViewState really won’t be needed beyond that timeframe.

Anyway, I hope this solution helps some other C# ASP.NET developers out there. It sure worked out great for me!

ASP.NET Site whitehouse.gov Reviewed

So I was over on Slashdot last night looking for something interesting to read and ran across this tidbit about the new whitehouse.gov site that runs on ASP.NET. Honestly I think the only reason this got mentioned on Slashdot is that yesterday was Inauguration Day. Any other day and it probably would have fallen through the cracks. Anyway, I decided to take a look at the pointers that the author brought up to see if there was something I could learn. Most of the improvements I already knew about, but there were a couple that were new to me. Two that I’d like to consider implementing in my sites are:

  • Remove X-Aspnet-Version: header (remove 30 bytes per request)
  • Use compressed JQuery from Google’s servers (lower latency and improve performance)

The one issue with using JQuery from Google is your tied to whatever they’re using. If a newer version comes out and you want to use it but Google doesn’t, you couldn’t do this. Luckily I use JQuery 1.2.6, so this isn’t an issue for me.

As for the author’s review, I thought it was pretty good. A real world example is always a great illustration of what to do and what not to do. I’d have liked to see him make a suggestion on how to fix one of the issues he found, the ViewState issue, which would have been useful for other developers making the same mistake.

ViewState is still necessary for an ASP.NET application, however you don’t have to pass the entire ViewState back to the client. Its a waste of bandwidth (and can ’cause some nasty Base64 exceptions). One solution I use is to store the ViewState on the file system and only pass the file name back to the client for reference later on. It takes up a lot less bandwidth than a potentially huge ViewState. Other solutions are storing it in memory, the database, or by some other means. We clean out our old ViewState’s every 3-4 hours as many of them aren’t needed after that point (I’m thinking this might make a good article in the future).

Another example that kind of irked me was the site’s use of meta keywords. Uhm, yeah, this might not be as relevant anymore for Search Engine Optimization, but its still not a bad thing to have it in there. Just keep it to 200-300 characters. Nothing too crazy.

One last thing that he pointed out that I just didn’t agree with was that the server was running IIS6.0. Now, correct me if I’m wrong, but IIS7.0 is only supported on Windows Server 2008, right? Well, IT departments have budgets and all, so maybe a Windows Server 2008 license or the hardware to install it to wasn’t available? I know in my case, the budget doesn’t always allow for the latest and greatest, even if I want to use it. So to knock the development team for using IIS6.0 seems a little over the top if you ask me.

This entire site could definitely use some improvements, which the article nicely points out. To go one step further, I suggest any web developer install YSlow for Firebug for the Firefox browser. This came in handy for me when I was trying to optimize my sites. The whitehouse.gov site’s YSlow score is a 51 (an F), which is horrible. When I started using YSlow, I noticed some of my sites had a similar score, to which I was appalled and went to fixing pronto. By implementing some of the changes that I could as suggested by YSlow, I got us up to a 78, which is a high C. Some changes you can’t make (like Javascript at the bottom of pages and using a Content Delivery Network) due to how your application works and changing them just to make a minor improvement is more trouble than its worth. However, there isn’t any excuse to have an ASP.NET site that scores so low. Those folks over at whitehouse.gov definitely need to clean things up!

URL Rewriting in ASP.NET – ISAPI Rewrite vs. UrlRewritingNet

Seeing your web site rank well in the major search engines (Google, Yahoo, MSN) is something that every web developer strives for. Making sure your web site’s pages are search engine friendly is a huge part of that effort. With ASP.NET, pages typically end in the .aspx extension. While this isn’t bad for SEO, most ASP.NET pages are also dynamic. So having a dynamic page that is rendered based on a bunch of query string variables doesn’t get you anywhere with SEO, especially if you use a lot of them. So for a while now, we’ve used ISAPI Rewrite to rewrite your dynamic pages into something that is a lot more user friendly. This ISAPI extension isn’t free though. It costs $99 to purchase a license. Not that bad, right?

Well recently, I discovered an Open Source solution called UrlRewritingNet. It was originally developed in 2006 for the 2.0 framework, however the developers claim it will work all the way up to the 3.5 framework. I’m not sure how much further development is being done though, as the last release was in August of 2006. It is Open Source, so theoretically you can download the source and make modifications yoruself if you need to.

UrlRewritingNet integrates directly into your ASP.NET web application as a HttpModule. With each incoming request, this module is called to see if the URL requested is a rewritten URL. This isn’t too different from ISAPI Rewrite except that the rewrite for ISAPI Rewrite is handled higher up the stack by IIS itself and not the web application. Using one solution over the other shouldn’t be much different from a performance standpoint, but tests would need run to prove it. That’s a task for another time however.

The one drawback, for me anyway, was that I did see that UrlRewritingNet requires all of the rewrites to be defined in the Web.config file. I’m not a huge fan of that. Web.config gets cluttered up enough as it is, so the more rewrites you need the bigger this file is going to get. With ISAPI Rewrite, all of the rewrites are stored in a separate file called httpd.ini, much like rewrites in Apache. On the plus side, it looks like you can extend UrlRewritingNet by developing your own rewrite rule providers. So if you need some functionality that isn’t provided, you can hook what you need up yourself.

UrlRewritingNet looks pretty promising and I hope I have some time to check it out and see if it is a good substitute for ISAPI Rewrite.

Minimizing Downtime When Deploying ASP.NET Web Applications

One of the annoying things I find with managing our ASP.NET web applications is that I need to “shut down” our sites when deploying a new version of our assemblies. Its not that I’m worried about visitors seeing our maintenance page for 20-30 minutes, but as search engines spider our site during this down time, they can’t find pages that they knew about before. This is especially annoying when analyzing my sites in Google’s Webmaster Tools and see that I have a bunch of HTTP errors because Google couldn’t find a page because the site was unavailable. Since Google puts a high regard on site availability and quality when determining rankings, I’d like to avoid this.

Deploying the site is actually very simple. We use the XCOPY method of pushing up web forms, controls, and assemblies. But if you just start overwriting files in the live site, users get errors or inconsistent pages. And, if any database changes need to be made, code could not function properly before updating the site. Any of these problems would affect my Google issue I mentioned above as well. Not only that, but any developer worth his/her paycheck tests their changes in the live environment anyway. So just tossing up the new version is no good.

I’ve been considering a few different solutions to this, but I haven’t come up with something that solves the issue completely. At some point, I still have to take down the site.

One solution that I thought of was that I could set up a staging server to deploy my changes to and then run my tests there. Once I’m satisfied with the results, I could push it to my live site, minimizing the amount of down time. I figure the max amount of downtime using this approach would be 5-10 minutes depending on if I had to make any database changes. Not a perfect solution, but better than the 20-30 minutes I’m experiencing now.

Another solution I thought of was to set up a web farm. I could deploy the new version to one web server, make sure everything is good to go, then deploy it to the second web server. Users could still visit the site because at least one server in the farm could handle the incoming request. But this wouldn’t work great if database changes needed to be made. The site itself would still have to come down.

So right now, solution #1 appears to be the best approach and easiest to implement. Maybe I’m making more of a big deal of this than I need to, but I think everyone wants to minimize the downtime of their site. The one reason I’m holding back on changing my approach is I don’t know how much Google or other search engines weigh not being able to access a site for a short period of time. Regardless, I’m curious what other solutions developers and web teams use to deploy their ASP.NET applications to minimize site downtime. I’m positive there is a solution to my problem, but I just haven’t thought of it yet. If anyone has something that works for them, please chime in here!