Getting your site ready for IE7

One of the realities of modern web development is that you’ll often have code which behaves differently depending on the web browser you’re using. In the past, this conditional code was usually intended to work around incompatibilities in the CSS standards support for the various browsers, but this may become less of a driver as compatibility improves across the board. Ironically, the renewed focus on AJAX applications means that browser detection will now become more important for detecting non-standardized capabilities of a browser in order to leverage them in an application.

As an example of this, consider how DasBlog enables rich HTML edit. DasBlog uses a freeware component called FreeTextBox, which enables rich HTML edit on both IE and Firefox. The problem is, FreeTextBox thinks that IE7 is Firefox. That’s because FTB uses an algorithm like:

this.ie5=(navVersion.indexOf(“msie 5.0”)!=-1)?true:false;
this.ie55=(navVersion.indexOf(“msie 5.5”)!=-1)?true:false;
this.ie6=(navVersion.indexOf(“msie 6.0”)!=-1)?true:false;

Note the problem here — all of the IE versions are very precisely defined, and Netscape is defined to be everything else. This is a very common pattern in browser detection, which can lead to buggy behavior. There are a number of reasons I can imagine that people would use a browser-detection pattern like this:

  • They are developing on IE primarily, and consider all other browsers to be lower priority in testing. Google hits for “Works best in Internet Explorer” often fall in this category.
  • They are developing to standards first, probably using Opera, Netscape, or Firefox as the litmus test. They find that code which works in their primary development browser needs to be hacked to work in IE. Note that the standards-first approach is what I prefer and recommend; however it is probably not the most common approach.
  • They want to take advantage of specific built-in proprietary functionality (such as rich edit or XmlHttpRequest) of IE to provide an “uplevel” experience. That’s clearly the case in this specific example. Firefox has been matching some of these libraries which have been in IE for several years, so the situation here will change.

Now, regardless of the reason that the above browser detect code is used, it’s probably not the right thing to do. In this case, the detection code is going to decide that IE7 is “anything other than IE”

So you have to consider what happens if your site mistakenly thinks that IE7 is “non-IE” and starts feeding it the non-IE code. If the “non-IE” setting was intended to mean “more standard CSS” (as it often is), then you might not even notice, since IE7 is a bit better with CSS. But if your app is using any hot “new” AJAX stuff, and the “non-IE” setting really means “downlevel experience” (as it also does sometimes) you really want your app to know that IE7 is a version of IE.

Of course, the best thing will be to test your site in IE7 as soon as it becomes available. But in the meantime, you could check your browser detect code and contact any component vendors used by your site to make sure they are thinking about this. See IE Blog and Dave Massy’s blog for some tips on browser detection. Both the user agent string and the navigator.appVersion (two different strings) are potential sources of trouble. If using the user agent string with tools that use browscap.ini (such as some PHP and IIS apps), you can just update the file.

Google vs. Microsoft in China

The usual suspects in the media are having a field day with the story about Google poaching an MSFT executive to run their Beijing office. Unfortunately, all of the reporting I’ve seen is rather lazy and sensationalist. My first instinct was, “why no such reporting when Li Gong left Sun to head up Beijing operations for Microsoft?” That was much more relevant to our position in China, IMO. I suppose the tech press have room for only one “clash of titans” story at a time, and now that McNeally and Ballmer are golfing together, it just doesn’t play.

Overall, I think the reporting misses the context when it portrays the news as a blow to our China R&D efforts.

First, China has no shortage of engineers, and companies have limits on how fast we can hire and integrate new employees. Even if both we and Google double the size of our China R&D operations every year for the next 5 years, we would still not be in a position where we would be competing with one anotherfor the top 1% of entry-level engineers. The bottleneck is not talent; the bottleneck is hiring.

Next, graduates of the very best schools in China (think IIT in India) have a high opinion of Microsoft. Even if we were in direct competition with Google (rather than IBM) for the top hires, we would do well.

Finally, when you start hiring the more senior people, and actually driving the business, local relationships matter more than in the U.S. Once the China operations are more established with the brightest devs, testers, and PMs, you will want to grow the middle and senior management capabilities so that the subsidiary can operate more independently. Li Gong is from Beijing. Dr. Lee is from Taiwan. It shouldn’t matter, but it does. Microsoft has a well-established China R&D, Google doesn’t.

In short, I don’t see any reason to expect that Google China R&D will be in the same league as MSFT anytime soon. This isn’t a story about Microsoft vs. Google in China. The story is just a boring story about non-compete for an executive with valuable insider information.

Bill O’Brien on IE7

Bill is skeptical of IE7, “Historically, developers do no real work on MS products until they’re gold because MS has a habit of changing things at the last minute.” I used to work at an ISV, so I know the feeling. But I think developers will see IE7 as an improvement, and in large part because of the sentiments he expresses. For starters, developers have asked for better CSS support for a very long time, and I don’t anticipate many web developers hesitating to adopt IE7 because “we need to see if this improved CSS support is fully baked”. And the syndication feed parser is, IMO, a no-brainer. Why write the code to parse five different feed formats when you can get it for free? It’s not as if everyone in the world needs to parse RSS, of course, but if you do, you want it to be consistent and reliable. Nobody today writes XML parsers, and RSS has reached that same level of maturity. It’s basic infrastructure now.


Well, there is one caveat here regarding the feed parser. RSS 1.0, 0.9x, and 2.0 are “baked” formats. Supporting them is a no-brainer. But there is a danger in baking support for a format that is changing. We learned this the hard way, when we were first out the gate with support for XSL in the browser. We shipped IE with support for the most current W3C draft, but unfortunately the language changed drastically, after we were gold. The critics assumed that there was a nefarious plot by MSFT to fragment XSL, but it was simply a matter of us going gold before the standard did. I think we recovered admirably, and have still had the best in-browser XSLT (final standard) support, but it burned us pretty bad. I think of this when I see the comments about Atom lurching toward signoff. We can only hope the spec is gold before IE7. So to Bill’s point, there is always a possibility that things will change between Beta and gold; but you also have to deal with the possibility that the standards change — and in this case I’d rather take dependencies on a single API and manage that risk than deal with five feed formats and potential incompatibilities, changes, and risks there.

Defending Alaska

Omar has been slamming Alaska Airlines over on his blog, joined by Dvorak. I have been guilty of slamming particular airlines on this blog, and will spend lots of money to avoid flying United or Northwest, so I think he has every right to do this. But I want to counter his criticism of Alaska and try to defend them a little bit.

First, he is talking only about flights out of SFO (the same mistake Dvorak makes), which is a hub controlled by a competitor airline, United. United had the worst on-time record in the world for some time, and I have twice had UA pay for Alaska tickets for me when they screwed up so bad they could not get me home on the scheduled day. These were UA out of SFO; Alaska has never missed planes this badly for me anywhere. I don’t want to bother looking up stats for other airlines out of SFO, but I dont think Alaska is unique in this regard. UA completely dominates the SFO to Seattle route, so it wouldn’t be a surprise to find out that they are finally learning how to be on time for this route. It also wouldn’t surprise me to find that Alaska does worse on this route (in terms of profit as well as timeliness) than on other routes.

On the other hand, when flying out of Seattle to places other than SFO, I have found Alaska to be much better than UA or American (although they are operated by NWA on some routes, and I have had typical bad NWA experiences on these flights).

So I think it’s really unfair to be slamming an airline about performance on one route; especially at a hub controlled by a competitor. I wrote off UA and NWA only after repeated gross problems including at their own hubs. If Omar and Dvorak want to say, “don’t fly Alaska to SFO”, then that’s fair (and I might agree, but for air-miles, since I no longer need red carpet club). But not everyone in the world is concerned about the SEA-SFO route, and for people who fly elsewhere (especially out of Seattle), it would be rotten advice to direct them to UA, NWA, or even AA. Alaska is simply better. Take the complaints in context.

Movie Review: I Heart Huckabees

Three years ago, I wrote “if a plot-segment has to jump out and grab the audience by the throat and scream “Look at me! I’m a moral dilemma!”, it is probably not much of one.”

I Heart Huckabees” begins by introducing a man gripped by existential crisis. He seeks the services of an existential detective, and soon gets wrapped up with several other people on intertwining journeys of self-discovery. There were some memorable characters, and parts where I laughed out loud, but overall I found the movie quite disappointing.

The movie pokes fun at western philosophy and modern therapy, but in a rather obvious and gratuitous way. The sequence of scenes touches on each of the big debates of existentialism as if the screen writer has just finished reading his undergraduate textbook and is writing a scene per chapter. “Everything Matters”, “Nothing Matters”, presented in exactly this juxtaposition, as if the viewer would be too stupid to figure this out if it were any more subtle. Every single such irony is paraded nakedly and beat in until you can be sure that nobody in the audience will miss the profundity.

The best stories hide their themes inside a plot that is equally enjoyable to people with or without sensitivity to the theme.Stoppard’s “The Real Inspector Hound”, is a perfect example of this. Stoppard’s play also pokes fun at existentialist crises in the context of a detective story, and would probably be enjoyable to the people who hated “I heart huckabees”.

Complex Navigation in Cities

I’ve blogged a few times about the London cabbies, and the potential to help people grow hippocampal neurons through use of simulated landmark navigation problems. The natural question for someone seeking to develop a therapeutic video game would be, “just what sort of map layout is best for neuronal activation?” This summary of recent research suggests some answers.

First, it is interesting to note that real cities are more difficult than randomly-generated maps; apparently due to clustering or lack of homogeneity. Second, it is not surprising that grid layouts are less challenging. In places like Manhattan, or Kenosha, WI, it is easy to find places on novel routes without relying on landmarks or much of a sense of direction. The act of navigation is pure sequencing; caudate nucleus. On the other hand, older cities have circle and spoke layouts and other tricky features; and hilly areas like Seattle tend to “ease” off of ninety degrees and end or merge parallels in unexpected places. Such deviations from the 4 cardinal directions wreaks havoc on mental sequencing and forces you to switch back to landmarks and sense of direction; hippocampus.

Interestingly, one study used the number of network nodes to prove that older cities are more complex. However, I believe that the number of nodes is not the key factor — especially in grid versus old city design. On the one hand, inplaces with very dense grids and a number of missing nodes (dead ends, non-intersections, etc), it is not that difficult to navigate. Such navigation can be done as a sequence of steps filtered against a mental blacklist of known missing intersections. If you miss a turn, there are lots of alternative turns you can make up on the fly to get the same result. A map with fewer intersections but more curves and merges would be significantly harder for people to navigate, and would activate hippocampus much better.

Apple’s RSS Parser

Dare weighs in on the hubub about Apple’s buggy RSS parser. I have a strong feeling that the people who developed the parser were convinced that they were doing the right thing (i.e. “be liberal in what you accept and conservative in what you send”). Butthis ispretty bad. Behavior like this poisons the wells, and makes interop harder for everyone. And the case-insensitive namespaces thing is just amazing — considering that there are plenty of standards-compliant XML parsers available, it seems it would have taken a huge extra effort to get the non-standard behavior. We’ve been through this debate enough times in the past five years that people should know better.