It seems like every web browser these days is spending an enormous amount of time and development effort on JavaScript performance. Whether it's the new TraceMonkey engine in Firefox 3.5, the V8 engine in Google Chrome, or the upcoming SquirrelFish engine in WebKit browsers, everyone claims (to some degree) superiority in this arms race. All of this raises two questions in my mind.
1. How important is JavaScript performance? Are JavaScript applications really that slow? I'll admit that the new Firefox 3.5 browser feels snappier on sites like GMail and Netflix, but said sites never felt that slow before. Why are developers spending so much time optimizing something that not everyone uses? Admittedly, JavaScript usage is going up (especially with the Web 2.0 craze), but how much latency does JavaScript computing really account for in today's world? I'm much more concerned about data transfer; that's the bottleneck I see. Broadband speeds here in the United States are ridiculously slow, compared to other parts of the world. Shouldn't we all focus on ways to improve that? Yes, I know software developers have little control over that kind of infrastructure, but perhaps there are better protocols out there to get data to the end user in a more efficient manner.
2. Won't improved JavaScript performance lead to poorer JavaScript programming? As computers have gotten faster over the past two decades, and as memory sizes have increased, applications have become more bloated and (arguably) slower than before. I'm convinced that if programmers had retained the "every byte matters" mentality from the 1970s, 80s, and early 90s, applications would be leaner and meaner than they are today (especially in the realm of operating systems). Can't the same thing be said for JavaScript programming? As JavaScript engines get faster, serious performance considerations during an application's design phase might become less and less frequent. I'm of the opinion that high performance hardware can lead to sloppy programming. "Well, the application is good enough" is what the pointy-haired bosses of the world would say. Shouldn't the application be the best it can be? Can't one argue that "good enough" isn't necessarily good enough?
I'll be interested to see where this arms race takes us. What do you think?