If your Web site is "slow," the problem could be the Internet or something you're doing yourself.
By Jordan Gold
One of the most often heard complaints to our webmaster is "Your Web site is slow." This makes me nervous. The Internet is slow enough, with its preponderance of graphics and its low modem speeds. However, if our Web site is slow, that means we're making the problem even worse. Customers won't tolerate a slow site; they'll go on to the next one. So when we get a complaint about slow performance, we set out to fix the problem.
First, we determine whether there is one. We check the route that the complainer has accessed to visit us, using a Unix command called traceroute. We can often see bad hops or areas on the Net that are down between us and the person trying to access us. We send that person a message, explain the problem and ask him or her to try us again later.
This is not an uncommon problem. As more and more Web sites go online and as more and more people try to access those Web sites, the Internet is becoming saturated. As a result, portions of the Net go down on a daily basis, limiting performance. Neither users nor Web sites themselves have anything to do with the problem. When I complained to the webmaster of a major search engine about the performance of his site, he asked if I was accessing his site from New York City. I told him that I was, and he said that the site had a particular problem with New York and was in the process of fixing it. Can you imagine trying to "fix a problem" in every major city in the world?
Here's another painful example. A few weeks ago, a major Internet service provider sent some faulty routing tables to the Internet. This effectively turned a large part of the Midwest--including us--off slowly as the faulty routing tables took hold. The result was similar to turning off an old black-and-white TV set and watching the picture fade to black. Fixing the routing tables was the same as turning that old TV back on and waiting for the picture to slowly come up. As a result of this little mishap, we were inaccessible to much of the Net for 24 hours.
We've also had difficulties with people around the world having problems in trying to access our site at one time or another. If they try again in a day or so, they do get through, but the vast majority of people give up and never come back.
When I'm out of town and try to access our site, if I find that it's slow, I often try to access another Web site that is right next to ours at our service provider. We share a high-speed (100 megabits per second) FDDI loop to the Internet on a server that's identical to ours. If that site is also slow, I know it's the Net. If it isn't, I know that the problem is our site, and I call our technical people to fix it (if they can).
If the problem is ours, we look into some causes and try to fix them. Beyond making sure that your connection to the Net is big enough and your server hardware is fast and powerful enough, there are other ways to optimize the performance of your Web server.
Avoid Perl. Perl is a language that makes it much easier to write code for the Web. However, Perl is also very slow. While it takes about one-tenth of the time to write a program in Perl that it takes in C, a Perl program will run half as fast as a C program. If you have a lot of Perl programs running on your site, it will slow you down considerably. We've heard that there's a program that reconfigures Perl code into C; we're trying to track it down. If you know of such a program, I'd appreciate hearing the details.
Upgrade your OS. Make sure you're using the latest version of your server's operating system software. We recently switched to the latest version of Solaris (2.5 at this writing) on our server and find that it runs much faster than previous versions. It's also more stable.
Seriously evaluate server software. There is a big difference in performance between server software, such as Netscape and Microsoft in the commercial world and NCSA, Apache and CERN in the public domain. We've had a lot of problems with NCSA "hanging" our server and are currently moving to Netscape to solve the problem. Netscape is obviously more expensive than NCSA, but because it's commercial software, the makers have a vested interest in making servers work at peak performance.
Tune, tune, tune. Our systems people are constantly evaluating programs that run on our site and making code fixes to push them to run faster. As your site ages, you'll have new code on top of old code. That code should be recompiled, optimized and rewritten if necessary to make sure that your site is running at optimum speed.
As the Internet grows, Web sites will have even less control over the problems that are caused by poor performance on the Net itself. But if you keep your own house in order, you can be sure that visitors will get the fastest site that their connection can handle.
Jordan Gold is vice president and publisher of Macmillan Online USA, a division of Simon and Schuster/ Viacom in Indianapolis. He can be reached at jgold@mcp.com. The Web site is http://www. superlibrary.com.