What are robots and spiders?
These small programs are sent out by search engines like Yahoo and Google to track and catalog the growth of the Web. They look for new pages, or new information on existing pages, and send back reports to their respective search engines. Robots and spiders help your favorite search engine provide up-to-date, accurate results. Their visits to your site count as traffic, so it’s good to use a traffic analysis tool that can distinguish between them and actual human visitors.
Thankfully, technology has come a long way with regard to online traffic reporting. It’s no longer necessary to run log files through special software to analyze your traffic. Now hosted sites like Omniture, Hitbox and Google Analytics make it easy to track your web site’s performance with a Web-based dashboard panel. They work by using bits of code you place on your Web pages, which feed traffic information to a tracking service on another server.
Sources of Traffic Tools
You can use three possible sources of traffic tools, which are often referred to as “web analytics”:
- Commercial systems that charge monthly fees but allow real-time traffic reporting
- Free programs that provide detailed traffic reporting but only update once every 24 hours (meaning you have to wait until tomorrow to see today’s traffic)
- Built-in tools offered by the content management software or hosting service you’re using.
We’ll discuss the second and third options in the next section. They may not be necessary because many web hosting companies are now including a tracking program in their hosting package. Check with your host to see what tools they offer.