I can't tell you how many times I've answered this question in forums, so I figured since so many are asking, it would make for a great article. First off, let's describe what we are talking about. A "bot" is a piece of software from a search engine that is built to go through every page of your site, categorize it, and place it into a database.
The Freshbot crawls the most popular pages on your website. It doesn't matter if that is one page or thousands. Sites like Amazon.com and CNN.com have pages that are crawled every ten minutes, since Google has learned that those pages have that amount of frequent changes. A typical site should expect to have a freshbot visit every 1 to 14 days, depending on how popular those pages are. What happens to your site on a Freshbot visit is that it finds all of the deeper links in your site. It places those links into a database so that when the DeepCrawl occurs, it has a reference.
Once a month, the DeepCrawl bot visits your site and goes over all the links found by the Freshbot. This is the reason why it can take up to a month for your entire site to be indexed in Google - even with the addition of a Google Sitemap. So, be patient and keep on adding content to your site, and work on getting valuable in-bound links to your site - Google will reward you for it.