Anyways, getting back on topic, "availability"'s importance for SEO purposes translates to a few concrete practices.
Make sure SE spiders can easily crawl your entire web site without hindrances. That includes avoiding
problematic robots.txt files,
and anything else that might conceivably cause problems in what Google founder Larry Page describes as linear surfing: someone navigating the web from one link to another. While this may previously have been an issue in SEO, with librarians reaffirming it, you can be sure Google will give this issue more weight in the future, if they aren't already.
Sitemaps are another good idea. (I'm currently working on one for City SEO/M, if you're wondering where mine is.)
The easy rule of thumb here is to make content freely available, easily accessible (i.e. not buried in some forgotten sub-sub-sub-sub-subsection), and equally legible by humans and spiders. More on this later.
2) Credibility - "Does the web site contribute current, accurate information? Is the site author(s) qualified to present the content provided?"
There are two issues in Schneider's view that establish credibility: quality of content and author credentials.
The applications for SEO are as follows.
Review your content to make sure it is accurate and on the mark. Statistics should be cited properly, and context must be given. If you don’t feel you have the time for this, have someone proofread it for you.
Make sure you have an About page. Include professional certifications and whatever else qualifies you to write about your topic. Furthermore, if you are affiliated with any organizations in your industry, display a badge or link of some nature that shows this, while making sure they link to you.
Give notice as to how current the information is. Note that Google's Blogger includes timestamps that indicate when a post was published (and also serve as permalinks).
You need to show that you are, if not an expert, at least someone who is knowledgeable about the field. Someone that can be trusted both because of their expertise, and because their information is accurate and up-to-date. Currently, Google assesses authoritativeness via PageRank and backlinks from other authoritative websites.
Larry Page and Sergey Brin, in their "Anatomy of a large-scale Hypertext Search Engine," noted that human maintained directories were generally reliable. (Anatomy is another recommended read for my fellow SEOs.) For Google to devise and implement algorithms to integrate human generated assessments of authority therefore makes sense to be consistent with what its founders have written in the past, besides being necessary to keep up with the competition (i.e. Yahoo's Del.icio.us). Based on past experience with PageRank, we can expect Google to take into account authority, likely based on a credentials approach.