"The relevancy of the results is determined by metrics such as click rates, time spent on pages, number of pages viewed and repeat visits by millions of search users."
Generally speaking, this is a great idea. I have a few reservations though, as I wrote to them on their blog contact form:
I've understood that your RelevancyRank creation ranks sites based in part on how long someone spends there, and on their number of page views. The assumption is that, if I understand you correctly, longer (i.e. more time spent) is better. Most of the time, that makes sense. But what about sites that serve customers faster? Suddenly, having an easy 1-step checkout means you aren't as relevant. Why use faster servers when it means your Relevancy is less? In short, the assumption of time spent isn't appropriate across all the various markets and niches on the Web.
On the flip side, I once suggested to Google to use click behaviour the way you do, for their algorithms. They never got back to me, but I'm happy to see someone's picked up the ball and is running with it! Good luck!
Best, Bookworm SEO
Another thing that I forgot to mention is that click fraud is obviousy going to be a major concern if RelevancyRank gains any traction. For example, a bot could be designed that clicked around the owner's website massively and stayed around for hours. This would unfairly increase the site's RelevancyRank.
Of course, humans might easily do this too, and it's quite conceivable that companies in difficult niches could hire minimum-wage "clickers" (the search engine industry of casino shills) across the country to increase their RelevancyRank. So Claria is going to have to work hard to fight click fraud, which, as Google and Yahoo can testify, is no small task.