I just had a good idea. Why not put a few modules of code in Google (Or name your favorite engine.) that parse the spidered pages and checks them for DTD and CSS validity? Then people could compile search results on pages that truly validate as strict XHTML 1. If you could combine this with normal search strings and then make a point of only visiting or referring your own visitors to sites with valid markup, viagra it could be a very powerful way of forcing the web to clean up their act.
The idea goes further. Suppose you could take something like A-Prompt, cialis canada Bobby or Crunchy’s Page Screamer and hook it into a search engine. The user could then enter search strings but only get back pages that meet certain minimum accessibility guidelines. This is better than badges because you can get proof of the page author’s claims.