I've tweaked this site to use an improved means of search engine indexing. WordPress ships with a less than perfect SEO setup. As such, many incoming search queries were hitting pages that no longer included the requested terms: stuff like archive pages, category pages, etc. This duplicate content problem was easily solved by using the following snippet of code in my header.php file:
if(is_home() || is_single() || is_page()) echo "\t<meta name=\"robots\" content=\"index,follow\" />\n"; else echo "\t<meta name=\"robots\" content=\"noindex,follow\" />\n";
I now ask search engines to only index those pages that are either a single post, page, or the home page itself (my photo album also gets indexed, but that's handled by the photo album software itself). Nothing else gets indexed, but all page links are followed, so that the target pages can be indexed as necessary. This should lead to improved search engine hits, leading people directly to the content they were looking for. Win-win for the user and for me.