Filter the contents of robots.txt. Instead of using an actual robots.txt file, WordPress responds to requests for /robots.txt itself and offers the chance to filter the response and add or remove rules.
<?php add_filter( 'robots_txt', 'filter_function_name' ) ?>
Instruct search engines not to index the post at http://example.com/hello-world
add_filter( 'robots_txt', 'example_block_hello_world', 10, 2 ); function example_block_hello_world( $output, $public ) { $output .= "\nDisallow: /hello-world"; return $output; }
robots_txt is called from do_robots(), located in wp-includes/functions.php
the_author(), get_the_author(), get_the_author_id(), the_author_link(), get_the_author_link(), the_author_meta(), get_the_author_meta(), the_author_posts(), get_the_author_posts(), the_author_posts_link(), get_author_posts_url(), get_the_modified_author(), the_modified_author(), wp_dropdown_users(), wp_list_authors()