admin管理员组文章数量:1134589
The virtual robots.txt file is generated via the do_robots() function, along with any directives added via the robots_txt filter.
But I can't figure out a way to display the generated robots.txt directives. Using this will result in an entire page of content (including headers):
$robots_text = do_robots();
Is there a way to display the generated robots.txt directives?
Here's the code for do_robots() (from /)
function do_robots() {
header( 'Content-Type: text/plain; charset=utf-8' );
/**
* Fires when displaying the robots.txt file.
*
* @since 2.1.0
*/
do_action( 'do_robotstxt' );
$output = "User-agent: *\n";
$public = get_option( 'blog_public' );
$site_url = parse_url( site_url() );
$path = ( ! empty( $site_url['path'] ) ) ? $site_url['path'] : '';
$output .= "Disallow: $path/wp-admin/\n";
$output .= "Allow: $path/wp-admin/admin-ajax.php\n";
/**
* Filters the robots.txt output.
*
* @since 3.0.0
*
* @param string $output The robots.txt output.
* @param bool $public Whether the site is considered "public".
*/
echo apply_filters( 'robots_txt', $output, $public );
}
Thanks.
The virtual robots.txt file is generated via the do_robots() function, along with any directives added via the robots_txt filter.
But I can't figure out a way to display the generated robots.txt directives. Using this will result in an entire page of content (including headers):
$robots_text = do_robots();
Is there a way to display the generated robots.txt directives?
Here's the code for do_robots() (from https://developer.wordpress.org/reference/functions/do_robots/)
function do_robots() {
header( 'Content-Type: text/plain; charset=utf-8' );
/**
* Fires when displaying the robots.txt file.
*
* @since 2.1.0
*/
do_action( 'do_robotstxt' );
$output = "User-agent: *\n";
$public = get_option( 'blog_public' );
$site_url = parse_url( site_url() );
$path = ( ! empty( $site_url['path'] ) ) ? $site_url['path'] : '';
$output .= "Disallow: $path/wp-admin/\n";
$output .= "Allow: $path/wp-admin/admin-ajax.php\n";
/**
* Filters the robots.txt output.
*
* @since 3.0.0
*
* @param string $output The robots.txt output.
* @param bool $public Whether the site is considered "public".
*/
echo apply_filters( 'robots_txt', $output, $public );
}
Thanks.
Share Improve this question asked Aug 11, 2023 at 1:37 Rick HellewellRick Hellewell 7,1062 gold badges22 silver badges41 bronze badges1 Answer
Reset to default -1The current/computed directives that WP generates for a virtual robots.txt file are not available with any function call or filter. The directives are only available with a page request similar to https://www.example.com?robots=1
. That page request will result in WP generating and returning the contents of the virtual robots.txt file, along with a Header.
The do_robots()
function is called when the ?robots=1
parameter is included in a page request. The code of do_robots()
is (from https://developer.wordpress.org/reference/functions/do_robots/ ):
function do_robots() {
header( 'Content-Type: text/plain; charset=utf-8' );
/**
* Fires when displaying the robots.txt file.
*
* @since 2.1.0
*/
do_action( 'do_robotstxt' );
$output = "User-agent: *\n";
$public = get_option( 'blog_public' );
$site_url = parse_url( site_url() );
$path = ( ! empty( $site_url['path'] ) ) ? $site_url['path'] : '';
$output .= "Disallow: $path/wp-admin/\n";
$output .= "Allow: $path/wp-admin/admin-ajax.php\n";
/**
* Filters the robots.txt output.
*
* @since 3.0.0
*
* @param string $output The robots.txt output.
* @param bool $public Whether the site is considered "public".
*/
echo apply_filters( 'robots_txt', $output, $public );
}
You can see that the header( Content-Type: text/plain; charset=utf-8' );
statement is included in the generated output of the virtual robots.txt file. That will display the page. But not allow me to use that function to get the directives into a variable.
The only way to see the contents of the generated virtual file is to use do_robots()
. Except that will display the page. There is no other way to get those directives into a variable that I can find.
The do_robots()
function will apply the robots.txt filter, which is how you add additional robots.txt directives to the virtual file sent by WP.
The only way to 'grab' the contents of the computed virtual robots.txt file is to call do_robots();
and capture that output with the ob*
functions. Then you can parse out the header statement, which will result in the entire contents of the virtual robots.txt file. (And clear the ob content.)
Which was what I needed - a way to put the actual generated virtual robot.txt directives into a variable.
Note that the other answer detailed how to modify the virtual robots.txt file. That does not allow you to put all of the directives (including the default entries that exist before the robots.txt filter) into a variable for use elsewhere in your code.
本文标签: Displaying the Virtural robotstxt file
版权声明:本文标题:Displaying the Virtural robots.txt file 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1736840683a1955086.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论