Blog Privacy: Don't add custom rules to wpcom robots.txt if blog_public=0 #39468
+7
−3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
In #35803, these rules were added to robots.txt regardless of the environment. It cluttered up the robots.txt and was confusing on simple, for sites that are discouraging search engines already.
This change implements custom behavior based on the environment
I think we can revert the wpcom test change too D161949-code
Proposed changes:
Other information:
Jetpack product discussion
p1726689440830569/1726685246.860939-slack-C02AVAR9B
Does this pull request change what data or activity we track or use?
No
Testing instructions:
Full instructions in Field Guide here for both Atomic and WPCOM testing PCYsg-Osp-p2#simple-testing
In WPCOM simple:
Toggle these settings and test individually:
Discourage search engines from indexing this site:
robots.txt should just haveUser-agent: * Disallow: /
Prevent third-party sharing for dsmartsandbox.wordpress.com
: Should show the list of AI user agents in robots.txt.Test also on a WoA site using the Jetpack Beta plugin