Welcome to the Question2Answer Q&A. There's also a demo if you just want to try it out.
+2 votes
2.7k views
in Q2A Core by
Q2A version: 1.5.2

3 Answers

+1 vote
by
edited by
Unfortunately you pretty much can't, since Google needs to know about the file. To be honest it's Google being pretty stupid thinking that showing that file in SERPs is ever useful to end users.

One possibility is to use a gzipped file instead of XML, i.e. sitemap.xml.gz. You'd need to customize the sitemap plugin for that, though.
+1 vote
by
edited by

You can try robots.txt file. If you want google ignore your sitemap.xml, you can set some line in your robots.txt:

User-agent: *
Disallow: /sitemap.xml

If you want google ignore your site, you can set for your robots.txt:

User-agent: *
Disallow: /
by
robots.txt, not bobots.txt. I've to edit. Thanks!
by
If you do that then aren't you blocking Google from seeing the sitemap? Which totally defeats the purpose of having the sitemap.
0 votes
by
edited by

You can use the X-Robots-Tag for non-HTML files like XML's, PDF's, .txt or image files where the usage of robots meta tags is not possible. Here's an example of adding a noindex X-Robots-Tag directive for XML files across an entire site:

<Files ~ "\.xml$">
  Header set X-Robots-Tag "noindex"
</Files>

You should add this code to .htaccess or httpd.conf file that are placed in public_html directory of Apache based web servers. This will prevent all .XML files from appearing in SERP's.

...