Welcome to the Question2Answer Q&A. There's also a demo if you just want to try it out.
+5 votes
1.6k views
in Q2A Core by
And if yes please also tell what the robots.txt contain?

3 Answers

+2 votes
by
Yes, it is required. it help search engine not to index certain pages. also you can stay away from some bots which you dont want.

see this file.

https://meta.question2answer.info/robots.txt
by
edited by
How about this ?
User-agent: *
Disallow: /login
Disallow: /index.php?qa-rewrite=
Disallow: /ask
Disallow: /forgot
Disallow: /register
Disallow: /questions?sort
Disallow: /chat
Disallow: /admin
Disallow: /activity/*
Disallow: /search?q=
Disallow: /cdn-cgi/
Crawl-delay: 4

User-agent: MJ12bot
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: Mozilla/5.0 (compatible; Ezooms/1.0; ezooms.bot@gmail.com)
Disallow: /

User-agent: Yandex
Crawl-delay: 30

User-agent: SindiceBot
Crawl-delay: 30

User-agent: CCBot
Crawl-Delay: 30

User-agent: wget
Disallow: /

User-agent: WebReaper
Disallow: /

User-agent: AhrefsBot
Disallow: /

User-agent: Yahoo Pipes 1.0
Disallow: /

Sitemap: https://electronics2electrical.com/sitemap.xml
by
+1
check which bots are bothering your server (see server access_log file) and which you dont want to index your site, then you can exclude those. Above bots looks fine to me, you can exclude.

Regarding excluding site content/pages, it totally depends on you.
by
ok thanks.....
+1 vote
by
edited by

Yes, Robots.txt file is essential for the SEO and it is the text file, not HTML file. According to Google Guidelines, a robots.txt file gives instructions to web robots about the pages the website owner doesn’t wish to be ‘crawled’. It contains the following code:

User-agent: *

Disallow: /wp-admin

Disallow: /wp-includes

Disallow: /wp-includes/js

Disallow: /trackback

Disallow: /*~*

Disallow: /*~

Disallow: /cgi-bin/

Disallow: /wp-content/plugins/

Disallow: /wp-content/cache/

Disallow: /wp-content/themes/

Disallow: /feed/

Disallow: /comments/

Disallow: */trackback/

Disallow: */feed/

Disallow: */comments/

Allow: /app/uploads/

sitemap: http://domainname.com /sitemap.xml

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

by
thanks.........
0 votes
by
Yes, it is a text file to instruct search engines robots that how to crawl & index pages on their website. Which area of the website is not to be scanned
...