Welcome to the Question2Answer Q&A. There's also a demo if you just want to try it out.

Is it necessary to have robots.txt file for seo ?

+5 votes
346 views
asked Jan 23 in Q2A Core by Zeeshan
And if yes please also tell what the robots.txt contain?

3 Answers

+2 votes
answered Jan 23 by ProThoughts
Yes, it is required. it help search engine not to index certain pages. also you can stay away from some bots which you dont want.

see this file.

https://meta.question2answer.info/robots.txt
commented Jan 23 by Zeeshan
edited Jan 23 by Zeeshan
How about this ?
User-agent: *
Disallow: /login
Disallow: /index.php?qa-rewrite=
Disallow: /ask
Disallow: /forgot
Disallow: /register
Disallow: /questions?sort
Disallow: /chat
Disallow: /admin
Disallow: /activity/*
Disallow: /search?q=
Disallow: /cdn-cgi/
Crawl-delay: 4

User-agent: MJ12bot
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: Mozilla/5.0 (compatible; Ezooms/1.0; ezooms.bot@gmail.com)
Disallow: /

User-agent: Yandex
Crawl-delay: 30

User-agent: SindiceBot
Crawl-delay: 30

User-agent: CCBot
Crawl-Delay: 30

User-agent: wget
Disallow: /

User-agent: WebReaper
Disallow: /

User-agent: AhrefsBot
Disallow: /

User-agent: Yahoo Pipes 1.0
Disallow: /

Sitemap: https://electronics2electrical.com/sitemap.xml
commented Jan 24 by ProThoughts
check which bots are bothering your server (see server access_log file) and which you dont want to index your site, then you can exclude those. Above bots looks fine to me, you can exclude.

Regarding excluding site content/pages, it totally depends on you.
commented Jan 24 by Zeeshan
ok thanks.....
0 votes
answered Feb 16 by priyarana
edited Mar 1 by priyarana

Yes, Robots.txt file is essential for the SEO and it is the text file, not HTML file. According to Google Guidelines, a robots.txt file gives instructions to web robots about the pages the website owner doesn’t wish to be ‘crawled’. It contains the following code:

User-agent: *

Disallow: /wp-admin

Disallow: /wp-includes

Disallow: /wp-includes/js

Disallow: /trackback

Disallow: /*~*

Disallow: /*~

Disallow: /cgi-bin/

Disallow: /wp-content/plugins/

Disallow: /wp-content/cache/

Disallow: /wp-content/themes/

Disallow: /feed/

Disallow: /comments/

Disallow: */trackback/

Disallow: */feed/

Disallow: */comments/

Allow: /app/uploads/

sitemap: http://domainname.com /sitemap.xml

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

commented Feb 16 by Zeeshan
thanks.........
0 votes
answered Mar 14 by seoeasy
Yes, it is a text file to instruct search engines robots that how to crawl & index pages on their website. Which area of the website is not to be scanned
...