I need to reduce the amount of pages google crawls when it comes to my site.ย
I would like to restrict the crawl to the sections with most information, to make sure google crawl the best stuff.ย
I can do this in the tool section of the forum, or through robots txt, but would this show as an error on my google search console? Or would it not show at all?ย
Also, does anyone have an example of what robots txt they use? Any advice on this would be massively appreciated.
I have a standard robot, I tried everything and there was even a robot 25 lines each. The load is minimal. Average attendance of 400 people per day. You should have caching enabled in the forum settings and no plugins are needed except for kaptha bank. I hope I didn't hurt you!
User-agent: * Disallow: /wp-admin/ Disallow: /r/ Disallow: /wp-json/
This is how mine look:
User-Agent: *
Allow: /wp-admin/admin-ajax.php
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/
Disallow: /forum/profile/
Sitemap: https://percysgrowroom.com/forum/sitemap.xml
Sitemap: https://percysgrowroom.com/sitemap_index.xml
I added the profile bit last night, and just added that line told me too, is this ok do you think?
Thanks for the input guys, I really appreciate the helpย