I need to reduce the amount of pages google crawls when it comes to my site.
I would like to restrict the crawl to the sections with most information, to make sure google crawl the best stuff.
I can do this in the tool section of the forum, or through robots txt, but would this show as an error on my google search console? Or would it not show at all?
Also, does anyone have an example of what robots txt they use? Any advice on this would be massively appreciated.
I have a standard robot, I tried everything and there was even a robot 25 lines each. The load is minimal. Average attendance of 400 people per day. You should have caching enabled in the forum settings and no plugins are needed except for kaptha bank. I hope I didn't hurt you!
User-agent: * Disallow: /wp-admin/ Disallow: /r/ Disallow: /wp-json/
This is how mine look:
I added the profile bit last night, and just added that line told me too, is this ok do you think?
Thanks for the input guys, I really appreciate the help
@percysgrowroom Seems right, still i'm no expert in all that.