Search engine bots crawl websites to uncover pages and gain an understanding of their content.
Robots.txt is used to communicate to search engines which pages should not be crawled, preventing bots from accessing low-quality or irrelevant content
Rendering is the process of code being run in order for webpages to be assessed.
These concise tags can be embedded in a webpage's HTML to instruct search engines not to index the page and not to follow a particular link.
If a page meets specific criteria, it can be categorised and included in the search engine index.
Canonical tags, used to manage duplicate content, serve as a signal to indicate your preferred page for indexing by search engines.