John Mueller of Google unfolded some of the facts related to the indexing of your site. He recommended some of the ways here:
1.If you misguide the Google bot then you will have to pay:
Never try to cloak the crawlers by redirecting it some unsupported site. Also, keep in mind that now Google does not support service workers, fetch API, promises, and Request Animation Frame, hence experts from SEO company Kansas suggest their clients to use genuine links. If you have to avail your content genuinely to the users the use features like ‘feature detection’ and ‘progressive enhancement’.
2.Rel= canonical attribute:
Use this attribute when you feel like presenting content to your users from different URLs.
3.Ajax crawling is now no more valuable:
Check all your old sites which are based on this scheme. While migrating to these sites do not forget to remove ‘meta fragment’ tags.
4.No more ‘#’ in URLs:
If you want your site to be indexed by Google then avoid using ‘#’ in the URL. It is clear that Google bots do not consider these special characters while indexing your site. Try to use normal URLs consisting of path/filename/query-parameters.
5.Do some work from your side also:
Keep checking the indexing of your web page by going to the search console’s fetch and render tool. This tool will give you an insight into how the bots are looking at your site. This tool will not support those sites which have ‘#!’ Or ‘# in their URL.
- Robots.txt file should be checked:
7.Limit the number of embedded resources: