Answer:
☛ Submit and check a sitemap
☛ Check and set the crawl rate, and view statistics about how Googlebot accesses a particular site
☛ Generate and check a robots.txt file. It also helps to discover pages that are blocked in robots.txt by chance.
☛ List internal and external pages that link to the site
☛ Get a list of broken links for the site
☛ See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings
☛ View statistics about how Google indexes the site, and if it found any errors while doing it
☛ Set a preferred domain (e.g. prefer example.com over www.example.com or vice versa), which determines how the site URL is displayed in SERPs
☛ Highlight to Google Search elements of structured data which are used to enrich search hit entries (released in December 2012 as Google Highlighter)[1]
☛ Receive notifications from Google for manual penalties.
☛ Check and set the crawl rate, and view statistics about how Googlebot accesses a particular site
☛ Generate and check a robots.txt file. It also helps to discover pages that are blocked in robots.txt by chance.
☛ List internal and external pages that link to the site
☛ Get a list of broken links for the site
☛ See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings
☛ View statistics about how Google indexes the site, and if it found any errors while doing it
☛ Set a preferred domain (e.g. prefer example.com over www.example.com or vice versa), which determines how the site URL is displayed in SERPs
☛ Highlight to Google Search elements of structured data which are used to enrich search hit entries (released in December 2012 as Google Highlighter)[1]
☛ Receive notifications from Google for manual penalties.
Previous Question | Next Question |
Tell me your level of expertise in matters of website security? | What Google webmaster tools give us? |