Google Search console is SEO basics. It helps you to monitor and maintain your website presence in the Google Search Engine. Whenever you create a website, you need to inform Search Engine about your website by submitting it for crawling and indexing in the Search Console. While submitting a sitemap for indexing, you can take full advantage of what needs to be indexed and not. The search console will help you to analyze the ranking of keywords, and all possible penalty action and many another technical part of SEO.
Search Console Verification:
Every website in the search console needs to be verified to ensure you have the site ownership. There are several ways to verify this like, HTML Verification file, Meta Script including in the website, Google Analytics Verification and Google Tag Manager Verification. By using either of the above and verify your site ownership.
A sitemap helps search engines to crawl your content and helps in indexing it better. It’s an XML file where your website sections are listed. XML files are really helpful when you have a large website with complicated structure.
Having website sitemap doesn’t help in improving traffic or ranking, it is for your advantage and benefit and you will not be penalized for not having one too. Not all the website needs a sitemap, Sitemap should not contain more than 50,000 URL and the file size should not exceed over 50 MB. Always have a website in the root directory.
Robots.txt is the file that tells crawler or Search Engine robots on which section of the website should be crawled and which is not. Robots.txt can be accessed as https://domainname.com/robots.txt and it is public. You can edit this when you don’t want some scripts or unnecessary files or images to be indexed.
Make sure you have strong technical On-Page SEO to rank soon and increase traffic to your website and get notified about any actions on your website.