Skip to HeaderSkip to PostSkip to Footer

Do you need manual submission in Google Search Console?

Table of Content

    Do you need manual submission in Google Search Console? - As of the date of this writing (October 21, 2020), there are still many Bloggers who are confused because they cannot request indexing on Google Search Console because Google is updating or maintaining their Search Console feature.

    However, what is a bit of a question for me personally is, do we really need to do Request Indexing manually every time we publish new content? Let's try to talk about this together.

    What is Google Search Console?

    Let's start with a simple question, what is Google Search Console first. I will quote an explanation from Google directly to answer the question.

    According to the Google Team,

    Search Console tools and reports help you measure your site's Search traffic and performance, fix issues, and make your site shine in Google Search results.

    So according to the Google Team itself, Google Search Console is a tool to get reports about the performance of our web or blog in Google search results. In addition, the Google Team also explained the functions of Google Search Console itself are to:

    1. Knowing your site/blog traffic from Google Search
    2. Your site's performance
    3. Fix problems that exist on your web / blog related to search results.
    4. MAKE YOUR SITE AWESOME IN GOOGLE SEARCH RESULTS

    So far, we understand that Google Search Console is not intended to make Google index our web/blog faster.

    So, if there is an SEO master or anyone who requires to always do Request Indexing every time you create new content, THIS IS A BIG WRONG.

    Does Request Indexing Speed ​​Up Google's Index?

    To answer that question I say, MAY.

    Why?

    Because the indexing process is more or less like this:

    1. GoogleBot visits website/blog pages.
    2. GoogleBot processes the data on the page.
    3. GoogleBot decide whether or not the data is appropriate to be stored in the Google database.
    4. Where appropriate, GoogleBot stores the data in a database for later serving to Google users.

    This fourth stage is known as indexing.

    I say maybe because the Request Indexing feature force GoogleBot to crawl our new post page as soon as possible which maybe if not requested GoogleBot will actually visit too.

    GoogleBot will actually visit our site if we tell (ping) or provide the sitemap address of our blog/website to Google.

    However, GoogleBot is trying to give us, bloggers or content creators time, to find ideas and start writing down other ideas by only visiting them a few times a day. Generally, in my experience, once in a while.

    How do you know? Look at the following picture.

    Peta Situs di Google Search Console

    Pay attention to the number of Discovered Urls.

    Until the time I took the screenshot, I had published 3 articles. But what Google found was only 2.

    That happens because for ATOM or RSS feeds, they will be automatically updated when we publish new content, but for Sitemap, took some time to find.

    Why wasn't it found? Because GoogleBot doesn't always stay on your website or blog sitemap, waiting for your sitemap to update, while smoking and drinking coffee.

    No!

    GoogleBot continues to surf nearly 2 billion web pages to continuously index the latest content that emerges.

    Therefore, it takes time for GoogleBot to return to visit our website map or blog. And from my experience, usually GoogleBot will return to visit our web/blog via our sitemap after 24 hours.

    In some cases, GoogleBot flags sites with frequent updates to be visited multiple times. Some examples are news sites, information portals, government websites, encyclopedias and other reference sites that share useful information that others need urgently.

    For such sites, GoogleBot usually updates faster and visits them more often. And even then only credible sites. Not all news portals are a priority!

    So, do you need an indexing request?

    If asked whether it is necessary or not, it goes back to each individual, yes. The point is that we both know what the concept of Request Indexing is.

    If you want your content soon to appear in Google search results, then please take the time to request indexing.

    I myself prefer that after I update the latest content, I share the content on social media, because the goal of the content I try to create is for people first, then search engines.

    After all, Neil Patel, an SEO Expert, also said that if a lot of people like content, search engines will like it. Or in other languages, if the content we just created is seen by many people, the search engines will also pay attention.

    I see.

    What I do in Google Search Console is just submit a sitemap. And it still works, really. Although you need to wait 1x24 hours first to be indexed properly by Google.

    Hasil Penelusuran Ninura di Google

    Take a look at the picture above.

    Assuming I update 1 content every day, Google still regularly indexes the content I create, really.

    Instead, Google automatically assigns a different rich snippet to one of my content. Because we chill.

    Don't mess around and make GoogleBot feel bad.

    GoogleBot is also happy to browse the web and our content.

    Closing

    Because I'm just learning about SEO, so I hear and read a lot of content from other SEO elders/temperature/masters. One of them is Neil Patel.

    Neil Patel once said that he took advantage of the Request Indexing feature in Google Search Console when he updated old indexed content.

    I reiterate yes, old content that has been indexed.

    Let what? Let GoogleBot also update the appearance and search results of that content.

    But Neil didn't bother himself to bother submitting the manual.

    If indeed Request Indexing has an effect on whether indexing is fast or not, then, what about news portals like Tribun or Kompas, right? They can update up to 30-50 articles per day. Not yet, the limit of the Google Search Console Request Indexing is a little, about 10-15 submissions per day.

    But for the latest news keywords, their fresh articles can still appear.

    So, the choice is yours. If you feel your content has value, has value for people, don't worry. Google can also assess, really. Which posts are made for things (whether it's AdSense or Search Results only) or people.

    That's all I want to say about whether or not it is necessary to submit manually or request manual indexing in Google Search Console. Sorry if the discussion digress and spreads everywhere. If something is unclear, please ask in the comments column below.


    If you like, also read my article entitled Latest Free Quality Backlink List for SEO. Hopefully useful, and see you in other content.

    Comments

    Post a Comment

    Please give relevant comments. Dont put irrelevant link on the comment section or I will delete the comment.