Every year Google’s New Algorithm comes up with new changes in the search algorithm making it more refined, dynamic and user-friendly.
The changes are sometimes minor and sometimes they are major roll-ups, such as the Google penguin which had affected the search results significantly. So like all other times google did not cease to surprise us again in google algorithm update 2018.The search engine web magazines have come up with this latest news about Google publishing a new algorithm that will create new unique and extremely precise articles by summarizing multiple contents available on the internet.
So it’s a concern for all the evergreen writers involved in writing articles and blogs, now have a reason to worry. This new algorithm may take any of the webpages and any number of them and turn it to generate a logically reasoned and rational article. These articles will mainly be used to suffice users query without referring the user to another website.
The General Strategies Influencing the Algorithm
The general working or functioning of this algorithm will comprise mainly of two sub-algorithms.The strategy used in one is Extractive Summaries and in the other is the Abstractive Summaries.
A brief idea about the two concepts is discussed below:
Extractive Summaries
The General motive of this strategy is to collect only the important details.
This new algorithm first targets to collect web content in a form of multi-document summarization, which includes picking up relevant contents from any website or blogs which will best answer to the query put forward by a searcher.
The general content will be then extracted and filtered by tossing out or removing all the irrelevant facts.
These final extracted and filtered summarised snippets are called “extractive summaries” and all this will be done by google without mentioning any credit to the sources referred in its article.
The idea is to provide users with relevant information excluding unnecessary details.
This process can be compared to the precise writing or summary writing the one which is done by the students in school.
Abstractive Summaries
After being done with the Extractive Summaries the next step is to combine all the extractive summaries that have been obtained from the various content sources which were picked by the query processor earlier.
The major drawback here is the combination may not give the correct information in a systematic sequence.Only a quarter part of the content would be factual and the rest would comprise fake information. This is an issue that Google is still working on to make this algorithm give relevant results to the user’s queries.
Google uses both of these approaches in tandem to accomplish its purpose in the algorithm.
The “extractive summary” is used first to obtain only the essential segments from the contents and the latter that is “abstractive summary” is used to paraphrase or combine all the extracted summaries.
So how will the Algorithm Perform?
As the search engine journal talks about the upcoming launch of Google’s new algorithm here is a summary of its working:
It collects the web pages or content blocks resulting from the SERPs (search engine Resulting Page) obtained after processing the Wikipedia topics as query.
Use of extractive summarization to produce coherent content. This includes the extractive summary process to produce concise segments of information scrupulously and rejecting all the unnecessary details. A basic procedure to shorten lengthy texts by extracting only the essential details.
After obtaining a concise detail about the fact from various sources it is then combined. This includes the Abstractive Summary concept. Combining all the essential segmented information obtained to form a sentence, then a paragraphand so on, leading to a crisp concise informative article.
Finally, Google claims that the generated article created after being processed through the above steps can pass a Human examination.
All this discussion about the algorithm leads to the main question that is coming to everybody’s mind.
IS Google Exploiting the Website Builders?
This is the question arising in the mind of most of the web developers, content writers and bloggers whether Google is using their content in their summarizing algorithm.
The answer is yes, any of the public web pages can be included. But not only one content, the algorithm is about summarizing more than one content source that is, it includes multiple documents.
The process uses the Wikipedia topics as their query and the results that are being generated by the Search Engine Result Pages (SERPs) are used as the source content for the algorithm to work on.
The content extraction and summarization are being done on the websites, articles or documents that the search engine generates which are then paraphrased or combined to generate a completely brand new concise article.
This article can answer the question of the searchers in a better and, most importantly, in a precise manner creating no inconvenience of clicking on suggested website links, but it won’t include any information regarding the sources that were used to build it.
Therefore,in conclusion we can state that this algorithm has turned out to be successful, that is Google can now create its own content in a precise way by extracting information from already existing web pages without mentioning any adulation to the source pages.
It is, therefore, a reason for concern to all the blog writers and website designers. Looking on the brighter side though, users will no longer be troubled by unnecessary long texts and lengthy articles to get their answers.
Clicking through the multiple suggested websites to get answers relevant to one’s query consumes an enormous amount of time of the users. This is good news for them.
Google would have already done the research on behalf of the user through its algorithm to get the most precise and relevant answer to the query. Tapping into multiple websites to gain the perfect answer is now the Google’s responsibility. All the user needs to do is type a question in and hit the search button.