The initiative concerning the policy of indexing in Google certainly raised the eyebrows of many people, some are likely to smile. In news reports of the intention to include the Google search engine indexing of the comments in the most popular social network Facebook (also including other networks, where access is only by request HTTP Post).
Indexing in Google
Together with the indexation of the comments on Google would cost to give yourself as user comments, as the search result. This idea aims to uncover all of the subject under the shells of the type facebook, Disqus, and other applications based on javascript. These incredibly simple shapes using the Internet to help bring people together on the Internet without much effort (why they are so popular).
In the Web world, there are two possible request: GET and POST. GET request means reading data, and POST can change them. Therefore, the robots of search engines (such as robots Google) are attached only to the needs of GET.Since reading the data does not involve changing the content, which was Read, Googlebot (the program that determines which Web sites should be considered, and tracks documents on the Internet) can be perceived as a certain passive observer. But now it can interact (and possibly even modify) the content, which he traces. Nevertheless, it is unlikely that Googlebot will be something to change.
Google's much easier for indexing web content. With the advent of Ajax (technology, which reduces the time difference between your mouse and the issuance of the search results), technology for a long time could be implemented in the algorithm of Google, but here the important role played by personal choice.
The favorable element
Expanding the scope of the search engine can expand the content of the content and make it more usable. The users will be able to get the most thematic material, and click it on the links that interest them.
This interesting change may be good news for experts SEO, which underestimate the power of areas intended for comment. Commenting on blogs is not so much affect the promotion of web sites. Nevertheless, after making these changes, the text of the comment blocks will be considered a system of Google.
More importantly, it will force users to think more seriously about what is written in the comments, but that is not worth it.
Unfavorable element
Given the reverse side of the story, there is a perception among developers about POST-requests from the bot Google.Involvement of robots increases the number of errors, and one should not underestimate any possible incidents.However, the robots.txt file can be used to prevent Googlebot to read the individual forms on the site (such as a ban on the consideration of requests for the URL in POST).
Secretive people are unlikely to be happy that their comments are now being exposed. Facebook users use policy and privacy settings as an advantage when they do not want to totally socialize. But after starting the indexing of their names and comments will be open. Many people regard their information as something personal and private, and it is unlikely they will like the fact that the bot is watching them. In this case, herself a social network Facebook may somehow protect their users.
Let's wait and see
Google came up with this responsibility, and not focused on anything that could lead to "unintended user actions." With the growing popularity of portals available to the user communities such as facebook, traffic has become a key component of the industry. Many users may feel uncomfortable with the fact that they put out the word to all for show.
It's time to weigh the advantages and disadvantages of this technological innovation. Words at times are worse than any weapon in the world, with a hand, it is a good thing that people will be watching what others are saying. see facebook listen
Indexing in Google
Together with the indexation of the comments on Google would cost to give yourself as user comments, as the search result. This idea aims to uncover all of the subject under the shells of the type facebook, Disqus, and other applications based on javascript. These incredibly simple shapes using the Internet to help bring people together on the Internet without much effort (why they are so popular).
In the Web world, there are two possible request: GET and POST. GET request means reading data, and POST can change them. Therefore, the robots of search engines (such as robots Google) are attached only to the needs of GET.Since reading the data does not involve changing the content, which was Read, Googlebot (the program that determines which Web sites should be considered, and tracks documents on the Internet) can be perceived as a certain passive observer. But now it can interact (and possibly even modify) the content, which he traces. Nevertheless, it is unlikely that Googlebot will be something to change.
Google's much easier for indexing web content. With the advent of Ajax (technology, which reduces the time difference between your mouse and the issuance of the search results), technology for a long time could be implemented in the algorithm of Google, but here the important role played by personal choice.
The favorable element
Expanding the scope of the search engine can expand the content of the content and make it more usable. The users will be able to get the most thematic material, and click it on the links that interest them.
This interesting change may be good news for experts SEO, which underestimate the power of areas intended for comment. Commenting on blogs is not so much affect the promotion of web sites. Nevertheless, after making these changes, the text of the comment blocks will be considered a system of Google.
More importantly, it will force users to think more seriously about what is written in the comments, but that is not worth it.
Unfavorable element
Given the reverse side of the story, there is a perception among developers about POST-requests from the bot Google.Involvement of robots increases the number of errors, and one should not underestimate any possible incidents.However, the robots.txt file can be used to prevent Googlebot to read the individual forms on the site (such as a ban on the consideration of requests for the URL in POST).
Secretive people are unlikely to be happy that their comments are now being exposed. Facebook users use policy and privacy settings as an advantage when they do not want to totally socialize. But after starting the indexing of their names and comments will be open. Many people regard their information as something personal and private, and it is unlikely they will like the fact that the bot is watching them. In this case, herself a social network Facebook may somehow protect their users.
Let's wait and see
Google came up with this responsibility, and not focused on anything that could lead to "unintended user actions." With the growing popularity of portals available to the user communities such as facebook, traffic has become a key component of the industry. Many users may feel uncomfortable with the fact that they put out the word to all for show.
It's time to weigh the advantages and disadvantages of this technological innovation. Words at times are worse than any weapon in the world, with a hand, it is a good thing that people will be watching what others are saying. see facebook listen
No comments:
Post a Comment