SEO webmaster should avoid – seo tips
Web promotion, web advertising, specifically optimized for search engines (Search Engine Optimization) so-called SEO jobs is quite comprehensive and complex. By the algorithms of the search engines, SEO tactics are complex and constantly changing. For example, Google has hundreds of elements in the Web page ranking algorithm. Moreover, search engines consider the algorithm is a top priority for two main reasons:
They do not want competitors to know what they are doing.
They do not want the webmaster or the spammers web design, application of an SEO tactic to high-class amenities abuse
There is one other reason for SEO jobs become more complex as the theory of SEO, SEO experience rapid changes in recent years. Webmaster SEO tactics, SEO experts applied in previous years no longer apply to the present time.
Change the appearance of the Web led to the environmental changes and the algorithms of search engines on an ongoing SEO job made more complex. Many questions, many problems still considered mysterious in the SEO profession. This article VietSEO will sum up for you 10 SEO tips are obsolete and should be avoided. vietSEO.net hope to offer a few brief explains the SEO, Webmaster.
Based on keywords in metatags Keywords tag
This is the first taboo simple reason because the search engines no longer rely on metatags Keywords tags to determine the content of Web pages for more than 3 years. Instead, the search engine analyzes the content displayed to the user to determine the content and classification rules for ranking pages. The text visible to the user, as metatags Keywords, did not make sense from a few years back because they are excessive abuse by spammers. However some search engines use to this Meta Tags with very low weight. So you put in the Meta Tags keywords (as follows), and then forget it.
Meanwhile, Meta Title – provide technical information to the user, is one of the most important SEO tips SEO jobs. It helps you to dramatically improve the ranking of the page.
In addition, you should also declare complete and accurate Meta Description tag compared to the content of the page. Meta Description tag is not directly help you improve your page rank, but it helps Google build snippets associated with the contents of the search results page. Meanwhile Yahoo card uses this description in the search results page in some cases. This increase click rates CRT. And general intangible, Meta Description tag is also indirectly involved in increasing the quality and increase the ranking of your website.
Keyword stuffing in the hidden text
Occupying second place because it will make your website penalized, banned or deleted from the list of indicators. The insert keywords with extremely small font, the font the same color as the background or beyond the browser window, or even use the HTML CSS SEO techniques are also tricks SEO taboo. Google’s algorithm was quite perfect in the discovery of these SEO techniques. And the punishment is inevitable especially when the anti-spam has become the top concern of many search engines (Google, Yahoo).
Buying and selling links
This is one of the ways is very popular and widely applied by SEO Webmaster and those who do. Particularly in Vietnam, where the index Alexa traffic that users appreciate and it was thought that the link, purchase, sale, exchange links will bring traffic to the website. many Vietnamese Webmaster seriously bring direct traffic from exchanging links rather than indirect traffic from search engines via the Website rankings.
The problem is, the exchange of links to wrong URLs nature “natural” and it will make the search results are no longer accurate with the user’s query (Remember that Web page also rank dependent to external URL pointing to the page). And the search engines, especially Google, in an effort to improve search results useful to the user, will seek to link against the sale and they are a priority. Matt Cutts, Google engineer has confirmed that Google’s algorithms are complete in detecting links are bought and sold. Normally, Google uses the following three methods to determine the purchase link:
Algorithm will search for suspicious patterns, such as the form of “advertising”, “sponsored” link located near. It can also find a group of discrete link does not have anything to do with this topic but links page.
Google has thousands of editors in Asia, who are looking for quality management. And certainly one of them will be trained to detect and alert the trade to keep the website link.
In addition, Google also has tools that allow users to post complaints and the purchase link. And they will be sent to the search quality management team located in Asia.
So Google will do anything to discover the purchase link? The links will be marked and no effect on the ranking for the linked page. In addition, if the sale was discovered in the aim of increasing rank, Google will apply the sanctions, as downgraded PageRank and even banned always Website.
So let’s use a more reasonable time and money. Rather than take the time to find purchase links, you will find valuable links, relevant to the topic of the page to provide useful information to the user. And build an information-rich Web site or the tools useful, you will get the “natural” user. It’s that users keep the old and bring in new traffic. Here’s how to make sure and lasting.
PageRank ranking losses
Here is one khuye favorites by simply it is something that many webmasters do not understand. Especially in the context of Vietnam, Website administrators, or content administrators often because of a vicious circle of copyright rather than the SEO aspect, so it’s “miserly” in placing a link to the page Web.
Misunderstood by those who do SEO when Web pages link to other sites outside the PageRank of the page that will be “split” and “loss” to other sites. But the world has changed. PageRank is just a common index for ranking Web pages only.
So you set up to strengthen links to similar content, which enhanced the reliability of information on your Web page.
Participate in link exchange system
Is a job quite old but had no effect at all. Search engines want links to keep nature “nature”, citing the need to provide information and tools. Meanwhile, the exchange of links to show the change of tracks and they are very easily detected.
Do not take time to participate in link exchange to affiliate system to build this simple trick. However, link building is very important if the Web page in the link diagrams are useful for the user. Build links to other pages with the same topic and useful to the user. And of course there will be better if the Web page with the subject link to your website that do not necessarily link back.
Duplicate content
As in a number of content articles duplicate on VietSEO that you can refer to:
Dual content and Google’s new detection method
Interview with Matt Cutts on duplicate content
Webmaster discussion after the interview Matt Cutts on duplicate content
There are two dual-generated content:
Many Webmaster purposely create doorway pages, sites with similar content, even exactly the same as the original page. These pages are presented in many different ways to promote the company’s products or services.
Many times, in the same Web page, the same content will appear in different pages (different URL). For example, the same content of the Blog can be found in the link to the article, category, archive, RSS, and on the home page.
Matter with dual content is Google search always want to bring a wide choice of content, So Google just pick out a single page of the duplicate content. Therefore duplicate content in wasted time of the search engines and a waste of bandwidth your Web server. And sometimes the search results displayed on the page content is not the version that you want users to access.
What should you do to avoid duplicate content? Refer to the article on duplicate content above and find ways to reduce them. There are also some tools to help you figure out the version required to index while eliminating the version appendages.
Refer using the Robots Exclusion Protocol (REP) to avoid duplicate content indexing unnecessary or individuals:
Robots.txt disallows Web Robot, User-agent
The article introduces the Robots Exclusion Protocol (REP) with the robots.txt file and syntax, using the correct and list User Agent Names. Robots, HTML Meta and Google, Yahoo, Microsoft
Common rules on standards REP of the three giants: Google, Yahoo and Microsoft: Robots.txt and Meta HTML Googlebot and Robots.txt: Allow, Disallow
Making use of the robots.txt file for the Google search engine. How to compile special robots.txt file GoogleBot spider. Robots META Tag – Metadata Elements
Metadata card applications Robots for single page.
Or use the Redirection folders, pages:
Permanent Link Redirection – Redirect 301
Using 301 Redirection to redirect the article.
Apache server configuration with htacess
Use Apache configuration name. Htacess to redirect the article.
Use Session IDs in URLs
Before going into details, if you have not mastered the basic elements of a hypertext link to your URL, please refer to the article of the basic ingredients of the URL, and static Web site.
Google’s index of Web sites continuously. Googlebot frequency depends on the ranking of Web pages and update the page level. To have a Web page ranks high is the prolonged persistence. In addition, Google and other search engines are always interested in static Web pages. The arguments appear the end of the search URL will be considered as part of the URL.
If your Web site contains the Session ID parameter, it is more likely that the search will fall into endless loop when indexing your page for each visit they are assigned a new Session ID and GoogleBot will see this as the new article. Session ID, you will create duplicate content as mentioned. And Google will take a long time to no avail indexing, while consuming more bandwidth for them. Session ID will reduce your page rankings.
Although Google’s algorithm has significant improvement in the handling of session ID, but you should use cookie instead of using parameters in the URL. Remember that only 2% of users do not use cookies.
You may also try to create friendly URLs using mod_rewrite URL with htacess such, or the Permanent Link configuration for WordPress.
Website with Flash
In terms of art, a Web page presented entirely in Flash can be very eye-catching, but definitely hard to rank high on search engines. As in the article SEO Flash Website for Google, even if the search engines can read and index Flash, but it is hard to see a Flash website to rank high for the keyword hot, highly competitive . One of the simple reason that Google likes text. And if your page layout with text, Flash just stop at providing the visual effects.
Excessive use of JavaScript
JavaScript appears to be very effective in website design. But the problem is that Google will have difficulty to understand source code javascript. Although at present and in the future, Google has and will put more effort, but the use of JavaScript will remain inefficiencies in touch with search engines.
For optimization, the SEO are disjoint own JavaScript, and in the case of use, please insert this file (included) or use CSS instead of in the header or body of the Website. Please help machines to understand the main content of the page and index them easily, like, everyone will benefit
Cloaking techniques
This is the SEO techniques “black hat” in order to display different content to the search compared to regular users. This is a technique used by many spammers old in the previous year.
Today’s search engine found this easy scams by sending bug usually sign a new search with the aim to detect cloaking. There are many cloaking techniques, trick spiders that can not list everything within the limits of this article. However, they are soon discovered. This is a SEO trick “black hat” should be avoided.
In the case of detection, relevant Web pages will be banned. So you should not use this technique. Let’s assume the problem by other techniques.
Conclusion SEO tips
Through the above analysis, VietSEO recast the two main problems that the Webmaster, who do SEO Notes apply SEO tips:
Learn how the operation of search engines to help them understand the content of your website. Explore the issues above have one thing in common is they are making it difficult for search engines to index and determine the content of Web pages. So good or to build interactive Web sites with search engines to provide them with unique content. Do not use valuable time to fool the search engines. Since the algorithms of search engines more than smart enough to detect the trick, not to mention the people in the anti-spam relay. Though even if you through the eyes of search engines. It would only be temporary in a short time and the cost of being exposed to much more expensive. Fool the search engines is not a long way.
Make use of time, energy and money to invest in content, useful tools and participate in other promotions that you will do if search engines did not exist.