My Suggestion Top 10 worst seo tactics

Status
Not open for further replies.
O

ovi

This days I have made some information work on SEO technics, and I have made a list of worst S.E.O tactics. Maybe this list will help someone. I still wait you all guys to add more features to this list.

1. Doorway Pages
Doorway pages are web pages that are created to rank high in search engines for particular phrases with purpose to seduce or hoax you to watch another page. They are also known as bridge pages, portal pages, zebra pages, jump pages, gateway pages, entry pages and by other names.
If you click to get a typical doorway page from the search engine result pages in most cases you will be redirected with a fast meta refresh command to another page. Doorway pages are easy to identify in that they have been designed primarily for search engines, not for human beings. Sometimes a doorway page is copied from another high ranking page, but this is likely to cause the search engine to detect the page as a duplicate and exclude it from the search engine listings.
Because many search engines give you a penalty for using meta refresh command some doorway pages just hoax you to click on a link to get more information or they use "code-swapping". Sometimes called "bait-and-switch". It means you submit an optimized page and wait to get listed and afterwards change the page on the server to a "real" page. The downside is that a search engine may revisit at any time, and if it indexes the "real" page, the position may drop.

2. Invisible Text
Definition: Invisible text is implemented in a variety of ways in an effort to increase the frequency of keywords in the body text of a web page. Some of the implementation methods are: making text the same color as the background of the web page, hiding text behind layers, placing text at the very bottom of over-sized pages, etc.
This tactic is perilously old and obvious to search engine spiders. It constantly amazes me when a web site utilizes these methods for placement. Invariably, placements are the last thing that a webmaster will get when using this tactic. Invisible text had its heyday from 1995 to 1999. This not to say that invisible text didn’t work after 1999 but the majority of web sites were not using it by this time as the search engines began implementing automated methods of detection and penalization.

3. Content Misrepresentation
Definition: Misleading search engines into believing your webpage is about topic 'A' when it is in fact about 'B'. This tactic was used primarily for the promotion of adult, gambling, and other extremely competitive search markets.
Unfortunately this tactic is still in use; you likely find one or two every time you search! The fact is that this tactic is the simplest for a search engine to identify and the result will be swift and complete; banishment from the search engine index indefinitely. The worst offense in the realm of the search engines is to try to fool them.

4. Redirects
Definition: Redirects have some innocent uses (practical, legal, etc.) but they are also used nefariously to mislead search engines by making them believe that the page they have indexed is about 'A'. When a surfer visits the page, however, they are redirected to an entirely different site about 'B'.
In most cases search engines have advanced enough to see this technique a mile away. In fact they usually ignore any page with a redirect (assuming correctly that the content is useless) while spidering the redirect destination. Redirects, unless blatantly Spam-related do not directly result in intentional ranking penalties, however, they have no positive effect either.

5. Heading Tag Duplication
Definition: Heading Tags, by definition, were created to highlight page headings in order of importance. Thus the Heading Tags that are available: H1, H2, H3, etc. This duplication technique involves implementing more than one H1 tag into a webpage in order to enhance a particular keyword or phrase.
This tactic is still very prevalent and likely still works on some search engines; however, none of the major search engines will respond well to this technique as it has been identified as a common manipulation.

6. Alt Tag Stuffing
Definition: Alt Tag stuffing is the act of adding unnecessary or repetitive keywords into the Alt Tag (or alternative tag – shown by words that appear when you hover over an image with you mouse pointer).
The Alt Tag is meant to be a textual description of the image it is attached to. There is nothing wrong with tailoring the Alt tag to meet your keyword goals IF the tag is still understandable and if the change still appropriately describes the image. The offense occurs when an Alt tag has obvious keyword repetition/filler which a search engine can key in on as Spam.

7. Comment Tag Stuffing
Definition: Comment Tags are used to include useful design comments in the background source code (html) when creating a webpage. These are suppose to be used only for adding technical instructions or reminders; however, in times past these tags were used to artificially increase the keyword count for targeted phrases.
At one time there was some argument that this technique worked, but it has always been a "Black Hat†SEO technique which even then could result in placement penalties. Nowadays this technique will not help an SEO campaign, if anything it will be ignored or produce a negative result.

8. Over Reliance on Meta Tags
Definition: Meta Tags is a broad term for descriptive tags that appear in the <Head></Head> of most webpages and are used to provide search engines with a concept of the page topic. The most common tags are the description and keyword tags.
At one time, extinct search engines such as Infoseek relied a great deal on Meta Tags and many took advantage of this factor to manipulate rankings with relative ease. In today's far more advanced climate the search engines place cautious weight on Meta Tags and when considering rankings Metas play only a fractional role. Some webmasters still consider Meta Tags the 'end-all and be-all' of ranking producers and forget to optimize the rest of their webpage for the search engines. With this line of thinking they miss that the search engines place far more importance on the body text (or visible text) of the webpage. This is a critical error that will ultimately lead to low or insignificant rankings.
Note: An extremely common example of Meta Tag over-reliance are web sites that have been designed totally graphically and are devoid (or nearly so) of html text that a search engine can read. A webpage such as this will have no body text to index and may only provide a small amount of relevance to the webpage which ultimately leads to poor rankings.
Over reliance on Meta Tags does not produce intentional search engine penalties, however, the simple act of ignoring other ranking principles often means a lower ranking.

9. Automatic Submission and Page Creation
Definition: This tactic is blatant Spam that is very common today. Essentially the webmaster will create a web site and then create duplicates of each page and optimize them differently in order to obtain varying placements. By doing this you are saturating the search engine databases with content that is essentially eating valuable bandwidth and hard drive space.
Duplicate content is a dangerous game often played by full-time marketers accustomed to trying to attain placements in aggressive markets. Avoid this tactic like the plague unless you are willing to sustain serious ranking damages if you get caught – which you likely will.

10. Automatic Submission
Definition:
- Automatic Submission
Search engines make the majority of their profit from surfers like you viewing their advertising. Do you think that allowing automated submission tools to submit a web site (which bypasses SE advertisements) is in the search engines best interest? No, in fact the submission companies have had to upgrade their software repeatedly to try and subvert the search engines' latest effort to stop their programs. There are also concerns about bandwidth because automated tools have been known to repeatedly submit sites and sometimes each individual page within a site.
All-in-all, this leaves the submitter in an unstable position where they may or may not have their submission ignored. This is not even considering the fact that automated tools claim to submit a website once a day or a week or a month! The cardinal rule of search engines… submit ONCE and it may take a while but the site will get spidered at some point (up to 2 or 3 months later – max). If within a few months a site is not listed, then resubmit. If a search engine is submitted to too often that it is Spam and frankly the website being submitted will not fair well. As for the major engines like Google… be patient and definitely don't submit more than once if you can help it.

This article has been made using some sources: wikipedia, stepforth.com
 

matthewmurcia

White Belt
It's funny how some people latch straight onto this type of behaviour to promote their websites. I am working with a client at the minute (a web designer) who had a total of 624 keyowrds in his keywords meta tag (2 x 312, which is twice as bad) and we've actually been having some quite good discussions about how to improve. He's very enthusiastic.

But I went back to see him this week and his brilliant new idea to make SEO easier was to like, put some text on his flash page (all his design work is done in flash, which is bad news to start with) and make the text the same colour as the background, so that his visitors see the flash and the SE sees the text.

I tried not to visibly sigh as I explained the problem :eek:

(He also showed me his exciting software tool that submits his new sites to 3000 directories automatically, but then admitted that he doesn't understand half of the replies because they're not in Spanish and never sees any kind of results)
 
P

prateeksha

Hi...How about a list of things we should DO to get good traffic...
 
Status
Not open for further replies.
Top