Part of achieving top search engine positions is through links from other Web pages. These links can come from people who like your site (natural links), reciprocal linking, directory submissions and a few other ways.
The goal of trading links is to get quality links for quality links. True quality links will carry benefits far beyond that of attaining a coveted position in the search engine results. The links will bring traffic from the Web page linking to your Web page. Therefore, you want to ensure you trade or barter links from quality partners.
Sometimes it's hard to determine who is a quality linking partner, even for the expert. So, how can you tell if your link is on a Web page where its value will not be very good?
The short list below highlights ways of diminishing or nullifying the value of a link to your site from another Web page.
Meta Tag Masking - this old trick simply used CGI codes to hide the Meta tags from browsers while allowing search engines to actually see the Meta tags.
Robots Meta Instructions - using noindex and nofollow attributes let's the novice link partner see the visible page with their link while telling the search engines to ignore the page and the links found on the page. Nofollow can be used while allowing the page to be indexed which gives the impression that the search engines will eventually count the link.
Rel=nofollow Attributes - this is not a real attribute based upon HTML standards, but rather it is an attribute approved by the search engines to help identify which links should not be followed. This attribute is often used with blogs to prevent comment and link spam. The link will appear on the Web page and in the search engine's cache, but never be counted.
Dynamic Listing - dynamic listing is a result of having links appear randomly across a series of pages. Each time the link is found on a new page, the search engines count consider the freshness of the link. It is extremely possible that the link won't be on the same page upon the next search engine visitation. So, the link from a partner displaying rotating, dynamic link listings rarely helps.
Floating List - this can be easily missed when checking link partners. Essentially, your link could be number one today, but as new link partners are added your link is moved down the list. This is harmful because the values of the links near the bottom of the list are considered to be of lesser value than the links at the top. With the floating list, it is possible to have your link moved to a new page whose PR value is significantly less or not existent and the new page may not be visited and indexed for months.
Old Cache - the caching date provided by Google indicates the last time the page was cached. Pages with lower PR values tend to be visited and cached less often than pages that have medium to high PR values. If the cache is more than six months old, it can be surmised that Google has little or no desire to revisit the page.
Denver Pages - while Denver, CO is a nice place to visit, Denver Pages are not a place you want to find your link in a trade. Denver Pages typically have a large amount of links grouped into categories on the same page. Some people call this the mile high list. These types of pages do not have any true value in the search engines and are not topically matched to your site.
Muddy Water Pages - these are dangerous and easy to spot. Your link will be piled in with non-topically matched links with no sense of order. It's like someone took all the links and thrown them in the air to see where they land. These are worse than the Denver Pages.
Cloaking - cloaking is the process of providing a page to people while providing a different page to search engines. You could be seeing your link on the Web page, but the search engines could possibly never see the link because they are provided with a different copy. Checking Google's cache is the only way to catch this ploy.
Dancing Robots - this can be easily performed with server-side scripting like PHP and is rarely easy to catch. In this situation people that attempt to view the robots.txt file receive a copy of the robots.txt file that does not include exclusion instructions for the search engines. However, when the search engines request the robots.txt file they receive the exclusion instructions. With this situation the links pages will never be linked and you'll never know why without expert assistance.
Meta Tags and Robots.txt Confusion - which instructions have the most weight? Don't know the answer? Shame. Search engines do. If they conflict the page Meta tags are typically considered the rule to follow.
Link the Head - while these links do not count in the search engines and do not show up on the Web page, they do get counted by scripts or programs designed to verify the links exist. These programs only look for the URL within the source codes for the Web page.
Empty Anchors - this is a nasty trick, but can be an honest mistake. The links exist and are counted by the search engines, but unfortunately are neither visible nor clickable on the Web page. So, there are no traffic values from the link.
The goal of trading links is to trade them for equal value. Understanding the ways people will attempt to prevent passing a quality value from their Web page to your Web page can help you avoid these useless links. If your link partner pulls under-handed tricks the links they trade you are useless.
While you may never be an expert in knowing all the latest tricks, traps and tests, you can now become an expert in knowing the thirteen mentioned above. Ensuring your link partners are not following or using these tactics can help improve the quality of links you gain from other Web pages. By having quality links pointing to your Web page will you gain additional traffic through organic search engine results and visitors driven directly from your linking partners.
No comments:
Post a Comment