PageRank for Websites: Is There More to the Web?

 


Visitors: 396

Google’s PageRank has been around for years, and in the opinions of a lot of e-business owners, it can make or break a site. Lately, with Google’s fingers in every pie, it seems important to remind everyone that there is more to a website than just PageRank. PageRank is a term that relates to the algorithm that Google uses to rank a website in its search engine. Coined by Larry Page, one of the engineers of Google, PageRank has come to mean so much to webmasters and SEO’s that it dictates how we market a website. But let me coin a few terms of my own. (Or, borrow them from others, perhaps…) And while some of these concepts are included in the PageRank algorithm themselves, it’s often helpful to be reminded that there are many factors that a webmaster should concentrate on, and not just one overwhelming aspect. It bothers me that Google’s toolbar’s PageRank indicator “measures the IMPORTANCE” of a page; important to them, perhaps, especially in light of the release of the Google Toolbar for Firefox on July 7th. But lack of PageRank doesn’t mean that your site isn’t important. So how do you let the search engines, Google in particular, know that? This article is a collection of the phrases that indicate the behavior of search engines today.

FreshRank

Currently there is patent application in the US Trademark Office from Monika Henzinger, published on June 30, 2005, that certifies that she has a way of determining a document’s “freshness” which we will name FreshRank, coined such by Michael Martinez on Cre8asite Forums. The abstract says that one of the problems is that the “last modified since” attribute isn’t always correct, and even if a webmaster has figured out they can change the date, it still doesn’t fool Google. What Google looks for is actual modified content. As far as how Google determines how old a document may be is still somewhat of a secret. Monika is attempting to patent a more explicit form of freshness, since not all search engines use the “last modified since” attribute anyways, and stating that search engines need a more reliable way of determining overall updated content. Well, I for one, happen to agree.

But consider this: some websites don’t need to be updated. Does this mean they aren’t fresh? Not likely. Think for a moment about that government bill that was put into action 10 years ago. No new fresh content there. What about the scientific formula for penicillin? I don’t think that changes much. And what about the historical account of the Trojan War? Pretty much stays the same. Most businesses adopt terms and conditions or a privacy policy that is designed to stay the same. So it’ll be interesting to see how the patent, if approved, will pan out in regards to this type of information. There is no mention in the patent application of “gauging whether the page continues to be a currently relevant citation. ” Even Google mentions how fresh a website’s content is as a problem in a recent patent application. So it will be interesting to see if the new FreshRank will truly help. As my grandfather used to say, “Well, we will just see what we’ll see. ” Indeed.

TrustRank

This is a term trademarked by Google. Google has this to say about TrustRank: “Web spam pages use various techniques to achieve higher-than-deserved rankings in a search engine's results. While human experts can identify spam, it is too expensive to manually evaluate a large number of pages. Instead, we propose techniques to semi-automatically separate reputable, good pages from spam. We first select a small set of seed pages to be evaluated by an expert. Once we manually identify the reputable seed pages, we use the link structure of the web to discover other pages that are likely to be good. In this paper we discuss possible ways to implement the seed selection and the discovery of good pages. We present results of experiments run on the World Wide Web indexed by AltaVista and evaluate the performance of our techniques. Our results show that we can effectively filter out spam from a significant fraction of the web, based on a good seed set of less than 200 sites. ”

The paper they are referring to is a 12 page abstract dealing with Google’s TrustRank, found at Stanford University’s site. In essence, TrustRank is a way to cut down on spam and filter out content that is not relevant to the searcher in order to bring them results they really want. While there is no way to implement techniques to include Google’s TrustRank currently, what this means to you as a website owner is simple: don’t spam the search engines! Spam is defined as email related in Webster’s, but the term has come to mean any unwanted information or propaganda that may have been received through deceptive measures on the part of the sender. To a search engine, spam is hyperlinked pages that are intent on misleading the search engine. So as long as you are genuinely trying to keep your nose clean, you should be okay, right? Generally, this is true, but you should still be careful. The best policy is to develop your website for your visitor, and not for the search engines; you’ll do well every time.

NicheRank

This is a term that I’m simply making up. If you contact the Small Business Administration for a packet for starting a business, you get a 26 page questionnaire about your niche product. There are many aspects to NicheRank. Ideally, offering a product that no one has ever though of before is the best niche to be in. Not everyone can have that brand new invention that everyone just has to have, however. So offering product in a saturated market is still possible, but you still have to find your niche market. What is it about your product that is different from all the rest? Why should someone buy from you instead of from one of your competitors? Whether it’s about price, or freebies with purchase, unbeatable customer service, or selection, you are not just selling a product; you are selling your company. And this is what makes up NicheRank. How effective will you be at penetrating a field, and then stealing some of that market share? I can’t remember what movie this was in, but the saying “You gotta have a gimmick” is still tried and true. So what’s your gimmick? How does it apply to search engines?

In an article by Scottie Claiborne, she talks about the infomercials that you see on TV. One in particular, the “Chocolate Dream”, was nothing more than an age-old concept: the double boiler. But this established product was given a new spin by the makers of the “Chocolate Dream”, which made it seem like something you just couldn’t live without. I love watching those shows on HGTV that turn old things into new things. They are still the same basic widgets, but they get a makeover to turn them into something different. I’m not a garage saler by nature, as I just don’t have the time, but I like seeing what possibilities those old items have for becoming potentially new ones. What’s your “Chocolate Dream”?

Search engines are excited over new things, even if you have an “old thing” that seems new. In fact, search engines penalize for duplicate content, and may not rank those sites at all. So you have to make your product different than the rest of them out there, even if it IS the same thing. And that’s NicheRank.

ContentRank

You’ve heard the old saying: Content is King. Too many times someone develops a website and concentrates on how pretty or flashy it has to be. As an SEO, updating content to keep a website fresh has always been a priority. Now, more than ever, as the algorithms of search engines change (almost daily, it seems) and the jump-on-the-bandwagon-follow-Google’s-lead patenting of every idea that pertains to search engines (this month there are eight that I could count), webmasters have to concentrate on what their website has to say, and not just a bunch of pretty pictures, JavaScript, and flash intros. But then again, I knew that already. In a way, the search engine patent craze is forcing webmasters to do what they should be doing already, and get them back to basics: words.

When a visitor is using a search engine, they use words to find what they are looking for. So in turn, a website uses words to connect with that visitor. So how does that search engine know which bunch of words are relevant, and which are not? By comparing them to all the other words on the site. Keyword meta tags are still used to tell the search engines in a glance what the site is about, but thanks to a bunch of unscrupulous web designers and SEO’s, meta tags are not as important as they used to be, because it is too easy to manipulate search results using keywords alone. Keyword meta tags in relation to the actual words on a page determine how important those keywords actually are in a search, or “keyword density”. Simply put, keyword density is how dense those keywords are in the actual content of a webpage. But it’s not just about individual pages; it’s about the scope of keyword density in general for an entire site. If a page has nothing to do with what the website pertains to, then most likely, it won’t get listed.

LinkRank

If you’ve done any reading at all on search engine optimization, then chances are you’ve come across link popularity and its importance in optimization. PageRank initially measured how important a website was by its back links, or a search engine’s perception of votes for a site. But you can’t do any kind of search on link popularity without coming across a dozen or so ads that tell you that you can just buy your back links. It seems to be pretty popular these days to purchase your back links. This is not to be confused with hiring an SEO. Using an SEO consultant to market all aspects of a website is probably a far better long term investment than to buy a bunch of links. We live in a world where we’ve gotten used to getting what we want RIGHT NOW. Whatever happened to patience is a virtue? Anymore, patience is a nostalgic concept, and is almost as antiquated as dime-store sodas.

There are changes in the works to help curb the back links controversy, with seven new search exclusions, by placing a hold delay on newer sites that have launched with seemingly huge amounts of back links before the site has even had a chance to take off. John Scott of Cre8asite Forums offered an opinion based on a source who used to work with a current Google employee:

"The probation does not apply to new sites. It applies to links. When the algorithm was deployed certain older links were grandfathered in. After that, links will be (are being) given partial credit, and be essentially on ‘probation. '"

"It applies to links, not sites. And the age of the link is not the only factor. The IP range of the links and other considerations are made, and the person who I discussed this with said that Krishna Bharat is at Google primarily to develop and implement this new algorithm. It is supposed to radically change the way links are evaluated. "

Spammers have traditionally harvested links in the way that my neighbor harvests soybeans: more is more. But it’s the relevant links that achieve the well placed rankings. Spammers artificially inflate search engine rankings through many links, most of which don’t relate to the sites content whatsoever. Gone are the days of link farms; I tried to search for one the other day, and couldn’t find a single one. Now it is the day of the “directories”. When compiling links for a client, we get a lot of requests for link exchanges. A site, for example, that relates to search engine optimization should stick to links to other sites that have to do with the Internet and search engines. So why on earth would they want to exchange links with Bahamas Vacation Packages? It seems that the way to get around the non-relevant links these days is to create a directory with many different categories that relate to just about anything. I can’t stress enough that these things are going to be examined by the search engines before too long, and it is best to just not partake. It is very difficult to pass up a link exchange for a new webmaster, especially when they are striving so hard to get those inbound links. But it just isn’t worth it.

AgeRank

I’ve known several a-company that has rushed out to register a domain name just in case they need it in five or so years. This might be a good idea for now, but in five years, will it matter? No one knows. AgeRank is a fairly new idea that has occurred to webmasters as far as importance to search engines. There are a lot of sites out there that exist for the sole purpose of spamming. And with Google’s recent stance on spamming (just look at the infamous Traffic Power fiasco), search engines are taking into account the longevity of a domain as well as all the other factors. However, all of the webmasters out there with brand new sites can take heart: it is theorized that AgeRank only makes up about 1% of the PageRank algorithm. After all, it should matter too much that a website that has been around for years should have more importance than a new site, especially if the new site has far more relevance to what the searcher is looking for. Google’s number one goal is to bring the visitor the most relevant search results.

There is, however, a patent application in the works that Barry Schwarz of the SEO Round Table calls this the “sandbox effect, ” because it’s a place where the new sites can where they can all play nicely away from the real sites until they’ve had a chance to prove themselves. In an explanation of why new sites rank well at first, then drop into obscurity, Schwarz said, “the only pattern I see from the threads is that these are new sites. I see a wide range of back links reported, a wide range of styles of on-page optimization. Only pattern is the site was launched after December. " The patent will be just another weapon in the arsenal against spammers. Those spammers ruin it for the rest of us, don’t they? But it seems that it could work for new sites as well, contradicting the idea of a damaging sandbox effect. Google cites two current problems with the way search engines work: FreshRank and LinkRank. The sandbox effect will simply delay PageRank for a site, so as to give PageRank validity again, and not just something any old Joe with enough cash can simply purchase. So for the small business owner, this can mean something important. For the spammer, however, it could be devastating.

Conclusion

Google’s PageRank isn’t going anywhere—at least not yet. Perhaps during the PageRank blackout in late May, 2005 had some webmasters celebrating that PageRank finally bit the dust, though most of them worried it wasn’t coming back, especially those that purchased theirs. But it did, of course. So while PageRank plays, in my opinion, wreaks far too much havoc upon the mental stresses of any webmaster or SEO, and has far too much importance in the minds of some, it’s probably around for good, but bringing back the original purpose of PageRank can only mean good things for those who are truly deserving.

There is a rumor that there are going to be some major implementations of these changes sometime this summer. Even though this is a rumor, it is one that makes me cringe a bit. Humans don’t like change; we are creatures of habit—me especially. But change I will, because that is what the industry is all about. I believe some of the change will have to do with those concepts outlined in the many patent applications, and link changes will be the first to start. I’d bet my firstborn on it.

My goal in writing this article today is to remind you of the importance of doing what you should do best: concentrating on your website, and what it has to say, and what it means to your visitors, instead of what it may mean to search engines. This isn’t a game in which the competitor with the most money wins. When you have that down, everything else will fall into place.

Jennifer E. Sullivan is an Internet Business Consultant who specializes in search engine optimization and web marketing. She holds a Bachelor of Science in Business Administration and Marketing from Kansas University. She has written several web marketing articles, including “Hiring An SEO Consultant: 10 Reasons Why You Should”, “Let's Not Forget About the Little Guy", and “Success for the Early Entrepreneur". You can find more information on her services at http://www.firstclass-seo.com . First Class SEO is an Internet Business Consultant Firm, with an extensive 10+ year marketing and sales resume. First Class SEO specializes in Search Engine Optimization and Marketing, with a primary focus cultivating professional relationships especially with small to medium businesses. First Class SEO's number one goal is to help businesses achieve quality rankings in search engines, and high conversion ratios from traffic to sales in the most natural, long lasting way possible by maximizing the benefits of search engine optimization.

(3179)

Article Source:


 
Rate this Article: 
 
The world of websites- The Advance Web Development
Rated 4 / 5
based on 5 votes
ArticleSlash

Related Articles:

How to increase a websites pagerank - all you need to know about Google ..

by: Larry Walsh (December 21, 2009) 
(Internet and Businesses Online/SEO)

High Pagerank Backlinks – Brand Trust for Today’s Web

by: Matthew Anton (December 08, 2011) 
(Internet and Businesses Online/SEO)

The Difference Between Real Pagerank and Toolbar Pagerank

by: Bedrich Omacka (January 16, 2009) 
(Internet and Businesses Online/SEO)

Have the Best Web Hosting For Your Websites

by: Sturat Mitchel (September 07, 2008) 
(Internet and Businesses Online/Web Hosting)

Web Designing Creative Websites

by: Ganesh Babu (June 30, 2008) 
(Internet and Businesses Online/Web Design)

Web Safe Fonts for Websites

by: Jenny Pilley (June 16, 2010) 
(Internet and Businesses Online/Web Design)

Web Hosting For Content Websites

by: Bill Pratt (December 14, 2007) 
(Internet and Businesses Online/Web Hosting)

How To Get Cheap Web Hosting For Your Websites

by: Collins Deconle (July 06, 2006) 
(Internet and Businesses Online/Web Hosting)

Web Design Tips for Mortgage Websites

by: Natalie Aranda (March 18, 2007) 
(Real Estate)

The world of websites- The Advance Web Development

by: Symonds Lee (August 28, 2010) 
(Internet and Businesses Online/Web Design)