We were cruising down a local highway, at about 80mph, in our super charged $250,000 sports car, with the top down.
Our white scarves were flapping in the breeze. The highway was four lanes on each side, and clear of all traffic. It was a newly constructed super highway. I had to do it. I put the “pedal to the metal", and pegged the speedometer at 150mph. What a rush that was. As we approached the top of a steep incline, the wheels on my sports car lifted off, and settled smoothly on the downside.
It felt as though we were flying for a moment, until, a large Steven Spielberg look-a-like monster, lumbering down the highway, loomed in front of us. I had to slow down very quickly to 75mph.
It was a strange looking monster. Its head slowly swayed back and forth, the long mechanical legs strutted like a chicken, sometimes limping first on the left then on the right.
Its head swayed left to right, and I could see a broad mechanical smile occasionally. The arms swayed as they would if it were on a power walk. We noticed a license plate hanging on its waddling tail. The plate read MONSTERBOT.
We were furious. This dumb looking monster was in our way. We wanted to pass it. We wanted be first on the highway.
I literally stood on the gas petal. The surge was very powerful as the speedometer was again pegged just past 150mph, the tachometer pointed like a magnet to 7000rpm. We were able to pull up along side, but we could not pass it. It just looked down at me and smiled a broad monster smile. What is this thing, I thought. Is it some mysterious, new, government experimental project?
My sports car was built to pass anything. It was the fastest most powerful sports car built to date. I backed off to 75mph and just watched the license plate sway. As long as this dumb looking monster is in front of me, I will never enjoy the experience of being number one on this super highway again.
This silly story is fictional of course. It is my 21st century version of the old Tortoise and Hare tale.
The monster is Googlebot. As we all know, Googlebot is a machine that scans, (spiders), the Internet. This amazing invention searches endlessly for new information, i. e. new Websites, new web pages, new keywords/phrases, and new content. Since Googlebot is a machine, it can only “see" certain items. Those items are HTML code. It does not see the color white, or macro media.
Gurus of the 20th century have studied Googlebot for years to formulate their theories about how Googlebot's Algorithm* works.
*Algorithm, From Wikipedia, the free encyclopedia:
In mathematics, computing, linguistics, and related disciplines, an algorithm is a procedure (a finite set of well-defined instructions) for accomplishing some task which, given an initial state, will terminate in a defined end-state. The computational complexity and efficient implementation of the algorithm are important in computing, and this depends on suitable data structures.
Informally, the concept of an algorithm is often illustrated by the example of a recipe, although many algorithms are much more complex. Basically, an algorithm is a method, like a recipe, in that by following the steps of an algorithm you are guaranteed to find the solution or the answer, if there is one. Algorithms often have steps that repeat (iterate) or require decisions (such as logic or comparison). Algorithms can be composed to create more complex algorithms.
The concept of an algorithm originated as a means of recording procedures for solving mathematical problems such as finding the common divisor of two numbers or multiplying two numbers. The concept was formalized in 1936 through Alan Turing's Turing machines and Alonzo Church's lambda calculus, which in turn formed the foundation of computer science.
Most algorithms can be directly implemented by computer programs; any other algorithms can at least in theory be simulated by computer programs. In many programming languages, algorithms are implemented as functions or procedures. Algorithms can be expressed in many kinds of notation . . . For example, Boolos-Burgess-Jeffrey (2002) (p. 26) give examples of Turing machine programs written as “machine tables" (see more at Turing machine, finite state machine, state transition table), as “flow charts" (see more at state diagram), or as a form of rudimentary machine code or assembly code called “sets of quadruples" (see more at Turing machine). They give a more detailed outline of their “multiplication machine" . . . portions of which are labeled with short natural-language descriptions.
This is getting very deep. But, I like abstract, creative thought. I did get straight A's in calculus at the University of Arizona. The other courses I took were boring so I quit the University of Arizona and joined the Navy.
My SEO theory for the 21st Century is based on Alan Turing's Turing machines and Alonzo Church's lambda calculus.
Choice of machine model: ( Again from Wikipedia )
This problem has not been resolved by the mathematics community. There is no “best", or “preferred" model. The Turing machine, while considered the sine qua non* -the indispensable, the ultimate fall-back - is notoriously opaque when confronted face-to-face. And different problems seem to require different models to study them. Many researchers have observed these problems, for example:
*Latin legal term for “without which it could not be"
The principal purpose of this paper is to offer a theory which is closely related to Turing's but is more economical in the basic operations" (Wang (1954) p. 63)
Enough ground work. Here is my theory based on the turing machine model.
I maintain that Googlebot's Algorithm is a mathematical formula based on the permutation calculation, where a x b x c x d . . . X n = total possibilities.
Those possibilities are based on best practices SEO, and not in any particular order of importance. ALL these considerations are IMPORTANT:
(1) The monster's head motion (website content)
(2) The long mechanical legs (alt tags)
(3) The monster's limping sometimes left then right (quality relevant external incoming and outgoing links and internal links)
(4) Its head swaying left to right. (Site's opening paragraph flowing smoothly and laden with keywords/phrases)
(5) Its broad mechanical smile (changing content)
(6) Its arms swaying (site map, verification, and robot text File)
Here is my assumption. Using these requirements at a minimum, (there may be more), excellent position in all search engines can be achieved for highly competitive, critical keyword/phrases.
Applying this theory to our site, we have successfully achieved top position for our own critical keywords, and for our clients. The issue is that TRUE SEO is an ART, not only a science, based on a mathematical formula.
Therefore, to permanently optimize a website for top position, a personalized approach is needed. I have a really important and bold statement I am about to make. Some SOFTWARE used for SEO will succeed sometimes, not all the time.
Because it is chasing a machine's algorithm that changes frequently.
The frequency of that change is, assuming a permutation* calculation of 6 X 5 X 4 X 3 X 2 X 1 = 720 / 5 X 4 X 3 X 2 X 1 = 6
*For four-letter permutations, there are 10 possibilities for the first letter, 9 for the second, 8 for the third, and 7 for the last letter. We can find the total number of different four-letter permutations by multiplying 10 x 9 x 8 x 7 = 5040. This is part of a factorial.
To arrive at 10 x 9 x 8 x 7, we need to divide 10 factorial (10 because there are ten objects) by (10-4) factorial (subtracting from the total number of objects from which we're choosing the number of objects in each permutation). You can see below that we can divide the numerator by 6 x 5 x 4 x 3 x 2 x 1:
10! 10! 10 x 9 x 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1
10_P_4 = - = - = -
(10 - 4)! 6! 6 x 5 x 4 x 3 x 2 x 1
= 10 x 9 x 8 x 7 = 5040
From this we can see that the more general formula for finding the number of permutations of size k taken from n objects is:
n_P_k = -
(n - k)!
If items 1 through 6 above were the only considerations, Google's algorithm could change several times in a specified period. If that time period were 30 days, those changes could occur about 6 times per MONTH. You will NEVER pass the hypothetical monster permanently. You will never be number one permanently, unless you use best practices SEO consistently like we do. Otherwise, you will always be chasing the monster and never passing it.
The gurus of the twentieth century are always discussing the Google algorithm. All of them are trying to second-guess were the next emphasis will be. Is it content, is it linking? is it . . ?
Don't chase the machine, cruise behind it like we do. Use best practices SEO all the time. No matter what Googlebot is smiling at you for, you will always be prepared to smile back and . . .
Enjoy the ride.
James A. “Jim" Holish
A guru wannabee for the 21st century