123 Internet Group - Blog: 06_02
Milton Keynes 01908 231 230
Northampton 01604 231 231
Bedford & Luton 01234 432 431
Peterborough 01733 822 821
 
  • 123 Internet Group
  • Web Design
  • Web Development
  • Search Engine Marketing SEO
  • Secure Web Hosting & Email
  • Social Media Networking
  • Outsourced Specialist SEO
  • Email Marketing
  • Online Page-Turn PDF Brochures
  • Text / SMS Marketing Services
  • VOIP / IP Telephony Services
  • Mobile Apps
  • CD / DVD Duplication & Full CD / DVD Printing
  • Low Cost, Competitive, High Quality Colour Printing
 

Monday, February 27, 2006

How Many Links Do You Need?

We all know that link building is an important aspect of SEO. Most of the websites I look at are reasonably well optimized, at least in terms of "on page" factors, but they're usually in terrible shape when it comes to links – both within the website and within the area of link popularity.
Among my students, one of the most frequently asked questions is "how many links do I need to get my site ranked better?" At SEO Research Labs, this question has been the subject of much study, of course. It's a simple question, but the answer can be complicated. Fortunately, the answer is usually "a lot less than you think."

In this article, I'll try to break the question down into bite-sized pieces, and give you the best answer we have based on our research and experience. I'll begin with three key concepts, and then give you some rules of thumb to guide you to your own answers.

The first idea that you need to understand is that there is more than one type of link. For our purposes, we can safely divide links into three main types:

URL links – where the "anchor text" is the URL of a web page. For example, "Dan Thies offers a frëe e-book on SEO at http://www.seoresearchlabs.com/seo-book.php". These links increase the general authority & PageRank of a web page. When the search terms are part of the URL, as in the example above, then this may contribute to rankings.

Title & Name links – where the anchor text is the business name or the title of the web page. For example, a link to SEO Research Labs or Matt Cutts' blog post confirming a penalty. These links may contribute to the page's ranking, depending on the words used.

Anchor text links – these are links pointing to a specific page, targeting specific search terms. For example, a link to my upcoming link building teleclass, specifically targeting "link building" as a search term. These links may contribute to a page's ranking, and as a result, "text links" have become a major obsession in the SEO community.

The second idea is that the location of the links matters. Again, I'll break this down into three categories:

Navigational or "Run of Site" links - those links which are contained within a website's global navigation, and/or appear on every page of the web site. Individually, these links are likely to count less than others, because the search engines are capable of identifying them as navigation.

Contextual links – those links which appear in the actual body or content of a web page – like the links in the section above. Individually, these links are likely to count for more than the average link, because search engines are capable of identifying the content areas of a page.

Directory links – those links which appear on links pages, resource pages, and other pages whose primary purpose is to link out to other websites. These links are likely to count for more than navigational links, but their value will be proportional to the number of links on the page.
The third key concept is that not all links are equal, and quality matters far more than quantity. Search engines have varying degrees of trust for links – in fact, some websites may not be able to pass any authority or reputation at all through links. Google's Matt Cutts and others have written and spoken quite clearly about filtering links from websites selling "text link ads," and told us that 2-way links (link exchanges) are unlikely to help much with search engine rankings.

These three concepts are important to what I'm about to tell you, because when you ask "how many links," the answer depends on what kind of links you're able to create. Linking strategies that take the search engines' position into account will be more effective, require less effort, and deliver more predictable long term results. Relying on one or two tactics is not a linking strategy.

For a website that isn't ranked well, playing catch-up can take some time and creativity, but it can be done. If you are in this position, you may want to take a fairly aggressive approach, with as many as 30-40% of the links you build containing anchor text for your most important search terms. It's important not to be a "one hit wonder," and focus all of your efforts on text links, especially if you are targeting only a handful of search terms.


A more conservative approach might involve closer to 10% text links, and perhaps 90% of the links producing only general authority (URL and title/name links). With many of my students, I advocate a broad website promotion strategy that tends to generate a lot of general links, and a follow-up program intended to create anchor text links within that larger pool of links.

So how many links do you need? Well, if you focus on higher quality links, and keep your text links within a reasonable proportion to your "general authority" links, we've found the following rules to be pretty accurate:


For a top 10 position, your text link count should outnumber the count of half of the 10 top ranked pages, and also exceed the count for two-thirds of the top 20 pages.

For a top 3 position, on average, you will need to have 50% more text links than were required to crack the top 10, although in some markets there may be a wide gap between the top few sites and the rest of the top 10.
These rules are just a guideline, and of course, relying on outdated tactics like link exchange or "text link ads" may prove ineffective. In our latest research, we've actually stopped counting these links altogether in looking at competitors. This approach has proven just as effective in the 5-6 months we've been doing it.

When you start to analyze the competition, you'll usually find that the number of text links you need is fairly low, in comparison to the number of general authority links you need. If you worry less about "getting anchor text," and instead look for ways that you can promote your website, you'll find it a lot easier. My students usually struggle with this idea, but in the end, we've always been able to find ways to do (profitable) promotions that also generate the links we need.

I wish you success.


About The Author
Dan Thies is a well-known writer and teacher on search engine marketing. He offers consulting, training, and coaching for webmasters, business owners, SEO/SEM consultants, and other marketing professionals through his company, SEO Research Labs. His next online class will be a link building clinic beginning March 22.

posted by Scott Jones @ 8:54 am 0 comments links to this post

Friday, February 24, 2006

Google Creates Web Pages

Any minor excitement over Microsoft's Office Live beta and its free page-hosting option just evaporated, as Google now offers an easy-to-use web page creator and 100 MB of storage space.

The latest free service from Google (beta, naturally) has arrived. Called Google Page Creator, the service provides a simple, visual approach to page design.

That WYSIWYG approach to web page design has made products like Macromedia's Dreamweaver so popular with the highly visual-oriented people who work in design. As elements of a page are created, the service auto-saves them, providing a safety net for the new user.

Presently, Google supports Internet Explorer and Firefox as the browsers of choice for Page Creator. If someone happens to be an avid user of another browser, a humorous message appears on-screen:

Our programming wizards tried their darndest to get Google Page Creator to work with as many browsers as possible. But alas, even the most expert practitioners of web sorcery must sleep now and again, lest their JavaScript magic run dry.

Sites created with the tool take the site name of http://username.googlepages.com. Within the Page Manager Settings, the site name can be changed from the username default; users who post content unsuitable for minors must check a box in Settings signaling this, per the terms of service.

Other content, like images, can be uploaded via the "Uploaded stuff" box on the right side of the page. Once those images have been uploaded, one can click on a page in the Manager to edit it directly.

The page editor maintains a small number of commonly used markup tools available by clicking their buttons. These fifteen buttons primarily focus on text, along with buttons for inserting images and hyperlinks.

I inadvertently discovered by logging into Page Manager from two different browsers that Google enforces some basic version control by locking a page that is being edited. The new browser can break that lock and edit the page by clicking a link.

One issue I noticed while testing this with a 1024x768 resolution setting in my monitor: in the Page Manager, an option to Edit HTML appears at the bottom of the markup tools on the left side of the page, below the Normal formatting button.

I only noticed this by accident while the tool loaded, because after the Page Manager loads, the Edit HTML link gets pushed below the bottom of the screen. While the page being edited does scroll vertically, as it should, the frame containing the tools does not. This happens in Firefox 1.5 and IE 6, so anyone who isn't using a greater resolution than 1024x768 probably won't see Edit HTML.

It is in beta, of course, and this little issue will likely be fixed in short order. Page Creator does provide a straightforward way for someone to create a site that isn't a blog (yes, it's true, not every site is a blog) and work on it from any computer.

About the Author:
David Utter is a staff writer for WebProNews covering technology and business.

posted by Scott Jones @ 8:30 am 0 comments links to this post

Advertising Like Its 1999

Starting a website used to be relatively easy. Register a domain name, get a virtual hostïng account, setup a basic looking website, then choose from the literally hundreds of marketing agencies that were willing to send traffïc to your site for a relatively small price. A lot has changed since 1999 on the Internet, and maybe nothing so much as the way we market our websites.

Some may be tempted to say that marketing has become easier in today's Internet. We know more about user's expectations and are able to better target our ads to users who are interested in our websites. Through programs such as Google Adsense and Yahoo's Contextual Marketing programs, we can be relatively certain that the clicks for which we pay are from people who are actually interested in our programs (of course there are issues of click fraud, but that is not the focus of this article).

But because our advertising choices have been effectively slimmed down to just a few major ad networks, finding a great deal in advertising has become much harder. Every website owner is rushing to the major ad networks which creates a scarcity of ad spots. The result is that ad prices are being driven up - and your profïts are being driven down.

After a little research, however, I learned that the small, upstart, great value advertising options had not died. It gave me hope that the good things of the early Internet could still be alive in today's webbed world.

Advertising on Blogs

Blogs are big. There is no doubt about it – everyone is starting a blog. My wife even started a blog last month ( http://www.thelazywife.com – please excuse the shameless promotion of her blog) with the hope of making a little side income. Blogs are relatively easy to setup and maintain, and with so many people talking about blogging successes, they have become an attractive option for those looking to bring in an additional income.

This is good for advertisers. The blogging boom has created a buyers market for advertising. Most bloggers are trying to make monëy from contextual advertising and are seeing some levels of success, but most would like to see more monëy from their blogs. The result for the rest of us is that buying ads on blogs can bring quite a bit of traffïc without having to pay a great deal of monëy.

If you need proof of this, just head on over to BlogAds. BlogAds is an invitation-only network of blogs offering advertising on their websites. Each site is categorized which allows advertisers to target their ads. The best feature of BlogAds, however, is the ability to not only see the site that you will be advertising on, but also the ability to see the site itself as well as how much estimated traffïc that site will receive while your ad is live.

Some of the prices are more expensive, but if you choose wisely and create a decent ad, seeing an effective clickthrough cost of $0.05 to $0.10 is attainable. For my wife's blog we purchased several ads across a handful of targeted blogs. Currently we are on pace to seeing an effective clickthrough rate of about $0.05/click. That is effective advertising.

There are other blog ad networks besides BlogAds, and many blog owners would be happy to accept an advertiser if you were to approach them. The traffïc on blogs is real, and with the number and popularity of blogs, finding a good advertising deal is not too difficult.

Finding Upstart Ad Networks

One of the beautiful things about the late 1990's was the sheer volume of upstart ad agencies. Although none of these groups were able to generate the traffïc that any of the mega agencies of today are able to generate, these upstarts usually were able to provide solid traffïc for a true bargain in an attempt to woo new advertisers.

Upstart ad networks, although a lot less visible today than they once were, can be found in a multitude of ways. They usually do not have a lot of press around them, and they probably have only a few quality websites in their network, but they do exist and they can be a good advertising outlet. More and more these networks are focusing on vertical markets (such as an ad network that deals only with Internet marketing). To find a network like this, you should familiarize yourself with the major websites in your industry. Pay attention to who is serving their advertising (you can usually figure this out by viewing the source of the page) and chëck the rates of advertising. Most of the time you will find a major ad network behind the ad, but from time to time you can find an absolute steal.

New Search Networks

With Google Adsense, Yahoo Marketing, and the upcoming MSN Ad Center (in Beta), it would be reasonable to assume that search engine marketing has turned into a virtual oligopoly. Thankfully, this is not the case. Not only are there new types of search engines being formed that will undoubtedly challenge search as we know it, there are traditional search networks that offer legitïmate advertising options.

The ISEDN (Independent Search Engine & Directory Network) is a group of smaller search engines and directories that have banded together to offer advertisers an alternative to the more expensive search engine options. Although the traffïc of the current 165+ search engines that make up the ISEDN is not at the level of the major search networks, the group still boasts a fairly impressive search volume of over 150 million monthly searches.

Most people would avoid advertising on a small search engine like many of the ones found in the ISEDN because off the lack of search volume as well as the question of whether the vendors are offering legitïmate traffïc. However, as a group, the ISEDN is able to leverage their traffïc, remove the incentive of offering bad traffïc by offering their ads for a flat fee ($4/keyword/month – minimum 3 months), and offer an ad product that can theoretically reduce an advertiser's cost to an insignificant level. This may be one of the reasons that the network sees the majority of its advertisers renew after the first three months.

In addition to search networks like the ISEDN, alternatives to search engines are starting to gain steam. Websites such as Digg.com, Del.icio.us, and Wikipedia are changing the way we find information on the Internet. While these are not a pure replacement for search engines, they are becoming a very popular way to find new websites. Most of these new social network websites do not currently offer advertising, but these could provide a very good alternative to the major search networks in the near future.

Be Crazy - Relive 1999

The web has certainly changed, and maybe nothing has changed more than the way we advertise. The days are gone when establishing a successful website was an easy task.

Paid advertising can be a quick shortcut to launching your website. Many website owners avoid paid advertising because it is usually expensive, and seeing a real return on the investmënt can be tricky. But if you look around, be creative, and keep an open mind, there are plenty of bargain advertisements that can bring quality traffïc to your website.


About The Author
Mark Daoust is the owner of Site Reference. If you want to reference this article, please reference it at its original published location.

posted by Scott Jones @ 8:24 am 0 comments links to this post

Thursday, February 23, 2006

Pulling Google

Admittedly, I have a bit of a childish mind. I often see things as more animated and fantasized than they really are. When I think of search engine optimizers, whether professionals or the casual SEO for a personal website, they often remind me of a room full of school children all waving their hands up in the air, holding their breath, grunting, and whimpering for the chance to have the teacher call on them (they have the best answer, after all).

It's true. Most website owners would gladly spend a day outside of the Googleplex jumping up and down, hoping, praying, and whimpering for Google to take notice of their website, if they thought for a moment that it would give them a chance at getting a top ranking. We are absolutely obsessed with search – it is the ultimate ego stroke to being a website owner.

Most modern SEO theories find their genesis in trying to push a website to the front of Google's rankings. They start with the idea that your website is the one that should be called on by the teacher and give you methods on how to get the teacher's attention. They teach how to raise your hand higher, how to squirm just a bit more, how to sigh with extreme disappointment when the teacher picks the website that is obviously the teacher's pet.

This is push SEO, and it does work for many people. The problem with push SEO is that our 'classroom' is huge. We are asking Google to pick our site out of literally thousands, if not millïons, of websites that all have something to offer on the subject at hand. We may believe that we have the best thing to offer, but Google does not know that.

Lately, however, a theory (or method) seems to be arising that counters the idea of push SEO. Rather than asking you to change your website to fit Google's standards of a 'good result', this theory is supposed to literally change Google's standards.

Google Has a Confidence Issue

I have already admitted to having a childish mind that creates fantastic visions of how the world works, but I really think that Google has a confidence issue. They are the ultimate 'know-it-all's'. Most of us are annoyed by that person who is quick to correct us in a small detail or who seems to have an answer to just about every question, but Google does just that.

Think about it – if you do a search for 'amazen', Google will respond with "Did you mean: amazon?". How arrogant and rude can a search engine be? How can they assume that they know what I am looking for?

All joking aside, they usually do know what we are looking for. They are so supremely confident that they know what we are looking for because they have been able to successfully respond to millïons of questïons daily for the past several years. But like most people with confidence issues, if they feel that they are being left out on a particular topic, they develop feelings of inadequacy. As a result, Google is constantly trying to know everything about everything. The idea behind pull SEO is to tell Google that they are wrong or that they do not know something – and that you have the website that they need to know about.
Mike Grehan on Pull SEO

I first was introduced to the idea of pull SEO by Mike Grehan, a man, in my opinion, who understands real SEO rather than just a bunch of SEO tricks. Although I do not know the man personally (although I would be happy to make his acquaintance), he is the one person who most closely echos my thoughts on SEO.

Just recently he posted on his blog an interesting article on how an in-progress event can effect search results. For example, take a tragedy such as Hurricane Katrina. When the Hurricane hit, it was all that was on our minds and hearts, and as a result, it was what people searched for in Google. Consequently, the search results of the major search engines changed.

Think about it – anytime a major disaster hits it becomes the major subject of the search engines. When Pope John Paul II died in 2005 searches for his name topped most search engine charts. After Janet Jackson's right breast obfuscated the Super Bowl halftime of 2004, search engines were quickly used as a resource to relive the questionable moment. After September 11th, the world flocked to a younger Google to find information on the World Trade Towers and Osama Bin Laden.

If you think like a search engine, being able to present up to date information based on the news of the day gives you a distinct competitive advantage. If you have the results people are looking for faster than others, then you suddenly become the trusted resource everyone looks to.

Mike discusses in several other posts the idea of pull marketing and how he actually uses it in his professional SEO consultations. I am not sure if Mike is the originator of the idea of pull SEO, but he is the first person that I learned this theory from.

Marketing in a Bathroom

I read an interesting comment at Threadwatch that gives a great example of how pull SEO can actually work. The comment related a story which seems to be fairly common place in the website owner world. A new website owner, who was completely unfamiliar with search engine optimization and website marketing was looking for help. In an effort to help market the website, the owner was instructed to place post-it notes with his website address on it in several bathrooms.

The result of this marketing activity? Within a few months his website rose to the top of the search engine rankings, he started to see a good amount of traffïc, and his search engine woes were quickly taken care of.

What SEO work did this person actually do? In reality, there was no SEO work at all – just regular viral marketing.

Making a Splash Big Enough To Notice – The Real Payoff

Allow me to be overdramatic for a moment, but if you want to get to the top of Google, you not only have to be the website that shows all the information possible on Hurricane Katrina, you also have to be the website that causes Hurricane Katrina. In other words, if you want to get to the top of the rankings – make enough noise that people start searching for your website independent of 'just finding' you in the search engine results pages.

If Google's base is hammering their search results to know more about BlueWidgets.com, then they will ultimately serve BlueWidgets.com as a result to their users. If they fail to do this, then they will löse trust among their users.

Mike Grehan often talks about the effect of a client launching a major television commercial campaign and how there is an immediate effect on that client's rankings in the search engines. This is not a coincidence, but a direct result of raising awareness of a website and Google responding to that new awareness.

The Reality – Small Businesses Have Trouble Making Big Splashes

Pull SEO is good in theory, and it is very good for a Fortune 500 corporation, but the small company will certainly have trouble utilizing pull SEO. Making a big publicity splash is either very expensive or it takes something so unique and revolutionary that making a splash is relatively simple. And, for the small company that is able to grab a lot of attention independent of the search engines, getting a top ranking really becomes ancillary to all the news coverage they are probably receiving.

But maybe this is the way it should actually be. Is it possible that the way to get to the top of the rankings is to develop an actual plan on how we will make our websites popular - independent of the search engines? If we are able to create enough buzz about our website, then search engine rankings, although nice, suddenly become less of a focus.

Put Your Hand Down – Get Your Marketing Geared Up

Google asks us millïons of questïons every day. Which website should they rank first for every topic that people ask about? Naturally, we want to raise our hands hoping that Google will call on us to answer their user's needs. But in all reality, we need to put our hands down and start working.

Relying on a single entity, such as Google, is a bad strategy. Google, as I mentioned earlier, is the ultimate stroke to a webmaster's ego. It is the 'icing on the cake', the affirmation of a job well done. It is not, however, the goal in and of itself.

Your goal is to be successful independent of Google. Make your website buzz worthy and Google will eventually take notice. Google cannot ignore the demands of thousands of users.


About The Author
Mark Daoust is the owner of Site Reference. If you want to reference this article, please reference it at its original published location.

posted by Scott Jones @ 4:29 pm 0 comments links to this post

Wednesday, February 22, 2006

The Importance (Or UN) Of PageRank

Another Google PageRank update appears to be underway or near completion. What that means for search engine optimization is a matter of debate inside the industry. But along PageRank toolbars, ranking has increased for lower ranked sites while it has decreased for previously higher ranked sites.

This indicates that there is something to PageRank, even if the industry aficionados don't always agree on what. The general consensus, however, is that PageRank is not as important for your site, as it is for other sites linking to your site. That is, if a highly ranked and trusted site links to yours, you get more credibility.

"Google seems to have expanded their differentiation between bogus links and links which are earned," said Search Engine Journal's Loren Baker.

"Earned links are links from authority blogs and web communities, well branded and trafficked sites, links earned via news or linkbaiting (the art of creating an idea or tool on your site worth linking to) and yellow page, local directory, user reviews, & local search marketing for brick mortar businesses."

Ultimately that can mean higher search result rankings for competitive keywords. PageRank, therefore, seems to be only a very small (but somewhat important) part of the overall equation. PageRank is judged primarily on a website's link volume, link age, and link value.

Most authorities in the SEO industry, including both individuals and companies have denounced the once mythical status of PageRank. At one time, PageRank was so valuable that companies were willing to pay to increase their website's PageRank (through link buying). As recently as a couple of years ago, high PageRank meant high rankings nearly all the time.

Hundreds of posts in the more popular SEO forums and blogs, though, have shown that the PageRank update has had ranging effects. Like all PageRank updates new websites are seeing increases in PageRank. Most new websites we see are being indexed anywhere between PR1 to PR5. Many sites previously ranked PR3 or PR4 have seen a one point increase.

Like an update that took place approximately nine-twelve months ago, many higher-ranked websites are seeing a drop in PageRank by one point on several of the datacenters, particularly those in the PR5-PR8 category. A PR7 appears to have taken the place of the PR8's and PR9's of the past as the untouchable score, as even the number of PR7's have declined overall.

With the recent launch of Google's Bigdaddy datacenter and the overall public awareness of Google's constantly moving rankings across a variety of datacenters, it's no wonder that we are also seeing PageRank fluctuation now too. Most agree that this fluctuation should settle down within the next few days.

"People in the know are no longer chasing after a higher Toolbar PR.," said SEO Company's Bob Mutch. "The emphasis has shifted to getting links from relevant sites that are in DMOZ and Yahoo directories, press releases, links in news sites, and the highly prized .gov and .edu links."

Mutch went on to pose two theories as to why PR has dropped, as in the past four PR exports.

"The main theory is that Google has changed the relationship scale between Real PR and Toolbar PR so that the 0 to 10 toolbar PR scale shows a lower value for the same Real PR value a Webpage has. This theory holds that this is being done to discourage people from acquiring links based strictly on the Toolbar PR value," he said.

"The other theory is that as there are more pages in the Google index, the relationship scale between Real PR and Toolbar PR changes due to the increase of the pages in the index. With the increase of pages it takes higher Real PR to get the same Toolbar PR."

SEO professional Andy Hagans downplays the importance of PageRank, placing more emphasis on link building.

"With each new Google update, it is apparent that PageRank counts for less, while other factors (trust, age, authority links) count for more," said Hagans.

"Really, the importance of the green pixels is now negligible, aside from a very rough indication of a site's link popularity and assurance that it isn't banned in Google. But as long as Google continues to show PageRank in the toolbar, it will never ‘die' in the public consciousness."

At Submitawebsite Inc., we really don't put a whole lot of emphasis on PageRank for most clients, as different industries have different website popularity requirements needed to attain high search engine rankings. In our industry, the requirements to attain top rankings are very high, whereas a website looking to sell electrical semi-conductors does not need to be as concerned about having thousands of back-links, or multiple high PR sites pointing to them.

For us, PageRank is a good indicator of a website's online value as seen by Google, but our concerns for a website with regards to assisting them with quality white-hat Natural Search consulting rely more in the area of traditional best practice SEO recommendations, unique content creation, press release and article distribution, and other forms of white-hat link building. PageRank has simply taken a backseat in the minds of our staff and clients.

It seems that the industry consensus is that PageRank really has taken a back seat, at least within the white-hat community, and that one should not be too concerned about their new PageRank score. Using PageRank as a measurement of a website's popularity in the search engines still makes sense, and if a site has PageRank it will likely help your rankings if it links to you, especially if the website has high PageRank. Based on my read, and the read of the top players in the industry, website owners and webmasters need to stay focused on valuable content and being linked with authority sites. PageRank, really at the end of the day is just the icing on the cake.

At the same time, understanding that PageRank doesn't have the same value as it once did, it can still be used for strategic link development.

"Instead of chasing PageRank I like to find the low PageRank sites that rank well in the search results," says SEOBook author Aaron Wall. "What links do they have? Why are they ranking so well? If you get the types of links that those sites have, and if you have a compelling website that other sites actually want to link to the PageRank will naturally fall into place without you developing a wonky link profile trying to artificially boost your PageRank."

Veteran link building expert, Eric Ward, agrees that trust and link building, not chasing PageRank, are the most important parts of the search strategy.

"My link building efforts are driven by subject affinity, and trust--the trust that the user will have for the site the link appears on," said Ward.

"You do not have to have a bunch of (or for that matter ANY) high PageRank links in order to rank well for a specific search phrase (in more niche markets). You need to have a handful of perfectly placed subject specific trusted links. This is the core reason why I love but don't worry about PageRank."

About the Author:
Joe Griffin is the President of SubmitAWebsite.com. Founded in 1997, Submitawebsite, Inc., "The Industry’s Original Submission Company," is a leading provider of search engine marketing services...

posted by Scott Jones @ 1:40 pm 0 comments links to this post

The Three Principles of HTML Code Optimization

Just like spring cleaning a house, the html code of your web pages should get periodic cleaning as well. Over time, as changes and updates are made to a web page, the code can become littered with unnecessary clutter, slowing down page load times and hurting the efficiency of your web page. Cluttered html can also seriously impact your search engine ranking.

This is especially true if you are using a WYSIWYG (What You See Is What You Get) web design package such as FrontPage or Dreamweaver. These programs will speed up your web site creation, but they are not that efficient at writing clean html code.

We will be focusing this discussion on the actual html coding, ignoring other programming languages that may be used in a page such as Java-Script. In the code examples I will be using round brackets ( ) instead of correct html angle brackets < > so that the code examples will display properly in this newsletter.

Up until recently when coding a page in HTML we would be using tags such as the (font) tag and (p) paragraph tags. Between these tags would be our page content, text, images and links. Each time a formatting change was made on the page new tags were needed with complete formatting for the new section. More recently we have gained the ability to use Cascading Style Sheets, allowing us to write the formatting once and then refer to that formatting several times within a web page.

In order to speed up page load times we need to have fewer characters on the page when viewed in an html editor. Since we really do not want to remove any of our visible content we need to look to the html code. By cleaning up this code we can remove characters, thereby creating a smaller web page that will load more quickly.

Over time HTML has changed and we now have many different ways to do the same thing. An example would be the code used to show a bold type face. In HTML we have two main choices, the (strong) tag and the (b) tag. As you can see the (strong) tag uses 5 more characters than the (b) tag, and if we consider the closing tags as well we see that using the (strong)(/strong) tag pair uses 10 more characters than the cleaner (b)(/b) tag pair.

This is our First Principle of clean HTML code: Use the simplest coding method available.

HTML has the ability of nesting code within other code. For instance we could have a line with three words where the middle word was in bold. This could be accomplished by changing the formatting completely each time the visible formatting changes. Consider this code:

(font face="times")This(/font)

(font face="times")(strong)BOLD(/strong)(/font)

(font face="times")Word(/font) This takes up 90 characters.

This is very poorly written html and is what you occasionally will get when using a WYSIWYG editor. Since the (font) tags are repeating the same information we can simply nest the (strong) tags inside the (font) tags, and better yet use the (b) tag instead of the (strong) tag. This would give us this code (font face="times)This (b)BOLD(/b) Word(/font), taking up only 46 characters.

This is our Second Principle of clean HTML code: Use nested tags when possible. Be aware that WYSIWYG editors will frequently update formatting by adding layer after layer of nested code. So while you are cleaning up the code look for redundant nested code placed there by your WYSIWYG editing program.

A big problem with using HTML tags is that we need to repeat the tag coding whenever we change the formatting. The advent of CSS allows us a great advantage in clean coding by allowing us to layout the formatting once in a document, then simply refer to it over and over again.

If we had six paragraphs in a page that switch between two different types of formatting, such as headings in Blue, Bold, Ariel, size 4 and paragraph text in Black, Times, size 2, using tags we would need to list that complete formatting each time we make a change.

(font face="Ariel" color="blue" size="4")(b)Our heading(/b)(/font)

(font face="Times color="black" size="2")Our paragraph(/font)

(font face="Ariel" color="blue" size="4")(b)Our next heading(/b)(/font)

(font face="Times color="black" size="2")Our next paragraph(/font)

We would then repeat this for each heading and paragraph, lots of html code.

With CSS we could create CSS Styles for each formatting type, list the Styles once in the Header of the page, and then simply refer to the Style each time we make a change.

(head)

(style type="text/css")

(!--

.style1 {

font-family: Arial, Helvetica, sans-serif;

font-weight: bold;

font-size: 24px;

}

.style2 {

font-family: "Times New Roman", Times, serif;

font-size: 12px;

}

--)

(/style)

(/head)

(body)

(p class="style1")Heading(/p)

(p class="style2")Paragraph Text(/p)

(/body)

Notice that the Styles are created in the Head section of the page and then simply referenced in the Body section. As we add more formatting we would simply continue to refer to the previously created Styles.

This is our Third Principle of Clean HTML Code: Use CSS styles whenever possible. CSS has several other benefits, such as being able to place the CSS styles in an external file, thereby reducing the page size even more, and the ability to quickly update formatting site-wide by simply updating the external CSS Style file.

So with some simple cleaning of your HTML code you can easily reduce the file size and make a fast loading, lean and mean web page.


About The Author
George Peirson is a successful Entrepreneur and Internet Trainer. He is the author of over 30 multimedia based tutorial training titles covering such topics as Photoshop, Flash and Dreamweaver. To see his training sets visit http://www.howtogurus.com. Article copyright 2005 George Peirson

posted by Scott Jones @ 1:06 pm 0 comments links to this post

Monday, February 20, 2006

Google.cn: The Internet As Beijing Sees It

In November, I wrote an article and referenced a trip that ICMediaDirect.com's VP of Business Development, Diana Lee, took to China. She participated in Shanghai's inaugural ad:tech conference. It was a great trip and our company's ties with China are stronger because of it. Like most Western companies doing business in China, we're just doing business and there are no extenuating circumstances. Google, the giant search engine, cannot say the same.

China is an economic giant warming up to the power of the Internet, but this hasn't been a perfect marriage so far. Centralized power and the decentralized nature of the Internet do not mesh well. Beijing feels compelled to exercise tight control over whatever flow of information they can in order to stifle potential dissent within Chinese society. A governmental missive from 2000 states plainly that Internet providers must restrict information that may "harm the dignity and interests of the state". And it is into the centrally run, Communist waters that Google waded into last week as they introduced their localized Chinese search engine, Google.cn.

Google.com was already available to Internet users in China, but the search engine launched Google.cn with the purpose of staying competitive in the market, as China already has some big search engines of their own, Baidu specifically. But there is a price to pay. In a stance wholly contradictory to its stated purpose Google must censor websites that the Chinese government finds threatening. Just a few of these sites deemed not kosher include: Bacardi.com, date.com, collegehumor.com, jackdaniels.com, news.bbc.co.uk, pressfreedom.com, queernet.org, and teenpregnancy.org. So, in addition to sites deemed critical to Beijing, websites concerning sëx, alcohol, and controversial issues are forbidden on Google.cn, as well.

Now consider an excerpt from Google's IPO filing that reads: "Don't be evil. We believe strongly that in the long term, we will be better served - as shareholders and in all other ways - by a company that does good things for the world even if we forgo some short term gains. This is an important aspect of our culture and is broadly shared within the company."

Google's foray into China is directly contradicting their exuberant IPO statement. Perhaps they took idealism a little too seriously, but that's forgivable. To date, none of Google's actions have really amounted to anything more than wearing some egg on their face. But Google isn't just any old company hawking its products. These are historic times for the Mountain View, CA bunch, and over the next few years their presence in China will amount to much more than a search engine that censored Playboy.com for the Communist government there.

If nothing else, the last two or three years have shown us the inherent strength of the search engine - and none more than Google. And I believe that an unintended consequence of Google's controversial stance in China has an awareness increase of just how influential search results can be. Comparisons of "Tiananmen" searches are illustrating this. Several blogs are showing split screen stills of keyword results using "Tiananmen" on Google Images. Google.cn shows picture after picture of a lovely park, while Google.com shows a screen full of those infamous images of a lone protestor in front of menacing tanks. Just one example of real time censorship is being beamed live over the internet, brought to you by Google. It makes for unintentional and terrible publicity for Google. Oddly timed, too, considering Google's righteous defense here in the United States against government intrusion into their own affairs.

From a business perspective Google's position is sound and totally understandable. They knew they were in for a lump or two for caving to Beijing. They said that providing some information is better than providing none at all. In their own defense, Google cited that less than 2% of websites were to be censored on Google.cn - a mere pittance - yet this is the same company that derided Yahoo for having as little as 1% of their index as paid inclusion. Then it was about principle. Now it's about business. Principle, not surprisingly, can go take a hike.

I repeat, Google's position is not wrong. It's almost silly to envision a leading global company that can maintain preeminence while staying true to a lofty (and now meaningless) definition of "Do No Evil". But in a darkly ironic twist, Google may someday find themselves in situations of flat out "We Do Evil Right".

A benefit of search is privacy and Google backs user privacy to the hilt here in the United States. Think of what privacy means to users - people can seek help for alcohol and drug problems without fear of ostracism, they can test the job market without making waves, ask questïons they may feel embarrassed asking someone they know - all anonymously. Maybe we take this for granted, but this is a powerful and useful asset for us.

Could anyone actually believe that Google will protect Chinese Internet users if the powers in Beijing started making demands for private search information on Chinese searches? Google has entered China on Beijing's terms, compromised. When issues of ethics arise Google won't have much to say because they are clearly in China for the dough. The power of search, that we see in China, can - and let's be frank, will be used against the people someday. This would make Google, of "Do No Evil" fame - somewhat complicit.

Until some big changes occur in Beijing I foresee much awkwardness for the "Do No Evil" bunch's operation in China. Simply put, the Party Leaders in Beijing have Google over a barrel - I suppose that means selling out. If they cared only about profïts, this article might not have even been written. But this is Google. And their product is a powerful tool and they've already yielded it to some very powerful folks in Beijing. This time it was to prevent the Chinese people from accessing certain information. As this tool of search continues to refine and become powerful, it's tough to say what Google will be asked for. Perhaps Google will be coerced into giving up the identities of their own users in China. It is anything but a farfetched scenario. Is hypocrisy in big business expected? Sure, to some degree. But this is dangerous hypocrisy.


About The Author
Joseph Pratt Media Analyst ICMediaDirect.com. email: joseph@icmediadirect.com

posted by Scott Jones @ 8:37 am 0 comments links to this post

Friday, February 17, 2006

Why Your Website Needs Inbound Links

Most web-savvy people quickly learn why they need "links" from other sites pointing at theirs. Your inbound links are one of the most important ways of getting yourself known in your field, generating traffïc to your website, and influencing the search engines to notice your site.

"Traffïc" is what linking is all about. Without traffïc your website is useless as a tool for selling your products or communicating your ideas. Getting links from other websites is not the only way to generate traffïc, but it is probably the most important one.

But how do links generate traffïc?

**Direct traffïc from links**

First, links generate direct traffïc. Links from sites that share your target audience will be an important source of traffïc to your site. A visitor to the other web site sees the link to yours, clicks on it, and becomes your visitor. Some estimates put the percentage of internet traffïc resulting from this kind of link as high as 21% of total traffïc.

Why do people clïck on these links? One reason is they may view a link to an outside source as an endorsement. They assume the webmaster is saying "Here is a source you will find interesting or helpful". They are looking for the kind of service you provide, so they clïck on the link to chëck you out.

But just as important is simple curiosity. Someone sees a text link with intriguing wording like "Powerful Cheap Advertising" or "Win a Free iPod" or "See Pamela Anderson Video" and, depending on their interests, a certain number of people are likely to clïck on it.

This suggests at least three things about your links. First, you should get as many links as possible on pages your target audience is likely to be visiting. The more people see your links, the more traffïc you are likely to get.

Second, your anchor text (the words that are linked) should be intriguing. It should be short and sweet, and suggest a benefit -- a reason for people to clïck on it.

Third, your links should be on pages that people actually look at. Having hundreds of links on pages that nobody ever looks at will not result in traffïc -- at least not direct traffïc. Putting your link on a link exchange page containing hundreds of services similar to yours is not likely to generate very many clicks. This is why exchanging links with link directories is such a questionable waste of time. Web visitors rarely look at these directories.

Finding good pages where you can place your link is not always easy. One method is to systematically do searches for your most important keywords -- the search phrases people are likely to use when looking for your kind of product or service. Many of the results will be competitors of yours. But one or two may be secondary sources such as directories or reference pages. Getting your link on some of these secondary sources is almost guaranteed to result in traffïc, so it is worth the effort -- and sometimes the cost -- of getting listed in the resources that score high for your keywords.

**Traffïc from Search Engines**

The second reason for getting inbound links is to impress the search engines. Most search engines use the quantity and quality of your inbound links to evaluate the importance and relevance of your site to specific keywords. For instance, if you sell a product like "Full Color Vinyl Banners", or you are a Real Estate agent servicing "Kitchener Real Estate", one of your objectives is to rank high for searches done on your primary search phrase (and other similar ones).

This will result in traffïc because when people search for your important keywords your site is more likely to show up in the search results. The more inbound links you have that relate your site to full color vinyl banners or web promotion services, or "fill in your keyword here", the higher your site is likely to rank for these terms, and the more search engine traffïc you are likely to receive as a result.

**Using Articles to get traffïc and impress the search engines**

Embedding your links in articles is one of the best methods of rapidly increasing your inbound links. Many times a well-written article will show up in hundreds of places on the web. And if it has your link embedded in it, that will obviously increase your inbound links. Webmasters pick up these articles because they want content to enhance the value of their sites.

Articles will also generate direct traffïc because people who read them are already interested in your subject matter, and are therefore more likely to clïck on your link.

This suggests that the most valuable place to publish your article is in a themed or categorized article resource. For instance, if your product is "health" related, having it published on health-oriented sites will be more valuable than having it published on generic sites.

You can even take this a step further. If your article is about something more specific like "mesothelioma advice", then getting it published on sites that focus on "mesothelioma" will get more "reads", and have a greater influence on the search engines.

Second, when embedding your link, try to use anchor text that contains one of your important keywords, not just your URL or web address. Remember that search engines are dumb. One of your objectives is to have them relate your website to specific search terms (keywords or key phrases). And the best way to do that is to use them as your anchor text.


About The Author
Rick Hendershot heads Linknet Promotions ==> http://www.linknet-promotions.com | Get links in articles and blog posts ==> http://www.linknet-news.com/linknet-news.php | Linkpopularity durch professionellen Linkaufbau ==> http://www.thinex.de

posted by Scott Jones @ 8:07 am 0 comments links to this post

Thursday, February 16, 2006

Content Layering :: Using Site Architecture To Improve SEO

Many times, a site gets very large and its ability to rank well in competitive markets decreases in part because of the size of the site. While we in the business know that content is king, more often than not it is a combination of content and effective site structure which will ultimately help your pages rank.

In this article I look at how to most effectively structure your site to take advantage of this.

I read this great article on layering on the SEOmoz Blog and while it does a good job of explaining what content layering is, I feel it could be improved just a little bit.

I'm not saying it is wrong in any way. In fact, the tactic outlined will be very effective for a small to medium sized site, however I have also found another way to organize your site which can be more effective if done properly.

In the article, it explains how you use layers to organize your site. We're not talking about CSS layering or anything like that. It's more of a site structure issue than anything.

According to the article, one can layer their site through the use of sub-folders. By creating layers of sub-folders and then placing all related content within that sub-folder you can layer your site to help specific sections of it rank higher.

This is a great way to organize a smaller site because it allows you to place topical pages together, and promote links within the pages to help improve overall positioning of these sections.

Further, it helps reduce the dilution factor often felt by sites that attempt to cover multiple topics in a flat file structure.

For example, if you sell widgets you could organize the sections by some common element, such as color. That way your site could be: http://widgetts.com/blue/page1.html and all blue widget pages would go into this sub-folder. You'd then organize all other sub-folders in a similar style.

Like I said, I think this is a very effective strategy for a smaller or medium site. There would be a much greater chance of blue widgets ranking highly in a structure like this.

However, I feel that for largër sites there's an even more effective way to organize your content.

Through the use of sub-domains one could further organize this content. This would make it even more relevant to search queries and more likely to rank. If one sold a largër variety of widgets yet still wanted to organize them by color, then the structure of the site would be: http://blue.widgetts.com and all site content relating to blue widgets would appear within this sub-domain.

The reason I say sub-domains would be more effective is because search engines tend to treat a sub-domain as its own site. In other words, a search engine sees http://blue.widgetts.com and http://widgetts.com as essentially 2 different sites.

Keep in mind that such a strategy is of the most benefit to largër sites. If you don't have a large site, or don't foresee your site growing to become a large site then I wouldn't recommend the sub-domain layering tactic.

This is because, as I've said, the search engines will treat your sub-domain as a unique site. So, if you've only got 10 or 15 or even 50 pages in your sub-domain, chances are it won't rank as competitively as it would have as a sub-folder of a largër site.

Nöw, to make your content even more competitive, why not combine these two strategies – use a sub-domain and sub-folders to provide you even more control in site organization as well as an even greater chance of ranking.

This is because the broader sub-domain can rank competitively for the broader terms while the sub-folder content can rank competitively for the less broad, more specific terms.

What you are doing by combining the two strategies is getting more bang for your buck. This is because you are covering more area on the web, allowing your site to rank for both broad and specific terms.

Then, with some good strategic interlinking you will be able to even further promote the broad areas of your site by linking all your internal pages to the pages above it.

While I'm not entirely dismissing the layered content theory presented above, I am saying consider your situation. If your site is a smaller site, by all means use the layered content approach. If it's largër, then use the sub-domain approach.

Also, remember that there could be multiple ways to organize the same content.

For example, in addition to organizing your sub-domains or sub-folders by color in the widget example, also consider organizing them by features. This way, a chosen widget could be linked to from multiple related categories.

Not only that but you've nöw bulked up your site with a bunch of additional pages. These new pages are required to help create the sub-domains and navigation required to drive visitors to the individual widget pages.

This type of multi-category linking is common among many large sites. One good example is Ebay. It organizes its top auctions into sub-domains like antiques, art, autos and clothing. Then, within the categories the sub-folder structure is used to further segment the site.

In conclusion, if you've been looking for a way to most effectively organize your site while helping to improve rankings, consider these options. Through the use of sub-folders, sub-domains or a combination of both you can effectively organize your site, segment your products and target searchers more effectively.


About The Author
Rob Sullivan is a SEO Consultant and Writer for Textlinkbrokers. Textlinkbrokers is a link building company.

posted by Scott Jones @ 1:07 pm 0 comments links to this post

Wednesday, February 15, 2006

The Dark Side Of Google A Reason For Concern?

The way that search marketers dream up conspiracy theories you'd think that we're all paranoid with nothing better to do.

Is there a true reason for concern? I think not, but reading other peoples paranoia is always entertaining. We all know search engines are "out to get webmasters". They have nothing better to do than to think of new ways that will infringe on websites rankings or play hide and seek with site PageRank.

Google is at the forefront of the theorists' attention. And it's not very hard to see why.

It's Tough Being At The Top

Google's market share is certainly growing. It handled 60 percent of Internet search queries in November 2005, up from 47 percent a year earlier, according to ComScore Networks. Google's chief officers have expressed that they are committed to growing the company itself in a sustainable way.

Quoting CFO George Reyes: "Google would be spending more on research and development, and will invest heavily in its computing infrastructure."

Google's motto "do no evil" has been analyzed and debated so many times. Forum posts and articles are always met with "Google does this" or "Google does that", the fact of the matter is that none of us know "what Google's intentions are, except Google themselves of course...but it's still nice to enter the guessing game to see exactly "how close, or far off" you are from the materialization.

Enter the Conspiracies

Everyone has their opinion on the matter - which makes for entertaining reading at least.

Jagger Update

The conspiracy: Google is out to destroy all the organic listings so that everyone will move over to PPC.

The real deal: Google updates their algorithm from time to time to help make search results more relevant. Each update usually receives a name by the SEO community - somewhat like naming hurricanes. The most recent update was called "Jagger". Many scraper directory sites and sites that bought those links were removed from the update.

If you had made use of any shädy techniques it is most likely that your site was caught in Jagger. It was quite a harsh update if you had not employed solid SEO techniques. So needless to say there are a lot of angry webmasters out there. A good example is the German BMW site (bmw.de) which was recently removed for making use of sp@m techniques. Just goes to show SEO is SEO no matter what the language.
Google Adsense

The conspiracy: Google Adsense sites get priority in rankings so that Google can make more monëy. And also Google is trying to take dominance and force webmasters to use Adsense rather than outbound links (link building).

The real deal: If this were true, regardless of how hard Google was to "try", they couldn't force a greater number of people to Adwords through preventing the achievement of a favorable ranking.

Besides, when Adwords first was released, several SEO's tested this theory buy purchasing paid listings over varied lengths in time. The results? There was absolutely no correlation between purchasing an Adwords account and your organic search ranking.

IP Recording / Privacy Infringement

The conspiracy: Search engines log IP addresses. The data collected can be used against you.

The real deal: There have been many theories that Google logs searchers' IP addresses etc., to track their search behaviour, but the situation has gotten much biggër than that. With all the hype stemming from the Department of Justice requesting logs from the Big Shots of search to see what searches were conducted, the talk has shifted to legal implications should the court find in favour of government.

Every bit of network traffïc you use is marked with your IP address; it can be used to link all of those disparate transactions together.

Filtering Results

The conspiracy: If Google can filter the results for China, what stops them from filtering the rest of world?

The real deal: Well this is still very much a hot topic at the moment and I have not really made up my mind on this one quite yet. I can only refer to the Google "Human Rights Caucus Briefing" in their Blog.

Excerpt from blog: "In deciding how best to approach the Chinese - or any - market, we must balance our commitments to satisfy the interests of users, expand access to information, and respond to local conditions. Our strategy for doing business in China seeks to achieve that balance through improved disclosure, targeting of services, and local investment."

And "In order to operate Google.cn as a website in China, Google is required to remove some sensitive information from our search results. These restrictions are imposed by Chinese laws, regulatïons, and policies. However, when we remove content from Google.cn, we disclose that fact to our users."

This is nothing new; in fact Google has altered their search results to comply with local laws in France, Germany, and the United States previously. Also, is it not better to have censored information than none at all? At least this way Google has a starting point from which to fight the censorship.

Do No Evil

According to Larry Page: "Google's goal is to provide a much higher level of service to all those who seek information, whether they're at a desk in Boston, driving through Bonn, or strolling in Bangkok."

The Google philosophy:
1. Focus on the user and all else will follow
2. It's best to do one thing really, really well
3. Fast is better than slow
4. Democracy on the web works
5. You don't need to be at your desk to need an answer
6. You can make monëy without doing evil
7. There is always more information out there
8. The need for information crosses all borders
9. You can be serious without a suit
10. Great just isn't good enough

Excerpt from site: Full-disclosure update: When we first wrote these "10 things" four years ago, we included the phrase "Google does not do horoscopes, financial advice or chat." Over time we've expanded our view of the range of services we can offer -- web search, for instance, isn't the only way for people to access or use information - and products that then seemed unlikely are nöw key aspects of our portfolio. This doesn't mean we've changed our core mission; just that the farther we travel toward achieving it, the more those blurry objects on the horizon come into sharper focus (to be replaced, of course, by more blurry objects).

Some psychologists say that the closer one becomes to a person (or something) the harder it is to see the good stuff. Has Google become so intertwined in our daily lives that we no longer recognize the good stuff that it has brought us?

Let me remind you of a few:

1. Relevant Search Results: A source to find information faster. Every update gets rid of the "clutter".

2. Gmail: As far as frëe web based email goes, this must be the most user-friendly with the largest amount of storage space to boot. You can also tie in any other email accounts you may hold and use Google's interface as the "one stop shop" so to speak.

3. Gtalk: Google's frëe IM and Voice Chat service. Nöw also tying in with your Gmail interface. This means that it's accessible from wherever you have internet - you don't need to have the program installed on the machine that you're working from.

4. Leader of other SE: There is no doubt that Google is at the forefront of "great new ideas" for search engines. Google leads and the rest follow. One example is Gmail - more storage space for frëe. Yahoo! was soon to follow with a similarly sized email account for Yahoo! Mail users at no cost. MSN, however, charges for an increased mailbox.

5. Google Earth: Geographic information at your fingertips. Get driving directions and location information for just about anywhere on the globe, and because they use satellite imagery intertwined with maps you get a pretty good idea of what any place looks like.

6. Google Video: A selection of homemade clips, TV shows, movies and viral clips *freely available on the net. (*some TV shows and movies need to be purchased of course)

7. Google Alerts: Need to know when someone has mentioned you, your company or any topic of interest to you on their website? With Google Alerts you are notified *as it happens. (*as Google spiders that site)

These are only but a few things that Google has brought into our lives so to speak.

So ask yourself again - is there really any concern for their progress, or are we benefiting from it at the end of the day?

Forget About It

It's a typical situation where a good company gets too big and people start getting a little uncomfortable about its dominance in society.

So I say forget about all the clutter and focus on the good stuff of which 2006 will bring many new innovations and a whole bunch of new conspiracy theories no doubt.


About The Author
Christine Stander is a professional search engine optimization and online marketing strategist with experience in many facets of search marketing, user behaviour analysis and brand management. For more information please refer to: http://www.altersage.com.

posted by Scott Jones @ 9:49 am 0 comments links to this post

Yahoo! & MSN Generate More Sales than Google

When evaluating the best search engines to spend your marketing budget and effort on, usually the blanket rule of biggest is best, is applied. While being highly visible to a large audience is a great initial search engine marketing goal, when ROI becomes your campaigns driving force, the real objective is being seen by potential customers in a search engine that generates results. And this is when all search engines are not the same especially when evaluating Google vs. Yahoo! vs. MSN.

BIGresearch have just released their latest Simultaneous Media Study (SMM) which after consulting with 15000 respondents provides a great insight into which search engines are best for influencing purchase decisions on particular product categories. In lay terms - which search engine people use to buy specific products.

And here's where it gets interesting.

For all of you, like me, who thought that Google was sure to be at the top of most of the product categories - you'll be surprised!

The top 5 places overall for "influence on purchase decision" rank as follows:


Yahoo! (score of 13)
MSN (score of 21)
AOL (score of 23)
Google (score of 26)
Ask Jeeves (score of 32)

So already the landscape of search looks very different when you start rating search engine effectiveness based on their ability to generate the results you want - sales. Below I have provided a breakdown of top search engines by product category. When you start your next round of search engine marketing planning, it will pay to bear these ratings in mind when choosing where to spend your marketing budget. Focus your search engine marketing based on your product category.

Electronics
Google
Yahoo!
MSN

Apparel
AOL
Yahoo!
MSN

Grocery
AOL
Yahoo!
Ask Jeeves

Home Improvement
Yahoo!
MSN
Ask Jeeves

Car/Truck
MSN
Yahoo!
Google

Medicines
Yahoo!
AOL
Ask Jeeves

Telecom
Google
MSN / Yahoo!
AOL

Eating Out
Yahoo!
MSN
Google / AOL

Source: BIGresearch Dec 2005

posted by Scott Jones @ 9:38 am 0 comments links to this post

Google Lays Down the Law on BMW

Google has flexed its clout by dropping BMW Germany from its search engine after realizing the iconic car manufacturer's German website (bmw.de) was artificially boosting its popularity ranking.

This sends out a strong warning to all online businesses employing aggressive optimization techniques to maximize their exposure and traffic in the search engines. Whilst most company's are using ethical procedures that are acceptable in the eyes of Google and other search engines, it's fair warning to anyone using black-hat techniques (those not in line with the search engine's guidelines).

Google has openly stated they will be clamping down on web spam this year. Many smaller, or lesser known websites have probably suffered the consequences of these anti-spam efforts already, but BMW Germany is the highest profile company to experience the true impact of Google flexing its muscles.

JavaScript re-directs were the reason that BMW's website was dropped from Google's search engine. As Matt Cutts highlighted, this is in violation of Google's Webmaster Quality Guidelines, which clearly address the issue of deceiving users or search engines by showing different content to each - also known as cloaking.

Not only has the website been dropped from the search engine, but its strong page rank has also been zeroed, meaning it will need to start again and build up its ranking from scratch. A costly exercise not only in re-optimization efforts, but in lost revenue and exposure.

In a sign that this is not an isolated case, Ricoh.de is rumored to be the next large company website to suffer the same fate. So it appears that these aggressive techniques, which were synonymous with pornographic and gambling sites, are being employed on a much larger scale, and will not be tolerated by Google or any other search engines.

This situation highlights two very important facts. When large companies are resorting to these tactics, it reinforces the importance that is placed on getting a top ranking in the organic search results.

Secondly, it should be a clear warning that, whether you are undertaking your own optimization efforts or employing a company to conduct your search engine marketing for you, you need to confirm that the techniques being used are ethical and inline with Google's (and Yahoo!'s) guidelines. Most professional search engine marketing companies will promote this fact somewhere within the product or service pitch.

You have all been warned...

posted by Scott Jones @ 9:37 am 0 comments links to this post

Monday, February 13, 2006

Gbuy - What Every Website Owner Must Know

Every month, it seems, a major company stares down the barrel of Google's brand recognition gun. The company of the month right now happens to be PayPal. The Wall Street Journal published an article on Ebay's Jeff Jordan's preparations for Gbuy, the so called PayPal killer (PayPal is owned by Ebay). Many news sources and blogs are anticipating Gbuy to quickly become a PayPal killer given Google's huge brand name recognition and reach with consumers.

But the whole PayPal killing talk is really much ado about nothing. I have no doubt whatsoever that Google will release Gbuy, but I do have significant doubts as to whether it will actually replace PayPal as many merchant's payment processor of choice. Given Google's recent releases, and given how PayPal has positioned itself in the marketplace, I would not be surprised of Gbuy proves itself to be a significant flop considering all the attention it has been given.

PayPal's Vulnerability

Jeff Jordan of Ebay has every right to be scared, however. Executives have a history of losing their dignity and control when they feel pressured by Google. Steve Ballmer is well known for his professional wrestling-like tirade in which he sent chairs flying and cursed the name of Google, and Yahoo is not much better having publicly given up its quest for search dominance, which in effect was Yahoo executives crying "Uncle!" with the hope that they could finally focus on something that they could actually excel in.

Professional wrestling rant's aside, Jeff Jordan does have reason to be worried. PayPal, unlike Yahoo and Microsoft, is much more vulnerable to a direct attack from Google. The history of PayPal is filled with trouble with CEO's, run-ins with the mafia, and pressure from an Attorney General. This article, however, is not the place for a lesson on the history of PayPal (a book was written for that), regardless of how fascinating it might be.

Article Tip
Did you know that PayPal is one of the few companies with a single letter domain name? Chëck it out: X.com.

The problem with PayPal is that it has not solidified itself in any other market besides the payment option of eBay merchants. The result is that many website owners view PayPal as a 'cheap' option. Furthermore, even though PayPal does allow non-members to purchase items through their system, it is not as easy as many merchant's would like. Many of the problems of PayPal were actually discussed on the forums a while ago.

These problems open up a vulnerability for PayPal. If Google releases a product that improves on the downfalls of PayPal in much the same way that they were able to revolutionize online maps, then PayPal should be worried. Google does have a knack for making web applications that make existing applications look outdated and simplistic.

Why Gbuy Will Not Kill PayPal

Experts have been predicting the wild success of Google in many different industries for some time. When Froogle was released it was thought by many to be a major threat to Amazon.com. Although Gmail has been a success, there have been no reports of Yahoo Mail suffering significant attrition (in fact the buzz over the new Yahoo Mail interface shows just how much interest there is in Yahoo Mail). And although Google News is a highly useful service, it is hardly considered the default news service for most web users.

The fact is, the past few Google releases have been relative failures. Google admitted that they screwed up with their video service (the Apple iPod Video is much more successful and sets the bar much higher than Google is currently meeting). Google Reader was met with a collective 'ho-hum' from the webmaster community (as was Google Pack), and even though Google Sitemaps may be useful, it is still under used. Google Analytics still is not able to accommodate mass signups, and the buzz over analytics has also declined significantly since its release.

Really, if we were to look at Google's recent releases, the only relatively successful releases have been Google's Search (obviously), Adwords and Adsense (also obvious), Google Maps, and Gmail. The fact is, with the exception of Adwords and Adsense, Google has not been very successful in launching commercial products.

Google's Lack of Simplicity for 'Everyday' Users

The problem with most of Google's recent releases has been its lack of simplicity. Google's initial success in search was powered by the extreme simplicity they brought to the process. All the user saw was a search box and search results. What could be more simple? On top of that, search results were stunningly accurate compared to the other results available.

Although Adwords is confusing to many website owners, Adsense also carries the trait of being extraordinarily simple. Add in the benefit of being able to make a significant income from Adsense and it is no wonder that Google has a firm hold on the contextual ad market. For potential advertisers there is no greater reach than Google Adwords.

Yet most of Google's recent releases have either lacked the simplicity that made them the choice for every day users or offer no significant advantages over existing products. If PayPal is genuinely going to be threated by Gbuy, then Google is going to have to perfect the simplicity that PayPal has capitalized on.

Gbuy - Finding a Niche Among Website Owners

A few years back when K-Mart filed for bänkruptcy, I remember listening to an analysis which looked at the reasons why K-Mart was having difficulty. The analyst explained how there were three major players in the mega-stores: Wal-Mart, K-Mart, and Target. Wal-Mart had successfully positioned itself as the price-leader out of the three mega-stores while Target, although still inexpensive, positioned itself as slightly more expensive, but higher quality. K-Mart, in this environment, lacked an identity to shoppers.

Gbuy could very well fall into the same problem. PayPal has done a great job in solidifying itself as the payment solution of choice for millïons of Ebay merchants. In addition, thousands of other website owners have chosen to at least add PayPal as a payment option on their website due to its extreme simplicity for those who have PayPal accounts.

In a best case scenario Gbuy could really only hope to fit in as an alternate payment system to those who have already established how users are supposed to pay for their goods. In all reality, though, the market is crowded, and Ebay users will likely continue to use the integrated, easy to use PayPal over any new-commer - especially of Google fails to make a relatively simple product.


About The Author
Mark Daoust is the owner of Site Reference. If you want to reference this article, please reference it at its original published location.

posted by Scott Jones @ 9:59 am 0 comments links to this post

Friday, February 10, 2006

Advertising in RSS Feeds

As publishers have moved towards monetizing RSS feeds, there have been vibrant discussions as to whether advertisements in feeds are viable or whether they will drive subscribers away. At the end of the day while it appears that many are discussing the philosophical approaches to ads in RSS feeds few are taking the time to examine the options available for inserting advertisements in feeds. Ultimately the advertisements served are going to determine the success of RSS as an advertising medium. The ads served must be related to the content contained in the feed. If the RSS feed contains quality content, the ads are relevant, and the volume of ads is in balance with the volume of content served, advertising in RSS feeds will succeed. Take a closer look at some of the ad serving options currently available for RSS feeds.

Review of Current Options

Google AdSense for Feeds

Google's AdSense for Feeds offers contextually targeted advertisements, with a wide selection of advertisers. Google chooses not to divulge the percentage of revenue that is shared with the publisher, so it is difficult if not impossible to predict monthly revenue. The current Google AdSense system for feeds is tied to blogs and does not appear to be overly flexible.

Pheedo

Pheedo displays categorized advertisements rather than contextual advertisements. The upside to this is that Pheedo's advertisements can be used in conjunction with Google AdSense or AdSense for feeds without violating Google's contract. Pheedo works with the publisher to serve advertisements from similar or related categories associated with the feeds contents.

Pheedo's system allows for advanced ad filtering, giving publishers control over keyword ad filtering, specific ad filtering or url filtering. Pheedo's system also allows publishers to sell ads to existing advertisers with whom they already have a relationship. The revenue split is 50% and feeds can be a sponsored flat rate advertisement or a pay-per-click advertisement, where the publisher is only paid if the advertisement is clicked.

Kanoodle for Feeds

Kanoodle's systems for providing advertisements for feeds is similar to Google's but they do not have the breadth of advertisers that Google boasts. Advertisements are served based on topics, not keywords. Kanoodle shares 50% of the revenue generated from the advertisements with the publisher serving the ad.

Evaluating Options

When evaluating feed ad serving solutions consider the following:

1. Ad Relevance
In order to generate revenue from RSS advertisements or for an advertising campaign to succeed using RSS as a channel it is absolutely critical that the advertisements served in the feed contain related content, the more related the content the higher the likelihood that the advertisements will be of interest to the reader and clicked. Also, the closer the content relates to the feeds theme the higher the likelihood the reader will have genuine interest in the product or service being advertised.

2. Ad Ratio
Publishers need to retain control over the frequency of advertisements. Advertisers may be happy because they are reaching a targeted audience and publishers because their advertisements are being clicked and generating revenue, but readers will become frustrated with feeds that are too heavily laden with advertisements.

3. Clearly Denoted as Ads
The debate over editorial control and advertisements rages on. It is generally considered proper net etiquette for publishers to clearly mark advertisements to distinguish them from editorial web content. When selecting a RSS advertising partner consider the context in which the advertisements are displayed. Does it blend with the feed or site, while still being clearly marked sponsored material? Or does the content blend so well that it appears as a product or service endorsement from the publisher? Credibility and reputation online matter, and the segregation of advertisements and ensuring they are properly denoted as such will go a long way to enhance credibility with readers.

Clearly as RSS increases in popularity, publishers will be looking for ways to monetize their content. RSS in advertising is a logical step, and striking a balance between quality, consistent content and occasional related advertisements will lead to the success of advertising in RSS feeds. If the balance is not found, publishers may be forced to move to a subscription RSS feed model.

The Wall Street Journal was one of the first content publishers that announced a subscription model. Rather than embedding advertisements in the RSS content feeds, the Wall Street Journal provides teaser copy and if the subscriber wishes to view the expanded content they are charged a subscription fee.

Time will determine the long term viability of advertisements in RSS feeds. If RSS advertisements perform like the contextual text based ads currently served on websites, RSS advertisements will likely become common place. While the content publishers who specialize in unique, consistent content might find the subscription model more effective.


About The Author
Sharon Housley manages marketing for FeedForAll software for creating, editing, publishing RSS feeds and podcasts. In addition Sharon manages marketing for FeedForDev an RSS component for developers.

posted by Scott Jones @ 8:47 am 0 comments links to this post

Wednesday, February 08, 2006

Google Big Daddy SearchQuake About to Rumble Your Ranking?

Running ranking reports for clients is a standard part of an SEO's job. This week I created a position report for a client - one for which we'd made significant gains in ranking for their targeted search phrase - and proudly sent off the report to them before a scheduled conference call to discuss our progress and status.

The client sent an email upon receiving the report saying "There is something wrong with your report - we rank higher than this report claims." I went back to Google and typed in the search phrases to find rankings exactly where the report showed them the previous day.

I explained to that client that Google has (at last count) nine data centers which serve up search results and that they were getting results from a data center in the Eastern US which showed differing results from results shown to us here in California.

The difference was substantial enough to move the client from page two to page one in the search results and therefore made a dramatic difference in their satisfaction with our work. Differences are rarely that substantial in previously observed ranking reports, so it prompted me to dig a bit deeper into the issue and I sent the note below to the client.

"Take a look at this link where Google datacenter IP addrresses are listed in detail."

http://www.webworkshop.net/seoforum/viewtopic.php?t=548

"Here is an overview of a coming update to all Google datacenters expected in February or March of 2006."

http://directmag.com/searchline/1-25-06-Google-BigDaddy/

"So you ARE ranking better from your area of the country and that particular data center which returns results to you. Things usually update to match in all data centers, but sometimes you may do better in one data center than in others. If you search from each individual IP address in that list discussed in the forum linked above, you'll see different rankings and may find datacenters where you rank at the bottom of page two of results."

You might also search from that new "Big Daddy" data center referenced in that article above, which discusses upcoming Google ranking algorithm changes due soon.

http://66.249.93.104

Where I'm seeing you ranked at #17 (bottom of page two.)

It's a measure of where you might expect to be when Google moves to that new algorithm for all data centers in February or March. (Of course we continue to work to achieve better results before then.)

This upcoming change in algorithm and the interestingly named server "Big Daddy" were publicly posted on Matt Cutts blog for beta testing by SEO's (and other Google Watchers) who read him regularly. (For those who don't know, Cutts is a software engineer at Google & shares SEO tips on his blog)

http://www.mattcutts.com/blog/

Of course this news was a bit much for the client to digest in one chunk and he had little time to read the articles I referenced in my note above, but it was enough to assure him that I knew what I was talking about and explain the differences in my report and his own keyword searches at his end of the country. It's a bit odd to try to explain to a client "there are different Googles." Few know or understand this.

Another issue cropped up later in the day when I was doing further research for a different client and found, while we were speaking on the phone, that his results differed from my own on specific query operator searches. We were using the "site:businessdomain.com" query operator and the "allinurl:pick-your-own-URL" query operator to limit search results and got vastly different numbers of results and rankings for the same searches.

The first stunning thing in this example was that we are less than 25 miles apart in Southern California. The second shocker was that I tried simply hitting the "Search" button a second time after getting the first results page and things changed again! All of this happening in a single day makes me believe that some percolating of results is going on as Google eases into an algorithm change.

Perhaps this is not all that unusual, but in seven years of this work, I've not seen the volatility noted in January of 2006. Are we about to have a major SearchQuake? Is Google about to split the earth and spew volcanic new results? Stand by for the BigDaddy SearchQuake sometime this month or next.


About The Author
Mike Banks Valentine blogs on Search Engine developments from http://RealitySEO.com and can be contacted for SEO work at: http://www.seoptimism.com/SEO_Contact.htm. He operates a free web content distribution site at: http://Publish101.com

posted by Scott Jones @ 8:19 am 0 comments links to this post

Monday, February 06, 2006

Top Dirty Linking Tricks

Part of achieving top search engine positions is through links from other Web pages. These links can come from people who like your site (natural links), reciprocal linking, directory submissions and a few other ways.

The goal of trading links is to get quality links for quality links. True quality links will carry benefits far beyond that of attaining a coveted position in the search engine results. The links will bring traffïc from the Web page linking to your Web page. Therefore, you want to ensure you trade or barter links from quality partners.

Sometimes it's hard to determine who is a quality linking partner, even for the expert. So, how can you tell if your link is on a Web page where its value will not be very good?

The short list below highlights ways of diminishing or nullifying the value of a link to your site from another Web page.

Meta Tag Masking - this old trick simply used CGI codes to hide the Meta tags from browsers while allowing search engines to actually see the Meta tags.

Robots Meta Instructions - using noindex and nofollow attributes let's the novice link partner see the visible page with their link while telling the search engines to ignore the page and the links found on the page. Nofollow can be used while allowing the page to be indexed which gives the impression that the search engines will eventually count the link.

Rel=nofollow Attributes - this is not a real attribute based upon HTML standards, but rather it is an attribute approved by the search engines to help identify which links should not be followed. This attribute is often used with blogs to prevent comment and link sp@m. The link will appear on the Web page and in the search engine's cache, but nevër be counted.

Dynamic Listing - dynamic listing is a result of having links appear randomly across a series of pages. Each time the link is found on a new page, the search engines count consider the freshness of the link. It is extremely possible that the link won't be on the same page upon the next search engine visitation. So, the link from a partner displaying rotating, dynamic link listings rarely helps.

Floating List - this can be easily missed when checking link partners. Essentially, your link could be number one today, but as new link partners are added your link is moved down the list. This is harmful because the values of the links near the bottom of the list are considered to be of lesser value than the links at the top. With the floating list, it is possible to have your link moved to a new page whose PR value is significantly less or non-existent and the new page may not be visited and indexed for months.

Old Cache - the caching date provided by Google indicates the last time the page was cached. Pages with lower PR values tend to be visited and cached less often than pages that have medium to high PR values. If the cache is more than six months old, it can be surmised that Google has little or no desire to revisit the page.

Denver Pages - while Denver, CO is a nice place to visit, Denver Pages are not a place you want to find your link in a trade. Denver Pages typically have a large amount of links grouped into categories on the same page. Some people call this the mile high list. These types of pages do not have any true value in the search engines and are not topically matched to your site.

Muddy Water Pages - these are dangerous and easy to spot. Your link will be piled in with non-topically matched links with no sense of order. It's like someone took all the links and threw them in the air to see where they land. These are worse than the Denver Pages.

Cloaking - cloaking is the process of providing a page to people while providing a different page to search engines. You could be seeing your link on the Web page, but the search engines could possibly nevër see the link because they are provided with a different copy. Checking Google's cache is the only way to catch this ploy.

Dancing Robots - this can be easily performed with server-side scripting like PHP and is rarely easy to catch. In this situation people that attempt to view the robots.txt file receive a copy of the robots.txt file that does not include exclusion instructions for the search engines. However, when the search engines request the robots.txt file they receive the exclusion instructions. With this situation the links pages will nevër be linked and you'll nevër know why without expert assistance.

Meta Tags and Robots.txt Confusion - which instructions have the most weïght? Don't know the answer? Shame. Search engines do. If they conflict, the page Meta tags are typically considered the rule to follow.

Link the Head - while these links do not count in the search engines and do not show up on the Web page, they do get counted by scripts or programs designed to verify the links exist. These programs only look for the URL within the source codes for the Web page.

Empty Anchors - this is a nästy trick, but can be an honest mistake. The links exist and are counted by the search engines, but unfortunately are neither visible nor clickable on the Web page. So, there are no traffïc values from the link.

The goal of trading links is to trade them for equal value. Understanding the ways people will attempt to prevent passing a quality value from their Web page to your Web page can help you avoid these useless links. If your link partner pulls under-handed tricks the links they trade you are useless.

While you may nevër be an expert in knowing all the latest tricks, traps and tests, you can nöw become an expert in knowing the thirteen mentioned above. Ensuring your link partners are not following or using these tactics can help improve the quality of links you gain from other Web pages. By having quality links pointing to your Web page you will gain additional traffïc through organic search engine results and visitors driven directly from your linking partners.


About The Author
Lee Roberts, The Web Doctor®, is President/Founder of Rose Rock Design, Inc. a website design company and Founder of the Apple Pie Shopping Cart, an ecommerce shopping cart.

2006 © Lee Roberts. All Rights Reserved.

posted by Scott Jones @ 8:38 am 0 comments links to this post

Friday, February 03, 2006

Web 2.0: The Next Big Thing or the Evolution of a Technology?

Is it a movement? A revolution? Perhaps a new paradigm? Or, is it a bunch of hype designed to sell a bunch of new software? Just what is Web 2.0?

Well, the term has been around since 2003. It was coined by I-Net pioneer Dale Dougherty and introduced at a conference by Tim O'Reilly of O'Reilly Media, Inc., who has subsequently made attempts at defining just what Web 2.0 means. In his seminal document entitled What Is Web 2.0: Design Patterns and Business Models for the Next Generation of Software, O'Reilly describes Web 2.0 as follows:

"Like many important concepts, Web 2.0 doesn't have a hard boundary, but rather, a gravitational core. You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core."

- Tim O'Reilly

Okay, that's a starting point of sorts - gravitational core, set of principles and practices, veritable solar system. The fact is, O'Reilly, the champion of Web 2.0, has written eloquently on the subject, but after reading his detailed explanation, you still walk away scratching your head. Additional research clearly demonstrates that there's a lack of consensus.

Tim Bray, writing at http://radar.oreilly.com, strongly contests the use of the term Web 2.0, calling it nothing more than a meme. Okay, so what's a meme? Well, we have to go back to 1976 to find the origin of the term created by Richard Dawkins in his text, The Selfish Gene. In it, Dawkins describes memes broadly:

"Examples of memes are tunes, ideas, catch-phrases, clothes fashions, ways of making pots or of building arches. Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation."

Okay, now we're getting somewhere. Web 2.0 is a catch phrase and one that's getting a lot of attention within the e-commerce community. In fact, since making its way into the collective I-conscious, there have been more than 9 million Google searches for Web 2.0 information. Somebody's interested.

Yes, there's something there, and when you cut through the hype, delete the meme and study the underlying concepts, Web 2.0 does offer some thinking points for every site designer, host and owner. Let's look at some of the parameters of this new way of thinking about the www.

Extreme Trust
A great catch phrase in its own right. Extreme trust is a new vision for using the collective knowledge of Internet users, demonstrated by the ascendancy of Wikipedia. In the world of Web 1.0 (the model for the past decade), the Internet was a source of information. However, the information was static. You could access World Book or The Encyclopedia Britannica on-line, but all you could do is read it, print it out and use it for your child's homework.

Sites, such as Wikipedia and the Open Directory Project are changing this dynamic based on the concept of extreme trust.

Wikipedia is a growing collection of information (over 100,000 unique entries) submitted and edited by volunteers. It changes daily, hourly, providing the latest information from a variety of writers of varying degrees of expertise. Information can be edited by anyone who knows more about the topic than the original poster. In fact, if you access certain topics on Wikipedia, you'll see warnings that certain encyclopedia entries have not been reviewed, and therefore, the content can't be deemed as accurate - yet. However, as more experts, operating under the doctrine of extreme trust, review each Wikipedia entry, the reliability and veracity of the content increases.

Thus, in the Web 1.0 world, people could access information, but not participate in its evolution. In the new age of Web 2.0, the collective intelligence of the world community becomes accessible and utile.

Personal Participation
Another, much-touted aspect of Web 2.0 is personal participation. Personal web sites have been around for years. You could post family pix and tell the world what you did over summer vacation. But, these personal web sites nevër really caught on because of the expense and time required to launch and maintain them.

Enter the web log, aka blog. These personal journals encourage greater, individual participation by enabling anyone with an opinion, idea or random thought to post these personal musings for all the world to see. Bloggers have changed the way information is disseminated. Many have garnered credibility as legitïmate news sources. In fact, bloggers have received press credentials for newsworthy events. They're used by the mainstream media as reference and several of these bloggers have broken major news stories before their largër print and on-line competitors, e.g., Robert Novak's outing of Valerie Plame as a CIA operative.

The concept of personal participation has also spilled over into the realm of e-commerce, with many on-line businesses offering a blog and/or forum where customers, clients and other interested parties can post their thoughts. Amazon.com is a leader in this area, encouraging its customers to submit reviews of purchased products. In fact, some Amazon reviewers have made names for themselves - and customers seek out their recommendations! As the old, anti-war chant once demanded, Power to the People has been finally realized.

In fact, if you tour the Amazon site, you'll discover opportunities for customer participation on virtually every page. Amazon's subsidiary, Booksurge.com has also simplified the entire publishing process. Authors no longer have to approach traditional publishers, hat in hand, begging to be published. Booksurge and Amazon have made it possible for anyone to write, publish and sell texts through Amazon, B&N, Borders and other on-line outlets. Yes, this is part of the Web 2.0 model.

Static versus Dynamic
Netscape was the browser of choice in the Web 1.0 era. It was published, then updated regularly in various versions identified as Netscape 1.0, 2.0, etc. This was a static business model in which users had to wait for improvements to be made, then download the updates.

Fast forward to the dynamic age of Web 2.0 where Google reigns supreme. Google is a true child of the Internet. It was made to fit with I-net dynamics. Improvements are made and implemented daily - seamlessly. No downloads, no patches required. The result? Google has enabled all of us to access the most obscure factoid in a nanosecond. Its index contains billions of pages of spidered text and as more new sites sprout like mushrooms, more pages are spidered and the index grows.

Google has demonstrated how to do it right. It's highly interactive, it's nevër static and it has created many new avenues for the e-commerce community and for users in search of the name of the pharaoh who was in power when the rotary mill was introduced in Egypt. This has increased productivity exponentially.

The Evolution of Technology
Technology evolves. It builds on what came before. It learns from past mistakes and takes advantage of unrealized opportunities. This is as true of America's Industrial Revolution as it is for the Internet. There were lots of false starts, missteps and abject failures during the rise of technology in the early and mid-1800s. The same is true of the current technological revolution underway on your computer screen daily.

Remember the original Priceline model? You could spend two hours saving 9¢ on a can of peas. Nice try, but no cigar, despite William Shatner's campy commercials. Or, how about buying pet foods on-line? That went down in flames, too. In fact, all you have to do is look at the I-net bubble that burst in 2000 to see the shake-out of what was working and what wasn't. A lot of investors lost a ton of cäsh, but the Net didn't shrivel up and die. In fact, it's more powerful than ever.

Technology doesn't move forward in straight line. It nevër has. There are offshoots, improvements and lots of really, really bad ideas along the way. (Anybody remember the Ford Edsel?) Internet technology is no different, except that the shakeouts occur much faster, the improvements take off much quicker and the really, really bad ideas are really, really expensive. Just ask Shatner. Such is the nature of technological evolution.

So, Is Web 2.0 A Revolution?
Tim O'Reilly and the other promoters of Web 2.0 have done us a service by focusing attention on new uses for the Net. RSS is a radical step forward. Podcasting, though in its infancy, is coming on strong having caught the attention of advertisers as a new means to reach the cutting edge public. In fact, just as anyone can set up and maintain a blog, today the technology exists to set up your own broadcast network complete with specialized shows for niche markets like pregnant parents or home schoolers.

However, Web 2.0 also has aspects of a meme. Many on-line businesses have picked up the term and now proudly display a Web 2.0 logo on their home pages, though the site has virtually no new features.

No, Web 2.0 isn't a new paradigm or a revolution. It's the natural evolution of a technology that's growing at truly heart-stopping speed. What was yesterday won't be tomorrow.

In the weeks and months ahead, we'll take a much closer look at this evolutionary track to sort hype from help, and to assist you in finding new, better ways to increase site traffïc, improve your conversion rate and expand your repeat-customer base.

For now, Google Web 2.0 and start doing your homework. Changes are coming. Will you be ready? If not, you won't be hëre tomorrow.


About The Author
Frederick Townes is the CEO of W3-EDGE Web Design. W3-EDGE specializes in business web design for all sized clients. They also provide quality professional web hostïng through W3-HOSTING.net. Contact Frederick at ftownes@w3-edge.com.

posted by Scott Jones @ 9:51 am 0 comments links to this post