Wednesday, March 29, 2006
Domain Name Insanity Does Your Name Really Matter?
Your domain name is the .com, .net, .org or some other dot something that people use to get to your web site. Affiliateblog.com is mine.
A group of investors headed by Jake Weinbaum (the guy behind Disney's go.com) paid $7.5 million for the name Business.com back in 1999, aiming to make it a showcase B2B site. According to their own press they have succeeded. Yes, it's a terrific name - short, sort of descriptive and easy to remember. There's some cachet there, but is it $7.5 million worth? That cäsh could have bought a lot of promotion or branding for whatever name they could have had for ten bucks, or a hundred, or two hundred grand.
Each year for 15 years The first $500K in profït goes toward amortizing the cost of that domain name. That could also pay for a terrific affïliate program, a truckload of banner and PPC advertising, and a nice BMW lease for Mr. Weinbaum (who probably doesn't need a BMW).
But the Business.com thing has set off a wave of domain name speculation that staggers the mind. People are snapping up domain names and ransoming them off to wide-eyed entrepreneurs with business plans and dreams of riches. Being a hardcöre capitalist I am torn about domain name speculation - I am tempted to applaud the person making a buck by getting there first and grabbing up the good names, but I am annoyed at the restraint of commerce that takes place while someone negotiates with one of these guys to get the right name.
So if I look at the top 50 websites on Alexa, most of them should be easy to remember names, right? Wrong. I would argue that only one, match.com, is an easy-to-remember name that describes what the site is about.
I keep hearing that the reason these so-called generic or descriptive domain names are so valuable is that some people just type domain names into the address bar of their browser rather than using a search engine. This fact seems to be intuitively false. I find it hard to believe that someone looking for information on a particular business would type in www.business.com. Furthermore, if I look at the top 50 websites on Alexa only one, match.com, is an easy-to-remember name that describes what the site is about.
I wondered how many people actually type in their address bar (address bar?) instead of using a search engine anyway. I didn't find the answer, but Jupiter Media tells me that 64% of people looking for something use a search engine.
That means that 36% of people use something other than a search engine. What makes me believe that people typing stuff into their address bar doesn't happen much is this simple fact...of the people using search engines last November, 43% searched for common websites like Ebay. In other words, instead of typing in http://www.ebay.com, people Googled Ebay and clicked on one of the results. That is absolutely hysterical. And totally believable.
What do all these facts mean? They mean that as far as getting the person there the first time, everyone starts off on the same square. If your domain name can get the minority of people who just type into their address bar to your website without a search engine, it's worth more than someone who can't.
Hëre are some of the legendary domain name salës in the past several years, according to Zetetic:
Amount Year Domain
12,000,000 - 2006 - sex.com
7,500,000 - 1999 - business.com
5,500,000 - 2003 - casino.com
5,000,000 - 2002 - asseenontv.com
5,000,000 - 1999 - korea.com
3,500,000 - 1996 - worldwideweb.com
3,350,000 - 1999 - altavista.com
3,300,000 - 1999 - wine.com
3,000,000 - 1999 - eshow.com
3,000,000 - 1999 - loans.com
2,750,000 - 2004 - creditcards.com
All of these with the exception of eshow.com (computer networking) should get address bar traffïc, because people who type will type in the descriptive names - if I'm looking for sex-related stuff, I'll type in sex.com. Where my mind gets boggled is in ROI. If you're selling something on asseenontv.com that nets you $25, you'll need to sell 200,000 of those George Foreman grills just to pay for your domain name.
It also dawned on me that if you pay $12,000,000 for sex.com, the frëe publicity generated is probably also worth millïons.
So nöw everyone gets dollar signs in their eyes and thinks they can make a million with their domain name. Hëre are some examples of asking prices from Ebay:
6usiness.com - 7,000,000 (yes, that's a 6)
ajobformom.com - 3,500,000
Exbay.com - 1,000,000
What does this mean for you? Well, there's some good news and some bad news. Remember back a few paragraphs when I said that everyone starts on the same square? That's really the good news. You can choose a pretty good domain name, put together some terrific content, employ some simple Search Engine Optimization and buy some keywords or exchange some links and you have a pretty good chance of getting people to your site the first time. Since most of them are coming via a search engine they're not going to notice your domain name until they get there anyway, so your domain name means the same thing (nothing) to the majority of people using the search engine.
One last thing: if you're hoping to be close to the top in the search results (the so-called organic SEO), having your keywords in the name of your website gives you a huge boost. For example, if you're looking for affïliate blog, we will be in the top five search results. In this case, Google ignores TLD unless you tell it otherwise. Affiliateblog.info will come up before us because their pagerank is higher (that's a discussion for another day). So if you think getting near the top of the organic search results is more important than having someone type your name directly into the address bar (and you very well could be right), then grab yourkeyword.cc or yourkeyword.to. I've done it, and I've suggested it to others.
Once the user comes to your site the name just needs to be memorable enough so they type it in to get there the next time. Or they may forget and Google you again. I do it every day. No matter how great your name is, if the content is lousy they won't come back anyway.
So should you buy a domain name? I don't know - I bought this one. And I made honorable mention in the Domain Name News for the price I paid ($2500). I bought the name because I liked it, I liked the number of incoming links to it, and I felt comfortable paying for it. I've nevër paid more than a couple hundred dollars for a domain otherwise, and I have more than 200 of them. My favorite by far is Blozzo.com, which I just bought for $25. I have a pretty terrific idea in mind for Blozzo too.
I would try to come up with my own name before I bought someone else's. Hëre are some tips:
1. Try to go with a .com. It's the name everyone associates with the Internet. Any other Top Level Domain (TLD) like .org or .net is just going to confuse people, unless it sounds better than the .com. For example, if you are about networking or a network, a .net is more natural. If your site is informational, you should use .info if it sounds okay. One of my favorite $10 domains is seosecrets.info. I think it sounds good. Hands down the most ingenious use of a TLD is del.icio.us, the social bookmarking site. The use of the .us TLD is absolutely brilliant.
2. Leave out the dashes and meaningless numbers. If it's a choice between this-domain.com, thisdomain123.com and thisdomain.net, take the .net. No one remembers to put the dashes or the numbers in, unless they are an integral part of the name like studio54.com or e-books.com.
3. Use the fewest letters possible to describe what you do. I own Purple Monkey Media Group. Purplemonkey.com would have been perfect. It's taken, of course. Purplemonkeymedia.com was not. I grabbed it. I could have taken purplemonkeymediagroup.com, but it would have been too long. Remember, every additional letter is a potential typing errör.
4. If you have a domain name that needs to be reinforced, get a good logo and sprinkle it liberally on your web site, along with some slogan that will reinforce the name in people's minds. You would be surprised at how inexpensive this can be.
5. If you can save a few bucks with your own domain name or by buying a cheaper domain name, do it, and use the monëy to get yourself placed higher in the search results or Adsense placement.
6. If you can't come up with a descriptive domain name, go the other way. Depending on your site's focus, pick a memorable short name that will stick in people's minds, get a great logo and include the name prominently in your advertising and marketing. It's called branding, and it's tried and true.
7. Ask your wïfe, friend, boyfriend, husband, dog, lawyer, associate, Mom, Dad, cousin, uncle, Police Chief, blog writer. They're smarter than you anyway, and they are going to be the one looking for the site, not you. Some of my best ideas have come going to or from somewhere with my wïfe and just brainstorming.
Here's the bad news: it may take you a while to come up with the right name. There's more good news though - in the real world most domain names sell for $1,000 or less.
Can't get started? - Go to a site that sells domain names, and put in a word that describes your business. See if the name is taken (it probably will be). Open your word processor or go to thesaurus.com and put the word in. Get a few more words. Chëck those. If there's a .com available and it looks good, grab it. If not, add the word site or blog or online to your word, and see if that works. Don't wait. If you think it might be useable, spend the $9.00. I came up with blogduck.com. I liked it. I decided to think about it some more. Someone grabbed it that afternoon. Just chisel loose the nine bucks (or less) and buy the domain.
If you want something a little more sophisticated there are several sites that are good for helping you come up with a name, like DomainsBot and Nameboy.
If you draw a blank, go over to Sedo or Afternic and see what's for sale. Search for a word that describes what you think people will associate the name of your site with, and see what pops up. That may give you some ideas.
These sites and more can be found in Tools section of http://affiliateblog.com.
Domain Name Journal tracks domain name salës. Going there is always fun.
About The Author
Matt DeAngelis runs AffiliateBlog.com. Matt is the former Chief Technology Officer of Modem Media, a pioneer in the Internet ad space. As a foot soldier in the Internet revolution, Matt devised the technology behind ad campaigns and online presence for a good portion of the Fortune 100.
posted by Scott Jones @ 8:41 am
links to this post
Monday, March 27, 2006
The Surprising Truth About Ugly Websites
Ugliness has nevër looked better. I have spent the last few days examining a surprising trend in web design that has made ugly websites look absolutely irresistible. No, it's not the bolded, 18 point Times New Roman font shouting at me as I access the page that has me excited, nor is it the harsh colors that have actually managed to make my eyes hurt and distort my vision. In fact, it's not even that logo which is so pixelated from being processed, resized, saved, and edited so many times that it appears to be blurred to protect the identity of the company who owns the website that has me singing the praises of ugly websites. What is it?
That's right – ugly websites are surprisingly effective in making monëy. As a person who puts business before technology, a profitable website is a website that is an unbelievably attractive website to me.
The Case of Plenty of Fish
I was struck by an example of just how effective ugly websites can be this past week as I was browsing through some web related news. I stumbled across the story of Plenty of Fish. This is a very plain looking website that offers a frëe online dating service much like Match.com (but without the subscription fee). There was nothing specifically impressive about the website that stood out to me, in fact the site was actually rather ugly.
What caused me (and I am sure several other people) to take a second look at the website was its reported earnings. It is reported that this website brings in over $10,000 from Adsense – in one day. Yes, you did read that correctly. For those of you counting, that is $300,000 per month and nearly one million dollars in just three months.
The example of Plenty of Fish led me to consider how an ugly website could be so successful. As I looked around, I suddenly realized that this was not the only successful ugly website. Ebay is unbelievably ugly; Craigslist has nevër won an award for innovative design, and IMDB has nevër even bothered to format their text out of the default Times New Roman. What is it about ugly websites that makes them so successful?
The Ability to Convey Trust
A while back I wrote an article on Controlling Your Visitors Eyes. The main point to this article was that you have less than a second to convey your marketing message to your visitor, and that every aspect, from your font selection, to the colors, navigation, and layout of your website plays a part in conveying your marketing message.
When I wrote this article, I had beautiful, CSS designed websites in mind. The idea that an ugly website could present a positive message nevër crossed my mind. Yet the fact is, ugly websites do have the ability to present the perfect marketing message. What is that message?
You can trust us. We are a family run business and do not employ a marketing team. Our website is simple, but functional. Most importantly, our goal is to serve our customers, not necessarily learn HTML.
As Internet professionals, we often forget that a large part of our society is actually afraid of the Internet. Although online shopping is growing, most people still have concerns about online security and the impersonal nature of the web. Most people do not know how to surf efficiently and use only the default tools that are given to them when they take their computer out of the box.
And this is one reason that ugly websites can sell. The lack of professionalism and a polished look leads one to believe that they are dealing with an individual. Websites cannot be trusted, but individuals can be trusted.
Function Over Förm
Although the above theory holds true in many examples, I believe there is more to the success of ugly websites than just conveying trust. Many of the websites that I referenced above have one underlying trait that can be attributed to their success: they are extremely easy to use.
Google is probably the best example of how functionality over förm can lead to success. When Google initially launched, every other major search engine was in the process of transforming themselves into a portal that would offer users all the information they could possibly want, and probably more than they really would want. Google, on the other hand, made their website ridiculously simple. There is one purpose to Google – to search the web. Nothing else was there to distract you from this one goal. It certainly did not hurt that Google was able to serve up relevant results, but the simplicity of the system was key to winning over users.
Sites like Drudge Report and Craigslist can also trace much of their success back to their functionality. Drudge Report is a very simple website that is essentially a collection of links to news stories. Most of the time, the Drudge Report does not even link over to content on their own website. Users who wanted an interesting collection of links to various news stories could find them all on one simple page. Craigslist also boasts simplicity. The website is simple to browse, simple to post, and simple to use. Because of its simplicity, it grew.
The general lesson here is simplicity. A beautiful website may draw a user in initially, but a simple website will keep your users coming back. If one of your users gets lost trying to navigate your website, chëck out of your web store, or find simple contact information, then you unnecessarily are increasing the chances that this user will simply leave.
Ugliness By Application – Not By Rule
Although ugly websites are often easier to use and can convey a unique sense of trust, ugliness is not a rule that should apply to all websites. In fact, the vast majority of websites can be improved by adding formatting and focusing on good site design principles.
There are two general rules that you must keep in mind when building your website: 1) What type of message will resonate with my visitors, and 2) Is the site easy to use?
Knowing the answer to the first question is knowing what type of visitors you are trying to reach. Are your visitors web-savvy and thus looking for a well-designed website? Are your visitors uncomfortable with the impersonal nature of the web and just looking for a simple website that is easy for them to use? Are your visitors scared of using online payment processing, or do they prefer the convenience of paying online where they do not have to talk to a person?
The second question is a rule that should apply to every website: functionality is more important than the design of your website. This does not mean, however, that a beautiful website cannot be easy to use. What this does mean is that you should nevër sacrifice the usability of your website for a fancy design effect or a more visually appealing website.
In Conclusion – It's Not Necessarily Ugliness That Sells
As website owners, it is very easy to get caught up in the design of our websites. We want to present our businesses to visitors in the best way possible, and as we get familiar with web technologies and design techniques, it is easy to focus solely on the design of a website from the standpoint of what looks good rather than the message our website conveys.
What we need to keep in mind, is that websites are meant to be used – used for reading, used for networking, used for shopping, etc. Websites, like any other marketing tool, convey a message and are an invitation for visitors to trust us. Our design needs to reflect this.
Take a moment today to look over your website. Is it really easy to use? Have you been more worried about the look of your website than its functionality? Would it be more effective if it were simpler in its design?
About The Author
Mark Daoust is the owner of Site Reference.
This article may be reprinted as long as all links are active, including a link to the article's original location which can be found at
posted by Scott Jones @ 3:46 pm
links to this post
Friday, March 24, 2006
How To Tap Into Massive Sources of Traffïc With Virtually No Competition!
Things haven't been this perfect for a long time. In fact, not since one of the big engines really started to take off, when those who were lucky enough to be already seated firmly in the Top 10 for their keywords, has there been such a plethora of new traffïc opportunities.
The internet marketing world has become multi-dimensional in ways that are surprising to most people who are still hooked on search engine marketing as their sole website promotion strategy.
Here are the new avenues of traffïc that most people are NOT taking advantage of outside of the big companies and a handful of savvy marketers:
While it was "the word of the year" for 2005, most people still have no idea what Podcasting is really about outside of being able to download music at iTunes.com.
And, frankly, it's because the people who "get it" are, for the first time ever, keeping quiet about it! Or at least keeping the information of how they are profiting wildly with Podcasting behind closed doors and in small groups.
And I am not going to get too far into it here for the same reason as the others: I don't want the competition that will be here by the end of 2006 to come any earlier because I blabbed about how it is done in public!
Sorry, but you will have to pay something to become a millionaire using this information!
But I will give you a clue. Go to iTunes.com and download the frëe software that allows you to, yes, download music. But ignore that part for a minute.
With the iTunes software you can grab Podcasts from major news organizations and tiny garage websites mostly focusing on short comic bits.
Pay attention to what you are NOT seeing. One thing is there isn't a lot of video Podcasts - period. Not even comedy! And there are no how-to podcasts save a couple from the people I talked about above.
The savvy marketers are already in the game and getting traffïc from a source all other marketers seem to be totally discounting right now. To the tune of 6 million or so pairs of eyeballs dying to see and hear more content, especially video content, at iTunes.com alone!
Last year you could see the word "Podcasting" uttered thousands of times by marketers trying to gauge the buzz worthiness of it among their customer lists.
Well, without showing people how marketing with Podcasts is done, and with article marketing doing a great job of bringing people traffïc in ways they COULD understand, not too many marketers really got into it.
We were lucky recently to have a guest speaker from none other than the Hawaiian Tropics site come in and talk to our clients about how easy it has been to blow away the likes of Playboy and other major competitors by offering their content through Podcasts.
We got to see the inner workings of a successful marketing campaign on a very very high level. And we gained valuable insight into how marketing with Podcasts can be done on the guerilla marketing level.
You watch. By the end of this year you will wish you started caring about including Podcasting in your marketing a year ago!
2. Audio and Video Syndication
Article syndication is not dead. It actually has yet to see its true "boom" period. I say that because until the content people syndicate gets MUCH better overall, we are basically using sophisticated software and networks to distribute garbage.
But as the article syndication industry slowly comes around to the fact that demanding good content is not going to hurt business, quite the opposite, there are new networks developing that will have us creating, syndicating and streaming much more audio and video around the web to promote our sites.
All-text content is wearing thin on the patience of surfers and possible customers. As the bïgger sites lead the way (they almost always do when it comes to new web technology) we are seeing that our own customers, formerly content to read 15 page salës letters, are leaving for something more exciting.
As we all get used to the internet everyone promised us would be here long ago, we see that as we buy stuff and entertain ourselves on the web, we also expect small websites to measure up to the speed, excitement, movement and sound we see on many other sites today.
Less text. Much more multi-media. And as many different ways to access and consume content as we can possibly dream up. That is what's on the menu and the source of traffïc is MASSIVE!
Think of all those people who left your all-text site in the last month without buying or clicking on a thing. Where exactly to you think they were headed? That's right. To sites with motion and sound to feed their brains without ruining their eyesight trying to read 10 point font at high resolution for 15 pages!
You have probably bought a product from a choice between a few different dealers just because the site was more engaging than the others. I know I have. And usually I buy from sites that look like they are really in business.
Any monkey can get a merchant account and slap up some text to sell a drop shipped product. I want to buy from people who take the time to take their business (and my credit card information) seriously! That means people who are into displaying information in formats other than all-text.
Audio and video editing and syndication tools have come a long long way in the last year. Anyone can get into the game and dominate in areas where the traffïc is theirs for the taking because no one is competing with them for it yet!
So, after you are done writing your next article, while you are syndicating it around the web, make sure you remember that you are not done until you figure out ways to convert that article into an audio for your site or for a Podcast. Or a video scrïpt to power a how-to video for the same purpose.
Articles, audio, and video should become synonymous with syndication and traffïc generation when thinking about your marketing campaign. Leaving any of them out of your marketing is going to cost you big time in 2006 and beyond!
About The Author
Jack Humphrey is the managing partner at Content Propulsion Lab. To learn more about how to propel your content around the web at the speed of sound, visit Content Propulsion Lab at http://www.contentpropulsionlab.com.
posted by Scott Jones @ 8:53 am
links to this post
Wednesday, March 22, 2006
Social Bookmarking for Traffic
A while back I wrote an article commenting on Yahoo's public declaration that they were effectively conceding to Google in the search market. The point of the article was that Yahoo was not necessarily giving up as a business, but rather focusing its efforts on more modern forms of search. And what are these more modern forms of search? In a word, social networks which includes social bookmarking and variants on social bookmarking.
What is Social Bookmarking
Social bookmarking is one of the flagships of Web 2.0. The basic concept behind social bookmarking is that when thousands of people get together, bookmark their favorite pages, and apply descriptive tags to each page that they bookmark, certain websites will rise to the top as being more popular. The result of this is that surfers will be able to see what websites are currently popular among users.
The idea of social bookmarking seems to have been originated by Del.icio.us back in 2003. Just by visiting the front page of Del.icio.us you can see the social bookmarking in practice. On the right hand side of the page there is a column labeled 'Popular'. These are websites that currently are receiving a lot of attention from users under specific keywords and phrases. These websites are listed under common 'tags' that users have given.
Wikipedia gives a fairly good explanation of social bookmarking. You can find that explanation at http://en.wikipedia.org/wiki/Social_bookmarking. You can also goto Del.icio.us and try out the service which is a great way to learn about social bookmarking
Digging for the News
Del.icio.us is not the only Web 2.0 flagship that relies on the power of the collective people. Arguably one of the most successful Web 2.0 enterprises is Digg. Digg is a news website which presents headlines from across the Internet. Unlike practically every other news website to date, however, Digg does not rely on editors to determine which news stories are worthy of their front page and which news stories they should ignore. Rather, Digg relies on the input of their users.
The system behind Digg is simple. Registered users can navigate their way to "Digg for Stories". Here everyone can see all of the stories submitted to Digg. If a user likes one of the stories, they simply clïck on the "Digg It" link. If they do not like the story they can either ignore the story or report it as being lame, a duplicate story, or outright sp@m. If a story receives enough Diggs in a fast enough amount of time, it gets promoted to the front page.
The system seems to work fairly well. Digg has been smart enough to put into place anti-cheating devices which do a fairly good job of catching manipulators of their system. And if someone does break through these barriers, Digg users (often referred to as Diggnation) are usually pretty quick to point out the offending users.
Why Should I Care About These Services?
This is all fine and interesting, but you might be wondering why you should spend your precious time reading more of this article. The answer is simple: websites like Digg and Del.icio.us represent the opportunïty to get a lot of new traffïc as well as quality links to your website.
Digg and Del.icio.us offer the absolute best type of web traffïc: viral traffïc. Business owners know that the most reliable prospects are the prospects that come from the referral of someone else, and Digg and Del.icio.us offer just that. In order to get seen on a large scale from any of these websites that rely on a community of users, your content must be good enough to meet the approval of enough people to warrant the elevation of your site to the front page. This, in effect, is like one great recommendation for your website.
So how much traffïc are we talking about? Darren Rowse of ProBlogger.net noted that when a post of his reached the front page of Del.icio.us, he saw around 8,000 visitors that day from Del.icio.us alone. This does not take into account all the bloggers and website owners who discovered his site from Del.icio.us, posted a link to it on their site or in a forum, which would in turn generate more traffïc to his site.
Tech-Recipes, a relatively common website on the front page of Digg, wrote a great post on what the digg effect is like. The traffïc numbers they post are quite astounding. From being featured in Digg, they regularly see 5,000 – 10,000 visitors per day. This is not unusual either – websites that are featured in Digg are often subject to what has been dubbed the "Digg Effect". It is quite common, unfortunately, for a dug website to receive so much traffïc that it brings down the server.
Nöw both Del.icio.us and Digg users do not tend to be very active users. This has been pointed out by more than one person. Typically they do not clïck on ads, they do not comment on blogs, and they do not register for an account with you. But the name of the website marketing game is always going to be frëe exposure, and social bookmarking services like these are great ways to get a lot of frëe exposure for your website. In addition, these sites will often have secondary and tertiary effects which you may not be able to link back directly to your initial exposure on them.
I'm Sold – Where Do I Sign Up?
So you are nöw sold on just how great it can be to be featured on sites like Del.icio.us and Digg. The natural question to ask here is how do you get featured on these sites. I am pretty sure the answer I am about to give is going to be one that you do not like as it is a tired phrase:
You need good, unique content.
Sound familiar? If you follow SEO at all, you undoubtedly have been told that good, unique content is the best way to get to the top of the rankings. The same thing holds true, but even more so, for social bookmarking websites.
In order to be featured on these sites, your website does not have to meet the approval of an automated bot that is scouring the web for information. Instead, your website needs to meet the approval of actual human beings who are going to look at your website, determine whether they like it or not, and then tell you the honest truth.
In the past, web pages that have been successful in being featured may have had the following traits:
- They are usually unique
- They often have useful content, such as a tutorial
- They may contain breaking news or an exclusive report
- They are sometimes particularly humorous
- It may be frëe content for downloading (frëe wallpapers have done well with Digg)
- It will rise to the top naturally – without manipulation
After I wrote the article on Yahoo I received an email asking how one would optimize their site for social bookmarking services. The response to that would have to be simple: optimize your site by offering some great, frëe content that anyone can access.
A Word to the Wise – Don't Cheat
As a quick sidebar, it is important to note that those who try to cheat the systems usually find themselves worse off than they were to begin with. It is very tempting when dealing with a system like Del.icio.us and Digg to try and manipulate the system to artificially get your website to the top.
The problem with this is simple: if you do succeed in manipulating the system, but do not have the content to really deserve a featured placement, you will undoubtedly turn off more visitors than attract. If your content is deserving of a featured placement, it should rise there naturally.
Social Bookmarking – The Future of Search?
The point of the article which I referenced above was not to state that Yahoo was washed up, but rather that Yahoo was on the cusp of a new Internet and a new förm of search. They recognized that Google would not be beat in the search market; however, this does not mean that they can not beat Google by creating a market more effective than search.
Social bookmarking is already becoming a very effective way for experienced web surfers to find the latest information on a particular subject. Do you want to see some of the latest videos to become popular? Just goto http://del.icio.us/tags/video and you can see what others are discovering and bookmarking as valuable. Want to find some rather obscure guide on Ruby on Rails? Lookup the common tags for Ruby on Rails and search through these resources.
Social bookmarking has the great ability to reach where search engines cannot: by using viral marketing and popular opinion, social bookmarking has the ability to discover what is important before any bot can spider the site and rank it among the thousands of sites available. Granted, social bookmarking will nevër replace search completely, but as it grows in popularity, web users are quickly discovering a whole new way to discover web pages that they would nevër discover otherwise.
So take the time today to examine Digg and Del.icio.us. Take a little more time to find new social websites like Digg and Del.icio.us (they are popping up all over the place) and learn what seems to make users on these sites clïck. Social technologies are here to stay, and they are only going to grow in popularity. Right nöw is a golden opportunïty for you to gain great exposure for your website if you simply learn how to use these services.
About The Author
Mark Daoust is the owner of Site Reference.
This article may be reprinted as long as all links are active, including a link to the article's original location which can be found at Site Reference.
posted by Scott Jones @ 8:34 am
links to this post
Monday, March 20, 2006
Predicting Search Engine Algorithm Changes
With moderate search engine optimization knowledge, some common sense, and a resourceful and imaginative mind, one can keep his or her web site in good standing with search engines even through the most significant algorithm changes. The recent Google update of October/November 2005, dubbed "Jagger", is what inspired me to write this, as I saw some web sites that previously ranked in the top 20 results for extremely competitive keywords suddenly drop down to the 70th page. Yes, the ebb and flow of search engine rankings is nothing to write home about, but when a web site doesn't regain many ranking spots after such a drop it can tell us that the SEO done on the site may have had some long-term flaws. In this case, the SEO team had not done a good job predicting the direction a search engine would take with its algorithm.
Impossible to predict, you say? Not quite. The ideas behind Google's algorithm come from the minds of fellow humans, not supercomputers. I'm not suggesting that it's easy to "crack the code" so to speak because the actual math behind it is extremely complicated. However, it is possible to understand the general direction that a search engine algorithm will take by keeping in mind that any component of SEO which is possible to manipulate to an abnormal extent will eventually be weighted less and finally rendered obsolete.
One of the first such areas of a web site that started to get abused by webmasters trying to raise their rankings was the keywords meta tag. The tag allows a webmaster to list the web site's most important keywords so the search engine knows when to display that site as a result for a matching search. It was only a matter of time until people started stuffing the tag with irrelevant words that were searched for more frequently than relevant words in an attempt to fool the algorithm. And they did fool it, but not for long. The keywords meta tag was identified as an area that was too susceptible to misuse and was subsequently de-valued to the point where the Google algorithm today doesn't even recognize it when scanning a web page.
Another early tactic which is all but obsolete is repeating keywords at the bottom of a web page and hiding them by changing the color of the text to match the background color. Search engines noticed that this text was not relevant to the visitor and red-flagged sites that employed this method of SEO.
This information is quite basic, but the idea behind the aforementioned algorithm shifts several years ago is still relevant today. With the Jagger update in full swing, people in the SEO world are taking notice that reciprocal links may very well be going the way of the keywords meta tag. (i.e. extinct) Webmasters across the world have long been obsessed with link exchanges and many profitable web sites exist offering services that help webmasters swap links with ease. But with a little foresight, one can see that link trading has its days numbered, as web sites have obtained thousands of incoming links from webmasters who may have nevër even viewed the web site they are trading with. In other words, web site popularity is being manipulated by excessively and unnaturally using an SEO method.
So with keyword meta tags, keyword stuffing within content, and nöw link exchanges simply a part of SEO history, what will be targeted in the future? Well, let's start with what search engines currently look at when ranking a web site and go from there:
On-page Textual Content
In the future, look for search engines to utilize ontological analysis of text. In other words, not only your main keywords will play a factor in your rankings, but also words that relate to them. For example, someone trying to sell NFL jerseys online would naturally mention the names of teams and star players. In the past, algorithms might have skipped over those names, deemed them irrelevant to a search for "NFL jerseys." But in the future, search engines will reward those web sites with a higher ranking than those that excessively repeat just "NFL jerseys." With ontological analysis, web sites that speak of not only the main keywords but other relevant words can expect higher rankings.
The Conclusion: Write your web site content for your visitors, not search engines. The more naturally written sites can expect to see better results in the future.
Offering Large Amounts of Content
This can frequently take the form of dynamic pages. Even nöw, search engines can have a difficult time with dynamic content on web sites. These pages usually have lengthy URLs consisting of numbers and characters such as &, =, and ? The common problem is that the content changes so frequently on these dynamic pages that the page becomes "old" in the search engine's database, thus leaving searchers seeing results that contain old information. Since many dynamic pages are created by web sites displaying hundreds or thousands of products they sell, and the number of people selling items on the Internet will obviously increase in the coming years, you can expect that search engines will improve their technology and do a better job indexing dynamic content in the future.
The Conclusion: Put yourself ahead of the game if you are selling products online and invest in database and shopping cart software that is SEO-friendly.
Once thought to be a very difficult thing to manipulate, incoming links to one's web site have been abused by crafty SEOs and webmasters the world over. It is finally at a point where Google is doing a revamp of what constitutes a "vote from [one site to another]" as they explain it in their webmaster resources section. Link exchanges are worth significantly less nöw than ever to the point where the only real value in obtaining them is to make sure a new web site gets crawled by search engine spiders.
Over the years, many web sites reached top spot for competitive keywords by flexing their financial muscle and buying thousands of text links pointing to their site with keywords in the anchor text. Usually these links would appear like advertisements along sidebars or navigation areas of web sites. Essentially this was an indirect way of paying for high Google rankings, something which Google is no doubt trying to combat with each passing algorithm update. One idea of thought is that different areas of a web page from a visual point of view will be weighted differently. For example, if a web site adds a link to your site within the middle of their page text, that link should count for more than one at the bottom of the site near the copyright information.
This brings up the value of content distribution. By writing articles, giving away free resources, or offering something else of value to people, you can create a significant amount of content on other web sites that will include a link back to your own.
The Conclusion: It all starts with useful content. If you are providing your web site visitors with useful information, chances are many other sites will want to do the same. SEO doesn't start with trying to cheat the algorithm; it starts with an understanding of what search engines look for in a quality web site.
About The Author
An expert at organic SEO, John Metzler has held executive positions in the search engine marketing industry since 2001. He is the President of FreshPromo, a Canadian-based SEO firm, and services American clients through SEOTampa.com.
posted by Scott Jones @ 8:34 am
links to this post
Friday, March 17, 2006
Tips for Getting Your Website Listed on Yahoo
Treat Yahoo as both a directory AND a search engine. Yahoo offers a number of different search results. Part of their search results come from an actual search engine, and some of their results come from human editors, called surfers.
Every Yahoo directory submission is viewed by a person. (Search engines use spiders and indexing software). Admission into the Yahoo directory is entirely at the discretion of the Yahoo surfer viewing your site. That's why the free submission lead time can often be 8-10 weeks, without using the Express Submit service.
Unlike search engine submissions that accept multiple pages from the same site, Yahoo surfers prefer to view your home page, and they will navigate your site from there.
With directories, the site owner selects the most appropriate directory categories for his/her site and writes descriptions that concisely and accurately describe the content of the site. Factors that affect directory placement are selecting the right category and writing a good description.
Always differentiate between the Yahoo search engine and the Yahoo directory whenever you speak to a search engine marketer. The strategies for getting listed in search engines are different from the strategies for getting listed in directories.
Fill out the Forms on Yahoo Exactly as Requested
We know this seems obvious, but some sites don't get listed because the submitters do not follow directions when filling out the forms. When Yahoo asks for a telephone number, give only one telephone number including the area code. When Yahoo asks for a 25-words-or-less description, don't try any fancy word stacking or keyword tricks. Stick to a concise, accurate description of your site's content. Yahoo surfers often change your site description after it's listed in their directory.
Although some fields are optional, try to fill in all of the fields in the förm, especially the fields asking your company name, address, telephone and fax numbers. Yahoo wants to know that you have a legitïmate business that will be around in years to come.
Select the Categories (only 3) That You Want to be Listed Under Very Carefully
To select your categories, type in your selected keywords in a Yahoo query, and study the results. Your site does not belong where you believe your target audience is searching. Your site's actual content should accurately reflect the category or categories you wish to be listed under.
You will probably be listed under the same categories your competitors are listed under. Study your competitors' directory listing. See what their descriptions are, and then modify your site's description to show the Yahoo surfer that you definitely belong in the categories you selected.
If you are a U.S. company, under the category Regional/United States/ in Yahoo, all 50 states are listed. You should include your site in one of the regional categories, if applicable. Don't be surprised that most of your traffïc from Yahoo is local. Many customers need to hear your voice or see your face to feel secure about hiring you or your company for a project.
Category listings are also at the discretion of a Yahoo surfer. You need to spend the time selecting your categories and studying your competitors so that you do not get listed under an "undesirable" category.
Have Unique Content and Point This Information Out in Your Submission
A web site is of no value to the Yahoo directory if the site contains the exact same information as other sites in the same categories. So to add value to the Yahoo directory, and to call attention to the unique aspects of your online business, make sure you have unique content on your site.
You can point out any unique content to the Yahoo surfer via your 25-word description or the extra comments field in the submission förm.
Get a Virtual Domain Name
Yahoo tends to recognize virtual domains (http://www.yourcompany.com) over others. Why is this? Yahoo wants to have legitïmate organizations and companies in their directory. They do not want a small start-up company that won't be around next year, thus resulting in a dead link to a URL in the directory. A virtual domain shows that you are serious about your business.
At the bottom of Yahoo's home page are some of Yahoo's regional directories. If you are in one of the regional areas, you can often get your web site listed much faster in the regional directory than in Yahoo's main directory.
Commercial web sites must pay an annual fee of $299 (currently) to be listed in the Yahoo directory. Sometimes, a Yahoo surfer might discover a content-rich commercial site without you ever submitting.
"Design for Speed" (Quotation From a Yahoo Surfer)
Yahoo is looking for sites that download very quickly, preferably within 30 seconds on a dial-up modem.
Of course, exceptions do apply, such as if you are an online video game company and use Shockwave on your site. Then it is understandable that your web site might take longer to download.
Lastly, the rule "Content is King" applies to getting listed in Yahoo. If Yahoo editors don't feel your site looks or sounds like a legitïmate business, they don't have to list your site. So make sure your web site is easy to read, easy to navigate, easy to find (on search engines), shows consistency in design and layout, and is quick to download. Above all, have unique content to add value to the Yahoo directory.
About The Author
Andy Macdonald owns and runs his own web design business called Swift Media UK, which incorporates logo design, & website hostïng.
posted by Scott Jones @ 9:41 am
links to this post
Wednesday, March 15, 2006
Are You A Closet FrontPage User?
In webmaster circles, fessing up to being a FrontPage user is akin to inviting your mother as your date to your senior prom: you just don't do it. In fact, admitting that you simply use a WYSIWIG editor can often be enough for experienced webmasters to quietly chuckle, look at you with a "someday you'll learn" look, and give you a nice pat on the back encouraging you to keep learning. 'Real' webmasters know three things: 1) Hand coding is the only way to make a website look nice, 2) The more your web programming looks like the screen from "The Matrix", the better your website will be, and 3) that FrontPage was actually programmed by Beelzebub himself.
The stigma that has been placed on WYSIWIG editors, especially FrontPage, is not without cause; there are legitïmate reasons to avoid these web design programs. But the hatred for these programs is also largely unfair and website owners who are using these programs should not necessarily be ashamed to admit that they did not take the time to pour through the W3C's lengthy, and frankly quite boring, recommendations for proper HTML coding. There are, dare I say, legitïmate times when using an editor like FrontPage is the best option.
Hand Coding Is Actually the Best
Now that I have ventured out on a limb and actually admitted to there being legitïmate reasons a person could use FrontPage or any other WYSIWIG editor, let me add an absolute necessary disclaimer. All this talk about creating W3C compliant code, learning proper CSS and HTML, and learning how to separate the design of your website from the HTML of your website is valid. In fact, it should ultimately be the goal of every website owner to have their website validate with W3C standards (Why? Chëck out the web standards movement to see why it is so important).
Here is the real letdown: there is virtually no way that you will create a W3C compliant website using FrontPage, and it is doubtful that any WYSIWIG editor will achieve this for you. Dreamweaver has made tremendous strides in the past year in creating more compliant code, but they are not perfect yet either. If you are going to reach that Shangri-la of web development, hand coding and learning HTML and CSS are the only paths that will lead you there.
The funny thing about all this is that once you become adept at designing websites using nothing but a hand-coded website with your design controlled by CSS and the structure handled by the HTML, you may just find that hand coding a website is actually much easier than fiddling around with a FrontPage or Dreamweaver. In fact, you may just become one of those webmaster 'snobs' who looks sympathetically at all the poor FrontPage-handicapped website owners.
Your Website is More Than a Website
Very few web businesses are actually about the website. Sure, the website is an integral part of your business – possibly an absolutely necessary part of your business. Ultimately, however, your website is a tool of your business. Amazon.com, as an example, is known for their website. But when we describe what Amazon.com does, the typical response is to say that they sell books. Google is known for being a website. But when asked what Google does, the typical response is that they help us find websites that we are looking for. Site Reference is inseparable from its website, but when asked what we do, our response is that we publish articles and provide forums to help website owners succeed in the online world (OK, the last example is not in the same class as the first two...we're getting there).
The point of all this is to emphasize that ultimately we are running a business, and a business, no matter how web-centric, is going to have more needs than just those of the website. As a web business owner you are inevitably faced with many different aspects of your business which you need to pay attention to, and it is possible that creating a W3C compliant website is not as important as finding the monëy to pay last years taxes, or handling a consumer issue, or developing that new product which is projected to double your online salës.
We would all love to say that every part of our business is done with meticulous detail and that even our office spaces would pass a white glove test, but that is just simply unreasonable. The truth is, however, that sometimes we just need to get things done. And with a web based business, often times just getting a good looking website up is what we need, and then we need to focus on another aspect of our company that is crying for attention.
I have a very good friend and occasional business partner who has become quite successful as an Internet entrepreneur. He owns a very successful web hostïng company, a quickly growing software company, and has launched several websites which have seen a healthy level of success. As much as it pains me to witness it, he has done all of this using FrontPage as his web design tool of choice. The simplicity with which it allows him to get something published in short order fits his needs perfectly, and although I still preach to him the need to learn HTML, it is hard to argue with someone who is currently more successful than I am.
Recognizing FrontPage for What It Is
If you are a FrontPage user, inevitably at some point you are going to come across another webmaster who, upon learning of your WYSIWIG addiction, will scold you for using a program that publishes what is generally considered to be 'ugly code'. When you hear this retribution, be sure to accept it for what it is – encouragement to take your website to 'the next level'.
FrontPage, or any WYSIWIG tool, is a 'quick and dirty' way to get a website published in a relatively short amount of time for those who do not know HTML or CSS. That is its purpose, and it fulfills that purpose well. Ultimately, however, websites whose goals include wide-accessibility, easy management, low bandwidth consumption, faster load times, multi-browser computability, higher search engine rankings, and an image of being taken care of by a company who has the resources to manage a professional website, will ultimately need to go the route of being hand coded.
Using a tool like FrontPage is not something you should have to apologize for, but it also may not be the best long-term plan for managing your web based business – especially when the web industry is setting standards that FrontPage refuses to meet.
At some point, bringing your website up to date with industry standards is a goal that will (or should) cross your to-do list. When it does, you may decide that taking the time to learn HTML and CSS is not the best use of your time and that outsourcing development is the best direction for your company. Or you may be someone who likes control of the important aspects of your business and may want to learn HTML and CSS to make sure that it is done correctly. Whatever you decide, making the move towards a website that meets industry standards will certainly be a plus for your business.
About The Author
Mark Daoust is the owner of Site Reference. This article may be reprinted under the condition that all links are made active and that a link back to the original article is in place, which can be found at: http://www.site-reference.com.
posted by Scott Jones @ 8:47 am
links to this post
Monday, March 13, 2006
Google's Growing Online Office
Does anyone remember how, less than a year ago, several commentators suggested Google was compiling a series of products that could emulate an online operating system? At the time, Google steadfastly denied such rumors. Yesterday, Google purchased Upstartle, the maker of a browser-based word processor called Writely.
Writely is an online word processor that enables multiple users to access and work on documents from any location. It can be used as a collaborative editing device and offers users online publishing options including the ability to convert Writely documents into "normal-looking web pages" or blog postings.
The acquisition of Upstartle, combined with other current and pending Google services poses a serious challenge to Microsoft's desktop oriented products. Google is clearly building a suite of branded, browser-based applications that contains several daily use products designed to capture users from Microsoft Office.
Earlier today, Slashdot published a story suggesting Google is running a closed beta test of Google Calendar, including a link to a series of screen shots. The project, nicknamed CL2, will be integrated with Gmail in the future.
The stakes for both firms are high with Microsoft preparing to release its new Internet focused operating system, Vista before the end of 2006. Until recently, Microsoft was able to bank on the storage space offered by personal computers. Its operating systems run from the hard drive and most digital documents composed by computer users are stored on those users' hard drives. The security of the hard-drive dependent storage system Microsoft enjoyed is about to change radically.
At its Analysts Day, held earlier this month, Google inadvertently announced the development of Gdrive, a virtually infinite, online data storage service. A series of slides offering preliminary details of Gdrive were included in notes for one of the day's PowerPoint presentations but were later removed by Google.
"The notes were deleted from the slides we posted because they were not intended for publication," Google spokeswoman Lynn Fox said in an interview with vnunet.com. While she declined further comment, those notes also included financial projections that stretched into next year, forcing Google to file a statement with the SEC on March 7.
Shortly after the presentation, the CEO of Findory.com, Greg Linden, posted comments about them to his Geeking with Greg blog, before Google removed them. The full text of the notes from Google Analyst Day can be found here.
In his review of the deleted notes, Greg found a few interesting sentences. At one point in Slide 19, the text notes how Google is inspired by the idea of "... a world with infinite storage, bandwidth and CPU power."
Google, like its competitors, is becoming a second generation web hostïng firm. Another line from Slide 19 says Google wants to be able to "... house all user files including Emails, web history, pictures, bookmarks, etc and make it accessible from anywhere (any device, any platform, etc)."
Google's capacity to store and retrieve personal information is already being applied to the corporate world. Google's Desktop 3 includes an option that allows users who work on multiple computers, or multi-user work-groups, to search for items stored on the hard drives of multiple computers. Google keeps copies of files found on computers in the file-sharing network and transfers them from unit to unit as searches take place.
One of the more interesting lines Greg extracted from Slide 19 was the idea that files stored and shared through Gdrive would become the "Golden copy" of those documents. Gdrive, like Writely is designed to facilitate work-group collaboration, much like a central file server in most IT offices does now. The copy kept on the hard-drives of members of a working group will be a cache of the most recent version displayed on that particular computer, but not necessarily the most up-to-date document.
Google Labs is pushing the other major Internet and search firms to work harder and faster. The addition of Writely to Google's stable of membership-based products raises another series of hurdles for Microsoft and might force them to refocus their Vista strategies. Microsoft was hoping to challenge Google's search dominance by integrating search within the desktop and operating system. Google appears ready to flank them by moving applications formerly found on the desktop into its sphere of search-related products. 2006 is shaping up to be a most interesting year.
About The Author
Jim Hedger is a writer, speaker and search engine marketing expert based in Victoria BC. Jim writes and edits full-time for StepForth and is also an editor for the Internet Search Engine Database. He has worked as an SEO for over 5 years and welcomes the opportunïty to share his experience through interviews, articles and speaking engagements. He can be reached at firstname.lastname@example.org.
posted by Scott Jones @ 8:42 am
links to this post
Wednesday, March 08, 2006
SEO for Traffïc with Content vs. Ranking with Links
How do you grow your search engine traffïc without adding a single new link or making any changes to your existing webpages?
It's simple. Just add content.
Simply having keyword-optimized pages of content on your site won't rank you high for competitive search engine keywords – that's a fact of life. But keyword-optimized content can really bring in the traffïc for low-competition and unique keywords. The low-competition and unique keywords are typically longer multi-word variants of the keyword. For instance, instead of "search engine ranking," "ranking for search engine traffïc niche keywords."
If you have lots of pages of optimized content–and you optimize well – all the search engine traffïc from these low-competition keywords will really add up. Plus, you'll usually get more repeat visitors and type-in traffïc, too.
Just picture this realistic example of traffic-building with content vs. ranking-building with links. Company A invests $5,000 for link-building in order to rank for a competitive keyword. Company B invests the same amount, only in content. Company A and Company B: each start out on equal SEO footing: equally old websites with the same amount and quality of content, same content management systems, the same PageRank and quantity, quality, and relevance of inbound links.
Company A's research reveals that $5000 is just the amount needed to get on the first page of Google for a target keyword that should deliver 100 unique visitors per day if the site ends up in the first position. They dutifully get inbound links optimized for that keyword, following all SEO best practices. Three months and $5,000 later, the site is stuck somewhere toward the bottom of the second page of Google search results for the target keyword. Six months later, they've actually sunk a bit lower in the SERPs. The good news is that the site is getting some traffïc from the links built and from the lowly search engine position, but nowhere near the 100 visitors/day they were hoping for from search results.
Company B, meanwhile, had content written around a long list of keywords with little or no competition in the search engines, using up-to-date search engine copywriting techniques. They've been enjoying a growing stream of visitors to their site almost since the first page of content was added. Three months later, the site's search engine traffïc has grown by a hundred unique visitors per day, or 3,000 per month. Moreover, Company B's repeat visitor traffïc has also jumped. Type-in traffïc has increased, presumably as visitors forward the URLs of useful pages to their friends. Page views are up, too, not only from more repeat visitors and type-in visitors, but also from first-time search visitors staying longer and browsing more pages. Six months later, the website's content has built a loyal following on the net, generating even more repeat visitors. The search engine traffïc is as good as it ever was.
Pitfalls of Link-Building for Search Engine Ranking
Company A thought it had a fairly sure thing: build enough optimized links for the keyword, taking care not to trigger search engine penalties. Yet as they've discovered, there is no sure thing when it comes to search engine rankings:
Over-optimization penalty minefield. The search engines, particularly Google and Yahoo!, are very risk-averse when it comes to ranking sites well for competitive keywords. On the whole, they are perfectly willing to risk dropping several good sites from top rankings in order to try to keep one bad site out. They are constantly tweaking their algorithms to identify sites whose link structures are not indicative of a quality site. In the process, plenty of good sites with good SEO also get swept up. This risk of failure is the inherent risk of SEO. True, most of the time, a good site with good SEO does move to the top. But in a large minority of cases, quality goes unrewarded.
Competition and the moving target. As Site A was moving up the search engine results for its competitive target keyword, so were the other sites. There is no rest for the victorious when it comes for SEO. The top sites for highly competitive keywords are constantly building new optimized links. That's why any SEO effort has to aim to do at least ten percent better than the site currently in the position it's targeting.
Lack of keyword diversity. Too often, websites with modest SEO budgets (and $5,000 is modest when it comes to a competitive keyword) aim for just a few keywords. Given all the potential pitfalls of an SEO campaign, you need to be going after ten or more target competitive keywords, and at least another ten related but less competitive keywords. That way, failure for a few keywords won't scuttle the whole project. Meanwhile, search engines look for diversity in targeted keywords, so you get much more out of targeting a largër group of keywords. If you can't afford to do this, you're really better off not going after competitive keywords. Sure, you might get those rankings. But what happens if you've spent your budget and still have little to show for it?
Meanwhile, the fundamental advantage of pursuing low-competition keywords is that, by definition, it's much closer to being a sure thing.
Advantages of Web Content SEO
Greater certainty. Not only is a page of content extremely likely to bring in search engine traffïc — unlike the similar investmënt in links — it won't suddenly disappear. The sites linking to you might stop anytime, or do something to stop links' passing search engine value (such as adding the "nofollow" tag or switching to a search-engine-unfriendly content management system).
Cost. Traditionally, copywriting has been more expensive than link-building. But that's changed. As "nofollow" link-Scrooge-ry becomes more and more common, and as paid and reciprocal links get downgraded, the real cost of obtaining quality links increases. Meanwhile, the copywriting market has increasingly adapted to the needs of search engine marketing. To get a search engine visitor, you don't need a Pulitzer-prize winning essay or a killer salës letter. You simply need highly focused, readable, keyword-optimized, information-packed pages of around 250 words each — and more and more copywriting and SEO firms are delivering this service cost-effectively. Blogs, meanwhile, let you and your employees add content easily. Bulletin boards (modified to be search-engine-friendly) let site visitors add content, too. In fact, "natural content" from blogs and bulletin boards is now much more viable than natural link building.
In conclusion, when you look at SEO, don't forget that your number-one goal is not to rank high for a certain keyword, but to get more search engine traffïc. In some less competitive sectors, high rankings may still be a realistic and effective proposition. But increasingly, ranking high for competitive keywords is no longer the best way to get traffïc.
About The Author
Joel Walsh is a professional in the fields of copywriting and SEO who has recently launched http://www.UpMarketSEO.com, an SEO firm.
posted by Scott Jones @ 9:11 am
links to this post
Monday, March 06, 2006
The Rise and Rise of Article PR. What are the Implications?
Already a very popular method of achieving a high search engine ranking, article PR (aka article submission) has now entered the mainstream. As such, its popularity is increasing at a dramatic rate. While this is great for SEO copywriters like myself, there are some side-effects that need to be addressed if article PR is going to remain a viable search engine ranking technique. This article discusses some of those side-effects, along with how they might be addressed.
But First, a Little on Article PR
Article PR is the process of writing 'frëe reprint articles' and submitting them to the 250+ established article submission sites on the Internet. An article submission site is simply a repository of frëe reprint articles - a place where authors can submit their articles frëe of charge, and where webmasters can find articles to use on their websites frëe of charge. In return for frëe use of your article, the webmaster includes your author bio and its links to your site. Every time your article is published, you get another link to your site and a boost to your ranking. If the quality of your article is high, it can be published hundreds of times.
The Rise And Rise of Article PR
Because article PR is such an effective way of generating a high search engine ranking, it has now entered the mainstream. As an SEO copywriter, I get several requests each week for quotes to write articles. These requests come almost exclusively from business owners and marketing managers who know little (if anything) about SEO. They obviously didn't go looking for article PR; article PR found them...
As a result of its newfound mainstream popularity, the number of articles being written and submitted has increased by between 100% - 600% in the past year! Christopher Knight, owner of the biggest article submission site, EzineArticles, tells me that the number of article submissions to his site increased by a staggering 600% from 2004 to 2005. In 2004, EzineArticles was averaging only 1416 article submissions per month. In 2005, it was averaging 8482 article submissions per month!
Similarly, at the end of 2005, when I spoke with Mel Strocen, owner of GoArticles, he reported a doubling of article submissions in the second half of the year. "In the last 6 months article submissions have increased by 100%, going from about 1,000 submissions per week to 2,000+ per week," he said.
Jason Lynch, owner of ArticleBlast, reported similar increases; between April '05 and January '06, submissions to ArticleBlast increased by over 300%.
The web traffïc to these sites tells the same story. According to Alexa statistics, at the end of 2004, EzineArticles had a reach of approx 100 users per million Internet users per day. Just over a year later, the site is reaching over ten times that many Internet users. (If we take the total number of Internet users worldwide to be 964 million, EzineArticles traffïc has increased from around 96,000 per day to over 1 million visitors per day.)
Alexa stats for GoArticles report similar increases in traffïc. At the end of 2004, it had a reach of approx 50 users per million Internet users per day. Just over a year later, it's reaching approx 10 times that number of users. (Again assuming 964 million Internet users worldwide, GoArticles traffïc has increased from around 48,000 per day to around half a million visitors per day.)
Figures for ArticleBlast are more difficult to ascertain as the site is younger and has lower overall traffïc.
Even if Alexa's figures are a little inflated (as I think they tend to be), they still provide a consistent measure for the period. As such, the percentage increases should be relatively accurate.
The Side-Effects of the Rise of Article PR
A number of writers have voiced the fear that article PR will die through 'over-use', just as keyword stuffing and link farms died. But I don't agree. Why? Because article PR isn't just useful to authors and SEO copywriters. The success of article PR is based on the premise that our articles are also useful to READERS. So long as the majority of articles remain useful (i.e. helpful, informative, and easy to read), readers will still want to read them, publishers will still want to publish them, and article PR will remain a viable link building method.
This is true no matter how many people are writing and publishing frëe reprint articles. Frequent use of a tool doesn't make the tool ineffective. (Just look at traditional forms of advertising - millïons of businesses engage in radio, print, and TV advertising, and those methods remain very effective. The fierce competition simply encourages advertisers to improve the quality of their ads in order to stand out.)
No, in my opinion, there's no such thing as too many articles. However, there is such a thing as too many BAD articles. Readers want helpful, credible information; they don't want badly written articles or empty words ('article sp@m') which simply carry a link.
Just as importantly, webmasters don't want to spend hours trying to find the right article to publish. At the moment, there are literally hundreds of article submission sites out there. Most of them are generic, fully automated affairs that involve no human moderation. They don't distinguish between good writing and bad, they don't cull article sp@m, and they don't categorize their articles very well. As a result, publishers have to wade through a sea of poor quality to find a handful of useful articles.
These issues are the real hurdles that need to be overcome if article PR is to survive.
Overcoming the Problems
The article submission sites will overcome the problems. Here's how...
As mentioned above, readers aren't interested in bad articles or article sp@m. This means that, in the long run, there's no real value in publishing such articles (either for webmasters or article submission sites); readers will frequent the sites that publish useful articles and ignore those that don't. Likewise, publishers will frequent the article submission sites that post useful, easy-to-find articles and ignore those that don't.
This means we'll see an increase in the number of human-moderated article submission sites. And once this happens, the article PR landscape will change forever:
1) Human moderated article submission sites will offer a higher percentage of quality articles, and those articles will be easier to find;
2) Human moderated article submission sites will attract more publishing webmasters, and, as a result, more authors;
3) We'll see a decrease in the number of un-moderated article submission sites because they won't generate enough traffïc to make AdSense profitable;
4) We'll see a decrease in the overall number of article submission sites (anyone can launch an automated article submission site, but it takes real commitment, business sense, and a dedicated budget to run a human-moderated article submission site);
5) The spoils will be greater for the surviving article submission sites, so they'll go to greater lengths to ensure the high quality of their articles; and
6) We'll witness the decline of article sp@m and poor quality articles simply because they won't be accepted at the good article submission sites.
All in all, it's a positive outlook for authors and publishers of quality articles.
Happy writing, publishing, and posting!
About The Author
* Glenn Murray is a website copywriter, SEO copywriter, and article submission and article PR specialist. He owns article submission service Article PR and copywriting studio Divine Write. He can be contacted on Sydney +612 4334 6222 or at email@example.com. Visit http://www.DivineWrite.com or http://www.ArticlePR.com for further details, more FR-E-E articles, or to download his FR-E-E SEO e-book.
posted by Scott Jones @ 9:25 am
links to this post
Friday, March 03, 2006
Raise Your Website Traffïc with RSS - Blogs and Yahoo!
In our first part of this article, we raised the question of whether blogging and its distribution tool, RSS feeds, are really useful for Internet and Search Engine Promotion. Are RSS feeds and blogs really the next big thing in web marketing, distribution, and content creation — or are they just hype?
There is a lot of hype around RSS, blogs, and derivative technologies like podcasting. But are they really useful to the serious Internet marketer or are they just the subjects used by marketers looking to create new products to grab our hard-earned marketing dollars.
We also covered the objections and reservations from some Internet marketers about the usefulness of RSS feeds and blogs to the bottom-line of their ebusinesses. This can be contrasted to our discovery of people like Willie Crawford and companies like Weblogs, which generate 6- and 7-figure incomes from blogs, RSS, and related technologies with Google Adsense.
To illustrate, if the typical Internet marketers — not just web gurus — can benefit from blogs and RSS feeds, I promised to share my experiences with my new sites not yet optimized for the search engines.
With virgin websites, I could observe the traffïc pulling power of blogging, pinging, and RSS. If you would like to read or familiarize yourself with Part 1 of this article, you can read it at... http://www.searchengineplan.com/articles/feb06-rss-prt1.htm.
To test the effectiveness of the ability of RSS feeds and blogs to attract and drive traffïc to my web properties, I did some quick and insightful research on the topic. Brandon Hong's Marketing Rampage with Blogs and RSS was the resource which best enabled me to understand the techno-jargon associated with blogs and RSS feeds. Believe me, I have a 10-year background in information technology, and I can't make heads or tails out of the alphabet soup served up by tech geeks on blog and RSS media.
Plus, I run a very busy SEO consultancy and virtual real estate (VRE Adsense™ and Affïliate Sites) side business, so I don't have the time to muck around in nebulous articles on these topics.
If you even remotely feel like me about the complexity of blogs or RSS, do yourself a favor and obtain Brandon Hong's multimedia ebook of screen-capture videos.
You can read a full review of the book at... http://www.searchengineplan.com/articles/hongrss.htm.
I have been blogging for almost 3 years, but RSS feeds have been harder to grasp in terms of development and marketing. The easiest way to start blogging is to setup an account with www.Blogger.com or www.Bloglines.com. Blogger will actually walk you through the process.
Experienced web designers should not have a problem setting up a www.Blogger.com account. Blogger.com is actually a good initial choice because it provides an easy setup for RSS feeds. The setup can be done by going to the Settings Tab in Blogger, clicking the site feed link, and filling out the forms.
The next issue to consider is the complex RSS compatibility issue. You can sidestep the decision about whether to go with RSS version 2.0 or Google's Atom standard by "burning your RSS feed" or making them more compatible with all popular RSS formats with a third party service like Feedburner.com.
After creating your RSS feed in Blogger, you should have it burned in the Feedburner.com service; it will guide you through the process. The optimized Feedburner.com RSS feed is then ready to be submitted to the major RSS directories.
I would suggest creating a few descriptions of your blog and then submitting both your blog and RSS URLs to the appropriate RSS and blog directories. My firm fast-tracks blog and RSS feed promotion by submitting them to about 90 directories that specialize in this type of media — including Yahoo! and MSN RSS content services.
Both the Blogger.com service and, more extensively, Feedburner.com can be configured to ping the major RSS and blog directories. This means they signal or alert these directories whenever you update posts on your blog in real time. Perhaps most importantly, you get traffïc statistics about your RSS subscribers and readers.
The results of my RSS and blog traffïc research over the last 3 months are amazing! I have been totally blown away by the research. I am excited about RSS and blog usage, despite the good and bad news:
The bad news: According to a White paper on blogging, sponsored in part by Yahoo!, 88% of Internet users don't know what RSS technology is and 96% of Internet users stated they do not use it!
The good news: 27% of Internet users experience RSS feed content on their My Yahoo and MSN web accounts, although they don't realize it! Moreover, 4% of Internet users actively use RSS feeds. This means 31%, or almost one-third (1/3), of Internet users in the U.S. read RSS feeds.
With almost 150 million U.S. Internet users and 600 million net users worldwide, you do the math on the large numbers of people reading RSS feeds even if unwittingly).
More positive stats on RSS and blog usage, according to the Pew Internet & American life project:
- (1) Fully 19% of online Americans ages 18-29 have created blogs
- (2) 11 million American adults say they have created blogs
- (3) 27% of Internet users reported in November that they read blogs
This translates into 32 million American adults who read blogs. This information shows that RSS and blogs are growing technologies for serious Internet business people to adapt into their marketing mix.
My personal research over three months showed that when I regularly updated my blog sites, burned RSS feeds pinging the major directories increased my traffïc a whopping 25%!
Blogs, RSS feeds, and articles distributed regularly to major host sites and distribution services actually rivaled the traffïc of my highly optimized top-ranking SEO and VRE sites. More importantly, traffïc from blog, RSS, and article-driven traffïc actually made twice as much income in sales and Adsense™ revenue than my traditional SEO sites.
Needless to say, once a skeptic, I am now a big believer in the power of RSS feeds and blogs to boost my bottom-line. I will leave you with a controversial statement from a SEO and Searchpreneur©.
Dr. Jakob Nielsen recently referred to search engines as "the leeches on the Internet." He feels "Search engines extract too much of the Web's value, leaving too little for the websites that actually create the content. Liberation from search dependency is a strategic imperative for both websites and software vendors."
With Yahoo! and MSN soon to enforce email postage, according to a recent article by the New York Times, RSS and blogging may become the best and latest arsenal for small business to continue to survive and thrive in the Internet economy.
About The Author
To stay informed on the latest blogging, RSS, and SEO developments, visit www.searchengineplan.com/blogs/seoblog.htm. Kamau Austin is publisher of www.eInfoNEWS.com and runs www.SearchEnginePlan.com. He is author of Always On Top -- How to Get the Highest Search Engine Ranking for your Website. See more about his strategies at www.AlwaysOnToptheBook.com.
posted by Scott Jones @ 9:42 am
links to this post
Wednesday, March 01, 2006
Search Engine Optimization: Four Vital Steps for Optimizing Your Website
There is a bit of confusion about search engine optimization. Some people think that SEO (the abbreviated form) is nothing more than tricking search engines into giving a high ranking for a particular site. Others think that search engine optimization is so complex that they could not possibly understand it. Neither of these views are correct. Search engine optimization is best defined as the art and science of building web pages that are both search engine friendly and user friendly. Below are four basic steps that you should take when optimizing your web pages.Florida Vacations: Florida Vacation Information by XYZ travel
1. Your Web Design Should Emphasize Text and Not Graphics
"Search engine friendly" means that search engines should be able to find data on your site that they can put in their data bases. While a picture may be worth a thousand words, a search engine is trying to classify pages by text and not by images. If you have an opening page with a beautiful picture of the sea and only two words of text saying "enter here" then this page will not rank high in searches for Florida Vacations. Similarly, if you have a headline with important text containing your site's keywords, it should not be displayed as a gif or jpeg image. Pages that are all flash or all images are not search engine friendly, and often are not user friendly as well.
2. Links to Your Interior Pages Should Be Easily Found by Search Engines
An important thing to remember is that you want not only your main page, but all of your interior pages to be included in the search engine index. While most people will probably enter your site through the main page, many will enter after doing searches which lead them to your inner pages. The best way to make sure that search engines will find and index your inner pages is to include text links to these pages. If you have a navigation system which uses Java-script or images, then it is best to add an additional text link navigation bar at the bottom of the site to ensure that the robot follows the links to your inner pages.
3. Your Pages Should Be Built Around Specific Keywords or Keyword Phrases
Robotic search engines and human users have one thing in common: they are trying to figure out what your site or your particular web page is all about. It is not possible to get high rankings for thirty different search terms with only one web page. However, it is possible to build separate web pages which explain and give importance to various aspects of your organization's activity. These sub pages can be optimized so that they perform well in searches for your various keywords.
4. Once Your Material is Organized, Then Your Keywords Should Appear in Strategic Portions of Your Web Pages
If your site is about Florida Vacations, then these words should appear in the following places of your html pages:
a. In the File Name or the URL
If your site is called www.floridavacations.com then this will give you a head start in any searches for this term. Similarly, if your company is called XYZ Travels, you may have a web page with this url: www.xyztravels.com/floridavacations.html
The URL or file name is an important indicator to a search engine, so don't miss the opportunïty to put your important term either in your main domain name or in your file names.
b. In the Title Tag
The text that is displayed in the blue line at the top of your browser is your title tag. The title tag is located in the HEAD section of the document. If your main phrase is "Florida Vacations" then the title tag in your html document should look something like this:
c. In the Description Tag
The description tag is not seen on the web page but search engines often display it as the text which gives the searcher an idea of what your page is about. The description tag should be compelling, and make someone want to click and see your page, while also containing the keywords that are in your url and your title tag. A description tag for this site might look as follows:
d. In the Headlines
Just as someone reading a newspaper looks at headlines to find out what is important, a search engine robot looks at the headlines of a web page in order to pick up the essential feature of that page. Put your main phrase in a headline and place it near the top of the page. Your headline text should be enclosed with special header tags such as
. A headline tag for our hypothetical page could be written as follows:
Florida Vacations: Plan Your Vacation Now And Save Monëy on Accommodations, Entertainment and Transport in Florida
If you don't like the look of the h1 tag, then use a smaller tag, h2 or h3, or adjust your site's style sheet so that the h1 tag is displayed in a small font which better matches your body text.
e. In the Body Text of Your Page
Your main keywords or key phrase should appear in the first paragraph of text and in a natural way throughout the text and also at the end of the page. In normal writing you would first introduce your subject, then explain what it is about and then summarize at the end. Follow this same procedure when you start writing your web page. Pages written in this style will automatically have correct keyword density and distribution.
f. In Anchor Text on Your Page
Anchor text is the clickable portion of links on your web page. Suppose you are describing your Florida Vacations and you want to direct your web visitors to an inside page with more information about this subject. Instead of making a link that says "click here," it would be better to have a link that says "Click here for more information about Florida Vacations" or even better, the link text will only be "Florida Vacations" and the "click here" will be rendered as normal text.
If you follow these search-engine-optimization steps when building your website you will end up with web pages that are easily understood by your visitors, and easily classified and indexed by search engines.
About The Author
Donald Nelson is a web developer, editor and social worker. He is the proprietor of A1-Optimization and provides search engine optimization, copywriting, reciprocal linking and article marketing services. He recently launched a new reprint article directory located at http://www.a1-articledirectory.com.
posted by Scott Jones @ 8:52 am
links to this post