Wednesday, May 24, 2006
Is Google Entering Web 2.0?
Of the four new product announcements made at Google Press Day, Google Co-op looks to have the greatest potential impact. This service allows users to subscribe to the "bookmarks" of experts in hopes that the relevance of search results will be improved.
The product manager for Google Co-op, Shashi Seth, described Google Co-op as follows: "Anyone can contribute. We expect it to work in a three-part process. At the first stage, the contributor will ask users to subscribe with specific pieces, relying on user trust and desire to utilize their content. At the initial stage contributors will 'sell' the Co-op product on their own sites and bring their own audience. Then we will tally how often they are used and the level of interaction and whether to build a signal. As confidence increases, the contributor has a better chance of getting into the Google Co-op directory. Once they are in the directory, it will make it easier for others to subscribe. And finally, with more quality proven, the information may affect Google Search itself."
Basically, Google Co-op allows web masters to improve search results in the topics they know best. Google Co-op creates a meta search engine, combining other specialized search engines that are created by its' users.
What Does this Mean for Us?
These developments have very exciting implications for web masters. The chance to influence Google rankings is always an exciting thought.
Google is entering into the world of communities by letting users contribute their knowledge and expertise to improve search results for everyone. This was confirmed by Shashi Seth, the lead product manager for Google Co-op who recently stated that Google Co-op is the search engine's push into community-based searches.
This service certainly seems similar to the many social bookmarking sites that have exploded on the internet in the past couple of years. Furl, Digg, del.icio.us, Scuttle, Yahoo! MyWeb 2.0, and others all offer ways for users to share information. Google entering into this giant arena may very well spark some additional growth of Web 2.0.
For the web developer and search engine optimizer, these developments are very enticing. Here's how you can take part.
How to Get Started
To build a topic, you must first decide on a set of labels and their presentation in the user interface. After that, you must annotate web pages to improve the search experience for yourself and for your subscribers. Depending on your topic you may want to label hundreds or even thousands of web sites or web pages.
To begin, simply go to Google.com/coop. Sign in and create a profile and a label. If you want to create a page about dog training, you might label it "dog training." Then, you could put all sorts of information in that page. Others can contribute to that page or subscribe. The more subscribers you get, the more relevant your page becomes.
For a complete tutorial on creating a topic, go to http://www.google.com/coop/docs/guide_topics.html.
Once you get started, you will be given a profile page. This is where users can learn about you and get more information about your contributions. Users can then decide whether or not they want to subscribe to your topic. Your profile page also includes:
Your recent contributions
what kind of labels you've added
Your subscribers and…
Links to your own web sites or blogs
You can allow people to subscribe to your topics by sending them to your public profile, enabling them to add your work to their search results.
However, the power of Google Co-op goes even further. To get the most benefit out of these new developments, you'll want to chëck out subscribed links.
Subscribed Links allow you to gain frëe promotion for your website by enabling you to add your services directly into Google search. This makes your links much more prominent for people who are subscribed to your content.
Google provides a number of special features, including currency conversion, movie showtimes, and stöck quotes. You can create your very own services as well by building subscribed links.
Other web sites have already begun offering these customized services.
OpenTable created a subscribe link that delivers real-time information about restaurant availability whenever they perform a restaurant search on Google. These specialized links lead to a web page that allows you to make reservations on OpenTable's website.
People Magazine built a subscribe link that gives users relevant celebrity info based on the queries they type into Google. When users subscribe to their content, they receive priority links at the top of Google search results that lead to more detailed celebrity info on the People Magazine site.
You can create similar applications and allow your users to add your subscriptions to their default Google search page. These results appear in the Google one box, a special text area with a light green background. So, if someone subscribes to your links, they will instantly see your website at the top of the results whenever they search for topics related to your expertise.
For a complete guide on creating subscribed links, go to http://www.google.com/coop/docs/guide_subscribed_links.html.
For webmasters, this is a very exciting development. Expect to see some "Subscribe to our Links!" buttons appearing on web sites very soon.
Ready to Get Started
Google Co-op is nöw in it's infancy stage, the perfect opportunïty for you to gain some ground early in the game. The Google Co-op directory is currently very small, with only a handful of topics being covered. By getting started nöw, you can gain an advantage on those who enter in later, much like those who have established their rankings early on within the Google search engine.
About The Author
Kim Roach, the Hip Marketing Gal, has many more frëe traffïc tips to unveil. To learn exactly how to attract tons of targeted, qualified web site traffïc to your site, visit www.unleashthetraffic.com/traffic3.html.
posted by Scott Jones @ 8:58 am
Friday, May 19, 2006
Best Blogging Software
The blogging platform wars are getting really interesting and much of the discussion I find myself in lately revolves around what is happening with various CMS systems. The market can essentially be defined into 3 major camps: remotely hosted, self hosted, and community based systems. I have used pretty much every blogging platform available and each of them has its ups and downs. In this article I will cover the best options for each area taking into account price, usability, market share and of course SEO potential.
All of these products are either open source, completely frëe or have a functional frëe version.
Remotely Hosted Blogging Software
(Note: I cannot really recommend any of these from an SEO stand point since optimizing a domain you do not own or control is obviously not a good marketing plan.)
Blogger is completely frëe and currently owns the majority of the remotely hosted user base, but not by a landslide. Bought out by Google in 1999, Blogger essentially fired up the blogging trend we see today. It is by far the easiest overall solution to use and, if you are a novice user looking to throw up some recipes or poetry, this is for you. Blogger is completely frëe and includes some great features like comments, photo blogging, and a basic community feel with user profiles. Because it is so dumbed down there are some features you may not find with Blogger that are only available through 3rd party add-ons. As a side note Blogger weblogs do quite well in the search engines and this was recently exploited with it being the first choice for sp@m blogs or splogs. A splog is a weblog used for the sole purpose of gaining inbound links or generating thousands of keyword stuffed pages with Adsense and the like. The recent Google Jagger update cleared a large portion of this up. Frëe.
Released in 2003, Typepad is a product of Sixapart, the makers of Movable Type. It is largely based on MT but there are some major enhancements and differences. Your blog can accomodate one or more photo albums with auto thumbnail generation. You can easily add music, books, and other media to Typelists, which grab a thumbnail from Amazon and other retailers for easy displaying in your sidebar. Typepad is also a great deal more technical than Blogger so a bit of HTML know-how is recommended. On that note, editing your blog to look the way you want is also quite easy and Typepad blogs are known for being very eye-pleasing, intuitive and easy to navigate. In Sixapart's business model, Typepad is aimed at regular home and small business users while Movable Type is targeted at largër businesses or for internal intranets. Price: Basic, $4.95 a month; Premium, $8.95 to $14.95 a month.
These guys originated back in 1999 as a site for sharing book, music and movie reviews. Although it quickly morphed into a full blown blogging tool, Xanga still maintains the ability to run a powerful review site. Xanga pulls data from several retailers like Amazon.com including thumbnails, pricing and a cover. The software also is very usable by novices with a powerful WYSIWYG editor allowing for easy HTML editing, adding smilies, links, and other symbols. By using Blog rings it is also easy to interface with Xanga's other 3 million users to share interests, ideas, and of course traffïc. Xanga comes in a frëe and $25 flavor.
Mentions: Blogsome, Blogster, MindSay, Multiply
Self Hosted Blogging Software
WordPress originally began as a mod of an older open source package known as B2. WP is MT's biggest competition and is often the bain of endless Wordpress vs Movabletype style threads around the internet. Although launched just over a year or so ago WP has really taken the blogosphere by storm. And with good reason - Wordpress is completely frëe under GNU licensing and is packed with many features you will not find anywhere else. It is also much easier to install and get blogging for novice users and has a very large and helpful community. WP runs on PHP/mySQL and is quite scalable judging from some of the very large and trafficked sites I see using it. It also sports utilities to import files from Movable Type, Textpattern, Greymatter, Blogger, and B2. Wordpress recently upped the ante when Yahoo recently included them on their hostíng packages, in addition to MT. I have to admit I am finding myself more and more digging WP and will likely convert Profitpapers to WP as I get time (it can be a biznitch). Wordpress is frëe.
Aside from maybe Greymatter (the original open source blogging tool), Movabletype dominated the blogging market share in 2002-2004. Released in late 2001, Perl based Movable Type by Sixapart has maintained a large portion of the blogging market share, due mainly to the fact that there is a frëe version (supporting up to 3 weblogs) and that it is incredibly powerful, intuitive and easy to customize. Template driven Movable Type also sports one of the largest communities of developers and blogging enthusiasts around, meaning lots of support, idea sharing, and of course plugins. Movable Type can be configured to dynamically generate HTML, PHP or any other kind of pages you like, meaning it is incredibly scalable, fast, and loved by spiders. It is perhaps the most well known blogging software for SEO purposes and it is what currently powers Profitpapers and several of my other projects. Moveabletype is either Frëe with 3 authors, 1 weblog, and no support or $69.95 with unlimited weblogs, authors and full support.
Textpattern is the brainchild of Dean Allen and was written to ease publishing of content for those not inclined to learn HTML. Like WP and MT, Textpattern runs on PHP and mySQL for easy administration, backups, and power. What really sets textpattern apart from the others is the integration of Textile. Textile is a tool for easily formatting content for those who do not know HTML. WP & MT have modules for textile as well but it is native to the Textpattern system. Another bonus of the app is its superior handling of comment sp@m due to its smaller market share. On the blogs I maintain running WP and MT, I often find myself clearing out sp@m every day, whereas on some very busy textpattern sites I receive only manual comment sp@m (not bot driven). TP is open source.
Mentions: Blosxom, LifeType, Serendipity
Community Based Blogging Software
Waaaaay back in 1997, Rob "CmdrTaco" Malda launched a website known as Chips & Dips, supplied via his student account at Hope College in Michigan. In 1999 Andover.net acquired Slashdot. Shortly after, the underlying code was released as open source software called Slash. Like Movable Type and Greymatter, Slash runs on Perl, but it also has established hooks into MySQL and a very strong track record of scaling to enormous traffïc levels. To give you an idea, the term 'slashdotted' originated from acquiring a link on this nöw infamous and very popular tech news website - and consequently watching your servers melt. If you have nevër messed around with Slash, you really should as it is quite a powerful platform. Slash is open source.
Another well known Perl based community blogging software is Scoop. Scoop is the software that powers Kuro5shin, DailyKos and many other busy community weblogs. Scoop took the Slashdot idea and expanded on it, making the discussion rather than the news the focus of the application. Where Slashdot entries tend to have a link with added commentary pointing readers off the site, Scoop points to stories written by members of the community keeping the reader within your own weblog. Scoop is also well known for handling large volumes of traffïc and a large very technical community. Scoop is frëe .
Drupal is a well known open source community blogging platform with a very large community of users and developers. Not only is Drupal frëe but it is damn powerful. Instead of Perl, which is quite hard to decode at times, even if you are a fluent coder, Drupal uses a PHP/mySQL platform. Drupal is also a very community focused application with a built-in forum, download area, and hundreds of other home brewed mods and hacks. If you are looking for a lot of functionality, give Drupal a look - the project has become quite mature. It is also much easier to use and customize than either Scoop or Slash. Drupal is also another open source project.
Mentions: LiveJournal, PHP Nuke
Here is a handy blog software comparison chart courtesy of Online Journalism Review. Here is another from Weblog Industry Report which is much more thorough and nostalgic yet a tad dated.
If you are into following the devlopment of open source CMS, portal, blog and related systems you should chëck out opensourcecms.
About The Author
Miles Evans writes for ProfitPapers where he writes essays on organic SEO, SEM, development and other equally fascinating subjects.
posted by Scott Jones @ 9:57 am
Monday, May 15, 2006
SEO With Google Sitemaps
What is a Google Sitemap?
A Google Sitemap is a very simple XML document that lists all the pages in your website, but the Google Sitemaps program is actually much more important than that. In fact, the Sitemaps program provides a little peek inside Google's mind - and it can tell you a lot about what Google thinks of your website!
Why Should You Use Google Sitemaps?
Until Google Sitemaps was released in the summer of 2005, optimizing a site for Google was a guessing game at best. A website's page might be deleted from the index, and the Webmaster had no idea why. Alternatively, a site's content could be scanned, but because of the peculiarities of the algorithm, the only pages that would rank well might be the "About Us" page, or the company's press releases.
As webmasters we were at the whim of Googlebot, the seemingly arbitrary algorithmic kingmaker that could make or break a website overnight through shifts in search engine positioning. There was no way to communicate with Google about a website - either to understand what was wrong with it, or to tell Google when something had been updated.
That all changed about a year ago when Google released Sitemaps, but the program really became useful in February of 2006 when Google updated it with a couple new tools.
So, what exactly is the Google Sitemaps program, and how can you use it to improve the position of your website? Well, there are essentially two reasons to use Google Sitemaps:
1. Sitemaps provide you with a way to tell Google valuable information about your website.
2. You can use Sitemaps to learn what Google thinks about your website.
What You Can Tell Google About Your Site
Believe it or not, Google is concerned about making sure webmasters have a way of communicating information that is important about their sites. Although Googlebot does a pretty decent job of finding and cataloging web pages, it has very little ability to rate the relative importance of one page versus another. After all, many important pages on the Internet are not properly "optimized", and many of the people who couldn't care less about spending their time on linking campaigns create some of the best content.
Therefore, Google gives you the ability to tell them on a scale of 0.0 to 1.0 how important a given page is relative to all the others. Using this system, you might tell Google that your home page is a 1.0, each of your product sections is a 0.8, and each of your individual product pages is a 0.5. Pages like your company's address and contact information might only rate a 0.2.
You can also tell Google how often your pages are updated and the date that each page was last modified. For example your home page might be updated every day, while a particular product page might only be updated on an annual basis.
What Google Can Tell You About Your Site
Having the ability to tell Google all this information is important, but you don't even need to create a sitemap file in order to enjoy some of the perks of having a Google Sitemaps account.
That's because even without a Sitemap file, you can still learn about any errors that Googlebot has found on your website. As you probably know, your site doesn't have to be "broken" for a robot to have trouble crawling it's pages. Google Sitemaps will tell you about pages it was unable to crawl and links it was unable to follow. Therefore, you can see where these problems are and fix them before your pages get deleted from the index.
You can also get information on the types of searches people are using to find your website. Of course, most website analytics tools will give this information to you anyway, but if the tool you use doesn't have this feature, then it's always nice to get it for frëe from Google.
But the best part of the Sitemaps program is the Page analysis section that was added in February of 2006. This page gives you two lists of words. The first list contains the words that Googlebot associates with your website based on content on your site. The second list contains words that Googlebot has found linking to your site!
Unfortunately, Google limits the number of words in each list to 20. As a consequence, the inbound links column is partly wasted by words such as "http", "www", and "com" - terms that apply equally to all websites (hey Google, how about suppressing those terms from the report?). That said, this list does provide you with a way to judge the effectiveness of your offsite optimization efforts.
When you compare these two lists, you can get an understanding of what Google thinks your website is about. If the words on your Site Content column are not really what you want Googlebot to think about your site, then you know you need to tweak your website's copy to make it more focused on your core competency.
If, on the other hand your inbound links don't contain any keywords that you want to rank well for, then perhaps you should focus your efforts in that direction.
Above all else, you really want these two lists to agree. You want your inbound linked words to match up to the site content words. This means that Google has a clear understanding of the focus of your website.
Additional Benefits of the Sitemaps Program
Google has even started notifying Sitemaps-participating Webmasters if they are breaking any of Google's Webmaster Guidelines. This can be very valuable information if your site suddenly becomes de-listed on Google and you don't know why.
Only Sitemaps participants can get this information, and it is only provided at Google's discretion. In fact, Google will NOT notify you if you are creating worthless websites that offer no original content, or if you are creating thousands of doorway pages that are redirecting to other web sites. Google doesn't want to give the sp@ammers any clues as to how to improve their techniques.
How Do You Get Started with Google Site Maps?
The first thing you must do is obtain a Google Account. If you already have a Gmail, Adsense, or Adwords account, then you are all set. If not, you can register an account by visiting the Google Accounts page.
Building your sitemap file is pretty easy to do if you are familiar with XML, and if you aren't you can always use a third-party tool such as the ones that are listed on Google's website. Google also has a "Sitemap Generator" that you can download and install on your server, but unless you are fairly adept at managing Python scripts, you should probably stick to the third-party tools.
At any rate, once you have your Google Account and your Sitemap file built, the rest is very easy. All you have to do is:
1. Log into your account
2. Type your website's URL into the "Add Site" box and clíck on "OK"
3. Clíck on the Manage Sites link for the website you are adding, and add your sitemap file to your account.
Google Sitemaps - An Excellent SEO Tool
Google Sitemaps help Googlebot quickly find new content on your website. They allow you to tell Google what's important, what's new, and what changes often. The tools provided to webmasters through the program can play a vital role in helping you understand how the search engines (especially Google) view your website.
Using this information you can dramatically improve the position of your website and quickly clear up any issues Google finds. You can also use the tools provided by Google to gauge the effectiveness of your off-site optimization efforts so you can better focus your time and energy on activities that bring you the most success.
About The Author
Matthew Coers is an Internet marketing expert. His website, ProfitChoice.com contains online courses designed to teach entrepreneurs how to build a website and make monëy online.
posted by Scott Jones @ 7:23 am
Friday, May 05, 2006
Google's Orion :: The Next Shining Star or a Burnt Ember?
By now you have probably heard that Google bought a new algorithm developed by a university student down under.
Many in the industry have speculated on what this could mean. Will this transform Google yet again? Or, will it merely be just another piece of technology they buy but don't appear to use.
In this article, I look at the implications of Orion and what it could mean to the future of search.
In order to understand the issues at play, we must first understand just what the heck Google bought.
Orion is a new algorithm which in a nutshell works like Ask or Clusty in that it will not only match results based on keywords but also similar results based on concepts around those keywords.
For example, if you were to search for "Canada" you may get not only the Government of Canada website, but also websites dealing with history, sites talking about the official languages of Canada (there are two official languages in case you were wondering) and more.
Similar to how Ask allows you to drill up or down to narrow or broaden your search.
Many people feel that Orion will "revolutionize" Google. And, while it will be interesting to see what Google does with the technology, I'm more inclined to agree with Danny Sullivan's assessment. In a recent article he basically says "So what?"
Mind you, I don't think this is as ho-hum an issue as Danny makes it out to be. However, I also don't think it's as huge a deal as others have made it.
For example, Danny says, "When Google acquired the three people from Kaltix along with their search technology back in 2003, it hardly created a revolutionary change for us soon after." And I'd have to disagree with him.
While the results of the Kaltix acquisition weren't immediately obvious, they did show up, at least partly, a little later on in the "update from hell" as many webmasters still call it today. It was also known as the Florida Update.
I do agree with this assessment also made by Danny in the article: "It sounds like Allon mainly developed an algorithm useful in pulling out better summaries of web pages."
Because, the way I see it, that is all this is: A way to make the search experience a little more useful.
Will This Revolutionize Search?
I don't think so. But it does do a couple of things for Google:
For one, it makes it easier for users to find the data they want on Google. Which, in turn, improves loyalty to the engine ultimately increasing the company's bottom line.
And another big reason for the purchase? To keep the technology out of the hands of the competition. Namely, Yahoo! and Microsoft.
So What Will Orion Do For Search?
Well, as I mentioned, it will make it easier to find information on Google. For example, if you can't find what you want in the immediate results, if you can scan some related terms to find other terms which could match what you are looking for and then view results there, it can help.
Also, look at what such a search does to the searcher. No longer does the searcher hit a result and leave the engine. Nöw, they could spend longer on the engine, potentially reviewing more results and obviously being exposed to more ads.
In reality, while this is a nice bell or whistle, the only one who's really going to benefit is Google. That's because it increases ad exposures; meaning that more ads get seen, which means a greater chance of an ad being clicked on.
Will the Average Person Use It?
In all honesty I doubt it. I think it's a tool guys like me will use. You know the type – always into the latest and greatest (if buggy) things. Those things that have a coolness factor.
But, in reality the average person doesn't care about these types of gimmicks. They just want the search engine to show them the right result every time. If you force users to hunt for the right results, you risk them switching engines until they find what they're looking for.
Therefore, the average user will probably say "hmm that's interesting, but what I really want to see better be in the top 2 or 3 results."
In the end Orion will do a couple of things for Google. It will add some new functionality that some will (but most won't) use, and it ensures that Microsoft and Yahoo! have to build the technology to remain competitive.
In the end, Danny was mostly right: Google gets another good employee and the technology may give them "another evolutionary change that may improve things over time, rather than instantly."
About The Author
Rob Sullivan is a SEO Consultant and Writer for Textlinkbrokers.com. Textlinkbrokers is the trusted leader in building long term rankings through safe and effective link building.
posted by Scott Jones @ 9:58 am
Tuesday, May 02, 2006
The Advance Of Algorithms - New Keyword Optimization Rules
Maintaining and marketing a website can be a difficult task especially for those who are inexperienced or who have very little experience. SEO rules are constantly changing and even then, many SEO professionals disagree on the actual specifics required to optimize a website. This is in no small part due to the search engines themselves.
Major search engines like Google are constantly striving to ensure that sites at the top of their result pages offer invaluable information or service to their visitors. However, webmasters who are looking to make quick monëy while offering very little quality content are always finding new ways to beat the search engines at their own game. For this reason, search engines regularly change the methods they use to determine the relevancy and importance of your site.
Evolving Search Engines
The first step you should take is to ensure that your website will do an effective job of turning visitors into monëy. The content needs to be optimized so that both search engine visitors and human visitors both deem it to be a useful website. Once upon a time, effective optimization entailed cramming content with as many keywords as possible and while this once generated good search engine results it invariably put visitors off. It is also now frowned upon and penalized as being sp@m by all of the major search engines.
The Evolution And Improvement Of Algorithms
Search engines use specific algorithms to determine the relevance of your website. The calculations from these algorithms determine where on the search engine result pages your website will appear. In order to keep the unscrupulous webmasters guessing and ensuring that results are always up to date, major search engines regularly update their algorithms.
The result of some of the most recent changes has seen the impetus move away from optimizing websites for search engines and instead the algorithms are now geared to promote websites that give true value to visitors. They're not only changing, they are evolving into more intelligent and accurate algorithms. While the use of keywords based around the relevant topic is still important, it is also important to ensure that visitors are your main priority.
Keyword optimization is now more heavily guarded. Those who include keywords too often will have their sites labeled as sp@m, whereas not enough instances of the appropriate keyword means you won't receive the desired results. However, the algorithms have become particularly smart and as well as the keywords you want to target you should include other relevant keywords. Including inflexions of keywords is one excellent way to ensure that your site is deemed to be relevant. Inflexions are slight changes to your keyword. For example, inflexions of the keyword "advertising" include advertise, advertised, advertisement, etc...
Weïght is also given to keywords that are included in certain sections of a page. These sections include the title tag, meta tags (only relevant to smaller search engines now), header tags, image alt tags and formatting tags (e.g. keywords in bold or italicized) of your text. With image alt tags and hyperlink title tags it is important that you don't simply fill these with keywords because this will be ignored at best, and penalized at worst.
Natural Content Writing
One of the most effective ways to ensure that your site is keyword optimized properly is to write the content naturally first. Once you have done this, go through and ensure that any relevant keywords are included throughout the text. Only place them where they would appear naturally and remove them from anywhere where they appear awkward. Once you've written the content you should also chëck the remaining factors to ensure everything is ok.
SEO Keyword Checklist
Below is a keyword checklist to ensure that you have fully optimized your web pages to the current, generally accepted search engine algorithm rules.
URL: Get your primary keyword as close to the beginning of the URL as possible.
Title Tag: The title should be between 10 and 50 characters and include one or more keywords while still being descriptive.
Description Meta Tag: The description meta tag should be insightful and useful but it should also contain one or two of your more important keywords.
Keyword Meta Tag: It makes sense that you should include all of your keywords in the keyword meta tag. Do not include any words that don't appear in the body of your text.
Keyword Density: Your content should be made up of all of your keywords and other text. A total keyword density (all keywords) of around 15% to 20% is the maximum you should aim for and anything less than 5% is unlikely to yield good results. Density for a single keyword should be between 1% and 7%. 1% seems too low, and 7% a little too high. Wherever possible aim for approx 5% with the primary keyword and 3% with secondary and subsequent keywords.
Header Tags (e.g. H1 and H2 tags): More weïght is given to keywords that appear within H1 tags, then H2 tags and so on.
Text Formatting Fonts (e.g. strong, bold and underline): This may not offer much weïght in algorithms, but generally if you bold the first instance of your keywords and the last instance of your primary keyword you should see some positive results.
Beginning Of Text: The closer you can get your keywords to the beginning of your page content the better. Try to include your primary keyword within the first sentence or two and also within the last paragraph.
Key-Phrases As Whole Phrases: If you are targeting Internet Marketing as a key phrase then do not split the words up if possible. Some effect is noticed if the words are split, but much more benefit is received by including the phrase as a whole.
Alt Text: Include your keyword at least once in the Alt tag of any images. Ensure that the text is relevant to the image and gives some information.
About The Author
Matt Jackson, founder of WebWiseWords, is a professional copywriter offering a professional service. Whether your business or your website needs a website content copyrwriter, an SEO copywriter, a press release copywriter or a copywriter for any other purpose WebWiseWords can craft the words you want.
posted by Scott Jones @ 7:44 am