SEO Expert Bangalore - India :: Pradeep SV

Sunday, February 26, 2006

How To Ensure The Search Engines Find Your Website

One of the most fundamental aspects of search engine optimisation (SEO) is ensuring that the pages within your website are as accessible as possible to the search engines. It's not only the homepage of a website that can be indexed, but also the internal pages within a site's structure. The internal pages of a site often contain important content such as products, services or general information, and therefore can be uniquely optimised for related terms. As a result, easy access to these pages is vital.

There are many do's and don'ts involved in ensuring all of your pages can be found by search engines. However, it is important to first establish how the search engines find and index web pages.

Search engines use "robots" (also known as "bots" or "spiders") to find content on the web for inclusion in their index. A robot is a computer programme that can follow the hyperlinks on a web page, which is known as "crawling". When a robot finds a document it includes the contents within the search engine's index, then follows the next links it can find and continues the process of crawling and indexing. With this in mind, it becomes apparent that the navigational structure of a website is important in getting as many pages as possible indexed.

When considering the navigational structure of your site, the hierarchy of content should be considered. Search engines judge what they feel to be the most important pages of a site when considering rankings and a page's position in the site structure can influence this. The homepage is generally considered the most important page of a site - it is the top level document and usually attracts the most inbound links. From here, search engine robots can normally reach pages that are within three clicks of the homepage. Therefore, your most important pages should be one click away, the next important two clicks away and so forth.

The next thing to consider is how to link the pages together. Search engine robots can only follow generic HTML href links, meaning Flash links, JavaScript links, dropdown menus and submit buttons will all be inaccessible to robots. Links with query strings that have two or more parameters are also typically ignored, so be aware of this if you run a dynamically generated website.

The best links to use from an SEO perspective are generic HTML text links, as not only can they be followed by robots but the text contained in the anchor can also be used to describe the destination page - an optimisation plus point. Image links are also acceptable but the ability to describe the destination page is diminished, as the alt attribute is not given as much ranking weight as anchor text.

The most natural way to organise content on a website is to categorise it. Break down your products, services or information into related categories and then structure this so that the most important aspects are linked to from the homepage. If you have a vast amount of information for each category then again you will want to narrow your content down further. This could involve having articles on a similar topic, different types of product for sale, or content that can be broken down geographically. Categorisation is natural optimisation - the further you break down your information the more content you can provide and the more niche key phrases there are that can be targeted.

If you are still concerned that your important pages may not get indexed, then you can consider adding a sitemap to your website. A sitemap can be best described as an index page - it is a list of links to all of the pages within a site contained on one page. If you link to a sitemap from your homepage then it gives a robot easy access to all of the pages within your site. Just remember - robots typically can't follow more than 100 links from one page, so if your site is larger than this you may want to consider spreading your sitemap across several pages.

There are many considerations to make when optimising your site for search engines, and making your pages accessible to search engine robots should be the first step of your optimisation process. Following the advice above will help you make your entire site accessible and aid you in gaining multiple rankings and extra traffic.

Thursday, February 09, 2006

Web 2.0: The Next Big Thing orthe Evolution of a Technology?

Is it a movement? A revolution? Perhaps a new paradigm? Or, is it a bunch of hype designed to sell a bunch of new software? Just what is Web 2.0?

Well, the term has been around since 2003. It was coined by I-Net pioneer Dale Dougherty and introduced at a conference by Tim O'Reilly of O'Reilly Media, Inc., who has subsequently made attempts at defining just what Web 2.0 means. In his seminal document entitled What Is Web 2.0: Design Patterns and Business Models for the Next Generation of Software, O'Reilly describes Web 2.0 as follows:

"Like many important concepts, Web 2.0 doesn't have a hard boundary, but rather, a gravitational core. You can visualize Web 2.0 as a set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core."
- Tim O'Reilly

Okay, that's a starting point of sorts - gravitational core, set of principles and practices, veritable solar system. The fact is, O'Reilly, the champion of Web 2.0, has written eloquently on the subject, but after reading his detailed explanation, you still walk away scratching your head. Additional research clearly demonstrates that there's a lack of consensus.

Tim Bray, writing at http://radar.oreilly.com/, strongly contests the use of the term Web 2.0, calling it nothing more than a meme. Okay, so what's a meme? Well, we have to go back to 1976 to find the origin of the term created by Richard Dawkins in his text, The Selfish Gene. In it, Dawkins describes memes broadly:

"Examples of memes are tunes, ideas, catch-phrases, clothes fashions, ways of making pots or of building arches. Just as genes propagate themselves in the gene pool by leaping from body to body via sperms or eggs, so memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation."

Okay, now we're getting somewhere. Web 2.0 is a catch phrase and one that's getting a lot of attention within the e-commerce community. In fact, since making its way into the collective I-conscious, there have been more than 9 million Google searches for Web 2.0 information. Somebody's interested.

Yes, there's something there, and when you cut through the hype, delete the meme and study the underlying concepts, Web 2.0 does offer some thinking points for every site designer, host and owner. Let's look at some of the parameters of this new way of thinking about the www.

Extreme Trust

A great catch phrase in its own right. Extreme trust is a new vision for using the collective knowledge of Internet users, demonstrated by the ascendancy of Wikipedia. In the world of Web 1.0 (the model for the past decade), the Internet was a source of information. However, the information was static. You could access World Book or The Encyclopedia Britannica on-line, but all you could do is read it, print it out and use it for your child's homework.

Sites, such as Wikipedia and the Open Directory Project are changing this dynamic based on the concept of extreme trust.

Wikipedia is a growing collection of information (over 100,000 unique entries) submitted and edited by volunteers. It changes daily, hourly, providing the latest information from a variety of writers of varying degrees of expertise. Information can be edited by anyone who knows more about the topic than the original poster. In fact, if you access certain topics on Wikipedia, you'll see warnings that certain encyclopedia entries have not been reviewed, and therefore, the content can't be deemed as accurate - yet. However, as more experts, operating under the doctrine of extreme trust, review each Wikipedia entry, the reliability and veracity of the content increases.

Thus, in the Web 1.0 world, people could access information, but not participate in its evolution. In the new age of Web 2.0, the collective intelligence of the world community becomes accessible and utile.

Personal Participation

Another, much-touted aspect of Web 2.0 is personal participation. Personal web sites have been around for years. You could post family pix and tell the world what you did over summer vacation. But, these personal web sites nevër really caught on because of the expense and time required to launch and maintain them.

Enter the web log, aka blog. These personal journals encourage greater, individual participation by enabling anyone with an opinion, idea or random thought to post these personal musings for all the world to see. Bloggers have changed the way information is disseminated. Many have garnered credibility as legitïmate news sources. In fact, bloggers have received press credentials for newsworthy events. They're used by the mainstream media as reference and several of these bloggers have broken major news stories before their largër print and on-line competitors, e.g., Robert Novak's outing of Valerie Plame as a CIA operative.

The concept of personal participation has also spilled over into the realm of e-commerce, with many on-line businesses offering a blog and/or forum where customers, clients and other interested parties can post their thoughts. Amazon.com is a leader in this area, encouraging its customers to submit reviews of purchased products. In fact, some Amazon reviewers have made names for themselves - and customers seek out their recommendations! As the old, anti-war chant once demanded, Power to the People has been finally realized.

In fact, if you tour the Amazon site, you'll discover opportunities for customer participation on virtually every page. Amazon's subsidiary, Booksurge.com has also simplified the entire publishing process. Authors no longer have to approach traditional publishers, hat in hand, begging to be published. Booksurge and Amazon have made it possible for anyone to write, publish and sell texts through Amazon, B&N, Borders and other on-line outlets. Yes, this is part of the Web 2.0 model.

Static versus Dynamic

Netscape was the browser of choice in the Web 1.0 era. It was published, then updated regularly in various versions identified as Netscape 1.0, 2.0, etc. This was a static business model in which users had to wait for improvements to be made, then download the updates.

Fast forward to the dynamic age of Web 2.0 where Google reigns supreme. Google is a true child of the Internet. It was made to fit with I-net dynamics. Improvements are made and implemented daily - seamlessly. No downloads, no patches required. The result? Google has enabled all of us to access the most obscure factoid in a nanosecond. Its index contains billions of pages of spidered text and as more new sites sprout like mushrooms, more pages are spidered and the index grows.

Google has demonstrated how to do it right. It's highly interactive, it's nevër static and it has created many new avenues for the e-commerce community and for users in search of the name of the pharaoh who was in power when the rotary mill was introduced in Egypt. This has increased productivity exponentially.

The Evolution of Technology

Technology evolves. It builds on what came before. It learns from past mistakes and takes advantage of unrealized opportunities. This is as true of America's Industrial Revolution as it is for the Internet. There were lots of false starts, missteps and abject failures during the rise of technology in the early and mid-1800s. The same is true of the current technological revolution underway on your computer screen daily.

Remember the original Priceline model? You could spend two hours saving 9¢ on a can of peas. Nice try, but no cigar, despite William Shatner's campy commercials. Or, how about buying pet foods on-line? That went down in flames, too. In fact, all you have to do is look at the I-net bubble that burst in 2000 to see the shake-out of what was working and what wasn't. A lot of investors lost a ton of cäsh, but the Net didn't shrivel up and die. In fact, it's more powerful than ever.

Technology doesn't move forward in straight line. It nevër has. There are offshoots, improvements and lots of really, really bad ideas along the way. (Anybody remember the Ford Edsel?) Internet technology is no different, except that the shakeouts occur much faster, the improvements take off much quicker and the really, really bad ideas are really, really expensive. Just ask Shatner. Such is the nature of technological evolution.

So, Is Web 2.0 A Revolution?

Tim O'Reilly and the other promoters of Web 2.0 have done us a service by focusing attention on new uses for the Net. RSS is a radical step forward. Podcasting, though in its infancy, is coming on strong having caught the attention of advertisers as a new means to reach the cutting edge public. In fact, just as anyone can set up and maintain a blog, today the technology exists to set up your own broadcast network complete with specialized shows for niche markets like pregnant parents or home schoolers.

However, Web 2.0 also has aspects of a meme. Many on-line businesses have picked up the term and now proudly display a Web 2.0 logo on their home pages, though the site has virtually no new features.

No, Web 2.0 isn't a new paradigm or a revolution. It's the natural evolution of a technology that's growing at truly heart-stopping speed. What was yesterday won't be tomorrow.
In the weeks and months ahead, we'll take a much closer look at this evolutionary track to sort hype from help, and to assist you in finding new, better ways to increase site traffïc, improve your conversion rate and expand your repeat-customer base.

For now, Google Web 2.0 and start doing your homework. Changes are coming. Will you be ready? If not, you won't be hëre tomorrow.

Read More Articles>>

Tuesday, January 17, 2006

Google Launches Mobile Personalized Homepage

Google Launches Mobile Personalized Homepage

Google launched a mobile version of its personalized homepage offering last week, allowing users to display stock quotes, news headlines, Gmail, weather, and selected RSS feeds on handheld devices, writes TechWeb News. The content is optimized for small screens and low bandwidth. Those who already have a personalized homepage can use the web browser of their cell phones to access Google and select the "personalized home" link to sign in. The service works with phones that have an XHTML-capable browser.

The launch came a week after Google announced that users would be able to access its search page via Motorola cell phones, points out Red Herring.

It adds that Google also added access to Google Talk and Google Local from BlackBerry handhelds.

"If anyone is going to make the internet over cell phones something that is useful, it will have to be a company like Google or Yahoo," Peter Gorham, an independent industry analyst, is quoted by Red Herring as saying.

Earlier this month, Yahoo also announced its mobile effort, which will be initially available on Nokia's Series 60 smartphones.

Read More Articles >>>>