me @ ur help

Showing posts with label Miscellaneous information on search engines. Show all posts
Showing posts with label Miscellaneous information on search engines. Show all posts

Saturday, November 20, 2010

Blekko search engine with a human touch


A small Silicon Valley company with some big name backers has released a test version of a new search engine that the company says has a key ingredient that is missing from Google, the human touch.
According to Blekko, the Web has increasingly become saturated with spam-like websites, specially designed to pop up in Google's search results but whose content is heavier on marketing pitches than substantive information.
The remedy, Blekko Chief Executive Rich Skrenta told Reuters, is to narrow searches to groups of websites that people, not computers, have pre-approved as being the best sources of information for particular topics.
The approach is decidedly old-school in an industry where computer algorithms developed by engineers at Google and Microsoft Corp have for years been considered the ideal way to find information in the sea of online data.
Blekko has raised $ 24 million in funding as it has developed its product over the past three years, with angel investors including Marc Andreessen, the creator of the first Web browser, and Ron Conway, who has invested in tech companies including Twitter, Foursquare and even Google.
Blekko joins a long list of search engines that have tried to improve upon Google's business, with a less-than-stellar track record. Cuil, a high-profile search engine launched by former Google employees in 2008, quietly shut down in September; Powerset, a search engine that let people ask questions in plain English, was acquired by Microsoft in 2008.
Greg Sterling, an Internet consultant and a contributing editor for the online blog Search Engine Land, said he doesn't expect that Blekko will displace Google any time soon. But he said the company has developed a creative approach to search that might become popular as a secondary search engine for certain types of queries.
In an emailed statement, Google said that it welcomes competition that helps users get useful information and gives people new choices. "Having great competitors is a huge benefit to us and everyone in the search space-it makes us all work harder, and at the end of the day everyone benefits from that," said Google.
Blekko is launching with a special directory of websites that can provide spam-free results in seven general search categories: health, recipes, song lyrics, hotels, automobiles, colleges and personal finance.
Blekko also allows users to create their own personal directories of websites for any topic, so that the search engine only looks for information from sources the user deems relevant or trustworthy.
The focus on quality websites over quantity has appeal, said Sterling. "We don't care if there are 30 million results or 40 million results, it's really 'Give me the information I need in an efficient way so I don't have to go wading through all this nonsense'," he said.

Thursday, November 18, 2010

Miscellaneous information on search engines


At the beginning of 2004, a new and mysterious term appeared among seo specialists – Google SandBox. This is the name of a new Google spam filter that excludes new sites from search results. The work of the SandBox filter results in new sites being absent from search results for virtually any phrase. This even happens with sites that have high-quality unique content and which are promoted using legitimate techniques.

The SandBox is currently applied only to the English segment of the Internet; sites in other languages are not yet affected by this filter. However, this filter may expand its influence. It is assumed that the aim of the SandBox filter is to exclude spam sites – indeed, no search spammer will be able to wait for months until he gets the necessary results. However, many perfectly valid new sites suffer the consequences. So far, there is no precise information as to what the SandBox filter actually is. Here are some assumptions based on practical seo experience:

- SandBox is a filter that is applied to new sites. A new site is put in the sandbox and is kept there for some time until the search engine starts treating it as a normal site.

- SandBox is a filter applied to new inbound links to new sites. There is a fundamental difference between this and the previous assumption: the filter is not based on the age of the site, but on the age of inbound links to the site. In other words, Google treats the site normally but it refuses to acknowledge any inbound links to it unless they have existed for several months. Since such inbound links are one of the main ranking factors, ignoring inbound links is equivalent to the site being absent from search results. It is difficult to say which of these assumptions is true, it is quite possible that they are both true.

- The site may be kept in the sandbox from 3 months to a year or more. It has also been noticed that sites are released from the sandbox in batches. This means that the time sites are kept in the sandbox is not calculated individually for each site, but for groups of sites. All sites created within a certain time period are put into the same group and they are eventually all released at the same time. Thus, individual sites in a group can spend different times in the sandbox depending where they were in the group capture-release cycle.

Typical indications that your site is in the sandbox include:

- Your site is normally indexed by Google and the search robot regularly visits it.
- Your site has a PageRank; the search engine knows about and correctly displays inbound links to your site.
- A search by site address (www.site.com) displays correct results, with the correct title, snippet (resource description), etc.
- Your site is found by rare and unique word combinations present in the text of its pages.
- Your site is not displayed in the first thousand results for any other queries, even for those for which it was initially created. Sometimes, there are exceptions and the site appears among 500-600 positions for some queries. This does not change the sandbox situation, of course.

There no practical ways to bypass the Sandbox filter. There have been some suggestions about how it may be done, but they are no more than suggestions and are of little use to a regular webmaster. The best course of action is to continue seo work on the site content and structure and wait patiently until the sandbox is disabled after which you can expect a dramatic increase in ratings, up to 400-500 positions.