How Google could improve Search

Google is about to (may have?) roll out another new way to “improve” the search results that it returns.   This has nothing to do with whether or not my site places well.

I have a really really big wish list item, and I think it is a win-win. I do a lot of searching for radio stations I haven’t found yet, and I would like to be able to remove domains from the result set, similar to the way I can now remove or demote news sources I don’t like or trust from Google News, which greatly improved my satisfaction.

There are a large number of “spammy” web sites that list radio stations (I accept that some of them may say the same of me)… but if I could eliminate those from the search results it would save a lot of time. Google also could start to notice patterns of sites individual users consider “spammy”.

When I do a search on a radio station, typically I get results like:

  • Radiolineup – just takes the FCC database and makes the data texty
  • mail-archive.com – an echo of a DX mailing list
  • facebook – the autogenerated from Wikipedia crap
  • wikimapia – showing me the station on a map
  • radiostationforums – content free site selling itunes music
    Similar sites I don’t leech from:

  • iro7.com 
  • radiotime
  • streema.com
  • radio74.com
  • radio-locator.com

Wikipedia clones:

  • haitel.com
  • resurrectionparishjohnstown.com

Then the total gibberesh or minimally relevant

  • auto555.com
  • whatrhymeswith.com
  • city-data.com
  • hometownlocater.com
  • merchantcircle.com
  • gaebler.com
  • recnet.com
  • top40charts.com
  • radio74.net
  • findlocalweather.com

I hard code exclusions for some of that list, but Google has a limit on how many -site:  parameters it will accept.

About Art Stone

I'm the guy who used to run StreamingRadioGuide.com (and FindAnISP.com).
This entry was posted in Internet Streaming. Bookmark the permalink.

3 Responses to How Google could improve Search

  1. Hesperus says:

    BING!

    • Art Stone says:

      Bada!

      Doesn’t do what I want. Both allow me to type -site:someplace.com for each search.

      What I want is – being logged in – that I can build up a list of domains I don’t want excluded from results that are permanent as long as I am logged in using my Google account. Google does somewhat customize your results based on your searh history, but I want to be more proactive in steering it.

  2. Art Stone says:

    Amazingly, 2 weeks later, Google brought out exactly what I asked for!

    http://googleblog.blogspot.com/2011/03/hide-sites-to-find-more-of-what-you.html

    The one detail that I missed at first was you first have to click through to the site, and then back up back to the original search result page. At that point, an extra option because visible to ignore all future results from that site (as long as you are logged into Google)

Comments are closed.