If you want to test your position in Google for a certain search term, you can do so by using the Google website. By position I don't mean Page Rank, but the actual place in the search results. You can also use C# to find the position like the method shows below.

You can use the code to build a cool application that lists the positions of your website based on all your important search terms. You can change it to run asynchronously for better performance when checking multiple search terms.

/// <summary> /// Retrives the position of the url from a search /// on www.oogle.com using the specified search term. /// </summary> public static int GetPosition(Uri url, string searchTerm) { string raw = "http://www.google.com/search?num=39&q={0}&btnG=Search"; string search = string.Format(raw, HttpUtility.UrlEncode(searchTerm)); HttpWebRequest request = (HttpWebRequest)WebRequest.Create(search); using (HttpWebResponse response = (HttpWebResponse)request.GetResponse()) { using (StreamReader reader = new StreamReader(response.GetResponseStream(), Encoding.ASCII)) { string html = reader.ReadToEnd(); return FindPosition(html, url); } } } /// <summary> /// Examins the search result and retrieves the position. /// </summary> private static int FindPosition(string html, Uri url) { string lookup = "(<h2 class=r><a href=\")(\\w+[a-zA-Z0-9.-?=/]*)"; MatchCollection matches = Regex.Matches(html, lookup); for (int i = 0; i < matches.Count; i++) { string match = matches[i].Groups[2].Value; if (match.Contains(url.Host)) return i + 1; } return 0; }

Examples of use

You simply provide the method with your website URL and the search term to test and it returns the position.

Uri url = new Uri("http://www.example.com"); int position = GetPosition(url, "search term");


Comment by  Lars Buur

That’s a great example! I imagine that you already have built this into headlight :). It certainly gets the mind going regarding how to build a simple keywords monitor on the keywords that is interesting to our company with simple statistic. It also would be of great value to some of our customers that have invested in some SEO consulting.

Comment by  cyanbane

Comparing Oranges to Oranges this is a good way to do it, only problem is that you are not taking into account localization (a user in UK , a user in NY, and a user in LA might all be seeing different results from different datacenters) and also personalization (do you use Google Personal Search (www.google.com/.../answer.py) If you are just doing day to day searches from the same place and want to trend data then this is a great resource (or you could also just use the API).

Comment by Mads Kristensen

You can easily change the localization by replacing "google.com" with "google.whereever". Personalization is impossible to check for. Unfortunatly, the Google API does not show the right positions at all - not by a long shot. Have tried that.

Comment by  cyanbane

Localization can happen within a country. A person in Los Angeles, California might hit a different data center than someone in Ny (both using www.Google.com). You can hit different datacenters (with different result sets) at any time from the same location. Go to google and query something and hover over any of the "cache" links for a search result. The IP in place of the domain is the current data center you are on. If you query something enough times you might get a different data center from your own machine (which could* have different results). I believe* the API just uses a standard datacenter for any request from any location, so although the results may not be "right" they could be what some user who is querying that data center might see, which in that case who of any of us is really seeing the "right" results. The changes between datacenters are usually fairly subtle though except during an update. there are a bunch of tools to already check this, here is one I sometimes use: http://www.seologs.com/data-center-check.html

Comment by  Andreas Kraus

Hi Mads,

good approach, what's missing for me is a variable display count (num query attribute). So in fact an approach to crawl through the paging to check your keyword against 500 or more search results. For example if your website is stuck in the Google Sandbox it's mostly around position 4xx.

I did a public website for SEO / Google Checks in ASP.NET including checks for up to 1000 search results: www.seo-trade.net (Keyword Ranking, Backlink Checker, Keyword Monitor, etc.)

Great blog, I really like reading it! Best regards from Germany,

Comment by  Richard Jonas

I've written something similar before, but not as simply and elegantly as your solution here. Like Andreas, I ran this repeatedly over multiple pages of search results.

I've found that the position of a search term can go up or down by a few percent each time, and what you really need to know is whether there's been a dramatic change in your position (e.g. if you're running an online shop, you might want to order more stock if your search position has improved), so I ran this as a scheduled task, which stores results in a database and sends me an email message if the position has changed dramatically.

Comment by  Hagrin

I have already built something just like this; however, I do not have this in production use because wouldn't an application like this violate the Google ToS? Obviously, on a small, manually run scale, this code wouldn't, but building a SEO tool using this type of code would definitely be a clear violation of the automated query/scraping guideline.

Great code though. :)

Comment by  Andreas Kraus

Hagrin: A page which is utilizing such Tools doesn't really have to be strong in Google so all they can do is ban the page which shouldn't bother you.

Comment by MikeyG

This looks great, going to try to implement for a client in their admin tool as another "bell" (or "whistle")

Thanks and keep it up, I really enjoy your blog.