12 Comments

I was recently made aware of a peculiar bug in BlogEngine.NET that would delete all posts, comments and pages. Now, this specific issue is not new, but it was new to me so I thought I would share it with you. Maybe it’s new to you too.

The scenario is extremely rare and that’s why I’ve never come across it before. Here’s the step to reproduce this issue:

  1. Sign in to your BlogEngine.NET installation using Internet Explorer.
  2. Open Microsoft Visio and use it’s reverse engineering to generate a sitemap of the blog.
  3. All your posts, comments and pages are now deleted.

The reason you need to use Internet Explorer is that Visio and Internet Explorer share the same cookie container behind the scenes. The cookie you got when you signed in using Internet Explorer is still present when you open Visio and therefore you are still signed in when you use Visio.

Ok, so now you are signed in using Visio and you start Visio’s crawling feature and point it to your blog address. All the delete-links under each post, comment or page gets crawled and thereby you delete them all.

The protection

It’s very easy to protect against this kind of bug. Just change the delete-links. This is an example of an unprotected link:

[code:html]

<a href="?delete=1234" onclick="return confirm('Are you sure?')">Delete</a>

[/code]

And this is the protected version

[code:html]

<a href="#" onclick="if (confirm('Are you sure?')) location.href='?delete=1234'">Delete</a>

[/code]

The difference is that now you can only delete if the client supports JavaScript, which of course Visio doesn’t. Remember that this is only an issue if you are signed in, so this is not something everybody can do and that is why I’ve never come across it before. In other words, it is not a dangerous bug at all and by fixing the links you will just be protected from your self.

The point is that if you expose delete-links on your page; make sure they are protected from Visio and other applications that share cookie container with Internet Explorer.

FYI, this has been corrected in the upcoming 1.2.5 release of BlogEngine.NET due in about a week.

0 Comments

Request validation is enabled by default in ASP.NET and it basically stops people from submitting a form with HTML in any of the input fields. It’s a little more sophisticated than that, but basically it just looks for HTML tags and if it finds any, it throws an exception and the form is prevented from being posted.

However, you often want people to be able to write HTML tags in your forms. That’s why most people turn it off either globally in web.config or on the individual pages hosting a form and then just HTML encodes the values. I’ve done it reluctantly myself many times, but there is a smarter way to allow HTML input without turning request validation off.

What if we could just HTML encode all input fields just before the form is submitted? That way we could benefit from request validation and the security it offers out of the box. By having request validation enabled, you also make it impossible for spambots to post links in your form.

The easiest way of doing this is to create a custom server control that inherits from System.Web.UI.WebControls.TextBox and add a little JavaScript magic. I’ve written a SafeTextBox class that HTML encodes its value client-side and then HTML decodes the value again server-side. That way it can be treated just like a normal TextBox.

[code:c#]

public class SafeTextBox : System.Web.UI.WebControls.TextBox
{
 protected override void OnLoad(System.EventArgs e)
 {
  base.OnLoad(e);
  if (!Page.ClientScript.IsClientScriptBlockRegistered(Page.GetType(), "TextBoxEncode"))
  {
   System.Text.StringBuilder sb = new System.Text.StringBuilder();
   sb.Append("function TextBoxEncode(id)");
   sb.Append("{");
   sb.Append("var tb = document.getElementById(id);");
   sb.Append("tb.value = tb.value.replace(new RegExp('<', 'g'), '&lt;');");
   sb.Append("tb.value = tb.value.replace(new RegExp('>', 'g'), '&gt;');");
   sb.Append("}");
   Page.ClientScript.RegisterClientScriptBlock(Page.GetType(), "TextBoxEncode", sb.ToString(), true);
  }

  // Adds the function call after the form validation is called.
  if (!Page.IsPostBack)
   Page.Form.Attributes["onsubmit"] += "TextBoxEncode('" + ClientID + "');";
 }

 public override string Text
 {
  get { return base.Text; }
  set
  {
   if (!string.IsNullOrEmpty(value))
    base.Text = value.Replace("&lt;", "<").Replace("&gt;", ">");
   else
    base.Text = value;
  }
 }
}

[/code]

The way the SafeTextBox HTML encodes/decodes is not very sophisticated but it works. You can add your own logic to the encoding/decoding if you feel the need.

To roll this out on your own website, just dump the SafeTextBox class in the App_Code folder and hook it up using tag mapping.

0 Comments

In the last couple of months I’ve been getting more and more attacks through the use of URL parameters. What happens is that I get a lot of requests to the pages that has URL parameters and then the hacker or robot tries to do SQL injection by adding code to the parameters.

This is one of the pages where this happens:

http://blog.madskristensen.dk/?year=2006&month=5

and this is the request that is made to that page by the robot:

http://blog.madskristensen.dk/?year=2006&month=5 and user>0

In my case nothing happens since BlogEngine isn’t vulnerable to these kinds of attacks, but it definitely is a reminder to always make sure that SQL injection attacks cannot happen from URL parameters like this. It was only when I counted the number of these attacks made to this website that I realized just how many I get on a daily basis. Be careful.

2 Comments

Today, for no particular reason, I got attacked by spammers in an unprecedented magnitude. It was not old school e-mail spam – that has been quite stabile the last years on about 100 spam mails per day. No, it was all other kinds of spam.

Comment spam

First of all, my server log has been busy logging 4 times as many comment spam attacks as usual. I normally have about 20 comment spam attacks per day, but today it was almost 80 (and today isn’t over yet). All of them failed to actually post a comment because they tried to make an invalid postback. That is the error message I get from comment spam attacks.

Trackback spam

Normally, I don’t get much trackback spam - maybe a few per day, tops. Today I received 22 which is a record for one day. None of them got through because I check for various conditions about each request and they all failed that check. Strangely enough, there have been no pingback attacks today.

Referrer spam

This is a very annoying way of spamming. I have a referrer log and every time a visitor enters my website with a referrer in the request headers, that request get’s logged so I can see where my visitors come from. Spam bots know this and send requests with fake referrer headers so that I might follow one of those referrer URLs to check it out. It’s pretty obvious most of the time that xxxgallery.com and cheap-mortage.tk is referrer spam. I do have a mechanism to filter them out. It’s very effective, but it also produces some false negatives. I can live with it.

Conclusion

It seems the annual let’s-spam-blogs-day is today and all spammers join forces and fire away. Spammers are people too and it’s nice to see that they can work together and combine various different spam techniques for optimum reach. I’m just glad that all their attempts to pollute my site were futile. Not a single spam attempt got through, just to my logs. It’s actually a good thing because for the first time I got a change to test my spam defences properly. It would have been nice if they would have invited a spammer who could do pingback spam as well. I guess you can’t win them all. Maybe they will invite him for the next annual let’s-spam-blogs-day.

I wonder if this is related to the International Talk Like A Pirate Day which is today. 

0 Comments

I’ve hooked a health provider up in my web.config to send me all unhandled exceptions by e-mail. See here how to do that – you just have to put some lines in the web.config. Well, I get all sorts of different exceptions but one I get more than 20 times a day. It’s actually rear that I get anything else than this one particular unhandled exception.

It looks like this:

Exception type: System.ArgumentException
Exception message: Invalid postback or callback argument. Event validation is enabled using <pages enableEventValidation="true"/> in configuration or <%@ Page EnableEventValidation="true" %> in a page.  For security purposes, this feature verifies that arguments to postback or callback events originate from the server control that originally rendered them.  If the data is valid and expected, use the ClientScriptManager.RegisterForEventValidation method in order to register the postback or callback data for validation.

Am I an idiot? 

Now you might think that I’m an idiot that I didn’t do anything about it months ago, but hold on a minute. It says that a postback is invalid because event validation is turned on. It’s turned on by default in ASP.NET so that’s no big surprise. No my dear reader, this is not an error I would like to remove by disabling the event validation, because this error is in fact caused by spam bots trying to spam my comments.

They all fail in doing so, because event validation is enabled and thus throwing this exception every time they try. Did I mention to say that event validation is turned on by default and is a native feature of ASP.NET? That means that all ASP.NET application has a natural spam bot protection system build right into it by default. How cool is that?

Maybe this example will convince those of you who didn’t believe me in the last post I did about ASP.NET security and unnecessary CAPTCHAS.

Update 30 minutes later: I've just received 25 more mails in half an hour. Maybe the bots read my post and didn't believe me either.