I was recently made aware of a peculiar bug in BlogEngine.NET that would delete all posts, comments and pages. Now, this specific issue is not new, but it was new to me so I thought I would share it with you. Maybe it’s new to you too.

The scenario is extremely rare and that’s why I’ve never come across it before. Here’s the step to reproduce this issue:

  1. Sign in to your BlogEngine.NET installation using Internet Explorer.
  2. Open Microsoft Visio and use it’s reverse engineering to generate a sitemap of the blog.
  3. All your posts, comments and pages are now deleted.

The reason you need to use Internet Explorer is that Visio and Internet Explorer share the same cookie container behind the scenes. The cookie you got when you signed in using Internet Explorer is still present when you open Visio and therefore you are still signed in when you use Visio.

Ok, so now you are signed in using Visio and you start Visio’s crawling feature and point it to your blog address. All the delete-links under each post, comment or page gets crawled and thereby you delete them all.

The protection

It’s very easy to protect against this kind of bug. Just change the delete-links. This is an example of an unprotected link:


<a href="?delete=1234" onclick="return confirm('Are you sure?')">Delete</a>


And this is the protected version


<a href="#" onclick="if (confirm('Are you sure?')) location.href='?delete=1234'">Delete</a>


The difference is that now you can only delete if the client supports JavaScript, which of course Visio doesn’t. Remember that this is only an issue if you are signed in, so this is not something everybody can do and that is why I’ve never come across it before. In other words, it is not a dangerous bug at all and by fixing the links you will just be protected from your self.

The point is that if you expose delete-links on your page; make sure they are protected from Visio and other applications that share cookie container with Internet Explorer.

FYI, this has been corrected in the upcoming 1.2.5 release of BlogEngine.NET due in about a week.


Comment by Brendan Enrick

Very interesting. That is a simple yet dangerous little bug there. Pretty interesting solution as well it is a nice little bot protection. That trick for the link could be used to stop any bot from crawling sections of a site. Might it be safer for the links to have been buttons instead? Then there is no need for customizations. Pretty much nothing that crawls pages uses forms except spamming stuff.

Comment by Mike Thomas

For what it's worth, according to the HTTP spec a GET request (that is, data on the querystring as opposed to data POSTed to a url) should never be destructive. Makes implementing delete links a little harder, but is definitely safer.

Mike Thomas

Comment by James Skemp

Odd. This is exactly what I was looking for for a project I'm working on.

Glad to hear 1.2.5 is coming out soon; I don't know how others feel, but I like the pace of releases ...

(Maybe the 'Notify me when new comments are added' checkbox has been added into the tab order with this release? I've run into this a couple of times, but time has been too short to post as an issue.)

Comment by Nis Wilson Nissen


You could try something like this:

<a href="/?delete=1234" onclick="if (confirm('Are you sure?')) { var f = document.createElement('form'); f.style.display = 'none'; this.parentNode.appendChild(f); f.method = 'POST'; f.action = this.href;f.submit(); };return false;">Delete</a>

This is how Rails converts a link from GET to POST when you ask it nicely ;-)

And I agree with Mike and HAB, and like the health surgeon general says:

"Put All Destructive Actions Behind a POST Request"


Comment by Eric

I'm the one who got my blog emptied by the bug. Found a database backup that was about 1½ months old that I read back, but it kind of sucked. I would've used a server control(Like linkbutton) myself, and made a proper postback when the link was clicked.