Omar AL Zabir blog on ASP.NET Ajax and .NET 3.5

Working hard to enrich millions of peoples' lives

Sponsors

News

I was
Co-Founder and CTO of Pageflakes, acquired by LiveUniverse - founded by MySpace founder.

I am
Chief Architect, SaaS Platform, British Telecom

I will be
Chief Architect, Mi...

Follow omaralzabir on Twitter

My Public Page
www.pageflakes.com/omar

View Omar AL Zabir's profile on LinkedIn

Read my blog on:

Omar AL Zabir

www.oazabir.com



Views:

Open source projects

Prevent Denial of Service (DOS) attacks in your web application

Web services are the most attractive target for hackers because even a pre-school hacker can bring down a server by repeatedly calling a web service which does expensive work. Ajax Start Pages like Pageflakes are the best target for such DOS attack because if you just visit the homepage repeatedly without preserving cookie, every hit is producing a brand new user, new page setup, new widgets and what not. The first visit experience is the most expensive one. Nonetheless, it’s the easiest one to exploit and bring down the site. You can try this yourself. Just write a simple code like this:

   1: for( int i = 0; i < 100000; i ++ )
   2: {
   3:    WebClient client = new WebClient();
   4:    client.DownloadString("http://www.pageflakes.com/default.aspx");
   5: }

In your great surprise, you will notice that, after a couple of call, you don't get valid response. It’s not that you have succeeded in bringing down the server. It’s that your requests are being rejected. You are happy that you no longer get any service, thus you achieve Denial of Service (for yourself). I am happy to Deny You of Service (DYOS).

The trick I have in my sleeve is an inexpensive way to remember how many requests are coming from a particular IP. When the number of request exceeds the threshold, deny further request for some duartion. The idea is to remember caller’s IP in Asp.net Cache and maintain a count of request per IP. When the count exceeds a predefined limit, reject further request for some specific duration like 10 mins. After 10 mins, again allow requests from that IP.

I have a class named ActionValidator which maintains a count of specific actions like First Visit, Revisit, Asynchrnous postbacks, Add New widget, Add New Page etc. It checks whether the count for such specific action for a specific IP exceeds the threshold value or not.

   1: public static class ActionValidator
   2: {
3: private const int DURATION = 10; // 10 min period
   4:  
   5:     public enum ActionTypeEnum
   6:     {
   7:         FirstVisit = 100, // The most expensive one, choose the valu wisely. 
   8:         ReVisit = 1000, // Welcome to revisit as many times as use likes
   9:         Postback = 5000,   // Not must of a problem for us
  10:         AddNewWidget = 100, 
  11:         AddNewPage = 100,
  12:     }

The enumeration contains the type of actions to check for and their threshold value for a specific duration – 10 mins.

A static method named IsValid does the check. It returns true if the request limit is not passed, false if the request needs to be denied. Once you get false, you can call Request.End() and prevent Asp.net from proceeding further. You can also switch to a page which shows “Congratulations! You have succeeded in Denial of Service Attack.”

 

   1: 
public static bool IsValid( ActionTypeEnum actionType )
   2: {
   3: HttpContext context  = HttpContext.Current;
   4: 
if( context.Request.Browser.Crawler ) return false;
   5:  
   6: string key = actionType.ToString()  + context.Request.UserHostAddress;
   7:  
   8: HitInfo hit  = (HitInfo)(context.Cache[key] ?? new HitInfo());
   9:  
  10: if( hit.Hits > (int)actionType ) return false;
  11: else hit.Hits ++;
  12:  
  13: if( hit.Hits == 1 )
  14:     context.Cache.Add(key, hit, null, DateTime.Now.AddMinutes(DURATION), 
  15:       System.Web.Caching.Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.Normal, null);
  16:  
  17: return true;
  18: }

The cache key is built with a combination of action type and client IP address. First it checks if there’s any entry for the action and the client IP in Cache or not. If not, start the count and store remember the count for the IP in cache for the specific duration. The absolute expiration on cache item ensures after the duration, the cache item will be cleared and the count will restart. When there’s already an entry in the cache, get the last hit count, and check if the limit is exceeded or not. If not exceeded, increase the counter. There is no need to store the updated value in the cache again by doing: Cache[url]=hit; because the hit object is by reference and changing it means it gets changed in the cache as well. In fact, if you do put it again in the cache, the cache expiration counter will restart and fail the logic of restarting count after specific duration.

The usage is very simple:

   1: 
protected override void OnInit(EventArgs e)
   2: {
   3:   base.OnInit(e);
   4:  
   5:   // Check if revisit is valid or not
   6:   if(  !base.IsPostBack ) 
   7:   {
   8:     // Block cookie less visit attempts
   9:     if( Profile.IsFirstVisit )
  10:     {
  11:        if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.FirstVisit) Response.End();
  12:     }
  13:    else
  14:     {
  15:      if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.ReVisit) ) Response.End();
  16:     }
  17:   }
  18:  else
  19:   {
  20:    // Limit number of postbacks
  21:     if( !ActionValidator.IsValid(ActionValidator.ActionTypeEnum.Postback)  Response.End();
  22:   }
  23: }

Here I am checking specific scenario like First Visit, re-visit, postbacks etc.

Of course you can put in some Cisco firewall and prevent DOS attack. You will get guaranty from your hosting provider that their entire network is immune to DOS and DDOS (Distributed DOS) attacks. What they guaranty is network level attack like TCP SYN attacks or malformed packet floods etc. There is no way they can analyze the packet and find out a particular IP is trying to load the site too many times without supporting cookie or trying to add too many widgets. These are called application level DOS attack which hardware cannot prevent. It must be implemented in your own code.

There are very few websites out their which take such precaution for application level DOS attacks. Thus it’s quite easy to make servers go mad by writing a simple loop and hitting expensive pages or web services continuously from your home broadband connection. I hope this small but effective class will help you implement DOS attack in your own web applications.

 

Update

Here's the code of the full ActionValidator class:

// Copyright (c) Omar AL Zabir. All rights reserved.
// For continued development and updates, visit http://msmvps.com/omar

using System;
using System.Data;
using System.Configuration;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Web.UI.HtmlControls;

/// <summary>
/// Summary description for ActionValidator
/// </summary>
namespace Dropthings.Web.Util
{
  public static class ActionValidator
  {
  private const int DURATION = 10; // 10 min period
   
  /*
  * Type of actions and their maximum value per period
  *
  */
  public enum ActionTypeEnum
  {
  None = 0,
  FirstVisit = 100, // The most expensive one, choose the value wisely.
  Revisit = 1000, // Welcome to revisit as many times as user likes
  Postback = 5000, // Not must of a problem for us
  AddNewWidget = 100,
  AddNewPage = 100,
  }

  private class HitInfo
  {
  public int Hits;
  private DateTime _ExpiresAt = DateTime.Now.AddMinutes(DURATION);
  public DateTime ExpiresAt { get { return _ExpiresAt; } set { _ExpiresAt = value; } }
  }

  public static bool IsValid( ActionTypeEnum actionType )
  {
  HttpContext context = HttpContext.Current;
  if( context.Request.Browser.Crawler ) return false;

  string key = actionType.ToString() + context.Request.UserHostAddress;

  HitInfo hit = (HitInfo)(context.Cache[key] ?? new HitInfo());

  if( hit.Hits > (int)actionType ) return false;
  else hit.Hits ++;

  if( hit.Hits == 1 )
  context.Cache.Add(key, hit, null, DateTime.Now.AddMinutes(DURATION),
  System.Web.Caching.Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.Normal, null);
   
  return true;
  }
  }
}

 

Posted: Oct 16 2007, 09:40 PM by oazabir | with 20 comment(s) |
Filed under:

Comments

Kenneth said:

Thanks for this article. I like the simple, yet effective implementation of application DOS attack prevention.

Nish: I don't think it's a bad idea at all. You just have to define your threshold values wisely. We want to stop those hitting your webserver hundres of times a minute.

# October 16, 2007 11:23 AM

rajbk said:

>The idea is to remember caller’s IP in Asp.net Cache and maintain a count of request per IP

This is only effective to a certain extent. DOS is typically done using spoofed IP addresses which will easily defeat this scheme.

# October 16, 2007 11:50 AM

Dave said:

Seriously - DOS attacks should be handled at your firewall, not by some hack in your application. Talk to your network guy.

# October 16, 2007 8:41 PM

oazabir said:

Hi Guys,

Looks like you are missing "Application Level" DOS attack point. Hardware can prevent network level DOS attack. In order to prevent Application Level DOS attack, you have to make your application secure enough. One common example is preventing browser F5 in order to prevent repeated postback of same data. Someone can easily flood you by producing repeated postback. No firewall can prevent this.

# October 17, 2007 4:14 AM

jurex said:

Use Captcha forget DOS

# April 15, 2008 2:52 AM

urockblue said:

No matter what other folks have said..I think this is a very slick piece of code.

Great Job Omar

# April 27, 2008 10:46 PM

docluv said:

Good article, but I do think it would cause usability issues for a corporation behind a proxy. But I think that could be addressed by examining the headers a little better. I hate CAPTCHA and it is not gonig to stop many issues this addresses. I think the real application would be to stop known sources of Injection attacks from having access to your site as well as others in your enterprise/company.

# May 5, 2008 9:55 AM

Robert said:

Hi. I really like the approach, but I have a dubt about cache locking.

You aren't using any locking on the Cache, is it correct? If many requests come at the same time, many threads execute context.Cache[key] that returns null for the same IP.

Can you clarify?

Thanks

# June 8, 2008 1:00 PM

Mojtaabaa said:

thanks for the article

# August 26, 2008 12:42 AM

heime said:

Mr. T- I agree.

This article is not useful to us as it is.

Does anyone here know what a 'HitInfo' object or a 'Profile' object is? Throw me a bone.

While this article is useful for intellectual banter, can a single one of you say that you've been able to run this code?

# September 4, 2008 10:16 AM

Atif Hussain said:

does any one have run this code or made the example for that? if yes then please upload it and provide the URL here or email me at matifhussain@yahoo.com

# September 11, 2008 10:57 PM

cmc said:

Two questions:

1) Why do you care whether it is a firstVisit or not?

2) You indicated that Profile.IsFirstVisit is a "custom" property. What is the logic that you are checking to determine this? Why is that important?

# November 21, 2008 2:13 PM

franklin said:

This solution doesn't really work in a web farm scenario.

It relies on the ASP.NET Cache object, which is stored in each web server's local memory.

Each web server in the farm has its own instance of the Cache object. So, webserver2 doesn't know how many times a particular IP address has hit webserver1.

# April 20, 2010 2:57 PM

oazabir said:

No need to. You set the threshold values based on the capacity of each server. Not based on the capacity of your entire web farm, which is not practical to calculate as capacity of a web farm is dynamic.

# April 20, 2010 3:51 PM