Omar AL Zabir blog on ASP.NET Ajax and .NET 3.5

Working hard to enrich millions of peoples' lives

Sponsors

News

I was
Co-Founder and CTO of Pageflakes, acquired by LiveUniverse - founded by MySpace founder.

I am
Chief Architect, SaaS Platform, British Telecom

I will be
Chief Architect, Mi...

Follow omaralzabir on Twitter

My Public Page
www.pageflakes.com/omar

View Omar AL Zabir's profile on LinkedIn

Read my blog on:

Omar AL Zabir

www.oazabir.com



Views:

Open source projects

Droptiles: Metro style Live Tiles enabled Web 2.0 Dashboard

Droptiles is an Open Source Windows 8 Start like Metro style Web 2.0 Dashboard. It builds the experience using Tiles. Tiles are mini apps that can fetch data from external sources. Clicking on a tile launches the full app. Apps can be from any existing website to customized website specifically built to fit the Dashboard experience. Droptiles is built almost entirely of HTML, Javascript and CSS and thus highly portable to any platform. The sample project is built using ASP.NET to show some server side integration, like Signup, Login and getting dynamic data from server. But with very little change you can port it to PHP, Ruby, JSP or any other platform. Droptiles is the sequel of my Dropthings, which is the first Open Source Web 2.0 Dashboard.

See it live!, go to Droptiles.com

Features

  • Metro style user interface. CSS framework to build metro style websites, inspired by metroui.org.ua.
  • Drag & Drop tiles to personalize the experience.
  • Client side object model and data binding for easy MVVM implementation.
  • Server side platform neutral implementation. Can be ported to PHP, JSP easily.
  • Live tiles. Tiles are mini-apps, loading data from variety of sources.

How can you use Droptiles

  • Enterprise Dashboard aggregating data from various systems and offering a launch pad for intranet/internet applications.
  • Web 2.0 Portal offering portlets in the form of tiles. Aggregating data from various services and as a launch pad for different services.
  • Touch enabled Kiosk front-end.
  • Content aggregator for News and Research purpose.

License

Droptiles is Open Source. It is free for personal use as long as you keep the copyright notices intact. In order to buy license, go to theDroptiles production site and there's a tile on the right side to place the order through Paypal.

Technologies

Screenshots

App Store

App Store

AppStore offers new applications

Tiles are mini apps, built using Javascript

Tiles

Tiles are mini apps that can have their own html, javascript and css. They are loaded by the Dashboard dynamically and executed dynamically. Each tile runs on its own delivering the in-tile experience. There's no special Widget framework to follow. Just plain simple Javascript.

Tiles can be dynamic, loaded from server side pages

Dynamic Tile

Tiles can be dynamic pages delivered from the server. Here is a tile that captures the html output of an ASPX page. Tiles can be intercative as well. You can embed a form inside a tile.

Apps running inside Droptiles

Apps

External applications can run inside Droptiles offering a seamless integration experience.

You can get the code from the GitHub site: http://oazabir.github.com/Droptiles/

Caching WCF javascript proxy on browser

When you use WCF services from Javascript, you have to generate the Javascript proxies by hitting the Service.svc/js. If you have five WCF services, then it means five javascripts to download. As browsers download javascripts synchronously, one after another, it adds latency to page load and slows down page rendering performance. Moreover, the same WCF service proxy is downloaded from every page, because the generated javascript file is not cached on browser. Here is a solution that will ensure the generated Javascript proxies are cached on browser and when there is a hit on the service, it will respond with HTTP 304 if the Service.svc file has not changed.

Here’s a Fiddler trace of a page that uses two WCF services.

image

You can see there are two /js hits and they are sequential. Every visit to the same page, even with the same browser session results in making those two hits to /js. Second time when the same page is browsed:

image

You can see everything else is cached, except the WCF javascript proxies. They are never cached because the WCF javascript proxy generator does not produce the necessary caching headers to cache the files on browser.

Here’s an HttpModule for IIS and IIS Express which will intercept calls to WCF service proxy. It first checks if the service is changed since the cached version on the browser. If it has not changed then it will return HTTP 304 and not go through the service proxy generation process. Thus it saves some CPU on server. But if the request is for the first time and there’s no cached copy on browser, it will deliver the proxy and also emit the proper cache headers to cache the response on browser.

http://www.codeproject.com/Articles/360437/Caching-WCF-javascript-proxy-on-browser

Don’t forget to vote.

Memory Stream Multiplexer–write and read from many threads simultaneously

Here’s an implementation of MemoryStream like buffer manager where one thread can write and many threads can read simultaneously. Each reading thread gets its own reader and can read from the shared stream on its own without blocking write operation or other parallel read operations. It supports blocking Read call so that reader threads can call Read(…) and wait until some data is available, exactly the same way you would expect a Stream to behave. You can use this to read content from network or file in one thread and then get it read by one or more threads simultaneously. Readers do not block writing. As a result, both read and write happens concurrently. Handy for building http proxy where you are downloading a certain file and you have multiple clients asking for the same file at the same time. You can download it in one thread and let one or more client threads read from the same buffer exactly at the same time. You can also use this to read same file on disk from multiple clients at the same time. You can also use this to implement a server side cache where the same buffer is read by multiple clients at the same time.

image

See the detail implementation here:

Memory Stream Multiplexer–write and read from many threads simultaneously

Don’t forget to vote.

ReadLine on Binary Stream

When you are reading data from a binary stream, like NetworkStream or FileStream and you need to read both binary chunks as well as read one text line at a time, you are on your own as BinaryReader nor Stream supports ReadLine. You can use StreamReader to do ReadLine, but it does not allow you to read chunks of bytes. The Read(byte[], int, int) is not there on StreamReader.

Here’s an extension of BinaryReader for doing ReadLine over a binary stream. You can read both byte chunks, as well as read text lines at the same time.

public class LineReader : BinaryReader
{
  private Encoding _encoding;
  private Decoder _decoder;

  const int bufferSize = 1024;
  private char[] _LineBuffer = new char[bufferSize];
    
  public LineReader(Stream stream, int bufferSize, Encoding encoding)
    : base(stream, encoding)
  {
    this._encoding = encoding;
    this._decoder = encoding.GetDecoder();
  }

  public string ReadLine()
  {
    int pos = 0;
    
    char[] buf = new char[2];

    StringBuilder stringBuffer = null;
    bool lineEndFound = false;

    while(base.Read(buf, 0, 2) > 0)
    {
      if (buf[1] == '\r')
      {
        // grab buf[0]
        this._LineBuffer[pos++] = buf[0];
        // get the '\n'
        char ch = base.ReadChar();
        Debug.Assert(ch == '\n');

        lineEndFound = true;
      }
      else if (buf[0] == '\r')
      {
        lineEndFound = true;
      }          
      else
      {
        this._LineBuffer[pos] = buf[0];
        this._LineBuffer[pos+1] = buf[1];
        pos += 2;

        if (pos >= bufferSize)
        {
          stringBuffer = new StringBuilder(bufferSize + 80);
          stringBuffer.Append(this._LineBuffer, 0, bufferSize);
          pos = 0;
        }
      }

      if (lineEndFound)
      {
        if (stringBuffer == null)
        {
          if (pos > 0)
            return new string(this._LineBuffer, 0, pos);
          else
            return string.Empty;
        }
        else
        {
          if (pos > 0)
            stringBuffer.Append(this._LineBuffer, 0, pos);
          return stringBuffer.ToString();
        }
      }
    }

    if (stringBuffer != null)
    {
      if (pos > 0)
        stringBuffer.Append(this._LineBuffer, 0, pos);
      return stringBuffer.ToString();
    }
    else
    {
      if (pos > 0)
        return new string(this._LineBuffer, 0, pos);
      else
        return null;
    }
  }

}

Enjoy.

Scaling ASP.NET websites from thousands to millions–LIDNUG

Here’s the recent presentation made on LIDNUG on scaling ASP.NET websites from thousands to millions of users.

Scaling ASP.NET websites from thousands to millions of users by Omar AL Zabir

Here’re the slides.

Browse internet faster and save power using a smart HOSTS file

Internet is full of flash ads nowadays that make page load slower, render slower and consumes more CPU, thus power. If you can browse without having any flash ads or in fact any ads loaded and without any of the tracking scripts - you can browse much faster, scroll through pages much smoother and have more hours from your battery. Nowadays, most websites use scripts from various analytics sites that track your browsing habit, use IFRAME to load tracking and social networking widgets. All of these add considerable delay to page loading and make browser consume more CPU and bandwidth. If you can turn all of them off, browsing internet feels a lot smoother, faster and you get more work hours while running on battery.

Moreover, you don’t get distracted by the flashy ads and save your children and young family members from looking at foul things.

If we could get 10% of the total internet users (2bn as of Jan 2011) to save 10% CPU, power and bandwidth while browsing everyday, we could save mega watts of power everyday throughout the world!

Using this solution, you can prevent ads and tracking scripts, prevent malicious and porn websites.

How bad is it?

Let’s take an example on a popular website. The red boxes are Flash Ads (read power suckers).

image

Once we disable all ads and tracking scripts, here’s how it looks:

image

Statistics:

  Before After
Total Requests 111 100
Total Download Size 1.2 MB 0.98 MB
Page load time 4.34s 3.64

 

Not just during page loading, while you are on the page, doing nothing, just reading, browser continuously consumes CPU.

Before:

image

After:

image

Before disabling the ads and tracking scripts, CPU is always around 20-25%. After disabling it is around 8-10%. The more CPU works, the more power it consumes. If you are running on battery, you can get at least 20% more time from your battery. If you have many tabs open all the time, you can save more.

Here’s how to save CPU, bandwidth and power

Go to this website and download the HOSTS file:

http://winhelp2002.mvps.org/hosts.htm

Follow the instruction to put the HOSTS file in your C:\Windows\System32\Drivers\etc folder.

Now go to Start Menu, type Notepad but do not hit enter. Right click on Notepad and select Run As Administrator.

image

Go to File menu and click Open.

Copy and paste this into the File Name and click OK.

c:\windows\system32\drivers\etc\HOSTS

image

Now go to Edit menu and select Replace. Put 127.0.0.1 in Find box and put 255.255.255.0 in Replace box. Click Replace All.

image

Once done, you need to type back 127.0.0.1 for the first entry localhost.

image

Remember, localhost cannot be 255.255.255.0.

When you have done this correctly, it will look like this.

image

Save the file and exit Notepad.

Then go to Start menu and type: services.msc

From the service list, double click on “DNS Client”.

image

First click “Stop” to stop the service.

Then from the Startup Type dropdown, select Disabled.

Click OK.

image

Close all your browsers and reopen them. I highly recommend restarting your PC.

You are ready to browse faster, smarter, cheaper! 

I also highly recommend everyone to use OpenDNS. You can save yourself from getting into malicious sites and being ripped off your bank balance, property, spouse and children. Just go to www.opendns.com and follow the instruction. It is the best thing happened on the internet after Wikipedia!

How does the HOSTS file trick work?

Here’s how internet works. You type www.something.com and it goes and finds out what is the IP address for this domain. First Windows checks a file called HOSTS. If it is not defined there, it will ask the DNS Server configured for your network to give it the IP for the domain so that it can connect to the webserver. If you put fake IP in HOSTS file, Windows will hand over fake IP to the browser and browser will connect to the fake IP. Thus by putting an invalid IP, we prevent browser or any application running on your PC from reaching the ads, tracker, malicious and porn websites.

Don’t forget to share this with your friends and families!

Get Dropthings license by donating to charity

Now you no longer pay me for Dropthings license instead you donate the money to a charity and I will give you the license. In case you don’t know what Dropthings is, it is a Web 2.0 Personalizable Dashboard framework that you can use to build Web 2.0 personalizable websites and enterprise dashboards. It is built using ASP.NET AJAX, jQuery, Silverlight, .NET 3.5, Entity Framework, SQL Server. It is in use in big companies like BT, Intel, Microsoft, Thomson Reuters; many government organizations like State Police, Canada Border Protection etc. Since it is a state of the art .NET 3.5 codebase, it is sometimes used as a starting point for an application with all the best practices already in place in order to build an N-tier web app using popular technologies, design patterns and testing methods. Dropthings helps you build web app utilizing extensive performance and scalability research that I have done to scale websites to millions of users. It also helps you build a codebase that is highly testable. It shows you how to test AJAX applications using automated test tools like WatiN. It has a business layer and a data access layer that is fully unit testable, nearly 100% test coverage and uses Inversion of Control pattern to the fullest.

You can find details about the Project here: http://code.google.com/p/dropthings/

There are two codeproject articles that show you how it was built, tested, deployed and the production challenges I had to overcome scaling this to millions of requests per day:

Build Google IG like Portal in 7 days

Web 2.0 AJAX Portal using jQuery, ASP.NET 3.5, Silverlight, Linq to SQL, WF and Unity

Finally, there’s a book on it, that takes you from the initial idea to design, coding, testing, all the way to purchasing right production hardware, deployment and production troubleshooting. It is a complete end-to-end guide for a developer/startup CTO to take an idea from design to VC funded successful startup used by millions. I have captured many experiences I have learnt from my startup years at Pageflakes that I co-founded and was the founding CTO.

Building a Web 2.0 Portal with ASP.NET 3.5 from O‘Reilly.

Let’s build great web apps and save the world at the same time!

Posted: Dec 22 2011, 10:42 PM by oazabir | with no comments
Filed under:
MVP Open Day 2011 at Cambridge

Microsoft Research arranged MVP Open Day 2011 at Cambridge on Oct 24, 2011. Beautiful university, made me feel like giving up my job and going back to study. Amazing research work going there, very thought provoking. The session on DNA programming was out of the world. The most surprising thing I learnt that a 10cm long DNA strand can hold 10TB digitally encoded data and cells are thousand times more robust computing system than silicon based chips. Moreover, cells are self-powered, super energy efficient micro processors, hundred years ahead of Intel processors.

Here’s my presentation slide. Nothing NDA in this, feel free to distribute.

Posted: Oct 15 2011, 09:18 PM by oazabir | with no comments
Filed under:
Prevent ASP.NET cookies from being sent on every css, js, image request

ASP.NET generates some large cookies if you are using ASP.NET membership provider. Especially if you are using the Anonymous provider, then a typical site will send the following cookies to every request when a user is logged in, whether the request is to a dynamic page or to any static resource:

.DBANON=w3kYczsH8Wvzs6MgryS4JYEF0N-8ZR6aLRSTU9KwVaGaydD6WwUHD7X9tN8vBgjgzKf3r3SJHusTYFjU85y
YfnunyCeuExcZs895JK9Fk1HS68ksGwm3QpxnRZvpDBAfJKEUKee2OTlND0gi43qwwtIPLeY1;
ASP.NET_SessionId=bmnbp155wilotk45gjhitoqg; DBAUTH12=2A848A8C200CB0E8E05C6EBA8059A0DBA228FC5F6EDD29401C249D2
37812344C15B3C5C57D6B776037FAA8F14017880E57BDC14A7963C58B0A0B30229
AF0123A6DF56601D814E75525E7DCA9AD4A0EF200832B39A1F35A5111092F0805B
0A8CD3D2FD5E3AB6176893D86AFBEB68F7EA42BE61E89537DEAA3279F3B576D0C
44BA00B9FA1D9DD3EE985F37B0A5A134ADC0EA9C548D

There are 517 bytes of worthless data being sent to every css, js and images from the browser to your webserver!

You might think 517 bytes is peanut. Do the math:

  • Avg page has 40 requests to server. 40 x 517 bytes = 20 KB per page view.
  • 1M page views = 20 GB
  • That’s 20GB of data getting uploaded to your server for just 1M page views. It does not take millions of users to produce 1M page views. Around 100k+ users using your site every day can produce 1M page views every day.

Here’s how to prevent this:

  • Setup a new website and map a different subdomain to it. If your main site is www.yoursite.com then map static.yoursite.com to it.
  • Manually change all the <link>, <script>, <img> css url(…) and prefix each resource with http://static.yoursite.com
  • If you don’t want to do it manually, use this solution I have done before.
  • Add a Global.asax and in the EndRequest do this trick:
    HttpContext context = HttpContext.Current;
    if (context.Request.Url.ToString.StartsWith("http://static.yoursite.com")
    {
      List<string> cookiesToClear = new List<string>();
      foreach (string cookieName in context.Request.Cookies)
      {
        HttpCookie cookie = context.Request.Cookies[cookieName];
        cookiesToClear.Add(cookie.Name);
      }
    
      foreach (string name in cookiesToClear)
      {
        HttpCookie cookie = new HttpCookie(name, string.Empty);
        cookie.Expires = DateTime.Today.AddYears(-1);
    
        context.Response.Cookies.Set(cookie);
      }
    }

    This code reads all the cookies it receives from request and expires them so that browser does not send those cookies again. If by any chance ASP.NET cookies get injected into the static.yoursite.com domain, this code will take care of removing them.

Posted: Oct 15 2011, 07:57 PM by oazabir | with no comments
Filed under:
Tweaking WCF to build highly scalable async REST API

At 9 AM in the morning, during the peak traffic for your business, you get an emergency call that the website you built is no more. It’s not responding to any request. Some people can see some page after waiting for long time but most can’t. So, you think it must be some slow query or the database might need some tuning. You do the regular checks like looking at CPU and Disk on database server. You find nothing is wrong there. Then you suspect it must be webserver running slow. So, you check CPU and Disk on webservers. You find no problem there either. Both web servers and database servers have very low CPU and Disk usage. Then you suspect it must be the network. So, you try a large file copy from webserver to database server and vice versa. Nope, file copies perfectly fine, network has no problem. You also quickly check RAM usage on all servers but find RAM usage is perfectly fine. As the last resort, you run some diagnostics on Load Balancer, Firewall, and Switches but find everything to be in good shape. But your website is down. Looking at the performance counters on the webserver, you see a lot of requests getting queued, and there’s very high request execution time, and request wait time.

image001

So, you do an IIS restart. Your websites comes back online for couple of minutes and then it goes down again. After doing restart several times you realize it’s not an infrastructure issue. You have some scalability issue in your code. All the good things you have read about scalability and thought that those were fairy tales and they will never happen to you is now happening right in front of you. You realize you should have made your services async.

However, just converting your sync services to async mode does not solve the scalability problem. WCF has a bug due to which it cannot serve requests as fast as you would like it to. The thread pool it uses to handle the async calls cannot start threads as requests come in. It only adds a new thread to the pool every 500ms. As a result, you get slow rampup of threads:

image018

Read my article to learn details on how WCF works for async services and how to fix this bug to make your async services truly async and scale under heavy load.

http://www.codeproject.com/KB/webservices/fixwcf_for_restapi.aspx

Don’t forget to vote.

Posted: Jul 31 2011, 05:56 PM by oazabir | with no comments
Filed under: , ,
More Posts Next page »