ASP.NET Road Show notes
Greg was kind enough to provide some notes that he took at the ASP.NET Road Show event.
ASP.NET EXPOSED 01/15/04
Part I: A Brief Intro to ASP.NET
ASP.NET Tips and Tricks
Part II: Preventing attacks, ASP.NET "Whidbey"
Uploading files and to SQL Server
Cross-site scripting attacks
SQL Script injection attack
Whidbey=.NET 2.0 & VS.NET 2.0
Classic ASP: Bad
Way too much code required
Applications contain spaghetti code
Limited language support
Deployment can be difficult
Components were a disaster: to update a component: stop web service, unregister components, register new component, restart IIS.
ISAPI Filters & ISAPI Extensions
Dramatically easier to build dynamic web pages: validation, data manipulation, PostBack, State Better support for different clients: rich support for devices Cleaner code organization: code no longer has to be mixed with HTML Declarative server-side UI control model: great way to encapsulate functionality Rich extensibility model: Extend ASPNET for your custom solutions
ASP.nET worker process:
ASP.NET runs inside a dedicated worker process - it's not part of IIS other than request/response.
Modules-like an ISAPI filter
Page Handler-gets the page off the disk or cache ASPX Engine - takes files, DLLs assemblies and code-behind and makes a single class file and puts it into cache. The page class is JIT compiled based on the processor architecture (Centrino, Athlon) compile in memory and then generate the response.
On 2nd request it goes straight into the class instance in memory.
Strong-typing variables are key to performance of asp.net vs. asp classic.
It's not necessary to use VS.NET
Remember the ASP.NET Trace object that allows the trace.write method. It won't write to the page unless you turn the tracing on.
<%@ page trace="true" %>
Cross-site scripting attacks (XSS): these occur when someone enters data that includes angle brackets, with and without the percent signs. It is possible to turn off the XSS catcher, look at the notes later.
Development Tips & Tricks:
File Uploading: Built-in file upload support - No posting acceptor required & no third party components required.
Exception!! Because the directory hasn't been created yet!
By default ASp.NET cannot write to your file system!!
You have to grant asp.net the privilege to write to your disk.
The maximum upload size is 4megs because a large file can serve as a denial of service attack.
<system.web><maxlengh can be changed in global.asax.
Don't make your directories accessible because it's possible for someone to upload to that directory executable code, some of which may be able to grab database passwords, etc.
FTP is better for huge upload files.
Otherwise some asp.net threads will e waiting around doing nothing. If you get a ton of uploads, threads will be taken up doing that work while other work stacks up in a queue. Not good.
Use a dedicated server to do file uploads is a good solutions - get it out of your main application path.
Or, you could have it run in its own process, so that the regular threads can process regular web requests.
enctype="multipart/form-data" is important to use in the form attribute for an upload form.
It's possible to upload files into SQL server. You can get the content type. This way, you can tell the browser what type of data is being sent back down. So something like IE can open its version of Excel or Word, etc.
This way, you man never need to write anything out to the disk, you can just write it to SQL server.
You can get an inputstream.length and the content-type and then dynamically construct a parameterized SQL statement.
Use of stored procedures is recommended whenever possible. Otherwise people can put escape statements into the SQL being run.
Less secure are those queries that are built dynamically from text statements.
Small/Large object mapping if under
ASP.NET Exposed #2
Rich server image generation
Read/write any standard IO Stream
Dynamically generate GIFs/JPGs from .aspx Use ASP.NET Page Sets content type to image/fig
Keep in mind output caching when you're making dynamic images.
<%@ OutputCache Duration="60" VaryByParam="none" %>
Page Output Caching
Cache contents of page to memory
Reused cached page on subsequent requests.
Microsoft Application enter Test ships with Enterprise or you can use Web Stress Tool from MSDN.
You can set up a test script and then run it, it will start throwing load against the application. You can see that it's doing the # of requests per second. Here's how you can test output caching.
The difference in performance is absolutely phenomenal.
IIS 6 also has kernel caching - which is even faster than before. In tests they're more likely to bump into NIC or bandwidth ceilings than in web server performance ceilings.
You could set the duration very low - like one second, which forces the page to run fully from code, the performance gains are still nearly as impressive as when using duration=60
Browser progress Page:
Expedia style "searching" page
Implementation: intermediate PageLoading.aspx page
Thread.sleep(10000) can simulate a work process.
has no server-side code:
frame1.document.location=pleasewait.htm - a moving image that says "please wait"
(you can also get this code from the road show website in case you forget to type in something)
Whidbey has asynchronous thread work. It can start a process, then free up the thread while the process is running, then it can start up again.
Two types of attacks
system level attacks
Exploit vulnerabilities in web servers
ISAPI DLL buffer overflows (code red/Nimda) IIS Unicode directory traversal (SadMind)
Solution: Up-to-date security patches
Exploit vulnerabilities in your code
Solution: code against them
SQL Injection Attacks
Exploits unfiltered inputs
Input from <form> tags
i.e. a login scenario:
Don't run your connection as the SA account Make sure only one record is matched for the login, not many.
If they know you're using dynamic SQL they can pass in OR statements that will make your statement evaluate to TRUE!!!
'"--- could halt a SQL statement
a') union select l:-- could also work..
Maybe even an exception or a lot of detail.
asp.net's website ran as the SA account! Hahahaha!!
Any situations in your app where you can dynamically create SQL statements, make sure it can't be attacked via the form and OR, or UNION statements.
This can be prevented in .net 1.1
by using request validation. Page validation="true" is the default and checks for brackets i.e. <script> etc.
You can capture the exception message and keep track of what someone is trying to do to you.
Always do your own checks on the input. <script><embed><object> tags are dangerous, but if you can, just code for what you'll allow - now what you intend to disallow.
Credential Storage: Salted hash passwords - look at the slides for this info - he's passing over it.
Firstly, it's backward-compatible with asp.net 1.1
Overview: rich data edition against business objects output caching....
Developer productivity: reduce code by 2/3rds Enable rich scenarios not easily possibly today.
Better administration and management.
File-based and SQL data caching. Asp.net takes data from db and runs it from memory until there's a change in the database.
There are also new templates of apps.
Cassini is included so that you don't need IIS however you can't serve outside the local box, however, you can start building and debugging.
The new building block APIs:
Membership objects, Rolemanager, Personalization objects.
Partial classes: part of a class can be defined in one assembly or file and in another which is
ASP.NET Exposed #3
Site navigation database caching, management. So no having to go through the global.asax file.
The provider model design pattern connects the new objects to the data stores: SQL, Oracle, Active Directory.
There is the new *.master page
in addition to the intellisense, there's the statement completion feature.
sqldependency property to outputcaching.
Greg was obviously not sleeping during the presentation!