I use Remote Desktop Client dozens of times per day to administer remote servers. With Windows Vista, I get an ugly prompt when connecting to Windows Server 2003 and Windows 2000 Server machines saying:
Remote Desktop cannot verify the identity of the computer you want to connect to. This problem can occur if:
1) The remote computer is running a version of Windows that is earlier than Windows Vista.
2) The remote computer is configured to support only the RDP security layer.
Contact your network administrator or the owner of the remote computer for assistance.
Do you want to connect anyway?
I know that the remote server is good, it's in a memorized list of servers. But it is Windows Server 2003 or Windows 2000 Server. Although the prompt is correct, I don't want to have to acknowledge that prompt over and over again.
Note: (added later) The obvious answer that I was alerted to from a comment from Blandname is to do this per session: click on the advanced tab in the Remote Desktop Connection tool and change the Authentication options to "Always connection, even if authentication fails". If you create your own RDP file, you can set it with "authentication level:i:0."
If you want to set this at the server level or find out more about this setting, read on.
I did some digging using Process Monitor from www.sysinternals.com (recently acquired by Microsoft) and found that the mstsc process was checking for some particular keys in the registry. Two of them seemed possible candidates and after testing I confirmed that AuthenticationLevelOverride is the key that applies to this situation.
The registry key is a DWORD value at \\HKCU\Software\Microsoft\Terminal Server Client\AuthenticationLevelOverride
I googled on AuthenticationLevelOverride and couldn't find very much information. But one article had a fair bit of information: http://support.microsoft.com/kb/895433. Here are the 3 possible values, at least in Windows Server 2003:
Set the authentication level value to one of the following values:
|•||0 This value corresponds to "No authentication."|
|•||1 This value corresponds to "Require authentication."|
|•||2 This value corresponds to "Attempt authentication."|
I experimented and found that 2 is the default now. I tested the 3 modes and found that:
0 -> Doesn't prompt. Yah!
1 -> Gives a similar message but doesn't allow me to continue. This is the strictest.
2 -> Gives the message but allows me to accept and continue.
In my case, I don't even want the prompt so I set AuthenticationLevelOverride to 0 and I'm able to log into my Remote Desktop sessions without that extra prompt.
Warning: this is a decrease in security so should only be changed if you are aware of the what and why of this change.
In summary, if you want to remove the Authentication check on Windows Vista that prompts you every time you connect to a pre-Vista machine, add a DWORD registry entry called AuthenticationLevelOverride in the \\HKLM\Software\Microsoft\Terminal Server Client\ key and ensure that its value is set to 0.
In the early days of 64-bit, drivers were hard to find or unstable, programs wouldn’t run and help was scarce. I started using Windows XP 64-bit about a year ago, so I’ve battled through many of the issues with working with this ‘newer’ technology. At ORCS Web, we support Windows Server 2003 64-bit on different types of applications, from SQL Server to IIS to Microsoft Virtual Server.
So, is there a rule of thumb with 64-bit? Should everyone use it whenever possible? Or are there reasons to stick with 32-bit?
My original thought was that [64-bit = newer therefore 64-bit = better]. I’ve come to find out that isn’t always the case, as I’ll explain below.
First, my experience on a desktop.
I got a new computer built and supported by Dell as a 64-bit server and with Windows XP 64-bit pre-installed. Drivers weren’t a problem for me since Dell took care of that already. But I still ran into a number of issues, including:
Many applications didn’t work in 64-bit. Many of my favorite utilities and tools wouldn’t work. If I were to guess at a number, 80% of my applications worked like normal, but about 20% didn’t. That 20% was enough to be inconvenient.
My memory (RAM) usage was higher. For lots of smaller applications, there is a lot of waste. The full 64-bit address space isn’t always utilized, leaving lots of underutilized memory.
Help was hard to find. Even simple things like finding my Mail icon in the control panel took a long time to figure out: http://weblogs.asp.net/owscott/archive/2006/01/19/435921.aspx
So, I have come to conclude that in many cases 64-bit isn’t necessary for a desktop computer. My desktop has 3GB of RAM now, but I don’t see any advantages of using 64-bit unless I’m a developer that needs to test 64-bit applications or have a need to use very large amounts of memory. My general recommendation is to only use 64-bit for a desktop computer if you have a specific reason to do so.
Next, my experience on a server
How about on the server end? I made the same incorrect assumption at first, determining that most new servers would eventually use 64-bit as a standard. But experience and recent load testing has convinced me otherwise.
Recently we had a high-traffic ASP.NET website that we deployed on a 64-bit server with Windows Server 2003 x64 Standard Edition. It was part of a webfarm so we were able to compare it directly against other 32-bit servers with similar hardware. Performance was noticeably less than on the 32-bit servers which surprised me at first. I spent a consideration amount of time reproducing the issue and isolating it to just 32-bit. In fact, I tested ASP.NET and IIS6 running in 32-bit mode as well to test all three situations.
I did some load testing using Microsoft’s Web Application Stress Tool (WAST). I used 2 identical servers (ordered from Dell at the same time with the same specs) and built one as 32-bit and one as 64-bit. They are solid machines, Dual 3.0Ghz Dual Core CPU with 4GB of RAM and 15,000RPM SCSI hard drives. I set up 7 load testing servers using a Gbps network core to test.
Test Results – IIS only
Hitting IIS directly for a static 2kb page, both servers were able to serve up over 14,000 Pages/sec with only 70% CPU. The 7 load testing servers weren’t able to max out the web servers. So IIS doesn’t have any problem handling huge amounts of traffic!
Test Results – Simple ASP.NET
I then tested a simple ASP.NET page that ran <%= now %> as well as 2kb of plain text. The 32-bit server was able to serve up a maximum of 3,400 Pages/sec. The CPUs were the bottleneck. The 64-bit server was able to serve up a maximum of 2,100 Pages/sec. So the 32-bit server easily outperformed the 64-bit server. (Note: On this particular test, on the 64-bit server, ASP.NET and IIS performed the same whether they were in 64-bit mode or 32-bit mode.)
Test Results – ASP.NET loop
The next test was a While loop with 1,000,000 iterations. In each iteration I outputted Response.Write(“”) just to give it something to do. Both servers were able to serve up 68 Pages/sec. So, in this particular test, the 64-bit server was able to catch up in performance. It would appear that a while loop performs identical in both situations.
Test Result – Others
I ran some other tests, some with interesting unexpected data, but that is a whole different topic. These tests only start to dive into the differences between 32-bit and 64-bit computing, but they show that there is a substantial difference between them, and for low memory applications, a 32-bit Operating System is often faster.
From the other tests I performed, I concluded that there is about a 10 – 15% performance penalty running a 64-bit OS with low amounts of memory. This varies depending on what operation is being performed.
So, for the server environment also, I concluded that 64-bit isn’t always better either.
Now, what about the stories of huge performance gains on 64-bit? Believe them! J For high memory situations and products well tuned for 64-bit like SQL Server 2005, there are tremendous performance gains to be had. I’ve seen some pretty impressive tests show substantial performance gains on 64-bit. The 64-bit SQL Server and 64-bit Virtual Server servers that we have set up at ORCS Web (www.orcsweb.com) are handling huge loads with ease. While I haven’t purposefully compared SQL Server performance like I have ASP.NET, I have every reason to believe that the reports are true that under high memory demands SQL Server on 64-bit greatly outperforms 32-bit.
Memory limits blown away
If you have high memory requirements, 64-bit may not only be better, but it may be the only option. The 64-bit memory space blows away many of the memory limits that used to exist. We’re talking Terabytes instead of Gigabytes!
I’ve come to conclude that 64-bit isn’t always better. Don’t just use 64-bit to use the latest technology. For low memory situations, 64-bit may actually perform worse, have driver compatibility issues and be harder to support. But, for high memory situations, chances are good that those issues disappear and the benefits that the larger memory namespace offers will far outweigh the issues.
There is a place for each, but there is still a lot of life left for 32-bit computing.