January 2004 - Posts
As a followup to my post about Intel's x64 bit x86 plans, News.com has posted an article about an upcoming Intel demo.
Why am I so interested in this stuff? Well, before I turned to software, I was very interested in the design and development of chips at the transistor level. I have a BS and MS in Electrical Engineering from Georgia Tech, where I specialized in micro-electronics, VLSI design, and digital signal processing.
I was talking with Codeboy this afternoon about this bit (no pun intended) of news. The thing that a lot of these companies miss when they attempt to drive a new CPU into the marketplace is that they really need high volume acceptance. To get that, they have to go after the whitebox style crowd and they have to have applications that run on those systems (Windows, Office, and Development tools). I have watched PowerPC, Alpha, MIPS, and now IA64 make that same mistake. Oh well, what's a few billion here and there.........................
I just saw an article that leads me to believe that Intel is going to come out with a 64 bit extension for the 32-bit Intel x86 architecture.
Intel President and Chief Operating Officer Paul Otellini on Wednesday said the world's largest chipmaker would likely give its 32-bit microprocessors an upgrade to 64 bits once supporting software becomes available.
"You can be fairly confident that when there is software from an application and operating system standpoint that we'll be there," Otellini said, responding to a question about 64-bit technology, in an interview with a Wall Street analyst that was broadcast over the Web.
Sounds like Intel will come out with something that is binary compatible with the AMD 64 bit extensions. Hmmm, IA64 looks more and more like IBM Microchannel every day.
I will be doing a talk on Microsoft 's next version of Sql Server codenamed “Yukon” at the Knoxville .NET User Group on February 19, 2004. If you are in the area, make plans to come to it.
I had a chance to sit down and read Scott Mitchell's articles that are on MSDN regarding data structures in .NET. First off, I like the content of them. While I have been programming professionally for 14.5 years, I have a BS and MS in Electrical Engineering, not in Computer Science. As a result, I sometimes miss certain basic items. It was good to read the info in the articles. Secondly, I like the fact that he spent some time focusing on algorithms and how long operations take. Algorithms and the amount of time spent solving a problem is an area that is very near and dear to my heart as I see a lot of programmers implementing algorithms that are sub-optimal.
Just thought I would share with everyone a Microsoft PowerPoint Presentation that I did a few months ago regarding what Microsoft .NET is to me and why a company should be interested in it. While it doesn't preach the Web Services Everywhere Manifesto I hear many people say, it does seem to hit the major issues that organizations have.
- This is not a Developer Oriented Talk. This is a talk geared towards technical managers and folks making the technology direction and purchasing decisions.
- This is not a marketing talk in that I am not attempting to solicit business from you. This is merely a presentation that I did that was very well recieved by a group of technology managers from different companies in Oak Ridge, TN at an organization called Tech2020.
The part of the talk that got the most interest and feedback was one bullet point that stated that you no longer needed separate VB/GUI team and WEB/ASP team due to the nature of using ASP.NET with Visual Studio produces a similar design and development environment as a VB/Delphi like GUI development environment. I had several people come upto me afterwards and say that they were pleasantly surprised that I talked as more about how .NET could effect their developer organizations and business as oppossed to talking about the bits-n-bytes of a “cool“ technology.
Presentations on our site.
More presentations will be posted. These are much more technical in nature.
One of the things that I always found interesting when looking at someone elses code in Classic ADO 2.x was the number of developers that misused the all of the cursor and locking options with a Classic ADO RecordSet when running against Sql Server. There are some good situations where there is a need for a scrollable updatable server-side cursor, but I would say about 50% of the time that I see one, it is not necessary. Well, with .NET Whidbey, ADO.NET will have the a scrollable updatable server-side cursor in the framework. The advantage to the .NET version will be that it is not directly associated with the SqlDataReader or the SqlDataAdapter, so it will be harder to misuse. This is unlike the situation with Classic ADO 2.x where creating a scrollable updatable server-side cursor was one of several options within the recordset object.
Warning: Ideally, you would only want to use scrollable updatable server-side cursors when doing programming directly within the database (such as Yukon). So, don't try this at home without a trained expert standing by........
Here is an example of some code that I wrote to use the SqlResultSet, which is the name of the object that provides the scrollable updatable server-side cursors. Note that the SqlResultSet is created by a call to the SqlCommand object. I also thought it interesting that you can't get the number of records back, merely whether or not there are records. The .HasRows property is good enough for me.
SqlConnection sqlCn = new SqlConnection(strCn);
SqlDataAdapter sqlDa = new SqlDataAdapter(strSql, sqlCn);
SqlCommand sqlCm = new SqlCommand();
sqlCm.Connection = sqlCn;
sqlCm.CommandText = strSql;
sqlCm.CommandType = CommandType.Text;
sqlRs = sqlCm.ExecuteResultSet(ResultSetOptions.Updatable);
if ( sqlRs.HasRows == true )
Additional info on the SqlResultSet object.
As I was first building my Web Spider, i figured that the easiest thing to build the spider with would be the TP. So based on my previous ramblings, I was disappointed by the fact that the WebClient also used the TP to retrieve its results, even when used in a synchronous fashion. This effectively cut my possible performance in half. Add to this the fact that the TP in .NET only supports 25 threads per cpu at any one moment and I was double frustrated. The result was that I could only fire up 12.5 threads per cpu on my development system. I just knew that if I could just switch to managed threads, I would be able to pull in 25 threads per cpu (based on the WebClient in System.NET). While I am also constrained by the bandwidth at my office, I knew that the addition of more threads would allow me to “smooth” in the waves when the TP version wasn't able to access the network due to other work that was going on. I just knew that I could outsmart the TP scheduling mechanism that will only allocate a specific number of threads based on system resources.
Given the above, I worked this morning on implementing MTs. I got through my bugs, set the system to run with 20 MTs, hit the start button, and watched with excitement as....................the performance of the app went in the toilet compared to using the TP. How in the world could this happen? Well, as I scaled back the number of threads, I watched performance increase. It appears that the fact that I had too many threads attempting to run across my limited bandwidth was causing too many problems. It looks like, given the amount of bandwidth at my office that 8 MTs is the appropiate number of threads. Maybe I am not smarter than the TP manager in .NET..............
Just remember folks, throwing more threads at a solution does not make that solution run better.
While I as sitting here fiddling with things this evening, I decided to do a little test to see just how well full-text search worked in Yukon. Man, I was blown away. Granted I don't have millions of rows in my table to search through, but I do have a system with about 70,000 rows setup for full-text search and am adding them at the rate of about 40 URLs per minute (hey I am bandwidth constrained at my office). I decided to do a full-text lookup for 'President Bush' on the Yukon database while simultaneously runing the spider and having already setup a full-text index in Yukon. In just a few seconds, I got about 200 rows back. Doing this same test on my other system that is running Sql Server 2000 and is taking hours to build the full-text index resulted in a query that took several orders of magnitude longer to search a table with about 700,000 rows in it. Now, I realize that this is not a fair comparison for several reasons. Once the full-text index is built on my Sql Server 2000 system, I am going to run a comparison. No, I am not going to post the complete results.
Here is the sql statement I ran:
select * from tblSearchResults where contains(SearchText, '"President Bush"')
Full-text search with Yukon.
Given the fact that AMD has made a pretty good financial uptick, I can't help but think of Intel's IA64 technology as the IBM Microchannel of the 2000s. I view Itanium as a dragster car and the AMD64 family as a four-door sportscar. Which is better to go to the grocery store in? And this is from someone that used to tow the Intel party line when I worked at Coca-Cola in the IT department. Of course, Coca-Cola invested pretty heavily in IBM microchannel systems........
Got an email a couple of hours ago from Ben Miller at Microsoft saying that I was awarded the MVP status. Thanks to Rob Howard for nominating me. First an ASPInsider and now a Microsoft MVP.
More Posts Next page »