Archives

Archives / 2003 / December
  • NASDAQ cut out its free XML feed...bummer

    Oh well, I guess you get what you (don't) pay for.  It was too good to be true...NASDAQ used to have a publicly-accessible XML feed for stock quotes, and I've noticed it's been inaccessible for several days, and finally found a couple links from some PHP gentlemen who said the service has been cut off.

    So, this basically killed of the effectiveness of a custom server control I built months ago and that was available from the ASP.NET Control Gallery.  Darn.

    Read more...

  • Bravo, Jeff - awesome "Power ASP.NET Programming" WebCast

    In case you missed it live earlier this month, or if you're like me and live in a place where geography mandates having to get up at 3AM on Wednesday mornings to watch live presentations, Jeff Prosise's “Power ASP.NET Programming“ kicks major ass.

    The Wintellect guru talks custom HttpHandlers for dynamic imaging with GDI+, logging in the HTTP pipeline, and has lots of cool demos, and a sweet database-dependency trick for SQL Server and ASP.NET 1.x.  Jeff also discusses what not to do when developing code and components, so as a pessimist, I enjoyed it.  It's a 200-level WebCast, discussing topics outside of the scope of the normative presentation on ASP.NET, so it's a must-watch.

    (Plus, you'll chuckle at the constant commentary by what I assume to be Jeff's pet parrot squaking in the background).  :)

    Check it out.

    Read more...

  • What a long, strange trip it's been: the biggest ASP.NET moments for 2003

    Borrowing a timeless theme from the late, great Jerry Garcia, here’s my list of memorable moments for being a proud ASP.NET developer over the past 12 months.  Feel free to append your own, as I’m sure I’ve left something out.

     

    • Updated version of ASP.NET Web Matrix released
    • New edition of Steve Walther’s seminal work “ASP.NET Unleashed” published, featuring ASP.NET 1.1 examples in C# and VB.NET
    • MMIT included as part of VS.NET 2003, eliminating need for separate download
    • Dave Wanta’s aspNetEmail component slays competition for sending mail
    • Someone figures out how to share session data between ASP 3.0 and ASP.NET 1.x
    • ASP.NET Forums take off...and take over
    • Data provider for Oracle released
    • C# gains leverage over VB.NET (ouch - you may throw tomatoes....NOW!!!)
    • The eternal “should I use a DataReader or DataSet?” argument, after much debate, is put to rest in numerous community forums
    • Whatever happened to ASPElite?
    • I foolishly fall victim to the soon-to-be bought out IDG Books (aka, Hungry Minds), now Wiley Publications, serving as technical reviewer and getting screwed out of a payment
    • Starter Kits made public
    • ASPAdvice inherits mailing list community from now-defunct ASPFriends
    • The .NET Show promises - and then cancels – episode on developing custom server controls 
    • Microsoft Application Blocks released to rave reviews
    • I get selected as a Whidbey alpha tester.  I have seen the future, and it is very, very, very good.
    • Freeware Cassini web server debuts (I think this happened in 2003)
    • ASPToday.com becomes part of fallout as Wrox goes under; properties later acquired by APress
    • Microsoft makes big push towards RSS in several of its public web properties
    • Community top dogs make move towards blogs, thanks to Scott Watermasysk’s .TEXT
    • Bravo to the cat who made DataGrids scrollable by setting <div style=”overflow:auto;”><asp:DataGrid id=”dg” runat=”server”></div> 
    • MSDN-TV debuts with Rob Howard talking about AppSettings
    • I secure two subscriptions to ASP.NET Pro Magazine – one for the office to get wrecked, one for archival at home
    • Addison-Wesley rolls out outstanding “.NET Developer Series”- advanced books without fluff or marketingspeak
    • ASP.NET WebCast Week on MSDN announced – <and the crowd goes wild!>
    • 2.0 - Whidbey blows ‘em all away after premiering at PDC

    Read more...

  • Community project: creating an ASP.NET-specific extension for UML

    I'd like to initiate a community-oriented project for visual modeling, centric strictly to ASP.NET development.  It will, at its core, incorporate the main concepts Jim Conallen of Rational introduced for the Web Application Extension for UML (WAE) in his excellent book “Building Web Applications with UML”.

     

    Basically, I’m looking to organize discussions for a common set of icons and associated visual modeling glyphs to be used by ASP.NET developers, for our way of life.  This logically could begin with the simple icons already available in VS.NET and Web Matrix for ASP.NET-related file types to demonstrate files and their relationships, and could extend all the way to things like components, Use Cases, XML Web services relationships, namespace hierarchies, custom controls, HttpHandlers, caching & cache dependencies, DALs, application settings, and Global.asax-resident routines.  However, this will only extend UML, not supercede it.

     

    I’ve been meaning to do this for use within my own projects, and I’d be happy to share it with you, too.

     

    I think it would really help us understand each other’s cool code and architectural tips as we share ideas if we could develop a common set of images and conventions just for us. 

     

    In my opinion, Microsoft technologies are to date a bit weak at providing such visual help (in comparison to Rational, for instance), and UML in general is overkill for web-based applications.  The WAE Conallen spoke of is good, but not MS-specific. 

     

    Basically, I'm just making an open call for a diverse group of people within the ASP.NET community willing to share their ideas (minimal time required...just a couple of messages every now and then) and work on developing an image set to iconify ASP.NET concepts.  We’ll then aggregate this information and make it available for public download.  If this really takes off, I’m hoping to have enough cool stuff to collaboratively develop an IDE add-in for visual modeling, available as freeware.

     

    Hopefully, it’ll be effective enough to be recognized and used with somewhat broad distribution by Microsoft web developers.   

     

    I’m willing to start and archive the information generated by such discussions, and do the majority of the legwork to get this going.  Anyone interested?

    Read more...

  • Practicality of having typed view state/ViewState API for ASP.NET 2.0?

    I’m quite sure someone has brought this up at some point either theoretically or jokingly, and I’m not sure how feasible it would be (although I think it would be pretty cool), but can/should a page’s view state be able to be typed, rather than just storing all data within as type object? 

     

    Perhaps the convention of - ViewState[“key”] - could include a second argument when 2.0 rolls out, which would be the explicitly-stated data type for the data?

     

    CHEESY EXAMPLE 1:

    String myName = “Jason Salas”;

    ViewState[“aDudeInGuam”,System.String] = myName;

     

    Or, possibly this could be set in web.config for **certain** ViewState entries, providing typing information, as the Profile does object for personalization in 2.0? 

     

    CHEESY EXAMPLE 2:

    <viewstate keyName=“phoneNumber“ type=“System.Int32“/>

     

    Or, maybe include a ViewState API (there’s a thought), similar to what is provided with the Cache API, wherein developers have a variety of overloaded methods from which to choose in setting/accessing view state values?

     

    CHEESY EXAMPLE 3:

    int homePhoneJenny = 8675309;

    ViewState.Insert(“keyPhone”, homePhoneJenny,System.Int32);

     

    Would this even be worth it, or make sense?  It seems to me like this would fit in nicely with Whidbey’s push towards more streamlined programming without the need for casting/recasting data. 

    Certainly, many people would get something out of it when working with business logic, and if this would help improve the expensive performance hit caused by the Framework's internal binary serialization for types without readily-available type converters, that would be gravy.

     

    What do you think? 

    Read more...

  • Is NASDAQ's XML feed down?

    I noticed a couple of days ago that the free XML feed from NASDAQ has been inoperative, returning an empty root XML node, and has yet to come up.  I first got wind of this after noticing that a custom stock ticker server control several months back (http://www.kuam.com/techtalk/nasdaqcompositecustomservercontrol.htm), was showing up blank. 

    I checked the source, and sure enough, NASDAQ was empty. 

    Something similar happened to the free http://weather.interceptvector.com/ XML feed for weather data about a year ago.  I, and several others, used to tap that service for its great features.  The guy who ran it (a nice dude, I corresponded with him a couple times), took it offline after apparent repeated problems.

    Read more...

  • Still trying to achieve anonymous personalization with WebParts in Whidbey

    My big pet peeve project of late has been trying to pull of anonymous personalization in ASP.NET 2.0.  I've talked to several PMs on the WebParts/Portal Framework team, and they say that such was possible previously, but it's since been taken out. 

    It's become more of a hobbyist project now than anything, even though most people I've talked to defer to using content stored in a database to provide personalization.  Basically, I'm trying to create a facility that will allow a site's user to be able to shift WebPart-based content areas around without requiring membership. 

    It's a stretch, but I'm still working out the kinks.  I'm just running circles around how to get an index for each WebPart and persist it, across postbacks and through sessions.

    In the process, I've been turning the WebParts API upside-down trying to figure it out.  It was mentioned previously that the protected SavePersonalizedData() method probably provides this type of functionality.

    What was that line about persistence being a virtue?

    Read more...

  • The top buzzwords used to market software development products

    I recall one of my viewers being absolutely livid over the fact that I mentioned the term “killer app” during the TV segment I host on web development, thinking I was making a call to violence.  In response, I did what all great journalists do - used her lack of foresightedness as the subject of my next column.  :)

    It's funny...as I'm filling out Christmas cards, I'm subconsciously being way too wordy in the corporate sense, rendering what should be sentimental, cheery, holiday greetings into de facto fluffy ads.  In business school, I was taught to be as verbose and long-winded as possible, and then as a journalist, I'm required to be extremely refined and simplistic.  And of course, as a programmer, my very existence is rooted in and around logic.  Needless to say, the many directions in which my brain is tugged daily make for some interesting internal debates about how to communicate. 

    This made me think...what are the most overused buzzwords/marketingspeak used in the development world to market IT products today?  Here are some of my faves:

    • “...gives you more granular control over...“
    • “language-agnostic...“
    • “scalability with stovepipe applications“
    • “a rich UI“

    I'm interested in seeing what new terms become part of the developer's lexicon, whether by use or by force.  What are your top buzzwords/terms?

    Read more...

  • Digital hypocrisy: crossing the use/misuse continuum on the Web with ROBOTS.TXT

    I've always found the optional file you can save in a Web site, ROBOTS.TXT, while sound in purpose, extremely hypocritical and potentially lethal to a site's integrity. As a guy who’s been in technical marketing for more than a decade, it's always been interest of mine to see the practical use of tidbits of information towards giving a site maximum exposure. As a budding developer years ago, this was also one of my first forays into “security“.

    As a refresher, ROBOTS.TXT is a simple text file stored in the root directory of a Website, containing metadata, instructing search engine spiders which directories/subdirectories to avoid browsing so as not to include sensitive information in their indexes. A simple concept, but the fact that these files can be browsed by any idiot with a browser and Internet connection of any speed makes them dangerous.

    For more on ROBOTS.TXT, visit
    http://www.robotstxt.org/wc/robots.html

    It's literally like saying, "Hey, there are certain directories I have secretive content stashed in, and I don't want you to see them at all...and here they are."

    Need proof? Check these URLs out for some good examples how varying organizations in varying industries creatively use the file:

    http://www.intel.com/robots.txt
    http://msn.espn.go.com/robots.txt
    http://www.ford.com/robots.txt
    http://www.cisco.com/robots.txt 
    http://www.cnet.com/robots.txt 
    http://slashdot.org/robots.txt 

    In fact, if memory serves, I recall an engineer at Sun Microsystems several years back writing quite the scathing criticism about the use of ROBOTS.TXT on
    www.sun.com, seeing as how it gave hackers one less challenge to break their stuff (Sun apparently had a bunch of internal download sections, CGI scripts and administrative utilities located in directories they didn't want search engine spiders to find out about). By storing the directory names in ROBOTS.TXT, Sun was essentially giving people the direct URL(s) to their private information, which granted was password-protected, but still overcame arguably THE major hurdle of hacking a site - figuring out which directories contain the good stuff.

    As for me, I constantly use the META tag in pages I don't want spiders to see. That normally does the trick. Using ROBOTS.TXT improperly just invites users savvy enough to know it exists (as many of you now do, after reading this) to type in your site’s domain name, and appending “/robots.txt”.

    To be the file’s proponent, it does do an effective job of preventing spiders from indexing your stuff. And sure, this locks unwanted access out from I'd dare say 97% of the Web browsing community. It would only be Web developers trying to hack Web developers, and one would hope that there would be enough honor among thieves, as it were, or at least an appreciation for parity, that savvy people would not engage such pursuits.

    However, some organizations do use the file to their advantage, not implementing it as a security means, but more so as a way to not let redundant content or data that would otherwise clutter the Web even more be indexed.

    Check out:
    http://www.asp.net/robots.txt 
    http://www.google.com/robots.txt 

    And just in case you’re wondering, don’t even bother looking for the file
    on my site - it doesn’t exist. :)

    Read more...

  • I'm hoping for more project-oriented books for ASP.NET 2.0

    I was really impressed with the former Wrox's (now APress) title, “ASP.NET Website Programming: Problem, Design, Solution”, and I'm hoping that for Whidbey, they'll be more titles like this.

    It really gave an architectural perspective on an application, taking a single theme and expanding it exhaustively throughout the course of the book.  It show in-depth code and concepts behind several sub-applications within the main app, which is really needed more these days.  And, it really leveraged some of the aspects of building an ASP.NET application with reusable code and components.  It's still one of the better reads out there.

    Hope there's more planned for the future.

    Read more...

  • Code for custom RSS generator server control in ASP.NET 1.x

    I wrote a custom server control for an RSS generator after some people had requested it from a previous blog I did, throwing out the question of Microsoft possibly being able to develop one or more custom server controls to generate and consume RSS feeds.  It got a decent response, with people vehemently petitioning both for and against it.

    It's far from a landmark achievement of modern computer science, but it demonstrates how easy it is to do (which, mind, you was never in question).

    Read more...

  • Code download: RSS custom server control

    I wrote this custom server control class to demonstrate how easy it is to develop a portable control generating RSS feeds for a content-oriented site.  It formats data coming out of a SQL Server database to conform with the RSS 2.0 Specification

    The only requirement is that since this control generates XML data, a page using the control can have no other HTML headers or markup other than the control, and page-level directives, so output caching will still apply to the XML-based data.  For example, this would be the code in a client page, like an .ASPX or .ASCX file:

    <%@ Page debug="false" trace="false" language="c#" AutoEventWireup="false" %><%@ Page debug="false" trace="false" language="c#" AutoEventWireup="false" %>
    <%@ Register TagPrefix="kuam" Namespace="RSSFeed" Assembly="RSSFeed"%>
    <%@ OutputCache Duration="30" VaryByParam="none" Location="server" %>

    <kuam:rssgenerator id="rssgen1" runat="server" SQLString="SELECT ID,Date,Title,Author,Description FROM ContentDBTable" Server="server=localhost;database=CompanyContent;uid=sa;pwd=;enlist=false;" DataFormatString="
    http://www.mysite.com/news/{0}.aspx"/>

    Here's the skinny on the object model (3 public properties):

    • DataFormatString - an echo of the .NET Framework's property of the same name, for structuring the destination URL
    • Server - the database connection string for the data store to hit
    • SQLString - the query to execute against the specified DB

    There's some room for improvement, which is very easily done, notably in the following areas:

    • support for databases other than SQL Server (Access, Oracle, ODBC, etc.)
    • support for XML-based data stores
    • a few more public properties for more customization

    Let me know if you find it helpful, and write me if you extend it to fit your own uses.  I did this in between TV shows, and I'm always a sucker for improvement.  :)


    using System;

    using System.Web;

    using System.Web.UI;

    using System.Web.UI.WebControls;

    using System.ComponentModel;

    using System.Text;

    using System.Data;

    using System.Data.SqlClient;

     

    namespace RSSFeed

    {

            /// <summary>

            /// This custom server control grabs data from KUAM.COM to be used as an RSS feed for headlines.

            /// This control requires removing all of the HTML headers from a page (no content).  Therefore, the only

            /// things that can be on the page are the control itself and any page-level directives.

            /// </summary>

            [DefaultProperty("Text"),ToolboxData("<{0}:rssgenerator runat=server></{0}:rssgenerator>")]

            public class rssgenerator : System.Web.UI.WebControls.WebControl

            {

                   // data members

                   private string _sql;

                   private string _serverName;

                   private string _url;

                          

                   // public properties

                   [Bindable(true),Category("Data"),DefaultValue("")]

                   public string SQLString

                   {

                           get {return this._sql;}

                           set {this._sql = value;}

                   }

     

                   [Bindable(true),Category("Data"),DefaultValue("")]

                   public string Server

                   {

                           get {return this._serverName;}

                           set {this._serverName = value;}

                   }

     

                   [Bindable(true),Category("Data"),DefaultValue("{0}")]

                   public string DataFormatString

                   {

                           get {return this._url;}

                           set {this._url = value;}

                   }

     

                   // get the daily news

                   private string GetRSSNewsFeed(string server,string sql)

                   {

                           SqlConnection conn = new SqlConnection(server);

                           SqlCommand comm = new SqlCommand(sql,conn);

                           SqlDataReader dr;

                  

                           string newsFeedData = string.Empty;

     

                           try

                           {

                                  conn.Open();

                                  dr = comm.ExecuteReader(CommandBehavior.CloseConnection);

     

                                 newsFeedData += "<!-- RSS Newsfeed custom server control -->\n";

                                  newsFeedData += "<!-- Written by Jason Salas \n Web Development Manager / News Anchor, KUAM News \n jason@kuam.com \n http://weblogs.asp.net/jasonsalas/ \n December 19, 2003 -->\n";

                                  newsFeedData += "<rss version=\"2.0\">\n\t<channel>";

                                 

                                  while(dr.Read())

                                  {

                                 // TODO: make the array of DB values come from a string[] array, use boolean operator for data provider

                                          newsFeedData += "\n\t\t<item>\n\t\t\t<title>" + MakeDataXMLSafe(dr.GetString(2)) + "</title>\n\t\t\t<description>";

                                          newsFeedData += CreateAbstract(dr.GetString(4)) + "</description>\n\t\t\t<link>";

                                          newsFeedData += MakeDataXMLSafe(string.Format(DataFormatString,dr.GetInt32(0))) + "</link>\n\t\t\t<author>";

                                          newsFeedData += MakeDataXMLSafe(dr.GetString(3)) + "</author>\n\t\t\t<pubDate>";

                                          newsFeedData += MakeDataXMLSafe(string.Format("{0:D}",dr.GetDateTime(1).ToString())) + "</pubDate>\n\t\t</item>";

                                  }

                                  dr.Close();

     

                                  newsFeedData += "</channel>\n\t</rss>";

                           }

                           catch(SqlException ex)

                           {

                                  newsFeedData += ex.ToString();

                           }

                           catch(Exception ex)

                           {

                                  newsFeedData += ex.ToString();

                           }

                           finally

                           {

                                  if(conn.State == ConnectionState.Open) conn.Close();

     

                                  comm.Dispose();

                                  conn.Dispose();

                           }

                           return newsFeedData;

                   }

     

                   private string MakeDataXMLSafe(object data)

                   {

                           string dataString = data.ToString();

                           dataString = dataString.Replace("'","&apos;");

                           dataString = dataString.Replace("\"","&quot;");

                           dataString = dataString.Replace(">","&gt;");

                           dataString = dataString.Replace("<","&lt;");

                           dataString = dataString.Replace("&","&amp;");

     

                           return dataString;

                   }

     

                   private string CreateAbstract(string bodyContent)

                   {

                           // create an story summary by truncating the BODY field of the DB table

                           int periodIndex = bodyContent.IndexOf(".");

                           string finalText = string.Empty;

     

                           if(periodIndex < 150)

                           {

                                  int newPeriodIndex = bodyContent.IndexOf(".",periodIndex+1);

                                  finalText = bodyContent.Substring(0,newPeriodIndex);

                           }

                           else

                           {

                                  finalText = bodyContent.Substring(0,periodIndex);

                           }

                           return finalText + " ... ";

                   }

     

                   protected override void OnInit(EventArgs e)

                   {

                           // set the MIME type for the page in which the control sits to XML

                           this.Context.Response.ClearContent();

                           this.Context.Response.ClearHeaders();

                           this.Context.Response.ContentType = "text/xml";

                   }

                  

                   protected override void Render(HtmlTextWriter output)

                   {      

                           output.Write(GetRSSNewsFeed(_serverName,_sql));

                   }

            }

    }

    Read more...

  • Why even bother asking for permission to link to a site?

    OK...I'm a web guy in the TV/media biz, so I'm not exactly surrounded by a huge community of people who subscribe to the software developer’s way of life.  In other words, I get a lot of requests from people from all walks of life who think “you’ve got a great Inter-web,” try to email 90MB TIFFs embedded in Word documents, and constantly ask me where on the Internet they can find information on any topic, as if I’ve got Google’s entire search index stored mentally.  You get the picture.

    Thus, my reason for writing - it's perfectly OK for the non-hardcore web community and those marketing morons not in the know to link to someone's site without asking.  For real.

    Throughout the year, but in particular around the holidays, I get bombarded by requests by legitimate individuals and organizations that send request e-mails asking if they can link to my site (I've gotten 7 today so far).  Now, I did a fair amount of coursework in graduate school on intellectual property law (which is to say, 1 class), and while it may be kosher to ask to associate oneself to another organization's site, it's realistically impossible to accurately track just who out there is connecting to your domain via hypermedia. 

    I'm a marketing major myself, so this practice of unnecessary politeness drives me nuts.  Such was a reasonable inquiry when the Web first took off, but we’re too far into the game at this point.  Behavior of this nature is arguably THE reason the job title of “Administrative Assistant” was created...giving those people something to do for 8 hours a day, and giving them something to lay claim to.

    If IBM started linking to my site, it's free press for my company's stuff - and there really is no such thing as bad press.  We in the blogging community openly and great frequency and passion link to other people's blogs, to MSDN, to countless online resources, news articles, media files, and everything under the sun.  And we do so both in praise of, in reference to, and in criticism towards, the content contained therein.

    My boss at a former job once asked me to compile a list of ALL the sites pointing to our domain.  How the heck am I to list not only the media & news sites linking to us, but also every Geocities and AngelFire site out there?  Impossible!  Heck, it's the foundation of why the Web was created in the first place, so make use of it.

    A company I once dealt with had an agreement companies needed to sign if they were to link to any of its pages, containing so much legalese it would make Johnny Cochran salivate.  Not surprisingly, no one bit with the agreement, and most people got turned away by the anal retentiveness.  Those that were still interested just did anyway, I’m guessing out of spite.  And again, the company itself never really who was doing so.

    So fear not, e-marketers, link away.  Let me put it in terms you can understand, wherein you can believe your own hype.  Swallow a healthy spoonful of the medicine from the very Nike campaign you helped to create: just do it.

    Read more...

  • SUGGESTION: formatting/color-coding changes for VS.NET

    I spoke to a Microsoft usability engineer recently who said there were discussions about possibly changing the color coding scheme employed by Visual Studio .NET.  While I'm totally happy with the current formatting convention used by the IDE (blue for language-specific keywords, green for comments, gray for in-line documentation, bolded purple for classes, etc.), I'd like to make the following suggestions:

    • Enumerations should be formatted differently than from classes and structs.   I realize functionally enums are similar to classes and structs, but formatting them as bolded purple tends to get me confused.  I get it, but I'd like it to be different.  Since they're a type of data type from which a predetermined range of values are selected, it would help me if they'd use some other type of coloration.
    • Method calls should be bolded to distinguish them from "normal"  code.  One of my favorite text editors, the freeware Jext  (www.jext.org) formats method calls/function invocations in bolded  black text (as well as the parenthesis), to identify a method.  I really like this, and is really speeds up my productivity when I'm hunting for something in my code.
    • I'm not sure if this is possible/feasible, but create a new class for  SQL/T-SQL statements and provide in-line SQL syntax formatting, perhaps called "System.Data.SqlClient.SqlLanguage" (or  something to that effect), and provide the ability to color-code SQL  syntax inline, as if it were being done so in Query Analyzer.  This would make for fantastic debugging!  (I'm tentative about this one...would this hamper performance for instantiating yet another object on a page just to get code-coloration?)

    At the very least, I'm hoping for consideration over the bolded method call suggestion.  That REALLY helps.

    Read more...

  • A developer's New Year's resolutions (aka, "How I plan to fatten my resume for 2004")

    I've made a little promise to myself as a web developer to try (and hopefully be somewhat proficient in) a few key areas of programming in 2004.  Thus, I'm hoping to dabble in the following for possible use in my work 12 months from now:

    • Windows Forms
    • integrating ASP.NET and Flash
    • .NET Remoting
    • Programming .NET components for Office

    How about you?  What's on your mental To Do list for '04?

    Read more...

  • Output caching and MIME types other than HTML

    I’ve got an interesting question - does output caching apply when the MIME type of a page is modified by changing its ContentType, or is it only applicable for HTML-based content? 

    I’m trying to cache an .ASPX page where the MIME type is changed from the default “text/html” to “text/xml“, using the OutputCache directive, and I’m not sure if it’s working (it doesn’t appear to be).  I can't really using page-level tracing, because the markup generated by tracing is read as XML up to a certain point, and then it throws an error.

    I wouldn't expect output caching to work for other MIME types like “image/jpeg“ or “application/ms-word“, and I've worked around it using the Cache API, but again, one of those things I'm curious about because I honestly don't know.

    Read more...

  • Will (and should) ASP.NET 2.0 embrace RSS more?

    I've been messing with various RSS (really simple syndication) applications after having created my own implementations of syndicating content from my site.  If you can't beat them....

    Anyhoo, I've been thinking about whether it would be a good idea to suggest that ASP.NET openly embrace RSS more, perhaps to the point of including one or more server controls that could easily let a developer syndicate their site's stuff without going through the coding aspect of it.  I think a drag-and-drop control that would seamlessly allow a site's content to be readily available to consumers would be a great addition to Whidbey's feature set. 

    It's very easy to build providers and consumers already in ASP.NET 1.x, and in no doubt in wide distribution by the ASP.NET community, so I think the demand is justified.  In fact, a past episode of MSDN TV cited RSS broadcasting as a major driving force a significant part of MSDN, Microsoft.com, and other aspects of its domain.

    However, for mere textual content, can one reasonably argue to a certain point that this might conceivably stymie, if not negate, the efforts of men better than me to further the use of XML Web services?  The counterpoint would be that you get the same effect as consuming a remote content feed without needing to use the XML web services development model's overhead (although IMHO, consuming web services in ASP.NET 1.x is more fluid than creating an RSS client).

    Essentially, the API would consist of a dev specifying a datastore (database, XML, etc.) and connection string (DB string, XPath, etc.), and the control would do the rest, implementing a template-based XML document, conforming to the RSS 2.0 Specification.  (Custom controls can be a bit tricky when working with MIME types other than “text/HTML” in the current model, but that's why those guys get the big bucks.)  I also think providing automatic caching facilities through a public property would be a nice touch.

    What do you think?  Would this be overkill?  Are you happy with how ASP.NET uses RSS the way it is now?

    Read more...

  • Great examples for using C# 2.0 Generics with ASP.NET

    If you’re having a tough time wrapping your brain around Generics in C# 2.0 as applicable to ASP.NET, or want some cool practical examples of how to use Generics check out Patrick Lorenz’s awesome book “ASP.NET 2.0 Revealed”.  

     

    In particular, read the section describing data-driven server controls for the sweet example of using an ObjectDataSource using Generics as part of a business logic tier to bind to a GridView or DetailView and easily add editing, deleting and inserting new records.  It's one of the few practical examples around at this point.

     

    Also included in the book is Scott Guthrie’s excellent example from his “Tips & Tricks” presentation from PDC about using Generics in ASP.NET.

    Read more...

  • The new "in" thing for web development: get rid of the query string

    After I commented yesterday about MSNBC.com's site redesign, Robert McLaws had the wisdom to point out that the site's URLs are now in the following format: http://msnbc.msn.com/id/3695726/, making each story look like its own subdirectory.

    In similar fashion, I've also taken note that Steve Smith at ASPAlliance also recently migrated his site's URL convention, lopping off the query string for user-submitted articles: http://aspalliance.com/324

    After having migrated my own site to ASP.NET, I now use a convention that MSNBC previously used, employing my database’s ID field as the disguised filename: http://www.kuam.com/news/3457.aspx.  The facilities within ASP.NET for handling dynamically rewriting a path (namely, the RewitePath() method) make pulling this off incredibly easy. 

    This draws to light a new theme that seems to be popping up more and more within the web development community, in particular from ASP.NET-driven sites: ridding oneself of the query string.  It would appear that site designers are now considering the cosmetic appeal of a URL as part of the site's total usability, and an Internet address' psychological effect on the user. 

    Perhaps this infers that a user's thought pattern might be that if a URL is messy and complex, the site will be, too?

    It seems that people are finally catching on to a principle that as a marketing guy, I've held since the first day I saw it: that query string-based URLs are really ugly.  Using one or more appended name/value pairs in a site's URL is incredibly hard to remember, and the values that were once extracted can now be accessed and persisted in other places just as well (Cache API, Session, ViewState).  I recall how people flocked to adding query string values to their URLs circa 1997 - whether they really needed them, or not - just to look advanced, shying away from plain 'ol “/directory/filename.html”. 

    Apparently, people are starting to realize the promotional potential and KISS charm in simple page addressing schemes.  Good to see.

    Read more...

  • SUGGESTION: allow control for CatalogWebParts via remote windows in ASP.NET 2.0

    I think it would be a nice touch if the Portal Framework could have functionality included wherein a developer woulnd't have to drop a CatalogWebPart on the same page as being worked on. For instance, the Framework could include a method or property that allowed child/popup windows control the changes and then upon submission, commit the changes back to the parent page.

    One of the methods already available in the API wraps a call to window.open(); to launch a static “help” page, but this of course, would be way more complex.

    In a discussion I had with Andres Sanabria, product manager for Web Parts & the Portal Framework, he indicated that it might be possible, but it's sticky since passing server data back-and-forth between a child and parent window client-side introduces new problems.  He said though, that I wasn't the first to bring this up.

    Regardless, people are most certainly going to want to do this at some point, and undoubtedly someone will come up with a workaround if it's not standard, but it would be nice it came as part of the base functionality for ASP.NET 2.0.

    Thanks for listening!

    Read more...

  • Code for Web service-less portable content

    I've gotten a couple of requests for the ASP.NET code for my blog about making your site's content portable to remote consumers without using a web service or RSS, so here 'tis.

    SqlConnection conn = new SqlConnection(conn);
    SqlDataReader dr;
    SqlCommand comm = new SqlCommand(strSQL,conn);
    StreamWriter writer = File.CreateText("C:\\inetpub\\wwwroot\\portableheadlineswithabstracts.js");
               
    string swappedOutSingleQuotes;
    int periodIndex;
    int newPeriodIndex;
    string finalText;
                
              writer.WriteLine("<!-- ");
              conn.Open();
              dr = comm.ExecuteReader(CommandBehavior.CloseConnection);
              while(dr.Read())
              {   
                        writer.WriteLine("document.writeln('<a target=\"_blank\" href=\"
    http://www.yourdomain.com/" + dr.GetInt32(0) + ".aspx\"><b>" + dr.GetString(1) + "</b></a><br>');");
                           
                            // create an abstract for the story by truncating the BODY field of the DB
                            swappedOutSingleQuotes = dr.GetString(2).Replace("'","’");
                            periodIndex = swappedOutSingleQuotes.IndexOf(".");
                            if(periodIndex < 150)
                            {
                                newPeriodIndex = swappedOutSingleQuotes.IndexOf(".",periodIndex+1);
                                finalText = swappedOutSingleQuotes.Substring(0,newPeriodIndex);
                                writer.WriteLine("document.writeln('" + finalText + " ... <br><br>');");
                            }
                            else
                            {
                                finalText = swappedOutSingleQuotes.Substring(0,periodIndex);
                                writer.WriteLine("document.writeln('" + finalText + " ... <br><br>');");
                            }
                           
                        }
              dr.Close();
                    
              writer.WriteLine("//-->");
              writer.Close();
                    
       Response.Write("<h1>The headlines with abstracts were written successfully!</h1>");

    First, here’s some basic instructions that you can place on your site, letting remote designers/developers know how to add your content to their pages with a single <SCRIPT> tag: http://www.kuam.com/marketingprograms/headlinegrabber.htm

    As far as the database fields used, here’s the breakdown:

    GetString(0) = StoryID (INT – Unique ID)
    GetString(
    1) = Title (TEXT)
    GetString(2) = Body(TEXT)

    Here’s a sample of what a remote page would look like for the remote content (news headlines with abstracts) so you can view it in action: http://www.kuam.com/marketingprograms/newsgrab.htm

    As long as you’ve got the write permissions on your directory, you’ll be up and running in no time!  I hope it does the trick for you or gives you some ideas for your own projects.

    Read more...

  • New-look MSNBC.com

    MSNBC.com just revamped their site, and it's quite impressive, IMHO.  The URL redirects to http://msnbc.msn.com so I'm not sure if this is a step in a new direction, but it's cool nonetheless.

    It's not a night-and-day change, but the most obvious modifications are new fonts (thank you for finally going away from Times New Roman - yuk!), and I think most of their apps are running ASP.NET, from what my friends over there tell me.   Someone on the inside indicated a few initial minor glitches that were quickly resolved, and I thought their layout was off initially, but that was my machine.

    I had to modify the code I use to harvest headlines from them, and I know they've been burning the midnight oil lately getting the site migrated over.

    Congrats on a job well done, MSNBC'ers!  :)

    Read more...

  • Make your site's content portable to remote consumers without a web service or RSS

    I run a news Web site (obligatory applause), and one of the keys to us staying competitive is making our news headlines portabloe to other sites.  It's a simple model: people put our headlines on their page(s), which link back to us.  I figured out a way to do this

    This is atually the pseudocode for a tutorial I did a couple years back for ASP101.com but it resurfaced recently, and some people thought it was cool in today's XML-friendly web:

    1. Connect to your data store (database, XML, Exchange, etc.)
    2. Write the content in JavaScript syntax to a .JS file that you save on your server (usually overwriting a file of the same name)
    3. Have ALL remote clients reference the full URL to your .JS file in the SRC attribute of <SCRIPT> tags:
      • <script language=“JavaScript“ src=“http://www.somedomain.com/somedir/anotherdir/foo.js“></script>

    It's a simple, 2-minute solution that practically all consumers will be able to use.  Documenting and supporting it is a snap, and you won't need to worry about whether distant-end developers know how to consume XML Web services, RSS feeds, updating DLLs, etc.  It's also usable in third-party hosting situations where supported services are a bit tighter than most.  You write it once, update it as often as you see fit (my own implementation runs off of a scheduled Windows task).

    Click here for the ASP 3.0/ADO source...let me know if you need it for ASP.NET.  It's not posted, but I've got it.

    Read more...

  • I'd like to see an ASP.NET-specific extension to UML (and support in Visio wouldn’t hurt, either)

    Jim Conallen of Rational wrote the seminal book “Building Web Applications with UML”, in addition to penning numerous resources, available from IBM's site.  He describes the Web Application Extension for UML (WAE), which is a great, scaled down philosophy for laying out components, interfaces and page relationships/mapping – both cosmetic and functional - within the context of a web application. 

    Jim’s book highlights an area of development that for many is either thought to be either too grandiose or too closely-tied to traditional desktop development to be relevant in the web environment, or just generally misunderstood, and therefore, largely underutilized.  The book “Building e-Commerce Sites with the .NET Framework” also uses a lot of database diagrams, Use Case models and site-flow diagrams, demonstrating that one doesn’t need to do a lot of classical UML such as class diagrams to be effective and build good documentation.

    I'd like to see more documented evidence and more widespread use on the use of modeling (UML and otherwise) within web applications.  I got a lot out of Jim's book, and have conversed with him sporadically on the use of modeling.   I started incorporating visual modeling into my projects a few years back, and while it is arguably overkill for smaller projects (it is a lot of additional work if done right), it is a critical part of development and really speeds the production process. 

    I realize that it's probably not feasible to use include UML and other visual modeling tools into the basic offering for Visual Studio .NET, due to the fact that for many environments it just wouldn’t be used, not to mention increasing the cost of the base product, but it would be nice to have better modeling tools other than the “Web site” tools in Visio, which are largely mapping diagrams showing link relationships and doesn’t speak to web components.  On the flip side, the software engineering tools in Visio not surprisingly tend to favor the desktop.  I’m not ragging on Visio, just asking for more.

    If someone out there created an UML offshoot that was specific to ASP.NET, such as allowing for custom server controls, HttpHandlers, modules, Global.asax routines, etc. I think this would be great.  This could either be a physical product or an intangible development methodology or a recommended standard.  I believe either would help and be greatly appreciated. 

    On this note, while the first edition of his book mentioned ASP.NET briefly a largely from a conceptual standpoint, Jim has told me that the next version of his book will get more in-depth with demonstrating web-centric modeling for ASP.NET. 

    Read more...

  • Interesting point: XmlTextReader interprets end elements as elements

    This sounds weird, but I discovered something earlier this week that I never would have expected, but makes perfect sense.  When using an XmlTextReader to navigate through a remote XML document in what I thought was the right way, I kept getting extra blank nodes.

    When reading the following XML structure:

    <Story>
      <Headline>Campaign finance reforms upheld</Headline>
      <Abstract>The Supreme Court upheld two key parts of a new campaign finance law Wednesday —  one on so-called soft money loopholes and the other on issue advertising.</Abstract>
      </Story>
     <Story>
      <Headline>U.S. copter down in Iraq</Headline>
      <Abstract>A U.S. military helicopter made an emergency landing Tuesday in Fallujah, just hours after a car bomb attack on barracks near the northern city of Mosul wounded 41 U.S. soldiers, mainly with flying debris and glass. Elsewhere, a rocket attack on a Baghdad mosque killed three Iraqi civilians.</Abstract>
      </Story>

    The XmlTextReader was apparently treating the closing </Story> end elements as elements!  Duh!  Thanks to Dan Wahlin (the man when it comes to anything XML) for helping me suss this out.

    Read more...

  • PivotTables (aka cross-tab queries) in Longhorn - YAH!

    [TAKING A BREAK BETWEEN SHOWS...]

    I'm stoked about the fact that Longhorn will support PivotTables, also known as cross-tab queries.  If I recall, PivotTable queries have been rolled into a single function.  Sweet!

    I recently did a project where these were used extensively, and I nearly forgot just how much code (relatively speaking) it takes to generate such stuff.

    Read more...

  • Happy anniversary/RIP ASPFriends

    Catholics will appreciate the significance of a first-anniversary rosary, the process held to commemorate the one-year anniversary of a loved one’s passing.

     

    It just dawned on me this morning that it’s been about a year since the moderately-balleyhooed demise of the ASPFriends mailing lists.  I did a story on it for work, interviewing two people who I consider my friends, Scott Guthrie and Charles Carroll.  And if nothing else, it made for an interesting blemish on the complexion of the history of ASP.NET development.

     

    We’ve migrated away – either by closure, by preference, or merely out of following the pack - from ASPFriends, the Wrox P2P Forums, and DevelopMentor mailings lists and forums and now embraced concepts like the ASP.NET Forums, blogs, WebCasts and a whole slew of community-oriented events.  Whether this is indicative of the progression of technology or just reminiscent of migratory patterns that would make salmon jealous is debatable.

     

    Much in the way of how my parents remember where they were when JFK was shot, or me committing to memory where I was when the announcement was made that Elvis had died, or John Lennon got assassinated (or more recently, when the Smashing Pumpkins broke up), can you recall what you were doing when the whole battle over ASPFriends ensued?  This was a simple matter turned ugly and got tragically aired out to the entire ASP.NET community over the span of a few harried days.  I remember it vividly. 

     

    And if you’re sitting now, groaning about why I’m bringing up old stuff, it’s because it’s comical, if anything, to remember just how large what should have been a small issue became.

     

    I recall a long-winded mass e-mail from Charles, sent numerous times, stating his case and announcing, in no shortage of words and implied emotion, that he would be terminating the valued e-mail mailing lists because of a contractual dispute with Microsoft.  In doing so, he indirectly lobbied for people’s support to become campy and join his crusade against the corporate monolith.  He made public outcries against noted and distinguished community members like Steve Smith and Alex Lowe.  He denounced Microsoft’s efforts.  He gave testimony about how many hours he put into the project, how he personally moderated many of the lists, how he recommended people for Microsoft MVP honors for ASP.NET, and how he personally financed the site’s hosting charges.

     

    I then recall a response letter from Scott a couple of days later, stating Microsoft’s position, promising great things with the ASP.NET Forums, and informing us that short of a court order, Charles was asked repeatedly to cease his affinity to call and harass Microsoft staffers about the issue.   

     

    Obviously, civility and professionalism won.

     

    The response from the ASP.NET community, not surprisingly, was largely apathetic.  We witnessed occasional supporters in the days following openly crying out for ASPFriends not to go offline.  Likewise, other people took the time out of their days to say, “Good riddance”.  But for the most part, the vast majority of developers accepted it, said nothing, and moved on with their lives. 

     

    Sadly, ASPFriends, Charles’ great creation and without a doubt THE most-popular aspect of the ASP.NET community experience has been whittled down to a few Yahoo! Groups and a few sparse mailers from ASPElite.  This is a sad and tragic death to something from which gave us all so much.

     

    ASPAdvice, run by Smith and Lowe, has since effectively picked up the slack and is a thriving online community of its own, using the same e-mail-based peer support concept.  The Forums on www.asp.net are phenomenally popular and everyone who’s anyone in the ASP.NET community has a blog.  Life goes on. 

     

    And who knows?  We may very well see the closure of the applications used today for other new and interesting products in the near future.  Innovation says that’s probably going to be the case, and a lot can happen in a year.   

     

    So say a prayer, drop some virtual flowers, and remember the good old days of ASPFriends.  And thank whichever deity you subscribe to that we’ve been able to move on and get to where we are now.

    Read more...

  • ASMX sample response message using serialized XML doesn't account for derived members

    I've noticed something in the sample response XML generated by an .ASMX file in ASP.NET 1.x (at the very least, in 1.0).  Specifically, when using serialization for custom XML, the Web service apparently does not take into account data members derived from a base class when reporting what a response message will look like.  Don't get me wrong, the final XML message itself is perfect, but I've found inaccuracies in the .ASMX  response message.
     
    For instance, here's an example I ran into when working on a statistic service for a local football league.  I used a base class "Player", containing only properties.  The classes "Offense", "Defense" and "Special Teams", all contain statistical information, and inherit from Player to get the shared properties Name, Position, JerseyNumber and Team:
     
    public class Player
    {
           private string _name;
            private string _position;
            private string _jerseyNumber;
            private string _team;
           
            public string Name
            {
                get { return this._name; }
                set { this._name = value; }
            }
           
            public string Position
            {
                get { return this._position; }
                set { this._position = value; }
            }
           
            public string JerseyNumber
            {
                get { return this._jerseyNumber; }
                set { this._jerseyNumber = value; }
            }
           
            public string Team
            {
                get { return this._team; }
                set { this._team = value; }
            }
    }
          
        [XmlRoot("OffensiveStats")]
        public class Offense : Player
        {
              // implementation...removed for brevity      
        }
       
        [XmlRoot("DefenseStats")]
        public class Defense : Player
        {
              // implementation...removed for brevity
        }  
       
        [XmlRoot("SpecialTeamsStats")]
        public class SpecialTeams : Player
        {
            public int Punts
            {
                get { return this._punts; }
                set { this._punts = value; }
            }
           
            public int PuntReturns
            {
                get { return this._puntReturns; }
                set { this._puntReturns = value; }
            }
           
            public int PuntReturnTDs
            {
                get { return this._puntReturnTDs; }
                set { this._puntReturnTDs = value; }
            }
           
            public int KickoffReturns
            {
                get { return this._kickoffReturns; }
                set { this._kickoffReturns = value; }
            }
           
            public int KickoffReturnTDs
            {
                get { return this._kickoffReturnTDs; }
                set { this._kickoffReturnTDs = value; }
            }
           
            // default class constructor...required here for serialization
            public SpecialTeams()
            {}
           
            // overloaded class constructor
            public SpecialTeams(string name,string position,string jerseynumber,string team,int punts,int puntReturns,int puntReturnTDs,int kickoffReturns,int kickoffReturnTDs)
            {
                base.Name = name;
                base.Position = position;
                base.JerseyNumber = jerseynumber;
                base.Team = team;
                this._punts = punts;
                this._puntReturns = puntReturns;
                this._puntReturnTDs = puntReturnTDs;
                this._kickoffReturns = kickoffReturns;
                this._kickoffReturnTDs = kickoffReturnTDs;
            }
        }
     
        [XmlRoot("Leaderboard")]
        public class Leaderboard
        {
            [XmlArray("Offense")]
            [XmlArrayItem("Player")]
            public Offense[] offensePlayers;
           
            [XmlArray("Defense")]
            [XmlArrayItem("Player")]
            public Defense[] defensePlayers;
           
            [XmlArray("SpecialTeams")]
            [XmlArrayItem("Player")]
            public SpecialTeams[] specialteamsPlayers;
        }
     
    As you can see, the Leaderboard class contains arrays of Offense, Defense and SpecialTeams objects to generate rosters of the statistical leaders in those respective categories, and it's this latter class that's returned by the Web service.  Pretty cut-and-dry stuff, and far from groundbreaking.  But here's where I noticed a gotcha: while the eventual XML generated contains all the fields and properties from the subclasses, as one would expect, the .ASMX file does not include the fields within the base class. 
     
    In my particular implementation, I'm reading values from a DB, which are read into the dataset and then passed as arguments to overloaded constructors of the Offense, Defense and Special Teams classes.
     
    However, this is the sample response the .ASMX file generates:
     
    <?xml version="1.0" encoding="utf-8"?>
    <Leaderboard xmlns="http://stats">
      <Offense>
        <Player>
          <TotalTDs>int</TotalTDs>
          <RushingAttempts>int</RushingAttempts>
          <RushingYardage>int</RushingYardage>
          <RushingTDs>int</RushingTDs>
          <PassingAttempts>int</PassingAttempts>
          <PassingYardage>int</PassingYardage>
          <PassingTDs>int</PassingTDs>
          <Receptions>int</Receptions>
          <ReceivingYardage>int</ReceivingYardage>
          <ReceivingTDs>int</ReceivingTDs>
        </Player>
        <Player>
          <TotalTDs>int</TotalTDs>
          <RushingAttempts>int</RushingAttempts>
          <RushingYardage>int</RushingYardage>
          <RushingTDs>int</RushingTDs>
          <PassingAttempts>int</PassingAttempts>
          <PassingYardage>int</PassingYardage>
          <PassingTDs>int</PassingTDs>
          <Receptions>int</Receptions>
          <ReceivingYardage>int</ReceivingYardage>
          <ReceivingTDs>int</ReceivingTDs>
        </Player>
      </Offense>
      <Defense>
        <Player>
          <Interceptions>int</Interceptions>
          <InterceptionTDs>int</InterceptionTDs>
          <FumbleRecoveryTDs>int</FumbleRecoveryTDs>
          <Tackles>int</Tackles>
          <Sacks>int</Sacks>
          <Safeties>int</Safeties>
        </Player>
        <Player>
          <Interceptions>int</Interceptions>
          <InterceptionTDs>int</InterceptionTDs>
          <FumbleRecoveryTDs>int</FumbleRecoveryTDs>
          <Tackles>int</Tackles>
          <Sacks>int</Sacks>
          <Safeties>int</Safeties>
        </Player>
      </Defense>
      <SpecialTeams>
        <Player>
          <Punts>int</Punts>
          <PuntReturns>int</PuntReturns>
          <PuntReturnTDs>int</PuntReturnTDs>
          <KickoffReturns>int</KickoffReturns>
          <KickoffReturnTDs>int</KickoffReturnTDs>
        </Player>
        <Player>
          <Punts>int</Punts>
          <PuntReturns>int</PuntReturns>
          <PuntReturnTDs>int</PuntReturnTDs>
          <KickoffReturns>int</KickoffReturns>
          <KickoffReturnTDs>int</KickoffReturnTDs>
        </Player>
      </SpecialTeams>
    </Leaderboard>

    Note that the fields inherent to each class are represented, but the inherited fields (Name, Position, JerseyNumber, and Team) aren't there.  This is consistent through the sample response content generated for requests made through SOAP, HTTP-GET and HTTP-POST.  However, everything comes out perfectly, as expected, when executing the method and examining the XML.  Certainly, a consumer of the Web service can see the true XML returned by invoking the method, but it makes for some unexpected surprises and misdirection when the true data to be returned isn't reported.

    If this was the result of ignorant on my part or a genuine flaw in the .NET Framework, I'd just like to know which.  Either way, it didn't produce the results I expected, although fortunately, it still worked perfectly in the final wash.  I researched this for awhile, and tried a few different approaches, and I thought I got it right.
     
    Anyone else run into this?

    Read more...

  • VS.NET *should* wrap all attributes with quotations by default

    I’ve got a comment/suggestion about the way VS.NET Whidbey handles the values for HTML attributes and declarative control properties. If the intent is to truly output XHTML 1.0, I was assuming that the IDE would naturally generate attributes and properties that would be surrounded by quotes (i.e., <a href=“adoc.html”>, <asp:Label id=“myLabel”/>).

    However, I’ve often found that when IntelliSense tries to autocomplete values in the way I’ve been used to since the Visual InterDev days (hitting “TAB” to autocomplete a statement, member or method), the quotes get left off and write out the above examples like this: (<a href=adoc.html>, <asp:Label id=myLabel/>).

    I’d like to suggest that these type of statement completions be set to wrapped in quotes automatically.  Is this just me or is this a known **bug**?

    Read more...

  • Suggestion: all IEWebControls should ship as part of ASP.NET 2.0

    I've got a quick suggestion I'm hoping Microsoft would consider: can the IEWebControls (TabStrip Control, et al.) ship as part of the standard package for ASP.NET 2.0? I think the TreeView is great, and it would be perfect if the controls that could be downloaded with 1.x came out-of-the-box with Whidbey.

    However, I assume that some of the control aren't fully compliant with all browsers (DHTML issues and whatnot), or at least they weren't a few years ago wheh I first read about them.

    Comments?

     

    Read more...

  • Open "casting call" for .NET book publishers - what ya got comin' for 2.0?

    Hi all,

     

    I’d like to make an open “casting call” (sorry, I’m in the TV biz) to all book publishers to post either lists or links to all the books you’ve got in the works or in the planning phases for.NET 2.0.  The general consensus seems to be that there were a helluva lot more titles on the market even in the days of .NET Beta 2 than there were for ASP 3.0, which at that point had been out for a couple of years.

     

    Many speculate that the number of .NET 2.0 books will surpass that number, so I’d like your feedback on what you’re going to be putting out onto our shelves and into our laps.  General omnibus titles?  Specific, market niche books?  A little but of both?

     

    Thanks for your input!

    Read more...

  • Anxiously waiting for Dino's ASP.NET 2.0 book

    I've been through the first two “official” books to make it to print:

    ..and I'm anxiously waiting for Dino Esposito's title “Introducing ASP.NET 2.0“ by MSPress to come out (Feb. 2004 I think).  I've asked O'Reilly and the rep said they've not got any beta books planned as of yet.  Dino's writing is always very example-driven, and he's a great writer.

    So in the time until Dino's book is out, I'd wholeheartedly recommend that any experienced ASP.NET developer get the first two as soon as you can.  They each have their distinct advantages, but actually contain enough information in differing ways that they compliment each other nicely.  Amazon doesn't “recommend” buying both as a package deal, but it's a good thought.  The Addison-Wesley text is slightly more technical from an archirtectural standpoint, and is therefore more conceptual.  It gives insight into many of Whidbey's new APIs and new methods available to you, and highlights several new features like precompilation.

    The APress title is more hands-on, and is more code-rich.  The thing I like best is that the introductory chapters deal with the new langauge features like Generics, and then uses them practically in examples.  The use of Generics is especially appreciated.

    I read them in the order they're listed above, and they're a must-have.

    Read more...

  • Book Review: A First Look at ASP.NET v 2.0

    The scientist Louis Pasteur is famous for, amongst other things, saying, “chance favors the prepared mind.” With Whidbey on the horizon, Dave Sussman, Alex Homer and Rob Howard, are getting you prepared for battle, as you combat long, drawn-out development sessions and having to write thousands of lines of code. You’ll definitely want to pick up a copy of this book to properly arm yourself.

    Whether you’re an existing alpha tester or one of the many who is privy to a PDC copy of Whidbey, this is the definitive source you’ll want in your arsenal for the next evolution of ASP.NET. There’s code galore, and the concepts are explained easily and well, while still mixing in the specifics of how the next version of .NET will help you become a better web developer.

    The book’s hearty 470+ pages display a tone that is friendly and comforting, which is a plus when taking into consideration the literal piano of information about new features and enhancements that will be dropping on you. It seems to be best read by an experienced ASP.NET developer, familiar with concepts and terms inherent to Microsoft web development. One will quickly welcome the perspectives given on a variety of topics from caching to the new server controls, to the enhancements Version 2.0 of the .NET Framework delivers.

    The book does not completely marry the reader to the Whidbey version of Visual Studio .NET, rather presenting the code examples in an IDE-agnostic manner, so as to still appeal to the NotePad enthusiast in all of us. Still, the vast and massive improvements to VS.NET itself are well documented.

    All the book’s examples are presented in Visual Basic .NET, which isn’t so bad, as one of the key points of the title is that Whidbey’s new model minimizes the authoring of code itself, so you can concentrate more on working with encapsulated server controls and optimizing your web apps through intelligent configuration and management utilities.

    A very healthy chapter on Web Parts and Whidbey’s model for the portal framework is most appreciated, and the ease by which you’ll sift through the accompanying code just goes to prove how much better developing web-based applications will be once Whidbey arrives. Equally-thick chapters on new aspects of the feature set such as master pages, membership, and personalization, as well as great discussions of the improvements to the existing security, data controls, configuration and administration.

    The book also does a great job of keeping multi-platform application development in mind, constantly mentioning the capabilities of Whidbey to generate output for both the desktop-based and mobile browser.

    My personal favorite new feature of ASP.NET 2.0 is Web Parts and Personalization, and the book has a great deal of information on both. The book proves that not only has Microsoft listened to customers and thought way ahead in developing the next big thing, but the title’s authors themselves answer many questions you’d likely ask.

    If you’re wondering if this book (and Whidbey in general) is worth it – believe the hype. Get this book now. You’ll be very happy you did, and will be anxiously anticipating the release of Beta 1.

    Read more...

  • Book Review: ASP.NET 2.0 Revealed

    This book is much more demonstrative and visual than some of the other titles currently on the market, making it the ultimate complement for books like “A First Look at ASP.NET v 2.0”. Overall, the book’s tone is very educational but not intimidating, complex but not complicated, making for a very friendly atmosphere that makes it quite easy to read.

    The book tackles some of the more technical topics involved with developing and administering ASP.NET web sites, so it’s an effective tool in lightly previewing what’s coming on the horizon for us, as well as giving you the in-depth answers to allow you to start planning to solve problems and work more efficiently.

    In each of the title’s 13 hearty chapters, there’s great examples that accompany most, if not all of the concepts presented. So, you don’t get an empty, lost feeling after being drawn-into an idea – it’s succeeded with a relevant, working C# example.

    The book is definitely written for the web developer already primed with experience in ASP.NET 1.x, so familiarity with the concepts of Microsoft web development is a must. But assuming that, you’ll be very pleased.

    Specifically, there are several areas in this book I found to be outstanding. These include a fantastic introduction to generics, iterators, anonymous methods and other new features of both C# and Visual Basic .NET. Also, there’s a great description of using the ObjectDataSource control for binding business objects to data controls, and an equally nice discussion of the Site Counter API and keeping tabs on the user currently on one’s site. The book also has a good chunk of information about the improvements to working with dynamic imaging, and about the easy by which you can create and control client-side script.

    Additionally, there’s a great preview of the Whidbey version of Visual Studio .NET’s features specifically for web developers. Provides constant tips and comments about what features are likely to be changed by the Whidbey Beta.

    The sole bit of criticism I would have it that, the chapter on Web Parts chapter was interesting, but largely took from the documentation and samples you can find on MSDN and in the .NET Framework documentation.

    But outside of that, the positive far outweigh the negatives, and if you’re on the hunt for anything and everything you can get your hands on about the

    Read more...

  • IntelliSense for web.config files in Whidbey

    I bounced the topic of IntelliSense for web.config off the ASPAdvice general mailing list for Whidbey (aspnetv2@aspadvice.com), commenting about how I love the fact that syntax gets coloration treatment, as do all XML files (I haven't tried XSLT files yet).  However, as sweet as it is, it's a let down without IntelliSense support, which is a downer, seeing as most values in config files are emulation-based.

    Scott Guthrie shot back a message later and said that's definetly in the works.  Awesome.

    Read more...

  • Suggestion: WebPartManager should be auto-included on page

    I’ve got a couple of suggestions on how Visual Studio .NET handles Web Parts. 

    First, one of the requirements of getting Web Parts to work properly is that an instance of a WebPartManager object needs to be present on a page.  I was surprised to discover that the Visual Studio .NET alpha does not add a WebPartManager automatically when WebPartZones are added visually through drag-and-drop to a WebForm.

    To date, publicly-available samples like the PDC Hands-On Labs and Whidbey documentation, indicate a WebPartManager needs to be manually added to a page, either programmatically or declaratively.  This actually struck me as surprising, considering how the IDE automatically handles other aspects of the .NET Framework. 

    Microsoft indicates that Web Parts can be “used” without an accompanying WebPartManager, however in doing so, WebPartZones are then only able to render content, sacrificing the key functionality Web Parts were meant to deliver (i.e., displaying a title bar, allowing draggable repositioning, etc.), making them the functional equivalent of Literal controls, which would appear to be a terribly inefficient method just for displaying content.

    I can realistically foresee the lack of a WebPartManager control as being a problem that developers of all levels will have when working with Web Parts that won’t seem to work as advertised, being one of the most-asked questions in forums, UseNet, mailing lists, etc. (Question: “Why doesn’t my page display WebParts correctly?” / Answer: “Did you include a WebPartManager?”)

    Therefore, if the placement of a WebPartManager object on a page is “mandated” for Web Parts to work properly, I suggest it be included automatically by VS .NET when a WebPartZone control is dropped onto a page.  This to me seems logical and would be consistent with the other ways the IDE manages server controls.

    Perhaps if the use of a WebPartManager is optional, either allowing for developer preference or to leave flexibility for future functionality that doesn’t require its presence, a smart tag could be included that would create/configure a WebPartManager object, much in the same way datasource controls are created for data-bound controls (i.e., a SqlDataSource context menu being available when adding a GridView control).

    Second, the publicly-available examples indicated that a WebPartManager control can be included anywhere within a WebForm, and various conventions make various recommendations about the best locale for its placement.  Patrick Lorenz, in his excellent book “ASP.NET 2.0 Revealed”, suggests standardizing placement of a WebPartControl within a WebForm’s server-side <HEAD> tags.

    Personally, I think it’s rather eerie that the control is allowed to just sit anywhere on a page.  Some people may like this liberal, unregulated approach, but I tend to rely on a bit more on discipline and rules in guiding my coding, so I’d prefer it if a WebForm’s WebPartZone’s were required to be encapsulated/wrapped physically by a WebPartManager control.

     <asp:WebPartManager id=“manager” runat=“server”>
      <webpartzone id=“zone1” runat=“server”>
       This is content for zone1
      </webpartzone>
      <webpartzone id=“zone2” runat=“server”>
       This is content for zone2
      </webpartzone>
     </asp:WebPartManager>

    Still, looking at the overall behavior of Whidbey, the datasource controls (e.g., SqlDataSource, ObjectDataSource, et al.) don’t require specific placement on a page, and likewise don’t wrap around the control(s) they bind to.  The general behavioral practice at this point seems to be laying them at the bottom of a WebForm.  If this de facto convention becomes the general practice, I won’t complain. 

    Just thought I’d throw this out.  Thanks for listening!

    What do you think?

    Read more...

  • Suggestion for ASP.NET 2.0: add an e-mail validation server control

    I sent this today to the ASPAdvice mailing list...comments?


    I’d like to make a suggestion for a new validation server control I’d like to see shipped as part of ASP.NET 2.0 – an e-mail validation server control. 

    I’d like to see a control that would internally contain a default regular expression (e.g., ^[\w-]+(?:\.[\w-]+)*@(?:[\w-]+\.)+[a-zA-Z]{2,7}$) that would validate the form of a given e-mail address, ensuring proper formatting.  Certainly this is possible by using a RegularExpressionValidator, but it would be great to see this come right out-of-the-box.  With all of the encapsulated functionality that 2.0’s rolling out, I think this would be a logical and very valuable addition to the feature set.

    What I’m proposing is this: a control that would by default, validate universally-used e-mail formats.  It would also contain a public property that would, if set, override the aforementioned default behavior and allow the developer to implement their own validation rule.  Additionally, the server control would, upon submission attempt, make a network call and validate the existence of an e-mail address.

    Most developers who are savvy enough to do this on their own or know where to find resources to help them build these types of controls, like http://regexlib.com, do it all the time, and lesser-experienced devs have a hard time with this and ask about it constantly.

    Thanks for listening!

    Read more...

  • Working on Web Parts, C# 2.0 Generics...

    Well, it took me long enough, but I've gotten here.  I've been working on different ideas for Web Parts...trying to develop some sort of anonymous personalization, mainly.  Basically, I'd like to give my users the ability to check out my site and reconfigure the physical layout of the different content areas, WITHOUT having to signup for membership.

    Microsoft has said that such was possible before, but has since been taken out, most likely due to the fact that the SavePersonlizationData() method is now protected.

    Still working on it, but this is essentially what is wrapped by the Page-level declaration “EnablePersonalization“:

    WEB.CONFIG
    =============
    <system.web>
    <anonymousIdentification enabled="true"/>
    <personalization>
     <profile>
      <property name="StartPoint" type="System.Int32" defaultValue="1" allowAnonymous="true"/>
     </profile>
    </personalization>
    </system.web>


    SOMEFILE.ASPX
    =============
    WebPartManager manager = WebPartManager.Current;

    for(int i=0;i<manager.Zones.Count-1;i++)
    {
     // get a reference to the current zone in which the current web part sits
     int currPosition = manager.Zones[i]; 
     
     WebPartCollection wpc = GetWebPartsForZone(manager.Zones[i]);
     for(int j=0;manager.wpc.Count-1;j++)
     {
      // get a reference to the current web part 
      int part = manager.wpc.WebParts[j];
      
      // get a reference to the default position set in the Profile object in web.config
      int defaultPosition = Profile.StartPoint;

      /*
       NOTE:
       ======
       This would probably work a lot better if somehow the web.cofig file could be read like a dictionary...that way,

    each Profile property could be evaluated against each WebPart
      
      */
      
      // if the web part's current position is not the the default, then move it programmatically
      if(currPosition.CompareTo(defaultPosition) != 0)
      {
       // if there is a web part already placed at the beginning index of the web part zone, place it lower on the

    control tree
       int index = 0;
       if(manager.wpc.WebParts[j])
       {
        index++;
       }
       manager.MoveWebPart(part,currPosition,index);
      }
     }
    }

     

     

     

    Read more...