Jeff and .NET

The .NET musings of Jeff Putz

Sponsors

News

My Sites

Archives

March 2007 - Posts

Using an anonymous delegate in List<T>.FindAll()

If your experience is anything like mine, you probably have been using generics like crazy since .NET v2 hit the streets (or before if you were a beta monkey). It's so much easier to manipulate object and especially strongly typed collections.

One thing I always found annoying, however, was that there wasn't an obvious way to find stuff in a List<T> object. For example, if you had a list of NavEntry objects that have Title and Url properties, there's no obvious way to pipe in some string and find all of your matches. The documentation tells you how to use a search predicate, which is not particularly useful in real life because you can't pass in parameters to it. You can of course use some private variable, but it feels "hackish" (for lack of a better word) and is less elegant.

A friend of mine, much smarter than me, suggested using an anonymous delegate. So in my example, you'd create a class like this:

public class NavEntryList : List<NavEntry>
{
    public List<NavEntry> GetItemsContaining(string text)
    {
        return this.FindAll(delegate(NavEntry nav) { return nav.Title.Contains(text); });
    }
}

I like the way that rolls much better. You can substitute the Title property for whatever it is your object has, and even create one for each property, if you'd like. You can probably refactor this even more. I've seen a third-party class library that does all kinds of neat stuff like this, though the name escapes me at the moment.
 

Posted: Mar 23 2007, 12:56 PM by Jeff | with 39 comment(s)
Filed under:
Loving Apple TV

In today's Kool-Aid drinking exercise, I have to say that I really like the Apple TV. I bought it yesterday at my local Apple Store.

Scoble likes it too. What's fun about his post is the comments that follow. There is one post in particular that summarizes exactly the way I feel about it. If you look at the box like an iPod that syncs wirelessly to iTunes, uses your stereo instead of headphones, and your TV instead of its own screen, then you understand exactly what it's used for. And relative to the cost of an iPod, it's a pretty solid value. Like the iPod, it "just works."

As is the case with everything that Apple has done in the digital media space, the idea is to think of the most simple way to make things works. They already had a pretty good model in place for this when they launched the iPod, and the same paradigm works pretty well for video delivered around the Internet. It's relatively idiot proof.

Geeks want it to do this and that, or be like their Xbox 360 or XP Media Center or whatever, but honestly none of those things are as simple. I'll concede that you have to live at least to some degree in an iTunes universe, but as someone who is, that's a non-issue for me.

I dig it. 

Code in Notepad? No thanks.

I've been known to drink the 37signals Kool-Aid now and then, because I think that in the bigger picture they have a lot of good ideas about user interface design and process.

But I have to call out the crap too, and this post that compares using a camera in manual mode to writing code in a straight text editor instead of an IDE, is about as crappy as you get.

I don't know if it's just old-skool people or what, but writing code in a text editor doesn't get you a badge of honor or a gold star. Nobody cares. You want an analogy? Using a text editor instead of an IDE because you want to be "in control" is like rubbing two sticks together to make a fire when you have a flame thrower available to you. Why waste your time? If you know how to use the flame thrower, you can do a whole lot more, and do it faster. 

The strange Mac vs. Windows holy war

A new OS X version was rolled out yesterday, which includes a number of security fixes. This has of course prompted the usual rash of "your OS suX0rz!!11" comments on Digg and various other places.

Before I get to my observations, let me just say that I really like Macs and OS X in particular. I switched a year ago and I've generally enjoyed using a computer more than I used to. Everything is more simple, and the OS rarely gets in the way of things I want to do (like get photos off my camera, find a network printer, connect to mysterious Wi-Fi spots, etc.). The greatest thing about OS X is that you hardly notice it. Oh, and I do love the hardware too. It's pretty and functional.

Then I go to work and deal with the constant disk churning I can't explain, reboots every couple of days, etc. I tend to wonder why it is my Web server just runs and runs, but I suppose that makes sense since all it has to do is run the same half-dozen processes all day. A personal computer has a lot more to do, opening and closing stuff.

The security update in OS X 10.4.9 covers a lot of really obscure stuff, much of it requiring you to have access to the computer. There are a few items I'd say would concern me, namely the disk image stuff and GIF previewing, but most of it wouldn't even be on the radar of things that I'd worry about.

But for a moment, let's talk about what "security" is. To me, being secure means nothing is going to happen to me. If I'm locked in my downtown apartment, I feel pretty secure. If I'm locked in my farm house in the middle of nowhere, I'm even more secure because there are a lot fewer things that can harm me. Critics will argue that part of the reason OS X has no viruses or spyware is because it's used by a small (although growing 30% year-over-year) share of the market. I don't deny that, but by my definition, that still makes me more secure.

Generally speaking, I feel that OS X is less vulnerable because you need to take deliberate action to install something. Nothing goes without my password. And it's not annoying like the "confirm or deny" feature of Vista that is inevitably going to get shut off. It's more annoying than "training" Zone Alarm back in the day when you first install it.

Where I think Windows really got it wrong was that Microsoft never had the nuts to just start over, because of compatibility concerns. Having used OS 9 back in the day, it had a good interface but the mess of "extensions" and other crap made it a dog too. Starting over did wonders for the new operating system. Vista is still trying to nurse ancient software that most people will never use. I don't know if all that bulk is what makes Windows crawl at times, but I'm sure it doesn't help. I know that compatibility is certainly a concern you can't ignore, but when I look at how relatively lightweight the .NET Framework is, and how you can write software against it, the Win32 world seems like a huge wasteland of bloat.

People are nuts for the Mac because it mostly delivers on the promises made by Steve's version of reality. It's certainly not perfect, but in the last year, the only thing I can remember encountering that frustrated me was that the wired LAN wouldn't take priority when I shut off the wireless. I can rattle off two or three things about Windows in just the last week that have annoyed me.

I don't know what makes an OS "better" than another, but for my money, it all has to do with how little I notice it.

Made the switch to Subversion, back to NUnit from VSTS

After about two years of using SourceGear's Vault, I switched to Subversion for source control. I was using Vault because it was free, Web-based, integrated with Visual Studio and was generally familiar.

Now I need to get more people into the loop on a particular project, and that makes Vault not free anymore. Since I've managed to hold down a day job, for more than a year, and actually like it, I've had lots of exposure to Subversion, and I really dig it. It's "free," it's fast and since the popular TortoiseSVN is a shell extension instead of yet another client, it's very easy to use. You just need to remember to add/delete from Explorer and not simply delete stuff from your Visual Studio solution explorer. Branching and merging is like magic.

I have to tell you though, getting there was not easy. Subversion's daemon was not something I felt could work right on a Windows server with IIS, so I installed Apache instead. After a lot of messing around, I did get it to coexist with IIS. The world of config files is a strange world for us Windows monkeys, and even more strange for those of us with the minimum set of skills to make a Web server work. It's a good example of why I tend to steer clear of open source software, but Subversion is that compelling. If people with time and inspiration went beyond the installer builds (don't think those make success easy), the entire world would switch and never go back.

I also went back to using NUnit instead of the VSTS testing suite. My reasoning at first was just that the UI around VSTS testing sucks. It seemed backward to me from the start. When I gave Resharper a try, and saw the UI that integrates with NUnit, that was the end of it. I'll be buying Resharper as soon as my trial expires, for sure. The only thing I miss from VSTS is the PrivateObject class, but can I generally get along without it.

Anti-virus scam

Anti-virus software is a $4 billion a year industry. Can you believe that? That's a lot of cash.

About a year ago, I was annoyed since my McAfee subscription was about to run out. I remember a day when the software makers didn't charge for updates after a year. But now, you can't avoid it. I was reminded of this because I sold my Dell laptop to a new friend, and she's annoyed that it costs so much to keep it protected.

I say "reminded" because I have a Mac now. I'm one year virus free because Macs don't have virus issues.

Yet another reason I'm glad that I switched.

Caching, SQL CLR and code monkey kingdoms

Prior to the release of SQL 2005, there was a lot of chatter about SQL cache invalidation. Then once it was released, it kind of just stopped. If any of it actually shipped, hell if I can find any documentation on it. A quick look at the stacks at my local Borders, I can find anything in the SQL or .NET books. This page says what it does, but it sure seems a little vague. It lacks context, and I wouldn't leave any performance implications to chance.

Another thing that has largely faded into obscurity is the SQL CLR implementation. This one is clearly a cultural issue, and not one lacking documentation and books. Bring up CLR code in any traditionally structured organization, and the database guys will immediately be talking about how they won't let code run on "my" server. 

All that said, there are some interesting possibilities using the CLR for things like caching beyond the limitations of SQLCacheDependency. The existing implementation will trigger invalidation because you incremented one number in a record that has a dozen other values. Seeing as how the application knows it made that change, it'd be silly to just throw all of that data away.

In a single Web server situation, of course you don't need to worry about this, because you can easily roll your own cache scheme since the box knows about everything that you'll ever do to the data. The Web farm situation makes it more interesting, and that's where I can see some of these great opportunities.

Architect types I thought would really poo-poo this kind of thing, but in talking to people much smarter than me, it surprises me to find that they're very open to the idea. Again, the barriers seem more cultural than anything. Computer sciencey types dig it.

Do you have any stories of adventure using the SQL CLR? 

More Posts