March 2005 - Posts
I couldn't find a review, comparison or guide for CDEx settings when using the Lame encoder so I did some playing around today. Before today I just set CDEx to a bitrate of 192 and forgot about it. Now I'll be going back and re-ripping a bunch of disks that I actually like to hear, because I expect my MP3s will be around longer than a bunch of my CDs and they're a whole lot handier.
Since my desktop speakers suck I tested with my daughter's Sennheiser HD437 headphones (a good deal for the sound), my old Sony MDR-AV5s (don't look, they don't exist anymore), and then my Sony inner-earphones (MDR-EX51LP). First conclusion: the Sony in-earphones aren't as good as I thought; fine for the laptop when EQed, not so great when compared with decent headphones. Though it doesn't affect the results here, the old Sony set was the best sounding and the Sennheisers (surprisingly) ran a hotter signal.
The computer has a garden variety sound card, the CD drive is actually a generic CD-RW/DVD-R, and I played everything through WinAmp with a flat EQ.
I used the Breeder's old Safari EP for the test -- Do You Love Me Now? and Safari were the tracks I picked. This EP was originally recorded and engineered on tape and later encoded for the CD, and for this purpose that means there's more to hear; on a good system you can tell where channels and effects punch in and out, and background sounds from the studio aren't edited out. When you can hear Kim Deal lick her lips between notes, that's a good thing.
These will be my settings until I find or hear something new...
Thread Priority: Above Normal.
It doesn't make a difference to the sound, but does make ripping go faster.
Version: MPEG I.
Set it to anything else and the bitrate maxes out at 160.
Bitrate Min/Max: 192/320.
Set to VBR and you get a range instead of just one bitrate setting. I first tried 160/320 and the sound was noticeably warmer and nuances were better defined at 192. To me, it was worth whatever size it added. I later tried 256/320 and the difference wasn't as impressive. Though the actual bitrate varies, Winamp reports a single value which I would guess is the average and the tracks I tried reported 238 and 268 at 192/320 (for an extra 1Mb per file), 285 and 286 at 256/320 (.5 Mb difference). So the reported bitrates went up by 47 and 18 respectively, and while the sound was closest to the original CD, I didn't hear anything new and the difference didn't justify the extra storage. 192/320 is where I'll live.
Definitely preserve the original channels. The alternatives save space in ways that seem to me like saying "sometimes mono is just as good."
Quality: Very high (q=0).
At min/max bitrates of 160/320 and a Quality setting of High (q=2) I saved about 200k per song vs. Very High, but there's a slight choppiness that comes off like stair-stepping in the wave. At 256/320 the files sizes were identical so this seems to be a relative setting that may not come into play at high bitrates. If it does make a difference, I'll take the setting that isn't liable to sound choppy at lower bitrates.
On-the-fly MP3 Recording: Disabled.
This copies the track to the hard drive before running the final conversion. When enabled you get occasional stutters in the output, and I suspect it's worse with badly scratched CDs since you're relying on the drive mechanism to stay in synch with the codec that much more.
VBR Method: VBR-MTRH.
Variable bitrates mean that extended silence takes less to store than a busy section. Rather than rip everything at 256 or 320, this sounds like a good idea to me. VBR-HTRH was indistinguishable to me from VBR-New, and produced the same file size. It's a hybrid between VBR-Old and VBR-New, so why not.
VBR Quality: VBR 0
Lower is better. One of the few things I didn't bother to test changes with.
With these settings and the Sony headphones I could hear as much in the recording as I'd expect from a good set of monitors -- mistakes, breaths, the natural reverb of the room on the vocals, gating on the drum mics -- that's as much information as I expect from an inherently lossy format like MP3, and better than I expected.
Something I didn't expect happened when I let the CD run from Safari (track 3) on to track 4. I've long complained that when listening to a lousy cd player -- and this one's obviously designed for data, not music -- everything seems to stutter, like the sound really is delivered in digital chunks one after-an-other rather than as a smooth wave. So I ripped track 4 to the above "best-of-all-worlds" setting too, and the stutter disappeared. Whether the stutter was caused by a scratch or a lousy mechanism I don't much care, reading from a reliable medium solved it. If all it means is that you're better off ripping to a drive than playing cds through a computer, then hey, I learned two new things today.
The page execution cycle can be a difficult thing to master. Most issues with an event not firing or form data being lost during postback can usually be solved by better understanding the lifecycle. I've constructed a few targeted searches to help out.
ASP.Net 1.x Page and Control Execution Lifecycle
ASP.Net 2.x Page and Control Execution Lifecycle [Diagram by Léon Andrianarivony]
A few things to look for when events don't fire.
[Updated: 2006-06-27 Added diagram, thanks Ambrose]
Feedmap shows where you're blogging from, and who else is blogging nearby. Nifty. Too bad .Text blocks the script block required for the inline map, I'll paste in a copy when the inclination strikes.
Almost as fun, this links to a map showing who else is in the neighbourhood...
To recap, lazy programming is not necessarily the easiest path in the short-term. The lazy path is the most efficient in the long-term to understand, reuse, maintain, and extend. Over time, the lazy paths waste the least time, money and energy. Being perfectly lazy often requires some hard work up front to ensure these long-term goals are met.
Law: On a long enough timeline, every possibility becomes a probability.
It is important to understand both the intended longevity and the likely longevity of every action. Though every action is permanent, "longevity" is the length of time you can be reasonably held responsible for the effects of an action.
When the longevity of a product is both intended and likely to be measured in weeks, your approach should be completely different than when the longevity is expected to be measured in years. This is because on a long enough timeline, every possible event becomes a probable event.
On a brief timeline it is reasonable to expect fewer events, and so design does not need to consider as many probabilities. The total effort spent on documentation, development, and the handling of unlikely events should be lower than on a longer timeline.The total effort spent should be appropriate to the timeline.
When the law is ignored, either too much effort goes toward short-lived actions, or too little effort goes toward long-lived actions. In the first case, the waste is obvious and immediate. The second is an order of magnitude more expensive. Problems stay hidden longer and are compounded by other changes, and therefore become more complex to solve. When the same problems are considered from the beginning, you can either build resistance into the product itself, or plan a course of action should the problem arise. When problems are considered from the beginning, the design of the product allows for an efficient solution.
See also: Risk Management, Murphy's Law.
View all posts on Lazy Programming.
Google has the best implementation of digital maps and directions so far. They've nailed it. To grab a link so someone else can see what you're looking at, it's not three steps, it's one step. When you get directions and want detail for the stages, you don't have to click back and forth from a big view to small views, you can click on each stage and an inset balloon pops right over the map (until they fix it, click on the words, not the stage number). In fact, you never really leave the main page, all the processing to search and update your views is done with fast xmlhttp, not page loads.
Here's the best part. Run a search like Italian Restaurants in Toronto, Ontario and use the slider on the left to zoom in. You can click and drag it, it behaves like it should. Now click and drag right on the map. Cool eh? Now zoom down to the street level, pick a street or highway and follow it by dragging. Heck, zoom out a bit and drag your way all the way across Lake Ontario. Hey I'm in the U.S. and didn't need to go through Customs. You can take whole virtual trips like this. Enjoy.
Now imagine if this was connected to a service like MSN Messenger, and you could form a tribe with your friends and leave post-it notes to each other on street corners. Imagine if I could share my location information with you through my GPS phone. Imagine if the transit company used GPS on its vehicles to post real-time maps. When's the next bus getting to the corner? There it is, better leave now. Imagine if instead of little pop-over balloons for those Italian restaurants, people could submit photos of the buildings themselves. Think virtual billboards. Think Sim City. Only this time, it's not simulated anymore. Damn, maybe this Internet thing really is just getting started.
The problem with portals is that they require tending. Whether you're building a developer hub for .Net or a launchpad to find recipes, it takes a human to moderate, tend and prune. What if a hub contained well-designed searches instead? When the design goal is to return a set of possible solutions, why not create a self-maintaining solution?
The experiment began with my blog entry yesterday. Rather than provide a specific link to content, I'll generate a Google search that happens to return what I might have linked to as its first result. Aside from giving the user more than one choice, it never gets stale, and never requires human maintenance.
The exceptions are specific references, like "my blog entry yesterday." Click on that title and you don't want all of the weblogs.asp.net posts from yesterday, you want the single thing the link refers to.
The truth is that designing effective queries remains a specialised skill. It adds value for those who don't have the time or the interest to develop the skill. On a knowledge hub or portal there is also value in writing, developing, and publishing original content, but on pages where the point is to provide a content index, the future is in linking topics to targeted searches.
Like it? Don't like it? Got a way to improve it? Feedback.
[Note: Edited March 19, same content, more concise.]
I made the sojourn to Kate's EoT.NetUG (you can tell by the sheer length of the acronym she's a C++ guru), and had a blast. Not only did she bring in the deep expertise of Sam Gentile for tonight's meet, but had the foresight to book an LCBO-sanctioned hall, so once the main seriousness was complete it became an instant social occasion. Kudos.
Sam's presentation on .NET Generics was great. I was lucky enough to learn the concept straight from Anders (with about 30 others), but at the time a few key pieces didn't sink in on how they drill down through MSIL and the CLR. And as Sam also reminded the group today, until you get it at the CLR, you don't quite got it. Sam helped me finally get it.
Sam brought something else full-circle for me. The research project that developed the concept of Generics for .NET was called Gyro and it was developed by a couple blokes from MSFT UK. Their original paper remains the blueprint, and it's next on my reading list. I've skimmed through, it's straightforward and readable, and you should check it out too. And it was made possible by something called ROTOR.
In Spring 2002 I first heard of this thing called ROTOR from Brad Merrill. Brad was excited about ROTOR (as memory serves) because it helped people understand the CLI, especially those people who wanted to write their own .NET implementations for different languages (Brad's a linguist by nature as much as by schooling). Or new languages. Or extensions to existing languages. There was no precedent for this open sourcing of significant IP at MSFT. Many saw it as a risk, battles were fought, but making the CLI an ECMA standard and releasing ROTOR happened and we now enjoy the fruits, including Generics.
What does this mean to you, intrepid reader? Well if that ROTOR thing called Generics turned into an integral feature for v2.0, just where do you think the ideas for v3 are being sewn? And you thought blog-watching was busy work. Enjoy. Sam, thanks. It was great to finally meet.
I'm happy to announce that the Toronto SharePoint User Group will have its first meeting on April 13.
I'd especially like to thank the folks at CDI Education (my employer) who came forward to support this community effort by providing the essentials (aka two hours of food and shelter for TSPUG members) as well as helping to get the word out.
Inaugural Meeting, April 13
Sign-in, Eat, Meet 'n' Greet
My User Group: An open discussion about the direction of the group. What would you like to see?
An Introduction to SharePoint: On this tour of SharePoint Products and Technologies, attendees will learn what SharePoint is, how it works, and how it can be customised and extended. Presenter: Eli Robillard (CDI Education)
For more details and to RSVP your place onto the guest list, head over to the site today: http://tspug.com/
See you there!
[Updated: 2005-03-15: RSVP now only available through the website to make it easier to notify everyone when the guest list is full.]
It's amazing that the popularity of RSS is still mostly restricted to techies and high school bloggers. Don't believe me? Ask your mom. What this probably means is that there is a large surge yet to come as aggregators are built into apps that people use on a daily basis like Outlook, or a dedicated feature of MSIE). Or maybe the status quo folks are happiest reading from pre-formatted websites. When a friend asked about RSS, I put together a quick list of sites to get started.
What is RSS? In a nutshell, RSS gives you the ability to build your own daily newspaper. Using either a piece of software or a website called an Aggregator, you select the news sources you want to read, and the aggregator automatically brings it to you whenever something new is published. News agencies, magazines, webloggers (aka bloggers), and many companies now make content available through RSS newsfeeds.
The coolest new thing to happen with RSS is Podcasting. Rather than a newspaper metaphor, podcasting is more like a subscription to an audio magazine; new audio programs show up on your computer or iPod as they are created. It's only a matter of time before the same happens for video.
RSS Feed Directories
Search RSS Feeds
Start your own weblog
Host your own weblog
[2005-03-21 Added a "What is RSS" section for Jim Martin's Mom and the bit on Podcasting.]