Duplicate cleanup...grrrr

I just spent a bunch of time cleaning out a massive round of duplicate postings in NewsGator, presumably due to the updated .Text roll-out. It annoys me to no end that I have to waste time on this. Is this really necessary? Do other RSS readers handle it more gracefully, or is it a problem common to all readers? Does Atom improve things? Surely this is not an intractable problem.

2 Comments

  • This was more my problem. Url's, especially ones called "permalinks" should not change.



    Under normal circumstances, I would have not made the change to the Urls, but since I turned on aggregator tracking most readers would have seen all of the of the posts as new anyway, so I seized the moment to clean things up a bit :)



    Posts are now archvied at Archive / Year / Month / Day / PostID/EntryName



    It sucks, but hopefully this will be the last time it happens.



    -Scott

  • and I thought it was just me that was having problems. I'm using SharpReader and have had this problem for quie a while now. Couldn't work out what was actually changing. An example is Dino's asp.net weblog. Even though I'm subscribed to his independantly of the rest of the asp.net blogs I still keep getting them duplicated most times I refresh. I get the same problem with Mr Box's also.

Comments have been disabled for this content.