“.NET Core is the future”, but whose future is that?

It’s likely you’ve heard about Microsoft’s release of the .NET Core source code, their announcement of ASP.NET vNext and accompanying PR talk. I’d like to point to two great articles first which analyze these bits without being under the influence of some sort of cool-aid: “.NET Core: Hype vs. Reality” by Chris Nahr and “.NET Core The Details - Is It Enough?” by Mike James.

I don’t have a problem with the fact that the ASP.NET team wants to do something about the performance of ASP.NET today and the big pile of APIs they created during the past 12-13 years. However I do have a problem with the following:

“We think of .NET Core as not being specific to either .NET Native nor ASP.NET 5 – the BCL and the runtimes are general purpose and designed to be modular. As such, it forms the foundation for all future .NET verticals.”

The quote above is from Immo Landwerth’s post I linked above. The premise is very simple, yet has far reaching consequences: .NET core is the future of .NET. Search for ‘Future’ in the article and you’ll see more reference to this remark besides the aforementioned quote. Please pay extra attention to the last sentence: “As such, it forms the foundation for all future .NET verticals”. The article is written by a PM, a person who’s paid to write articles like this, so I can only assume what’s written there has been eyeballed by more than one person and can be assumed to be true.

The simple question that popped up in my mind when I read about ‘.NET core is the future’, is: “if .NET core is the future of all .NET stacks, what is going to happen with .NET full and the APIs in .NET full?”

Simple question, with a set of simple answers:

  • Either .NET Core + new framework libs will get enough body and it will be simply called ‘.NET’ and what’s left is send off to bit heaven, so stuff that’s not ported to .NET core nor the new framework libs is simply ‘legacy’ and effectively dead.
  • Or .NET Core + new framework libs will form a separate stack besides .NET full and will co-exist like there’s a stack for Store apps, for Phone etc.

Of course there’s also the possibility that .NET core will follow the faith of Dynamic Data, Webforms, WCF Ria Services and WCF Data Services, to name a few of the many dead and burned frameworks and features originating from the ASP.NET team, but let’s ignore that for a second.

For 3rd party developers like myself who provide class libraries and frameworks to be used in .NET apps, it’s crucial to know which one of the above answers will become reality: if .NET core + new framework libs is the future, sooner or later all 3rd party library developers have to port their code over and rule of thumb is: the sooner you do that, the better. If .NET core + new framework libs will form a separate stack, it’s an optional choice and therefore might not be a profitable one. After all the amount of people, time and money we can spend on porting code to ‘yet another platform/framework’, is rather limited if we compare it to a large corporation like Microsoft.

Porting a large framework to .NET Core, how high is the price to pay?

For my company, I develop an entity modeling system and O/R mapper for .NET: LLBLGen Pro. It’s a commercial toolkit that’s been on the market for over 12 years now, and I’ve seen my fair share of frameworks and systems come out of Microsoft which were positioned as essential for the .NET developer at that moment and crucial for the future.  .NET Core is the base for ASP.NET vNext and positioned to be the future of .NET and applications on .NET Core / ASP.NET vNext will likely use data-access to use some sort of database. This means that my runtime (the LLBLGen Pro runtime framework, which is our ORM framework) should be present on .NET core. 

Our runtime isn’t small, it spans over 500,000 lines of code and has a lot of functionality, not all of which is considered ‘modern’ but not all of us develop new software: most developers out there actually do maintenance work on software which will likely be used in production for years to come. This means that what’s provided as functionality today will be required tomorrow as well. Add to that that a lot of our users write desktop applications and our framework therefore has to work on .NET full no matter what. This has the side effect that what’s in our runtime will have to stay there for a long period of time, and porting it to .NET core will effectively mean: create a fork of it for a new runtime and maintain them in parallel.

I’ve done this before, for the Compact Framework, a limited .NET framework that ran on Windows Mobile and other limited devices, so I know what costs come with a port like this:

  • research in what is not supported, which API acts differently what limitations there are and which quirks / bugs to stay away from or take into account
  • features in the .NET framework aren’t there, so you have to work around these or provide your own implementation
  • APIs are different or lack overloads so you have to create conditional compile blocks using #if
  • because not everything is possible on a limited framework you have to cut features in your own framework, limiting usability
  • less features or limited features in your own work mean you have to provide different documentation for these features to explain the differences
  • a different platform requires additional tests to make sure what changed actually works
  • additional maintenance costs for support, as issues only occurring with the additional framework require specific setups for reproducing the issue
  • supporting a new platform isn’t for a week but for a long period of time as customers take a dependency on your work for a long period of time.

For an ISV or for an OSS team these issues have a severe impact: they take time to resolve and time has a cost: you can’t spend the time on something else. In short: it’s a serious investment.

I’m not afraid to do these kind of investments. In the past I’ve spent time on things like the following: (time is full time, just development work)

  • Several months implementing DataSource controls for our runtime to be used in ASP.NET webforms. Dead: ASP.NET vNext doesn’t contain webforms support anymore. We still ship the DataSource controls though.
  • Several months on adding support for Dynamic Data in our runtime. Dead. We don’t ship support for it anymore. Customers who want it can get the source if they want to from the website.
  • Several months on adding support for WCF Ria Services in our runtime. Dead. We don’t ship support for it anymore. Customers who want it can get the source if they want to from the website.
  • Several months on adding support for WCF Data Services in our runtime. Dead, as the future is in WebAPI, which is now merged into ASP.NET vNext. We still ship the library.
  • Five months on adding support for Compact Framework. Dead. We don’t ship support for it anymore. Last version which did is from 2008.
  • Two months on adding support for XML serialization. Dead. JSon is now what’s to be used instead. We still ship full xml serialization support with multiple formats.
  • One month on adding support for Binary Serialization. Dead. JSon is now what’s to be used instead. We still ship full binary serialization support with optimized pipeline for fast and compact binary serialization of entity graphs.
  • Several weeks on adding support for WCF services in our runtime. Dead, as the future is WebAPI, which is now merged into ASP.NET vNext. We still ship support for it.
  • Several months on adding support for complex binding in Winforms and WPF: Still alive, but future is unclear (see below). We ship full support for it, including entity view classes.
  • Almost a full year on adding support for Linq in our runtime. Still alive. This was a horrible year but in the end it was worth it.
  • One month on adding a full async/await API to our runtime. Still alive. This was actually quite fun.

That’s just the runtime, and the changes required to ‘stay accurate’ according to Microsoft’s roadmap for .NET and what’s required to build a ‘modern’ application for .NET. As you can see, lots of time spend on stuff that’s considered ‘dead’ today but was very accurate at that moment or looked to be like it would become great soon.

One can also imagine that with the experience from the past, I’m a bit reluctant with respect to supporting new stuff nowadays, see it as a case of “fool me 10 times, shame on me”, or something like that. At the same time, things change, and if .NET core is the future for both server and desktop, we have to abandon the current .NET framework and its features anyway in the future, so moving is inevitable then. So what’s one more investment?

It’s not a simple investment

It’s not as simple as ‘one more investment, what harm can it do?’. The thing is that with a small ISV as we are, it’s crucial you spend your time on the things that matter: if things fail, it might be fatal to the company. This is different from a team within Microsoft which still get a paycheck after a failed project: they move on to the next project, or even get a chance to rewrite everything from scratch. So from the perspective of a Microsoft employee, it might look like something that might take a month or two and then you’re all set for ‘the future’, and if everything fails, well, we’ll all have a laugh and a beer and move on, right?

No.

When you write software for Microsoft platforms you’ll pick up a thing or two after a while and you’ll begin to see a pattern: Within Microsoft there are a lot of different teams, all trying to get the OK from upper management to keep doing what they’re doing. The numbers are so vast that it’s often the case teams are not really working together but actually against each other, even without knowing it, simply because they have their own agendas and goals and they’re only known within these teams. All these teams produce stuff, new technology to both gain users and the attention of upper management. Some of these technologies stick around and gain traction, others fail and die off. It can be that the decision of one team might affect the future of another, but that’s part of the game: in the end it will all sort out, perhaps both will stay, both will die, upper management will step in and demand the teams talk.

We 3rd party developers look at what’s produced by all these teams and hope to bet on the technologies that stick around. Chances are (see above) that you’re betting on a crippled horse with one lung, and your investment is rendered void after a period of time. It’s therefore crucial to know up front as much as possible before taking the plunge and hope for the best.

With the investment to support .NET Core and ASP.NET vNext in our runtime this isn’t different: I want to know up front why I am doing this, why this is the best investment for my time, time I can’t spend on new features for my customers. I don’t think that’s an unreasonable question.

“Sell me this framework”

So I want Microsoft to sell it to me. Not with PR bullshit and hype nonsense, but with arguments that actually mean something. I want them to sell me their vision of the future, why I have to make the investment they ask from me. “Sell me this pen”, says Jordan Belfort in ‘The Wolf of Wall Street’, while holding up a basic ballpoint pen. It’s from one the many brilliant scenes in that wonderful movie which shows how hard it actually is to sell something which seems trivial but isn’t. Microsoft acts with their communication about .NET core as the room full of sales people in the last scene of The Wolf of Wall Street but they have to act like Brad who picks up the pen and says “I want you to write your name on a napkin” to which Jordan replies “But I don’t have a pen”. “Exactly, buy one?”.

It comes down to which future they mean with ‘.NET Core is the future’, and whose future that is. Will my customers who write desktop applications using Winforms or WPF be part of that future? Or will ASP.NET users only be part of that future? It’s very vague and unclear and with that uncertain. There’s contradicting information provided both through official channels and through unofficial channels (e.g. email) that paints the picture of the Microsoft we have all known for many years: a group of teams all trying to do their best, providing value for what their team stands for and we outsiders have to make sense out of the often contradicting visions ourselves.

My wife said last night: “They don’t want us there, all they want is stuff they control themselves”. I fear she’s right (as always); I have never felt more unwelcome in the world of .NET as today.

Our future

So I decided to make my own future and see where that gets me. This means I’ll spend time I otherwise would spend on a .NET core port on new features for our customers and will take a ‘wait-and-see’-stance with .NET core. After all, our customers had and have confidence in what we provide is solid enough for their future and that’s what matters to me, not necessarily what’s best for Microsoft’s future.

26 Comments

  • Sensible approach Frans.

    I do similar with learning MS technologies as they come out as well.

  • A deeply pessimistic post, but I can't deny anything you said, especially your (long) list of failed Microsoft initiatives. I too have been burned by many of these over the years.

    Still, though, I can't help but feel like something is different this time. It feels like Microsoft (and .NET) is going through a paradigm shift. I think you should consider the fact that the whole .NET Core is open source, and think about just what that implies will be different with this initiative. Once that cat's out of the bag, you can't put it back in again.

  • @Ron: we'll see what the future (whomever it is) brings :) It's of course never too late to port over later on if it's required, but at the moment it's very uncertain how everything will pan out. I agree things are a bit different with .NET core than they were with e.g. the compact framework or other smaller initiatives: they really want to get this done right this time. That's OK of course, the problem is with the fact that they can't simply arrogantly expect from 3rd parties that we'll immediately move to their latest thing, they've burned too many people.

    And perhaps in 2 years time we'll all laugh about this initiative like we tend to do now about WinRT and store apps and how well that went. ;) Cheers.

  • How I see this: as .NET is a mature framework and without too many things left to be implemented, and it's no longer a critical selling point for Microsoft, no longer "the competitive advantage", finally Microsoft has enough resources to make a big clean-up and refactor - so this can be called "the .NET reboot" or "the big .NET refactoring".
    Whether they will succeed in this endeavor or not, we will see.
    If it succeeds it will be a shift and change as big as the one between COM/DCOM era and .NET era..

  • After WCF Ria Services declared dead, shouldn't you just redesign whole Server architecture to be independent from actual stack used? And serialization logic is just a format which one can switch on the fly depending on different factors... I still prefer binary serialization for production scenarios despite all that Json buzz...

  • Well thought through post. We're working through this same chain with our products (Loupe and VistaDB). Our default position is that we want to be everywhere our customers want us to be - but this feels like Silverlight where we discovered the investment would be substantial, and at the expense of the backlog of things our customers were asking for.

    For me this is particularly disappointing because the story with both EF 7 and vNext seems to be a rush to market with no thought of the ecosystem that's made .NET a place to be for 15 years. The crazy thing is, I just don't see what the rush is for: The building's not on fire so why the rush. I can understand if the vNext team didn't want to port some things that have clearly had their day and are done (.NET remoting springs to mind, and WCF) and it doesn't even burn me that the windows client libraries aren't there (even though I love me some WinFroms) because I get it - there's no UI on the server core they're going for. But to not port over ADO.NET, all of the stream handling, all of the crypto handling, Transaction support.. Just what are people developing server apps to do if they aren't accessing databases, doing stream processing, and encrypting data?

  • @Lex: WCF Ria services, dynamic data, WCF Data services, support for those frameworks is actually building a layer on top of the ORM which provides model information to these services and accepts calls in the formats they provide and converts them into calls the ORM understands. Funny thing is that the model formats for these 3 frameworks is all different, but actually about the same information. Shows how disconnect these teams are.

    Serialization is writing stuff to an output in a given format, true, but implementing e.g. xml serialization or binary serialization requires implementing an interface and do the work internally, so that means the code to do that is substantial in some cases. It should be orthogonal to the ORM and outside of it, but alas, that's sadly not the case in .NET. With JSON it starts to be that way, however to do proper serialization of graphs of objects without getting every property serialized too requires implementing interfaces like ISerializable, so it's not simple, sadly...

    @Kendall: good points. About the ado.net being absent on .net core: that's not the case, they just didn't port certain interfaces and apparently no way to define a DbParameter at the moment and no dbproviderfactory system. So something is there, not everything (so it requires work to port straight forward code that even worked on compact framework ;)). It might be added before RTM though, it seems that if customers yell loud enough they'll port stuff over...

  • In summary:

    "Crap, another variant of .NET that will cost me time to support!"

    And it's true. Core will be very useful for end users, but for library/tools developers, it's going to increase your surface area. PCL will help a lot, but it's not going to be a silver bullet. It does help that what's kept in Core will maintain identical behavior to the vanilla Framework.

  • Some guy on HN said that in 5 years he was betting no one would be using .NET anymore and, along with other articles and posts I read, I'm starting to think he might be right.

  • Hi Frans,

    Good post. Let me share my 2 cents on the topic.

    I think .Net Core has to be viewed from the "Mobile-First, Cloud-First" viewing angle which is MS's strategy for the years to come, beyond Ballmer's "Devices & Services". We live in a world that is becoming more & more mobile, and mobile solutions are, most of the time, connected to the cloud. The trend when deploying services in the cloud is towards more lightweight, portable, modular software. The .Net framework (the "full" version) as we know is not well-suited for that. It was designed to run on desktop computers and on full-fledged servers, not on resource-constrained mobile devices, or in lightweight cloud containers. If .Net is to survive in a mobile & cloud era, where it competes with other technologies that can talk HTTP/REST/OAuth/OpenID, it has to be redesigned from the ground up. MS is not setting the trends anymore, they're mostly playing catch up, which is fine for us mere mortal developers trying to make a living with almost-bleeding edge techs. MS is still wrapping things in a nice & approachable way though e.g. Azure.
    So, if I were you, I'd look at the trends that are picking up outside of the MS / .net ecosystem and that MS is gradually adopting. Those are indirectly setting the direction for the .Net eco-system. That is, "lightweight portable server-side services" or calll-them-micro-services-if-you-like (e.g. Docker containers, ), NoSQL adoption (Azure DocumentDb will probably gradually become a 1st choice over Azure SqlServer), Mobile-to-Cloud data sync & analytics (Azure Mobile Services, RayGun, ...), NoWindows/.Net (as in Not-Only-Windows/.Net, MS is embracing the fact that your C#/F# can/should run on Linux/OSX/iOS/Android/... e.g. Mono/Xamarin).

    The point is, I don't think you really have the choice to support .Net Core or not.
    The .Net Framework IS dead. Long live .Net Core / Katana.

    Cheers,
    Steve

  • @steve: thing is: I followed a lot of 'trends' during the years, the (incomplete) list is in the blogpost, and more often than not it turned out to be just a phase and things passed. I'm not denying things change though, sure they are changing, but I see them more as an additional area than a replacement.

    Funny thing is: people on the JVM don't require a rewrite of their VM when they want to run their service/site in the cloud, but MS has to. you can also argue: if that rewrite is necessary, why not rewrite the stuff you can, behind the surface of the existing APIs and runtime? that would mean perhaps you can't fix everything, but that's life: on the JVM they have accepted that and it doesn't hurt them in the slightest. The advantage is that users can keep using what they're using, instead of 1) waiting for the stuff to be ported and 2) waiting for the initial bugs in a v1.0 product to be ironed out.

  • Where did you read asp.net vnext will not support webforms? As far as I can see, it is still there: http://www.asp.net/vnext/overview/aspnet-vnext/aspnet-5-overview#webforms

  • @jogchem: webforms is present in asp.net vnext on .NET v4.6 (.NET Full). It's not present in ASP.NET vNext on .NET Core, as that's MVC only.

  • Thank you for writing this. I've built a career on surfing the Microsoft technology wave. It's been good to me. But make no mistake: having a nose for what'll last versus what'll disappear in a puff of smoke has been critical.

  • «Or .NET Core + new framework libs will form a separate stack besides .NET full and will co-exist like there’s a stack for Store apps, for Phone etc.»

    The .NET Native half of .NET Core targeting means that .NET Core is explicitly the future of Windows Store and Windows Phone. Those stacks have already converged a great deal and it seems fairly clear at this point, because of .NET Native, that the coming convergence at or around Windows 10 *is* .NET Core. The .NET Core announcements seem clear to me that we are going to be down to three stacks remaining: .NET Full, Mono .NET "Half-Full" and .NET Core (on Windows and on Mono and embedded in who knows how many verticals and gaming engines going forward, converged and shared).

  • I know it's easy to get jaded about .NET Core in light of the previous efforts that you note (and I recall similar efforts from Apple in the 80's, so there's that...) but I think one thing we can take some solace in is the fact that this solves a problem for MS. They have as serious multi-platform problems as anyone else, and if they want to be successful on phone, they have to have something like Core in their stable.

    And to that end, I think it's in all of our interests to keep telling MS that we're using X on Y, and hoping that they are doing the same. That's no guarantee of success, but it doesn't hurt.

  • ant real .Net Dev can go see https://github.com/Microsoft/referencesource
    then go look at your apps, what system.? namespace are you using. I went to see for myself because , holy crap it's way way worse than any have said. its a basically a minimal ASP.NET setup, nothing more. and worse then that the WCF/ aka service model looks to be the new OData serv. stole the WCF name crap & MEF aka System.ComponentModel looks like a new RT version or somthing, it sure the hell isn't real MEF, the only reason I can think of that they would do something this so obviously deceptive so that they didn't want people to complain about other pieces where not including, didn't they think devs of all people would not see though this,
    I guess if a blogger is qualified to be a VP, anything goes. I feel stupid for waiting till today to check github. I hope they boo them off the stage at build when try to launch rev 3 of MetroUniStoreModern 4 inch to 80 inch screen hot mess, I can hear the music now, 4 to 80, hear me now, 80 to 4, any size screen, woot woot, or the new cortana app maker, just tell cortana what kind of app you want to make, cortana and azure do the rest, azure can easily add your new app to our universal app store called "apPIIZZle", you can instantly start getting paid and also feedback and from your users via VS team foundation, we just cant wait to see all amazing great things you can build in the new cloud first mobile first world!


    I think this is real the real last straw for me!

  • you want certainity in life, you want stability
    but ... well, life is not that simple, ultimately resources are scare
    and people compete over them
    and things change everyday

    you need to learn how to be more agile how to adapt and accept
    change faster, rather than request that change become
    more predictable

    also maybe you should start thinking how to make the future
    instead of how hard it is to follow microsoft the the future

    its a bit cliché but true
    "if you are not working on your dream, you are working to realise someone elses dream"


  • @Kendall:
    I understand the rush fully. Where are we?
    The domination of windows client is over.
    MS has no portability solution.
    On the cloud terrain linux rules (even in azure has a big part).

    So what can they do?
    They have to quick fix the client with mono support.
    On the server side they must sell .NET on linux.


  • wow, forgot to make the main point, the only connection bewteen .net core and xamarin is the connected people, company's, cross marketing program and the connect event itself.

    there is no direct connection between .net core and xamarin except the "connect" event and marketing.
    and no one can build anything with .net core without a ton of work ,Especially but not limited to GUI. the most that could ever be done with whats there now is maybe super simple ASP.Net on system other than windows, that the totality of the whole thing, period!

  • Ok , did more research, this is just a simple misunderstanding , and a little bit of fumbled marketing message, this is not a threat to any one or your platform, every can and should look themselves,
    The current referencesource-master package is 120MB 10,148 Files, in 555 folders, copying *.cs to a single directory yields 112MB in 9746 after eliminating duplicate boilerplate code. Sort by size, about half the files , top two about 2 megs, no really code, start going down the list If you don’t find something
    Interesting you are not a .Net developer, some modules are over commented, some very terse (the way I like it) it is so cool to see the inside of things you’ve only seen from the outside, this is for me better than open source it’s “opened” source, I nice to see the source, I don’t need to make my own .Net, I just need to see what’s behind the curtains, If there’s an issue, I will work around it not try to fix the framework at least not in the middle of a project.

    This may not be the greatest event in the history of .Net like many of us let ourselves believe, but it’ not a disaster for anyone, some will benefit more than others from this but there is no reason for anyone to panic, all we have is a little failure to clearing communicate from the parties involved and connect event or a successful obfuscation, I tend toward the latter, but I also take into account the these are two company’s doing some cross marketing, it’s not breaking some kind of Nights Templar open source secrete code. I like to be a informed consumer, I don’t count on vendors be forthright to a fault, people have said that Soma put his career at risk and stood up to sinofksy, so he has many years in the bank of good will with me.
    Before you get in fight with your wife and head off cross country to Nebraska to collect your million dollars from the book club promotion, stop and think it’s a promotion, what am I missing here, I’m just glad I woke up before I hit Ohio, open source .Net core is what I wanted to hear so I didn’t hear the core .Net thing , I’m not mad at anyone but myself. the only thing that’s change is now I see http://blogs.msdn.com/b/dotnet/ for what it is,.
    If you ended up here from the last post on soma’s last blog post, do you now I wonder if the last comment is really the last that got moderated that pointed me to, hmmm, and why do most of the msdn blogs show # of comment and others with long periods of no interest are totally functional but number comments hidden in summary view, some peoples names in the titles with the framework names, just a little self-promotion by the head sales manager, no offense takin, WPF blog added back to .Net framework menu, but can’t get comments working, maybe the person who knows how to hide comment counts could help?, sorry too busy moderating somas blog, LMFAO, enlighten yourself , don’t get angry and remember it’s ad copy, not legal agreement.

    @fixed again sorry to 3repost spam for correction ,
    @fran, Great post, set be on my why opening my eyes!

  • As such, it forms the foundation for all future .NET verticals.

    Note future .Net is not a future stack. >net will not go away although it may in the future be based on .Net core right now it is not.

  • @killerwhale,ali whoever you are you clearly no nothing about real .Net which is whats being discussed hear,
    you might try a fan site like Winbeta, I sure you fit right in. and if your gonna use a quote at least use more than a small piece that makes no point other than revealing poor, nearly unintelligible arguments
    “We think of .NET Core as not being specific to either .NET Native nor ASP.NET 5 – the BCL and the runtimes are general purpose and designed to be modular. As such, it forms the foundation for all future .NET verticals.” From Immo Landwerths Introducing .NET Core. Take a look near the end of the messages
    first trip down the rabbit hole ended at EventDescriptor.cs need InternalSR.ValueMustBeNonNegative
    found it here
    stangelandcl/pash-1/blob/master/External/System.ServiceModel.Internals/System.Runtime/InternalSR.Designer.cs
    unresolvable external reference, do you know what that means? Good luck finding Immo, probable too busy on the pat themselves on the back for nothing tour.
    The verticals as in vertical or specialized markets being phone(last gasp),Xbox(pc gaming better,XBone and ASP.net Web. The real .Net framework isn’t a vertical market, it’ the rest of the world besides those for items, you can call it the desktop if you want, if that helps you feel better about your store apps future, everyone can view this any way they like, the way I see it is a way to put all the junk one spot, one rat hole to please them all. (wow , I should trademark that). Main stream .Net framework is going forward, being funded, runs the real world, trillions of dollars not millions of hits, get some clues kids, your not taking over jack squat, no run along like good hanselpuppets and get some more smoke blown up your, never mind, we will miss you the Dev threads,we will miss your endless wit and charm, no you can show the world what you can really do. not go away mad, just go away, bubbye!

  • I find it absolutely astonishing how many developers and non-developers do not see what the future of all this technology is. Even Dvorak who's been writing for 25+ years about computing technologies. Apparently few are versed in the two basic essentials in broadband terms necessary to have successful mass market technology.

    1. Bringing to the market what the masses want even if they do not know they want it yet.

    2. Being ahead of prospect competitors, preferably way ahead in respect to #1.

    .NET Core, open sourcing .net, providing free express editions of development environments from Microsoft 15 years ago would result in developers fainting at the news.

    What we all seem to not be understanding is the old market .vs. todays market .vs. the future market. They are all quite different. The old markets in respect to technology had many who cared how things worked, worried of matters such as say privacy. Went shopping to find the perfect PC they wanted and parts!

    Today's market is much more of a "It works". People will buy more based on a brand name than actual capabilities.

    The past had fractured markets. Many different things were happening in the early years. Code hackers made things happen. A fine example is Apples repeated mistake. The Mac was a better box for average users than was DOS boxes. Apple deemed the Mac esoteric, priced it that way and got its butt eaten by Microsoft. Why? Because the masses care about PRICE and care that it does what they want it to do. Android has more marketshare than iStuff because Apple AGAIN made the same mistake.
    If iPhone 4's were available unlocked, new for $79 more people would buy them than Androids. Everyone I know has a smartphone. Everyone who has tried my old iPhone 4 likes it far better than any Android phone they have owned.

    Regardless... What is the future?

    The near future is the unification of the deemed "smart technologies". Tablets, TV's, phone's, smart watches, smart glasses, laptops, desktops, home information centers (TV w/ PC built in), gaming etc. All unified. Access from any device anywhere, anytime and the "data casting" thereof. Cloud computing is nothing new, its had many names past. The difference now is the future has that as folks main data store and they will be able to broadcast (if you will) to not only any of there devices but those of whomever. Anyone they so choose to any device, even their smart automobile, boat, plane whatall.

    Microsoft has not been concerned over Windows Phone being a 15 lap down 3rd place runner because they know that all the other race teams (Apple and Google) cannot WIN the race. They cannot win the race because by the time the last lap comes they wont have the tools, the code, the unified processes and professional environments to create them. From Visual Studio one can create anything. PC Applications, phone app's, web sites, games and much more all nice and consistent like for user AND developer.

    Open sourcing .NET and Mono are not just random events. Corporations the size of Microsoft or Amazon or this one or that one dont do random events. They are long term planned strategies. Mono bridges the Linuxophiles giving them what they love with C#, .NET, more capability in one framework than all others existing in PHP or Java land in one nice, professional, considerably nicer development environment that NOW produces code that is FAST. PHP coders are already jumping ship. I know many, in fact, 17 out of 20 I know already are on .NET and learning.

    By the time the unification of technology finish line is a lap away the Apple car will be sputtering not able to make bells and whistles chime and blow across every or any smart device. In fact, the .NET guru's will provide the third partyware that makes the Apple Car or Android car play nicely with said unification USING Microsofts technology. That puts Microsoft in the drivers seat.

    Hardware technology has and in the future will more so weld itself into higher performance of enterprise level demand. Amazon did not pour countless dollars into their AWS service infrastructure to make for download purchased content. They did it so they are assured their position in that smart device unification. They need not rely on some other entity and trust they can support Amazons market position.

    Sony stated that there will be no Playstation 5. They cited cost .vs return revenue. Thats all feldercarb. The fact is they have no market partners that can integrate that technology into smart device unification. For them, worse yet the entity that can happens to be their #1 competitor in video games. Microsoft. You are not hearing Microsoft state "This will be our last console" but instead hear, "We want you to be able to play on your tablet, phone, laptop anytime, anywhere".

    They mean it.

    The Mono / .NET Core and subsequent nuclear shells which few seem to be seeing as that are just that. Microsoft is moving towards dominance of the future market share and these items are significant steps into making that happen by pulling on the "Open Source / Developers" community. Removing entities such as PHP, Java... myriads of lessers by drawing them off what they have been working on and with towards .NET. Its HUGE.

    Towards that end all of you who care read and open eyes. This provides an EXCELLENT time for you to get .NET'n and C#'n NOW. As that unification occurs there will be money, lots and lots and lots of it to be had for the people who are smart enough to realize WHAT tomorrows market will be. If you take your ideas, get them into working code and have your timelines along the same timelines as unification you will see more money come your way than you ever thought possible.

    Its exactly what we are doing. However... there is more.

    The unification also spells regulation. Just as in the early days of TV and Radio anyone who could broadcast could broadcast. The unification of the technology will also spell the beginning of real regulatory measures by nations and accountability of what people do, say etc.

  • thats ethier funny as hell,or rick is one messed up freak

  • First, thanks for your interesting article. Also @steve who has made very good comments.

    My 2 cents: isn't the difference between this and previous 'failed attempts', that Microsoft is now acting out of _necessity_?

Comments have been disabled for this content.