|Subject:||Does Microsoft Need a New Source Code for the Future?|
|Posted by:||Dan (D…@discussions.microsoft.com)|
|Date:||Thu, 24 Jul 2008|
I want to start a new topic on this because the Biometrics debate has gotten
too long. I will now post Chris Quirke, MVPs reply to me about my thinking
the 9x (98 Second Edition) should be part of the internal Defense Network of
this source code.
Chris Quirke, MVP says:
I think we have the same ideas, but weigh things differently and
reach different conclusions - you see the 9x code base itself as
being something to be preserved at all costs, where I see the
factors that make the 9x code base safer in certain respects as
something that should inform other code base development.
An interesting point from the article I linked for you, was the
difference between deeply re-architecting an existing code base,
and starting a new code base from scratch. I'd have though such
deep design change to be as disruptive as re-coding from scratch,
but apparently this is not the case. If that's so, then it may be
practical to re-architect the NT code base as a true stand-alone
OS, which keeps networking out of the center as an discardable
subsystem should unexpected risks demand that response.
I put it this way; exposed code surfaces are like points of wear
in a car. You don't merge piston rings into pistons (or brake
shoes into axles) so that when these parts get worn, they are
easy to replace. Same thing with code surfaces; you may have
to suddenly amputate or replace them, so don't embed them in
the core of how the OS works.
For example, an OS should be able to wipe its own butt without
RPC, and/or not expose RPC to network surfaces (especially
the Internet). It shouldn't rely on RPC to do internal things, weld
this into Internet exposure, and then rely on a firewall as a band
aid over this clickless, remotable risk surface.