Les wrote:
Of course the churning that is going on could possibly be related to the
advent of high speed DVD's, Bluetooth, USB, USB2.0, Firewire and RAID,
all of which are still undergoing the kinds of development and
consolidation that IDE went through in the late 80s to 90s.
And all of which could be isolated to driver modules behind an
unchanging kernel interface so the OS could do its unix-like job of
presenting every device as a stream-of-bytes without having to change
everything else every time a new piece of hardware is invented.
Someone said, the only thing constant is the rate of change, which
means to those of us old enough a perception that it is logarithmic.
Which just emphasizes the need to stabilize the layers so they can
change independently.
I know that this doesn't add much, but think about this. My first
computer was one printed circuit board. It had 2K or memory, an 8080 (I
think), and four LED number displays with a hex keypad. I had voice
output (dutycycle operations on a bit to a speaker), and tape storage
using audio cassette at about 8Kbaud that I wrote myself. It ran at a
whopping 2Mhz I think (I had some bits before that, but they were mostly
just experiments with soldering chips together on protoboard).
About the 2nd thing you should have learned from that 8080 system was
that if you wrote something in it's native assembly language and used
unique features of its OS (if it had one) you'd have to throw it all out
as soon as that hardware was obsolete (well, I suppose you could have
moved to a Z80 for another year when the designer moved from Intel to
Zilog and kept opcodes backwards compatible...)
I am writing this on a system with over 800G of storage, running at
2Ghz, dual processors, with 1G of memory, over a network that runs at
multiple gigs at least part of the way, and all of you can see it and
read it all over the world in seconds if things go well.
I hope somewhere along the line you started writing in C or a similarly
standardized language and using an OS that presented a standard
interface so your own work was no longer obsoleted along with the platforms.
That is the past 30 years. What will full immersion and higher
technology bring in 30 more years? Any guesses? How will the
processors, operating systems, networking and hardware keep up with
that? Will we be part of the next three decades of development?
Along with that, I like this quote:
"Faced with the choice between changing one's mind and proving that
there is no need to do so, almost everyone gets busy on the proof."
— John Kenneth Galbraith
I like to think I am on the chaning mind side of that quote...
The problem with change is the amount of baggage that goes along with
the actual piece that needed the change. If you standardize interfaces,
you can change any single piece without throwing the rest out.
--
Les Mikesell
lesmikesell@xxxxxxxxx
--
fedora-list mailing list
fedora-list@xxxxxxxxxx
To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list