On Sun, 2008-04-06 at 18:51 -0500, John Thompson wrote: > On 2008-04-04, Robert Rabinoff <rar113@xxxxxxxxxxxx> wrote: > > > When I first learned to program in 1964 we used an IBM 1620, fondly known > > as CADET (Can't Add, Doesn't Even Try). > > Heh. My one-and-only formal computer class was learning FORTRAN, which > we ran on an IBM 1620. The computer had more important things to do than > run student programs, so we would write them out in spiral bound > notebooks in class and as homework, then come to the computer center > after hours when the keypunches weren't being used for more important > work, punch the cards and put them in the job queue to be run over night > (we weren't allowed to touch the sacred computer). The next day we'd > come back for the job printout (on wide greenbar paper, of course), > peruse the errors in our programs, punch new cards, drop them in the > queue and repeat until it worked. > Most of us "old timers" have been through this. I think that this exercise made us better programmers. We learned to write code with fewer errors so that we didn't have to do multiple iterations. Of course it also meant that we didn't do too much exploration with novel algorithms, and that was a real impediment to evolution of the programming. However, without that experience, programmers today are somewhat less efficient due to the cut/paste/debug/repeat programming style and when something works once, move on. This leaves some of the more esoteric bugs like open ended strings, buffer overflows, memory leaks, openended chains and so forth, to be discovered in use, with the attendant problems of finding isolating and repairing them falling on other shoulders. The various Programming Proverbs books do a pretty good job of pointing out some of the issues, but nothing works like experience. One of my favorite books was Algorithms + Data Structures = Programs. Don't remember who wrote it, and I'm sure it is somewhat archaic right now, but the lessons about how you structure the data impacting the speed and efficiency of a program were precious to me. When I learned to create double-linked chains for data structures, I was able to create new and novel programming techniques, and coupled with a good round of lessons on sorting (bubble, double bubble, hash tables, quick sort and insertion sort ) along with some illustrations of speed of sort vs data table size, and how to utilize one for one type of data and another for different data entry, I gained great insight. Another algorithm that I love, but don't yet fully understand is the PRML algorithm. I think it has lots of applications if one can structure the data as a net. Optimal path algorithms are also worthy of some additional study. And if you are doing image processing some 2d Fourier Transforms, and some efforts with Gaussian filters is great. If you do not know what these are, please look them up. Some of the new pattern recognition stuff is pretty neat, too, but I am not as conversant with them as with the others. Line following algorithms, and other algorithms to help separate images from each other are great, too. Some of the newer speech recognition algorithms are so proprietary you cannot see their underlying algorithms, but this is an area worthy of greater study as well. Distributed computing is neat, and I am interested in its operation, scheduling, flagging data, and separating algorithms are also areas of interest. I still have a book on CAPP architectures here somewhere. I think that the 80386(tm) processor had some special registers to enable CAPP operations on a hardware level, but I believe they were dropped with the 80486(tm) and later processors. You can read libraries of books to learn about what is out there, but coding and getting a few dozen good algorithms to work that apply to the 80% job of your desired work will take you a long way to being one of the top guys. Then learning how to find and implement new algorithms will take you the rest of the way, in my experience. And for all the emphasis on Object programming, either OO which only works at compile time, or realtime objects, without ultimately having good underlying algorithms, the programs are just poor examples of how to confuse the person reading or working on your code. Comments are necessary, and for objects documentation about the object, its data structures, and the available algorithms is important to make them really portable and useful. Repeat after me: Self documenting code is an oxymoron. And even if you are a wizbang programmer, if your code is too obtuse (and I have written my share of obtuse programs), if the next guy cannot understand it, or what it actually does, it is not effective in the real world. Regards, Les H -- fedora-list mailing list fedora-list@xxxxxxxxxx To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-list