[IAEP] P.S. Re: etoys now available in Debian's non-free repository

Alan Kay alan.nemo at yahoo.com
Sat Jun 28 13:21:47 CEST 2008


P.S. I thought of a different way to possibly resolve this.

It occurs to me that this discussion and differences of opinion could really be about how executables are made. One of the main issues cited has to do with security, and a notion that being able to see the sources will help.

The simplest application model today has a file which contains executable code and one or more files of non-executable data which is manipulated by the executable file when they are combined into an OS process. For example, a text editor in Linux will have a file of executable code and a file that is a text document that (presumably) has no executables in it.

Since (Squeak) Smalltalk is set up to run under various OSs (even though it doesn't need them), this model is followed. There is an executable file which contains all the executable machine code in the system -- it usually has a name like Squeak.exe -- and there can be any number of "document" files which contain only data. The "image" file is such a document file, and it contains no executable code wrt any computer which might use it -- it is a pure data structure.

Squeak.exe or its equivalent is ultimately made by (a subset of) the C compiler and tools of the operating system that will: invoke it, give it access to the screen and keyboard and file system, etc. Every piece of C code that is fed to the C compiler is on one file or another and available for perusal. In practice, we don't write most of this code by hand, but generate it from a higher level architecture and debugging process that is internal to the Squeak system. But still, the result is C code that is compiled by someone else's C compiler (in this case one of the Linux compilers) into standard Linux executables that will be run in a standard way. Any Linux person can examine all the source files using tools they are familiar with.

The executable file is like an OS kernel that can incorporate executable plugins -- for graphics, sound, the particle system, (Mozilla has even been run as a plugin, etc.)), so there can be lots of stuff in this file -- but again, all of it ultimately has to be made in a form that is acceptable to the underlying OS which owns the machine resources.

Seems as though this should do it. And all these files are readily available for all of the OSs we deal with.

We would build the Squeak kernel very differently if there were no OS to contend with or that has to be cooperated with.... But, since we don't regard ourselves as a religion or the one right way, we are content to go along with current conventions where the alternative is needless argumentation.

The rest of the misunderstandings seem to me to be epistemological, and ultimately go back to very different POVs taken by different research groups in the 60s.

>From several different motivations, the ARPA (and then the PARC branch of ARPA) community got into thinking about highly scalable architectures, partly motivated by the goal of a world-wide network which had to scale better than any computer artifact by many orders of magnitude. This community also took Moore's Law seriously, particularly wrt personal computing and just how many nodes the proposed network might have. This led to a "no-centers" style which manifested itself most strongly in the 70s. The most extreme successful versions of this style eliminated the OS, file systems, applications, data structures, simulated punched cards/teletype, etc., in favor of what Marvin Minsky called a "hetarchical" (as opposed to hierarchical) organization of as much as possible.

Several of the formulators of this style had considerable backgrounds in biology, whose no-center scaling then and now goes far beyond anything successfully done in computing.

It was realized that most computing of the 50s and 60s was rather like synthetic chemistry in which structures are built atom by atom and molecule by molecule. This gets more and more difficult for larger and more complex combinations. "Life" generally uses a process quite different -- instead of building cells and organisms, it grows them. This leads to seeming paradoxes in the epistemology of making -- i.e. to make a cell we need a cell., to make a chicken we need a chicken. However, all works fine in the epistemology of growing. But the initial bootstrapping is a little tricky. Once the bootstrap is done to make life then life can assist much more powerfully in making more life, and to vary life, etc. As mentioned before, the Internet is one of these, and so is Smalltalk.

In "biologically inspired" architectures one is much more interested in how the organism and ecologies of them are to be sustained, and how the dynamical systems can be understood, fixed, changed, evolved, reformulated, etc., while they are running and with the help of tools that work within them. Just as a cell, and especially e. g. a human baby, is not made by hand, we are more interested in making growth processes that can be influenced with far more ease than direct construction. So, most objects are made by other objects acting under conditions that include the dynamic state in which they will become part of the ecology.

Compared to what can really be done along these lines, what we did at PARC 30+ years ago was pretty crude. Still the combination of ideas and architectures resulted in very small, powerful, and highly and easily changeable systems that allowed a lot of experimentation and redesign while still being efficient enough to be used as tools and media.

Best wishes to all,

Alan


      
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.lo-res.org/pipermail/its.an.education.project/attachments/20080628/08639fe0/attachment.htm>


More information about the Its.an.education.project mailing list