Re: ProDOS Plus



BluPhoenyx wrote:
Michael J. Mahon wrote:


It was a doomed idea from the beginning. Even back in the day, a 128k operating system was not considered worth the problems when it was just as easy to make the applications support 128k or more ram.


I don't think it should be a "128K" system...


I could imagine a 128k OS. Much of the kernel code could run from the aux 64k utilizing it's own stack and page 0 as well. However, since it's already limiting the user base one might also want to require an accelerated CPU as well.

However, I understand your advocation of a reworked and/or patched version of the existing system. For one thing, it would be easier to actually accomplish something useful. New OS development could take quite a while to develop properly. Also, your suggested P-code system could do something similar by running in the aux 64k memory range $D000...FFFF or perhaps the aux ram used by the P8 /ram driver code space. Another option would be an overlay type system where seldom used code could be loaded as required. Possibilities abound.

The idea was to use interpretation to shrink the code so that *all* of
the OS could fit in *only* the space that ProDOS now occupies. Anything
else fails the application compatibility test.

Yeah, I recall a few what if dreams discussions. Still this one now has it's own thread now. Besides, if practicality were a concern we probably wouldn't be using old hardware. For me this is a hobby or more precisely a diversion.


Not at all. Practicality is an absolute requirement for me.


Not to make light of your incredible projects, intelligence and talent but realistically, how practical are your recent projects outside of the II environment? Now, how practical are they in the II environment other than to yourself and what would you estimate the user base to be? They are useful to you for the purposes you made them for but they are constrained to a limited user base of a classic computer environment. Practicality really isn't the issue whether you concede the point or not.

I meant "practicality" in the context of an Apple enthusiast, not in
the context of "real-world" computing.

The Apple II is a wonderful resource-constrained programming platform,
and I take no joy in fantasy that cannot be reduced to practice.


Hmmm and I thought I had problems. :)

I wouldn't call it fantasy. A lot of great inventions started out in a similar fashion. Just because we are focusing on a resource-constrained environment for a limited user base doesn't detract from the possibilities. It simply adds more challenge for the developers.

I don't disagree as much as you think, as my next quoted paragraph makes
clear. But I'm also quite aware of the need to calibrate the difficulty
of a task within the range that constitutes an "interesting challenge",
as opposed to either "too easy" or "too hard". (And I do appreciate
that different people have different thresholds.)

Put another way, many years of experience have caused me to develop a
sense of when a project is in danger of becoming overblown, which is
a sure sign that it will never happen.

Most people have this sense implicitly, so that instead of proposing
something that others would think unlikely, they work on it privately
until they have either convinced themselves that they can't do it
either, or have done enough work on it to convice others that it *can*
be done.

One of the greatest joys is imagining something that *seems* impossible,
then finding a way to *do* it. But constraints are real, and there
are proveably impossible things, given those constraints.


But that is the whole point of the exercise. It's been said it couldn't be done but with the right developers and intelligent, well thought out design, the II is capable of some incredible things.

Indeed, and I've already suggested a way of possibly satisfying the
extermemly important memory constraint without too much compromise on
the necessarily limited performance:

Perhaps ProDOS and several filesystems could be coded and run
interpretively, like P-code, to get more code into the current
memory profile. If this could be done, it would probably come
at the cost of speed, but it would be interesting.

Compatiblitiy is possible using this approach, since, unlike
DOS 3.3, there was little motivation, and even less facilitation,
for writing code dependent on internal entry points in ProDOS.


Now see, you're already coming up with ideas on workable possibilities. This sounds like a rather likely possibility for a 8 bit P8 replacement or possible patch based update. If the interpreter code were tight enough there might be enough reclaimable space in the current P8 memory usage. That unused space in the aux 16k $d000 area and perhaps removing the built in /ram driver if necessary. Well you get the idea.

Are you sure that space is free? How many applications have had the
same thought? The only thing we know "for sure" is that the currently
occupied space is available. (And not even all of that, since the
_de jure_ space for a clock driver is actually smaller than the _de
facto_ space--I know, because I have an outsize clock driver that
has worked well for many years!)

A reasonable way to proceed would be to create a slightly modified
ProDOS that doesn't do much extra, but writes (and verifies that
it remains unclobbered) all the RAM that you might want to use.

You then distribute this to interested parties to use as another
"version" of ProDOS, and, after enough exercising, you will know
with reasonable probabliity just what is used and what isn't.

The problem is that it might take a long time with the community
at its current population and level of adventurousness to get
reasonable coverage of the application space.

In any case, it always seemed to me that the technique should be
nicely applicable to an OS, since 90%+ of the time spent in an OS
is in 10% or less of the code, and so much OS code is error recovery
code with a frequency of essentially zero.

It shouldn't be hard to put the error recovery and reporting, exception
handling, initialization, and other low-frequency code into a very
compact, highly interpretive form, leaving only the frequently executed
parts as in-line executable code.

I've wanted to try this experiment for many years, but could never
convince the "powers that be", and have never seen a system that used
this approach.

Properly done, it would call for designing a "virtual machine" that
is particularly adept at doing the things that OS code does, just as
the Digitek interpreter was specialized to recursive-descent compiling.


Now you're thinking in the right direction. If a few more 8 bit guru's chip in there's no telling how much could be done on such a project.

Overcoming the implementation obstacle is just one input the the
"and gate"--a major remaining obstacle for any code that must handle
as many cases as an OS is ensuring that it is sufficiently reliable.

The combinatorics of an OS are such that testing is a very weak
strategy. Widespread use in many applications over a period of time
is the way that virtually all complex systems are brought to an
acceptable state of reliability. I'm concerned that the level of
acceptance and activity in the current Apple II community is marginal
for getting there.

One way to mitigate this would be to build a test scaffold to compare
intermediate states (as signatures) of the "new" ProDOS with the same
state signatures for the existing ProDOS. This would improve the
efficiency of "certification" testing of existing functionality by many
orders of magnitude.

Any new functionality would still be subject to less efficient testing
(and this is a good reason to limit the new functionality).

Sure--so do I. But I doubt that such a multi-filesystem OS can be
produced for 8-bit Apples while maintaining backward compatibility
unless some radical approach (see above) is taken. The memory map
of the Apple II, including AUX mem, is too open to "squatting" to be
able to evict everyone at this late date.

Any practical extension will have to fit within the memory envelope
of the current ProDOS. That's one reason that multi-filesystems
implemented in conventional ways don't compute for me. And if they
require a large amount of buffer space, then I don't see any practical
way to implement them.


Agreed but ideas must be bandied about to kick loose the bits and pieces required to decide on which features are really important for the project.

It's true--but I must admit to lifelong impatience with "brainstorming"
as opposed to a focused design process. Although it can be useful
in the "green field" phase, I've *almost* always found it to have too
low a signal to noise ratio (of course, that depends on the group ;-).

I don't think any OS has ever maintained 100% compatibility with previous versions. In my experience this has never been the case. Of course said experience is limited to the Apple2 and the PC and compatibles over the past 23 years.


It's not necessary to maintain full compatibility while a system is
still on the upswing--new applications are being written, new systems
are being sold, and the motivation is there to achieve critical mass
on a new OS that offers functional advantages.

But I think you will agree that we are no longer in that situation
with the Apple II. Now, compatibility is a must if there is to be
any hope of sufficient applications for (relatively) wide use.


Considering the number of II users, wide usage isn't a consideration in most instances. IMHO, one great application could make the efforts worthwhile be it gaming, porting newer development systems or an Internet capable environment. I can think of a great number of things which were considered impractical when first created yet someone still made them.

For one demanding application, writing one's own OS is undoubtedly the
best way to get the maximum functionality and performance from a system.
Contiki comes immediately to mind. Appleworks is another example, where
a quite general memory manager was written to simplify the rest of the
application.

If that application becomes popular, and documentation of the internals
is available, then others may choose to build upon that system base.
(Note that the Appleworks memory manager is not more widely used because
it fails the "documentation available" test.)

The problem with leaving "wide adoption" to the future is that every
new application will find new bugs. You'd like to "front load" that
discovery process if at all possible.

Given that I hope to use my II systems longer than any PC I own plus the shorter lifetimes for most PC software, the useful lifetime I expect for II software is longer than it would be for other computers. This offers some possibilities that other II enthusiasts might consider when deciding about creating applications and extensions as well.

I agree, but note that there is not even a "steady trickle" of new
software for our platforms that is intended for general use.

Unless we (miraculously) get proactive in programming, and introduce
others to the joys of programming on the Apple II platform, I see no
reason to hope for the situation to change.

Maybe when more Boomers start to retire... ;-)

It is still an interesting discussion of ideas and whether these things bear fruit or not they usually spark an idea in someones mind. Occasionally, these ideas are pretty good too.


I play both top-down and bottom-up: grand visions and incremental
extension. But given the constraints outlined above, I think that
the incremental extension route is currently the most viable one for
ProDOS.


And you play your part well. Quite often you get folks thinking about things being too difficult or even impossible and later on give ideas on how things might actually be possible. Take the issue of adapting for modern monitors. I recall you originally said it would not be worth the difficulty involved (not an exact quote) yet more recently you have mentioned several possibilities for developing just such an item.

Yes, I have a very strong aesthetic issue with using a cannon to kill
a mouse, and am always looking for simple solutions to simple problems,
or, even better, simple solutions to complex problems!

It has always set off alarms for me when complex solutions to simple
problems are proposed.

My physics background has given me a strong appreciation of the role
of "invariants" in a system--things that remain the same "no matter
what".

Your intelligence and pragmatic insight are well used and well appreciated.

Thanks! ;-)

More concretely, small changes to compact the existing code enough
to allow extensions like "parent" in pathnames would be a very good,
100% compatible, and relatively doable extension.

If people liked it, it would provide both the learning and the
encouragement to proceed with other incremental extensions.


Yet another purpose for discussions. Knowledge and development of skills are always a worthy goal.

Absolutely! Immediately satisfying, and I always trust that new
knowledge and skills will find pragmatic application down the road.

The things I've tried that didn't work have been especially useful in
helping to "prune" later search trees. ;-)

If nothing else comes out of this discussion, I have enjoyed it immensely.

As have I. It's started me thinking about interpretation again, an
approach I've always found fascinating. Who wouldn't like a virtual
machine that they can modify instantly? ;-)

-michael

NadaNet file server for Apple II computers!
Home page: http://members.aol.com/MJMahon/

"The wastebasket is our most important design
tool--and it's seriously underused."
.



Relevant Pages

  • Re: xmalloc string functions
    ... require memory allocations depending on the way the system works. ... If the toolkit being used is not one of those, then it is irrelevant that some provide a means to do so, particularly if the "some" are not available for the platform being targeted. ... Not enough context for most real-world applications to recover at this point. ... At this point g_malloccalling abortbecomes a moot point, particularly if your auto-save code is robust against memory allocation errors. ...
    (comp.lang.c)
  • Re: ProDOS Plus
    ... operating system was not considered worth the problems when it was just as easy to make the applications support 128k or more ram. ... if practicality were a concern we probably wouldn't be using old hardware. ... Any practical extension will have to fit within the memory envelope ... compatibility is a must if there is to be ...
    (comp.sys.apple2)
  • Re: xmalloc string functions
    ... If these were the only choices (crashing applications or a frozen ... trying to malloc some rediculously large amount of memory - e.g. in the ... because their malloc wrapper decided to exit. ... Exiting on malloc failure makes sense for a utility like sort. ...
    (comp.lang.c)
  • Re: xmalloc string functions
    ... require memory allocations depending on the way the system works. ... Not enough context for most real-world applications to ... It is /more/ reliable to routinely auto-save the user's work (as you ... particularly if your auto-save code is robust against memory allocation ...
    (comp.lang.c)
  • RE: DLLHOST.EXE and Secure Server Crash
    ... This is a very common problem with COM+ components and IIS. ... | Applications view switch to Status View) ... |>with 2-3 dedicated SSL servers. ... A symptom of the problem centers around memory ...
    (microsoft.public.inetserver.iis.security)