Re: allowing my AI to dynamically change its own structure

On Wed, 14 Apr 2010 05:39:51 GMT, Don Geddis wrote:

"Dmitry A. Kazakov" <mailbox@xxxxxxxxxxxxxxxxx> wrote on Mon, 29 Mar 2010:
On Thu, 25 Mar 2010 06:28:01 GMT, Don Geddis wrote:
Imagine you had a program that implemented a bubble sort, and that was
the only kind of sort you (the human programmers) knew about. But then
another part of the program internally explored the search space of
sorting algorithms

"Internally explored" = "it was programmed (intentionally or not) to
explore" => it had a reachable state "space is explored" and performed a
transition into this state (set of states).

You seem to be missing, that a program running on a computer can be
described at many different layers of abstraction.

I have addressed this logical fallacy. The litmus test is. Is the behavior
a property of the program, or is it of the given description layer of?

Even considering the "behavior" of a layer, that does not change. The only
thing that does is the subject to which the word behavior is applied, which
constitutes the fallacy.

Will it change the behavior? No, it will not. It is programmed to change
its mind.

If you have a program with a goal of sorting lists of numbers, with a
separate part that searches the space of sorting algorithms

I have addressed this too in my earlier post. A separate program can modify
*other* program. It does not change the behavior of its own, nor the
behavior of the program + its offspring.

It is useful to describe this as, "the program learned a better sorting
algorithm, and now has improved behavior".

It is useful to describe the sun as "raising." Yet we know that it does

You say that the behavior was changed because some human observing the
program saw something unpredictable.

No. The behavior changed, because originally the program could be
simply modelled as [...]

Who could model the program, if not a human?

The only way, in this example, that "neither the set of states, nor the
transitions" have changed, is if you make the foolish model that the
software is just a few gigabytes of random bits, and the transitions are
what the CPU does, given RAM with those random bits.

What is it otherwise?

"Otherwise", it's a vastly more useful model of the program as a sorting
algorithm in a high level language.

Again, model /= object. You can have all sorts of models of the program's
behavior. But you cannot substitute one for another.

Of course, "trivially", the CPU ("transitions") doesn't change, and the
RAM space ("states") doesn't change. But that's an especially useless
way of modelling the system.

On the contrary, this is a way to prove or at least to predict the program

Really? And how, exactly, do you use the full transistor diagram of a
modern x86 processor, along with a multi-gigabyte bit vector, to "prove
or predict" whether the list of numbers will be sorted in about N^2
time, or N log N time?

Very simply, a properly functioning CPU is a necessary premise for all
further considerations about the programs running on this CPU. You cannot
sort numbers if the CPU does not function as required. All statements about
program correctness and its behavior are conditional to the computational

What is your practical approach for answering the real questions of
interest, from the very low level model you seem to prefer?

By no means I prefer low level models. The argument is independent on the
abstraction level of the given programming language. It is to program
correctness. How do you define it, if not based on the behavior? If
correctness is irrelevant, why do we test and debug programs?

These, much more meaningful, things, certain CAN change over time.

How so? The input is either sorted or not. There are states where the
predicate Sorted(Input) is true. In *other* states it is false. That does
not influence the states themselves.

The state/transition model of a bubble sort algorithm is very different
from the state/transition model of a quicksort algorithm. At the
algorithm level, it's an entirely different model.

The property "sorted" of the input does not depend on the [model of]
algorithm of sorting, obviously.

-- Don
Don Geddis don@xxxxxxxxxx
If I have seen farther than others, it is because I have stood on the shoulders
of giants.
-- Isaac Newton
If I have not seen as far as others, it is because giants were standing on my
-- Hal Abelson
In computer science, we stand on each other's feet.
-- Brian K. Reed

If I have not seen as far as others, it is because I have stood in the
footprints of giants...

Dmitry A. Kazakov

[ is moderated ... your article may take a while to appear. ]