Porting MindForth AI into JavaScript Mind.html

M

mentifex

In our JSAI coding over the last few days, we kept noticing
that the activation-level on S-V-O verbs was going to zero
immediately after the generation of a sentence of thought.
It looked obvious to us that something in there was
arbitrarily zeroing out the verbs. Last night we looked into
both Mind.Forth and Mind.html, and we quickly saw that a
verbClear module was zeroing out the verbs in the JSAI.

Before we took a look at the two programs last night,
we mistakenly were assuming that the modern MindForth
was parroting back its first few sentences of
knowledge-base (KB) input, as if "cats eat fish"
would get a response of "CATS EAT FISH". Actually and
felicitously, the response was "FISH WHAT DO FISH DO".

If the Forthmind had been parroting back its inputs,
there would have been a serious problem with the question
of what activational mechanism were activating the concepts
in the same order as they had come in. Luckily, Mind.Forth
has progressed far beyond the "actset" mentality (in both
the AI Mind and in the mind of its programmer) of forcing
a Subject-Verb-Object input to generate the same output.

The http://mentifex.virtualentity.com/actrules.html webpage
on "Activation Rules" has gotten far out of date from its
last update a year ago on 21.MAY.2007, because "actset"
no longer plays any role at all in the AI Mind. Still,
there are some important insights in the "actrules" text.

As we proceed today to transmogrify the JSAI verbClear()
module into the same sort of verbClip() module that we
have in MindForth, we begin to entertain the notion
that we may be able to get totally away not only from
verbClear but also from verbClip, if we manage not to
focus on verbs as such for the clipping of activation,
but rather on the no-longer-cresting concept as the item
that needs to have its activation drastically decreased.
In our break-out coding of Mind.Forth half a year ago,
it was difficult to get the AI to "detour" away from
defective thoughts if the verbs involved were maintaining
an unduly high post-thought activation. The verbClip
module in Mind.Forth was a way to knock out the
just-thought verbs and produce the "detour" response.
Soon a more mature Mind.Forth or http://AIMind-i.com
may subsume the verbClip operation into a generalized
and therefore less ad-hoc (Band-Aid tm) algorithm.

Along the same line of thought -- following general
principles rather than grabbing at ad-hoc bugfixes --
our Tutorial displays -- perhaps in MindForth but
more definitely in JavaScript Mind.html -- may start
to show not only the candidates for composing a
Subject-Verb-Object link in a chain of thought,
but many forms of cognitive association across the
entire mindgrid of the artificial Mind.

ATM
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,011
Latest member
AjaUqq1950

Latest Threads

Top