2 Comments

Let me copy-paste here the point on your post from Barry Smith "What is missing in his account is the absence of an AGI will. To have AGI the machine would have to want things, have intentions, ...

See attached extract from 9781003310105.pdf

BS" see https://groups.google.com/g/ontolog-forum/c/-nl9WyPa9qc/m/2Zm4HfrNAwAJ

Expand full comment

Discussions of will, purpose, and intention tend to be more or less scholastic unless the participants in the debate are involved in hands-on experiments with the development of AGI or an AGI component. The need to find answers to questions "how to .." puts everything in its place. The non-empty intersection of the set of possibilities and the set of needs leads to the necessity to select from it a subset of intentions that can be executed step by step simultaneously. Any choice requires some kind of compromise between estimates of the importance of different intentions or random selection of preferred ones; both are the exhibition of the will as a psychological term. Nothing supernatural - everyday life in automatic control systems; AGI only complicates how decisions are made compared to existing systems.

Expand full comment