Jump to content

英文维基 | 中文维基 | 日文维基 | 草榴社区

Talk:Cognitive ontology

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

A Cognitive Ontology provides a roadmap to a cognition.

Motivation for this ontology

[edit]

Cognition confers a competitive advantage. Therefore entities which cognize will increase in number until resistance occurs. See ontological warfare.

Patterns for investigating this ontology

[edit]

One obvious pattern to map onto cognition is the scientific method, a well-known mechanism for producing knowledge.

  • The term 'mechanism' was chosen deliberately. Isaac Newton counted himself a 'mechanic' at a time when this was a radical notion.
The mapping is as follows:
Information
Processing
Loop
Scientific Method
Sense data Observation
Percepts Hypothesis
Concepts Prediction
Test data Test
Actions Review


Reasonable questions about an entity E

[edit]

Now the tough part is structuring the ontology. We posit entities, 'E's as the primitive concept.

If we take E from the collection in the table above, our choices for E include (Sense data, Percepts, Concepts, Test results, Actions). More choices are available; it would be risky to presume we have a complete list.

Our motivation is the computer programming agent which we think of as a task. From a programming viewpoint, it is no problem to think of the E's as hierarchical (outline-format). Naming the E's is also no problem; they are tagged with symbols which reside in some symbol table. Datatype-ing the E's is also no problem; they are at least triples (Object, Attribute, Value) The type for E is some choice from a Collection.

Our motivation for 'collection' is from Naive set theory, and we assume the machinery for First-order predicate calculus (but no stronger) to stay within the bounds of the scientific method, in order to stay decidable.

  1. What is E
  2. When did E occur
  3. Where is E
  4. If there were no E, what would happen
  5. Reasons to expect the observation of E

Cognition in an entity (embodied Cognition)

[edit]

We get specific here to help out our reasoning: we assume the entities are embodied, so we can hope for answers to questions 1 through 5.

If we take the definitions from the article on Agents, the Agents (the Entities) have an Ontological commitment. By definition, the Agents have an Ontology. Ontological commitment is the base of the Attribute values of an Entity. The values could be under control by some set of sub-agents. Agents can form coalitions with each other.

Clearly, E can be natural or artificial.

Natural E's have solved the problem of survival. Part of the Ontology for such E's is to preserve E's attributes (properties). The values of those attributes would be determined by a natural process for natural E's.

Example of an Ontological commitment for an E

[edit]
[edit]

Cognition is also an international journal publishing theoretical and experimental papers on the study of the mind. External link to journal homepage.


1950s

1980s

Other observations of self-organizing behavior:

Other emergent behavior: