a

Lorem ipsum dolor sit amet, consectetur adicing elit ut ullamcorper. leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet. Leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet.

  /  Project   /  Blog: Mens Latina Strong AI — 2019–05–05

Blog: Mens Latina Strong AI — 2019–05–05


As we code artificial intelligence in the Latin language, the central problem for the LaParser mind-module is how to use inflectional endings instead of word-order for Natural Language Understanding (NLU). Fortunately, we have key variables to hold the instantiation-time of a subject, a verb, and a direct or indirect object. In our English-speaking and Russian-speaking AI Minds, we have already implemented the retroactive setting of a few associative tags in the conceptual flag-panel of each element of a sentence. Now for Mens Latina we envision setting everything retroactively.

We are having difficulting in identifying the grammatical person of a recognized Latin verb, such as “DAT”. The OldConcept() module finds only the concept-number. Even AudRecog() finds only the concept-number. If we want a report of person and number for the LaParser() module, we will need to deal with the inflectional endings passing through AudBuffer() and OutBuffer()

In the OldConcept() module we have started testing for the characters being held as the inflectional ending at the end of a Latin verb. Thus we are able to detect the grammatical person of the verb. Next, for the sake of the conceptual flag-panel, we need to test for the grammatical number of the verb. By a default that can be overruled, we let a verb ending in “S” be interpreted as a second-person singular, and then we let a “-US” ending overrule the default and be interpreted as a first-person plural like “DAMUS”. We also let a “-TIS” ending overrule the default and interpret the verb as a second-person plural, like “DATIS”. By default we let a “T” ending be interpreted as third-person singular, but with an “-NT” ending able to overrule the default and be interpreted as third-person plural, like “DANT”.

In the LaParser() Latin parsing module we load the variables for time of subject, verb and direct object, and we impose the dba value in the conceptual flag-panel for case of noun or person of verb. We assume the input of a subject-verb-object (SVO) Latin sentence but in no particular word-order, possibly verb-subject-object or some variation. However, because we have captured the SVO time-points for access ad libitum to subject or verb or object, we may retroactively insert associative tags into any one of the SVO words.

We are finding it difficult to obtain the dba value for the input of a subject-word in the nominative case. It seems that there are basically two ways to obtain the dba value. One would be to intercept the Latin word at the moment of its recognition in the auditroy memory channel, where the time of recognition (“tor”?) is the same time-point as the concept of the word stored in the Psy conceptual memory. Another way to obtain the dba would be to test for the full spelling of the Latin pronouns in the OldConcept module by using b16 and the other OutBuffer variables.

The problem with using the time-of-recognition value from AudRecog is that each AI Mind is looking to recognize only the concept-number of an input-word, and not necessarily the specific form of the word that happens to be in the nominative case or some other case. So we should probably use the functionality of the OutBuffer module to obtain the dba value of a Latin word. And we may test for the complete spelling of any particular Latin pronoun.

Perhaps in the LaParser module we should call InStantiate early and then work various transformations retroactively upon the concepts already instantiated. So we move the call upwards, and at first we keep on getting bad results in the storage of Latin words as concepts. Then we start commenting out the retroactive transformations being worked in the LaParser module, and suddenly we get such good results that we decide to save the Abra011A.html local version as too valuable to risk corrupting with further coding. The plan is to rename the somewhat working, saved version with a new version number and to improve on the AI functionality. Although the Mens Latina does not yet store all the necessary associative tags for each concept in a subject-verb-object (SVO) input, what we have now assigns the proper dba tag for the subject, the verb and the object in whatever word-order we make the input.

Now we need to code the retroactive assignment of associative tags that link concepts together for the purpose of Natural Language Understanding (NLU). In the InStantiate module we capture each concept being instantiated as the subject, the verb and the direct object so that in the LaParser module we may retroactively insert tags for the comprehension of Latin input. We achieve the ability of the Mens Latina to understand the input of a Latin sentence regardless of word order and to generate the same idea from memory in an arbitrary word-order.

Source: Artificial Intelligence on Medium

(Visited 3 times, 1 visits today)
Post a Comment

Newsletter