Abstract | This paper considers how the kind of formal
semantic objects used in TTR (a theory of types
with records, Cooper, 2023) might be related
to the vector representations used in Eliasmith
(2013). An advantage of doing this is that it
would immediately give us a neural representa-
tion for TTR objects as Eliasmith relates vec-
tors to neural activity in his semantic pointer
architecture (SPA). This would be an alternat-
ive using convolution to the suggestions made
by Cooper (2019a) based on the phasing of
neural activity. The project seems potentially
hopeful since all complex TTR objects are con-
structed from labelled sets (essentially sets of
ordered pairs consisting of labels and values)
which might be seen as corresponding to the
representation of structured objects which Elia-
smith achieves using superposition and circular
convolution.
|