Kernel: Python 3
#Exemplar Example
#####Exemplar vectors: Distributed representations Lexical identity: N units = number of distinct lexical items (see parameters nHF_words, nLF_words below)
Place of articulation: 2 units (labial vs. velar)
Voicing: 2 units (voiced vs. voiceless)
VOT: 1 unit
#####Probe vectors Same as exemplars, except no specification for VOT
#####Constraints a vector of activation of exemplars
Faithfulness: +1 for each matching element in distributed representation, weighted by activation of exemplar; sum over all exemplars
Quantization harmony:
Unit harmony:
#####Output Optimal blend of exemplars
Predicted VOT: Weighted average of exemplar VOTs
In [10]:
In [11]:
Out[11]:
[[ 1. 0. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 62.77551111]
[ 1. 0. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 72.84136104]
[ 1. 0. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 75.21589845]
[ 1. 0. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 69.7088384 ]
[ 1. 0. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 71.31556986]
[ 0. 1. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 61.30914024]
[ 0. 1. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 68.24221288]
[ 0. 1. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 69.6642167 ]
[ 0. 1. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 59.23671182]
[ 0. 1. 0. 0. 0. 0.
0. 0. 0. 1. 0. 0.
1. 68.12493114]
[ 0. 0. 1. 0. 0. 0.
0. 0. 0. 0. 1. 0.
1. 79.0174813 ]
[ 0. 0. 1. 0. 0. 0.
0. 0. 0. 0. 1. 0.
1. 74.28347632]
[ 0. 0. 1. 0. 0. 0.
0. 0. 0. 0. 1. 0.
1. 71.85512138]
[ 0. 0. 1. 0. 0. 0.
0. 0. 0. 0. 1. 0.
1. 79.75213782]
[ 0. 0. 1. 0. 0. 0.
0. 0. 0. 0. 1. 0.
1. 82.37959887]
[ 0. 0. 0. 1. 0. 0.
0. 0. 0. 0. 1. 0.
1. 75.86342071]
[ 0. 0. 0. 1. 0. 0.
0. 0. 0. 0. 1. 0.
1. 71.26733935]
[ 0. 0. 0. 1. 0. 0.
0. 0. 0. 0. 1. 0.
1. 80.0526634 ]
[ 0. 0. 0. 1. 0. 0.
0. 0. 0. 0. 1. 0.
1. 86.87251432]
[ 0. 0. 0. 1. 0. 0.
0. 0. 0. 0. 1. 0.
1. 84.02777846]
[ 0. 0. 0. 0. 1. 0.
0. 0. 0. 1. 0. 0.
1. 60.45854098]
[ 0. 0. 0. 0. 0. 1.
0. 0. 0. 1. 0. 0.
1. 56.86387184]
[ 0. 0. 0. 0. 0. 0.
1. 0. 0. 0. 1. 0.
1. 82.37533183]
[ 0. 0. 0. 0. 0. 0.
0. 1. 0. 0. 1. 0.
1. 78.92324344]
[ 0. 0. 0. 0. 0. 0.
0. 0. 1. 1. 0. 0.
1. 68.42341449]]
In [12]:
In [13]:
In [14]:
Out[14]:
Activation of exemplars:
[ 0.41338741 0.41338741 0.4133871 0.41338715 0.41338715 0.20645068
0.20645084 0.20645068 0.20645085 0.20645145 0.10291774 0.10291774
0.10291694 0.10291677 0.10291774 0.10291774 0.10291729 0.10291677
0.10291797 0.10291774 0.20645072 0.20645035 0.10291775 0.10291718
0.20645153]
Predicted VOT: 67.6920858634
Activation of exemplars scaled
[ 1. 1. 0.99999925 0.99999935 0.99999935 0.4994121
0.4994125 0.4994121 0.49941252 0.49941398 0.24896196 0.24896196
0.24896003 0.2489596 0.24896196 0.24896196 0.24896086 0.2489596
0.24896252 0.24896196 0.49941219 0.49941132 0.24896198 0.2489606
0.49941415]
Resonance (=similarity) scaled
[ 1. 1. 1. 1. 1. 0.66666667
0.66666667 0.66666667 0.66666667 0.66666667 0.33333333 0.33333333
0.33333333 0.33333333 0.33333333 0.33333333 0.33333333 0.33333333
0.33333333 0.33333333 0.66666667 0.66666667 0.33333333 0.33333333
0.66666667]
In [15]:
Out[15]:
HF labial mean VOT: 67.4280655322
LF labial VOT: 67.1264705831
Novel labial VOT: 67.5333359583
HF velar VOT: 77.7034552416
LF velar VOT: 77.6256811169
In [16]:
In [17]:
Out[17]:
HF labial mean VOT: 116.476592049
Change: 49.0485265166
LF labial VOT: 117.758528675
Change: 50.6320580915
Novel labial VOT: 116.129849704
Change: 48.5965137456
HF velar VOT: 116.14059377
Change: 38.4371385289
LF velar VOT: 117.097122993
Change: 39.471441876
In [ ]: