click below
click below
Normal Size Small Size show me how
cog chp 4 top 12
Feature Nets and Word Recognition
Question | Answer |
---|---|
Simple -> Complex | Bottom up processing |
Complex->Simple | Top-down processing |
What is the order of the detectors on the recognition scale? | Start from Feature-> Letter-> Word Ending |
Define Activation level | in every detector input, the activation is increased |
What happens in the response threshold? | Detectors fire once it's reached |
T or F, Detectors are complex assemblies of neural tissue, not single or groups neurons | T |
Define frequency | Frequent firing in higher starting activation Exercise effect |
Define Recency | recent firing results in higher starting activation Warmup effect |
T or F, repetition forming boosts respective detector | T |
T or F, word commonality does not boost respective detector | F |
Low frequency= | Unfamiliar word |
High frequency= | Familiar word |
T or F, Well formed non/words include the bigram layer | T |
Define bigram | 2 letter combo |
Where is bigram on the simple-> complex scale? | below word detector |
T or F, people cannot recognize non-words at all | F |
The more efficient a word, the more | no matter how fast you may answer, it may be incorrect |
What are the upsides to Feature nets | Help us resolve unclear inputs fast |
Can autocorrect you to the wrong word is what | is a down side of feature nets |
Define disturbed knowledge | knowledge is not locally represented feature nets contain distributed knowledge come from experience |