Individual results
View docsView in-depth performance of a single language model on a single test suite.
Region-by-region surprisal
Sample item for Negative Polarity Licensing (ever; with object relative clause)
The first item of the test suite is shown below for quick reference. Please visit the page for Negative Polarity Licensing (ever; with object relative clause) to see the full list of items.
Item |
Condition
|
Licensor | np | compl | rc_dp | rc_subj | rc_verb | has | npi | continuation |
---|---|---|---|---|---|---|---|---|---|---|
Item | Condition | Licensor | np | compl | rc_dp | rc_subj | rc_verb | has | npi | continuation |
1 | neg_pos | No | author | that | the | senators | liked | has | ever | been popular |
1 | neg_neg | No | author | that | no | senators | liked | has | ever | been popular |
1 | pos_pos | The | author | that | the | senators | liked | has | ever | been popular |
1 | pos_neg | The | author | that | no | senators | liked | has | ever | been popular |
Prediction performance for JRNN on Negative Polarity Licensing (ever; with object relative clause)
Accuracy |
Formula
|
Description |
---|---|---|
Accuracy | Prediction | Description |
92.11% | (591,neg_pos/8,npi) < (589,pos_pos/8,npi) | No description provided. |
60.53% | (590,neg_neg/8,npi) < (592,pos_neg/8,npi) | No description provided. |
0.00% | (591,neg_pos/8,npi) < (592,pos_neg/8,npi) | No description provided. |