ASL-LEX is a database of lexical and phonological properties that have been compiled for nearly 1,000 signs of American Sign Language.

It contains:
  • Frequency ratings
  • Iconicity ratings
  • Lexical properties (e.g., initialized signs, lexical class)
  • Phonological coding (6 features)
  • Neighborhood density calculations
  • English translation
  • Alternative translations (for a subset of signs)
  • Reference video clip metadata

Reference video clips may be viewed using the searchable web interface. To license these video clips, contact asllexproject@gmail.com.


ASL-LEX is available as a searchable web interface and as raw data in spreadsheet form.

  • Instructions for Using the Visualization

    This website provides a searchable, interactive visualization of the ~1,000 ASL signs contained in the ASL-LEX database.

  • Signs as Nodes

    In this visualization, each sign is represented as a node. Higher frequency signs have larger nodes while lower frequency signs are smaller. Signs that are phonologically related--called phonological 'neighbors'--are connected to each other with edges.

    Note on Phonological Relatedness

    The ASL-LEX visualization uses 4 primary phonological features to determine phonological relatedness. Because this is not a complete phonological description, some signs that intuitively are not very phonologically similar may be shown as neighbors, connected to each other in the network.
  • Search

    To search for individual signs, enter the English translation of the sign in the Search tab of the navigation pane in the top left of the screen. If more than one sign or English translation matches what you have typed, you will see a list of words to select from. Once a sign has been selected, its node will be centered and information about the sign will appear in the Sign Data tab at the top left of the screen.

    Note on English Glosses (Translations)

    The primary purposes of the glosses used in this database (called “EntryID” glosses in the database) are to uniquely identify each sign in the database and to provide a convenient way to search for signs. While these glosses were selected to evoke the meaning of the signs, they may not be accurate translations as meanings can change with context.
  • Navigation

    Users can zoom in and out of the visualization using the + and - magnifying glass icons at right side of the screen. Users can also pan around the visualization by clicking anywhere and dragging. Clicking on a node will center the visualization on that node and will cause information about that sign to appear in the Sign Data table. The visualization may be reset to its original form by clicking the empty magnifying glass.

  • Filters

    In addition to searching for specific signs, signs can be filtered by any of the properties listed in the database. This allows users to find all signs matching a particular set of properties, for example all signs with Frequency > 5. To filter by property, click on the Filter tab and specify the desired values for desired properties. Information about individual properties can be accessed by clicking on the About Filters link at the top of the Filter tab.

    Once filter values are set, only the nodes that match those properties will remain colored in the visualization; signs that do not match the criteria will have their nodes greyed out.

    To restore all of the signs, click the Remove Filters link at the top of the filters tab.

  • Downloading data

    Once a node is selected, its lexical information is displayed in the Sign Data tab at the top left.

    To download data, click the download button at the top right corner of the visualization. Select the properties that you would like to be included and decide whether you would like these data for all signs or only the signs that meet the current Filter criteria.

Download Data

The ASL-LEX database is available for download in .csv format from the Open Science Framework.


The raw ASL-LEX data are available under a CC-By-NC license, meaning you can reuse and remix this content with attribution for non-commercial purposes. To cite it, please use the following:

Caselli, N., Sevcikova Sehyr, Z., Cohen-Goldberg, A. M., & Emmorey, K. (2017). ASL-LEX: A lexical database of American Sign Language. Behavior Research Methods, 49(2), 784-801. doi:10.3758/s13428-016-0742-0
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Publications and Presentations

Article Describing ASL-LEX

This article describes the procedures used to create the database, reports descriptive statistics for a number of sign properties, and reports analyses designed to more deeply understand how phonological, lexical, and semantic factors interact in the ASL lexicon.

  • Caselli, N., Sevcikova Sehyr, Z., Cohen-Goldberg, A. M., & Emmorey, K. (2017). ASL-LEX: A lexical database of American Sign Language. Behavior Research Methods, 49(2), 784-801. doi:10.3758/s13428-016-0742-0

Articles Citing ASL-LEX


  • Martinez, D. (2019). Immediate and long-term memory and their relation to crystallized and fluid intelligence. Intelligence, 76, 1-16. doi:10.1016/j.intell.2019.101382
  • Emmorey, K., Li, C., Petrich, J., & Gollan, T. H. (2019). Turning languages on and off: Switching into and out of code-blends reveals the nature of bilingual language control. Journal of Experimental Psychology: Learning, Memory, and Cognition. doi:10.1037/xlm0000734
  • Caselli, N. K., & Pyers, J. E. (2019). Degree and not type of iconicity affects sign language vocabulary acquisition. Journal of Experimental Psychology: Learning, Memory, and Cognition. doi:10.1037/xlm0000713
  • Farhana Thariq Ahmed, H., Ahmad, H., Phang, S. K., Vaithilingam, C. A., Harkat, H., & Narasingamurthi, K. (2019). Higher order feature extraction and selection for robust human gesture recognition using CSI of COTS Wi-Fi devices. Sensors, 19(13), 2959. doi:10.3390/s19132959
  • Lee, B., Meade, G., Midgley, K. J., Holcomb, P. J., & Emmorey, K. (2019). ERP evidence for co-activation of English words during recognition of American Sign Language signs. Brain Sciences, 9(6), 148. doi:10.3390/brainsci9060148
  • Haug, T., Ebling, S., Braem, P. B., Tissi, K., & Sidler-Miserez, S. (2019). Sign language learning and assessment in German Switzerland: Exploring the potential of vocabulary size tests for Swiss German Sign Language. Language Education & Assessment, 2(1), 20-40. doi:10.29140/lea.v2n1.85
  • Ortega, G., & Özyürek, A. (2019). Systematic mappings between semantic categories and types of iconic representations in the manual modality: A normed database of silent gesture. Behavior Research Methods, 1-17. doi:10.3758/s13428-019-01204-6


  • Jiang, D., Li, G., Sun, Y., Kong, J., & Tao, B. (2018). Gesture recognition based on skeletonization algorithm and CNN with ASL database. Multimedia Tools and Applications, 1-18. doi:10.1007/s11042-018-6748-0
  • Corina, D. P., & Lawyer, L. A. (2018). Language in Deaf Populations Signed Language and Orthographic Processing. In S-A. Rueschemeyer & M. G. Gaskell (Eds). The Oxford Handbook of Psycholinguistics, 259-290.
  • Quandt, L. C., & Kubicek, E. (2018). Sensorimotor characteristics of sign translations modulate EEG when deaf signers read English. Brain and language, 187, 9-17. doi:10.1016/j.bandl.2018.10.001
  • Duarte, A. Camli, G., Torres, J., and Giró-i-Nieto, X., (2018) Towards speech to sign language translation. In ECCV 2018 Workshop on Shortcomings in Vision and Language.
  • MacDonald, K., LaMarr, T., Corina, D., Marchman, V. A., Fernald, A. Real‐time lexical comprehension in young children learning American Sign Language. Developmental Science, 21, e12672. https://doi.org/10.1111/desc.12672
  • Perlman, M., Little, H., Thompson, B., & Thompson, R. L. (2018). Iconicity in Signed and Spoken Vocabulary: A comparison between American Sign Language, British Sign Language, English, and Spanish. Frontiers in psychology, 9, 1433. doi:/10.3389/fpsyg.2018.01433
  • Meade, G., Lee, B., Midgley, K. J., Holcomb, P. J., & Emmorey, K. (2018). Phonological and semantic priming in American Sign Language: N300 and N400 effects. Language, cognition and neuroscience, 33(9), 1092-1106. doi:10.1080/23273798.2018.1446543
  • Henner, J., Novogrodsky, R., Reis, J., & Hoffmeister, R. (2018). Recent Issues in the use of Signed Language Assessments for Diagnosis of Language Disorders in Signing Deaf and Hard of Hearing children. The Journal of Deaf Studies and Deaf Education, 23(4), 307–316. doi:10.1093/deafed/eny014
  • Ma, Y., Zhou, G., Wang, S., Zhao, H., & Jung, W. (2018). SignFi: Sign Language Recognition Using WiFi. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2(1), 23.
  • Fonseca, S. R. D., Fontes, A. B. A. D. L., & Finger, I. (2018). Construction of a translation recognition task in Libras-Portuguese: methodological considerations. Letras de Hoje, 53(1), 89-99. doi:10.15448/1984-7726.2018.1.28964


  • Caselli, N. & Pyers, J. (2017). The road to language learning is not entirely iconic: Iconicity, neighborhood density, and frequency facilitate sign language acquisition. Psychological Science, 28(7), 979–987. doi:10.1177/0956797617700498, preprint: https://open.bu.edu/handle/2144/20655
  • Hall, K. C., Mackie, S., Fry, M., & Tkachman, O. (2017). SLPAnnotator: Tools for Implementing Sign Language Phonetic Annotation. In INTERSPEECH, p. 2083-2087.
  • Henner, J., Hoffmeister, R., & Reis, J. (2017). Developing sign language measurements for research with Deaf populations. In S. Cawthon & C. L. Garberoglio (Eds.) Research in Deaf Education: Contexts, Challenges, and Considerations, 141-160.
  • Williams, J. T., Stone, A., & Newman, S. D. (2017). Operationalization of sign language phonological similarity and its effects on lexical access. The Journal of Deaf Studies and Deaf Education, 22(3), 303-315.
  • Magid, R. W. & Pyers, J. E. (2017). "I use it when I see it": The role of development and experience in Deaf and hearing children's understanding of iconic gesture. Cognition, 162, 73-86. doi:10.1016/j.cognition.2017.01.015
  • Emmorey, K., Mehta, S., McCullough, S., & Grabowski, T. J. (2016). The neural circuits recruited for the production of signs and fingerspelled words. Brain and Language, 160, 30-41. doi:10.1016/j.bandl.2016.07.003

Please let us know if you cite or otherwise make use of ASL-LEX in your research/teaching!



Contact Us

To get in contact, please feel free to e-mail us.

Also to keep up to date, subscribe to our e-list:

ASL-LEX 2.0 Preview

In Fall 2019 we will release ASL-LEX 2.0, an updated version of the database. ASL-LEX 2.0 will include more signs (~2700 total), more information about each sign, and new ways to visualize the lexicon.

Please note: this preview is a (rough!) work in progress. It does not yet contain all of the features and may be slow or may not work at all on some browsers.

We recommend using Chrome.