Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published August 30, 2022 | Supplemental Material + Published
Journal Article Open

Optimizing the human learnability of abstract network representations

Abstract

Precisely how humans process relational patterns of information in knowledge, language, music, and society is not well understood. Prior work in the field of statistical learning has demonstrated that humans process such information by building internal models of the underlying network structure. However, these mental maps are often inaccurate due to limitations in human information processing. The existence of such limitations raises clear questions: Given a target network that one wishes for a human to learn, what network should one present to the human? Should one simply present the target network as-is, or should one emphasize certain parts of the network to proactively mitigate expected errors in learning? To investigate these questions, we study the optimization of network learnability in a computational model of human learning. Evaluating an array of synthetic and real-world networks, we find that learnability is enhanced by reinforcing connections within modules or clusters. In contrast, when networks contain significant core–periphery structure, we find that learnability is best optimized by reinforcing peripheral edges between low-degree nodes. Overall, our findings suggest that the accuracy of human network learning can be systematically enhanced by targeted emphasis and de-emphasis of prescribed sectors of information.

Additional Information

© 2022 the Author(s). Published by PNAS. This open access article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND). We thank Christopher Kroninger for feedback on earlier versions of this manuscript. We also thank Pixel Xia for useful discussions on Sierpiński graph construction. This work was supported by the Army Research Office (DCIST-W911NF-17-2-0181) and the National Institute of Mental Health (1-R21-MH-124121-01). D.S.B. acknowledges additional support from the John D. and Catherine T. MacArthur Foundation, the Institute for Scientific Interchange Foundation, NSF CAREER Award PHY-1554488, and the Center for Curiosity. The content is solely the responsibility of the authors and does not necessarily represent the official views of any of the funding agencies. Data, Materials, and Software Availability. Previously published data were used for this work from GitHub (https://github.com/nhchristianson/Math-text-semantic-networks). All code and data used in analyses have been made publicly available at https://github.com/wqian0/OptimizedGraphLearning (60). Citation Diversity Statement. Recent work in several fields of science has identified a bias in citation practices, such that papers from women and other minorities are undercited relative to the number of such papers in the field (53–58). Here, we sought to proactively consider choosing references that reflect the diversity of the field in thought, form of contribution, gender, and other factors. We obtained predicted gender of the first and last author of each reference by using databases that store the probability of a name being carried by a woman (53, 59). By this measure (and excluding self-citations to the first and last authors of our current paper), our references contain 21.05% woman(first)/woman(last), 7.89% man/woman, 14.17% woman/man, and 56.88% man/man. This method is limited in that 1) names, pronouns, and social media profiles used to construct the databases may not, in every case, be indicative of gender identity; and 2) it cannot account for intersex, nonbinary, or transgender people. We look forward to future work that could help us to better understand how to support equitable practices in science. The authors declare no competing interest.

Attached Files

Published - pnas.202121338.pdf

Supplemental Material - pnas.2121338119.sapp.pdf

Files

pnas.202121338.pdf
Files (3.6 MB)
Name Size Download all
md5:0ceb3961533f884a2db7eca667290728
2.7 MB Preview Download
md5:df27bf8b1d58cf704681165d21e78cc7
921.9 kB Preview Download

Additional details

Created:
August 22, 2023
Modified:
October 24, 2023