Product Description
The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the computational efficiency of neurons, with special reference to visual perception and the efficient coding hypothesis. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural processing; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, and a list of annotated Further Readings, this book is an ideal introduction to the principles of neural information theory.
Customers Who Bought This Item Also Bought
*If this is not the "
Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutori" product you were looking for, you can check the other results by
clicking this link.
Details were last updated on Oct 6, 2024 02:19 +08.