Non-Fiction Books:

Space-Time Computing with Temporal Neural Networks

Sorry, this product is not currently available to order

Here are some other products you might consider...

Space-Time Computing with Temporal Neural Networks

Click to share your rating 0 ratings (0.0/5.0 average) Thanks for your vote!

Format:

Paperback / softback
Unavailable
Sorry, this product is not currently available to order

Description

Understanding and implementing the brain's computational paradigm is the one true grand challenge facing computer researchers. Not only are the brain's computational capabilities far beyond those of conventional computers, its energy efficiency is truly remarkable. This book, written from the perspective of a computer designer and targeted at computer researchers, is intended to give both background and lay out a course of action for studying the brain's computational paradigm. It contains a mix of concepts and ideas drawn from computational neuroscience, combined with those of the author. As background, relevant biological features are described in terms of their computational and communication properties. The brain's neocortex is constructed of massively interconnected neurons that compute and communicate via voltage spikes, and a strong argument can be made that precise spike timing is an essential element of the paradigm. Drawing from the biological features, a mathematics-based computational paradigm is constructed. The key feature is spiking neurons that perform communication and processing in space-time, with emphasis on time. In these paradigms, time is used as a freely available resource for both communication and computation. Neuron models are first discussed in general, and one is chosen for detailed development. Using the model, single-neuron computation is first explored. Neuron inputs are encoded as spike patterns, and the neuron is trained to identify input pattern similarities. Individual neurons are building blocks for constructing larger ensembles, referred to as "columns". These columns are trained in an unsupervised manner and operate collectively to perform the basic cognitive function of pattern clustering. Similar input patterns are mapped to a much smaller set of similar output patterns, thereby dividing the input patterns into identifiable clusters. Larger cognitive systems are formed by combining columns into a hierarchical architecture. These higher level architectures are the subject of ongoing study, and progress to date is described in detail in later chapters. Simulation plays a major role in model development, and the simulation infrastructure developed by the author is described.

Author Biography:

James E. Smith is Professor Emeritus in the Department of Electrical and Computer Engineering at the University of Wisconsin-Madison. He received his Ph.D. from the University of Illinois in 1976. He then joined the faculty of the University of Wisconsin-Madison, teaching and conducting research-first in fault-tolerant computing, then in computer architecture. In 1979, he took a leave of absence to work for the Control Data Corporation in Arden Hills, MN, participating in the design of the CYBER 180/990. While at Control Data and after returning to the University of Wisconsin in 1981, he studied several aspects of high performance pipelined processors. This work included the development of dynamic history-based branch predictors, instruction issuing methods, and techniques for providing precise interrupts that are widely used today. From 1984-1989, he was principal architect and a logic designer for the ACA ZS-1, a scientific computer employing a dynamically scheduled, superscalar processor architecture. In 1989, Dr. Smith joined Cray Research and headed a small research team that participated in the development and analysis of future supercomputer architectures. This work focused on advanced vector processor implementations, high bandwidth memory systems, and interconnection networks. In 1994, he re-joined the Department of ECE at the University of Wisconsin. His research interests were directed at new paradigms for exploiting instruction level parallelism. The virtual machine abstraction was used as a technique for providing high performance through co-design and tight coupling of hardware and software. In 2007, he retired from Wisconsin, and then conducted research in industry for four years, first at Google, then at Intel. He received the 1999 ACM/IEEE Eckert-Mauchly Award for contributions to computer architecture. Currently, he is studying new neuron-based computing paradigms at home along the Clark Fork near Missoula, Montana. Princeton University
Release date Australia
May 30th, 2017
Author
Audience
  • General (US: Trade)
Contributor
  • Series edited by Margaret Martonosi
Pages
215
Dimensions
191x235x13
ISBN-13
9781627059480
Product ID
26843998

Customer reviews

Nobody has reviewed this product yet. You could be the first!

Write a Review

Marketplace listings

There are no Marketplace listings available for this product currently.
Already own it? Create a free listing and pay just 9% commission when it sells!

Sell Yours Here

Help & options

Filed under...