Sendes vanligvis innen 7-15 dager
Vision is one of the most active areas in biomedical research, and visual psychophysical techniques are a foundational methodology for this research enterprise. Visual psychophysics, which studies the relationship between the physical world and human behavior, is a classical field of study that has widespread applications in modern vision science. Bridging the gap between theory and practice, this textbook provides a comprehensive treatment of visual psychophysics, teaching not only basic techniques but also sophisticated data analysis methodologies and theoretical approaches. It begins with practical information about setting up a vision lab and goes on to discuss the creation, manipulation, and display of visual images; timing and integration of displays with measurements of brain activities and other relevant techniques; experimental designs; estimation of behavioral functions; and examples of psychophysics in applied and clinical settings. The book's treatment of experimental designs presents the most commonly used psychophysical paradigms, theory-driven psychophysical experiments, and the analysis of these procedures in a signal-detection theory framework. The book discusses the theoretical underpinnings of data analysis and scientific interpretation, presenting data analysis techniques that include model fitting, model comparison, and a general framework for optimized adaptive testing methods. It includes many sample programs in Matlab with functions from Psychtoolbox, a free toolbox for real-time experimental control. Once students and researchers have mastered the material in this book, they will have the skills to apply visual psychophysics to cutting-edge vision science.
|Utgitt||2013||Forfatter||Barbara Dosher, Zhong-Lin Lu|
University Press Group Ltd
|Antall sider||464||Dimensjoner||17,8cm x 22,9cm x 2,7cm|
|Vekt||1016 gram||Leverandør||Bertram Trading Ltd|
|Emner og form||Neurology & clinical neurophysiology, Physiology, Physiological & neuro-psychology, biopsychology|