Pre-event
Tutorial 1 :Prof. Brian A. Wandell
(Stanford University, United States)
Tutorial title :
"How Wavelength Becomes Color: Foundations of Human Color Science"
I will describe our current understanding of how the human brain derives the perception of color from the wavelength information arriving at the eye. Many properties of color can be understood by the retinal encoding of light in the cone photoreceptors and the subsequent retinal circuits. Additional aspects of color appearance depend crucially on circuits in visual cortex. The responses to colored stimuli differ across the brain, and it seems likely that only some portions of the brain are essential for color appearance. By understanding these differences between cortical circuits we aim to distinguish between the neural processes designed to capture the perception of color from those processes, such as motion perception, that use color information for other goals. A current hypothesis is that wavelength information is used in multiple cortical circuits; some of these circuits are essential for color appearance and other circuits use wavelength information to assist with other visual functions.
Prof. Lindsay W. MacDonald
(University College London, United Kingdom)
Tutorial title :
"Using Real and Synthesized Reflectance Spectra in Color Imaging"
The reflectance of a point on the surface of a real-world object is characterised by the spectral reflectance distribution, typically measured at wavelength intervals of 5 or 10 nm. It can be represented by a vector, enabling colorimetric computation with corresponding vectors for the spectral power distribution of a light source and the sensitivity of an observer. Large sets of reflectance spectra, containing thousands of vectors, can be analysed to determine the number of degrees of freedom, for example by principal component analysis.
The RGB-to-XYZ transfer function of digital cameras with broad-band spectral sensitivity in each channel can be approximated well by polynomial functions trained on large sets of reflectance spectra. For narrow-band sensors (and especially in the limiting case of monochromatic laser scanners), however, the colorimetric errors of such approximations may be very large. In this tutorial it will be shown how synthetic reflectance spectra may be generated with 'realistic' spectral waveforms. Millions of such spectra can then be used to populate a multi-dimensional lookup table to generate a transformation that adapts to different regions of color space and thus minimises the encoding errors for all possible input reflectance spectra. The method can readily be extended to multispectral imaging systems.
Laboratory Tour of Color, Imaging and Vision at Chiba University
Reception :
University Restaurant Colza in Nishi-Chiba Campus of Chiba University.