Back to Search
Start Over
Information extraction from shadowed regions in images: an eye movement study
- Source :
- Vision research. 113
- Publication Year :
- 2014
-
Abstract
- Natural scenes often contain variations in local luminance as a result of cast shadows and illumination from different directions. When making judgments about such scenes, it may be hypothesized that darker regions (with lower relative contrast due to a lack of illumination) are avoided as they may provide less detailed information than well-illuminated areas. We here test this hypothesis, first by presenting participants images of faces that were digitally modified to simulate the effect of a shadow over half of the image, and second by presenting photographs of faces taken with side illumination, also resulting in the appearance of a shadow across half of the face. While participants viewed these images, they were asked to perform different tasks on the images, to allow for the presentation of the different versions of each image (left shadow, right shadow, no shadow), and to distract the observers from the contrast and illumination manipulations. The results confirm our hypothesis and demonstrate that observers fixate the better illuminated regions of the images.
- Subjects :
- Adult
Male
Adolescent
Eye Movements
Computer science
media_common.quotation_subject
Fixation, Ocular
computer.software_genre
Luminance
050105 experimental psychology
03 medical and health sciences
Judgment
Young Adult
0302 clinical medicine
Optics
Shadow
Psychophysics
Contrast (vision)
Humans
0501 psychology and cognitive sciences
Computer vision
Lighting
media_common
Analysis of Variance
Task effects
business.industry
05 social sciences
Eye movement
Lightness perception
Sensory Systems
Information extraction
Ophthalmology
Face (geometry)
Female
Artificial intelligence
business
computer
Facial Recognition
030217 neurology & neurosurgery
Photic Stimulation
Subjects
Details
- ISSN :
- 18785646
- Volume :
- 113
- Database :
- OpenAIRE
- Journal :
- Vision research
- Accession number :
- edsair.doi.dedup.....378fcabecf003e247c9002018bac305e