Mutual transfer between visual and auditory temporal interval learning supports a central clock in temporal processing
Shu-Chen Guan; Ying-Zi Xiong; Cong Yu
2017-07
会议名称2017年第二届曲阜视觉科学会议
会议日期2017.7.2
会议地点曲阜
摘要

Temporal perceptual learning shows specificity to the visual or auditory modality, or asymmetric partial transfer from auditory to visual, but not vice versa. These findings are interpreted as evidence for distributed, rather than central, temporal processing (Ivry & Schlerf, 2008). For visual perceptual learning, location and orientation specificity can be eliminated with double training, indicating that learning specificity may not be used to infer the mechanisms of perceptual learning (e.g., Xiao et al., 2008). Here we investigated whether double training can eliminate the modality specificity in temporal learning.
We first replicated asymmetric partial transfer results between auditory and visual learning of a temporal-interval discrimination (TID) task with standard training. The standard interval was marked by a pair of auditory beeps or visual gratings at 100-ms. The subjects practiced either the auditory or the visual TID task for 5 sessions. Visual TID learning had no impact on auditory TID performance (p=0.65), while auditory TID learning improved visual TID performance (p=0.005), although not as much as direction visual TID learning (p=0.028). However, complete learning transfer was evident with double-training. When visual TID learning was paired with an auditory frequency discrimination task at the same 100-ms interval, auditory TID performance was improved similarly to direct auditory training (p=0.051), indicating complete cross-modal learning transfer. Similarly, when auditory TID learning was paired with a visual contrast discrimination task at the same 100-ms interval, visual TID performance was improved equally as direct visual training (p=0.95), again indicating complete cross-modal learning transfer. In both cases we found no significant impact of practicing auditory frequency discrimination or visual contrast discrimination alone on TID performance.
Our results suggest mutual and nearly complete learning transfer of TID learning between visual and auditory modalities, which are consistent with a central temporal processing mechanism shared by different modalities.

关键词temporal interval discrimination cross-modal double training
学科领域视觉心理物理与模型(二)
语种英语
文献类型会议论文
条目标识符http://ir.psych.ac.cn/handle/311026/22062
专题心理所主办、承办、协办学术会议_2017年第二届曲阜视觉科学会议_会议摘要
作者单位School of Psychological and Cognitive Sciences, IDG-McGovern Institute for Brain Sciences, and Peking-Tsinghua Center for Life Sciences, Peking University
推荐引用方式
GB/T 7714
Shu-Chen Guan,Ying-Zi Xiong,Cong Yu. Mutual transfer between visual and auditory temporal interval learning supports a central clock in temporal processing[C],2017.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Shu-Chen Guan]的文章
[Ying-Zi Xiong]的文章
[Cong Yu]的文章
百度学术
百度学术中相似的文章
[Shu-Chen Guan]的文章
[Ying-Zi Xiong]的文章
[Cong Yu]的文章
必应学术
必应学术中相似的文章
[Shu-Chen Guan]的文章
[Ying-Zi Xiong]的文章
[Cong Yu]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。