Hello,
My name is Benedikt Ehinger. I’m a Tenure-Track Professor for Computational Cognitive Sciences at the SimTech and VIS at University of Stuttgart
Send me a message: science@benediktehinger.de
My studies and interests involve cognitive psychology, visual perception, categorization and cortocortical interactions. I am working with EEG, Bayesian statistics, eye tracking and slowly diving into VR/mobile EEG.
Publications
4784278
BBWDIMSG
1
harvard1
50
date
year
1
1
Ehinger
2
https://benediktehinger.de/blog/science/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22R4AMD8IL%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Yan%20et%20al.%22%2C%22parsedDate%22%3A%222023-04-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EYan%2C%20C.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282023%29%20%26%23x2018%3BHumans%20predict%20the%20forest%2C%20not%20the%20trees%3A%20statistical%20learning%20of%20spatiotemporal%20structure%20in%20visual%20scenes%26%23x2019%3B%2C%20%3Ci%3ECerebral%20Cortex%3C%5C%2Fi%3E%2C%20p.%20bhad115.%20Available%20at%3A%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1093%5C%2Fcercor%5C%2Fbhad115%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1093%5C%2Fcercor%5C%2Fbhad115%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DR4AMD8IL%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Humans%20predict%20the%20forest%2C%20not%20the%20trees%3A%20statistical%20learning%20of%20spatiotemporal%20structure%20in%20visual%20scenes%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chuyao%22%2C%22lastName%22%3A%22Yan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexis%22%2C%22lastName%22%3A%22P%5Cu00e9rez-Bellido%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marius%20V%22%2C%22lastName%22%3A%22Peelen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Floris%20P%22%2C%22lastName%22%3A%22de%20Lange%22%7D%5D%2C%22abstractNote%22%3A%22The%20human%20brain%20is%20capable%20of%20using%20statistical%20regularities%20to%20predict%20future%20inputs.%20In%20the%20real%20world%2C%20such%20inputs%20typically%20comprise%20a%20collection%20of%20objects%20%28e.g.%20a%20forest%20constitutes%20numerous%20trees%29.%20The%20present%20study%20aimed%20to%20investigate%20whether%20perceptual%20anticipation%20relies%20on%20lower-level%20or%20higher-level%20information.%20Specifically%2C%20we%20examined%20whether%20the%20human%20brain%20anticipates%20each%20object%20in%20a%20scene%20individually%20or%20anticipates%20the%20scene%20as%20a%20whole.%20To%20explore%20this%20issue%2C%20we%20first%20trained%20participants%20to%20associate%20co-occurring%20objects%20within%20fixed%20spatial%20arrangements.%20Meanwhile%2C%20participants%20implicitly%20learned%20temporal%20regularities%20between%20these%20displays.%20We%20then%20tested%20how%20spatial%20and%20temporal%20violations%20of%20the%20structure%20modulated%20behavior%20and%20neural%20activity%20in%20the%20visual%20system%20using%20fMRI.%20We%20found%20that%20participants%20only%20showed%20a%20behavioral%20advantage%20of%20temporal%20regularities%20when%20the%20displays%20conformed%20to%20their%20previously%20learned%20spatial%20structure%2C%20demonstrating%20that%20humans%20form%20configuration-specific%20temporal%20expectations%20instead%20of%20predicting%20individual%20objects.%20Similarly%2C%20we%20found%20suppression%20of%20neural%20responses%20for%20temporally%20expected%20compared%20with%20temporally%20unexpected%20objects%20in%20lateral%20occipital%20cortex%20only%20when%20the%20objects%20were%20embedded%20within%20expected%20configurations.%20Overall%2C%20our%20findings%20indicate%20that%20humans%20form%20expectations%20about%20object%20configurations%2C%20demonstrating%20the%20prioritization%20of%20higher-level%20over%20lower-level%20information%20in%20temporal%20expectation.%22%2C%22date%22%3A%222023-04-01%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1093%5C%2Fcercor%5C%2Fbhad115%22%2C%22ISSN%22%3A%221047-3211%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1093%5C%2Fcercor%5C%2Fbhad115%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-05-28T19%3A06%3A39Z%22%7D%7D%2C%7B%22key%22%3A%22YN6R9JMV%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Skukies%2C%20Ren%5Cu00e9%20and%20Ehinger%2C%20Benedikt%20V.%22%2C%22parsedDate%22%3A%222023%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESkukies%2C%20Ren%26%23xE9%3B%20and%20%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20Benedikt%20V.%20%282023%29%20%26%23x2018%3B%26%23x202A%3BThe%20effect%20of%20estimation%20time%20window%20length%20on%20overlap%20correction%20in%20EEG%20data%26%23x202C%3B%26%23x2019%3B%2C%20in.%20%3Ci%3EComputational%20Cognitive%20Neuroscience%3C%5C%2Fi%3E.%20Available%20at%3A%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fscholar.google.de%5C%2Fcitations%3Fview_op%3Dview_citation%26hl%3Dde%26user%3DVKDX28YAAAAJ%26sortby%3Dpubdate%26citation_for_view%3DVKDX28YAAAAJ%3Ans9cj8rnVeAC%27%3Ehttps%3A%5C%2F%5C%2Fscholar.google.de%5C%2Fcitations%3Fview_op%3Dview_citation%26hl%3Dde%26user%3DVKDX28YAAAAJ%26sortby%3Dpubdate%26citation_for_view%3DVKDX28YAAAAJ%3Ans9cj8rnVeAC%3C%5C%2Fa%3E%20%28Accessed%3A%2024%20June%202023%29.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DYN6R9JMV%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22%5Cu202aThe%20effect%20of%20estimation%20time%20window%20length%20on%20overlap%20correction%20in%20EEG%20data%5Cu202c%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22Skukies%2C%20Ren%5Cu00e9%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22Ehinger%2C%20Benedikt%20V.%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222023%22%2C%22proceedingsTitle%22%3A%22%22%2C%22conferenceName%22%3A%22Computational%20Cognitive%20Neuroscience%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fscholar.google.de%5C%2Fcitations%3Fview_op%3Dview_citation%26hl%3Dde%26user%3DVKDX28YAAAAJ%26sortby%3Dpubdate%26citation_for_view%3DVKDX28YAAAAJ%3Ans9cj8rnVeAC%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-06-24T14%3A40%3A24Z%22%7D%7D%2C%7B%22key%22%3A%22UZSUB3GX%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Bonasch%2C%20Hannes%20and%20Ehinger%2C%20Benedikt%20V.%22%2C%22parsedDate%22%3A%222023%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBonasch%2C%20Hannes%20and%20%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20Benedikt%20V.%20%282023%29%20%26%23x2018%3B%26%23x202A%3BDecoding%20accuracies%20as%20well%20as%20ERP%20amplitudes%20do%20not%20show%20between-task%20correlations%26%23x202C%3B%26%23x2019%3B%2C%20in.%20%3Ci%3EComputational%20Cognitive%20Neuroscience%3C%5C%2Fi%3E.%20Available%20at%3A%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fscholar.google.de%5C%2Fcitations%3Fview_op%3Dview_citation%26hl%3Dde%26user%3DVKDX28YAAAAJ%26sortby%3Dpubdate%26citation_for_view%3DVKDX28YAAAAJ%3AGnPB-g6toBAC%27%3Ehttps%3A%5C%2F%5C%2Fscholar.google.de%5C%2Fcitations%3Fview_op%3Dview_citation%26hl%3Dde%26user%3DVKDX28YAAAAJ%26sortby%3Dpubdate%26citation_for_view%3DVKDX28YAAAAJ%3AGnPB-g6toBAC%3C%5C%2Fa%3E%20%28Accessed%3A%2024%20June%202023%29.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DUZSUB3GX%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22%5Cu202aDecoding%20accuracies%20as%20well%20as%20ERP%20amplitudes%20do%20not%20show%20between-task%20correlations%5Cu202c%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22Bonasch%2C%20Hannes%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22Ehinger%2C%20Benedikt%20V.%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222023%22%2C%22proceedingsTitle%22%3A%22%22%2C%22conferenceName%22%3A%22Computational%20Cognitive%20Neuroscience%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fscholar.google.de%5C%2Fcitations%3Fview_op%3Dview_citation%26hl%3Dde%26user%3DVKDX28YAAAAJ%26sortby%3Dpubdate%26citation_for_view%3DVKDX28YAAAAJ%3AGnPB-g6toBAC%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-06-24T14%3A33%3A00Z%22%7D%7D%2C%7B%22key%22%3A%22XB7B8ZB9%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Nikolaev%20et%20al.%22%2C%22parsedDate%22%3A%222023%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ENikolaev%2C%20A.R.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282023%29%20%26%23x2018%3BBefore%20the%20second%20glance%3A%20neural%20correlates%20of%20refixation%20planning%20in%20precursor%20fixations%26%23x2019%3B.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F660308%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F660308%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DXB7B8ZB9%27%3ECite%3C%5C%2Fa%3E%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fbenediktehinger.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D4784278%26amp%3Bdlkey%3DN8WCXV2V%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Before%20the%20second%20glance%3A%20neural%20correlates%20of%20refixation%20planning%20in%20precursor%20fixations%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andrey%20R.%22%2C%22lastName%22%3A%22Nikolaev%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Radha%20Nila%22%2C%22lastName%22%3A%22Meghanathan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cees%20van%22%2C%22lastName%22%3A%22Leeuwen%22%7D%5D%2C%22abstractNote%22%3A%22Evidence%20has%20been%20accumulating%20for%20the%20close%20relationship%20between%2C%20on%20the%20one%20hand%2C%20the%20neural%20mechanisms%20of%20eye%20movement%20control%20and%2C%20on%20the%20other%2C%20cognitive%20functions%20such%20as%20perception%2C%20attention%2C%20and%20memory.%20In%20natural%20viewing%20behavior%2C%20eye%20movements%20drive%20and%20coordinate%20the%20neural%20activity%20of%20these%20functions.%20This%2C%20however%2C%20makes%20it%20a%20methodologically%20challenging%20task%20to%20separate%20neural%20activity%20serving%20oculomotor%20and%20cognitive%20functions%20during%20free%20viewing.%20Extensive%20free%20viewing%20is%20characterized%20by%20three%20distinct%20fixation%20types%3A%20ordinary%20fixations%20to%20locations%20visited%20once%2C%20precursor%20fixations%20to%20locations%20that%20are%20revisited%20later%2C%20and%20refixations%20that%20are%20revisits%20of%20precursor%20locations.%20We%20simultaneously%20recorded%20EEG%20and%20eye%20movement%20in%20a%20free-viewing%20contour%20search%20task%20and%20analyzed%20fixation-related%20EEG%20activity%20in%20these%20fixation%20categories.%20We%20applied%20a%20regression-based%20deconvolution%20approach%2C%20which%20allowed%20us%20to%20account%20for%20the%20overlapping%20EEG%20responses%20due%20to%20the%20saccade%20sequence%2C%20as%20well%20as%20for%20the%20contribution%20of%20oculomotor%20variables.%20We%20found%20that%20the%20EEG%20amplitude%20for%20precursor%20fixations%20differs%20from%20that%20for%20ordinary%20fixations%20and%20refixations%20200-400%20ms%20after%20the%20fixation%20onset%2C%20most%20noticeably%20over%20the%20occipital%20areas.%20Follow-up%20analyses%20showed%20that%20the%20effect%20increases%20if%20we%20remove%20adjustments%20according%20to%20saccade%20size%20and%20order%20in%20the%20fixation%20sequence%20%28fixation%20rank%29%2C%20which%20are%20distinct%20for%20precursor%20fixations.%20This%20implies%20that%20brain%20processes%20underlying%20precursor%20fixations%20have%20distinct%20oculomotor%20and%20cognitive%20components.%20As%20our%20data%20suggest%2C%20the%20cognitive%20role%20of%20precursor%20fixations%20is%20to%20demarcate%20locations%20strategically%20important%20for%20planning%20information%20acquisition%20in%20the%20current%20task.%20This%20observation%20is%20supported%20by%20the%20particular%20oculomotor%20characteristics%20of%20precursor%20fixations%2C%20such%20as%20saccade%20size%20and%20fixation%20rank.%20Overall%2C%20our%20findings%20emphasize%20the%20active%20coordinating%20role%20of%20eye%20movements%20in%20cognitive%20neural%20mechanisms.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222023%22%2C%22DOI%22%3A%2210.1101%5C%2F660308%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-05-28T19%3A07%3A10Z%22%7D%7D%2C%7B%22key%22%3A%22Z9TSTKQY%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Fr%5Cu00f6mer%20et%20al.%22%2C%22parsedDate%22%3A%222023%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EFr%26%23xF6%3Bmer%2C%20R.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282023%29%20%26%23x2018%3BCommon%20neural%20choice%20signals%20emerge%20artifactually%20amidst%20multiple%20distinct%20value%20signals%26%23x2019%3B.%20bioRxiv.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2022.08.02.502393%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F2022.08.02.502393%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DZ9TSTKQY%27%3ECite%3C%5C%2Fa%3E%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fbenediktehinger.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D4784278%26amp%3Bdlkey%3DGLPWLA2U%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22Common%20neural%20choice%20signals%20emerge%20artifactually%20amidst%20multiple%20distinct%20value%20signals%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22R.%22%2C%22lastName%22%3A%22Fr%5Cu00f6mer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M.%20R.%22%2C%22lastName%22%3A%22Nassar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B.%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22A.%22%2C%22lastName%22%3A%22Shenhav%22%7D%5D%2C%22abstractNote%22%3A%22A%20large%20body%20of%20work%20has%20identified%20characteristic%20neural%20signatures%20of%20value-based%20decision-making.%20A%20core%20circuit%20has%20been%20shown%20to%20track%20the%20values%20of%20options%20under%20consideration%2C%20and%20the%20dynamics%20of%20neural%20activity%20associated%20with%20this%20circuit%20have%20been%20found%20to%20closely%20resemble%20the%20ramping%20evidence%20accumulation%20process%20believed%20to%20underpin%20goal-directed%20choice%2C%20most%20notably%20in%20the%20centroparietal%20positivity%20%28CPP%29.%20However%2C%20recent%20neuroimaging%20studies%20suggest%20that%20value-based%20choices%20trigger%20multiple%20value-related%20neural%20signatures%2C%20some%20of%20which%20are%20unrelated%20to%20decision-making%20per%20se%20but%20instead%20reflect%20reflexive%20affective%20reactions%20to%20one%5Cu2019s%20options.%20Here%2C%20we%20use%20the%20temporal%20resolution%20of%20EEG%20to%20test%20whether%20choice-independent%20value%20signals%20could%20be%20dissociated%20from%20well-known%20temporal%20signatures%20of%20the%20choice%20process.%20We%20show%20that%20EEG%20activity%20during%20value-based%20choice%20can%20be%20decomposed%20into%20distinct%20spatiotemporal%20clusters%2C%20one%20stimulus-locked%20%28associated%20with%20the%20affective%20salience%20of%20a%20choice%20set%29%20and%20one%20response-locked%20%28associated%20with%20the%20difficulty%20of%20selecting%20the%20best%20option%29.%20We%20show%20that%20neither%20of%20these%20clusters%20meet%20the%20criteria%20for%20an%20evidence%20accumulation%20signal.%20Instead%2C%20and%20to%20our%20surprise%2C%20we%20found%20that%20stimulus-locked%20activity%20can%20mimic%20an%20evidence%20accumulation%20process%20when%20aligned%20to%20the%20response%20%28as%20with%20the%20CPP%29.%20In%20this%20dataset%20and%20a%20second%20one%20that%20uses%20a%20more%20traditional%20perceptual%20decision-making%20task%2C%20we%20show%20that%20the%20CPP%5Cu2019s%20apparent%20pattern%20of%20evidence%20accumulation%20disappears%20when%20stimulus-locked%20and%20response-locked%20signals%20are%20accounted%20for%20jointly.%20Collectively%2C%20our%20findings%20show%20that%20neural%20signatures%20of%20value%20can%20reflect%20choiceindependent%20processes%20that%20when%20analyzed%20using%20standard%20approaches%2C%20can%20look%20deceptively%20like%20evidence%20accumulation.%5CnSignificance%20Statement%20To%20choose%2C%20people%20must%20evaluate%20their%20options%20and%20select%20between%20them.%20Selection%20is%20well%20described%20by%20a%20process%20of%20accumulating%20evidence%20up%20to%20some%20threshold%2C%20with%20an%20electrophysiological%20signature%20in%20the%20centroparietal%20positivity%20%28CPP%29.%20However%2C%20decision-making%20also%20gives%20rise%20to%20value%20signals%20reflecting%20affective%20reactions%20and%20other%20selection-unrelated%20processes.%20Measuring%20EEG%20while%20participants%20made%20value-based%20choices%2C%20we%20identified%20two%20spatiotemporally%20distinct%20value%20signals%2C%20neither%20reflecting%20evidence%20accumulation.%20Instead%2C%20we%20show%20that%20evidence%20accumulation%20signals%20found%20in%20the%20CPP%20can%20arise%20artifactually%20from%20overlapping%20stimulus-%20and%20response-related%20activity.%20These%20findings%20call%20for%20a%20significant%20reexamination%20of%20established%20links%20between%20neural%20and%20computational%20mechanisms%20of%20choice%2C%20while%20inviting%20deeper%20consideration%20of%20the%20array%20of%20cognitive%20and%20affective%20processes%20that%20occur%20in%20parallel.%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22bioRxiv%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222023%22%2C%22DOI%22%3A%2210.1101%5C%2F2022.08.02.502393%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2022.08.02.502393v2%22%2C%22language%22%3A%22en%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-05-28T19%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22XSAZ3IA5%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Chiossi%20et%20al.%22%2C%22parsedDate%22%3A%222022-08-01%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EChiossi%2C%20F.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282022%29%20%26%23x2018%3BAdapting%20visualizations%20and%20interfaces%20to%20the%20user%26%23x2019%3B%2C%20%3Ci%3Eit%20-%20Information%20Technology%3C%5C%2Fi%3E%2C%2064%284%26%23x2013%3B5%29%2C%20pp.%20133%26%23x2013%3B143.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1515%5C%2Fitit-2022-0035%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1515%5C%2Fitit-2022-0035%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DXSAZ3IA5%27%3ECite%3C%5C%2Fa%3E%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fbenediktehinger.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D4784278%26amp%3Bdlkey%3DMXFU2K52%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Adapting%20visualizations%20and%20interfaces%20to%20the%20user%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Francesco%22%2C%22lastName%22%3A%22Chiossi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Johannes%22%2C%22lastName%22%3A%22Zagermann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jakob%22%2C%22lastName%22%3A%22Karolus%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nils%22%2C%22lastName%22%3A%22Rodrigues%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Priscilla%22%2C%22lastName%22%3A%22Balestrucci%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daniel%22%2C%22lastName%22%3A%22Weiskopf%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tiare%22%2C%22lastName%22%3A%22Feuchtner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Harald%22%2C%22lastName%22%3A%22Reiterer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lewis%20L.%22%2C%22lastName%22%3A%22Chuang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marc%22%2C%22lastName%22%3A%22Ernst%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andreas%22%2C%22lastName%22%3A%22Bulling%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sven%22%2C%22lastName%22%3A%22Mayer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Albrecht%22%2C%22lastName%22%3A%22Schmidt%22%7D%5D%2C%22abstractNote%22%3A%22Adaptive%20visualization%20and%20interfaces%20pervade%20our%20everyday%20tasks%20to%20improve%20interaction%20from%20the%20point%20of%20view%20of%20user%20performance%20and%20experience.%20This%20approach%20allows%20using%20several%20user%20inputs%2C%20whether%20physiological%2C%20behavioral%2C%20qualitative%2C%20or%20multimodal%20combinations%2C%20to%20enhance%20the%20interaction.%20Due%20to%20the%20multitude%20of%20approaches%2C%20we%20outline%20the%20current%20research%20trends%20of%20inputs%20used%20to%20adapt%20visualizations%20and%20user%20interfaces.%20Moreover%2C%20we%20discuss%20methodological%20approaches%20used%20in%20mixed%20reality%2C%20physiological%20computing%2C%20visual%20analytics%2C%20and%20proficiency-aware%20systems.%20With%20this%20work%2C%20we%20provide%20an%20overview%20of%20the%20current%20research%20in%20adaptive%20systems.%22%2C%22date%22%3A%222022-08-01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1515%5C%2Fitit-2022-0035%22%2C%22ISSN%22%3A%222196-7032%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.degruyter.com%5C%2Fdocument%5C%2Fdoi%5C%2F10.1515%5C%2Fitit-2022-0035%5C%2Fhtml%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-01-30T15%3A33%3A56Z%22%7D%7D%2C%7B%22key%22%3A%2272CT2FQT%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Govaart%20et%20al.%22%2C%22parsedDate%22%3A%222022%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EGovaart%2C%20G.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282022%29%20%26%23x2018%3BEEG%20ERP%20preregistration%20template%26%23x2019%3B.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.31222%5C%2Fosf.io%5C%2F4nvpt%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.31222%5C%2Fosf.io%5C%2F4nvpt%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3D72CT2FQT%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22preprint%22%2C%22title%22%3A%22EEG%20ERP%20preregistration%20template%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gisela%22%2C%22lastName%22%3A%22Govaart%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Antonio%22%2C%22lastName%22%3A%22Schettino%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Saskia%22%2C%22lastName%22%3A%22Helbling%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Mehler%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22William%20Xiang%20Quan%22%2C%22lastName%22%3A%22Ngiam%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Moreau%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Francesco%22%2C%22lastName%22%3A%22Chiossi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anthony%22%2C%22lastName%22%3A%22Zanesco%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yu-Fang%22%2C%22lastName%22%3A%22Yang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Remi%22%2C%22lastName%22%3A%22Gau%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22genre%22%3A%22%22%2C%22repository%22%3A%22%22%2C%22archiveID%22%3A%22%22%2C%22date%22%3A%222022%22%2C%22DOI%22%3A%2210.31222%5C%2Fosf.io%5C%2F4nvpt%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-05-28T19%3A07%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22KDXI5JDF%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Gert%20et%20al.%22%2C%22parsedDate%22%3A%222022%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EGert%2C%20A.L.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282022%29%20%26%23x2018%3BWildLab%3A%20A%20naturalistic%20free%20viewing%20experiment%20reveals%20previously%20unknown%20electroencephalography%20signatures%20of%20face%20processing%26%23x2019%3B%2C%20%3Ci%3EEuropean%20Journal%20of%20Neuroscience%3C%5C%2Fi%3E%2C%2056%2811%29%2C%20pp.%206022%26%23x2013%3B6038.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2Fejn.15824%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1111%5C%2Fejn.15824%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DKDXI5JDF%27%3ECite%3C%5C%2Fa%3E%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fbenediktehinger.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D4784278%26amp%3Bdlkey%3DIC7GX4SJ%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22WildLab%3A%20A%20naturalistic%20free%20viewing%20experiment%20reveals%20previously%20unknown%20electroencephalography%20signatures%20of%20face%20processing%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%20L.%22%2C%22lastName%22%3A%22Gert%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Silja%22%2C%22lastName%22%3A%22Timm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%20C.%22%2C%22lastName%22%3A%22Kietzmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22Neural%20mechanisms%20of%20face%20perception%20are%20predominantly%20studied%20in%20well-controlled%20experimental%20settings%20that%20involve%20random%20stimulus%20sequences%20and%20fixed%20eye%20positions.%20Although%20powerful%2C%20the%20employed%20paradigms%20are%20far%20from%20what%20constitutes%20natural%20vision.%20Here%2C%20we%20demonstrate%20the%20feasibility%20of%20ecologically%20more%20valid%20experimental%20paradigms%20using%20natural%20viewing%20behaviour%2C%20by%20combining%20a%20free%20viewing%20paradigm%20on%20natural%20scenes%2C%20free%20of%20photographer%20bias%2C%20with%20advanced%20data%20processing%20techniques%20that%20correct%20for%20overlap%20effects%20and%20co-varying%20non-linear%20dependencies%20of%20multiple%20eye%20movement%20parameters.%20We%20validate%20this%20approach%20by%20replicating%20classic%20N170%20effects%20in%20neural%20responses%2C%20triggered%20by%20fixation%20onsets%20%28fixation%20event-related%20potentials%20%5BfERPs%5D%29.%20Importantly%2C%20besides%20finding%20a%20strong%20correlation%20between%20both%20experiments%2C%20our%20more%20natural%20stimulus%20paradigm%20yielded%20smaller%20variability%20between%20subjects%20than%20the%20classic%20setup.%20Moving%20beyond%20classic%20temporal%20and%20spatial%20effect%20locations%2C%20our%20experiment%20furthermore%20revealed%20previously%20unknown%20signatures%20of%20face%20processing%3A%20This%20includes%20category-specific%20modulation%20of%20the%20event-related%20potential%20%28ERP%29%27s%20amplitude%20even%20before%20fixation%20onset%2C%20as%20well%20as%20adaptation%20effects%20across%20subsequent%20fixations%20depending%20on%20their%20history.%22%2C%22date%22%3A%222022%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1111%5C%2Fejn.15824%22%2C%22ISSN%22%3A%221460-9568%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1111%5C%2Fejn.15824%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-01-30T15%3A33%3A45Z%22%7D%7D%2C%7B%22key%22%3A%226W8QEAVW%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Pavlov%20et%20al.%22%2C%22parsedDate%22%3A%222021-04-02%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EPavlov%2C%20Y.G.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282021%29%20%26%23x2018%3B%23EEGManyLabs%3A%20Investigating%20the%20replicability%20of%20influential%20EEG%20experiments%26%23x2019%3B%2C%20%3Ci%3ECortex%3C%5C%2Fi%3E%20%5BPreprint%5D.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.cortex.2021.03.013%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.cortex.2021.03.013%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3D6W8QEAVW%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22%23EEGManyLabs%3A%20Investigating%20the%20replicability%20of%20influential%20EEG%20experiments%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yuri%20G.%22%2C%22lastName%22%3A%22Pavlov%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nika%22%2C%22lastName%22%3A%22Adamian%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Appelhoff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mahnaz%22%2C%22lastName%22%3A%22Arvaneh%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christopher%20S.%20Y.%22%2C%22lastName%22%3A%22Benwell%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christian%22%2C%22lastName%22%3A%22Beste%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amy%20R.%22%2C%22lastName%22%3A%22Bland%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daniel%20E.%22%2C%22lastName%22%3A%22Bradford%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Florian%22%2C%22lastName%22%3A%22Bublatzky%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Niko%20A.%22%2C%22lastName%22%3A%22Busch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%20E.%22%2C%22lastName%22%3A%22Clayson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Damian%22%2C%22lastName%22%3A%22Cruse%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Artur%22%2C%22lastName%22%3A%22Czeszumski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%22%2C%22lastName%22%3A%22Dreber%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Guillaume%22%2C%22lastName%22%3A%22Dumas%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ganis%22%2C%22lastName%22%3A%22Giorgio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Xun%22%2C%22lastName%22%3A%22He%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jos%5Cu00e9%20A.%22%2C%22lastName%22%3A%22Hinojosa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christoph%22%2C%22lastName%22%3A%22Huber-Huber%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Inzlicht%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bradley%20N.%22%2C%22lastName%22%3A%22Jack%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Magnus%22%2C%22lastName%22%3A%22Johannesson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rhiannon%22%2C%22lastName%22%3A%22Jones%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Evgenii%22%2C%22lastName%22%3A%22Kalenkovich%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Laura%22%2C%22lastName%22%3A%22Kaltwasser%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Hamid%22%2C%22lastName%22%3A%22Karimi-Rouzbahani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andreas%22%2C%22lastName%22%3A%22Keil%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Layla%22%2C%22lastName%22%3A%22Kouara%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Louisa%22%2C%22lastName%22%3A%22Kulke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cecile%20D.%22%2C%22lastName%22%3A%22Ladouceur%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nicolas%22%2C%22lastName%22%3A%22Langer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Heinrich%20R.%22%2C%22lastName%22%3A%22Liesefeld%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%22%2C%22lastName%22%3A%22Luque%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Annmarie%22%2C%22lastName%22%3A%22MacNamara%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Liad%22%2C%22lastName%22%3A%22Mudrik%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Muthuraman%22%2C%22lastName%22%3A%22Muthuraman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lauren%20B.%22%2C%22lastName%22%3A%22Neal%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gustav%22%2C%22lastName%22%3A%22Nilsonne%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Guiomar%22%2C%22lastName%22%3A%22Niso%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sebastian%22%2C%22lastName%22%3A%22Ocklenburg%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%22%2C%22lastName%22%3A%22Oostenveld%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cyril%20R.%22%2C%22lastName%22%3A%22Pernet%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gilles%22%2C%22lastName%22%3A%22Pourtois%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Manuela%22%2C%22lastName%22%3A%22Ruzzoli%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sarah%20M.%22%2C%22lastName%22%3A%22Sass%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexandre%22%2C%22lastName%22%3A%22Schaefer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Magdalena%22%2C%22lastName%22%3A%22Senderecka%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joel%20S.%22%2C%22lastName%22%3A%22Snyder%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Christian%20K.%22%2C%22lastName%22%3A%22Tamnes%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emmanuelle%22%2C%22lastName%22%3A%22Tognoli%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marieke%20K.%22%2C%22lastName%22%3A%22van%20Vugt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Edelyn%22%2C%22lastName%22%3A%22Verona%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robin%22%2C%22lastName%22%3A%22Vloeberghs%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominik%22%2C%22lastName%22%3A%22Welke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jan%20R.%22%2C%22lastName%22%3A%22Wessel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ilya%22%2C%22lastName%22%3A%22Zakharov%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Faisal%22%2C%22lastName%22%3A%22Mushtaq%22%7D%5D%2C%22abstractNote%22%3A%22There%20is%20growing%20awareness%20across%20the%20neuroscience%20community%20that%20the%20replicability%20of%20findings%20about%20the%20relationship%20between%20brain%20activity%20and%20cognitive%20phenomena%20can%20be%20improved%20by%20conducting%20studies%20with%20high%20statistical%20power%20that%20adhere%20to%20well-defined%20and%20standardised%20analysis%20pipelines.%20Inspired%20by%20recent%20efforts%20from%20the%20psychological%20sciences%2C%20and%20with%20the%20desire%20to%20examine%20some%20of%20the%20foundational%20findings%20using%20electroencephalography%20%28EEG%29%2C%20we%20have%20launched%20%23EEGManyLabs%2C%20a%20large-scale%20international%20collaborative%20replication%20effort.%20Since%20its%20discovery%20in%20the%20early%2020th%20century%2C%20EEG%20has%20had%20a%20profound%20influence%20on%20our%20understanding%20of%20human%20cognition%2C%20but%20there%20is%20limited%20evidence%20on%20the%20replicability%20of%20some%20of%20the%20most%20highly%20cited%20discoveries.%20After%20a%20systematic%20search%20and%20selection%20process%2C%20we%20have%20identified%2027%20of%20the%20most%20influential%20and%20continually%20cited%20studies%20in%20the%20field.%20We%20plan%20to%20directly%20test%20the%20replicability%20of%20key%20findings%20from%2020%20of%20these%20studies%20in%20teams%20of%20at%20least%20three%20independent%20laboratories.%20The%20design%20and%20protocol%20of%20each%20replication%20effort%20will%20be%20submitted%20as%20a%20Registered%20Report%20and%20peer-reviewed%20prior%20to%20data%20collection.%20Prediction%20markets%2C%20open%20to%20all%20EEG%20researchers%2C%20will%20be%20used%20as%20a%20forecasting%20tool%20to%20examine%20which%20findings%20the%20community%20expects%20to%20replicate.%20This%20project%20will%20update%20our%20confidence%20in%20some%20of%20the%20most%20influential%20EEG%20findings%20and%20generate%20a%20large%20open%20access%20database%20that%20can%20be%20used%20to%20inform%20future%20research%20practices.%20Finally%2C%20through%20this%20international%20effort%2C%20we%20hope%20to%20create%20a%20cultural%20shift%20towards%20inclusive%2C%20high-powered%20multi-laboratory%20collaborations.%22%2C%22date%22%3A%22April%202%2C%202021%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.cortex.2021.03.013%22%2C%22ISSN%22%3A%220010-9452%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS0010945221001106%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222021-05-05T21%3A20%3A10Z%22%7D%7D%2C%7B%22key%22%3A%22D2C77XXE%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Dimigen%20and%20Ehinger%22%2C%22parsedDate%22%3A%222021-01-04%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EDimigen%2C%20O.%20and%20%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20%282021%29%20%26%23x2018%3BRegression-based%20analysis%20of%20combined%20EEG%20and%20eye-tracking%20data%3A%20Theory%20and%20applications%26%23x2019%3B%2C%20%3Ci%3EJournal%20of%20Vision%3C%5C%2Fi%3E%2C%2021%281%29%2C%20pp.%203%26%23x2013%3B3.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1167%5C%2Fjov.21.1.3%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1167%5C%2Fjov.21.1.3%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DD2C77XXE%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Regression-based%20analysis%20of%20combined%20EEG%20and%20eye-tracking%20data%3A%20Theory%20and%20applications%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Olaf%22%2C%22lastName%22%3A%22Dimigen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222021%5C%2F01%5C%2F04%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1167%5C%2Fjov.21.1.3%22%2C%22ISSN%22%3A%221534-7362%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjov.arvojournals.org%5C%2Farticle.aspx%3Farticleid%3D2772164%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%2C%22AXM5LPNY%22%2C%2252ZJKMY5%22%2C%22HWDKJTEV%22%5D%2C%22dateModified%22%3A%222023-03-17T13%3A52%3A34Z%22%7D%7D%2C%7B%22key%22%3A%22KZ4FP7TL%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Czeszumski%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECzeszumski%2C%20A.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282021%29%20%26%23x2018%3BCoordinating%20With%20a%20Robot%20Partner%20Affects%20Neural%20Processing%20Related%20to%20Action%20Monitoring%26%23x2019%3B%2C%20%3Ci%3EFrontiers%20in%20Neurorobotics%3C%5C%2Fi%3E%2C%2015.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3389%5C%2Ffnbot.2021.686010%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3389%5C%2Ffnbot.2021.686010%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DKZ4FP7TL%27%3ECite%3C%5C%2Fa%3E%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fbenediktehinger.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D4784278%26amp%3Bdlkey%3DLV6IIJDU%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Coordinating%20With%20a%20Robot%20Partner%20Affects%20Neural%20Processing%20Related%20to%20Action%20Monitoring%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Artur%22%2C%22lastName%22%3A%22Czeszumski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%20L.%22%2C%22lastName%22%3A%22Gert%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ashima%22%2C%22lastName%22%3A%22Keshava%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ali%22%2C%22lastName%22%3A%22Ghadirzadeh%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tilman%22%2C%22lastName%22%3A%22Kalthoff%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Max%22%2C%22lastName%22%3A%22Tiessen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22M%5Cu00e5rten%22%2C%22lastName%22%3A%22Bj%5Cu00f6rkman%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Danica%22%2C%22lastName%22%3A%22Kragic%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22Robots%20start%20to%20play%20a%20role%20in%20our%20social%20landscape%2C%20and%20they%20are%20progressively%20becoming%20responsive%2C%20both%20physically%20and%20socially.%20It%20begs%20the%20question%20of%20how%20humans%20react%20to%20and%20interact%20with%20robots%20in%20a%20coordinated%20manner%20and%20what%20the%20neural%20underpinnings%20of%20such%20behavior%20are.%20This%20exploratory%20study%20aims%20to%20understand%20the%20differences%20in%20human-human%20and%20human-robot%20interactions%20at%20a%20behavioral%20level%20and%20from%20a%20neurophysiological%20perspective.%20For%20this%20purpose%2C%20we%20adapted%20a%20collaborative%20dynamical%20paradigm%20from%20the%20literature.%20We%20asked%2012%20participants%20to%20hold%20two%20corners%20of%20a%20tablet%20while%20collaboratively%20guiding%20a%20ball%20around%20a%20circular%20track%20either%20with%20another%20participant%20or%20a%20robot.%20In%20irregular%20intervals%2C%20the%20ball%20was%20perturbed%20outward%20creating%20an%20artificial%20error%20in%20the%20behavior%2C%20which%20required%20corrective%20measures%20to%20return%20to%20the%20circular%20track%20again.%20Concurrently%2C%20we%20recorded%20electroencephalography%20%28EEG%29.%20In%20the%20behavioral%20data%2C%20we%20found%20an%20increased%20velocity%20and%20positional%20error%20of%20the%20ball%20from%20the%20track%20in%20the%20human-human%20condition%20vs.%20human-robot%20condition.%20For%20the%20EEG%20data%2C%20we%20computed%20event-related%20potentials.%20We%20found%20a%20significant%20difference%20between%20human%20and%20robot%20partners%20driven%20by%20significant%20clusters%20at%20fronto-central%20electrodes.%20The%20amplitudes%20were%20stronger%20with%20a%20robot%20partner%2C%20suggesting%20a%20different%20neural%20processing.%20All%20in%20all%2C%20our%20exploratory%20study%20suggests%20that%20coordinating%20with%20robots%20affects%20action%20monitoring%20related%20processing.%20In%20the%20investigated%20paradigm%2C%20human%20participants%20treat%20errors%20during%20human-robot%20interaction%20differently%20from%20those%20made%20during%20interactions%20with%20other%20humans.%20These%20results%20can%20improve%20communication%20between%20humans%20and%20robot%20with%20the%20use%20of%20neural%20activity%20in%20real-time.%22%2C%22date%22%3A%222021%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.3389%5C%2Ffnbot.2021.686010%22%2C%22ISSN%22%3A%221662-5218%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.frontiersin.org%5C%2Farticle%5C%2F10.3389%5C%2Ffnbot.2021.686010%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222022-05-25T11%3A14%3A29Z%22%7D%7D%2C%7B%22key%22%3A%22WFNFJ7BI%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Gert%20et%20al.%22%2C%22parsedDate%22%3A%222020-06-02%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EGert%2C%20A.L.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282020%29%20%26%23x2018%3BFaces%20strongly%20attract%20early%20fixations%20in%20naturally%20sampled%20real-world%20stimulus%20materials%26%23x2019%3B%2C%20in%20%3Ci%3EACM%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%3C%5C%2Fi%3E.%20New%20York%2C%20NY%2C%20USA%3A%20Association%20for%20Computing%20Machinery%20%28ETRA%20%26%23x2019%3B20%20Short%20Papers%29%2C%20pp.%201%26%23x2013%3B5.%20Available%20at%3A%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3379156.3391377%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3379156.3391377%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DWFNFJ7BI%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Faces%20strongly%20attract%20early%20fixations%20in%20naturally%20sampled%20real-world%20stimulus%20materials%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%20Lisa%22%2C%22lastName%22%3A%22Gert%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%20C.%22%2C%22lastName%22%3A%22Kietzmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Konig%22%7D%5D%2C%22abstractNote%22%3A%22Faces%20are%20an%20important%20and%20salient%20stimulus%20in%20our%20everyday%20life.%20They%20convey%20social%20information%20and%2C%20consequently%2C%20attract%20our%20attention%20easily.%20Here%2C%20we%20investigate%20this%20face-attraction-bias%20in%20detail%20and%20analyze%20the%20first%20fixations%20made%20in%20a%20free-viewing%20paradigm.%20We%20presented%2020%20participants%20with%20natural%2C%20head-centered%2C%20live-sized%20stimuli%20of%20indoor%20scenes%2C%20taken%20during%20unconstrained%20free-viewing%20in%20a%20real-world%20environment.%20About%2070%25%20of%20first%20fixations%20were%20made%20on%20human%20faces%2C%20rather%20than%20human%20heads%2C%20non-human%20faces%20or%20the%20background.%20This%20effect%20was%20present%20even%20though%20human%20faces%20constituted%20only%20about%205%25%20of%20the%20stimulus%20area%20and%20occurred%20in%20a%20wide%20variety%20of%20positions.%20With%20a%20hierarchical%20logistic%20model%2C%20we%20identify%20behavioral%20and%20stimulus%5Cu2019%20features%20that%20explain%20this%20bias.%20We%20conclude%20that%20the%20face-attraction%20bias%20replicates%20under%20more%20natural%20conditions%2C%20reflects%20high-level%20properties%20of%20faces%2C%20and%20discuss%20its%20implications%20on%20the%20measurement%20of%20brain%20dynamics.%22%2C%22date%22%3A%22June%202%2C%202020%22%2C%22proceedingsTitle%22%3A%22ACM%20Symposium%20on%20Eye%20Tracking%20Research%20and%20Applications%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3379156.3391377%22%2C%22ISBN%22%3A%22978-1-4503-7134-6%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3379156.3391377%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%2C%2252ZJKMY5%22%5D%2C%22dateModified%22%3A%222023-03-17T13%3A52%3A05Z%22%7D%7D%2C%7B%22key%22%3A%22AR5JPWE4%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Bosch%20et%20al.%22%2C%22parsedDate%22%3A%222020-02-14%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBosch%2C%20E.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282020%29%20%26%23x2018%3BOpposite%20effects%20of%20choice%20history%20and%20stimulus%20history%20resolve%20a%20paradox%20of%20sequential%20choice%20bias%26%23x2019%3B%2C%20%3Ci%3EJournal%20of%20Vision%3C%5C%2Fi%3E%2C%20p.%202020.02.14.948919.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F%28accepted%29%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F%28accepted%29%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DAR5JPWE4%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Opposite%20effects%20of%20choice%20history%20and%20stimulus%20history%20resolve%20a%20paradox%20of%20sequential%20choice%20bias%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ella%22%2C%22lastName%22%3A%22Bosch%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Matthias%22%2C%22lastName%22%3A%22Fritsche%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Floris%20P.%20de%22%2C%22lastName%22%3A%22Lange%22%7D%5D%2C%22abstractNote%22%3A%22%3Ch3%3EABSTRACT%3C%5C%2Fh3%3E%20%3Cp%3EPerceptual%20decisions%20are%20biased%20towards%20previous%20decisions.%20Previous%20research%20suggests%20that%20this%20choice%20repetition%20bias%20is%20increased%20after%20previous%20decisions%20of%20high%20confidence%2C%20as%20inferred%20from%20response%20time%20measures%20%28Urai%20et%20al.%2C%202017%29%2C%20but%20also%20when%20previous%20decisions%20were%20based%20on%20weak%20sensory%20evidence%20%28Akaishi%20et%20al.%2C%202014%29.%20As%20weak%20sensory%20evidence%20is%20typically%20associated%20with%20low%20confidence%2C%20these%20previous%20findings%20appear%20conflicting.%20To%20resolve%20this%20conflict%2C%20we%20set%20out%20to%20investigate%20the%20effect%20of%20decision%20confidence%20on%20choice%20repetition%20more%20directly%20by%20measuring%20explicit%20confidence%20ratings%20in%20a%20motion%20coherence%20discrimination%20task.%20Moreover%2C%20we%20explored%20how%20choice%20and%20stimulus%20history%20jointly%20affect%20subsequent%20perceptual%20choices.%20We%20found%20that%20participants%20were%20more%20likely%20to%20repeat%20previous%20choices%20of%20high%20subjective%20confidence%2C%20as%20well%20as%20previous%20fast%20choices%2C%20confirming%20the%20boost%20of%20choice%20repetition%20with%20decision%20confidence.%20Furthermore%2C%20we%20discovered%20that%20current%20choices%20were%20biased%20away%20from%20the%20previous%20evidence%20direction%2C%20not%20previous%20choice%2C%20and%20that%20this%20effect%20grew%20with%20previous%20evidence%20strength.%20These%20findings%20point%20towards%20simultaneous%20biases%20of%20choice%20repetition%2C%20modulated%20by%20decision%20confidence%2C%20and%20adaptation%2C%20modulated%20by%20the%20strength%20of%20evidence%2C%20which%20bias%20current%20perceptual%20decisions%20in%20opposite%20directions.%3C%5C%2Fp%3E%22%2C%22date%22%3A%222020-02-14%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%28accepted%29%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.biorxiv.org%5C%2Fcontent%5C%2F10.1101%5C%2F2020.02.14.948919v1%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222020-11-23T12%3A31%3A07Z%22%7D%7D%2C%7B%22key%22%3A%2255UAFW9P%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Heilbron%20et%20al.%22%2C%22parsedDate%22%3A%222019-09-10%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHeilbron%2C%20M.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282019%29%20%26%23x2018%3BTracking%20Naturalistic%20Linguistic%20Predictions%20with%20Deep%20Neural%20Language%20Models%26%23x2019%3B%2C%20%3Ci%3EarXiv%3A1909.04400%20%5Bq-bio%5D%3C%5C%2Fi%3E%20%5BPreprint%5D.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.32470%5C%2FCCN.2019.1096-0%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.32470%5C%2FCCN.2019.1096-0%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3D55UAFW9P%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Tracking%20Naturalistic%20Linguistic%20Predictions%20with%20Deep%20Neural%20Language%20Models%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Micha%22%2C%22lastName%22%3A%22Heilbron%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Hagoort%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Floris%20P.%22%2C%22lastName%22%3A%22de%20Lange%22%7D%5D%2C%22abstractNote%22%3A%22Prediction%20in%20language%20has%20traditionally%20been%20studied%20using%20simple%20designs%20in%20which%20neural%20responses%20to%20expected%20and%20unexpected%20words%20are%20compared%20in%20a%20categorical%20fashion.%20However%2C%20these%20designs%20have%20been%20contested%20as%20being%20%60prediction%20encouraging%27%2C%20potentially%20exaggerating%20the%20importance%20of%20prediction%20in%20language%20understanding.%20A%20few%20recent%20studies%20have%20begun%20to%20address%20these%20worries%20by%20using%20model-based%20approaches%20to%20probe%20the%20effects%20of%20linguistic%20predictability%20in%20naturalistic%20stimuli%20%28e.g.%20continuous%20narrative%29.%20However%2C%20these%20studies%20so%20far%20only%20looked%20at%20very%20local%20forms%20of%20prediction%2C%20using%20models%20that%20take%20no%20more%20than%20the%20prior%20two%20words%20into%20account%20when%20computing%20a%20word%27s%20predictability.%20Here%2C%20we%20extend%20this%20approach%20using%20a%20state-of-the-art%20neural%20language%20model%20that%20can%20take%20roughly%20500%20times%20longer%20linguistic%20contexts%20into%20account.%20Predictability%20estimates%20from%20the%20neural%20network%20offer%20a%20much%20better%20fit%20to%20EEG%20data%20from%20subjects%20listening%20to%20naturalistic%20narrative%20than%20simpler%20models%2C%20and%20reveal%20strong%20surprise%20responses%20akin%20to%20the%20P200%20and%20N400.%20These%20results%20show%20that%20predictability%20effects%20in%20language%20are%20not%20a%20side-effect%20of%20simple%20designs%2C%20and%20demonstrate%20the%20practical%20use%20of%20recent%20advances%20in%20AI%20for%20the%20cognitive%20neuroscience%20of%20language.%22%2C%22date%22%3A%222019-09-10%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.32470%5C%2FCCN.2019.1096-0%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Farxiv.org%5C%2Fabs%5C%2F1909.04400%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222020-11-23T12%3A40%3A39Z%22%7D%7D%2C%7B%22key%22%3A%22F3DL9NXP%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%20et%20al.%22%2C%22parsedDate%22%3A%222019-07-09%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282019%29%20%26%23x2018%3BA%20new%20comprehensive%20eye-tracking%20test%20battery%20concurrently%20evaluating%20the%20Pupil%20Labs%20glasses%20and%20the%20EyeLink%201000%26%23x2019%3B%2C%20%3Ci%3EPeerJ%3C%5C%2Fi%3E%2C%207%2C%20p.%20e7086.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7717%5C%2Fpeerj.7086%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7717%5C%2Fpeerj.7086%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DF3DL9NXP%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22A%20new%20comprehensive%20eye-tracking%20test%20battery%20concurrently%20evaluating%20the%20Pupil%20Labs%20glasses%20and%20the%20EyeLink%201000%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katharina%22%2C%22lastName%22%3A%22Gro%5Cu00df%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Inga%22%2C%22lastName%22%3A%22Ibs%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22Eye-tracking%20experiments%20rely%20heavily%20on%20good%20data%20quality%20of%20eye-trackers.%20Unfortunately%2C%20it%20is%20often%20the%20case%20that%20only%20the%20spatial%20accuracy%20and%20precision%20values%20are%20available%20from%20the%20manufacturers.%20These%20two%20values%20alone%20are%20not%20suf%5Cufb01cient%20to%20serve%20as%20a%20benchmark%20for%20an%20eye-tracker%3A%20Eye-tracking%20quality%20deteriorates%20during%20an%20experimental%20session%20due%20to%20head%20movements%2C%20changing%20illumination%20or%20calibration%20decay.%20Additionally%2C%20different%20experimental%20paradigms%20require%20the%20analysis%20of%20different%20types%20of%20eye%20movements%3B%20for%20instance%2C%20smooth%20pursuit%20movements%2C%20blinks%20or%20microsaccades%2C%20which%20themselves%20cannot%20readily%20be%20evaluated%20by%20using%20spatial%20accuracy%20or%20precision%20alone.%20To%20obtain%20a%20more%20comprehensive%20description%20of%20properties%2C%20we%20developed%20an%20extensive%20eye-tracking%20test%20battery.%20In%2010%20different%20tasks%2C%20we%20evaluated%20eye-tracking%20related%20measures%20such%20as%3A%20the%20decay%20of%20accuracy%2C%20%5Cufb01xation%20durations%2C%20pupil%20dilation%2C%20smooth%20pursuit%20movement%2C%20microsaccade%20classi%5Cufb01cation%2C%20blink%20classi%5Cufb01cation%2C%20or%20the%20in%5Cufb02uence%20of%20head%20motion.%20For%20some%20measures%2C%20true%20theoretical%20values%20exist.%20For%20others%2C%20a%20relative%20comparison%20to%20a%20reference%20eye-tracker%20is%20needed.%20Therefore%2C%20we%20collected%20our%20gaze%20data%20simultaneously%20from%20a%20remote%20EyeLink%201000%20eye-tracker%20as%20the%20reference%20and%20compared%20it%20with%20the%20mobile%20Pupil%20Labs%20glasses.%20As%20expected%2C%20the%20average%20spatial%20accuracy%20of%200.57%20%20for%20the%20EyeLink%201000%20eye-tracker%20was%20better%20than%20the%200.82%20%20for%20the%20Pupil%20Labs%20glasses%20%28N%20%3D%2015%29.%20Furthermore%2C%20we%20classi%5Cufb01ed%20less%20%5Cufb01xations%20and%20shorter%20saccade%20durations%20for%20the%20Pupil%20Labs%20glasses.%20Similarly%2C%20we%20found%20fewer%20microsaccades%20using%20the%20Pupil%20Labs%20glasses.%20The%20accuracy%20over%20time%20decayed%20only%20slightly%20for%20the%20EyeLink%201000%2C%20but%20strongly%20for%20the%20Pupil%20Labs%20glasses.%20Finally%2C%20we%20observed%20that%20the%20measured%20pupil%20diameters%20differed%20between%20eye-trackers%20on%20the%20individual%20subject%20level%20but%20not%20on%20the%20group%20level.%20To%20conclude%2C%20our%20eye-tracking%20test%20battery%20offers%2010%20tasks%20that%20allow%20us%20to%20benchmark%20the%20many%20parameters%20of%20interest%20in%20stereotypical%20eye-tracking%20situations%20and%20addresses%20a%20common%20source%20of%20confounds%20in%20measurement%20errors%20%28e.g.%2C%20yaw%20and%20roll%20head%20movements%29.%20All%20recorded%20eye-tracking%20data%20%28including%20Pupil%20Labs%5Cu2019%20eye%20videos%29%2C%20the%20stimulus%20code%20for%20the%20test%20battery%2C%20and%20the%20modular%20analysis%20pipeline%20are%20freely%20available%20%28https%3A%5C%2F%5C%2Fgithub.com%5C%2Fbehinger%5C%2Fetcomp%29.%22%2C%22date%22%3A%222019-07-09%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.7717%5C%2Fpeerj.7086%22%2C%22ISSN%22%3A%222167-8359%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fpeerj.com%5C%2Farticles%5C%2F7086%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222019-09-11T09%3A53%3A20Z%22%7D%7D%2C%7B%22key%22%3A%22L4JRBJDD%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Czeszumski%20et%20al.%22%2C%22parsedDate%22%3A%222019%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ECzeszumski%2C%20A.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282019%29%20%26%23x2018%3BThe%20Social%20Situation%20Affects%20How%20We%20Process%20Feedback%20About%20Our%20Actions%26%23x2019%3B%2C%20%3Ci%3EFrontiers%20in%20Psychology%3C%5C%2Fi%3E%2C%2010.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3389%5C%2Ffpsyg.2019.00361%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3389%5C%2Ffpsyg.2019.00361%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DL4JRBJDD%27%3ECite%3C%5C%2Fa%3E%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fbenediktehinger.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D4784278%26amp%3Bdlkey%3DQCWIFM2R%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22The%20Social%20Situation%20Affects%20How%20We%20Process%20Feedback%20About%20Our%20Actions%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Artur%22%2C%22lastName%22%3A%22Czeszumski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Basil%22%2C%22lastName%22%3A%22Wahn%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22Humans%20achieve%20their%20goals%20in%20joint%20action%20tasks%20either%20by%20cooperation%20or%20competition.%20In%20the%20present%20study%2C%20we%20investigated%20the%20neural%20processes%20underpinning%20error%20and%20monetary%20rewards%20processing%20in%20such%20cooperative%20and%20competitive%20situations.%20We%20used%20electroencephalography%20%28EEG%29%20and%20analyzed%20event-related%20potentials%20%28ERPs%29%20triggered%20by%20feedback%20in%20both%20social%20situations.%2026%20dyads%20performed%20a%20joint%20four-alternative%20forced%20choice%20%284AFC%29%20visual%20task%20either%20cooperatively%20or%20competitively.%20At%20the%20end%20of%20each%20trial%2C%20participants%20received%20performance%20feedback%20about%20their%20individual%20and%20joint%20errors%20and%20accompanying%20monetary%20rewards.%20Furthermore%2C%20the%20outcome%2C%20i.e.%20resulting%20positive%2C%20negative%20or%20neutral%20rewards%2C%20was%20dependent%20on%20the%20pay-off%20matrix%2C%20defining%20the%20social%20situation%20either%20as%20cooperative%20or%20competitive.%20We%20used%20linear%20mixed%20effects%20models%20to%20analyze%20the%20feedback-related-negativity%20%28FRN%29%20and%20used%20the%20Threshold-free%20cluster%20enhancement%20%28TFCE%29%20method%20to%20explore%20activations%20of%20all%20electrodes%20and%20times.%20We%20found%20main%20effects%20of%20the%20outcome%20and%20social%20situation%2C%20but%20no%20interaction%20at%20mid-line%20frontal%20electrodes.%20The%20FRN%20was%20more%20negative%20for%20losses%20than%20wins%20in%20both%20social%20situations.%20However%2C%20the%20FRN%20amplitudes%20differed%20between%20social%20situations.%20Moreover%2C%20we%20compared%20monetary%20with%20neutral%20outcomes%20in%20both%20social%20situations.%20Our%20exploratory%20TFCE%20analysis%20revealed%20that%20processing%20of%20feedback%20differs%20between%20cooperative%20and%20competitive%20situations%20at%20right%20temporo-parietal%20electrodes%20where%20the%20cooperative%20situation%20elicited%20more%20positive%20amplitudes.%20Further%2C%20the%20differences%20induced%20by%20the%20social%20situations%20were%20stronger%20in%20participants%20with%20higher%20scores%20on%20a%20perspective%20taking%20test.%20In%20sum%2C%20our%20results%20replicate%20previous%20studies%20about%20the%20FRN%20and%20extend%20them%20by%20comparing%20neurophysiological%20responses%20to%20positive%20and%20negative%20outcomes%20in%20a%20task%20that%20simultaneously%20engages%20two%20participants%20in%20competitive%20and%20cooperative%20situations.%22%2C%22date%22%3A%222019%22%2C%22language%22%3A%22English%22%2C%22DOI%22%3A%2210.3389%5C%2Ffpsyg.2019.00361%22%2C%22ISSN%22%3A%221664-1078%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.frontiersin.org%5C%2Farticles%5C%2F10.3389%5C%2Ffpsyg.2019.00361%5C%2Ffull%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%2C%22HWDKJTEV%22%5D%2C%22dateModified%22%3A%222023-03-17T13%3A53%3A11Z%22%7D%7D%2C%7B%22key%22%3A%22E8QAEP6Z%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%22%2C%22parsedDate%22%3A%222019%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20%282019%29%20%26%23x2018%3BUnmixed%3A%20Linear%20Mixed%20Models%20combined%20with%20Overlap%20Correction%20for%20M%5C%2FEEG%20analyses.%20An%20Extension%20to%20the%20unfold%20Toolbox%26%23x2019%3B%2C%20in%20%3Ci%3E2019%20Conference%20on%20Cognitive%20Computational%20Neuroscience%3C%5C%2Fi%3E.%20%3Ci%3E2019%20Conference%20on%20Cognitive%20Computational%20Neuroscience%3C%5C%2Fi%3E%2C%20Berlin%2C%20Germany%3A%20Cognitive%20Computational%20Neuroscience.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.32470%5C%2FCCN.2019.1102-0%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.32470%5C%2FCCN.2019.1102-0%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DE8QAEP6Z%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Unmixed%3A%20Linear%20Mixed%20Models%20combined%20with%20Overlap%20Correction%20for%20M%5C%2FEEG%20analyses.%20An%20Extension%20to%20the%20unfold%20Toolbox%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%5D%2C%22abstractNote%22%3A%22Linear%20mixed%20models%20%28LMMs%29%20offer%20several%20benefits%20over%20traditional%20two-stage%20analysis%20methods%20common%20in%20EEG%20analysis%3A%20Higher%20power%20to%20detect%20effects%2C%20partial%20pooling%20with%20noisy%20data%20and%20the%20possibility%20to%20account%20for%20both%20subject%20and%20item%20effects.%20LMMs%20come%20at%20the%20price%20of%20increased%20computational%20cost%2C%20up%20to%20now%20making%20them%20incompatible%20to%20use%20in%20natural%20experiments%20that%20require%20time-resolved%20deconvolution%20methods%20of%20continuous%20EEG%20data.%20Here%2C%20I%20present%20unmixed%20an%20extension%20to%20the%20open%20source%20unfold-toolbox%2C%20allowing%20to%20fit%20LMMs%20and%20GAMMs%20to%20rERP%20%28regression%20ERPs%29%20using%20extended%20Wilkinson%20formulas.%20Unmixed%20supports%20mixed%20modelling%20of%20overlapping%20events%20and%20non-linear%20effects.%20It%20offers%20several%20different%20optimizers%2C%20Walds%20t-tests%20and%20likelihood%20ratio%20model%20comparison%20tests%20for%20statistical%20analysis%2C%20and%20Benjamini-Hochberg%20FDR%20for%20multiple%20comparison%20correction.%20This%20technique%20is%20promising%20for%20population%20where%20extensive%20data%20collection%20is%20not%20possible%2C%20e.g.%20infants%20or%20clinical%20populations.%22%2C%22date%22%3A%222019%22%2C%22proceedingsTitle%22%3A%222019%20Conference%20on%20Cognitive%20Computational%20Neuroscience%22%2C%22conferenceName%22%3A%222019%20Conference%20on%20Cognitive%20Computational%20Neuroscience%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.32470%5C%2FCCN.2019.1102-0%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fccneuro.org%5C%2F2019%5C%2FPapers%5C%2FViewPapers.asp%3FPaperNum%3D1102%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%2C%22AXM5LPNY%22%2C%2252ZJKMY5%22%2C%22HWDKJTEV%22%5D%2C%22dateModified%22%3A%222022-05-09T15%3A04%3A56Z%22%7D%7D%2C%7B%22key%22%3A%22XI39VJCE%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%20and%20Dimigen%22%2C%22parsedDate%22%3A%222019%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20and%20Dimigen%2C%20O.%20%282019%29%20%26%23x2018%3BUnfold%3A%20An%20integrated%20toolbox%20for%20overlap%20correction%2C%20non-linear%20modeling%2C%20and%20regression-based%20EEG%20analysis%26%23x2019%3B%2C%20%3Ci%3EpeerJ%3C%5C%2Fi%3E%20%5BPreprint%5D.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2Fhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7717%5C%2Fpeerj.7838%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2Fhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7717%5C%2Fpeerj.7838%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DXI39VJCE%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Unfold%3A%20An%20integrated%20toolbox%20for%20overlap%20correction%2C%20non-linear%20modeling%2C%20and%20regression-based%20EEG%20analysis%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Olaf%22%2C%22lastName%22%3A%22Dimigen%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222019%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7717%5C%2Fpeerj.7838%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%2C%22AXM5LPNY%22%2C%2252ZJKMY5%22%2C%22HWDKJTEV%22%5D%2C%22dateModified%22%3A%222022-05-09T15%3A04%3A53Z%22%7D%7D%2C%7B%22key%22%3A%229KSE9T3S%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22S%5Cu00fctfeld%20et%20al.%22%2C%22parsedDate%22%3A%222019%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ES%26%23xFC%3Btfeld%2C%20L.R.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282019%29%20%26%23x2018%3BHow%20does%20the%20method%20change%20what%20we%20measure%3F%20Comparing%20virtual%20reality%20and%20text-based%20surveys%20for%20the%20assessment%20of%20moral%20decisions%20in%20traffic%20dilemmas%26%23x2019%3B.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.31234%5C%2Fosf.io%5C%2Fh2z7p%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.31234%5C%2Fosf.io%5C%2Fh2z7p%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3D9KSE9T3S%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22How%20does%20the%20method%20change%20what%20we%20measure%3F%20Comparing%20virtual%20reality%20and%20text-based%20surveys%20for%20the%20assessment%20of%20moral%20decisions%20in%20traffic%20dilemmas%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Leon%20Ren%5Cu00e9%22%2C%22lastName%22%3A%22S%5Cu00fctfeld%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20Valerian%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gordon%22%2C%22lastName%22%3A%22Pipa%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222019%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.31234%5C%2Fosf.io%5C%2Fh2z7p%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222019-09-11T10%3A00%3A19Z%22%7D%7D%2C%7B%22key%22%3A%22SC4LXGSY%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%20et%20al.%22%2C%22parsedDate%22%3A%222018-03-01%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%2C%20Kaufhold%2C%20L.%20and%20K%26%23xF6%3Bnig%2C%20P.%20%282018%29%20%26%23x2018%3BProbing%20the%20temporal%20dynamics%20of%20the%20exploration%26%23x2013%3Bexploitation%20dilemma%20of%20eye%20movements%26%23x2019%3B%2C%20%3Ci%3EJournal%20of%20Vision%3C%5C%2Fi%3E%2C%2018%283%29%2C%20pp.%206%26%23x2013%3B6.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1167%5C%2F18.3.6%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1167%5C%2F18.3.6%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DSC4LXGSY%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Probing%20the%20temporal%20dynamics%20of%20the%20exploration%5Cu2013exploitation%20dilemma%20of%20eye%20movements%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lilli%22%2C%22lastName%22%3A%22Kaufhold%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222018%5C%2F03%5C%2F01%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1167%5C%2F18.3.6%22%2C%22ISSN%22%3A%221534-7362%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fjov.arvojournals.org%5C%2Farticle.aspx%3Farticleid%3D2674777%22%2C%22collections%22%3A%5B%22LAIN3GVD%22%2C%22BBWDIMSG%22%2C%22HWDKJTEV%22%5D%2C%22dateModified%22%3A%222019-09-11T09%3A50%3A09Z%22%7D%7D%2C%7B%22key%22%3A%22J2KSEE6Z%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Benedikt%20V.%20Ehinger%22%2C%22parsedDate%22%3A%222018%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EBenedikt%20V.%20%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%20%282018%29%20%3Ci%3EDecisions%2C%20Predictions%2C%20and%20Learning%20in%20the%20visual%20sense%3C%5C%2Fi%3E.%20Osnabr%26%23xFC%3Bck%20University.%20Available%20at%3A%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Frepositorium.ub.uni-osnabrueck.de%5C%2Fhandle%5C%2Furn%3Anbn%3Ade%3Agbv%3A700-20181116806%27%3Ehttps%3A%5C%2F%5C%2Frepositorium.ub.uni-osnabrueck.de%5C%2Fhandle%5C%2Furn%3Anbn%3Ade%3Agbv%3A700-20181116806%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DJ2KSEE6Z%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22thesis%22%2C%22title%22%3A%22Decisions%2C%20Predictions%2C%20and%20Learning%20in%20the%20visual%20sense%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22Benedikt%20V.%20Ehinger%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22thesisType%22%3A%22%22%2C%22university%22%3A%22Osnabr%5Cu00fcck%20University%22%2C%22date%22%3A%222018%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Frepositorium.ub.uni-osnabrueck.de%5C%2Fhandle%5C%2Furn%3Anbn%3Ade%3Agbv%3A700-20181116806%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222023-03-17T13%3A49%3A10Z%22%7D%7D%2C%7B%22key%22%3A%22QT8DKHID%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%22%2C%22parsedDate%22%3A%222018%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20%282018%29%20%3Ci%3EEEGVIS%20Toolbox%3C%5C%2Fi%3E.%20Available%20at%3A%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.5281%5C%2Fzenodo.1312813.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DQT8DKHID%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22book%22%2C%22title%22%3A%22EEGVIS%20Toolbox%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222018%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22BBWDIMSG%22%2C%2252ZJKMY5%22%5D%2C%22dateModified%22%3A%222022-05-09T15%3A05%3A49Z%22%7D%7D%2C%7B%22key%22%3A%22QKNQX76R%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%20et%20al.%22%2C%22parsedDate%22%3A%222017-05-16%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282017%29%20%26%23x2018%3BHumans%20treat%20unreliable%20filled-in%20percepts%20as%20more%20real%20than%20veridical%20ones%26%23x2019%3B%2C%20%3Ci%3EeLife%3C%5C%2Fi%3E.%20Edited%20by%20P.%20Latham%2C%206%2C%20p.%20e21761.%20Available%20at%3A%20%3Ca%20class%3D%27zp-ItemURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7554%5C%2FeLife.21761%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7554%5C%2FeLife.21761%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DQKNQX76R%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Humans%20treat%20unreliable%20filled-in%20percepts%20as%20more%20real%20than%20veridical%20ones%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katja%22%2C%22lastName%22%3A%22H%5Cu00e4usser%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jos%5Cu00e9%20P%22%2C%22lastName%22%3A%22Ossand%5Cu00f3n%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22Latham%22%7D%5D%2C%22abstractNote%22%3A%22Humans%20often%20evaluate%20sensory%20signals%20according%20to%20their%20reliability%20for%20optimal%20decision-making.%20However%2C%20how%20do%20we%20evaluate%20percepts%20generated%20in%20the%20absence%20of%20direct%20input%20that%20are%2C%20therefore%2C%20completely%20unreliable%3F%20Here%2C%20we%20utilize%20the%20phenomenon%20of%20filling-in%20occurring%20at%20the%20physiological%20blind-spots%20to%20compare%20partially%20inferred%20and%20veridical%20percepts.%20Subjects%20chose%20between%20stimuli%20that%20elicit%20filling-in%2C%20and%20perceptually%20equivalent%20ones%20presented%20outside%20the%20blind-spots%2C%20looking%20for%20a%20Gabor%20stimulus%20without%20a%20small%20orthogonal%20inset.%20In%20ambiguous%20conditions%2C%20when%20the%20stimuli%20were%20physically%20identical%20and%20the%20inset%20was%20absent%20in%20both%2C%20subjects%20behaved%20opposite%20to%20optimal%2C%20preferring%20the%20blind-spot%20stimulus%20as%20the%20better%20example%20of%20a%20collinear%20stimulus%2C%20even%20though%20no%20relevant%20veridical%20information%20was%20available.%20Thus%2C%20a%20percept%20that%20is%20partially%20inferred%20is%20paradoxically%20considered%20more%20reliable%20than%20a%20percept%20based%20on%20external%20input.%20In%20other%20words%3A%20Humans%20treat%20filled-in%20inferred%20percepts%20as%20more%20real%20than%20veridical%20ones.%22%2C%22date%22%3A%22May%2016%2C%202017%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.7554%5C%2FeLife.21761%22%2C%22ISSN%22%3A%222050-084X%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.7554%5C%2FeLife.21761%22%2C%22collections%22%3A%5B%22LAIN3GVD%22%2C%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222019-09-11T09%3A50%3A18Z%22%7D%7D%2C%7B%22key%22%3A%22SIYWX9UG%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Kietzmann%20et%20al.%22%2C%22parsedDate%22%3A%222016%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EKietzmann%2C%20T.C.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282016%29%20%26%23x2018%3BExtensive%20training%20leads%20to%20temporal%20and%20spatial%20shifts%20of%20cortical%20activity%20underlying%20visual%20category%20selectivity%26%23x2019%3B%2C%20%3Ci%3ENeuroImage%3C%5C%2Fi%3E%2C%20134%2C%20pp.%2022%26%23x2013%3B34.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.neuroimage.2016.03.066%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.neuroimage.2016.03.066%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DSIYWX9UG%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Extensive%20training%20leads%20to%20temporal%20and%20spatial%20shifts%20of%20cortical%20activity%20underlying%20visual%20category%20selectivity%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%20C.%22%2C%22lastName%22%3A%22Kietzmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Danja%22%2C%22lastName%22%3A%22Porada%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andreas%20K.%22%2C%22lastName%22%3A%22Engel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2207%5C%2F2016%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.neuroimage.2016.03.066%22%2C%22ISSN%22%3A%2210538119%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flinkinghub.elsevier.com%5C%2Fretrieve%5C%2Fpii%5C%2FS1053811916300106%22%2C%22collections%22%3A%5B%22LAIN3GVD%22%2C%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222019-09-11T09%3A28%3A28Z%22%7D%7D%2C%7B%22key%22%3A%22EESL2BY2%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Spoida%20et%20al.%22%2C%22parsedDate%22%3A%222016%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESpoida%2C%20K.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282016%29%20%26%23x2018%3BMelanopsin%20Variants%20as%20Intrinsic%20Optogenetic%20On%20and%20Off%20Switches%20for%20Transient%20versus%20Sustained%20Activation%20of%20G%20Protein%20Pathways%26%23x2019%3B%2C%20%3Ci%3ECurrent%20Biology%3C%5C%2Fi%3E%2C%2026%289%29%2C%20pp.%201206%26%23x2013%3B1212.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.cub.2016.03.007%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1016%5C%2Fj.cub.2016.03.007%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DEESL2BY2%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Melanopsin%20Variants%20as%20Intrinsic%20Optogenetic%20On%20and%20Off%20Switches%20for%20Transient%20versus%20Sustained%20Activation%20of%20G%20Protein%20Pathways%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katharina%22%2C%22lastName%22%3A%22Spoida%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dennis%22%2C%22lastName%22%3A%22Eickelbeck%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Raziye%22%2C%22lastName%22%3A%22Karapinar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tobias%22%2C%22lastName%22%3A%22Eckhardt%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Melanie%5Cu00a0D.%22%2C%22lastName%22%3A%22Mark%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dirk%22%2C%22lastName%22%3A%22Jancke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%5Cu00a0Valerian%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Deniz%22%2C%22lastName%22%3A%22Dalkara%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Herlitze%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Olivia%5Cu00a0A.%22%2C%22lastName%22%3A%22Masseck%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%2205%5C%2F2016%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.cub.2016.03.007%22%2C%22ISSN%22%3A%2209609822%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flinkinghub.elsevier.com%5C%2Fretrieve%5C%2Fpii%5C%2FS096098221630183X%22%2C%22collections%22%3A%5B%22LAIN3GVD%22%2C%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222019-09-11T10%3A00%3A02Z%22%7D%7D%2C%7B%22key%22%3A%228FZL5JBC%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22K%5Cu00f6nig%20et%20al.%22%2C%22parsedDate%22%3A%222016%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EK%26%23xF6%3Bnig%2C%20P.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282016%29%20%26%23x2018%3BEye%20movements%20as%20a%20window%20to%20cognitive%20processes%26%23x2019%3B%2C%20%3Ci%3EJournal%20of%20Eye%20Movement%20Research%3C%5C%2Fi%3E%2C%209%2C%20pp.%201%26%23x2013%3B16.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2Fhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.16910%5C%2Fjemr.9.5.3%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2Fhttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.16910%5C%2Fjemr.9.5.3%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3D8FZL5JBC%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Eye%20movements%20as%20a%20window%20to%20cognitive%20processes%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Niklas%22%2C%22lastName%22%3A%22Wilming%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tim%20C%22%2C%22lastName%22%3A%22Kietzmann%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jos%5Cu00e9%20P%22%2C%22lastName%22%3A%22Ossand%5Cu00f2n%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Selim%22%2C%22lastName%22%3A%22Onat%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20Valerian%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ricardo%20R%22%2C%22lastName%22%3A%22Gameiro%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kai%22%2C%22lastName%22%3A%22Kaspar%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222016%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.16910%5C%2Fjemr.9.5.3%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22LAIN3GVD%22%2C%22BBWDIMSG%22%2C%22HWDKJTEV%22%5D%2C%22dateModified%22%3A%222023-07-27T11%3A21%3A24Z%22%7D%7D%2C%7B%22key%22%3A%22YJ8WF6KK%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%20et%20al.%22%2C%22parsedDate%22%3A%222016%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282016%29%20%26%23x2018%3BUnderstanding%20melanopsin%20using%20bayesian%20generative%20models-%20an%20Introduction%26%23x2019%3B%2C%20%3Ci%3EbioRxiv%3C%5C%2Fi%3E%20%5BPreprint%5D.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F043273%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1101%5C%2F043273%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DYJ8WF6KK%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Understanding%20melanopsin%20using%20bayesian%20generative%20models-%20an%20Introduction%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20Valerian%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dennis%22%2C%22lastName%22%3A%22Eickelbeck%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Katharina%22%2C%22lastName%22%3A%22Spoida%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22Herlitze%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222016%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1101%5C%2F043273%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22LAIN3GVD%22%2C%22BBWDIMSG%22%5D%2C%22dateModified%22%3A%222019-09-11T09%3A27%3A15Z%22%7D%7D%2C%7B%22key%22%3A%22ZCP283PA%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%20et%20al.%22%2C%22parsedDate%22%3A%222015-05-13%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%2C%20K%26%23xF6%3Bnig%2C%20P.%20and%20Ossand%26%23xF3%3Bn%2C%20J.P.%20%282015%29%20%26%23x2018%3BPredictions%20of%20Visual%20Content%20across%20Eye%20Movements%20and%20Their%20Modulation%20by%20Inferred%20Information%26%23x2019%3B%2C%20%3Ci%3EJournal%20of%20Neuroscience%3C%5C%2Fi%3E%2C%2035%2819%29%2C%20pp.%207403%26%23x2013%3B7413.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1523%5C%2FJNEUROSCI.5114-14.2015%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1523%5C%2FJNEUROSCI.5114-14.2015%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DZCP283PA%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Predictions%20of%20Visual%20Content%20across%20Eye%20Movements%20and%20Their%20Modulation%20by%20Inferred%20Information%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22B.%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22P.%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22J.%20P.%22%2C%22lastName%22%3A%22Ossand%5Cu00f3n%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222015-05-13%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1523%5C%2FJNEUROSCI.5114-14.2015%22%2C%22ISSN%22%3A%220270-6474%2C%201529-2401%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.jneurosci.org%5C%2Fcgi%5C%2Fdoi%5C%2F10.1523%5C%2FJNEUROSCI.5114-14.2015%22%2C%22collections%22%3A%5B%22LAIN3GVD%22%2C%22BBWDIMSG%22%2C%2252ZJKMY5%22%5D%2C%22dateModified%22%3A%222022-05-09T15%3A05%3A55Z%22%7D%7D%2C%7B%22key%22%3A%22HJ5D5NCY%22%2C%22library%22%3A%7B%22id%22%3A4784278%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ehinger%20et%20al.%22%2C%22parsedDate%22%3A%222014%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%201.35%3B%20%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E%2C%20B.V.%20%3Ci%3Eet%20al.%3C%5C%2Fi%3E%20%282014%29%20%26%23x2018%3BKinesthetic%20and%20vestibular%20information%20modulate%20alpha%20activity%20during%20spatial%20navigation%3A%20a%20mobile%20EEG%20study%26%23x2019%3B%2C%20%3Ci%3EFrontiers%20in%20Human%20Neuroscience%3C%5C%2Fi%3E%2C%208.%20Available%20at%3A%20%3Ca%20class%3D%27zp-DOIURL%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3389%5C%2Ffnhum.2014.00071%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3389%5C%2Ffnhum.2014.00071%3C%5C%2Fa%3E.%20%3Ca%20title%3D%27Cite%20in%20RIS%20Format%27%20class%3D%27zp-CiteRIS%27%20href%3D%27https%3A%5C%2F%5C%2Fbenedikt%3Cstrong%3EEhinger%3C%5C%2Fstrong%3E.de%5C%2Fblog%5C%2Fscience%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.cite.php%3Fapi_user_id%3D4784278%26amp%3Bitem_key%3DHJ5D5NCY%27%3ECite%3C%5C%2Fa%3E%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Kinesthetic%20and%20vestibular%20information%20modulate%20alpha%20activity%20during%20spatial%20navigation%3A%20a%20mobile%20EEG%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Benedikt%20V.%22%2C%22lastName%22%3A%22Ehinger%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Petra%22%2C%22lastName%22%3A%22Fischer%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Anna%20L.%22%2C%22lastName%22%3A%22Gert%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lilli%22%2C%22lastName%22%3A%22Kaufhold%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Felix%22%2C%22lastName%22%3A%22Weber%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gordon%22%2C%22lastName%22%3A%22Pipa%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Peter%22%2C%22lastName%22%3A%22K%5Cu00f6nig%22%7D%5D%2C%22abstractNote%22%3A%22In%20everyday%20life%2C%20spatial%20navigation%20involving%20locomotion%20provides%20congruent%20visual%2C%20vestibular%2C%20and%20kinesthetic%20information%20that%20need%20to%20be%20integrated.%20Yet%2C%20previous%20studies%20on%20human%20brain%20activity%20during%20navigation%20focus%20on%20stationary%20setups%2C%20neglecting%20vestibular%20and%20kinesthetic%20feedback.%20The%20aim%20of%20our%20work%20is%20to%20uncover%20the%20in%5Cufb02uence%20of%20those%20sensory%20modalities%20on%20cortical%20processing.%20We%20developed%20a%20fully%20immersive%20virtual%20reality%20setup%20combined%20with%20high-density%20mobile%20electroencephalography%20%28EEG%29.%20Participants%20traversed%20one%20leg%20of%20a%20triangle%2C%20turned%20on%20the%20spot%2C%20continued%20along%20the%20second%20leg%2C%20and%20%5Cufb01nally%20indicated%20the%20location%20of%20their%20starting%20position.%20Vestibular%20and%20kinesthetic%20information%20was%20provided%20either%20in%20combination%2C%20as%20isolated%20sources%20of%20information%2C%20or%20not%20at%20all%20within%20a%202%20%5Cu00d7%202%20full%20factorial%20intra-subjects%20design.%20EEG%20data%20were%20processed%20by%20clustering%20independent%20components%2C%20and%20time-frequency%20spectrograms%20were%20calculated.%20In%20parietal%2C%20occipital%2C%20and%20temporal%20clusters%2C%20we%20detected%20alpha%20suppression%20during%20the%20turning%20movement%2C%20which%20is%20associated%20with%20a%20heightened%20demand%20of%20visuo-attentional%20processing%20and%20closely%20resembles%20results%20reported%20in%20previous%20stationary%20studies.%20This%20decrease%20is%20present%20in%20all%20conditions%20and%20therefore%20seems%20to%20generalize%20to%20more%20natural%20settings.%20Yet%2C%20in%20incongruent%20conditions%2C%20when%20different%20sensory%20modalities%20did%20not%20match%2C%20the%20decrease%20is%20signi%5Cufb01cantly%20stronger.%20Additionally%2C%20in%20more%20anterior%20areas%20we%20found%20that%20providing%20only%20vestibular%20but%20no%20kinesthetic%20information%20results%20in%20alpha%20increase.%20These%20observations%20demonstrate%20that%20stationary%20experiments%20omit%20important%20aspects%20of%20sensory%20feedback.%20Therefore%2C%20it%20is%20important%20to%20develop%20more%20natural%20experimental%20settings%20in%20order%20to%20capture%20a%20more%20complete%20picture%20of%20neural%20correlates%20of%20spatial%20navigation.%22%2C%22date%22%3A%222014%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.3389%5C%2Ffnhum.2014.00071%22%2C%22ISSN%22%3A%221662-5161%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fjournal.frontiersin.org%5C%2Farticle%5C%2F10.3389%5C%2Ffnhum.2014.00071%5C%2Fabstract%22%2C%22collections%22%3A%5B%22Y9YV56VT%22%2C%22LAIN3GVD%22%2C%22BBWDIMSG%22%2C%2252ZJKMY5%22%5D%2C%22dateModified%22%3A%222022-05-09T15%3A04%3A50Z%22%7D%7D%5D%7D
Yan, C.
et al. (2023) ‘Humans predict the forest, not the trees: statistical learning of spatiotemporal structure in visual scenes’,
Cerebral Cortex, p. bhad115. Available at:
https://doi.org/10.1093/cercor/bhad115.
Cite
Skukies, René and
Ehinger, Benedikt V. (2023) ‘The effect of estimation time window length on overlap correction in EEG data’, in.
Computational Cognitive Neuroscience. Available at:
https://scholar.google.de/citations?view_op=view_citation&hl=de&user=VKDX28YAAAAJ&sortby=pubdate&citation_for_view=VKDX28YAAAAJ:ns9cj8rnVeAC (Accessed: 24 June 2023).
Cite
Bonasch, Hannes and
Ehinger, Benedikt V. (2023) ‘Decoding accuracies as well as ERP amplitudes do not show between-task correlations’, in.
Computational Cognitive Neuroscience. Available at:
https://scholar.google.de/citations?view_op=view_citation&hl=de&user=VKDX28YAAAAJ&sortby=pubdate&citation_for_view=VKDX28YAAAAJ:GnPB-g6toBAC (Accessed: 24 June 2023).
Cite
Nikolaev, A.R.
et al. (2023) ‘Before the second glance: neural correlates of refixation planning in precursor fixations’. Available at:
https://doi.org/10.1101/660308.
Cite Download
Frömer, R.
et al. (2023) ‘Common neural choice signals emerge artifactually amidst multiple distinct value signals’. bioRxiv. Available at:
https://doi.org/10.1101/2022.08.02.502393.
Cite Download
Chiossi, F.
et al. (2022) ‘Adapting visualizations and interfaces to the user’,
it - Information Technology, 64(4–5), pp. 133–143. Available at:
https://doi.org/10.1515/itit-2022-0035.
Cite Download
Govaart, G.
et al. (2022) ‘EEG ERP preregistration template’. Available at:
https://doi.org/10.31222/osf.io/4nvpt.
Cite
Gert, A.L.
et al. (2022) ‘WildLab: A naturalistic free viewing experiment reveals previously unknown electroencephalography signatures of face processing’,
European Journal of Neuroscience, 56(11), pp. 6022–6038. Available at:
https://doi.org/10.1111/ejn.15824.
Cite Download
Pavlov, Y.G.
et al. (2021) ‘#EEGManyLabs: Investigating the replicability of influential EEG experiments’,
Cortex [Preprint]. Available at:
https://doi.org/10.1016/j.cortex.2021.03.013.
Cite
Dimigen, O. and
Ehinger, B.V. (2021) ‘Regression-based analysis of combined EEG and eye-tracking data: Theory and applications’,
Journal of Vision, 21(1), pp. 3–3. Available at:
https://doi.org/10.1167/jov.21.1.3.
Cite
Czeszumski, A.
et al. (2021) ‘Coordinating With a Robot Partner Affects Neural Processing Related to Action Monitoring’,
Frontiers in Neurorobotics, 15. Available at:
https://doi.org/10.3389/fnbot.2021.686010.
Cite Download
Gert, A.L.
et al. (2020) ‘Faces strongly attract early fixations in naturally sampled real-world stimulus materials’, in
ACM Symposium on Eye Tracking Research and Applications. New York, NY, USA: Association for Computing Machinery (ETRA ’20 Short Papers), pp. 1–5. Available at:
https://doi.org/10.1145/3379156.3391377.
Cite
Bosch, E.
et al. (2020) ‘Opposite effects of choice history and stimulus history resolve a paradox of sequential choice bias’,
Journal of Vision, p. 2020.02.14.948919. Available at:
https://doi.org/(accepted).
Cite
Heilbron, M.
et al. (2019) ‘Tracking Naturalistic Linguistic Predictions with Deep Neural Language Models’,
arXiv:1909.04400 [q-bio] [Preprint]. Available at:
https://doi.org/10.32470/CCN.2019.1096-0.
Cite
Ehinger, B.V.
et al. (2019) ‘A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000’,
PeerJ, 7, p. e7086. Available at:
https://doi.org/10.7717/peerj.7086.
Cite
Czeszumski, A.
et al. (2019) ‘The Social Situation Affects How We Process Feedback About Our Actions’,
Frontiers in Psychology, 10. Available at:
https://doi.org/10.3389/fpsyg.2019.00361.
Cite Download
Ehinger, B.V. (2019) ‘Unmixed: Linear Mixed Models combined with Overlap Correction for M/EEG analyses. An Extension to the unfold Toolbox’, in
2019 Conference on Cognitive Computational Neuroscience.
2019 Conference on Cognitive Computational Neuroscience, Berlin, Germany: Cognitive Computational Neuroscience. Available at:
https://doi.org/10.32470/CCN.2019.1102-0.
Cite
Ehinger, B.V. and Dimigen, O. (2019) ‘Unfold: An integrated toolbox for overlap correction, non-linear modeling, and regression-based EEG analysis’,
peerJ [Preprint]. Available at:
https://doi.org/https://doi.org/10.7717/peerj.7838.
Cite
Sütfeld, L.R.
et al. (2019) ‘How does the method change what we measure? Comparing virtual reality and text-based surveys for the assessment of moral decisions in traffic dilemmas’. Available at:
https://doi.org/10.31234/osf.io/h2z7p.
Cite
Ehinger, B.V., Kaufhold, L. and König, P. (2018) ‘Probing the temporal dynamics of the exploration–exploitation dilemma of eye movements’,
Journal of Vision, 18(3), pp. 6–6. Available at:
https://doi.org/10.1167/18.3.6.
Cite
Benedikt V.
Ehinger (2018)
Decisions, Predictions, and Learning in the visual sense. Osnabrück University. Available at:
https://repositorium.ub.uni-osnabrueck.de/handle/urn:nbn:de:gbv:700-20181116806.
Cite
Ehinger, B.V. (2018)
EEGVIS Toolbox. Available at: https://doi.org/10.5281/zenodo.1312813.
Cite
Ehinger, B.V.
et al. (2017) ‘Humans treat unreliable filled-in percepts as more real than veridical ones’,
eLife. Edited by P. Latham, 6, p. e21761. Available at:
https://doi.org/10.7554/eLife.21761.
Cite
Kietzmann, T.C.
et al. (2016) ‘Extensive training leads to temporal and spatial shifts of cortical activity underlying visual category selectivity’,
NeuroImage, 134, pp. 22–34. Available at:
https://doi.org/10.1016/j.neuroimage.2016.03.066.
Cite
Spoida, K.
et al. (2016) ‘Melanopsin Variants as Intrinsic Optogenetic On and Off Switches for Transient versus Sustained Activation of G Protein Pathways’,
Current Biology, 26(9), pp. 1206–1212. Available at:
https://doi.org/10.1016/j.cub.2016.03.007.
Cite
König, P.
et al. (2016) ‘Eye movements as a window to cognitive processes’,
Journal of Eye Movement Research, 9, pp. 1–16. Available at:
https://doi.org/https://doi.org/10.16910/jemr.9.5.3.
Cite
Ehinger, B.V.
et al. (2016) ‘Understanding melanopsin using bayesian generative models- an Introduction’,
bioRxiv [Preprint]. Available at:
https://doi.org/10.1101/043273.
Cite
Ehinger, B.V., König, P. and Ossandón, J.P. (2015) ‘Predictions of Visual Content across Eye Movements and Their Modulation by Inferred Information’,
Journal of Neuroscience, 35(19), pp. 7403–7413. Available at:
https://doi.org/10.1523/JNEUROSCI.5114-14.2015.
Cite
Ehinger, B.V.
et al. (2014) ‘Kinesthetic and vestibular information modulate alpha activity during spatial navigation: a mobile EEG study’,
Frontiers in Human Neuroscience, 8. Available at:
https://doi.org/10.3389/fnhum.2014.00071.
Cite
Courses
2019 (single Lecture) Introduction to ERP analysis (EEG)
2019 (Workshop) Statistics (Linear Model) & Deconvolution
2018 Statistical rethinking
2018 Applied Generalized Linear Mixed Models
2017 Generalized Linear Mixed Models
2017 (Workshop) Combined EEG/Eyetracking at the ECEM
2016 Bayesian Data Analysis
2015 Advanced Methods for M/EEG Data Analysis
2013 Basic and Advanced MatLab Exercises in Data Analysis
2012 Basic and Advanced MatLab Exercises in Data Analysis