CFP: IEEE Trans. Affective Computing (TAC) Special Issue on Advances in Affective Analysis in Multimedia (Yi-Hsuan Yang )


Subject: CFP: IEEE Trans. Affective Computing (TAC) Special Issue on Advances in Affective Analysis in Multimedia
From:    Yi-Hsuan Yang  <affige@xxxxxxxx>
Date:    Fri, 17 Jan 2014 14:29:39 +0800
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--047d7bacb22eaaf0a204f024aa1a Content-Type: text/plain; charset=windows-1252 Content-Transfer-Encoding: quoted-printable Apologies for cross posting. Call for papers ** Special Issue on Advances in Affective Analysis in Multimedia IEEE Transaction on Affective Computing (TAC) ** Recent advances in multimedia computing brought a dramatic increase in the research on multimedia retrieval and indexing based on highly subjective concepts such as emotion, preference and aesthetics. These retrieval methods are considered human-centered, intuitive, and beyond the conventional keyword- or object-based retrieval paradigm. In addition, the problem is considered challenging because it requires multidisciplinary understanding of human behavior and perception as well as multimodal integration of different modalities (music, image, video, text) for better performance. Affective analyses of multimedia are attracting growing attention from industry and general public, e.g., mood based online radio stations. IEEE Transactions on Affective Computing is the flagship journal in affective computing, featuring the state of the art on this multidisciplinary topic. We are inviting original submissions to for this special issue from the topics including but not limited to: - Affective/emotional content analysis of music, images and videos - Multimodal integration for affective content understanding - Affective multimedia indexing - Affect in content retrieval and recommendation - Image and video summarization based on affect - Emotional implicit tagging, sensing techniques and interactive systems - Affective benchmarking development - Cognitive/psychological perspective on affective content analysis - Affective multimedia applications * Important dates Paper submission deadline 31 July 2014 Notification and the first round of review 25 September 2014 Final notification 15 November 2014 Final manuscript due 10 January 2015 Projected to appear Spring or Summer issue 2015 * Review process The review process will comply with the standard review process of the TAC. Each paper will receive at least three reviews from the experts in the field. Papers will undergo one cycle of revision and will be only accepted if they receive acceptance with minor revisions in the first review cycle. * Submission instructions Prospective authors are invited to submit their manuscripts electronically after the =93open for submissions=94 date, adhering to the IEEE Transaction= s on Affective Computing guidelines (http://www.computer.org/ portal/web/tac/author) Please submit your papers through the online system = ( https://mc.manuscriptcentral.com/taffc-cs) and be sure to select the special issue. Manuscripts should not be published or currently submitted for publication elsewhere. Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere. If the submission is an extended work of the previously published paper, please include the original work and a cover letter describing the changes that have been made. Guest editors: Mohammad Soleymani, Imperial College London, UK (m.soleymani@xxxxxxxx= ) Yi-Hsuan (Eric) Yang, Academia Sinica, Taiwan (affige@xxxxxxxx) Go Irie, NTT Corp., Japan (irie.go@xxxxxxxx) Alan Hanjalic, Delft University of Technology, the Netherlands ( a.hanjalic@xxxxxxxx) Best regards, Yi-Hsuan Yang on behalf of the guest editors --=20 Yi-Hsuan (Eric) Yang, Ph.D. Assistant Research Fellow Research Center for IT Innovation, Academia Sinica http://www.citi.sinica.edu.tw/pages/yang/ --047d7bacb22eaaf0a204f024aa1a Content-Type: text/html; charset=windows-1252 Content-Transfer-Encoding: quoted-printable <div dir=3D"ltr"><div class=3D"gmail_extra"><span style=3D"font-family:aria= l,sans-serif;font-size:16.363636016845703px">Apologies for cross posting.</= span><br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703= px"><br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703p= x"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Call for papers</span><br style=3D"font-family:arial,sans-serif;font-size:= 16.363636016845703px"><span style=3D"font-family:arial,sans-serif;font-size= :16.363636016845703px">** Special Issue on Advances in Affective Analysis i= n Multimedia IEEE Transaction on Affective Computing (TAC) **</span><br sty= le=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"> <br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"><= span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px">= Recent advances in multimedia computing brought a dramatic increase in the = research on multimedia retrieval and indexing based on highly subjective co= ncepts such as emotion, preference and aesthetics. These retrieval methods = are considered human-centered, intuitive, and beyond the conventional keywo= rd- or object-based retrieval paradigm. In addition, the problem is conside= red challenging because it requires multidisciplinary understanding of huma= n behavior and perception as well as multimodal integration of different mo= dalities (music, image, video, text) for better performance. Affective anal= yses of multimedia are attracting growing attention from industry and gener= al public, e.g., mood based online radio stations.</span><br style=3D"font-= family:arial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >IEEE Transactions on Affective Computing is the flagship journal in affect= ive computing, featuring the state of the art on this multidisciplinary top= ic. We are inviting original submissions to for this special issue from the= topics including but not limited to:</span><br style=3D"font-family:arial,= sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >- Affective/emotional content analysis of music, images and videos</span><= br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"><s= pan style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px">-= Multimodal integration for affective content understanding</span><br style= =3D"font-family:arial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >- Affective multimedia indexing</span><br style=3D"font-family:arial,sans-= serif;font-size:16.363636016845703px"><span style=3D"font-family:arial,sans= -serif;font-size:16.363636016845703px">- Affect in content retrieval and re= commendation</span><br style=3D"font-family:arial,sans-serif;font-size:16.3= 63636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >- Image and video summarization based on affect</span><br style=3D"font-fa= mily:arial,sans-serif;font-size:16.363636016845703px"><span style=3D"font-f= amily:arial,sans-serif;font-size:16.363636016845703px">- Emotional implicit= tagging, sensing techniques and interactive systems</span><br style=3D"fon= t-family:arial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >- Affective benchmarking development</span><br style=3D"font-family:arial,= sans-serif;font-size:16.363636016845703px"><span style=3D"font-family:arial= ,sans-serif;font-size:16.363636016845703px">- Cognitive/psychological persp= ective on affective content analysis</span><br style=3D"font-family:arial,s= ans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >- Affective multimedia applications</span><br style=3D"font-family:arial,s= ans-serif;font-size:16.363636016845703px"><br style=3D"font-family:arial,sa= ns-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >* Important dates</span><br style=3D"font-family:arial,sans-serif;font-siz= e:16.363636016845703px"><span style=3D"font-family:arial,sans-serif;font-si= ze:16.363636016845703px">Paper submission deadline=A0</span><span style=3D"= font-family:arial,sans-serif;font-size:16.363636016845703px"><span>31 July = 2014</span></span><br style=3D"font-family:arial,sans-serif;font-size:16.36= 3636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Notification and the first round of review=A0</span><span style=3D"font-fa= mily:arial,sans-serif;font-size:16.363636016845703px"><span>25 September 20= 14</span></span><br style=3D"font-family:arial,sans-serif;font-size:16.3636= 36016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Final notification=A0</span><span style=3D"font-family:arial,sans-serif;fo= nt-size:16.363636016845703px"><span>15 November 2014</span></span><br style= =3D"font-family:arial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Final manuscript due=A0</span><span style=3D"font-family:arial,sans-serif;= font-size:16.363636016845703px"><span>10 January 2015</span></span><br styl= e=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Projected to appear Spring or Summer issue 2015</span><br style=3D"font-fa= mily:arial,sans-serif;font-size:16.363636016845703px"><br style=3D"font-fam= ily:arial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >* Review process</span><br style=3D"font-family:arial,sans-serif;font-size= :16.363636016845703px"><span style=3D"font-family:arial,sans-serif;font-siz= e:16.363636016845703px">The review process will comply with the standard re= view process of the TAC. Each paper will receive at least three reviews fro= m the experts in the field. Papers will undergo one cycle of revision and w= ill be only accepted if they receive acceptance with minor revisions in the= first review cycle.</span><br style=3D"font-family:arial,sans-serif;font-s= ize:16.363636016845703px"> <br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"><= span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px">= * Submission instructions</span><br style=3D"font-family:arial,sans-serif;f= ont-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Prospective authors are invited to submit their manuscripts electronically= after the =93open for submissions=94 date, adhering to the IEEE Transactio= ns on Affective Computing guidelines (</span><a href=3D"http://www.computer= .org/portal/web/tac/author" style=3D"font-family:arial,sans-serif;font-size= :16.363636016845703px" target=3D"_blank">http://www.computer.org/<u></u>por= tal/web/tac/author</a><span style=3D"font-family:arial,sans-serif;font-size= :16.363636016845703px">) Please submit your papers through the online syste= m (</span><a href=3D"https://mc.manuscriptcentral.com/taffc-cs" style=3D"fo= nt-family:arial,sans-serif;font-size:16.363636016845703px" target=3D"_blank= ">https://mc.manuscriptcentral.<u></u>com/taffc-cs</a><span style=3D"font-f= amily:arial,sans-serif;font-size:16.363636016845703px">) and be sure to sel= ect the special issue. Manuscripts should not be published or currently sub= mitted for publication elsewhere. Submitted manuscripts should not have bee= n published previously, nor be under consideration for publication elsewher= e. If the submission is an extended work of the previously published paper,= please include the original work and a cover letter describing the changes= that have been made.</span><br style=3D"font-family:arial,sans-serif;font-= size:16.363636016845703px"> <br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"><= span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px">= Guest editors:</span><br style=3D"font-family:arial,sans-serif;font-size:16= .363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Mohammad Soleymani, Imperial College London, UK (</span><a href=3D"mailto:= m.soleymani@xxxxxxxx" style=3D"font-family:arial,sans-serif;font-size= :16.363636016845703px" target=3D"_blank">m.soleymani@xxxxxxxx</a><spa= n style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px">)</= span><br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703= px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Yi-Hsuan (Eric) Yang, Academia Sinica, Taiwan (</span><a href=3D"mailto:af= fige@xxxxxxxx" style=3D"font-family:arial,sans-serif;font-size:16.36363601= 6845703px" target=3D"_blank">affige@xxxxxxxx</a><span style=3D"font-family= :arial,sans-serif;font-size:16.363636016845703px">)</span><br style=3D"font= -family:arial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Go Irie, NTT Corp., Japan (</span><a href=3D"mailto:irie.go@xxxxxxxx"= style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px" targ= et=3D"_blank">irie.go@xxxxxxxx</a><span style=3D"font-family:arial,san= s-serif;font-size:16.363636016845703px">)</span><br style=3D"font-family:ar= ial,sans-serif;font-size:16.363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >Alan Hanjalic, Delft University of Technology, the Netherlands (</span><a = href=3D"mailto:a.hanjalic@xxxxxxxx" style=3D"font-family:arial,sans-serif= ;font-size:16.363636016845703px" target=3D"_blank">a.hanjalic@xxxxxxxx</a= ><span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px= ">)</span><br style=3D"font-family:arial,sans-serif;font-size:16.3636360168= 45703px"> <br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"><= span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px">= Best regards,</span><br style=3D"font-family:arial,sans-serif;font-size:16.= 363636016845703px"> <br style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"><= span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px">= Yi-Hsuan Yang</span><br style=3D"font-family:arial,sans-serif;font-size:16.= 363636016845703px"> <span style=3D"font-family:arial,sans-serif;font-size:16.363636016845703px"= >on behalf of the guest editors</span><span style=3D"font-family:arial,sans= -serif;font-size:16.363636016845703px"><font color=3D"#888888"><br></font><= /span> </div><div class=3D"gmail_extra"><span style=3D"font-family:arial,sans-seri= f;font-size:16.363636016845703px"><br></span></div><div class=3D"gmail_extr= a">--=A0<br>Yi-Hsuan (Eric) Yang, Ph.D.<br>Assistant Research Fellow<br>Res= earch Center for IT Innovation, Academia Sinica<br> <a href=3D"http://www.citi.sinica.edu.tw/pages/yang/" target=3D"_blank">htt= p://www.citi.sinica.edu.tw/pages/yang/</a><span style=3D"font-family:arial,= sans-serif;font-size:16.363636016845703px"><br></span></div></div> --047d7bacb22eaaf0a204f024aa1a--


This message came from the mail archive
/var/www/postings/2014/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University