Journal Articles | Conference Papers | Book Chapters | PhD Thesis | Edited
Journal Articles
5790233 ZJY82L5I 1 apa 50 date desc 1 1 title Visi 791 https://www.federicovisi.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22Z36DZWMK%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Tanaka%20et%20al.%22%2C%22parsedDate%22%3A%222024-04-10%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ETanaka%2C%20A.%2C%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Di%20Donato%2C%20B.%2C%20Klang%2C%20M.%2C%20%26%20Zbyszy%5Cu0144ski%2C%20M.%20%282024%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fdirect.mit.edu%5C%2Fcomj%5C%2Farticle%5C%2Fdoi%5C%2F10.1162%5C%2Fcomj_a_00672%5C%2F120561%5C%2FAn-End-to-End-Musical-Instrument-System-That%27%3EAn%20End-to-End%20Musical%20Instrument%20System%20That%20Translates%20Electromyogram%20Biosignals%20to%20Synthesized%20Sound%3C%5C%2Fa%3E.%20%3Ci%3EComputer%20Music%20Journal%3C%5C%2Fi%3E%2C%201%5Cu201340.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1162%5C%2Fcomj_a_00672%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D7BGFREDK%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22An%20End-to-End%20Musical%20Instrument%20System%20That%20Translates%20Electromyogram%20Biosignals%20to%20Synthesized%20Sound%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Atau%22%2C%22lastName%22%3A%22Tanaka%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Balandino%22%2C%22lastName%22%3A%22Di%20Donato%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Martin%22%2C%22lastName%22%3A%22Klang%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Zbyszy%5Cu0144ski%22%7D%5D%2C%22abstractNote%22%3A%22Abstract%5Cn%20%20%20%20%20%20%20%20%20%20%20%20This%20article%20presents%20a%20custom%20system%20combining%20hardware%20and%20sortware%20that%20sense%20physiological%20signals%20of%20the%20performer%27s%20body%20resulting%20from%20muscle%20contraction%20and%20translates%20them%20to%20computer-synthesized%20sound.%20Our%20goal%20was%20to%20build%20upon%20the%20history%20of%20research%20in%20the%20field%20to%20develop%20a%20complete%2C%20integrated%20system%20that%20could%20be%20used%20by%20nonspecialist%20musicians.%20We%20describe%20the%20Embodied%20AudioVisual%20Interaction%20Electromyogram%2C%20an%20end-to-end%20system%2C%20spanning%20wearable%20sensing%20on%20the%20musician%27s%20body%2C%20custom%20microcontroller-based%20biosignal%20acquisition%20hardware%2C%20machine%20learning%5Cu2013%20based%20gesture-to-sound%20mapping%20middleware%2C%20and%20software-based%20granular%20synthesis%20sound%20output.%20A%20novel%20hardware%20design%20digitizes%20the%20electromyogram%20signals%20from%20the%20muscle%20with%20minimal%20analog%20preprocessing%20and%20treats%20it%20in%20an%20audio%20signal-processing%20chain%20as%20a%20class-compliant%20audio%20and%20wireless%20MIDI%20interface.%20The%20mapping%20layer%20implements%20an%20interactive%20machine%20learning%20workflow%20in%20a%20reinforcement%20learning%20configuration%20and%20can%20map%20gesture%20features%20to%20auditory%20metadata%20in%20a%20multidimensional%20information%20space.%20The%20system%20adapts%20existing%20machine%20learning%20and%20synthesis%20modules%20adapted%20to%20work%20with%20the%20hardware%2C%20resulting%20in%20an%20integrated%2C%20end-to-end%20system.%20We%20explore%20its%20potential%20as%20a%20digital%20musical%20instrument%20through%20a%20series%20of%20public%20presentations%20and%20concert%20performance%20by%20a%20range%20of%20musical%20practitioners.%22%2C%22date%22%3A%222024-04-10%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1162%5C%2Fcomj_a_00672%22%2C%22ISSN%22%3A%220148-9267%2C%201531-5169%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdirect.mit.edu%5C%2Fcomj%5C%2Farticle%5C%2Fdoi%5C%2F10.1162%5C%2Fcomj_a_00672%5C%2F120561%5C%2FAn-End-to-End-Musical-Instrument-System-That%22%2C%22collections%22%3A%5B%22ZJY82L5I%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22J7IY3W28%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Basso%2C%20T.%2C%20Greinke%2C%20B.%2C%20Wood%2C%20E.%2C%20Gschwendtner%2C%20P.%2C%20Hope%2C%20C.%2C%20%26%20%5Cu00d6stersj%5Cu00f6%2C%20S.%20%282024%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F14626268.2024.2311906%27%3ENetworking%20concert%20halls%2C%20musicians%2C%20and%20interactive%20textiles%3A%20Interwoven%20Sound%20Spaces%3C%5C%2Fa%3E.%20%3Ci%3EDigital%20Creativity%3C%5C%2Fi%3E%2C%20%3Ci%3E0%3C%5C%2Fi%3E%280%29%2C%201%5Cu201322.%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D9CRCENZH%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Networking%20concert%20halls%2C%20musicians%2C%20and%20interactive%20textiles%3A%20Interwoven%20Sound%20Spaces%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tatiana%22%2C%22lastName%22%3A%22Basso%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Berit%22%2C%22lastName%22%3A%22Greinke%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Emma%22%2C%22lastName%22%3A%22Wood%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Philipp%22%2C%22lastName%22%3A%22Gschwendtner%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Cat%22%2C%22lastName%22%3A%22Hope%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22%5Cu00d6stersj%5Cu00f6%22%7D%5D%2C%22abstractNote%22%3A%22Interwoven%20Sound%20Spaces%20is%20an%20interdisciplinary%20project%20which%20brought%20together%20telematic%20music%20performance%2C%20interactive%20textiles%2C%20interaction%20design%2C%20and%20artistic%20research.%20A%20team%20of%20researchers%20collaborated%20with%20two%20professional%20contemporary%20music%20ensembles%20based%20in%20Berlin%2C%20Germany%2C%20and%20Pite%5Cu00e5%2C%20Sweden%2C%20and%20four%20composers%2C%20with%20the%20aim%20of%20creating%20a%20telematic%20distributed%20concert%20taking%20place%20simultaneously%20in%20two%20concert%20halls%20and%20online.%20Central%20to%20the%20project%20was%20the%20development%20of%20interactive%20textiles%20capable%20of%20sensing%20the%20musicians%5Cu2019%20movements%20while%20playing%20acoustic%20instruments%2C%20and%20generating%20data%20the%20composers%20used%20in%20their%20works.%20Musicians%2C%20instruments%2C%20textiles%2C%20sounds%2C%20halls%2C%20and%20data%20formed%20a%20network%20of%20entities%20and%20agencies%20that%20was%20reconfigured%20for%20each%20piece%2C%20showing%20how%20networked%20music%20practice%20enables%20distinctive%20musicking%20techniques.%20We%20describe%20each%20part%20of%20the%20project%20and%20report%20on%20a%20research%20interview%20conducted%20with%20one%20of%20the%20composers%20for%20the%20purpose%20of%20analysing%20the%20creative%20approaches%20she%20adopted%20for%20composing%20her%20piece.%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1080%5C%2F14626268.2024.2311906%22%2C%22ISSN%22%3A%221462-6268%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1080%5C%2F14626268.2024.2311906%22%2C%22collections%22%3A%5B%22ZJY82L5I%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22PK9DLT3W%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Griffiths%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EGriffiths%2C%20A.%20G.%20F.%2C%20Garrett%2C%20J.%20K.%2C%20Duffy%2C%20J.%20P.%2C%20Matthews%2C%20K.%2C%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20G.%2C%20Eatock%2C%20C.%2C%20Robinson%2C%20M.%2C%20%26%20Griffiths%2C%20D.%20J.%20%282021%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fopenhardware.metajnl.com%5C%2Farticles%5C%2F10.5334%5C%2Fjoh.35%5C%2F%27%3ENew%20Water%20and%20Air%20Pollution%20Sensors%20Added%20to%20the%20Sonic%20Kayak%20Citizen%20Science%20System%20for%20Low%20Cost%20Environmental%20Mapping%3C%5C%2Fa%3E.%20%3Ci%3EJournal%20of%20Open%20Hardware%3C%5C%2Fi%3E%2C%20%3Ci%3E5%3C%5C%2Fi%3E%281%29%2C%205.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.5334%5C%2Fjoh.35%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DX8DMNVML%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22New%20Water%20and%20Air%20Pollution%20Sensors%20Added%20to%20the%20Sonic%20Kayak%20Citizen%20Science%20System%20for%20Low%20Cost%20Environmental%20Mapping%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Amber%20G.%20F.%22%2C%22lastName%22%3A%22Griffiths%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Joanne%20K.%22%2C%22lastName%22%3A%22Garrett%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22James%20P.%22%2C%22lastName%22%3A%22Duffy%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kaffe%22%2C%22lastName%22%3A%22Matthews%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%20G.%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Claire%22%2C%22lastName%22%3A%22Eatock%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mike%22%2C%22lastName%22%3A%22Robinson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22David%20J.%22%2C%22lastName%22%3A%22Griffiths%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222021%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.5334%5C%2Fjoh.35%22%2C%22ISSN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fopenhardware.metajnl.com%5C%2Farticles%5C%2F10.5334%5C%2Fjoh.35%5C%2F%22%2C%22collections%22%3A%5B%22ZJY82L5I%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22Y8CZ7ZVK%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222020-12-04%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20G.%2C%20%5Cu00d6stersj%5Cu00f6%2C%20S.%2C%20Ek%2C%20R.%2C%20%26%20R%5Cu00f6ijezon%2C%20U.%20%282020%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.frontiersin.org%5C%2Farticles%5C%2F10.3389%5C%2Ffpsyg.2020.576751%5C%2Ffull%27%3EMethod%20Development%20for%20Multimodal%20Data%20Corpus%20Analysis%20of%20Expressive%20Instrumental%20Music%20Performance%3C%5C%2Fa%3E.%20%3Ci%3EFrontiers%20in%20Psychology%3C%5C%2Fi%3E%2C%20%3Ci%3E11%3C%5C%2Fi%3E%28576751%29.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.3389%5C%2Ffpsyg.2020.576751%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DZYG5V4BZ%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Method%20Development%20for%20Multimodal%20Data%20Corpus%20Analysis%20of%20Expressive%20Instrumental%20Music%20Performance%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%20Ghelli%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22%5Cu00d6stersj%5Cu00f6%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%22%2C%22lastName%22%3A%22Ek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ulrik%22%2C%22lastName%22%3A%22R%5Cu00f6ijezon%22%7D%5D%2C%22abstractNote%22%3A%22Musical%20performance%20is%20a%20multimodal%20experience%2C%20for%20performers%20and%20listeners%20alike.%20This%20paper%20reports%20on%20a%20pilot%20study%20which%20constitutes%20the%20first%20step%20toward%20a%20comprehensive%20approach%20to%20the%20experience%20of%20music%20as%20performed.%20We%20aim%20at%20bridging%20the%20gap%20between%20qualitative%20and%20quantitative%20approaches%2C%20by%20combining%20methods%20for%20data%20collection.%20The%20purpose%20is%20to%20build%20a%20data%20corpus%20containing%20multimodal%20measures%20linked%20to%20high-level%20subjective%20observations.%20This%20will%20allow%20for%20a%20systematic%20inclusion%20of%20the%20knowledge%20of%20music%20professionals%20in%20an%20analytic%20framework%2C%20which%20synthesizes%20methods%20across%20established%20research%20disciplines.%20We%20outline%20the%20methods%20we%20are%20currently%20developing%20for%20the%20creation%20of%20a%20multimodal%20data%20corpus%20dedicated%20to%20the%20analysis%20and%20exploration%20of%20instrumental%20music%20performance%20from%20the%20perspective%20of%20embodied%20music%20cognition.%20This%20will%20enable%20the%20study%20of%20the%20multiple%20facets%20of%20instrumental%20music%20performance%20in%20great%20detail%2C%20as%20well%20as%20lead%20to%20the%20development%20of%20music%20creation%20techniques%20that%20take%20advantage%20of%20the%20cross-modal%20relationships%20and%20higher-level%20qualities%20emerging%20from%20the%20analysis%20of%20this%20multi-layered%2C%20multimodal%20corpus.%20The%20results%20of%20the%20pilot%20project%20suggest%20that%20qualitative%20analysis%20through%20stimulated%20recall%20is%20an%20efficient%20method%20for%20generating%20higher-level%20understandings%20of%20musical%20performance.%20Furthermore%2C%20the%20results%20indicate%20several%20directions%20for%20further%20development%2C%20regarding%20observational%20movement%20analysis%2C%20and%20computational%20analysis%20of%20coarticulation%2C%20chunking%2C%20and%20movement%20qualities%20in%20musical%20performance.%20We%20argue%20that%20the%20development%20of%20methods%20for%20combining%20qualitative%20and%20quantitative%20data%20are%20required%20to%20fully%20understand%20expressive%20musical%20performance%2C%20especially%20in%20a%20broader%20scenario%20in%20which%20arts%2C%20humanities%2C%20and%20science%20are%20increasingly%20entangled.%20The%20future%20work%20in%20the%20project%20will%20therefore%20entail%20an%20increasingly%20multimodal%20analysis%2C%20aiming%20to%20become%20as%20holistic%20as%20is%20music%20in%20performance.%22%2C%22date%22%3A%222020-12-04%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.3389%5C%2Ffpsyg.2020.576751%22%2C%22ISSN%22%3A%221664-1078%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.frontiersin.org%5C%2Farticles%5C%2F10.3389%5C%2Ffpsyg.2020.576751%5C%2Ffull%22%2C%22collections%22%3A%5B%22ZJY82L5I%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22MVUVATGE%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222017-05-31%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Coorevits%2C%20E.%2C%20Schramm%2C%20R.%2C%20%26%20Miranda%2C%20E.%20R.%20%282017%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Fhumantechnology.jyu.fi%5C%2Farchive%5C%2Fvol-13%5C%2Fissue-1%5C%2Fmusical-instruments-body-movement-space-and-motion-data%27%3EMusical%20Instruments%2C%20Body%20Movement%2C%20Space%2C%20and%20Motion%20Data%3A%20Music%20as%20an%20Emergent%20Multimodal%20Choreography%3C%5C%2Fa%3E.%20%3Ci%3EHuman%20Technology%3C%5C%2Fi%3E%2C%20%3Ci%3E13%3C%5C%2Fi%3E%281%29%2C%2058%5Cu201381.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.17011%5C%2Fht%5C%2Furn.201705272518%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D98KRIT54%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DTZTTBGI3%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Musical%20Instruments%2C%20Body%20Movement%2C%20Space%2C%20and%20Motion%20Data%3A%20Music%20as%20an%20Emergent%20Multimodal%20Choreography%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Esther%22%2C%22lastName%22%3A%22Coorevits%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%20Reck%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22Music%20is%20a%20complex%20multimodal%20medium%20experienced%20not%20only%20via%20sounds%20but%20also%20through%20body%20movement.%20Musical%20instruments%20can%20be%20seen%20as%20technological%20objects%20coupled%20with%20a%20repertoire%20of%20gestures.%20We%20present%20technical%20and%20conceptual%20issues%20related%20to%20the%20digital%20representation%20and%20mediation%20of%20body%20movement%20in%20musical%20performance.%20The%20paper%20reports%20on%20a%20case%20study%20of%20a%20musical%20performance%20where%20motion%20sensor%20technologies%20tracked%20the%20movements%20of%20the%20musicians%20while%20they%20played%20their%20instruments.%20Motion%20data%20were%20used%20to%20control%20the%20electronic%20elements%20of%20the%20piece%20in%20real%20time.%20It%20is%20suggested%20that%20computable%20motion%20descriptors%20and%20machine%20learning%20techniques%20are%20useful%20tools%20for%20interpreting%20motion%20data%20in%20a%20meaningful%20manner.%20However%2C%20qualitative%20insights%20regarding%20how%20human%20body%20movement%20is%20understood%20and%20experienced%20are%20necessary%20to%20inform%20further%20development%20of%20motion-capture%20technologies%20for%20expressive%20purposes.%20Thus%2C%20musical%20performances%20provide%20an%20effective%20test%20bed%20for%20new%20modalities%20of%20human%5Cu2013computer%20interaction.%22%2C%22date%22%3A%222017-05-31%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.17011%5C%2Fht%5C%2Furn.201705272518%22%2C%22ISSN%22%3A%2217956889%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fhumantechnology.jyu.fi%5C%2Farchive%5C%2Fvol-13%5C%2Fissue-1%5C%2Fmusical-instruments-body-movement-space-and-motion-data%22%2C%22collections%22%3A%5B%22ZJY82L5I%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%5D%7D
Tanaka, A., Visi, F., Di Donato, B., Klang, M., & Zbyszyński, M. (2024). An End-to-End Musical Instrument System That Translates Electromyogram Biosignals to Synthesized Sound. Computer Music Journal, 1–40. https://doi.org/10.1162/comj_a_00672 Download
Visi, F., Basso, T., Greinke, B., Wood, E., Gschwendtner, P., Hope, C., & Östersjö, S. (2024). Networking concert halls, musicians, and interactive textiles: Interwoven Sound Spaces. Digital Creativity, 0(0), 1–22. Download
Griffiths, A. G. F., Garrett, J. K., Duffy, J. P., Matthews, K., Visi, F. G., Eatock, C., Robinson, M., & Griffiths, D. J. (2021). New Water and Air Pollution Sensors Added to the Sonic Kayak Citizen Science System for Low Cost Environmental Mapping. Journal of Open Hardware, 5(1), 5. https://doi.org/10.5334/joh.35 Download
Visi, F. G., Östersjö, S., Ek, R., & Röijezon, U. (2020). Method Development for Multimodal Data Corpus Analysis of Expressive Instrumental Music Performance. Frontiers in Psychology, 11(576751). https://doi.org/10.3389/fpsyg.2020.576751 Download
Visi, F., Coorevits, E., Schramm, R., & Miranda, E. R. (2017). Musical Instruments, Body Movement, Space, and Motion Data: Music as an Emergent Multimodal Choreography. Human Technology, 13(1), 58–81. https://doi.org/10.17011/ht/urn.201705272518 Download Download
Conference Papers
5790233 8MWBZ5AR 1 apa 50 date desc 1 1 title Visi 791 https://www.federicovisi.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22YZBHDJ4I%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20%282024%29.%20The%20Sophtar%3A%20a%20networkable%20feedback%20string%20instrument%20with%20embedded%20machine%20learning.%20%3Ci%3ENIME%202024%20Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%3C%5C%2Fi%3E.%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DPPVNKZZ9%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22The%20Sophtar%3A%20a%20networkable%20feedback%20string%20instrument%20with%20embedded%20machine%20learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%5D%2C%22abstractNote%22%3A%22The%20Sophtar%20is%20a%20tabletop%20string%20instrument%20with%20an%20embedded%20system%20for%20digital%20signal%20processing%2C%20networking%2C%20and%20machine%20learning.%20It%20features%20a%20pressure-sensitive%20fretted%20neck%2C%20two%20sound%20boxes%2C%20and%20controlled%20feedback%20capabilities%20by%20means%20of%20bespoke%20interface%20elements.%20The%20design%20of%20the%20instrument%20is%20informed%20by%20my%20practice%20with%20hyperorgan%20interaction%20in%20networked%20music%20performance.%20I%20discuss%20the%20motivations%20behind%20the%20development%20of%20the%20instrument%20and%20describe%20its%20structure%2C%20interface%20elements%2C%20and%20the%20hyperorgan%20and%20sound%20synthesis%20interactions%20approaches%20it%20implements.%20Finally%2C%20I%20reflect%20on%20the%20affordances%20of%20the%20Sophtar%20and%20the%20differences%20and%20similarities%20with%20other%20instruments%20and%20outline%20future%20developments%20and%20uses.%22%2C%22date%22%3A%222024%22%2C%22proceedingsTitle%22%3A%22NIME%202024%20Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22XT2G46XZ%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Schramm%2C%20R.%2C%20Fr%26%23xF6%3Bdin%2C%20K.%2C%20Unander-Scharin%2C%20%26%23xC5%3B.%2C%20%26amp%3B%20%26%23xD6%3Bstersj%26%23xF6%3B%2C%20S.%20%282024%29.%20Empirical%20Analysis%20of%20Gestural%20Sonic%20Objects%20Combining%20Qualitative%20and%20Quantitative%20Methods.%20In%20A.%20R.%20Jensenius%20%28Ed.%29%2C%20%3Ci%3ESonic%20Design%3C%5C%2Fi%3E%20%28pp.%20115%26%23×2013%3B137%29.%20Springer%20Nature%20Switzerland.%20%3Ca%20class%3D%27zp-DOIURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-031-57892-2_7%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-031-57892-2_7%3C%5C%2Fa%3E%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D76EBU3TP%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Empirical%20Analysis%20of%20Gestural%20Sonic%20Objects%20Combining%20Qualitative%20and%20Quantitative%20Methods%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Kerstin%22%2C%22lastName%22%3A%22Fr%5Cu00f6din%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22%5Cu00c5sa%22%2C%22lastName%22%3A%22Unander-Scharin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22%5Cu00d6stersj%5Cu00f6%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Alexander%20Refsum%22%2C%22lastName%22%3A%22Jensenius%22%7D%5D%2C%22abstractNote%22%3A%22In%20this%20chapter%2C%20we%20describe%20a%20series%20of%20studies%20related%20to%20our%20research%20on%20using%20gestural%20sonic%20objects%20in%20music%20analysis.%20These%20include%20developing%20a%20method%20for%20annotating%20the%20qualities%20of%20gestural%20sonic%20objects%20on%20multimodal%20recordings%3B%20ranking%20which%20features%20in%20a%20multimodal%20dataset%20are%20good%20predictors%20of%20basic%20qualities%20of%20gestural%20sonic%20objects%20using%20the%20Random%20Forests%20algorithm%3B%20and%20a%20supervised%20learning%20method%20for%20automated%20spotting%20designed%20to%20assist%20human%20annotators.%20The%20subject%20of%20our%20analyses%20is%20a%20performance%20of%20Fragmente2%2C%20a%20choreomusical%20composition%20based%20on%20the%20Japanese%20composer%20Makoto%20Shinohara%5Cu2019s%20solo%20piece%20for%20tenor%20recorder%20Fragmente%20%281968%29.%20To%20obtain%20the%20dataset%2C%20we%20carried%20out%20a%20multimodal%20recording%20of%20a%20full%20performance%20of%20the%20piece%20and%20obtained%20synchronised%20audio%2C%20video%2C%20motion%2C%20and%20electromyogram%20%28EMG%29%20data%20describing%20the%20body%20movements%20of%20the%20performers.%20We%20then%20added%20annotations%20on%20gestural%20sonic%20objects%20through%20dedicated%20qualitative%20analysis%20sessions.%20The%20task%20of%20annotating%20gestural%20sonic%20objects%20on%20the%20recordings%20of%20this%20performance%20has%20led%20to%20a%20meticulous%20examination%20of%20related%20theoretical%20concepts%20to%20establish%20a%20method%20applicable%20beyond%20this%20case%20study.%20This%20process%20of%20gestural%20sonic%20object%20annotation%5Cu2014like%20other%20qualitative%20approaches%20involving%20manual%20labelling%20of%20data%5Cu2014has%20proven%20to%20be%20very%20time-consuming.%20This%20motivated%20the%20exploration%20of%20data-driven%2C%20automated%20approaches%20to%20assist%20expert%20annotators.%22%2C%22date%22%3A%222024%22%2C%22proceedingsTitle%22%3A%22Sonic%20Design%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22en%22%2C%22DOI%22%3A%2210.1007%5C%2F978-3-031-57892-2_7%22%2C%22ISBN%22%3A%22978-3-031-57892-2%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22FABRLJ7X%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Ek%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EEk%2C%20R.%2C%20%26%23xD6%3Bsters%26%23xF6%3B%2C%20S.%2C%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20G.%2C%20%26amp%3B%20Petersson%2C%20M.%20%282021%29.%20The%20TCP%5C%2FIndeterminate%20Place%20Quartet%3A%20a%20Global%20Hyperorgan%20Scenario.%20%3Ci%3EInternational%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%20%28NIME%29%3C%5C%2Fi%3E.%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DI5UUN93S%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22The%20TCP%5C%2FIndeterminate%20Place%20Quartet%3A%20a%20Global%20Hyperorgan%20Scenario%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%22%2C%22lastName%22%3A%22Ek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22%5Cu00d6sters%5Cu00f6%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%20Ghelli%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mattias%22%2C%22lastName%22%3A%22Petersson%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222021%22%2C%22proceedingsTitle%22%3A%22International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%20%28NIME%29%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22HWXP84TG%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Harlow%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EHarlow%2C%20R.%2C%20Petersson%2C%20M.%2C%20Ek%2C%20R.%2C%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20%26%20%5Cu00d6stersj%5Cu00f6%2C%20S.%20%282021%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fnime.pubpub.org%5C%2Fpub%5C%2Fa626cbqh%27%3EGlobal%20Hyperorgan%3A%20a%20platform%20for%20telematic%20musicking%20and%20research%3C%5C%2Fa%3E.%20%3Ci%3ENIME%202021%3C%5C%2Fi%3E%2C%201%5Cu201315.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.21428%5C%2F92fbeb44.d4146b2d%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DTJBQNCBI%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Global%20Hyperorgan%3A%20a%20platform%20for%20telematic%20musicking%20and%20research%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Randall%22%2C%22lastName%22%3A%22Harlow%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mattias%22%2C%22lastName%22%3A%22Petersson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Robert%22%2C%22lastName%22%3A%22Ek%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stefan%22%2C%22lastName%22%3A%22%5Cu00d6stersj%5Cu00f6%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222021%22%2C%22proceedingsTitle%22%3A%22NIME%202021%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.21428%5C%2F92fbeb44.d4146b2d%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fnime.pubpub.org%5C%2Fpub%5C%2Fa626cbqh%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22GINWAPKD%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20and%20AQAXA%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20G.%2C%20%26%20AQAXA.%20%282020%29.%20%5C%22You%20have%20a%20new%20memory.%5C%22%20%3Ci%3EICLI%202020%20-%20the%20Fifth%20International%20Conference%20on%20Live%20Interfaces%3C%5C%2Fi%3E.%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22%5C%22You%20have%20a%20new%20memory%5C%22%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%20Ghelli%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22name%22%3A%22AQAXA%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222020%22%2C%22proceedingsTitle%22%3A%22ICLI%202020%20-%20the%20Fifth%20International%20Conference%20on%20Live%20Interfaces%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fmedia.researchcatalogue.net%5C%2Frc%5C%2Fmaster%5C%2Fdf%5C%2F44%5C%2F5b%5C%2Fe4%5C%2Fdf445be4f61d68412dfbf12e44336f1b.pdf%3Ft%3Df4480b134a62694c92fa244348136840%26e%3D1719580200%26f%3Dicli2020_proceedings.pdf%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22Q6JJ9DGI%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20and%20Tanaka%22%2C%22parsedDate%22%3A%222020%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20G.%2C%20%26amp%3B%20Tanaka%2C%20A.%20%282020%29.%20Towards%20Assisted%20Interactive%20Machine%20Learning%3A%20Exploring%20Gesture-Sound%20Mappings%20Using%20Reinforcement%20Learning.%20%3Ci%3EICLI%202020%20-%20the%20Fifth%20International%20Conference%20on%20Live%20Interfaces%3C%5C%2Fi%3E%2C%2010%26%23×2013%3B19.%20%3Ca%20class%3D%27zp-DOIURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.5281%5C%2Fzenodo.3928167%27%3Ehttps%3A%5C%2F%5C%2Fdoi.org%5C%2F10.5281%5C%2Fzenodo.3928167%3C%5C%2Fa%3E%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D9Y2PDAMV%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Towards%20Assisted%20Interactive%20Machine%20Learning%3A%20Exploring%20Gesture-Sound%20Mappings%20Using%20Reinforcement%20Learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%20Ghelli%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Atau%22%2C%22lastName%22%3A%22Tanaka%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222020%22%2C%22proceedingsTitle%22%3A%22ICLI%202020%20-%20the%20Fifth%20International%20Conference%20on%20Live%20Interfaces%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.5281%5C%2Fzenodo.3928167%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22H35L485Q%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20and%20Schramm%22%2C%22parsedDate%22%3A%222019-06%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20%26%20Schramm%2C%20R.%20%282019%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Fwww.nime.org%5C%2Fproceedings%5C%2F2019%5C%2Fnime2019_music00I.pdf%27%3EIntroduction%3C%5C%2Fa%3E.%20In%20F.%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%20%28Ed.%29%2C%20%3Ci%3EMusic%20proceedings%20of%20the%20international%20conference%20on%20new%20interfaces%20for%20musical%20expression%3C%5C%2Fi%3E%20%28p.%204%29.%20UFRGS.%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DEFIJX95V%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Introduction%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222019-06%22%2C%22proceedingsTitle%22%3A%22Music%20proceedings%20of%20the%20international%20conference%20on%20new%20interfaces%20for%20musical%20expression%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.nime.org%5C%2Fproceedings%5C%2F2019%5C%2Fnime2019_music00I.pdf%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%222V7H52N5%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%22%2C%22parsedDate%22%3A%222018-06-28%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20%282018%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3212721.3212890%27%3ESloMo%20study%20%232%3C%5C%2Fa%3E.%20%3Ci%3EProceedings%20of%20the%205th%20International%20Conference%20on%20Movement%20and%20Computing%3C%5C%2Fi%3E%2C%201%5Cu20132.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3212721.3212890%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22SloMo%20study%20%232%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222018-06-28%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%205th%20International%20Conference%20on%20Movement%20and%20Computing%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3212721.3212890%22%2C%22ISBN%22%3A%22978-1-4503-6504-8%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3212721.3212890%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22DUMZ5XRP%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Schramm%20et%20al.%22%2C%22parsedDate%22%3A%222018-06%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3ESchramm%2C%20R.%2C%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Brasil%2C%20A.%2C%20%26%20Johann%2C%20M.%20%282018%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Fwww.nime.org%5C%2Fproceedings%5C%2F2018%5C%2Fnime2018%7B%5C%5C_%7Dpaper0027.pdf%20https%3A%5C%2F%5C%2Fwww.raspberrypi.org%27%3EA%20polyphonic%20pitch%20tracking%20embedded%20system%20for%20rapid%20instrument%20augmentation%3C%5C%2Fa%3E.%20In%20L.%20Dahl%2C%20D.%20Bowman%2C%20%26%20T.%20Martin%20%28Eds.%29%2C%20%3Ci%3EProceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%3C%5C%2Fi%3E%20%28pp.%20120%5Cu2013125%29.%20Virginia%20Tech.%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DCAZDLE6A%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27http%3A%5C%2F%5C%2Fwww.nime.org%5C%2Fproceedings%5C%2F2018%5C%2Fnime2018_paper0027.pdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22A%20polyphonic%20pitch%20tracking%20embedded%20system%20for%20rapid%20instrument%20augmentation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Andr%5Cu00e9%22%2C%22lastName%22%3A%22Brasil%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marcelo%22%2C%22lastName%22%3A%22Johann%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Luke%22%2C%22lastName%22%3A%22Dahl%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Douglas%22%2C%22lastName%22%3A%22Bowman%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Thomas%22%2C%22lastName%22%3A%22Martin%22%7D%5D%2C%22abstractNote%22%3A%22This%20paper%20presents%20a%20system%20for%20easily%20augmenting%20polyphonic%20pitched%20instruments.%20The%20entire%20system%20is%20designed%20to%20run%20on%20a%20low-cost%20embedded%20computer%2C%20suitable%20for%20live%20performance%20and%20easy%20to%20customise%20for%20different%20use%20cases.%20The%20core%20of%20the%20system%20implements%20real-time%20spectrum%20factorisation%2C%20decomposing%20polyphonic%20audio%20input%20signals%20into%20music%20note%20activations.%20New%20instruments%20can%20be%20easily%20added%20to%20the%20system%20with%20the%20help%20of%20custom%20spectral%20template%20dictionaries.%20Instrument%20augmentation%20is%20achieved%20by%20replacing%20or%20mixing%20the%20instrument%27s%20original%20sounds%20with%20a%20large%20variety%20of%20synthetic%20or%20sampled%20sounds%2C%20which%20follow%20the%20polyphonic%20pitch%20activations.%22%2C%22date%22%3A%222018-06%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fwww.nime.org%5C%2Fproceedings%5C%2F2018%5C%2Fnime2018%7B%5C%5C_%7Dpaper0027.pdf%20https%3A%5C%2F%5C%2Fwww.raspberrypi.org%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22KC87XGR3%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Dahl%20and%20Visi%22%2C%22parsedDate%22%3A%222018%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EDahl%2C%20L.%2C%20%26%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20%282018%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fcitation.cfm%3Fdoid%3D3212721.3212842%27%3EModosc%3A%20A%20Library%20of%20Real-Time%20Movement%20Descriptors%20for%20Marker-Based%20Motion%20Capture%3C%5C%2Fa%3E.%20%3Ci%3EMOCO%20%2718%20Proceedings%20of%20the%204th%20International%20Conference%20on%20Movement%20Computing%3C%5C%2Fi%3E%2C%201%5Cu20134.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3212721.3212842%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Modosc%3A%20A%20Library%20of%20Real-Time%20Movement%20Descriptors%20for%20Marker-Based%20Motion%20Capture%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luke%22%2C%22lastName%22%3A%22Dahl%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%5D%2C%22abstractNote%22%3A%22Marker-based%20motion%20capture%20systems%20that%20stream%20precise%20movement%20data%20in%20real-time%20afford%20interaction%20scenarios%20that%20can%20be%20subtle%2C%20detailed%2C%20and%20immediate.%20However%2C%20challenges%20to%20effectively%20utilizing%20this%20data%20include%20having%20to%20build%20bespoke%20processing%20systems%20which%20may%20not%20scale%20well%2C%20and%20a%20need%20for%20higher-level%20representations%20of%20movement%20and%20movement%20qualities.%20We%20present%20modosc%2C%20a%20set%20of%20Max%20abstractions%20for%20computing%20motion%20descriptors%20from%20raw%20motion%20capture%20data%20in%20real%20time.%20Modosc%20is%20designed%20to%20address%20the%20data%20handling%20and%20synchronization%20issues%20that%20arise%20when%20working%20with%20complex%20marker%20sets%2C%20and%20to%20structure%20data%20streams%20in%20a%20meaningful%20and%20easily%20accessible%20manner.%20This%20is%20achieved%20by%20adopting%20a%20multiparadigm%20programming%20approach%20using%20o.dot%20and%20Open%20Sound%20Control.%20We%20describe%20an%20initial%20set%20of%20motion%20descriptors%2C%20the%20addressing%20system%20employed%2C%20and%20design%20decisions%20and%20challenges.%22%2C%22date%22%3A%222018%22%2C%22proceedingsTitle%22%3A%22MOCO%20%2718%20Proceedings%20of%20the%204th%20International%20Conference%20on%20Movement%20Computing%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3212721.3212842%22%2C%22ISBN%22%3A%22978-1-4503-6504-8%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fcitation.cfm%3Fdoid%3D3212721.3212842%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%2239HIZIFC%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20and%20Dahl%22%2C%22parsedDate%22%3A%222018%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20%26%20Dahl%2C%20L.%20%282018%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fgithub.com%5C%2Fmotiondescriptors%5C%2Fmodosc%27%3EReal-Time%20Motion%20Capture%20Analysis%20and%20Music%20Interaction%20with%20the%20Modosc%20Descriptor%20Library%3C%5C%2Fa%3E.%20%3Ci%3EProceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%3C%5C%2Fi%3E%2C%20144%5Cu2013147.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.5281%5C%2Fzenodo.1302707%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DRZC9G2QL%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Real-Time%20Motion%20Capture%20Analysis%20and%20Music%20Interaction%20with%20the%20Modosc%20Descriptor%20Library%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Luke%22%2C%22lastName%22%3A%22Dahl%22%7D%5D%2C%22abstractNote%22%3A%22We%20present%20modosc%2C%20a%20set%20of%20Max%20abstractions%20designed%20for%20computing%20motion%20descriptors%20from%20raw%20motion%20capture%20data%20in%20real%20time.%20The%20library%20contains%20methods%20for%20extracting%20descriptors%20useful%20for%20expressive%20movement%20analysis%20and%20sonic%20interaction%20design.%20Moreover%2C%20modosc%20is%20designed%20to%20address%20the%20data%20handling%20and%20synchronization%20issues%20that%20often%20arise%20when%20working%20with%20complex%20marker%20sets.%20This%20is%20achieved%20by%20adopting%20a%20multiparadigm%20approach%20facilitated%20by%20odot%20and%20Open%20Sound%20Control%20to%20overcome%20some%20of%20the%20limitations%20of%20conventional%20Max%20programming%2C%20and%20structure%20incoming%20and%20outgoing%20data%20streams%20in%20a%20meaningful%20and%20easily%20accessible%20manner.%20After%20describing%20the%20contents%20of%20the%20library%20and%20how%20data%20streams%20are%20structured%20and%20processed%2C%20we%20report%20on%20a%20sonic%20interaction%20design%20use%20case%20involving%20motion%20feature%20extraction%20and%20machine%20learning%20.%22%2C%22date%22%3A%222018%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.5281%5C%2Fzenodo.1302707%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fgithub.com%5C%2Fmotiondescriptors%5C%2Fmodosc%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22ZUW7C47F%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222017-06-28%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Georgiou%2C%20T.%2C%20Holland%2C%20S.%2C%20Pinzone%2C%20O.%2C%20Donaldson%2C%20G.%2C%20%26%20Tetley%2C%20J.%20%282017%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3077981.3078034%27%3EAssessing%20the%20Accuracy%20of%20an%20Algorithm%20for%20the%20Estimation%20of%20Spatial%20Gait%20Parameters%20Using%20Inertial%20Measurement%20Units%3C%5C%2Fa%3E.%20%3Ci%3EMOCO%20%2717%20Proceedings%20of%20the%204th%20International%20Conference%20on%20Movement%20Computing%3C%5C%2Fi%3E%2C%201%5Cu20137.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F3077981.3078034%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DVMKKICJA%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Assessing%20the%20Accuracy%20of%20an%20Algorithm%20for%20the%20Estimation%20of%20Spatial%20Gait%20Parameters%20Using%20Inertial%20Measurement%20Units%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Theodoros%22%2C%22lastName%22%3A%22Georgiou%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Simon%22%2C%22lastName%22%3A%22Holland%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Ornella%22%2C%22lastName%22%3A%22Pinzone%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Glenis%22%2C%22lastName%22%3A%22Donaldson%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Josie%22%2C%22lastName%22%3A%22Tetley%22%7D%5D%2C%22abstractNote%22%3A%22%5Cu00a9%202017%20Copyright%20held%20by%20the%20owner%5C%2Fauthor%28s%29.%20We%20have%20reviewed%20and%20assessed%20the%20reliability%20of%20a%20dead%20reckoning%20and%20drift%20correction%20algorithm%20for%20the%20estimation%20of%20spatial%20gait%20parameters%20using%20Inertial%20Measurement%20Units%20%28IMUs%29.%20In%20particular%2C%20we%20are%20interested%20in%20obtaining%20accurate%20stride%20lengths%20measurements%20in%20order%20to%20assess%20the%20effects%20of%20a%20wearable%20haptic%20cueing%20device%20designed%20to%20assist%20people%20with%20neurological%20health%20conditions%20during%20gait%20rehabilitation.%20To%20assess%20the%20accuracy%20of%20the%20stride%20lengths%20estimates%2C%20we%20compared%20the%20output%20of%20the%20algorithm%20with%20measurements%20obtained%20using%20a%20high-end%20marker-based%20motion%20capture%20system%2C%20here%20adopted%20as%20a%20gold%20standard.%20In%20addition%2C%20we%20introduce%20an%20alternative%20method%20for%20detecting%20initial%20impact%20events%20%28i.e.%20the%20instants%20at%20which%20one%20foot%20contacts%20the%20ground%2C%20here%20used%20for%20de-%20limiting%20strides%29%20using%20accelerometer%20data.%20Our%20method%2C%20based%20on%20a%20kinematic%20feature%20we%20named%20%27jerkage%27%2C%20has%20proved%20more%20robust%20than%20detecting%20peaks%20on%20raw%20accelerometer%20data.%20We%20argue%20that%20the%20resulting%20measurements%20of%20stride%20lengths%20are%20accurate%20enough%20to%20provide%20trend%20data%20needed%20to%20support%20worthwhile%20gait%20rehabilitation%20applications.%20This%20approach%20has%20potential%20to%20assist%20physiotherapists%20and%20patients%20without%20access%20to%20fully-equipped%20movement%20labs.%20More%20specifically%2C%20it%20has%20applications%20for%20collecting%20data%20to%20guide%20and%20assess%20gait%20rehabilitation%20both%20outdoors%20and%20at%20home.%22%2C%22date%22%3A%222017-06-28%22%2C%22proceedingsTitle%22%3A%22MOCO%20%2717%20Proceedings%20of%20the%204th%20International%20Conference%20on%20Movement%20Computing%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F3077981.3078034%22%2C%22ISBN%22%3A%22978-1-4503-5209-3%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fdoi%5C%2F10.1145%5C%2F3077981.3078034%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22SQQ3T87K%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222017%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Caramiaux%2C%20B.%2C%20Mcloughlin%2C%20M.%2C%20%26amp%3B%20Miranda%2C%20E.%20%282017%29.%20A%20Knowledge-based%2C%20Data-driven%20Method%20for%20Action-sound%20Mapping.%20%3Ci%3EProceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%3C%5C%2Fi%3E.%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D3JEJ2495%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22A%20Knowledge-based%2C%20Data-driven%20Method%20for%20Action-sound%20Mapping%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Baptiste%22%2C%22lastName%22%3A%22Caramiaux%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Mcloughlin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22This%20paper%20presents%20a%20knowledge-based%2C%20data-driven%20method%20for%20using%20data%20describing%20action-sound%20couplings%20collected%20from%20a%20group%20of%20people%20to%20generate%20multiple%20complex%20map-pings%20between%20the%20performance%20movements%20of%20a%20musician%20and%20sound%20synthesis.%20This%20is%20done%20by%20using%20a%20database%20of%20multimodal%20motion%20data%20collected%20from%20multiple%20subjects%20coupled%20with%20sound%20synthesis%20parameters.%20A%20series%20of%20sound%20stimuli%20is%20synthesised%20using%20the%20sound%20engine%20that%20will%20be%20used%20in%20performance.%20Multimodal%20motion%20data%20is%20collected%20by%20asking%20each%20participant%20to%20listen%20to%20each%20sound%20stimu-lus%20and%20move%20as%20if%20they%20were%20producing%20the%20sound%20using%20a%20musical%20instrument%20they%20are%20given.%20Multimodal%20data%20is%20recorded%20during%20each%20performance%2C%20and%20paired%20with%20the%20syn-thesis%20parameters%20used%20for%20generating%20the%20sound%20stimulus.%20The%20dataset%20created%20using%20this%20method%20is%20then%20used%20to%20build%20a%20topological%20representation%20of%20the%20performance%20movements%20of%20the%20subjects.%20This%20representation%20is%20then%20used%20to%20inter-actively%20generate%20training%20data%20for%20machine%20learning%20algo-rithms%2C%20and%20define%20mappings%20for%20real-time%20performance.%20To%20better%20illustrate%20each%20step%20of%20the%20procedure%2C%20we%20describe%20an%20implementation%20involving%20clarinet%2C%20motion%20capture%2C%20wear-able%20sensor%20armbands%2C%20and%20waveguide%20synthesis.%22%2C%22date%22%3A%222017%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22RJF4S8GY%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%22%2C%22parsedDate%22%3A%222016%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20%282016%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Freframe.sussex.ac.uk%5C%2Freframebooks%5C%2Farchive2016%5C%2Flive-interfaces%5C%2F%27%3ETuned%20Constraint%3C%5C%2Fa%3E.%20%3Ci%3EICLI%202016%20%5Cu2013%20International%20Conference%20on%20Live%20Interfaces%3C%5C%2Fi%3E.%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Tuned%20Constraint%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222016%22%2C%22proceedingsTitle%22%3A%22ICLI%202016%20%5Cu2013%20International%20Conference%20on%20Live%20Interfaces%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Freframe.sussex.ac.uk%5C%2Freframebooks%5C%2Farchive2016%5C%2Flive-interfaces%5C%2F%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%2236ZTGDIR%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20and%20Miranda%22%2C%22parsedDate%22%3A%222016%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20%26amp%3B%20Miranda%2C%20E.%20R.%20%282016%29.%20Instrumental%20Movements%20to%20Physical%20Models%3A%20Mapping%20Postural%20and%20Sonic%20Topologies%20through%20Machine%20Learning.%20%3Ci%3EPorto%20International%20Conference%20on%20Musical%20Gesture%20as%20Creative%20Interface%3C%5C%2Fi%3E.%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Instrumental%20Movements%20to%20Physical%20Models%3A%20Mapping%20Postural%20and%20Sonic%20Topologies%20through%20Machine%20Learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%20Reck%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222016%22%2C%22proceedingsTitle%22%3A%22Porto%20International%20Conference%20on%20Musical%20Gesture%20as%20Creative%20Interface%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%225DEKWHMC%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222015%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Coorevits%2C%20E.%2C%20%26amp%3B%20Miranda%2C%20E.%20R.%20%282015%29.%20A%20practice-based%20study%20on%20instrumental%20gestures%20in%20music%20composition%20and%20performance%3A%20Kineslimina.%20%3Ci%3EMuSA%202015%20-%20Sixth%20International%20Symposium%20on%20Music%5C%2FSonic%20Art%3A%20Practices%20and%20Theories%3C%5C%2Fi%3E.%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22A%20practice-based%20study%20on%20instrumental%20gestures%20in%20music%20composition%20and%20performance%3A%20Kineslimina%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Esther%22%2C%22lastName%22%3A%22Coorevits%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%20Reck%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222015%22%2C%22proceedingsTitle%22%3A%22MuSA%202015%20-%20Sixth%20International%20Symposium%20on%20Music%5C%2FSonic%20Art%3A%20Practices%20and%20Theories%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22S7GNXGF2%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222015%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Coorevits%2C%20E.%2C%20Schramm%2C%20R.%2C%20%26amp%3B%20Miranda%2C%20E.%20%282015%29.%20Instrumental%20Movements%20of%20Neophytes%3A%20Analysis%20of%20Movement%20Periodicities%2C%20Commonalities%20and%20Individualities%20in%20Mimed%20Violin%20Performance.%20%3Ci%3EProceedings%20of%20the%2011th%20International%20Symposium%20on%20Computer%20Music%20Multidisciplinary%20Research%20%28CMMR%29%3C%5C%2Fi%3E.%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DZ4STDIQ9%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Instrumental%20Movements%20of%20Neophytes%3A%20Analysis%20of%20Movement%20Periodicities%2C%20Commonalities%20and%20Individualities%20in%20Mimed%20Violin%20Performance%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Esther%22%2C%22lastName%22%3A%22Coorevits%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222015%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%2011th%20International%20Symposium%20on%20Computer%20Music%20Multidisciplinary%20Research%20%28CMMR%29%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22VWTG9P2W%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222014%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Coorevits%2C%20E.%2C%20Miranda%2C%20E.%2C%20%26%20Leman%2C%20M.%20%282014%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Fspeech.di.uoa.gr%5C%2FICMC-SMC-2014%5C%2Fimages%5C%2FVOL%7B%5C%5C_%7D2%5C%2F1368.pdf%27%3EEffects%20of%20different%20bow%20stroke%20styles%20on%20body%20movements%20of%20a%20viola%20player%3A%20an%20exploratory%20study%3C%5C%2Fa%3E.%20In%20A.%20Georgaki%20%26%20G.%20Kouroupetroglou%20%28Eds.%29%2C%20%3Ci%3EProceedings%20of%20the%20joint%20ICMC%20%7C%20SMC%20%7C2014%2C%2040th%20International%20Computer%20Music%20Conference%2C%2011th%20Sound%20and%20Music%20Computing%20conference%3C%5C%2Fi%3E%20%28pp.%201368%5Cu20131374%29.%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D86WAIU9L%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Effects%20of%20different%20bow%20stroke%20styles%20on%20body%20movements%20of%20a%20viola%20player%3A%20an%20exploratory%20study%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Esther%22%2C%22lastName%22%3A%22Coorevits%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Marc%22%2C%22lastName%22%3A%22Leman%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Anastasia%22%2C%22lastName%22%3A%22Georgaki%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Georgios%22%2C%22lastName%22%3A%22Kouroupetroglou%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222014%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20joint%20ICMC%20%7C%20SMC%20%7C2014%2C%2040th%20International%20Computer%20Music%20Conference%2C%2011th%20Sound%20and%20Music%20Computing%20conference%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22978-960-466-137-4%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fspeech.di.uoa.gr%5C%2FICMC-SMC-2014%5C%2Fimages%5C%2FVOL%7B%5C%5C_%7D2%5C%2F1368.pdf%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22YC7HRBYS%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222014%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Dothel%2C%20G.%2C%20Williams%2C%20D.%2C%20%26amp%3B%20Miranda%2C%20E.%20%282014%29.%20Unfolding%20%7C%20Clusters%3A%20A%20Music%20and%20Visual%20Media%20Model%20of%20ALS%20Pathophysiology.%20%3Ci%3EProceedings%20of%20SoniHED%20Conference%3A%20Sonification%20of%20Health%20and%20Environmental%20Data%3C%5C%2Fi%3E.%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DEXS7IVYE%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Unfolding%20%7C%20Clusters%3A%20A%20Music%20and%20Visual%20Media%20Model%20of%20ALS%20Pathophysiology%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Giovanni%22%2C%22lastName%22%3A%22Dothel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Duncan%22%2C%22lastName%22%3A%22Williams%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222014%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20SoniHED%20Conference%3A%20Sonification%20of%20Health%20and%20Environmental%20Data%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%224S4NMXAP%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222014%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Schramm%2C%20R.%2C%20%26%20Miranda%2C%20E.%20%282014%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fcitation.cfm%3Fid%3D2618013%20http%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fcitation.cfm%3Fdoid%3D2617995.2618013%27%3EGesture%20in%20performance%20with%20traditional%20musical%20instruments%20and%20electronics%3C%5C%2Fa%3E.%20%3Ci%3EProceedings%20of%20the%202014%20International%20Workshop%20on%20Movement%20and%20Computing%20-%20MOCO%20%2714%3C%5C%2Fi%3E%2C%20100%5Cu2013105.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1145%5C%2F2617995.2618013%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DG885GJCZ%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Gesture%20in%20performance%20with%20traditional%20musical%20instruments%20and%20electronics%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222014%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%202014%20International%20Workshop%20on%20Movement%20and%20Computing%20-%20MOCO%20%2714%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1145%5C%2F2617995.2618013%22%2C%22ISBN%22%3A%22978-1-4503-2814-2%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fcitation.cfm%3Fid%3D2618013%20http%3A%5C%2F%5C%2Fdl.acm.org%5C%2Fcitation.cfm%3Fdoid%3D2617995.2618013%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22M6P9M3ZQ%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222014%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Schramm%2C%20R.%2C%20%26%20Miranda%2C%20E.%20%282014%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Fnime2014.org%5C%2Fproceedings%5C%2Fpapers%5C%2F460%7B%5C%5C_%7Dpaper.pdf%27%3EUse%20of%20Body%20Motion%20to%20Enhance%20Traditional%20Musical%20Instruments%3A%20A%20Multimodal%20Embodied%20Approach%20to%20Gesture%20Mapping%20%2C%20Composition%20and%20Performance%3C%5C%2Fa%3E.%20%3Ci%3EProceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%3C%5C%2Fi%3E%2C%20601–604.%20%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DUWN5QI4C%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Use%20of%20Body%20Motion%20to%20Enhance%20Traditional%20Musical%20Instruments%3A%20A%20Multimodal%20Embodied%20Approach%20to%20Gesture%20Mapping%20%2C%20Composition%20and%20Performance%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222014%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Fnime2014.org%5C%2Fproceedings%5C%2Fpapers%5C%2F460%7B%5C%5C_%7Dpaper.pdf%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22GAVIU5QP%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222014%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Williams%2C%20D.%2C%20Dothel%2C%20G.%2C%20%26amp%3B%20Miranda%2C%20E.%20%282014%29.%20Musification%20of%20ALS%20Pathophysiology%3A%20Notes%20on%20Timbre%20and%20Spatialisation%20in%20Unfolding%20%7C%20Clusters.%20%3Ci%3EProceedings%20of%20the%209th%20Conference%20on%20Interdisciplinary%20Musicology%20-%20CIM14%3C%5C%2Fi%3E%2C%201%26%23×2013%3B6.%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DA336ZQET%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22Musification%20of%20ALS%20Pathophysiology%3A%20Notes%20on%20Timbre%20and%20Spatialisation%20in%20Unfolding%20%7C%20Clusters%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Duncan%22%2C%22lastName%22%3A%22Williams%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Giovanni%22%2C%22lastName%22%3A%22Dothel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222014%22%2C%22proceedingsTitle%22%3A%22Proceedings%20of%20the%209th%20Conference%20on%20Interdisciplinary%20Musicology%20-%20CIM14%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22DXV7ERPI%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222014%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Williams%2C%20D.%2C%20Dothel%2C%20G.%2C%20%26amp%3B%20Miranda%2C%20E.%20%282014%29.%20An%20Immersive%20Media%20Model%20of%20Amyotrophic%20Lateral%20Sclerosis.%20%3Ci%3EEVA%20London%202014%3A%20Electronic%20Visualisation%20and%20the%20Arts%3C%5C%2Fi%3E.%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DEMKR8RIW%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22conferencePaper%22%2C%22title%22%3A%22An%20Immersive%20Media%20Model%20of%20Amyotrophic%20Lateral%20Sclerosis%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Duncan%22%2C%22lastName%22%3A%22Williams%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Giovanni%22%2C%22lastName%22%3A%22Dothel%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%22%2C%22lastName%22%3A%22Miranda%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222014%22%2C%22proceedingsTitle%22%3A%22EVA%20London%202014%3A%20Electronic%20Visualisation%20and%20the%20Arts%22%2C%22conferenceName%22%3A%22%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%228MWBZ5AR%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%5D%7D
Visi, F. (2024). The Sophtar: a networkable feedback string instrument with embedded machine learning. NIME 2024 Proceedings of the International Conference on New Interfaces for Musical Expression. Download
Visi, F., Schramm, R., Frödin, K., Unander-Scharin, Å., & Östersjö, S. (2024). Empirical Analysis of Gestural Sonic Objects Combining Qualitative and Quantitative Methods. In A. R. Jensenius (Ed.), Sonic Design (pp. 115–137). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-57892-2_7 Download
Ek, R., Östersö, S., Visi, F. G., & Petersson, M. (2021). The TCP/Indeterminate Place Quartet: a Global Hyperorgan Scenario. International Conference on New Interfaces for Musical Expression (NIME). Download
Harlow, R., Petersson, M., Ek, R., Visi, F., & Östersjö, S. (2021). Global Hyperorgan: a platform for telematic musicking and research. NIME 2021, 1–15. https://doi.org/10.21428/92fbeb44.d4146b2d Download
Visi, F. G., & AQAXA. (2020). “You have a new memory.” ICLI 2020 – the Fifth International Conference on Live Interfaces.
Visi, F. G., & Tanaka, A. (2020). Towards Assisted Interactive Machine Learning: Exploring Gesture-Sound Mappings Using Reinforcement Learning. ICLI 2020 – the Fifth International Conference on Live Interfaces, 10–19. https://doi.org/10.5281/zenodo.3928167 Download
Visi, F., & Schramm, R. (2019). Introduction. In F. Visi (Ed.), Music proceedings of the international conference on new interfaces for musical expression (p. 4). UFRGS. Download
Visi, F. (2018). SloMo study #2. Proceedings of the 5th International Conference on Movement and Computing, 1–2. https://doi.org/10.1145/3212721.3212890
Schramm, R., Visi, F., Brasil, A., & Johann, M. (2018). A polyphonic pitch tracking embedded system for rapid instrument augmentation. In L. Dahl, D. Bowman, & T. Martin (Eds.), Proceedings of the International Conference on New Interfaces for Musical Expression (pp. 120–125). Virginia Tech. Download Download
Dahl, L., & Visi, F. (2018). Modosc: A Library of Real-Time Movement Descriptors for Marker-Based Motion Capture. MOCO ’18 Proceedings of the 4th International Conference on Movement Computing, 1–4. https://doi.org/10.1145/3212721.3212842
Visi, F., & Dahl, L. (2018). Real-Time Motion Capture Analysis and Music Interaction with the Modosc Descriptor Library. Proceedings of the International Conference on New Interfaces for Musical Expression, 144–147. https://doi.org/10.5281/zenodo.1302707 Download
Visi, F., Georgiou, T., Holland, S., Pinzone, O., Donaldson, G., & Tetley, J. (2017). Assessing the Accuracy of an Algorithm for the Estimation of Spatial Gait Parameters Using Inertial Measurement Units. MOCO ’17 Proceedings of the 4th International Conference on Movement Computing, 1–7. https://doi.org/10.1145/3077981.3078034 Download
Visi, F., Caramiaux, B., Mcloughlin, M., & Miranda, E. (2017). A Knowledge-based, Data-driven Method for Action-sound Mapping. Proceedings of the International Conference on New Interfaces for Musical Expression. Download
Visi, F. (2016). Tuned Constraint. ICLI 2016 – International Conference on Live Interfaces.
Visi, F., & Miranda, E. R. (2016). Instrumental Movements to Physical Models: Mapping Postural and Sonic Topologies through Machine Learning. Porto International Conference on Musical Gesture as Creative Interface.
Visi, F., Coorevits, E., & Miranda, E. R. (2015). A practice-based study on instrumental gestures in music composition and performance: Kineslimina. MuSA 2015 – Sixth International Symposium on Music/Sonic Art: Practices and Theories.
Visi, F., Coorevits, E., Schramm, R., & Miranda, E. (2015). Instrumental Movements of Neophytes: Analysis of Movement Periodicities, Commonalities and Individualities in Mimed Violin Performance. Proceedings of the 11th International Symposium on Computer Music Multidisciplinary Research (CMMR). Download
Visi, F., Coorevits, E., Miranda, E., & Leman, M. (2014). Effects of different bow stroke styles on body movements of a viola player: an exploratory study. In A. Georgaki & G. Kouroupetroglou (Eds.), Proceedings of the joint ICMC | SMC |2014, 40th International Computer Music Conference, 11th Sound and Music Computing conference (pp. 1368–1374). Download
Visi, F., Dothel, G., Williams, D., & Miranda, E. (2014). Unfolding | Clusters: A Music and Visual Media Model of ALS Pathophysiology. Proceedings of SoniHED Conference: Sonification of Health and Environmental Data. Download
Visi, F., Schramm, R., & Miranda, E. (2014). Gesture in performance with traditional musical instruments and electronics. Proceedings of the 2014 International Workshop on Movement and Computing – MOCO ’14, 100–105. https://doi.org/10.1145/2617995.2618013 Download
Visi, F., Schramm, R., & Miranda, E. (2014). Use of Body Motion to Enhance Traditional Musical Instruments: A Multimodal Embodied Approach to Gesture Mapping , Composition and Performance. Proceedings of the International Conference on New Interfaces for Musical Expression, 601–604. Download
Visi, F., Williams, D., Dothel, G., & Miranda, E. (2014). Musification of ALS Pathophysiology: Notes on Timbre and Spatialisation in Unfolding | Clusters. Proceedings of the 9th Conference on Interdisciplinary Musicology – CIM14, 1–6. Download
Visi, F., Williams, D., Dothel, G., & Miranda, E. (2014). An Immersive Media Model of Amyotrophic Lateral Sclerosis. EVA London 2014: Electronic Visualisation and the Arts. Download
Book Chapters
5790233 SCEBQZZN 1 apa 50 date desc 1 1 title Visi 791 https://www.federicovisi.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%227U27MZRC%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20and%20Tanaka%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20G.%2C%20%26%20Tanaka%2C%20A.%20%282021%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Flink.springer.com%5C%2Fchapter%5C%2F10.1007%5C%2F978-3-030-72116-9_27%27%3EInteractive%20Machine%20Learning%20of%20Musical%20Gesture%3C%5C%2Fa%3E.%20In%20%3Ci%3EHandbook%20of%20Artificial%20Intelligence%20for%20Music%3C%5C%2Fi%3E%20%28pp.%20771%5Cu2013798%29.%20Springer%20International%20Publishing.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-030-72116-9_27%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22bookSection%22%2C%22title%22%3A%22Interactive%20Machine%20Learning%20of%20Musical%20Gesture%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%20Ghelli%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Atau%22%2C%22lastName%22%3A%22Tanaka%22%7D%5D%2C%22abstractNote%22%3A%22This%20chapter%20presents%20an%20overview%20of%20Interactive%20Machine%20Learning%20%28IML%29%20techniques%20applied%20to%20the%20analysis%20and%20design%20of%20musical%20gestures.%20We%20go%20through%20the%20main%20challenges%20and%20needs%20related%20to%20capturing%2C%20analysing%2C%20and%20applying%20IML%20techniques%20to%20human%20bodily%20gestures%20with%20the%20purpose%20of%20performing%20with%20sound%20synthesis%20systems.%20We%20discuss%20how%20different%20algorithms%20may%20be%20used%20to%20accomplish%20different%20tasks%2C%20including%20interacting%20with%20complex%20synthesis%20techniques%20and%20exploring%20interaction%20possibilities%20by%20means%20of%20Reinforcement%20Learning%20%28RL%29%20in%20an%20interaction%20paradigm%20we%20developed%20called%20Assisted%20Interactive%20Machine%20Learning%20%28AIML%29.%20We%20conclude%20the%20chapter%20with%20a%20description%20of%20how%20some%20of%20these%20techniques%20were%20employed%20by%20the%20authors%20for%20the%20development%20of%20four%20musical%20pieces%2C%20thus%20outlining%20the%20implications%20that%20IML%20have%20for%20musical%20practice.%22%2C%22bookTitle%22%3A%22Handbook%20of%20Artificial%20Intelligence%20for%20Music%22%2C%22date%22%3A%222021%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Flink.springer.com%5C%2Fchapter%5C%2F10.1007%5C%2F978-3-030-72116-9_27%22%2C%22collections%22%3A%5B%22SCEBQZZN%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22Z9L8YP56%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Zbyszy%5Cu0144ski%20et%20al.%22%2C%22parsedDate%22%3A%222021%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3EZbyszy%5Cu0144ski%2C%20M.%2C%20Di%20Donato%2C%20B.%2C%20%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20G.%2C%20%26%20Tanaka%2C%20A.%20%282021%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2F978-3-030-70210-6_39%27%3EGesture-Timbre%20Space%3A%20Multidimensional%20Feature%20Mapping%20Using%20Machine%20Learning%20and%20Concatenative%20Synthesis%3C%5C%2Fa%3E.%20In%20%3Ci%3ELecture%20Notes%20in%20Computer%20Science%20%28including%20subseries%20Lecture%20Notes%20in%20Artificial%20Intelligence%20and%20Lecture%20Notes%20in%20Bioinformatics%29%3A%20Vol.%2012631%20LNCS%3C%5C%2Fi%3E%20%28pp.%20600%5Cu2013622%29.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-030-70210-6_39%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3DSZ6BZ3NQ%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22bookSection%22%2C%22title%22%3A%22Gesture-Timbre%20Space%3A%20Multidimensional%20Feature%20Mapping%20Using%20Machine%20Learning%20and%20Concatenative%20Synthesis%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Zbyszy%5Cu0144ski%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Balandino%22%2C%22lastName%22%3A%22Di%20Donato%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%20Ghelli%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Atau%22%2C%22lastName%22%3A%22Tanaka%22%7D%5D%2C%22abstractNote%22%3A%22This%20chapter%20explores%20three%20systems%20for%20mapping%20embodied%20gesture%2C%20acquired%20with%20electromyography%20and%20motion%20sensing%2C%20to%20sound%20synthesis.%20A%20pilot%20study%20using%20granular%20synthesis%20is%20presented%2C%20followed%20by%20studies%20employing%20corpus-based%20concatenative%20synthesis%2C%20where%20small%20sound%20units%20are%20organized%20by%20derived%20timbral%20features.%20We%20use%20interactive%20machine%20learning%20in%20a%20mapping-by-demonstration%20paradigm%20to%20create%20regression%20models%20that%20map%20high-dimensional%20gestural%20data%20to%20timbral%20data%20without%20dimensionality%20reduction%20in%20three%20distinct%20workflows.%20First%2C%20by%20directly%20associating%20individual%20sound%20units%20and%20static%20poses%20%28anchor%20points%29%20in%20static%20regression.%20Second%2C%20in%20whole%20regression%20a%20sound%20tracing%20method%20leverages%20our%20intuitive%20associations%20between%20time-varying%20sound%20and%20embodied%20movement.%20Third%2C%20we%20extend%20interactive%20machine%20learning%20through%20the%20use%20of%20artificial%20agents%20and%20reinforcement%20learning%20in%20an%20assisted%20interactive%20machine%20learning%20workflow.%20We%20discuss%20the%20benefits%20of%20organizing%20the%20sound%20corpus%20using%20self-organizing%20maps%20to%20address%20corpus%20sparseness%2C%20and%20the%20potential%20of%20regression-based%20mapping%20at%20different%20points%20in%20a%20musical%20workflow%3A%20gesture%20design%2C%20sound%20design%2C%20and%20mapping%20design.%20These%20systems%20support%20expressive%20performance%20by%20creating%20gesture-timbre%20spaces%20that%20maximize%20sonic%20diversity%20while%20maintaining%20coherence%2C%20enabling%20reliable%20reproduction%20of%20target%20sounds%20as%20well%20as%20improvisatory%20exploration%20of%20a%20sonic%20corpus.%20They%20have%20been%20made%20available%20to%20the%20research%20community%2C%20and%20have%20been%20used%20by%20the%20authors%20in%20concert%20performance.%22%2C%22bookTitle%22%3A%22Lecture%20Notes%20in%20Computer%20Science%20%28including%20subseries%20Lecture%20Notes%20in%20Artificial%20Intelligence%20and%20Lecture%20Notes%20in%20Bioinformatics%29%22%2C%22date%22%3A%222021%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22978-3-030-70209-0%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2F978-3-030-70210-6_39%22%2C%22collections%22%3A%5B%22SCEBQZZN%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22BIJ7S2QG%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20and%20Faasch%22%2C%22parsedDate%22%3A%222018%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20%26amp%3B%20Faasch%2C%20F.%20%282018%29.%20Motion%20Controllers%2C%20Sound%2C%20and%20Music%20in%20Video%20Games%3A%20State%20of%20the%20Art%20and%20Research%20Perspectives.%20In%20D.%20Williams%20%26amp%3B%20N.%20Lee%20%28Eds.%29%2C%20%3Ci%3EEmotion%20in%20Video%20Game%20Soundtracking%3C%5C%2Fi%3E%20%281st%20ed.%29.%20Springer%20International%20Publishing.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-319-72272-6%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22bookSection%22%2C%22title%22%3A%22Motion%20Controllers%2C%20Sound%2C%20and%20Music%20in%20Video%20Games%3A%20State%20of%20the%20Art%20and%20Research%20Perspectives%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Frithjof%22%2C%22lastName%22%3A%22Faasch%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Duncan%22%2C%22lastName%22%3A%22Williams%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Newton%22%2C%22lastName%22%3A%22Lee%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22bookTitle%22%3A%22Emotion%20in%20Video%20Game%20Soundtracking%22%2C%22date%22%3A%222018%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22SCEBQZZN%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22A659K8HL%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%22%2C%22parsedDate%22%3A%222017%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20%282017%29.%20Augmenting%20Instruments%20and%20Extending%20Cultures%3A%20on%20the%20Overtone%20Violin.%20In%20A.%20R.%20Jensenius%20%26amp%3B%20M.%20Lyons%20%28Eds.%29%2C%20%3Ci%3EA%20NIME%20Reader%3A%20Fifteen%20years%20of%20New%20Interfaces%20for%20Musical%20Expression%3C%5C%2Fi%3E.%20Springer%20International%20Publishing.%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22bookSection%22%2C%22title%22%3A%22Augmenting%20Instruments%20and%20Extending%20Cultures%3A%20on%20the%20Overtone%20Violin%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Alexander%20Refsum%22%2C%22lastName%22%3A%22Jensenius%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Michael%22%2C%22lastName%22%3A%22Lyons%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22bookTitle%22%3A%22A%20NIME%20Reader%3A%20Fifteen%20years%20of%20New%20Interfaces%20for%20Musical%20Expression%22%2C%22date%22%3A%222017%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22%22%2C%22collections%22%3A%5B%22SCEBQZZN%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%2C%7B%22key%22%3A%22BYHVS6KA%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%20et%20al.%22%2C%22parsedDate%22%3A%222016-06-16%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%2C%20Coorevits%2C%20E.%2C%20Schramm%2C%20R.%2C%20%26%20Miranda%2C%20E.%20R.%20%282016%29.%20%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27http%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2F978-3-319-46282-0%7B%5C%5C_%7D6%20https%3A%5C%2F%5C%2Flink.springer.com%5C%2Fcontent%5C%2Fpdf%5C%2F10.1007%7B%5C%5C%25%7D2F978-3-319-46282-0%7B%5C%5C_%7D6.pdf%20http%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2F978-3-319-46282-0%27%3EAnalysis%20of%20Mimed%20Violin%20Performance%20Movements%20of%20Neophytes%3A%20Patterns%2C%20Periodicities%2C%20Commonalities%20and%20Individualities%3C%5C%2Fa%3E.%20In%20R.%20Kronland-Martinet%2C%20M.%20Aramaki%2C%20%26%20S.%20Ystad%20%28Eds.%29%2C%20%3Ci%3EMusic%2C%20Mind%2C%20and%20Embodiment%3A%2011th%20International%20Symposium%20on%20Computer%20Music%20Multidisciplinary%20Research%2C%20CMMR%202015%2C%20Plymouth%2C%20UK%2C%20June%2016-19%2C%202015%2C%20Revised%20Selected%20Papers%3C%5C%2Fi%3E%20%28Vol.%209617%2C%20pp.%2088%5Cu2013108%29.%20Springer%20International%20Publishing.%20https%3A%5C%2F%5C%2Fdoi.org%5C%2F10.1007%5C%2F978-3-319-46282-0%20%3Ca%20title%3D%27Download%27%20class%3D%27zp-DownloadURL%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.federicovisi.com%5C%2Fwp-content%5C%2Fplugins%5C%2Fzotpress%5C%2Flib%5C%2Frequest%5C%2Frequest.dl.php%3Fapi_user_id%3D5790233%26amp%3Bdlkey%3D9MLP4D9N%26amp%3Bcontent_type%3Dapplication%5C%2Fpdf%27%3EDownload%3C%5C%2Fa%3E%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22bookSection%22%2C%22title%22%3A%22Analysis%20of%20Mimed%20Violin%20Performance%20Movements%20of%20Neophytes%3A%20Patterns%2C%20Periodicities%2C%20Commonalities%20and%20Individualities%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Esther%22%2C%22lastName%22%3A%22Coorevits%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Rodrigo%22%2C%22lastName%22%3A%22Schramm%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Eduardo%20R.%22%2C%22lastName%22%3A%22Miranda%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Richard%22%2C%22lastName%22%3A%22Kronland-Martinet%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Mitsuko%22%2C%22lastName%22%3A%22Aramaki%22%7D%2C%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22S%5Cu00f8lvi%22%2C%22lastName%22%3A%22Ystad%22%7D%5D%2C%22abstractNote%22%3A%22Body%20movement%20and%20embodied%20knowledge%20play%20an%20impor-tant%20part%20in%20how%20we%20express%20and%20understand%20music.%20The%20gestures%20of%20a%20musician%20playing%20an%20instrument%20are%20part%20of%20a%20shared%20knowledge%20that%20con-tributes%20to%20musical%20expressivity%20by%20building%20expectations%20and%20influencing%20perception.%20In%20this%20study%2C%20we%20investigate%20the%20extent%20in%20which%20the%20move-ment%20vocabulary%20of%20violin%20performance%20is%20part%20of%20the%20embodied%20knowl-edge%20of%20individuals%20with%20no%20experience%20in%20playing%20the%20instrument.%20We%20asked%20people%20who%20cannot%20play%20the%20violin%20to%20mime%20a%20performance%20along%20an%20audio%20excerpt%20recorded%20by%20an%20expert.%20They%20do%20so%20by%20using%20a%20silent%20violin%2C%20specifically%20modified%20to%20be%20more%20accessible%20to%20neophytes.%20Prelimi-nary%20motion%20data%20analyses%20suggest%20that%2C%20despite%20the%20individuality%20of%20each%20performance%2C%20there%20is%20a%20certain%20consistency%20among%20participants%20in%20terms%20of%20overall%20rhythmic%20resonance%20with%20the%20music%20and%20movement%20in%20response%20to%20melodic%20phrasing.%20Individualities%20and%20commonalities%20are%20then%20analysed%20using%20Functional%20Principal%20Component%20Analysis.%22%2C%22bookTitle%22%3A%22Music%2C%20Mind%2C%20and%20Embodiment%3A%2011th%20International%20Symposium%20on%20Computer%20Music%20Multidisciplinary%20Research%2C%20CMMR%202015%2C%20Plymouth%2C%20UK%2C%20June%2016-19%2C%202015%2C%20Revised%20Selected%20Papers%22%2C%22date%22%3A%222016-06-16%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22978-3-319-46281-3%22%2C%22url%22%3A%22http%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2F978-3-319-46282-0%7B%5C%5C_%7D6%20https%3A%5C%2F%5C%2Flink.springer.com%5C%2Fcontent%5C%2Fpdf%5C%2F10.1007%7B%5C%5C%25%7D2F978-3-319-46282-0%7B%5C%5C_%7D6.pdf%20http%3A%5C%2F%5C%2Flink.springer.com%5C%2F10.1007%5C%2F978-3-319-46282-0%22%2C%22collections%22%3A%5B%22SCEBQZZN%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%5D%7D
Visi, F. G., & Tanaka, A. (2021). Interactive Machine Learning of Musical Gesture. In Handbook of Artificial Intelligence for Music (pp. 771–798). Springer International Publishing. https://doi.org/10.1007/978-3-030-72116-9_27
Zbyszyński, M., Di Donato, B., Visi, F. G., & Tanaka, A. (2021). Gesture-Timbre Space: Multidimensional Feature Mapping Using Machine Learning and Concatenative Synthesis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): Vol. 12631 LNCS (pp. 600–622). https://doi.org/10.1007/978-3-030-70210-6_39 Download
Visi, F., & Faasch, F. (2018). Motion Controllers, Sound, and Music in Video Games: State of the Art and Research Perspectives. In D. Williams & N. Lee (Eds.), Emotion in Video Game Soundtracking (1st ed.). Springer International Publishing. https://doi.org/10.1007/978-3-319-72272-6
Visi, F. (2017). Augmenting Instruments and Extending Cultures: on the Overtone Violin. In A. R. Jensenius & M. Lyons (Eds.), A NIME Reader: Fifteen years of New Interfaces for Musical Expression. Springer International Publishing.
Visi, F., Coorevits, E., Schramm, R., & Miranda, E. R. (2016). Analysis of Mimed Violin Performance Movements of Neophytes: Patterns, Periodicities, Commonalities and Individualities. In R. Kronland-Martinet, M. Aramaki, & S. Ystad (Eds.), Music, Mind, and Embodiment: 11th International Symposium on Computer Music Multidisciplinary Research, CMMR 2015, Plymouth, UK, June 16-19, 2015, Revised Selected Papers (Vol. 9617, pp. 88–108). Springer International Publishing. https://doi.org/10.1007/978-3-319-46282-0 Download
PhD Thesis
5790233 5RK4KP6U 1 apa 50 date desc 1 1 title Visi 791 https://www.federicovisi.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22TMZHGVYZ%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%22%2C%22parsedDate%22%3A%222017%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20%282017%29.%20%3Ci%3E%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fpearl.plymouth.ac.uk%5C%2Fhandle%5C%2F10026.1%5C%2F8805%27%3EMethods%20and%20Technologies%20for%20the%20Analysis%20and%20Interactive%20Use%20of%20Body%20Movements%20in%20Instrumental%20Music%20Performance%3C%5C%2Fa%3E%3C%5C%2Fi%3E%20%5BPhD%20Thesis%2C%20Plymouth%20University%5D.%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22thesis%22%2C%22title%22%3A%22Methods%20and%20Technologies%20for%20the%20Analysis%20and%20Interactive%20Use%20of%20Body%20Movements%20in%20Instrumental%20Music%20Performance%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22thesisType%22%3A%22PhD%20Thesis%22%2C%22university%22%3A%22Plymouth%20University%22%2C%22date%22%3A%222017%22%2C%22language%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fpearl.plymouth.ac.uk%5C%2Fhandle%5C%2F10026.1%5C%2F8805%22%2C%22collections%22%3A%5B%225RK4KP6U%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%5D%7D
Visi, F. (2017). Methods and Technologies for the Analysis and Interactive Use of Body Movements in Instrumental Music Performance [PhD Thesis, Plymouth University].
Edited
5790233 D4CB8LXL 1 apa 50 date desc 1 1 title Visi 791 https://www.federicovisi.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22HDKL7JMU%22%2C%22library%22%3A%7B%22id%22%3A5790233%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Visi%22%2C%22parsedDate%22%3A%222019%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%3Cdiv%20class%3D%5C%22csl-bib-body%5C%22%20style%3D%5C%22line-height%3A%202%3B%20padding-left%3A%201em%3B%20text-indent%3A-1em%3B%5C%22%3E%5Cn%20%3Cdiv%20class%3D%5C%22csl-entry%5C%22%3E%3Cstrong%3EVisi%3C%5C%2Fstrong%3E%2C%20F.%20%28Ed.%29.%20%282019%29.%20%3Ci%3E%3Ca%20class%3D%27zp-ItemURL%27%20target%3D%27_blank%27%20href%3D%27https%3A%5C%2F%5C%2Fwww.nime.org%5C%2Fmusic%5C%2F%27%3EMusic%20Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%3C%5C%2Fa%3E%3C%5C%2Fi%3E.%20UFRGS.%20%3C%5C%2Fdiv%3E%5Cn%3C%5C%2Fdiv%3E%22%2C%22data%22%3A%7B%22itemType%22%3A%22book%22%2C%22title%22%3A%22Music%20Proceedings%20of%20the%20International%20Conference%20on%20New%20Interfaces%20for%20Musical%20Expression%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22editor%22%2C%22firstName%22%3A%22Federico%22%2C%22lastName%22%3A%22Visi%22%7D%5D%2C%22abstractNote%22%3A%22%22%2C%22date%22%3A%222019%22%2C%22language%22%3A%22%22%2C%22ISBN%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.nime.org%5C%2Fmusic%5C%2F%22%2C%22collections%22%3A%5B%22D4CB8LXL%22%5D%2C%22dateModified%22%3A%222024-12-07T17%3A05%3A22Z%22%7D%7D%5D%7D
Visi, F. (Ed.). (2019). Music Proceedings of the International Conference on New Interfaces for Musical Expression. UFRGS.
🔥