Encoding force modulation in two electrotactile feedback parameters strengthens sensory integration according to maximum likelihood estimation
Bidirectional human–machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude...
Saved in:
Published in | Scientific reports Vol. 13; no. 1; pp. 12461 - 13 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
01.08.2023
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Bidirectional human–machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude or frequency. Recently, concurrent modulation in amplitude and frequency (multimodal encoding) was proposed. We hypothesized that feedback with multimodal encoding may effectively be processed by the central nervous system as two independent inputs encoded in amplitude and frequency, respectively, thereby increasing state estimate quality in accordance with maximum-likelihood estimation. Using an adaptation paradigm, we tested this hypothesis during a grasp force matching task where subjects received electrotactile feedback encoding instantaneous force in amplitude, frequency, or both, in addition to their natural force feedback. The results showed that adaptations in grasp force with multimodal encoding could be accurately predicted as the integration of three independent inputs according to maximum-likelihood estimation: amplitude modulated electrotactile feedback, frequency modulated electrotactile feedback, and natural force feedback (r
2
= 0.73). These findings show that multimodal electrotactile feedback carries an intrinsic advantage for state estimation accuracy with respect to single-variable modulation and suggest that this scheme should be the preferred strategy for bidirectional human–machine interfaces with electrotactile feedback. |
---|---|
AbstractList | Bidirectional human–machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude or frequency. Recently, concurrent modulation in amplitude and frequency (multimodal encoding) was proposed. We hypothesized that feedback with multimodal encoding may effectively be processed by the central nervous system as two independent inputs encoded in amplitude and frequency, respectively, thereby increasing state estimate quality in accordance with maximum-likelihood estimation. Using an adaptation paradigm, we tested this hypothesis during a grasp force matching task where subjects received electrotactile feedback encoding instantaneous force in amplitude, frequency, or both, in addition to their natural force feedback. The results showed that adaptations in grasp force with multimodal encoding could be accurately predicted as the integration of three independent inputs according to maximum-likelihood estimation: amplitude modulated electrotactile feedback, frequency modulated electrotactile feedback, and natural force feedback (r
2
= 0.73). These findings show that multimodal electrotactile feedback carries an intrinsic advantage for state estimation accuracy with respect to single-variable modulation and suggest that this scheme should be the preferred strategy for bidirectional human–machine interfaces with electrotactile feedback. Abstract Bidirectional human–machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude or frequency. Recently, concurrent modulation in amplitude and frequency (multimodal encoding) was proposed. We hypothesized that feedback with multimodal encoding may effectively be processed by the central nervous system as two independent inputs encoded in amplitude and frequency, respectively, thereby increasing state estimate quality in accordance with maximum-likelihood estimation. Using an adaptation paradigm, we tested this hypothesis during a grasp force matching task where subjects received electrotactile feedback encoding instantaneous force in amplitude, frequency, or both, in addition to their natural force feedback. The results showed that adaptations in grasp force with multimodal encoding could be accurately predicted as the integration of three independent inputs according to maximum-likelihood estimation: amplitude modulated electrotactile feedback, frequency modulated electrotactile feedback, and natural force feedback (r2 = 0.73). These findings show that multimodal electrotactile feedback carries an intrinsic advantage for state estimation accuracy with respect to single-variable modulation and suggest that this scheme should be the preferred strategy for bidirectional human–machine interfaces with electrotactile feedback. Bidirectional human-machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude or frequency. Recently, concurrent modulation in amplitude and frequency (multimodal encoding) was proposed. We hypothesized that feedback with multimodal encoding may effectively be processed by the central nervous system as two independent inputs encoded in amplitude and frequency, respectively, thereby increasing state estimate quality in accordance with maximum-likelihood estimation. Using an adaptation paradigm, we tested this hypothesis during a grasp force matching task where subjects received electrotactile feedback encoding instantaneous force in amplitude, frequency, or both, in addition to their natural force feedback. The results showed that adaptations in grasp force with multimodal encoding could be accurately predicted as the integration of three independent inputs according to maximum-likelihood estimation: amplitude modulated electrotactile feedback, frequency modulated electrotactile feedback, and natural force feedback (r = 0.73). These findings show that multimodal electrotactile feedback carries an intrinsic advantage for state estimation accuracy with respect to single-variable modulation and suggest that this scheme should be the preferred strategy for bidirectional human-machine interfaces with electrotactile feedback. Bidirectional human-machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude or frequency. Recently, concurrent modulation in amplitude and frequency (multimodal encoding) was proposed. We hypothesized that feedback with multimodal encoding may effectively be processed by the central nervous system as two independent inputs encoded in amplitude and frequency, respectively, thereby increasing state estimate quality in accordance with maximum-likelihood estimation. Using an adaptation paradigm, we tested this hypothesis during a grasp force matching task where subjects received electrotactile feedback encoding instantaneous force in amplitude, frequency, or both, in addition to their natural force feedback. The results showed that adaptations in grasp force with multimodal encoding could be accurately predicted as the integration of three independent inputs according to maximum-likelihood estimation: amplitude modulated electrotactile feedback, frequency modulated electrotactile feedback, and natural force feedback (r2 = 0.73). These findings show that multimodal electrotactile feedback carries an intrinsic advantage for state estimation accuracy with respect to single-variable modulation and suggest that this scheme should be the preferred strategy for bidirectional human-machine interfaces with electrotactile feedback.Bidirectional human-machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude or frequency. Recently, concurrent modulation in amplitude and frequency (multimodal encoding) was proposed. We hypothesized that feedback with multimodal encoding may effectively be processed by the central nervous system as two independent inputs encoded in amplitude and frequency, respectively, thereby increasing state estimate quality in accordance with maximum-likelihood estimation. Using an adaptation paradigm, we tested this hypothesis during a grasp force matching task where subjects received electrotactile feedback encoding instantaneous force in amplitude, frequency, or both, in addition to their natural force feedback. The results showed that adaptations in grasp force with multimodal encoding could be accurately predicted as the integration of three independent inputs according to maximum-likelihood estimation: amplitude modulated electrotactile feedback, frequency modulated electrotactile feedback, and natural force feedback (r2 = 0.73). These findings show that multimodal electrotactile feedback carries an intrinsic advantage for state estimation accuracy with respect to single-variable modulation and suggest that this scheme should be the preferred strategy for bidirectional human-machine interfaces with electrotactile feedback. Bidirectional human–machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such feedback may be elicited by electrical stimulation of somatosensory nerves, where a task-relevant variable is encoded in stimulation amplitude or frequency. Recently, concurrent modulation in amplitude and frequency (multimodal encoding) was proposed. We hypothesized that feedback with multimodal encoding may effectively be processed by the central nervous system as two independent inputs encoded in amplitude and frequency, respectively, thereby increasing state estimate quality in accordance with maximum-likelihood estimation. Using an adaptation paradigm, we tested this hypothesis during a grasp force matching task where subjects received electrotactile feedback encoding instantaneous force in amplitude, frequency, or both, in addition to their natural force feedback. The results showed that adaptations in grasp force with multimodal encoding could be accurately predicted as the integration of three independent inputs according to maximum-likelihood estimation: amplitude modulated electrotactile feedback, frequency modulated electrotactile feedback, and natural force feedback (r2 = 0.73). These findings show that multimodal electrotactile feedback carries an intrinsic advantage for state estimation accuracy with respect to single-variable modulation and suggest that this scheme should be the preferred strategy for bidirectional human–machine interfaces with electrotactile feedback. |
ArticleNumber | 12461 |
Author | Gholinezhad, Shima Dosen, Strahinja Dideriksen, Jakob Farina, Dario |
Author_xml | – sequence: 1 givenname: Shima surname: Gholinezhad fullname: Gholinezhad, Shima organization: Department of Health Science and Technology, Aalborg University – sequence: 2 givenname: Dario surname: Farina fullname: Farina, Dario organization: Department of Bioengineering, Faculty of Engineering, Imperial College London – sequence: 3 givenname: Strahinja surname: Dosen fullname: Dosen, Strahinja organization: Department of Health Science and Technology, Aalborg University – sequence: 4 givenname: Jakob surname: Dideriksen fullname: Dideriksen, Jakob email: jldi@hst.aau.dk organization: Department of Health Science and Technology, Aalborg University |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37528160$$D View this record in MEDLINE/PubMed |
BookMark | eNp9ks1u1TAQhSNUREvpC7BAltiwCfgvib1CqCpQqRIbWFuOPcn1bWIX2yncl-CZcW9aaLtoFo7lnPP5ZGZeVgc-eKiq1wS_J5iJD4mTRooaU1Yz0TWs3j2rjijmTU0ZpQf39ofVSUpbXJ6GSk7ki-qQdQ0VpMVH1Z8zb4J1fkRDiAbQHOwy6eyCR86j_CsgmMDkGLI22U2ABgDba3OJrnTUM2SICaUcwY95A77syxLirrgzjHElaWNC3F-SA5r1bzcvM5rcJUxuE4JFkLKb99JX1fNBTwlObt_H1Y_PZ99Pv9YX376cn366qE3DSa5N23eSSdzSvmu5FoPsZEOk4Fa3uO2YbZtB920vDGMdAaKFpUZKLaHFHbYDO67OV64NequuYrk-7lTQTu0PQhyVjtmZCRRQTXjLDFhhOG-6vsNQEA0MVhrobWF9XFlXSz-DNeBz1NMD6MMv3m3UGK5V6WP5iY4UwrtbQgw_l1INNbtkYJq0h7AkRQVviphhXKRvH0m3YYm-1OpGxTkrdaFF9eZ-pH9Z7vpeBGIVmBhSijAo4_K-AyWhm0q0m3RCrVOmypSp_ZSpXbHSR9Y7-pMmtppSEfsR4v_YT7j-Aop56cU |
CitedBy_id | crossref_primary_10_1109_TNSRE_2024_3443398 |
Cites_doi | 10.1038/415429a 10.1109/TNSRE.2014.2305111 10.1109/TNSRE.2018.2803844 10.1093/brain/awh350 10.3758/BF03211869 10.3758/BF03214307 10.1586/erd.12.68 10.1016/S0960-9822(02)00836-9 10.1126/scirobotics.abf3368 10.1016/S0304-3959(00)00484-X 10.1016/j.neuron.2018.08.033 10.1109/TNSRE.2013.2266482 10.1088/1741-2552/ac1fce 10.1097/WCO.0000000000000266 10.1038/nrn2621 10.1113/jphysiol.2006.120121 10.1016/j.apmr.2013.05.025 10.1523/JNEUROSCI.1494-21.2021 10.1186/1743-0003-11-72 10.1126/scirobotics.aax2352 10.1088/1741-2552/acd4e8 10.1186/s12984-018-0371-1 10.1111/j.1469-7793.1997.221bf.x 10.3389/fnins.2020.00345 10.1038/s41551-021-00732-x 10.3389/fnins.2020.00348 10.1152/jn.00869.2005 10.1080/17434440.2017.1332989 10.1007/BF02363433 10.1088/1741-2552/aba4fd 10.1016/j.tics.2009.11.004 10.1177/0309364614522260 |
ContentType | Journal Article |
Copyright | The Author(s) 2023 2023. The Author(s). The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: The Author(s) 2023 – notice: 2023. The Author(s). – notice: The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | C6C AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7X7 7XB 88A 88E 88I 8FE 8FH 8FI 8FJ 8FK ABUWG AEUYN AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO FYUFA GHDGH GNUQQ HCIFZ K9. LK8 M0S M1P M2P M7P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI Q9U 7X8 5PM DOA |
DOI | 10.1038/s41598-023-38753-y |
DatabaseName | Springer Nature OA Free Journals CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Health & Medical Collection ProQuest Central (purchase pre-March 2016) Biology Database (Alumni Edition) Medical Database (Alumni Edition) Science Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Collection Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest One Sustainability ProQuest Central UK/Ireland ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One Community College ProQuest Central Korea Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Biological Sciences ProQuest Health & Medical Collection Medical Database Science Database Biological Science Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Biology Journals (Alumni Edition) ProQuest Central ProQuest One Applied & Life Sciences ProQuest One Sustainability ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) Natural Science Collection ProQuest Central Korea Health & Medical Research Collection Biological Science Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest Central Basic ProQuest Science Journals ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) Biological Science Database ProQuest SciTech Collection ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | CrossRef MEDLINE MEDLINE - Academic Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: C6C name: SpringerOpen Free (Free internet resource, activated by CARLI) url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 4 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 5 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Biology |
EISSN | 2045-2322 |
EndPage | 13 |
ExternalDocumentID | oai_doaj_org_article_e2a1463ced8c4457b70e0df5efd9cebd PMC10393971 37528160 10_1038_s41598_023_38753_y |
Genre | Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: Danmarks Frie Forskningsfond grantid: 8022-00243A; 8022-00226B funderid: http://dx.doi.org/10.13039/501100004836 – fundername: ; grantid: 8022-00243A; 8022-00226B |
GroupedDBID | 0R~ 3V. 4.4 53G 5VS 7X7 88A 88E 88I 8FE 8FH 8FI 8FJ AAFWJ AAJSJ AAKDD ABDBF ABUWG ACGFS ACSMW ACUHS ADBBV ADRAZ AENEX AEUYN AFKRA AJTQC ALIPV ALMA_UNASSIGNED_HOLDINGS AOIJS AZQEC BAWUL BBNVY BCNDV BENPR BHPHI BPHCQ BVXVI C6C CCPQU DIK DWQXO EBD EBLON EBS ESX FYUFA GNUQQ GROUPED_DOAJ GX1 HCIFZ HH5 HMCUK HYE KQ8 LK8 M0L M1P M2P M48 M7P M~E NAO OK1 PIMPY PQQKQ PROAC PSQYO RNT RNTTT RPM SNYQT UKHRP AASML AAYXX AFPKN CITATION PHGZM PHGZT CGR CUY CVF ECM EIF NPM PJZUB PPXIY PQGLB 7XB 8FK AARCD K9. PKEHL PQEST PQUKI Q9U 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c541t-c6b7939062b764a8f97951984da60673d65fab6b8c3371e1a8d2c99a9e6070df3 |
IEDL.DBID | M48 |
ISSN | 2045-2322 |
IngestDate | Wed Aug 27 01:26:17 EDT 2025 Thu Aug 21 18:42:11 EDT 2025 Fri Jul 11 06:16:17 EDT 2025 Wed Aug 13 06:05:49 EDT 2025 Mon Jul 21 05:56:59 EDT 2025 Tue Jul 01 03:57:10 EDT 2025 Thu Apr 24 23:09:06 EDT 2025 Fri Feb 21 02:37:24 EST 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Language | English |
License | 2023. The Author(s). Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c541t-c6b7939062b764a8f97951984da60673d65fab6b8c3371e1a8d2c99a9e6070df3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
OpenAccessLink | https://doaj.org/article/e2a1463ced8c4457b70e0df5efd9cebd |
PMID | 37528160 |
PQID | 2844437932 |
PQPubID | 2041939 |
PageCount | 13 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_e2a1463ced8c4457b70e0df5efd9cebd pubmedcentral_primary_oai_pubmedcentral_nih_gov_10393971 proquest_miscellaneous_2845103300 proquest_journals_2844437932 pubmed_primary_37528160 crossref_citationtrail_10_1038_s41598_023_38753_y crossref_primary_10_1038_s41598_023_38753_y springer_journals_10_1038_s41598_023_38753_y |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-08-01 |
PublicationDateYYYYMMDD | 2023-08-01 |
PublicationDate_xml | – month: 08 year: 2023 text: 2023-08-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | London |
PublicationPlace_xml | – name: London – name: England |
PublicationTitle | Scientific reports |
PublicationTitleAbbrev | Sci Rep |
PublicationTitleAlternate | Sci Rep |
PublicationYear | 2023 |
Publisher | Nature Publishing Group UK Nature Publishing Group Nature Portfolio |
Publisher_xml | – name: Nature Publishing Group UK – name: Nature Publishing Group – name: Nature Portfolio |
References | D’Alonzo, Dosen, Cipriani, Farina (CR29) 2014; 22 Farina (CR1) 2014; 22 Kaernbach (CR33) 1991; 49 Van Gils (CR31) 2013; 94 CR16 CR14 CR13 CR11 CR10 CR32 Antfolk (CR3) 2013; 10 Graczyk, Christie, He, Tyler, Bensmaia (CR19) 2022; 42 Gholinezhad, Dosen, Dideriksen (CR24) 2021; 18 Bays, Wolpert (CR15) 2007; 578 Van Beers, Wolpert, Haggard (CR23) 2002; 12 Ernst, Banks (CR22) 2002; 415 Tyler (CR8) 2015; 28 Hunter, Katz, Davis (CR30) 2005; 128 CR2 CR4 CR6 CR5 Johansson, Flanagan (CR12) 2009; 10 Marasco (CR25) 2021; 6 CR7 CR28 CR27 CR26 Geng (CR18) 2018; 26 Van Doren (CR20) 1997; 59 CR21 Valle (CR9) 2018; 100 Schlereth, Magerl, Treede (CR17) 2001; 92 S Gholinezhad (38753_CR24) 2021; 18 DJ Tyler (38753_CR8) 2015; 28 38753_CR28 38753_CR26 38753_CR27 RS Johansson (38753_CR12) 2009; 10 RJ Van Beers (38753_CR23) 2002; 12 C Antfolk (38753_CR3) 2013; 10 M D’Alonzo (38753_CR29) 2014; 22 38753_CR11 PD Marasco (38753_CR25) 2021; 6 C Kaernbach (38753_CR33) 1991; 49 38753_CR10 38753_CR32 D Farina (38753_CR1) 2014; 22 PM Bays (38753_CR15) 2007; 578 W Van Gils (38753_CR31) 2013; 94 G Valle (38753_CR9) 2018; 100 38753_CR16 38753_CR13 MO Ernst (38753_CR22) 2002; 415 38753_CR14 B Geng (38753_CR18) 2018; 26 EL Graczyk (38753_CR19) 2022; 42 38753_CR4 38753_CR5 CL Van Doren (38753_CR20) 1997; 59 38753_CR2 T Schlereth (38753_CR17) 2001; 92 38753_CR21 38753_CR6 JP Hunter (38753_CR30) 2005; 128 38753_CR7 |
References_xml | – volume: 415 start-page: 429 year: 2002 end-page: 433 ident: CR22 article-title: Humans integrate visual and haptic information in a statistically optimal fashion publication-title: Nature doi: 10.1038/415429a – volume: 22 start-page: 797 year: 2014 end-page: 809 ident: CR1 article-title: The extraction of neural information from the surface EMG for the control of upper-limb prostheses: Emerging avenues and challenges publication-title: IEEE Trans. Neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2014.2305111 – ident: CR4 – ident: CR14 – ident: CR2 – ident: CR16 – ident: CR10 – volume: 26 start-page: 709 year: 2018 end-page: 715 ident: CR18 article-title: Psychophysical Evaluation of Subdermal Electrical Stimulation in Relation to Prosthesis Sensory Feedback publication-title: IEEE Trans. neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2018.2803844 – volume: 128 start-page: 308 year: 2005 end-page: 320 ident: CR30 article-title: Dissociation of phantom limb phenomena from stump tactile spatial acuity and sensory thresholds publication-title: Brain doi: 10.1093/brain/awh350 – volume: 59 start-page: 613 year: 1997 end-page: 622 ident: CR20 article-title: Contours of equal perceived amplitude and equal perceived frequency for electrocutaneous stimuli publication-title: Percept. Psychophys. doi: 10.3758/BF03211869 – volume: 49 start-page: 227 year: 1991 end-page: 229 ident: CR33 article-title: Simple adaptive testing with the weighted up-down method publication-title: Percept. Psychophys. doi: 10.3758/BF03214307 – volume: 10 start-page: 45 year: 2013 end-page: 54 ident: CR3 article-title: Sensory feedback in upper limb prosthetics publication-title: Expert Rev. Med. Devices doi: 10.1586/erd.12.68 – ident: CR6 – volume: 12 start-page: 834 year: 2002 end-page: 837 ident: CR23 article-title: When feeling is more important than seeing in sensorimotor adaptation publication-title: Curr. Biol. doi: 10.1016/S0960-9822(02)00836-9 – volume: 6 start-page: 3368 year: 2021 ident: CR25 article-title: Neurorobotic fusion of prosthetic touch, kinesthesia, and movement in bionic upper limbs promotes intrinsic brain behaviors publication-title: Sci. Robot doi: 10.1126/scirobotics.abf3368 – ident: CR27 – volume: 92 start-page: 187 year: 2001 end-page: 194 ident: CR17 article-title: Spatial discrimination thresholds for pain and touch in human hairy skin publication-title: Pain doi: 10.1016/S0304-3959(00)00484-X – volume: 100 start-page: 37 year: 2018 end-page: 45 ident: CR9 article-title: Biomimetic intraneural sensory feedback enhances sensation naturalness, tactile sensitivity, and manual dexterity in a bidirectional prosthesis publication-title: Neuron doi: 10.1016/j.neuron.2018.08.033 – volume: 22 start-page: 290 year: 2014 end-page: 301 ident: CR29 article-title: HyVE: Hybrid vibro-electrotactile stimulation for sensory feedback and substitution in rehabilitation publication-title: IEEE Trans. Neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2013.2266482 – ident: CR21 – volume: 18 year: 2021 ident: CR24 article-title: Electrotactile feedback outweighs natural feedback in sensory integration during control of grasp force publication-title: J. Neural Eng. doi: 10.1088/1741-2552/ac1fce – volume: 28 start-page: 574 year: 2015 end-page: 581 ident: CR8 article-title: Neural interfaces for somatosensory feedback: bringing life to a prosthesis publication-title: Curr. Opin. Neurol. doi: 10.1097/WCO.0000000000000266 – volume: 10 start-page: 345 year: 2009 end-page: 359 ident: CR12 article-title: Coding and use of tactile signals from the fingertips in object manipulation tasks publication-title: Nat. Rev. Neurosci. doi: 10.1038/nrn2621 – volume: 578 start-page: 387 year: 2007 end-page: 396 ident: CR15 article-title: Computational principles of sensorimotor control that minimize uncertainty and variability publication-title: J. Physiol. doi: 10.1113/jphysiol.2006.120121 – volume: 94 start-page: 2179 year: 2013 end-page: 2185 ident: CR31 article-title: Sensibility of the stump in adults with an acquired major upper extremity amputation publication-title: Arch. Phys. Med. Rehabil. doi: 10.1016/j.apmr.2013.05.025 – ident: CR13 – ident: CR11 – ident: CR32 – ident: CR5 – ident: CR7 – volume: 42 start-page: 2052 year: 2022 end-page: 2064 ident: CR19 article-title: Frequency shapes the quality of tactile percepts evoked through electrical stimulation of the nerves publication-title: J. Neurosci. doi: 10.1523/JNEUROSCI.1494-21.2021 – ident: CR28 – ident: CR26 – ident: 38753_CR2 doi: 10.1186/1743-0003-11-72 – ident: 38753_CR10 doi: 10.1126/scirobotics.aax2352 – ident: 38753_CR32 – ident: 38753_CR11 doi: 10.1088/1741-2552/acd4e8 – volume: 59 start-page: 613 year: 1997 ident: 38753_CR20 publication-title: Percept. Psychophys. doi: 10.3758/BF03211869 – volume: 10 start-page: 45 year: 2013 ident: 38753_CR3 publication-title: Expert Rev. Med. Devices doi: 10.1586/erd.12.68 – volume: 42 start-page: 2052 year: 2022 ident: 38753_CR19 publication-title: J. Neurosci. doi: 10.1523/JNEUROSCI.1494-21.2021 – volume: 6 start-page: 3368 year: 2021 ident: 38753_CR25 publication-title: Sci. Robot doi: 10.1126/scirobotics.abf3368 – volume: 415 start-page: 429 year: 2002 ident: 38753_CR22 publication-title: Nature doi: 10.1038/415429a – ident: 38753_CR26 doi: 10.1186/s12984-018-0371-1 – volume: 10 start-page: 345 year: 2009 ident: 38753_CR12 publication-title: Nat. Rev. Neurosci. doi: 10.1038/nrn2621 – volume: 22 start-page: 797 year: 2014 ident: 38753_CR1 publication-title: IEEE Trans. Neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2014.2305111 – volume: 26 start-page: 709 year: 2018 ident: 38753_CR18 publication-title: IEEE Trans. neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2018.2803844 – volume: 128 start-page: 308 year: 2005 ident: 38753_CR30 publication-title: Brain doi: 10.1093/brain/awh350 – volume: 100 start-page: 37 year: 2018 ident: 38753_CR9 publication-title: Neuron doi: 10.1016/j.neuron.2018.08.033 – volume: 22 start-page: 290 year: 2014 ident: 38753_CR29 publication-title: IEEE Trans. Neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2013.2266482 – volume: 92 start-page: 187 year: 2001 ident: 38753_CR17 publication-title: Pain doi: 10.1016/S0304-3959(00)00484-X – ident: 38753_CR14 doi: 10.1111/j.1469-7793.1997.221bf.x – volume: 49 start-page: 227 year: 1991 ident: 38753_CR33 publication-title: Percept. Psychophys. doi: 10.3758/BF03214307 – ident: 38753_CR6 doi: 10.3389/fnins.2020.00345 – volume: 578 start-page: 387 year: 2007 ident: 38753_CR15 publication-title: J. Physiol. doi: 10.1113/jphysiol.2006.120121 – ident: 38753_CR5 doi: 10.1038/s41551-021-00732-x – ident: 38753_CR21 doi: 10.3389/fnins.2020.00348 – volume: 18 year: 2021 ident: 38753_CR24 publication-title: J. Neural Eng. doi: 10.1088/1741-2552/ac1fce – ident: 38753_CR13 doi: 10.1152/jn.00869.2005 – ident: 38753_CR7 doi: 10.1080/17434440.2017.1332989 – ident: 38753_CR4 doi: 10.1007/BF02363433 – volume: 94 start-page: 2179 year: 2013 ident: 38753_CR31 publication-title: Arch. Phys. Med. Rehabil. doi: 10.1016/j.apmr.2013.05.025 – ident: 38753_CR27 doi: 10.1088/1741-2552/aba4fd – ident: 38753_CR16 doi: 10.1016/j.tics.2009.11.004 – volume: 12 start-page: 834 year: 2002 ident: 38753_CR23 publication-title: Curr. Biol. doi: 10.1016/S0960-9822(02)00836-9 – ident: 38753_CR28 doi: 10.1177/0309364614522260 – volume: 28 start-page: 574 year: 2015 ident: 38753_CR8 publication-title: Curr. Opin. Neurol. doi: 10.1097/WCO.0000000000000266 |
SSID | ssj0000529419 |
Score | 2.4057486 |
Snippet | Bidirectional human–machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such... Bidirectional human-machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device state. Such... Abstract Bidirectional human–machine interfaces involve commands from the central nervous system to an external device and feedback characterizing device... |
SourceID | doaj pubmedcentral proquest pubmed crossref springer |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 12461 |
SubjectTerms | 631/378/3917 639/166/985 Adaptation Central nervous system Electric Stimulation Electrical stimuli Feedback Feedback, Sensory - physiology Hand Strength - physiology Humanities and Social Sciences Humans Interfaces Likelihood Functions multidisciplinary Nervous system Science Science (multidisciplinary) Sensory integration Touch - physiology |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwELZQJSQuiDeBgozEDaLGsePER0CtKg6cqNSb5ccEou4mVTcr2D_Bb2bGyS5dnhcuURQ7ieWZ8TeWZ75h7GWLTqjRAXKEapcrMHVuoPB50ZZVdFTSLKRoiw_69Ey9P6_Or5X6opiwiR54mrgjKB0aswwQm6BUVfu6gCK2FbTRBPCRVl_EvGubqYnVuzRKmDlLppDN0QqRirLJSkl0spXMN3tIlAj7f-dl_hos-dOJaQKikzvs9uxB8jfTyO-yG9DfYzenmpKb--zbcR8GAiSO7mgAvhziXKGLdz0fvwx8rnwzUkrDAniL-OVduODEAr6k6JgVpwyS_hMRI-A9XoarDd8SS9CXXKBNK_1kHPjSfe2W6yVfdBew6IgmmRN1x5QT-YCdnRx_fHeaz0UX8lApMeZBezRZYi_2tVauaU2NTphpVHSaitpEXbXOa98EKWsBwjWxDMY4AxpXj9jKh-ygH3p4zLisgvAiGl-WQgHgt2JwMiBqAopE1xkTWwHYMDOSU2GMhU0n47Kxk9AsCs0modlNxl7t3rmc-Dj-2vstyXXXk7i00wPUMDtrmP2XhmXscKsVdjbwlUVUV0TlKMuMvdg1o2nSeYvrYVinPsRXKIsiY48mJdqNRNZV2QiNLc2eeu0Ndb-l7z4n-m86vEcvUmTs9VYTf4zrz3Px5H_MxVN2qyQTShGQh-xgvFrDM_TKRv88GeB3whc5kQ priority: 102 providerName: Directory of Open Access Journals – databaseName: Health & Medical Collection dbid: 7X7 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lj9MwELZgERIXxHsDCzISN4g2jh0nPiFAu1px4MRKvUV-ZYm2SZY2FfRP8JuZcZyuymMvVVW7qZOZ8Uw9M99HyJsGglAlrU_BVetUeFWmymcmzZq8cBopzWyotvgiz87F50WxiAdu61hWOe-JYaN2g8Uz8mPYRgVi5_H8_dX3FFmjMLsaKTRukzsIXYYlXeWi3J2xYBZLMBV7ZTJeHa_BX2FPWc4RVLbg6XbPHwXY_n_Fmn-XTP6RNw3u6PQBuR_jSPphEvxDcsv3j8jdiVly-5j8OuntgG6JQlBqPe0GF3m6aNvT8cdAI__NiI0NS08b8GJG20uKWOAd1sisKfaR9BcIjwDv4WVYbekML4FX0hb_uuKPjAPt9M-223R02V76ZYtgyRQBPKbOyCfk_PTk66ezNFIvpLYQbEytNPCoEcPYlFLoqlElhGKqEk5LpLZxsmi0kaaynJfMM1253CqllZewh7iGPyUH_dD7Q0J5YZlhTpk8Z8J7uJazmlvwnR5EIsuEsFkAtY245EiPsaxDfpxX9SS0GoRWB6HV24S83X3nakLluHH2R5TrbiYiaocPhtVFHQ209rkGp8Gtd5UVoihNmXm4kcI3TllvXEKOZq2oo5mv62ulTMjr3TAYKGZddO-HTZiDqIU8yxLybFKi3Up4WeQVkzBS7anX3lL3R_r2WwABxxQ-xJIsIe9mTbxe1_-fxfObb-MFuZejcYQKxyNyMK42_iVEXaN5FUzrN287LyA priority: 102 providerName: ProQuest – databaseName: Springer Nature OA Free Journals dbid: C6C link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9NAEF6VIiQuiDeGghaJG1h4H157jxC1qjhwolJv1r5crCZ2lTiC_In-5s6sHyhQkLhEVnbX3nhm_I0zM98Q8q4GJ1QrF1KAapPKoItUh8ymWc1zb7ClmYvZFl_V6Zn8cp6fHxA-1cLEpP1IaRkf01N22McNAA0Wg3GBbLC5SHd3yF2kbketXqjF_L8KRq4k02N9TCbKW5buYVCk6r_Nv_wzTfK3WGmEoJOH5MHoO9JPw24fkYPQPib3hm6Suyfk-rh1HUIRBUfUBbrq_NibizYt7X90dOx502MxwzLQGpDLGndJkf97hXkxG4q1I-0FUiLAMXx06x2dKCXwTMbh6ypepO_oyvxsVtsVXTaXYdkgQTJF0o6hGvIpOTs5_rY4Tcd2C6nLJetTpywYK_IW20JJU9a6APdLl9Ibhe1svMprY5UtnRAFC8yUnjutjQ4Knhu-Fs_IYdu14QWhInfMMq8t50yGAOfyzggHeBlAJKpICJsEULmRixxbYiyrGBMXZTUIrQKhVVFo1S4h7-c1VwMTxz9nf0a5zjORRTt-0a0vqlGrqsANAIVwwZdOyrywRRbgh-Sh9toF6xNyNGlFNZr2pgI8l0jiKHhC3s7DYJQYaTFt6LZxDjIViixLyPNBieadiCLnJVMwUu6p195W90fa5nsk_sawPfiPLCEfJk38ta-_34uX_zf9FbnP0VhiluMROezX2_AaPK_evommdgPiTS1- priority: 102 providerName: Springer Nature |
Title | Encoding force modulation in two electrotactile feedback parameters strengthens sensory integration according to maximum likelihood estimation |
URI | https://link.springer.com/article/10.1038/s41598-023-38753-y https://www.ncbi.nlm.nih.gov/pubmed/37528160 https://www.proquest.com/docview/2844437932 https://www.proquest.com/docview/2845103300 https://pubmed.ncbi.nlm.nih.gov/PMC10393971 https://doaj.org/article/e2a1463ced8c4457b70e0df5efd9cebd |
Volume | 13 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Nj9MwELX2Q0hcEN8ElspI3CDQ2IkdHxDqVl2tKrFCQKXeIsd2dqttE2hTsf0T_GZmnKSoUJC4tFXsuq5npm9Se94j5GUBSagSxoUA1TqMnZKhcv087BcssRolzYw_bXEhzifxeJpMD0gnd9Qu4GrvrR3qSU2W8zc33zbvIeDfNSXj6dsVgBAWijGOTLEJDzeH5BiQSWKgfmjT_Ybrm6nYa30gCXsIyQRr62j2D7ODVZ7Sf18e-udxyt_2VD1Und0ld9ockw4ap7hHDlx5n9xqVCc3D8iPUWkqhCwKCatxdFHZVsOLzkpaf69oq41TY9HD3NECEC7X5poiT_gCz8-sKNaYlJdInQCv4aFabmhHPYEjaYO3tfghdUUX-ma2WC_ofHbt5jMkUqZI7tFUTT4kk7PRl-F52MoyhCaJozo0IoegRn7jXIpYp4WSkKapNLZaoOyNFUmhc5GnhnMZuUinlhmltHICfl9swR-Ro7Iq3RNCeWKiPLIqZyyKnYOxrNHcAK46MImQAYk6A2Sm5SxH6Yx55vfOeZo1RsvAaJk3WrYJyKvte742jB3_7H2Kdt32RLZtf6FaXmZt8GaOaQAUbpxNTRwnMpd9B18kcYVVxuU2ICedV2SdB2eA-zGSPXIWkBfbZghe3JHRpavWvg8yGvJ-PyCPGyfazoTLhKWRgJZ0x712prrbUs6uPEE4bu9DnhkF5HXnib_m9fe1ePpfK_eM3GYYK_4w5Ak5qpdr9xwStDrvkUM5lT1yPBiMP4_h-XR08fETXB2KYc__6dHzcfkTLWY-XQ |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Zb9NAEF6VVAheKm5cCiwSPIFV27s-9gEhCqlSWiKEWqlv271coiZ2SRyV_Al-Cr-RGR-pwtG3vkRWdr1ZZ871zHxDyMscnFCRGOeDqVY-dyL1hQu0H-RRbBW2NDN1tsUwGRzxT8fx8Rr51dXCYFplpxNrRW1Lg-_It0GNcsTOY9G78-8-do3C6GrXQqNhi323uIAj2-zt3keg76so2u0ffhj4bVcB38Q8rHyTaFgF4Xl1mnCV5SIFL0Nk3KoEu7bYJM6VTnRmGEtDF6rMRkYIJVwC4mFzBuveIOucwVGmR9Z3-sMvX5dvdTBuxkPRVucELNuegYXEKraIIYxtzPzFigWsGwX8y7v9O0nzj0htbQB375CN1nOl7xtWu0vWXHGP3Gx6WS7uk5_9wpRoCCm4wcbRSWnbzmB0VNDqoqRtx50KSynGjuZgN7UyZxTRxyeYlTOjWLlSnCIgA1zDRzld0A7QAldSBg_L-CNVSSfqx2gyn9Dx6MyNRwjPTBEypKnFfECOroUsD0mvKAv3mFAWm1CHVugoCrlzsJY1ihmw1g5IkqQeCTsCSNMioWNDjrGsI_Iskw3RJBBN1kSTC4-8Xt5z3uCAXDl7B-m6nIkY3vUX5fRUtipBukiBmWLG2cxwHqc6DRw8SOxyK4zT1iNbHVfIVrHM5KUYeOTFchhUAsZ5VOHKeT0HcRJZEHjkUcNEy52wNI6yMIGRbIW9Vra6OlKMvtWw45g0AN5r6JE3HSde7uv__8Xm1Y_xnNwaHH4-kAd7w_0n5HaEglLnV26RXjWdu6fg81X6WStolJxct2z_BsRKbEk |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1bb9MwFLbGEIgXxJ2MAUaCJ4gax0kcPyAEbNXG0MQDk_oWfMuo1iZbm2r0T_CD-HWc4ySdymVve6mq2nWdnmvic76PkJclJKEyMy6EUK3CxEkRShfpMCrj1CqkNDO-2uIw2ztKPo3S0Qb51ffCYFll7xO9o7a1wWfkA3CjCWLn8XhQdmURX3aG707PQmSQwpPWnk6jVZEDtzyH27f52_0dkPWrOB7ufv24F3YMA6FJE9aEJtOwIkL1apElKi-lgIxD5olVGTK42Cwtlc50bjgXzDGV29hIqaTLwFRsyWHda-S64ClDGxMjsXq-gydoCZNdn07E88EcYiX2s8UcAW1THi7XYqGnDPhXnvt3ueYfZ7Y-FA7vkNtdDkvft0p3l2y46h650bJaLu-Tn7uVqTEkUkiIjaPT2nYcYXRc0ea8ph33ToNNFRNHS4igWpkTijjkU6zPmVPsYamOEZoB3sNLPVvSHtoCV1IGb5vxR5qaTtWP8XQxpZPxiZuMEaiZInhI25X5gBxdiVAeks2qrtxjQnlqmGZW6jhmiXOwljWKG4jbDkSSiYCwXgCF6TDRkZpjUvizeZ4XrdAKEFrhhVYsA_J69Z3TFhHk0tkfUK6rmYjm7T-oZ8dF5xwKFysIWNw4m5skSYUWkYMLSV1ppXHaBmS714qiczHz4sIgAvJiNQzOAU98VOXqhZ-DiIk8igLyqFWi1U64SOOcZTCSr6nX2lbXR6rxdw9AjuUDkMeygLzpNfFiX___L7Yuv4zn5CZYdPF5__DgCbkVo534QsttstnMFu4pJH-NfuatjJJvV23WvwF9jG8Z |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Encoding+force+modulation+in+two+electrotactile+feedback+parameters+strengthens+sensory+integration+according+to+maximum+likelihood+estimation&rft.jtitle=Scientific+reports&rft.au=Gholinezhad%2C+Shima&rft.au=Farina%2C+Dario&rft.au=Dosen%2C+Strahinja&rft.au=Dideriksen%2C+Jakob&rft.date=2023-08-01&rft.issn=2045-2322&rft.eissn=2045-2322&rft.volume=13&rft.issue=1&rft_id=info:doi/10.1038%2Fs41598-023-38753-y&rft.externalDBID=n%2Fa&rft.externalDocID=10_1038_s41598_023_38753_y |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2045-2322&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2045-2322&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2045-2322&client=summon |