Search for a command to run...
This dataset contains de-identified behavioral and electrophysiological data from a study examining interhemispheric somatosensory communication during bimanual movement in pianists and nonmusicians. Data were collected using electrical stimulation, passive robotic finger movement, and perceptual tasks, and include participant-level measures of somatosensory responses, behavioral performance, experimental condition labels, and analysis-ready summary variables. The files are structured to reproduce the principal analyses of directional asymmetry, group differences, and brain-behavior associations reported in the associated manuscript. These data may be reused for research on bimanual coordination, somatosensory processing, hemispheric specialization, expertise-dependent plasticity, and comparative statistical reanalysis. All data are fully de-identified and were collected under institutional ethical approval with informed consent. Description of data.zip This dataset is provided as a single compressed archive (data.zip) containing three comma-separated value (CSV) files: fig3_data.csv, fig4_data.csv, and fig5_data.csv. Each file is organized in long format and contains the source data corresponding to one main figure in the associated manuscript. fig3_data.csv fig3_data.csv contains the data used for Figure 3. The columns are erp, finger, condition, hemisphere, group, and id. erp represents the P1 amplitude of the event-related potential (ERP) elicited by passive finger movement. finger indicates the stimulated/moved finger, coded from 1 to 5 (thumb, index, middle, ring, and little finger). condition indicates the movement context, where 0 = unimanual and 1 = bimanual. hemisphere indicates the hemisphere from which the ERP was measured, where 0 = dominant hemisphere and 1 = nondominant hemisphere. group indicates participant group, where 0 = pianist and 1 = nonmusician. id is a unique participant identifier. fig4_data.csv fig4_data.csv contains the data used for Figure 4. The columns are pp_sep, sp_sep, hemisphere, group, id, and condition. pp_sep represents the amplitude of the somatosensory evoked potential (SEP) elicited by paired-pulse electrical stimulation, and sp_sep represents the amplitude elicited by single-pulse stimulation. hemisphere indicates the recorded hemisphere, with D = dominant hemisphere and ND = nondominant hemisphere. group indicates participant group, with P = pianist and NM = nonmusician. id is a unique participant identifier. condition indicates the task state, where Rest refers to the resting condition and Bimanual refers to the condition in which both hands were passively moved by the exoskeleton during recording. fig5_data.csv fig5_data.csv contains the data used for Figure 5. The columns are pt, id, hand, condition, and group. pt represents the perceptual threshold after log10 transformation. id is a unique participant identifier. hand indicates the hand for which the perceptual threshold was measured, where 1 = dominant hand and 2 = nondominant hand. condition indicates whether the opposite hand was at rest or moving during threshold measurement, where 1 = opposite hand at rest and 2 = opposite hand moving randomly. group indicates participant group, where 1 = pianist and 2 = nonmusician. All files are fully de-identified. No personally identifiable information is included. The dataset can be reused to reproduce the main analyses and figures in the manuscript, and may also support secondary analyses of interhemispheric somatosensory processing, bimanual coordination, expertise-related plasticity, and brain-behavior relationships.