Search for a command to run...
Context . Star formation within filaments may arise due to the growth of cores according to linear perturbation theory. This implies a minimum core separation, as shorter modes would not be able to grow. While many observations agree with core separations by theoretical predictions, some observations also show star forming cores which lie closer together than the minimum wavelength given by perturbation theory. Aims . We explore whether non-linear effects during the late stages of core growth can explain the discrepancy between theory and observations. Methods . We perform 3D hydrodynamical simulations with the RAMSES code to follow the evolution of initial perturbations within filaments and compare the measured growth rates to expectations from theoretical models. Results . Non-linear evolution sets in as soon as the core mass reaches a value where the gravitational potential is no longer dominated by the cylindrical potential of the filament but by the spherical potential of the Bonnor-Ebert sphere. Consequently, core collapse is not triggered by the loss of hydrostatic stability of the filament but by the loss of hydrostatic stability of the Bonnor-Ebert sphere. As the core is embedded in the filament, the maximum core mass is given by the pressure within the filament, resulting in a constant line-mass threshold for core collapse. Conclusions . As core collapse is triggered as soon as overdensities reach a certain line-mass, cores which form as large line-mass perturbations during filament formation can go into direct collapse even if their separation is closer than predicted by linear perturbation theory. Therefore, our result can explain the discrepancy between theory and observations.