Search for a command to run...
Purpose. The purpose of the study was to evaluate the potential of artificial intelligence algorithms in detecting latent sexual disorders and to determine the effectiveness of various models for the integrative diagnosis of subclinical psychosexual disorders. Materials and methods. A systematic analysis of scientific publications from international databases, including PubMed, Scopus and Web of Science, was conducted for the period 2014–2025. Fifty-four studies were selected that met the inclusion criteria of using machine learning, deep learning, or natural language processing algorithms to assess psychosexual or behavioral markers of sexual function. Multimodal approaches were used, including the analysis of speech patterns, digital behavioral signals, and physiological data. Statistical analysis was carried out by assessing the accuracy, sensitivity, specificity of the models and comparing them between different types of data. Results. The results showed that artificial intelligence algorithms are able to effectively detect hidden sexual disorders that are not detected by traditional clinical methods. The highest performance was demonstrated by multimodal models that integrate language, behavioral, and physiological data, providing high sensitivity (0.78–0.88) and specificity (0.76–0.88) scores. NLP models revealed subtle linguistic markers of subclinical impairment, digital phenotypes allowed indirect behavioral indicators to be tracked, and physiological signals provided additional objective assessments of reactivity and arousal. At the same time, limitations were identified: low representativeness of the samples, methodological heterogeneity, lack of standardized criteria for determining hidden disorders, and ethical risks regarding data confidentiality. Conclusions. Artificial intelligence algorithms have significant potential for the diagnosis of hidden sexual disorders and can become the basis for preventive digital sexopathology. Further development of the field requires large-scale multimodal research, standardization of algorithms, multicenter validation, and implementation of ethical principles of data processing.