https://con.duikt.edu.ua/index.php/communication/issue/feedConnectivity2025-07-23T14:56:44+00:00Open Journal Systems<p><img src="/public/site/images/coneditor/Обкладинка_Звязок_№_6_(172)3.png"></p> <p><strong>Name of journal</strong> – «Connectivity» (Зв'язок)<br><strong>Founder</strong>: State University of Telecommunications.<br><strong>Year of foundation</strong>: 1995.<br><strong>State certificate of registration</strong>: <a href="http://www.irbis-nbuv.gov.ua/cgi-bin/irbis_nbuv/cgiirbis_64.exe?C21COM=2&I21DBN=UJRN&P21DBN=UJRN&Z21ID=&Image_file_name=IMG%2Fvduikt_s.jpg&IMAGE_FILE_DOWNLOAD=0">КВ № 20996-10796 ПР від 25.09.2014</a>. <br><strong>ISSN</strong>: 2412-9070<br><strong>Subject</strong>: telecommunications, informative technologies, computing engineering, education.<br><strong>Periodicity</strong> – six times a year.<br><strong>Address</strong>: Solomyanska Str., 7, Kyiv, 03110, Ukraine.<br><strong>Telephones</strong>:+380 (44) 249 25 42;<br><strong>E-mail</strong>: <strong><a href="mailto:kpstorchak@ukr.net">kpstorchak@ukr.net</a></strong><br><strong>Web-сайт: </strong><a href="http://www.dut.edu.ua/" target="_blank" rel="noopener">http://www.dut.edu.ua/</a>, <a href="http://con.dut.edu.ua/">http://con.dut.edu.ua/</a></p>https://con.duikt.edu.ua/index.php/communication/article/view/2868Title2025-07-22T20:53:25+00:00<p>Title</p>2025-07-13T22:13:06+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2869Content2025-07-22T20:53:29+00:00<p>Content</p>2025-07-13T22:16:10+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2870Monitoring and use of IoT for industrial process automation2025-07-23T14:50:07+00:00Заячковський А. В. (Zayachkovskyi A. V.)con@duikt.edu.uaЗавацький В. О. (Zavatsky V. O.)con@duikt.edu.uaСторчак К. П. (Storchak K. P.)con@duikt.edu.uaТкаленко О. М. (Tkalenko O. M.)con@duikt.edu.ua<p>Recent technological advances offer innovative methods for automating industrial processes, which play a critical role in enhancing productivity, efficiency, and overall competitiveness in the manufacturing sector. The Internet of Things (IoT) has emerged as one of the primary technologies driving this transformation. By enabling the connection and communication of devices and machinery, IoT technologies provide a foundation for creating intelligent systems that support the real time collection, processing, and analysis of data. This capability is particularly valuable in manufac turing environments, where it facilitates informed decision-making, optimizes resource allocation, and enables predictive maintenance, leading to a significant reduction in operational costs. Additionally, IoT-driven automation enhances safety by enabling the early detection of anomalies and potential hazards, reducing human intervention in high-risk tasks. <br>The integration of microprocessor systems with IoT introduces a new level of flexibility and scalability in automated industrial solutions. These systems can be customized to meet the specific needs of various industries, such as automotive, pharmaceuticals, and energy, where production processes demand high precision and reliability. By leveraging the power of IoT and microprocessors, industries can achieve real-time monitoring and adaptive control of machinery, optimizing the manufacturing workflow and ensuring consistent product quality. Furthermore, IoT applications in industrial automation offer a modular approach, allowing manufacturers to expand or reconfigure production systems without significant downtime or resource investment. <br>Implementing IoT solutions in industrial environments requires robust communication protocols and data security measures to ensure the safe and efficient exchange of information between connected devices. Secure data handling and storage are essential to protect sensitive information and maintain system integrity, especially in large-scale operations. Consequently, industrial IoT solutions often incorporate advanced cybersecurity practices, including encryption and authentication mechanisms, to safeguard against potential breaches or disruptions. <br>Overall, the use of IoT and microprocessor-based systems in industrial process automation represents a paradigm shift in manufacturing, promoting sustainability and operational efficiency. As industries increasingly adopt IoT-enabled automation, they are better positioned to meet the demands of a rapidly changing market, adjust to resource constraints, and minimize environmental impact. By capitalizing on real-time insights and intelligent control systems, IoT technology is paving the way for a future where automated industrial processes are smarter, more adaptive, and economically viable.</p> <p><strong>Keywords:</strong> Internet of Things (IoT), microprocessor systems, industrial process automation, smart systems, real-time data, predictive maintenance, cybersecurity.</p>2025-07-14T20:23:23+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2871Methodology for applying service level agreements (SLA)2025-07-23T14:51:23+00:00Заїка В. Ф. (Zaika V. F.)con@duikt.edu.uaВарфоломеєва О. Г. (Varfolomeeva O. G.)con@duikt.edu.uaКолченко Г. Ф. (Kolchenko G. F.)con@duikt.edu.uaМиронов Д. В. (Mironov D. V.)con@duikt.edu.uaПерепелиця Н. Л. (Perepelitsa N. L.)con@duikt.edu.ua<p>The principles of building next-generation networks are considered. The main technological features that distinguish information and communication services from the services of traditional communication networks are analyzed. It is determined that the next generation network should ensure the transmission of all types of media traffic and the distributed provision of an unlimited range of information and communication services with the possibility of scaling. For some information and communication services, the order of packet arrival, packet delay and delay variation (jitter) are critical. It is necessary to guarantee the delivery of information such as speech, video and multimedia in real time with the lowest possible delay. For this purpose, mechanisms must be implemented in the network to guarantee the required quality of service (QoS). To ensure the guaranteed quality of service at the upper levels of the hierarchy, according to the Open Systems Interoperability (OSI) model, a Service Level Agreement (SLA) is proposed. <br>The use of SLAs in the modern electronic communications market is due to the fact that when it comes time to choose a service provider, the subscriber (consumer) is mainly interested in the following issues: cost (availability), performance of the equipment providing the service and the level of quality of service provision (maintenance). At the same time, the subscriber expects the operator or provider to ensure not only uninterrupted provision of services, but also the rapid introduction of new ones. <br>An SLA is an agreement between an operator or provider and a subscriber (consumer) for the provision/receipt of certain telecommunications services of a certain level of quality, which will allow to guarantee a certain level of quality of the provided telecommunications services. <br>SLA is a universal method that allows you to negotiate with the consumer about the level of quality of the provided information and communication services, that is, a method that would represent the level of quality of telecommunication services from the point of view of the consumer.</p> <p><strong>Keywords:</strong> network, management, delay, probability, information and communication service, agreement, QoS, SLA.</p>2025-07-14T21:08:56+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2872A model for controlling virtualized network functions under dynamic load changes2025-07-23T14:56:44+00:00Кульчицький Д. О. (Kulchytskyi D. V.)con@duikt.edu.uaЖуков Є. В. (Zhukov Y. V.)con@duikt.edu.ua<p>The paper addresses the challenge of improving the management efficiency of virtualized network functions (NFV) in software-defined network (SDN) environments under dynamic load changes and varying Quality of Service (QoS) requirements. The increasing scale and complexity of network structures necessitate the introduction of intelligent automation mechanisms for network infrastructure management. <br>A conceptual model of intelligent NFV is proposed, integrating fuzzy production-rule inference ("if-then" rules) into the orchestration process of virtual network functions. This model enables real-time monitoring of network metrics and resource states to automatically make decisions regarding scaling, resource allocation, and routing, thereby adapting network operations to current conditions. <br>It is argued that traditional approaches with rigid threshold settings or static policies cannot provide sufficient flexibility and responsiveness in modern networks. In contrast, the use of fuzzy logic allows for processing imprecise input data, incorporating expert knowledge, and achieving smooth adaptive management without abrupt transitions. The paper outlines the structure of the proposed model, describes its operating algorithm, and provides examples of fuzzy rules. <br>The application of this approach demonstrates more efficient use of data center resources, main tains stable latency and other QoS metrics even under sudden traffic changes, and reduces the risk of network overload. This is particularly important for critical services requiring continuous operation and low latency (real-time services such as VoIP, video streaming, IoT, etc.).</p> <p><strong>Keywords:</strong> software-defined networks; network function virtualization; fuzzy production-rule inference; orchestration; quality of service (QoS); intelligent control system.</p>2025-07-14T21:40:34+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2873BLE and ZigBee in IoT climate control systems: a comparative overview and application prospects2025-07-23T14:53:40+00:00Коломієць Н. В. (Kolomiiets N. V.)con@duikt.edu.uaВладарчик Ю. В. (Vladarchyk Y. V.)con@duikt.edu.uaЗайченко С. П. (Zaichenko S. P.)con@duikt.edu.uaУнгурян С. В. (Unhuryan S. V.)con@duikt.edu.ua<p>The choice of network technology is key in the development of IoT systems. As the use of smart devices in IoT and IIoT continues to scale, it is clear that this is making wired device connections a thing of the past, enabling wireless networking technologies to cover more and more areas of application. <br>The latter is achieved by wireless technology developers who, based on global needs, adapt wireless technologies and optimise them specifically for low-power devices. It is also worth noting the environmental component, as the installation of hundreds or thousands of cables for IoT devices would be more expensive and would cause more environmental damage from larger cable production facilities. <br>The article discusses the main trends of two currently developed devices that are economical in terms of power consumption - Bluetooth LE and ZigBee wireless network technologies. When choosing these two wireless technologies, the main focus was on their popularity, similarity to each other in terms of the principle of operation, excellent encryption of transmitted data in the network, the same frequency range, efficiency in terms of discharging devices, and a relatively short range of operation (less than 200 metres).<br>A comparison of the protocol stacks of both networks was considered, the network operation process and other various features of the technologies were outlined. The latter, the consideration of the main characteristics, provided a clear view of both network technologies, which distinguished Bluetooth Low Energy as a better network in all respects compared to ZigBee. That's why it became clear that in photo transmission and larger projects, Bluetooth LE can provide greater functional needs of the system. That's why it became clear that in photo transmission and larger projects, Bluetooth LE can provide more functional needs of the system. However, this does not preclude the use of ZigBee in small and medium-sized scalable systems, since for devices that transmit low-bit information or not very important information in real time, ZigBee can perfectly provide their functionality and have no significant difference from Bluetooth LE. </p> <p><strong>Keywords:</strong> Bluetooth LE, ZigBee, IoT, power consumption, protocol stack, data rate, range, wireless network, device connection scheme, Mesh.</p>2025-07-15T05:16:35+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2874Research on existing methods for ensuring functional resilience of computer networks2025-07-22T20:53:38+00:00Кравченко Ю. В. (Kravchenko Y. V.)con@duikt.edu.uaФісун О. С. (Fisun O. S.)con@duikt.edu.ua<p>The article examines the existing methods of the theory of functional stability when used to improve computer networks. The work is devoted to the analysis of existing methods of functional stability, comparison of their main characteristics and identification of advantages. <br>Research of the latest approaches and technologies for ensuring the functioning of computer networks in various operating conditions. <br>Modern methodologies for ensuring the functional stability of computer networks are analyzed, including the evaluation of the effectiveness and advantages of various approaches in operational conditions. The study emphasizes the importance of new approaches to increasing the efficiency of technical systems through the implementation of the principles of functional stability. This is achieved through rational use of resources and redistribution of excesses to minimize the consequences of unforeseen situations. The need for further development of the theory of functional stability for an accurate description of the operation of various systems is emphasized, especially in the context of modern information technologies that contribute to increasing the efficiency of various technical systems. <br>Research methods for ensuring the functional stability of computer networks can be used in the educational process in institutions of higher education of the appropriate profile and for the development and improvement of the operation of computer networks under the conditions of different operation and usage scenarios.</p> <p><strong>Keywords:</strong> functional stability; computer networks; work optimization; method; system; methodology; information technology; software; data processing; operation.</p>2025-07-15T21:34:08+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2875Multi-criteria recognition of text-to-topic correspondence based on the TF-IDF algorithm2025-07-22T20:53:41+00:00Данильченко В. М. (Danylchenko V. M.)con@duikt.edu.uaОтрох С. І. (Otrokh S. I.)con@duikt.edu.uaШалигін М. О. (Shaligin M. O.)con@duikt.edu.uaДонець А. Г. (Donets A. G.)con@duikt.edu.ua<p>This research delves into the critical role of statistical methodologies in determining the thematic alignment of textual content with specific user interests. Recognizing the everincreasing volume of digital information, the need for accurate and efficient document classifycation has become paramount across numerous applications. This study meticulously examines the potential and subsequently implements a refined algorithmic approach grounded in the Term Frequency-Inverse Document Frequency (TF-IDF) metric. The modifications introduced aim to optimize the algorithm's performance in analyzing and categorizing diverse sets of textual data. <br>A comprehensive account of the program development process is provided, emphasizing the integration of a multi-criteria decision-making framework to achieve a nuanced understanding of text-to-topic relevance. This multi-faceted approach considers various linguistic features and statistical indicators beyond the basic TF-IDF scores, thereby enabling a more robust and accurate classification outcome. Furthermore, the research addresses the crucial pre-processing steps involved in handling textual data. Detailed attention is given to the methodologies of text normalization, which includes techniques such as stemming, lemmatization, and case conversion, to reduce data dimensionality and improve the consistency of feature representation. <br>The study also rigorously explores the application of effective filtering techniques, particularly the identification and removal of stop words, which are common words that carry minimal semantic weight and can often introduce noise into the classification process. The strategic implementation of these normalization and filtering methods is shown to significantly contribute to the overall precision and recall of the proposed text classification system.<br>The outcome of this research is a highly effective and adaptable solution for the identifycation of documents that are genuinely relevant to a user's specified thematic focus. The potential applications of this solution span a wide spectrum of information management and retrieval systems. In the context of search engines, the enhanced classification capabilities can lead to more precise and contextually appropriate search results, improving user satisfaction and information discovery. Similarly, in information filtering systems, the proposed approach can facilitate the delivery of tailored content streams, reducing information overload and enhancing user engagement. <br>Ultimately, this research underscores the growing significance of employing innovative and statistically sound methods for the automated filtering and categorization of textual data in the age of big data. The proposed multi-criteria TF-IDF-based algorithm offers a valuable contribution to the field, providing a practical and efficient means of navigating and extracting relevant information from the vast digital landscape. The findings highlight the potential for significant advancements in various information-intensive applications through the intelligent automation of text analysis and classification processes.</p> <p><strong>Keywords:</strong> processing; data; TF-IDF method; document classification; multi-criteria recognition; stop words; text normalization; data set; deep learning; technology.</p>2025-07-20T13:30:48+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2876Модель оцінювання рівня інтернаціоналізації діяльності наукової інституції2025-07-22T20:53:45+00:00Statyvka Y. I. (Стативка Ю. І.)con@duikt.edu.uaNedashkivskiy O. L. (Недашківський О. Л.)con@duikt.edu.uaMingjun Z. (Мінцзюнь Ч.)con@duikt.edu.ua<p>The article outlines the reasons for the lack of software tools and analytical platforms for evaluating the level of internationalization of scientific institutions and provides arguments for the feasibility of their creation. To eliminate the contradiction between the generally recognized importance of such evaluation and the absence of the specified means, a model of the process for evaluating the level of internationalization of scientific institutions is proposed. The model contains a description of the context of the main process, its two-level decomposition and a list and purpose of intermediate artifacts. Such a model can be used as the basis for developing a software framework for automating routine tasks of a research or practical and experimental nature in the development and testing of methodologies for evaluating the level of internationalization of scientific institutions.</p> <p><strong>Keywords:</strong> internationalization of scientific institutions; evaluation of the level of internationalization; model of the process of evaluating the level of internationalization; software engineering.</p>2025-07-20T13:52:35+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2877Integration of artificial intelligence into confinement ventilation control systems: development of specialized software based on the Mamdani algorithm2025-07-22T20:53:49+00:00Гаврилко Є. В. (Havrylko )con@duikt.edu.uaСавко В. Я. (Savko V. Y.)con@duikt.edu.ua<p>The article proposes an innovative approach to controlling ventilation systems of the New Safe Confinement (NSC) of the Chernobyl NPP based on fuzzy logic using the Mamdani algorithm. The aim of the study is to optimize the operation of ventilation systems taking into account the inertia of air flows, dynamic external conditions (wind speed, pressure, humidity) and minimize radioactive emissions. <br>The developed system is based on a fuzzy inference model, where the input parameters are: pressure difference (∆P), wind speed (V) and air mass inertia (I). <br>Triangular membership functions were used to fuzzify the variables, and the fuzzy rule base was formed based on expert knowledge of physical processes in the NSC. The implementation of the Mamdani algorithm allowed adaptive control of fan power, ensuring pressure stability and energy efficiency. <br>Experiments conducted on real NSC data showed an 18% reduction in energy consumption compared to previous methods (genetic algorithms), a 25% reduction in pressure fluctuations, and a system response time of up to 5 minutes. The system is integrated with a SCADA platform for online monitoring and correction of ventilation operation. <br>These enhancements derive from the Mamdani algorithm’s capacity to process nonlinear rela-tionships and uncertainties without computationally intensive operations — a stark contrast to neural networks reliant on GPU clusters. The system’s modular architecture supports seamless integration of new sensors and rule updates, ensuring adaptability to evolving operational demands. <br>By demonstrating fuzzy logic’s robustness in extreme nuclear environments, this research marks a paradigm shift in containment facility management. Current efforts focus on hybrid neuro-fuzzy models to automate rule generation and refine pressure stability. <br>The results prove the effectiveness of the fuzzy approach for managing complex engineering facilities under uncertainty. Further research is aimed at implementing hybrid models (neural networks + fuzzy logic) for auto-formalization of rules.</p> <p><strong>Keywords:</strong> software engineering; artificial intelligence; neuro-fuzzy systems; fuzzy control; Mamdani algorithm; New Safe Confinement; air flow inertia; energy efficiency; radiation safety.</p>2025-07-20T14:13:06+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2878Analysis of the application of artificial intelligence for 3D scanning data processing2025-07-22T20:53:54+00:00Черевик О. В. (Cherevyk O. V.)con@duikt.edu.uaЛащевська Н. О. (Lashchevska N. O.)con@duikt.edu.ua<p>This article explores the application of artificial intelligence (AI) in the processing of 3D scanning data as a promising direction in the field of digital technologies. The study focuses on a detailed examination of contemporary methods such as noise reduction, segmentation, and reconstruction of 3D objects using deep learning models. Particular attention is paid to how AI enhances accuracy, automates complex data analysis processes, and reduces the need for manual intervention in 3D workflows. <br>The research highlights the advantages of integrating AI into 3D data processing pipelines, including improvements in speed, precision, and scalability. These benefits are particularly valuable in sectors such as industrial visualization, architecture, medical diagnostics, and cultural heritage preservation. At the same time, the paper discusses existing limitations of AI models—such as their reliance on large annotated datasets, difficulties in generalizing to real-world scenarios, and high computational requirements. <br>A number of practical recommendations are proposed for the effective integration of AI into 3D scanning projects. These include selecting appropriate models based on specific project goals, ensuring data quality and diversity, adopting hybrid workflows that combine traditional algorithms with AI-based methods, and building scalable systems that are adaptable to dynamic environments. The use of both classical algorithms (e.g., Poisson reconstruction, ICP, SOR) and modern AI techniques demonstrates the need for a balanced, context-aware approach to 3D data processing. <br>The findings of this study offer valuable insights for researchers and practitioners aiming to optimize the accuracy and efficiency of 3D scanning technologies. The article concludes with an outlook on future research directions, such as the development of lightweight, domain-adaptive models, self supervised learning approaches, and comprehensive real-world datasets. These efforts are essential for overcoming current challenges and creating robust, generalizable AI systems capable of reliable performance in diverse application areas..</p> <p><strong>Keywords:</strong> artificial intelligence; 3D scanning; 3D data analysis; computer technologies; deep learning; machine learning; neural networks; data processing.</p>2025-07-20T14:43:47+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2879Modern approaches to the automation of task management for small teams using machine learning methods2025-07-22T20:53:57+00:00Коломієць І. Ю. (Kolomiiets I. Y.)con@duikt.edu.uaЗамрій І. В. (Zamrii I. V.)con@duikt.edu.uaКалинюк Б. С. (Kalyniuk B. S.)con@duikt.edu.uaБажан Ю. П. (Bazhan Y. P.)con@duikt.edu.uaДовженко Т. П. (Dovzhenko T. P.)con@duikt.edu.ua<p>Text of annotation translation – Modern project and task management technologies offer numerous tools that provide basic functions for organizing team work. However, most of these solutions are focused on large companies, which often makes them too complex for small teams. The lack of tools for automated forecasting of task completion times and determining their priorities limits the effectiveness of such systems. Small teams need flexible solutions that reduce the burden on work organization, automate routine tasks, and promote rational resource allocation. Therefore, the problem is the lack of automated tools for accurately predicting task completion times and determining their priority, which leads to inefficient resource allocation and work delays. <br>The main approaches to task management automation are analyzed, in particular, methods that can be used to forecast task completion times and determine priorities using machine learning. Special attention is paid to the TF-IDF, Random Forest Regressor, and K-means algorithms. TF-IDF allows for efficient processing of text descriptions of tasks, converting them into numerical features, which provides an analytical basis for the operation of machine learning models. Random Forest Regressor is used to accurately predict task completion times, which helps teams plan the workflow. The K-means algorithm is used to cluster tasks by their importance and complexity, providing auto matic prioritization. <br>Popular tools such as Trello, Asana, Jira, Wrike, provide basic functionality for task management, but do not use machine learning methods for automation. The considered approaches can be integrated into existing systems or implemented as a separate solution for small teams seeking to increase productivity without significant costs for complex platforms. <br>The results of the study emphasize the importance of using machine learning to automate task management. This approach allows you to reduce dependence on the human factor, reduce the time for organizing tasks and optimize the planning process. In addition, it helps to increase the efficiency of teamwork, which is especially important for small teams working in conditions of limited resources. </p> <p><strong>Keywords:</strong> task management automation; machine learning; time forecasting; clustering; TF-IDF; Random Forest Regressor; K-means.</p>2025-07-21T20:59:33+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2880Determining the factors of tuberculosis: analysis of machine learning and neural network methods2025-07-22T20:54:04+00:00Невінський Д. В. (Nevinskyi D. V.)con@duikt.edu.uaМартьянов Д. І. (Martjanov D. I.)con@duikt.edu.uaГосподарський О. А. (Hospodarskyy O. A.)con@duikt.edu.uaВиклюк Я. І. (Vyklyuk Y. I.)con@duikt.edu.uaСем’янів І. О. (Semianiv I. O.)con@duikt.edu.ua<p>Tuberculosis (TB) remains one of the most serious infectious diseases globally, particularly in India, where its high incidence poses significant challenges to the healthcare system. This study focuses on analyzing the determinants of TB prevalence in India using machine learning (ML) and neural network (NN) methods. The objective is to identify key factors influencing TB incidence and develop accurate predictive models to support prevention and treatment strategies. Based on statistical data from 2019–2022, encompassing demographic characteristics, social factors, and medical indicators, a comprehensive analysis was conducted. Data processing techniques, including correlation analysis, oversampling (SMOGN) for sample balancing, and modeling with linear regressions (LM, Ridge, Lasso), ML algorithms (Decision Tree, K-Nearest Neighbors, Random Forest), and a deep neural network were employed. Results revealed that linear models exhibited limited accuracy (R² Test up to 0.600), while Random Forest (R² Test = 0.832) and K-Nearest Neighbors (R² Test = 0.865) significantly outperformed them due to their ability to capture nonlinear relationships. <br>The highest accuracy was achieved by the neural network (R² Test = 0.822, RMSE Test = 0.433), highlighting its effectiveness in detecting complex interdependencies. Key factors influencing TB incidence included population size (Population), gender ratio (Gender Ratio), the number of specialized centers (Nodal_DR_TB_Centres_Per_Population), and urban characteristics (City_Encoded). These findings underscore the potential of integrating ML and NN into medical research for TB forecasting and control, offering valuable insights for developing personalized therapeutic approaches and improving public health outcomes.</p> <p><strong>Keywords:</strong> machine learning; neural networks; disease prediction; oversampling; SMOGN; linear regression; Random Forest; K-Nearest Neighbors; determinants; integration.</p>2025-07-21T21:44:01+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2881Improving the mechanical approach using bitcoin market volume data2025-07-22T20:54:08+00:00Цапро І. В. (Tsapro I. V.)con@duikt.edu.uaЗолотухіна О. А. (Zolotukhina O. A.)con@duikt.edu.ua<p>The subject of this study is the improvement of the mechanistic approach to the analysis of market volumes of cryptocurrencies, in particular Bitcoin, using indicators of market purchases, sales and their difference. The purpose of the work is to determine the effectiveness of the improved mechanistic approach by using market volumes of purchases, sales and their difference in predicting market trends and its impact on the profitability of Bitcoin trading strategies. The objectives of the study include: to expand the mechanistic approach by using market volumes (purchases, sales, the difference between them); to test the improved approach on historical BTC/USDT trading data using the moving average strategy; to compare the effectiveness of the new approach with the traditional mechanistic analysis of total volume; to assess the dependence of profitability and win rate on the selected method and time intervals (1 day, 4 hours, 30 minutes). The results obtained indicate that the impulse of market purchases (GMI Volume Buy) demonstrates the highest profitability and accuracy among all analyzed indicators, while the impulse of volume difference (GMI Volume Delta) has the lowest efficiency and higher volatility. It was found that shortening the time interval reduces the profitability and stability of all methods, and also increases their sensitivity to the choice of parameters. Thus, the study confirms the prospects of the mechanistic approach, especially taking into account the market volumes of purchases and sales. The results can be used to improve algorithmic trading strategies, aswell as in further research related to the application of machine learning algorithms and optimization of mechanistic analysis parameters. </p> <p><strong>Keywords:</strong> mechanistic approach; cryptocurrency; moving average; trading volumes; backtesting; bitcoin, trading strategy; forecasting; dependence; data.</p>2025-07-21T21:50:03+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2882Analysis of modern predictive analytics systems2025-07-22T20:54:12+00:00Яценко Я. Д. (Yatsenko Y. D.)con@duikt.edu.uaЖидка О. В. (Zhydka O. V.)con@duikt.edu.uaКіс О. Я. (Kis O. Y.)con@duikt.edu.ua<p>Abstract: The article presents a comparative analysis of modern predictive analytics systems (PAS) that employ machine learning methods. The study focuses on evaluating the core capabilities, architectural models, and application areas of widely used platforms, including IBM Watson Studio, Google Vertex AI, Microsoft Azure Machine Learning Studio, Amazon SageMaker, RapidMiner, DataRobot, H2O.ai, and SAS Predictive Analytics. Special attention is given to the classification of PAS by the types of tasks they solve (classification, regression, time series forecasting), levels of automation (AutoML, semi-automated, and custom solutions), and deployment models (on-premise, cloud-based, and hybrid). <br>The analysis highlights key criteria for comparison: architectural flexibility and scalability, support for machine learning algorithms and AutoML features, integration with diverse data sources, data preparation and visualization tools, model performance and accuracy, security compliance (GDPR, ISO), and user experience including interface convenience, documentation, and vendor support. <br>Each system is assessed in terms of its strengths and weaknesses, depending on its suitability for various use cases. Enterprise-level platforms such as IBM Watson Studio, SAS, and Azure ML Studio are identified as optimal for regulated environments and large-scale deployments. In contrast, cloud native solutions like Google Vertex AI and Amazon SageMaker demonstrate high flexibility and DevOps integration, making them ideal for scalable AI projects. Tools such as RapidMiner and DataRobot are more suitable for rapid prototyping and business users due to their intuitive interfaces and AutoML capabilities. Open-source platforms like H2O.ai are shown to be effective for research, experimentation, and startups thanks to their performance, transparency, and community support. <br>The paper concludes by outlining the current trends in PAS development and identifying promising directions for future research. These include the further integration of explainable AI (XAI), ethical data handling, and the convergence of predictive analytics with real-time data processing in cloud ecosystems. Based on the analysis, practical recommendations are proposed to assist organizations in selecting appropriate PAS tools aligned with their technical needs and strategic objectives. </p> <p><strong>Keywords:</strong> predictive analytics, machine learning, cloud platforms, decision support systems, AutoML.</p>2025-07-22T20:36:25+00:00##submission.copyrightStatement##https://con.duikt.edu.ua/index.php/communication/article/view/2883Modelling protein folding using machine learning methods2025-07-22T20:54:16+00:00Дзюба В. В. (Dziuba V. V.)con@duikt.edu.uaКолодюк А. В. (Kolodiuk A. V.)con@duikt.edu.uaОлейніков І. А. (Oleinikov I. A.)con@duikt.edu.uaБугайов Д. М. (Bugayev D. M.)con@duikt.edu.ua<p>The article highlights modern approaches to modelling protein folding using machine learning methods, a rapidly developing field at the intersection of bioinformatics, physics and artificial intelligence. The purpose of the article is to systematise existing approaches to modelling protein folding using machine learning, identify the advantages and limitations of modern techniques, and identify areas for further research in this area. <br>Predicting the three-dimensional structure of proteins based on amino acid sequences remains a challenging task, as the structure of a protein determines its function in the cell, and its misfolding often leads to severe diseases. This paper reviews the most successful deep learning models, including AlphaFold, MSA Transformer, and ultra-deep neural networks, which have demonstrated the ability to accurately predict protein structures based on the analysis of evolutionary relationships and contact maps. <br>Particular attention is paid to the limitations of such methods, in particular, their complexity in processing dynamic processes and failure to take into account the stochastic nature of protein interactions. In this context, the author proposes an innovative approach, which consists in the integration of quantum mechanical models, in particular the mechanism of wave function collapse, into classical machine learning algorithms. This approach allows to take into account the probabilistic transitions between protein conformational states and minimise the free energy of the system. Mathematical formalisations and examples of implementation based on the Monte Carlo method are presented. <br>The proposed integrated model demonstrates an increased prediction accuracy (up to 95%) compared to existing solutions. Its application is promising in personalised medicine (analysis of the effect of mutations on protein structure), pharmacology (improvement of drug design), industrial biotechnology (optimisation of enzymes), and in studies of complex protein complexes. The work forms the scientific basis for the creation of new intelligent tools that combine structural prediction with functional activity analysis, which opens up new horizons for the development of bioinformatics and related fields. </p> <p><strong>Keywords:</strong> proteins, protein folding, machine learning, quantum mechanics, deep neural networks, wave function collapse, bioinformatics, protein structure prediction, recurrent neural networks, <br>convolutional neural networks.</p>2025-07-22T20:52:52+00:00##submission.copyrightStatement##