The development, research and construction of devices that speed up the process of interaction between various modules (for example, telemetry and remote control systems), and in general, hybrid communication systems of a digital city that include a variety of systems used in an Intelligent Building is an urgent problem. One of these devices presented in the article is the optimal multi–frequency modem developed. In addition to the developed modem, the article presents examples of the development of similar types of devices and systems by both Russian and foreign researchers. At the same time, the authors proved that the use of the proposed modem provides a gain in spectral and energy efficiency in comparison with analogues. The proposed approach can be used to organize high-speed data transmission over frequency-limited communication channels based on new wired technologies of the digital subscriber line standard, as well as wireless systems.
Keywords: telemetry and remote control system, intelligent building, digital city hybrid communications system, modem, multi-frequency modulation, digital subscriber line, optimal finite signal, modulator, demodulator, wireless communication system
This study describes approaches to automating full-text keyword search in the field of patent information. Automating the search by keywords (n-grams) is a significantly more difficult task than searching by individual words, in addition, it requires morphological and syntactic analysis of the text. To achieve this goal, the following tasks were solved: (a) the full-text search systems were analyzed: Apache Solr, ElasticSearch and ClickHouse; (b) a comparison of the architectures and basic capabilities of each system was carried out; (c) search results in Apache Solr, ElasticSearch and ClickHouse were obtained on the same dataset. The following conclusions were drawn: (a) all the systems considered perform full-text keyword search; (b) Apache Solr is the system with the highest performance, it also has very convenient functions; (b) ElasticSearch has a fast and powerful architecture; (c) ClickHouse has a high data processing speed.
Keywords: search, keyphrases, patent, Apache Solr, Elasticsearch, ClickHouse
The article presents a comprehensive analysis of a systematic approach to the implementation and development of innovative information technologies aimed at preventing offenses committed by foreign citizens. The introduction provides an overview of the growing importance of employing advanced technological solutions in law enforcement, particularly in addressing challenges associated with foreign nationals. The main objectives of the study are to explore how the integration of technologies such as big data processing, artificial intelligence, and geographic information systems can enhance the efficiency of preventive measures. The article details the use of data analysis techniques, machine learning models, and system integration to create a unified information platform. This platform enables the consolidation of data from diverse sources, thereby improving the coordination between different law enforcement units and facilitating faster and more informed decision-making processes. The integration of these technologies also supports process standardization, reducing data inconsistencies and ensuring more reliable operations across various departments. The results highlight the benefits of utilizing big data analytics to process vast amounts of information that would be otherwise impossible to handle efficiently. Artificial intelligence, through predictive models and risk assessment tools, plays a crucial role in identifying potential threats and allocating resources effectively. Geographic information systems contribute by mapping crime hotspots and providing spatial analysis, which aids in targeted intervention strategies. The discussion emphasizes the importance of a unified approach to technology implementation, focusing on the creation of an integrated information system that can adapt to ongoing changes in the social and legal environment. The adaptability of the system is critical for maintaining its effectiveness in the face of new challenges and evolving regulatory requirements. The development of standardized data collection and processing protocols further enhances the system's resilience and operational efficiency. In conclusion, the article underscores that a systematic and integrated use of innovative information technologies significantly improves the effectiveness of crime prevention efforts and the overall efficiency of law enforcement agencies. The proposed approach not only facilitates proactive measures but also ensures a high level of responsiveness to emerging security threats, thereby strengthening public safety.
Keywords: systemic approach, innovative information technologies, crime prevention, foreign citizens, big data, artificial intelligence, geoinformation systems, information platform, standardization, law enforcement agencies, efficiency management, data integration
Abstract. It is revealed that specific forms of a simulation game combined with some peculiarities of training sessions in organizational systems could result in developing adaptable simulation models of a business situation. It is recommended to use a cognitive model in problem analysis of organizational systems, which allows switching from cognitive to simulation models naturally still being in visual topological descriptions. The AnyLogic software platform was chosen for developing a model which provides ample opportunities for creating an innovative educational environment with the elements of game simulations and AI. Cognitive analysis of a game learning process has revealed that the latter should have one cycle of a business game with two interactive nodes to introduce a host and a player into the game. It is noted that business games focused on developing management styles in a conflict are mostly in demand. Therefore, a simulation model has been developed to train executives to counteract an organizational conflict within the variability of authoritarian, democratic and liberal management styles. The model uses a paradigm of systems dynamics and is implemented in the AnyLogic software platform notation. To set the rules, the game host in the initial state or when starting the next game cycle sets the dynamics characteristics of a process while managing the organizational structure, as well as changes characteristics values of a pre-conflict situation. In response to conflict development the player performs management using auxiliary services available to him. In fact, the model is not limited by a list of the game’s tasks or possible options for a player’s decision.
Keywords: management diversification, production diversification, financial and economic diversification goals, production and technical goals to ensure production flexibility
The relevance of the problem of pattern recognition lies not only in the quality of recognition - classification of images, but also in the possibility of their rapid restoration in noisy conditions. Such solutions are useful, for example, for automatic access control systems to a protected area in the case of recognition of license plates or an on-board computer when recognizing license plates in real time. It is shown that a recurrent neural network with the Hopfield architecture copes well with the recognition of simple monochrome images of small size in conditions of their noisiness. The architecture of the Hopfield neural network is given, the peculiarity of which is a small amount of memory, which determines the scope of application of the neural network of this architecture. The algorithm for training the Hopfield neural network is given. Examples and results of recognition of noisy monochrome images are demonstrated using the example of road signs. The results of the experiment on noisy images demonstrate the possibility of image restoration with less than 40% of distorted bits.
Keywords: pattern recognition, recurrent neural network, noisy monochrome image, reference sample, training
The relationship between "old" and "new" concepts/metrics for quality assensing of statistical detection criteria and binary events classification is considered. Independence and consistency assessments of analyzed metrics relative to initial input data volume/composition are provided. Recommendations for the use of "new" metrics for assessing the quality of detection and binary classification events are clarified.
Keywords: Type I and Type II errors, accuracy, recall, specificity, F-score, ROC curve, AUC integral metric
The article considers the possibility of integrating traditional folk crafts into modern design solutions using 3D modeling technology. Digital tools are being considered that make it possible to combine historical colors and modern forms. The possibility of using glass-like elements in the decoration of clothing and the role of cylindrical elements in modern design are described. The results of the created ornamental solutions in the Gzhel palette on 3D models of clothing are presented.
Keywords: digital tools, 3D models, visualization, glass beads, gzhel, design solutions, ornament, cultural heritage
The article is devoted to the problems of the complexity of the process of developing organizational and administrative documentation, taking into account the branch of work of the organization, as well as the departments that make up its main work. Taking into account the impact of the changing economy in the country, organizations are constantly subject to changes with the initiative of the relevant regulators in a particular area and regulatory documents in the form of standards and laws. The main branches of the organizations' work are highlighted, as well as the number of regulatory documents for regulating their activities. The analysis of the organization as a system based on a system analysis is carried out. The Sagatovsky method was chosen as an approach to solve the problem. According to the methodology, the system was analyzed, consisting of seven stages. At each stage, the main components are highlighted, and justifications for each of them are given. Life cycle diagrams of the specified "types of end products" have been compiled, taking into account the direction of work of the departments. A scheme of the process of creating organizational and administrative documentation by employees and departments of the organization has been developed. An analysis of the organization from the point of view of a system analysis will further develop criteria for creating a set of organizational and administrative documentation. Criteria for the creation of organizational and administrative documentation and methods of their assessment will help organizations significantly facilitate work with the main regulators in any area, as well as meet the set standards of work, which in the future will help not only to improve work, but also to avoid negative consequences for the enterprise itself.
Keywords: the Saratovsky method, system analysis, goal setting, information security
The article considers the problem of increasing the safety and efficiency of the green tea drying process. To ensure safe operation of drying units, an algorithm is proposed that uses a finite state machine to diagnose faults and prevent emergency situations. Control schemes and state transition graphs of the Moore machine are presented, which analyzes the state of the equipment and initiates its shutdown when dangerous conditions are detected. The developed system allows not only to increase the safety of the drying process by preventing accidents, but also to improve the efficiency of the equipment. The main advantages of the proposed scheme are the ability to timely diagnose faults and prevent critical situations, such as overheating or increased pressure in the chamber.
Keywords: automation, system, drying process, diagnostics, finite state machine
The paper presents an analysis of existing issues in the subject area, upon which research findings are formed to enhance the quality of the tourist experience through the implementation of a software-mathematical complex aimed at generating personalized recommendations. The task of managing tourist resources is addressed by considering the decision support process through the introduction of multi-criteria optimization and fuzzy logic. Such mechanisms significantly improve the accuracy of forming personalized recommendations that meet individual user requests. As an individual algorithm, a method is proposed for introducing ontological hierarchical types of connections that capture generic and object-oriented relationships of tourism categories, characterizing the influence on tourist objects. In addition, to accommodate flexible, poorly formalized user requests, fuzzy logic mechanisms are introduced by implementing fuzzy evaluation scales. The paper presents a description of the implementation of the recommendation algorithm, which possesses scientific novelty and practical significance. A class diagram details the structure of the ontological hierarchical model presented in the work. Based on the provided research results, the effectiveness of the algorithm is tested through test calculations.
Keywords: tourism, tourism optimization, recommendation systems, fuzzy logic, multi-criteria optimization, mobile development
The article describes the mathematical foundations of time-frequency analysis of signals using the algorithms Empirical Mode Decomposition (EMD), Intrinsic Time-Scale Decomposition (ITD) and Variational Mode Decomposition (VMD). Synthetic and real signals distorted by additive white Gaussian noise with different signal-to-noise ratio are considered. A comprehensive comparison of the EMD, ITD and VMD algorithms has been performed. The possibility of using these algorithms in the tasks of signal denoising and spectral analysis is investigated. The estimation of algorithm execution time and calculation stability is performed.
Keywords: time-frequency analysis, denoising, decomposition, mode, Hilbert-Huang transformation, Empirical Mode Decomposition, Intrinsic Time-Scale Decomposition, Variational Mode Decomposition
Relevance of the research topic. Modern cyber attacks are becoming more complex and diverse, which makes classical methods of detecting anomalies, such as signature and heuristic, insufficiently effective. In this regard, it is necessary to develop more advanced systems for detecting network threats based on machine learning and artificial intelligence technologies. Problem statement. Existing methods of detecting malicious traffic often face problems associated with high false-positive response and insufficient accuracy in the face of real threats on the network. This reduces the effectiveness of cybersecurity systems and makes it difficult to identify new attacks. The purpose of the study. The purpose of this work is to develop a malicious traffic detection system that would increase the number of detected anomalies in network traffic through the introduction of machine learning and AI technologies. Research methods. To achieve this goal, a thorough analysis and preprocessing of data obtained from publicly available datasets such as CICIDS2017 and KDD Cup 1999 was carried out.
Keywords: anomaly detection, malicious traffic, cybersecurity, machine learning, artificial intelligence, signature methods
Modern digitalization processes involve the use of intelligent systems at key stages of information processing. Given that the data available for intelligent analysis in organizational systems are often fuzzy, there is a problem of comparing the corresponding units of information with each other. There are several known methods for such a comparison. In particular, for random fuzzy variables with known distribution laws, the degree of coincidence of these distribution laws can be used as a criterion for the correspondence of one random variable to another. However, this approach does not have the necessary flexibility required to solve practical problems. The approach we propose allows you to compare fuzzy, fuzzy and clear, as well as clear and clear data. The paper will provide an example illustrating this approach. The material presented in the study was initially focused on managing organizational systems in education. However, its results can be extended to other organizational systems.
Keywords: fuzzy data, weakly structured problems, comparison criteria, hierarchy analysis method, systems analysis, fuzzy benchmarking
The purpose of research is to increase the level of specification of sentiment within the framework of sentiment analysis of Russian-language texts by developing a dataset with an extensive set of emotional categories. The paper discusses the main methods of sentimental analysis and the main emotional models. A software system for decentralizing data tagging has been developed and described. The novelty of this work lies in the fact that to determine the emotional coloring of Russian-language texts, an emotional model is used for the first time, which contains more than 8 emotional classes, namely the model of R. Plutchik. As a result, a new dataset was developed for the study and analysis of emotions. This dataset consists of 24,435 unique records labeled into 32 emotion classes, making it one of the most diverse and detailed datasets in the field. Using the resulting dataset, a neural network was trained that determines the author’s set of emotions when writing text. The resulting dataset provides an opportunity for further research in this area. One of the promising tasks is to enhance the efficiency of neural networks trained on this dataset.
Keywords: sentiment, analysis, model, Robert Plutchik, emotions, markup, text
The article presents the main stages and recommendations for the development of an information and analytical system (IAS) based on geographic information systems (GIS) in the field of rational management of forest resources, providing for the processing, storage and presentation of information on forest wood resources, as well as a description of some specific examples of the implementation of its individual components and digital technologies. The following stages of IAS development are considered: the stage of collecting and structuring data on forest wood resources; the stage of justifying the type of software implementation of the IAS; the stage of equipment selection; the stage of developing a data analysis and processing unit; the stage of developing the architecture of interaction of IAS blocks; the stage of developing the IAS application interface; the stage of testing the IAS. It is proposed to implement the interaction between the client and server parts based on Asynchronous JavaScript and XML (AJAX) technology. It is recommended to use the open source Leaflet libraries for visualization of geodata. To store large amounts of data on the server, it is proposed to use the SQLite database management system. The proposed approaches can find application in the creation of an IAS for the formation of management decisions in the field of rational management of forest wood resources.
Keywords: geographic information systems, forest resources, methodology, web application, AJAX technology, SQLite, Leaflet, information processing
With the development of low-orbit satellite Internet systems (NSIS), issues of ensuring effective operation in conditions of intentional interference come to the fore. One of the solutions is related to the use of systems using both OFDM methods and generators implementing frequency hopping (HF). Obviously, the more complex the algorithm for selecting operating frequencies, the more efficient the operation of the microwave. In the article, it is proposed to use the SPN cipher "Grasshopper" as a generator for selecting operating frequencies. As a result, the CCF system will have a high resistance to calculating operating frequency numbers by electronic warfare systems. However, failures and failures may occur during the operation of the SSC. To prevent their consequences, it is proposed to implement an SPN cipher using polynomial modular codes of residue classes (PMCC). One of the transformations in the "Grasshopper" is a nonlinear transformation that performs the substitution operation. It is obvious that the creation of a new mathematical model for performing a nonlinear transformation using MCCS will ensure the operation of the SPN-cipher-based RF generator in conditions of failures and failures.
Keywords: low-orbit satellite Internet systems, the Grasshopper SPN cipher, nonlinear transformations, modular codes of residue classes, mathematical model, fault tolerance, frequency hopping, polynomial modular code of residue classes
More attention is being paid to the transition to domestic software with the digitalisation of the construction industry and import substitution. At each stage of construction, additional products are needed, including CAD and BIM. The experience of integration of Russian-made systems for the tasks of information modeling of transport infrastructure and road construction is considered. Within the framework of the work the integration of Vitro-CAD CDE and Topomatic Robur software system was performed. Joint work of the construction project participants in a single information space was organized. The efficiency of work of the project participants was determined due to the release from routine operations. Integration experience has shown that the combination of Vitro-CAD and Topomatic Robur allows to manage project data efficiently, store files with version tracking, coordinate documentation and issue comments to it.
Keywords: common data environment, information space, information model, digital ecosystem, computer-aided design, building information modeling, automation, integration, import substitution, software complex, platform, design documentation, road construction
The present study aims to explore the methodologies employed in practice to ascertain the parameters of processes occurring in supercritical fluid media. A primary focus of this investigation lies in the solubility of key components of the system in supercritical fluid solvents, with a view to understanding the limitations of mathematical models in qualitatively predicting solubility outside the investigated ranges of values. This analysis seeks to elucidate the potential challenges and opportunities in conducting experimental studies in this domain. However, within the domain of supercritical fluid technologies, the optimization of processes and the prediction of their properties is attainable through the utilization of models and machine learning methodologies, leveraging both accumulated experimental and calculated data. The present study is dedicated to the examination of this approach, encompassing the consideration of system input parameters, solvent properties, solute properties, and the designated output parameter, solubility. The findings of the present study demonstrate the efficacy of this approach in predicting the solubility process through machine learning.
Keywords: supercritical fluids, solubility of substances, solubility factors, solubility prediction, machine learning, residue analysis, feature importance analysis
The article is devoted to the development and implementation of a two-stage magnetometer calibration algorithm integrated into the navigation system of a small-class unmanned underwater vehicle. At the first stage, an ellipsoidal approximation method is used to compensate for soft iron and hard iron distortion, ensuring the correct geometric location of magnetometer measurements. The second stage of calibration involves a method for estimating rotation between the coordinate systems of the magnetometer and accelerometer using quaternions as rotation parameters. Experimental verification of the algorithm demonstrated its effectiveness. Following completion of the two-step calibration, calibration parameters were determined and their use confirmed good consistency between magnetometer readings and actual magnetic field data, indicating the feasibility of using this technique for calibrating magnetometers.. The proposed algorithm for two-stage magnetometer calibration does not require laboratory equipment and can be carried out under real-world operating conditions. This makes it possible to integrate it into the onboard software of unmanned underwater vehicles.
Keywords: calibration, magnetometer, accelerometer, MEMS sensor, AHRS, navigation system, unmanned underwater vehicle, ellipsoid approximation, quaternion, magnetic inclination
This paper addresses the challenge of assuring data quality in high-frequency Internet-of-Things (IoT) streams while migrating to a hybrid edge–cloud architecture. We demonstrate that moving a subset of data-quality procedures—trust-metric calculation, outlier detection, and data-contract validation—from the cloud to edge devices markedly lowers end-to-end latency and reduces cloud load. After surveying existing cloud-centric and edge-centric quality-control solutions, we reveal their limitations: static placement of analytic modules and lack of support for dynamic workload drift. We introduce the concept of edge-oriented data-quality control, in which validation tasks are continuously re-assigned according to real-time network bandwidth and CPU utilisation. A prototype based on Apache Flink implements the proposed scheduler. Experiments with an industrial testbed (300 000 messages/s) show a 37 % reduction in alert latency and a 46 % decrease in cloud CPU consumption compared with a fully cloud-based pipeline. The paper discusses strengths, weaknesses, applicability boundaries, and security threats, and outlines future work on adaptive model selection at the edge, multimodal stream support, and formalised data-quality contracts.
Keywords: data quality control, streaming processing, Internet of Things, cloud computing, Apache Flink, Apache Kafka, anomaly detection, dynamic offloading
In the article, based on the estimate of the Euclidean norm of the deviation of the coordinates of the transition and stationary states of the dynamic system, the compression condition of the generalized projection operator of the dynamic system with restrictions is derived. From the principle of contracting mappings, taking into account the derived compression condition of the projection operator, estimates are obtained for the sufficient condition for the stability of the dynamic system of stabilization of the equilibrium position and program motions. The obtained estimates generalize the previously obtained results. Ensuring the stability of the operator of a limited dynamic system is demonstrated experimentally.
Keywords: sufficient condition for stability, projection operator, stabilization of equilibrium position. stabilization of program motions, SimInTech
Oil spills require timely measures to eliminate the causes and neutralize the consequences. The use of a case-based reasoning is promising to develop specific technological solutions in order to eliminate oil spills. It becomes important to structure the description of possible situations and the formation of a representation of solutions. In this paper, the results of these tasks are presented. A structure is proposed for representing situations in oil product spills based on a situation tree, a description of the algorithm for situational decision-making using this structure is given, parameters for describing situations in oil product spills and presenting solutions are proposed. The situation tree allows you to form a representation of situations based on the analysis of various source information. This approach makes it possible to quickly clarify the parameters and select similar situations from the knowledge base, the solutions of which can be used in the current undesirable situation.
Keywords: case-based reasoning; decision making; oil spill, oil spill response, decision support, situation tree
The article provides a review and systematisation of works devoted to the application of machine learning for solving problems of research, calculation and design of reinforced concrete structures. It considers the aspects, which are relevant today, related to calculation, design, as well as assessment of the technical condition of objects with the help of various approaches that implement machine learning schemes, including deep learning, ensemble algorithms. It is shown that nowadays in the world construction science this area is rapidly developing and improving. Thus machine learning algorithms solve problems of prediction of design parameters, problems of identification of these or those parameters, defects, damages on the basis of classification algorithms and others. The materials presented in the article will allow specialists to choose the subject area of research more precisely and determine the directions of adaptation and improvement of their own developments in the field of machine learning.
Keywords: machine learning, reinforced concrete structures, regression equations, identification, approximation, artificial intelligence
The purpose of the article is a software implementation of a module for analyzing the activity of site users based on a heat map of clicks, compatible with domestic web services, for example, combining the functionality of correlation and regression analysis and visualization in the form of dashboards before and after making changes to site elements. All functionality is carried out directly in the web analytics service. Based on the data obtained on the analyzed site element, a decision is made to adjust the design and/or content to increase the click rate. Thus, the proposed solution allows us to expand the functionality of the web analytics service and reduce labor costs. The software module has been successfully tested. As a result of the analysis and making the necessary adjustments to the site, the click rate increased
Keywords: user activity, correlation and regression analysis, dashboard, program module, trend line, coefficient of determination
There is often a need to analyze unstructured data when assessing the risk of emergency situations. Traditional analysis methods may not take into account the ambiguity of information, which makes them insufficiently effective for risk assessment. The article proposes the use of a modified hierarchy process analysis method using fuzzy logic, which allows for more effective consideration of uncertainties and subjective assessments in the process of analyzing emergency risks. In addition, such methods allow for consideration of not only quantitative indicators, but also qualitative ones. This, in turn, can lead to more informed decisions in the field of risk management and increased preparedness for various situations. The integration of technologies for working with unstructured data in the process of assessing emergency risks not only increases the accuracy of forecasting, but also allows for adapting management strategies to changing conditions.
Keywords: artificial intelligent systems, unstructured data, risk assessment, classical hierarchy analysis method, modified hierarchy analysis method, fuzzy logical inference system