×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Development of an Event Tree Based on System Goals, Strategies, and Tasks

    The combination of systems analysis and long-term planning is a crucial factor for ensuring sustainable development and enhancing the competitiveness of enterprises. In this context, the use of the Event Tree Analysis method plays a key role in assessing the achievement of strategic goals, tasks, and identifying potential risks. This study focuses on the development and application of an event tree to analyze various aspects of system operations, including goal setting, strategy development, and task execution. The application of the ETA method not only allows for modeling possible event scenarios but also enables the development of risk mitigation measures, contributing to long-term sustainability and successful system functioning.

    Keywords: event tree, system analysis, strategic planning, risk management, threat minimization, sustainable development, enterprise competitiveness, quantitative analysis, qualitative analysis, dependent events, conditional probabilities, protective mechanisms

  • Designing a quadcopter for indoor inspection and developing a control system based on the CAD model

    The issue of developing a prototype for an unmanned aerial vehicle (UAV) and creating a control system based on a computer-aided design (CAD) model as part of a project for inspecting construction sites is under consideration. Special attention has been paid to constructing a computer model of a quadcopter. Based on existing methods, energy calculations have been performed and a process for synthesizing controllers in orientation and positioning control circuits has been proposed, considering the sampling rate of the sensors utilized. The outcomes obtained through modeling confirm the suggested algorithm for adjusting controllers. The solution can be utilized by students and professionals in the development of autonomous UAVs or their computer models.

    Keywords: quadcopter, computer modeling, PD controller synthesis, UAV design, stereo camera, room inspection

  • Methodology for automated evaluation of fire detector response time based on fire simulation results

    One of the key parameters used to assess the magnitude of the individual fire risk based on the appropriate calculation methodology is the evacuation start time. To calculate it, there is a need for information about the time of reaching the threshold value of the fire detector, which can be determined on the basis of a fire simulation for the room in which the fire is located. At the same time, it is required to dynamically evaluate the size of the area at each point of which at the height of the location of fire detectors there is an excess of the threshold value of the acting parameter, which is a rather complex task, the solution of which requires the use of automation. This paper proposes a method for automated assessment of the time for reaching the threshold value of the fire detector response based on the results of fire modeling when determining the calculated values of the individual fire risk. Functional model and basic algorithm of the proposed technology are presented. The developed methodology was tested using the example of modeling a fire in a commercial building based on the FDS software kernel for various scenarios. The results of a comparative analysis of solving the problems of estimating the time for reaching the threshold value of fire detectors for various criteria based on the proposed technology and manual method are presented.

    Keywords: individual fire risk, fire dynamics simulation, field fire model, automation, algorithm, FDS

  • Application of the Fuzzy Set Method in the Information Security Audit Process

    The process of ensuring information security is inextricably linked with the assessment of compliance with the requirements. In the field of information protection, this process is called an information security audit. Currently, there are many international and domestic audit standards that describe various processes and methods for assessing compliance with requirements. One of the key drawbacks of these standards is the use of exclusively qualitative assessment without numerical calculations, which in turn does not allow making the procedure the most objective. The use of fuzzy logic allows providing the audit process with an appropriate quantitative assessment, while operating with understandable linguistic variables. The article analyzes existing standards and presents a conceptual model for applying the fuzzy set method in the process of information security audit.

    Keywords: information security, information infrastructure, security audit, risk analysis, fuzzy sets, fuzzy logic

  • The machine stock load optimization model within calendar year

    The paper discusses the issues of multi-criteria optimization of planning the loading of technological equipment at a machine-building enterprise within a calendar year. Planning and optimizing the loading of technological equipment is one of the key tasks of operational calendar planning at engineering enterprises. The paper presents a model for optimizing the load of technological equipment used in the production process. Within the optimization model, three groups of target indicators were identified: the performance indicator of the group of technological equipment within the calendar year; indicator of uniformity of process equipment group loading within the calendar year; the amount of losses from downtime of a group of process equipment within a calendar year. The paper presents the results of optimizing the load of the fleet of machine tools used within the framework of the machining workshop. Load optimization was carried out for certain groups of technological equipment: a group of lathes, a group of milling machines, a group of grinding machines. Equipment load optimization was carried out by redistributing the total labor intensity of the work performed for the corresponding groups of technological equipment between periods of the calendar year. The Pareto optimization method was used to determine the optimal option for loading groups of process equipment. The following optimization strategy has been defined: minimizing the total amount of losses from downtime of process equipment. The paper presents graphs of Pareto fronts for targets for turning group machines. As a result of optimization, the total amount of losses for certain groups of process equipment resulting from downtime decreased by 6.8% -10.2%. Thus, the use of the developed model to solve the problem of optimizing the load of the fleet of machine-tool equipment made it possible to increase the efficiency of the operational scheduling process at machine-building enterprises.

    Keywords: scheduling, multi-criteria optimization, machine stock, targets, losses, process loading

  • Adaptive pipeline architecture with shared memory and selective ordering for high-performance stream data processing

    This paper presents an adaptive pipeline architecture designed to enhance both throughput and reduce latency in real-time stream data processing within single- and multi-processor systems. Unlike predominantly conceptual models or narrowly focused algorithms, the practical impact of this architecture is demonstrated by achieving measurable performance gains through reducing redundant data copying and synchronization costs or by providing flexible control over input and output data ordering. The architecture employs shared memory to eliminate buffer duplication, uses data transfer channels that adapt based on the need for order preservation, and supports the replication of processes within or across CPU cores. Experimental results indicate that the proposed architecture delivers both high throughput and low latency while introducing minimal overhead for data transmission and process synchronization. By offering a flexible and scalable foundation, this architecture can be applied to a wide range of real-time applications, from video surveillance and robotics to distributed platforms for processing large data sets. It demonstrates versatility and robustness in adapting to varying computational demands, thereby ensuring both efficiency and reliability in high-performance environments.

    Keywords: parallelism, multiprocessor computing, computational pipeline, performance scaling, queues, shared memory

  • Analysis of decision-making models in ensuring the protection of public order

    This study is devoted to the analysis of decision-making models in ensuring the protection of public order. The results obtained will allow us to formulate a new mathematical model of decision-making, which will allow us to obtain objective management decisions to ensure the protection of public order in the territory of the Republic of Tajikistan with the possibility of simulation. The object of the study is the process of ensuring the protection of public order. In the scientific literature and in open sources of information, there is a large number of works describing models and algorithms developed on the basis of various mathematical tools. The analysis of a number of papers on this topic will allow us to formulate a new mathematical model of decision-making, which will optimize and improve the quality of prepared decision-making projects while ensuring the protection of public order. The study revealed that the basis for improving the effectiveness of ensuring the safety of citizens during mass events is an effective management decision. 1) Based on this, an analysis of decision-making models is presented, the purpose of which is to determine the need to create a decision-making model while ensuring the protection of public order in the Republic of Tajikistan. 2) A model of decision-making in ensuring the protection of public order in the Republic of Tajikistan is proposed. The model is implemented based on the synthesis of mathematical modeling methods, including cluster analysis, pairwise comparison method and Petri nets. The model allows you to divide committed events, i.e. crimes into clusters according to previously defined criteria. At the final stage, the model allows you to simulate each event, thereby predicting the possible development of the event under study. The presented results of the analysis of decision-making models made it possible to formulate a new mathematical model of decision-making in ensuring the protection of public order in the interests of the Republic of Tajikistan.

    Keywords: public order protection, mathematical model, cluster analysis, pairwise comparison method, expert assessments, Petri nets

  • Analysis of corporate network traffic using SMTP protocol to detect malicious traffic

    This article presents an analysis of corporate network traffic over the SMTP protocol to identify malicious traffic. The relevance of the study is driven by the increasing number of email-based attacks, such as the distribution of viruses, spam, and phishing messages. The objective of the work is to develop an algorithm for detecting malicious traffic that combines traditional analysis methods with modern machine learning approaches. The article describes the research stages: data collection, preprocessing, model training, algorithm testing, and effectiveness analysis. The data used were collected with the Wireshark tool and include SMTP logs, message headers, and attachments. The experimental results demonstrated high accuracy in detecting malicious traffic, confirming the potential of the proposed approach.

    Keywords: SMTP, malicious traffic, network traffic analysis, email, machine learning, Wireshark, spam, phishing, classification algorithms

  • An overview of solutions for optimizing the management system of facility protection complex

    Optimization of automated management systems for facility protection complexes remains relevant today. This research paper provides an overview of the tools for implementing separate monitoring processes: device polling, processing of the received data, and transferring data to the graphic user interface. Based on the analysis of the reviewed information, a basis of solutions for developing management system of the technical means complex is planned to be formed. During the research, it was found that the combination of multi-threading architecture and adaptive polling algorithm allows to implement a large-scale polling; the clustering algorithm and special settings of frameworks for processing large-scale datasets can enhance job performance; WebSocket protocol has proved its efficiency for transferring the real-time data. The result of the evaluation of solutions was a set of tools for implementation of a hardware-software complex.

    Keywords: sensor, management system, monitoring, SNMP manager, clustering, Hadoop, MapReduce, Spark, Apache Kafka, WebSocket

  • Current state and prospects of development of high-tech industrial systems based on 5th generation mobile broadband communications

    The paper examines the current state of the industrial Internet of Things market in Russia and around the world, the main areas of its application, as well as the prospects and challenges that businesses and industrial enterprises will face in implementing this technology. Special attention is paid to the advantages of implementing IIoT, such as increased productivity, reduced costs, improved security and transparency of processes. The barriers specific to the Russian market are discussed, including cybersecurity, hardware compatibility, and significant initial costs. Examples of successful implementations of IIoT technologies in various industries such as the oil and gas industry, logistics and chemical production are given. The emphasis is placed on the need for government support and adaptation of the regulatory framework to accelerate implementation. The article highlights the importance of an integrated approach to IIoT implementation, including using international experience and consolidating efforts to develop the digital economy in the face of global and local challenges.

    Keywords: industrial Internet of Things, IIoT, industry 4.0, 5G, production automation, digital transformation

  • Forecasting rare events based on the analysis of interaction graphlets in social networks

    The widespread use of social media platforms has led to the accumulation of vast amounts of stored data, enabling the prediction of rare events based on user interaction analysis. This study presents a method for predicting rare events using graph theory, particularly graphlets. The social network VKontakte, with over 90 million users, serves as the data source. The ORCA algorithm is utilized to identify characteristic graph structures within the data. Throughout the study, user interactions were analyzed to identify precursors of rare events and assess prediction accuracy. The results demonstrate the effectiveness of the proposed method, its potential for threat monitoring, and the possibilities for further refinement of graphlet-based prediction models.

    Keywords: social media, security event, event prediction, graph theory, graphlet, interaction analysis, time series analysis, correlation analysis, data processing, anomalous activity

  • Exploring the possibilities of using blockchain technology to pro-tect data in CRM-systems and increase transparency in the process of interacting with customers

    In modern conditions of digital transformation, companies are actively implementing customer Relationship Management systems (CRM systems) to manage customer relationships. However, the issues of data protection, confidentiality and transparency of interaction remain critically important. This article explores the possibilities of using blockchain technology to enhance the security of CRM systems and improve trust between businesses and customers. The purpose of the work is to analyze the potential of using blockchain in data protection of CRM systems, as well as to assess its impact on the transparency of customer transactions. The paper examines the main threats to data security in CRM, the principles of blockchain technology and its key advantages in this context, including decentralization, immutability of records and protection from unauthorized access. Based on the analysis, promising areas of blockchain integration into CRM systems have been identified, practical recommendations for its application have been proposed, and the potential effectiveness of this technology has been assessed. The results of the study may be useful to companies interested in strengthening the protection of customer data and increasing the transparency of user interaction processes.

    Keywords: blockchain, CRM-system, security, data protection, transparency, customer interaction

  • Prediction of gas concentrations based on neural network modeling

    The article discusses the use of a recurrent neural network in the task of predicting pollutants in the air based on simulated data in the form of a time series. Neural recurrent network models with long Short-Term Memory (LSTM) are used to build the forecast. Unidirectional LSTM (hereinafter simply LSTM), as well as bidirectional LSTM (Bidirectional LSTM, hereinafter Bi-LSTM). Both algorithms were applied for temperature, humidity, pollutant concentration, and other parameters, taking into account both seasonal and short-term changes. The Bi-LSTM network showed the best performance and the least errors.

    Keywords: environmental monitoring, data analysis, forecasting, recurrent neural networks, long-term short-term memory, unidirectional, bidirectional

  • Design and Development of Information System for Automated Processing of Orders for the Production of Abrasive Tools

    The article is devoted to the creation of a highly specialized automated information system for automated processing of orders for the production of abrasive tools. The development of such software products will improve production efficiency through the transition from order-based production to batch production.

    Keywords: automated information system, production order processing system, Rammler-Breich diagram, role-based data access system

  • System analysis of the information system for accounting of human resources in risk management

    The article is the result of an analytical study on the topic of risk management in the creation and modernization of business processes. The article proposes risk management methods using the organization's human resources and methods for training personnel taking into account trends in the labor market. The effect of implementing risk management measures and the method for assessing the effectiveness of the implemented training are separately noted.

    Keywords: risk management, human resources, employee training, experts, SWOT analysis

  • Modeling the dynamics of mixing of a two-component mixture by a Markov process

    The article considers the issues of imitation modeling of fibrous material mixing processes using Markov processes. The correct combination and redistribution of components in a two-component mixture significantly affects their physical properties, and the developed model makes it possible to optimize this process. The authors propose an algorithm for modeling transitions between mixture states based on Markov processes.

    Keywords: modeling, imitation, mixture, mixing, fibrous materials

  • Integration of Cloud, Fog, and Edge Computing: Opportunities and Challenges in Digital Transformation

    This article explores the opportunities and challenges of integrating cloud, fog, and edge computing in the context of digital transformation. The analysis reveals that the synergy of these technologies enables optimization of big data processing, enhances system adaptability, and ensures information security. Special attention is given to hybrid architectures that combine the advantages of centralized and decentralized approaches. Practical aspects are addressed, such as the use of the ENIGMA simulator for modeling scalable infrastructures and the EC-CC architecture for smart grids and IoT systems. The role of specialized frameworks in optimizing routing and improving infrastructure reliability is also highlighted. The integration of these technologies drives advancements in key industries, including energy, healthcare, and the Internet of Things, despite challenges related to data security.

    Keywords: cloud computing, fog computing, edge computing, hybrid architectures, Internet of Things, digital transformation, big data, decentralized systems, computing integration, distributed computing, data security, resource optimization, data transfer speed

  • Application of neural networks in modern radiography: automated analysis of reflectometry data using machine learning

    This article will present the mlreflect package, written in Python, which is an optimized data pipeline for automated analysis of reflectometry data using machine learning. This package combines several methods of training and data processing. The predictions made by the neural network are accurate and reliable enough to serve as good starting parameters for subsequent data fitting using the least-mean-squares (LSC) method. For a large dataset consisting of 250 reflectivity curves of various thin films on silicon substrates, it was demonstrated that the analytical data pipeline with high accuracy finds the minimum of the film, which is very close to the set by the researcher using physical knowledge and carefully selected boundary conditions.

    Keywords: neural network, radiography, thin films, data pipeline, machine learning

  • Substantiation of the effectiveness of using recycling and waste disposal technologies based on the materials management model

    The paper analyzes existing effective technologies of waste recycling and utilization. The authors consider various approaches in the international practice of recycling production and consumption waste. An assessment is given of the possibilities of using effective technologies for waste recycling and disposal and the necessary costs for their implementation in relation to the conditions of an industrial enterprise. The types and volumes of waste that can be recycled and disposed of irrevocably are considered, for which the carbon footprint parameters are calculated using the materials management model. A statistical regression analysis of data on the production, processing, disposal and incineration of polyethylene waste, solid municipal waste and paper was carried out. The principles of building a system for reducing technogenic risks and managing production and consumption waste were determined.

    Keywords: waste processing; waste disposal; carbon footprint; carbon footprint calculation methods; man-made risk management system; hazardous impact factors; industrial waste management

  • Analysis of the influence of data representation accuracy on the quality of wavelet image processing using Winograd method computations

    This paper is devoted to the application of the Winograd method to perform the wavelet transform in the problem of image compression. The application of this method reduces the computational complexity and also increases the speed of computation due to group processing of pixels. In this paper, the minimum number of bits at which high quality of processed images is achieved as a result of performing discrete wavelet transform in fixed-point computation format is determined. The experimental results showed that for processing fragments of 2 and 3 pixels without loss of accuracy using the Winograd method it is enough to use 2 binary decimal places for calculations. To obtain a high-quality image when processing groups of 4 and 5 pixels, it is sufficient to use 4 and 7 binary decimal places, respectively. Development of hardware accelerators of the proposed method of image compression is a promising direction for further research.

    Keywords: wavelet transform, Winograd method, image processing, digital filtering, convolution with step

  • Using neural networks to solve computer vision problems

    The article discusses the main approaches to solving computer vision problems using neural networks, focusing on their application to a wide range of tasks. It describes the types of problems addressed by computer vision, such as image classification, object detection, segmentation, and activity recognition. The functioning mechanisms of convolutional neural networks (CNNs) are explained in detail, highlighting key features like convolutional layers, pooling operations, and activation functions. The problem of selecting object detection models, which generalize the more studied problem of object classification, is examined in depth, along with an evaluation of the efficiency of various algorithms using metrics like mAP (mean Average Precision) and IoU (Intersection over Union). Modern approaches to training neural networks are discussed, including the use of pre-trained models, transfer learning methods, and fine-tuning techniques for domain-specific applications. The article describes the advantages and limitations of prominent CNN architectures such as ResNet, VGG, and EfficientNet, offering insights into their suitability for different tasks. Data augmentation methods, aimed at improving the generalization ability of models, are also considered, emphasizing their importance for addressing data scarcity challenges. Practical examples of computer vision applications in areas like facial recognition, autonomous driving, and medical diagnostics are provided to illustrate the real-world relevance of these methods. Additionally, the integration of computer vision algorithms into complex systems and workflows is analyzed, highlighting its transformative potential across industries. Finally, the article discusses the future directions for research in this domain, including advancements in unsupervised learning, real-time processing, and explainable AI in computer vision.

    Keywords: computer vision, architecture, convolutional neural networks, digital image, object classification

  • Programming using the actor model on the Akka platform: concepts, patterns, and implementation examples

    This article discusses the basic concepts and practical aspects of programming using the actor model on the Akka platform. The actor model is a powerful tool for creating parallel and distributed systems, providing high performance, fault tolerance and scalability. The article describes in detail the basic principles of how actors work, their lifecycle, and messaging mechanisms, as well as provides examples of typical patterns such as Master/Worker and Proxy. Special attention is paid to clustering and remote interaction of actors, which makes the article useful for developers working on distributed systems.

    Keywords: actor model, akka, parallel programming, distributed systems, messaging, clustering, fault tolerance, actor lifecycle, programming patterns, master worker, proxy actor, synchronization, asynchrony, scalability, error handling

  • Adaptation of the dynamic time warping algorithm for the problem of finding the distance between two time series with periods of low value variability

    The dynamic time warping algorithm (DTW) is designed to compare two time series by measuring the distance between them. DTW is widely used in medicine, speech recognition, financial market and gaze trajectories analysis. Considering the classic version of DTW, as well as its various modifications, it was found that in the tasks of analyzing the distance between gaze trajectories, they are not able to correctly take into account the duration of its fixations on visual stimuli. The problem has not attracted much attention so far, although its solution will improve the accuracy and interpretation of the results of many experimental studies, since assessing the time of visual focus on objects is an important factor in visual analysis. Hence the need to adapt DTW for such tasks. The goal of this work is to adapt the classic DTW to the problem of finding the distance between two time series with periods of low variability of values. During the demonstration of the developed algorithm, it was proven that the effect of a given minimum threshold of fixation duration on the result is significant. The proposed adaptation of DTW will improve the quality of visual data analysis and can be applied to understanding the mechanisms of human perception and decision-making in various fields of activity, such as psychology and marketing, as well as to developing effective methods for testing interfaces.

    Keywords: dynamic time warping algorithm, eye tracking, time series, gaze trajectory, gaze fixation duration

  • Comparative analysis of ResNet18 and ResNet50 neural network resilience to adversarial attacks on training sets

    This article is devoted to a comparative analysis of the resilience of ResNet18 and ResNet50 neural networks to adversarial attacks on training sets. The issue of the importance of ensuring the safety of learning sets is considered, taking into account the growing scope of artificial intelligence applications. The process of conducting an adversarial attack is described using the example of an animal recognition task. The results of two experiments are analyzed. The purpose of the first experiment was to identify the dependence of the number of epochs required for the successful execution of an adversarial attack on the training set on the neural network version of the ResNet architecture using the example of ResNet18 and ResNet50. The purpose of the second experiment was to get an answer to the question: how successful are attacks on one neural network using modified images of the second neural network. An analysis of the experimental results showed that ResNet50 is more resistant to competitive attacks, but further improvement is still necessary.

    Keywords: artificial intelligence, computer vision, Reset, ResNet18, ResNet50, adversarial attacks, learning set, learning set security, neural networks, comparative analysis

  • Language neural networks for matching text descriptions of products

    The article is devoted to the application of language neural networks for matching text descriptions of products. The analysis of methods for comparing text descriptions of products is carried out, the advantages and disadvantages of each method are noted. The method for matching text descriptions of products, based on Bert neural networks, is considered. Experiments and tests on data sets of text descriptions of similar goods from different retail chains are carried out. Conclusions about the quality of matching various networks of the Bert architecture are made.

    Keywords: neural networks, transformers, comparison of text descriptions, text analysis, Bert