To ensure the safety of capital construction facilities, it is necessary to anticipate and predict risks at the planning and design stage. Risk forecasting is carried out in both quantitative and qualitative measurement. The accuracy of the calculations requires consideration of a large number of different risks, their causes, possible consequences and the likelihood of their occurrence. At this scale of input, traditional ways of calculating construction risks are costly in terms of money and labour, and can be very time-consuming. Artificial intelligence and machine learning technologies automate the process of risk assessment and calculation. With digital technology, all the factors arising during construction will be taken into account in real time. Despite a number of limitations in the application of this technology, this method is the most promising and increasingly widespread.
Keywords: construction, capital engineering, safety, risk, risk forecasting, risk assessment, risk management, artificial intelligence, machine learning
This study presents a comparative analysis of machine learning models used for driver classification based on microelectromechanical system (MEMS) sensor data. The research utilizes the “UAH-DriveSet” open dataset, which includes over 500 minutes of driving data with annotations for aggressive driving events, such as sudden braking, sharp turns, and rapid acceleration. The models evaluated in this study include gradient boosting algorithms, a recurrent neural network and a convolutional neural network. Special attention is given to the impact of data segmentation parameters, specifically window size and overlap, on classification performance using the sliding window method. The effectiveness of each model was assessed based on classification metrics such as accuracy, precision, and F1 score. The results show that gradient boosting “LightGBM” outperforms the other models in terms of accuracy and F1 score, while long short-term memory model demonstrates good performance with time-series data but requires larger datasets for better generalization. Convolutional neural network, while effective for identifying short-term patterns, faced difficulties with class imbalances. This research provides valuable insights into selecting the most appropriate machine learning models for driver behavior classification and offers directions for future work in developing intelligent systems using MEMS sensor data.
Keywords: driver behavior analysis, microelectromechanical system sensors, machine learning, aggressive driving, gradient boosting, recurrent neural networks, convolutional neural networks, sliding window, driver classification
The results of fire dynamics simulation based on the FDS software kernel are a large amount of data describing the dynamics of various parameters in the space of the studied object. Solving various research problems based on them may require quite complex processing, which goes beyond the functionality of existing software solutions. The article is devoted to the method of efficiency increasing for numerical fire dynamics simulation results processing by automating the implementation of relevant operations. The article describes the functional model of the developed technology and its main stages. Approbation of the proposed method was carried out using the example of solving the problem of forming initial data arrays in high spatial and time resolution for the subsequent study of enclosing tunnel structures heating in case of fire. Graphs of the gas medium temperature at various points under the roof of the tunnel structure from the coordinate are presented, as well as temperature fields in the vertical section of the investigated structure in the plane passing through the fire focus at different times. Based on the comparative analysis, it was shown that the speed of calculation results automated processing is several orders of magnitude higher compared to methods that use the functionality of existing software solutions designed to view the output of the fire dynamics simulation.
Keywords: fire dynamics simulation, automation, data processing, tunnel structures, mathematical model, FDS
The article discusses current issues related to the design of a smart home wireless local area network based on splitter-repeater modules. Special attention in the study is paid to the modules of wired and wireless hubs and switches. The results of the comparative characteristics of PLC and FBT splitter-repeaters are also presented. Particular emphasis is placed on the network topology and its main components.
Keywords: wireless network, topology, data, transmission, power, traffic, packet, failures, adapter, cable, connection
The article explores the actor model as implemented in the Elixir programming language, which builds upon the principles of the Erlang language. The actor model is an approach to parallel programming where independent entities, called actors, communicate with each other through asynchronous messages. The article details the main concepts of Elixir, such as comparison with a sample, data immutability, types and collections, and mechanisms for working with the actors. Special attention is paid to the practical aspects of creating and managing actors, their interaction and maintenance. This article will be valuable for researchers and developers interested in parallel programming and functional programming languages.
Keywords: actor model, elixir, parallel programming, pattern matching, data immutability, processes, messages, mailbox, state, recursion, asynchrony, distributed systems, functional programming, fault tolerance, scalability
The transition from scheduled maintenance and repair of equipment to maintenance based on its actual technical state requires the use of new methods of data analysis based on machine learning. Modern data collection systems such as robotic unmanned complexes allow generating large volumes of graphic data in various spectra. The increase in data volume leads to the task of automating their processing and analysis to identify defects in high-voltage equipment. This article analyzes the features of using computer vision algorithms for images of high-voltage equipment of power plants and substations in the infrared spectrum and presents a method for their analysis, which can be used to create intelligent decision support systems in the field of technical diagnostics of equipment. The proposed method uses both deterministic algorithms and machine learning. Classical computer vision algorithms are applied for preliminary data processing in order to highlight significant features, and models based on unsupervised machine learning are applied to recognize graphic images of equipment in a feature space optimized for information space. Image segmentation using a spatial clustering algorithm based on the density distribution of values taking into account outliers allows detecting and grouping image fragments with statistically close distributions of line orientations. Such fragments characterize certain structural elements of the equipment. The article describes an algorithm that implements the proposed method using the example of solving the problem of detecting defects in current transformers, and presents a visualization of its intermediate steps.
Keywords: diversification of management, production diversification, financial and economic purposes of a diversification, technological purposes of ensuring flexibility of production
The annual growth of the load on data centers increases many times over, which is due to the growing growth of users of the information and telecommunications network Internet. Users access various resources and sources, using search engines and services for this. Installing equipment that processes telecommunications traffic faster requires significant financial costs, and can also significantly increase the downtime of the data center due to possible problems during routine maintenance. It is more expedient to focus resources on improving the software, rather than the hardware of the equipment. The article provides an algorithm that can reduce the load on telecommunications equipment by searching for information within a specific subject area, as well as by using the features of natural language and the process of forming words, sentences and texts in it. It is proposed to analyze the request based on the formation of a prefix tree and clustering, as well as by calculating the probability of the occurrence of the desired word based on the three sigma rule and Zipf's Law.
Keywords: Three Sigma Rule, Zipf's Law, Clusters, Language Analysis, Morphemes, Prefix Tree, Probability Distribution
The article presents the existing methods of reducing the dimensionality of data for teaching machine models of natural language. The concepts of text vectorization and word-form embedding are introduced. The task of text classification is being formed. The stages of classifier training are being formed. A classifying neural network is being designed. A series of experiments is being conducted to determine the effect of reducing the dimension of word-form embeddings on the quality of text classification. The results of evaluating the work of trained classifiers are compared.
Keywords: natural language processing, vectorization, word-form embedding, text classification, data dimensionality reduction, classifier
The paper discusses the use of the M/M/n mass service model to analyze the performance of cloud storage systems. Simulations are performed to identify the impact of system parameters on average latency, blocking probability, and throughput. The results demonstrate how optimizing the number of servers and service intensity can improve system performance and minimize latency. The relevance of the study is due to the need to improve the performance of cloud solutions in the context of growing data volumes and increasing load on storage systems.
Keywords: cloud storage, mass service theory, M/M/n model, Python, modeling, performance analysis
This paper considered the problem of detection and classification of surface objects in low visibility conditions such as rain and fog. The focus is on the application of state-of-the-art deep learning algorithms, in particular the YOLO architecture , to improve detection accuracy and speed. The introduction to the problem includes a discussion of the limitations of visibility degradation, the change in shape and size of objects depending on the viewing angle, and the lack of training data. The paper also presents the use of discrete wavelet transform to improve image quality and increase the robustness of the systems to adverse conditions. Experimental results show that the proposed algorithm achieves high accuracy and speed, which makes it suitable for application in drone video monitoring systems.
Keywords: YOLO, wavelet transform, overwater objects, drones, low visibility condition, Fourier transforms, Haar
In the work describes the extreme filtering method and the author's approaches that allow adapting it to work in real time: frame-by-frame processing and the method with signal loading. Further, solutions are presented that can be used to implement the above on real devices. The first solution is to use the Multiprocessing library for the Python language. The second approach involves creating a client-server application and sending asynchronous POST requests to implement the frame-by-frame signal processing method. The third method is also associated with the development of a client-server application, but with the WebSocket protocol, not HTTP, as in the previous approach. Then, the results are presented, and conclusions are made about the suitability of the author's approaches and solutions for working on real devices. It is noted that the solution based on the use of the WebSocket protocol is of particular interest. This solution is suitable for both the frame-by-frame signal processing method and the method with value loading. It is also noted that all approaches proposed by the author are workable, which is confirmed by the time values and the coincidence of the graphs.
Keywords: extreme filtering, frame-by-frame signal processing method, method with value loading, Multiprocessing, HTTP, WebSocket, REST, JSON, Python, microcontrollers, single-board computers
The main maintenance of a diversification of production as activity of subjects of managing is considered. being shown in purchase of the operating enterprises, the organizations of the new enterprises, redistribution of investments in interests of the organization and development of new production on available floor spaces. The most important organizational economic targets of a diversification of management are presented by innovative activity of the industrial enterprise.
Keywords: software systems, visualization, data, graphic systems, parts, models, diagrams, drawings
The article is devoted to the developed code designer for the Scilab environment, which is intended to automate the process of creating software modules. The program allows you to generate code for Scilab through an intuitive interface, providing users with tools for working with variables, loops, graphs, system analysis and user-defined functions. The constructor allows you to write programs for Scilab without knowledge of a programming language.
Keywords: Scilab, code designer, programming automation, code generation, visual programming
Linear feedback shift registers (LFSR) and the pseudo-random sequences of maximum length (m-sequences) generated by them have become widely used in solving problems of mathematical modeling, cryptography, radar and communications. The wide distribution is due to their special properties, such as correlation. An interesting, but rarely discussed in the scientific literature of recent years, property of these sequences is the possibility of forming quasi-orthogonal matrices on their basis.In this paper, was conducted a study of methods for generating quasi-orthogonal matrices based on pseudo-random sequences of maximum length (m-sequences). An analysis of the existing method based on the cyclic shift of the m-sequence and the addition of a border to the resulting cyclic matrix is carried out. Proposed an alternative method based on the relationship between pseudo-random sequences of maximum length and quasi-orthogonal Mersenne and Hadamard matrices, which allows generating cyclic quasi-orthogonal matrices of symmetric structure without a border. A comparative analysis of the correlation properties of the matrices obtained by both methods and the original m-sequences is performed. It is shown that the proposed method inherits the correlation properties of m-sequences, provides more efficient storage, and is potentially better suited for privacy problems.
Keywords: orthogonal matrices, quasi-orthogonal matrices, Hadamard matrices, m-sequences
The article considers the options for visual programming of information support means for software and information complexes for UAV operators training. The main criterion indicators for systematically organizing the set of components for reusing program code are identified. An example of an unmanned payload carrier in various representative forms of visualization is given. A comparison of the labor intensity of developing the specified software and information implementations for the same unmanned robotics object with their normative labor intensity is shown. The variants of content filling during the development of the same material part of the considered device for various aspects of training specialists in the management and operation of UAV are considered. The principle of systematization of components by means of ordering the complexity of presentation and softwarе implementation is shown.
Keywords: risk forecasting, information support, training of unmanned aircraft systems operators, labor intensity assessment
The paper presents a method for quantitative assessment of zigzag trajectories of vehicles, which allows to identify potentially dangerous behavior of drivers. The algorithm analyzes changes in direction between trajectory segments and includes data preprocessing steps: merging of closely spaced points and trajectory simplification using a modified Ramer-Douglas-Pecker algorithm. Experiments on a balanced data set (20 trajectories) confirmed the effectiveness of the method: accuracy - 0.8, completeness - 1.0, F1-measure - 0.833. The developed approach can be applied in traffic monitoring, accident prevention and hazardous driving detection systems. Further research is aimed at improving the accuracy and adapting the method to real-world conditions.
Keywords: trajectory, trajectory analysis, zigzag, trajectory simplification, Ramer-Douglas-Pecker algorithm, yolo, object detection
The article explores the implementation of digital and mathematical technologies in decision support systems (DSS) aimed at enhancing the efficiency of livestock enterprises. In the context of digital transformation and increasing uncertainty in agriculture, the authors emphasize the importance of intelligent DSS capable of processing large datasets and supporting rapid, evidence-based decision-making. The purpose of the study is to identify effective technological and methodological approaches for optimizing livestock management, particularly in the area of animal feeding. Methods include the use of mathematical models, predictive algorithms, automated control systems, and big data analytics. The proposed DSS architecture enables real-time monitoring, adaptive ration formulation, and integration of physiological, environmental, and economic data. The paper provides practical examples of successful DSS applications, such as automated milking systems and health monitoring technologies, and analyzes their impact on productivity and cost reduction. A set of methodological recommendations is formulated to enhance management efficiency, including modular system design, staff training, and integration of IoT and AI technologies. The article concludes that intelligent DSS not only reduce feeding costs but also improve animal health, optimize resource use, and support sustainable agricultural practices. The results are of practical significance for researchers, developers, and farm managers aiming to implement data-driven solutions in livestock production.
Keywords: diversification of management, production diversification, financial and economic purposes of a diversification, technological purposes of ensuring flexibility of production
A class of mathematical methods for code channel division has been developed based on the use of pairs of orthogonal encoding and decoding matrices, the components of which are polynomials and integers. The principles of constructing schemes for implementing code channel combining on the transmitting side and arithmetic code channel division on the receiving side of the communication system and examples of such schemes are presented. The proposed approach will significantly simplify the design of encoding and decoding devices used in space and satellite communication systems.
Keywords: telecommunications systems, telecommunications devices, multiplexing, code division of channels, matrix analysis, encoding matrices, synthesis method, orthogonal matrices, integers
A method is proposed for cascading connection of encoding and decoding devices to implement code division of channels. It is shown that by increasing the number of cascading levels, their implementation is significantly simplified and the number of operations performed is reduced. In this case, as many pairs of subscribers can simultaneously exchange information, what is the minimum order of the encoding and decoding devices in the system. The proposed approach will significantly simplify the design of encoding and decoding devices used in space and satellite communication systems.
Keywords: telecommunications systems, telecommunications devices, multiplexing, code division of channels, orthogonal matrices, integers, cascaded connection
In this paper, a star sensor tracking method without a star library based on the angular distance chain algorithm is proposed to solve the problem that traditional star sensors rely on a fixed star library and need to be configured to work with multiple units in the tracking mode. This method achieves star map matching by dynamically generating angular distance chains, avoiding the dependence on the global star library. Experiments show that the recognition time of the algorithm in the tracking mode is reduced to milliseconds, and the maximum pose determination error is no more than 0.035°, which proves its effectiveness and reliability. The study provides key technical support for the development of low-cost and lightweight star sensors that are suitable for scenarios such as deep space exploration and near-Earth satellite clusters.
Keywords: angular distance chain algorithm, star sensor without star library, star map recognition, tracking mode, orientation, dynamic matching, deep space exploration
Regression analysis based on the use of statistical data and their processing by special methods is an effective method of researching and forecasting the number of employees of structural units. In this paper, based on statistical information on 81 regional offices of the Social Fund of Russia, a regression analysis of the staffing of individual information protection divisions was carried out taking into account the total area and population of the regions. It is shown that a number of subjects are understaffed and some of them, on the contrary, are overstaffed.
Keywords: information protection, regression model, adequacy criteria, forecasting, staffing analysis, information protection units
The effectiveness of advanced pavement defect detection algorithms is considered depending on the data collection devices used, such as cameras, GPR, LiDAR and IMU sensors installed in smartphones. Rational use of these hardware and software tools will allow utilities to identify and eliminate road surface defects in a timely manner, thereby improving road safety.
Keywords: transportation sector, pavement defects, mobile road laboratories, neural network algorithms, computer vision
The article considers the issues of developing a database of wooden architecture objects in Karelia. The database includes 1009 attributed photographs obtained as part of the comprehensive expedition of the Ministry of Culture of Karelia in 1979-1980 and the comprehensive expedition to Syamozerye in 2000-2001. The expedition research was carried out by specialists and students of Petrozavodsk State University under the supervision of Academician Vyacheslav Petrovich Orfinsky. The database was developed in the MySQL database management system. The database scheme is provided, the tables are described, and an example of a photograph with attributes is presented. The database allows storing and editing materials from historical and architectural expeditions and research by employees of Petrozavodsk State University, as well as searching for photographs of wooden architecture objects in Karelia by various criteria, including date, location, type of object, and author.
Keywords: wooden architecture, database, expeditions, MySQL DBMS, photographs of wooden architecture objects
The article studies the application of neural networks with long short-term memory (LSTM) for forecasting the precipitation of asphalt-resin-paraffin deposits (ARPD) during oil pumping through main oil pipelines. The authors of the article outline the relevance of the problem of ARPD formation in main oil transportation and consider modern approaches to mathematical modeling of forecasting the precipitation of deposits. The aim of the study was to develop a neural network model that allows constructing a graph of the distribution of ARPD along the length of the model pipeline over time. Taking into account the features of various types of neural networks and the available input data, a corresponding neural network model based on LSTM was developed. The key parameters of the "oil - pipeline - soil" system were determined, which should be taken into account as initial data. The developed model demonstrates a sufficient degree of forecasting accuracy and at the same time has prospects for its improvement. The results obtained can be applied by operators of main oil transportation for more accurate forecasting and determining the most cost-effective period for cleaning the pipeline.
Keywords: recurrent neural network, asphalt-resin-paraffin deposits, neural networks, forecasting, short-term long-term memory networks, oil trunk pipeline, oil transportation
A Simulink model is considered that allows calculating transient processes of objects described using a transient function for any type of input action. An algorithm for the operation of the S-function that performs calculations using the Duhamel integral is described. It is shown that due to the features of the S-function, it can store the values of the previous step of the Simulink model calculation. This allows the input signal to be decomposed into step components and the time of occurrence of each step and its value to be stored. For each step of the input signal increment, the S-function calculates the response by scaling the transient response. Then, at each step of the calculation, the sum of such reactions is found. The S-function provides a procedure for freeing memory when the end point of the transient response is reached at each step. Thus, the amount of memory required for the calculation does not increase above a certain limit, and, in general, does not depend on the length of the model time. For calculations, the S-function uses matrix operations and does not use cycles. Due to this, the speed of model calculation is quite high. The article presents the results of calculations. Recommendations are given for setting the parameters of the model. A conclusion is formulated on the possibility of using the model for calculating dynamic modes.
Keywords: simulation modeling, Simulink, step response, step function, S-function, Duhamel integral.