×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Determination of the geometry of the room by the impulse response using convolutional neural networks

    Existing methods for determining the geometry of an enclosed space using echolocation assume the presence of a large amount of additional equipment (sound sources and receivers) in the room. This paper investigates a method for determining the geometry of enclosed spaces using sound location. The method does not assume the presence of a priori knowledge about the surrounding space. One sound source and one sound receiver were used to create and capture real impulse characteristics. A microphone was used as a sound receiver and a finger snap was used as a sound source to produce the impulse response. In this work, we used convolutional neural networks that were trained on a large dataset consisting of 48000 impulse responses and a number of room geometry parameters corresponding to them. The trained convolutional neural network was tested on the recorded impulse responses of a real room and showed accuracy ranging from 92.2 to 98.7% in estimating room size from various parameters.

    Keywords: convolutional neural networks, room geometry, echolocation, impulse response, robotics, recognition, contactless methods of measuring objects, sonar, geometry prediction, virtual reality

  • A Study and Comparative Analysis of the Computational Performance of AssemblyScript

    In the world of web development, there is a growing need for tools that can provide high performance for client applications. In response to this challenge, WebAssembly technology has been developed to compile various programming languages into a binary format that can be executed in web browsers. The new AssemblyScript programming language provides the ability to create high-performance WebAssembly modules using the TypeScript language syntax familiar to Web developers. This paper investigates WebAssembly and AssemblyScript, and compares the performance of AssemblyScript and JavaScript using four computational algorithms as examples. The test results demonstrate faster execution speed of AssemblyScript in most tasks, as well as more stable performance when executed in different browsers. The study highlights the relevance of using AssemblyScript to optimize computationally intensive operations in web application development.

    Keywords: assemblyscript, webassembly, wasm, javascript, front end, performance, web applications

  • Using statistical edge-based modeling for motion detection in video

    In this article we present a novel algorithm for detecting moving objects using a stationary camera, which is based on statistical background modeling using edge segments. Traditional algorithms that rely on pixel intensity struggle in dynamic environments due to their inability to handle sudden changes in lighting conditions. These algorithms also tend to produce ghosting artifacts when a sudden change occurs in the scene. To address this issue, edge-based features that are robust to intensity variations and noise have been introduced. However, existing methods that rely on individual edge pixels suffer from scattered edge pixels and cannot effectively utilize shape information. Additionally, traditional segment-based methods struggle with variations in edge shape and may miss moving edges that are close to the background edges. In contrast to conventional approaches, our proposed method constructs the background model using regular training frames that may include moving objects. Furthermore, it avoids the generation of ghosting artifacts. Additionally, our method employs an automatic adaptive threshold for each background edge distribution to facilitate matching. This enhances the robustness of our approach to changes in illumination, camera movement, and background motion. Experimental results demonstrate that our method outperforms other techniques and efficiently detects moving edges despite the aforementioned challenges.

    Keywords: motion detection, edges, canny edge detector, gaussian of color, gaussian of gradient magnitude, normal distribution, adaptive thresholds, statistical map

  • The regression modieling the level of Ia river, Irkutsk State

    Analyses for the current publishes show that the problem of forecast water overflowing is actual and often causing a lot health threaten and other dangerouses. This article offers computing, analysis and development the regression model of the level of Ia river. The final model correspont the real data with proper level. The final calculation means that this model could be used for real forecast for defend the people from water's overflow.

    Keywords: model, simulation, river, water level, flood, emergency, forecast, statistics, monitoring, analysis, iya river, Irkutsk region

  • Algorithm for fragmentation and defragmentation of formal contexts

    A combinatorial problem - the problem of finding the set of all formal concepts of formal concepts is considered. The computational complexity of the problem is that the number of formal concepts depends exponentially on the size of the initial formal context. in the article to solve this problem, an algorithm for fragmentation and defragmentation of the formal context is given, based on the method of decomposing the formal context into fragments. The essence of the method is that the original formal context is divided into various fragments. The fragments have different sizes and a non-empty intersection. Each fragment is subsequently considered as a formal context and can again be subject to decomposition. As a result, a finite set of fragments is formed. Then formal concepts are found in each fragment and combined to form the desired set of all formal concepts of the formal context. The method is “non-distorting”: when dividing the context into fragments, new formal concepts are not formed and the sought-for concepts are not lost. The results of computational experiments are presented, showing the effectiveness of the developed algorithm.

    Keywords: formal concepts analisys, fragmentation algorithm, formal context, object-attribute table, combinatorial problem, the problem of finding the set of all formal concepts

  • Study of synchronization of almost-proportional and almost-periodic characteristics of time series

    In this study, an analysis of the time series was conducted using a class of shift functions for arithmetic and geometric progressions, along with their synchronization using logarithmic decrement. The closing prices of IBM company stocks were taken as the examined data for each trading day. The shift functions of geometric and arithmetic progressions revealed almost-proportions and almost-periods in the examined data. These detected patterns emphasize the importance of applying shift functions in the analysis of time series, allowing the extraction of internal patterns and periodic fluctuations that might go unnoticed with standard analysis methods. Computing the minima and corresponding values of the geometric progression enabled the identification of almost-periods in the data. These results not only confirmed visual observations but also enhanced our understanding of the internal patterns of the time series. The findings underscore the effectiveness of applying methods for analyzing time series based on almost-proportions and metric techniques. These approaches play a crucial role in uncovering hidden patterns and subtle periodicities in data, providing a fundamental foundation for more accurate analysis and successful forecasting.

    Keywords: nearly-proportionalities, synchronization of geometric progression, empirical data, geometric progression, shift functions

  • Algorithms for automatic control of geometric parameters of steel ropes in elevator systems

    The paper considers the problem of automatic detection of defects in the geometric parameters of steel ropes of elevator systems using computer vision methods. The features of flaw detection of moving steel ropes based on video sequences are analyzed, associated with the fragmentation of the image of some defects in adjacent frames and the variability of the geometric dimensions of the rope and the characteristics of the defect visible by the camera due to vibrations of the rope during movement. Taking into account the considered features, two algorithms have been proposed: to determine the defect of thickening/thinning of the rope diameter and the defect of undulation. The paper presents the results of experimental testing of algorithms on a special test bench and calculates the reliability indicators of defect detection by the proposed algorithms in the form of precision and recall of detection of each defect individually, as well as the average precision and recall of detection of both considered defects of geometric parameters of the rope as a whole.

    Keywords: steel rope defects, instrumental control, non-contact flaw detection, computer vision

  • Study of hardware implementation of neural networks when processing information in residue number system

    This article examines models of arithmetic devices for finite ring neural networks of the second and third orders. The arithmetic devices under study were synthesized on the basis of FPGA. Estimates of hardware costs and performance of computers for system modules of residual classes of different capacity were obtained. The structure of a finite ring neural network with dynamic connections is proposed, the efficiency of which in terms of hardware costs is observed with increasing capacity of the residue number system module. The advantage of a finite ring neural network with dynamic connections is established for modules with a capacity of 64 bits and higher.

    Keywords: neural networks, residue number system, group of elliptic curve points, FPGA, multiplier, adder

  • Development of a fuzzy logic controller for a process control system for membrane gas separation

    The production of nitrogen from air using membrane gas separation processes is widely used in many industries. The problem of controlling the gas separation process is associated with multi-loop control using control of several variables. To build a model of a gas separator, a detailed analysis of the gas separation process was carried out in this work. This article proposes a fuzzy logic controller used to match pressure fluctuations and air flow of a gas separator. The performance of the proposed controller was evaluated in comparison with traditional controllers. The proposed fuzzy logic controller makes it possible to increase the accuracy of the gas separation control system and reduce the duration of transient processes.

    Keywords: fuzzy logic, controller, gas separation, membrane technology, nitrogen, control system

  • A method for automated formation of a training data set for machine learning algorithms for classification of electronic documents

    The article considers a method of automated formation of a training data set for machine learning algorithms for classification of electronic documents, which differs from the known ones by forming training data sets based on the synthesis of clustering and data augmentation methods based on calculating the distance between objects in multidimensional spaces.

    Keywords: teaching with a teacher, clustering, pattern recognition, machine learning algorithm, electronic document, vectorization, formalized documents

  • Processing of hydroacoustic signals using wavelets

    In this article, an algorithm for processing hydroacoustic signals in the frequency domain using wavelets is considered. Arguments are given in favor of the similarity of the structure of hydroacoustic signals with the structure of vibration signals. The structure of hydroacoustic signals is described, while the relevance of wavelet analysis over analysis using Fourier transforms is emphasized. The algorithm can be applied to estimate the spectral density using the Fourier periodogram and estimate the energy in different frequency ranges. The method of hard threshold smoothing coefficients is presented and the advantages of this approach over a soft threshold are presented. A step-by-step algorithm for filtering the hydroacoustic signal is described. One of the applications of the algorithm is to estimate the parameters of the vibration signal using a parallel implementation of the algorithm.

    Keywords: frequency domain, spectral density; wavelets; vibration signal processing

  • Factor analysis of the effect of additives on the technological properties of dry building mixes

    the effect of a thickener and a setting retarder on the technological properties of GSHS START is investigated. One-factor plans of a two-factor model have been developed, with a minimum (0.1%; 0.005%) and maximum (0.2%; 0.05%) dosage levels of pore-forming and water-retaining additives, respectively. Regression equations of output parameters in the form of a second-degree polynomial are obtained using regression and correlation analysis of experimental data. The values of partial correlation coefficients are analyzed. With an increase in the dose of water-retaining and pore-forming additives from 0.1% to 0.2% and from 0.005% to 0.05% of the binder, respectively, for all possible combinations of the dosage of the thickener and setting retarder, there is an increase in setting time by 10 ... 72%, and sliding by 33 ... 80%. The least sensitive to an increase in water-retaining and pore-forming additives was a mixture in which the amount of thickener is 0.2% (at the upper level), and the amount of moderator is 0.04% (at the lower level).

    Keywords: technological properties, organizational and technological solutions, dry building mixes, functional additives, thickener, setting, retarder, two-factor experiment, coefficient of determination, regression analysis, correlation analysis

  • Modeling the relationship of sediment pollution with external factors on the example of the Silinka river, Komsomolsk-on-Amur

    In this article, using the example of the Silinka River in the city of Komsomolsk-on-Amur, the influence of various factors on the formation and transportation of sediments in the river, such as sediments, dissolved substances, such as the gross form of zinc, is estimated. The paper uses a multiple regression model to identify the influence of some external factors on the level of contamination of bottom sediments with zinc and presents the results of numerical modeling that allow us to assess changes in the "water – bottom sediments" system under the influence of various factors. The work is important for understanding ecological processes in rivers and can be used to develop methods for managing and protecting water resources.

    Keywords: multiple regression, urban area, ecological status, mass transfer processes, water resources, bottom sediments, modeling, Silinka River, Komsomolsk-on-Amur city, ecological processes, numerical modeling, water resources management

  • Overview: Advances and Challenges in Analyzing and Diagnosing Product Defects by Digital Methods

    The article provides an overview of the analysis and diagnosis of product surface defects, evaluated using digital image processing. The search for scientific publications was carried out mainly in the Scopus and RSCI scientometric bases for the period 2019-2023. The purpose of this article is to determine the best methods for assessing the destruction of materials using digital images. The main methods of processing and analyzing digital images are considered. The perspective of unification of segmentation modes by digital image acquisition sources and combining images from various recording sources to obtain objective information on the nature of material destruction is shown. To reduce the time for assessing the degree of destruction of materials, it is proposed to gradually use the methods of segmentation, filtering digital images of defects in metal products with subsequent calculation by a neural network.

    Keywords: defect, control, digital image, neural network.

  • An algorithm for tracing human movements in a video stream based on clothing recognition technologies

    Currently, tracing the movements of various objects (in particular a person) occupies a central place in video surveillance and video analytics systems. It is a system for tracking people's movements by localizing their positions on each frame within the entire video stream and is the basis of many intellectual computer vision systems. The purpose of this article is to develop a new algorithm for tracing human movements in a video stream with the possibility of selecting motion trajectories. The main stages of the algorithm include: dividing the video into frames with a difference of one second, selecting the person under study in the video stream, implementing a digital processing process based on recognizing the clothes of the person under study and obtaining its color histogram, predicting localization and recognizing the person under study on all subsequent frames of the video stream using the developed methods of forecasting the direction of movement of this object. The output data of the proposed algorithm is used in the procedure of forming and displaying a general picture of the movement of a particular person within the entire video stream. The information and materials contained in this article may be of interest to specialists and experts who, in their work, pay special attention to data processing when analyzing fragments of the video stream.

    Keywords: surveillance cameras, u2– net neural network, rembg library, pattern recognition, clothing recognition, delta E, tracing, direction prediction, object detection, tracking, mathematical statistics, predicted area, RGB pixels

  • Improving methods for constructing extreme control systems

    A new approach to increasing the efficiency of extreme control systems by improving the method of searching for the extremum of the objective function is presented. In its multidimensional nonlinear optimization, instead of a traditional linear search along a once selected direction, an exploratory search is used, the direction of which at each step is adapted to the topology of the objective function. This makes it possible to localize an extremum as quickly as possible and significantly reduce the time of its determination. An algorithm for interpolation search for an extremum in the found interval is proposed. The objective function is modeled by a cubic spline segment based on information about its gradient vector at the boundary points of the interval, as a result of which the number of interpolation search steps is significantly reduced. The possibility of simplified nonsmooth interpolation using first-order splines in the extremum region is considered. The results of a numerical experiment confirm the high efficiency of the new method in solving various problems.

    Keywords: extremal control systems, nonlinear optimization, acceleration of extremum search, quasi-Newton method, polynomial interpolation, non-smooth interpolation

  • Analysis of the possibility of using a mechanisms as an intermediate notation when moving to BPMN

    The problem of reducing communication interaction in the chain between a natural language message and a BPMN model is considered. For this purpose, a number of authors have proposed a special notation called a mechanism. The procedure for constructing a mechanism using a given BPMN model is considered. The possibility of building a mechanism only for BPMN models that satisfy certain conditions is shown: the model must contain at least one artifact associated with one of the gateway actions; gateways should not contain more than two choices; the model should not end with a gateway; the model should not contain an AND-OR gateway. The procedure for constructing a BPMN model using a given mechanism is considered. The possibility of such a transformation is shown if the following conditions are met: the presence of a one-to-one correspondence of the elements and functions of the mechanism, the use of a single tool and a single strip in the mechanism. For models that do not satisfy these conditions, the use of the mechanism is problematic: it turns out to be either too cumbersome or too simple, which does not facilitate the simplification of communicative interaction. It is concluded that additional research is necessary in order to either improve the mechanism or use a different notation that does not have the disadvantages of the mechanism.

    Keywords: BPMN, communication, business model, modeling, mechanisms, natural language, translation into BPMN

  • Development of an algorithm for optimizing business processes of an IT company and a model of a decision support information system

    The development and implementation of decision support systems (DSS) based on modern methods of data processing, storage and analysis is an urgent task. As part of this work, an algorithm for optimizing the business processes of an IT company and a model for the functioning of a DSS were developed. The implementation of the proposed methods will improve the efficiency of IT companies.

    Keywords: decision support system, business process, optimization, algorithm, IT company, data analysis, software, program code

  • A new approach to assessing the contamination level with heavy metals in the soil-like fraction from landfills

    The paper presents a new approach to assessing the level of contamination with heavy metals of the soil-like fraction from landfills using Monte Carlo simulation using the example of landfills located within the borders of Volgograd.It was found that with a probability of 36.2%, the contamination level of a soil-like fraction from the landfill located in the Voroshilovsky district will correspond to moderately hazardous, and with a probability of 63.8%, hazardous. It is economically justified to isolate a soil-like fraction with a low level of pollution to detoxify it and further use it in the territory reclamation. For a soil-like fraction from landfill located in the Traktorozavodsky district, the pollution level was determined as extremely hazardous and hazardous with a probability of 87.1% and 3.1%, respectively. It is shown that a useful and usable part cannot be isolated from a soil-like fraction. A soil-like fraction must be neutralized and placed at waste disposal facilities.The presented approach is a useful instrument for pollution level assessment of a soil-like fraction, which can increase the accuracy of an estimate and the management effectiveness of a soil-like fraction during landfill development.

    Keywords: landfill, soil-like fraction, heavy metals, pollution level, Monte Carlo method, modeling

  • Detection of defects in extended products based on two-dimensional scanning results

    The article discusses a method for detecting defects in extended products. To find defects, scanning the product along its entire length is used. The result is a two-dimensional data stream that needs to be analyzed. The problem of detecting a defect is one of the tasks of detecting a “useful” signal against a background of “noise”. The most reliable method is to use a set of statistical criteria. To compare the mean values, the Student's test and two Wilcoxon–Mann–Whitney tests were used; to compare the scattering values, the Fisher test and the Ansari–Bradley test were used. The effectiveness of the algorithm was confirmed using a computer model simulating a two-dimensional homogeneous data stream.

    Keywords: defects, extended products, computer model, simulation, statistical criterion

  • Assessment of the rational directions of the forest industry development of the Russian Arctic border regions based on cluster analysis

    The purpose of this study is to present the forestry complex development scenarios of the Republic of Karelia and the Murmansk region. Based on the use of factor analysis and cluster analysis, 27 central foresters of the study region were divided into 9 clusters according to 20 indicators. The selected indicators took into account the characteristics of wood resources, natural-production conditions and road infrastructure. Based on cluster profiles, as well as on topographical, climatic, soil map and vegetation maps, scenarios for the development of the study region forestry complex in the context of the resulting clusters. The results of the study showed that as they move from south to north, a gradual impoverishment of wood resources occurs. The efforts of the state and business should be aimed at resolving issues of road infrastructure, involving deciduous, small, energy wood in production circulation. Given the natural and production conditions, which are largely determined by the moist forest soils and the extreme vulnerability of the northern ecosystems, in the process of logging, it is especially necessary to pay attention to the minimization of the negative impact of logging operations on the soil cover.

    Keywords: zoning, forest industry, factor analysis, cluster analysis, k-means cluster analysis, logging, forest management

  • Construction and evaluation of the effectiveness of a decision tree model for predicting student performance

    This work solves the problem of increasing the effectiveness of educational activities by predicting student performance based on external and internal factors. To solve this problem, a model for predicting student performance was built using the Python programming language. The initial data for building the decision tree model was taken from the UCI Machine Learning Repository platform and pre-processed using the Deductor Studio Academic analytical platform. The results of the model are presented and a study was conducted to evaluate the effectiveness of predicting student performance.

    Keywords: forecasting, decision tree, student performance, influence of factors, effectiveness assessment

  • Applying DIANA hierarchical clustering to improve text classification quality

    The article presents ways to improve the accuracy of the classification of normative and reference information using hierarchical clustering algorithms.

    Keywords: machine learning, artificial neural network, convolutional neural network, normative reference information, hierarchical clustering, DIANA

  • On the development of an information system for processing the results of sports competitions on the 1C:Enterprise platform

    The article describes the results of the development of an information system for processing the results of sports competitions on the 1C:Enterprise platform for informational support of the sports social project Don Family League, which involves entire families in sports and physical education. An object model of configuration data is presented, which allowed structuring the subject area, highlighting the main application objects, their details and the relationships between them, which was later used for algorithmization of the solution and software implementation of complex tools on the 1C platform.:Enterprise". The software package was tested as part of information support for the activities of the Don Family League project in the period from July 2022 to July 2023 and showed high efficiency.

    Keywords: All-Russian physical culture and sports complex «Ready for work and Defense», the Don Family League project, processing of results of sports competitions, rating of individual standings, rating of family standings, rating of team standings

  • Dependence comparison of the effectiveness of neural networks to improve image resolution on format and size

    Roads have a huge impact on the life of a modern person. One of the key characteristics of the roadway is its quality. There are many systems for assessing the quality of the road surface. Such technologies work better with high-resolution images (HRI), because it is easier to identify any features on them. There are a sufficient number of ways to improve the resolution of photos, including neural networks. However, each neural network has certain characteristics. For example, for some neural networks, it is quite problematic to work with photos of a large initial size. To understand how effective a particular neural network is, a comparative analysis is needed. In this study, the average time to obtain the HRI is taken as the main indicator of effectiveness. EDSR, ESPCN, ESRGAN, FSRCNN and LapSRN were selected as neural networks, each of which increases the width and height of the image by 4 times (the number of pixels increases by 16 times). The source material is 5 photos of 5 different sizes (141x141, 200x200, 245x245, 283x283, 316x316) in png, jpg and bmp formats. ESPCN demonstrates the best performance indicators according to the proposed methodology, the FSRCNN neural network also has good results. Therefore, they are more preferable for solving the problem of improving image resolution.

    Keywords: comparison, dependence, effectiveness, neural network, neuronet, resolution improvement, image, photo, format, size, road surface