×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Simulation of the design activity diversification of innovative enterprise

    The article presents a comparative analysis of modern database management systems (PostgreSQL/PostGIS, Oracle Database, Microsoft SQL Server, and MongoDB) in the context of implementing a distributed storage of geospatial information. The aim of the study is to identify the strengths and limitations of different platforms when working with heterogeneous geospatial data and to evaluate their applicability in distributed GIS solutions. The research covers three main types of data: vector, raster, and point cloud. A comprehensive set of experiments was conducted in a test environment close to real operating conditions, including functional testing, performance benchmarking, scalability analysis, and fault tolerance assessment.
    The results demonstrated that PostgreSQL/PostGIS provides the most balanced solution, showing high scalability and stable performance across all data types, which makes it a versatile platform for building GIS applications. Oracle Database exhibited strong results when processing raster data and proved effective under heavy workloads in multi-node architectures, which is especially relevant for corporate environments. Microsoft SQL Server showed reliable performance on vector data, particularly in distributed scenarios, though requiring optimization for binary storage. MongoDB proved suitable for storing raster content and metadata through GridFS, but its scalability is limited compared to traditional relational DBMS.
    In conclusion, PostgreSQL/PostGIS can be recommended as the optimal choice for projects that require universality and high efficiency in distributed storage of geospatial data, while Oracle and Microsoft SQL Server may be preferable in specialized enterprise solutions, and MongoDB can be applied in tasks where flexible metadata management is a priority.

    Keywords: geographic information system, database, postgresql, postgis, oracle database, microsoft sql server, mongodb, vector, raster, point cloud, scalability, performance, fault tolerance

  • Development of a model for optimizing the management of fire and rescue units during a fire using neural networks

    The article is devoted to the development of an innovative neural network decision support system for firefighting in conditions of limited visibility. A comprehensive approach based on the integration of data from multispectral sensors (lidar, ultrasonic phased array, temperature and hygrometric sensors) is presented. The architecture includes a hybrid network that combines three-dimensional convolutional and bidirectional LSTM neurons. To improve the quality of processing, a cross-modal attention mechanism is used to evaluate the physical nature and reliability of incoming signals. A Bayesian approach is used to account for the uncertainty of forecasts using the Monte Carlo dropout method. Adaptive routing algorithms allow for quick response to changing situations. This solution significantly improves the efficiency of firefighting operations and reduces the risk to personnel.

    Keywords: mathematical model, intelligence, organizational model, gas and smoke protection service, neural networks, limited visibility, fire department, management, intelligent systems, decision support

  • Analysis of elliptic curve algorithms and their application in information systems

    Methods of increasing the efficiency of data analysis based on topology and analytical geometry are becoming increasingly popular in modern information systems. However, due to the high degree of complexity of topological structures, the solution of the main tasks of processing and storing information is provided by spatial geometry in combination with modular arithmetic and analytical assignment of geometric structures, the description of which is involved in the development of new methods for solving optimization problems. The practical application of elliptic cryptography, including in network protocols, is based on the use of interpolation methods for approximating graphs of functions, since a loss of accuracy may occur when performing many sequential mathematical operations. This problem is related to the features of the computing architecture of modern devices. It is known that an error can have a cumulative effect, so data approximation methods must be used sequentially as calculations are performed.

    Keywords: elliptic curve, information system, data analysis, discrete logarithm, point order, scalar, subexponential algorithm

  • Systematization of prospects for the development of firefighting through the prism of the theory of complex organizational systems

    The article discusses the conceptual foundations of the transformation of the fire extinguishing management system based on the theory of complex organizational systems. The author substantiates the need to move from linear-hierarchical models to adaptive and networked structures capable of providing high stability and efficiency of response in conditions of uncertainty and dynamics of emergency situations. The analysis of the compliance of the fire extinguishing system with the characteristics of a complex organizational system has been carried out, contradictions between its complex nature and primitive control mechanisms have been identified, the causes and consequences of this paradox have been identified. Multi-agent digital platforms, the use of digital twins, situation centers, as well as the use of game theory methods to optimize resource allocation and decision support are proposed as ways to solve the identified problems.

    Keywords: system approach, organizational system, firefighting, network structures, management, digitalization, transformation, game theory, optimization, criteria

  • Comparison of correlation-extreme and neural network methods of aircraft guidance based on digital terrain maps

    The paper provides a comparative analysis of the accuracy of determining the coordinates of an aircraft using the classical correlation extreme algorithm (CEA) and the machine irradiation method based on a fully convolutional neural network (FCN) based on terrain maps. Two-dimensional correlated random functions are used as relief models.  It has been shown that CEA is effective with small amounts of data, whereas FCN demonstrates high noise immunity after training on representative samples. Both methods showed the dependence of the accuracy of determining the coordinates of the aircraft on the size of the reference area, the number of standards, entropy, and the correlation coefficient of the random relief.

    Keywords: correlation-extreme algorithm, deep learning, convolutional neural network, aircraft guidance, digital terrain model, Fourier filtering, spatial correlation, noise immunity, algorithm comparison, autonomous navigation, hybrid systems, terrain entropy

  • Using a local approach to hierarchical text classification

    The article forms the task of hierarchical classification of texts, describes approaches to hierarchical classification and metrics for evaluating their work, examines in detail the local approach to hierarchical classification, describes different approaches to local hierarchical classification, conducts a series of experiments on training local hierarchical classifiers with various vectorization methods, compares the results of evaluating the work of trained classifiers.

    Keywords: classification, hierarchical classification, local classification, hierarchical presicion, hierarchical recall, hierarchical F-measure, natural language processing, vectorization

  • Synthesis of Kalman Filter for Asymmetric Quadcopter Control with Optimization of Covariance Matrix Ratio

    The work is devoted to the application of a linear Kalman filter (KF) for estimating the roll angle of a quadcopter with structural asymmetry, under which the control input contains a nonzero constant component. This violates the standard assumption of zero mathematical expectation and reduces the efficiency of traditional KF implementations. A filter synthesis method is proposed based on the optimization of the covariance matrices ratio using a criterion that accounts for both the mean square error and the transient response time. The effectiveness of the approach is confirmed by simulation and experimental studies conducted on a setup with an IMU-6050 and an Arduino Nano. The obtained results demonstrated that the proposed Kalman filter provides improved accuracy in estimating the angle and angular velocity, thereby simplifying its tuning for asymmetric dynamic systems.

    Keywords: Kalman filter, quadcopter with asymmetry, optimization of covariance matrices, functional with mean square error and process time, complementary filter, roll and pitch control

  • Application of the Residue Number System in Text Information Processing

    The article explores the application of the residue number system in text information processing. The residue number system, based on the principles of modular arithmetic, represents numbers as sets of residues relative to pairwise coprime moduli. This approach enables parallel computation, potential data compression, and increased noise immunity. The study addresses issues such as character encoding, parallel information processing, error detection and correction, computational advantages in implementing polynomial hash functions, as well as practical limitations of the residue number system.

    Keywords: residue number system, modular arithmetic, text processing, parallel computing, data compression, noise immunity, Chinese remainder theorem, polynomial hashing, error correction, computational linguistics

  • Development of an environmental monitoring portal

    The article focuses on the development of a web portal for monitoring and forecasting atmospheric air quality in the Khabarovsk Territory. The study analyzes existing solutions in the field of environmental monitoring, identifying their key shortcomings, such as the lack of real-time data, limited functionality, and outdated interfaces. The authors propose a modern solution based on the Python/Django and PostgreSQL technology stack, which enables the collection, processing, and visualization of air quality sensor data. Special attention is given to the implementation of harmful gas concentration forecasting using a recurrent neural network, as well as the creation of an intuitive user interface with an interactive map based on OpenStreetMap. The article provides a detailed description of the system architecture, including the backend, database, and frontend implementation, along with the methods used to ensure performance and security. The result of this work is a functional web portal that provides up-to-date information on atmospheric air conditions, forecast data, and user-friendly visualization tools. The developed solution demonstrates high efficiency and can be scaled for use in other regions.

    Keywords: environmental monitoring, air quality, web portal, forecasting, Django, Python, PostgreSQL, neural networks, OpenStreetMap

  • Physics-Informed Neural Network Based on Transformer Architecture for Time Series Forecasting in Engineering Systems

    The study addresses the problem of short-term forecasting of ice temperature in engineering systems with high sensitivity to thermal loads. A transformer-based architecture is proposed, enhanced with a physics-informed loss function derived from the heat balance equation. This approach accounts for the inertial properties of the system and aligns the predicted temperature dynamics with the supplied power and external conditions. The model is tested on data from an ice rink, sampled at one-minute intervals. A comparative analysis is conducted against baseline architectures including LSTM, GRU, and Transformer using MSE, MAE, and MAPE metrics. The results demonstrate a significant improvement in accuracy during transitional regimes, as well as robustness to sharp temperature fluctuations—particularly following ice resurfacing. The proposed method can be integrated into intelligent control loops for engineering systems, providing not only high predictive accuracy but also physical interpretability. The study confirms the effectiveness of incorporating physical knowledge into neural forecasting models.

    Keywords: short-term forecasting, time series analysis, transformer architecture, machine learning, physics-informed modeling, predictive control

  • Application of modern language models for automatic transcription and analysis of audio recordings of telephone conversations between sales department employees and clients

    The article is devoted to the study of the possibilities of automatic transcription and analysis of audio recordings of telephone conversations of sales department employees with clients. The relevance of the study is associated with the growth of the volume of voice data and the need for their rapid processing in organizations whose activities are closely related to the sale of their products or services to clients. Automatic processing of audio recordings will allow checking the quality of work of call center employees, identifying violations in the scripts of conversations with clients. The proposed software solution is based on the use of the Whisper model for speech recognition, the pyannote.audio library for speaker diarization, and the RapidFuzz library for organizing fuzzy search when analyzing strings. In the course of an experimental study conducted on the basis of the developed software solution, it was confirmed that the use of modern language models and algorithms allows achieving a high degree of automation of audio recordings processing and can be used as a preliminary control tool without the participation of a specialist. The results confirm the practical applicability of the approach used by the authors for solving quality control problems in sales departments or call centers.

    Keywords: call center, audio file, speech recognition, transcription, speaker diarization, replica classification, audio recording processing, Whisper, pyannote.audio, RapidFuzz

  • Semantic integration and data adaptation in heterogeneous corporate information systems

    The article addresses the issues of integration and processing heterogeneous data within a single company as well as during interaction between various participants of business processes under conditions of digital transformation. Special attention is given to collaboration between equipment manufacturers and industrial enterprises, emphasizing the importance of aligning and transforming data when interacting with heterogeneous information systems. The problem of integrating historical data, challenges arising from transitioning to new infrastructure, and a solution based on principles similar to those used by open standards such as OpenCL are discussed. Particular emphasis is placed on providing complete and consistent datasets, developing effective mechanisms for semantic integration, and using ontological approaches to address difficulties in comparing and interpreting diverse data formats. It highlights the necessity of continuously updating metadata dictionaries and establishing connections between different data sources to ensure high-quality and reliable integration. The proposed methods aim at creating sustainable mechanisms for exchanging information among multiple business entities for making informed management decisions.

    Keywords: digital transformation, heterogeneous systems, erp/mes systems, ontology, semantic integration, metadata, data mapping

  • Calculation of the coefficient of heterogeneity of a mixture when mixing bulk media, the particles of which have different sizes and shapes

    The article discusses the structure and principle of operation of an improved centrifugal unit for mixing bulk materials. A special feature of which is the ability to control mixing modes. Due to its design, the selection of a rational position of the bump makes it possible to provide such conditions for the impact interaction of particle flows, in which a high-quality homogeneous mixture of components is formed, the particles of which have different sizes, shapes and other parameters. To characterize the resulting mixture, the coefficient of heterogeneity was used, the conclusion of which is based on a probabilistic approach. A computational scheme of the rarefied flow formation process is given. An expression is derived for calculating the coefficient of heterogeneity when mixing bulk media, the particles of which have different sizes, shapes and other parameters. The research conducted in the article allows not only to predict the quality of the resulting mixture, but also to identify the factors that have the greatest impact on achieving the required uniformity.

    Keywords: aggregate, bulk media, mixing, coefficient of heterogeneity, concentration, design scheme, particle size

  • Efficiency of using long-span structures in industrial and civil construction

    The article discusses some methods for the construction of long-span coverings from precast reinforced concrete elements and prefabricated steel structures. To systematize these design and technological solutions and determine the effectiveness of their application based on the parameters of manufacturability, a comparative analysis was carried out. The construction technologies were compared according to the following parameters: specific and total labor intensity, the level of mechanization, the total number of elements, the average and maximum mass of one element, the total mass of the mounted elements, and the equilibrium coefficient.  The analysis showed that for reinforced concrete structures, installation in blocks is most effective, involving preliminary enlargement at ground level, followed by lifting and installation in the design position. Precast reinforced concrete shells have a higher level of mechanization and a degree of equilibrium, which makes it possible to use crane equipment efficiently, but due to their considerable weight, they require the use of supporting structures and high-load cranes. The installation of prefabricated steel structures in its entirety with preliminary enlargement at ground level is the least laborious, but the need to install a large number of low-mass piece elements reduces manufacturability.

    Keywords: installation of long-span structures, installation of triple-layer rotational shells of double curvature, installation of steel beam structures, installation of a spatial structural roof unit, installation of the entire roof structure as a single unit

  • Development of a software module for automatic code generation based on UML diagrams

    The article discusses a software module developed by the authors for automatic generation of program code based on UML diagrams. The relevance of developing this module is due to the limitations of existing foreign code generation tools related to functionality, ease of use, support for modern technologies, as well as their unavailability in Russian Federation. The module analyzes JSON files obtained by exporting UML diagrams from the draw.io online service and converts them into code in a selected programming language (Python, C++, Java) or DDL scripts for DBMS (PostgreSQL, Oracle, MySQL). The Python language and the Jinja2 template engine were used as the main development tools. The operation of the software module is demonstrated using the example of a small project "Library Management System". During the study, a series of tests were conducted on automatic code generation based on the architectures of software information systems developed by students of the Software Engineering bachelor's degree program in the discipline "Design and Architecture of Software Systems". The test results showed that the code generated using the developed module fully complies with the original UML diagrams, including the structure of classes, relationships between them, as well as the configuration of the database and infrastructure (Docker Compose). The practical significance of the investigation is that the proposed concept of generating program code based on visual models of UML diagrams built in the popular online editor draw.io significantly simplifies the development of software information systems, and can be used for educational purposes.

    Keywords: code generation, automation, python, jinja2, uml diagram, json, template engine, parsing, class diagram, database, deployment diagram