×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Modular supporting structures design of Arctic wind turbine

    This article examines the support structures of a wind turbine designed for operation in the extreme climatic conditions of the Russian High North. The relevance of the study is driven by the strategic objectives of developing the Arctic zone of Russia and the necessity to account for specific environmental and climatic factors in the design of energy infrastructure. A modular structural system is proposed, taking into consideration transportation and technological constraints associated with Arctic wind turbines. A CAD-model of the structural system has been developed, comprising a three-section tubular conical tower and a compound pile cap with a three-point support configuration. CAE-based simulations were conducted to evaluate the load-bearing capacity of the structural system under extreme load combination. The results demonstrate that the proposed structural configuration meets transportation limitations while ensuring the strength and stability of the Arctic wind turbine under critical load combination. The proposed design solution is suitable for simplifying transportation and on-site assembly of Arctic wind turbine in remote northern energy infrastructure projects.

    Keywords: Arctic wind turbine, modular structures, supporting structures, CAD modeling, CAE simulation, permafrost

  • Evaluation of the effectiveness of a data set expansion method based on deep reinforcement learning

    The article presents the results of a numerical experiment comparing the accuracy of neural network recognition of objects in images using various types of data set extensions. It describes the need to expand data sets using adaptive approaches in order to minimize the use of image transformations that may reduce the accuracy of object recognition. The author considers such approaches to data set expansion as random and automatic augmentation, as they are common, as well as the developed method of adaptive data set expansion using a reinforcement learning algorithm. The algorithms of operation of each of the approaches, their advantages and disadvantages of the methods are given. The work and main parameters of the developed method of expanding the dataset using the Deep-Q-Network algorithm are described from the point of view of the algorithm and the main module of the software package. Attention is being paid to one of the machine learning approaches, namely reinforcement learning. The application of a neural network for approximating the Q-function and updating it in the learning process, which is based on the developed method, is described. The experimental results show the advantage of using data set expansion using a reinforcement learning algorithm using the example of the Squeezenet v1.1 classification model. The comparison of recognition accuracy using data set expansion methods was carried out using the same parameters of a neural network classifier with and without the use of pre-trained weights. Thus, the increase in accuracy in comparison with other methods varies from 2.91% to 6.635%.

    Keywords: dataset, extension, neural network models, classification, image transformation, data replacement

  • Modern tools used in the formation of intelligent control systems

    Modern intelligent control systems (ICS) are complex software and hardware systems that use artificial intelligence, machine learning, and big data processing to automate decision-making processes. The article discusses the main tools and technologies used in the development of ICS, such as neural networks, deep learning algorithms, expert systems and decision support systems. Special attention is paid to the role of cloud computing, the Internet of Things and cyber-physical systems in improving the efficiency of intelligent control systems. The prospects for the development of this field are analyzed, as well as challenges related to data security and interpretability of models. Examples of the successful implementation of ICS in industry, medicine and urban management are given.

    Keywords: intelligent control systems, artificial intelligence, machine learning, neural networks, big data, Internet of things, cyber-physical systems, deep learning, expert systems, automation

  • Application of variational principles in problems of development and testing of complex technical systems

    The technology of applying the variational principle in problems of development and testing of complex technical systems is described. Let there be a certain set of restrictions imposed on random variables in the form of given statistical moments and/or in the form of a restriction by some estimates from above and below the range of possible values of these random variables. The task is set: without knowing anything except these restrictions, to construct for further research, ultimately, for assessing the efficiency of the complex technical system being developed, the probability distribution function of its determining parameter. By varying the functional, including Shannon entropy and typical restrictions on the distribution density function of the determining parameter of a complex technical system, the main stages of constructing the distribution density function are described. It is shown that, depending on the type of restriction, the constructed distribution density function can have an analytical form, be expressed through special mathematical functions, or be calculated numerically. Examples of applying the variational principle to find the distribution density function are given. It is demonstrated that the variational principle allows obtaining both the distribution laws widely used in probability theory and mathematical statistics, and specific distributions characteristic of the problems of developing and testing complex technical systems. The technology of applying the variational principle presented in the article can be used in the model of managing the self-diagnostics process of intelligent control systems with machine consciousness.

    Keywords: variational principle, distribution density function, Shannon entropy, complex technical system

  • Distribution of stresses near underground cylindrical and spherical cavities created by an explosion

    The paper considers the problem of the stress state of a rock array with continuous inhomogeneity. This type of inhomogeneity can be observed in rock arrays with cavities created by explosion. In this case, the dependence was chosen when the main mechanical characteristics depend only on one coordinate - the radius. It was also taken into account that the chosen dependence gives an opportunity to obtain relatively simple methods of solving the problems. The chosen calculation scheme of the problem allows to reduce it to the solution of one-dimensional task. For the case of the centrally symmetric problem we consider the solving equation, which is an ordinary inhomogeneous differential equation of the second order with variable coefficients. Using the substitution of variables, we can proceed to the solution of the hypergeometric equation. Solutions of hypergeometric equations are given in the form of hypergeometric series, which are known to converge. Using inverse substitutions, the stresses are found. The stress state of the rock array at different degrees of its heterogeneity is determined. The results are presented in the form of graphs. Comparison with similar solutions for homogeneous arrays is carried out. The presented results allow us to conclude that when solving problems on the stress state of rock arrays with cavities, it is necessary to take into account the heterogeneity of the arrays obtained in the process of creating such cavities with the help of explosion.

    Keywords: heterogeneity of the medium, rock array, spherical cavity, stress state

  • The actor model in the Elixir programming language: fundamentals and application

    The article explores the actor model as implemented in the Elixir programming language, which builds upon the principles of the Erlang language. The actor model is an approach to parallel programming where independent entities, called actors, communicate with each other through asynchronous messages. The article details the main concepts of Elixir, such as comparison with a sample, data immutability, types and collections, and mechanisms for working with the actors. Special attention is paid to the practical aspects of creating and managing actors, their interaction and maintenance. This article will be valuable for researchers and developers interested in parallel programming and functional programming languages.

    Keywords: actor model, elixir, parallel programming, pattern matching, data immutability, processes, messages, mailbox, state, recursion, asynchrony, distributed systems, functional programming, fault tolerance, scalability

  • Stability of the horizontal flight mode in the Zhukovsky problem with constant thrust

    The paper considers the plane motion of a heavy material point in a quasi-static medium under the influence of gravity, aerodynamic forces and thrust forces. This problem can be considered as a continuation of Zhukovsky's problem of modeling the longitudinal flight of an aircraft, assuming that the angle of attack is constant, taking into account the effect of thrust. The equations of motion in different coordinate systems are obtained. Stationary flight modes have been found.  The stability of the most basic modes is investigated. A numerical solution of the equations of motion is found and the behavior of trajectories in various flight modes is investigated.

    Keywords: thrust force, material point, Zhukovsky's problem

  • Prediction of wind flow velocity as a component of climatic factors in the task of thermal processes accounting in calculation of steady-state mode of electric grids

    Increasing the accuracy of steady-state calculation is possible by taking into account the thermal processes occurring in electrical energy conductors. The wind flow velocity, in turn, is of significant importance in determining the conductor temperature. In this paper, the values of wind speed for an 11-year period are considered. The time series is analyzed and the prediction models of the target variable are tested and the prediction results are compared.

    Keywords: power grid mode calculation, thermal processes, wind flow velocity, prediction models, feed forward neural network, ensemble methods

  • Control algorithm for a mechatronic station for sorting products using a computer vision system

    The paper considers the issue of using a computer vision system to control the quality of products in the control algorithm of a mechatronic sorting station. Shoe products are chosen as an example. The developed system is based on machine learning methods for image recognition by segmentation. As a result, a neural network model was created, and a program was written for identifying and selecting objects using a camera for subsequent sorting of defective products. The program contains three modules: initialization for declaring all variables, models, classes, video stream from the camera; the main module, containing an internal loop for each segmented object; a subroutine for completing the work. The introduction of computer vision into the control algorithm increases the efficiency and flexibility of the quality control system, and improves the accuracy of measuring the parameters of objects for their subsequent sorting.

    Keywords: mechatronic station, sorting, computer vision, image segmentation, neural network training, control algorithm

  • Development and Analysis of a Feature Model for Dynamic Handwritten Signature Recognition

    In this work, we present the development and analysis of a feature model for dynamic handwritten signature recognition to improve its effectiveness. The feature model is based on the extraction of both global features (signature length, average angle between signature vectors, range of dynamic characteristics, proportionality coefficient, average input speed) and local features (pen coordinates, pressure, azimuth, and tilt angle). We utilized the method of potentials to generate a signature template that accounts for variations in writing style. Experimental evaluation was conducted using the MCYT_Signature_100 signature database, which contains 2500 genuine and 2500 forged samples. We determined optimal compactness values for each feature, enabling us to accommodate signature writing variability and enhance recognition accuracy. The obtained results confirm the effectiveness of the proposed feature model and its potential for biometric authentication systems, presenting practical interest for information security specialists.

    Keywords: dynamic handwritten signature, signature recognition, biometric authentication, feature model, potential method, MCYT_Signature_100, FRR, FAR

  • Development of Software for Calculating Formation Parameters of Functional Coatings with Specified Adhesion (Case Study: Polyisobutylene-Based Coatings)

    This paper presents the results of an investigation into the adhesion properties of release coatings based on polyisobutylene applied to metallic substrates. A software tool was developed in Microsoft Visual Studio using the C++ programming language to compute the composition and effective technological parameters for forming coatings that ensure optimal adhesion to protected surfaces. As a case study, the method of calculating the relationships between composition, temperature, and formation time is demonstrated for coatings achieving the highest adhesion, corresponding to a score of “zero” on the standardized six-point cross-cut adhesion test. It is shown that the application of the developed software enables parameter evaluation within 1–2 seconds. The computational results are experimentally validated. The morphology of the coatings was examined using optical microscopy. It was observed that no delamination occurs at the intersection points of cuts or within the grid pattern.

    Keywords: coating, adhesion, microstructure, cross-cut test, polyisobutylene, optimization

  • Using Game Theory to Model Review Manipulation on Marketplaces

    The article is devoted to the consideration of the features of using game theory to model review manipulation on marketplaces. In the course of the study, a model was proposed that is based on evolutionary theory and allows us to determine how susceptible buyers and marketplaces are to review manipulation, and what benefits all participants in these relationships receive. Special attention is paid to the audit that is conducted by the marketplace in relation to the processes of review manipulation by the seller, and the losses that the parties incur if it is detected.

    Keywords: marketplace, reviews, buyer, seller, benefit

  • The effect of data replacement and expansion using transformations on the recognition accuracy of the deep neural network ResNet - 50

    The article examines how the replacement of the original data with transformed data affects the quality of training of deep neural network models. The author conducts four experiments to assess the impact of data substitution in tasks with small datasets. The first experiment consists in training the model without making changes to the original data set, the second is to replace all images in the original set with transformed ones, the third is to reduce the number of original images and expand the original data set using transformations applied to images, and also in the fourth experiment, the data set is expanded in order to balance the number of images There are more in each class.

    Keywords: dataset, extension, neural network models, classification, image transformation, data replacement

  • Development of a power supply system for an organic substrate of an integrated circuit crystal with a high-speed interface at a rate of 28.25 Gbps

    The article focuses on methods for reducing high inductance in power supply circuits using one of the IC substrate topologies with a high-speed interface as an example. The interface in question operates at a speed of 28.25 Gbit/s and imposes strict requirements on the parameters of the power supply inductance. The presented solutions are aimed at ensuring low values ​​of power supply inductance in conditions of high layout density and power integrity requirements for modern data transfer interfaces.

    Keywords: power supply inductor, power system, low noise power supply, power supply impedance, analog power supply, serial interface, high speed interface, organic substrate, IC packaging

  • Calculation of the area of ​​the image of a flat region using mathematical analysis methods

    The paper proposes a method for calculating the area of a flat area from a photograph based on the use of mathematical analysis methods. To calculate the area, a curved integral of the second kind is used along a closed contour bounding the area under consideration. Defining the boundary in the form of a Bezier spline reduces the calculation of a curved integral to the calculation of several definite integrals from the Bernstein basis polynomials. An explicit form is obtained for integrals of Bernstein basis polynomials. For a third-order Bezier spline, a formula is derived for calculating the area of the area in terms of the coordinates of the reference points of the Bezier curves.

    Keywords: cubic spline, Bernstein basis polynomials. Bezier curve, Bezier spline, Green's formula, beta function, gamma function

  • Development of a control system in REPEAT to implement S-shaped movement of a robotic snake

    The article considers a robotic control system for implementing the S-shaped movement of a snake robot, and evaluates the specifics of its use on a physical prototype. Since the snake robot is a complex composite structure, the assessment and assignment of a certain type of movement becomes one of the primary tasks in the development of this control system. The simulation model of the snake robot is implemented using mathematical modeling in the Russian software REPEAT. As a result of the simulation, it was found that the developed control system ensures the operability of the robotic system, ensures the accuracy of movement and adaptability to changing external conditions.

    Keywords: mathematical modeling, simulation model, REPEAT, snake robot, wave motion control system, torque variation

  • Adaptive convolutional neural network for detecting safety violations

    The article presents an adaptive convolutional neural network for automated detection of safety violations in real time. Unlike existing solutions using static models, the proposed approach includes two key innovations. Automatic adaptation of model weights with a combination of stochastic and gradient descent methods. The algorithm dynamically adjusts the learning rate and the depth of parameter modification, which makes it possible to preserve previously acquired knowledge while further training on new data without degrading accuracy. Optimized context processing mechanism – the model analyzes not only objects (for example, the absence of a helmet), but also their relative location (a worker in a dangerous area without personal protective equipment), which reduces the number of false alarms. The developed system integrates computer vision, alert generation, and analytics modules, providing not only instant response to violations, but also long-term risk analysis. Experiments have confirmed a 15% increase in accuracy when working in changing lighting conditions and shooting angles.

    Keywords: convolutional neural network, information system, industrial accidents, safety, production, model training, neural network, adaptive algorithm

  • Modeling Paid Parking Occupancy: A Regression Analysis Taking into Account Customer Behavior

    The article describes the methodology for constructing a regression model of occupancy of paid parking zones taking into account the uneven distribution of sessions during the day and the behavioral characteristics of two groups of clients - the regression model consists of two equations that take into account the characteristics of each group. In addition, the process of creating a data model, collecting, processing and analyzing data, distribution of occupancy during the day is described. Also, the methodology for modeling a phenomenon whose distribution has the shape of a bell and depends on the time of day is given. The results can be used by commercial enterprises managing parking lots and city administrations, researchers when modeling similar indicators that demonstrate a normal distribution characteristic of many natural processes (customer flow in bank branches, replenishment and / or withdrawal of funds during the life of replenished deposits, etc.).

    Keywords: paid parking, occupancy, regression model, customer behavior, behavioral segmentation, model robustness, model, forecast, parking management, distribution

  • Software for calculating the surface characteristics of liquid media

    Software has been developed to evaluate the surface characteristics of liquids, solutions and suspensions in the Microsoft Visual Studio environment. The module with a user-friendly interface does not require special skills from the user and allows for a numerical calculation of the energy characteristics of the liquid in a time of ~ 1 second: adhesion, cohesion, wetting energy, spreading coefficient and adhesion of the liquid composition to the contact surface. Using the example of a test liquid - distilled water and an initial liquid separation lubricant of the Penta-100 series, an example of calculating the wetting of a steel surface with liquid media is demonstrated. Optical microscopy methods have shown that good lubrication of the steel surface ensures the formation of a homogeneous, defect-free coating. The use of the proposed module allows for an express assessment of the compatibility of liquid formulations with the protected surface and is of interest to manufacturers of paint and varnish materials in product quality control.

    Keywords: computer program, C# programming language, wetting, surface, adhesion

  • Modeling the interaction of a single abrasive grain with the surface of a part

    A review of various approaches used to model the contact interaction between the grinding wheel grain and the surface layer of the workpiece during grinding is presented. In addition, the influence of material properties, grinding parameters and grain morphology on the contact process is studied.

    Keywords: grinding, grain, contact zone, modeling, grinding wheel, indenter, micro cutting, cutting depth

  • Methods for forming quasi-orthogonal matrices based on pseudo-random sequences of maximum length

    Linear feedback shift registers (LFSR) and the pseudo-random sequences of maximum length (m-sequences) generated by them have become widely used in solving problems of mathematical modeling, cryptography, radar and communications. The wide distribution is due to their special properties, such as correlation. An interesting, but rarely discussed in the scientific literature of recent years, property of these sequences is the possibility of forming quasi-orthogonal matrices on their basis.In this paper, was conducted a study of methods for generating quasi-orthogonal matrices based on pseudo-random sequences of maximum length (m-sequences). An analysis of the existing method based on the cyclic shift of the m-sequence and the addition of a border to the resulting cyclic matrix is carried out. Proposed an alternative method based on the relationship between pseudo-random sequences of maximum length and quasi-orthogonal Mersenne and Hadamard matrices, which allows generating cyclic quasi-orthogonal matrices of symmetric structure without a border. A comparative analysis of the correlation properties of the matrices obtained by both methods and the original m-sequences is performed. It is shown that the proposed method inherits the correlation properties of m-sequences, provides more efficient storage, and is potentially better suited for privacy problems.

    Keywords: orthogonal matrices, quasi-orthogonal matrices, Hadamard matrices, m-sequences

  • Moving from a university data warehouse to a lake: models and methods of big data processing

    The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management. The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management.

    Keywords: data warehouse, data lake, big data, cloud storage, unstructured data, semi-structured data

  • Determination of zigzag nature of vehicle trajectories

    The paper presents a method for quantitative assessment of zigzag trajectories of vehicles, which allows to identify potentially dangerous behavior of drivers. The algorithm analyzes changes in direction between trajectory segments and includes data preprocessing steps: merging of closely spaced points and trajectory simplification using a modified Ramer-Douglas-Pecker algorithm. Experiments on a balanced data set (20 trajectories) confirmed the effectiveness of the method: accuracy - 0.8, completeness - 1.0, F1-measure - 0.833. The developed approach can be applied in traffic monitoring, accident prevention and hazardous driving detection systems. Further research is aimed at improving the accuracy and adapting the method to real-world conditions.

    Keywords: trajectory, trajectory analysis, zigzag, trajectory simplification, Ramer-Douglas-Pecker algorithm, yolo, object detection

  • Queuing system with mutual assistance between channels and limited dwell time

    • Abstract

    In this paper, a new model of an open multichannel queuing system with mutual assistance between channels and limited waiting time for a request in a queue is proposed. General mathematical dependencies for the probabilistic characteristics of such a system are presented.

    Keywords: queuing system, queue, service device, mutual assistance between channels

  • Tree-based diagnostic classificators of hump retarders

    The paper addresses the problem of the technical diagnostics of hump control devices, such as wagon retarders. The current analytical methods of monitoring and technical diagnostics of wagon retarder conditions are reviewed. The factors that are used in the existing diagnostics systems are analyzed and new factors to be taken into account, including specific pathway peculiarities, wagon group lengths, breaking curve styles, initial wagon group speed and environment conditions, are suggested. The suggested set of factors are characterized from the point of regression analysis. The replacement of some continuous factors with lexical ones are suggested. Decision tree-based classificators are suggested to perform the classification of hump retarder conditions. The decision tree-based classificators can be built with the means of Data Mining on a training set. An improved method of building decision trees is suggested. It’s advantage over the existing algorithms is shown on evaluation sets.

    Keywords: hump yard, wagon retarders, regression, decision trees, classification, data mining, multi-factor analysis, soft computations