×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Methodology of formation and determination of parameters of machine learning algorithms for classification of electronic documents according to the importance of information for officials of organizations

    The article considers the methodology of forming and determining the parameters of machine learning algorithms for classifying electronic documents according to the importance of information for officials of organizations, which differs from the known ones by the dynamic formation of the structure and number of machine learning algorithms, due to the automated determination of sets of structural divisions of the organization, sets of keywords reflecting the tasks and functions of structural divisions in the process of automated analysis of the Organization's Regulations, The positions of structural units based on the theory of pattern recognition.

    Keywords: lemmatization, pattern recognition, machine learning algorithm, electronic document, vectorization, formalized documents

  • Ethical aspects of the use of artificial intelligence systems

    In modern society, problems related to the ethics of artificial intelligence (AI) are increasingly emerging. AI is used everywhere, and the lack of ethical standards and a code necessitates its creation to ensure the safety and comfort of users. The purpose of the work is to analyze approaches to the ethics of artificial intelligence and identify the parameters for evaluating approaches to create systems that meet ethical standards and meet the needs of users. Approaches to the ethics of artificial intelligence are considered. The parameters for evaluating approaches are highlighted. The main characteristics are highlighted for each parameter. The parameters described in this paper will help achieve better results when creating standards for the development of safer and more user-friendly systems.

    Keywords: Code, parameters, indicators, characteristics, ethics, artificial intelligence

  • Dependence сomparative analysis of the effectiveness of image quality improvement approaches on the format and size

    Road surface quality assessment is one of the most popular tasks worldwide. To solve it, there are many systems, mainly interacting with images of the roadway. They work on the basis of both traditional methods (without using machine learning) and machine learning algorithms. To increase the effectiveness of such systems, there are a sufficient number of ways, including improving image quality. However, each of the approaches has certain characteristics. For example, some of them produce an improved version of the original photo faster. The analyzed methods for improving image quality are: noise reduction, histogram equalization, sharpening and smoothing. The main indicator of effectiveness in this study is the average time to obtain an improved image. The source material is 10 different photos of the road surface in 5 sizes (447x447, 632x632, 775x775, 894x894, 1000x1000) in png, jpg, bmp formats. The best performance indicator according to the methodology proposed in the study was demonstrated by the "Histogram equalization" approach, the "Sharpening" method has a comparable result.

    Keywords: comparison, analysis, dependence, effectiveness, approach, quality improvement, image, photo, format, size, road surface

  • Model of configuration of structural and functional characteristics of departmental information systems

    This paper considers the conditions and factors affecting the security of information systems functioning under network reconnaissance conditions. The developed model is based on the techniques that realize the dynamic change of domain names, network addresses and ports to the network devices of the information system and false network information objects functioning as part of them. The formalization of the research problem was carried out. The theoretical basis of the developed model is the theories of probability and random processes. The modeled target system is represented as a semi-Markov process identified by an oriented graph. The results of calculation of probabilistic-temporal characteristics of the target system depending on the actions of network reconnaissance are presented, which allow to determine the mode of adjustment of the developed protection measures and to evaluate the security of the target system under different conditions of its functioning.

    Keywords: departmental information system, network intelligence, structural and functional characterization, false network information object

  • Large data deduplication using databases

    To date, a huge amount of heterogeneous information passes through electronic computing systems. There is a critical need to analyze an endless stream of data with limited means, and this, in turn, requires structuring information. One of the steps in solving the problem of data ordering is deduplication. This article discusses the method of removing duplicates using databases, analyzes the results of testing work with various types of database management systems with different sets of parameters.

    Keywords: deduplication, database, field, row, text data, artificial neural network, sets, query, software, unstructured data

  • Artificial intelligence: the danger of inflated expectations

    Currently, digitalization as a technological tool penetrates into the humanitarian sphere of knowledge, linking technocratic and humanitarian industries. An example is legal informatics, in which conceptual devices of quite different – at first glance – areas of human knowledge are interfaced. However, the desire to abstract (formalize) any knowledge is the most important task in the "convergence" of computer technologies and mathematical methods into a non-traditional humanitarian sphere for them. The paper discusses the problems generated by the superficial idea of artificial intelligence. A typical example is the attempt of some authors in jurisprudence to give computer technologies, often referred to as artificial intelligence by humanitarians, an almost sacred meaning and endow it with legal personality.

    Keywords: artificial intelligence, deep learning, machine learning, hybrid intelligence, adaptive behavior, digital economy, digital law, legal personality of artificial intelligence

  • Implementation of a competition for regression models in assessing the amount of social and pension funding

    Social and pension provision are key processes in the activities of any state, and the issues of forecasting expenses for them are among the most important in the economy. The task of evaluating the effectiveness of the pension fund has been solved by various methods, including regression analysis methods. This task is particularly difficult due to the presence of a large number of factors determining the activity of the pension fund, such as: the number of recipients of old-age pensions, the number of policyholders, self-employed policyholders, recipients of benefits, insured persons and working pensioners. As the main approach to the study, the method of implementing a model competition was applied. Those variants that violated the meaningful meaning of the variables and did not fully reflect the behavior of the modeled process were excluded from the resulting set of alternative model options. The final option was selected using the multi-criteria selection method. It is revealed that the use of relative variables is important for qualitative modeling of the studied processes. The above model shows that an increase in the ratio of the number of employers and the self-employed to the number of insured persons leads to a decrease in the cost of financing social and pension provision.The model can be effectively used for short-term forecasting of the total annual volume of financing of the pension fund department in the context of changing social and macroeconomic factors.

    Keywords: pension fund, regression model, model competition, adequacy criteria, forecasting

  • Using segment tree in PostgreSQL

    The article considers an approach to solving the problem of optimizing the speed of aggregating queries to a continuous range of rows of a PostgreSQL database table. A program module based on PostgreSQL Extensions is created, which provides construction of a segment tree for a table and queries to it. Increased query speed by more than 80 times for a table of 100 million records compared to existing solutions.

    Keywords: PostgreSQL, segment tree, query, aggregation, optimization, PosgreSQL Extensions, asymptotics, index, build, get, insert

  • Modification of the algorithm for correcting errors that occur during the operation of the satellite authentication system

    Frequency multiplexing (OFDM) methods have become the main basis for most outbred systems. These methods have also found application in modern systems of low-orbit satellite Internet (LOSIS). For example, the StarLink system uses OFDM transmission systems that use a signal frame consisting of 52 channels to transmit data. One way to increase the data rate in OFDM is to replace the Fourier transform (FT) with a faster orthogonal transform. As such, the modified wavelet transform (MWT) of Haar was chosen. The Haar MVP allows to reduce the number of arithmetic operations during the orthogonal signal transformation in comparison with the PF. The use of integer algebraic systems, such as Galois fields and modular residue class codes (MCCR), makes it possible to increase the speed of a computing device that performs orthogonal transformations of signals. Obviously, the transition to new algebraic systems should lead to changes in the structure of OFDM systems. Therefore, the development of structural models of an OFDM transmission system using the Haar MWP in the Galois field and the ICCM is an urgent task. Therefore, the aim of the work is to develop structural models of wireless OFDM systems using a modified integer discrete Haar transform, which can reduce the execution time of the orthogonal signal transformation. And this, in turn, will lead to an increase in the data transfer rate in the SNSI.

    Keywords: orthogonal frequency multiplexing, modification of the Haar wavelet transform, structural models of execution of the Haar MVP, Galois field, modular residue class codes

  • Estimating the power consumption of wireless sensor network nodes

    The article proposes an algorithm for ensuring the minimum power consumption of end nodes in a wireless network of sensors. A simulation model of the process of information exchange in a wireless network of sensors developed in the Matlab - Simulink software environment is presented, the use of which allows estimating the total power consumption when transmitting messages by all end nodes of the network during a given time interval.

    Keywords: wireless sensor network, LoRaWAN, Internet of Things, Internet of Things, IoT, power consumption, simulation model, Simulink, signal attenuation, frame transmission

  • Analysis of floating point calculations on microcontrollers

    The article discusses methods for optimizing floating point calculations on microcontroller devices. Hardware and software methods for accelerating calculations are considered. Algorithms of Karatsuba and Schönhage-Strassen for the multiplication operation are given. A method for replacing floating-point calculations with integer calculations is proposed. Describes how to use fixed point instead of floating point. The option of using hash memory and code optimization is considered. The results of measuring calculations on the AVR microcontroller are presented.

    Keywords: floating point calculations, fixed point calculations, microcontroller, AVR, ARM

  • Preprocessing speech data to train a neural network

    This article analyzes data processing problems for training a neural network. The first stage of model training - feature extraction - is discussed in detail. The article discusses the method of mel-frequency cepstral coefficients. The spectrum of the voice signal was plotted. By multiplying the vectors of the signal spectrum and the window function, we found the signal energy that falls into each of the analysis windows. Next, we calculated the mel-frequency cepstral coefficients. The use of a chalk scale helps in audio analysis tasks and is used in training neural networks when working with speech. The use of mel-cepstral coefficients significantly improved the quality of recognition due to the fact that it made it possible to see the most informative coefficients. These coefficients have already been used as input to the neural network. The method with mel-frequency cepstral coefficients made it possible to reduce the input data for training, increase productivity, and improve recognition clarity.

    Keywords: machine learning, data preprocessing, audio analysis, mel-cepstral coefficients, feature extraction, voice signal spectrum, Fourier transform, Hann window, discrete cosine transform, short Fourier transform

  • On existing methods for removing noise from an image

    This paper considers existing classical and neural network methods for combating noise in computer vision systems. Despite the fact that neural network classifiers demonstrate high accuracy, it is not possible to achieve stability on noisy data. Methods for improving an image based on a bilateral filter, a histogram of oriented gradients, integration of filters with Retinex, a gamma-normal model, a combination of a dark channel with various tools, as well as changes in the architecture of convolutional neural networks by modifying or replacing its components and the applicability of ensembles of neural networks are considered.

    Keywords: image processing, image filtering, machine vision, pattern recognition

  • Information processing using a VGA adapter for an FPGA camera

    This article describes the first stage of the research work on the development of an FPGA-based camera for vehicle identification tasks, which are widely used in automated weight and size control points. Since the FPGA is an alternative to conventional processors, which features the ability to perform multiple tasks in parallel, an FPGA-equipped camera will be able to perform the functions of detecting and identifying vehicles at the same time.Thus, the camera will not only transmit the image, but also transmit the result of processing for problem-oriented control systems, decision-making and optimization of data flow processing, after which the server will only need to confirm or deny the results of the camera, which will significantly reduce the image processing time from all automated points of weight and size control.In the course of development, a simple VGA port board, a static image program for displaying it on a monitor in 640x480 resolution, and a pixel counter program were implemented. EP4CE6E22C8 is used as FPGA, the power of which is more than enough to achieve the result.

    Keywords: system analysis methods, optimization, FPGA, VGA adapter, Verilog, recognition camera, board design, information processing, statistics

  • Investigation of the corrective capabilities of the noise-immune code of the system of residual classes

    This article explores the LTE-R group, in which the OFDM system has a special place, and considers the possibility of developing new methods for detecting error-correcting coding to test high data transmission under rate estimation conditions. The problems associated with the transmission of large amounts of information in conditions of high speed of movement of trains and the volatility of the environment are considered, as well as the features of interference associated with the railway infrastructure. As noise-immune codes, modular codes are used, which, unlike criminal BCH codes, are arithmetic.

    Keywords: LTE-R standard, OFDM system, modular codes, noise immunity, error hit interval, BCH codes, error packet, error rate

  • Optimization based on mixing methods when solving multicriteria selection problems

    The method of analyzing hierarchies has long been described, studied and applied in practice. In order to reduce the factor of subjectivity inherent in many decisions, we are considering the option of using the hierarchy analysis method, in which the assessment is carried out not by the decision maker, but by a group of independent experts. Thus, we propose a method for solving multicriteria optimization problems based on mixing (combination) of two methods - the method of analyzing hierarchies and the method of expert assessment.

    Keywords: Optimality criteria, alternative, decision maker, optimization, method of expert assessments, method of hierarchy analysis, competence of experts, consistency of expert opinions

  • Improving efficiency of Dijkstra's algorithm using parallel computing technologies with OpenMP library

    The purpose of the study is to improve the efficiency of Dijkstra's algorithm by using the shared memory model with OpenMP library and working on the principle of parallel execution in the implementation of the algorithm. Using Dijkstra's algorithm to find the shortest path between two nodes in a graph is quite common. However, the time complexity of the algorithm increases as the size of the graph increases, resulting in longer execution time, so parallel execution is a good option to solve the time complexity problem. In this research work, we propose a parallel computing method to improve the efficiency of Dijkstra's algorithm for large graphs.The method involves dividing the array of paths in Dijkstra's algorithm into a specified number of processors for parallel execution. We provide an implementation of the parallelized Dijkstra algorithm and access its performance using actual datasets and with different number of nodes. Our results show that Dijkstra's parallelized algorithm can significantly speed up the process compared to the sequential version of the algorithm, while reducing execution time and continuously improving CPU efficiency, making it a useful choice for finding shortest paths in large graphs.

    Keywords: Dijkstra algorithm, graph, shortest paths, parallel computing, shared memory model, OpenMP library

  • Modelling construction time by discrete Markov chains

    Often in practice, construction times are estimated using deterministic methods, for example, based on a network schedule of the construction plan with deterministic values for the timing of specific works. This approach does not reflect the reality associated with the probabilistic nature of risks and leads to a systematic underestimation of the time and, as a consequence, the cost of construction. The research proposes to use a Markov discrete heterogeneous Markov chain to assess the risks of non-completion of construction in due time. The states of the Markov process are proposed to correspond to the stages of construction of the object. Probabilities of system transitions from state to state are proposed to be estimated on the basis of empirical data on previously implemented projects and/or expertly, taking into account the risks characterising construction conditions in dynamics. The dynamic model of the construction plan development allows to determine such characteristics as: the probability of the construction plan realisation within the established terms, the probability that the object will ever be completed, the time of construction to the stage of completion with a given degree of reliability; unconditional probabilities of the system states (construction stage) in a given period of time relative to the beginning of construction. The model has been tested. The proposed model allows us to estimate the time of completion of construction, to assess the risks of failure to complete construction within the established deadlines in the planned conditions of construction realisation, taking into account the dynamics of risks.

    Keywords: construction time, risk assessment, markov model, discrete Markov chain, inhomogeneous random process

  • Modeling and program development for an intelligent system to support personnel management decisions in the electric power industry

    In the conditions of modern economy, where optimal personnel decisions are very important for any organizations, especially in the dynamically divisive electric power industry, the issue of developing an intelligent system for making personnel decisions in the electric power industry becomes relevant. This paper analyzes the existing tools for selection of candidates for vacant positions including managerial positions and vacancies from the electric power industry. Based on the analysis and earlier research, a competency profile of managers of the electric power industry is formed. The development of the program product was conducted using various programming languages in the Visual Studio development environment. The program represents a dynamic and interactive process of managerial decision-making, where users face different scenarios to assess the formed competencies, with the output of a detailed report on their skills, which provides employers with an objective assessment of the candidate's potential for a vacant managerial position.

    Keywords: electric power industry, competences, personnel, personnel, optimal personnel management decisions, intellectual system, personnel management, competence assessment, software product

  • The technique of analyzing video files for detecting the presence of persons and attractions, using recognition by key, non-repeating frames

    In this paper, we consider a technique for automatic analysis of video files for detecting the presence of persons and attractions, using recognition by key, non-repeating frames, based on algorithms for their extraction. Recognition of landmarks and faces only by keyframes will significantly reduce computational costs, as well as avoid overflowing with repetitive information. The effectiveness of the proposed technique is evaluated in terms of accuracy and speed on a set of test videos.

    Keywords: keyframe, recognition, computer vision, algorithm, video

  • Aliasing-grams for express control of the adequacy of the choice of sampling interval of the measured signal

    A new mathematical apparatus is proposed for monitoring the adequacy of the choice of signal sampling interval from the point of view of taking into account the main high-frequency components and identifying the possibilities of increasing it. It is based on the construction of special aliasing grams based on measured signal samples. Aliasing grams are graphs of standard deviations of the amplitude spectra of a conventionally reference discrete signal, specified with the highest sampling frequency, and auxiliary discrete signals obtained over the same observation interval, but with lower sampling frequencies. By analyzing such graphs, it is easy to identify sampling frequencies that lead to the appearance of the aliasing effect in the case of sampling, and, consequently, to distortion of the signal spectrum. To speed up and simplify the construction of aliasinggrams, it is proposed to use as auxiliary signals obtained from the reference one by thinning. It has been shown that this device is also effective in the case of the spectrum spreading effect. It can be used in self-learning measuring systems.

    Keywords: sampling interval, aliasing, amplitude spectrum, aliasing-gram, sample decimation, spectrum spreading

  • Models of inclusive learning in foreign language classes

    This paper reveals many topical problems related to the modernization of inclusive education in Russia, with an emphasis on the practice of teaching foreign language in higher educational institutions. The paper also presents models of inclusion of persons with disabilities relevant to the modern educational environment. A brief description of the historical and legal basis of inclusion in Russia is also given. The authors note that in higher education institutions inclusive education is still at the stage of formation and that for its successful implementation it is necessary to comprehend the problem and create a methodological basis.

    Keywords: inclusive education, persons with disabilities, equal access, quality education, integration, synergy, legal framework, adaptation, transformation.

  • Simulation of an autonomous control system for a slitting machine of a paper machine

    The work is aimed at modeling the control system of a slitting machine of a paper machine in order to improve the quality of products and eliminate defects in winding density. The developed automated system implements the functions of controlling the operating modes of the machine, distributing the loads of the bearing shafts, braking the roll and tensioning the paper web.

    Keywords: slitting machine, paper machine, automated control system, rewinder, pressure roller, decoiler, reeler, accelerating shaft, deflecting shaft, cutting section

  • On the issue of reducing the power consumption of wireless sensor nodes

    The article presents expressions that allow you to calculate the amount of power consumption of end nodes when transmitting a message in a wireless sensor network. Data are obtained on the values that the value of power consumption of the end node of the sensor network takes, depending on the attenuation of signals during transmission over a wireless channel, as well as on the set values of the output power and the spreading factor of the transmitted signals.

    Keywords: internet of Things, sensor network, LoRaWAN, IoT system, end node power consumption, spreading factor, output power

  • Detection of local defect areas during non-destructive testing of extended products

    The article discusses a method for detecting local areas with hidden defects in products whose length is several orders of magnitude greater than other dimensions, when processing information from non-destructive testing of the product. To obtain the necessary information, various means of introscopy and radiation of different nature are used. Processing of information obtained using scanning control should detect areas with defects and determine their nature. To compare different processing methods and select the optimal method for processing information, a computer modeling method was used, with the help of which the process of obtaining information and processing it was simulated, which simplifies the selection of the most suitable method for detecting a defect. The article describes typical models of the received signal and presents the simulation results.

    Keywords: defects, non-destructive testing, extended products, simulation model, moving averaging, time series