Many modern information processing and control systems for various fields are based on software and hardware for image processing and analysis. At the same time, it is often necessary to ensure the storage and transmission of large data sets, including image collections. Data compression technologies are used to reduce the amount of memory required and increase the speed of information transmission. To date, approaches based on the use of discrete wavelet transformations have been developed and applied. The advantage of these transformations is the ability to localize the points of brightness change in images. The detailing coefficients corresponding to such points make a significant contribution to the energy of the image. This contribution can be quantified in the form of weights, the analysis of which allows us to determine the method of quantization of the coefficients of the wavelet transform in the proposed lossy compression method. The approach described in the paper corresponds to the general scheme of image compression and provides for the stages of transformation, quantization and encoding. It provides good compression performance and can be used in information processing and control systems.
Keywords: image processing, image compression, redundancy in images, general image compression scheme, wavelet transform, compression based on wavelet transform, weight model, significance of detail coefficients, quantization, entropy coding
The paper considers the task of collection and preparation of data coming from several information systems on the example of automation of registrar's reporting. The languages OWL, XML, XBRL and semantic networks can be used to describe the subject area. A set of criteria for analysing and selecting the most appropriate knowledge representation language for the purpose of data collection on the example of financial statements is prepared. The results of service development are described and the application of XBRL format is shown. The multi-agent approach to modelling and design of information systems was used in the development of the service.
Keywords: data mining, subject area model, data formats, XBRL, business process, service, data integration
The work presents the review of modern log trucks under the recent sanctions imposed. The author states that the problem of renewing the existing log trucks becomes urgent for forest transporting and logging companies nowadays. There is a wide range of new basic chassis and trucks at the market to build log trucks with a wheel formula 6x4 and 6x6 produced by Russian, Belorussian and Chinese factories. A great number of trailer links is produced to build log trucks. There is an opportunity to buy used trucks of other companies. For the first stage of the technical and economic analysis and preliminary selection of the optimal type and composition of a logging truck, a comparative assessment of the effectiveness of logging trucks was carried out. The analysis shows that Russian log trucks with engine power more than 400 HP (horsepower) can compete with the best foreign models. Nevertheless, the problem of reliability of Russian, Belorussian and Chinese log trucks needs further research.
Keywords: log trucks, trailer links, productivity, effectiveness
This paper describes approaches to visualization and comparison of semantic trees reflecting the component structure of the patented device and the connections between them using graph databases. DBMS data uses graph structures to store, process, and represent data. The main elements of a graph database are nodes and edges, which, within the framework of the task, model entities of 3 types (SYSTEM, COMPONENT, ATTRIBUTE) and 5 types of connections (PART-OF, LOCATED-AT, CONNECTED-WITH, ATTRIBUTE-FOR, IN-MANNER-OF). According to the results of the study, it can be stated that Neo4j demonstrates the best possibilities for graph visualization; ArangoDB, despite correctly entered queries, performs incomplete visualization; AllegroGraph showed difficult work with code, difficult configuration of graph tree visualization. 3 algorithms for comparing graph representations of information have been tested: Graph Edit Distance, Topological Comparison, Subgraph Isomorphism. The algorithms are implemented in python, compares 2 graph trees, displays visualization and analysis of common graph structures and differences.
Keywords: semantic tree, component structure, patent, graph databases, Neo4j, AllegroGraph, ArangoDB
In systems for monitoring, diagnostics and recognition of the state of various types of objects, an important aspect is the reduction of the volume of measured signal data for its transmission or accumulation in information bases with the ability to restore it without significant distortion. A special type of signals in this case are packet signals, which represent sets of harmonics with multiple frequencies and are truly periodic with a clearly distinguishable period. Signals of this type are typical for mechanical, electromechanical systems with rotating elements: reducers, gearboxes, electric motors, internal combustion engines, etc. The article considers a number of models for reducing these signals and cases of priority application of each of them. In particular, the following are highlighted: the discrete Fourier transform model with a modified formula for restoring a continuous signal, the proposed model based on decomposition by bordering functions and the discrete cosine transform model. The first two models ideally provide absolute accuracy of signal restoration after reduction, the last one refers to reduction models with information loss. The main criteria for evaluating the models are: computational complexity of the implemented transformations, the degree of implemented signal reduction, and the error in restoring the signal from the reduced data. It was found that in the case of application to packet signals, each of the listed models can be used, the choice being determined by the priority indicators of the reduction assessment. The application of the considered reduction models is possible in information and measuring systems for monitoring the state, diagnostics, and control of the above-mentioned objects.
Keywords: reduction model, measured packet signal, discrete cosine transform, decomposition into bordering functions, reduction quality assessment, information-measuring system
At present, continuous tank reactor is widely used in many different industries, and there are many control methods for this reactor. This paper presents a design method for model predictive controller (MPC) based on fuzzy model. The control object is modeled by fuzzy model (Takagi-Sugeno), the optimization problem is solved by genetic algorithm. Using fuzzy models and genetic algorithms to implement MPC controller, it achieved better quality than traditional MPC controllers.
Keywords: method of designing a model predictive controller, fuzzy model, Takagi Sugeno, genetic algorithms, multiple inputs-multiple outputs
In operational diagnostics and recognition of states of complex technical systems, an important task is to identify small time-determined changes in complex measured diagnostic signals of the controlled object. For these purposes, the signal is transformed into a small-sized image in the diagnostic feature space, moving along trajectories of different shapes, depending on the nature and magnitude of the changes. It is important to identify stable and deterministic patterns of changes in these complex-shaped diagnostic signals. Identification of such patterns largely depends on the principles of constructing a small-sized feature space. In the article, the space of decomposition coefficients of the measured signal in the adaptive orthonormal basis of canonical transformations is considered as such a space. In this case, the basis is constructed based on a representative sample of realizations of the controlled signal for various states of the system using the proposed algorithm. The identified shapes of the trajectories of the images correspond to specific types of deterministic changes in the signal. Analytical functional dependencies were discovered linking a specific type of signal change with the shape of the trajectory of the image in the feature space. The proposed approach, when used, simplifies modeling, operational diagnostics and condition monitoring during the implementation of, for example, low-frequency diagnostics and defectoscopy of structures, vibration diagnostics, monitoring of the stress state of an object by analyzing the time characteristics of response functions to impact.
Keywords: modeling, functional dependencies, state recognition, diagnostic image, image movement trajectories, small changes in diagnostic signals, canonical decomposition basis, analytical description of image trajectory
The development of a system for automatic generation of starter site templates to simplify the creation of web applications is being considered. Using code generation allows you to automate the process of writing repetitive code, reducing development time and increasing the efficiency of developers. The system provides a user-friendly interface for selecting and configuring templates, eliminating the need to work with console commands. This allows you to speed up the prototyping and deployment of web applications, which is especially important when creating projects with many repetitive components.
Keywords: website, content management, code generation, content management system, website template, web applications, framework, server side, client side, optimization
The article is devoted to describing approaches to analyzing the information space using low-code platforms in order to identify factors that form new identities of Azerbaijan and the unique features of the country’s information landscape. The article describes the steps to identify key themes and collect big data in the form of text corpora from various Internet sources and analyze the data. In terms of data analysis, the study of the sentiment of the text and the identification of opinion leaders is carried out; the article also includes monitoring of key topics, visualized for a visual presentation of the results.
Keywords: data analytics, trend monitoring, sentiment analysis, data visualization, low-code, Kribrum, Polyanalyst, big data
The problem of determining the area of defects in the surface layer of bearing parts according to eddy current non-destructive testing is considered. Methods of processing eddy current control data are given. The possibility of using a robust median polishing method to increase the information content of eddy current data is substantiated. It is proposed to use a sliding window, a standard deviation calculation, and a production rule formed by the Shannon information entropy criterion as tools for localizing defect patterns in the eddy current image of the control object. The results of the application of the developed localization algorithm based on eddy current control data of bearing parts obtained in real production conditions are presented.
Keywords: eddy current control, localization, defect, data analysis, recognition, surface layer, intelligent technologies, Shannon entropy, median polishing, classification problem
The problem of optimisation of selective assembly of plunger-housing precision joints of feeders of centralised lubrication systems used in mechanical engineering, metallurgy, mining, etc. is considered. The probability of formation of assembly sets of all types is used as the target function; the controlled variables are the number and volumes of parts of batches and their adjustment centres, as well as the values of group tolerances. Several variants of solving the problem at different combinations of controlled variables are considered. An example of the solution of the optimisation problem on the basis of the previously developed mathematical models with the given initial data and constraints is given, the advantages and disadvantages of each of the variants are outlined. Optimisation allows to increase the considered indicator by the value from 5% to 20%.
Keywords: selective assembly, lubrication feeder, precision connection, mathematical model, optimisation
A combined theoretical and practical study of the burner device parameters has been performed. The flow characteristic of the fuel supply system has been determined. Aerodynamic studies of the burner device characteristics have been conducted, axial velocity fields have been constructed, and critical parameters of the air supply unit design have been identified. The temperatures of in-chamber processes have been experimentally determined. A mathematical model of chemical reactions of the torch has been developed, and the dependence of diesel fuel toxicity on the excess air coefficient has been constructed. The effect of water vapor on the burner device operation has been determined.
Keywords: burner device, axial velocity field, intra-chamber processes, thermochemical parameters, mathematical modeling, toxicity
Automating government processes is a top priority in the digital era. Because of historical development, many existing systems for registering and storing data about individuals coexist, requiring intervening IT infrastructures. The article considers the procedure for the development, creation and implementation of software for updating and generating data about residents of the city of Astana. It defines the functional capabilities and determines the role of the information system in automation and monitoring government activities. The authors conducted the study by observing, synthesizing, analyzing, systematizing, and classifying the data received. The authors used scientific works of local and foreign authors on the topic under study and open databases as sources of literature. At the end of the work, the authors list the literature used. The authors have, for the first time, created the structure and algorithms of the information system known as ""Population Database ""Geonomics"". Specifically, they have developed the mechanism and algorithm for the interaction of the ""Geonomics"" information system with government databases. As well as, additional opportunities for using the software have been identified by developing an algorithm for planning and placing social objects when using the information system ""Geonomics"". The authors have concluded that the algorithms developed for the use of the information system ""Population Database “Geonomics"" represent a reliable and powerful tool, which plays a critical role in the optimization and automation of processes related to population accounting and urban infrastructure management. This software contributes to the development of the city and the improvement of its residents' quality of life, based on up-to-date and reliable information. In addition, the developed algorithm allows for real-time monitoring of the current data of city residents and their density, based on which decisions can be made regarding the construction and placement of social facilities for the comfortable service and living of city residents.
Keywords: automation, updating, government activities, government agency, information system, database
This paper discusses the Viola-Jones algorithm for face detection and its implementation based on the STM32 microcontroller. The advantages of using embedded systems in implementing personal identification systems are given: low cost due to the reduction of the element base and low power consumption. The architecture of the hardware and software system for face detection based on a multi-core microcontroller is proposed. The following requirements are put forward for the implemented facial recognition system: processing frequency of not less than 1 frame per second, output in color format, display of faces in the form of rectangular frames on the frame, refusal to use external memory modules. Cascades and features used in the classical version of the Viola-Jones algorithm are described. MB-LBP is chosen as a feature due to the efficiency of calculation and storage within low-power embedded systems due to integer single-byte results. The structure of files of trained OpenCV classifiers is described and methods for their compression and conversion for use in 32-bit systems with limited RAM and the absence of a floating-point unit are proposed. A method for optimizing an integral image using overflow calculations is described. A multicriterial optimization problem for selecting optimal parameters of an integral image is formulated and solved using the gradient descent method. The application of SIMD instructions for parallelizing the calculation of an integral image on the STM32 is described. The results of measuring the operating time of the implemented system at different stages are presented, which confirm that the previously stated requirements are met.
Keywords: face detection, microcontroller, embedded systems, Viola-Jones algorithm, MB-LBP features, classifier optimization, integral image optimization, SIMD instructions
The term "oculography" (eye tracking) describes a technological method used to record eye movements in real time. This technique allows researchers to analyze the focus of subjects' attention on various interface elements. Color is a powerful tool for attracting attention. Understanding which colors first attract attention allows marketers to correctly place accents on visual stimuli, such as advertising materials that feature clothing of different colors, in order to improve the experience of interaction of a potential consumer with this content. The purpose of this work is to determine the effect of the black color of clothing on the priority of human attention. To achieve this goal, experiments were conducted in which the gaze of subjects was tracked using a webcam while they studied an experimental image. The analysis of the final experimental data obtained using the adapted velocity threshold identification algorithm showed a high attention priority for the black color of clothing. In 87.5% of cases, attention was paid to it first, while the gender of the subject did not play a significant role in this perception. The obtained results can help in the development of research aimed at improving the efficiency of information perception.
Keywords: oculography, velocity threshold identification algorithm, eye tracking technology, attention priority, region of interest, time to first fixation, advertising, clothing, color
In this paper, the problem of extrapolating a video signal with a quasi-rational spectral density, which significantly generalizes the rational density, is explicitly solved. The spectral characteristic of video signal extrapolation is constructed using the original method of A.M. Yaglom, a follower of academician A.N. Kolmogorov, who first posed the problem of extrapolation for random sequences and processes. The essence of the method consists in transferring all studies and calculations of spectral characteristics and densities from the real axis to the complex plane. The paper considers a video signal with a quasi-rational spectral density of a special type, interesting for practical applications, in which, as shown by the author using the Chebotarev and Sturm methods, it has all its roots only in an open upper half-plane.
Keywords: random process, video signal, prediction, filtering, spectral characteristic, prediction time
A complex dynamic system is defined by a structurally invariant operator. The operator structure allows formulating problems of stabilizing program motions or equilibrium positions of a complex dynamic system with constraints on state coordinates and control. The solution of these problems allows synthesizing a structurally invariant operator of a complex dynamic system with inequality-constraints on the vector of locally admissible controls and state coordinates. Computational experiments confirming the correctness of the synthesized structurally invariant projection operator are performed.
Keywords: structurally-invariant operator, stabilization of program motions, complex nonlinear dynamic system, projection operator, SimInTech
Digital holographic microscopy (DHM), is a combination of digital holography and microscopy. It is capable of tracking transparent objects, such as organelles of living cells, without the use of fluorescent markers. The main problem of DHM is to increase an image spatial resolution while maintaining a wide field of view. The main approaches to solving this problem are: increasing of the numerical aperture of lighting and recording systems, as well as using deep learning methods. Increasing the numerical aperture of lighting systems is achieved by using oblique, structured or speckle illumination. For recording systems it is achieved by using hologram extrapolation, synthesis or super-resolution. Deep learning is usually used in conjunction with other methods to shorten the compute time. This article is dedicated to describe the basic principles and features of the above approaches.
Keywords: digital holographic microscopy, spatial resolution, field of view, numerical aperture, sample, light beam, CCD camera, diffraction, imaging system, super-resolution
Blurred frames pose a significant problem in various fields such as video surveillance, medical imaging and aerial photography, when solving the following object detection and identification, image-based disease diagnosis, as well as analyzing and processing data from drones to create maps and conduct monitoring. This article proposes a method for detecting blurred frames using a neural network model. The principle of operation of the model is to analyze images presented in the frequency domain in the Hough space. To further evaluate the effectiveness of the proposed author's solution, a comparison was made of existing methods and algorithms that can be used to solve the problem, namely the Laplacian method and the manual sampling method. The results obtained show that the proposed method has high accuracy in detecting blurred frames and can be used in systems where high accuracy and clarity of visual data are required for decision-making.
Keywords: blurred frames, motion blur, blur, Hough transform, spectral analysis
In this paper, a new intent and entity recognition model for the subject area of air passenger service, labelled as IRERAIR-TWIN, is developed using the ‘no code’ question-answer development platform ‘TWIN’. The advantages of the no-code platform were analysed in terms of the ease of developing an application question-answer system and reducing the amount of work involved in developing an application model for a narrow subject area. The results show that the ‘TWIN’ system provides an intuitive web-based user interface and a simpler approach to develop the semantic module of a question-answer system capable of solving application problems for a narrow subject area that are not overly complex. However, this approach has limitations for deep semantic analysis tasks, especially in complex contextual inference and processing of large text fragments. The paper concludes by emphasising that future research will focus on using ChatGPT-based ‘low code’ platforms and large language models to further improve the intelligence of the IRERAIR-TWIN model. This extension aims to broaden the scope of the scenarios.
Keywords: question-answering systems, No-code, Low-code, Intent recognition, Named entity recognition, Data annotation, Feature engineering, Pre-trained model, software development,End-user development
The paper considers: synthesis of regulators of the system of subordinate regulation of DC electric drive, development of blocks of adaptive control of current and speed, modeling of adaptive control systems in the visual modeling environment Matlab/Simulink.
Keywords: automatic control system, electric drive, adaptive control, regulator, subordinate control, Matlab, Simulink, DC motor
Image super-resolution is a popular task that aims to translate images from low resolution to high resolution. For this task, convolutional networks are often used. Convolutional neural networks, have a great advantage in image processing. But despite this, often information can be lost during processing and increasing the depth and width of the network can make further work difficult. To solve this problem, data transformation into frequency domain is used. In this paper, the image is divided into high frequency and low frequency regions, where higher priority is given to the former. Then with the help of quality check, and visual evaluation, the method is analyzed and the conclusion regarding the performance of the algorithm is drawn.trial enterprise.
Keywords: super-resolution (SR), low-resolution (LR), high-resolution (HR), discrete-cosine transform, convolution-neural networks
The article describes the prerequisites for creating an electronic notification system for students in an educational institution. A use case diagram is provided that describes interaction with the system from the point of view of the user-employee of the educational department and the user-student. A diagram of the physical database model is presented and a description of the purpose of the tables is given. The system uses two types of client applications: an administrative client for organizing the work of educational department employees and a Telegram bot for working on the students’ side. A scheme for working with user data when processing chatbot commands is defined in the IDEF0 notation. The choice of the interlocutor program as a communication tool was made based on the popularity of this technology. The administrative client is implemented in C# using Windows Forms technology, the chatbot is implemented in Python using the “schedule” time planning library, “time” working with time and “threading” multi-threading support.
Keywords: chat bot, Telegram bot, messenger, message, mobile device, information system, database, computer program, application
The paper is dedicated to the modeling of opportunistic behavior in electroenergetics. We considered two setups: an optimal control problem from the point of view of a separate agent and a Stackelberg game of the controller with several agents. It is assumed that the agents may collude with the controller and to diminish the data about electroenergy consumption proportionaaly to the amount of bribe. The principal attention is paid to the numerical investigation of these problems basing on the method of qualitatively representative scenarios in simulation modeling. It is shown that using of a small number of the correctly chosen scenarios provides an acceptable qualitative precision of the forecast of systems dynamics. The numerical results are analyzed, and the recommendations on the struggle with corruption are formulated. An increase of the penalty coefficient in the case of catching of the controller taking "kickbacks" or an increase of her official reward makes the kickbacks not profitable.
Keywords: opportunistic behavior, optimal control problem, simulation modeling, Stackelberg games
The article discusses the sources and types of data used to create a digital student profile, as well as possible ways of using them in educational analytics. A digital profile is a comprehensive description of a student's academic, behavioral, and social characteristics collected from various sources. The data coming from educational institutions' information systems, social networks, instant messengers, mobile applications, video content platforms, questionnaires, and video cameras are analyzed. The importance of a digital profile is due to its ability to support personalization of learning and improve the efficiency of educational processes. The article highlights numeric, categorical, binary, ordinal, and unstructured data types, as well as metadata and derived data used for data analysis in DataScience and machine learning algorithms. Examples include grades, participation in educational events, social activity, preferences, text comments, and video recordings. Attention is also paid to the analysis of possible ways of using this data to predict academic performance, identify learning difficulties, and assess student engagement and motivation.
Keywords: digital student profile, educational analytics, data types, data sources, data analysis, personalization of learning, machine learning in education, datascience, educational data mining, crisp-dm, semma