The development, research and construction of devices that speed up the process of interaction between various modules (for example, telemetry and remote control systems), and in general, hybrid communication systems of a digital city that include a variety of systems used in an Intelligent Building is an urgent problem. One of these devices presented in the article is the optimal multi–frequency modem developed. In addition to the developed modem, the article presents examples of the development of similar types of devices and systems by both Russian and foreign researchers. At the same time, the authors proved that the use of the proposed modem provides a gain in spectral and energy efficiency in comparison with analogues. The proposed approach can be used to organize high-speed data transmission over frequency-limited communication channels based on new wired technologies of the digital subscriber line standard, as well as wireless systems.
Keywords: telemetry and remote control system, intelligent building, digital city hybrid communications system, modem, multi-frequency modulation, digital subscriber line, optimal finite signal, modulator, demodulator, wireless communication system
The article presents a comprehensive analysis of a systematic approach to the implementation and development of innovative information technologies aimed at preventing offenses committed by foreign citizens. The introduction provides an overview of the growing importance of employing advanced technological solutions in law enforcement, particularly in addressing challenges associated with foreign nationals. The main objectives of the study are to explore how the integration of technologies such as big data processing, artificial intelligence, and geographic information systems can enhance the efficiency of preventive measures. The article details the use of data analysis techniques, machine learning models, and system integration to create a unified information platform. This platform enables the consolidation of data from diverse sources, thereby improving the coordination between different law enforcement units and facilitating faster and more informed decision-making processes. The integration of these technologies also supports process standardization, reducing data inconsistencies and ensuring more reliable operations across various departments. The results highlight the benefits of utilizing big data analytics to process vast amounts of information that would be otherwise impossible to handle efficiently. Artificial intelligence, through predictive models and risk assessment tools, plays a crucial role in identifying potential threats and allocating resources effectively. Geographic information systems contribute by mapping crime hotspots and providing spatial analysis, which aids in targeted intervention strategies. The discussion emphasizes the importance of a unified approach to technology implementation, focusing on the creation of an integrated information system that can adapt to ongoing changes in the social and legal environment. The adaptability of the system is critical for maintaining its effectiveness in the face of new challenges and evolving regulatory requirements. The development of standardized data collection and processing protocols further enhances the system's resilience and operational efficiency. In conclusion, the article underscores that a systematic and integrated use of innovative information technologies significantly improves the effectiveness of crime prevention efforts and the overall efficiency of law enforcement agencies. The proposed approach not only facilitates proactive measures but also ensures a high level of responsiveness to emerging security threats, thereby strengthening public safety.
Keywords: systemic approach, innovative information technologies, crime prevention, foreign citizens, big data, artificial intelligence, geoinformation systems, information platform, standardization, law enforcement agencies, efficiency management, data integration
When evaluating student work, the analysis of written assignments, particularly the analysis of source code, becomes particularly relevant. This article discusses an approach for evaluating the dynamics of feature changes in students' source code. Various metrics of source code are analyzed and key metrics are identified, including quantitative metrics, program control flow complexity metrics, and the TIOBE quality indicator. A set of text data containing program source codes from a website dedicated to practical programming, was used to determine threshold values for each metric and categorize them. The obtained results were used to conduct an analysis of students' source code using a developed service that allows for the evaluation of work based on key features, the observation of dynamics in code indicators, and the understanding of a student's position within the group based on the obtained values.
Keywords: machine learning, text data analysis, program code analysis, digital footprint, data visualization
There is often a need to analyze unstructured data when assessing the risk of emergency situations. Traditional analysis methods may not take into account the ambiguity of information, which makes them insufficiently effective for risk assessment. The article proposes the use of a modified hierarchy process analysis method using fuzzy logic, which allows for more effective consideration of uncertainties and subjective assessments in the process of analyzing emergency risks. In addition, such methods allow for consideration of not only quantitative indicators, but also qualitative ones. This, in turn, can lead to more informed decisions in the field of risk management and increased preparedness for various situations. The integration of technologies for working with unstructured data in the process of assessing emergency risks not only increases the accuracy of forecasting, but also allows for adapting management strategies to changing conditions.
Keywords: artificial intelligent systems, unstructured data, risk assessment, classical hierarchy analysis method, modified hierarchy analysis method, fuzzy logical inference system
The article considers the causes of the formation of defects to be evaluated. The methods of obtaining information about the condition of metal corrugated pipes are presented. The main defects arising during the operation of metal corrugated pipes are shown. The most effective methods of assessing the condition of metal corrugated pipes have been determined.
Keywords: corrugated metal pipes, wear, durability, defects, factors, evaluation
In systems for monitoring, diagnostics and recognition of the state of various types of objects, an important aspect is the reduction of the volume of measured signal data for its transmission or accumulation in information bases with the ability to restore it without significant distortion. A special type of signals in this case are packet signals, which represent sets of harmonics with multiple frequencies and are truly periodic with a clearly distinguishable period. Signals of this type are typical for mechanical, electromechanical systems with rotating elements: reducers, gearboxes, electric motors, internal combustion engines, etc. The article considers a number of models for reducing these signals and cases of priority application of each of them. In particular, the following are highlighted: the discrete Fourier transform model with a modified formula for restoring a continuous signal, the proposed model based on decomposition by bordering functions and the discrete cosine transform model. The first two models ideally provide absolute accuracy of signal restoration after reduction, the last one refers to reduction models with information loss. The main criteria for evaluating the models are: computational complexity of the implemented transformations, the degree of implemented signal reduction, and the error in restoring the signal from the reduced data. It was found that in the case of application to packet signals, each of the listed models can be used, the choice being determined by the priority indicators of the reduction assessment. The application of the considered reduction models is possible in information and measuring systems for monitoring the state, diagnostics, and control of the above-mentioned objects.
Keywords: reduction model, measured packet signal, discrete cosine transform, decomposition into bordering functions, reduction quality assessment, information-measuring system
In operational diagnostics and recognition of states of complex technical systems, an important task is to identify small time-determined changes in complex measured diagnostic signals of the controlled object. For these purposes, the signal is transformed into a small-sized image in the diagnostic feature space, moving along trajectories of different shapes, depending on the nature and magnitude of the changes. It is important to identify stable and deterministic patterns of changes in these complex-shaped diagnostic signals. Identification of such patterns largely depends on the principles of constructing a small-sized feature space. In the article, the space of decomposition coefficients of the measured signal in the adaptive orthonormal basis of canonical transformations is considered as such a space. In this case, the basis is constructed based on a representative sample of realizations of the controlled signal for various states of the system using the proposed algorithm. The identified shapes of the trajectories of the images correspond to specific types of deterministic changes in the signal. Analytical functional dependencies were discovered linking a specific type of signal change with the shape of the trajectory of the image in the feature space. The proposed approach, when used, simplifies modeling, operational diagnostics and condition monitoring during the implementation of, for example, low-frequency diagnostics and defectoscopy of structures, vibration diagnostics, monitoring of the stress state of an object by analyzing the time characteristics of response functions to impact.
Keywords: modeling, functional dependencies, state recognition, diagnostic image, image movement trajectories, small changes in diagnostic signals, canonical decomposition basis, analytical description of image trajectory
The article solves the problem of automated generation of user roles using machine learning methods. To solve the problem, cluster data analysis methods implemented in Python in the Google Colab development environment are used. Based on the results obtained, a method for generating user roles was developed and tested, which allows reducing the time for generating a role-based access control model.
Keywords: machine learning, role-based access control model, clustering, k-means method, hierarchical clustering, DBSCAN method
The article is devoted to the design of a test automation system for the DBaaS Postgres Pro cloud database management manager. New approaches, concepts and definitions of the theory of test automation are formulated and old ones are updated. An analysis of modern tools widely used in commercial software development is carried out. The features of the system under test were studied, including the specifics of working with cloud computing and the Postgres DBMS. Based on the data obtained, an optimal technology stack was formed that is planned to be used in development, and functional requirements for the test automation system were developed. In practical terms, the use of this system on a DBaaS project will reduce labor intensity and speed up work at the testing and development stages, increase the efficiency of testing and the quality of the software product.
Keywords: software testing automation, DBaaS, cloud database, Postgres DBMS, GO programming language
This article analyzes the organizational and technological solutions used in the investment and construction sector, aimed at significantly increasing the efficiency of construction production and optimizing labor productivity. The process of implementing investment and construction activities is a multifaceted set of strategic maneuvers aimed at attracting capital investments, improving organizational and technological management and successfully completing project initiatives. The integration of innovative approaches in resource management and labor organization processes serves as the basis for the formation of a sustainable competitive advantage and maximizing performance in the dynamically developing construction industry. Organizational and technological schemes for the implementation of investment and construction projects are considered. Attention is paid to the issue of complex turnkey contracts, which involve the preparation of preliminary contract documentation in two stages. The contractor studies the requirements put forward by the customer, after which he sends him his proposals, after approval and acceptance of which the risk of constructive and organizational and technological solutions is reduced.
Keywords: organizational and technological solutions, investment and construction processes, facility, organizational management structures, reliability, efficiency
This paper investigates the effectiveness of the distance fields method for building 3D graphics in comparison with the traditional polygonal approach. The main attention is paid to the use of analytical representation of models, which allows to determine the shortest distance to the objects of the scene and provides high speed even on weak hardware. Comparative analysis is made on the possibility of wide model detailing, applicability of different lighting sources, reflection mapping and model transformation. Conclusions are drawn about the promising potential of the distance field method for 3D graphics, especially in real-time rendering systems. It is also emphasized that further research and development in this area is relevant. Within the framework of this work, a universal software implementation of the distance fields method was realized.
Keywords: computer graphics, rendering, 3D graphics, ray marching, polygonal graphics, 3D graphics development, modeling, 3D models
The article considers the possibility of measuring the temperature of cable transmission lines with the help of specially manufactured narrowed quartz optical fiber. The technology of manufacturing of narrowed optical fiber on a specially designed device is considered. The device of laboratory installation for revealing the expansion of optical fiber under the influence of temperature is considered. The influence of temperature on the laser beam deflection in the created quartz single-mode fiber is investigated.
Keywords: optical fiber, fiber optic sensor, temperature measurement, cable power lines
The article describes the methodology for developing a client-server application intended for constructing a virtual museum. The creation of the server part of the application with the functions of processing and executing requests from the client part, as well as the creation of a database and interaction with it, is discussed in detail. The client part is developed using the Angular framework and the TypeScript language; the three-dimensional implementation is based on the three.js library, which is an add-on to WebGL technology. The server part is developed on the ASP.NET Core platform in C#. The database schema is based on a Code-First approach using Entity Framework Core. Microsoft SQL Server is used as the database management system.
Keywords: client-server application, virtual tour designer, virtual museum, three.js library, framework, Angular, ASP.NET Core, Entity Framework Core, Code-First, WebGL
This article presents a study on the approach to the development of a medical decision support system (DSS) for the selection of formulas for calculating the optical strength of intraocular lenses (IOLs) used in the surgical treatment of cataracts. The system is based on the methods of building recommendation systems, which allows you to automate the process of choosing an IOL and minimize the risk of human error. The implementation of the system in the practice of medical organizations is expected to be highly accurate and efficient, significantly reduce the time allowed for decision-making, as well as improve the results of surgical interventions.
Keywords: intraocular lens, ophthalmology, formulas for calculating optical strength, web application, machine learning, eye parameters, prognostic model, recommendation system, prediction accuracy, medical decision
The article is devoted to the issue of making organizational and technological decisions when the stock of information for alternatives to choosing the required tasks is either very small, or they are changing drastically. Since it is not easy to make an organizational and technological business decision due to the fact that it is not clear what the result will be, it is proposed to use game theory in conditions of uncertainty. What kind of decision will be made and which of the available strategies the designer will adopt depends on what economic effect will be obtained as a result of the implementation of the organizational and technological solution being developed. Game theory is a special class of mathematical models for decision-making under specific conditions. By choosing solutions in the wake of uncertainty, game theory formalizes possible knowledge about the course of action of the participants in the game in certain conflict situations and allows you to mathematically substantiate knowledge about a rational course of action in certain conditions. When constructing construction projects in environmentally unfavorable conditions, the risk matrix is compared and the best option for the development of a construction project is determined.
Keywords: construction and technical expertise, reliability, organizational and technological solutions, diagnostic methods, efficiency.