×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Application of neural network technologies for user authentication in modern mobile systems

    With the rapid development of mobile technologies and increasing risks of data leakage, providing reliable user authentication becomes one of the key tasks of information security. This paper is devoted to the study of application of neural network technologies for biometric authentication in modern mobile systems. The paper provides a comprehensive analysis of existing biometric authentication methods such as face recognition, voice and fingerprint analysis. Special attention is paid to the peculiarities of the methods' operation, accuracy and resistance to attacks. The main advantages and disadvantages of each of the considered authentication methods are given. At the end of the article is presented the practical application of the developed algorithm of neural network authentication based on fingerprint analysis, integrated into the SIM-card. This innovative approach not only increases the security level of mobile devices, but also provides convenience to the user. The implementation of this case study will form the basis for further research presented in this thesis work, which emphasizes the importance of integrating neural network technologies into authentication processes. The results of the research will be useful for both scientists and developers in the field of information security, opening new horizons for the improvement of biometric systems in the mobile environment.

    Keywords: authentication, neural networks, biometrics, mobile systems, information security, deepfake, GDPR, hybrid technologies, sim card

  • Analysis of network stability and optimization of data exchange in banking systems

    The article presents an analysis of the network stability of modern banking systems from the point of view of graph theory. The use of graph models makes it possible to effectively describe complex network structures, identify bottlenecks, and predict system behavior during failures or attacks. Algorithms based on graph theory, such as Dijkstra's Algorithm, have been proposed to ensure minimal transaction processing time and improve system reliability. A comparative analysis of various optimization methods through modeling on abstract graphs and real banking network data was carried out. As a result of the study, solutions were proposed to protect the banking system, as well as improve its connectivity and fault tolerance.

    Keywords: banking system, graph theory, Dijkstra's algorithm, blockchain, transactions, cyber attack, network stability analysis, banking infrastructure, cyber security, DDoS attack

  • Application of convolutional neural networks and deep learning algorithms for prediction and identification of voice deepfakes

    The purpose of this article is to create a convolutional neural network model for identifying and predicting audio deepfakes by classifying voice content using deep machine learning algorithms and python programming language libraries. The audio content datasets are basic for the neural network learning process and are represented by mel spectrograms. The processing of graphic images of the audio signal in the heatmap format forms the knowledge base of the convolutional neural network. The results of the visualization of mel spectrograms in the ratio of the measurement of the frequency of sound and chalk determine the key characteristics of the audio signal and provide a comparison procedure between a real voice and artificial speech. Modern speech synthesizers use a complex selection and generate synthetic speech based on the recording of a person's voice and a language model. We note the importance of mel spectrograms, including for speech synthesis models, where this type of spectrograms is used to record the timbre of a voice and encode the speaker's original speech. Convolutional neural networks allow you to automate the processing of mel spectrograms and classify voice content: original or fake. The experiments conducted on test voice sets proved the success of learning and using convolutional neural networks using images of MFCC spectral coefficients to classify and study audio content, and the use of this type of neural networks in the field of information security to identify audio deepfakes.

    Keywords: neural networks, detection of voice deepfakes, information security, speech synthesis models, deep machine learning, categorical cross-entropy, loss function, algorithms for detecting voice deepfakes, convolutional neural networks, mel-spectrograms

  • On the Implementation of Information Security in Distributed Data Storage Systems for Small Businesses

    The article examines the key aspects and recommendations for implementing data protection in distributed data storage systems (DDSS) for small businesses. It explores methods of ensuring information security, including incident monitoring, two-factor authentication, and file encryption. The study includes tests of the fault tolerance of DDSS and the robustness of authentication mechanisms under simulated DOS and brute force attacks using fuzzing techniques. Proposed methods include the integration of platforms for incident monitoring (MISP, Wazuh) and the use of TOTP for two-factor authentication. Additionally, it discusses data encryption mechanisms and access management using JWT.

    Keywords: information security, fuzzing, monitoring, WAF, data storage system, data encryption, two-factor authentication, small business, fault tolerance

  • Practical study of the Kerberos protocol: attacks, detection and development of detection rules

    This article examines the practical aspects of using the Kerberos authentication protocol. It provides a brief historical background and describes the main operational principles and characteristics of the protocol that may lead to vulnerabilities. Exploitation of these vulnerabilities can allow an attacker to maintain a persistent presence in a domain environment. In the course of the study, the potential for an attacker to establish domain persistence is investigated through attacks such as Kerberoasting, brute-forcing hashes, and creating Golden Tickets. These activities are carried out using tools designed for penetration testing. Special attention is devoted to assessing the subsequent negative consequences of unauthorized access to domain resources. The study analyzes methods for detecting attacks, including the development of rules to monitor suspicious activity. The article underscores the importance of promptly identifying real-world threat vectors and reinforcing security measures within domain infrastructures. Discussion of the proposed set of attack vectors helps create a more effective penetration testing plan and strengthen monitoring. In particular, the paper examines the process of obtaining Ticket-Granting Tickets, as well as ways to compromise and further exploit static credentials. It is highlighted that the Key Distribution Center (KDC) is a critical component requiring additional oversight and protection. Practical recommendations are provided on monitoring event logs and configuring Intrusion Detection Systems (IDS) for timely detection of anomalous activities. Thus, the central idea of this work is to demonstrate the relevance and significance of Kerberos in corporate networks, while highlighting both its strong points and possible risks. The results emphasize the importance of combining an in-depth understanding of Kerberos functionality with practical security measures to preserve system integrity and reduce risks.

    Keywords: pinning, authentication protocol, access level, Kerberos, attack detection, attacker, Golden Ticket, Kerberoasting, attack detection, monitoring

  • Development and Analysis of a Feature Model for Dynamic Handwritten Signature Recognition

    In this work, we present the development and analysis of a feature model for dynamic handwritten signature recognition to improve its effectiveness. The feature model is based on the extraction of both global features (signature length, average angle between signature vectors, range of dynamic characteristics, proportionality coefficient, average input speed) and local features (pen coordinates, pressure, azimuth, and tilt angle). We utilized the method of potentials to generate a signature template that accounts for variations in writing style. Experimental evaluation was conducted using the MCYT_Signature_100 signature database, which contains 2500 genuine and 2500 forged samples. We determined optimal compactness values for each feature, enabling us to accommodate signature writing variability and enhance recognition accuracy. The obtained results confirm the effectiveness of the proposed feature model and its potential for biometric authentication systems, presenting practical interest for information security specialists.

    Keywords: dynamic handwritten signature, signature recognition, biometric authentication, feature model, potential method, MCYT_Signature_100, FRR, FAR

  • Development of an integrated method for assessing the security level of an organization's server infrastructure

    Information security management at the enterprise is an important task, as the number of threats is growing and constant improvement of protection mechanisms is necessary. The server infrastructure of the enterprise is used to publish corporate services and the requirements for it are high in terms of performance, reliability and security. This article discusses the developed method of integral assessment of the level of security of the server infrastructure of the enterprise from attacks of various types.

    Keywords: data protection, information technology, comprehensive assessment, systems analysis, information systems, information security

  • Implementation of the algorithm for updating the lists of cancelled certificates of the certifying center

    The most important problem in using an electronic signature is updating the lists of revoked certificates. Currently, there is no single solution to automate this process. This paper presents one of the solutions to this problem using the example of integrated use of the capabilities of the operating system, cryptography tools and standard certificate management libraries.

    Keywords: information security, software, lists of revoked certificates

  • A method of increasing the security of image transmission in messengers using one-time passwords

    The article presents a method for protecting transmitted images in instant messengers using time-based one-time passwords (TOTP). An additional level of protection is offered based on a combination of image masking using orthogonal matrices and two-factor authentication based on TOTP. A prototype Python application has been developed and tested using the gRPC remote procedure protocol to ensure secure data exchange between the client and the server. The results of the implementation of the proposed method in preventing unauthorized access to confidential images are presented.

    Keywords: information security, messenger, messaging, communications, instant messaging systems, one-time password

  • Moving from a university data warehouse to a lake: models and methods of big data processing

    The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management. The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management.

    Keywords: data warehouse, data lake, big data, cloud storage, unstructured data, semi-structured data

  • Preprocessing of tabular structure data to solve problems of multivalued classification of computer attacks

    The development and application of methods of preliminary processing of tabular data for solving problems of multivalued classification of computer attacks is considered. The object of the study is a data set containing multivalued records collected using a hardware and software complex developed by the authors. The analysis of the attributes of the dataset was carried out, during which 28 attributes were identified that are of the greatest informational importance when used for classification by machine learning algorithms. The expediency of using autoencoders in the field of information security, in tasks related to datasets with the property of ambiguity of target attributes is substantiated. Practical significance: data preprocessing can be used to improve the accuracy of detecting and classifying multi-valued computer attacks.

    Keywords: information security, computer attacks, multi-label, multi-label classification, multivalued classification, dataset analysis, experimental data collection, multivalued data, network attacks, information security

  • Behavioral biometrics of touch screen interaction to identify mobile device users

    Based on the analysis of behavioral characteristics, the main indicators that provide the greatest accuracy in identifying users of mobile devices are identified. As part of the research, software has been written to collect touchscreen data when performing typical user actions. Identification algorithms are implemented based on machine learning algorithms and accuracy is shown. The results obtained in the study can be used to build continuous identification systems.

    Keywords: user behavior, touch screen, continuous identification, biometrics, dataset, classification, deep learning, recurrent neural network, mobile device

  • Construction of encoders and decoders for code division multiplexing

    A class of mathematical methods for code channel division has been developed based on the use of pairs of orthogonal encoding and decoding matrices, the components of which are polynomials and integers. The principles of constructing schemes for implementing code channel combining on the transmitting side and arithmetic code channel division on the receiving side of the communication system and examples of such schemes are presented. The proposed approach will significantly simplify the design of encoding and decoding devices used in space and satellite communication systems.

    Keywords: telecommunications systems, telecommunications devices, multiplexing, code division of channels, matrix analysis, encoding matrices, synthesis method, orthogonal matrices, integers

  • On the Development of Secure Applications Based on the Integration of the Rust Programming Language and PostgreSQL DBMS

    Currently, key aspects of software development include the security and efficiency of the applications being created. Special attention is given to data security and operations involving databases. This article discusses methods and techniques for developing secure applications through the integration of the Rust programming language and the PostgreSQL database management system (DBMS). Rust is a general-purpose programming language that prioritizes safety as its primary objective. The article examines key concepts of Rust, such as strict typing, the RAII (Resource Acquisition Is Initialization) programming idiom, macro definitions, and immutability, and how these features contribute to the development of reliable and high-performance applications when interfacing with databases. The integration with PostgreSQL, which has been demonstrated to be both straightforward and robust, is analyzed, highlighting its capacity for efficient data management while maintaining a high level of security, thereby mitigating common errors and vulnerabilities. Rust is currently used less than popular languages like JavaScript, Python, and Java, despite its steep learning curve. However, major companies see its potential. Rust modules are being integrated into operating system kernels (Linux, Windows, Android), Mozilla is developing features for Firefox's Gecko engine and StackOverflow surveys show a rising usage of Rust. A practical example involving the dispatch of information related to class schedules and video content illustrates the advantages of utilizing Rust in conjunction with PostgreSQL to create a scheduling management system, ensuring data integrity and security.

    Keywords: Rust programming language, memory safety, RAII, metaprogramming, DBMS, PostgreSQL

  • Cascaded code division multiplexing

    A method is proposed for cascading connection of encoding and decoding devices to implement code division of channels. It is shown that by increasing the number of cascading levels, their implementation is significantly simplified and the number of operations performed is reduced. In this case, as many pairs of subscribers can simultaneously exchange information, what is the minimum order of the encoding and decoding devices in the system. The proposed approach will significantly simplify the design of encoding and decoding devices used in space and satellite communication systems.

    Keywords: telecommunications systems, telecommunications devices, multiplexing, code division of channels, orthogonal matrices, integers, cascaded connection

  • The method of multiple initial connections as a tool for enhancing information security in peer-to-peer virtual private networks

    The article presents the method of multiple initial connections aimed at enhancing the information security of peer-to-peer virtual private networks. This method ensures the simultaneous establishment of several initial connections through intermediate nodes, which complicates data interception and minimizes the risks of connection compromise. The paper describes the algorithmic foundation of the method and demonstrates its application using a network of four nodes. An analysis of packet routing is conducted, including the stages of packet formation, modification, and transmission. To calculate the number of unique routes and assess data interception risks, a software package registered with the Federal Service for Intellectual Property was developed. The software utilizes matrix and combinatorial methods, providing high calculation accuracy and analysis efficiency. The proposed method has broad application prospects in peer-to-peer networks, Internet of Things systems, and distributed control systems.

    Keywords: multiple initial connections, peer-to-peer network, virtual private network, information security, data transmission routes, intermediate nodes, unique routes

  • Development of a secure connection establishment algorithm for peer-to-peer virtual private networks using multi-level cryptographic protection

    The article presents an algorithm for establishing a secure connection for peer-to-peer virtual private networks aimed at enhancing information security. The algorithm employs modern cryptographic protocols such as IKEv2, RSA, and DH, providing multi-level data protection. The developed algorithm structure includes dynamic generation and destruction of temporary keys, reducing the risk of compromise. The proposed solution is designed for use in corporate network security systems, Internet of Things system, and distributed systems.

    Keywords: virtual Private Network, peer-to-peer network, cryptographic protocols, RSA, Diffie-Hellman, IKEv2, secure connection, multi-layer protection, information security, distributed systems

  • A method for protecting images transmitted via messenger

    This article examines the vulnerability associated with storing image files in the cache on the device's hard disk in unencrypted form. The nature of this problem and the possible consequences of its exploitation, including leakage of confidential data, abuse of information received and risks to corporate information systems, are being investigated. The main attention is paid to the method of protection against this vulnerability, which is based on the use of masking techniques using orthogonal matrices.. The developed prototype of the messenger is presented, in which this method is implemented: images are transmitted and stored in the file system in masked form, the unmasking process is carried out directly in the messenger application itself.

    Keywords: information security, messenger, messaging, communications, instant messaging systems, encryption, orthogonal matrices

  • A review of technologies for deceiving an attacker (traps, decoys, moving target defense, deception platform), their classification and interaction

    The purpose of the article is to review various types how to deceive attackers in the network, analyze the applicability and variability of modern deception technologies. The method of investigation - analyzing existing articles in reviewed Russian and foreign sources, aggregating researches, forming conclusions based on the analyzed sources. The review article considers technologies of deception an attacker (Honeypot traps, Honeytoken decoys, moving target defense MTD, Deception platform). The effectiveness of the use of deception in terms of the impact on the mental state of a person is given in the article. The article provides a description of different types of Honeypots, discusses the classification according to the target, place of introduction, level of interaction, location, type of introduction, homogeneity and type of activity. as well as their component parts. Different strategies for using traps in the network are discussed - sacrificial lamb, hacker zoo, minefield, proximity traps, redirection screens, and deception ports. Classification of decoys is given, methods of their application in an organization's network are described, additional conditions that increase the probability of detection of an attacker by using decoys are specified. The basic techniques of the MTD strategy to obfuscate the infrastructure are given. The interaction of these methods with Honeypot and Honeytoken technologies is described. Research that confirms the effectiveness of using MTD in conjunction with traps and decoys is given it he article, the difficulties in using this strategy are pointed out. A description of the Deception platform is given, its distinctive features from conventional traps and decoys are described, and the possibility of its interaction with MTD is given. As a result, the main technologies and strategies to deceive the attacker have been identified and described, their development is pointed, their interaction with attackers and counteraction to them is described.

    Keywords: Deception Platform, Honeypot, Honeytoken, Honeynet, MTD

  • Correspondence between old and new estimates of events binary classification and statistical detection criteria ouality

    The relationship between "old" and "new" concepts/metrics for quality assensing of statistical detection criteria and binary events classification is considered. Independence and consistency assessments of analyzed metrics relative to initial input data volume/composition are provided. Recommendations for the use of "new" metrics for assessing the quality of detection and binary classification events are clarified.

    Keywords: Type I and Type II errors, accuracy, recall, specificity, F-score, ROC curve, AUC integral metric

  • Analysis of the Structure and Characteristics of Multilayer Autoencoders for Computer Attack Detection

    This study examines the structure and characteristics of multilayer autoencoders (MAEs) used in detecting computer attacks. The potential of MAEs for improving detection capabilities in cybersecurity is analyzed, with a focus on their role in reducing the dimensionality of large datasets involved in identifying computer attacks. The study explores the use of different neuron activation functions within the network and the most commonly applied loss functions that define reconstruction quality of the original data. Additionally, an optimization algorithm for autoencoder parameters is considered, designed to accelerate model training, reduce the likelihood of overfitting, and minimize the loss function.

    Keywords: neural networks, layers, neurons, loss function, activation function, mobile applications, attacks, hyperparameters, optimization, machine learning

  • System analysis of the process of developing organizational and administrative documentation of the enterprise

    The article is devoted to the problems of the complexity of the process of developing organizational and administrative documentation, taking into account the branch of work of the organization, as well as the departments that make up its main work. Taking into account the impact of the changing economy in the country, organizations are constantly subject to changes with the initiative of the relevant regulators in a particular area and regulatory documents in the form of standards and laws. The main branches of the organizations' work are highlighted, as well as the number of regulatory documents for regulating their activities. The analysis of the organization as a system based on a system analysis is carried out. The Sagatovsky method was chosen as an approach to solve the problem. According to the methodology, the system was analyzed, consisting of seven stages. At each stage, the main components are highlighted, and justifications for each of them are given. Life cycle diagrams of the specified "types of end products" have been compiled, taking into account the direction of work of the departments. A scheme of the process of creating organizational and administrative documentation by employees and departments of the organization has been developed. An analysis of the organization from the point of view of a system analysis will further develop criteria for creating a set of organizational and administrative documentation. Criteria for the creation of organizational and administrative documentation and methods of their assessment will help organizations significantly facilitate work with the main regulators in any area, as well as meet the set standards of work, which in the future will help not only to improve work, but also to avoid negative consequences for the enterprise itself.

    Keywords: the Saratovsky method, system analysis, goal setting, information security

  • Improving the efficiency of working with databases in PHP based on the use of PDO

    PHP Data Objects (PDOs) represent a significant advancement in PHP application development by providing a universal approach to interacting with database management systems (DBMSs). This article opens with an introduction describing the need for PDOs as of PHP 5.1, which allows PHP developers to interact with different databases through a single interface, minimising the effort involved in portability and code maintenance. It discusses how PDO can improve security by supporting prepared queries, which is a defence against SQL injection. The main part of the paper analyses the key advantages of PDO, such as its versatility in connecting to multiple databases (e.g. MySQL, PostgreSQL, SQLite), the ability to use prepared queries to enhance security, improved error handling through exceptions, transactional support for data integrity, and the ease of learning the PDO API even for beginners. Practical examples are provided, including preparing and executing SQL queries, setting attributes via the setAttribute method, and performing operations in transactions, emphasising the flexibility and robustness of PDO. In addition, the paper discusses best practices for using PDO in complex and high-volume projects, such as using prepared queries for bulk data insertion, query optimisation and stream processing for efficient handling of large amounts of data. The conclusion section characterises PDO as the preferred tool for modern web applications, offering a combination of security, performance and code quality enhancement. The authors also suggest directions for future research regarding security test automation and the impact of different data models on application performance.

    Keywords: PHP, PDO, databases, DBMS, security, prepared queries, transactions, programming

  • Development of a malmical traffic detection system to increase the number of detected anomalies

    Relevance of the research topic. Modern cyber attacks are becoming more complex and diverse, which makes classical methods of detecting anomalies, such as signature and heuristic, insufficiently effective. In this regard, it is necessary to develop more advanced systems for detecting network threats based on machine learning and artificial intelligence technologies. Problem statement. Existing methods of detecting malicious traffic often face problems associated with high false-positive response and insufficient accuracy in the face of real threats on the network. This reduces the effectiveness of cybersecurity systems and makes it difficult to identify new attacks. The purpose of the study. The purpose of this work is to develop a malicious traffic detection system that would increase the number of detected anomalies in network traffic through the introduction of machine learning and AI technologies. Research methods. To achieve this goal, a thorough analysis and preprocessing of data obtained from publicly available datasets such as CICIDS2017 and KDD Cup 1999 was carried out.

    Keywords: anomaly detection, malicious traffic, cybersecurity, machine learning, artificial intelligence, signature methods

  • Risk resilience of information security monitoring centers and its modeling

    Abstract. The purpose of the article is to study the information security of critical parameters of the organization's IT infrastructure processes and its digital infrastructure using Security Monitoring Centers. Such risk factors as adaptability, stability in the middle and long period, the influence of uncertainties ("white noise") are emphasized. In addition to system analysis and synthesis, methods of mathematical (simulation, operator) modeling, computational mathematics and statistics are used in the work. Based on the analysis and synthesis, the following main results were obtained: 1) the classification of the effects of various attacks on the distributed infrastructure was carried out; 2) a scheme, a multiplicative model of integral interactions of protective measures and an integral measure of security are proposed; 3) an algorithm has been developed to identify the constructed multiplicative model based on the least squares criterion, both by the set of factors and by risk classes; 4) shows an example of an operator equation taking into account random noise in the system. Scientific and practical value of work: the results can be used to assess the security of the system and reduce the risks of targeted attacks, damage from them. In addition, the proposed schemes will facilitate situational modeling to detect risk situations and assess the damage from their implementation.

    Keywords: assessment, sustainability, maturity, information security center, monitoring, risk, management