An empirical study on vulnerability assessment and penetration detection for highly sensitive networks

: With the advancement of internet and the emergence of network globalization, security has always been a major concern. During the trial operation, the management control platform discussed in this article included more than 600 network security vulnerabilities in the industry, with dozens of inci - dents, which were promptly dealt with and recti ﬁ ed, e ﬀ ectively improving the level of network security management and protection in the industry. As networks are very much vulnerable to denial of service attacks, much more emphasis has been given to security. By improving their network security, network administrators have often tried their best. To attempt penetration testing, it is the best way of ensuring the system security. With the development of information technology, the security requirement of information system is increasing day by day. The use of penetration testing technology is conducive to the realization of accurate positioning, accurate detection, and active alarm of security vulnerabilities, and the optimization of monitoring and recti ﬁ cation of the combination of network security management control system. Taking penetration testing technology as one of the core elements of management and control, the risk index model is optimized to make network security management controllable and e ﬃ cient, and e ﬀ ectively achieve management and control objectives.


Introduction
The rapid development of information technology makes the application of information system in various fields more common. At present, information construction in the field of education has achieved rapid development, and modern information technology has been widely applied in the education system.
For example, the use of information systems such as service portal, office automation, educational administration management, financial management, and online teaching has greatly improved work efficiency and service level [1]. Therefore, it is very important to improve the level of network security of each information system. At present, there are tens of thousands of information systems involved in the education bureau of 16 districts, more than 3,000 primary schools and kindergartens, more than 60 universities, more than 30 directly affiliated institutions, and more than 80 secondary vocational schools in Shanghai [2]. The security level of these information systems is improved to quickly find security vulnerabilities. The timely repair is one of the important means to ensure the safe operation of the systems [3]. With the rapid update of information technology, network security vulnerabilities also continue to appear. It is crucial to find security vulnerabilities in the information system's life cycle in a timely manner and quickly rectify them. Therefore, a set of effective management and control mechanism which can rely on reasonable technical means to achieve accurate location, accurate detection, active alarm, monitoring and rectification of information security management, and control system and technology platform become very necessary.
The exploits and vulnerabilities that exist within an organization are identified by the penetration testing. The security measures have been implemented by the IT infrastructure that helps the effectiveness or in-effectiveness [4][5][6]. The additional funding for security controls are justified in a better way as compared to the flaws present in the operational system. The penetration testing should model a real world attack that is very important [7,8]. A penetration tester will rarely be afforded this luxury, and for researching the target, a real world attacker would typically spend many months [9][10][11]. A similar methodology is used by the all penetration tests regardless of the actual attack profile that is being simulated. The information about a target is gained by the tester for the target acquisition. For this purpose, several ways are used such as scanning a website for names, photographs, or contact telephone numbers. Process of internal penetration is shown in Figure 1 and the example application is shown in Figure 2.
This article is structured as follows. Details of state-of-the art techniques are discussed in Section 2 followed by the contribution of the manuscript. Management control model and its elements are detailed in Section 3. Detailed controllability of explicit channels and the result analysis are presented in Section 4. Section 5 concludes the article.

Literature review
New trends in the information technology era are clouds, big data, internet of things (IoT), and artificial intelligence. Enterprises do not actively build many service-related service information systems to establish a fast and convenient connection between customers and enterprises. The author Ma, W.M. elaborates on the information security attack and defense exercises to understand the enterprise's external service information system. Hydra is a very common website penetration testing tool used by practitioners for assessing vulnerabilities. Meanwhile, new investigators may gain some practical experience with website vulnerabilities, to improve their penetration ability [12,13]. Traditional machine learning algorithms have been widely used in intrusion detection, despite the scalability, functional engineering efforts, and accuracy hinder their access to safe markets. Using deep learning methods can alleviate these shortcomings, because it has been successful in the field of big data. In addition to eliminating the need for manual production, deep learning has high detection accuracy and can resist deformation attack. Diro and others propose an long short-term memory (LSTM) network for distributed network attack detection in fog-to-object communication. We identify and analyze key attacks and threats against IoT devices, in particular, the use of wireless communication vulnerabilities attack. Experiments in two cases prove that the depth model is effective and efficient than the traditional machine learning model [14]. As the private cloud spreads, protecting the network security of private cloud has become the focus of more and more enterprises. Enterprise information security system needs to integrate information security construction into infrastructure construction. The security of the enterprise private cloud network is a systematic, overall situation management engineering issue, and any private cloud vulnerability can paralyze the entire network. Qing and others take the basic network security, big data security, and private cloud network security as the starting point, analyze the relevant evaluation indexes, establish the evaluation system model and other key technologies, and expound the enterprise private cloud network security [15]. A trust enhanced distributed authorization architecture is presented by the author to provide a holistic framework [16]. The notions of "hard" and "soft" are encompassed by the model to determine whether a platform can be trusted for authorization. The hybrid model with "hard" and "soft" trust components are described after detailing the rationale for the overall model. The presented architecture is then implemented in the context of authorization for web services. The obtained results demonstrated that the presented model enables better authorization decision making, especially in a distributed environment. To manage federated authorization infrastructures, the authors of this article explore the automatic adaptation of authorization assets (policies and subject access rights). Self-Adaptive Authorization Framework (SAAF) is adapted for managing policy-based federated role/attribute that access and control authorization infrastructures [17]. A feedback loop is controlled by the SAAF controller to monitor the authorization infrastructure. A potential adaptation for handling malicious behavior is analyzed. A prototype of the SAAF controller is evaluated by the simulating malicious that demonstrating the escalation of adaptation [18].
Authorization infrastructures become increasingly difficult to manage as organizations start to federate access to their resources. The authors of this article presented a SAAF to control the access to resources through the manipulation of authorization assets that are capable of monitoring the usage of resources. The utilization of models for facilitating the autonomic management of federated authorization infrastructures is explored by the authors. The classification is required to categorize behavior exhibited by users, including usage, for identifying abnormal behavior. Evaluation of SAAF is done by integrating it into an existing authorization infrastructure. For international information exchange and platform, network globalization and advent of internet are the main tools [19]. Networks are very much vulnerable to denial of service attacks and then much emphasis has been given to security. Network administrators have tried their best by improving their network security; however, the system is secure to attempt penetration testing. To prove whether a system is vulnerable, this is the most efficient way. By means of the internet, network security cybercrime technologies have brought many good things. There is also a criminal hacker with most technological advances [20]. Governments, companies, and private citizens are afraid that some hackers will break into their web server. The authors of this article detail the skills and attitudes of ethical hackers that how they go about helping their customers. To become an ethical hacker, which is also called as penetrate testing, there are many rules. These rules include knowledge of HTML, Java Scripts, Computer Tricks, Cracking & Breaking, etc. In this article, the authors described about the hacking techniques and the functions of how it takes place in the network.

Contribution
The use of penetration testing technology is conducive to the realization of accurate positioning, accurate detection, and active alarm of security vulnerabilities, and the optimization of monitoring and rectification of the combination of network security management control system are analyzed and detailed by the authors in this article. Taking penetration testing technology as one of the core elements of management and control, the risk index model is optimized to make network security management controllable and efficient, and effectively achieve management and control objectives.

General model of management control
The control work in management is to determine the implementation of the plan according to the standard, and to ensure the correctness and realization of the plan objectives by correcting the deviation in the implementation.
Management control refers to the process in which managers influence other members of an organization to achieve organizational strategy. Management control process helps to achieve a desired goal, e.g., optimized and effective organization planning and coordination of normal work order and management system activities. The purpose of management control is to execute strategy (see Figure 3).

Management control elements
The working process of management control, as shown in Figure 4, generally consists of the following three steps: setting control objectives and establishing performance standards; measure actual work and obtain deviation information; and analyze the cause of the deviation and take corrective action (feedback control).
Feedback control is derived from the theory of automatic signal control. According to the feedback signal, it can be divided into "positive feedback" and "negative feedback." Feedback control refers to comparing the actual results of a task after completion and judging the impact on the implementation of the next action to play a controlling role [21]. By introducing it into control science, three basic element nodes are set: control target, standard, deviation information, and corrective measures, which form the key node of the whole control closed-loop.

Penetration test 3.3.1 Definition
Before we talk about penetration testing, let us talk about vulnerability scanning. Vulnerability scanning is an examination of information system security, including systems connected to the internet, applications, and online network equipment components, through the detection of vulnerabilities and security  vulnerabilities. A typical vulnerability scanner is based on a vulnerability database, which contains information about services, ports, packet types, and other known security issues that are at risk [22,23]. The risk list in the library also contains security recommendations for addressing vulnerabilities. The use of simulation hackers, malicious attack method, evaluation system security status, are used for penetration testing of a computer network system. In the whole evaluation process, the analyst takes the initiative to exploit the security vulnerability based on the location of an attacker, which is based on the active analysis of various weaknesses, technical defects, and vulnerabilities of the system. The application of penetration test to network security has become an effective technical means to prevent attacks. At the same time, we must base on the premise that the test does not affect the normal operation of the business system.

Classification
Referring to the classification of software development tests, penetration tests can be divided into the following three categories: (i) Black box tests, in which testers are completely ignorant of the system, perform this type of tests, and obtain information including DNS, Web, Email, and other internet services from servers that are exposed to the public by the company. (ii) White box test, which is the opposite to black box test. Testers can communicate with non-IT staff face to face to collect effective information, including network topology, employee information, website, and part of application code [24,25]. The purpose of the test is to simulate the operation of the staff within the enterprise. (iii) Covert testing, which is a test of the ability of the inspection unit to monitor, respond to, and recover information security events. Generally, when the penetration test is executed, the monitoring network personnel will monitor the changes in the network. The security management department of the unit will know the time period of the test in advance, but basically no one else knows, which is for the purpose of achieving the test better.

Model study
To study the network information penetration technology and penetration detection technology, we plan to establish a simulation model, which consists of the following parts: two networks with different information security levels -A network (high information security level, known controllable network) with B network (low information security level, unknown uncontrollable network); there are n known information channels between A network and B network e e e , , , n 1 2 … , and m unknown information channels h h h , , , m 1 2 … ; there are x network entities in A network g g g , , , x 1 2 … , including u specific information requesters s s s , , , u 1 2 … ; and there are y network entities in network B f f f , , , y 1 2 … , including v specific information publishers o o o , , , v 1 2 … . The specific information requester hopes to obtain specific information from the specific information publisher i, the specific information publisher wants to pass the specific information to the specific information requester, and the two transmit specific information through the specific information access flow (SIF, sensitive information flow) [26][27][28][29]. Specific information access flow has three key characteristics: specific purpose (destination), specific channel (method), and specific content (content). The information filtering system is built on e e e , , , n 1 2 … . The network information intrusion detection system (NIPDS) identifies and blocks the transmission of suspicious information.
Define the following set: Vulnerability assessment and penetration detection for high sensitive networks  597 E is divided into two subsets: the explicit controllable information channel subset EC (NIPDS can identify the information channel in EC and can be controlled) and the explicit uncontrollable information channel subset EU (NIPDS can identify the information channel in EU but unable to control or identify the specific information), E = EC ∪ EU. . The goal of the infiltrating party (NIPT) is to make as many S as possible to obtain C, that is, to maximize the number of elements in the SIF set. As the key parameters that determine the size of SIF are the size and characteristics of S, O, and E, the infiltrator adopts micro-disordering methods to try to achieve the goal: (iii) Turn uncontrollable information channels into controllable information channels, from micro-disorder to macro-order, that is, EU → EC; (iv) csif CSIF i ∈ from micro-controllable to macro-controllable.
That is, the information channel can be gradually monitored by the method of step-by-step degradation, and the controllability of information can be improved.

Monitoring of open information channels
For NIPT, the purpose is to diffuse C into all S. To achieve the above-mentioned means (1), O must be diffused to S through an open channel, so we can monitor those unknown displays by sampling. Suppose that NIPT randomly sends < ≤ , with q sampling points, wherein p is the release ratio. Besides, sampling coverage is calculated as r f p q r , , 0 1 = ( ) < ≤ , wherein q represents sampling points. According to the principle of probability, we get Equation (1) can be transformed into a geometric sequence, and then according to the series formula: According to equation (2), the formula for the number of sampling points can be obtained: Example: set v = 1,000, random release each time O i . The number is 3. If you want to obtain 99% coverage, the sampling point needs to be q = ln(1 -0.99)/ln(1 -0.003) = 1,535, that is, 1,535 samples need to be sampled; if v = 3,000, then q is 4,605. Another important application of equations (2) and (3) is to estimate the size of the information source O [O]. The q curve is calculated through equation (3) and presented in Figure 5 for the various values of p including v = 1,000 (p = 0.003) and v = The curve of q at 3,000 (p = 0.001).
It can be seen that the curvatures of the two curves are different, and a graph of the q function family is obtained by setting different p values (as shown in Figure 3). Thus, sampling data can be generated after a certain amount of sampling, and the p value can be estimated by comparing the convergence curve of the published source coincidence degree of the sampling data with the q curve, thereby obtaining the approximate value of [O]. A large number of existing curve similarity calculation methods can be used for processing, and other distribution models can also be used for estimation.

Controllability of explicit channels and the result analysis
The purpose of explicit channel controllability is to analyze channels that cannot identify transmission content or protocols. The discussion on the experimentation is provided in this section.
For sequential channel transmission data (such as the content carried on SSL), if the microscopic method is still used for analysis, it will often be useless and get twice the result. It is necessary to use a higher level of macro-anomaly detection methods for analysis on massive samples, such as data stream spectrum analysis and statistical analysis. However, the current technical analysis is still developing in the direction of pure technology, and the author believes that when dealing with these problems, it is necessary to improve the role of humans and combine humans and machines. Among them, the key issue is to establish a set of entities that have both powerful computer system support and a strong team of experts [30][31][32][33]. At the same time, they have a complete system to minimize dependence on individual people and can continue to operate without being affected by personnel in the entity [34].
A key reason why many intelligent human-machine systems established over the years have disappeared is that the system is imperfect, which makes the system unable to operate and develop sustainably. The methodology is applied to solve the open complex giant systems, such as "the comprehensive integration method from qualitative to quantitative" and "the controllable process of the explicit channel is not complicated" [35,36]. The characteristic is that the analysis process is participated in and the ability of man-machine integration is fully used, as shown in Figure 6.
For example: The analysis of simple unknown communication protocols can be solved by applying comprehensive integration methods: Put forward the proposition: Analyze the communication law of the unknown protocol to make it macroscopically orderly and controllable.
Raw data: A sample of captured communication data.
Knowledge system: Experts' understanding and experience of the basic laws of network communication.
Qualitative analysis by experts: (i) For simple communication protocols, if the encryption algorithm is complex, it will lead to reduced availability; (ii) For any application layer communication protocol, for safe transmission, there must be a signaling part, including message payload length, sequence number, verification, etc., even if it is an encryption protocol; (iii) Some commonly used simple encryption algorithms are known.
Perform nonlinear analysis on the sample, use the same original data multiple times, observe the results of the communication message sample, and find that the changing law of several fields, the hypothesis of each control field, and the qualitative hypothesis of the encryption method are proposed.
Quantitative verification: Use a large number of samples to test the coincidence rate of the hypothesis. Overturn hypotheses with counter examples, based on new evidence. Reanalyze, revise the original hypothesis, and re-verify.
Quantitative conclusion: To get enough macro-communication laws, it is not necessary to fully understand the details of the unknown communication protocol, but to make the communication pass programmable.
The four databases are used in this research work and 1,042 papers retrieved through the submission of the search string. Some of the papers are selected on the basis of paper's title and abstract, and some are removed because they had no straight relation to the expected contribution. Table 1 shows the primary study for each database which is also graphically represented in Figures 7-9 for drawing better inferences.
After all that, the management control platform is also discussed in this article, which included more than 600 network security vulnerabilities in the industry during the trial operation period, and dozens of  incidents were handled and rectified in a timely manner, effectively improving the industry's network security management and protection level. Through the results of the penetration test and the performance of rectification, this article established a vulnerability risk index to quantify the vulnerability risk of each institution's network system. By establishing a vulnerability security analysis and assessment system, timely notification, rectification, and verification of vulnerabilities can be realized, thereby effectively improving the industry's network security management level. Through the implementation of the project, the work system of network security management and service has been strengthened, and the network security reports and processing procedures of relevant units have been standardized from the technical and administrative levels, thereby strengthening the standardization of industry network security management.   Figure 8: Selected criteria study and the preceding rate.
Vulnerability assessment and penetration detection for high sensitive networks  601 During the trial operation, the management control platform discussed in this article included more than 600 network security vulnerabilities in the industry, with dozens of incidents, which were promptly dealt with and rectified, effectively improving the level of network security management and protection in the industry. Through the penetration test results, this article establishes the vulnerability risk index and quantifies the vulnerability risk of each institution's network system. The use of penetration testing technology is conducive to the realization of accurate positioning, accurate detection, and active alarm of security vulnerabilities, and the optimization of monitoring and rectification of the combination of network security management control system. Through the implementation of the project, the work system of network security management and service was strengthened, and the network security report and processing process of relevant units were standardized from the technical and administrative levels, thus strengthening the standardization of network security management in the industry.

Conflict of interest:
Authors state no conflict of interest.