LightBlog

Thursday 30 May 2019

#5 OWASP Tutorials - Security Tests Integrated in Development and Testing Workflows

OWASP-Tutorials

OWASP Tutorials - Integrated Safety Test Development and Test Workflow


Security Testing in the Development Workflow 

During the development phase of SDLC, safety test represents the first opportunity for developers to ensure that personalized software components developed by them integrate with other components and security testing is done before the application is built...

Software components can include software artifacts such as functions, methods, and classrooms, as well as application programming interfaces, libraries, and executable files.

For safety testing, developers can rely on the results To verify the source code analysis statistically that the developed source The potential vulnerabilities in the code are not included and there is a compliance Secure coding standard Security Unit Test Can Further Verify Dynamically (i.e., at run time) that components work as expected.

Before integrating both new and existing code changes into the app build, the results of static and dynamic analysis
Review and verification should be done.

Verification of source code before integration into application creation Generally, the responsibility of the senior developer is responsible.

Such senior developers are subject experts in software security and their role is to lead a secure code review.

They should decide whether the application is to accept the code issued in the build or requires further changes and tests.

This safe code review workflow can be implemented through formal acceptance as well as checks in the workflow management tool.

For example, assuming the specific defect management workflow used for functional bugs, the safety worms fixed by a developer can be reported on a defect or change management system.

The build master can see the test results reported by the developers in the tool and provides approval for testing in the code change in the app build.

In the test workflow, security testing components and code changes are tested by developers and checked in the App Build, the next step in software development process workflow is to test on the application as a whole unit.

This level of testing is commonly known as an integrated test and system-level test. When security testing is part of these testing activities, they can be used to validate both the application's security functionality, as well as exposure to application-level vulnerabilities.

These safety tests on the application include both white-box tests, such as source code analysis, and black box testing, such as entrance testing.

The gray box test is similar to the black box test. In a gray box test, it is believed that the examiner has some partial knowledge about the management of the application session, and should help understand whether the logout and the timeout function are properly protected.

The goal of security tests is the complete system
Probably attacked and include the entire source code and both
Executable.

A specialty of safety testing during this phase It is possible for security examiners to determine that the weaknesses can be exploited and the application can be exposed to actual risks.

These include general web application vulnerabilities, as well as
Security issues have previously been identified with SDLC
Other activities such as threat modeling, source code analysis and
Safe code reviews.

Typically, test engineers instead of software developers, when there is scope for application for integration, do a safety test System Testing

Such test engineers have the security knowledge of web application vulnerability, black box, and white box safety testing techniques, and this phase recognizes security requirements.

To perform such security testing, it is a condition that the safety test cases are recorded in the safety test guidelines and procedures.

A test engineer who recognizes the security of the application
Integrated system environment can issue applications
For testing in operating environments (e.g., user acceptance
testing).

Application at this level of SDLC (i.e., verification) Functional testing is usually a responsibility of QA testers, while White-hat hackers or security advisors are usually responsible For safety testing.

Some organizations rely on their own particular ethical hacking team to perform such third-party tests.

Some organizations rely on their own particular ethical hacking team to perform such third-party tests.

Since these tests are the last resort to fix vulnerabilities before the release of applications for production, it is important that such issues be addressed as recommended by the test team.

Recommendations may include code, design or configuration
Change.

At this stage, the Security Auditor and Information Security Officers discuss the security issues reported and analyze the capacity.

Risks according to information risk management processes.

Such processes may need to fix all high-risk vulnerabilities before applying to the development team unless it is done.
Risks are accepted and accepted.


Developers’ Security Tests


Security test in the Coding Phase: Unit Test

From the developer's perspective, the main purpose of security
To validate the tests that the code is being complied with
Safe coding standards requirements

Developers should have their coding artworks (such as functions, methods, classes, APIs, and libraries) functionally valid before integrating into the application build.

Developers must follow the security requirements
Valid with secure coding standards documented and static
And dynamic analysis

If the unit testing activity follows a secure code review, unit testing can confirm that the code changes required by secure code reviews have been implemented properly. 

Safe code reviews and source code analysis through the source code analysis tool help the developers to identify security problems in the source code.

Using unit testing and dynamic analysis (eg debugging), developers can validate the security functionality of the components and can also verify that the counter makers being developed are already risking any security risks and modeling, and Source codes are identified through analysis.

A good practice for developers is to create safety test cases
Generic Security Testing Suite which is part of the existing unit test
Framework

A general safety test suite can be taken from pre-determined usage and safety test cases, methods, and cases of abuse of classes.

A general safety test suite can include safety test cases to validate both positive and negative requirements for security control:
  • Identity, authentication and access control
  • Input validation and encoding
  • Encryption
  • User and session management
  • Error and Exception Handling
  • Audit and logging

Developers integrate integrated with a source code analysis tool
In their IDE, safe coding standards and a security unit test
The framework can assess and verify the security of the software components being developed.

Security test cases can be run to identify potential security issues, which have root causes in the source code: 

In addition to the input and output validation of the parameters to enter and exit the components, these issues were made by the component Includes authentication and authorization checks, data security Components, safe exceptions, and error handling and security auditing and logging.

Unit Test Frameworks such as Junit, Nunes, and Kunert can be adapted to verify the safety test requirements. In

In the case of security functional tests, the unit-level test cat tests the functionality of security control at the component level of the classes, such as work, methods.

For example, a test case can validate threshold probes for variables by assessing input and output validation (for example, variable hygiene) and expected functionality of the component.

Hazards of danger identified with the use and abuse cases can be used to document procedures for the testing of software components. 

In the case of authentication components, for example, security unit testing can also show the fact that the user input parameters cannot be misused by bypassing the account lockout, along with the functionality to set the account lockout (e.g., The negative number by setting up the account lockout counter).

At the component level, the security unit tests can validate negative reactions like positive reactions as well as errors and exception handling. 

Exceptions should be caught in the system without being de-allocated in the unsecured (for example, the connection handles are not locked within the final statement block), as well as the possible height of privileges (for example, achieved before the exception High privileges have been thrown and do not reset the previous level before exiting the function).

Secure error handling can validate potential information disclosure through informative error messages and stack marks.

Unit-level security testing cases can be developed by a security engineer who is a specialist in software security and is also responsible for confirming that security issues have been fixed in the source code and investigated in the integrated system build May to go.

Typically, the manager of the App Build also ensures that third-party libraries and executable files are assessed for potential vulnerabilities before integrating into the application build.

The danger scenario can also be documented in the developer's safety test guide for the general weaknesses of the root cause in the unsafe coding. 

When a fix is ​​applied for coding defects identified with source code analysis, for example, security testing cases can verify that the implementation of the code changes follows the safe coding requirements recorded in the safe coding standards.

Source code analysis and unit testing can verify that the code changes first reduces the vulnerability exposed by the coding fault identified. 

The results of automated secure code analysis can also be used as an automatic check-in gate for version control, for example, software artifacts cannot be checked in build with high or medium seriousness coding issues.

Functional Testers’ Security Tests 

Security Testing During the Integration and Validation Phase:  

Integrated System Tests and Operation Tests 

The main purpose of integrated system tests is to validate the concept of "deep in defense", that is, the implementation of security controls provides protection in different layers.

For example, the lack of input verification when calling a component integrated with the application is often a factor that can be tested with integration testing.

The integration system test environment is also the first environment where testers can simulate actual attack scenarios because potentially the application can be executed by the malicious external or internal use.

Safety tests at this level can prove that weaknesses are real and can be exploited by attackers.

For example, the potential vulnerability found in the source code can be evaluated in the form of high risk due to potentially malicious contact between users, as well as due to potential effects (e.g., access to confidential information).

Actual attack scenarios can be tested with both manual testing techniques and penetration testing equipment. These types of safety tests are also called ethical hacking tests.

From the point of view of the safety test, these are risk-driven tests and the purpose of testing the application creation is to represent the version of the application being deployed in production.

Including security testing in the integration and verification phase is important for identifying the weaknesses due to the integration of components as well as proving the risk of such vulnerabilities.

Application security testing requires a special set of skills, including both software and security knowledge, which are not specific to security engineers.

As a result, organizations are often required to train their software developers on ethical hacking techniques, security assessment processes, and tools.

A realistic scenario is to develop such resources in the home and document them in the safety test guides and procedures, which keep in mind the developer safety test knowledge.

For example, a so-called "safety test case list or check-list cheats", can provide simple test cases and attack vectors, which can be found in common vulnerabilities such as spoofing, information disclosure, buffer flows, format verification Can be used by the testers. 

Wire, SQL injection and XSS injection, XML, SOAP, denial issues, denial of service and managed code and ActiveX control (e.g., .NET).

The first battery of these tests can be done manually with very basic knowledge of software protection.

The first objective of security tests can be to verify a set of minimum safety requirements. These security test cases may involve manual error and extraordinary states to compel applications and gather knowledge from application behavior.

For example, SQL injection vulnerability can be manually tested by checking the injection vector through user input and checking if the SQL exception is thrown back by the user.

The evidence of the SQL exception error can be a disclosure of a vulnerability, which can be exploited.

Knowledge of examiner of special testing techniques and equipment may be required for a more in-depth safety test.

In addition to source code analysis and entry testing, these techniques include, for example, source code and binary fault injection, fault spread analysis and code coverage, fuzz testing, and reverse engineering.

The safety test guide should provide procedures and recommend equipment that can be used by the security examiners for such a deeper security assessment.

After integration system tests, the next level of security testing is to test safety in the user acceptance environment.

There are unique advantages to test safety in the operating environment. The User Acceptance Test Environment (UAT) is that which is the most representative of the release configuration with the exception of the data (e.g., test data is used instead of the actual data).

A feature of security testing in UAT is testing for security configuration issues. In some cases, these risks can represent a high risk.

For example, server configurations that host web applications, disable required services, and webroot directory cannot be cleaned from testing and administration web pages.




No comments:

Post a Comment