Discuss the difficulties of staying on top of viruses and security issues surrounding a network implementation.© BrainMass Inc. brainmass.com June 20, 2018, 7:13 pm ad1c9bdddf
Current real-world computer security efforts usually focus on external threats, and treat the computer system itself as a trusted system. Some researchers consider this to be a disastrous mistake, and point out that this distinction is the cause of much of the insecurity of current Computer systems - once an attacker has subverted one part of a system without fine-grained security, he or she usually has access to most or all of the features of that system. Because computer systems can be very complex, and cannot be guaranteed to be free of defects, this security stance tends to produce insecure systems.
The 'trusted systems' approach has been predominant in the design of many software products, due to the long-standing policy of emphasizing functionality and 'ease of use' over security. This has led to unfortunate effects.
Software flaws, especially buffer overflows, are often exploited to gain control of a computer, or to cause it to operate in an unexpected manner. Many development methodologies rely on testing to ensure the quality of any code released; this process often fails to discover extremely unusual potential exploits. The term "exploit" generally refers to small programs designed to take advantage of a software flaw that has been discovered, either remote or local. The code from the exploit ...