Thursday, June 19, 2008

The Insecurity in the Systems Intelligence: A Very Brief Study of the Failures of Information Security Projects

“Best is to know and know you know. Next best is to know that you don’t know. Third best is knowing, but not realizing it. Worst is not to know that you don’t know.”
Ancient Proverb (Business, 2002, P xxxi)
Introduction

What do IT projects fail so often? What are the impacts, if any; do these failures have on the confidentiality, integrity and availability [CIA] of our Information Systems? A fundamental and direct relationship between these two processes (security and failure) exists without a doubt; however, being that the impacts of project failure on security is a certainty, as it is obvious and dangerous, for the operational stability of our IT systems; and having in consideration that all along systems project failure is very expensive and has caused so much pain and sorrow: Why then, we are not more careful about developing and implementing these type of crucial projects? These were some of the questions that were explored during our last Xterior’s security team weekly meeting. On this paper, I am briefly documenting my conclusions based on the literature review that I had conducted about the failure of information systems security projects (Courses Material, 2007).

Insecurity Preambles

It is common knowledge what that old saying states: “The necessity is the mother of the invention.” I could not find a better word, with which I could epitomize more accurately the degree of civilization of any given society, and hence innovations are the best of our human manifestations that stems from our lack of having the adequate vital resources, or the insecurity to be free from our most basic physiological needs. The advances of any determined culture or historical period are represented precisely by its tools, as by its knowledge employed to solve their challenges, which typifies the distributed human intelligence expressed through the development of arts, sciences, architecture, engineering, weaponry, and industry and governmental institutions. Thus the history of invention describes and enumerates those technologies used from the Stone Age, throughout the contemporary and pervasive, as seemingly intrusive and insecure, Information age.

However as our inventions, i.e. the ability to create tools, objects and ideas for outreaching our overarching goals and objectives, and creativity have assisted us to become organized as nations with peculiar cultures and philosophies (the east and the west divide). Thus, we have survived and adapted to the inclemency of our environment and we have triumphed over all known beasts in nature with the exception of only one: The Man itself. “Homo Homini Lupus”, i.e., "Man is a wolf to man," is a famous Roman Proverb attributed to Plautus (184 BC) and used it as one of the main ingredients in the authoritarian social contract, in his “De Cive”, by Thomas Hobbes (1651). Thenceforth, it is the beginning of our tribulations, insecurities and project failures; it appears ingrained in our own needs, instincts, self-interested reasons and imperfect souls. It is not any different now that it was then, in the Stone Age, we need to survive. So, If the necessity were the mother of the invention, then the scarcity is the grandmother of all our insecurities and wrong doings. Therefore, information security deals not only with technology; it involves other factors that are more darker, deep-rooted and less understood than any form of attack because it has been, is and will be the source of all our human maladies.
System Project Failure

Failure is lack of success. To fail is to lose in a way, as not being able to provide. If a system were not working as expected, then we state, “the system has failed”, by the same token is a project fails, we can say “the project as the people who were involved in it, have fallen short in meeting the goals” or “they could not grow or develop fast enough to deliver successfully what is was expected or needed at this time.” (Scalability problem) At the end of the day, system project failures have significant costs, not only for the project team’s morale but in terms of dollars, time and other resources, as the loss of critical information, that have proven catastrophic for many business cases. Nowadays, that our lives spin around multiple projects, it will be a good idea to refresh our memories, and figure out what exactly is a project and why projects do fail. As I see it, a project is an organized and systematized activity that employs resources, i.e., people, technology, and processes or operations, within a well-defined start date, milestones or phases, and end date (duration and schedule) with the aim to achieve a relative unique service or product within an organization (Sommerville, 2000).
Learning from Failure

The study of failure could ensure the path to success. What did work some years ago to solve a problem, it might not work today. To investigate why systems projects usually fail, we need to understand how we plan, administer, monitor and evaluate them; in other words, we are asking what Project Management [PM] is. PM is a life cycle based discipline utilized to achieve desired and projected goals within a determined schedule and specific budgetary constrains (Why, 2007). Many experts agree that one of the main reasons why systems or Information Technology [IT] projects failed so often is the lack of knowledge, understanding or experience, of the IT Managers about System Engineering [SE] (Sommerville, 2000, p. 9); and on how to apply effectively the PM Body of Knowledge [PMBOK] for their specific situations. For instance, Wimmel & Wisspeintner (2003) showed the importance for security managers to be familiar with the application of the “Information Technology Security Evaluation Criteria” [ITSEC], AKA “the Orange book” and/or the Common Criteria [CC]; both standards that are usually utilized for designing and modeling trusted ecommerce systems. As it is the case of Plow of the Sea, Inc that now wants to translate its retailing business experience into the Internet.

Sten E. Vesterly (2004) based on surveys, reported that only one quarter of the System projects are almost completed as expected, as they meet just some of their specified deliverables according to schedule and budget. He added that two quarters of the IT projects deliver their final outputs with serious deficiencies in functionality and largely exceeding their price and time tags. In addition, that one quarter of the entire projects deliver anything but yes, they consumed all and evenly more of the money, time and resources that were assigned to them at their inception or during planning process (p. 1-7); No wonder, why IT Systems Project managers are fired with conspicuous frequency.

Vesterly (2004) explained that another cause, for the aforementioned negative and poor system project outcomes, is the overexciting or excessive enthusiasm created by the new features and rapid deployment of powerful emergent technologies. Most IT staff just cannot resist them and become too ambitious in including uncertain devices or software mechanisms; and so become part of what it has been called, the frontlines of the “bleeding edge”. As many new-released hardware, firmware and software off the shelf have not been properly tested or certified, using these products create tremendous stress in supporting their deployment and maintenance. IT Prudence is required in designing and developing system projects. On the other hand, Vesterly (idem), pointed out, that some IT managers usually offer the argument of, “If it works, don’t fix it”, pretending that legacy systems will work in the same way, facing the same attacks as they used to, within new and dangerous environments with threats that in no manner were known or evenly envisioned when they were firstly deployed. It suffices only to recalled the case of zero-day attacks, to understand that upgrading and patching systems is not only necessary, it should be continuous and compulsory goal to be enforced to all IT managers by the top security managers in any organization (p. 1-7). Ted Hendy (2000), the System Security Architect and Engineer at the Pentagon for the Office of the Secretary of Defense [OSD], stressed the importance of human analysis and the use of Security System Engineering Process and Techniques to prevent major problems in planning, implementing and managing IT systems, Hendy (idem) summarized very well why systems fail, “Though the technology exists to do great things in this field, without thoughtful analysis and preparation many of those attempts are doomed to fail.” (p. 1).

Moreover, we know too well, that complexity creates havoc and it is very difficult to manage. The complexity of the Information Systems [IS], which is incremented by the rapid change in the technology available to use; as the level of sophistication, reached by the IS interoperability and components interactions, has created confusion and pressure on security and IT managers. Actually, the system complexity is one of the major causes of the other aforementioned factors. In reality, studying the causes that originate systems project failure is looking for answers as to why systems projects fail and how to do for implementing more trusted IS (Ivory & Alderman, 2005, p. 1-13).

Sommerville (2000) reminded that systems components are just not only about inert parts or soulless machines working as a whole with emergent own characteristics, properties and capabilities; systems also involved people and are built for support them and aid human activity (p. 3). Therefore, we need to zero-in on the users and operators too, and on the risks associated with how these users and operators are aware, trained and educated for perusing the systems infrastructure and access critical information. (Ivory & Alderman, 2005, p. 1-13) This factor is outlined by Beckley (2007) as inadequate training; he presents also other causes for the failure of system projects, such as: Radical and unconventional changes; the lack of enterprise level support and resources or budget; and communication failures (Slide 53). In addition, the figure 1, below, encapsulates the concepts behind the systems project failures and the difference between what the users want and what they finally get:




Conclusion

At the end of the day, some factors of failure appear at the beginning of the project, like the lack of communication among the team members of the project, or understanding over which ones are the system requirements or over roles or responsibilities. Some factors develop on the last phase of the project, like the lack of proper testing or evaluative practices. However, system projects fail at any phase during their development process, whereas for the lack of monitoring, contingency measures, or concurrency, interdependency and tasking issues, or event or people driven deadlocks. Systems project failure poises a great danger given its frequency. It has come to mean for us, as Xterior security managers, prevention, instead of contention; it means alertness instead of lamentations. It also means due-diligence and al lot of work and patience to estimate the process and to not sub estimates nothing; and so it security system projects need to adherence to system security engineering standards, such as SSE-CMM for reaching an adequate level of information assurance, and system reliability for Xterior’s customers.
A successful project not only meets its goals and objectives on time and within budget, but also implements and translates accurately the business model requirements into the security technology controls that squarely satisfy the security policy of the corporation.

Bibliography

Business - The Ultimate Resource. (2002) London, GBR: Bloomsbury Publishing Plc. Retrieved November 30, 2007, from http://http://wf2dnvr5.webfeat.org:80/dIEzI117/url=http://site.ebrary.com/lib/cecybrary/Doc?id=10022156&ppg=1.
Hendy, T. (2000, May 9). The art of security engineering. US Army Information Systems Engineering Command. Retrieved November 30, 2007, from http://www.itoc.usma.edu/Workshop/2000/Abstracts/TM2_1.pdf.
Hobbes, T. (1651). De Cive. Retrieved December 1, 2007 from, http://socserv2.socsci.mcmaster.ca/~econ/ugcm/3ll3/hobbes/hobbes1.
Ivory, C. & Alderman N. (2005, September 28). Can Project Management Learn Anything From Studies Of Failure In Complex Systems? Project Management Journal, Vol. 36 Issue 3, p. 5-16, 12p. Retrieved November 30, 2007 from, http://search.ebscohost.com/login.aspx?direct=true&db=buh&jid=1NR&loginpage=Login.asp&site=ehost-live.
Mitchell, A. (2004, December 14). E-Commerce Missing Link: Software Requirement Specifications. E-Commerce Times. Retrieved December 2, 2007, from http://www.ecommercetimes.com/story/38919.html.
Sommerville, I. (2000). Critical Systems Engineering: Processes and techniques for developing critical systems. PowerPoint Presentation. Retrieved December 2, 2007 from, http://www.comp.lancs.ac.uk/computing/resources/IanS/Ian/Courses/CritSys-2004/PDF-notes/Introduction.pdf
Vesterli, S. E. (2004, October 19). Why IT Projects Fail − and how to avoid it. Retrieved on December 2, 2007, from http://vesterli.com/papers/why_it_projects_fail.pdf.
Wimmel, G. & Wisspeintner, A. (2003, June 17). Extended Description Techniques For Security Engineering. Institut für Informatik, Technische Universität: München, D-80290 München, Germany. Retrieved December 2, 2007 from, http://www4.in.tum.de/publ/papers/WW01.pdf.
Why do Projects Fail? (2007). JISC InfoNet. Retrieved December 2, 2007 from, http://www.jiscinfonet.ac.uk/InfoKits/project-management/pm-intro-1.2.

Plowed Results | Resultados Arados