Tuesday, September 3, 2013

Homework #5


Nancy Leveson

Nancy Leveson currently researches topics of system safety, software safety, software and system engineering, and human-computer interaction.  She is also a Professor of Aeronautics and Astronautics at MIT and does consult work with industries on how to prevent accidents.







Faulty Software Articles:

"An Investigation of the Therac-25 Accidents" by Nancy Leveson and Clark Turner is about the well known Therac-25 accidents where many serious injuries and deaths occurred due to massive overdoses of radiation.  This was in large part due to the operator interface that lead the user to believe no dose was given when in fact it had been.  The user would then press the proceed command which resulted in a radiation overdose.

"After Stroke Scans, Patients Face Serious Health Risks" by Walt Bogdanich was also about radiation overdose, but in this case, depending on how the user programmed the machine, an increased dosage would be given when the user intended to lower the dosage.



"FDA: Software Failures Responsible for 24% of all Medical Device Recalls" by Paul Roberts this article was more about the potential for wireless medical devices to be hacked.  I was obsessed with the show "Homeland" last year and in one of the most nail-biting episodes...SPOILER ALERT...the VP's pacemaker is hacked and results in him dying from a heart attack.  The article even mentions that two U.S. Congressmen had their wireless medical devices reviewed due to a presentation on medical device hacking.

"The Role of Software in Spacecraft Accidents" by Nancy G. Leveson talks in depth about the flaws in the safety culture of software this includes:  complacency and discounting of software risk, diffusion of responsibility and authority, limited communication channels and poor information flow, inadequate system and software engineering, inadequate review activities, ineffective system safety engineering, flawed test and simulation environments, and inadequate human factors engineering.  Back to the diffusion of responsibility and authority:  in the article they mention that roles were not clearly allocated and that is where some of the responsibilities/ tasks fall through the cracks. I know I always somehow relate something I read back to nursing, so here I go again: In the healthcare world, during an emergency for example, roles are assigned, responsibilities are dolled out and everyone tries to work together to stabilize the patient.  Your role is clearly understood and you know what your responsibilities are.  Somehow this needs to be more clear when working as a team on software.

"Who Killed the Virtual Case File?" by Harry Goldstein is about the costly VCF debacle in 2005.  The number of communication breakdowns involved in this fiasco amazed me.  

"IG:  FBI's Sentinel program still of track, over budget" by Gautham Nagesh is looking at VCF's predecessor- Sentinel.  This article was written in October of 2010 and at this time, the Sentinel was only in phase two of four and struggling to meet deadlines.  A piece of good advice:  "The IG recommends the Bureau prioritize the development of applications and function that would be most beneficial for agents managing their casework." According to Wikipedia the Sentinel program was showcased in March 2012.

There was one quote that really stood out for me and it comes from the 2010 Radiation Follies article.  Mr. Heuser, one of the affected people mentioned in the article, says "When you are in a car and it backs up, it goes beep, beep, beep...if you fill the washing machine up too much, it won't work.  There is no red light that says you are over radiating."  Good point.  Why is this the case?  When I have finally graduated with my degree in Computer Science from the College of Charleston I will be much more proficient and knowledgable about coding and the concepts that go behind it, but I don't know anything about radiation machines and spacecrafts.  I don't use a spacecraft to get to work daily and I definitely have never performed a CT scan on myself.  I do however wash laundry in the washing machine quite frequently and drive my car almost daily.  I think this might be one of the big problems when it comes to faulty software.  The software engineers do not have enough knowledge on how the software needs to be used and how the user will potentially use it.  One way to fix this problem is better communication and collaboration between the engineers and potential users of the product.  Nancy Leveson mentions that "the first and most important step in solving any problem is understanding the problem well enough to create effective solutions."  In the FBI fiasco of 2005, Higgins explains, "The customer should be saying, 'This is what we need.' And the contractor should be saying, 'Here's how we're going to deliver it.' And those lines were never clear (Referring to the VCF debacle)."  There also needs to be better training of the users: in the article by Bogdanich, it says that "GE trainers never fully explained the automatic feature" (that caused the increase in radiation dosage).

Better communication between all involved is essential to get a project done, done safely, and done right.  By better communication I mean:  everyone has a role and knows the responsibilities of that role in regards to writing software, software engineers should have many meeting with the consumer to make sure they are meeting their needs, and when the product software is completed the training that users go through should be thorough.

No comments:

Post a Comment