Stakeholder involvement, motivation, responsibility, communication: How to design usable security in e-Science

e-Science projects face a difficult challenge in providing access to valuable computational resources, data and software to large communities of distributed users. On the one hand, the raison d'etre of the projects is to encourage members of their research communities to use the resources provided. On the other hand, the threats to these resources from online attacks require robust and effective security to mitigate the risks faced. This raises two issues: ensuring that (1) the security mechanisms put in place are usable by the different users of the system, and (2) the security of the overall system satisfies the security needs of all its different stakeholders. A failure to address either of these issues can seriously jeopardise the success of e-Science projects. The aim of this paper is to firstly provide a detailed understanding of how these challenges can present themselves in practice in the development of e-Science applications. Secondly, this paper examines the steps that projects can undertake to ensure that security requirements are correctly identified, and security measures are usable by the intended research community. The research presented in this paper is based on four case studies of e-Science projects. Security design traditionally uses expert analysis of risks to the technology and deploys appropriate countermeasures to deal with them. However, these case studies highlight the importance of involving all stakeholders in the process of identifying security needs and designing secure and usable systems. For each case study, transcripts of the security analysis and design sessions were analysed to gain insight into the issues and factors that surround the design of usable security. The analysis concludes with a model explaining the relationships between the most important factors identified. This includes a detailed examination of the roles of responsibility, motivation and communication of stakeholders in the ongoing process of designing usable secure socio-technical systems such as e-Science.

[1]  John P. McDermott,et al.  Using abuse case models for security requirements analysis , 1999, Proceedings 15th Annual Computer Security Applications Conference (ACSAC'99).

[2]  Mikko T. Siponen,et al.  An Analysis of the Recent IS Security Development Approaches: Descriptive and Prescriptive Implications , 2001 .

[3]  Ka-Ping Yee,et al.  User Interaction Design for Secure Systems , 2002, ICICS.

[4]  Detmar W. Straub,et al.  Coping With Systems Risk: Security Planning Models for Management Decision Making , 1998, MIS Q..

[5]  Cecilia Mascolo,et al.  Integrating security and usability into the requirements and design process , 2007, Int. J. Electron. Secur. Digit. Forensics.

[6]  Ivan Flechais,et al.  Usable Security: What Is It? How Do We Get It? , 2005 .

[7]  M. Angela Sasse,et al.  Users are not the enemy , 1999, CACM.

[8]  Jerome H. Saltzer,et al.  The protection of information in computer systems , 1975, Proc. IEEE.

[9]  Enid Mumford,et al.  Computer systems in work design--the ETHICS method : effective technical and human implementation of computer systems , 1979 .

[10]  Anne Adams,et al.  Privacy in Multimedia Communications: Protecting Users, Not Just Data , 2001, BCS HCI/IHM.

[11]  James Backhouse,et al.  Current directions in IS security research: towards socio‐organizational perspectives , 2001, Inf. Syst. J..

[12]  Lada Gorlenko Small world, water coolers, and the challenge of remote collaboration , 2005, INTR.

[13]  Iris Vessey,et al.  The Effect of User Involvement on System Success: A Contingency Approach , 1988, MIS Q..

[14]  James Backhouse,et al.  Structures of responsibility and security of information systems , 1996 .

[15]  B. Turner,et al.  Grounded Theory and Organizational Research , 1986 .

[16]  Enid Mumford,et al.  Designing Human Systems For New Technology: The Ethics Method , 1983 .

[17]  Brian Fitzgerald,et al.  A case study of user participation in the information systems development process , 1997, ICIS '97.

[18]  Jean Hitchings A practical solution to the complex human issues of information security design , 1996, SEC.

[19]  H. Vos Trade and Industry , 1946 .

[20]  Lorrie Faith Cranor,et al.  Security and Usability: Designing Secure Systems that People Can Use , 2005 .

[21]  Henrik Eriksson,et al.  The impact of participation in information system design: a comparison of contextual placements , 2004, PDC 04.

[22]  Henri Barki,et al.  Rethinking the Concept of User Involvement , 1989, MIS Q..

[23]  Graham Tate,et al.  A study of user participation in information systems development , 1994, J. Inf. Technol..

[24]  Luigi Ciminiera,et al.  European Grid of Solar Observations (EGSO) , 2001 .

[25]  Helen L. James,et al.  Managing information systems security: a soft approach , 1996, Proceedings of 1996 Information Systems Conference of New Zealand.

[26]  Richard Baskerville,et al.  A New Paradigm for Adding Security Into IS Development Methods , 2001, Conference on Information Security Management & Small Systems Security.

[27]  Shawn A. Butler,et al.  Security Design : Why It ’ s Hard To Do Empirical Research , 2001 .

[28]  Mary Ellen Zurko,et al.  User-centered security , 1996, NSPW '96.

[29]  M. Angela Sasse,et al.  Bringing security home: a process for developing secure and usable systems , 2003, NSPW '03.

[30]  J. Doug Tygar,et al.  Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0 , 1999, USENIX Security Symposium.

[31]  Richard Baskerville,et al.  Investigating Information Systems with Action Research , 1999, Commun. Assoc. Inf. Syst..

[32]  Denise M. Rousseau,et al.  Managing the change to an automated office: lessons from five case studies , 1988 .

[33]  Karen Holtzblatt,et al.  Contextual design , 1997, INTR.

[34]  Jean Vanderdonckt,et al.  People and Computers XV—Interaction without Frontiers , 2001, Springer London.