Inah Omoronyia

Designing Secured and Privacy Preserving Software Systems, Privacy and Security Requirements

Main Content

Overview

Most of my research revolves around building frameworks, tools and techniques for engineering secured and privacy-preserving software systems, as well as ensuring regulatory compliance.

A while ago, I published an article on privacy-by-design in ITNOW (link). Although the article presents a dystopic view on why baking privacy into software design is hard, it also outlines a bright future where good privacy becomes an inherent feature of secured software.
I believe that better software design (inc. its process, requirements, implementation and testing) is a pathway to achieving an optimal balance between data-inspired technological innovation, regulatory compliance and the privacy needs of end-users. Achieving this balance highlights one core foundation that motivates my research.

Completed PhDs

    1. Oluwafemi Samuel Olukoya (Thesis: Privacy Analysis of Mobile Apps )
    2. Saad Abdullah S Alahmari (Thesis: A Model for Describing and Encouraging Cyber Security Knowledge Sharing to Enhance Awareness. )
    3. Peter Inglis (Thesis: Privacy conflict analysis in web interaction models. )

Some interesting research topics

Resource: A key challenge to privacy management in frequently changing context is that users get unaware of when and for what purpose sensitive information about them is being collected, analysed or disseminated. Indeed, traditional theories suggest users should be able to manage their privacy, yet empirical research evidence suggests that users often lack enough awareness to make privacy sensitive decisions. This suggests a need for more systematic approaches to enable the explicit consideration of privacy awareness in software systems. 
This research aims to contribute to the development of approaches to support the explicit consideration of privacy awareness in the engineering of socio-technical systems. Specifically, we investigate the notion of privacy awareness requirements as a novel and systematic means for considering the privacy objectives of users, and the awareness required to effectively manage these objectives. The target is to address challenges range from methods and processes for identifying privacy awareness requirements, to optimal representation and analysis mechanisms.
Resource: The aim is to understand and build privacy models applicable to software systems and suitable for reasoning about information disclosure in a sociotechnical ecosystem. We investigate the evolutionary nature of sociotechnical ecosystems and the type and nature of interactions that threatens privacy. The aim is to derive social and technical dimensions to how inherent properties of sociotechnical ecosystems can generate privacy problems and impact on user’s ability to preserve their privacy objectives. 

One technical challenge is techniques for modeling, learning and building profiles of adversaries in complex information flow networks. Such profile can be that of a single entity or collectives of adversaries in the ecosystem.
For numerous scenarios involving the use of socio-technical systems, its normally difficult for users to keep pace with the impact of frequently changing context on their privacy. Such change may render certain privacy requirements un-satisfiable, in other cases irrelevant. It may also be the case that policies that are used to ensure the satisfaction of privacy requirements become ineffective. Ensuring the continuous satisfaction of privacy in such environment therefore requires some level of automation.
This research investigates the viability of using software agents in automating user privacy. Key outputs are software models, frameworks and architectures that can be instrumented at design time to help achieve the runtime behaviour of privacy preserving agents.
The goal of privacy by design is to take privacy requirements into account throughout the system development process, from the conception of a new IT system up to its realisation. The underlying motivation for this approach is that by considering privacy as part of the system development lifecycle, there is increased likelihood of building more privacy friendly systems. But in part, the challenge of achieving privacy by design is the lack of a privacy justificatory framework to reason about the satisfaction of privacy requirements. Such framework can be used as a reference point by software designers to evaluate the extent to which the software being developed will preserve user privacy.
This research is a step towards achieving a privacy justificatory framework for designers of privacy critical systems. The focus is to discover privacy patterns via a series of empirical studies and prototype implementations on e-learning platforms. We will then investigate how these patterns can be used as guiding principles to support the inclusion of privacy requirements throughout the system development life cycle. One key output we expect from this research is to develop a framework to express, study and select privacy design patterns. We also aim to develop tools to support system designers to apply privacy design patterns in system development lifecycle and methodologies.
Privacy management in software typically involves scenario where information ownership is shared by multiple users, e.g pictures in Facebook, or personal information in custody of a dynamic group of users along an information-flow path. The main software engineering challenge is that of privacy requirements negotiation. Specifically, when others are told or given access to a person's private information, they become co-owners of that information, and co-owners of private information need to negotiate mutually agreeable privacy rules about telling others. 

The problem with negotiation is that different owners have different privacy requirements that may conflict. Also, where conflicts are identified a resolution on the optimal privacy policy (or disclosure protocols) that optimises the satisfaction of each individual privacy requirements is necessary. The aim of this research is to investigate how game theory could be used to address this problem. Some expected research outputs include heuristics and techniques for the efficient negotiation of privacy requirements.
There is generally limited research on privacy for individual users of assistive technologies who may be deemed as vulnerable. For example, individuals with different forms of disability. In part, the problem is with the unbalanced trade-off 'of take it or leave it' commonly used for privacy management in software systems. It is frequently the case that the privacy of this class of users is ignored, as its obvious they really have no choice than to use these technologies to improve the quality of their day-to-day living.
This research lies on the boundary between computer science and engineering. We seek to provide assistive technology users with the required privacy by design based on their needs without compromising the function of assistive technologies or their safety. The focus is on producing a privacy management system for assistive technologies with a focus on people with visual impairments.