Richard Cloete, Chris Norval and Jatinder Singh
Compliant and Accountable Systems Group (http://www.compacctsys.net/)
University of Cambridge
To appear in VRST 2020 (https://vrst.acm.org/vrst2020/)
XR systems (Virtual, Augmented and Virtual Reality) are growing in prominence. With any new technology comes risk, and XR is no exception. Indeed, the novel interaction modalities and the immersive nature (manipulating/impeding senses) of XR presents some unique risks. This is particularly important given the adoption of the tech by more risk-averse sectors such as manufacturing, surgery, training, construction and the military, to name a few.
Auditability of XR systems can help in this regard. That is, capturing information about how XR systems are used, how they operate and how various components (internal and external) interact can help to unpack failures/harms be they due to usage, technical, administrative or social complications. This 'audit data' can also help identify failures before they occur, demonstrate that systems operate/perform in line with regulations and design specifications and more generally, can help towards making XR more compliant, transparent and safer -- properties that will undoubtedly affect the technologies adoption.
We conducted a survey exploring the state of audit practices 'in the wild' for a range of emerging technologies and stakeholders.
Our research proposal for this work was reviewed was approved by our institution's ethical committee (for further information, see: www.research-integrity.admin.cam.ac.uk/research-ethics), and consent was obtained from all participants. All questions were optional and randomised, data was anonymised and participants gave their explicit consent for us to use their responses.
For this paper, we compiled our results based only on those participants that said that they work with XR (AR/VR/MR) technologies; we did this in order to make three broad and general points as elaborated in our paper.
The following questions relate to your organisation and present questions about which technologies you work with, which sectors your organisation targets, etc.
- 1 employee
- 2-10
- 11-50
- 51-250
- More than 250 employees
- Virtual Reality
- Augmented Reality
- Mixed Reality
- Internet of Things
- Artificial Intelligence / Machine Learning
- Robotics
- Other
Q3 - You’ve indicated that you work with technologies not listed above. Please elaborate in the text field below.
- Arts/Culture
- Civil Society (NGOs)
- Construction
- Education
- Finance
- Gaming
- Government
- Manufacturing
- Medical
- Mining/oil/gas
- Military
- Research
- Security
- Other
- Businesses (B2B)
- Consumers (B2C)
- Government/public sector/charities
- Other
Q7 - You’ve indicated that your organisation uses other approaches to selling. Please elaborate in the text field below.
- Developing software products (programming, database management, testing, architecting, etc.)
- Analysing data (data science, user data analysis, etc.)
- Management (leading a software team, overseeing the design of products, etc.)
- Sales
- Other
Q9 - You’ve indicated that you are involved in other/additional roles to the ones listed above. Please elaborate in the text field below.
- Security
- Other
- Manufacturing
- Medical
- Construction
- Education
- Gaming
- Finance
- Military
- Government
- Civil Society (NGOs)
- Arts/Culture
- Research
- Mining/oil/gas
The following questions relate to the recording of data during the design, development and testing stages of products.
- Debug information (e.g. test records, performance information, debug logs, diagnostics, etc.)
- Intermediate/generated data (e.g. interaction events, aggregated data, function outputs, etc.)
- Meta information (versioning information, operating system, hardware, resources consumed, etc.)
- Other
- Provenance data (e.g. records describing the origins of data or events, the flow of data, how data was collected, any pre-processing, etc.)
- Product/system data (e.g. data from sensors, peripheral devices such as mouse or keyboard, etc.)
- Error/crash data (e.g. crash reports, error logs, etc.)
- Input data (e.g. uploaded files, speech, text input, etc.)
Q13 - What are the reasons for recording data during design/development/testing? Check all that apply.
- To verify correct operation (e.g. according to a design specification)
- To improve performance
- To identify bugs/issues/failures
- Other
- To demonstrate compliance (e.g. with regulations such as the General Data Protection Regulation)
- To track design/development/testing progress
- For insurance purposes
The following questions relate to data recorded from products while being used by customers. Is any data recorded while customers are using your products (i.e. during operation)?
Q15 - What types of data are recorded while your products are being used (i.e. are being operated)? Select all that apply.
- Product usage data (e.g. app start and end time, user preferences and settings, etc.)
- Error/crash data (e.g. crash reports, error logs, etc.)
- Meta information (versioning information, operating system, hardware, total resources, etc.)
- Other
- Provenance data (e.g. records describing the origins of data or events, the flow of data, how data was collected, any pre-processing, etc.)
- Product/system data (e.g. data from sensors, peripheral devices such as mouse or keyboard, etc.)
- Intermediate/generated data (e.g. interaction events, aggregated data, function outputs, etc.)
- User input data (e.g. uploaded files, user speech, text input, etc.)
Q17 - What are the main reasons for recording data during product usage? Check all those that apply.
- Advertising
- To demonstrate compliance with regulations (e.g. GDPR, CCPA)
- To verify correct operation
- To identify why a product has failed/crashed
- To improve performance
- Detecting potential issues/faults
- Detecting fraud
- Other
- For insurance purposes
The following questions relate to tools and/or libraries used for recording data. Note this is at any stage of the product life cycle, including the during design/development/testing and the use of products.
Q19 - Are any 3rd party (i.e. developed/created/owned outside of your organisation) tools and/or libraries used for recording data?"
- No
- Yes
Q20 - Are any tools/libraries used to record data at any stage (design/development/testing/use by customers) developed in-house?
- No
- Yes
Q21 - Please elaborate on why 3rd-party tools/libraries are not sufficient. Please also indicate in which phases issues are experienced (e.g. design/development/testing/use).
Q22 - Please elaborate on what aspects of 3rd party tools/libraries are particularly useful for recording data, and in which phases (e.g. design/development/testing/use) the features are most helpful.
Q23 - How are tools/libraries developed in-house better suited to recording data? Check all that apply.
- They have features not available from 3rd party tools/libraries
- In-house tools/libraries are better suited for data aggregation
- In-house tools/libraries are more performant
- In-house tools/libraries allow for more control over data format and representation
- Other
- In-house tools/libraries are better at maintaining consistency