My Research Vision
My main goal is to produce high quality research in Human Factors that allows better understanding of human performance and technology. After spending over ten years in Industry I endeavour to use my knowledge of real-world systems and operations to drive research objectives. I believe this will allow me to provide significant impact for industry with Human Systems Integration with new/novel technologies. I also think it is important to not only disseminate knowledge to peer community in relevant Journals and Conferences, but to also be an active part of steering policy and research focus.
Dale has a background in Cognitive Psychology and Human-Computer Interaction. After completing his University of Wales PhD scholarship Dale joined QinetiQ (formerly the Defence Evaluation Research Agency) and worked primarily on defence programmes. He is a Chartered Psychologist, Chartered Scientist and an Associate Fellow of the British Psychological Society.
During his time at QinetiQ Dale worked initially at MoD Portsdown West providing Human Factors advice on Royal Navy visual displays. After two years working on maritime systems he joined the Aerospace Group at Farnborough where he attained Technical Focus for Human Factors within QinetiQ Aerospace Division. Here Dale gained experience in applying Human Factors knowledge across different programmes ranging from pervasive networks, ubiquitous computing, commercial flight decks, advanced HMI for Future Offensive Air Systems (FOAS). For several years Dale was the Human Factors lead for the UK MoD Applied Research Programme (ARP) Autonomy & Mission Management for unmanned systems. This would eventually lead to the successful demonstration of a fast jet controlling multiple unmanned air systems, where Dale led the design of the displays implemented into the fast jet cockpit (Tornado F2A).
Dale was also Human Factors lead for QinetiQ on the first two phases of civil UK Unmanned Air Vehicle programme – ASTRAEA (Autonomous Systems Technology Related Airborne Evaluation & Assessment).
Dale has presented many conference papers on Human Factors for unmanned and autonomous systems and invited to deliver a talk at MIT in 2006. Dale has acted as a reviewer for several conferences including HCI, CHI, and AUI and sits on the Executive Committee of the British Computer Society special group for HCI (Interaction) and recently invited to participate on a NASA/FAA/CAA panel to discuss UAS in civil airspace.
- Richards, D., Stedmon, A., Shaikh, S., & Davies, D. (2014) Responding to disaster using autonomous systems. The Ergonomist, No. 534, 8-9.
- Greenbank, C. & Richards, D. (2014) Human-System Integration within a multidisciplinary design team: An inside view. Presented at Royal Institute of Naval Architects Conference of Marine Design, UK: Coventry.
- Stedmon, A.W, Richards, D., Shaikh, S.A., Huddlestone, J. and Davison, R. (2014). Cyber-specifications: Capturing user requirements for cyber security investigations. In, B. Akhgar, A. Staniforth, and F. Bosco (Eds). Cyber Crime and Cyber Terrorism: Investigator’s Handbook. Syngress, Elsevier: Waltham, MA. 43-58.
- Richards, D. (2014) To delegate or not to delegate: A human factors perspective of autonomous driving. Presented at European conference on Human Centred Design for intelligent Transport Systems. Austria: Vienna.
- Richards, D. (2012) Human Factors and Safety with Unmanned Systems. Unmanned Aerial Systems UK, Cranfield University, Shrivenham, UK, 25-28 June, 2012.
- Richards, D. (2010) Manning the Unmanned. Bristol UAV International Conference: Bristol, UK.
- Baxter, J. & Richards, D. (2010) Whose goal is it anyway? User interaction in an autonomous system. Association for the Advancement of Artificial Intelligence Conference: Atlanta, USA.
- Richards, D., & Howitt, S. (2006) Cognitive Engineering and HMI design of a UAV Ground Control Station. Presented at CERI Human Factors of UAVs, Arizona; May 2006.
- Clark, P., Banbury, S., Richards, D., & Dickson, B. (2004) Methodologies for Evaluating Decision Quality. HPSA II Conference, Human Performance, Situation Awareness, and Automation Technology. Daytona Beach, FL. March 22-25, 2004.
- Richards, D., & McDougall, S. (1999) Road Traffic signs: How implicit category knowledge improves learning. In D. Harris (ed.) Engineering Psychology and Cognitive Ergonomics, 329-336.
- Future Flight Deck. To develop new pilot-centred interface technologies to improve situation awareness, decision making and improve the availability of aircraft in adverse weather. A further major objective is to develop novel system architectures which will allow for the safe and expedient operation of commercial aircraft with a reduced number of crew.
- Growing Autonomous Mission Management Applications. Technology that can enhance decision making is already evident in many everyday applications. When we delegate tasks for a system to perform we must ensure that the human-system partnership is balanced so that the goals of the user are understood and carried out by the system. The HSIG have teamed with Serious Games International Ltd (SGIL) to propose the development of an intelligent mission management application that can be used to assist first responders with emergency situations. Building upon existing software architecture provided by SGIL, the Coventry University team provide the autonomous agent capability and the Human-Systems Integration component that will allow an operator to delegate tasks to both human and systems assets. The interaction between the human and intelligent software agents will be facilitated through a cognitively engineered Human-Machine Interface (HMI). The HMI will allow the human operator to command and accept decisions from the agents, and importantly allow a seamless interaction that facilitates decision-making within the human-agent collective.
- Hands-Free Inspection Interface: The proposed project will adapt and augment the consumer focussed Google Glass to the aerospace inspection and test regime to produce a concept Head Mounted Display System (HMDS). By building new applications capable of taking existing inspection and test documentation in pdf, MS Word and other formats, and rendering them such that they can be interrogated by voice and presented to the user in a useable form, they will make the inspection and test process not just largely hands free but facilitate an intuitive interaction between the user and the task. In addition, the HMDS supports the generation of video and voice data enabling the capture, recording and transmission of information throughout the inspection process and, as estimated by one End-User, reduce inspection time by between 10% and 20%. Moreover, by building a framework for visual interaction we will pave the way for future developments on advanced context-sensitive inspection where object recognition by the HMDS would allow inspectors to automatically be presented with the relevant imagery.
- Training for unmanned assets in the maritime environment: Current and future considerations: This is a joint research programme between QinetiQ and Coventry to examine the training requirements and implications for future unmanned maritime systems in the UK Royal Navy. The UK MoD has a requirement to understand the possible risks associated with unmanned/autonomous systems. Unmanned assets can provide great benefits to the Navy above, on and under the surface, such as force multipliers, 24/7 reconnaissance, reduced risk to personnel, endurance, expanding capability reach by using UAS as an extension beyond existing capability and cost-saving efficiencies. However, understanding of the training requirement and associated risks for operational effectiveness is essential for maximum benefit and to build on lessons learnt in early and urgent operational requirement unmanned systems as current training is often stove piped, platform-focussed and constrained by a failure to develop a coherent approach to introducing/maintaining this capability into service.
Paul Weller, Researcher
Contact for religion and Belief, State and Society Relationships | Email: firstname.lastname@example.org