The integration of video conferencing systems (VCS) have increased significantly in the classrooms and administrative practices of higher education institutions. The VCSs discussed in the existing literature can be broadly categorized as desktop systems (e.g. Scopia), WebRTC or Real-Time Communications (e.g. Google Hangout, Adobe Connect, Cisco WebEx, and appear.in), and dedicated (e.g. Polycom). There is a lack of empirical study on usability evaluation of the interactive systems in educational contexts. This study identifies usability errors and measures user satisfaction of a dedicated VCS in a Danish university’s classrooms. This work contributes (1) to the methodological approach that uses mixed methods to collect and analyze data from users as part of a summative evaluation, (2) to demonstrate the methods applied by independent usability evaluator using field study approach, (3) to the usability evaluation literature dealing with empirical evaluation methods. PACT (people, activity, context, and technology) analysis of participant observation and interview data shows a lack of user guide, training, IT support, and vendor coordination. Software usability measurement inventory (SUMI) analysis of 12 user responses results below average score. Poststudy system test by the vendor has identified cabling and setup error. Applying SUMI followed by qualitative methods might enrich evaluation outcomes.
Proceedings of 19th International Conference on Computer and Information Technology, 2016, p. 184-190
usability evaluation; software usability measurement inventory; SUMI; empirical evaluation; video conferencing system
Main Research Area:
19th International Conference on Computer and Information TechnologyIEEE International Conference on Computer and Information Technology, 2016