About us

Home / Research / Work Package 4

Media Content Interaction & Accessibility

Home / Research / Work Package 4

About us

Home / Research / Work Package 4

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ People

Njål Borch

Njål Borch

Work Package Leader

NORCE 

Read more
Morten Fjeld

Morten Fjeld

Work Package Co-Leader

Frode Guribye

Frode Guribye

Key Researcher

Helwig Hauser

Helwig Hauser

Task Leader

Oskar Juhlin

Oskar Juhlin

Work Package Advisor

Ingar Mæhlum Arntzen

Ingar Mæhlum Arntzen

Key Researcher

NORCE 

Read more
Petter Ole Jakobsen

Petter Ole Jakobsen

Steering Board Member, WP4 Industry Co-Leader

Vizrt

Read more

/ Publications

2021

VXSlate: Exploring Combination of Head Movements and Mobile Touch for Large Virtual Display Interaction Proceeding

Khanh-Duy Le; Tanh Quang Tran; Karol Chlasta; Krzysztof Krejtz; Morten Fjeld; Andreas Kunz

Association for Computing Machinery, New York, NY, USA, 2021, ISBN: 978-1-4503-8476-6.

BibTeX | Links:

What Matters in Professional Drone Pilots’ Practice? An Interview Study to Understand the Complexity of Their Work and Inform Human-Drone Interaction Research Conference

Sara Ljungblad; Yemao Man; Mehmet Aydın Baytaş; Mafalda Gamboa; Mohammad Obaid; Morten Fjeld

(Article 159), Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21) Association for Computing Machinery, New York, NY, USA, 2021.

Abstract | BibTeX | Links:

VXSlate: Combining Head Movement and Mobile Touch for Large Virtual Display Interaction Conference

Khanh-Duy Le; Tanh Quang Tran; Karol Chlasta; Krzysztof Krejtz; Morten Fjeld; Andreas Kunz

2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE The Institute of Electrical and Electronics Engineers, Inc., 2021.

Abstract | BibTeX | Links:

2020

Unpacking Editorial Agreements in Collaborative Video Production Conference

Pavel Okopnyi; Oskar Juhlin; Frode Guribye

IMX '20: ACM International Conference on Interactive Media Experiences, New York, 2020, (Pre SFI).

Abstract | BibTeX | Links:

“We in the Mojo Community” – Exploring a Global Network of Mobile Journalists Journal Article

Anja Salzmann; Frode Guribye; Astrid Gynnild

Journalism Practice, pp. 1-18, 2020, (Pre SFI).

Abstract | BibTeX | Links:

Learn with Haptics: Improving Vocabulary Recall with Free-form Digital Annotation on Touchscreen Mobiles Journal Article

Morten Fjeld; Smitha Sheshadri; Shendong Zhao; Yang Cheng

CHI 2020 Paper, pp. 1-13, 2020, (Pre SFI).

Abstract | BibTeX | Links:

2019

Participatory Design of VR Scenarios for Exposure Therapy Conference

Eivind Flobak; Jo Dugstad Wake; Joakim Vindenes; Smiti Kahlon; T. Nordgreen; Frode Guribye

Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), (Paper 569), New York, 2019, (Pre SFI).

Abstract | BibTeX | Links:

2018

Mediasync Report 2015: Evaluating timed playback of HTML5 Media Technical Report

Njål Borch; Ingar Mæhlum Arntzen

2018, (Pre SFI).

BibTeX | Links:

AdapTable: Extending Reach over Large Tabletops Through Flexible Multi-Display Configuration. Proceeding

Y. Kudo; K. Takashima; Morten Fjeld; Y. Kitamura,

2018, (Pre SFI).

BibTeX | Links:

WristOrigami: Exploring foldable design for multi-display smartwatch Proceeding

K, Zhu; Morten Fjeld; A Ülüner

2018, (Pre SFI).

BibTeX | Links:

Movespace: on-body athletic interaction for running and cycling Journal Article

V. Vechev; A. Dancu; S Perrault; Q. Roy; Morten Fjeld; S Zhao

2018, (Pre SFI).

BibTeX | Links:

Media Synchronization on the Web. In: MediaSync Book Chapter

Ingar Mæhlum Arntzen; Njål Borch; François Daoust

2018, (Pre SFI).

BibTeX | Links:

Multi-device Linear Composition on the Web, Enabling Multi-device Linear Media with HTMLTimingObject and Shared Motion Conference

Ingar Mæhlum Arntzen; Njål Borch; François Daoust; Dominique Hazael-Massieux

Media Synchronization Workshop Brussels, 2018, (Pre SFI).

Abstract | BibTeX | Links:

2017

Økt samvirke og beslutningsstøtte – Case Salten Brann IKS Technical Report

Njål Borch

2017, (Pre SFI).

BibTeX | Links:

Timing - small step for developers, giant leap for the media industry, IBC 2016 Conference

Njål Borch; François Daoust; Ingar Mæhlum Arntzen

2017, (Pre SFI).

BibTeX | Links:

2016

The changing ecology of tools for live news reporting Journal Article

Frode Guribye; Lars Nyre

Journalism Practice, 10 (11), pp. 1216-1230, 2016, ISSN: 1751-2794, (Pre SFI).

Abstract | BibTeX | Links:

Data-independent sequencing with the timing object: a JavaScript sequencer for single-device and multi-device web media. In Proceedings of the 7th International Conference on Multimedia Systems (MMSys '16) Proceeding

Ingar Mæhlum Arntzen; Njål Borch

2016, (Pre SFI).

BibTeX | Links:

Hapticolor: Interpolating color information as haptic feedback to assist the colorblind Proceeding

M.G Carcedo; S.H Chua; S Perrault; P Wozniak; R Joshi; M Obaid; Morten Fjeld; S Zhao

2016, (Pre SFI).

BibTeX | Links:

RAMPARTS: Supporting sensemaking with spatially-aware mobile interactions Journal Article

P Wozniak; N. Goyal; P. Kucharski; L. Lischke; S. Mayer; Morten Fjeld

2016, (Pre SFI).

BibTeX | Links:

2015

Mediasync Report 2015: Evaluating timed playback of HTML5 Media Journal Article

Njål Borch; Ingar Mæhlum Arntzen

Norut, 2015, ISBN: 978-82-7492-319-5, (Pre SFI).

Abstract | BibTeX | Links: