About us

Home / Research / Work Package 4

Media Content Interaction & Accessibility

Home / Research / Work Package 4

About us

Home / Research / Work Package 4

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ People

Njål Borch

Njål Borch

Work Package Leader

NORCE 

Read more
Morten Fjeld

Morten Fjeld

Work Package Co-Leader

Frode Guribye

Frode Guribye

Key Researcher

Helwig Hauser

Helwig Hauser

Task Leader

Oskar Juhlin

Oskar Juhlin

Work Package Advisor

Ingar Mæhlum Arntzen

Ingar Mæhlum Arntzen

Key Researcher

NORCE 

Read more
Petter Ole Jakobsen

Petter Ole Jakobsen

Steering Board Member, WP4 Industry Co-Leader

Vizrt

Read more
Pete Andrews

Pete Andrews

PhD Canditate

Jonathan Geffen

Jonathan Geffen

PhD Canditate

/ Publications

2022

Morten Fjeld; Yukai Hoshikawa; Kazuyuki Fujita; Kazuki Takashima; Yoshifumi Kitamura

RedirectedDoors: Redirection While Opening Doors in Virtual Reality Conference

RedirectedDoors: Redirection While Opening Doors in Virtual Reality., 2022.

Abstract | BibTeX

2021

Khanh-Duy Le; Tanh Quang Tran; Karol Chlasta; Krzysztof Krejtz; Morten Fjeld; Andreas Kunz

VXSlate: Exploring Combination of Head Movements and Mobile Touch for Large Virtual Display Interaction Proceeding

Association for Computing Machinery, New York, NY, USA, 2021, ISBN: 978-1-4503-8476-6.

BibTeX | Links:

Khanh-Duy Le; Tanh Quang Tran; Karol Chlasta; Krzysztof Krejtz; Morten Fjeld; Andreas Kunz

VXSlate: Combining Head Movement and Mobile Touch for Large Virtual Display Interaction Conference

2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE The Institute of Electrical and Electronics Engineers, Inc., 2021.

Abstract | BibTeX | Links:

2020

Pavel Okopnyi; Oskar Juhlin; Frode Guribye

Unpacking Editorial Agreements in Collaborative Video Production Conference

IMX '20: ACM International Conference on Interactive Media Experiences, New York, 2020, (Pre SFI).

Abstract | BibTeX | Links:

Anja Salzmann; Frode Guribye; Astrid Gynnild

“We in the Mojo Community” – Exploring a Global Network of Mobile Journalists Journal Article

In: Journalism Practice, pp. 1-18, 2020, (Pre SFI).

Abstract | BibTeX | Links:

Morten Fjeld; Smitha Sheshadri; Shengdong Zhao; Yang Cheng

Learn with Haptics: Improving Vocabulary Recall with Free-form Digital Annotation on Touchscreen Mobiles Journal Article

In: CHI 2020 Paper, pp. 1-13, 2020, (Pre SFI).

Abstract | BibTeX | Links:

2019

Eivind Flobak; Jo Dugstad Wake; Joakim Vindenes; Smiti Kahlon; T. Nordgreen; Frode Guribye

Participatory Design of VR Scenarios for Exposure Therapy Conference

Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), no. Paper 569, New York, 2019, (Pre SFI).

Abstract | BibTeX | Links:

2018

Njål Borch; Ingar Mæhlum Arntzen

Mediasync Report 2015: Evaluating timed playback of HTML5 Media Technical Report

2018, (Pre SFI).

BibTeX | Links:

Yoshiki Kudo; Kazuki Takashima; Morten Fjeld; Yoshifumi Kitamura

AdapTable: Extending Reach over Large Tabletops Through Flexible Multi-Display Configuration. Proceeding

2018, (Pre SFI).

BibTeX | Links:

Kening Zhu; Morten Fjeld; Ayca Ülüner

WristOrigami: Exploring foldable design for multi-display smartwatch Proceeding

2018, (Pre SFI).

BibTeX | Links:

Velko Vechev; Alexandru Dancu; Simon T. Perrault; Quentin Roy; Morten Fjeld; Shengdong Zhao

Movespace: on-body athletic interaction for running and cycling Journal Article

In: 2018, (Pre SFI).

BibTeX | Links:

Ingar Mæhlum Arntzen; Njål Borch; François Daoust

Media Synchronization on the Web. In: MediaSync Book Chapter

In: 2018, (Pre SFI).

BibTeX | Links:

Ingar Mæhlum Arntzen; Njål Borch; François Daoust; Dominique Hazael-Massieux

Multi-device Linear Composition on the Web, Enabling Multi-device Linear Media with HTMLTimingObject and Shared Motion Conference

Media Synchronization Workshop Brussels, 2018, (Pre SFI).

Abstract | BibTeX | Links:

2017

Njål Borch

Økt samvirke og beslutningsstøtte – Case Salten Brann IKS Technical Report

2017, (Pre SFI).

BibTeX | Links:

Njål Borch; François Daoust; Ingar Mæhlum Arntzen

Timing - small step for developers, giant leap for the media industry, IBC 2016 Conference

2017, (Pre SFI).

BibTeX | Links:

2016

Frode Guribye; Lars Nyre

The changing ecology of tools for live news reporting Journal Article

In: Journalism Practice, vol. 10, no. 11, pp. 1216-1230, 2016, ISSN: 1751-2794, (Pre SFI).

Abstract | BibTeX | Links:

Ingar Mæhlum Arntzen; Njål Borch

Data-independent sequencing with the timing object: a JavaScript sequencer for single-device and multi-device web media. In Proceedings of the 7th International Conference on Multimedia Systems (MMSys '16) Proceeding

2016, (Pre SFI).

BibTeX | Links:

Pawel Wozniak; Nitesh Goyal; Przemyslaw Kucharski; Lars Lischke; Sven Mayer; Morten Fjeld

RAMPARTS: Supporting sensemaking with spatially-aware mobile interactions Journal Article

In: 2016, (Pre SFI).

BibTeX | Links:

Marta G. Carcedo; Soon H. Chua; Simon Perrault; Pawel Wozniak; Raj Joshi; Mohammad Obaid; Morten Fjeld; Shengdong Zhao

Hapticolor: Interpolating color information as haptic feedback to assist the colorblind Proceeding

2016, (Pre SFI).

BibTeX | Links:

2015

Njål Borch; Ingar Mæhlum Arntzen

Mediasync Report 2015: Evaluating timed playback of HTML5 Media Journal Article

In: Norut, 2015, ISBN: 978-82-7492-319-5, (Pre SFI).

Abstract | BibTeX | Links: