About us

Home / Research / Work Package 4

Media Content Interaction & Accessibility

Home / Research / Work Package 4

About us

Home / Research / Work Package 4

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ Introduction

Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.

Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.

This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.

Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs. 

/ People

Ingar Mæhlum Arntzen

Ingar Mæhlum Arntzen

Work Package Leader

NORCE

Read more
Morten Fjeld

Morten Fjeld

Work Package Co-Leader

Frode Guribye

Frode Guribye

Key Researcher

Helwig Hauser

Helwig Hauser

Task Leader

Oskar Juhlin

Oskar Juhlin

Work Package Advisor

Njål Borch

Njål Borch

Schibsted

Read more
Oda Elise Nordberg

Oda Elise Nordberg

University of Bergen

Pete Andrews

Pete Andrews

PhD Canditate

Eivind Throndsen

Eivind Throndsen

Indutrsy WP leader

Schibsted

Lubos Seteskal

Lubos Seteskal

TV2

Sergej Stoppel

Sergej Stoppel

Wolftech

Magnus Helgesen

Magnus Helgesen

Industry Partner

BT

Jan Stian Vold

Jan Stian Vold

Industry Partner

BT

/ Publications

2024

Ingar M Arntzen; Njål Borch; Anders Andersen

Control-driven Media. A unifying model for consistent, cross-platform multimedia experiences Journal Article

In: FTC 2024 International Journal of Advanced Computer Science and Applications (IJACSA), 2024.

Abstract | BibTeX | Links:

Peter Andrews; Oda Elise Nordberg; Frode Guribye; Morten Fjeld; Njål Borch

Designing for Automated Sports Commentary Systems Conference

IMX'24, 2024.

Abstract | BibTeX | Links:

Peter Andrews; Njål Borch; Morten Fjeld

FootyVision: Multi-Object Tracking, Localisation, and Augmentation of Players and Ball in Football Video Conference

ACM ICMIP, 2024.

Abstract | BibTeX | Links:

Peter Andrews; Oda Elise Nordberg; Frode Guribye; Kazuyuki Fujita; Morten Fjeld; Njål Borch

AiCommentator: A Multimodal Conversational Agent for Embedded Visualization in Football Viewing Conference

Intelligent User Interfaces (IUI), 2024.

Abstract | BibTeX | Links:

2022

Morten Fjeld; Yukai Hoshikawa; Kazuyuki Fujita; Kazuki Takashima; Yoshifumi Kitamura

RedirectedDoors: Redirection While Opening Doors in Virtual Reality Conference

RedirectedDoors: Redirection While Opening Doors in Virtual Reality., 2022.

Abstract | BibTeX

2021

Khanh-Duy Le; Tanh Quang Tran; Karol Chlasta; Krzysztof Krejtz; Morten Fjeld; Andreas Kunz

VXSlate: Exploring Combination of Head Movements and Mobile Touch for Large Virtual Display Interaction Proceedings

Association for Computing Machinery, New York, NY, USA, 2021, ISBN: 978-1-4503-8476-6.

BibTeX | Links:

Khanh-Duy Le; Tanh Quang Tran; Karol Chlasta; Krzysztof Krejtz; Morten Fjeld; Andreas Kunz

VXSlate: Combining Head Movement and Mobile Touch for Large Virtual Display Interaction Conference

2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE The Institute of Electrical and Electronics Engineers, Inc., 2021.

Abstract | BibTeX | Links:

2020

Pavel Okopnyi; Oskar Juhlin; Frode Guribye

Unpacking Editorial Agreements in Collaborative Video Production Conference

IMX '20: ACM International Conference on Interactive Media Experiences, New York, 2020, (Pre SFI).

Abstract | BibTeX | Links:

Anja Salzmann; Frode Guribye; Astrid Gynnild

“We in the Mojo Community” – Exploring a Global Network of Mobile Journalists Journal Article

In: Journalism Practice, pp. 1-18, 2020, (Pre SFI).

Abstract | BibTeX | Links:

Morten Fjeld; Smitha Sheshadri; Shengdong Zhao; Yang Cheng

Learn with Haptics: Improving Vocabulary Recall with Free-form Digital Annotation on Touchscreen Mobiles Journal Article

In: CHI 2020 Paper, pp. 1-13, 2020, (Pre SFI).

Abstract | BibTeX | Links:

2019

Eivind Flobak; Jo Dugstad Wake; Joakim Vindenes; Smiti Kahlon; T. Nordgreen; Frode Guribye

Participatory Design of VR Scenarios for Exposure Therapy Conference

Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), no. Paper 569, New York, 2019, (Pre SFI).

Abstract | BibTeX | Links:

2018

Njål Borch; Ingar Mæhlum Arntzen

Mediasync Report 2015: Evaluating timed playback of HTML5 Media Technical Report

2018, (Pre SFI).

BibTeX | Links:

Yoshiki Kudo; Kazuki Takashima; Morten Fjeld; Yoshifumi Kitamura

AdapTable: Extending Reach over Large Tabletops Through Flexible Multi-Display Configuration. Proceedings

2018, (Pre SFI).

BibTeX | Links:

Kening Zhu; Morten Fjeld; Ayca Ülüner

WristOrigami: Exploring foldable design for multi-display smartwatch Proceedings

2018, (Pre SFI).

BibTeX | Links:

Velko Vechev; Alexandru Dancu; Simon T. Perrault; Quentin Roy; Morten Fjeld; Shengdong Zhao

Movespace: on-body athletic interaction for running and cycling Journal Article

In: 2018, (Pre SFI).

BibTeX | Links:

Ingar Mæhlum Arntzen; Njål Borch; François Daoust

Media Synchronization on the Web. In: MediaSync Book Chapter

In: 2018, (Pre SFI).

BibTeX | Links:

Ingar Mæhlum Arntzen; Njål Borch; François Daoust; Dominique Hazael-Massieux

Multi-device Linear Composition on the Web, Enabling Multi-device Linear Media with HTMLTimingObject and Shared Motion Conference

Media Synchronization Workshop Brussels, 2018, (Pre SFI).

Abstract | BibTeX | Links:

2017

Njål Borch

Økt samvirke og beslutningsstøtte – Case Salten Brann IKS Technical Report

2017, (Pre SFI).

BibTeX | Links:

Njål Borch; François Daoust; Ingar Mæhlum Arntzen

Timing - small step for developers, giant leap for the media industry, IBC 2016 Conference

2017, (Pre SFI).

BibTeX | Links:

2016

Frode Guribye; Lars Nyre

The changing ecology of tools for live news reporting Journal Article

In: Journalism Practice, vol. 10, no. 11, pp. 1216-1230, 2016, ISSN: 1751-2794, (Pre SFI).

Abstract | BibTeX | Links:

Ingar Mæhlum Arntzen; Njål Borch

Data-independent sequencing with the timing object: a JavaScript sequencer for single-device and multi-device web media. In Proceedings of the 7th International Conference on Multimedia Systems (MMSys '16) Proceedings

2016, (Pre SFI).

BibTeX | Links:

Marta G. Carcedo; Soon H. Chua; Simon Perrault; Pawel Wozniak; Raj Joshi; Mohammad Obaid; Morten Fjeld; Shengdong Zhao

Hapticolor: Interpolating color information as haptic feedback to assist the colorblind Proceedings

2016, (Pre SFI).

BibTeX | Links:

Pawel Wozniak; Nitesh Goyal; Przemyslaw Kucharski; Lars Lischke; Sven Mayer; Morten Fjeld

RAMPARTS: Supporting sensemaking with spatially-aware mobile interactions Journal Article

In: 2016, (Pre SFI).

BibTeX | Links:

2015

Njål Borch; Ingar Mæhlum Arntzen

Mediasync Report 2015: Evaluating timed playback of HTML5 Media Journal Article

In: Norut, 2015, ISBN: 978-82-7492-319-5, (Pre SFI).

Abstract | BibTeX | Links:

Find us

Lars Hilles gate 30
5008 Bergen
Norway

Contact us

MediaFutures
Office@mediafutures.no

 

Responsible Editor:
Centre Director Prof. Dr. Christoph Trattner
Christoph.Trattner@uib.no

NEWSLETTER

Subscribe to our monthly Newsletter by sending mail to office@mediafutures.no

 

Hosted by 

PARTNERS

CLUSTER

FUNDED BY

Copyright © University of Bergen 2024