/ Introduction
Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.
Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.
This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.
Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs.
/ Introduction
Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.
Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.
This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.
Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs.
/ Introduction
Tomorrow’s media experiences will combine sensor technology (instrumentation), AI and personal devices (interactivity) to increase engagement and collaboration. Enablers such as haptics, AR/VR, conversational AI, tangible interfaces, wearable sensors, and eyes-free interactions have made clear progress. Hence, media experiences will become more individualised, targeting the preferences and circumstances of each user (adaptation), making use of a variety of device categories offering alternative capabilities.
Research into adaptation includes responsive UIs, adaptive streaming, content adaptation and multi-device adaptation. Adaptation is also needed for collaborative and social use. Finally, media experiences must be inclusive and available for all (accessibility). Research in accessibility includes screen readers and AI-based techniques for the generation of adapted content such as audio tracks for visually impaired, video interpreters for the deaf and individualised narration.
This WP will focus on rapid prototyping and experimentation to validate approaches, ideas and user experiences. It is vital that complexity is controlled, in particular for end users, ensuring that produced knowledge is relevant for industry partners and society at large.
Objective: The work package will develop methods and technologies for interaction between media content and users, both humans and computerised, and to provide personalised, adapted media experiences to all users regardless of their technical aptitude and personal needs.
/ People
Njål Borch
Schibsted
Read more
Oda Elise Nordberg
University of Bergen
Eivind Throndsen
Indutrsy WP leader
Schibsted
Lubos Seteskal
TV2
Sergej Stoppel
Wolftech
Magnus Helgesen
Industry Partner
BT
Jan Stian Vold
Industry Partner
BT
/ Publications
2024
Control-driven Media. A unifying model for consistent, cross-platform multimedia experiences Journal Article
In: FTC 2024 International Journal of Advanced Computer Science and Applications (IJACSA), 2024.
Designing for Automated Sports Commentary Systems Conference
IMX'24, 2024.
FootyVision: Multi-Object Tracking, Localisation, and Augmentation of Players and Ball in Football Video Conference
ACM ICMIP, 2024.
AiCommentator: A Multimodal Conversational Agent for Embedded Visualization in Football Viewing Conference
Intelligent User Interfaces (IUI), 2024.
2022
RedirectedDoors: Redirection While Opening Doors in Virtual Reality Conference
RedirectedDoors: Redirection While Opening Doors in Virtual Reality., 2022.
2021
VXSlate: Exploring Combination of Head Movements and Mobile Touch for Large Virtual Display Interaction Proceedings
Association for Computing Machinery, New York, NY, USA, 2021, ISBN: 978-1-4503-8476-6.
VXSlate: Combining Head Movement and Mobile Touch for Large Virtual Display Interaction Conference
2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE The Institute of Electrical and Electronics Engineers, Inc., 2021.
2020
Unpacking Editorial Agreements in Collaborative Video Production Conference
IMX '20: ACM International Conference on Interactive Media Experiences, New York, 2020, (Pre SFI).
“We in the Mojo Community” – Exploring a Global Network of Mobile Journalists Journal Article
In: Journalism Practice, pp. 1-18, 2020, (Pre SFI).
Learn with Haptics: Improving Vocabulary Recall with Free-form Digital Annotation on Touchscreen Mobiles Journal Article
In: CHI 2020 Paper, pp. 1-13, 2020, (Pre SFI).
2019
Participatory Design of VR Scenarios for Exposure Therapy Conference
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19), no. Paper 569, New York, 2019, (Pre SFI).
2018
Mediasync Report 2015: Evaluating timed playback of HTML5 Media Technical Report
2018, (Pre SFI).
AdapTable: Extending Reach over Large Tabletops Through Flexible Multi-Display Configuration. Proceedings
2018, (Pre SFI).
WristOrigami: Exploring foldable design for multi-display smartwatch Proceedings
2018, (Pre SFI).
Movespace: on-body athletic interaction for running and cycling Journal Article
In: 2018, (Pre SFI).
Media Synchronization on the Web. In: MediaSync Book Chapter
In: 2018, (Pre SFI).
Multi-device Linear Composition on the Web, Enabling Multi-device Linear Media with HTMLTimingObject and Shared Motion Conference
Media Synchronization Workshop Brussels, 2018, (Pre SFI).
2017
Økt samvirke og beslutningsstøtte – Case Salten Brann IKS Technical Report
2017, (Pre SFI).
Timing - small step for developers, giant leap for the media industry, IBC 2016 Conference
2017, (Pre SFI).
2016
The changing ecology of tools for live news reporting Journal Article
In: Journalism Practice, vol. 10, no. 11, pp. 1216-1230, 2016, ISSN: 1751-2794, (Pre SFI).
Data-independent sequencing with the timing object: a JavaScript sequencer for single-device and multi-device web media. In Proceedings of the 7th International Conference on Multimedia Systems (MMSys '16) Proceedings
2016, (Pre SFI).
Hapticolor: Interpolating color information as haptic feedback to assist the colorblind Proceedings
2016, (Pre SFI).
RAMPARTS: Supporting sensemaking with spatially-aware mobile interactions Journal Article
In: 2016, (Pre SFI).
2015
Mediasync Report 2015: Evaluating timed playback of HTML5 Media Journal Article
In: Norut, 2015, ISBN: 978-82-7492-319-5, (Pre SFI).
Find us
Lars Hilles gate 30
5008 Bergen
Norway
Contact us
MediaFutures
Office@mediafutures.no
Responsible Editor:
Centre Director Prof. Dr. Christoph Trattner
Christoph.Trattner@uib.no
Copyright © University of Bergen 2024