VibeShare is a set of experimental interactive technologies for XR live entertainment that shares “non-verbal emotions between performer and viewer” that were difficult to convey through conventional live streaming.


Functions Overview

VibeShare many functions to enhance the relationship between performer and audiences.

The overview of all VibeShare functions.


Base module for non-verbal low-latency privacy-free live communications. QR code brings various users into real interactive OBS-based live streaming. The system can be used together with various platforms: YouTube, Vimeo, Twitch, Zoom, Virtual Cast, and Mozilla Hubs.

VibeShare: Live enables audiences to share their emotions with the performer and other audiences in various ways. The user only needs their smartphone and scan QR code to participate.

Back to Index


Design audience reactions with animation behavior of emojis. The price and effect logic is also customizable.

The usage of VibeShare: Vote in the Zoom presentation of “Upgrade with Tokyo.” The other speakers and judges joined in real-time during the presentation and expressed their feelings.

Back to Index


Improve your live performance according to evidence. Powerful logger and event tracker visualize the session. Audience privacy is guaranteed.

The example of how the analyzer visualizes a session. Sent emotes all stored in a database with timeline information. You can refer to them after the session and learn how the audiences’ emotions are evoked.

Back to Index


Make your program more dynamic and interactive. Live graph visualizer and tagging involve the audience in the program.

Vote function used in VTechChallagne 2020 located in Hubs. The audience watching live streaming on YouTube voted their impression from their smartphone, using VibeShare: Vote website.

Back to Index


Give haptic sense and “6th sense” to virtual beings.

VTuber “Invisible gameplay” in VTech Challenge 2019 final.
Two VTubers acquire haptic sensation in a virtual world. They enjoyed the haptic feelings of shoulder tapping, hugging, and explosion in the movie. Furthermore, they acquire the “6th sense,” in which they can feel the presence of an invisible object.

Back to Index


Haptic, sound, illumination LED, and projection mapping share audiences’ passion with the performer

画像に alt 属性が指定されていません。ファイル名: image-1024x647.png
The drummer at the back smiled by feeling the haptic sensation activated by the money emote.
Please see the video version of this scene on YouTube.

Back to Index


“Future Signboard” eliminates the delay of broadcasting platforms, syncing audiences’ applause and reactions with the performer’s timeline.

Thanks to the “Director” function, the audience could send claps right after the performance beyond the delay caused by the broadcasting platform.
Please see the video version of this scene on YouTube.

Back to Index


Game system enhances the unified experience with the player and audience and strengthens the bond between them.

The concept game, “Live Haptic Tower Defense game,” is designed to enhance the unified experience with the player and viewer. The system requires the “reload” comment from viewers to produce bullets necessary for defeating enemies.
The player, therefore, should ask the viewer to give him/her comments to clear the game, which grows the bond between them.

Back to Index


Bring current ecosystem into hybrid live performance. Design your original gifting system and various animating virtual gifts to entertain payment.

The flower emote from audiences turned into the real flower and presented to the singer.

Back to Index


Enriches the archived content by sharing audiences’ emotions beyond real-time.

Concept of “Pseudo-Real-time Live” supported by . The audiences in a traditional real-time live can only see the reactions of those present. The audience in a Pseudo-Real-time Live can enjoy the reactions of past participants and feel like they are participating with them, even if they are the sole participant at that particular time.


Map based XR Metaverse game system and for educational workshop.

Back to Index


We created and demonstrated several specific applications in events, conferences, lectures, and facilities.

The detail is here: Page link to Demo (written in Japanese)


Real-Time Live in SIGGRAPH Asia 2019
Real-Time Live in SIGGRAPH Asia 2019
Talk in CEDEC2021
Music live using VibeShare system in Real Diva’s, a music bar in Tokyo, Roppongi.
Virtual Beings World 2020

Back to Index

Related Publications

  1. 山崎勇祐(REALITY株式会社/東京工業大学大学院), 白井暁彦「VibeShare::Performer — Emoji・触覚・音効によるオンライン音楽ライブの双方向化」, 第26回日本バーチャルリアリティ学会大会 (2021/9/21). {Web} {PDF}
  2. 山崎勇祐, 白井暁彦, 「VibeShare: Vote ~オンラインでの出演者と観客の非言語コミュニケーションの実現~」, 映像表現・芸術科学フォーラム2021(Expressive Japan 2021), [Abstract], [Slides], [SlideShare] (2021/3/8)
  3. 白井暁彦, 「Virtual Cast と Hapbeat を使った国際双方向アバター触覚ライブの開発」, グリー技術書典部誌2020年春号
  4. 山崎勇祐, 白井暁彦, 「首の触知覚を用いたナビゲーションシステムの提案」, 第24回日本バーチャルリアリティ学会大会論文集, 2019-9-11 [PDF]
  5. Yusuke Yamazaki, Shoichi Hasegawa, Hironori Mitake, and Akihiko Shirai. 2019. Neck strap haptics: an algorithm for non-visible VR information using haptic perception on the neck. In ACM SIGGRAPH 2019 Posters (SIGGRAPH ’19). Association for Computing Machinery, New York, NY, USA, Article 60, 1–2. DOI:


Please feel free to contact us via email at