Looking into the future with AI holograms

Coexisting with AI holograms

AI Holographic technology has risen to new heights recently, with the Hypervsn SmartV Digital Avatar being released at the start of the year. The AI hologram functions on the SmartV Window Display, a gesture-based 3D display and merchandising system, allowing real-time customer interaction.

Universiti Teknologi Malaysia (UTM) has also developed its first home-grown real-time holo professor, which can project a speech given by a lecturer in another place. With Malaysia breaking boundaries with extended reality (XR) technology, can the next wave of hologram technology be fully AI-powered without constraints?

Having created Malaysia’s first home-grown holo professor, Dr Ajune Wanis Ismail, senior lecturer in computer graphics and computer vision at UTM’s Faculty of Computing, shares that XR hologram systems can be complex to set up and maintain. Technical issues, such as connectivity problems or software glitches, could disrupt lessons.

AI algorithms are used to enhance the accuracy of holographic content, reducing artifacts and improving image quality. These holographic solutions in extended reality (XR) technology come as a challenge as the technology is relatively new and is rapidly evolving with new breakthroughs occurring since then.

“Building and deploying AI-powered holographic systems can be costly [in terms of hardware and software components].”

Incorporating AI into holograms could pose an immense demand on computational power. Most of the existing holograms produce non real-time content with a video editing loop, but AI models for holography are computationally intensive, says Ajune.

She emphasises the importance of achieving high-fidelity reconstruction in handling complex dynamic scenes with objects or viewers in motion.

“Researchers are developing more efficient algorithms and leveraging hardware acceleration [such as graphics processing units] to reduce computational demands,” says Ajune on how achieving real-time interaction with holographic content demands low latency.

There is no doubt that XR holograms systems are complicated and a challenge to integrate with AI, however, the prospect of being able to replicate environments and enable real-time global communication without the need for physical presence spurs excitement.

As we advance into the era of digitalisation, people need to start familiarising themselves with this technology and become proficient users, believes Ajune.

Read more >> https://theedgemalaysia.com/node/689700

This article first appeared in Digital Edge, The Edge Malaysia Weekly on November 13, 2023 – November 19, 2023

https://www.theedgesingapore.com/digitaledge/focus/coexisting-ai-holograms

This article first appeared in The Edge Malaysia. It has been edited for clarity and length by The Edge Singapore.

Augmented Reality Real-time Drawing Application

Augmented Reality (AR) is a technology that can combine real and virtual objects. Interaction is essential in AR, allowing the user to interact with a virtual object in the natural environment. There is various type of gesture interaction that can be applied in AR. Drawing space can also be related to AR technology with the use of provided gesture interaction. The use of gesture interaction in drawing space has become a popular topic among researchers to apply this concept to AR technology by using bare hands to interact directly with AR. However, the visual and spatial complexity of the canvas requires rethinking how to develop a real-time drawing canvas for AR using real hand gestures on a handheld device. The compatibility to convert from traditional drawing into digital drawing space and the use of hand gestures as input is challenging. Therefore, this paper will discuss the designing and development process to actualize the real-time drawing in handheld AR using the user’s real hand.

Read more : https://ieeexplore.ieee.org/abstract/document/9666439

AR Drawing on handheld device with leap motion

Human Teleportation in Extended Reality (xR) application

Extended Reality (xR) encapsulates various computer-altered reality technologies that cover virtual reality (VR), augmented reality (AR), and mixed reality (MR). xR is a technology that merges a virtual element into the real-world with the aims to enhance reality on the virtual world immerse onto real-world space. xR has been improved from time to time as the advanced immersive technologies to extend the reality we experience by either combining the virtual and real worlds or by generating a fully immersive experience. Remote collaboration in xR is a challenge since both parties need to have the same system and to set to parallel to xR environments. In a collaborative interface context, the user can be in a remote collaboration or face-to-face to sense the immersive environment. Human teleportation is transferring a human from a local location to a remote location, where the reconstruction of a human appears in a realistic visual representation. However, creating a fully realistic representation of the human figure need a complex 3D reconstruction method. Therefore, the paper describes the human teleportation in the xR environment using the advanced RGB-D sensor devices. It explains the phases to develop the real human teleportation in xR.

Read more : https://iopscience.iop.org/article/10.1088/1757-899X/979/1/012005/pdf

Urban Planning Case Study

More posts

MR-MEET: Mixed Reality Collaborative Interface for HMD and Handheld Users

Mixed Reality (MR) collaborative interface allows remote collaboration between an HMD (Head Mounted Display) user in Virtual Reality (VR) and a non-HMD user in Augmented Reality (AR). Both users to have a meeting with each other despite the distance between one another. It enables users to interact in a virtual environment that can replace the traditional way of physical meeting by using appropriate collaborative tools. However, the problem arises when the user is far apart from each other and needs an effective way or tools to collaborate. This paper discusses about a collaborative tool and proposed a prototype called MR-Meet as a collaborative MR interface. MR-Meet has been designed with a tool that can aid both users from different settings to work together. It provides the designing stage to actualize MR collaborative interface with two basic tools such as sticky notes and a whiteboard to help distant …

Read more : https://ieeexplore.ieee.org/abstract/document/9938307

ICVRMR 2021

ICVRMR 2021 – International Conference on Virtual and Mixed Reality Interfaces 2021

This virtual conference invites papers from various fields in engineering and technology and computing fields focusing on Virtual and Augmented/Mixed Reality Interfaces. All papers will be reviewed and evaluated based on originality, technical quality, and relevance to the conference. Join us on 16 – 17 November 2021. Kindly submit your papers and publish your papers that indexed by Scopus!

For more details: Click here.

How to submit https://easychair.org/cfp/ICVRMR2021#

International Conference on Virtual and Mixed Reality Interfaces 2020

ICVRMR 2020 | 16 – 17 November 2020 | Platform Webex

International Conference on Virtual and Mixed Reality Interfaces 2020 : https://vicubelab.utm.my/icvrmr2020/
Due to Pandemic COVID 19, ICVRMR 2020 moved from Physical Mode to Digital mode.
==========================================================

  • Early Bird Important Dates 15 October 2020
  • Full Paper Submission 30 September 2020
  • Notification of Full Paper Acceptance 10 October 2020
  • Submission of Camera Ready Full paper 20 October 2020

Indexing: All ICVRMR 2020 presented papers will be published and indexed in Scopus, as well as EI Compendex and Inspec. One important point to note, this publication is not covered by SCI (proceedings journals are not indexed in SCI, they are indexed within a separate database, the CPC-I); this means our proceedings journals are not issued with an Impact Factor.

PAPERS SUBMISSION: Submissions of high quality papers in all areas of Virtual and Mixed Reality Interfaces and its applications. The submissions are handled only through the Easychair website at: https://easychair.org/cfp/icvrmr2020

Dear Friends and Colleagues,

We cordially invite to our virtual conference, all academicians and practitioners in the related field on engineering and computing focusing on Virtual Reality and Mixed Reality Interfaces, to take part in this virtual conference including Keynote talks, Paper Presentation as well as Regular and Special Sessions. All papers will be reviewed and evaluated based on originality, technical quality, and relevance to the conference. We seek original research full papers covering these topics including, but not limited to: –

  • Modeling and Simulation
  • Data Visualization
  • Rendering and 3D Reconstruction
  • Artificial Intelligence and Agent Systems
  • Multimedia Systems
  • Human Computer Interaction
  • Motion Capture and Telepresence
  • Image and Speech Processing’
  • Other related topics on applied computing in Virtual and Augmented/Mixed Reality Interfaces and/or related domains (Engineering / Computing/Computer Vision/ Computer Graphics/ Visualization/Image Processing etc.)

Authors are kindly invited to submit their formatted full papers a maximum of 12 pages including results, tables, figures, and references. All submissions are handled through the Easy Chair at https://easychair.org/cfp/icvrmr2020

For any query, please write mail on vicubelab@utm.my or visit our website https://vicubelab.utm.my/icvrmr2020/

Ready Player One Game Showcase 2019

الـــــسَّلاَمُ عَلَيْكُـــمْ وَرَحْمَـــة ُاللهِ وَبَرَكَاتُـــهُ Salam Sejahtera,
Y.Bhg. Prof. / Prof. Madya / Dr. / Tuan / Puan / All students,

Assalamualaikum,

This year brings along a new challenge and goal, we are pleased to invite you to visit our game showcase 2019, date and time – 20 April 2019, 11:00 AM to 5:00PM, Venue – N28 Lobby Level 2.

Questions about VR, AR and MR

** this questions are being discussed with my students during the Real-time Computer Graphics course.

What is 6DOF VR? How to develop in mobile HMD such as using mobile VR headset with controller?

  • 3 degrees of freedom: pitch, yaw, and roll. Thus, 3 types of translation + 3 types of rotation = 6 DOF! There are 2 ‘directions’ in which an object is free to move inside any particular DOF. The 6 DOF will have forward/back, up/down, left/right, yaw, pitch, roll.
  • Mobile HMD using Google Cardboard – make sure your mobile’s specification and requirement good to go for VR. Reconsider the resolution, refresh rate (higher is the smoother the image will be), and field of view (it should be around 100 degree).
  • Controller if it was an external hardware not customize / built-in with mobile, you may require several method to integrate them. For example sending the event handler (button is press) to your mobile VR apk apps, you may need network protocol to retrieve, send or receive the action from controller to your apk.

What the difference between Mixed Reality (MR) and Extended Reality (XR)?

  • XR consists of VR, AR and MR technologies on how to extend, improve our reality using these technologies. MR is a technology we can use to extend reality, it reacts and responds to real world.

360 degree Live Video is it considering as VR?

  • The basis behind the argument is viewing with Google Cardboard are really cool, but they’re not VR. Looking at photos on Flickr VR is really cool; also not VR. VR headset, you can feel immersive and presence it runs in real-time – 360 degree video is not the same thing as VR and you can watch most of these videos and photos on a regular flat 2D screen.

Nowadays, VR headset often need cable connected to the PC. If we go wireless, what are the limitation?

  • Freeing users from wires will give them a truly immersive experience, connect with wire limits your walking range and can get caught under your feet. To answer your questions – until today it was considered near-impossible to wirelessly stream vast amounts of data to a VR headset . Existing wireless systems such as Wi-Fi cannot support this transfer data rate, and trying to compress the video stream so it fits into the available bandwidth takes a few milliseconds, which can make users feel sick. The range of anything approaching real-world conditions limit to wireless connection. Other attempts to put everything into the headset.

Sekarang, hardware untuk VR/MR mahal dsb..How long before technology ini menjadi sesuatu yang affordable and accessible untuk kegunaan orang ramai? (Translation Nowadays VR/MR hardware expensive, how long before this technology becomes affordable and accessible to public usage?)

  • How long it may take – perhaps when it comes to entering the mainstream and transforming lives, it will become cheaper, may be given for free with selling VR product/package and when comes to mainstream it was ready to most everyone can have it same as smartphone, bluetooth, and external drive – accommodate almost everyone today. Headset VR is available in various type but there’s no great reason to buy a headset yet even they are selling in affordable price.

Collision Detection with Visual Feedback using Leap Motion

How to make the penetration of the virtual surface more logical and more realistic through visual feedback?

To answer this question, we tried three methods—highlighting the boundaries and depths as they traverse the grid, adding color gradients to the fingertips as they approach interactive objects and UI elements, and unpredictable grabs. Make a responsive prompt. But first let’s look at how Leap Motion’s interaction engine handles the interaction between hands and objects.

In a virtual world, this type of visual cropping occurs whether you touch a stationary surface (such as a wall) or touch an interactive object. The two core functions of the Leap Motion libraries—flip and grab—have almost always the case where the user’s hand penetrates the interactive object.

Similarly, when interacting with a physics-based user interface (such as the InteractionButtons of the Leap Motion interaction engine, which is reduced in Z space), because the elements of the UI reach the end of their travel distance, the fingertips still have a little bit through Interactive object.

Experiment #1: Highlighting the boundary and depth while crossing the grid

In our first experiment, we proposed that when the hand intersects with other meshes, the boundary should be visible. The shallow part of the hand is still visible, but the color or transparency of the hand changes.

To achieve this, we applied a shader to our hand grid. We measure the distance of each pixel on the hand from the camera and compare it to the depth of the scene (from the depth texture to the camera reading). If the two values ​​are relatively close, we cause the pixels to illuminate and increase the illuminance as we get closer.

When the intensity and depth of illumination are reduced to a minimum, it seems to be an effect that can be applied universally in an application without appearing particularly eye-catching.

Lab #2: Add a color gradient to your fingertips when approaching interactive objects and UI elements

In the second experiment, we decided to change the color of the fingertips to match the surface color of the object we want to interact with. The closer the hand is to the touch object, the closer the colors are. This will help the user to more easily determine the distance between the fingertip and the surface of the object while reducing the likelihood that the fingertip will penetrate the surface. In addition, even if the fingertip does penetrate the mesh, the resulting visual cropping will not be as awkward – because the fingertip and the object surface will be the same color.

Whenever we hover over an InteractionObject, we check the distance from each fingertip to the surface of the object. Then, we use this data to drive a gradient change that affects the color of each fingertip independently.

This experiment really helps us to more accurately determine the distance between our fingertips and the surface of the object. In addition, it makes it easier for us to know the closest contact we have. Combine it with the effect of Experiment #1 to make the various phases of the interaction (close, contact, intersect, grip) clearer.

Lab #3: Responsive prompting for unpredictable crawls

How to capture virtual objects in VR? You may catch a fist, or pinch it, or fasten the object. Previously, we have tried to design some tips—for example, the handles—and hopefully these instructions will guide the user how to grab objects.

By creating a Raycast on each finger joint and checking its projection position on the InteractionObject, we create a shallow socket grid at the hit point of the ray projection. We align the shallow fossa with the hit point normal and use Raycast’s hit distance – mainly the depth of the finger inside the object – to drive the Blendshape to extend the dimple.

From this concept, we further divergence. Do you want to predict the proximity of your hand before your hand touches the surface of the object, and reflect it? To do this, we increase the length of the fingertip ray projection so that the hit is registered before your finger touches the surface. Then, we create a two-part prefab consisting of a (1) circular mesh and (2) a cylindrical mesh with a depth mask that stops rendering any pixels behind it.

We also tried adding a fingertip color gradient. But this time, the gradient is not driven by the proximity of the object, but by the depth of the finger into the object.

By setting the layer so that the depth mask does not render the mesh of the InteractionObject, but instead renders the user’s hand mesh. These effects make the crawler feel more coherent, as if our fingers were invited to cross the grid. Obviously, this approach requires a more complex system to handle objects outside the sphere – for the palm of the hand, and when the fingers are close together.

Augmented Reality Course AUGUST 2018

AUGMENTED REALITY SHORT COURSE is two days professional courses that focus on fundamental concepts and techniques for approaching augmented reality technology (overlays virtual content on real world).

Who can Attend

  • AR/VR Content Creator
  • Keen interest in AR technology
  • Digital Marketers
  • Anyone – No need coding experiences

Course Outline : Click here

Venue : School of Computing, Faculty of Engineering, Universiti Teknologi Malaysia, 81310 Skudai, Johor

more details