HyperHaptics
Oscillating between physical and virtual tactility

The introduction of the concept of »the ultimate display« by Ivan Sutherland is regarded as the birth of Virtual Reality, or VR for short. The technological developments of the last 50 years have since then produced increasingly realistic experiences that reproduce sensations associated with the physical world. However, these systems primarily address only two of the human senses - our sight and hearing. In comparison, our skin and with it our sense of touch is a little-explored sensory interface for VR and AR (Augmented Realities) technologies. The skin forms the basis of all our physical interactions with the outside world. Tactile sensations generate the deepest and most emotional kind of contact between individuals and are therefore essential for immersive virtual experiences.

We understand ›virtual reality‹ not only as a collective term for dedicated technologies, but also as the digital enrichment of physical spaces and objects with multi-sensual levels of information. The aim of the project is to explore the extent to which virtual realities can generate not only equivalent but completely new ›hyper-realistic‹ experiences by means of synaesthetic stimuli. How can haptics and materiality be transported into the virtual? How can senses be meaningfully coupled to complement and expand physical reality? How can the imagination of users be stimulated while they mentally construct imaginary virtual worlds beyond spatial and temporal boundaries?

›HyperHaptics‹ differs from the ›typical‹ course of a product design project, which usually ends with a single design. By means of experiments organised in three sprints, students explore the limits of multi-sensory patterns of experience and learn to manipulate them. Each sprint focused on a specific thematic and technological aspect.

The projects shown here are a selection of a large number of prototypes and concepts. In the last of the four sprints, the students had time to pursue and elaborate one of their approaches discovered in previous sprints. An overview of all sprints has been documented in a publication under the same name.

The project is based on the research work of the ›Cutting‹ project at the Cluster of Excellence ›Matters of Activity‹, which deals with the cultural practices of ›cutting‹ and understands them as a process of dividing as well as composing, ultimately deciding.

The course is being realised under the patronage of Prof. Carola Zwick by Judith Glaser, lecturer for interaction designer (Studio NAND), Felix Rasehorn Pre-Doctoral Researcher (MoA), and Felix Groll Head of the eLAB (weissensee school of art and design berlin).
Dance with Somebody
Tal Sznicer
Haptic sensation in an online dance experience
›Letting GO!‹ is a dance workshop that took place online during the COVID-19 pandemic. In times of social distancing, we yearn for human connection. The project examines how an online dance experience can go beyond the limited and everyday use of audio-visual means by incorporating haptic sensations. Although the screen is a window for communication, it also screens out the delicate nuance that we would perceive with our senses when dancing offline. How can an online experience between dancers become more immersive?

The first goal was to transmit a tactile sensations of a dance move from one dancer to the other. It was necessary to define and filter the ›moves‹ from all the motion data captured. To gather movement data, image targets were used to track the hand movements of dancers. Then, by calculating the acceleration, only the final position of the dance move was extracted.



As for the output, the hand proved to be the most appropriate place to receive a ›touch‹ due to its high sensitivity. Vibration motors concealed in a slim band create the feeling of motion on the skin, as if someone is brushing the back of your hand.

It seems more relevant than ever to try new means that can help us feel a connection with each other. Especially when communication in the broadest sense is almost completely migrating to digital spaces, yet the physicality of our human bodies is not.
Individualized AR
Yoorie Kwon
How can personalised augmented reality affect human relationships?
From the findings of experiments conducted with virtual reality in Sprint 2, it became clear that our minds can be shaped by our virtual experiences, and not just our real ones.  In VR it's easy to manipulate our vision, but by also giving our body the right feedback, we can make our brain believe its real. Through virtual interactions, we gain a feeling of ownership over this virtual body. But if this results in a psychological intervention with long term consequences for the human experience, what are the ethical considerations designers must take into account?

This design research was inspired by the practical brief set out by the interaction design project HyperHaptics, but also takes influence from the theory course »Technobodies« offered in parallel at KHB this semester which addresses the question of how the human body will relate to technology where I researched  the topic of sense augmentations.

An experiment was designed to test the premise that individualised augmented reality (AR) experiences can affect human relationships. A couple with different preferences enters the same room, but each partner gains entirely different impressions of the space due to an AR layer which has been customized to their individual preferences. A scale model was built of a room which could be ›entered‹ using a small camera,  allowing the ›room‹ to be viewed with its personalised AR layer.

Only the furniture which participants would interact with physically was included in the scale model, because in this scenario decoration - one element of the personalised AR layer - would be solely virtual. The participants ›enter the room‹ and begin to describe and compare the different music, decor and lighting of the individual spaces. The question is: despite the different perceptions of the rooms, do the couple still see this as a shared experience? 



It seems more relevant than ever to try new means that can help us feel a connection with each other. Especially when communication in the broadest sense is almost completely migrating to digital spaces, yet the physicality of our human bodies is not.

Based on the findings of this experiment, it remains a shared experience as long each participant communicates their surroundings to the other. Long term, could it affect human relationships? If personalised AR becomes widely used, we may begin to take our individual perspectives for granted and as a result, stop sharing information about this experience with our partner. While each participant enjoyed having ›their own room‹ tailored to their taste, they expressed the importance of sharing an actual space to maintain their relationship. Sometimes just sitting together without a word and listening to the same music or seeing the same world is enough to create a bond.

This small study highlights the importance of involving psychologists and neuroscientists in more rigorous design research processes for the development of personalised AR, because designers need to seriously consider the consequences of applying any technology that can mould our minds and perceptions. 
Jelly, our haptic friend
Minseong Kim
Jelly, Invitation to call virtual reality into the physical world
If children grow up in a world where the boundaries between virtual reality and physical reality are increasingly blurred, how will they learn the impact their virtual actions have on the real world? Digital programs aren’t affected by real world forces. Ping pong can continue endlessly, with no force dissipating or being lost. But that doesn’t explain how things work in the real world. This project builds on experiments from Sprint 3, in which events occurring in a computer program were converted into real, physical outputs using vibration motors.

›Jelly‹ serves as an amplifier for the vibration output. ›Jelly‹ is neither solid nor liquid, it’s a kind of hybrid. The fact the jelly is so fluid with a larger surface area allows the vibration to spread out and radiate. This material acts as a buffer against force - as a result, its haptic and visual vibration creates a much softer and more harmless effect than other materials. 



There is no limit to the forms jelly can take. Jellies with different colors and shapes are pleasing to the eye and visualize the flow of vibrations. Patterns drawn on the jelly at regular intervals demonstrate the beginning and end of the vibration, a force we can now perceive in the physical world. ›Jelly‹ is a familiar and safe ingredient for children to play with or consume. It also offers kids the possibility to decorate their own haptic toy. ›Jelly‹ is linked to a smartphone, and children can experience virtual reality being implemented in the physical world through actions in the smartphone. This enhanced jelly can be a way for children to learn about the boundaries and impacts of behavior in a virtual world.
Metron
Johannes Schmidt
Better workout routine through haptics
Working out at home has never been so popular. However it still can’t compare with exercising in the gym and being coached. Virtual fitness is mostly screen-based, and therefore limited to a visual and auditory interaction. This slimmed down digital trainer is missing some key features: users receive no feedback on whether they carry out an exercise safely and correctly, and not every exercise position allows users to view the screen. Nor would they wish to – exercise should offer a welcome break from screen time. 

While fitness exercises vary a lot in form and effect, they have one common element that appears on different levels: rhythm. We find rhythm in the individual repetitions and the sets and intervals that contain them. Its impact is significant – the right rhythm serves as a guide for the right intensity and execution in terms of tempo and breathing. 

Rhythm is something we feel, therefore the interaction design should be haptic. User tests revealed that the wrist is a pleasant and effective spot to receive haptic feedback. It’s a sensitive area where we’re already used to reading information – like a watch or testing our pulse. It also allows for freedom of movement. 



›Metron‹ is an application that guides individual workouts with dynamic haptic feedback. It converts a fitness routine into an oscillating pulse that keeps you in the rhythm.  The pulse follows a steady sine wave form: the vibration signal is always on, but its intensity fades in and out analogous to the movement. This flowing, dynamic feedback places emphasis on the highs and lows of the exercise. 

›Metron‹ works with the hardware embedded in a typical smartwatch. Equipped with a tactile engine that can be programmed, most smartwatches would be capable of transmitting ›Metron's‹ dynamic vibration signal. Once the routine is set up, ›Metron‹ sends the signal to the watch, and the workout can begin.
Oscillate.
Tillmann Kayser
Interface for virtual ›room emotions‹
When giving a presentation digitally, it’s very hard to »read the room« and assess how your audience is reacting. In these times, when everyone is working from home, speakers face new challenges. A digital barrier is erected between the presenter and the audience - key emotional feedback is lost, making it more difficult for the speaker to respond to the audience.

Attempting to convey the most important emotional feedback through the virtual space, it was important to summarize these into the most significant and frequent types of audience response: agitation/quietness, or agreement/disagreement. A camera source serves as an input for machine learning data. This machine learning program would then evaluate the facial expressions and gestures & interpret it into a ›room emotion‹.

Interviews with various users gave important insights that informed the design: all participants agreed that they were insecure without any feedback from the audience, however they would also be distracted if too much feedback was given at the wrong time. Haptic feedback in particular was rejected because of the disturbance, as well as too much ›noise‹. It became clear that users wish to decide for themselves when and if they want to receive feedback.



The final outcome is ›OSCILLATE‹. – A speculative concept for a non-intrusive interface for virtual conferences that allows speakers to identify the mood in the audience and better assess how their presentation is received. ›OSCILLATE‹ has at its centre a clock with a dial that is semi-transparent, reminiscent of a dandelion clock. The transparency and subtle design is deliberate, so as not to distract the speaker from their primary focus. The different mood states are displayed as a particle-cloud. Initially broadly dispersed, the particles begin to gravitate towards each meeting participant. The participants’ varying moods cause their particles to cluster at different points on the dial, generating cloud formations. The cloud formations are in constant flux, in line with the changing mood of the audience, and the motion of one ›cloud‹ can also influence the other particles.The end result is a nebulous indicator that gives the speaker a sense of the atmosphere in the virtual meeting ›room‹.
TapTam
Netta Gigi and Maria Soravito De Franceschi
Multi-Sensory interaction game for kids
Very small children explore through touch and taste and this helps their development. Today it can be seen that children are less exposed to tactile and thought-provoking games that combine several senses in parallel. However, there are many long term benefits when children are encouraged to explore and be creative at an early age. 

›Taptam‹ is a toy for young children that awakens and stimulates their senses as much as possible, and triggers their curiosity and imagination. By creating a game in which there is no right or wrong, children are given the freedom to use their own imagination and approach things differently. 

Building on knowledge gained from previous experiments with haptic, acoustic and visual feedback, it became apparent that the toy should incorporate all of these sensory stimuli - this time with the addition of more tactile stimuli through varied materials and textures.



›Taptam‹ is built in a ›Totem-Principle‹. Similar to the traditional wooden building blocks that mostly consist of simple geometric shapes with smooth, rounded edges. Each figure represents a specific sensory stimulus; these blocks click together thanks to concealed magnets. The child is rewarded for playing and interacting with the different blocks, as each provides a different, exciting, sensory feedback. Some of these are ›smart‹ - such as a light module that encourages the child’s increasing level of interaction: gradually lighting up when lifted, it becomes a colourful light show when connected to the other blocks. Whereas others are based on analog textural stimuli - such as tunnels lined We created 2 models: one is a 3d rendered model simulation, the other one is 3D printed technical model contains our vibration sensors and light work with ›Processing‹ and ›Arduino‹ and shows all technical possibilities, the other one transposes the ›look & feel‹ of selected materials without technical know-how. 

While ›Taptam‹ has been designed as a toy, there is also potential for further development for use in therapeutic applications if adapted with specialist input.
The Travel Dice
Alex Ruppert
A neo-analog navigation tool
Besides our physical world, there are spaces we can only explore virtually: VR game worlds, the hypnotizing online globe of GoogleEarth, or lucid dreams – dreams that you can control while asleep. Nevertheless we humans still like the idea of getting to know these ›unreal‹ places. Can a neo-analog tool help us to more intuitively navigate both physical and virtual spaces? 

The project aim was to create a handy tool and build up research into navigation and how we orientate ourselves within our world. This research included both human biophysical features that help us with spatial orientation, as well as tools and gadgets that humankind developed since ancient times to help orientate and navigate, such as tracking the positions of sun and stars, the compass or contemporary GPS coordinates. 

The final prototype, called ›Travel Dice‹, is a small, handy sponge dice that fits in your hand and pocket. Its attributed navigational features include position, orientation, and ›magic transport‹. Those features can be explored physically via haptic interactions that build on initial technical experiments: In order to get information about your position, the user holds the ›Travel Dice‹ close to the ear and hears a voice that will name their current location.



To be guided into a specific direction, the orientation feature serves as a compass: holding the dice in front of the body, the strength of vibration will give the user feedback about the right direction. For the ›magic transport‹ feature, the context of Google Earth serves as a playground: just as the platform's dice button virtually transports you to another random place on Earth, the ›Travel Dice‹ will take you to another place once you squeeze it in your hand.

Whereas in our physical world, the features of the ›Travel Dice‹ rely on technical possibilities, in lucid dreams its full, unlimited functionality can play out. In lucid dreams, anything seems possible, but references are drawn from the familiar. Carrying around a tool with specific features in the real world will allow users to make use of that tool in a lucid dream. Information about one’s position, orientating through the landscape or jumping to another place by a magic transport: the ›Travel Dice‹ will serve many lucid dreamers as the first neo-analog navigation tool.
Wair
Ran Zhang
Feeling what you're breathing
How can we transform invisible data into virtual textures we can perceive?

There is a lot of data in our lives that is invisible but important to us, and we often access it by using tools to turn it into numbers or words. But could we interpret this data as surface textures? We can sense the information we want directly with our skin, no longer relying on cold numbers, but on our sense of touch.

Air quality was selected as an example for the implementation of this principle. Air quality is not visible, not touchable, not smellable. It is difficult for us to perceive it directly under normal circumstances. However, air quality directly impacts our health. Air pollution is harmful both psychologically (e.g., tiredness, difficulty concentrating) and physically (e.g., respiratory illnesses, headaches). Indoors or outdoors, air pollution varies from block to block, hour to hour. Air quality is like the texture of air: if only we could find a way to ›feel it‹?

Experiments in Sprint 3 provided the groundwork for turning these ›textures‹ into vibration patterns. The intensity of the vibration varies as graphic patterns are read, for example by the size of the overlap between two shapes, or in the case of the final prototype, by the steepness of a curve.



The outcome is a wearable air quality monitor called ›Wair‹. It feels the air quality in real time like a new sense. The air quality is set to different levels, such as fresh air, low pollution, and high pollution, and this data is translated into different curves. These curves are then translated into ripple-like vibrations felt by the user. The worse the air quality, the greater the curvature and the stronger the vibration.

When the air is fresh, the user feels a low frequency, slight vibration, which is a reminder to the user that this is a safe state. When the air quality gradually becomes worse, the vibration becomes more intense and high frequency. For testing, I used a marker to pollute the air as it evaporates alcohol and the prototype works well. ›Wair‹ can work like a skin patch or can be implanted into other wearables such as wristbands, masks and so on.