Magnetic fields to send data through the human body

Magnetic fields to send data through the human body

Researchers at the University of California, San Diego, are working on a technology that uses human body as a communication medium as a safer alternative to Bluetooth for wearable gadgets. According to Patrick Mercier – assistant professor at UCSD and co-director of its Center for Wearable Sensors – Bluetooth radios are not that great at transferring data when there’s a body in the way requiring more power while this doesn’t apply to magnetic fields.

Read more on MIT Technology Review.

Be My Eyes: an app to lend your eyes to blind persons

Be My Eyes by Robocat is a simple app with a very special mission: enabling video calls to put in contact visual impaired people with seeing volunteers willing to help them.

The whole process is straightforward. According to the official website:

“A blind person requests assistance in the Be My Eyes app. The challenge that he/she needs help with can be anything from knowing the expiry date on the milk to navigating new surroundings. The volunteer helper receives a notification for help and a live video connection is established. From the live video the volunteer can help the blind person by answering the question they need answered.”

The app is currently available only on the App Store. You can download it from here.

User Experienced Software Aging: Test Environment, Testing and Improvement Suggestions

In this MA Thesis, Sayed Tenkanen (University of Tampere) offers a framework for the automated analysis of software aging.

Abstract: “Software aging is empirically observed in software systems in a variety of manifestations ranging from slower performance to various failures as reported by users. Unlike hardware aging, where in its lifetime hardware goes through wear and tear resulting in an increased rate of failure after certain stable use conditions, software aging is a result of software bugs. Such bugs are always present in the software but may not make themselves known unless a set of preconditions are met. When activated, software bugs may result in slower performance and contribute to user dissatisfaction.

However, the impact of software bugs on PCs and mobile phones is different as their uses are different. A PC is often turned off or rebooted on an average of every seven days, but a mobile device may continue to be used without a reboot for much longer. The prolonged operation period of mobile devices thus opens up opportunities for software bugs to be activated more often compared to PCs. Therefore, software aging in mobile devices, a considerable challenge to the ultimate user experience, is the focus of this thesis. The study was done in three consecutive phases: firstly, a test environment was set up; secondly, mobile device was tested as a human user would use under ordinary-use circumstances and finally, suggestions were made on future testing implementations. To this end, a LG Nexus 4 was setup in an automated test environment that simulates a regular user’s use conditions and executes a set of human user use cases, and gathers data on consumption of power as well as reaction and response times in the various interactions. The results showed that an operating system agnostic test environment can be constructed with a limited number of equipment that is capable of simulating a regular user’s use cases as a user would interact with a mobile device to measure user experienced software aging”.

Download the MA Thesis.

Cyborg Dating: Google Cardboard VR to enhance future human enagement

From Cyborg Dating website: “Visitors of the Impakt Festival could go on a “Cyborg Date”. They could take a walk through a virtual forest which was situated at GPS-coordinates in the city centre of Utrecht. Navigating was a joint effort. The person wearing the Google Cardboard could see where to go, but the person guiding could see how to go there, avoiding obstacles, people and bicycles. On the way, they both were informed about the progress of the ‘date’. They were given suggestions on what to say, and at certain interactive spots buttons appeared. A rose could be given to the human guide. The contents of a picnic basket could be shared. And at one point en route, the forest could be switched to nighttime, to subtly nudge the date towards a romantic ending. The growth of the amount of cyborgs amongst us, is irreversible. Curious about the impact this will have on society? Meet the future by using new experimental technology engineered by Veenhof and Frabsnap. An interface to let cyborgs anno now, humans with smartphones, date cyborgs of the future: human beings living in both virtual and physical worlds at the same time. Become an instant cyborg using a simple VR-headset extension for your Android smartphone, or use your smartphone to guide a cyborg around the city during a romantic walk through a virtual forest located in the parallel reality of the city of Utrecht. An ideal way to experience the unique new opportunities of sincere and efficient digital communication between future humans.”

 

CREDITS:

Concept & realisation: Sander Veenhof & Rosa Frabsnap
Programming interaction: Sander Veenhof
Virtual forest: Paul Siegmann
Video footage: Remko Dekker & Sander Veenhof

 

SOURCE:

Cyborg Dating originally appeared in Fast.Co Design

GripSense: Using Built-In Sensors to Detect Hand Posture and Pressure on Commodity Mobile Phones

In a paper published as part of the UIST ’12 Proceedings of the 25th annual ACM symposium on User interface software and technology, Mayank Goel,  Jacob Wobbrock, and Shwetak Patel – University of Washington – present their system to infer hand interaction on mobile devices.

The abstract: “We introduce , use of thumb or index finger, or use on a table. GripSense also senses the amount of pressure a user exerts on the touchscreen despite a lack of direct pressure sensors by observing diminished gyroscope readings when the vibration motor is “pulsed.” In a controlled study with 10 participants, GripSense accurately differentiated device usage on a table vs. in hand with 99.7% accuracy; when in hand, it inferred hand postures with 84.3% accuracy. In addition, GripSense distinguished three levels of pressure with 95.1% accuracy. A usability analysis of GripSense was conducted in three custom applications and showed that pressure input and hand-posture sensing can be useful in a number of scenarios.”

Some further insight: ” A typical computer user is no longer confined to a desk in a relatively consistent and comfortable environment. The world’s typical computer user is now holding a mobile device smaller than his or her hand, is perhaps outdoors, perhaps in motion, and perhaps carrying more things than just a mobile device. A host of assumptions about a user’s environment and capabilities that were tenable in comfortable desktop environments no longer applies to mobile users. This dynamic state of a user’s environment can lead to situational impairments [28], which pose a significant challenge to effective interaction because our current mobile Figure 1. (left) It is difficult for a user to perform interactions like pinch-to-zoom with one hand. (right) GripSense senses user’s hand posture and infers pressure exerted on the screen to facilitate new interactions like zoom-in and zoom-out. Devices do not have much awareness of our environments or how those environments affect users’ abilities [33]. One of the most significant contextual factors affecting mobile device use may be a user’s hand posture with which he or she manipulates a mobile device. Research has shown that hand postures including grip, one or two hands, hand pose, the number of fingers used, and so on significantly affect performance and usage of mobile devices [34]. For example, the pointing performance of index fingers is significantly better than thumbs, as is pointing performance when using two hands versus one hand. Similarly, the performance of a user’s dominant hand is better than that of his or her non-dominant hand. Research has found distinct touch patterns for different hand postures while typing on on-screen keyboards [1]. And yet our devices, for the most part, have no clue how they are being held or manipulated, and therefore cannot respond appropriately with adapted user interfaces better suited to different hand postures. Researchers have explored various techniques to accommodate some of these interaction challenges, like the change in device orientation due to hand movement [2,15]. But despite prior explorations, there remains a need to develop new techniques for sensing the hand postures with which people use mobile devices in order to adapt to postural and grip changes during use.”

 

Read the full paper.

 

Thanks to Joshua Tucker for sharing this paper on Framer Community. You can watch his implementation of the hand usage research on a nice prototype he made with FramerJS.