Google is quietly experimenting with holographic glasses and smart tattoos – CNET

Posted: July 21, 2020 at 12:13 pm

A simple pair of sunglasses that projects holographic icons. A smartwatch that has a digital screen but analog hands. A temporary tattoo that, when applied to your skin, transforms your body into a living touchpad. A virtual reality controller that lets you pick up objects in digital worlds and feel their weight as you swing them around. Those are some of the projects Google has quietly been developing or funding, according to white papers and demo videos, in an effort to create the next generation of wearable technology devices.

The eyewear and smartwatch projects come from the search giant's Interaction Lab, an initiative aimed at intertwining digital and physical experiences. It's part of Google Research, an arm of the search giant with roots in academia that focuses on technical breakthroughs. The Interaction Lab was created within Google's hardware division in 2015, before it was spun out to join the company's research arm about two years ago, according to the resume of Alex Olwal, the lab's leader. Olwal, a senior Google researcher, previously worked at X, the company's self-described moonshot factory, and ATAP, Google's experimental hardware branch.

The goal of the Interaction Lab is to expand Google's "capabilities for rapid hardware prototyping of wearable concepts and interface technology," Olwal writes. Its initiatives appear to be more science experiment than product roadmap, with the likely goal of proving ideas rather than competing with the Apple Watch or Snapchat Spectacles. But taken together, they provide a glimpse at Google's ambitions for wearable tech.

The other projects were collaborations with researchers from universities around the world. At least two of them -- the VR controller and smart tattoos -- were partly funded throughGoogle Faculty Research Awards, which support academic work related to computer science and engineering. The efforts highlight Google's close ties with the academic community, a bridge to the company's beginnings as a Stanford University grad school project by co-founders Larry Page and Sergey Brin that grew into a global behemoth with deep hooks into our lives.

Google and Olwal confirmed the company had developed or funded the projects.

The experiments could play a critical role in coming years as tech giants open up a new battlefront in wearable tech. Many in the industry see it as the next major computing platform after smartphones. Google, Apple, Amazon, Samsung and Facebook -- through its virtual reality subsidiary Oculus -- have all released wearables, including watches, rings, earbuds and jean jackets. Almost 370 million wearable devices will be shipped this year, forecasts the research firm IDC, growing to more than 525 million in two years.

It isn't just about selling hardware. Getting sensor packed-devices onto consumers could mean a treasure trove of data beyond what people produce on their phones or at their desks. It's an especially valuable haul for Google, which makes more than $160 billion a year, mostly through targeted ads that are informed by the personal data of people who use its services. The gadgets also create inroads to lucrative new businesses for tech giants, like health and fitness, though lawmakers and regulators have privacy concerns about Silicon Valley's ever-expanding scope.

Google has been trying to get a toehold in wearables for years but hasn't quite found the spot. In 2012, the company unveiled Silicon Valley's most notorious foray into wearable technology: Google Glass eyewear. The device was maligned from the start and ultimately flopped. Google has also developed an operating system specifically for smartwatches and other devices, called Wear OS, though it's earned little more than a niche following.

Recently, however, the company has made a more determined push. Last month, it acquired North, a Canadian company that makes smart glasses called Focals, reportedly for $180 million. Google last year announced a $2.1 billion deal to acquire Fitbit, the struggling fitness tracker pioneer, in an attempt to bolster Google's hardware operation. The buyout has sparked alarm among critics worried about Google's ability to strong-arm its way into new industries and buy the health data of millions of people.

It's to get ahead of the curve. By learning about the consumer in different ways that other companies aren't doing yet, even if it's an incomplete picture.

Tuong Nguyen, Gartner

Making advancements in new wearable form factors, like smart fabrics, is crucial, says Tuong Nguyen, an analyst at the research firm Gartner. "It's to get ahead of the curve," he says. "By learning about the consumer in different ways that other companies aren't doing yet, even if it's an incomplete picture."

Each project is accompanied by an academic white paper, photos and demo videos, as is customary with work done at Google Research. The videos are intended as a showcase of findings for researchers, instead of the slickly produced marketing clips you'd see on stage at a Google launch event. Olwal and Google are listed as authors on all of the papers, but only the eyewear and hybrid watch projects list an affiliation with the Interaction Lab.

The company has already publicly demoed one of the Interaction Lab's projects. The I/O Braid, which the search giant showed off at an AI event in San Francisco in January, allows people to control a device by interacting with a wire. The Braid lets someone, for example, start, stop and control the volume of music on a phone by twisting or pinching the fabric wire of earbuds.

But other efforts of the lab, as well as other wearable tech projects Olwal has been involved with for Google, haven't previously been given a spotlight. Here are a few of them:

1D Eyewear, a Google smart glasses project, was developed by the Interaction Lab.

When Google unveiled Glass, born out of the company's X moonshot factory, critics mocked it endlessly. People were put off by the device's cyborg-like design. A chunky block of glass sat in front of one eye, and the device's processors were housed inside its thick frame and earpiece. Its geeky design, coupled with a fierce privacy backlash, pushed Google to discontinue the consumer version in 2015. Now it's mostly a tool for warehouse workers and other businesses.

The 1D Eyewear project, from the Interaction Lab, appears designed to succeed where Glass most importantly failed -- getting people to want to wear the tech in the first place. The goal is to make the device minimalistic enough that it can still be stylish (though the prototype appears to have a thick earpiece as well).

"The requirement to fit all the electronics, optics and image-generating components, in addition to batteries of sufficient capacity, greatly affects the possible industrial design options," Olwal and his team write in a white paper describing the device. "The variations of styles that end users may choose from is thus limited by these constraints, with reduced flexibility in wearability and aesthetics."

The Interactive Lab's solution is an understated pair of shades that pairs with an Android device and projects holographic icons and colored lights over a wearer's eyes. For example, when using a navigation app, a blinking yellow light you'd see above the left frame tells you to turn left. A light above the right frame points you in that direction. Other notifications are color-coded: A flashing blue light means you're getting a calendar reminder, yellow is for Gmail, and green is for chat or phone notifications.

A test of the device's hologram system.

The glasses also display 16 different holograms that are projected using laser beams. The pictures are simple line drawings of "common icons for mobile devices," the white paper explains. One is of a phone, another is of a speaker that looks like a volume control tool. It's unclear how they can be used.

The device's development has apparently touched other teams at Google. After the Glass initiative was shelved, the company said it would reimagine the failed project under a new initiative called Aura. It was placed under Google's Advanced Technology and Projects group, or ATAP. In the 1D Eyewear white paper, its engineers list the Google Glass, Aura and X teams as "collaborators." 1D Eyewear is similar to the Aura project, but a Google spokesman said the two are not related.

Grabity, a VR controller, was a collaborative project with researchers at Stanford.

Virtual reality platforms like Facebook's Oculus or HTC's Vive can transport you to another digital world. But those worlds are only as immersive as your ability to explore the environments they create. A device called Grabity, developed in collaboration with researchers at Stanford, is designed to simulate the feeling of grasping and picking up objects in VR.

The prototype isn't worn like a glove but slips onto your thumb and index finger like a boxy controller strapped to your hand. It positions your fingers as if you're holding a soda can. The device uses gentle vibrations, or haptics, to mimic the sensation of picking up a small item in VR games. The haptics are meant to replicate the skin stretching on your fingertips when you've grasped something. To emit the vibrations to your hands, the device contains two small motors called voice coil actuators. The bottom of the gadget has an arm that swings back and forth, giving you a feeling of inertia as you wave the item around in your hand.

"We need to think about how we perceive weight," Inrak Choi, one of the project's researchers, and a Ph.D. student at Stanford's Shape Lab, said during a presentation on Grabity in 2017. "Basically it is the combination of multiple sensory systems on the human body." The project was funded partially through a Google Faculty Research Award, according to a white paper on Grabity from 2017.

Choi didn't respond to a request for comment.

Google has struggled with VR. While Facebook and other companies have invested in powerful platforms that require high-end computing power for their VR products, Google has relied mostly on mobile phones. Meanwhile, Facebook's Oculus Quest, the wireless headset, is having a moment. In May, the company announced that consumers have spent more than $100 million on Quest content.

Google made its first foray into VR in 2014 with Cardboard. As the name suggests, a square of cardboard is used to cradle your phone, converting it into a VR headset. Two years later, the company unveiled Daydream, a more polished version of the concept that required juiced-up processing but was still built around using your phone as the brains of the operation. Google quietly shuttered the platform last year.

The company's work with Grabity, though, suggests Google has thought about more complex VR experiences -- with experimental hardware to go with it.

Google developed smart tattoo prototypes with researchers at Saarland University in Germany.

A project called SkinMarks uses rub-on tattoos to transform your skin into a touchpad.

Here's how it works: The tattoos, which are loaded with sensors, are applied to a part of the body, like the ridge of a person's knuckles or the side of a finger. The sensors can be triggered by traditional touch or swipe gestures, like you'd use on your phone. But there are also a few gestures that are more specific to working on the skin's surface. You could squeeze the area around the tattoo or bend your fingers or limbs to activate the sensors.

The benefit of using your skin as an interface, the researchers write in a 2017white paper, is tapping into the fine motor skills that human beings naturally have. Being able to bend and squeeze is instinctive, so the movements make it more natural to engage with technology. Interacting with your own skin and limbs also means you can do it without looking.

The tattoos are made by screen printing conductive ink onto tattoo paper. The paper is then thermal-cured so it can be applied to the skin. Some of the prototype tattoos include cartoon drawings or light up displays. The experiment, led by researchers at Saarland University in Germany, is partly funded through a Google Faculty Research Award.

"Through a vastly reduced tattoo thickness and increased stretchability, a SkinMark is sufficiently thin and flexible to conform to irregular geometry, like flexure lines and protruding bones," the researchers write.

The tattoos can be applied to uneven surfaces, like the ridge of a person's knuckles.

Google isn't the only tech giant that has experimented with skin in moonshot projects. In 2017, Facebook unveiled a project that could let people "hear" and decipher words through vibrations on their skin. The concept is similar to braille, in which tiny bumps represent letters and other elements of language. But instead of running your hand over those bumps, you'd feel frequencies in different patterns on your forearm from a sleeve worn on your wrist.

The initiative was one of the marquee projects of Building 8, Facebook's experimental hardware lab. After major struggles, the lab was shuttered a year later.

SmartSleeve is a high-tech textile project.

Two other projects, called SmartSleeve and StretchEBand, are focused on weaving sensors into fabrics.

The SmartSleeve prototype looks like a shooter sleeve that a basketball player might wear. Sensors are pressure-sensitive and threaded into the material. The sleeve can read 22 different types of gestures, including twisting, stretching and folding the fabric. It can also interpret when users bend their arms or push the fabric toward their elbows.

In a demo video, researchers give the example of the tech being used to control video playback. Bending your arm starts and pauses the video. Running your finger up and down the sleeve rewinds and fast forwards. Twisting the fabric like a knob turns the volume up or down.

The goal of the project appears to be similar to that of Google's Jacquard initiative, also aimed at creating smart clothing and accessories. Jacquard, which was announced in 2015, has developed a handful of products with internet-connected fabrics, including a denim jacket made in partnership with Levi's. The jacket lets people control music or get traffic updates by swiping the sleeve cuff. A luxury backpack, unveiled last year with Yves Saint Laurent, has a touch- and tap-enabled strap. Most recently, Google partnered with Adidas and Electronic Arts to make a smart shoe sole.

Another project called StretchEBand also weaves sensors into fabrics, like the band of a watch, a cellphone case, a stuffed animal or the interior of a car. In one example shown in a demo video, pulling on the strap of a car seat handle can recline or adjust the seat. In another, straps attached to the top and bottom of a phone case are used to scroll up or down.

The SmartSleeve project was developed with researchers at the University of Applied Sciences Upper Austria and at Saarland University. The StretchEBand was developed just with researchers at the Austrian school.

Google's Interaction Lab developed a watch with a digital screen and analog hands.

Another Interaction Lab project meshes the worlds of analog and smart watches. The project, which the lab only refers to as "hybrid watch user interfaces," uses the old-school hour and minute hands you'd find on a traditional watch and repurposes them as cursors to point at different commands.

Behind the watch hands is a digital screen that displays e-ink, like on a reading tablet. The electromagnetic hands are moved by pushing the buttons on the side of the device -- the ones normally used for setting the time on an analog watch.

"Together, these components enable a unique set of interaction techniques and user interfaces beyond their individual capabilities," says the project's white paper, written by Olwal.

One use for the interface could be answering a text. In a demo video, the wearer gets a message that says, "Hey! Send me photos of your new prototypes!" Underneath the text are three options: archive, reply or delete. Pushing a button on the side of the watch moves the clock hand to point at one of the options.

The idea has been tried before. Two years ago, LG announced the Watch W7, a device that runs on Wear OS and has physical clock hands that sit on a digital screen. The device got a mostly lukewarm reception.

The lackluster LG release may be instructive for Google. It's unclear whether the search giant will ever try to commercialize something from the Interactive Lab, but whatever Google does come up with will have to be compelling enough to stand out in a crowded market. For all its flaws, Google Glass did one thing right: It got everyone's attention.

Read the rest here:

Google is quietly experimenting with holographic glasses and smart tattoos - CNET

Related Posts