Google wants to build a computer that “understands” you

In the 1990s, computer scientist Mark Weiser proposed a new concept, pervasive computing, in his article “Computers in the 21st Century”. He believes that in the future, computers will be integrated into the network, the environment, and life, and will be ubiquitous and invisible.

Just like when we see a road sign, we don’t think about what the road sign is, but directly look at the text on it. Computers should be “senseless” and people can use them without thinking. For decades, countless people and technology companies have tried to make this “future prophecy” a reality. Google is one of them, proposing new ways for ambient computers to understand humans and understand everyone’s behavior.

Can “understand” your computer
Use the tablet to play the cooking video while cooking. When you leave to get the ingredients, the video stops automatically, and when you come back, the video automatically plays. This story has been staged many times in science fiction movies, and it is also the future depicted by many smart home companies. Now Google has really done it. The latest Soli technology shows that the film has almost the same scene.

How does it do it?

Google’s answer is to let the computer understand your movement changes. As early as the Pixel 4, the company launched the gesture interaction function, without touching the screen, you can switch songs, pause, etc. with a light wave, and the mobile phone feedback is accurate. Gesture interaction relies heavily on the Soli radar sensor at the top of the Pixel 4, which Google now applies to hardware such as the Nest Hub, a smart home controller.

The principle of how a computer “understands” human movements is actually not complicated. The Soli radar will scan the surrounding area in real time and simulate a 3D-like space, in which the human body is the small moving point. According to the movement trajectory of the small point, combined with the huge data analysis, the system will find out what your actions are and what your intentions are. For example, when you get up from the sofa, the system can analyze your movement and you may want to leave, and the TV will automatically Pause playback. The automatic pause and playback of the video is just one of the scenes shown by Google. The video also shows various reminder scenes, such as when you go out, the thermostat will remind you to bring an umbrella to avoid the rain, and get up and leave to remind you that there is a video call next.

Thanks to the integrated Soli ultrasonic radar sensor, the Nest Hub can capture movement and track sleep, as long as the Nest Hub is placed on a bedside table as high as a mattress, it will automatically track movements and tell you sleep with data Time, breathing pattern, snoring or not.

The Nest Hub also recommends the best time to fall asleep based on your sleep situation.

After connecting to the Soli radar sensor and smart analysis system, the smart home becomes truly “smart” and can understand part of people’s actions and give accurate feedback.

In an interview with the media FastCompany, Leonardo Giusti, Google ATAP design lead responsible for Soli technology research and development, said:

We are inspired by the way people understand each other.

When you walk behind someone, they will open the door for you. When you ask for something, the person next to you hands it to you, and as humans, we often understand each other intuitively without saying a word.

This intuitive understanding leads to an almost nonsensical experience. Movement is the most universal human “language” and one of the best research threads.

“Senseless” Experience and Spatial Behavior
No feeling is the highest evaluation of hardware product experience. Google products based on Soli radar have done it. No matter when you leave the movie, it automatically pauses, or when you go out, the thermostat automatically reminds you to bring an umbrella. It blends into the home environment, subtle and natural.

The construction of non-sensing experience is closely related to “Spatial Behavior”. To understand people’s actions and postures, it is necessary to understand the principles of human interaction in a fixed space. According to Giusti, the design director of the Google STAP team:

We imagine people and devices having personal space.

These spaces overlap, giving us more insight into types of engagement and social relationships at any given moment.

Scholar Edward Hall put forward the concept of “spatial behavior” in the 1960s. People will construct interpersonal distance space due to different cultural backgrounds. From near to far, they are intimate distance, personal distance, and public distance.

In practice, when approaching a Nest Hub with Soli radar and making eye contact, the Nest Hub should adjust the UI display for information and even further action. It’s like making eye contact with friends at our personal distance, intending to communicate further.

Proximity interaction is a typical case, and it is also a product function that is relatively easy to commercialize. Medium distance and long distance are different. When in a long public distance, how should Nest Hub display content, from far to near when approaching And how, whether to display.

Smart interactions should occur when needed, not interrupted frequently, and deciding to change what is displayed is just as important as staying the same, and these are the next steps for the Google ATAP team to look at. Products that enter personal intimate distance, such as Nest Hub and other products equipped with Soli radar, naturally also need to pay attention to privacy, and people will feel uncomfortable when their intimate space is violated.

Google has always emphasized that the Soli radar is different from the lens. It adopts non-intrusive data collection and does not directly collect personal information features such as faces. The human body in the system is composed of small blurred dots, and the motion analysis is often based on the movement trend of the dots. . The Nest Hub motion detection data processing is run locally, and the data will not be uploaded to the cloud, reducing the risk of leakage.

Whether people trust Google and what the credit value of information privacy is is another matter. A good experience with almost no sense comes from the attention of the Google ATAP team to people. From superficial actions to deep cultural phenomena, electronic products begin to understand people, which is a very futuristic thing in itself.

Hardware for the future
The full name of Google ATAP is the Google Advanced Technology and Planning Department. The official website profile describes the direction of the team and landing products:

We are creating the future of hardware.

A company known for its software, how to combine its advantages to develop hardware products can be seen from the past product plans of Google ATAP. The modular mobile phone project (Projet Ara) is one of the most well-known projects of the Google ATAP department. At that time, Google designed a framework of basic modules such as battery, processor, and Wi-Fi communication. Most of the modules can be replaced. That is to say, users can upgrade or replace old mobile phone modules at will.

Modular mobile phones were once considered an innovative and promising project, and countless developers could enter to build various modules, breaking through the market domination of a few mobile phone manufacturers.

However, only a few companies such as Fairphone are still pushing the modular mobile phone plan. The provision of replaceable modules, the limitations of mobile phone performance and the imperfect third-party development ecology have directly led Google to shelve the mobile phone modularization plan. Google only plays the role of the platform, and to Project Soli, the Google ATAP department began to combine its own advantages – software algorithms.

Google ATAP first optimizes the ultrasonic radar system to ensure accurate data while reducing the size for use in mobile devices such as Pixel and Nest Hub. After being formed, Soli radar plays the role of data collector, mobile devices such as mobile phones or Nest Hub determine what human actions and gestures are based on software algorithms, and change operations. The logic is the same as today’s popular computational photography, hardware is the foundation, and software algorithms are the guarantee of experience.

In addition to Pixel 4, Google ATAP has cooperated with several companies to integrate Soli radar into various products, such as the denim jacket launched by apparel brand Levi’s. The pill-shaped Soli radar sensor is sewn into the sleeve, and the clothes become smart products .

You can control the phone with gestures, take photos, switch songs, pause playback, and answer the call with a tap of your wrist. Similar functions are also supported by the Konnect-i backpack. These may not be breakthrough experience changes, but more like practical gadgets, commonly used wearable products such as clothes and backpacks, and controllers to control mobile phones.

We may not expect Soli to spread to computer products anytime soon, but change always starts small, and products like the Nest Hub show enough “smart” to understand people. As the Google ATAP team continues to work on making computers understand more human movements, useful features like auto-pausing videos when you get up will only increase.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s