SHARE

Have you called a family member or a friend and after exchanging a couple of words you know exactly who you are and your friend are planning to talk about? On the contrary, when we call and speak with an automated customer service center the same experience is not felt. When we speak to a computer technology, it determines the words we speak and based on these words the computer decides how to respond to our queries.

So, when we speak with a computer, we expect technology to better understand human conversation outside the spoken communication. Which words were emphasized to indicate what is important, what about the emotions, time crunch, or available time to be spoken in a conversation? What about noises in the background and the response of the virtual assistant? Welcome to the exciting world of audio analytics which can help to bridge this gap by learning the characteristics of human speech and the environment of communication.

International Space Station Orbiting Earth. 3D Illustration.
Deep Audio Analytics beyond the Earth Frontiers

In space, the mechanical equipment screaming cannot be heard by anyone, unless, it is a pressurized environment with a microphone array availability nearby. Bosh and American space logistics company Astrobotic Technology Inc. will soon send roving robots into space and plan to test experimental sensor technology powered to assess how mechanical systems are running just by listening. This research will take place on board the International Space Station (ISS) and could commence as early as May 2019.

The joint Bosch and Astrobotic research adds a bit of scientific repeatability to that idea, Bosch’s SoundSee technology deploys an array of machine learning and microphones to study information contained in noises emitted from machinery aboard the ISS. SoundSee’s analytics determines whether audio data may be used to refine and upgrade the operations of the space station.

Dr. Samarjit Das, principal researcher at Bosch Research and Technology Center in Pittsburgh, asserts that machines like pumps and motors emit noise signatures while they operate and SoundSee’s AI algorithm uses machine learning to analyse the subtle acoustic clues to determine whether a machine, or even a single component of a machine, needs replacements or repairs. The equipment post the testing phase will be mounted on NASA’s Astrobee robot, which is an autonomous free-flying vehicle built by Astrobotic programmed to navigate the internal chambers of the ISS.

Dr. Joseph Szurley, a Bosch research scientist on the project adds that, for some time Bosch has been interested to use audio analytics to monitor critical machines and equipment, such as HVAC systems or car engines. The International Space Station (ISS) will allow technology to study how these techniques can extend to even more challenging and unique environments.

About the Project

Bosch in North America and Astrobotic Technology Inc. have together collaborated to a research partnership aimed to send experimental sensor technology to the ISS as early as May 2019. Bosch’s SoundSee technology is a deep audio analytics capability deploying a custom array of machine learning and microphones to analyse information contained in emitted noises. Adding to the conversation, Dr. Andrew Horchler, Astrobotic lead project engineer says that the support from NASA has been critical in catering to the requirements and feedback that have contributed to the design and operational plans to understand the challenges of sending a hardware payload to the ISS.

On the ISS, space researchers will collect data to send it back to the Earth for Bosch’s experts for a study. As research progresses, the team will adjust to operational routines and update the software to improve data-collection results. Jon Macoskey, research engineer at Bosch and lead payload designer for the project asserts that this data should allow them to gain insights into the state of the space station. The long-term goal is to show how anomalies can be detected in the operation of the station and return that intelligence to crewmembers or ground control. This research holds promise for numerous crewed spacecraft and terrestrial applications, including missions to the Mars and the Moon.

The SoundSee project has been into development stages since the Center for the Advancement of Science in Space (CASIS) has approved funding for launch costs and astronaut time aboard the space station earlier this year. CASIS is an organization tasked by NASA to manage the ISS U.S. National Laboratory. The SoundSee payload will soon launch to the ISS as a part of NASA’s Astrobee robot, programmed to be delivered on a future commercial resupply services mission.

Source : https://www.analyticsinsight.net/on-an-upcoming-mission-space-robot-to-test-deep-audio-analytics/

LEAVE A REPLY