Skip to main content

Future Pixel Buds could get heart-rate tracking thanks to Google breakthrough

Web Hosting & Remote IT Support

In a recent post on its Research blog, Google reveals it has discovered a way to detect a person’s heart rate through ANC (active noise canceling) earbuds.

The method is called Audioplethysmography (say that three times fast), or APG for short. The way it works, according to the tech giant, is the ANC earbuds send out a “low-intensity ultrasound probing signal through” the speakers. The signal then bounces around in the ear canal, sending the echo back to be received by the “on-board feedback microphones.” The echoes are influenced by “tiny ear canal skin displacement and heartbeat vibrations”. 

The company was able to detect both the heart rate within the feedback signal as well as the heart rate variability. They could tell if the heart was beating fast or slow. Google explains in its post that the ear canal is “an ideal location for health sensing” thanks to all of the blood vessels closely permeating throughout that part of the body.

APG earbuds detecting heart rate

(Image credit: Google Research)

Surprisingly accurate

As part of its research, Google performed two rounds of studies with 153 participants in total. Their results show the ANC earbuds were able to accurately detect heartbeats with a low margin of error of about 3.21 percent. Apparently, the devices could accomplish their task even with music playing and outside sound leaking in. What’s more, the technology isn’t impacted by different skin tones or different ear canal sizes. It all works the same.

APG isn’t perfect as it “could be heavily disturbed by” body movement which could greatly limit its implementation in future devices. However, Google remains hopeful as it believes this tech is better suited for earbuds than your standard electrocardiogram (ECG). The reason why the latter hasn’t been added to headphones is it would add “cost, weight, power consumption, [design] complexity, and form factor challenges”, preventing wide adoption.

A work in progress

Now the question is will we see APG in the next generation of Pixel Buds or any earbuds for that matter? Maybe. It certainly has a lot of potential for health-conscious users wanting to record their heart rate but may not want to commit to purchasing a Pixel Watch 2 or a fitness band. Plus, the company claims it was able to make ANC headphones support APG with a “simple software upgrade”. So what’s the hold-up?

Well, it’s still a work in progress. If you read the full paper on Google Research, the next major steps are to improve APG’s performance in rigorous exercises including, but not limited to, hiking, weightlifting, and boxing. Also, the team behind the APG hopes the findings can be used in other experiments.  

Google explains ANC headphones utilize “feedback and feedforward microphones” to function. Those mics have the potential to open “new opportunities” for other applications in medicine from monitoring a person’s breathing to diagnosing ear diseases.

Until we learn more, be sure to check out TechRadar’s list of the best fitness trackers for 2023.

You might also like



via Hosting & Support

Comments

Popular posts from this blog

Microsoft, Google, and Meta have borrowed EV tech for the next big thing in data centers: 1MW watercooled racks

Web Hosting & Remote IT Support Liquid cooling isn't optional anymore, it's the only way to survive AI's thermal onslaught The jump to 400VDC borrows heavily from electric vehicle supply chains and design logic Google’s TPU supercomputers now run at gigawatt scale with 99.999% uptime As demand for artificial intelligence workloads intensifies, the physical infrastructure of data centers is undergoing rapid and radical transformation. The likes of Google, Microsoft, and Meta are now drawing on technologies initially developed for electric vehicles (EVs), particularly 400VDC systems, to address the dual challenges of high-density power delivery and thermal management. The emerging vision is of data center racks capable of delivering up to 1 megawatt of power, paired with liquid cooling systems engineered to manage the resulting heat. Borrowing EV technology for data center evolution The shift to 400VDC power distribution marks a decisive break from legacy sy...

Google’s AI Mode can explain what you’re seeing even if you can’t

Web Hosting & Remote IT Support Google’s AI Mode now lets users upload images and photos to go with text queries The feature combines Google Gemini and Lens AI Mode can understand entire scenes, not just objects Google is adding a new dimension to its experimental AI Mode by connecting Google Lens's visual abilities with Gemini . AI Mode is a part of Google Search that can break down complex topics, compare options, and suggest follow-ups. Now, that search includes uploaded images and photos taken on your smartphone. The result is a way to search through images the way you would text but with much more complex and detailed answers than just putting a picture into reverse image search. You can literally snap a photo of a weird-looking kitchen tool and ask, “What is this, and how do I use it?” and get a helpful answer, complete with shopping links and YouTube demos. AI Eyes If you take a picture of a bookshelf, a plate of food, or the chaotic interior of your junk...