Did you receive this newsletter as a forward? Subscribe here
|
|
Why TinyML is still so hard to get excited about |
|
By Stacey Higginbotham |
This week, I went to the tinyML Summit in Burlingame, Calif. TinyML, or running small machine learning models on constrained devices, is one of the most exciting technologies I’ve encountered. But it's also the one most likely to put people to sleep when I talk about it.
Using local computing to handle object or even limited face detection, wake word detection, anomaly detection, and more holds the promise of bringing more privacy to the IoT and more sensors to the world, and to give everyday products superpowers.
Last year, I was bummed because the conference was heavy on tech and possibilities and light on actual use cases. But this year, the organizers made a big effort to show off users. In the meantime, I was struck by just how challenging the technology is to implement — and to get people excited about it. |
|
— A session during the TinyML Summit. Image courtesy of the TinyML Foundation. |
|
|
Among the various use cases on display, there were two common themes: One, that the actual model development and running TinyML on hardware wasn’t difficult and two, that packaging it or making it discoverable was. The other challenge that makes TinyML so hard to talk about was that many of the implemented use cases were hidden or somewhat dull.
While at the conference, I ran into Pete Warden, founder and CEO of Useful Sensors, which I covered last year when it launched an integrated object detection sensor that sells for $10 and has the sensor and model already built in. At the time, he mentioned that the company's next sensor would be a gesture recognition sensor that could be integrated into televisions or other devices. It would recognize a few basic gestures, such as waving a hand to skip to the next image or channel, or putting a finger in front of your lips to mute something.
However, at the conference Warden told me that, while he'd quickly discovered that the model worked, educating people about new gestures was tough. “No one knows that these gestures are available,” he said. This makes sense. If you remember back to the launch of the first iPhone and its touchscreen, the first ads and demonstrations focused on things like taps and pinch-to-zoom. Those weren’t intuitive; they were taught.
So instead, Warden's company is releasing a new sensor that can scan a QR code. The idea behind this $6 sensor is that appliance makers can put it inside their products as a method of getting devices onto Wi-Fi easier. A user could simply show their Wi-Fi QR code (I find mine in my router app) to the sensor and get their, say, fridge or washer online. I think it could be neat as a way to transfer a recipe to an oven, or specific washing instructions to a washing machine for particular items of clothing. Unfortunately, unlike scanning a new shirt and getting the machine to change its parameters to provide the best wash, many of the use cases for TinyML are going to be kind of boring.
Elsewhere at the event, HP showed off two TinyML implementations with ST Micro that are embedded in new laptops. The first TinyML model uses a gyroscope to detect if a laptop has been placed in a bag or taken out of a bag. The idea behind the implementation is that the laptop will start booting up when it’s taken out of a bag in preparation for its owner to use it. If the model detects the laptop has been placed in a bag, it will change heating and cooling parameters to make sure the laptop doesn’t overheat.
The second use case also helps with thermal management. In that use case, the laptop detects when it is on a hard or soft surface. If it's on a soft surface, like a bed or a person’s lap, it will try to run cooler so as to avoid overheating.
Which is neat, but not anything you'd write home about. It’s also not a reason someone would buy a laptop, which makes it hard to justify adding TinyML to one. Many of the consumer use cases at the show fit this mold. Using TinyML to track where a person’s face and ears are as part of a sound bar, for example, does help deliver great sound, but it’s also a nice-to-have element, not a need-to-have one.
On the industrial side, things get a little more interesting, but the challenge there is that few companies want to talk about TinyML. As Warden noted to me, industrial users view success with TinyML as a competitive advantage and so are loathe to share the details of their success with potential competitors. Having previously been at Google and elsewhere the tech world, where success in innovation is heavily touted, he found the reluctance to share disheartening and surprising. I found his surprise at this charming.
Another example of how difficult it was to turn a TinyML solution into a product came during a presentation from the founders of a startup called Shoreline IoT. Shoreline IoT makes a ruggedized sensor that can be flashed with different ML models to detect different issues. CEO Kishore Manghnani said that getting useful models running on the computing hardware only solved about 15% of the problem associated with industrial sensing. The other 85% was in packaging the sensor into a form factor that could be deployed by anyone, in rugged environments, with good connectivity (among other things).
Boring use cases, challenges packaging a solution, and customers that don't want to talk are not obstacles solely faced by TinyML. In many ways, these are issues the tech industry will have to increasingly confront as it pushes computing and connectivity into more places. While a computer felt like it was a solution in and of itself after we added the internet and an array of online services (instead of the fancier calculator, word processor, and game player it was in the late 70s and 80s), computing is really just a tool designed to solve existing problems.
In many circles, connectivity and computing is seen as a way to add new services to more devices (and charge for them accordingly), but it may be that all we really need are new ways to solve old problems using better tools. TinyML is one such tool that will allow more information to be processed quickly, privately, and perhaps without consuming much power.
That's nothing to scoff at, but it may mean that those touting the technology have to adjust their expectations accordingly.
|
|
|
|
What Is, and Why Use a Column Database? SPONSORED |
|
|
If you are working with large amounts of data that will primarily be used for analytics, a column database might be a good option.
There are a lot of different options when it comes to choosing a database for your application. A common discussion seems to be whether data should be stored in a relational database or in a NoSQL alternative like key-value, document or graph databases.
Another option is to flip things entirely by going with a column database. In this article, learn why this new contender might be the best for IoT. |
|
|
|
Legato Logic is building a chip for battery-powered sensing |
|
It's clear there is huge demand for low-power, smarter sensors at the edge, and Legato Logic has a chip that it thinks can do just that using less than a milliwatt of power. The company has designed a chip to handle small machine learning models using a fraction of the power those models would require if they were running on a traditional microcontroller.
Shahrzad Naraghi, co-founder and CEO, calls the concept perception as opposed to processing. The two-year-old company has taped out a prototype version of its chip to show that its idea works, and is now looking for partners and investors to help it release a second chip to test with potential customers. |
|
— Shahrzad Naraghi, co-founder and CEO of Legato Logic, and Ankush Goel, co-founder and CTO. Image courtesy of S. Higginbotham. |
|
|
There are two things to like about Legato. One is that it has focused on bringing a piece of silicon to the market that already has machine learning models running on it. In Legato's first chip, the models will detect objects and wake words. It resembles the simple sensors that Pete Warden is building with Useful Sensors (see story above), which involves hiding the complexity of matching a model to hardware and simply offering a low-power sensor.
The second thing to like is that the sensor can run on harvested energy or battery power for years. That means a city could put a chip like this on a streetlight and use it to count cars, or maybe even to detect people. If this sort of functionality could be built into a light and did a good enough job tracking people, the city could then use that detection to trigger the lights to brighten or flash, for example, thereby alerting drivers when there's a person on the road.
Again, the idea of detecting people in an always-on sensor that doesn't send images back to the internet isn't a terribly exciting use case, but using that information to take some sort of action does allow us to use computing as a tool to solve a real problem in a new way that's cheap.
Legato's technology is a specially designed computing engine that uses time-domain to perform calculations at incredibly low voltages, which saves on power. I'm not going to force y'all to focus on that part other than to note that it is technically feasible and the resulting chip can be built using traditional manufacturing processes. The chip does need to be specially programmed, but because it's an integrated model/sensor requiring specialty software, that isn't really a big deal. It's not like we want it to run Windows.
The founders have so far raised about $500,000 in funding and are seeking about $3 million to get the second generation of their chip into the hands of potential customers. The founders have worked together before, at companies including IBM and Cadence, and count Alessandro Piovaccari, the former CTO of Silicon Labs, as an advisor.
I like the concept, and I find myself wondering if power-sipping silicon designed for highly specific use cases that could scale to billions of units (these tiny sensors could end up everywhere) changes the economics of how we build chips and software. We're so used to having a few general purpose options (Arm v. x86) that can scale massively, and building limited software supports those systems.
But if computing really does end up everywhere, it stands to reason that some chip designs or ideas that would have otherwise been doomed to become expensive ASICs could become incredibly widespread and get their own software ecosystems. I don't think that will necessarily happen here, but it's an idea I've been spending a lot more time thinking about lately. |
|
|
|
|
Podcast 416: What the heck is an IoT hyperscaler? |
|
With this week's show I feel like we're singing the same old tune. Philips Hue maker Signify is delaying its implementation of Matter while it waits for others to implement features it needs. Meanwhile Eve has started selling plugs that are Matter-ready from the get-go, and will sell new Matter-ready contact and motion sensors starting April 17. In related news, we tout the fact that the Thread Group has now certified 200 devices. We also see a new integrated DIY home security product from Google and ADT, which is a culmination of their $600 million partnership signed three years back. In enterprise news, we discuss Kore's acquisition of Twilio's IoT assets and try to figure out what an IoT hyperscaler is. Amazon has also opened up its Sidewalk Network, a free LPWAN for connecting devices (it's free because it sends your data to AWS). We talk about what I saw with regards to Sidewalk coverage in my travels around Seattle and the Bay Area. We then hear about Kevin's frustrations with HomeKit and the latest Apple iOS upgrades that broke his smart home, and new features from the Home+ app, which Kevin uses to manage his devices. I then review the Homey Bridge, a DIY smart home hub. Finally, we answer a listener question about Shelly products. |
|
— The ADT/Google start bundle which retails for $220. Image courtesy of ADT. |
|
|
Our guest this week is Chuck Sabin, the head of market development for the Bluetooth SIG. He is on the show to discuss the newly launched Bluetooth standard for electronic shelf labels. We discuss what electronic shelf labels will enable for consumers and retailers, as well as the different services and profiles that the SIG has built into the standard. After extolling the potential benefits for Instacart shoppers, we then talk about smart tags and the concept of ambient IoT. You'll be hearing that phrase a lot more often. The SIG is working on a standard around smart tags, as well as updating its networked smart lighting standard. You'll get a good sense of what Bluetooth plans to bring to the IoT, so enjoy the show. |
|
|
|
|
The Homey Bridge is a nice-to-have DIY hub option |
|
|
— I have been testing a DIY smart home hub from European company Athom, which just launched in the U.S. The Homey Bridge is only $69 but has Wi-Fi, Zigbee, Z-Wave, Bluetooth, and an IR blaster, and integrations with a lot of popular smart home brands. To find out if this hub is right for you, read my review here. Image courtesy of S. Higginbotham |
|
|
|
|
|
|
News of the Week |
|
This week's news was compiled and written by Kevin C. Tofel
Wirepas is the newest CSA member: Add this to the list of things I didn’t see coming: Wirepas joined the Connectivity Standards Alliance (CSA) this week. Why does this surprise me? Mainly because Wirepas isn’t a connected device maker, nor does it have products that work with Wi-Fi or Thread, which are used by the CSA’s Matter standard. The main focus of Wirepas of late is the non-cellular 5G DECT NR+ standard. Regardless of the wireless protocol used, Wirepas brings a high level of expertise in decentralized mesh networks, something that could be of huge benefit to the CSA moving forward. (Wirepas)
There’s a new chair in town: Speaking of the CSA, the organization announced that its current chair, Bruno Vulcano of Legrand, is now the Alliance Chair Emeritus. That opened up a key position and no, I didn’t get it. Instead, Musa Unmehopa from Signify will take over the reins. Having a chair from Signify, formerly Philips Lighting, is certainly a good choice. However, it’s also notable that Philips Hue, a Signify company, this week failed to deliver its promised Matter upgrade to the Philips Hue Bridge by March 31. In fact, there’s no new time frame for Matter support with Philips Hue. I guess the CSA members will have something fun to discuss at the next board meeting! (Connectivity Standards Alliance)
GitHub adds free support for one-click SBOMs: We’ve recently discussed SBOMs, or software bills of materials, as they pertain to the IoT. These are used to show a list of open source software and service components, helping to track potential vulnerabilities and software versions. SBOMs are key to understanding and managing the software used by IoT devices. This week, GitHub announced “self-service SBOMs,” a method that provides such a list of software components in a standard format approved by the National Telecommunications and Information Administration. The best part? GitHub creates the SBOM for free with a single click in a code repository, which reduces any friction that might come with building and maintaining them. (GitHub)
Tiny AI measures vibrations with minuscule data: Plenty of IIoT devices are monitored for vibrations as the data can be useful for predicting breakdowns or non-optimal machinery. But while that sounds good, such sensors create a massive amount of data. And much of it is just “noise” in the grand scheme, which means it requires lots of compute power to get at the useful data. Polyn has an answer with its VibroSense Tiny AI chip solution, which it introduced this week. The company says it uses small vibration patterns and Tiny AI to get the relevant data, with the patterns being up to 1,000 smaller than the data created from traditional vibration sensors. That means less data sent to the cloud and less power required for actionable information. (Polyn)
You may be talking to Google Bard, not Google Assistant: I’m sure you’ve been hearing a lot about generative AI in the news lately. It seems like we’re in a digital gold rush of artificial intelligence right now as large companies scramble to add this technology into any products they can. To that end, Google recently released Bard, its entry in this space. Surprisingly though, Google this week announced that it was reorganizing its Assistant division with a major focus shift to Google Bard. While I wouldn’t expect Assistant features to simply stop evolving, I do think we’ll see more experimentation as Google attempts to bridge its Bard AI service with the Google Assistant. That’s not necessarily a bad thing as digital assistant smarts have stagnated of late. However, based on my own usage of Bard, Google has a long way to go before it becomes a value-add to the Assistant experience. (CNBC)
Show me your hand and I’ll give you a sandwich: If the idea of scanning your palm to shop at an Amazon Fresh or Whole Foods gives you the willies, you’re not going to like this next development. At least not if you also like Panera Bread. The company has partnered with Amazon to bring the latter’s palm-scanning payment system to two St. Louis Panera locations. More Panera Bread locations will follow suit, although so far there are no details as to where, when, and how many. I’m still not a fan of giving Amazon even more of my data, especially when it comes to biometric data. Guess I’ll just have to visit my local Subway more often. (Amazon)
Ikea HomeKit support is a breath of fresh air: Good news for those who bought, or plan to buy, the Ikea Vindstyrka air quality monitor and use Apple Home. Since its launch, the Ikea sensor has only worked with Ikea’s Dirigera hub. A firmware update released this week adds HomeKit support for the Vindstryka. That means your Apple Home can get temperature, humidity, VOCs, and particulate matter (PM 2.5) data from your Ikea sensor once you upgrade your hub. (HomeKit News)
Tapo adds three new smart home devices, but…: Tapo, a TP-Link smart home brand, has a trio of new devices available, although unlike the company’s newest smart plug, they don’t use Thread. Instead, Tapo is using Wi-Fi and 900 MHz spectrum for connectivity. There’s a $23 smart hub with a chime, and both a new smart motion sensor and contact sensor that retail for $20 each. Tapo says the hub can support up to 64 devices, which is more than enough for those new to the smart home. While these new Tapo products offer an inexpensive way to add some smarts to your home, I’d rather see more Thread and Matter products from the company, even if they cost more. (Tapo)
Maybe the FireTV can earn Alexa more money: It’s been reported that Amazon has lost billions of dollars on its Echo and Alexa products, mainly because the company is finding it hard to monetize them. Perhaps it was focused on the wrong products all along. That’s the premise of this article, which suggests Amazon FireTV is the better option to boost revenues. It does make sense given that “customers who use voice features on Fire TV engage with content nearly twice as much as those who don’t,” according to Amazon. I don’t own a FireTV so I can’t speak to the experience of it as compared to an Echo. But I’d love to hear from those who do own one. Do you engage with it in ways that can add more money to Amazon’s bank account as opposed to using an Echo speaker or display? (ArsTechnica)
Sorry Apple, iOS 16.4 didn’t fix my broken HomeKit home yet: On the podcast this week, I expressed utter frustration with the new Apple HomeKit architecture that’s plagued me since iOS 16.1. That architecture update was pulled, but not before I installed it, and we’ve had nothing but problems with HomeKit ever since. So when I heard that this week’s iOS 16.4 update would finally address the problem, I did my duty and updated all of my iOS devices. And now my wife can’t see any of our smart home devices. I’ve been relegated to receiving her text messages asking me to dim or turn off a light. For many, likely those who didn’t get the architecture update before it was pulled from iOS 16.1, all is well. For others, like me, the problems haven’t been resolved. If you’re also in this boat, please accept my sympathy and this forum of potential fixes. (Reddit)
Want to sponsor this newsletter and the IoT Podcast? Our 2023 media kit is now available. Request a media kit.
|
|
|
|
|
|
|
|
|