[ad_1]
I’ve met with Misty’s founders a number of times over the years, often in hotel suites at CES where the company was showing off their humanoid robot’s latest capability. The firm started life in mid-2017, as a spinoff of Sphero, which was dealing with its own growing pains at the time, moving from entertaining robots with help from Disney IP to a purely educationally-focused firm.
The company began life under the Sphero banner, led by co-founder Ian Bernstein, coming out of stealth with an $11.5 million Series A led by Venrock and Foundry Group. The company launched an adorable little wheeled robot, designed to be, in part, a development platform for things like hospitality.
Misty’s journey will continue, but things are about to look different from the startup, which was recently acquired by Swedish firm Furhat Robotics. The latter says it will be keeping a number of Misty senior staff on-board as it looks to incorporate the technology into its own products.
“For us, Misty is the piece that was missing. It helps us access a larger market focusing on education, for example,” co-founder and CEO Samer Al Moubayed tells TechCrunch. “The technologies are very similar, but Misty is great at the hardware front, and we are very strong on software. Instead of building a new robot for educational use, Furhat decided to start looking for another solution.”
Misty is, at the very least, immediately less jarring than Furhat’s rear-projected face. The company notes on its About Page, “Animated projection allows for smoother facial movements as opposed to mechatronic designs, meaning Furhat falls short of the Uncanny Valley effect that most people feel when seeing a robot that appears almost lifelike, but fails to mimic a realistic human.” Uncanny? Maybe not. Unnerving? A bit, yeah.
Anyone in the industry can tell you the importance of creating a kind of emotional bond with a robot. I’ve heard stories about everyone from Roomba owners to manufacturing floor workers sticking googly eyes on their robots to humanize them. Or take all of those viral Boston Dynamics videos of robots dancing to Motown and Katy Perry. It takes the edge off when we see these machines are more human or animal-like.
A recent MIT Media Lab paper confirms what we sort of already knew on that front. People respond better to machines when they display some of these characteristics. This particular study focused on smart speakers like Amazon Echo and Google Nest. Pitted against bygone social robot Jibo, researchers reported greater interaction when the machine does something as simple as pointing its head toward the speaker in the way a person or animal would.
Says MIT, “[P]arcipants interacted with a Jibo robot, Amazon Echo, and Google Home, with no modifications. Most found the Jibo to be far more outgoing, dependable, and sympathetic. Because the users perceived that Jibo had a more humanlike personality, they were more likely to interact with it, [research assistant Anastasia] Ostrowski explains.”
Of course, Amazon has taken a step in that direction with its home robot Astro, but the device’s $1,500 price tag makes it quite cost prohibitive. And while the company insists it’s taking the category seriously, we still feel a long ways away from any sort of mainstream success for an Astro, Jibo or Misty. Crossing that threshold is going to require a dramatic reduction in cost and increase in functionality that can make it feel more accessible and like less of a toy. In the meantime, education is a fairly practical path for Furhat/Misty, going forward.
There was a pair of big rounds courtesy of French robotics startups this week. Exotec raised a beefy $335 million Series D, which puts the firm’s valuation at $2 billion. The company deals in the familiar territory of warehouse automation, but takes a fairly novel approach to the category. That comes courtesy of its “Skypods,” which roll around autonomously and then climb up the side of a rack and pull down a bin. It’s an extremely clever solution, particularly in densely packed warehouses, as Romain notes in his piece.
“While the entire logistics sector is fraught with uncertainty, one of the most prevalent challenges is ongoing labor shortages,” CEO Romain Moulin said in a statement. “Exotec pioneers a new path: elegant collaboration between human and robot workers that delivers warehouse productivity in a lasting, far more sustainable way.”
Meanwhile, Parisian company Wandercraft has announced a $45 million Series C. The round comes as the company is eyeing international expansion to the U.S. for its mobility-focused robotic exoskeleton, Atalante. The company is entering a crowded market here that includes the likes of ReWalk Robotics, Ekso, SuitX and Sarcos. Though Wandercraft distinguishes itself somewhat by exclusively focusing on healthcare.
“We are super excited to have attracted world-class investors from the USA and Europe to advance the development program of the company,” CEO Matthieu Masselin said in a release. “With the support of patients, medical professionals and the DeepTech community, Wandercraft’s team has created a unique technology that improves rehabilitation care and will soon enable people in wheelchairs to regain autonomy and improve their everyday health.”
This week, Aigen announced a more humble $4 million seed round, though that comes with the much grander ambition of “terraform[ing] the Earth” apparently. That starts with solar-powered weeding robots that use computer vision to distinguish crops.
“My relatives are farmers in Minnesota, and I’ve been talking with them for quite some time. They’re really experiencing some trouble with traditional agriculture approaches,” CEO Richard Wurden tells TechCrunch. “Even the diehard people that love chemicals, that love tilling the earth and other practices that have been releasing carbon in the atmosphere for thousands of years are starting to realize, hey, maybe we should be open to other ways to do this. Right now, agriculture is about 16% of carbon emissions. In the future, it has the potential to go negative, by reducing diesel emissions, soil compaction, chemical usage and reducing tilling.”
Rounding us out this week is a great piece from Rita breaking down the world of China’s robotaxis. Without the proper computer vision, it can be hard to separate the plants from the weeds here, but she does a fine job highlighting the many successes and challenges in that world.
You don’t need computer vision to sign up for the Actuator newsletter (though you may need to prove you’re not a robot).
[ad_2]
techcrunch.com