You are currently viewing Autobrains nabs $19M, bringing its Series C to $120M, to take on Mobileye in autonomous driving tech – TechCrunch

Autobrains nabs $19M, bringing its Series C to $120M, to take on Mobileye in autonomous driving tech – TechCrunch


AI has been the backbone of many a technological breakthrough over the years, but one challenge it has yet to solve is that of self-driving: try as they may, engineers have yet to build a platform that can manage all the practicalities and unexpected eventualities of conducting a vehicle as well as or better than a human can do, and which has also convinced regulators and the general population of its reliability. We’re still seeing a lot of development, however, and today, Autobrains, one of the hopefuls in this space that believes it has figured out how to fix the 1% margin of error typical in self-driving with a “self-learning” approach that is hardware-agnostic (more on that below) is announcing yet more funding to continue developing its platform.

The Israeli startup has raised $19 million, rounding out its Series C at $120 million. The first tranche of this investment was made public in November 2021, and altogether the investor list includes Temasek, previous strategic backers Continental and BMW i Ventures, and new backers Knorr-Bremse AG and VinFast. As before, the company is not disclosing its valuation, but for some context, it’s a crowded space that provides some comparable numbers.

Israel’s Mobileye, which Autobrains’ CEO and founder Igal Rachelgauz describes as his company’s biggest competitor, earlier this month filed confidentially for an IPO (owner Intel would retain a stake in the spun out company should this go ahead). It’s been reported that Mobileye could be valued at around $50 billion if it lists. Wayve, another Israeli self-driving startup, raised $200 million in January, valued in the region of $1 billion.

Autobrains has to date raised just under $140 million, and it’s taking an approach that it believes will give it more traction in the market because of its flexibility.

A lot of self-driving technology (Mobileye’s being one example) is based around LIDAR sensors, with a few companies (like Wayve) building systems on lower cost bases using radar, smartphones, and AI to stitch the experience together. Autobrains takes a different approach that might be described as hardware-agnostic, using radar, and also LIDAR but only if the OEM has built it in.

The company’s approach comes from more than a decade of R&D. Originally, the startup descends from a company called Cortica AI (which Rachelgauz had founded), which has spent years building AI-based imaging technology applied across a wide variety of use cases (our first coverage of it, in fact, was about developing image recognition for advertising): Autobrains was spun out initially branded as “Cartica AI” to realize more of the value of the IP as it pertained to the very specific use case of driving. The company says it has more than 250 patents filed on its technology already.

One of the main barriers to self-driving AI has been the inability for machine learning systems to account for edge cases — with decision making based essentially on labelled data sets that have been fed into the algorithms. “It’s a very expensive process involving thousands of people, but still faces the challenge of accuracy because you can’t cover all the edge cases,” Rachelgauz said. So in one tragic example, while the operator in the Uber self-driving car pilot accident in Arizona was charged over the crash, the reason the car didn’t stop on its own was that it didn’t recognize the jay-walker.

As Rachelgauz describes it, Autobrains does not depend on labelled data, and has been built to “works closer to human way” of learning, by keeping the data randomized, letting the platform find the commonalities, and then going over the learnings to keep what is relevant to continue learning from (eg clothes that are the same color as the background) but disregard details that are not (eg, the shapes of clouds). What is kept then starts to form clusters of understanding that teach the self-driving platform to react more accurately to related scenarios. Pedestrians, for example, might have up to 100 different classes of behavior that are being developed on the Autobrains system.

Currently the platform is set up for two levels of self-driving. The first is to feed assisted systems aimed at improving human driver safety, which is scheduled to be rolled out commercially in 2023 adding on average $100 to the price of a car. The second is aimed at self driving at levels 4 and 5 and is “being worked on now” and will use whatever hardware has been built into vehicles to work. It’s projected to cost in the “few thousands of dollars” at the moment, and production should start on it in 2024, but with caveats that this could move depending on the market, its customers’ appetites to invest in this, the progress of technology, and of course what consumers realistically will want and use. (The two-level approach. focusing initially on scenarios involving AI-based driver assistance rather than autonomy, is one that other startups in the space are also taking: for example another self-learning startup called Annotell, which also recently raised funding.)

“I think it’s a process, not an immediate target,” Rachelgauz said of the fully-autonomous roadmap. “But if we can commit to 2024, [we can so so understanding] it will take time to see how we can we scale it safely. The way it will happen is the differentiating factor for us.”

“Autobrains’ technology holds the promise we have all been looking for to create the paradigm shift in the industry to self-learning AI, bridging the gap to fully autonomous driving,” said Thuy Linh Pham, Deputy CEO VinFast, in a statement. “Autobrains captured our attention by applying unsupervised AI software, as opposed to traditional software that is based on manually labeled data, to make self-driving vehicles adaptive to unprecedented behaviors in real-time. We expect that Autobrains will actualize this ambitious goal into a reality in the near future.”



Source link

Leave a Reply