THE FUTURE IS HERE

Algolux announces Ion, a development platform for autonomous vision systems

Algolux, founded in 2015, isn’t exactly a household name in the already crowded world of automotive computer vision. But the Quebec-based startup has generated some interest among investors. For instance, it’s raised $13.4 million, including a $10 million Series A led by General Motors Ventures last May. Not bad, given the fact that it’s remained a virtual unknown, up until now.

Today, Algolux is unveiling Ion, a platform that gives companies a set of tools and an embedded software stack to help them build their own perception systems. It’s essentially a plug-and-play solution, a departure from the common approach today in which companies are confined to siloed systems that often don’t integrate as easily with other systems. 

Algolux’s system brings the company’s machine learning and computer vision technologies to users looking to build an end-to-end solution, incorporating various regulations from governing bodies and safety features designed to help systems operating in tricky environments.

The company says Ion can be used to make more traditional systems, or “radical new designs.” This capability is applicable to any sensor type, processor type and perception task. Ion relies on the deep neural network Eos and Atlas, a number of different modules designed for camera tuning. It provides developers a mix and match approach based on their individual needs.

In a letter to TechCrunch, VP Dave Tokic notes the key differentiator between the company and its competition is a kind of brand agnosticism that lets companies use different products for different needs and to keep cost down.

“Our Ion Platform consists of tools (Atlas) and embedded software stacks (Eos) to uniquely provide an end-to-end approach to teams building perception systems,” he tells TechCrunch. “This allows the team to optimize and deep learn across both sensing and perception (even up to planning and control) for significantly better performance and to break down today’s design process silos. This capability is applicable to any sensor type, processor type, and perception task.”