visual intelligence at the edge
With COVID-19 top of mind, we're committed to keeping you posted on what we're doing. Learn more
create smart devices that sense the world
.....the same way you do
WE'RE HIRING - Are you interested in making a difference?
The DeepView™ Machine Learning Toolkit and RunTime Inference Engine help developers to quickly design, train and deploy cutting edge Deep Learning networks on a wide range of embedded MCU's, CPU's and GPU's.
Bring Your Own Model (BYOM) or BYOD (Data) development workflows allow you to quickly evaluate ML solutions to meet the challenges of your business.
The DeepViewRT Inference Engine allows you to focus on the data science with the confidence that when you're ready deploy your models to the embedded target they will be running fully optimized.
The DeepView on Pi Release provides data scientists and embedded developers with a vehicle to quickly and cost effectively explore the possibilities of embedded Machine Learning concepts and Visual Intelligence for IoT devices, test new ML models and create functional prototypes on the worlds more accessible embedded hardware platform - for FREE.
The combination of easy to use ML tools, highly optimized inference engine and proven hardware platform allows for extremely rapid prototyping, customization and commercial deployment.