Solutions for AI Spatial Perception
Fast-track your AI perception applications for unstructured and harsh environments on the edge.
De-risk your development and get your AI-enabled market-disruptive product to market much faster
Create and test your deeply embedded AI visual-spatial perception concept in weeks with Au-Zone’s all-in-one development framework and perception solutions. Don’t waste 12 months or more, let Au-Zone’s development tools and embedded spatial perception software get you a trusted perception solution on the edge now.
Au-Zone technology solutions simplify even the most challenging AI perception applications with a spatial perception fusion engine that achieves the perception accuracy required in real time while also meeting the power, size and cost constraints of embedded edge systems.
Train in the cloud and run at the edge
Au-Zone solutions for data-driven and precision agriculture, commercial construction and other industrial uses include:
Deep View AI-assisted enterprise tool suite for data collection, auto-annotation and AI model deployment to cost-effectively scale from small image and video datasets to datasets with more than a million images or many gigabytes (GBs) of video.
EdgeFirst rugged form-factor hardware reference designs provide industrial-grade, rugged, integrated camera and radar reference designs to accelerate and de-risk development
EdgeFirst high-fidelity sensor fusion inference engine to significantly improve detection performance and safety for extremely harsh and poor visibility conditions while doing so cost-effectively
EdgeFirst SDKs and porting kits to accelerate and de-risk development
EdgeFirst pre-optimized AI models and reference apps to accelerate AI model development for a range of visual-spatial perception use cases, including object detection, segmentation and classification
Deep View AI-Assisted Enterprise Tool Suite
One stop for all your dataset management—store, curate, auto-annotate datasets and train your AI models.
Import visual and radar data with effortless integration and time-saving auto-annotation.
AI-assisted labeling and annotation enables you to audit 4 times faster, save 80% on detection dataset generation costs, and save 95% through faster segmentation.
Enrich, refine and manage datasets iteratively.
Train and optimize your AI models for an edge deployment, visualize results and optimize model convergence.
Deploy at the edge with no-code deployment and workflows ready-to-use with a range of Au-Zone perception development reference modules based on processors from Raspberry Pi to NXP iMX8 to NVIDIA Jetson AGX.
No setup, installation or configuration on site. Tailor your tool seat costs and dataset annotation costs with Au-Zone’s software-as-a-service (SaaS) model.
EdgeFirst Rugged Form-Factor Hardware Reference Designs for AI Perception
Purpose-built, ready-to-use AI spatial perception hardware and modular reference designs are built with production-grade, industrialized hardware and software components for prototyping, field testing, and production.
Gather real-world camera and radar data for AI model training and sensor fusion.
AI vision starter kits, ranging from high-performance NPU-based systems (NXP i.MX 8M Plus with optional NPU co-processor) to low-cost MCU-based systems (NXP i.MX RT Crossover MCU), accelerate the development of your proof-of-concept (POC) and rapidly enable low-volume trials.
Reference designs get your next-generation product into production and to market faster.
EdgeFirst High-Fidelity Sensor Fusion Inference Engine
Cutting-edge research shows that perception based on low-level data fusion significantly outperforms the incumbent method used in automotive ADAS systems based on object-level perception. In particular, perception using low-level data fusion results in a higher probability of detection, fewer false alarms and shorter process delays. What is needed to achieve these results is an AI perception processing framework, trained AI models for low-level fused data (i.e., radar RAD data and camera RGB data) and a development kit. Most importantly, these AI models must be cost-effective to run in real time on edge or embedded processors.
Au-Zone brings the benefits of perception based on low-level fused data and reduces the complexity and cost to implement in these ways:
This field-proven, configurable, multi-purpose sensor fusion and decision engine can be re-trained and re-purposed for real-time embedded AI perception in any industry.
Low-level RAD (radar) and RGB (camera) image data are aligned and calibrated as one stream to preserve rich data for better object detection and decision-making.
Ideal for use in deeply embedded low-power devices in harsh environments, including exposure to water, dust, mud, snow and fog.
Use the same hardware reference design for AI model training as well as real-time decision-making for the highest performance, reliability and accuracy in field deployment.
The small footprint and cost-, power- and size-optimized hardware can be attached to existing equipment or used in standalone devices used in remote locations.
EdgeFirst SDKs and Porting Kits
Our software products integrate natively in the NXP eIQ Toolkit development framework and can be used in your machine learning environment. Contact us about software development kits and porting guides for other embedded processors, AI NPU accelerator chips and platforms.
EdgeFirst Pre-Optimized AI Models and Reference Apps
Reference implementations for AI spatial perception tasks provide the building blocks to solve rapidly and robustly your AI perception training challenges. AI models are available for frequent use cases like these:
Dense detection, segmentation and localization at the leaf level, such as for weed, disease and insects identification
Agricultural row detection and navigation to reduce headlands and increase crop yields
Sparse detection, segmentation and localization, such as to identify plastic debris in the ocean or, at the leaf level, early detection of an insect infestation in a large field
Pedestrian detection, segmentation and ranging, such as to identify workers near construction equipment in operation
Head pose to monitor operator behavior
Reduce program and operational risk
If you are unsure whether you can solve a challenging AI computer vision and perception problem, does it make sense to spend six months working from-scratch to find out?
With Au-Zone Deep View and EdgeFirst solutions, you can try out your idea with ready-to-use industrialized, rugged hardware and AI perception software. Get started today.