東芝情報システム Official site

High-precision self-location estimation SLAM solution

Middleware for self-position estimation necessary for mobile entities. It enables stable self-position estimation even in factory and indoor environments!

Are there any challenges related to "self-localization and environmental recognition (SLAM)" in manufacturing sites and autonomous systems? ● Unable to determine self-location in factory or indoor environments where GPS cannot be used ● Decreased localization accuracy due to environmental changes (lighting, layout changes) ● High-precision self-localization is computationally intensive and cannot operate in real-time ● Difficulties in integrating multiple sensors such as cameras, IMUs, and odometry              ▼ Leave it to our SLAM solution! ――― Features ―――――――――――――――――――――― ◆ Visual SLAM that operates with low resources ◆ Improved accuracy and robustness through sensor fusion ◆ Recognition of the surrounding environment using surrounding map generation functionality ――――――――――――――――――――――――――― SLAM is a technology that allows mobile entities (AGVs and autonomous systems) to map their surrounding environment while moving and to identify their own location. It observes features in the environment, estimates self-location based on that information, and simultaneously creates an environmental map. Our company primarily develops self-localization technology/SLAM using cameras.

Related Link - https://www.tjsys.co.jp/embedded/esb-slam/index_j.…

basic information

In AGVs and autonomous systems, it is necessary to accurately grasp the destination and self-position for route determination. Our company customizes Toshiba's unique self-position estimation technology for customer systems and optimizes it for real-time systems. ◆ Visual SLAM that operates with low resources This method estimates relative self-position based on the distance changes to multiple feature points visible in the camera. The algorithm, designed for real-time processing, achieves high-speed and low-memory operation. ◆ Improved accuracy and robustness through sensor fusion Using Visual SLAM or IMU alone presents challenges such as error accumulation during long-distance travel and the impact of environmental changes. By combining multiple sensors, our company addresses these issues, achieving high-precision and stable self-position estimation. ◆ Environmental recognition through surrounding map generation Based on environmental information, we can generate a spatial map of the surroundings in real-time while estimating self-position. This can be utilized for obstacle avoidance and error correction, making it applicable to harvesting robots and warehouse robots for picking tasks.

Price information

For more details, please contact us.

Delivery Time

Applications/Examples of results

■Use Cases in Factories and Warehouses By utilizing SLAM in unmanned transport vehicles (AGVs) that operate in factories and warehouses, efficiency, safety, and autonomy are improved. - Applicable to robots that pick up products from warehouses - Adapts to new obstacles caused by the movement of objects and changes in the environment such as natural light and lighting - Multiple sensors (GNSS) ■Use Cases in Farms and Orchards Autonomous tractors moving outdoors face several challenges. By optimizing sensors and improving algorithms, the safety and efficiency of autonomous tractors in outdoor settings are enhanced. - When GPS signals are unstable outdoors, SLAM technology is used to identify the tractor's position in real-time - Combines LiDAR/sensors and cameras to create environmental maps and perform localization - Capable of avoiding obstacles such as trees and rocks, and applicable to recognizing and harvesting fruits at the right time with harvesting robots - Adaptable to various terrain changes such as furrows and slopes

Related Videos

Recommended products

Distributors

We will provide optimal solutions that contribute to our customers' businesses through our "extensive experience and achievements accumulated over many years" and "high technical capabilities" in the fields of embedded and LSI design.