Aicobotix demos QiCHECK-2 at WOF Expo

Slovakian start-up Aicobotix launched an upgraded version of its QiCHECK Solutions recognition system at the recent WOF Expo in Bratislava. QiCHECK-2 uses a camera for object recognition and the entire system learns on the spot based on the recognition of good samples.

In this way, the conformity of the product, assembly, type and quality can be checked by accurate comparison with good samples. If an incorrect sample is detected, the operator is informed immediately.

The camera sees the scene, recognises it then evaluates the category in which it is to be placed. The whole learning process is quick and can be handled by a normal operator, which is a good feature when production changes.

All recognition results are also available online via a standard web browser. Deployment does not require significant intervention in the existing line and the production data is processed in an easy-to-understand format. In addition, the entire solution can also be rented and its features tested.

At the WOF EXPO 2021, Aicobotix demonstrated stock recognition using QiCHECK-2 in this way. It recognised the categories ‘empty place’, ‘pallet in the right place’ and ‘pallet with sweets’. The categories were recognised by a camera placed above the test area and the system quickly interpreted the visual image of the virtual warehouse.

Here again, the company demonstrated how quickly QiCHECK-2 could learn what was good housekeeping when changing the desired behaviour.

The camera as a smart sensor represents a visual system with an extremely fast implementation. In doing so, the camera sensor identifies changes in the scene with the support of machine learning. The entire system is suitable for plants with conveyors and can inspect multiple parts at once. It only needs a moment to evaluate and no special knowledge is required to learn it on-site at the production site.

WATCH THIS VIDEO to learn more.

 

The post Aicobotix demos QiCHECK-2 at WOF Expo appeared first on Logistics Business® Magazine.