What s The ACCURACY OF DRAGONFLY

De Wiki-AUER


Dragonfly is Onit’s reducing-edge laptop vision indoor iTagPro features localization technology based on visible SLAM that gives correct indoor place and orientation to forklifts, automated guided vehicles (AGV), autonomous mobile robots robots (AMR), robots, drones and some other moving car and asset. Dragonfly enables RTLS solutions for analytics, productiveness and security in GPS-denied environments like warehouses, production plants, factories etc.. Dragonfly delivers the X-Y-Z coordinates and 3D orientation of any moving device with centimeter-degree accuracy, by analyzing in real-time the video stream coming from an ordinary huge-angle camera related to a small computing unit. Dragonfly represents the state-of-the-artwork for iTagPro smart device indoor localization applied sciences at areas the place GPS/GNSS cannot be used and it's far more aggressive compared to other indoor localization technologies based mostly on LiDAR, Ultra Wide Bandwidth, Wi-Fi, Bluetooth RSS. HOW DOES IT WORK? Throughout the system setup section the wide-angle digicam sends the video feed of its surroundings to the computing unit.



The computing unit takes care of extracting the iTagPro features of the setting, in each of the frames, and creating a 3D map of the surroundings (which is geo-referenced utilizing a DWG file of the world). During its utilization in manufacturing the broad-angle digital camera sends the true-time video feed of its surroundings to the computing unit. The computing unit extracts the features of the atmosphere in each of the frames and compare them with these inside the beforehand created 3D map of the setting. This process permits Dragonfly to calculate at more than 30 Hz the X-Y-Z position and iTagPro features orientation within the 3D area of the digicam (and thus of the cellular asset on which it is mounted). Dragonfly is an correct indoor location system primarily based on pc vision. The situation is computed in real time using just an on board camera and a computing unit on board of the machine to be tracked, thanks to our computer imaginative and prescient algorithm. Computer vision, odometry and synthetic intelligence are used to create an correct system, in order to deliver a exact location for multiple functions.



It is an excellent resolution for the precise indoor monitoring of forklifts, AGV, AMR, robots and drones (within the 3D space). Dragonfly is way more competitive than LiDAR, UWB, radio sign primarily based applied sciences for which an advert-hoc infrastructure should be designed, setup, calibrated and maintained for each specific venue. No receivers, no RFID tags, no antennas, no nodes, no magnetic stripes. Nothing must be deployed by means of the venue. You want just a digicam and a computing unit onboard your cell automobiles. No tech abilities required, no difficult instructions, no want for error-prone and time-consuming calibrations of ad-hoc UWB infrastructure. SLAM technology is much more robust to environmental adjustments versus LiDAR, ItagPro which struggles significantly to keep up accuracy in environments in which obstacles change over time. Dragonfly cameras are easier to calibrate and are more strong to modifications in the setting. Dragonfly distributed structure makes the answer reliable by eliminating obligatory server that led to SPOF (single points of failures).



This additionally means that Dragonfly can develop following the size and growth of your fleet of moving automobiles. Dragonfly can work utterly offline on a computing unit on board of forklift, AVG, AMR, drones, robots or on an on-premise server. Dragonfly permits you to optimize your operations, increasing the productivity and effectiveness of the tracked devices. In addition to this its competitive value, makes the ROI higher than some other expertise presently in the marketplace. Enhance the operations thanks to an actual-time visibility of the actual utilization and path of your cell autos (like forklifts) to keep away from beneath/over utilization and maximize the efficiency of the fleet. Know in actual-time the placement of every transferring asset to stop accidents between human-guided cellular automobiles (resembling forklifts) inside warehouses and manufacturing services enabling thus V2V (automobile to car) and V2P (vehicles to pedestrians) functions for collision-avoidance. Speed up the productiveness by tracking the location of each moving asset to not directly know the place of every handling unit on the bottom, racks and shelves.