top of page

Sensing

We are developing a precise and reliable sensing system to be the 'eyes' of the robot, helping the robot navigate through the narrow furrow and align with the dock.
We have used a low-cost camera ($50) and a LiDAR sensor ($1500) to detect the relative location of the vehicle in the furrow during early and late growth season respectively.

relative location of the rover with respect to the furrow
phi_e_invert.png

     is the lateral offset and      is the heading offset of the rover with respect to the centerline of the furrow.

e_invert.png

Our solution is superior than most current computer vision algorithms used for PA. We have invented dynamic crop recognition threshold, which adaptively adjusts its value according to the environmental changes like ambient light and crop size.

Our in-field test results prove that our novel algorithms approach the accuracy of RTK-GPS on cross-track detection, and exceed the RTK-GPS on heading detection. To know more, please read our paper published on IEEE journal.

​Our next step is to explore the results of using Realsense depth camera and the RCNN based machine learning algorithms to process the images/videos.

relativelocation

Result

results
bottom of page