Skip to content

Developed an LLM-aware computer vision algorithm - dependent on physically probing the built environment - equipping a quadruped unmanned ground vehicle (UGV) for unsupervised object detection.

License

Notifications You must be signed in to change notification settings

HJReachability/Quadruped-Unsupervised_Object_Discovery

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Unsupervised Object Discovery via Interaction

Developed an LLM-aware computer vision algorithm - dependent on physically probing the built environment - equipping a quadruped unmanned ground vehicle (UGV) for unsupervised object detection.

Description

To view final results:


Dependencies

This repo contains the forked versions of the following repos:

Specficially the public branch of both. This code would not be possible without the amazing work of alonrot, and his documentation

Since both repos have to be in the same ROS workspace to move the robot, our team found it easier to combine into a single repo


Before running the experiment, make sure to:

  • copy the files from src/pi_files to the Raspberry Pi
  • copy the files from src/nano_files to the main Jetson Nano (192.168.123.13)

Running on Go1

  1. Connect to the unitree Wi-Fi
ssh [email protected]	<pwd: 123>
  1. Connect to the Raspberry Pi and launch the Relay Node
python camera1node.py
  1. On an external computer, launch the object and goal detection nodes (this assumes the camera used for AR tag tracking is already running)

Note: depending on the camera used for AR tag tracking, parameters in src/perception/launch/ar_track.launch might need to be ajusted

# Object detection
rosrun perception object_detector_pc.py

# Goal detection
roslaunch perception ar_track.launch

rosrun perception get_goal1_transform.py ar_tag_1

rosrun perception get_goal1_transform.py ar_tag_2

rosrun perception goal_detector.py

  1. Still on the external computer, run the main ROS node
rosrun plannedctrl main_node.py

About

Developed an LLM-aware computer vision algorithm - dependent on physically probing the built environment - equipping a quadruped unmanned ground vehicle (UGV) for unsupervised object detection.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 77.1%
  • Python 18.8%
  • CMake 4.0%
  • Other 0.1%