Skip to content

saeed/multi_ar_drone_controller

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 

Repository files navigation

This program contains all files required to run the testbed. The testbed uses a master camera that is assumed to be at 192.168.200.102. Images are pullsed from the camera and processed by applying color filters to detect the location of the drones and the targets. A Matlab server processes the coordinates of the targets and determines the location of the drones. The drones are then controlled to move from their current location to the new location by moving first on the X-axis then on the Y-axis. This project is based on the work in "Krajník, Tomáš, et al. "AR-drone as a platform for robotic research and education." Research and Education in Robotics-EUROBOT 2011. Springer Berlin Heidelberg, 2011. 172-186." and some OpenCV tutorials. However, it was extended to allow for simultanious deployment of multiple drones and video decoding from the drone was made a bit faster to make decoding videos from multiple drones possible. Most of the older code is written by Tom Krajnik.

AR Drone Summary:

To control a single drone, 3 threads are required to allow for full control. The first thread (encapsulated by ATCmd) sends AT commands to the drone to control it. AT commands are generally responsible for controlling the drone's motion (e.g. take off, land, and changing it's angular velocities). It also allows you to update some of its parameters. If you are to edit this code significantly, you need to read more about AT commands in the AR Drone SDK Developer Guide. The second thread (encapsulated by NavdataC) is responsible for querying the state of the drone and its sensors to fill the navdata_unpacked structure, more details are available in the developers guide and the SDK documentation. The third thread (encapsulated by CImageClient) is responsible for getting, decoding, and recording images from the drone's cameras. All these threads are wrapped in the CHeli class. An instance of CHeli represents a physical drone. Thus we need one of these for each drone we create. CHeli class has functions like CHeli::takeoff(), CHeli::land(), and CHeli::setAngles() to control the drone. However, the controlling algorithm is wrapped implemented inside the CoordinateController and it is the only instance the main class needs to see to represent each drone. This separation allows for updating he controlling algorithm without touching the lower level classes representing the drone. It also allows for making the drone classes independent of any parameters required by the control algorithm.

For computer vision summary refer to calibrator2.cpp.

Pay careful attention to the following things while deploying the testbed:

the drone must always be set at 90 degrees with respect to the view of the master camera.
make sure that the starting location of the drone not close to the border of the master camera's view so that when it lefts off it doesn't go out view (remember that the view of the camera is cone shaped).
make sure to supply the correct color tags and IPs of the drones to the main file found in "main" directory
at the begining of each experiement, you will have to log on to the drone and execute a script that makes it dropout of the AccesPoint mode and join your local WiFi network.
at the begining of each experiement, you will have to execute the calibrator by executing the compile.sh script in "Calibrator" directory. The calibrator program allows you to make sure that you have the right view for the master camera. It also allows you to modify the color filters so you can get the best colors for your deployment.