HTTPS SSH

README

Dependencies:

  • ROS Indigo (newly migrated): ardrone_autonomy, image_transport, cv_bridge
  • OpenCV 2.4, RailwayImageProcess

Railway Follower Application

Ref to: Páll Előd, Koppány Máthé, Levente Tamás, Lucian Bușoniu, "Railway Track Following with the AR.Drone Using Vanishing Point Detection", IEEE Int. Conf. on Automation, Quality and Testing, Robotics (AQTR), 2014.

This application uses an AR.Drone to autonomously fly above railway lines using only the on-board sensors of the drone.

How to use railway pkg collection

Place the drone between the rail lines facing the desired direction (in real life or in the simulator). Next start the Detection node and than the flight controller node:

    $ rosrun RailDetect railDetect
    $ rosrun RailFlyControl2 railFly

To build the RailwayImageProcess shared library:

    $ g++ `pkg-config --cflags opencv` -fPIC -o RailwayImageProcess.o -c RailwayImageProcess.c `pkg-config --libs opencv`
    $ g++ RailwayImageProcess.o -Wall -g -shared -o RailwayImageProcess.so
    $ sudo cp RailwayImageProcess.so /usr/local/lib/libRailwayImageProcess.so
  • RAILWAY DETECTOR NODE: this node detects the desired direction where the drone should fly. It has the img processing and estimation implemnted.

  • RAILWAY FOLLOWER NODE: the node controls the drone`s velocity, the uses a PID controller, tuneable from launchfile.

Corridor Follower Application

Ref to: Előd Páll, Levente Tamás, Lucian Buşoniu, "Vision-Based Quadcopter Navigation in Structured Environments", In Handling Uncertainty and Networked Structure in Robot Control, Springer, Studies in Systems, Decision and Control Series, L. Busoniu, L. Tamas (editors), pp 265-290, 2016

This application uses an AR.Drone to autonomously fly through hallway like environments by using only the on-board sensors of the drone.

How to use corridor pkg collection

Copy the camera_info folder in ~/.ros/ and launch the ardrone_autonomy driver with myArdroneDriver.launch. It is needed to publish the rectified and color image for corridor detection.

First, place the drone facing the direction of the corridor on a landing pad. Than start the nodes in the following order: corridor detector, flight controller, and data logger node:

    $ roslaunch myArdroneDriver.launch
    $ rosrun Corridor corridorDetect
    $ rosrun CorridorFlyControl corridorFly
    $ rosrun DataLogger dataLog
  • CORRIDOR DETECTER NODE: this node processes the images from the AR.Drone and also estimates the postin of the VP with a linear Kalman filter. To upgrade the estimater into Extanded Kalman Filter visit: https://sites.google.com/site/timecontroll/home/extended-kalman-filtering-with-opencv

  • FLIGHT CONTROLLER NODE: this node sends the velocity commands to the AR.Dron in depending on published "fly_command" msg by the corridor pkg.

  • DATA LOGGER NODE: this node saves navigational data (time stamp, Vx, Vy, altitude, yaw rotation) to "nav.txt" and VP obseravtion (position VP oservation X & Y, VP estimation of X & Y) is saved in the "imageProcess.txt".

Contact

Elod Pall

https://sites.google.com/site/timecontroll/