Running Carmen

De LCAD
Revisão de 16h26min de 10 de junho de 2011 por Ranik (discussão | contribs) (New page: This document is an aid to begin using the CARMEN robot sensing and navigation software, including (but not limited to) the following programs: * logger This program stores sens...)
(dif) ← Edição anterior | Revisão atual (dif) | Versão posterior → (dif)
Ir para: navegação, pesquisa

This document is an aid to begin using the CARMEN robot sensing and navigation software, including (but not limited to) the following programs:

   * logger
     This program stores sensor and odometry data with time stamps into a log file.
   * playback
     This program plays back a log file and send the stored messages to the other modules.
   * map_editor
     This program allows for creating or editting maps for use by Carmen.
   * navigatorgui
     This program provides a graphic interface which shows the robot's position and destiation on the pre-built map and allows setting of the current position and orientation, and selection of destinations.
   * param_edit
     This program enables the user to change parameters as the robot is running. It also makes it easy to save the changes to a ".ini" file.
   * param_deamon
     This program provides other programs with information about the robot being used and the area around it. This can include a pre-built map, sensors on the robot, and typical sensor error.
   * robotgui
     This program provides a simple graphic interface for the robot, allowing direct motion control and a display of current sensor information.
   * processcontrol
     This program controls the different processes, restarts them in case of a crash, etc.
   * simulator
     This program provides simulated data generation from a virtual robot. It requires a previously generated map.
   * base_services
     This program controls the movement of the robot and accepts inputs from the sensors. This program MUST run on the computer attatched to the robot hardware.
   * localize
     This program uses the sensor information from the baseServer to find the robot position in a map provided by param_deamon.
   * navigator
     This program enables autonomous navigation.
   * vasco
     This program creates a map from sensor and odometry data stored in a log file.
   * vasco-tiny
     A command line scan-matcher based on vasco. It reads log files and outputs a (locally) corrected log file. 

The links above will go to other pages with more complete descriptions and instructions for using these programs.

The "central" program, IPC, enables communication between these other programs. Information on using and coding with IPC can be found at http://www-2.cs.cmu.edu/afs/cs/project/TCA/www/ipc/ipc.html. A simplistic explanation of IPC: various programs "publish" generated information which is then "subscibed to" by other programs. IPC keeps track of what is published and delivers it to the subscribers.

Table Of Contents

  1. Running a Simulated Robot
  2. Teleoperation of a Robot
  3. Building a Map
  4. Autonomous Navigation

The use of CARMEN will described using four different scenaria: simulating a robot, tele-operating a robot, building a map, and way-point navigation of a robot in a known map. Since CARMEN is modular software, each scenario will involve opening a bunch of different xterms and running different programs.

Parameters for all of the modules are read from the carmen.ini file. You will need to edit this file, adding your own robot and parameter values. The name of your robot in [] square brackets in the param file will be a command line parameter to the parameter server later in these instructions.


Running a Simulated Robot

1. In window 1, start the IPC central server

  cd carmen/bin ; ./central

If you are running programs on multiple machines, then you must specify on the other machines which machine is running central. This is done with the command:

setenv CENTRALHOST [hostname]

where [hostname] is the name of the machine on which central is running.

Note: When running displays on off-robot computers, running the display programs off-robot rather than on-robot uses less bandwidth than piping the X display across the network.

2. In window 2, start the parameter server

  cd carmen/bin ; ./param_deamon --robot pearl ../data/thickwean.map

The command above will start the param_deamon such that it serves parameters for the CMU robot "Pearl" from the parameter file "carmen.ini" and the map file "thickwean.map." Other command-line options are:

   * robot [robotname] directs the server to parameters specific to the robot which should exist in the specified parameter file.
   * [mapfile.map] directs the server to a map file specific to the location. If you do not specify a mapfile, then no map will be served.
   * [paramfile.ini] specifies a parameter file. If no file is specified, then parameters from carmen.ini will be used. The param_deamon will look first in the current directory, then the parent directory (../), and then ../src/ for carmen.ini.
   * [-port portnum] specifies a port to be used for an ANT server
   * [-noant] says not open an ANT server
   * [-useant] says to use an ANT server

3. In window 3, start the robot simulator

  cd carmen/bin ; ./simulator

4. In window 4, start the navigatorgui program

  cd carmen/bin ; ./navigatorgui

This will open a graphic display. Place the simulated robot in the map by left-clicking on the "Place Simulator" button, and then left-clicking once on the desired location on the map. Then move the cursor away from the blue dot that appears, in the direction you wish the robot to face. A blue line appears between the cursor (now two curved arrows) and the simulated robot, indicating the facing of the robot. Left-click again to fix the facing of the robot.

5. In window 5, start the robotgui (before robotgraph) program

  cd carmen/bin ; ./robotgui

See the instructions for using robotgui below to teleoperate the simulated robot. You can also build maps and autonomously navigate the robot as described further below. Maps should look the same as the original map unless there were differences in the parameters used (in either the paramfile or in the vasco program) when creating the original map and the new map. Remember that if you are running programs on multiple machines, you must set the CENTRALHOST environment variable to point to the machine on which you are running central.


Teleoperation of a Robot

1. In window 1, start the IPC central server

  cd carmen/bin ; ./central

2. In window 2, start the parameter server

  cd carmen/bin ; ./param_deamon [paramfile.ini] --robot robotname

3. In window 3, start the base server that corresponds to your robot

  cd carmen/bin ; ./scout

4. In window 4, start the laser server

  cd carmen/bin ; ./laser

5. In window 5, start the robot server

  cd carmen/bin ; ./robot

6. In window 6, start the robotgui program

  cd carmen/bin ; ./robogui

Note that if you have a pioneer or scout robot, you can run the base_services program instead of steps 3, 4, and 5. It is not yet working with other robot types.

  cd carmen/bin ; ./base_services

Here is a bit more detail on the programs to be run.

   * central
     Run this anywhere, but other programs need an environment vairable set before you run them: "setenv CENTRALHOST [hostname]" if that program is being run on a different computer than central. This program must be started first. If stop this program (with cntl-c, typically), all the other Carmen programs will quit.
   * param_deamon --robot robotname [paramfile.ini]
     The robotname option directs the server to parameters specific to the robot which should exist in the file "carmen.ini". If you wish to use a parameter file other than carmen.ini, then that would be specified in the paramfile.ini option.
   * base_services
     Run this program after central and param_deamon have been started. This program connects to the robot and to the various sensors on the robot, so it MUST run on the machine connected to that hardware.
         o Note: This program combines three modules. If the robot hardware is divided among multiple computers, then the modules can be used independently on different computers. this currently only works with the Scout robots from Nomadics Technologies and the Pioneer robots from ActivMedia. scout - This module co-ordinates the motion of the Scout robot and monitors robot odometry.
         o pioneer - This module co-ordinates the motion of the Pioneer robot and monitors robot odometry. It is an alternative to scout.
         o laser - this monitors the SICK LMS and PLS laser sensors.
         o robot - this module takes odometry and sensor readings and combines them for distribution to other programs. 
   * robotgui -add_control on
     This opens a display which shows the robot as a circle or rectangle in the center with a small line designating the front and also sensor information. Blue regions are obsticals or unknown, white is clear. Red edges in the laser display indicate indicate obsticals which are close enough to prevent forward motion. Red arcs indicate sonar-detected obsticals.
     To turn the robot, use the left mouse button. Clicking near the robot will turn it right or left. To move the robot, click on the robot and drag the mouse pointer in the direction (relative to the "forward" indicator on the screen) you wish the robot to go. The robot indicator will turn red when you click on it, indicating that it is ready to move. The further from the robot you drag the cursor, the faster the robot will go.
     Or, if you have properly installed your joystick under linux, you can drive the robot with the joystick.
     Alternatively, you can use keyboard commands to move the robot. The following keys will move the robot:
     U I O
     J K L
     M , .
     with I moving the robot forward, U will move it forward and turning it left, O forward and right. J and L will turn the robot in place. Some robot param files are written so that the robot will not go backwards because some of our robots do not have rear sensing. M and . will move it backwards and turn, or just turn, depending on the parameter file. , will move it backwards or not at all. any other key, including K, will stop the robot. Speed cannot be controlled with this method. 


In this image, the sensor data displayed is from a SICK PLS laser scanner. The robot is represented by the grey circle, facing the direction indicated by the black line. Dark blue edges are perceived objects, and lighter blue areas are unknown. The red edges are objects which are close enough that obstical avoidance routines will prevent forward motion.


Building a Map (using teleoperation)

To build a map, you can use either teleoperation or autonomous navigation. However, autonomous navigation requires that you start with a map, so this section describes how to do it with teleoperation. Another way to build a map is with the map editor program.

While the robot is moving, you will need to run the logger program.

Run steps 1 through 6 for teleoperating the robot, then:

7. In window 7, start the loggger

  cd carmen/bin ; ./logger [filename].log

[filename] specifies a filename for the sensor log. If the file already exists, the program will ask to overwrite it. If an answer other than "Y" or "y" is given, then the logger will quit.

Drive the robot around, covering your environment, then stop the logger by typing a CONTROL-C in the logger xterm. Before quitting the other programs (or after restarting them at a later time), run vasco on the logfile you created

   cd carmen/bin ; ./vasco file.log

This program opens an interface which displays sensor data compiled with raw odometry data, as in this image:

To correct for odometry error, click the "scan match" button on the upper right. After some computation time (progress is indicated on the lower left), the display will change, as in this second image:

This data was generated moving a robot from a lab (center right) into a corridor, up and down the corridor some distance, then back to the lab. The odd "fuzz" in the center of the corridor is actually the legs of people walking by as the data was gathered.

To create the final map to be used with param_deamon, now click the "Make Evidence Grid" button on the upper right. A new display will open allowing you to adjust parameters for creating the mapfile. At first, you should stick with the defaults.

When you click "Okay," that display will disapear, a new one will open and the map will form as you watch.

Note that the data which showed the people walking through the image has not effected the final map. To save this as a map, klick on the "File" menu and save as a ".map" file.

This file can then be used through your param_deamon program to allow for localization and autonomous navigation through the area you mapped.

Autonomous Navigation

This begins as if you were teleoperating the robot, but with a change in the param_deamon options to add the map file, and with other programs added.

1. In window 1, start the IPC central server

  cd carmen/bin ; ./central

2. In window 2, start the parameter server

  cd carmen/bin ; ./param_deamon [paramfile.ini] --robot robotname mapfile.map

3. In window 3, start the base server that corresponds to your robot

  cd carmen/bin ; ./scout

4. In window 4, start the laser server

  cd carmen/bin ; ./laser

5. In window 5, start the robot server

  cd carmen/bin ; ./robot

6. In window 6, start the robotgui program

  cd carmen/bin ; ./robotgui

7. In window 7, Run the localizer

  cd carmen/bin ; ./localize

8. In window 8, Run the path planner

  cd carmen/bin ; ./navigator

9. In window 9, Run the path planner graphics module

  cd carmen/bin ; ./navigatorgui

Here is more detail on the new programs.

   * localize
     This program takes sensor data from base_services and compares it to the map to localize the robot on that map.
   * navigator
     This program plans the path from the robot's current location to its destination and tells base_services how to move the robot.
   * navigatorgui
     This program provides a user interface for telling the robot where it is and where to go. To set the robot position on the map display, simply left-click the "Place Robot" button, then left-click the desired location on the map. The cursor will change into a pair of curved arrows, and a red dot appears ti indicate the robot location. A line between the dot and the cursor indicates the direction the robot's orientation. Left-click again to fix the robot orientation. If you are using a simulated robot, this will also place and orient the simulated robot. To separate the simulated robot from it's perceived location, you must set the simulator location after setting the robot location. 

You can now control the robot using the navigator. You can then left click on the map to give the robot a goal. Select the "Go" button to send the robot from its current location to the selected goal. Select the "Stop" button (the same location as the "Go" button) to make the robot stop at any time.


In this image, the robot's current perceived position is represented by the red circle, and the simulated position is represented by the blue circle. The robot's goal is represented by the yellow circle. The planned path is represented by the blue line. The planned path will change as the robot moves toward the goal and encounters obsticles. The scale of the image can be changed by moving the lower slide-bar.