Software and Datasets

On this page the Chair of Automatic Control Engineering offer datasets and software that is used  within active research or published along with scientific work.


MATLAB

Am 1. Oktober 2011 hat die TUM eine Campuslizens mit MathWorks abgeschlossen.

Das heißt, dass MATLAB- und Simulink-Lizenzen, Toolboxen und Blocksets für Studenten und Mirarbeiter kostenlos angeboten werden.

Rahmenvertrag für MathWorks Produkte: Matlab, Simulink, Toolboxen

Die Software und die Lizensen können hier bezogen werden.


Surface Based Segmentation - Dataset and Software

Here we share and make publicly available both real and simulated 3D range data sets from the Chair of Automatic Control Engineering. To make storage and processing efficient, our data sets are provided in a binary format. Files have the extension *.pc (point cloud) and may contain an arbitrary number of frames, each of which contains an ASCII-identifier, a time stamp, a view point and the 3D point coordinates for one scanline.

To this end we provide a Matlab script to read the files. Save it in the same folder as the downloaded data sets, then type "help loadpcfile" in Matlab to learn how to use it. Note:It is not recommended to visualize point clouds in Matlab, as the 3D view will slow to a crawl when drawing more than a couple thousand line/point handles.

When working with the data sets, you should be aware of two things: Firstly, the real data sets may have an incomplete last frame, i.e. the total number of points may be slightly less than the number of scanlines times the number of points per scanline. Secondly, even for scans which were recorded from one view point, the points will usually be loaded with very many distinct view points. This is because the sensor pose estimate (in our data sets this is extracted from the joint encoders and calculation of the forward kinematics) will differ slightly in the last digits from frame to frame. You can remedy this by manually assigning one viewpoint (for example the column mean of V) to all points in Matlab.

Download Publications

Download Dataset

 

Name # Points # Scanlines Size
Seat & Armchair
Recorded with a Hokuyo UTM-30LX using only the front 180°
Frequency: 40Hz
Field of view: 180°
Angular resolution: 0.25°
720 points/scanline
seat_armchair_horizontal 297,360 413 6,991kB
seat_armchair_vertical 297,360 413 6,991kB
seat_armchair_rotational 110,880 154 2,608kB
Seat, Armchair, Table
Recorded with a Hokuyo UTM-30LX using only the front 180°
Frequency: 40Hz
Field of view: 180°
Angular resolution: 0.25°
720 points/scanline
seat_armchair_table_horizontal 296,640 412 6,975kB
seat_armchair_table_vertical 297,360 413 6,991kB
seat_armchair_rotational 158,400 220 3,712kB
Two Chairs
Recorded with a Hokuyo UTM-30LX using only the front 180°
Frequency: 40Hz
Field of view: 180°
Angular resolution: 0.25°
720 points/scanline
two_chairs_horizontal 198,720 276 4,671kB
two_chairs_vertical 295,920 411 6,959kB
two_chairs_rotational 177,840 247 4,176kB

 

The software framework that we currently use for simulation and segmentation of 3D range data has many dependencies and is not in any way cleaned up to allow for a public release. However, to facilitate a comparison of performance, we here provide the code which is at the core of our segmentation algorithm. Although it has undergone (and is still undergoing minor) modifications, the core clustering loop is that used and mentioned in several publications. In an implementation with linked lists, we have found this routine to perform significantly faster than region growing with seed points. Note that the below snippets only provide relevant excerpts of the full classes and routines. In our code, the data structures make use of Qt containers, however, they can just as well be implemented with equivalent STL containers.

Using a point data structure like:

 

 

  1. struct SmartPoint {
  2. //! The point
  3. Point3F p;
  4. //! The normal
  5. Point3F n;
  6. //! The average Euclidean distance to the neighbors
  7. double ad;
  8. //! User pointer
  9. void *up;
  10. };

 

and a linked list point cloud class such as

 

 

  1. class SmartPointCloud : public QLinkedList<SmartPoint*> {
  2.  
  3. //! Computes the normal vector of each point from its \a k nearest neighbors.
  4. void computeNormalVectors(int k=20);
  5.  
  6. //! Segments the points by grouping all points with at least
    // \a l neighbors within distance \a r and normal vector angle difference of
    // \a a degrees. Removes clusters with less than \a nMin points.
  7. QList<SmartPointCloud*> segmentByNormalVectors(double aMax, double rMax, int nMin);
  8.  
  9. //! Appends smart point cloud \a other to this one.
  10. SmartPointCloud& operator<<(const SmartPointCloud &other);
  11.  
  12. }

 

the clustering routine can be summarized as follows:

 

 

  1. QList<SmartPointCloud*> SmartPointCloud::segmentByNormalVectors(double aMax, double rMax, int nMin) {
  2. if (aMax<=0)
  3. return QList<SmartPointCloud*>();
  4.  
  5. qDebug("[SmartPointCloud] Segmenting smart point cloud by normal vectors...");
  6. time.start();
  7.  
  8. //Create a kd-tree if necessary, allocate space, prepare parameters
  9. if (!kdTree)
  10. kdTree = createKdTreeFromPoints();
  11. ANNpointArray points = kdTree->thePoints();
  12. const int k = MIN(50,count()-1);
  13. ANNidxArray indices = new ANNidx[k];
  14. ANNdistArray distances = new ANNdist[k];
  15. double aThresh = cos(aMax/180.0*M_PI);
  16.  
  17. //Make sure normal vectors have been estimated
  18. if (!normalVectorsComputed)
  19. computeNormalVectors(k);
  20.  
  21. //Create a temporary array of indices and clean the user pointers
  22. SmartPoint **pArray = new SmartPoint*[count()];
  23. int i=0;
  24. for (iterator p=begin(); p!=end(); p++) {
  25. pArray[i] = (*p);
  26. pArray[i]->up = NULL;
  27. i++;
  28. }
  29.  
  30. int id=0;
  31. for (iterator p=begin(); p!=end(); p++) {
  32. //Current point
  33. SmartPoint *cp = (*p);
  34.  
  35. //Get the nearest neighbors
  36. double r = MIN(rMax,cp->ad);
  37. kdTree->annkFRSearch(points[id],r*r,k,indices,distances);
  38.  
  39. //Now cluster
  40. for (int j=0; j<k; j++) {
  41. if (indices[j] == ANN_NULL_IDX)
  42. break;
  43. if (fabs(cp->n.dot(pArray[indices[j]]->n)) < aThresh)
  44. continue;
  45. //Neighbor point
  46. SmartPoint *np = pArray[indices[j]];
  47. if (cp->up) {
  48. if (np->up) {
  49. //If points belong to the same cluster continue
  50. if (cp->up == np->up)
  51. continue;
  52. //Else merge the clusters (append the smaller to the larger)
  53. bool g = ((SmartPointCloud*)cp->up)->count() > ((SmartPointCloud*)np->up)->count();
  54. SmartPointCloud *pc1 = g ? (SmartPointCloud*)cp->up : (SmartPointCloud*)np->up;
  55. SmartPointCloud *pc2 = g ? (SmartPointCloud*)np->up : (SmartPointCloud*)cp->up;
  56. for (iterator p=pc2->begin(); p!=pc2->end(); p++)
  57. (*p)->up = pc1;
  58. (*pc1) << (*pc2);
  59. delete pc2;
  60. } else {
  61. //Neighbor point doesn't have a cluster, add it to this one
  62. np->up = cp->up;
  63. ((SmartPointCloud*)cp->up)->append(np);
  64. }
  65. } else if (np->up) {
  66. //Add this point to neighbor cluster
  67. cp->up = np->up;
  68. ((SmartPointCloud*)np->up)->append(cp);
  69. } else {
  70. //Both points don't have a cluster -> Create a new one
  71. SmartPointCloud *nc = new SmartPointCloud;
  72. cp->up = (void*)nc;
  73. nc->append(cp);
  74. np->up = (void*)nc;
  75. nc->append(np);
  76. }
  77. }
  78. id++;
  79. }
  80.  
  81. //Gather the clusters and delete the ones with less than nMin points
  82. QList<SmartPointCloud*> clusters;
  83. for (iterator p=begin(); p!=end(); p++) {
  84. SmartPointCloud *cluster = (SmartPointCloud*)(*p)->up;
  85. if (!cluster)
  86. continue;
  87. if (clusters.contains(cluster)) {
  88. p = --erase(p);
  89. continue;
  90. }
  91. if (cluster->count() < nMin) {
  92. for (iterator cp=cluster->begin(); cp!=cluster->end(); cp++)
  93. (*cp)->up = NULL;
  94. delete cluster;
  95. } else {
  96. p = --erase(p);
  97. clusters.append(cluster);
  98. }
  99. }
  100.  
  101. //The point array and kd-tree are no longer valid for this point cloud
  102. annDeallocPts(points);
  103. delete kdTree;
  104. kdTree = NULL;
  105.  
  106. //Clean up and return
  107. delete indices;
  108. delete distances;
  109. delete inRange;
  110. delete pArray;
  111.  
  112. qDebug("[SmartPointCloud] Done (%d ms). %d cluster(s).",time.elapsed(),clusters.count());
  113. return clusters;
  114. }

 


Munich Corpus - Human-Human Dialogs when Asking for Directions (german)

Here we share and make publicly available a corpus of transcribed human-human dialogs in German language provided in PDF-format. The corpus was created in order to derive linguistic heuristics from human-human communication and adopt it to human-robot dialog in the context of asking for directions. It is intended for research use and the corpus is a free documentation released under the terms of the GNU Free Documentation License (GFDL). You may copy or reproduce and use the corpus for your own research. Scientific publications based on the corpus are requested to acknowledge the source:

The 48 asking-for-directions-dialogs have been recorded in summer 2009 in the city of Munich asking passers-by for the way to three different goals on five different routes. In order not to influence the participants and gain natural dialogs the participants were not informed about the recording of the conversations. Exceptions are dialogs 4, 6, 20 and 34, in which the subjects detected the recorder while describing the route. Afterwards, all participants agreed that the dialog-transcription could be published in anonymous form.  Accordingly, the dialog partners are marked “F” for the asker and “A” for the responder, more than one responding passer-by are marked “A1, A2 …”. Overlapping utterances are not identified within these transcriptions. The starting point and the designated goal location are indicated above each dialog. Additionally, the asker’s initial walking orientation is quoted, as well as the direction of the mutual co-orientation of both conversation partners in the course of the dialog.

Download Dataset


Sampling based Path Planning

The software package PP is a simple sampling-based path planning library written in C++. The library was created as a tool for solving path planning problems arising in the context of robotics applications. It is intended for both research and educational use and provides a framework that is extensible for a number of applications. The library is free software released under the terms of the GNU General Public License (GPLv3). In its current version the library offers the following advantages:

  • Simple: PP provides an intuitve and uncluttered API. Planning problems are quickly composed by specifying a robot, a world and at least one planning query.
  • Flexible: PP lets you define any kind of rigid tree-structured kinematic linkage. It provides an easy-to-use XML based interface for creating and defining robots and worlds.
  • Powerful: PP comes with a set of provenly efficient planners, such as PRMs and RDTs to solve generic path planning problems.
  • Parallel: Most planners come with a parallel implementation, so that you can use your multi-core machines to their full potential.
  • Cross-platform: Because it uses only cross-platform compatible third party libraries, PP runs on Linux and Windows out of the box. (not tested on Mac).

Download PP

Data for the PLOSone Article: Towards Assessing the Human Trajectory Planning Horizon

The data provided here is supplementary material to a Journal Article under Review with PLOSone: "Towards Assessing the Human Trajectory Planning Horizon"

The data was collected from 41 subjects that were using the "Desktop Kinesthetic Feedback Device" (DeKiFeD). Subjects were acting on the DeKiFeD with one arm and were supposed to move a block shaped cursor through a virtual environment. The environment changed randomly based on 10 levels with obstacles entering the field blocking the direct path to the goal. Each higher level raised the complexity, meaning that more moving obstacles had to be avoided. During the experiment the position, velocity and gaze data of each subject was recorded. This data is provided here.

Download Dataset