<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<title>Robotika.cz</title>
<link rel='self' type='application/atom+xml' href='http://localhost/feed/en'/>
<link rel='alternate' type='text/html' href='http://localhost/en'/>
<id>http://localhost/en</id>
<updated>2017-03-08T00:00:00Z</updated>
<entry>
	<title>OSGAR</title>
	<link rel='alternate' href="http://localhost/robots/osgar/en"/>
	<id>http://localhost/robots/osgar/en</id>
	<updated>2014-07-28T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> OSGAR is a long term project which will probably incorporate more than one
robotic platform in the future. The project started in 2014 when we decided to
modify school garden tractor &lt;b>John Deere X300R&lt;/b> into autonomous robot. The
project is carried on in cooperation with the Czech University of Life Science
Prague. &lt;b>Blog update:&lt;/b> 16/5 &amp;mdash; &lt;a href='/robots/osgar/en#170516'>Robotem rovně and garden tests&lt;/a>
 </summary>
	<content type='html'> 
&lt;h1>John Deere X300R&lt;/h1>

&lt;h2>Team&lt;/h2>

&lt;ul>
&lt;li>Milan Kroulík &amp;mdash; project leader (&lt;a href='http://www.tf.czu.cz/cs/' class='external'>CZU/TF&lt;/a>)&lt;/li>

&lt;li>Stanislav Petrásek &amp;mdash; mechanics (&lt;a href='http://www.tf.czu.cz/cs/' class='external'>CZU/TF&lt;/a>)&lt;/li>

&lt;li>Tomáš Roubíček &amp;mdash; electronics (&lt;a href='http://www.robsys.cz' class='external'>RobSys&lt;/a>)&lt;/li>

&lt;li>Jakub Lev &amp;mdash; software/testing (&lt;a href='http://www.tf.czu.cz/cs/' class='external'>CZU/TF&lt;/a>)&lt;/li>

&lt;li>Martin Dlouhý &amp;mdash; software (&lt;a href='http://robotika.cz' class='external'>robotika.cz&lt;/a>)&lt;/li>
&lt;/ul>

&lt;h2>2014&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/osgar/powered-by-eduro-maxi.jpg'>&lt;img src='/robots/osgar/powered-by-eduro-maxi_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The idea is relatively old &amp;mdash; our garden desires mowing every second week and I
would not mind some „helper”. Yes, in meantime autonomous lawn mowers made
serious advance, but it is not only our garden and there are much larger grass
areas which require regular maintenance (castle gardens for example &lt;span class='wink'>&lt;/span>).&lt;/div>

&lt;div class='p'>In 2014 we also started school project, where self-driving tractor should cut
grass and regularly monitor trees in a orchard. You could probably use GPS for
basic navigation, but there are many obstacles (trees), which have to be
avoided. You would need sensors for this task and if you have them then you can
use them also for navigation.&lt;/div>

&lt;div class='p'>This „orchard task” is similar to navigation in the row of maize field in
&lt;a href='/competitions/fieldrobot/en'>Field Robot Event&lt;/a> contest. We already got some
experience there (and prices) with our robot &lt;a href='/robots/eduro/en'>Eduro&lt;/a>. The
very first experiment was to take Eduro, mount it on the top of the tractor
and manually drive it through orchard to collect some data &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="160514">&lt;/a>&lt;/div>

&lt;h2>14th May, 2016 &amp;mdash; Contest „Robot, go straight!”&lt;/h2>

&lt;div class='p'>The Czech season of outdoor competitions for autonomous robots begins in May in
Písek. It is called &lt;a href='/competitions/robotem-rovne/2016/en'>&lt;span class='cs'>Robot, go straight!&lt;/span>&lt;/a>
(in Czech „Robotem rovně”) and autonomous robots should &lt;i>only&lt;/i> navigate
straight on paved road in central park. The track is 314m long and only small
electrical vehicles are allowed.&lt;/div>

&lt;div class='p'>In 2015 we asked organizers if it would be possible to make an exception so
that we could participate with our modified garden tractor John Deere X300R.
The permission was granted but we did not use it &amp;mdash; the tractor was not
ready yet.&lt;/div>

&lt;div class='p'>In 2016 it looked much more promising because the school tractor was supposed
to be at exhibition in Brno, one month before the contest. But there were
problems with hydraulics components and this exhibition program had to be
canceled.&lt;/div>

&lt;div class='p'>But we wanted to push the project forward! We joked with Standa and Tomáš, that
we can short safety switch under the seat and use it as emergency stop and then
just put brick on the gas pedal and &lt;i>version 0&lt;/i> is ready. &lt;span class='smile'>&lt;/span> Well, what I
did not know was that this „joke” will turn true and in reality we will need
it for homologation and the first run:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2016/brick-on-gas.jpg'>&lt;img src='/competitions/robotem-rovne/2016/brick-on-gas_t.jpg' alt='homologation' title='homologation' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2016/brick-on-gas.jpg'>homologation&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2016/first-run.jpg'>&lt;img src='/competitions/robotem-rovne/2016/first-run_t.jpg' alt='1st run' title='1st run' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2016/first-run.jpg'>1st run&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2016/jd-4th-run-no-gas.jpg'>&lt;img src='/competitions/robotem-rovne/2016/jd-4th-run-no-gas_t.jpg' alt='4th run: out of gas' title='4th run: out of gas' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2016/jd-4th-run-no-gas.jpg'>4th run: out of gas&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The contest is great. It runs all day (there are four runs), so we had enough
time to continue to work on the robot. In the second run the robot was already
driven by Python code and pedal was controlled via CAN bus. We even collected
several points and „good bye” joke was in the 4th run, when the tractor run
out of gas &lt;span class='smile'>&lt;/span> [we are so used to regularly check the battery status, but not
the gas &amp;mdash; new experience].&lt;/div>

&lt;div class='p'>For more detailed story see
&lt;a href='/competitions/robotem-rovne/2016/en'>&lt;span class='cs'>Robotem rovně 2016 article&lt;/span>&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="160604">&lt;/a>&lt;/div>

&lt;h2>4th June, 2016 &amp;mdash; Contest „RoboOrienteering”&lt;/h2>

&lt;div class='p'>The contest „Robotem rovně” was great, because it got us started. Finally!
But how to proceed? We were quite exhausted from the contest so we took a break
next Tuesday (our group meets up once a week). And then I realized that
contests are the only way how to push out project further, i.e. complete
steering mechanism. The only related contest (as far as I know) was
&lt;a href='/competitions/roboorienteering/en'>RoboOrienteering&lt;/a>, but the dead-line was
deadly. We had to finish the robot in two weeks (mainly Tomáš and Standa). The
deal was set that if I can write navigation software in two weeks they will
have the hardware ready. And so we fight again! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>The task for RoboOrienteering 2016 was to navigate  to differently evaluated
orange cones (their positions are known few minutes before the start &amp;mdash; the
teams receives USB disk with configuration text file), and drop there golf
balls. There are two radii: 2.5m for double and 5m for single score.&lt;/div>

&lt;div class='p'>Version 0 &amp;mdash; the simplest thing which can possibly work &amp;mdash; was here much
harder. If nothing else to score a single point it was necessary to have
integrated GPS, functional ball dispenser, and also robot had to be able to
avoid collision with obstacle (part of homologation).&lt;/div>

&lt;div class='p'>My part was relatively easy &amp;mdash; if you do not have any feedback data (encoders
or position of steering wheel) then your program has to be very simple. My
primary goal was to integrate new sensor
&lt;a href='http://velodynelidar.com/vlp-16.html' class='external'>Velodyne VLP-16&lt;/a>, so the first tests
were simple collision detection and STOP, something we wanted to have already
in Písek.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/roboorienteering/2016/velodyne.jpg'>&lt;img src='/competitions/roboorienteering/2016/velodyne_t.jpg' alt='Velodyne Puck VLP-16' title='Velodyne Puck VLP-16' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/roboorienteering/2016/velodyne.jpg'>Velodyne Puck VLP-16&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>You can have a look at &lt;a href='https://youtu.be/xc4WignNlh0' class='external'>video&lt;/a> from the first
outdoor test. It was not perfect. You can see „gaps” which corresponds to
dropped UDP packets, and at one point you can see tractor jumping back when
obstacle was detected. There was a non-trivial lag between reality and
processed data.&lt;/div>

&lt;div class='p'>Yes, there was a bug in (my) software. After the program start it was waiting
for start button. The Velodyne socket was already opened but I started to read
it after button was in on position &amp;hellip; well, fixed, no so many things damaged,
ready for competition.&lt;/div>

&lt;div class='p'>The tractor first autonomously steered just few hours before the contest (we
arrived in Rychnov at 1am and started testing at 5am in the parking lot. So far
so good. Unfortunately none of us is expert in hydraulics and we did not know
that there is too much pressure in the system. It broke during homologation (it
was bit uphill so I had to increase the gas) &amp;mdash; but over the day Standa with
colleagues somehow magically managed to fix it even without proper tools. [now
it is revised and working fine]&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/roboorienteering/2016/jd-ball.jpg'>&lt;img src='/competitions/roboorienteering/2016/jd-ball_t.jpg' alt='Ball dispenser' title='Ball dispenser' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/roboorienteering/2016/jd-ball.jpg'>Ball dispenser&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/roboorienteering/2016/czu-tf-robotika-cz-team.jpg'>&lt;img src='/competitions/roboorienteering/2016/czu-tf-robotika-cz-team_t.jpg' alt='The team' title='The team' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/roboorienteering/2016/czu-tf-robotika-cz-team.jpg'>The team&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/roboorienteering/2016/jd-nav2.jpg'>&lt;img src='/competitions/roboorienteering/2016/jd-nav2_t.jpg' alt='Autonomous collision avoidance' title='Autonomous collision avoidance' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/roboorienteering/2016/jd-nav2.jpg'>Autonomous collision avoidance&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h3>Video from RoboOrienteering 2016&lt;/h3>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/3ARlfJ3d5a0" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;div class='p'>&lt;/center>&lt;/div>

&lt;div class='p'>For more info see &lt;a href='/competitions/roboorienteering/2016/en'>&lt;span class='cs'>the article from RO16&lt;/span>&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="161123">&lt;/a>&lt;/div>

&lt;h2>23rd November, 2016 &amp;mdash; Hydromotor video&lt;/h2>

&lt;div class='p'>Standa sent me link to very nice descriptive video how transmission works in
our John Deere tractor:&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/KKQnet3IA_s" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="170516">&lt;/a>&lt;/div>

&lt;h2>16th May, 2017 &amp;mdash; „Robotem rovně” (again) and the first garden test&lt;/h2>

&lt;div class='p'>Last weekend we participated in contest
&lt;a href='/competitions/robotem-rovne/2017/en'>&lt;span class='cs'>Robot go straight!&lt;/span>&lt;/a> in Písek. It was
primarily show how far we moved since the last year (brick on pedal to pass
homologation). The sensor set was: odometry (encoders on front wheels with
position angle of the left wheel), GPS used for reference, laser scanner SICK
LMS100 and dual IP camera with wide angle lenses. For the first two runs we
basically tuned the steering and collected real data of the environment. In
the afternoon we added simple correction from camera data to avoid green areas
(yes, this is a bit strange for lawn mower &lt;span class='wink'>&lt;/span> ).&lt;/div>

&lt;div class='p'>The second day we tested our newly developed „cones algorithm” in the garden.
For version 0 it is sufficient to prove that we can repeat given pattern 10
times (or better infinite times, but we do not want to wait so long). You can
see it on the video bellow. We tried also to turn the cutting mechanism on,
but surprise surprise John Deere turns it off whenever the mower is in backward
motion. So we switched to a bit boring oval and did some experiments in higher
grass.&lt;/div>

&lt;div class='p'>Please forgive me the quality of the video &amp;mdash; it is taken by digital camera 12
years old (which is still good for pictures but not for videos).&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/KiDnPsnLmLU" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;h2>Links&lt;/h2>

&lt;ul>
&lt;li>&lt;a href='https://github.com/robotika/osgar' class='external'>OSGAR source code on Github&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>&lt;a href='/robots/osgar/en#email'>Contact Form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>Robotour 2017</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2017/en"/>
	<id>http://localhost/competitions/robotour/2017/en</id>
	<updated>2017-03-08T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> The 12th year of Robotour contest for autonomous outdoor robots will take place
in Slovakia/Žilina. There will be several changes in the rules this year: the
concurrent start of all robots is no longer required and new &lt;i>autonomous&lt;/i> and
&lt;i>service&lt;/i> areas are introduced. When and where? &lt;b>Žilina, 16th September
2017&lt;/b>.
 </summary>
	<content type='html'> 
&lt;h1>Rules&lt;/h1>

&lt;div class='p'>The rules for Robotour 2017 are slightly different compared to the last
years. In particular the concurrent start of all robots at given time will no
longer be enforced. Exact start location will change to &lt;i>boundary&lt;/i> of
&lt;i>autonomous area&lt;/i>, and it will be up to the teams where and when exactly they
will start. The task will be to go to loading zone, where beer barrel will be
manually or automatically loaded, and then deliver the payload to desired
destination (unloading zone). After unloading (again manual or automatic) the
robot has to leave the &lt;i>autonomous zone&lt;/i>.&lt;/div>

&lt;div class='p'>For each sub-task team gets one point, and extra bonus is for autonomous
load/unload. In total 5 points per run/delivery request.&lt;/div>

&lt;div class='p'>The detail rules are available on GitHub with tag ROBOTOUR2017:
&lt;a href='https://github.com/robotika/robotour/blob/ROBOTOUR2017/rules/rules.md' class='external'>English&lt;/a>.
and
&lt;a href='https://github.com/robotika/robotour/blob/ROBOTOUR2017/rules/pravidla.md' class='external'>Czech&lt;/a>&lt;/div>

&lt;h2>Examples of roads in Žilina (Ľudovít Štúr park)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-113023.jpg'>&lt;img src='/competitions/robotour/2017/zilina-113023_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-113655.jpg'>&lt;img src='/competitions/robotour/2017/zilina-113655_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-114218.jpg'>&lt;img src='/competitions/robotour/2017/zilina-114218_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-114334.jpg'>&lt;img src='/competitions/robotour/2017/zilina-114334_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-114624.jpg'>&lt;img src='/competitions/robotour/2017/zilina-114624_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-114712.jpg'>&lt;img src='/competitions/robotour/2017/zilina-114712_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-115451.jpg'>&lt;img src='/competitions/robotour/2017/zilina-115451_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2017/zilina-120138.jpg'>&lt;img src='/competitions/robotour/2017/zilina-120138_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Location&lt;/h2>

&lt;div class='p'>The contest will take place primary in 
&lt;a href='http://www.openstreetmap.org/way/123697463/' class='external'>park of Ľudovít Štúr&lt;/a>.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 476px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotour/2017/map.png' alt='overview map' title='overview map' class='border'  width='470' height='703'/>&lt;/span>&lt;br/>
&lt;span>overview map&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 306px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotour/2017/logo-uz.jpg' alt='' title='' class='border'  width='300' height='292'/>&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 306px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotour/2017/logo-icm.jpg' alt='' title='' class='border'  width='300' height='214'/>&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comments/question, please use our standard &lt;a href='/competitions/robotour/2017/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>SICK Robot Day 2016</title>
	<link rel='alternate' href="http://localhost/competitions/sick-robot-day/2016/en"/>
	<id>http://localhost/competitions/sick-robot-day/2016/en</id>
	<updated>2016-10-18T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> Another „SICK Robot Day” is over. This time two robots had to place cubes in
the field loaded with RFID chips in 5x5 meter bit „chess” area. Once the
cubes were placed then they become obstacle and future collisions were
penalized. So how it went?

 </summary>
	<content type='html'> 
&lt;h2>Rules in nutshell&lt;/h2>

&lt;div class='p'>The playing field has size approximately 7x13m. Two autonomous robots start
from yellow and ping 3x3m storage area. Their task is to delivery cubes to the
central 5x5m large playing field. The cubes have to be carried separately and
robots get points for each covered 1x1m square. The point is gained by robot,
which places the cube on given square as first one. As soon as the cube is
placed it automatically becomes an obstacle (for both robots) and contact is
penalized by lost of 0.5 point.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/sick-robot-day/2016/playing-field.png'>&lt;img src='/competitions/sick-robot-day/2016/playing-field_t.png' alt='' title='' class='border'  width='320' height='152'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The match takes 10 minutes and every team plays two matches. The color and the
oponent are randomly selected. There is RFID chip on every crossing of the
field to simplify robot navigation. The code defines (X, Y) coordinates but
also the „zone type”. The organizer provided teams sensors
&lt;a href='https://www.sick.com/media/pdf/2/52/052/dataSheet_RFU620-10100_1062599_de.pdf' class='external'>RFU
620&lt;/a>, and the data are available over the Ethernet or CAN bus.&lt;/div>

&lt;ul>
&lt;li>&lt;a href='https://www.sick.com/de/en/robotday-2016/w/robotday2016/' class='external'>main contest page&lt;/a>&lt;/li>

&lt;li>&lt;a href='https://www.sick.com/medias/Reglement-EN-final.pdf?context=bWFzdGVyfHJvb3R8MjEzMTk5fGFwcGxpY2F0aW9uL3BkZnxoMWEvaDNmLzg5MTQ4MDI4MDI3MTgucGRmfGE3N2M1OTYyY2ZhNDI0Y2M4YzVjNWYyYzgxYmMwZTg2NjQzMGJmMzdkNzI4ZTNmNmExN2M5NThmMmRlYjBiMmU
' class='external'>detail rules in English (PDF)&lt;/a>&lt;/li>
&lt;/ul>

&lt;h1>Results&lt;/h1>

&lt;div class='p'>&lt;center>
&lt;table border="1">
	&lt;tr>
		&lt;th>Team&lt;/th>
		&lt;th>Lauf 1&lt;/th>
		&lt;th>Lauf 2&lt;/th>
		&lt;th>Bestes&lt;/th>
		&lt;th>Platz&lt;/th>
	&lt;/tr>
	&lt;tr bgcolor="yellow">
		&lt;td>&lt;b>Karlsuni Prag&lt;/b>&lt;/td>
		&lt;td>7.0&lt;/td>
		&lt;td>6.0&lt;/td>
		&lt;td>7.0&lt;/td>
		&lt;td>1&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td>&lt;b>Uni FR&lt;/b>&lt;/td>
		&lt;td>-0.5&lt;/td>
		&lt;td>5.0&lt;/td>
		&lt;td>5.0&lt;/td>
		&lt;td>2&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td>&lt;b>EDURO&lt;/b>&lt;/td>
		&lt;td>2.0&lt;/td>
		&lt;td>4.5&lt;/td>
		&lt;td>4.5&lt;/td>
		&lt;td>3&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td>&lt;b>Kamaro Engineering e.V.&lt;/b>&lt;/td>
		&lt;td>4.0&lt;/td>
		&lt;td>4.0&lt;/td>
		&lt;td>4.0&lt;/td>
		&lt;td>4&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td>&lt;b>Osnabrück&lt;/b>&lt;/td>
		&lt;td>1.0&lt;/td>
		&lt;td>2.5&lt;/td>
		&lt;td>2.5&lt;/td>
		&lt;td>6&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td>&lt;b>FRED&lt;/b>&lt;/td>
		&lt;td>X&lt;/td>
		&lt;td>1.0&lt;/td>
		&lt;td>1.0&lt;/td>
		&lt;td>6&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td>&lt;b>Alpaca&lt;/b>&lt;/td>
		&lt;td>0.0&lt;/td>
		&lt;td>0.0&lt;/td>
		&lt;td>0.0&lt;/td>
		&lt;td>7&lt;/td>
	&lt;/tr>
	&lt;tr>
   &lt;td>&lt;b>WINGmen&lt;/b>&lt;/td>
		&lt;td>X&lt;/td>
		&lt;td>X&lt;/td>
		&lt;td>0.0&lt;/td>
		&lt;td>7&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td>&lt;b>Smelý Zajko&lt;/b>&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>-&lt;/td>
	&lt;/tr>
&lt;/table>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;h1>Cogito MART (1st place)&lt;/h1>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/XGcGBP-hDKI" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;div class='p'>The winner of &lt;i>SICK Robot Day 2016&lt;/i> is team &lt;b>Cogito MART&lt;/b>. You can read
&lt;a href='https://sites.google.com/site/cogitoteam/sick-robot-day-2016' class='external'>interesting
Cogito diary&lt;/a> how the robot named &lt;i>Clementine&lt;/i> was modified for the contest
12 days to the competition.&lt;/div>

&lt;hr/>

&lt;h1>CS Freiburg (2nd place)&lt;/h1>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/UegNz-ERm9k" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;div class='p'>&lt;i>author: Andreas Hertle&lt;/i>&lt;/div>

&lt;div class='p'>Our robot is called Zerg based on a speedy unit (Zergling) from the strategy
game Starcraft. The robot was built in 2005/2006 for a race competition, thus
high velocity and acceleration are its dominant attributes.&lt;/div>

&lt;div class='p'>As robotic middleware we use ROS running on Ubuntu, which for us has two major
benefits:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/sick-robot-day/2016/zerg.jpg'>&lt;img src='/competitions/sick-robot-day/2016/zerg_t.jpg' alt='Zerg' title='Zerg' class='border'  width='220' height='163'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/sick-robot-day/2016/zerg.jpg'>Zerg&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>1. The ROS community provides a whole range of ready-to-use components, ranging
from drivers for various sensors to path planning and perception libraries.
Thus less time needs to be invested into the development of components not
specific to the task at hand.&lt;/div>

&lt;div class='p'>2. The flexible communication between ROS programs allows for easy
re-configuration of the system. Multiple alternative implementations of
specific components can be prepared in advance and quickly exchanged should one
show significant advantages.&lt;/div>

&lt;div class='p'>This year we upgraded the electronics hardware, replacing the old custom built
boards with an Arduino Mega. The Arduino program reads the wheel encoders to
estimate the current velocity and feeds it to the PID controller for the
motors. Our robot can reach a linear velocity of 2.3 m/s, however there is
rarely enough free space to utilize maximum velocity. For turning the robot
relies on skid steering. In theory it can turn 360° in half a second, however
that is much faster than the navigation algorithms can handle. For the
competition the linear velocity was limited to more manageable 1.3 m/s and the
angular to 180°/s.&lt;/div>

&lt;h2>Gripper&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/sick-robot-day/2016/cube_gripper_design.png'>&lt;img src='/competitions/sick-robot-day/2016/cube_gripper_design_t.png' alt='' title='' class='border'  width='220' height='133'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>To move the foam cubes, we constructed a specialized gripper. It has a round
shape so that the cubes do not need to be aligned and can be picked up at an
angle. Usually at the front of the robot we put our laser range finder and to
be able to see the cubes the laser sensor had to be placed less than 0.15m
above ground. A solid grip can be achieved when holding the cubes equatorial,
so 0.075m above ground. Furthermore, the gripper should block as few laser
beams as possible in both the lowered and raised state. That is the reason for
the arcing shape of the arm: the laser sensor is placed at the center.&lt;/div>

&lt;div class='p'>The gripper was designed with Blender and printed on the MakerBot Replicator
5th generation. We use one Dynamixel AX12-a servos to operate the closing
mechanism and two to operate the lifting mechanism. Finally, we attached a Sick
WT4-2P132 distance sensor to detect whether the gripper is holding a cube.&lt;/div>

&lt;h2>Cube Perception&lt;/h2>

&lt;div class='p'>Since this year’s competition focuses on transporting foam cubes, perceiving
those cubes reliably is a key component. We implemented two different
approaches.&lt;/div>

&lt;div class='p'>The first relies on 3d point cloud data, as produced by a Microsoft Kinect
camera. In the first step the ground plane is detected and removed from the
point cloud data, also points above cube size are removed. The remaining data
is segmented by distance to separate individual cubes. Region growing
algorithms find the visible sides of each cube. With this data the center and
orientation of the cube can be computed. The point cloud also contains color
data, thus the color of the cube can be determined. Unfortunately this whole
process is computationally expensive. Even after subsampling the input data the
algorithm requires 1 second to process a point cloud. Furthermore, the small
horizontal angle (48°) of the Kinect camera and the minimum distance of 0.5m
complicates the problem further.&lt;/div>

&lt;div class='p'>Thus we implemented an alternative approach based on data from a laser range
finder. The laser data first is segmented based on distance between two
neighbouring ranges. Then we determine which point cluster are fully visible
(not occluded by their neighbouring clusters). Finally, the distance between
the first and last point decides whether it is a cube or something else. We can
not determine the color of cube, however, the color information does not
influence the behavior algorithms anyway. With this algorithm we can detect
cubes in a 200° cone and between 0.05m and 10m as fast as the laser sensor
produces the data (40Hz).&lt;/div>

&lt;h2>RFID Localization&lt;/h2>

&lt;div class='p'>The second half of the challenge for this competition consisted of recognizing
RFID tags in the environment to distinguish the score fields and the home base
field.&lt;/div>

&lt;div class='p'>We modeled the location of the RFID tags as a second map layer. The first map
is produced by the laser range finder. We estimate the displacement between the
origin of the laser map and the rfid map with a probabilistic particle filter,
similar as is done in Monte-Carlo Localization. The challenging part here was
modeling the RFID sensor: the sensor produces the tag id and signal strength,
which localizes the robot in a radius of up to 0.3m around that specific tag,
so it is not very precise and has no orientation. Unfortunately we did not
manage get the particle filter working reliably in time for the competition.&lt;/div>

&lt;div class='p'>Thus, we went for an easier but less reliable method: we rely on the assumption
that the enclosing fence is placed symmetrically around the RFID tags. That is
the center of the enclosed area coincides with the center of the RFID map. We
take the free cells in the laser map as input and compute the mean and
covariance. With a Singular Value Decomposition we get the main axes and the
center of the environment. Thus, we can estimate the center of the RFID map
based solely on laser data.&lt;/div>

&lt;h2>Behavior&lt;/h2>

&lt;div class='p'>We encoded the behavior of the robot as a deterministic state machine. The
process of placing cubes consists of the following steps:&lt;/div>

&lt;ol>
&lt;li>Navigate to a random pose around the home field if no cube is detected.&lt;/li>

&lt;li>Navigate to closest cube in the home field.&lt;/li>

&lt;li>Lower gripper, approach cube and pick it up.&lt;/li>

&lt;li>Decide where to place the cube.&lt;/li>

&lt;li>Navigate to target field.&lt;/li>

&lt;li>Lower gripper and release cube.&lt;/li>
&lt;/ol>

&lt;div class='p'>Should the cube sensor in the gripper report that no cube is in the gripper,
the algorithm jumps to the appropriate state (1, 2 or 3 depending on the
distance to possibly visible cubes).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/sick-robot-day/2016/screenshot.png'>&lt;img src='/competitions/sick-robot-day/2016/screenshot_t.png' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The strategy is implemented in step 4. We prepared two alternatives: the
aggressive strategy selects target fields closer to the opponent's home field;
the defensive strategy places cubes closer to our own home field. In the
competition our robot was using the defensive strategy.&lt;/div>

&lt;div class='p'>A simulator was crucial during the development of the overall behavior, since
we did not have the room to physically build the whole playing field in our
lab. We used the Stage simulator, which simulates (mostly) in 2d and is
lightweight, so that multiple robots can be simulated. The blue box in the
picture represents the robot and the red boxes the cubes. The field of view of
the laser sensor is displayed in green.&lt;/div>

&lt;h2>Challenges during the competition&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/sick-robot-day/2016/cs-freiburg-team.jpg'>&lt;img src='/competitions/sick-robot-day/2016/cs-freiburg-team_t.jpg' alt='From right to left:
Hanna Stellmach
Andreas Hertle
Jens Schindler
' title='From right to left:
Hanna Stellmach
Andreas Hertle
Jens Schindler
' class='border'  width='220' height='293'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/sick-robot-day/2016/cs-freiburg-team.jpg'>From right to left:
Hanna Stellmach
Andreas Hertle
Jens Schindler
&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>After arriving at the Stadthalle, we soon realized that something was wrong
with the robot’s main sensor, a Hokuyo UTM-30LX laser range finder. The sensor,
which was working fine in our lab, kept crashing every few minutes. We figured
out, that the sensor detected other infrared light sources and determined that
turning itself off was the sensible thing to do. So we spent most of the
preparation time getting a replacement sensor from our lab (home advantage) and
mounting it on the robot. The only sensor we could get to work in the short
time was a Hokuyo URG-4LX (4m max range), which is a huge downgrade from the
UTM-30LX (30m max range). We assume the reduced range is to blame for failure
in our first run.&lt;/div>

&lt;div class='p'>We were so focussed on the problems with the laser sensor that we did not
figured out the second problem before our second run: the carpet. Our robot has
skid-steering but the carpet had high friction which made fine turns all but
impossible.This can be observed when the robot tries to approach a target field
or approach a cube. Not sure if we could have done anything about it, maybe
wrap the robot’s rubber wheels in tape. Fortunately, our luck was that the
sharp turn and stop maneuvers shook free the cube the robot was carrying and
the cubes even rolled on empty target fields!&lt;/div>

&lt;div class='p'>Additional thanks to students who did not participate in the competition but
worked on components in previous semesters: Robert Grönsfeld, Natalie Prange
and Hermann Ritzenthaler&lt;/div>

&lt;hr/>

&lt;h1>EDURO Team (3rd place)&lt;/h1>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/3uaT7xsMJnM?list=PL1slVS532sQBkc6ujGt-5oI5I4hIp3YQA" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;div class='p'>There is &lt;i>EDURO Team Diary&lt;/i> when you switch this page into Czech. Hopefully
there will be enough energy to create short English summary later on, but for
now enjoy the videos from David/MART. See his complete
&lt;a href='https://goo.gl/TcBvBu' class='external'>SICK Robot Day 2016 video list&lt;/a>.&lt;/div>

&lt;h3>Eduro "show time"&lt;/h3>

&lt;div class='p'>&amp;hellip; the most cubes were placed in the very last game (not counted for Eduro)&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/F_SzsSTY5og?list=PL1slVS532sQBkc6ujGt-5oI5I4hIp3YQA" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/competitions/sick-robot-day/2016/en#email'>contact form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>Results</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2016/results/en"/>
	<id>http://localhost/competitions/robotour/2016/results/en</id>
	<updated>2016-09-21T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> The 11th year of Robotour contest was the wettest. It rained permanently all
day. This was not only challenge for the sensors and electronics, but also for
the teams themselves. Never the less there were 12 robots on the start
accepting the challenge in heavy rain (and wind too). Istrobotics with more
than 600 points was the clear winner &amp;hellip;
 </summary>
	<content type='html'> 
&lt;h1>Competition&lt;/h1>

&lt;h2>Attendance&lt;/h2>

&lt;div class='p'>There were fourteen teams registered and twelve of them actively participated.
The Czech team MART give up on Friday and also the local team WallE had some
unrecoverable problems (the robot parts were immediately reused for „No! This
is Patrick” team).&lt;/div>

&lt;h2>The Place&lt;/h2>

&lt;div class='p'>The &lt;b>Technische Hochschule Deggendorf&lt;/b> has very nice campus on the river bank
of Donau and it is surrounded by several parks. &lt;i>Stadthallenpark&lt;/i> was classic
park with fountain, roses and sandy roads. Then there were several long
straight roads along the creek Bogenbach and river Donau. Then there was a
section with metal corridor on the roof of parking garage &lt;i>Parkdeck
Deichgarten&lt;/i>. And if you would like to make it even harder there were bridges,
up and down ramps, railway, steps &amp;hellip; simply challenge. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 564px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotour/2016/map.png' alt='Map overview' title='Map overview' class='border'  width='558' height='453'/>&lt;/span>&lt;br/>
&lt;span>Map overview&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>The Weather&lt;/h2>

&lt;div class='p'>&amp;hellip; was just horrible. The next level would be probably snow and hail! It
rained all day and the area was covered by many pools (some several centimeters
deep), plus cold wind &amp;hellip; but what does not kill you makes you stronger, so maybe it
was good „push” to improve the robots and also the contest itself.&lt;/div>

&lt;h2>Homologation&lt;/h2>

&lt;div class='p'>Friday homologation was like paradise when compared to the weather contest on
Saturday. It turned out that there only two teams loading the beer: Cogito used
the this year option with 500ml can and Radioklub Písek with crane and full
5liter barrel. Other teams were more rational and skipped this extra bonus
points/troubles (for the next years we want to discourage it a little).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/istrobotics.jpg'>&lt;img src='/competitions/robotour/2016/results/istrobotics_t.jpg' alt='Istrobotics' title='Istrobotics' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/istrobotics.jpg'>Istrobotics&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/jecc.jpg'>&lt;img src='/competitions/robotour/2016/results/jecc_t.jpg' alt='JECC - Fesl' title='JECC - Fesl' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/jecc.jpg'>JECC - Fesl&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/raptors.jpg'>&lt;img src='/competitions/robotour/2016/results/raptors_t.jpg' alt='Raptors' title='Raptors' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/raptors.jpg'>Raptors&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>There was an exciting moment worth &lt;i>yellow journalism&lt;/i> and that was then
Quadron team left their robot parked (was it with STOP emergency button
pressed?) their robot next to the road (and university stream) and after
several minutes the robot woke up, turned sharp left and run for the water! The
team members were not quick enough, but they were lucky that the stream had
bushes along so they stopped the robot. The batteries were pulled out, but the
robot survived.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/run-away-robot.jpg'>&lt;img src='/competitions/robotour/2016/results/run-away-robot_t.jpg' alt='Run way robot' title='Run way robot' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/run-away-robot.jpg'>Run way robot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/quadrons.jpg'>&lt;img src='/competitions/robotour/2016/results/quadrons_t.jpg' alt='Quadrons' title='Quadrons' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/quadrons.jpg'>Quadrons&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 171px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/lois.jpg'>&lt;img src='/competitions/robotour/2016/results/lois_t.jpg' alt='Lois' title='Lois' class='border'  width='165' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/lois.jpg'>Lois&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Surprisingly something similar happened to Raptors an hour later &amp;hellip; (but not
so dramatic).&lt;/div>

&lt;h2>Runs&lt;/h2>

&lt;div class='p'>There was almost nobody for 0. run at 9am. Only Aleš from AmBot came and
started test run. Other hoped for better weather and decided that there is no
point to get wet if there is no punishment or points.&lt;/div>

&lt;div class='p'>The official first run started at 10:30 (the even opening was at 10am),
&lt;i>Stadthallenpark&lt;/i> near the bridge and destination on the other side.
Istrobotics was able to get first several hundred points but also AmBot, Smelý
Zajko, JECC, Raptors and Quadrons moved and got some points.&lt;/div>

&lt;div class='p'>The second run was at 11:30 so not much time for rest. It was on the roof of
the parking garage, very windy, very wet. Here scoring teams were already at
the end of the line (the best is at the end) so AmBot tried side road to avoid
the high traffic. Unfortunately the robot decisions oscillated near the rusty
border and did not move far. Only Istrobotics managed to get some points in the
run, moving zigzag from the last position among other robots to the far end.
Near the goal the robot decided to turn left but there were 3 parallel steps.
Surprisingly the owners did not run after their machine and the small car
loaded with the large barrel managed to get from the first step. On the second
step they stopped it.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/run2.jpg'>&lt;img src='/competitions/robotour/2016/results/run2_t.jpg' alt='2nd run near the river bank' title='2nd run near the river bank' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/run2.jpg'>2nd run near the river bank&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/run2-cogito.jpg'>&lt;img src='/competitions/robotour/2016/results/run2-cogito_t.jpg' alt='Cogito' title='Cogito' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/run2-cogito.jpg'>Cogito&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The 3rd run was probably most emotive for me. All twelve robots where there, on
the road parallel to railway. Here &lt;i>Kamaro Beteigeuze&lt;/i> got their first
points. They wanted to cross the bridge, but turned right sooner. The robot did
not recognize this mistake so on the next crossing they turned right instead of
left on the narrow downhill road under the bridge.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/run3.jpg'>&lt;img src='/competitions/robotour/2016/results/run3_t.jpg' alt='3rd run' title='3rd run' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/run3.jpg'>3rd run&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/run3-back.jpg'>&lt;img src='/competitions/robotour/2016/results/run3-back_t.jpg' alt='3rd run continues' title='3rd run continues' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/run3-back.jpg'>3rd run continues&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/kamaro.jpg'>&lt;img src='/competitions/robotour/2016/results/kamaro_t.jpg' alt='Kamaro Beteigeuze' title='Kamaro Beteigeuze' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/kamaro.jpg'>Kamaro Beteigeuze&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Also Istrobotics had problem with the bridge and their robot refused to cross
it. There were several collisions on the start (Radioklub Písek backed up into
the opponent for example). Several teams confirmed anomalies with compass, so
maybe there is high voltage cable hidden under the road like in Písek?&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/run4.jpg'>&lt;img src='/competitions/robotour/2016/results/run4_t.jpg' alt='4th run' title='4th run' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/run4.jpg'>4th run&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The 4th (last) run was from the other side of the bridge back to school. There
were again all 12 robots. Team Cogito changed their strategy to move 25
meters, unload the beer can and return to start. That way they would get extra
points for automatic loading, unloading and twice the points for the distance
from the start, but &amp;hellip; ARBot started the same second and was in the blind zone
on the right. It looked like the racing but unfortunately at one point both
robots decided to steer it towards the opponent.&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/dB2YhL4bpEY" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;div class='p'>Interestingly Smelý Zajko and JECC finished again on the same spot (they left
the road in the same area in 1st run). Here they hit the concrete block when
entering the bridge. Was it just coincidence or their robot brains using neural
networks interpreted in both cases the situation the same way?&lt;/div>

&lt;h2>Total Score&lt;/h2>

&lt;div class='p'>&lt;center>
&lt;table border="1">
	&lt;tr>
		&lt;th>Place&lt;/th>
		&lt;th>Team&lt;/th>
		&lt;th>1st Run&lt;/th>
		&lt;th>2nd Run&lt;/th>
		&lt;th>3rd Run&lt;/th>
		&lt;th>4th Run&lt;/th>
		&lt;th>Total&lt;/th>
	&lt;/tr>
	&lt;tr bgcolor="yellow">
		&lt;td align="center">1st&lt;/td>
		&lt;td>&lt;b>Istrobotics&lt;/b>&lt;/td>
		&lt;td>156&lt;/td>
		&lt;td>272&lt;/td>
		&lt;td>90&lt;/td>
		&lt;td>94&lt;/td>
		&lt;td>612&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">2nd&lt;/td>
		&lt;td>&lt;b>Kamaro Beteigeuze&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>76&lt;/td>
		&lt;td>94&lt;/td>
		&lt;td>170&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">3rd-5th&lt;/td>
		&lt;td>&lt;b>Smelý Zajko&lt;/b>&lt;/td>
		&lt;td>28&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>40&lt;/td>
		&lt;td>68&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">3rd-5th&lt;/td>
		&lt;td>&lt;b>JECC - Fesl&lt;/b>&lt;/td>
		&lt;td>28&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>40&lt;/td>
		&lt;td>68&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">3rd-5th&lt;/td>
		&lt;td>&lt;b>AmBot&lt;/b>&lt;/td>
		&lt;td>66&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>66&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">6th&lt;/td>
		&lt;td>&lt;b>Raptors&lt;/b>&lt;/td>
		&lt;td>20&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>3&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>23&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">7th&lt;/td>
		&lt;td>&lt;b>Cogito&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>1+5&lt;/td>
		&lt;td>6&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">8th-9th&lt;/td>
    &lt;td>&lt;b>Quadrons&lt;/b>&lt;/td>
		&lt;td>1&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>1&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">8th-9th&lt;/td>
		&lt;td>&lt;b>ARBot&lt;/b>&lt;/td>
		&lt;td>1&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>1&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">10th-12th&lt;/td>
		&lt;td>&lt;b>Lois&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">10th-12th&lt;/td>
		&lt;td>&lt;b>No! This is Patrick!&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">10th-12th&lt;/td>
		&lt;td>&lt;b>Radioklub Písek TCVVI&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
&lt;/table>
&lt;/center>&lt;/div>

&lt;h1>Conclusions&lt;/h1>

&lt;div class='p'>The contest was quite exhausting this year, but it was clear that there is high
determination of the teams to push it further. Thanks a lot to prof. Kurpris
and his colleagues &amp;mdash; the organization was perfect! Also thanks to the
sponsors and the city Deggendorf covering the accommodation and dinner
expenses.&lt;/div>

&lt;div class='p'>And what next? We discussed it on the Sunday workshop (as part of PAIR'16) and
I would leave it to separate article, but it resonate with the wishes of teams
that they do not want to enter coordinates in their robots any more (wet
keyboard, unreadable papers, non-working wet notebooks, &amp;hellip;). It is time for
full autonomy. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/polish-teams.jpg'>&lt;img src='/competitions/robotour/2016/results/polish-teams_t.jpg' alt='Polish teams Quadrons and Raptors' title='Polish teams Quadrons and Raptors' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/polish-teams.jpg'>Polish teams Quadrons and Raptors&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/radioklub-pisek-crane.jpg'>&lt;img src='/competitions/robotour/2016/results/radioklub-pisek-crane_t.jpg' alt='Radioklub Písek - crane' title='Radioklub Písek - crane' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/radioklub-pisek-crane.jpg'>Radioklub Písek - crane&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/cogito2.jpg'>&lt;img src='/competitions/robotour/2016/results/cogito2_t.jpg' alt='Cogito' title='Cogito' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/cogito2.jpg'>Cogito&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;h2>ARBot photos (2nd Run)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/ar-arbot.jpg'>&lt;img src='/competitions/robotour/2016/results/ar-arbot_t.jpg' alt='ARBot' title='ARBot' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/ar-arbot.jpg'>ARBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/ar-2nd-run.jpg'>&lt;img src='/competitions/robotour/2016/results/ar-2nd-run_t.jpg' alt='2nd run - parking roof' title='2nd run - parking roof' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/ar-2nd-run.jpg'>2nd run - parking roof&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/ar-ambot-and-istrobotics.jpg'>&lt;img src='/competitions/robotour/2016/results/ar-ambot-and-istrobotics_t.jpg' alt='AmBot and Istrobotics' title='AmBot and Istrobotics' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/ar-ambot-and-istrobotics.jpg'>AmBot and Istrobotics&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/ar-kamaro.jpg'>&lt;img src='/competitions/robotour/2016/results/ar-kamaro_t.jpg' alt='Kamaro Beteigeuze' title='Kamaro Beteigeuze' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/ar-kamaro.jpg'>Kamaro Beteigeuze&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/results/ar-smely-zajko.jpg'>&lt;img src='/competitions/robotour/2016/results/ar-smely-zajko_t.jpg' alt='Smelý Zajko' title='Smelý Zajko' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2016/results/ar-smely-zajko.jpg'>Smelý Zajko&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;h2>Kamaro Betelgeuse Video&lt;/h2>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/IrkcsHbpMHo" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comment/query, please use our &lt;a href='/competitions/robotour/2016/results/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>Introduction of teams</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2016/teams/en"/>
	<id>http://localhost/competitions/robotour/2016/teams/en</id>
	<updated>2016-08-23T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> There are 14 teams registered for Robotour 2016 in Deggendorf Germany. That is
nice start of the second contest decade &lt;span class='smile'>&lt;/span>. Also note, that Robotour is no
longer Czech domain &amp;mdash; there are 5 teams from Germany, &lt;i>only&lt;/i> 4 teams from
Czech republic, 2 from Poland, 2 from Slovakia and 1 from Switzerland. In
particular I am pleased that there are 5 new teams and even old teams upgraded
their robots, sensors, software &amp;hellip;
 </summary>
	<content type='html'> 
&lt;h1>Teams&lt;/h1>

&lt;h1>&lt;a href='https://www.youtube.com/watch?v=HnTYwTIg1NQ&amp;amp;list=PL2gPpyBs1e20HoM1dX_m1ju1MHXuQQzGB' class='external'>YouTube playlist of all registrations 2016&lt;/a>&lt;/h1>

&lt;h2>&lt;a href='http://ambot6.webnode.cz/' class='external'>AmBot&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/ambot.jpg'>&lt;img src='/competitions/robotour/2016/teams/ambot_t.jpg' alt='' title='' class='border'  width='320' height='181'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/HnTYwTIg1NQ' class='external'>https://youtu.be/HnTYwTIg1NQ&lt;/a>&lt;/div>

&lt;div class='p'>Robot Ferda for Robotour 2016 is a modified children's electric car 
("ride-on"). The controller (based on Arduino with ATmega2560) controls 
the motors, utilizes magnetometer as a compass, manages three sonars to 
detect obstacles, reads data from an external GPS receiver and 
communicates with a Bluetooth converter (to respond to commands from the 
master system). The master system is an Android smartphone with special 
application (RoboNav) for GPS navigation by the map (derived from 
OpenStreetMap) supplemented by visual navigation according to smartphone 
camera (for keeping on the road).&lt;/div>

&lt;h2>&lt;a href='http://www.arbot.cz/' class='external'>ARBot&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/arbot.jpg'>&lt;img src='/competitions/robotour/2016/teams/arbot_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/sW_2RUfSADQ' class='external'>https://youtu.be/sW_2RUfSADQ&lt;/a>&lt;/div>

&lt;div class='p'>Robot is controlled via Zedboard with SoC Zynq 7020 manufactured by Xilinx.
It has 512 MB RAM, 32 GB SD, the heart is dual-core ARM Cortex A9 + 
NEON at 600 MHz and it contains programmable gate array with 85 thousands units. 
Zedboard runs on Linux and as main programming language is used mono and C#.&lt;/div>

&lt;div class='p'>The chassis is differentially driven with 3rd passive wheel. The modeller
wheels have diameter 17 cm and two motors PG36555126000-50.9K with encoders are
controlled by professional unit SDC2160 by Roboteq and it provides necessary
traction.&lt;/div>

&lt;div class='p'>There are two optical odometers ADNS 3080 for further position improvements.
The next sense is „touch”. Robot has two tactile FSR sensors integrated in
the front bumper.&lt;/div>

&lt;div class='p'>Time-tested AHRS VN-100 from VectorNav provides information about robot orientation.
The global position information is handled by GPS uBlox NEO 7M.&lt;/div>

&lt;div class='p'>Robot has two sonars HC-SR04, which can be rotated via model servo motor.&lt;/div>

&lt;div class='p'>The control unit for model servos is SSC-32.&lt;/div>

&lt;div class='p'>The robot will use logitech c920 this year.&lt;/div>

&lt;div class='p'>The energy source are 4 LiFePo cells with capacity 14.5 Ah and protected by
SBM.&lt;/div>

&lt;div class='p'>The chassis is built from 2mm thick plywood for air models and spruce beams 7
mm. Simply model domain. All parts were cut by laser.&lt;/div>

&lt;h2>&lt;a href='https://sites.google.com/site/cogitoteam/robotour-2016' class='external'>Cogito&lt;/a> (CH)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 228px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/cogito.jpg'>&lt;img src='/competitions/robotour/2016/teams/cogito_t.jpg' alt='' title='' class='border'  width='222' height='320'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/sVGSWT-eOoc' class='external'>https://youtu.be/sVGSWT-eOoc&lt;/a>&lt;/div>

&lt;pre>She was nice and she was beauty,
she was smart and she was fine,
just in one word, she was cutie,
and her name was Clementine.&lt;/pre>

&lt;div class='p'>&amp;hellip; all of it with a differentially driven platform, industrial laser 
scanner, home-made laser scanner, stereo camera, GPS and compass, highly 
modular message passing software architecture with computer vision and 
probabilistic localization &amp;hellip; simply irresistible.&lt;/div>

&lt;h2>Istrobotics (SK)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/istrobotics.jpg'>&lt;img src='/competitions/robotour/2016/teams/istrobotics_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/Jt5gbtevDCA' class='external'>https://youtu.be/Jt5gbtevDCA&lt;/a>&lt;/div>

&lt;div class='p'>The base of the robot is modified RC model TRAXXAS E-MAXX (3903) equipped with
webcamera, GPS, sonars HC-SR04, IMU with 3D compass and magnetic IRC.  This year
we will give a second chance to RPLIDAR. The control of the robot and basic
sensors is processed via Arduino mega. Image processing and navigation handles
Odroid XU4. Robot is programmed in C++ and OpenCV.&lt;/div>

&lt;h2>&lt;a href='http://www.jecc.de/' class='external'>JECC&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/jecc.jpg'>&lt;img src='/competitions/robotour/2016/teams/jecc_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://www.youtube.com/watch?v=izHRiv0AWOg' class='external'>https://www.youtube.com/watch?v=izHRiv0AWOg&lt;/a>&lt;/div>

&lt;div class='p'>Hardware:&lt;/div>

&lt;ul>
&lt;li>Industrial COM Express computer with 6th gen. Core i7, 8GB RAM, 64GB 
Flash&lt;/li>

&lt;li>GTX960 Graphics Card processing Deep Neural Networks and stereo vision&lt;/li>
&lt;/ul>

&lt;div class='p'>Custom Software using:&lt;/div>

&lt;ul>
&lt;li>Qt&lt;/li>

&lt;li>OpenCV&lt;/li>

&lt;li>Caffe&lt;/li>
&lt;/ul>

&lt;h2>&lt;a href='http://www.kamaro-engineering.de/' class='external'>Kamaro Beteigeuze&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/kamaro.jpg'>&lt;img src='/competitions/robotour/2016/teams/kamaro_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/ledYvBy0-bM' class='external'>https://youtu.be/ledYvBy0-bM&lt;/a>&lt;/div>

&lt;div class='p'>Four wheels, axle independent steering, offroad suspension.
x86 PC, ARM Mikrocontrollers, CAN Bus and Ethernet.
Lidar (front/rear) GPS, 9-DOF-IMU, Cameras
ROS-Based Software&lt;/div>

&lt;h2>&lt;a href='http://www.jecc.de/' class='external'>Lois, JECC&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 245px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/lois-jecc.jpg'>&lt;img src='/competitions/robotour/2016/teams/lois-jecc_t.jpg' alt='' title='' class='border'  width='239' height='320'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://www.youtube.com/watch?v=vqsAUxzY5mM' class='external'>https://www.youtube.com/watch?v=vqsAUxzY5mM&lt;/a>&lt;/div>

&lt;div class='p'>HARDWARE:&lt;/div>

&lt;ul>
&lt;li>Raspberry pi 3 with Ubuntu Mate 16.04&lt;/li>

&lt;li>Obstacle recognition with SICK PLS Laser-Scanner&lt;/li>

&lt;li>Motor control via AVR-Mikrocontroller&lt;/li>

&lt;li>Optical odometer&lt;/li>

&lt;li>BNO055-Sensor&lt;/li>
&lt;/ul>

&lt;div class='p'>SOFTWARE:&lt;/div>

&lt;ul>
&lt;li>Programmed in C/C++&lt;/li>

&lt;li>Navigation via patched Navit-Software&lt;/li>
&lt;/ul>

&lt;h2>MART (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/mart.jpg'>&lt;img src='/competitions/robotour/2016/teams/mart_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://www.youtube.com/watch?v=6yKi9MFpmdo' class='external'>https://www.youtube.com/watch?v=6yKi9MFpmdo&lt;/a>&lt;/div>

&lt;div class='p'>The robot chassis is composed of two independent halves connected with axes.
This is mainly to handle complex terrain. The wheels are driven by
industrial stepper motors and each two are connected via toothed belt. The
robot is differentially driven. Motors are controlled over CAN bus. Power
provides two gel Pb accumulators from UPS. Robot is controlled by duo of
BeagleBone Black boards. They integrated inertial unit, compass, GPS and
sonars. There is a special trailer designed for barrel transportation.&lt;/div>

&lt;h2>&lt;a href='http://www.th-deg.de/' class='external'>No! This is Patrick!&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/patrick.jpg'>&lt;img src='/competitions/robotour/2016/teams/patrick_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://www.youtube.com/watch?v=AggbYBLsKwI' class='external'>https://www.youtube.com/watch?v=AggbYBLsKwI&lt;/a>&lt;/div>

&lt;div class='p'>The main hardware consists of a Congatec TS180 COM Express module. For 
the main processing, it communicates with the XBox One Kinect and the 
Rasberry Pi and  displays the crucial information on a Krämer V-800 
Touchscreen.&lt;/div>

&lt;div class='p'>A Raspberry Pi reads the sensor values from the GPS and the Bosch BNO055 
and transmits them to the Congatec module. It receives the steering and 
speed values from the Congatec module, which are transmitted to the 
Freescale KL25Z microcontroller, which in turn controls the RC-Car 
model(1:5) via PWM signals.&lt;/div>

&lt;h2>&lt;a href='http://skaner.p.lodz.pl/news.php' class='external'>Quadrons&lt;/a> (PL)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/quadrons.jpg'>&lt;img src='/competitions/robotour/2016/teams/quadrons_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/cQRcvG4eTOY' class='external'>https://youtu.be/cQRcvG4eTOY&lt;/a>&lt;/div>

&lt;div class='p'>MECHANICS:&lt;/div>

&lt;div class='p'>Quadron is a quadricycle (ultimately) autonomous mobile robot. It’s 
mechanical construction has the car-like kinematic configuration, which 
means that on the front axle there are two steering wheels, while the 
rear rigid axle is used to propel the vehicle (2WD). The base of the 
structure has become an electric quad, produced by Lifestyle-4-U GmbH.
The assumption was to create universal platform, which will be used in 
many independent fields and disciplines. That is why we have introduced 
a number of electrical and mechanical modifications, especially in the 
supporting frame. After changes our robot gained the possibility of 
transporting elements of a fairly large scale, because in border cases 
up to 60kg, at curb weight of about 60kg. Then we adjusted the drive 
systems and steering mechanisms, that could be done with electric 
motors. For the movement of the vehicle corresponds 500W DC motor, while 
for the movement of the steering wheel is responsible DC gear motor and 
coupler system. Each engine was equipped with a driver, encoders and 
security features.&lt;/div>

&lt;div class='p'>ELECTRONICS:&lt;/div>

&lt;div class='p'>The robot is equipped with a number of sensors and modules responsible 
for security, location and orientation in the field. On board are:&lt;/div>

&lt;ul>
&lt;li>Emergency stop&lt;/li>

&lt;li>Encoders with IMU sensor to determine the position and orientation&lt;/li>

&lt;li>GPS module which allows to set correct position of the vehicle&lt;/li>

&lt;li>SICK 2D laser scanner and ultrasonic sensors located on the front of 
the vehicle  which allows to   see what is in front of robot&lt;/li>

&lt;li>Vision system to assist control&lt;/li>

&lt;li>Power supply 3x12V batteries, in order to achieve necessary voltage 
levels.&lt;/li>
&lt;/ul>

&lt;div class='p'>The concept of a multi-layer control required the use of computers at 
different levels. For the lower is responsible board Nucleo STM. While 
higher layer functions on a PC.&lt;/div>

&lt;div class='p'>SOFTWARE:&lt;/div>

&lt;div class='p'>To effectively manage and control the robot, we had to accept the 
concept of multi-layer software. We decided that the best solution would 
be to use platform ROS (The Robot Operating System). As a result, we 
have gained the ability to connect all levels.
The lower is responsible for motor control through higher responsible 
for manual control using a wireless gamepad and simple autonomy, up to 
the highest layer that collects data from sensors, vision, coordinates 
designated route and decides where and how the vehicle is moving.&lt;/div>

&lt;div class='p'>Over the top layer of completely autonomous work is still in progress, 
but we are at an advanced stage.&lt;/div>

&lt;h2>&lt;a href='http://www.kufr.cz/' class='external'>Radioklub Písek - TCVVI&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/radioklub-pisek.jpg'>&lt;img src='/competitions/robotour/2016/teams/radioklub-pisek_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/HwBSFMnEKys' class='external'>https://youtu.be/HwBSFMnEKys&lt;/a>&lt;/div>

&lt;div class='p'>&lt;a href='https://www.youtube.com/watch?v=bAk96kyaTgc' class='external'>E-liška&lt;/a> &amp;hellip;  is already
traditional robot of Radioklub Písek. There are always some changes or rather
major vehicle rebuilds. This time is Eliška driven by 4x4, with new motor in each
wheel. Reworked is also independent suspension of all wheels including
problematic steering (hopefully finally perfect).&lt;/div>

&lt;div class='p'>The main control computer is notebook with Intel processor and static harddisk.
Secondary computers for control of various groups are with ARM processor.  The
energy is stored in gel-lead batteries. Novelty is loading crane on which we
are currently intensively working. Unfortunately new space arrangement forces
us to do more change then we originally anticipated &amp;hellip;&lt;/div>

&lt;h2>&lt;a href='http://raptors.p.lodz.pl' class='external'>Raptors&lt;/a> (PL)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/raptors.jpg'>&lt;img src='/competitions/robotour/2016/teams/raptors_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/VuRul6QmGOs' class='external'>https://youtu.be/VuRul6QmGOs&lt;/a>&lt;/div>

&lt;div class='p'>Our rover – called Raptor, consists of about 2000 parts designed by our 
team. We have used AutoCad Inventor software in the design process. Our 
Robot consists of a couple of modules, that can work quite 
independently. The main module is a robot’s body, that houses almost all 
of the electronics and batteries. The rover is equipped with 6 wheel 
drive system, mounted on the rocker boogie suspension. Batteries provide 
energy for 90 minutes continues work  and have special compartments with 
quick changing mechanism. Our on-board computer system is based on SB- 
RIO programed in Lab View environment. We are able to supervise all 
useful parameters concerning robot localization and orientation which 
make the navigation system. The navigation system runs on independent 
software and hardware platform. Dedicated software is written in C++ 
within the Robot Operating System ROS. Our system runs on multiple 
machines with different hardware architectures, file systems and OS but 
it is not a problem for ROS. Inertial data is provided by high quality 
Inertial Measurement Unit with six degrees of freedom, containing 
gyroscope and accelerometer. The sensor’s output is three axis angular 
velocity and linear acceleration. Using specialized filters system 
computes very accurate orientation. GPS module outputs position data 
with one meter accuracy. This data passes through the
raspberry pi 3.The weight of Raptors Rover differs depending on the 
configuration, reaching maximum of 50 kg in the most complex version.&lt;/div>

&lt;div class='p'>At our robot we use several safety systems. Due to any emergency 
situation, you could use din style red safety button, with cut off all 
active parts of rover or send emergency message from base station. 
Moreover we are able to fully separate batteries from any electric 
modules of robot, by switching off all switches located at back plate of 
chasse&lt;/div>

&lt;h2>&lt;a href='http://kempelen.ii.fmph.uniba.sk/rg/' class='external'>Smely Zajko&lt;/a> (SK)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/smely-zajko.jpg'>&lt;img src='/competitions/robotour/2016/teams/smely-zajko_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://www.youtube.com/watch?v=PcLq-2B7rR4' class='external'>https://www.youtube.com/watch?v=PcLq-2B7rR4&lt;/a>&lt;/div>

&lt;div class='p'>HARDWARE&lt;/div>

&lt;div class='p'>Parallax Motor Mount &amp;amp; Wheel Kit (old one) with speed encoders
2x HB-25 Motor Controller
Sbot board (based on AVR ATmega128, low-level control board, hopefully 
soon to be replaced with STM32F103 board)
Panasonic SDR T-50 camcorder
Diamond Multimedia One-Touch Video Capture VC500
Hokuyo
5x SRF-08 (ultrasonic sensors)
GPS NaviLock NL-302U USB SiRF III
Compass with tilt compensation (HMC6343)
AVR ATmega8 (compass driver)
usual usb hub
Power: HAZE HZS 12V 9Ah
handmade wood &amp;amp; aluminium base
red power switch and power circuitry
HK6S remote control console + receiver unit for easier transport
ASUS X552M PC as main controller&lt;/div>

&lt;div class='p'>SOFTWARE&lt;/div>

&lt;div class='p'>Ubuntu 16.04 LTS
C++ app developed in Netbeans - https://github.com/Robotics-DAI-FMFI-UK 
/smely-zajko
AVRStudio for SBot
ChibiStudio for STM32F103 board
FANN library for training and evaluating Neural Networks&lt;/div>

&lt;h2>&lt;a href='http://www.th-deg.de/' class='external'>WallE&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/teams/walle.jpg'>&lt;img src='/competitions/robotour/2016/teams/walle_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube &lt;a href='https://youtu.be/v979YcBIVLQ' class='external'>https://youtu.be/v979YcBIVLQ&lt;/a>&lt;/div>

&lt;div class='p'>The main hardware consists of a Microsoft surface tablet. For the main 
processing, it communicates with the XBox One Kinect and the Rasberry Pi 
and displays the crucial information on its main screen.
A Raspberry Pi reads the sensor values from the GPS and the Bosch BNO055 
and transmits them to the Microsoft Surface. It receives the steering 
and speed values from the Microsoft Surface, which are transmitted to 
the Freescale KL25Z microcontroller, which in turn controls the 
Continuous track model via H-Birdges.&lt;/div>

&lt;div class='p'>second team of the th-deg&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comments/question, please use our standard &lt;a href='/competitions/robotour/2016/teams/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>Robotour 2016</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2016/en"/>
	<id>http://localhost/competitions/robotour/2016/en</id>
	<updated>2016-05-04T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> The 11th year of Robotour contest of autonomous outdoor robots will take place
in Germany &amp;mdash; the last unvisited neighbour of the origin country. New are
extra bonus points for automatic payload loading at start position and
unloading at goal position. Moreover some contestants could be excited about
lower payload requirements when automatic loading is used. Note, that this is
an optional feature and thus robots from previous years can be used without any
modification. The contest will take place on &lt;b>17th September 2016 in
Deggendorf&lt;/b> as part of „Czech-Bavarian week”.
 </summary>
	<content type='html'> 
&lt;h1>Rules&lt;/h1>

&lt;div class='p'>There are slight changes in rules this year: robot can automatically load and
also unload at destination its payload. It will gain extra bonus points.  Also
there is an alternative for beer barrel &amp;mdash; 500ml cans, if they are loaded
automatically. And small note, that rule from last year &lt;i>the robot with the
highest score starts from the last position&lt;/i> will be used also in Robotour
2016.&lt;/div>

&lt;div class='p'>The detail rules are available on GitHubu with tag ROBOTOUR2016:
&lt;a href='https://github.com/robotika/robotour/blob/ROBOTOUR2016/rules/rules.md' class='external'>English&lt;/a>,
&lt;a href='https://github.com/robotika/robotour/blob/ROBOTOUR2016/rules/pravidla.md' class='external'>Czech&lt;/a>&lt;/div>

&lt;h2>Examples of roads in Deggendorf&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/deggendorf-7831.jpg'>&lt;img src='/competitions/robotour/2016/deggendorf-7831_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/deggendorf-7836.jpg'>&lt;img src='/competitions/robotour/2016/deggendorf-7836_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/deggendorf-7846.jpg'>&lt;img src='/competitions/robotour/2016/deggendorf-7846_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/deggendorf-7853.jpg'>&lt;img src='/competitions/robotour/2016/deggendorf-7853_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/deggendorf-7855.jpg'>&lt;img src='/competitions/robotour/2016/deggendorf-7855_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2016/deggendorf-7857.jpg'>&lt;img src='/competitions/robotour/2016/deggendorf-7857_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Location&lt;/h2>

&lt;div class='p'>The contest will take place in several neighbor locations, see the 
&lt;a href='http://www.openstreetmap.org/relation/959809#map=16/48.8305/12.9544&amp;amp;layers=D' class='external'>map&lt;/a>.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 564px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotour/2016/map.png' alt='map overview' title='map overview' class='border'  width='558' height='453'/>&lt;/span>&lt;br/>
&lt;span>map overview&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The base will be in 
&lt;a href='https://en.wikipedia.org/wiki/Deggendorf_Institute_of_Technology' class='external'>Technishe
Hochschule Deggendorf&lt;/a>, part of the contest in
&lt;a href='https://de.wikipedia.org/wiki/Landesgartenschau_Deggendorf_2014#Donaupark' class='external'>Donaupark&lt;/a>,
and also in
&lt;a href='https://de.wikipedia.org/wiki/Landesgartenschau_Deggendorf_2014#Stadthallenpark' class='external'>Stadthallenpark&lt;/a>
and finally on the university campus ground.&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comments/question, please use our standard &lt;a href='/competitions/robotour/2016/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>Tour the Stairs 2015</title>
	<link rel='alternate' href="http://localhost/competitions/tour-the-stairs/2015/en"/>
	<id>http://localhost/competitions/tour-the-stairs/2015/en</id>
	<updated>2015-09-25T00:00:00Z</updated>
	<author><name>Matin Dlouhý & Christian Gjørret</name></author>
	<summary type='html'> The second year of the contest „Tour the Stairs” will take place on Saturday
&lt;b>28th November 2015&lt;/b>. The robots could climb the stairs up and down this year
to make it more interesting for last year participants. The location is the
same: NTK Galery in Prague, Dejvice. &lt;b>UPDATE:&lt;/b> 7/1 &amp;mdash;
&lt;a href='/competitions/tour-the-stairs/2015/en#160107'>minivideo&lt;/a>
 </summary>
	<content type='html'> 
&lt;h1>Rules for year 2015&lt;/h1>

&lt;div class='p'>The basic rules remain the same as &lt;a href='/competitions/tour-the-stairs/2014/en'>last
year&lt;/a>. There is an extra extension in cases when you successfully complete
climbing up the stairs. The robot can then climb down and get extra points (1
step = 1 point). Note, that if robot starts to climb down in the middle of the
stairs it will actually loose points!&lt;/div>

&lt;div class='p'>Christian wanted extra „artists bonus points”, and I did not succeed to talk
him out of it. &lt;span class='wink'>&lt;/span> The rules is that if more than 50% of robot body is made
from &lt;b>the wood&lt;/b> and the robot completes at least one step in one of the runs
it will gain extra bonus 25 points. It means that this year you can gain up to
4*2*21+25=193 points.&lt;/div>

&lt;div class='p'>Also note, that there is now dedicated website for this contest:&lt;/div>

&lt;ul>
&lt;li>&lt;a href='http://tour-the-stairs.com/' class='external'>http://tour-the-stairs.com/&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>p.s. note, that &lt;a href='/competitions/tour-the-stairs/en#registration'>registration
form&lt;/a> will be the same for all years and thus is available on the main page of
the contest.&lt;/div>

&lt;hr/>

&lt;h1>UPDATES&lt;/h1>

&lt;div class='p'>&lt;a id="151101">&lt;/a>&lt;/div>

&lt;h2>1st November 2015 &amp;mdash; Registration video KRA Písek&lt;/h2>

&lt;div class='p'>I have two news for you: good one and bad one. The good news is that „Club of
robotics and automatization”(KRA) from Písek fulfilled their promise and
recorded their spider-like robot completing one step:&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/A1QM6_73UWQ?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;div class='p'>The bad news is that the registration deadline was yesterday and at the moment
KRA Písek is the only registered team! If you just overlooked the deadline,
then do not worry, fill the registration form and rather send me also email.&lt;/div>

&lt;div class='p'>We talked with Christian, that in case like this I will prepare opponent robot
&amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="151109">&lt;/a>&lt;/div>

&lt;h2>9th November 2015 &amp;mdash; Husky won't make it&lt;/h2>

&lt;div class='p'>Last week we gave a try to robot &lt;a href='/robots/husky/en'>Husky&lt;/a> &amp;mdash; if it could (without
modification) climb the stairs, &amp;hellip; and it failed:&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/zCQURTvGf6U?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;div class='p'>I talked with Chrisitian and I was quite surprised by one comment, that for
some robotics is TTS contest only mechanical challenge, no intelligence is
needed. Well, maybe it is time to sketch plans for 2016, when the robot will
compete in loop: climb one staircase, navigate on the first floor to the second
staircase, climb it down, and then back to start location. Better? &lt;span class='smile'>&lt;/span> Why not
get ready now and practise already this year?&lt;/div>

&lt;div class='p'>One more note, that on Saturday there will be also
&lt;a href='http://cafe-neu-romance.com/program/cnr-2015-lectures' class='external'>lectures&lt;/a> in parallel
to the contest, and for example
&lt;a href='http://cafe-neu-romance.com/program/cnr-2015-lectures/cnr-2015-lectures-jean-baptiste-mouret-%28fra%29-robots-that-can-adapt-to-damage-in-minutes' class='external'>
Jean-Baptiste Mouret&lt;/a>, who will talk about robot adaptation to damage, could
be quite interesting &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="151128">&lt;/a>&lt;/div>

&lt;h2>28th November 2015 &amp;mdash; Results and Lamia videos&lt;/h2>

&lt;div class='p'>Two competing teams are not many, but I think that it was still interesting
show for visitors of NTK Galery. &lt;span class='smile'>&lt;/span> And it was funny to realize that both
teams ended with the same rank:&lt;/div>

&lt;div class='p'>&lt;table border="1">
	&lt;tr>
		&lt;th>Rank&lt;/th>
		&lt;th>Team&lt;/th>
		&lt;th>1st straight&lt;/th>
		&lt;th>1st spiral&lt;/th>
		&lt;th>2nd straight&lt;/th>
		&lt;th>2nd spiral&lt;/th>
		&lt;th>3rd straight&lt;/th>
		&lt;th>3rd spiral&lt;/th>
		&lt;th>4th straight&lt;/th>
		&lt;th>Total&lt;/th>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">1st/2nd&lt;/td>
		&lt;td>&lt;b>KRA Písek&lt;/b>&lt;/td>
		&lt;td>4&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>5&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>5&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>6&lt;/td>
		&lt;td>26&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">1st/2nd&lt;/td>
		&lt;td>&lt;b>Lamia&lt;/b>&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>3&lt;/td>
		&lt;td>3&lt;/td>
		&lt;td>3&lt;/td>
		&lt;td>10&lt;/td>
		&lt;td>3&lt;/td>
		&lt;td>26&lt;/td>
	&lt;/tr>
&lt;/table>&lt;/div>

&lt;h3>Lamia - Tour the Stairs 2015 (straight)&lt;/h3>

&lt;div class='p'>&lt;center> &lt;iframe width="640" height="360"
src="https://www.youtube.com/embed/q4me_P9ja3M?feature=player_detailpage"
frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;h3>Lamia - Tour the Stairs 2015 (twisted)&lt;/h3>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360"
src="https://www.youtube.com/embed/-a-FW6YXUgc?feature=player_detailpage"
frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="160107">&lt;/a>&lt;/div>

&lt;h2>7th January 2016 &amp;mdash; minivideo&lt;/h2>

&lt;div class='p'>Here are two short videos from Christian:&lt;/div>

&lt;h3>KRA Písek
&lt;center>
&lt;iframe src="https://player.vimeo.com/video/150148266" width="500" height="281" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/h3>

&lt;h3>Lamia
&lt;center>
&lt;iframe src="https://player.vimeo.com/video/150147833" width="500" height="281" frameborder="0" webkitallowfullscreen mozallowfullscreen allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/h3>

&lt;div class='p'>plus one older one &amp;hellip;&lt;/div>

&lt;h3>Czech TV&lt;/h3>

&lt;ul>
&lt;li>&lt;a href='http://www.ceskatelevize.cz/ivysilani/1097206490-udalosti-v-kulture/215411000121128/obsah/437896-prehlidka-robotiky' class='external'>ceskatelevize.cz/ivysilani/1097206490-udalosti-v-kulture&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 362px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/tour-the-stairs/2015/ct-art.jpg'>&lt;img src='/competitions/tour-the-stairs/2015/ct-art_t.jpg' alt='KRA Písek on ČT art' title='KRA Písek on ČT art' class='border'  width='356' height='200'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/tour-the-stairs/2015/ct-art.jpg'>KRA Písek on ČT art&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/competitions/tour-the-stairs/2015/en#email'>contact form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>Results</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2015/results/en"/>
	<id>http://localhost/competitions/robotour/2015/results/en</id>
	<updated>2015-09-10T00:00:00Z</updated>
	<author><name>Martin Dlouhý, photo Zdeněk Kakáč</name></author>
	<summary type='html'> The 10th year of Robotour contest, which took place in Písek, had several
specials. There were several runs directly in the city, i.e. not only park
roads.  And the biggest surprise was victory of new German team from Karlsruhe!
&lt;span class='smile'>&lt;/span>
 </summary>
	<content type='html'> 
&lt;h1>Competition&lt;/h1>

&lt;h2>Attendance&lt;/h2>

&lt;div class='p'>There were nine teams registered and also nine teams actively participated &lt;span class='smile'>&lt;/span>.
In reality instead of AmBot, which had serious troubles the day before contest
and its participation canceled, started another German team
&lt;a href='https://www.youtube.com/watch?v=29UI9oZJr1c' class='external'>JECC2&lt;/a>. There was only one Robotour
novice this year: team &lt;i>Kamaro Engineering&lt;/i>, which we met on contest
&lt;a href='/competitions/fieldrobot/2015/en'>&lt;span class='cs'>Field Robot Event 2015&lt;/span>&lt;/a>. To be honest I was
not quite sure if they will come. But they did, their robot moved and with a
bit of luck they won the contest!&lt;/div>

&lt;h2>The Place&lt;/h2>

&lt;div class='p'>It would not be the group from Písek, if they would not prepare something
special. &lt;span class='wink'>&lt;/span> There were
prepared quite challenging sections, in excuse that 10th year desires something extra. The runs were partially in parks, i.e. on park
roads, but also on normal street, cobbles and surfaces with various colors.
There was even cemetery, or better „reverential place”, where I was a
bit uncomfortable, but there were also weddings and youngsters played there
football, so the robots behaved with more respect to the deceased.&lt;/div>

&lt;div class='p'>Another tricky place was a pedestrian bridge with wheelchair accessible slope to an
island. And if you think that this is was it, what about passageway
through house, which leads from the square to park „Palackého sady”? This is the
big park you may know from &lt;a href='/competitions/robotem-rovne/en'>Robotem rovně&lt;/a>
contest. What pleased me was the fact that no team complained. Just the
opposite, several teams priced highly this city tour. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;h2>0th and 1st Run&lt;/h2>

&lt;div class='p'>Zero run, where we review start procedure, was surprisingly not that chaotic as
previous years. The start and goal was then the same also for the first run.&lt;/div>

&lt;div class='p'>We were witnesses of Murphy's law in practise &amp;mdash; &lt;i>Kamaro Engineering&lt;/i> 
completed the straight park road, crossed the street and find the way through
the gate to cemetery. Then they choose wrong direction to avoid the tree and
left the road. That was the practise run. In the real run the robot did not
even start.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/kamaro.jpg'>&lt;img src='/competitions/robotour/2015/results/kamaro_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/kamaro2.jpg'>&lt;img src='/competitions/robotour/2015/results/kamaro2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/kamaro-gate.jpg'>&lt;img src='/competitions/robotour/2015/results/kamaro-gate_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;i>JECC2&lt;/i> went on odometry only, without GPS. The robot was only several
hours old so it was nice to see that it worked reasonably well, and the full
barrel was not a problem.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/jecc2.jpg'>&lt;img src='/competitions/robotour/2015/results/jecc2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/jecc2b.jpg'>&lt;img src='/competitions/robotour/2015/results/jecc2b_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/jecc2c.jpg'>&lt;img src='/competitions/robotour/2015/results/jecc2c_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The winner from last year &lt;i>Smelý Zajko&lt;/i> entered
with new dose of courage loaded into its neural network. There was a new sensor (laser scanner Hokuyo), which
kept robot away from the cemetery wall. After several minutes of hesitation
with motion a little there and a little back the direction from camera finally
won and in the real run &lt;i>Smelý Zajko&lt;/i> managed to enter the gate, went through
the place of rest and leave through small gate on the other side. Its run was
terminated by an another trap: white car on the parking place. The color was
classified as potential road but dark wheel were not. And so the robot oscillated
and never left the white car. The next year this situation should be solved
by better integration of new LIDAR.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/smely-zajko3.jpg'>&lt;img src='/competitions/robotour/2015/results/smely-zajko3_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/smely-zajko2.jpg'>&lt;img src='/competitions/robotour/2015/results/smely-zajko2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/zajko-wall.jpg'>&lt;img src='/competitions/robotour/2015/results/zajko-wall_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>I would mention one more luckless fellow: &lt;i>MarS&lt;/i>. Also here played the camera
primary part for navigation and the concrete columns colored like the road
attracted the robot and caused collision.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/mars.jpg'>&lt;img src='/competitions/robotour/2015/results/mars_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/mars2.jpg'>&lt;img src='/competitions/robotour/2015/results/mars2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/column.jpg'>&lt;img src='/competitions/robotour/2015/results/column_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>And what about &lt;i>E_liška&lt;/i> from Radioklub Písek? She went fine, but missed the
gate. She flirt a while with the cemetery wall but then we were witnesses of AI
in practice (carper would probably say luck only), when she replanned the road and
choose an alternative around the cemetery. And she almost succeeded (the
failure was near the goal).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/radioklub-pisek.jpg'>&lt;img src='/competitions/robotour/2015/results/radioklub-pisek_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/radioklub-pisek2.jpg'>&lt;img src='/competitions/robotour/2015/results/radioklub-pisek2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run1.jpg'>&lt;img src='/competitions/robotour/2015/results/run1_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>2nd Run&lt;/h2>

&lt;div class='p'>The second run was recorded from the air by Dan Polák, so you can have a look:&lt;/div>

&lt;div class='p'>&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/U-IakVdhsRg?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;div class='p'>This shots look like some mass accident on robotic highway to me. It was
supposed to be a simple run, but it was not. The start was on a cycle path
along river Otava, slightly curved. The direction south &amp;mdash; so was the sun the
source of
problems? There were strips painted on the road, was that the problem? I do not
know. In any case robots had problems and nobody completed the first curve. This means that
also nobody reached the first crossing to turn left on the bridge and second junction to the
island. Once a while it was adrenalin, for example when &lt;i>E_liška&lt;/i> wrongly
evaluated surface change and turn left. There was steep slope down to the river.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run2.jpg'>&lt;img src='/competitions/robotour/2015/results/run2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run2b.jpg'>&lt;img src='/competitions/robotour/2015/results/run2b_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run2c.jpg'>&lt;img src='/competitions/robotour/2015/results/run2c_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>3rd Run&lt;/h2>

&lt;div class='p'>We decided not to trouble robots with the
uphill drive to „Putimská brána” after experience from the first two runs.
Instead we moved the start to the beginning
of pedestrian zone along the river. Teams and the batteries had some time to
recover over lunch break, and you could recognize that. Somebody would say,
that this was a simple track, somebody else just the opposite.&lt;/div>

&lt;div class='p'>Well there were not many ways where you could turn away: there was a wall on the left
and houses on the right. But low kerb was killer for several robots. Also wide
dark sewer scare off for example &lt;i>MarS&lt;/i> and it turned 180 degrees from it.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/mars-stop.jpg'>&lt;img src='/competitions/robotour/2015/results/mars-stop_t.jpg' alt='sewer as an obstacle' title='sewer as an obstacle' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2015/results/mars-stop.jpg'>sewer as an obstacle&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/broken-eliska.jpg'>&lt;img src='/competitions/robotour/2015/results/broken-eliska_t.jpg' alt='broken wheel' title='broken wheel' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2015/results/broken-eliska.jpg'>broken wheel&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/ndteam3.jpg'>&lt;img src='/competitions/robotour/2015/results/ndteam3_t.jpg' alt='NDTeam' title='NDTeam' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2015/results/ndteam3.jpg'>NDTeam&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>This run was a tragedy for so far winning &lt;i>E_lišku&lt;/i> from team &lt;i>Radioklub
Písek&lt;/i>. Due to repeated motion back and forth she broke the wheel and did not
participate in the remaining runs.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/kamaro-run3.jpg'>&lt;img src='/competitions/robotour/2015/results/kamaro-run3_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/kamaro-run3b.jpg'>&lt;img src='/competitions/robotour/2015/results/kamaro-run3b_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/kamaro-near-bridge.jpg'>&lt;img src='/competitions/robotour/2015/results/kamaro-near-bridge_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>This run scored &lt;i>NDTeam&lt;/i> and the most points gained &lt;i>Kamaro Engineering&lt;/i>. They had
problem with their onboard computer so they mounted notebook on robot instead.
And it worked. After first runs experiences they removed from the code procedure
to return to
the start &amp;hellip; and I believe that they would not reach the goal if the code was still
there. &lt;span class='wink'>&lt;/span> So it was the only team which managed to navigate between houses and
narrow streets along the river all the way to Stone bridge. They finished 8
meters from the goal because their goal radius was set to 10m.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run3.jpg'>&lt;img src='/competitions/robotour/2015/results/run3_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run3b.jpg'>&lt;img src='/competitions/robotour/2015/results/run3b_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run3c.jpg'>&lt;img src='/competitions/robotour/2015/results/run3c_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>4th Run&lt;/h2>

&lt;div class='p'>The last run was on a wide street, cobblestones, uphill. These conditions did
not suite in particular to &lt;i>MarS&lt;/i> &amp;mdash; small robot did not have enough power to pull
the carriage with beer and small wheels turned in joint between tiles. Kamaro
scored again and a lot. I expected that there are only two ways how to enter
the park though houses, but they found the third one. Unfortunately it was very
narrow with restaurant table on both sides so there was no chance to guess
the correct direction. Even though the goal was just on the other side of the house and it
was enough to convincing victory!&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run4.jpg'>&lt;img src='/competitions/robotour/2015/results/run4_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/run4b.jpg'>&lt;img src='/competitions/robotour/2015/results/run4b_t.jpg' alt='' title='' class='border'  width='220' height='147'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/winners.jpg'>&lt;img src='/competitions/robotour/2015/results/winners_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Total Score&lt;/h2>

&lt;div class='p'>&lt;table border="1">
	&lt;tr>
		&lt;th>Place&lt;/th>
		&lt;th>Team&lt;/th>
		&lt;th>1st Run&lt;/th>
		&lt;th>2nd Run&lt;/th>
		&lt;th>3rd Run&lt;/th>
		&lt;th>4th Run&lt;/th>
		&lt;th>Total&lt;/th>
	&lt;/tr>
	&lt;tr bgcolor="yellow">
		&lt;td align="center">1.&lt;/td>
		&lt;td>&lt;b>Kamaro Engineering&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>12&lt;/td>
		&lt;td>385&lt;/td>
		&lt;td>221&lt;/td>
		&lt;td>618&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">2.&lt;/td>
		&lt;td>&lt;b>Radioklub Písek&lt;/b>&lt;/td>
		&lt;td>148&lt;/td>
		&lt;td>74&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>222&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">3.&lt;/td>
		&lt;td>&lt;b>Smelý Zajko&lt;/b>&lt;/td>
		&lt;td>122&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>3&lt;/td>
		&lt;td>31&lt;/td>
		&lt;td>156&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">4.&lt;/td>
		&lt;td>&lt;b>ARBot&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>28&lt;/td>
		&lt;td>12&lt;/td>
		&lt;td>86&lt;/td>
		&lt;td>126&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">5.&lt;/td>
		&lt;td>&lt;b>NDTeam&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>75&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>77&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">6.&lt;/td>
		&lt;td>&lt;b>MarS&lt;/b>&lt;/td>
		&lt;td>44&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>46&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">7.&lt;/td>
		&lt;td>&lt;b>JECC2&lt;/b>&lt;/td>
		&lt;td>26&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>9&lt;/td>
		&lt;td>35&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">8.&lt;/td>
		&lt;td>&lt;b>Istrobotics&lt;/b>&lt;/td>
		&lt;td>4&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>17&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>23&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">9.&lt;/td>
		&lt;td>&lt;b>JECC&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>1&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
&lt;/table>&lt;/div>

&lt;h1>Workshop and PAIR'15&lt;/h1>

&lt;div class='p'>There was a „classic” workshop on Sunday. It was coordinated with student
conference &lt;a href='http://robotics.fel.cvut.cz/pair15/' class='external'>PAIR&amp;#039;15&lt;/a> (Planning in
Artificial Intelligence and Robotics) this year, organized by ČVUT FEL Praha.
The overlap was useful for both sides: students learned something from
experiences of Robotour participants and competitors got some idea about current
status of scientific research. The invited talk was about Roboauto presented by
Honza Najvárek.&lt;/div>

&lt;h1>Technology&lt;/h1>

&lt;div class='p'>There were some interesting technologies used by teams this year:&lt;/div>

&lt;ul>
&lt;li>&lt;a href='https://en.wikipedia.org/wiki/Odroid' class='external'>Odroid&lt;/a> (NDTeam, Istrobotics)&lt;/li>

&lt;li>laser scanner vacuum cleaner (Istrobotics, Kamaro Engineering)&lt;/li>

&lt;li>&lt;a href='https://en.wikipedia.org/wiki/Field-programmable_gate_array' class='external'>FPGA&lt;/a> (ARBot)&lt;/li>

&lt;li>modified BLDC motors in wheel (Radioklub písek)&lt;/li>
&lt;/ul>

&lt;h1>Roboauto&lt;/h1>

&lt;div class='p'>Nice „diversification” of the competition was visit of Roboauto from Brno. You
may remember them from previous years, when their robots finished on leading
places. Now they experiment with modified car Hundai i40 and they presented
small demo to groups of robotics in Písek. Do not worry if you missed that &amp;mdash;
there is going to be a chance to see the car on &lt;i>Robotem rovně 2016&lt;/i>, where
the goal is to participate and autonomously navigate through the park &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/roboauto.jpg'>&lt;img src='/competitions/robotour/2015/results/roboauto_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/roboauto2.jpg'>&lt;img src='/competitions/robotour/2015/results/roboauto2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/roboauto-back.jpg'>&lt;img src='/competitions/robotour/2015/results/roboauto-back_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/roboauto-in.jpg'>&lt;img src='/competitions/robotour/2015/results/roboauto-in_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/roboauto-side.jpg'>&lt;img src='/competitions/robotour/2015/results/roboauto-side_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/roboauto-run1.jpg'>&lt;img src='/competitions/robotour/2015/results/roboauto-run1_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h1>Plans for  Robotour 2016&lt;/h1>

&lt;div class='p'>If everything goes fine then the next Robotour is going to be in Germany
(&lt;a href='http://www.openstreetmap.org/relation/959809#map=17/48.83164/12.95056&amp;amp;layers=N' class='external'>Deggendorf&lt;/a>). Note, that we also plan some small changes in the rules. In particular there will be
possibility to load/unload barrel (50 points if robot pass the start line).
Also 500ml cans will be allowed instead of the big barrel, but the robot has to be
able to load them (5 points per can, max 10 pieces). We will keep the
reverse order on start, i.e. the currently best team will start from the last
position.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/hexacopter.jpg'>&lt;img src='/competitions/robotour/2015/results/hexacopter_t.jpg' alt='cameraman' title='cameraman' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2015/results/hexacopter.jpg'>cameraman&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/istrobotics.jpg'>&lt;img src='/competitions/robotour/2015/results/istrobotics_t.jpg' alt='Istrobotics' title='Istrobotics' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2015/results/istrobotics.jpg'>Istrobotics&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/jecc.jpg'>&lt;img src='/competitions/robotour/2015/results/jecc_t.jpg' alt='JECC' title='JECC' class='border'  width='220' height='146'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2015/results/jecc.jpg'>JECC&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/jeccb.jpg'>&lt;img src='/competitions/robotour/2015/results/jeccb_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/jeccc.jpg'>&lt;img src='/competitions/robotour/2015/results/jeccc_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/arbot.jpg'>&lt;img src='/competitions/robotour/2015/results/arbot_t.jpg' alt='' title='' class='border'  width='220' height='147'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/arbot2.jpg'>&lt;img src='/competitions/robotour/2015/results/arbot2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/ndteam.jpg'>&lt;img src='/competitions/robotour/2015/results/ndteam_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/ndteam2.jpg'>&lt;img src='/competitions/robotour/2015/results/ndteam2_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/ndteam-istrobotics.jpg'>&lt;img src='/competitions/robotour/2015/results/ndteam-istrobotics_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/robots.jpg'>&lt;img src='/competitions/robotour/2015/results/robots_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/taking-pictures.jpg'>&lt;img src='/competitions/robotour/2015/results/taking-pictures_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;h2>Thanks&lt;/h2>

&lt;div class='p'>I would like to thank a lot to Radioklub Písek for preparation and perfect
realization of jubilee Robotour. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comment/query, please use our &lt;a href='/competitions/robotour/2015/results/en#email'>contact form&lt;/a>.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/results/noname.jpg'>&lt;img src='/competitions/robotour/2015/results/noname_t.jpg' alt='' title='' class='border'  width='220' height='146'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>
 </content>
</entry>
<entry>
	<title>Introduction of teams</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2015/teams/en"/>
	<id>http://localhost/competitions/robotour/2015/teams/en</id>
	<updated>2014-08-11T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> Welcome new team KaMaRo Engineering (known from Field Robot Event), new
machines (ARBot, JECC) and all stayers &lt;span class='smile'>&lt;/span>. When and where you could see the
robots?  &lt;b>5th September 2015, Písek city center, Czech Republic&lt;/b>. Concurent
starts of all robots at &lt;b>10am, 11am, 2pm and 3pm&lt;/b>.
 </summary>
	<content type='html'> 
&lt;h1>Teams&lt;/h1>

&lt;h1>&lt;a href='https://www.youtube.com/playlist?list=PL2gPpyBs1e23JrcQralHGYVZ8UIHfuWcS' class='external'>YouTube playlist of all registrations&lt;/a>&lt;/h1>

&lt;h2>&lt;a href='http://ambot6.webnode.cz/' class='external'>AmBot&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/ambot.jpg'>&lt;img src='/competitions/robotour/2015/teams/ambot_t.jpg' alt='' title='' class='border'  width='320' height='177'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://youtu.be/y_J8i0tHYjw' class='external'>https://youtu.be/y_J8i0tHYjw&lt;/a>&lt;/div>

&lt;div class='p'>Robot Ferda is based on modified children's electric car ("ride-on") for Robotour 2015.
The controller is based on Arduino with ATmega2560, controls
the motors, utilizes magnetometer as a compass, manages three sonars to detect
obstacles, reads data from an external GPS receiver and communicates with a
Bluetooth converter (to respond to commands from the master system). This
master system is an Android smartphone with proprietary application (RoboNav)
for GPS navigation by the map (derived from OpenStreetMap) supplemented by
visual navigation according to smartphone camera (for keeping on the road).&lt;/div>

&lt;h2>&lt;a href='http://www.arbot.cz' class='external'>ARBot&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/arbot.jpg'>&lt;img src='/competitions/robotour/2015/teams/arbot_t.jpg' alt='' title='' class='border'  width='320' height='178'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://youtu.be/DsKR7kcO96o' class='external'>https://youtu.be/DsKR7kcO96o&lt;/a>&lt;/div>

&lt;div class='p'>Robot is controlled via Zedboard with SoC Zynq 7020 manufactured by Xilinx.
It has 512 MB RAM, 32 GB SD, the heart is dual-core ARM Cortex A9 + 
NEON at 600 MHz and it contains programmable gate array with 85 thousands units. 
Zedboard runs on Linux and as main programming language is used mono and C#.&lt;/div>

&lt;div class='p'>The chassis is differentially driven with 3rd passive wheel. The modeller
wheels have diameter 17 cm and two motors PG36555126000-50.9K with encoders are
controlled by professional unit SDC2160 by Roboteq and it provides necessary
traction.&lt;/div>

&lt;div class='p'>There are two optical odometers ADNS 3080 for further position improvements.
The next sense is „touch”. Robot has two tactile FSR sensors integrated in
the front bumper.&lt;/div>

&lt;div class='p'>Time-tested AHRS VN-100 from VectorNav provides information about robot orientation.
The global position information is handled by GPS uBlox NEO 7M.&lt;/div>

&lt;div class='p'>Robot has two sonars HC-SR04, which can be rotated via model servo motor.&lt;/div>

&lt;div class='p'>The control unit for model servos is SSC-32.&lt;/div>

&lt;div class='p'>Robot has stereoscopy camera with chips Aptina MT9V032, which have global
shutter and are able to work in HDR mode.  Camera is movable in two axis via
servos.&lt;/div>

&lt;div class='p'>The energy source are 4 LiFePo cells with capacity 14.5 Ah and protected by
SBM.&lt;/div>

&lt;div class='p'>The chassis is built from 2mm thick plywood for air models and spruce beams 7
mm. Simply model domain. All parts were cut by laser.&lt;/div>

&lt;h2>Istrobotics (SK)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/istrobotics.jpg'>&lt;img src='/competitions/robotour/2015/teams/istrobotics_t.jpg' alt='' title='' class='border'  width='320' height='178'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://youtu.be/rTmW52ScBX0' class='external'>https://youtu.be/rTmW52ScBX0&lt;/a>&lt;/div>

&lt;div class='p'>The robot is based on modified RC car TRAXXAS E-MAXX (3903). It is equipped
with camera, GPS, sonar HC-SR04, IMU with 3D compas and magnetic IRC. Arduino
mega is used for robot control and sensors reading. Odroid C1 running on linux
is used for computer vision and GPS reading. Robot is programed with C++ and
OpenCV.&lt;/div>

&lt;h2>&lt;a href='http://www.jecc.de' class='external'>JECC&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/jecc.jpg'>&lt;img src='/competitions/robotour/2015/teams/jecc_t.jpg' alt='' title='' class='border'  width='320' height='178'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://youtu.be/0WH-PPUnL6c' class='external'>https://youtu.be/0WH-PPUnL6c&lt;/a>&lt;/div>

&lt;ul>
&lt;li>image recognition with webcam&lt;/li>

&lt;li>obstacle detection with SICK PLS 101&lt;/li>

&lt;li>embedded pc module with Intel Core i7&lt;/li>

&lt;li>DRV8432 motor driver&lt;/li>
&lt;/ul>

&lt;h2>&lt;a href='http://www.kamaro.kit.edu' class='external'>Kamaro Engineering&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/kamaro.jpg'>&lt;img src='/competitions/robotour/2015/teams/kamaro_t.jpg' alt='' title='' class='border'  width='320' height='178'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://www.youtube.com/watch?v=txRV6iRba_w' class='external'>https://www.youtube.com/watch?v=txRV6iRba_w&lt;/a>&lt;/div>

&lt;div class='p'>Our Robot is completely self developed, from mechanics to software. It 
has four wheels, two independently steered axis, one main motor and 
weighs about 40kg. It was mainly developed as an agricultural robot for 
the Field Robot Event. The low level software is written in C++, the 
high level software in C# and Java. It has two LIDARs and a camera as 
its main sensors.&lt;/div>

&lt;h2>&lt;a href='http://cyber.felk.cvut.cz/research/theses/detail.phtml?id=484' class='external'>MarS&lt;/a>&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/mars.jpg'>&lt;img src='/competitions/robotour/2015/teams/mars_t.jpg' alt='' title='' class='border'  width='320' height='178'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://www.youtube.com/watch?v=xDHrKi1u2z0' class='external'>https://www.youtube.com/watch?v=xDHrKi1u2z0&lt;/a>&lt;/div>

&lt;div class='p'>Robot is based on 4WD chassis of terrain RC model.&lt;/div>

&lt;div class='p'>The data from GPS and compass are used for grid localization. The candidate
with the highest probability is selected from the grid. Moreover it has to
overweight all neighbors by given ratio to be accepted.&lt;/div>

&lt;div class='p'>The algorithm for planning path from current winner position to goal is
standard A*.&lt;/div>

&lt;div class='p'>The road navigation is based on image processing of front camera. The odometry
and GPS position then estimates the distance to crossing, which is realized by
simple wall-following algorithm with compass direction correction.&lt;/div>

&lt;h2>NDTeam (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/ndteam.jpg'>&lt;img src='/competitions/robotour/2015/teams/ndteam_t.jpg' alt='' title='' class='border'  width='320' height='178'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/2vXgi8JW6GU' class='external'>http://youtu.be/2vXgi8JW6GU&lt;/a>&lt;/div>

&lt;div class='p'>Robot Robík:&lt;/div>

&lt;ul>
&lt;li>weight about 20 kg without barrel&lt;/li>

&lt;li>controlled by homemade control unit based on Cortex M3 (LPC1765)&lt;/li>

&lt;li>9 DOF AHRS, GPS&lt;/li>

&lt;li>power: 8S1P 5Ah LiPol&lt;/li>

&lt;li>Credit card sized Linux computer Odroid U3 for video processing&lt;/li>

&lt;li>OpenCV for road detection&lt;/li>

&lt;li>5x sonar for obstacle detection and avoidance&lt;/li>
&lt;/ul>

&lt;h2>&lt;a href='http://www.kufr.cz' class='external'>Radioklub Písek&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/radioklub-pisek.jpg'>&lt;img src='/competitions/robotour/2015/teams/radioklub-pisek_t.jpg' alt='' title='' class='border'  width='320' height='239'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://youtu.be/Elczrq4C2Rw' class='external'>https://youtu.be/Elczrq4C2Rw&lt;/a>&lt;/div>

&lt;div class='p'>E-liška has size approximately 95x60x48 cm and weights around 40 kg.  The board
voltage 24V is provided by two gel-lead accumulators 12V/18Ah. E-liška has
spring-loaded 4-wheels undercarriage with Ackermann steering and powered all
four wheels, each with its own control unit.  We replaced two back motors by
BLDC for this year. We use Lidar Sick , GPS and 9 DOF innertial unit. The main
control provides notebook with operating system Linux-Debian, and motor control
handles own module with STM32. The power steering is addressed by own
construction of H-bridges and 3phase control of BLDC motors. The main program
is written in Python.&lt;/div>

&lt;h2>&lt;a href='http://dai.fmph.uniba.sk/projects/smelyzajko/' class='external'>Smely Zajko&lt;/a> (SK)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/teams/smely-zajko.jpg'>&lt;img src='/competitions/robotour/2015/teams/smely-zajko_t.jpg' alt='' title='' class='border'  width='320' height='178'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/aixAzwEohaI' class='external'>http://youtu.be/aixAzwEohaI&lt;/a>&lt;/div>

&lt;div class='p'>Parallax (Motor Mount and Wheel Kit), encoders, 2xHB25 motor drivers
Sbot board (based on AVR ATmega128, low-level control board)
PC ASUS UL30V (main control computer)
5x SRF-08 (ultrasonic sensors)
GPS NaviLock NL-302U USB SiRF III
Compass with tilt compensation (HMC6343)
AVR ATmega8 (compass driver)
Camcorder Panasonic SDR-T50 or webcam
usual usb hub
Power: HAZE HZS 12V 9Ah
handmade wood &amp;amp; aluminium base
red power switch and power circuitry&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comments/question, please use our standard &lt;a href='/competitions/robotour/2015/teams/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>Robotour 2015</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2015/en"/>
	<id>http://localhost/competitions/robotour/2015/en</id>
	<updated>2015-05-25T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> This year is already 10th anniversary (!) of Robotour contest. &lt;span class='smile'>&lt;/span> We decided to
organize it in Písek this time. Radioklub Písek opens the outdoor robotic
season with „Robotem rovně” and this year it will also close with
„Robotour”. The contest will be in parallel with „Beer celebrations”. Exact
place in Písek will be specified later. And the date is &lt;b>5th September
2015&lt;/b>. (&lt;b>Update: location details&lt;/b>)
 </summary>
	<content type='html'> 
&lt;h1>Rules&lt;/h1>

&lt;div class='p'>The rules for ground vehicles are the same as previous year and are available
in in PDF format in &lt;a href='/competitions/robotour/2013/Robotour-rules.pdf'>English&lt;/a>
and in &lt;a href='/competitions/robotour/2013/Robotour-pravidla.pdf'>Czech&lt;/a>.&lt;/div>

&lt;div class='p'>There is only one new rule regarding order at the start:&lt;/div>

&lt;ul>
&lt;li>the robot with the highest score starts from the last position&lt;/li>
&lt;/ul>

&lt;h2>Photo of potential roads/crossings&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-001.jpg'>&lt;img src='/competitions/robotour/2015/pisek-001_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-006.jpg'>&lt;img src='/competitions/robotour/2015/pisek-006_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-008.jpg'>&lt;img src='/competitions/robotour/2015/pisek-008_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-009.jpg'>&lt;img src='/competitions/robotour/2015/pisek-009_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-011.jpg'>&lt;img src='/competitions/robotour/2015/pisek-011_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-018.jpg'>&lt;img src='/competitions/robotour/2015/pisek-018_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-019.jpg'>&lt;img src='/competitions/robotour/2015/pisek-019_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-033.jpg'>&lt;img src='/competitions/robotour/2015/pisek-033_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2015/pisek-045.jpg'>&lt;img src='/competitions/robotour/2015/pisek-045_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Location&lt;/h2>

&lt;div class='p'>The contest will take place at different places in Pisek. The bounding box is
minlat='49.304811' minlon='14.1398335' maxlat='49.3107989' maxlon='14.1518927'
The first start and homologation place will be near street "U Vystaviste" at
49.308258, 14.143379
(&lt;a href='http://www.openstreetmap.org/search?query=49.308258%2C%2014.143379#map=19/49.30826/14.14338&amp;amp;layers=D' class='external'>map&lt;/a>).&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comments/question, please use our standard &lt;a href='/competitions/robotour/2015/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>Husky</title>
	<link rel='alternate' href="http://localhost/robots/husky/en"/>
	<id>http://localhost/robots/husky/en</id>
	<updated>2014-07-29T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> Husky is commercial platform manufactured by Clearpath Robotics. It is planned
to be an active helper on a small farm, where it will cut the grass near
electric fence and assist with material handling. &lt;b>Blog update:&lt;/b> 3/6 &amp;mdash;
&lt;a href='/robots/husky/en#150603'>dnipola.sk/Encoders&lt;/a>
 </summary>
	<content type='html'> 
&lt;div class='p'>Our robot Husky was tested for the first time on competition
&lt;a href='/competitions/robotem-rovne/2014/en#140519'>&lt;span class='cs'>Robotem rovně&lt;/span>&lt;/a>. Here is
&lt;a href='http://www.youtube.com/watch?v=XknqC8_kI0E' class='external'>short video&lt;/a> of robot in motion.
The &lt;a href='https://github.com/robotika/husky/blob/master/rr.py' class='external'>source code&lt;/a> was
very simple (basically set 30% power to both motors), but still it was success
&amp;mdash; I had the robot for couple hours only.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/husky/husky.jpg'>&lt;img src='/robots/husky/husky_t.jpg' alt='Robot Husky' title='Robot Husky' class='border'  width='320' height='212'/>&lt;/a>&lt;br/>
&lt;a href='/robots/husky/husky.jpg'>Robot Husky&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>It looks like that Czech public is not very interested in this project (see
results of &lt;a href='http://fandorama.cz/projekty/921778164/robot-husky/' class='external'>this fandorama
project&lt;/a>) so I will try to write my experiences and test results here in
English. If you have any question or comment please do not hesitate to ask
(contact form is at the end of this article).&lt;/div>

&lt;h2>Links:&lt;/h2>

&lt;ul>
&lt;li>Website of manufacture:
&lt;a href='http://www.clearpathrobotics.com/husky/' class='external'>http://www.clearpathrobotics.com/husky/&lt;/a>&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;hr/>

&lt;h1>Content&lt;/h1>

&lt;h3>2014&lt;/h3>

&lt;ul>
&lt;li>&lt;a href='/robots/husky/en#140729'>Github&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140731'>IMU&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140801'>End of IMU mystery&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140819'>USB devices&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140820'>ROS without ROS?&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140821'>Subscriber client&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140822'>Hello Robot&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140825'>ROS /imu/data&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140827'>Hello ROS Family&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140829'>node.py&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140902'>replay metalog&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140903'>Heartbeat&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140904'>Joyride&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140907'>Quaternions and loose wheel/axis&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#140911'>Goodbye Husky!&lt;/a>&lt;/li>
&lt;/ul>

&lt;h3>2015&lt;/h3>

&lt;ul>
&lt;li>&lt;a href='/robots/husky/en#150203'>Husky at CZU&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150210'>Simple Depth Viewer&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150211'>To the Wall&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150212'>depth_stream.start()&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150224'>followme.py ver0&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150423'>go.py into sun&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150513'>Husky &amp;amp; Heidi (first test)&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150514'>2 days to RR2015&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150515'>H+H Network (failure)&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150518'>The Contest Robotem Rovně 2015&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150519'>USB Ghost&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150522'>OS Upgrade&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150527'>ssh tunnel&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150528'>Compass calibration&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150529'>imu.py&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/husky/en#150603'>dnipola.sk/Encoders&lt;/a>&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;hr/>

&lt;h1>Blog&lt;/h1>

&lt;div class='p'>&lt;a id="140729">&lt;/a>&lt;/div>

&lt;h2>29th July 2014 &amp;mdash; Github&lt;/h2>

&lt;div class='p'>Last couple of years I prefer to program in Python. So I was very happy that
Husky comes with Python wrapper.  Unfortunately it looks like the focus of the
company is mainly on ROS (&lt;a href='http://www.ros.org/about-ros/' class='external'>The Robot Operating
System&lt;/a>) which we will probably use at the end too, but somehow I trust more
Python. Yeah, and I do not know ROS much &amp;hellip; &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>The official wrapper (for registered users only) is fine, but I miss
communication logging and simple way of testing from recorded log files.
Because of that I started alternative wrapper which is available on
&lt;a href='https://github.com/robotika/husky' class='external'>github&lt;/a>. I still use the official wrapper
as reference and it helped me yesterday to solve three bugs in generation of
command packets:&lt;/div>

&lt;ul>
&lt;li>length byte in message header should be smaller&lt;/li>

&lt;li>there was a mistake with alignment (probably also causing the first mistake)&lt;/li>

&lt;li>I was using protocol version 0 instead of version 1, which is described in 
the documentation&lt;/li>
&lt;/ul>

&lt;div class='p'>The good news is that Husky moves now and all its sensor data are properly
logged and can be replayed any time after the trial. I could already observe
current rising when the robot was pushing against the wall and I saw couple
millimeters readings from encoders.&lt;/div>

&lt;div class='p'>Well, I did not mention important „detail” that the whole platform can be
controlled via serial line with command packets. You can read sensor data as
response to query commands or you can set frequency how often they should be
reported automatically. It reminds me &lt;a href='/robots/eduro/en'>CAN bus on Eduro robot&lt;/a>,
so I am quite happy with this setup &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140731">&lt;/a>&lt;/div>

&lt;h2>31st July 2014 &amp;mdash; IMU&lt;/h2>

&lt;div class='p'>Odometry with encoders looks fine so next step is to get compass/magnetometer
working. I expected that it will be easy and that I will just request data via
0x4600 or 0x4606 message (&lt;a href='https://github.com/robotika/husky/commit/5c917687650ead21721aaf5bca27fb48e6dbd55a' class='external'>code
diff&lt;/a>), but &amp;hellip; no, something is wrong &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>While I get response that MAX_SPEED is set to 1 m/s and MAX_ACCEL to 4 m/s2
confirmation of Magneto messages fails with response &lt;i>['0x2', '0x0']&lt;/i>, which
means &lt;i>[1] Type not supported - The platform configuration being used does not
support this command or request.&lt;/i>?!&lt;/div>

&lt;div class='p'>IMU (looks like &lt;a href='http://www.pololu.com/product/1255' class='external'>UM6-CHR&lt;/a>) is obviously
mounted inside Husky. I wrote to CPR (Clearpath Robotics) support and got
response how to test presence of IMU, but in ROS:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ rostopic list
/clearpath/announce/robots
/diagnostics
/diagnostics_agg
/diagnostics_toplevel_state
/encoder
/husky/cmd_freq
/husky/cmd_vel
/husky/data/differential_output
/husky/data/differential_speed
/husky/data/encoders
/husky/data/power_status
/husky/data/safety_status
/husky/data/system_status
/husky/robot
/imu/data
/imu/mag
/imu/rpy
/imu/temperature
/inertial_ekf/odom
/joint_states
/joy
/plan_cmd_vel
/rosout
/rosout_agg
/tf
/tf_static&lt;/pre>

&lt;div class='p'>So you see there &lt;i>/imu/data&lt;/i> and yes, if you query them via &lt;b>rostopic echo
/imu/data&lt;/b> you get readings like:&lt;/div>

&lt;pre>header: 
  seq: 6366
  stamp: 
    secs: 1406748289
    nsecs: 74127
  frame_id: imu_link
orientation: 
  x: 0.5765191582
  y: 0.4495936349
  z: 0.4151515331
  w: 0.5414056704
orientation_covariance: [0.03235951438546181, -0.022549884393811226, 
-0.028850223869085312, -0.02254987321794033, 0.019496535882353783, 
0.02177048847079277, -0.028850214555859566, 0.021770477294921875, 
0.03033282235264778]
angular_velocity: 
  x: -0.00532632599807
  y: -0.013848447595
  z: -0.00213053039923
angular_velocity_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration: 
  x: 0.013000455
  y: 1.05505101
  z: -0.095031495
linear_acceleration_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]&lt;/pre>

&lt;div class='p'>So IMU is connected and sends some data. So what is the problem and how it
could be fixed? Now I am going to fall back to original Python wrapper if it
also fails. Here is modified horizon snippet:&lt;/div>

&lt;pre>def platform_magnetometer(code, payload, timestamp):
  print(payload.print_format())

horizon.add_handler(platform_magnetometer, request = 'platform_magnetometer')
horizon.request_platform_magnetometer(subscription = 1)&lt;/pre>

&lt;div class='p'>Time to power on Husky in bedroom &amp;hellip;&lt;/div>

&lt;pre>administratorcpr-sylv-01:~/md/hacking$ Traceback (most recent call last):
  File "./mag.py", line 20, in &lt;module>
    horizon.request_platform_magnetometer(subscription = 1)
  File "/home/administrator/md/hacking/clearpath/horizon/__init__.py", line 449,
 in request_platform_magnetometer
    return self._protocol.request('platform_magnetometer', locals())
  File "/home/administrator/md/hacking/clearpath/horizon/protocol.py", line 297,
 in request
    self.send_message(message)
  File "/home/administrator/md/hacking/clearpath/horizon/protocol.py", line 346,
 in send_message
    raise utils.UnsupportedCodeError("Acknowledgment says Bad Code.")
clearpath.utils.UnsupportedCodeError: Acknowledgment says Bad Code.&lt;/pre>

&lt;div class='p'>OK, so it is not necessary my fault.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140801">&lt;/a>&lt;/div>

&lt;h2>1st August 2014 &amp;mdash; End of IMU mystery&lt;/h2>

&lt;div class='p'>How is it possible that I cannot see IMU on serial port when it is working in
ROS?! I was going to sniff USB communication to get the answer &amp;hellip; but I would
not found it there &lt;span class='wink'>&lt;/span>. Any idea?&lt;/div>

&lt;div class='p'>Hint:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ ls /dev/ttyUSB*
/dev/ttyUSB0  /dev/ttyUSB1&lt;/pre>

&lt;div class='p'>Stupid me &amp;mdash; sometimes, even for programmer, it is better to open the box
instead of looking for the bug in software.&lt;/div>

&lt;div class='p'>Thanks to Martin C. from CPR support: 
&lt;i>I have gotten word back from one of our software engineers. He said
that since Husky doesn't have a built in IMU, only the motor and encoder
messages are supported. To interface with the UM6 IMU that you have in your
Husky, you can use this legacy Python module:&lt;/i>
&lt;a href='http://aptima-ros-pkg.googlecode.com/svn/trunk/imu_um6/src/um6/driver.py' class='external'>http://aptima-ros-pkg.googlecode.com/svn/trunk/imu_um6/src/um6/driver.py&lt;/a>&lt;/div>

&lt;div class='p'>So IMU is separate module connected to separate USB port! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>I still did not get nice data from IMU:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~/md$ python driver.py
Read Pkt: BAD CHECKSUM
CMD 172 Success: True
CMD 176 Success: True
CMD 175 Success: True
rx pkt
Read Pkt: BAD CHECKSUM
rx pkt
Read Pkt: BAD CHECKSUM
rx pkt
Read Pkt: BAD CHECKSUM
rx pkt&lt;/pre>

&lt;div class='p'>&amp;hellip; but I still consider it as success, for the moment.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140819">&lt;/a>&lt;/div>

&lt;h2>19th August 2014 &amp;mdash; USB devices&lt;/h2>

&lt;div class='p'>There are four USB devices plugged into Husky robot:&lt;/div>

&lt;ul>
&lt;li>USB-serial converter for Husky platform (motors, encoders, &amp;hellip;)&lt;/li>

&lt;li>IMU unit&lt;/li>

&lt;li>Logitech joystick receiver&lt;/li>

&lt;li>Prime Sense Sensor (a'la Kinect)&lt;/li>
&lt;/ul>

&lt;div class='p'>Originally I wanted unified data source, similar to CAN bus messages, test and
log each device separately and then integrate them together. Current status is
not very encouraging. I can talk to Husky platform, but that is the only good
news.&lt;/div>

&lt;div class='p'>I can read data from joystick via &lt;a href='http://pygame.org/' class='external'>pygame&lt;/a> on Win7 (I
tried also Eduro
&lt;a href='https://github.com/robotika/eduro/blob/master/joy.py' class='external'>joy.py&lt;/a> but my
impression is that I am getting random buffered results from
&lt;i>/dev/input/js0&lt;/i>).  There is a possibility to install &lt;i>pygame&lt;/i> on Husky,
but is it right direction?&lt;/div>

&lt;div class='p'>I do not have any success talking directly to IMU. It should be
&lt;i>/dev/clearpath/imu&lt;/i> (symbolic link to &lt;i>/dev/ttyUSB0&lt;/i>) running at 115200,
but again I am getting rather some random bytes and checksums are not valid.&lt;/div>

&lt;div class='p'>Regarding Prime Sense Sensor &amp;mdash; it works on Win7 with open source drivers and
I did not test it on Husky yet. That would be 2nd phase. The first task is to
record and replay (drive again) recorded path.&lt;/div>

&lt;div class='p'>So what to do?! One possibility is to give up low level access (I am not very
comfortable with that) and use ROS as middleware. ROS seems to be working &amp;mdash; I
see data from IMU (&lt;i>rostopic echo imu/data&lt;/i>) and from Joystick (&lt;i>rostopic
echo joy&lt;/i>). It can handle multiplex of all sources.  It can also log them,
probably &amp;hellip; it is just such a monster that I still hesitate to cross that
line. Again I would like small "hard coded" client, which does not need ROS
library and connect to something, request data, read and parse them into Python
class. Is then &lt;a href='http://wiki.ros.org/rospy/' class='external'>rospy&lt;/a> "must have"?&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140820">&lt;/a>&lt;/div>

&lt;h2>20th August 2014 &amp;mdash; ROS without ROS?&lt;/h2>

&lt;div class='p'>There is a light at the end of the tunnel. Maybe &lt;span class='wink'>&lt;/span>. I am trying to get data
from existing ROS nodes, but I would like to avoid ROS installation (for
Windows). I start to learn it &amp;mdash; I had some idea, but there are tons of texts
you do not really want to read &amp;hellip; until I found nice article (short with
images) on &lt;a href='http://robohub.org/' class='external'>robohub.org&lt;/a>:&lt;/div>

&lt;ul>
&lt;li>&lt;a href='http://robohub.org/ros-101-intro-to-the-robot-operating-system/' class='external'>http://robohub.org/ros-101-intro-to-the-robot-operating-system/&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>And then simple usage in&lt;/div>

&lt;ul>
&lt;li>&lt;a href='http://robohub.org/ros-101-a-practical-example/' class='external'>http://robohub.org/ros-101-a-practical-example/&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>I did not fully understand why they provide readers images for vitrual machines
until I tried to install it on older machine with Ubuntu. There is no &lt;i>general
installation&lt;/i> it depends even on which particular version of Ubuntu you have.
And the one I have is the last supported by &lt;i>hydro&lt;/i>.&lt;/div>

&lt;div class='p'>Recommended installation would take more than 2GB(!), which I do not have
available on that old machine. So I went for &lt;b>Bare Bones&lt;/b>, approx 100MB of
packages, followed instructions on
&lt;a href='http://wiki.ros.org/hydro/Installation/Ubuntu' class='external'>ROS wiki&lt;/a> and it went fine.&lt;/div>

&lt;div class='p'>I tried &lt;a href='http://robohub.org/ros-101-a-practical-example/' class='external'>Hello Robot&lt;/a> example
and it worked (avoid copy and paste, or make sure you have corrected quotation
marks):&lt;/div>

&lt;pre>martindmartind-ThinkPad-R60:~$ rostopic pub /hello std_msgs/String “Hello Robot”
ERROR: Too many arguments:
 * Given: [u'\u201cHello', u'Robot\u201d']
 * Expected: ['data']

Args are: [data]&lt;/pre>

&lt;div class='p'>And then I came over &lt;a href='http://wiki.ros.org/ROS/Technical%20Overview' class='external'>ROS
Technical Overview&lt;/a>. It was rather "lucky accident", but as soon as I read
&lt;i>Most ROS users do not need to know these details...&lt;/i> I know this is for me
&lt;span class='smile'>&lt;/span>. This is the piece I was looking for. After simple test on Ubuntu I could
connect also from Win7. See the „proof”:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 683px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/robots/husky/win-ros-test.png' alt='Python test talking to ROS' title='Python test talking to ROS' class='border'  width='677' height='342'/>&lt;/span>&lt;br/>
&lt;span>Python test talking to ROS&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>So there is a chance &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140821">&lt;/a>&lt;/div>

&lt;h2>21st August 2014 &amp;mdash; Subscriber client&lt;/h2>

&lt;div class='p'>A long time ago I was thinking how to split processing power on
&lt;a href='/robots/eduro/en'>robot Eduro&lt;/a>. I asked my robo-friends if it is a good idea to
use &lt;a href='http://en.wikipedia.org/wiki/XML-RPC' class='external'>XMLRPC&lt;/a> (Remote Procedure Call) to
put it all together. Their answer was hard to forget: &lt;i>Well, then you will
have TWO problems&lt;/i>! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>ROS is using XMLRPC. I am trying to create simple &lt;i>Subscriber client&lt;/i>, i.e.
piece of code which can receive IMU, data for example. The begging was fine &amp;mdash;
the client has to talk to the ROS Master via XMLRPC if there is any
&lt;i>Publisher&lt;/i> providing such information (see
&lt;a href='http://wiki.ros.org/ROS/Master_API' class='external'>ROS Master API&lt;/a>). In particular it is
necessary to call &lt;i>registerSubscriber(caller_id, topic, topic_type,
caller_api)&lt;/i>.&lt;/div>

&lt;div class='p'>What are the parameters? &lt;i>caller_id&lt;/i> is string &lt;i>ROS caller ID&lt;/i>. Here I
would guess that it can be any string describing my client node (I picked
&lt;b>"hello_test_client"&lt;/b>). Then there is the &lt;i>topic&lt;/i> and &lt;i>topic_type&lt;/i>.
Because I want to test it on
&lt;a href='http://robohub.org/ros-101-a-practical-example/' class='external'>Hello Robot example&lt;/a> it
should be &lt;b>"/hello"&lt;/b> and &lt;b>""std_msgs/String"&lt;/b>, I hope. And then
troublemaker &lt;i>caller_api&lt;/i>. It is supposed to be string &lt;i>API URI of
subscriber to register. Will be used for new publisher notifications.&lt;/i>. I was
surely coding this wrong, because I am getting:&lt;/div>

&lt;pre>[-1, 'ERROR: parameter [caller_api] is not an XMLRPC URI', 0]&lt;/pre>

&lt;div class='p'>Do I really have to create server in order to receive &lt;i>publisherUpdate&lt;/i>
remote procedure call? I am trying to write simple client, and from most (all?)
examples I saw so far the clients subscribe and then wait until there is any
publisher available. Is not there an option to drop this part completely? I
would not mind to have there parameter &lt;i>blocking&lt;/i> or &lt;i>timeout&lt;/i> that if
there is no publisher within given period it will fail. Why my poor client has
to build also XMLRPC server just for service it does not care of?&lt;/div>

&lt;div class='p'>OK, solved &lt;span class='smile'>&lt;/span> &amp;hellip; see
&lt;a href='https://github.com/robotika/husky/commit/952d4d1754301254b22debff2183a5c4d8010940' class='external'>github&lt;/a>.
I am sorry for hard coded IP addresses &amp;mdash; it is just test between PC running
Ubuntu and Win7 laptop. There is
&lt;a href='https://github.com/robotika/husky/blob/master/ros/hello.py' class='external'>hello.py&lt;/a> which
tries to subscribe to ROS Master and
&lt;a href='https://github.com/robotika/husky/blob/master/ros/hello_server.py' class='external'>hello_server.py&lt;/a>
which would accept remote calls (copy from Python documentation). Note, that
server does not call it immediately. But you are not allowed to fill there
empty string. On the other hand &lt;b>"http://blablabla:123"&lt;/b> worked too:&lt;/div>

&lt;pre>m:\git\husky\ros>hello.py
Publishers:
['/rosout_agg', ['/rosout']]
['/hello', ['/rostopic_2157_1408558967708']]
[1, 'Subscribed to [/hello]', ['http://martind-ThinkPad-R60:48116/']]&lt;/pre>

&lt;div class='p'>Now it is time to talk to the node &lt;b>'http://martind-ThinkPad-R60:48116/'&lt;/b> and
convince it to open connection for me &amp;hellip;&lt;/div>

&lt;div class='p'>p.s. I did not receive any &lt;i>publisherUpdate&lt;/i> even when I stopped &lt;i>/hello&lt;/i>
and added some &lt;i>sleep()&lt;/i> &amp;hellip; so &lt;i>hello_server.py&lt;/i> side is not working
properly yet.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140822">&lt;/a>&lt;/div>

&lt;h2>22nd August 2014 &amp;mdash; Hello Robot&lt;/h2>

&lt;div class='p'>Done. I finally convinced the other ROS node to send me (Win7 Python client)
&lt;i>Hello Robot&lt;/i>! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;pre>m:\git\husky\ros>hello.py
Publishers:
['/rosout_agg', ['/rosout']]
['/hello', ['/rostopic_2157_1408640226737']]
[1, 'ready on martind-ThinkPad-R60:49065', ['TCPROS', 'martind-ThinkPad-R60', 49065]]
- - - - - - - 
SIZE 174
message_definition=string data


callerid=/rostopic_2157_1408640226737
latching=1
md5sum=992ce8a1687cec8c8bd883ec73ca41d1
topic=/hello
type=std_msgs/String
- - - - - - - 
SIZE 15
Hello Robot
- - - - - - - 
Traceback (most recent call last):
  File "M:\git\husky\ros\hello.py", line 77, in &lt;module>
    data = soc.recv(4)
socket.timeout: timed out&lt;/pre>

&lt;div class='p'>Here is
&lt;a href='https://github.com/robotika/husky/commit/4c010833f650301fe2036e1258d308efc5058c03' class='external'>the
code diff&lt;/a> &amp;mdash; very ugly, nor parsing what I got from the node but rather
patching of patches until I finally got answer. It was not that trivial as it
may sound now, so let's review where I made mistakes &amp;hellip;&lt;/div>

&lt;div class='p'>The very first problem was in specification of the communication protocol &amp;mdash;
the last parameter in &lt;i>publisher.requestTopic()&lt;/i>. I was using
&lt;a href='http://wiki.ros.org/ROS/Slave_API' class='external'>ROS Slave API&lt;/a> as documentation and it was
supposed to be: &lt;i>List of desired protocols for communication in order of
preference.&lt;/i>. Somewhere else I read that there are actually only two
(&lt;a href='http://wiki.ros.org/ROS/TCPROS' class='external'>TCPROS&lt;/a> and
&lt;a href='http://wiki.ros.org/ROS/UDPROS' class='external'>UDPROS&lt;/a>) so it took me while to realize why
it is list of lists?! Never mind, at the end the documentation was correct,
i.e. the last parameter is &lt;b>[ ["TCPROS"] ]&lt;/b>.&lt;/div>

&lt;div class='p'>Great, after several unsuccessful attempt I finally got &lt;i>[1, 'ready on
martind-ThinkPad-R60:49065', ['TCPROS', 'martind-ThinkPad-R60', 49065]]&lt;/i>, so
my &lt;b>/hello&lt;/b> node can be accessed via port 49065. I opened socket, connected
to it, but I did not receive any data. I supposed that it is because of buffering
so I played (probably wrong) with &lt;i>socket.TCP_NODELAY&lt;/i>, timeouts etc. I tried
to get 4 bytes of message length, guided by this
&lt;a href='http://wiki.ros.org/ROS/Connection%20Header' class='external'>nice descriptive example&lt;/a>.
Nothing.&lt;/div>

&lt;div class='p'>From &lt;a href='http://wiki.ros.org/ROS/Technical%20Overview' class='external'>Technical Overview 6.1
Example&lt;/a> it looked like client has to only connect. I should rather read
&lt;a href='http://wiki.ros.org/ROS/TCPROS' class='external'>TCPROS page&lt;/a>, where there was an important
information: &lt;i>A TCPROS subscriber is required to send the following
fields...&lt;/i>. So client has to talk to node first.&lt;/div>

&lt;div class='p'>After a while I realized that there are tons of warnings in "/hello" node
terminal:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 681px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/robots/husky/hello-warnings.png' alt='Warnings from /hello node' title='Warnings from /hello node' class='border'  width='675' height='424'/>&lt;/span>&lt;br/>
&lt;span>Warnings from /hello node&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>So I added several &lt;i>key=value&lt;/i> pairs with 4 bytes length prefix, random
md5sum and then from warnings I got correct md5sum &amp;hellip; yeah, very dirty way. At
the end I received what I sent but still not the data &lt;span class='smile'>&lt;/span>. So the answer from
node starts with message description (again) and the second "packet" is the
real data &lt;b>Hello Robot&lt;/b> string.&lt;/div>

&lt;div class='p'>What I currently do not understand is why client has to specify message
structure, when it first talks to the server, why it has to repeat it when it
talks to publisher and finally why publisher repeats that when it talks back to
subscriber.&lt;/div>

&lt;div class='p'>I could not resist to do another test: instead of providing &lt;i>std_msgs/String&lt;/i>
I tried to send number. If you need to know how to specify number
&lt;a href='http://wiki.ros.org/std_msgs' class='external'>this std_msgs wiki&lt;/a> should help. Note, that it
is case sensitive, so in my case I tried &lt;i>rostopic pub /hello std_msgs/Int32
123&lt;/i>. Strange thing is that when I asked server, which node provides &lt;i>/hello
std_msgs/String&lt;/i> I got both. I stopped the one with string and this is what I
got from &lt;b>node&lt;/b>:&lt;/div>

&lt;pre>error=topic types do not match: [std_msgs/String] vs. [std_msgs/Int32]&lt;/pre>

&lt;div class='p'>So I really do not understand why I have to say to ROS master what type of
message I need &amp;hellip;&lt;/div>

&lt;div class='p'>TIMEOUT i.e. time to run to work &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140825">&lt;/a>&lt;/div>

&lt;h2>25th August 2014 &amp;mdash; ROS /imu/data&lt;/h2>

&lt;div class='p'>As soon as I got &lt;i>Hello Robot&lt;/i> from &lt;i>/hello&lt;/i> node I tried to receive
&lt;i>/imu/data&lt;/i> from Husky. I
&lt;a href='https://github.com/robotika/husky/commit/009e69483e5b11c3c11e750ad14c4e455a7ea15b' class='external'>hacked
hello.py&lt;/a> and recorded received data. Now I understand that there is a big
difference between &lt;i>std_msgs/String&lt;/i> and &lt;i>message_definition=string data&lt;/i>.
The first is only a name while the other is message description. It was quite
clear for &lt;i>/imu/data&lt;/i> node, which provides message (topic type)
&lt;i>std_msgs/Imu&lt;/i>, and has many sub-elements.&lt;/div>

&lt;div class='p'>So when you talk to ROS master you give it topic and only topic type (I am
still not sure why), but once you start to talk to publisher you have to
specify correct MD5 of the message structure. The good news if that if you do
not know it the publisher responds with something like &lt;i>your md5 is not
matching&lt;/i> and that publisher has this md5. So you can start to talk to
publisher again, this time with its MD5 &amp;hellip; and then you get complete message
description, including comment! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>For &lt;i>std_msgs/Imu&lt;/i> I received:&lt;/div>

&lt;pre>SIZE 2329
callerid=/um6_driver
latching=0
md5sum=6a62c6daae103f4ff57a132d6f95cec2
message_definition=# This is a message to hold data from an IMU (Inertial Measur
ement Unit)
#
# Accelerations should be in m/s^2 (not in g's), and rotational velocity should
be in rad/sec
#
# If the covariance of the measurement is known, it should be filled in (if all
you know is the variance of each measurement, e.g. from the datasheet, just put
those along the diagonal)
# A covariance matrix of all zeros will be interpreted as "covariance unknown",
and to use the data a covariance will have to be assumed or gotten from some oth
er source
#
# If you have no estimate for one of the data elements (e.g. your IMU doesn't pr
oduce an orientation estimate), please set element 0 of the associated covarianc
e matrix to -1
# If you are interpreting this message, please check for a value of -1 in the fi
rst element of each covariance matrix, and disregard the associated estimate.

Header header

geometry_msgs/Quaternion orientation
float64[9] orientation_covariance # Row major about x, y, z axes

geometry_msgs/Vector3 angular_velocity
float64[9] angular_velocity_covariance # Row major about x, y, z axes

geometry_msgs/Vector3 linear_acceleration
float64[9] linear_acceleration_covariance # Row major x, y z

&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>

MSG: std_msgs/Header
# Standard metadata for higher-level stamped data type
&lt;strike>-&lt;/strike>&lt;strike>-&lt;/strike>&lt;strike>-&lt;/strike>&lt;strike>-&lt;/strike>
SIZE 587869811
s is generally used to communicate timestamped data
# in a particular coordinate frame.
#
# sequence ID: consecutively increasing ID uint32 seq
#Two-integer timestamp that is expressed as:
# * stamp.secs: seconds (stamp_secs) since epoch
# * stamp.nsecs: nanoseconds since stamp_secs
# time-handling sugar is provided by the client library
time stamp
#Frame this data is associated with
# 0: no frame
# 1: global frame
string frame_id

&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>

MSG: geometry_msgs/Quaternion
# This represents an orientation in free space in quaternion form.

float64 x
float64 y
float64 z
float64 w

&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>&lt;tt>=&lt;/tt>

MSG: geometry_msgs/Vector3
# This represents a vector in free space.

float64 x
float64 y
float64 z&lt;/pre>

&lt;div class='p'>What is surely wrong is &lt;i>SIZE 587869811&lt;/i> &amp;hellip; but it is probably my mistake &amp;mdash;
I did not read all &lt;i>SIZE 2329&lt;/i> bytes &amp;hellip; I guess. It should be clear from
replay of log file.&lt;/div>

&lt;div class='p'>Confirmed = in the log file it looks OK.
&lt;a href='https://github.com/robotika/husky/commit/ab5ef8e502b03687c0b801d6fa8ba8c662a8eacc' class='external'>Here&lt;/a>
is parsing IMU data (hard coded, for now). And here some outputs:&lt;/div>

&lt;pre>22227 1408728589 709250449 8
imu_link
(-0.007217399499999999, -0.7315086163, -0.6816246364999999, -0.015408308699999998)

22228 1408728589 760128957 8
imu_link
(-0.0073516766999999995, -0.7315086163, -0.6816246364999999, -0.015475447299999999)

22229 1408728589 811034145 8
imu_link
(-0.007586661799999999, -0.7314414776999999, -0.6816582057999999, -0.015676863099999997)&lt;/pre>

&lt;div class='p'>It is nice that every message contains sequential number and timestamp. What I
am not sure is &lt;i>frame_id&lt;/i> in particular with comment &lt;i>0: no frame, 1: global
frame&lt;/i> when I received string &lt;b>imu_link&lt;/b> &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140827">&lt;/a>&lt;/div>

&lt;h2>27th August 2014 &amp;mdash; Hello ROS Family&lt;/h2>

&lt;div class='p'>OK, now I have hacked whole ROS family: subscriber, publisher and XMLRPC server
(for details see
&lt;a href='https://github.com/robotika/husky/commit/2b236a0ec02756a917b3e593893f2a3dc51119b4' class='external'>github&lt;/a>).
It is still quite dirty, but I can receive hello string and send it. The next
step would be to finish parsing of other Husky messages, and try to publish
command to move. To be honest I am not very happy about it &amp;mdash; I am thinking
about crazy hybrid, where sensors would be taken from ROS while I would talk to
the platform directly, logging all &lt;i>currently unnecessary data&lt;/i>, like
currents and temperatures on both motors.&lt;/div>

&lt;div class='p'>p.s. I did not found overall ROS Husky messages documentation &amp;hellip; so
probably the simplest thing is to look for description files directly on the
platform.&lt;/div>

&lt;div class='p'>p.s.2 &lt;b>Husky moved!&lt;/b> with pseudo-ROS-Python code &lt;span class='smile'>&lt;/span>. I searched for
&lt;i>*.msg&lt;/i> in &lt;i>/org/ros/hydro&lt;/i> directory. There are plenty of them and the one
I needed was &lt;i>geometry_msgs/Twist&lt;/i> which contains two vectors: one for linear
and one for angular velocity. I am not sure if it is good to commit another
IP-comment-hacking patch &amp;hellip; but without that it would not be clear what I did
&amp;hellip; so here is another
&lt;a href='https://github.com/robotika/husky/commit/752dcc01f9d778e738234217be7b2d81a14ea3ef' class='external'>tunnel
digging&lt;/a> to Husky ROS code.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140829">&lt;/a>&lt;/div>

&lt;h2>29th August 2014 &amp;mdash; node.py&lt;/h2>

&lt;div class='p'>I am currently trying to put all bits I learned from &lt;i>/hello&lt;/i> experiments
together and create &lt;b>ROS node&lt;/b> which will subscribe to several topics and
publish at least one (send drive command) &amp;mdash; see
&lt;a href='https://github.com/robotika/husky/blob/master/ros/node.py' class='external'>node.py&lt;/a>.
&lt;a href='https://github.com/robotika/husky/commit/5f7c867dadcaf712d4676b61e041c2a2de3eadff' class='external'>At
the moment&lt;/a> it is logging couple Husky ROS topics, each communication into
separate file, and it creates also &lt;i>metalog&lt;/i> recording order of all received
messages.&lt;/div>

&lt;div class='p'>I decided to use &lt;i>stupid single-threaded solution&lt;/i> (based on
&lt;a href='http://en.wikipedia.org/wiki/KISS_principle' class='external'>KISS principle&lt;/a>). Note, that
there is in reality second thread handling XMLRPC server, but I would ignore it
for the moment. The timeouts are set to 0.0 so it is actively pulling all TCP
connections &amp;mdash; not nice, but the machine has to do something anyway. I may
change it later to couple milliseconds. If you read Python documentation you
would hope for &lt;i>socket.timeout&lt;/i> exception. Well, I was also naive &amp;hellip; so you
get &lt;i>socket.error&lt;/i> instead and it is different on Windows and Linux. It is this code:&lt;/div>

&lt;pre>try:
    data = self.readFn( 4096 )
except socket.timeout as e:
    print e # it should contain partial data            
except socket.error as (errno, errStr):
    assert errno in [10035,11], (errno, errStr) 
       # Windows 'A non-blocking socket operation could not be completed immediately'
       # Linux (11, 'Resource temporarily unavailable')
    data = ""&lt;/pre>

&lt;div class='p'>As I see it now it is surely wrong as &lt;i>socket.timeout&lt;/i> does not handle
received data. For timeout exception may already read couple bytes and there is
no other way how to pass them to &lt;i>data&lt;/i>. But I did not get timeout yet (OK,
to be sure I added assert
&lt;a href='https://github.com/robotika/husky/commit/2b83dfa2a62383129f3ece5a5b51673c6e914eb4' class='external'>there&lt;/a>).
Now I would guess that you either get some bytes i.e. no timeout or there is
nothing ready and thus you get &lt;i>socket.error&lt;/i>.&lt;/div>

&lt;div class='p'>Other than that it looks fine, except there is a partially expected issue with
buffering: when I run &lt;i>node.py&lt;/i> remotely from Win7 the metalog looks
something like this:&lt;/div>

&lt;pre>&amp;hellip;
/husky/data/encoders
/husky/data/encoders
/husky/data/safety_status
/husky/data/safety_status
/imu/data
/imu/data
/imu/data
/imu/data
/husky/data/power_status
/husky/data/encoders
/husky/data/encoders
/husky/data/safety_status
/husky/data/safety_status
/imu/data
/imu/data
&amp;hellip;&lt;/pre>

&lt;div class='p'>but on Husky it is fine:&lt;/div>

&lt;pre>&amp;hellip;
/imu/data
/husky/data/encoders
/imu/data
/husky/data/safety_status
/husky/data/power_status
/imu/data
/husky/data/encoders
/imu/data
/husky/data/safety_status
/imu/data
/husky/data/encoders
/imu/data 
&amp;hellip;&lt;/pre>

&lt;hr/>

&lt;div class='p'>&lt;a id="140902">&lt;/a>&lt;/div>

&lt;h2>2nd September 2014 &amp;mdash; replay metalog&lt;/h2>

&lt;div class='p'>Finally! Last week I made a stupid mistake and could not find it &amp;mdash; I forgot
to call &lt;i>self.master.registerPublisher&lt;/i> in order to get contacts to clients.
Now this is fixed and &lt;b>node.py&lt;/b> now also provides simple topic publishing
(see
&lt;a href='https://github.com/robotika/husky/commit/55467c94ee96adf9cf1410141d70ec9cc8d4a2d8' class='external'>gihub
diff&lt;/a>).&lt;/div>

&lt;div class='p'>There is also a
&lt;a href='https://github.com/robotika/husky/commit/f5de25143437301d0e01445352f8fbb498bfcbb2' class='external'>newer
version&lt;/a>, where &lt;i>node.py&lt;/i> accepts name of metalog (instead of master and
node IPs) and then you can replay whole run. This morning it was tested only on
publish &lt;i>/hello&lt;/i> and subscribe &lt;i>/hello&lt;/i>, so I am looking forward to do it
on Husky (with publishing velocity command).&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140903">&lt;/a>&lt;/div>

&lt;h2>3rd September 2014 &amp;mdash; Heartbeat&lt;/h2>

&lt;div class='p'>Today I moved with Husky couple centimeters using &lt;i>NodeROS.update()&lt;/i> (see
&lt;a href='https://github.com/robotika/husky/commit/daa18bee91ca0bd76e1c86900b679b79eb48b7fe' class='external'>code diff&lt;/a>).
Sure enough there were some bugs, and one was related to failure of 100%
reproducibility, i.e. major bug (later found in &lt;i>update()&lt;/i> function).&lt;/div>

&lt;div class='p'>The older version of &lt;i>update()&lt;/i> checks all subscribers and repeats that in
cycle until at least one sends data. The problem is that sometimes only one
subscriber sends the data while next time it can be three subscribers, for
example. And it is not clear from metalog which update got which topics
updated. It is now fixed with extra '&amp;mdash;' separator &amp;hellip; not very scientific,
but clear what is going on. The metalog now looks like this:&lt;/div>

&lt;pre>_imu_data140903_064426.log
_husky_data_encoders140903_064426.log
_husky_data_power_status140903_064426.log
_husky_data_safety_status140903_064426.log
_joy140903_064426.log
_husky_data_system_status140903_064427.log
_husky_cmd_vel140903_064427.log
/husky/cmd_vel
/imu/data
/husky/data/encoders
/husky/data/power_status
&amp;mdash;
/husky/cmd_vel
/imu/data
/husky/data/safety_status
/husky/data/encoders
&amp;mdash;
/husky/cmd_vel
/imu/data
/husky/data/safety_status
/husky/data/encoders
&amp;mdash; 
&amp;hellip;&lt;/pre>

&lt;div class='p'>And what is the &lt;b>heartbeat&lt;/b> from today's title? Well, not all topics are
equal and not all of them are frequent and regular. I need one, which is
regular and which will &lt;i>dictate robot frequency of control&lt;/i>. In the case of
Husky it is &lt;i>/husky/data/encoders&lt;/i> &amp;mdash; &lt;i>update()&lt;/i> is not finished until
there is at least one update of encoders is received.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140904">&lt;/a>&lt;/div>

&lt;h2>4th September 2014 &amp;mdash; Joyride&lt;/h2>

&lt;div class='p'>I started new class
&lt;a href='https://github.com/robotika/husky/blob/master/ros/huskyros.py' class='external'>HuskyROS&lt;/a> to
experiment how difficult it is to use my „node”. One of the first experiments
was to drive forward when you hold GREEN button and terminate program when you
press RED (see
&lt;a href='https://github.com/robotika/husky/commit/e6a6e942bddfd1e8f27f6c51a030b96db6abc131' class='external'>code&lt;/a>).&lt;/div>

&lt;div class='p'>It moved but it was shaking a lot. Why? Well, on boot up there is another
colliding process &lt;i>husky_joystick&lt;/i> &amp;hellip; so the first step was to kill it.&lt;/div>

&lt;div class='p'>Now I was &lt;i>seriously&lt;/i> moving &amp;mdash; instead of 10cm test in bedroom it moved 2m
in the entrance hall &lt;span class='smile'>&lt;/span>. But it was not turning &amp;hellip; so I changed the ROS
message&lt;/div>

&lt;pre># This expresses velocity in free space broken into its linear and angular parts.
Vector3  linear
Vector3  angular&lt;/pre>

&lt;div class='p'>to this:&lt;/div>

&lt;pre>return struct.pack("dddddd", speed,0,0, angularSpeed,angularSpeed,angularSpeed)&lt;/pre>

&lt;div class='p'>&lt;span class='smile'>&lt;/span> &amp;hellip; then it turned for sure. The correct solution is on
&lt;a href='https://github.com/robotika/husky/commit/0a843fd2d3996076ee4c6a2f5be5c2c66d895629' class='external'>github&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140907">&lt;/a>&lt;/div>

&lt;h2>7th September 2014 &amp;mdash; Quaternions and loose wheel/axis&lt;/h2>

&lt;div class='p'>What I like on robotics is that it is quite varied. One moment you try to learn
quaternions and the very next moment you have to deal with broken wheel.  It is
not computer science only and it is also not mechanics only &amp;mdash; it is a bit
of everything. &lt;span class='smile'>&lt;/span> So let's start from the beginning &amp;hellip;&lt;/div>

&lt;div class='p'>This weekend I took Husky for outdoor testing, finally. I started with two
simple functions &lt;i>goStraight()&lt;/i> and &lt;i>turn()&lt;/i>. And I ended there too. For
the moment I ignore signs so it is always going forward and always turning
left. The data from encoders, when going straight, were reasonable &amp;mdash; I took
average of left and right traveled distance, and moved slowly (0.1 m/s) so I
ignored speed ramps &amp;hellip; and maybe the system takes care of that??&lt;/div>

&lt;div class='p'>The troubles come as soon as I was turning. I did not try that much at home,
you know carpet, and even in the garden it made "nice" holes in the lawn. At
first I tried to turn 90 degrees based on odometry data. I was quite naive. It
is basically the difference of left and right traveled distance, but to get
the angle you also need robot width &amp;hellip; or better distance between left and
right steering wheel. Husky has 4 wheels, 2+2 connected with a chain, so that
have to skid. And they do. So once I set &lt;i>WIDTH&lt;/i> to measured 55cm it turned
only approximately 40 degrees (instead of 90) [it was probably skidding
diagonally so the distance could be 1m instead].&lt;/div>

&lt;div class='p'>Then there was much more serious problem: the left front wheel looked a bit
(approximately 1cm) pulled off and you could wiggle it a bit. I was not so big
deal when going straight, but it was stopper for turning. There was no other
way than to open the robot.&lt;/div>

&lt;div class='p'>I must say, that the Husky robot is very nice done. Extra plus was that all I
needed was a set of hexagonal wrenches. And the problem? It looks like there
was missing lock on that particular wheel axis and thus it was no longer in the
bearing. It is possible that I did something wrong when loading/unloading
the robot to/from a car, but &amp;hellip;  it should be &lt;i>a bit industrial&lt;/i> and
survive.  There was no lock laying around so it is possible that it is missing
for a long time &amp;hellip;&lt;/div>

&lt;div class='p'>Quaternions &amp;mdash; it is quite a while when I worked with them, and I forgot
everything. Why do I need them? Well, the robot orientation in &lt;i>/imu/data&lt;/i> is
defined by Quaternion. If you do not know what it is then have a look at
wikipedia: &lt;a href='http://en.wikipedia.org/wiki/Quaternion' class='external'>Quaterinion&lt;/a> and
&lt;a href='http://en.wikipedia.org/wiki/Conversion_between_quaternions_and_Euler_angles' class='external'>conversion
to Euler angles&lt;/a>. After some messing around (you can tell from commented
lines in IMU code) I found the magic formula:&lt;/div>

&lt;pre>self.imu =  math.atan2(2*(q0*q1+q2*q3), 1-2*(q1*q1+q2*q2)) # along X axis&lt;/pre>

&lt;div class='p'>After that Husky turned close to 90 degrees once, but like 130 degrees next
time. It is probably integration of compass, gyros and accelerators &amp;hellip; but I
am not sure.&lt;/div>

&lt;div class='p'>My last experiment was with &lt;b>/imu/mag&lt;/b>, which is really the direct source I
want.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 610px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/robots/husky/husky-compass.png' alt='Compass readings for more than one complete rotation' title='Compass readings for more than one complete rotation' class='border'  width='604' height='345'/>&lt;/span>&lt;br/>
&lt;span>Compass readings for more than one complete rotation&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>There is one &lt;i>outlier reading&lt;/i> but other than that it looks almost circular
and center is close to (0,0). Some calibration/transformation would improve it,
but it could be good for the first record &amp;amp; replay.&lt;/div>

&lt;div class='p'>After that Husky stopped to talk to me. The weather was very nice today, so
maybe the processor case was getting too hot?? Maybe there is another ROS topic
describing it??&lt;/div>

&lt;div class='p'>One more note, that setting velocity command (0,0) actually switches to neutral
and robot will start to move if it is on a small slope. So is there some
command for active braking?&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/husky/long-axis.jpg'>&lt;img src='/robots/husky/long-axis_t.jpg' alt='The lock ring should be from inside!' title='The lock ring should be from inside!' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/husky/long-axis.jpg'>The lock ring should be from inside!&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/husky/husky-opened.jpg'>&lt;img src='/robots/husky/husky-opened_t.jpg' alt='Opened front of Husky' title='Opened front of Husky' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/husky/husky-opened.jpg'>Opened front of Husky&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/husky/test-field.jpg'>&lt;img src='/robots/husky/test-field_t.jpg' alt='Test field' title='Test field' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/husky/test-field.jpg'>Test field&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140911">&lt;/a>&lt;/div>

&lt;h2>11th September 2014 &amp;mdash; Goodbye Husky!&lt;/h2>

&lt;div class='p'>Husky went home yesterday night, back to France. There are still some
unresolved issues, like the snap ring for the front left wheel, but there was
some progress anyway. I was in touch with Martin from Clearpath Robotics and I
learned that there is a jumper on motor controller which defines default
behavior: "coast" mode or "brake" mode. Our Husky had it in "coast" mode, so
when I send command (0,0) robot would move downhill by itself.&lt;/div>

&lt;div class='p'>It is necessary to open both front and back cover in order to change the
jumpers. I did that and find out that one connector for temperature
measurements  (my guess, 3-PIN connector with resistor loop) was loose. I did
not know how to put it back, so I started to open front cover again (it is 8
screws + 2 big ones for the bumper). In the middle of the operation I realized
that I took a picture of the motor controller before I changed the jumper, so I
could copy that:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/husky/motor-controller.jpg'>&lt;img src='/robots/husky/motor-controller_t.jpg' alt='Jumper and 3-PIN cable on motor controller' title='Jumper and 3-PIN cable on motor controller' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/husky/motor-controller.jpg'>Jumper and 3-PIN cable on motor controller&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The bad news is that there is no way how to control braking from the software.
You have to choose. If you are in the terrain braking is surely the choice &amp;mdash;
I rather carried Husky uphill because without the brake it could be fatal
disaster.  On the other hand if you want to just push the robot around, like
when you load it to the car, you want "coast" mode. Note, that pushing will
generate enough power that robot will start braking (even if you removed the
battery and it is in E-stop mode).&lt;/div>

&lt;div class='p'>What else? Probably the PrimeSense sensor. Sylvio bought it a year ago, but
then the company was bought by Apple (see
&lt;a href='http://ros-users.122217.n3.nabble.com/Confirmation-on-primesense-td4020570.html' class='external'>the
discussion of upset users&lt;/a>). The websites &lt;b>www.primesense.com&lt;/b> and
&lt;b>openni.org&lt;/b> are no longer working. There is a
&lt;a href='https://github.com/PrimeSense/Sensor' class='external'>source code on github&lt;/a>, but it probably
won't be supported any more (click on the title link, for example).&lt;/div>

&lt;div class='p'>ROS was supposed to support Kinect and PrimeSense, and after reading several
websites and almost corrupting robot installation I gave up. No ROS. An
alternative seems to be new OpenNI 2.0 driver available in beta stage:
&lt;a href='http://structure.io/openni' class='external'>http://structure.io/openni&lt;/a>. When I downloaded it for windows it work
immediately. Also on Linux I could use NiViewer, but Husky does not have
monitor so you get &lt;i>freeglut (./NiViewer): failed to open display&lt;/i> &amp;hellip; note,
that this was already good news! The others were like
&lt;i>/opt/ros/hydro/lib/openni_camera/openni_node: symbol lookup error:
/opt/ros/hydro/lib/libnodeletlib.so: undefined symbol:
_ZN3ros7console5printEPNS0_10FilterBaseEPvNS0_6levels5LevelEPKciS7_S7_z&lt;/i> or
&lt;i>Open failed: USB interface is not supported!&lt;/i>. Note important choice is 32
vs.  64bit installation &amp;mdash; the other will not work (Husky required 64bit,
Sylvio notebook 32bit).&lt;/div>

&lt;div class='p'>OK, so I was able to theoretically run some program, but how to get the images
and depth data? It turned out that there is small Python wrapper already
available: &lt;a href='https://pypi.python.org/pypi/primesense' class='external'>https://pypi.python.org/pypi/primesense&lt;/a> and although the short
example had a small mistake (&lt;i>get_sensor_info()&lt;/i> requires some parameter,
probably sensor index), I got first Husky depth data &lt;span class='smile'>&lt;/span>. If you need image use
&lt;i>dev.create_color_stream()&lt;/i>. Now I have two raw arrays as last memory of
Husky &amp;hellip;&lt;/div>

&lt;div class='p'>Is this „The End”? No, this is just the beginning! A friend of mine added ssh
tunnel in start-up script, so whenever is Husky connected to the internet I
could work on it. [Yes, I did not mention this detail &amp;mdash; you can plug in
Ethernet cable in the WiFi router and then you get both: old Access Point and
robot connected to the Internet]. So that was the motivation to get the
PrimeSense working. Now I could get at least some reference pictures what
is/was going on 1000km away &amp;hellip; could be fun &amp;hellip; or nightmare &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150203">&lt;/a>&lt;/div>

&lt;h2>3rd February 2015 &amp;mdash; Husky at CZU&lt;/h2>

&lt;div class='p'>Today I finally spent the afternoon playing with Sylvio's robot Husky. It is
parked now at the university (&lt;a href='http://czu.cz/' class='external'>CZU&lt;/a> &amp;mdash; Czech University of
Life Sciences Prague) for the winter (?). The hardest part was to recover mine
almost half a year old memories, even with the help of this blog &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>Question: where is the starting point, what was the last used script?&lt;/div>

&lt;div class='p'>Answer: the best version worked with ROS, i.e.
&lt;a href='https://github.com/robotika/husky/blob/master/ros/huskyros.py' class='external'>ros/huskyros.py&lt;/a>.
It has two parameters (IP of the ROS master and ROS client).&lt;/div>

&lt;div class='p'>Question: how to get images/depth from PrimeSense?&lt;/div>

&lt;div class='p'>Answer: this part was not in github but was still left on Husky. There is
python wrapper and I was able to get the data with initial version of
&lt;a href='https://github.com/robotika/husky/blob/master/openni2/getpic.py' class='external'>openni2/getpic.py&lt;/a>.&lt;/div>

&lt;div class='p'>Note, that there is a bug in current &lt;i>getpic.py&lt;/i> script. I wanted to use
gziped data, but if you write array with 16bit values the write() works for
normal file but not for gzip. TODO. Now I have collected data but reader reject
them that the file size does not match the real size:&lt;/div>

&lt;pre>File "C:\Python27\lib\gzip.py", line 349, in _read_eof
    raise IOError, "Incorrect length of data produced"
IOError: Incorrect length of data produced&lt;/pre>

&lt;div class='p'>And how was Husky driving? Well, not much and very shaky. Why? I realized that
when I was going home &amp;hellip; there was another controller running since bootup and
I forgot to kill it. Next time, hopefully with processing of depth data.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150210">&lt;/a>&lt;/div>

&lt;h2>10th February 2015 &amp;mdash; Simple Depth Viewer&lt;/h2>

&lt;div class='p'>Yesterday I completed simple depth
&lt;a href='https://github.com/robotika/husky/blob/master/openni2/viewer.py' class='external'>openni2/viewer.py&lt;/a>.
In particular I fixed reading 16bit distance array. This simple script displays
the row data from PrimeSense. The decision if the file contains depth data or
color data is based on its size only. Expected is 320x240 array and depth has 2
bytes per pixel while color has 3 bytes. There was necessary some mirroring and
swapping colors. Also note, that depth in millimeters is divided by 20 and
uint8 value is used in gray scale image (i.e. resolution 2cm and max distance
5m). Finally value 0 probably means missing information, so it is ignored for
minimal distance.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 343px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/robots/husky/depth-programmer.jpg' alt='Depth auto-portrait' title='Depth auto-portrait' class='border'  width='337' height='279'/>&lt;/span>&lt;br/>
&lt;span>Depth auto-portrait&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>There is also a fix of writing gzipped data &amp;mdash; I could repeat the bug (?) with
numpy 16bit array (see
&lt;a href='https://github.com/robotika/husky/commit/8991eaded9cb3fb1af0a13db3f6e647eb7ef3251' class='external'>diff&lt;/a>),
and then fix it with array of bytes
(&lt;a href='https://github.com/robotika/husky/commit/aaa38b7e77ba09ff3398a5863b1ad4f30c93d204' class='external'>diff&lt;/a>).
I hope that this will help today &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>The next step is to integrate this distance sensor with driving. Prepared is
sketch of
&lt;a href='https://github.com/robotika/husky/blob/master/ros/followme.py' class='external'>followme.py&lt;/a>
for collection of some real data. This robot behaviour was very popular on
robot &lt;a href='/robots/eduro/en'>Eduro&lt;/a> laser scanner &amp;mdash; see
&lt;a href='https://www.youtube.com/watch?v=c-SXdDgbi0s' class='external'>video&lt;/a> or source code
&lt;a href='https://github.com/robotika/eduro/blob/master/followme.py' class='external'>eduro/followme.py&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150211">&lt;/a>&lt;/div>

&lt;h2>11th February 2015 &amp;mdash; To the Wall&lt;/h2>

&lt;div class='p'>Thare was a big contrast between the afternoon and the evening. I exchanged few
mails with Ruud about 100 nodes ROS system and later that day I was trying to
get Husky to move with PrimeSense sensor (few ROS nodes).  While the OpenNI2
driver with a viewer is working fine and fast on Win7, I was not able to get
ROS node working half year ago. I could try upgrade, but I would not be sure
that it would work after that at all. At the end I was glad for at least
slowly working sample in Python. Sigh.&lt;/div>

&lt;div class='p'>So where are we now? Husky moves and collects pictures and depth data from the
PrimeSense sensor (see
&lt;a href='https://github.com/robotika/husky/commit/8190af39aa125ce51604d4806741b3318baba125' class='external'>diff&lt;/a>).
If there is an obstacle within 1 meter it stops. The motion is very slow
(0.1m/s) because also the data collection is extremely slow. I hacked logging
of two threads communication (in this sense if Ruud quite right, that Python is
not robotic middleware and I do not have clean ready-to-go classes for
communication, logging, replay etc.) and it takes in average 30 cycles = 3
seconds!? Well, but at least we started.&lt;/div>

&lt;div class='p'>Surprises? If I ignore my stupidity with typo of Husky IP address at the
beginning &amp;mdash; it has to be on fixed network 192.168.1.x and I wrote 192.162.1.x
so no wonder it did not want to connect. There was another IP issue: If I run
HuskyROS with parameters &lt;i>localhost localhost&lt;/i> it worked but not for
&lt;i>followme.py&lt;/i>!? Desperately I tried &lt;b>127.0.0.1&lt;/b> and that worked in both
cases &amp;mdash; no idea why :-(.&lt;/div>

&lt;div class='p'>Finally when I was driving in the school corridor the distance estimation
sometimes crashed:&lt;/div>

&lt;pre>Exception in thread Thread-2:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 551, in __bootstrap_inner
    self.run()
  File "./followme.py", line 32, in run
    self.minDist = centerArea[mask].min()/1000.0
ValueError: zero-size array to minimum.reduce without identity&lt;/pre>

&lt;div class='p'>This was easy to fix. I used mask for non-zero numpy elements (I was glad that
numpy is installed on Husky machine) and if there was too much free space it
was looking for minimum of empty array. The same fix was also necessary in the
viewer
(&lt;a href='https://github.com/robotika/husky/commit/be2fb9d224f00356e516322ce32af225523d9c6f' class='external'>diff&lt;/a>).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 326px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/robots/husky/jakub.jpg' alt='Jakub via PrimeSense' title='Jakub via PrimeSense' class='border'  width='320' height='240'/>&lt;/span>&lt;br/>
&lt;span>Jakub via PrimeSense&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 326px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/robots/husky/jakub-dist.jpg' alt='Distance image' title='Distance image' class='border'  width='320' height='240'/>&lt;/span>&lt;br/>
&lt;span>Distance image&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>TODO list:&lt;/div>

&lt;ul>
&lt;li>sync depth and color data collection&lt;/li>

&lt;li>try Process instead of Thread&lt;/li>

&lt;li>detect human and floor&lt;/li>

&lt;li>navigate to free space&lt;/li>

&lt;li>follow human&lt;/li>

&lt;li>outdoor test&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;div class='p'>&lt;a id="150212">&lt;/a>&lt;/div>

&lt;h2>12th February 2015 &amp;mdash; depth_stream.start()&lt;/h2>

&lt;div class='p'>I feel little bit stupid now. I was convinced that 32-bit OpenNI2 driver did
not work on my laptop, but &amp;hellip; did I really tried that? Was not I dealing with
Linux on Husky?? Python 2.7 32bit did not work with 64bit DLL, which is not
surprising. When I run 64bit Python 2.7 the
&lt;a href='https://github.com/robotika/husky/blob/master/openni2/getpic.py' class='external'>getpic.py&lt;/a>
worked without any modification.&lt;/div>

&lt;div class='p'>Now I installed also 32bit version from
&lt;a href='http://structure.io/openni' class='external'>structure.io/openni&lt;/a> and it still did not work
&amp;hellip; for a while. And now it works. Both 32bit as well as 64bit version. I
believe that Windows did not release the &lt;b>OpenNI2.dll&lt;/b> from the memory
(???) so I was still using 64bit version even the file was already replaced.&lt;/div>

&lt;div class='p'>And it works as slow as on Linux &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>So my apologize to Linux driver &amp;mdash; it is the &lt;b>depth_stream.start()&lt;/b> which is
very slow. Once you start it and only grab data with
&lt;b>depth_stream.read_frame()&lt;/b> then it is much faster (approx 10Hz). Here is the
&lt;a href='https://github.com/robotika/husky/commit/e444701aedca64165562feba52e231b81b1924d6' class='external'>diff&lt;/a>.&lt;/div>

&lt;div class='p'>If you would like to read a bit about the sensor technology have a look
at &lt;a href='http://stomatobot.com/primesense-3dsensor-ir-stream/' class='external'>this older article&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150224">&lt;/a>&lt;/div>

&lt;h2>24th February 2015 &amp;mdash; followme.py ver0&lt;/h2>

&lt;div class='p'>There is version 0 of
&lt;a href='https://github.com/robotika/husky/blob/master/ros/followme.py' class='external'>followme.py&lt;/a>
(&lt;a href='https://github.com/robotika/husky/commit/e81585704e795a5110f9e8c835d695fdaec79a3d' class='external'>hash&lt;/a>):&lt;/div>

&lt;div class='p'>&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/GX3MSfMEuTM?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;div class='p'>The code is relatively simple. It scans depth image approximately at 3Hz,
selects the nearest point in given ROI and turns that direction with speed
proportional to the distance.&lt;/div>

&lt;div class='p'>There is TODO for version 1 &amp;mdash; narrow search area centered around previously
detected nearest object (similar to Eduro version). You can see that the red
circle (nearest point) sometimes jumps to wall or some other nearby objects.&lt;/div>

&lt;div class='p'>p.s. this video was created by
&lt;a href='https://github.com/robotika/husky/commit/16cbaff8f2d4a9a65fc5a5160627ac0f3e09b2d7' class='external'>viewer.py&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150423">&lt;/a>&lt;/div>

&lt;h2>23rd April 2015 &amp;mdash; go.py into sun&lt;/h2>

&lt;div class='p'>I has been a while again. There were school lectures and I was playing with
&lt;a href='/robots/katarina/en'>Katarina&lt;/a> so hardly any progress with Husky :-(. The weather
was great two days ago, so I finally did some outdoor tests with PrimeSense
sensor. I wrote very simple
&lt;a href='https://github.com/robotika/husky/blob/master/ros/go.py' class='external'>go.py&lt;/a> script: if
there is an obstacle on the left, turn right. If the obstacle is on the right,
turn left. If it is very close STOP and otherwise go slowly forward. I see that
Here is the code core:&lt;/div>

&lt;pre>if prev is None or prev &lt; safeDist:
    robot.setSpeedPxPa( 0, 0 )
elif prev &lt; safeDist*2:
    # turn in place
    if obstacleDir > 0: # i.e. on the left
        robot.setSpeedPxPa( 0.0, math.radians(-10) )
    else:
        robot.setSpeedPxPa( 0.0, math.radians(10) )
else:
    robot.setSpeedPxPa( maxSpeed, 0 )&lt;/pre>

&lt;div class='p'>It was so simple that it worked within 10 minutes (yes, it was mostly copy and
paste from &lt;i>folowme.py&lt;/i>) with logging so that I could travel back and forth
the corridor and then I could go outside. I expected that it will not work at
all in the sun, but it was not that bad. But the sun light is surely limiting
which you can see from this short video log file:&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/E0x8rEOYMaU?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;div class='p'>The red circle is the nearest obstacle and when approaching the corner you get
nice oscillations. &lt;span class='smile'>&lt;/span> But other than that it moved fine.&lt;/div>

&lt;div class='p'>The logs from outside in the bright sun are mostly black = no depth
information at all. On the other hand it was enough if somebody walked close,
cast a shadow and in that the distance was measured.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150513">&lt;/a>&lt;/div>

&lt;h2>13th May 2015 &amp;mdash; Husky &amp;amp; Heidi (first test)&lt;/h2>

&lt;div class='p'>I am preparing demo for the competition &lt;a href='/competitions/robotem-rovne/en'>Robot go
Straight!&lt;/a>, which will take place in Písek this weekend (16th May 2015). This
time I would like to show combination with ARDrone2 &lt;a href='/robots/heidi/en'>&lt;span class='cs'>Heidi&lt;/span>&lt;/a>.
The idea is that Heidi would fly approximately 1 meter above the Husky robot
and stream video as navigation data for the ground platform. The plan is to
reuse code
&lt;a href='https://github.com/robotika/heidi/blob/master/rr_drone.py' class='external'>heidi/rr_drone.py&lt;/a>
from last year.&lt;/div>

&lt;div class='p'>Yesterday I was finally happy with navigation above oriented roundel &amp;mdash; see
&lt;a href='https://github.com/robotika/heidi/blob/master/guide/lesson7.py' class='external'>guide/lesson7.py&lt;/a>.
I plan to change the configuration from down pointing camera to front
(ARDrone2 API supports detection of oriented roundel even if you do not use the
down pointing camera for video streaming). Heidi's part is ready for
„integration testing”.&lt;/div>

&lt;div class='p'>The troubles were with Husky. I repeated &lt;i>go.py&lt;/i> test (using emergency STOP
button to pause the robot) and 3 times it rebooted. The battery was probably on
the edge, but &amp;hellip;  I added &lt;i>battery status&lt;/i>
(&lt;a href='https://github.com/robotika/husky/commit/b2fbb84a10484e7c3f7744b1681ede96e6ddf0af' class='external'>diff&lt;/a>)
so now in &lt;i>robot.power&lt;/i> is fraction of &lt;i>charge&lt;/i> based on ROS
&lt;i>/husky/data/power_status&lt;/i>. This value started at 0.37, with
occasional drop to 0.25, but the first computer restart happened at 0.30. In
the second run it rebooted at 0.22 where I expected that it is still safe to
use it. At the moment I am not sure if it was battery issue &amp;mdash; if yes, I may
have troubles because the battery does not charge quickly if no then I do not
know :-(.&lt;/div>

&lt;div class='p'>My next plan is to properly handle Emergency STOP. It is OK for stopping, but
if you unblock it and there was some former move command Husky „jumps” &amp;hellip;
yes, that could be another reason for reboot even in the second case I made
sure that there was an obstacle in front of the robot so it would first send
(0,0) motion command.&lt;/div>

&lt;div class='p'>The first and only video is not very exiting (Husky did not move), so at least
one picture from Heidi's down pointing camera after takeoff:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/husky/husky-as-landing-zone.jpg'>&lt;img src='/robots/husky/husky-as-landing-zone_t.jpg' alt='Husky as landing zone' title='Husky as landing zone' class='border'  width='320' height='184'/>&lt;/a>&lt;br/>
&lt;a href='/robots/husky/husky-as-landing-zone.jpg'>Husky as landing zone&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150514">&lt;/a>&lt;/div>

&lt;h2>14th May 2015 &amp;mdash; 2 days to RR2015&lt;/h2>

&lt;div class='p'>Well, it will be hard.
&lt;a href='http://www.kufr.cz/view.php?nazevclanku=prehled-tymu-roro15&amp;amp;cisloclanku=2015010004' class='external'>Robotem
Rovne 2015&lt;/a> is going to start in 48hours and I do not have properly working
version 0. Moreover Husky was making some bad jokes yesterday: when I started
the &lt;i>go.py&lt;/i> with eStop pressed after unblocking it reboot the internal PC?!
Repeatably (3x). It worked when eStop was not pressed, but then it „jumped”
after release. I added small pause in the code to send 0 velocity command even
after eStop was released
(&lt;a href='https://github.com/robotika/husky/commit/00475c9cc5ac6f40947b9ad548d3149d7381ee1e' class='external'>diff&lt;/a>),
but it is hard to tell if that helped (I have Husky at home and there is really
no space to move around). I was so desperate that I wrote to CPR support and I
would like to thank Martin C. for his quick response &lt;span class='smile'>&lt;/span>. If nothing else it
was good that there is somebody I can talk to about the issue &amp;hellip;&lt;/div>

&lt;div class='p'>&lt;i>...the commands received during estop are not buffered, and Husky will always
act on the latest commands received. The timeout still applies. This means that
if the commands are stopped before the e-stop is released, the Husky shouldn't
move. If the commands are continued, the Husky will move at the full speed
commanded once the e-stop is released.&lt;/i>"&lt;/div>

&lt;div class='p'>&amp;hellip; which is something I am doing now, so we will see during further testing.
It could be related to some other issues &amp;mdash; I am putting landing zone as lid
of the Husky, maybe the PrimeSense is taking too much power, or the battery??
We will see.&lt;/div>

&lt;div class='p'>This morning I tried to install OpenCV via
&lt;a href='https://github.com/jayrambhia/Install-OpenCV.git' class='external'>Install-OpenCV.git&lt;/a> but it
failed due to:&lt;/div>

&lt;pre>The following packages will be upgraded:
  libglib2.0-0 libjpeg-turbo8 libpixman-1-0
3 upgraded, 137 newly installed, 0 to remove and 243 not upgraded.&lt;/pre>

&lt;div class='p'>&amp;hellip; at least I suppose that this was the problem and not&lt;/div>

&lt;pre>WARNING: The following packages cannot be authenticated!
  libglib2.0-0 libatk1.0-data libatk1.0-0 libavutil51 libgsm1 liborc-0.4-0
&amp;hellip;&lt;/pre>

&lt;div class='p'>So I tried to upgrade, which is surely not good idea in such a short remaining
time, and hit another „wall” (at least for me, unexperienced Linux/Ubuntu
user):&lt;/div>

&lt;pre>$ sudo apt-get upgrade
Reading package lists&amp;hellip; Done
Building dependency tree
Reading state information&amp;hellip; Done
You might want to run 'apt-get -f install' to correct these.
The following packages have unmet dependencies:
 ros-hydro-roscpp : 
Depends: 
ros-hydro-cpp-common (>= 0.3.17) but 0.3.16-0precise-20130919-0111-+0000 is installed
Depends: 
ros-hydro-roscpp-traits (>= 0.3.17) but 0.3.16-0precise-20130919-0231-+0000 is installed
E: Unmet dependencies. Try using -f.&lt;/pre>

&lt;div class='p'>Using &lt;i>force&lt;/i> does not sound good to me, so there will be no OpenCV for
RR2015 and I will probably run drone part from laptop.&lt;/div>

&lt;div class='p'>TO BE CONTINUED&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150515">&lt;/a>&lt;/div>

&lt;h2>15th May 2015 &amp;mdash; H+H Network (failure)&lt;/h2>

&lt;div class='p'>Last night and this morning I tried to setup Heidi+Husky network, and I did not
succeeded yet. According to document &lt;b>From The Desk of The Robotsmiths:&lt;/b> &lt;i>The
Husky may be connected to the Internet in two ways.  The first is by connecting
the WAN port on the Microhard radio to a live Internet connection.  The second
is by logging in to the Microhard via wifi, and changing its SSID and wifi
password to match that of an existing wifi network in your location.&lt;/i>&lt;/div>

&lt;div class='p'>Year ago I used the first case and it worked, although Sylvio had problems
probably due to colliding IP addresses. The ARDrone2 talks over Wi-Fi, so this
time I need to use the second option. I found
&lt;a href='http://outlaw.ca/uploads/Products/VIP4Gv1.1.pdf' class='external'>manual for similar router&lt;/a>
(in Husky is &lt;i>VIP2-2400&lt;/i>) and there I found what I needed, i.e. that I have
to change &lt;i>Access Point&lt;/i> (&lt;i>An Access Point may provide a wireless data
connection to many clients, such as stations, repeaters, or other supported
wireless devices such as laptops etc.&lt;/i>) to &lt;b>Station/Client&lt;/b> (&lt;i>A Station may
sustain one wireless connection, i.e. to an Access Point.&lt;/i>). I could fill SSID
of our home Wi-Fi with password and the connection was established. I could see
nice colors with signal bar, but &amp;hellip; I could not ping outside (&lt;i>Destination
Net Unreachable&lt;/i>).&lt;/div>

&lt;div class='p'>With the drone it was similar or rather worse. Router IP &lt;b>192.168.1.1&lt;/b> is
colliding with the drone IP, so first I changed that to 192.168.1.13. I also
tried to change &lt;i>Gateway&lt;/i> to &lt;i>Router&lt;/i>, but in my case it is like monkey
trying to do some programming :-(. So far no luck.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150518">&lt;/a>&lt;/div>

&lt;h2>18th May 2015 &amp;mdash; The Contest Robotem Rovně 2015&lt;/h2>

&lt;div class='p'>The contest is over. If you want to jump to
&lt;a href='http://www.kufr.cz/fotoalbum.php?idakce=79' class='external'>pictures&lt;/a> or
&lt;a href='http://www.kufr.cz/view.php?nazevclanku=dalsi-rocnik-souteze-robotem-rovne-aneb-auticka-v-parku-skoncil&amp;amp;cisloclanku=2015050001' class='external'>results&lt;/a>,
feel free &amp;hellip;&lt;/div>

&lt;div class='p'>There is an hour for robot homologation (8am-9am) followed by four competition
runs. The task is simple &amp;mdash; just go straight 314 meters without leaving the
park road &lt;span class='wink'>&lt;/span> (see &lt;a href='https://www.youtube.com/watch?v=hFjiNO0FwbQ' class='external'>interview
about the competition history in English&lt;/a>). This year there were 25 competing
robots.  Moreover new „specials” sub-category was created where flying,
walking, omni-directional and other non-standard robots were separately
evaluated.&lt;/div>

&lt;div class='p'>What a pain! We arrived with both robots (Husky + &lt;a href='/robots/heidi/en'>&lt;span class='cs'>Heidi&lt;/span>&lt;/a>)
shortly after the big tent for participants was build. I started &lt;i>go.py&lt;/i>,
which I was using already for several weeks, and:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~/md/ros$ ./go.py 127.0.0.1 127.0.0.1
Go straight &amp;hellip;
METALOG: logs/meta_150516_020733.log
STARTING /node_test_ros at http://127.0.0.1:8000
Listening on port 8000 &amp;hellip;
Traceback (most recent call last):
  File "./go.py", line 157, in &lt;module>
    go( metalog, assertWrite, ipPair )
  File "./go.py", line 80, in go
    robot = HuskyROS( filename=metalog.getLog("node"), replay=metalog.replay, ipPair=ipPair )
  File "/home/administrator/md/ros/huskyros.py", line 33, in __init__
    filename=filename, replay=replay, assertWrite=assertWrite )
  File "/home/administrator/md/ros/node.py", line 90, in __init__
    logStream = LoggedStream( self.requestTopic( topic ).recv, prefix=topic.replace('/','_') )
  File "/home/administrator/md/ros/node.py", line 111, in requestTopic
    assert len(publishers) == 1, publishers # i.e. fails if publisher is not ready now
AssertionError: []&lt;/pre>

&lt;div class='p'>What?! This never happened to me so I started again, and again &amp;hellip; still the
same result. I was bit in panic mode, rebooted the Husky &amp;hellip; the same again.
When I calmed down I modified the assert
(&lt;a href='https://github.com/robotika/husky/commit/7bf9545b81520ed381001141bb728229121fabd2' class='external'>diff&lt;/a>)
so I know what publisher is failing.&lt;/div>

&lt;pre>AssertionError: ('/imu/data', [])&lt;/pre>

&lt;div class='p'>OK, I can live without IMU, at least for the homologation. Then I remembered
that I can read ROS data directly from console, so&lt;/div>

&lt;pre>administratorcpr-sylv-01:~/git/husky/ros$ rostopic echo /imu/data
ERROR: Cannot load message class for [std_msgs/Imu]. Are your messages built?&lt;/pre>

&lt;div class='p'>??? The command is maybe wrong, but it does not matter now. It did not help.&lt;/div>

&lt;div class='p'>After reboot I got:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ rostopic echo /imu/data
WARNING: topic [/imu/data] does not appear to be published yet

[WARN] [WallTime: 1431759925.748272] Inbound TCP/IP connection failed:
connection from sender terminated before handshake header received. 0 bytes
were received. Please check sender for additional details.  

[WARN] [WallTime: 1431759926.051634] Inbound TCP/IP connection failed:
connection from sender terminated before handshake header received. 0 bytes
were received. Please check sender for additional details.  

[WARN] [WallTime: 1431759926.361188] Inbound TCP/IP connection failed:
connection from sender terminated before handshake header received. 0 bytes
were received. Please check sender for additional details.

&amp;hellip;&lt;/pre>

&lt;div class='p'>Nice.  I removed IMU from my code. Guess what? Another assert with publisher
failure. This time with motor controller and there is no way to drive Husky
without motor controller. Dead End.&lt;/div>

&lt;div class='p'>I tried simple Python code (without ROS) from last year:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~/md$ ./rr.py
Exception in thread clearpath.horizon.transports.Serial.Receiver:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 551, in __bootstrap_inner
    self.run()
  File "/opt/ros/hydro/lib/python2.7/dist-packages/clearpath/horizon/transports.py", 
line 465, in run
    message = self._get_message()
  File "/opt/ros/hydro/lib/python2.7/dist-packages/clearpath/horizon/transports.py", 
line 536, in _get_message
    chars = self._serial.read(1000)
  File "/usr/lib/python2.7/dist-packages/serial/serialposix.py", line 449, in read
    buf = os.read(self.fd, size-len(read))
OSError: [Errno 11] Resource temporarily unavailable

Traceback (most recent call last):
  File "./rr.py", line 28, in &lt;module>
    horizon.set_differential_output(30, 30)
  File "/opt/ros/hydro/lib/python2.7/dist-packages/clearpath/horizon/__init__.py", 
line 244, in set_differential_output
    self._protocol.command('differential_output', locals())
  File "/opt/ros/hydro/lib/python2.7/dist-packages/clearpath/horizon/protocol.py", 
line 287, in command
    self.send_message(messages.Message.command(name, args, self.timestamp(), 
no_ack=(not self.acks)))
  File "/opt/ros/hydro/lib/python2.7/dist-packages/clearpath/horizon/protocol.py", 
line 372, in send_message
    raise utils.TimeoutError("Message Timeout Occurred!")
clearpath.utils.TimeoutError: Message Timeout Occurred!&lt;/pre>

&lt;div class='p'>I expected that some other running process is using &lt;i>/dev/ttyUSB1&lt;/i> so my
program cannot open it, but &amp;hellip; surprise &lt;b>there was no /dev/ttyUSB1!!!&lt;/b> I had
like 5 minutes remaining for the homologation so I took Heidi, tried
&lt;a href='https://github.com/robotika/heidi/blob/master/rr_drone.py' class='external'>rr_drone.py&lt;/a> from
last year, left the road and hit the lamp in 2 meters from start.
Never-the-less I was listed at the end, after all homologated robots, so I
could still compete &amp;hellip;&lt;/div>

&lt;div class='p'>So where is &lt;i>/dev/ttyUSB1&lt;/i>?! I unplugged and plugged IMU and motor controller
USB cables and that was the major breakthrough. My luck was back &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ ls /dev/ttyU*
ls: cannot access /dev/ttyU*: No such file or directory

administratorcpr-sylv-01:~$ ls /dev/ttyU*
/dev/ttyUSB0  /dev/ttyUSB1&lt;/pre>

&lt;div class='p'>&lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>I do not know how to restart all failing ROS nodes and after reboot the USB
devices disappeared again so I focused on direct COM and my former program
(&lt;a href='https://github.com/robotika/husky/commit/b07de16bcc38c0d96443a9676b87a1cd7aeec996' class='external'>diff&lt;/a>)
and again I was getting strange reports:&lt;/div>

&lt;pre>0x8010
Traceback (most recent call last):
  File "./husky.py", line 180, in &lt;module>
    testRR2015( com )
  File "./husky.py", line 154, in testRR2015
    robot.update()
  File "./husky.py", line 83, in update
    timestamp, msgType, data = self.readPacket()
  File "./husky.py", line 57, in readPacket
    length = ord(self.com.read(1))
  File "/home/administrator/md/logit.py", line 32, in read
    s = self._com.read( numChars )
  File "/usr/lib/python2.7/dist-packages/serial/serialposix.py", line 449, in read
    buf = os.read(self.fd, size-len(read))
OSError: [Errno 11] Resource temporarily unavailable&lt;/pre>

&lt;div class='p'>or&lt;/div>

&lt;pre>0x8200
0x8202
0x8800
Traceback (most recent call last):
  File "./husky.py", line 180, in &lt;module>
    testRR2015( com )
  File "./husky.py", line 154, in testRR2015
    robot.update()
  File "./husky.py", line 83, in update
    timestamp, msgType, data = self.readPacket()
  File "./husky.py", line 59, in readPacket
    assert length+notLength == 0xFF, (length, notLength)
AssertionError: (242, 83)&lt;/pre>

&lt;div class='p'>The Clearpath Python wrapper probably hides this, because &lt;i>./rr.py&lt;/i> worked.&lt;/div>

&lt;div class='p'>Husky was announced for the first run. I was so thrown off by the Husky behavior
that I did not give Heidi enough attention. I started the drone from my
notebook, then unblocked eStop to run old code on Husky and then the show began. It
was exciting until Heidi landed between moving Husky wheels! I know immediately
what it was &amp;mdash; the time was 60s after start &amp;hellip; yes, I had still testing
version for Heidi
(&lt;a href='https://github.com/robotika/heidi/commit/e9a9884fde9fcb59d901be765095ab6b12fa73b2' class='external'>diff&lt;/a>
also with changes after 2nd run) with timeout set to 60s. Pity, but I hit eStop
fast enough so Heidi was still as one part.&lt;/div>

&lt;div class='p'>In the second run Heidi already had much longer time for the
&lt;i>hoverAboveRoundel()&lt;/i> task. It only went bit too high and after approximately
10m (on the first crossing) lost Husky. Yes, there was no coordination of both
robots (I may describe dropping off functionality list at the end of this).
After that I increased the speed and aggression of corrections. I also changed
camera view to front camera as it may collect more interesting images.&lt;/div>

&lt;div class='p'>In the third round I warned the spectators about parameter changes (without any
chance to test them) and yes, the drone was much more aggressive. But it was
also much more interesting to watch. As in previous runs if the target (landing
platform with oriented roundel) was not detected the drone moved slowly up.
When detection worked for last 100 pictures (the bottom camera runs at 60Hz)
then it was moving down to 1 meter above the roundel. As far as I remember both
robots cooperated in this run and it was terminated due to Husky leaving
road.&lt;/div>

&lt;div class='p'>For the final round I modified Husky code. Yeah, it was big change
(&lt;a href='https://github.com/robotika/husky/commit/fc423e665d8895caddd9ad990c8b65ece5ce8114' class='external'>diff&lt;/a>)
&lt;span class='smile'>&lt;/span> &amp;hellip; instead of &lt;i>horizon.set_differential_output(30, 30)&lt;/i> I called
&lt;i>horizon.set_differential_output(30, 33)&lt;/i> and the robot moved almost
straight. I had to stop it not because of the leaving the road but because of
queue of people in front of one kiosk. I was not complaining, because I lost
Heidi somewhere in the middle of the way so 90 meters/points was more than what
I could hope for.&lt;/div>

&lt;div class='p'>Timeout - so here is at least video from the first run. Note, that the drone
starts with front video and then switches to bottom view.&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/uHqMT5UyxQk?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150519">&lt;/a>&lt;/div>

&lt;h2>19th May 2015 &amp;mdash; USB Ghost&lt;/h2>

&lt;div class='p'>Last night I tried to read something about USB devices and how they can be
virtually unplugged. The situation was the same as during the contest &amp;mdash;
missing both &lt;i>/dev/ttyUSB0&lt;/i> and &lt;i>/dev/ttyUSB1&lt;/i>. It turned out that I was
actually lucky, because index 0 and 1 depends on the order how you plug-in the
cables and motor controller was expected on  &lt;i>/dev/ttyUSB1&lt;/i>.&lt;/div>

&lt;div class='p'>In particular I was looking for use cases of &lt;i>/sys/bus/usb/drivers/usb/bind&lt;/i> and &lt;i>/sys/bus/usb/drivers/usb/unbind&lt;/i>.&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ ls /dev/ttyU*
ls: cannot access /dev/ttyU*: No such file or directory

administratorcpr-sylv-01:~$ lsusb -t
6-1:1.0: No such file or directory
6-2:1.0: No such file or directory
/:  Bus 09.Port 1: Dev 1, Class=root_hub, Driver=xhci_hcd/2p, 5000M
/:  Bus 08.Port 1: Dev 1, Class=root_hub, Driver=xhci_hcd/2p, 480M
/:  Bus 07.Port 1: Dev 1, Class=root_hub, Driver=xhci_hcd/2p, 5000M
/:  Bus 06.Port 1: Dev 1, Class=root_hub, Driver=xhci_hcd/2p, 480M
    |__ Port 1: Dev 2, If 0, Class=vend., Driver=, 12M
    |__ Port 2: Dev 3, If 0, Class=vend., Driver=, 12M
/:  Bus 05.Port 1: Dev 1, Class=root_hub, Driver=ohci_hcd/2p, 12M
/:  Bus 04.Port 1: Dev 1, Class=root_hub, Driver=ohci_hcd/5p, 12M
/:  Bus 03.Port 1: Dev 1, Class=root_hub, Driver=ohci_hcd/5p, 12M
    |__ Port 5: Dev 3, If 0, Class=vend., Driver=xpad, 12M
/:  Bus 02.Port 1: Dev 1, Class=root_hub, Driver=ehci_hcd/5p, 480M
/:  Bus 01.Port 1: Dev 1, Class=root_hub, Driver=ehci_hcd/5p, 480M

administratorcpr-sylv-01:~$ lsusb
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 006 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 007 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 008 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 009 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 003 Device 003: ID 046d:c21f Logitech, Inc. F710 Wireless Gamepad [XInput Mode]
Bus 006 Device 002: ID 0403:6001 Future Technology Devices International, 
Ltd FT232 USB-Serial (UART) IC
Bus 006 Device 003: ID 067b:2303 Prolific Technology, Inc. PL2303 Serial Port&lt;/pre>

&lt;div class='p'>I also tried to complete the upgrade with &lt;b>-f&lt;/b> option, but failed again on&lt;/div>

&lt;pre>grub-probe: error: cannot find a device for / (is /dev mounted?).
Installation finished. No error reported.
/usr/sbin/grub-probe: error: cannot find a device for / (is /dev mounted?).
dpkg: error processing grub-pc (&amp;ndash;configure):
 subprocess installed post-installation script returned error exit status 1
Processing triggers for libc-bin &amp;hellip;
ldconfig deferred processing now taking place
Processing triggers for initramfs-tools &amp;hellip;
update-initramfs: Generating /boot/initrd.img-3.2.0-54-generic&lt;/pre>

&lt;div class='p'>I am mentioning this if it could be related to &lt;i>ghost&lt;/i> behavior because after
reboot I have seen the USB devices:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ find /sys/bus/usb/devices/usb*/ -name dev
/sys/bus/usb/devices/usb1/dev
/sys/bus/usb/devices/usb2/dev
/sys/bus/usb/devices/usb3/dev
/sys/bus/usb/devices/usb3/3-5/dev
/sys/bus/usb/devices/usb3/3-5/3-5:1.0/input/input9/event9/dev
/sys/bus/usb/devices/usb3/3-5/3-5:1.0/input/input9/js0/dev
/sys/bus/usb/devices/usb4/dev
/sys/bus/usb/devices/usb5/dev
/sys/bus/usb/devices/usb6/dev
/sys/bus/usb/devices/usb6/6-1/dev
/sys/bus/usb/devices/usb6/6-1/6-1:1.0/ttyUSB0/tty/ttyUSB0/dev
/sys/bus/usb/devices/usb6/6-2/dev
/sys/bus/usb/devices/usb6/6-2/6-2:1.0/ttyUSB1/tty/ttyUSB1/dev
/sys/bus/usb/devices/usb7/dev
/sys/bus/usb/devices/usb8/dev
/sys/bus/usb/devices/usb9/dev&lt;/pre>

&lt;div class='p'>I do not know.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150522">&lt;/a>&lt;/div>

&lt;h2>22th May 2015 &amp;mdash; OS Upgrade&lt;/h2>

&lt;div class='p'>I decided to go ahead with upgrade of Husky OS. Martin from CPR confirmed that:
&lt;i>Our Indigo release is available and stable. There was quite a bit of clean up
and new features, so I would indeed suggest upgrading! Instructions to do so
can be found&lt;/i> &lt;a href='http://wiki.ros.org/Robots/Husky' class='external'>here&lt;/a>&lt;i>. Be sure to back
up your system just in case!&lt;/i> Then it was a kind of &lt;b>when if not now and who
if not me&lt;/b> question (there are three events at the beginning of next month and
I need Husky working for the demo).&lt;/div>

&lt;div class='p'>There were couple minor issues like I do not have a keyboard or monitor at home
(we use only laptops and „smart” phones), so I borrowed one at work and used
TV as monitor. Yes, Husky is now our second „dog” and lives in our living
room. &lt;span class='smile'>&lt;/span> And yes, you can imagine how happy is my wife about it. &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>The second detail was USB disk. Based on the suggested link
&lt;a href='http://wiki.ros.org/Robots/Husky' class='external'>http://wiki.ros.org/Robots/Husky&lt;/a> I downloaded relatively small (29MB) ISO
image and used &lt;i>Universal-USB-Installer-1.9.5.9.exe&lt;/i> to create bootable USB
drive. That worked fine for other disk and other system, except that here I
selected last option &lt;i>Try Unlisted Linux ISO&lt;/i>. The format of USB disk for
some unknown reason failed and maybe that was the detail causing problems later
&amp;mdash; Husky BIOS did not see it (I could mount it and use it with the system,
but not from BIOS). So I used another USB disk, format it manually and then I
could see two more options in boot list (F7 option). I did not write it down
but one started from &lt;b>T&lt;/b> and that worked while the other from &lt;b>U&lt;/b>, which
did not work.&lt;/div>

&lt;div class='p'>There was another „stopper” before this step &amp;mdash; it is highly recommended
that you backup whole disk and the first step new installation will do is to
wipe your disk! Do I really want to reinstall all the packages I installed over
the year? Will it work in new system? It is hard to tell and I am sure I forgot
some details like automatic ssh-tunnel for remote access &amp;hellip;&lt;/div>

&lt;div class='p'>The instructions were clear enough, I would maybe only stress a little bit that
the internet is &lt;b>must have&lt;/b> and you should have rather fast connection to
download all the packages. Actually I had to fill only IP, mask, gateway and
DNS server and the rest was automatic up to changing the computer name. There
was one moment when I was particularly nervous (586/718) when for several
minutes nothing was happening (based on „progress bar”)?! But now is the
system replaced.&lt;/div>

&lt;div class='p'>I tried &lt;i>./rr.py&lt;/i>, but it did not work, because Python wrapper
(clearpath.horizon) is not installed. Then I tried my
&lt;a href='https://github.com/robotika/husky/blob/master/husky.py' class='external'>husky.py&lt;/a>, which
worked, but Husky did not move. I suppose that I did not configure sending
necessary messages which was formerly done with ROS on boot up. It was 1am so I
rather went to sleep and starting now at 5am &amp;hellip; is it really so much fun?
&lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>As my grandfather would say: „it is done, just to finish it”. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150527">&lt;/a>&lt;/div>

&lt;h2>27th May 2015 &amp;mdash; ssh tunnel&lt;/h2>

&lt;div class='p'>This note is rather for myself if I ever have to reinstall Husky again. I had
working &lt;i>ssh tunnel&lt;/i> in start up script so whenever was Husky connected to
the internet I was able to connect to it from dedicated server. It worked fine
for me. A friend of mine did it for me and I did not know all the details. And
after reinstall I lost that.&lt;/div>

&lt;div class='p'>There were two issues: how to start it always on boot and where is home for
&lt;i>root&lt;/i> user (for &lt;i>.ssh&lt;/i> configuration). The first was solved with
&lt;i>/etc/rc.local&lt;/i> file. It was necessary to add line &lt;b>before&lt;/b> exit 0 &lt;span class='smile'>&lt;/span> and
redirect input, output and error output, and run it with &amp;amp;.&lt;/div>

&lt;div class='p'>The second was sorted with hint from
&lt;a href='http://askubuntu.com/questions/115151/how-to-setup-passwordless-ssh-access-for-root-user' class='external'>askubuntu.com&lt;/a>
in particular line&lt;/div>

&lt;pre>sudo su - root&lt;/pre>

&lt;div class='p'>and then with &lt;i>echo $HOME&lt;/i> I got searched &lt;i>/root&lt;/i>. &lt;span class='smile'>&lt;/span> Yes, like total
beginner. Now it establishes tunnel again and I am installing OpenCV (again).
There is still unsolved issue with IMU, which requires calibration. This means
turn in place which I would rather do outside &amp;hellip;&lt;/div>

&lt;div class='p'>p.s.&lt;/div>

&lt;pre>OpenCV 3.0.0-rc1 ready to be used&lt;/pre>

&lt;hr/>

&lt;div class='p'>&lt;a id="150528">&lt;/a>&lt;/div>

&lt;h2>28th May 2015 &amp;mdash; Compass calibration&lt;/h2>

&lt;div class='p'>It is still Wednesday evening, but &amp;hellip; before I finish, it could be already
Thursday. I was bit lazy to carry Husky outside so I decided to do the
calibration at home at the end. This was after I read the source code of
&lt;a href='https://github.com/husky/husky_robot/blob/indigo-devel/husky_bringup/scripts/calibrate_compass' class='external'>calibrate_compass&lt;/a>
where you can see what it will in reality do:&lt;/div>

&lt;ul>
&lt;li>turn for 60 seconds&lt;/li>

&lt;li>the speed is 0.2 (radians) which corresponds to 12 deg/sec&lt;/li>

&lt;li>in total two slow complete spins&lt;/li>
&lt;/ul>

&lt;div class='p'>Result:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ rosrun husky_bringup calibrate_compass
rospack: error while loading shared libraries: librospack.so: cannot open shared
 object file: No such file or directory
Traceback (most recent call last):
  File "/opt/ros/indigo/bin/rostopic", line 35, in &lt;module>
    rostopic.rostopicmain()
  File "/opt/ros/indigo/lib/python2.7/dist-packages/rostopic/__init__.py", line 1753, 
in rostopicmain
    import rosbag
  File "/opt/ros/indigo/lib/python2.7/dist-packages/rosbag/__init__.py", line 33, in &lt;module>
    from .bag import Bag, Compression, ROSBagException, ROSBagFormatException, 
ROSBagUnindexedException
  File "/opt/ros/indigo/lib/python2.7/dist-packages/rosbag/bag.py", line 65, in &lt;module>
    import roslz4
  File "/opt/ros/indigo/lib/python2.7/dist-packages/roslz4/__init__.py", line 33, in &lt;module>
    from ._roslz4 import *
ImportError: libroslz4.so: cannot open shared object file: No such file or directory
ROS appears not to be running. Please start ROS service:
sudo service ros start&lt;/pre>

&lt;div class='p'>Hmm, strange. I found
&lt;a href='http://answers.ros.org/question/67073/roscd-and-roslaunch-problem/' class='external'>roscd-and-roslaunch-problem&lt;/a>
that this was due to not properly set &lt;b>LD_LIBRARY_PATH&lt;/b>. After setting it&lt;/div>

&lt;pre>export LD_LIBRARY_PATH=/opt/ros/indigo/lib/&lt;/pre>

&lt;div class='p'>it went through&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ rosrun husky_bringup calibrate_compass
Started rosbag record, duration 60 seconds, pid [2483]
Started motion commands, pid [2484]
[ INFO] [1432750402.337673856]: Subscribing to /tf
[ INFO] [1432750402.347888180]: Subscribing to /imu/data
[ INFO] [1432750402.356517429]: Subscribing to /imu/mag
[ INFO] [1432750402.372649791]: Recording to /tmp/calibrate_compass.oXJK/imu_record.bag.
Test underway.
Time remaining: 0
Shutting down motion command publisher.
Waiting for rosbag to shut down.
Computing magnetic calibration.
Calibration generated in /tmp/calibrate_compass.oXJK/mag_config.yaml.
Copy calibration to /opt/ros/indigo/etc/husky_bringup? [Y/n]
[sudo] password for administrator:
Restart ROS service to begin using saved calibration.&lt;/pre>

&lt;div class='p'>OK fine. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>It turned out that the &lt;i>LD_LIBRARY_PATH&lt;/i> was actually by default set:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~/git/husky/ros$ env | grep LD
LD_LIBRARY_PATH=/opt/ros/indigo/lib:/opt/ros/indigo/lib/x86_64-linux-gnu
OLDPWD=/home/administrator&lt;/pre>

&lt;div class='p'>but the problem was with my favorite tool &lt;i>screen&lt;/i> (I use it for the case
the connection is broken).&lt;/div>

&lt;div class='p'>On the other hand &lt;i>/imu/rpy&lt;/i> still does not work (it is in &lt;i>rostopic
list&lt;/i>, but:&lt;/div>

&lt;pre>administratorcpr-sylv-01:~$ rostopic echo /imu/rpy
WARNING: topic [/imu/rpy] does not appear to be published yet&lt;/pre>

&lt;div class='p'>The same for &lt;i>/imu/temperature&lt;/i> or &lt;i>/husky/data/encoders&lt;/i>. The
&lt;i>/imu/data&lt;/i> works fine.&lt;/div>

&lt;div class='p'>p.s. in meantime I started to work on alternative backup solution. I picked
several years old board &lt;a href='http://www.pcengines.ch/alix3d2.htm' class='external'>alix3d2&lt;/a> from
„intelligent home” project, changed static IP to 192.168.1.12, set gateway to
192.168.1.1 and dns-namespaces to 8.8.8.8, plugged Ethernet cable into Husky
switch, installed git and python-serial, downloaded
&lt;a href='https://github.com/robotika/husky' class='external'>husky.git&lt;/a>, inserted USB for motion
control and IMU, and was almost ready to go. Minimal dependences, much lower
power consumption (I know for other projects we may need stronger PC). I had to
change the code a little
(&lt;a href='https://github.com/robotika/husky/commit/f663584d6767ce9e609e5ccd9c7dec2457026bc0' class='external'>diff&lt;/a>)
to request basic info from motion controller (originally pre-requested by ROS
master), and that was it. Husky moved and logged all data. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150529">&lt;/a>&lt;/div>

&lt;h2>29th May 2015 &amp;mdash; imu.py&lt;/h2>

&lt;div class='p'>&lt;i>You'll learn to love ROS, just you wait ;)&lt;/i> &amp;hellip; this sentence is repeated in
my mind again and again (the author is Martin from CPR). I do not know.  Maybe
if everything works, but it usually does not. It is not ROS failure, it is
just complex system under development. The problem can be in the cable, bad
configuration, burned capacitor or who knows? For me it is still extra level
which sometimes hides the problems and makes the whole thing even more complex.&lt;/div>

&lt;div class='p'>Friends at &lt;a href='/competitions/robotem-rovne/2015/en'>&lt;span class='cs'>Robotem Rovne 2015&lt;/span>&lt;/a> said „we
use ROS, we have to, but we do not like it”. They told me stories about
colliding libraries and dedicated systems only for ROS. On the other hand
&lt;a href='https://twitter.com/roboauto' class='external'>roboauto&lt;/a> is using ROS for their autonomous
car and also commercial mobile platforms
&lt;a href='http://mobile-industrial-robots.com/en/' class='external'>mir-robotics&lt;/a> is using ROS. Also
&lt;a href='http://www8.cs.umu.se/~ringdahl/publications/RHEA_2014.pdf' class='external'>apple harvesting
robot&lt;/a> was using ROS. I do not know.&lt;/div>

&lt;div class='p'>At the moment I do not have working ROS system. I believe that it is just some
simple configuration somewhere, but somehow I prefer to learn and understand
the foundations on which are the ROS nodes based. I had problems with IMU year
ago, so this time I started from scratch (see
&lt;a href='https://github.com/robotika/husky/commits/master/imu.py' class='external'>imu.py today&amp;#039;s
history&lt;/a>). It looks OK, it is
&lt;a href='https://www.chrobotics.com/docs/UM6_datasheet.pdf' class='external'>well documented&lt;/a> and I am
starting to be confident to build demo based on it for next Tuesday
(&lt;a href='http://dnipola.sk/' class='external'>http://dnipola.sk/&lt;/a>). At the moment
&lt;a href='https://github.com/robotika/husky/blob/master/imu.py' class='external'>imu.py&lt;/a> is parsing only
temperature, but I also see what messages it is sending:&lt;/div>

&lt;pre>UM6_GYRO_RAW_XY = 0x56
UM6_GYRO_PROC_XY = 0x5e
UM6_MAG_PROC_XY = 0x60
UM6_EULER_PHI_THETA = 0x62
UM6_TEMPERATURE = 0x76&lt;/pre>

&lt;div class='p'>And the driver expects Euler angles and quaternions &amp;hellip; so maybe the
calibration actually broke the IMU node??? But it was not sending the
&lt;i>/imu/rpy&lt;/i> before, so probably not. Note, that now I am using another board,
not the mini PC with ROS, so it could be interesting if the ROS node is
changing UM6 configuration?? I am just running out of time, so „another day”.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150603">&lt;/a>&lt;/div>

&lt;h2>3rd June 2015 &amp;mdash; dnipola.sk/Encoders&lt;/h2>

&lt;div class='p'>It is early in the morning and I am preparing Husky for today's show. Yesterday
it went fine &amp;mdash; I was using only
&lt;a href='https://github.com/robotika/husky/blob/master/rr.py' class='external'>rr.py&lt;/a> code from
&lt;a href='/competitions/robotem-rovne/2015/en'>&lt;span class='cs'>Robotem rovně&lt;/span>&lt;/a> and I was flying with the
drone above the heliport sign. So far so good. Note, that I installed
&lt;i>clearpath_py_r920.tar.gz&lt;/i> on both computer boards in order to run &lt;i>rr.py&lt;/i>.&lt;/div>

&lt;div class='p'>I made two observations:&lt;/div>

&lt;ul>
&lt;li>the battery was really „drained” by the hungry computer &amp;mdash; using the alternative
board solved the problem. Husky was running all day on one battery charge &amp;hellip;
yes it was mostly off, or not moving but still 100 times better than when I
was watching power bar going to zero before. Also there were no overheating
issues even during the hot day.&lt;/li>

&lt;li>the communication once a while fails and it looks like the Husky temporarily
stops (very short „blink” of red light)&lt;/li>
&lt;/ul>

&lt;div class='p'>Today I planned to repeat the yesterday show, maybe with some extra maneuvers,
so that people will see the drone also turning. The simplest solution was to
add extra bits of code into &lt;i>husky.py&lt;/i>
(&lt;a href='https://github.com/robotika/husky/commit/bbe57fc80a9d2e99c027a57390af07df064e826e' class='external'>diff&lt;/a>)
and I was surprised (again) to see raw data from encoders with status of
emergency STOP button:&lt;/div>

&lt;pre>True (2, 0, 0, 0, 0)
True (2, 0, 0, 0, 0)
False (2, 0, 0, 0, 0)
False (2, 0, 0, 0, 0)
False (2, 1, 0, 25, 20)
False (2, 3, 3, 34, 32)
False (2, 8, 7, 28, 27)
&amp;hellip;
False (2, 218, 212, 11, 6)
False (2, 218, 212, -23, -17)
True (2, 215, 210, -32, -24)
True (2, 214, 208, -29, -26)
True (2, 214, 207, -29, -24)
True (2, 214, 207, -29, -24)
&amp;hellip;
True (2, 214, 207, -29, -24)
True (2, 214, 207, -29, -24)
False (2, 214, 207, -29, -24)
False (2, 214, 0, 0, 0)
False (2, 216, 2, 35, 49)&lt;/pre>

&lt;div class='p'>So shortly after eStop was released the second encoder was reset to zero while
the first encoder continued from origin value?! I am still convinced that
something like this can case the „jumps” I was observing when using ROS
couple weeks ago &amp;hellip; sigh.&lt;/div>

&lt;div class='p'>Just remark, I would not use eStop for starting/pausing my program but it is
&lt;b>the only button&lt;/b> available on Husky :-(.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/robots/husky/en#email'>contact form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>Katarina</title>
	<link rel='alternate' href="http://localhost/robots/katarina/en"/>
	<id>http://localhost/robots/katarina/en</id>
	<updated>2015-01-28T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> There is another toy from Parrot &amp;mdash; drone called &lt;b>Bebop&lt;/b>. My task is to
prepare simple tracking of Parrot Hat (old AR Drone 2 could do that, so why not
the follower?). Bebop is more professional at first sight: extra battery is
part of the basic package, spare propellers, GPS, full HD camera, 8 time more
powerful computer, 14 megapixels Fisheye camera with user selectable ROI &amp;hellip;
so I am quite curious how it will work from Python &lt;span class='wink'>&lt;/span>. &lt;b>Blog
update:&lt;/b> 30/4 &amp;mdash; &lt;a href='/robots/katarina/en#150430'>Paparazzi?!&lt;/a>
 </summary>
	<content type='html'> 
&lt;div class='p'>This is a blog about my experience with &lt;a href='http://www.icornerhightech.cz/' class='external'>Parrot
Bebop drone&lt;/a>. She received "working name" &lt;i>Katarina&lt;/i> &amp;hellip; I expected that it
will be roar and originally wanted to name it
&lt;a href='http://en.wikipedia.org/wiki/Hurricane_Katrina' class='external'>Katrina&lt;/a>, but I changed my
mind as there was too much destruction behind that name.&lt;/div>

&lt;div class='p'>At the moment I am not sure if I will write this blog in Czech or English &amp;mdash;
we will see if the world is more interested then
&lt;a href='http://fandorama.cz/projekty/921778164/katarina-bebop/' class='external'>Czech community&lt;/a> &amp;hellip;
at the moment it will be mix of both.&lt;/div>

&lt;div class='p'>The drone is loan from &lt;a href='http://www.icornerhightech.cz/' class='external'>Czech Parrot
distributor&lt;/a> (last time it was minidrone &lt;a href='/robots/jessica/en'>Jessica&lt;/a>). Thanks
&lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/box.jpg'>&lt;img src='/robots/katarina/box_t.jpg' alt='Box' title='Box' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/box.jpg'>Box&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/packed.jpg'>&lt;img src='/robots/katarina/packed_t.jpg' alt='Packed Katarina' title='Packed Katarina' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/packed.jpg'>Packed Katarina&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/katarina.jpg'>&lt;img src='/robots/katarina/katarina_t.jpg' alt='Size' title='Size' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/katarina.jpg'>Size&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Links&lt;/h2>

&lt;ul>
&lt;li>Czech Parrot distributor
&lt;a href='http://www.icornerhightech.cz/' class='external'>http://www.icornerhightech.cz/&lt;/a>&lt;/li>

&lt;li>&lt;a href='http://www.parrot.com/usa/products/bebop-drone/' class='external'>Parrot Bebop&lt;/a>&lt;/li>

&lt;li>&lt;a href='https://github.com/matthewlloyd/bebop' class='external'>Parrot Bebop Drone Hacking&lt;/a>&lt;/li>

&lt;li>&lt;a href='https://github.com/ARDroneSDK3' class='external'>ARDroneSDK3&lt;/a>&lt;/li>

&lt;li>&lt;a href='https://github.com/robotika/katarina' class='external'>Katarina github&lt;/a>&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;hr/>

&lt;h1>Content&lt;/h1>

&lt;ul>
&lt;li>&lt;a href='/robots/katarina/en#150131'>First talk to &amp;quot;dragon&amp;quot;&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150202'>Parsing Navdata&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150203'>generateLibARCommands.py&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150204'>Commands required&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150205'>libARNetworkAL&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150215'>PING and PONG&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150216'>ACK of ACK?&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150217'>Low latency video stream&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150218'>video.py&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150220'>Camera Tilt/Pan and Emergency&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150221'>First Crash&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150222'>ManualControlException and FlatTrim&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150223'>Fandorama&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150320'>Status quo&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150322'>PCMD &amp;#64;40Hz&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150408'>Terminated.&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150414'>Remote distributed debugging&lt;/a>&lt;/li>

&lt;li>&lt;a href='/robots/katarina/en#150430'>Paparazzi?!&lt;/a>&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;hr/>

&lt;h1>Blog&lt;/h1>

&lt;div class='p'>&lt;a id="150131">&lt;/a>&lt;/div>

&lt;h2>31st January 2015 &amp;mdash; First talk to "dragon"&lt;/h2>

&lt;div class='p'>I would like to write some notes in parallel to
&lt;a href='https://github.com/robotika/katarina' class='external'>github development&lt;/a>, similar to
other &lt;a href='https://github.com/robotika' class='external'>projects&lt;/a>. It helps me to remember some
details and it could better explain my steps for random reader/developer.&lt;/div>

&lt;div class='p'>Today I unpacked Bebop and powered it up. It was quite noisy even I did not take
off &amp;hellip; there is a fan for the processor probably (?). My first task was to
take a picture but now I am happy that the drone started to talk to me &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>I would recommend document &lt;a href='https://github.com/matthewlloyd/bebop' class='external'>Parrot Bebop
Drone Hacking&lt;/a>. But the hint, how to establish communication with the drone,
is from &lt;a href='https://github.com/ARDroneSDK3/ARSDKBuildUtils/issues/5' class='external'>here&lt;/a>. It is
necessary to do „discovery step” before you can use ports &lt;i>c2d_port &lt;/i> and
&lt;i>d2c_port&lt;/i>. TCP is used on 192.168.42.1, port 44444 and you have to first
send JSON to the drone with &lt;i>controller_type&lt;/i>, &lt;i>controller_name&lt;/i> and
&lt;i>d2c_port&lt;/i>.  See
&lt;a href='https://github.com/robotika/katarina/commit/24f021d2b28b6c6ac420d3d2e3f1403dbcf421c5' class='external'>diff&lt;/a>
for details.&lt;/div>

&lt;div class='p'>The Python scripts sent:&lt;/div>

&lt;pre>{"controller_type":"computer", "controller_name":"katarina", "d2c_port":"43210"}&lt;/pre>

&lt;div class='p'>and received:&lt;/div>

&lt;pre>{ "status": 0, "c2d_port": 54321, "arstream_fragment_size": 1000, 
"arstream_fragment_maximum_number": 128, "arstream_max_ack_interval": 0, 
"c2d_update_port": 51, "c2d_user_port": 21 }&lt;/pre>

&lt;div class='p'>To be continued &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150202">&lt;/a>&lt;/div>

&lt;h2>2nd February 2015 &amp;mdash; Parsing Navdata&lt;/h2>

&lt;div class='p'>At first I would like to thank Darryl for his supportive mail &lt;span class='smile'>&lt;/span>. It pushed me
to get some data from &lt;i>d2c_port&lt;/i>. I started with copy and paste from
&lt;a href='https://github.com/robotika/heidi' class='external'>Heidi code&lt;/a> (UDP sockets), so I call them
again &lt;i>NAVDATA_PORT&lt;/i> and &lt;i>COMMAND_PORT&lt;/i>.&lt;/div>

&lt;div class='p'>The code was simple:&lt;/div>

&lt;pre>robot = Bebop() 
for i in xrange(100): 
    robot.update( cmd=None )&lt;/pre>

&lt;div class='p'>and because it already generated logs with date/time I have test for 1, 10 an
100 messages (see related
&lt;a href='https://github.com/robotika/katarina/commit/d112a31654dfb142b9793ca307c614af1d90370d' class='external'>diff&lt;/a>).&lt;/div>

&lt;div class='p'>The structure looks similar to &lt;a href='/robots/jessica/en'>Jessica (Rolling Spider)&lt;/a>.
Code 2, then type of queue (?), increasing index and length of the whole data
packet. There is now
&lt;a href='https://github.com/robotika/katarina/blob/master/navdata.py' class='external'>navdata.py&lt;/a> for
separate parsing and if you run it on logged data you will see something
like:&lt;/div>

&lt;pre>m:\git\ARDroneSDK3\katarina>navdata.py navdata_150201_211239.bin
127 1 size 35
127 2 size 23
127 3 size 23
127 4 size 19
127 5 size 23
127 6 size 23
127 7 size 19
127 8 size 23
127 9 size 23
127 10 size 19
0 1 size 15
127 11 size 23
127 12 size 23
127 13 size 19
127 14 size 23
127 15 size 23
127 16 size 19
&amp;hellip;&lt;/pre>

&lt;div class='p'>The counters are separate and so far it looks like that the 0-queue contains
only 15bytes long message.&lt;/div>

&lt;div class='p'>This is the content:&lt;/div>

&lt;pre>02 00 01 0F 00 00 00 5D 01 00 00 99 56 57 0C
02 00 02 0F 00 00 00 5E 01 00 00 87 C8 93 0D
02 00 03 0F 00 00 00 5F 01 00 00 A7 9E A7 0D
02 00 04 0F 00 00 00 60 01 00 00 CF 39 B7 0D
02 00 05 0F 00 00 00 61 01 00 00 0D EB F7 0E
02 00 06 0F 00 00 00 62 01 00 00 A2 85 0E 0F&lt;/pre>

&lt;div class='p'>So again some counter and checksum or time??&lt;/div>

&lt;div class='p'>In 127 or 0xFE "queue" the 19bytes long messages looks the same all the time
(except increasing counter):&lt;/div>

&lt;pre>02 7F 04 13 00 00 00 01 04 08 00 00 00 00 00 00 00 00 00&lt;/pre>

&lt;div class='p'>23 bytes message are two types:&lt;/div>

&lt;pre>02 7F 02 17 00 00 00 01 04 06 00 E8 6D E6 3C F5 7D 20 BC AA 4D 85 3B
02 7F 03 17 00 00 00 01 04 05 00 00 00 00 00 00 00 00 00 00 00 00 00
02 7F 05 17 00 00 00 01 04 06 00 5B 48 E8 3C 08 D0 1E BC 28 F0 84 3B
02 7F 06 17 00 00 00 01 04 05 00 00 00 00 00 00 00 00 00 00 00 00 00&lt;/pre>

&lt;div class='p'>And finally 35 bytes message is so far still the same.&lt;/div>

&lt;pre>02 7F 01 23 00 00 00 01 04 04 00 00 00 00 00 00 40 7F 
   40 00 00 00 00 00 40 7F 40 00 00 00 00 00 40 7F 40
02 7F 11 23 00 00 00 01 04 04 00 00 00 00 00 00 40 7F 
   40 00 00 00 00 00 40 7F 40 00 00 00 00 00 40 7F 40&lt;/pre>

&lt;div class='p'>Well, it may take a while to find corresponding file in ARDroneSDK3 &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150203">&lt;/a>&lt;/div>

&lt;h2>3rd February 2015 &amp;mdash; generateLibARCommands.py&lt;/h2>

&lt;div class='p'>You would not find useful file, you have to generate it first! &lt;span class='smile'>&lt;/span> In
particular you will need
&lt;a href='https://github.com/ARDroneSDK3/libARCommands' class='external'>libARCommands&lt;/a> and probably
also main &lt;a href='https://github.com/ARDroneSDK3/ARSDKBuildUtils' class='external'>ARSDKBuildUtils&lt;/a>.&lt;/div>

&lt;div class='p'>Then you need to locate
&lt;a href='https://github.com/ARDroneSDK3/libARCommands/blob/master/Xml/generateLibARCommands.py' class='external'>generateLibARCommands.py&lt;/a>
and run it with parameter &lt;i>-projects ARDrone3&lt;/i> (you will not find code name
"Bebop").&lt;/div>

&lt;pre>m:\git\ARDroneSDK3\libARCommands\Xml>generateLibARCommands.py -projects ARDrone3&lt;/pre>

&lt;div class='p'>Then you can find desired files: &lt;i>ARCOMMANDS_Ids.h&lt;/i>, &lt;i>ARCOMMANDS_Filter.c&lt;/i>
and &lt;i>ARCOMMANDS_Decoder.c&lt;/i>. Interesting lines from &lt;i>ARCOMMANDS_Filter.c&lt;/i>
are:&lt;/div>

&lt;pre>commandProject = ARCOMMANDS_ReadWrite_Read8FromBuffer (&amp;hellip;);
commandClass = ARCOMMANDS_ReadWrite_Read8FromBuffer (&amp;hellip;);
commandId = ARCOMMANDS_ReadWrite_Read16FromBuffer (&amp;hellip;);&lt;/pre>

&lt;div class='p'>So if you look for byte &lt;b>commandProject&lt;/b> = 1
(ARCOMMANDS_ID_PROJECT_ARDRONE3), you will find combinations for example 01 04
04 00 in 35bytes messages in previous post. &lt;b>commandClass&lt;/b> = 4
(ARCOMMANDS_ID_ARDRONE3_CLASS_PILOTINGSTATE) and &lt;b>commandId&lt;/b> = 4
(ARCOMMANDS_ID_ARDRONE3_PILOTINGSTATE_CMD_POSITIONCHANGED). Now you can lookup
decoding routing in &lt;i>ARCOMMANDS_Decoder.c&lt;/i>:&lt;/div>

&lt;pre>_latitude = ARCOMMANDS_ReadWrite_ReadDoubleFromBuffer (&amp;hellip;);
_longitude = ARCOMMANDS_ReadWrite_ReadDoubleFromBuffer (&amp;hellip;);
_altitude = ARCOMMANDS_ReadWrite_ReadDoubleFromBuffer (&amp;hellip;);&lt;/pre>

&lt;div class='p'>All three 8bytes values are &lt;b>00 00 00 00 00 40 7F 40&lt;/b>.&lt;/div>

&lt;ul>
&lt;li>ARCOMMANDS_ID_ARDRONE3_PILOTINGSTATE_CMD_SPEEDCHANGED = 5
&amp;hellip; float _speedX, _speedY, _speedZ;&lt;/li>

&lt;li>ARCOMMANDS_ID_ARDRONE3_PILOTINGSTATE_CMD_ATTITUDECHANGED = 6
&amp;hellip; float _roll, _pitch, _yaw;&lt;/li>

&lt;li>ARCOMMANDS_ID_ARDRONE3_PILOTINGSTATE_CMD_ALTITUDECHANGED = 8
&amp;hellip; double _altitude;&lt;/li>
&lt;/ul>

&lt;div class='p'>That could be enough to get started &lt;span class='wink'>&lt;/span>. See related
&lt;a href='https://github.com/robotika/katarina/commit/28ef70baa1ffb80efb558639360a5b60c02a542e' class='external'>diff&lt;/a>.&lt;/div>

&lt;div class='p'>p.s. I expected copy and paste error, but it is different "ATTITUDE" vs.
"ALTITUDE"&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150204">&lt;/a>&lt;/div>

&lt;h2>4th February 2015 &amp;mdash; Commands required&lt;/h2>

&lt;div class='p'>I asked Darryl to repeat my first step and in meantime I also did it myself
this morning. And it did not work :-(. At least 100 messages did not go
through. When I set the limit to 10 or even 50 it was OK. So there is some
time limit when you have to send some command to the drone.&lt;/div>

&lt;div class='p'>I did not mention this, but just for reference I am using &lt;b>Windows 7&lt;/b> and
&lt;b>Python 2.7&lt;/b>. On the other hand it should work also on other OSs so far (at
the moment there is no dependency, but it will probably soon change and OpenCV2
and NumPy libraries will be required).&lt;/div>

&lt;div class='p'>So how to send the command and which command would not be too dangerous? It is
surely not a good idea to send &lt;b>takeoff&lt;/b> first &lt;span class='wink'>&lt;/span>. There are commands like
&lt;b>SetHome&lt;/b> and &lt;b>ResetHome&lt;/b> which should generate some drone response and
without flying they should not be harmful.&lt;/div>

&lt;div class='p'>The encoding scheme is similar to decoding:&lt;/div>

&lt;pre>ARCOMMANDS_ReadWrite_AddU8ToBuffer(&amp;hellip;ARCOMMANDS_ID_PROJECT_ARDRONE3&amp;hellip;);
ARCOMMANDS_ReadWrite_AddU8ToBuffer(&amp;hellip;ARCOMMANDS_ID_ARDRONE3_CLASS_GPSSETTINGS&amp;hellip;);
ARCOMMANDS_ReadWrite_AddU16ToBuffer(&amp;hellip;ARCOMMANDS_ID_ARDRONE3_GPSSETTINGS_CMD_RESETHOME&amp;hellip;);&lt;/pre>

&lt;div class='p'>and&lt;/div>

&lt;pre>ARCOMMANDS_ReadWrite_AddU8ToBuffer(&amp;hellip;ARCOMMANDS_ID_PROJECT_ARDRONE3&amp;hellip;);
ARCOMMANDS_ReadWrite_AddU8ToBuffer(&amp;hellip;ARCOMMANDS_ID_ARDRONE3_CLASS_GPSSETTINGS&amp;hellip;);
ARCOMMANDS_ReadWrite_AddU16ToBuffer(&amp;hellip;ARCOMMANDS_ID_ARDRONE3_GPSSETTINGS_CMD_SETHOME&amp;hellip;);
ARCOMMANDS_ReadWrite_AddDoubleToBuffer(&amp;hellip;_latitude&amp;hellip;);
ARCOMMANDS_ReadWrite_AddDoubleToBuffer(&amp;hellip;_longitude&amp;hellip;);
ARCOMMANDS_ReadWrite_AddDoubleToBuffer(&amp;hellip;_altitude&amp;hellip;);&lt;/pre>

&lt;div class='p'>The question is what is necessary to send &lt;i>before&lt;/i> these bytes are encoded??&lt;/div>

&lt;div class='p'>I will at least answer one Darryl's question: &lt;i>How could you tell about
counter, checksum, and time??&lt;/i> Well, I was probably wrong with checksum, and
time or better &lt;i>timestamp&lt;/i> I guessed as the 4 byte number was increasing (and
I am maybe wrong). But I am pretty sure about the &lt;i>counter&lt;/i>. Parrot used
that in ARDrone2 (absolute increasing uint32) and also for Rolling Spider
(uint8).  The packets can be lost in UDP communication and one way how to
recognize this fact is to index them. The packets also do not have to come in
the original order although it is probably not the case with direct WiFi
connection.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150205">&lt;/a>&lt;/div>

&lt;h2>5th February 2015 &amp;mdash; libARNetworkAL&lt;/h2>

&lt;div class='p'>Lesson learned: if you want to play with ARDrone3 download &lt;b>ALL 14
repositories!!!&lt;/b> Do not be lazy like me! Otherwise you may omit for example
&lt;i>libARNetworkAL&lt;/i>. It is the place where you will find this structure:&lt;/div>

&lt;pre>typedef struct
{
    uint8_t type;     /* frame type eARNETWORK_FRAME_TYPE */
    uint8_t id;       /* identifier of the buffer sending the frame */
    uint8_t seq;      /* sequence number of the frame */
    uint32_t size;    /* size of the frame */
    uint8_t *dataPtr; /* pointer on the data of the frame */
}&lt;/pre>

&lt;div class='p'>Not interested? Well, review the collected data now &lt;span class='wink'>&lt;/span>. &lt;b>02&lt;/b> is &lt;b>type&lt;/b> and
it corresponds to &lt;i>ARNETWORKAL_FRAME_TYPE_DATA&lt;/i> &amp;mdash; &lt;i>Data type. Main type
for data that does not require an acknowledge&lt;/i>.&lt;/div>

&lt;div class='p'>The next item was &lt;b>00&lt;/b> or &lt;b>7F&lt;/b> and it is &lt;b>id&lt;/b>. The &lt;b>seq&lt;/b> was guessed
correctly but &lt;b>size&lt;/b> is not single byte, but whole &lt;b>uint32&lt;/b>, which explains
the three zeros. And the rest we were already able to decode (see
&lt;a href='https://github.com/robotika/katarina/commit/1b56ae9ba807c3c11376ecb82ebd4f0bec55e2f7' class='external'>diff&lt;/a>).
&lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>So now, what to do with the problem from yesterday? How to &lt;i>encode&lt;/i> whole
command packet? Based on &lt;i>ARNETWORKAL_WifiNetwork.c&lt;/i> and function
&lt;i>ARNETWORKAL_WifiNetwork_PushFrame&lt;/i> it should be the same frame as we just
decoded. I would guess that like for &lt;a href='/robots/jessica/en'>Jessica&lt;/a> the
&lt;b>type=2&lt;/b> but I am not sure about &lt;i>id&lt;/i>. According to
&lt;i>ARNETWORK_IOBufferParam.h&lt;/i> in &lt;i>libARNetwork&lt;/i> &lt;b>ID&lt;/b> &amp;mdash; &lt;i>Identifier used
to find the IOBuffer in a list - Valid range : 10-127&lt;/i>.&lt;/div>

&lt;div class='p'>&lt;a href='https://github.com/robotika/katarina/commit/768076fcb42443c4c679f3fbb360c8d717b6f1e0' class='external'>Here&lt;/a>
is my first attempt to send command &amp;mdash; unsuccessful :-(. But I received new
messages &lt;span class='wink'>&lt;/span>. In particular&lt;/div>

&lt;pre>04 7E 01 23 00 00 00 01 18 00 00 00 00 00 00 00 40 7F C0 
         00 00 00 00 00 40 7F C0 00 00 00 00 00 40 7F C0&lt;/pre>

&lt;div class='p'>where &lt;b>04&lt;/b> is message with required acknowledgement
(&lt;i>ARNETWORKAL_FRAME_TYPE_DATA_WITH_ACK&lt;/i>), &lt;b>01 18 00 00&lt;/b> is
ARCOMMANDS_ID_PROJECT_ARDRONE3 = 1,
ARCOMMANDS_ID_ARDRONE3_CLASS_GPSSETTINGSSTATE = 24,
ARCOMMANDS_ID_ARDRONE3_GPSSETTINGSSTATE_CMD_HOMECHANGED = 0 &amp;hellip; so maybe I was
sucessful at the end! &lt;span class='smile'>&lt;/span>. This worked for &lt;b>id=10&lt;/b>, but did not work for 0x7F
and 0x40.&lt;/div>

&lt;div class='p'>Confirmed &lt;span class='smile'>&lt;/span>. So one more small
&lt;a href='https://github.com/robotika/katarina/commit/385e8486622f4235a3c24a890eb0529d35009c77' class='external'>diff&lt;/a>.
Also for &lt;i>frameId=10&lt;/i> I received all 10+100 messages, while for other two
"random" channels it stopped before 100.&lt;/div>

&lt;div class='p'>OK, so the next step will be confirmation of the received message (type 4).&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150215">&lt;/a>&lt;/div>

&lt;h2>15th February 2015 &amp;mdash; PING and PONG&lt;/h2>

&lt;div class='p'>Well, nothing really exiting yet &lt;span class='wink'>&lt;/span>. I tried to confirm (acknowledge) message
of &lt;i>GPS home changed&lt;/i>, and I did not succeed. But in that particular piece of
code (see original
&lt;a href='https://github.com/ARDroneSDK3/libARNetwork/blob/master/Sources/ARNETWORK_Receiver.c#L191' class='external'>ARNETWORK_Receiver.c:191&lt;/a>)
there is an extra switch for handling frame.id
&lt;i>ARNETWORK_MANAGER_INTERNAL_BUFFER_ID_PONG&lt;/i>. If you look this ID up you will
find that it is 0. And what is it?&lt;/div>

&lt;div class='p'>It looks like that these are the 15bytes long messages starting with &lt;b>02 00 XX
0F&lt;/b> prefix. The payload is &lt;i>struct timespec&lt;/i>, which should be 8 bytes
(time_t and long), probably both 4 bytes (seconds, nanoseconds).&lt;/div>

&lt;pre>Time 176.160592961
Time 177.181624819
Time 178.202618948
Time 179.362471135
Time 180.38329366&lt;/pre>

&lt;div class='p'>So it looks like PING every second, and because it is not PONGed the connection
is lost after a while (I guess).&lt;/div>

&lt;div class='p'>There was a small surprise regarding UDP packets. One packet can contain
several messages and thus simple logging (see
&lt;a href='https://github.com/robotika/katarina/commit/f3ec23f67452656b87625b9ca560aa2059d73b0b' class='external'>diff&lt;/a>)
is not sufficient. So far I have seen only combination of standard messages
with these PING messages, but who knows &amp;hellip;&lt;/div>

&lt;div class='p'>&lt;b>Update:&lt;/b>&lt;/div>

&lt;ul>
&lt;li>the UDP packets are now split for reliable record and replay + asserts On/Off
(&lt;a href='https://github.com/robotika/katarina/commit/7106c6c42851cb2aa046ccdc7633ddeabe87210c' class='external'>diff&lt;/a>)&lt;/li>

&lt;li>I got ping-pong working and tested it on 1000 messages 
(&lt;a href='https://github.com/robotika/katarina/commit/7e1e91b7dcd6c31b84b1263a5e02c401e834675d' class='external'>diff&lt;/a>)&lt;/li>

&lt;li>thanks to 
&lt;a href='https://github.com/ARDroneSDK3/Samples/blob/master/Unix/JumpingSumoChangePosture/JumpingSumoChangePosture.c' class='external'>Jumping Sumo sample&lt;/a>
I know how the configuration looks like &amp;hellip; frame.id = 10/127 for normal messages, 11/126 for acknowledged&lt;/li>

&lt;li>I received new message &lt;b>04 7E 02 0C 00 00 00  00 05 01 00 38&lt;/b> which should mean ARCOMMANDS_ID_PROJECT_COMMON = 0,
ARCOMMANDS_ID_COMMON_CLASS_COMMONSTATE = 5, ARCOMMANDS_ID_COMMON_COMMONSTATE_CMD_BATTERYSTATECHANGED = 1, value 0x38=56%
&amp;hellip; so it is probably OK, that I did not receive this always&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;div class='p'>&lt;a id="150216">&lt;/a>&lt;/div>

&lt;h2>16th February 2015 &amp;mdash; ACK of ACK?&lt;/h2>

&lt;div class='p'>There was a big mistake in the code yesterday &amp;mdash; yes, I wrote that code. It is
necessary to pack payload for normal command but special cases like ACK or
PING are already prepared to be sent as-they-are (see
&lt;a href='https://github.com/robotika/katarina/commit/66fbbb2a31d1597387b5e3731fcc0812e6a51242' class='external'>fix&lt;/a>).&lt;/div>

&lt;div class='p'>This means that ping-pong was probably not working, only the communication kept
the drone busy. After the change I received strange 8 bytes messages. They had
&lt;i>frameType&lt;/i> 0x1 (ARNETWORKAL_FRAME_TYPE_ACK) and they were coming with
&lt;i>frameId&lt;/i> 0x8B?! In HEX code it is more visible. Now I would guess that it
was acknowledge of my (wrongly composed) acknowledge of original &lt;i>home
changed&lt;/i> or &lt;i>battery status changed&lt;/i> from &lt;i>frameId&lt;/i> 11 = 0xB.&lt;/div>

&lt;div class='p'>I am convinced now that &lt;i>frameId&lt;/i> &lt;b>10&lt;/b> is for sending standard messages
without acknowledgement and &lt;b>11&lt;/b> for any (?) messages for which I would like
to get acknowledgement. I did not succeed to send ACK probably, because the
message from the drone was still repeated. The plan is to try (again)
&lt;i>frameType&lt;/i> 0x1 (ARNETWORKAL_FRAME_TYPE_ACK &amp;mdash; &lt;i>Acknowledgment type.
Internal use only&lt;/i> &amp;hellip; but I am probably working on "internal" now), and
&lt;i>frameId&lt;/i> = 0x7E + 0x80 &amp;hellip; I wonder what is the chance of success :-(.&lt;/div>

&lt;div class='p'>p.s. yesterday I have seen the drone die on low battery again. The last message
was something like &lt;i>Battery 32%&lt;/i>, so the main computer eats a lot or it has
set auto-shutdown threshold relatively high.&lt;/div>

&lt;div class='p'>p.s.2 it looks like it worked
(&lt;a href='https://github.com/robotika/katarina/commit/96e608077107613c11a00e2bfbb8497f7c8e1470' class='external'>diff&lt;/a>)
 &amp;hellip; &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150217">&lt;/a>&lt;/div>

&lt;h2>17th February 2015 &amp;mdash; Low latency video stream&lt;/h2>

&lt;div class='p'>It is time for the next step. I found
&lt;i>ARCOMMANDS_ID_ARDRONE3_MEDIASTREAMING_CMD_VIDEOENABLE&lt;/i> in the list of IDs
and tried to call it (see
&lt;a href='https://github.com/robotika/katarina/commit/b106f20c7fbcf3b6c44b6e02f6680773064ce9af' class='external'>diff&lt;/a>).
There were many new messages almost 1kB long and they looked like:&lt;/div>

&lt;pre>03 7D 7D F4 03 00 00 07 00 01 00 1F 00 00 00 01 27 42 E0 28 &amp;hellip;
03 7D 7E F4 03 00 00 07 00 01 01 1F DE 11 5A 4E 70 5E BE F6 &amp;hellip;
03 7D 7F F4 03 00 00 07 00 01 02 1F 96 C4 44 3E A3 7F 0E 0D &amp;hellip;
03 7D 80 F4 03 00 00 07 00 01 03 1F D1 8B 07 A2 57 2A 86 C7 &amp;hellip;
&amp;hellip;&lt;/pre>

&lt;div class='p'>At first I was happy to see &lt;b>00 00 00 01&lt;/b>, which is used as start tag in
H.264 video codec. I cut that part and tried to replay it with old
&lt;a href='https://github.com/robotika/heidi/blob/master/rr_drone.py' class='external'>Heidi code&lt;/a>. I saw
image patches of reality, but also many errors.&lt;/div>

&lt;div class='p'>Later I recorded longer communication (instead of 100 messages it was 1000 and
5000) and I was surprised that not all packets arrived and that they are
repeating. Why? The answer is simple: they have to be acknowledged. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>Note &lt;b>03&lt;/b> at the beginning &amp;mdash; this is a new type of frames
(ARNETWORKAL_FRAME_TYPE_DATA_LOW_LATENCY). Also note, that there is a new
&lt;i>frameId&lt;/i> &lt;b>FD&lt;/b> and according to
&lt;a href='https://github.com/ARDroneSDK3/Samples/blob/master/Unix/JumpingSumoReceiveStream/JumpingSumoReceiveStream.c' class='external'>Jumping
Sumo Receive Stream sample&lt;/a> could be the acknowledge &lt;i>frameId&lt;/i> equal to
JS_NET_CD_VIDEO_ACK_ID = 13.&lt;/div>

&lt;div class='p'>Both I/O structures look reasonable (see
&lt;a href='https://github.com/ARDroneSDK3/libARStream/blob/master/Sources/ARSTREAM_NetworkHeaders.h' class='external'>libARStream/ARSTREAM_NetworkHeaders.h&lt;/a>):&lt;/div>

&lt;pre>typedef struct {
    uint16_t frameNumber; /** id of the current frame */
    uint8_t frameFlags; /** Infos on the current frame */
    uint8_t fragmentNumber; /** Index of the current fragment in current frame */
    uint8_t fragmentsPerFrame; /** Number of fragments in current frame */
} __attribute__ ((packed)) ARSTREAM_NetworkHeaders_DataHeader_t;&lt;/pre>

&lt;div class='p'>So in my example &lt;i>07 00 01 00 1F&lt;/i> is &lt;i>frameNumber&lt;/i> = 7, &lt;i>frameFlags&lt;/i> is
so far always 1 (FLUSH FRAME), &lt;i>fragmentNumber&lt;/i> is increasing and &lt;b>1F&lt;/b> is
&lt;i>fragmentNumber&lt;/i>.&lt;/div>

&lt;div class='p'>Now it is time to implement video packet acknowledgement:&lt;/div>

&lt;pre>typedef struct {
    uint16_t frameNumber; /** id of the current frame */
    uint64_t highPacketsAck; /** Upper 64 packets bitfield */
    uint64_t lowPacketsAck; /** Lower 64 packets bitfield */
} __attribute__ ((packed)) ARSTREAM_NetworkHeaders_AckPacket_t;&lt;/pre>

&lt;div class='p'>i.e. there are up to 128 frame fragments which have its own confirmation bit.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150218">&lt;/a>&lt;/div>

&lt;h2>18th February 2015 &amp;mdash; video.py&lt;/h2>

&lt;div class='p'>It looks like the video packet confirmation works (see code
&lt;a href='https://github.com/robotika/katarina/commit/f1c302ccd3f56c23980470d72cd992c0e946415d' class='external'>diff&lt;/a>).
Note, that the &lt;i>frame fragments&lt;/i> can be in random order and there could be
also some duplicities. Because I wanted to see the video and I was not
patient enough, I wrote simple &lt;i>brute force&lt;/i> utility which sorts the
fragments in the memory and creates video file (see
&lt;a href='https://github.com/robotika/katarina/commit/cb7a37f66b15671ff9845064cdcf771fc9e82adb' class='external'>video.py&lt;/a>).
And it works. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>I am using code from Heidi to replay the video file
(&lt;a href='https://github.com/robotika/heidi/blob/master/rr_drone.py' class='external'>rr_drone.py&lt;/a> or
directly
&lt;a href='https://github.com/robotika/heidi/blob/master/airrace.py' class='external'>airrace.py&lt;/a>). But
because the file does not contain &lt;b>PaVE&lt;/b> headers any more, you can directly
load it with OpenCV:&lt;/div>

&lt;pre>import cv2
cap = cv2.VideoCapture( "video.bin" )
ret, frame = cap.read()
cv2.imwrite( "first-video.jpg", frame )&lt;/pre>

&lt;div class='p'>and this is my first result:
&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/first-video.jpg'>&lt;img src='/robots/katarina/first-video_t.jpg' alt='first successfully transfered video frame' title='first successfully transfered video frame' class='border'  width='320' height='184'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/first-video.jpg'>first successfully transfered video frame&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>I know that the picture of the side view of our fridge is not very exciting.
&lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>If you want just to see the frame(s), use:&lt;/div>

&lt;pre>cv2.imshow('image', frame)
cv2.waitKey(0)&lt;/pre>

&lt;div class='p'>with wait in milliseconds or 0 for pause until user press any key. And finally
if you want your code to be as it should be clean it all at the end:&lt;/div>

&lt;pre>cap.release()
cv2.destroyAllWindows()&lt;/pre>

&lt;div class='p'>Yesterday I finally took off. It was only in &lt;i>Free Flight 3&lt;/i> application but
still it was exciting. The bad news is that &lt;b>Bebop&lt;/b> has the same problem as
&lt;b>Rolling Spider&lt;/b> &amp;mdash; you cannot land until you complete takeoff sequence :-(.
That means troubles and I hope Parrot will soon change this in the future
firmware.&lt;/div>

&lt;div class='p'>The second observation was related to image stabilization  &amp;mdash; I did not notice
it before, or there is some setting to turn it on/off, but if you tilt your
drone the image does not tilt &lt;span class='smile'>&lt;/span>. The same if you point up and down it remains
stable. The application asked for magnetometer calibration before (which I did
not because I was afraid that it will take off), and yesterday we calibrated
it. So maybe that was why it is now working?&lt;/div>

&lt;div class='p'>What do to next? I plan to try move image ROI (region of interest), switch to
HD video resolution (from the picture you can see that it was 640x368), take
14Mb resolution picture &amp;hellip; and probably finally takeoff, move and land
autonomously &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150220">&lt;/a>&lt;/div>

&lt;h2>20th February 2015 &amp;mdash; Camera Tilt/Pan and Emergency&lt;/h2>

&lt;div class='p'>Yesterday I experimented with camera tilt and pan
(&lt;a href='https://github.com/robotika/katarina/commit/6943b766a224f218bb1b2acc9e3050e4a84a2568' class='external'>diff&lt;/a>).
It was fun, because it worked immediately &lt;span class='wink'>&lt;/span>. Yes, except detail, that the byte
for setting camera angle is signed, so packing once failed, but the fix was
trivial.&lt;/div>

&lt;div class='p'>I implemented also utility
&lt;a href='https://github.com/robotika/katarina/blob/master/play.py' class='external'>play.py&lt;/a> &amp;mdash;
basically a short code for playing video as I was describing it in the previous
report.  Now, as soon as you finish your test, you can replay your video
directly from navdata log file.&lt;/div>

&lt;div class='p'>This morning I tried first &lt;b>takeoff()&lt;/b>. It is always exciting, kind of
adrenalin fun, because you do not know what will happen. I was holding Katarine
tight, and called &lt;i>robot.takeoff()&lt;/i> followed by &lt;i>robot.emergency()&lt;/i> from
Python script, i.e. cut motors as soon as you start them. What would you
expect to happen? The propellers were still turning after the script
termination!&lt;/div>

&lt;div class='p'>You can turn Bebop upside down and that stops the motors, but still &amp;hellip; it was
surprise even I was &lt;i>ready for everything&lt;/i>.
&lt;a href='https://github.com/robotika/katarina/commit/cb02800a6b2a26e78989e36e3fcf7d675db090c5' class='external'>Here&lt;/a>
is the code plus extra lines for parsing new messages: calibration info,
flight number info and piloting state.&lt;/div>

&lt;div class='p'>I suppose that there will be the same problem like with
&lt;a href='/robots/jessica/en'>Rolling Spider&lt;/a>, i.e. I will have to wait until take off
sequence is completed before I can land :-(. BTW this is one of the reasons I
am bit afraid to takeoff in this narrow space.&lt;/div>

&lt;div class='p'>&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/FktDClh8fEk?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150221">&lt;/a>&lt;/div>

&lt;h2>21st February 2015 &amp;mdash; First Crash&lt;/h2>

&lt;div class='p'>„I told you there will be a problem!” &amp;hellip; and yes, there was. I partially
moved furniture in the living room to have enough space for the first takeoff
test (code
&lt;a href='https://github.com/robotika/katarina/commit/9a02496e4b8e2f78fcf76fb360a4877c5c1dc357' class='external'>diff&lt;/a>).
The first test was over quickly: battery low, battery critical and stop, so the
propellers hardly twisted.&lt;/div>

&lt;div class='p'>I swapped batteries (Bebop has second battery set in the standard package,
which is very nice), took off and during hover the drone slided a little bit to
the left, hit the wear drier and landed. The video is not very exciting, so I
picked at least one video frame recorded by the drone, when it is
falling/landing and it sees its own protective hull:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/crash-hull.jpg'>&lt;img src='/robots/katarina/crash-hull_t.jpg' alt='Falling down' title='Falling down' class='border'  width='320' height='184'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/crash-hull.jpg'>Falling down&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Later that day I did also outdoor test with &lt;i>Free Flight 3&lt;/i> application. And
that looked even more scary. There was a light breeze and at one moment the
drone was completely uncontrollable (WiFi connection lost?). I wander how the
&lt;i>black box&lt;/i> works and if there are any data stored? I am also curious if by
default was the Flight No XXX recorded?&lt;/div>

&lt;div class='p'>Sigh. Nothing. Based on &lt;a href='https://github.com/matthewlloyd/bebop' class='external'>Bebop hacking
page&lt;/a> I would have to have enabled Black Box in &lt;i>/etc/debug.conf&lt;/i>. During
the tests at underground garage we probably turned off automatic video
recording, so there is nothing from yesterday. Sigh. On the other hand the
quality of recorded garage video is great (full HD, stable image). &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150222">&lt;/a>&lt;/div>

&lt;h2>22nd February 2015 &amp;mdash; ManualControlException and FlatTrim&lt;/h2>

&lt;div class='p'>Katarina is finally slowly learning the same commands as
&lt;a href='/robots/heidi/en'>&lt;span class='cs'>Heidi&lt;/span>&lt;/a>. Today I added &lt;b>ManualControlException&lt;/b>, something
like red Emergency STOP button on every robot. For the flying drone this means
land as soon as you hit any key on the keyboard. The bad news is that there
is no platform independent &lt;i>kbhit&lt;/i> so at the moment it works only under
Windows.  There was implementation with &lt;i>pygame&lt;/i> on Linux version (Isabelle),
but I did not test it yet and it is necessary to create some window and
initialize &lt;i>pygame&lt;/i> library. An alternative is Joris's Linux version &amp;mdash; TODO
(let me know if you would like to use Katarina's code on Linux or Mac). Here is
the slightly bigger
&lt;a href='https://github.com/robotika/katarina/commit/83642a821d714744f5c4317b7d53c4d1743e6c22' class='external'>diff&lt;/a>.
Note, that
&lt;a href='https://github.com/robotika/katarina/blob/master/apyros/sourcelogger.py' class='external'>apyros/sourcelogger.py&lt;/a>
is copy from
&lt;a href='https://github.com/robotika/heidi/blob/master/sourcelogger.py' class='external'>Heidi code&lt;/a>.&lt;/div>

&lt;div class='p'>Any exciting moments? &lt;span class='wink'>&lt;/span> &amp;hellip; well at least two. First of all the drone was
&lt;b>always&lt;/b> moving to the left as soon as it switched to hover mode. So
yesterday it was not necessary the breeze, but probably missing flat trim.
Fixed. Now it stays in place. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>The second funny moment was that I forgot &lt;i>land()&lt;/i> command in normal run.
When I hit the emergency button everything was fine, but then I let that be,
actually I hit the button outside the &lt;i>try..except&lt;/i> block and the drone was
still flying and flying. New lesson is that you can grab the Bebop with the
same trick as ARDrone2 &amp;mdash; one hand from the top, so you do not disturb down
pointing camera and sonar, and then the second hand from the bottom, and twist
up-side-down.&lt;/div>

&lt;div class='p'>The last step I did today was some cleaning (see
&lt;a href='https://github.com/robotika/katarina/commit/b435fbb2bca25c728ff1bd13ace1101db1bd306d' class='external'>diff&lt;/a>),
so now command are in separate file, &lt;i>trim()&lt;/i> is finished as soon as drone
confirms it and &lt;i>takeoff()&lt;/i> and &lt;i>land()&lt;/i> is terminated by flying state
change.&lt;/div>

&lt;div class='p'>What next? Redirect incoming video frames to extra working thread and look
for two-colors cap:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/md-cap.jpg'>&lt;img src='/robots/katarina/md-cap_t.jpg' alt='Sample image with the Parrot Cap' title='Sample image with the Parrot Cap' class='border'  width='320' height='184'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/md-cap.jpg'>Sample image with the Parrot Cap&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150223">&lt;/a>&lt;/div>

&lt;h2>23rd February 2015 &amp;mdash; Fandorama&lt;/h2>

&lt;div class='p'>I will switch this blog to Czech language for a while. I did not expect it,
but at the end the
&lt;a href='http://fandorama.cz/projekty/921778164/katarina-bebop/' class='external'>fandorama rising
project&lt;/a> was successful &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>There will be surely updates on
&lt;a href='https://github.com/robotika/katarina' class='external'>github&lt;/a>, and I will probably write
major changes/discoveries in both languages, but &amp;hellip; let me know if you find
this article/blog interesting, and if I should explain or translate some
parts.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150320">&lt;/a>&lt;/div>

&lt;h2>20th March 2015 &amp;mdash; Status quo&lt;/h2>

&lt;div class='p'>It is hard to believe that it is almost a month since I switched to Czech
„blogging”. There is some progress but it is still far from perfect &lt;span class='wink'>&lt;/span>. In
particular I am fighting now with streamed video which &lt;i>sometimes&lt;/i> stops and
I have to power down the drone to restart it. No idea why.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/screen.jpg'>&lt;img src='/robots/katarina/screen_t.jpg' alt='Screenshot with cap color detection' title='Screenshot with cap color detection' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/screen.jpg'>Screenshot with cap color detection&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>On the other hand on-board video recording works fine, so I have „internal”
reference, but that is useless for autonomous navigation. It is somehow related
to flying &amp;hellip; maybe I have to send ping messages myself to keep the drone
going??&lt;/div>

&lt;div class='p'>I did some experiments in our garden and it turned out that the grass is not
well suited for landing (it was good in one crash case, but now I mean
„standard successful automatic landing”). That was old task for
&lt;a href='/robots/heidi/en'>&lt;span class='cs'>ARDrone2&lt;/span>&lt;/a> when the detection was done by Parrot's code. This
is no longer true, so I have to do it myself:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/katarina/roundel-too-far.jpg'>&lt;img src='/robots/katarina/roundel-too-far_t.jpg' alt='Example of wrong detection' title='Example of wrong detection' class='border'  width='320' height='184'/>&lt;/a>&lt;br/>
&lt;a href='/robots/katarina/roundel-too-far.jpg'>Example of wrong detection&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>There is new folder
&lt;a href='https://github.com/robotika/katarina/tree/master/behaviors' class='external'>behaviors&lt;/a>, where
you can find experimental &lt;i>navigation to box&lt;/i>
&lt;a href='https://github.com/robotika/katarina/blob/master/behaviors/navbox.py' class='external'>navbox.py&lt;/a>.
Image multiprocessing is prepared but not fully integrated yet. It should be
fun as soon as the video stream is reliable.&lt;/div>

&lt;div class='p'>What else? I did some experiments with &lt;b>navdata&lt;/b> and &lt;b>black box&lt;/b> according
to the forum thread
&lt;a href='http://forum.parrot.com/usa/viewtopic.php?id=29088' class='external'>Hacking the Bebop&lt;/a>. Both
collects data on-board, including sonar and EKF estimates, but it is not
streamed over network :-( &amp;hellip; see
&lt;a href='https://github.com/ARDroneSDK3/ARSDKBuildUtils/issues/2' class='external'>Nicolas note&lt;/a> on
GitHub.&lt;/div>

&lt;div class='p'>There is new
&lt;a href='https://github.com/robotika/katarina/blob/master/bebop-protocol.md' class='external'>Parro
Bebop Protocol&lt;/a> document. The motivation to write it down was from
&lt;a href='http://forum.parrot.com/english/viewtopic.php?id=15772' class='external'>British forum&lt;/a>, but
no idea, if it was really useful for anybody &amp;hellip;&lt;/div>

&lt;div class='p'>Yeah and some fun from
&lt;a href='http://forum.parrot.com/usa/viewtopic.php?id=30005' class='external'>American forum&lt;/a> &amp;hellip; &lt;i>a
fracking genius&lt;/i> &amp;hellip; &lt;i>can read hex like Neo reads the Matrix!&lt;/i> &amp;hellip; ha ha ha
&lt;span class='smile'>&lt;/span>. No, not really.  But speaking of HEX &amp;hellip; I did some (more) experiments
with H.264 codec decoding for ARDrone3 (in Czech
&lt;a href='/articles/h264-drone-vision/en'>&lt;span class='cs'>H264 Drone Vision&lt;/span>&lt;/a>). And surprise, surprise &amp;hellip;
the codec is different. While ARDrone2 was sending only 16x16 macroblocks, you
get finer compression on Bebop with sub-blocks down to 4x4. This means that the
&lt;a href='https://github.com/robotika/h264-drone-vision' class='external'>H264 navigation code&lt;/a>
needs some work to be done to port it on 3rd generation of Parrot's drones.&lt;/div>

&lt;div class='p'>&lt;center>
&lt;iframe width="640" height="360" src="https://www.youtube.com/embed/3pzLRF1-KkM?feature=player_embedded" frameborder="0" allowfullscreen>&lt;/iframe>
&lt;/center>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150322">&lt;/a>&lt;/div>

&lt;h2>22nd March 2015 &amp;mdash; PCMD &amp;#64;40Hz&lt;/h2>

&lt;div class='p'>This one is worth mentioning immediately. Now I know why the video streaming
fails once a time! The reason is timing of PCMD messages &amp;hellip; but let's start
from the beginning.&lt;/div>

&lt;div class='p'>On Saturday I finally created example when the video streaming fails without
need of flying (see
&lt;a href='https://github.com/robotika/katarina/commit/63116ec5886c5c28a79ed348f4fecac8e3f8198c' class='external'>diff&lt;/a>).
Later I wrote even simpler example
(&lt;a href='https://github.com/robotika/katarina/commit/99b0b0b12f4f03e530b4d370b5f6437247c49aab' class='external'>diff&lt;/a>),
i.e. as soon as I sent the first PCMD the video stopped and was not restored
until next power down and power up.&lt;/div>

&lt;div class='p'>So what to do? I am glad I wrote to
&lt;a href='https://github.com/ARDroneSDK3/libARCommands/issues/4' class='external'>ARDroneSKD3/libARCommands/issues&lt;/a>.
And surprise surprise I got answer today! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;i>When using the PCMDs, you should make sure that your send them with a fixed
frequency (we use 40Hz / 25ms in FreeFlight), even if you're sending the same
command every 25ms.&lt;/i>&lt;/div>

&lt;div class='p'>&lt;i>A change in the PCMD frequency is considered by the Bebop to be a "bad wifi"
indicator. In this case, the bebop will decrease the video bandwidth (up to a
"no video" condition if the indicator is really bad) in order to save the
bandwidth for the piloting commands. When the PCMD frequency become stable
again, the video bandwidth will rise again.&lt;/i>&lt;/div>

&lt;div class='p'>This explains a lot - in particular why it sometimes worked &lt;span class='wink'>&lt;/span>. If it was
fixed loop like flying to requested altitude it was OK, but as soon as I
switched to another task the timing was broken.&lt;/div>

&lt;div class='p'>&lt;a href='https://github.com/robotika/katarina/commit/a4f81c15abfd4cb4cf97b9c16837011794fbfcab' class='external'>Here&lt;/a>
is small hack confirming Nicolas answer &amp;hellip; it works, even for the second run.
But &amp;hellip; yes, there is still &lt;i>but&lt;/i>, only during the times when the Tread is on
sending PCMD at 40Hz, so I am asking now if it is necessary to send PCMD all
the time, or if there is a way how to say „I do not want to send more
PCMD!”.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150408">&lt;/a>&lt;/div>

&lt;h2>8th April 2015 &amp;mdash; Terminated.&lt;/h2>

&lt;div class='p'>Well, I know that The Day will come, but still it strike me out unprepared &lt;span class='wink'>&lt;/span>.
The distributor asked me to return the drone because of some presentation. OK,
what can I do. Katarina is not mine. I could buy one, but do I want to??&lt;/div>

&lt;div class='p'>Interestingly enough it was after the Eastern weekend when I decided that if I
want to move forward I will have to write the firmware myself or use/learn from
others. Yesterday I downloaded
&lt;a href='https://github.com/MegaPirateNG/ardupilot-mpng/tree/master' class='external'>ardupilot-mpng&lt;/a>
and &lt;a href='https://github.com/paparazzi/paparazzi' class='external'>paparazzi&lt;/a> repositories &amp;hellip; and
then I received that mail.&lt;/div>

&lt;div class='p'>Somehow I lost the motivation to practise landing yesterday and do the cap
tracking with multiprocessing today. I deleted videos and thumbnails from the
drone and I will return it in two hours &amp;hellip; Goodbye Katarina!&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150414">&lt;/a>&lt;/div>

&lt;h2>14th April 2015 &amp;mdash; Remote distributed debugging&lt;/h2>

&lt;div class='p'>Thanks Charles and others for mails, queries and issues on GitHub &lt;span class='smile'>&lt;/span>. I am not
giving up yet &lt;span class='wink'>&lt;/span>. I do not have Bebop now, but that will probably change in
May 2015. So now I am &lt;i>remotely debugging&lt;/i> like with dropping incomplete
video frames
(&lt;a href='https://github.com/robotika/katarina/commit/81e1e38bf699b6de018b2317ca4c4da73ac6a07e' class='external'>diff&lt;/a>).
This one I did from old log files, but I can repeat the problems even from your
new log files &lt;span class='smile'>&lt;/span>. Now I am waiting for Charles's confirmation that fix worked
also for him.&lt;/div>

&lt;div class='p'>At the moment I am studying &lt;a href='https://github.com/paparazzi/paparazzi' class='external'>paparazzi
code&lt;/a> for ARDrone2, but note, that there is also
&lt;a href='https://github.com/paparazzi/paparazzi/tree/master/sw/airborne/boards/bebop' class='external'>Bebop
board code&lt;/a>. The goal is to stream missing navdata like raw sonar readings &amp;hellip;
and maybe more. We will see.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150430">&lt;/a>&lt;/div>

&lt;h2>30th April 2015 &amp;mdash; Paparazzi?!&lt;/h2>

&lt;div class='p'>I had to smile yesterday, when I received notification from GitHub (regarding
&lt;a href='https://github.com/ARDroneSDK3/libARCommands/issues/9#issuecomment-97374627' class='external'>Mavlink
seems not to work&lt;/a> issue): &lt;i>If you really need to do it before the next
firmware release (in the coming weeks), you can maybe look at Paparazzi. I
can't provide support on it, but I think they support the bebop. It is not very
easy, so maybe a small wait is better&lt;/i> &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>It was hard to believe that this is the Parrot answer, but yes, it was. So I am
probably on the right track learning
&lt;a href='https://github.com/paparazzi/paparazzi/blob/master/sw/airborne/boards/ardrone/' class='external'>Paparazzi
for ARDrone2&lt;/a> at the moment, and I can confirm that reading navdata on
&lt;a href='https://github.com/robotika/heidi/tree/master/onboard' class='external'>Heidi onboard&lt;/a>
works.&lt;/div>

&lt;div class='p'>Regarding &lt;i>Katarina&lt;/i> I merged DEVELOP back to master &amp;mdash; the issue with
navigation Home is not solved (see
&lt;a href='https://github.com/ARDroneSDK3/libARCommands/issues/7' class='external'>Issue #7&lt;/a>) and it is
probably related to minimal height above the ground. Aldo did some experiments
at 10 meters with string, but what he would need is rather MavLink.
If nothing else then at least battery status and include paths in Linux are
fixed and that was the main reason why I merged it.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/robots/katarina/en#email'>contact form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>AppleBot</title>
	<link rel='alternate' href="http://localhost/articles/applebot/en"/>
	<id>http://localhost/articles/applebot/en</id>
	<updated>2014-12-10T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> Two weeks ago a UK company contacted CZU regarding &lt;i>Robotic arm for apple
harvesting&lt;/i>. The motivation is that to find cheap work labor for picking
apples is getting harder and harder, so the company would like to buy or
develop a technology capable of autonomous apple picking. In this article you
can find our development blog. &lt;b>Blog update:&lt;/b> 30/1 &amp;mdash;
&lt;a href='/articles/applebot/en#150130'>The Others (part1)&lt;/a>
 </summary>
	<content type='html'> 
&lt;h2>Content&lt;/h2>

&lt;ul>
&lt;li>&lt;a href='/articles/applebot/en#141203'>Apple 3D scan&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141204'>Back to 2D&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141205'>cv2.MSER&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141206'>Three apples?!&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141207'>Probabilistic Sphere&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141212'>URCaps&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141215'>PoE IP Camera&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141216'>Dual Sensor&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141217'>2nd 3D-Scanning Experiment&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141218'>applethreshold.py&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141219'>Crazy Ideas&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141221'>Memories and UR5 API&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#141222'>Apple Generators&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150104'>Recording H264 video&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150109'>stereo_match.py&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150111'>matplotlib&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150116'>The First Tests with Universal Robot UR5&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150117'>UR5 Video&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150120'>Applebot UR5 ver0&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150124'>The Hand&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150125'>Standa and UR5&lt;/a>&lt;/li>

&lt;li>&lt;a href='/articles/applebot/en#150130'>The Others (part1)&lt;/a>&lt;/li>
&lt;/ul>

&lt;h2>Source code&lt;/h2>

&lt;ul>
&lt;li>&lt;a href='https://github.com/robotika/applebot' class='external'>https://github.com/robotika/applebot&lt;/a>&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;hr/>

&lt;h1>Blog&lt;/h1>

&lt;div class='p'>&lt;a id="141203">&lt;/a>&lt;/div>

&lt;h2>3rd December 2014 &amp;mdash; Apple 3D scan&lt;/h2>

&lt;div class='p'>Yesterday we did the first experiment with apple recognition at CZU. We used
already existing tools:&lt;/div>

&lt;ul>
&lt;li>&lt;a href='https://www.mysick.com/saqqara/im0051230.pdf' class='external'>TiM55x&lt;/a> laser scanner&lt;/li>

&lt;li>effector for linear motion (developed for some soil analysis)&lt;/li>

&lt;li>&lt;a href='https://github.com/robotika/eduro/blob/master/laser.py' class='external'>laser.py&lt;/a> from &lt;a href='/robots/eduro/en'>Eduro&lt;/a> project&lt;/li>
&lt;/ul>

&lt;div class='p'>The sensor TiM55x we obtained thanks to
&lt;a href='/competitions/sick-robot-day/2014/en'>SICK Robot Day 2014&lt;/a> participation, but we
never throughly test it. It should be similar to older versions, but &amp;hellip; there
were two details:&lt;/div>

&lt;ul>
&lt;li>the IP was &lt;i>just crazy&lt;/i>: &lt;b>169.254.225.156&lt;/b> (we had to use old notebook with
SICK software to find this out. Once this was set correctly sensor responded
also to &lt;i>ping&lt;/i> commands)&lt;/li>

&lt;li>if you send authorization command, used for LMS100, you got only old scan&lt;/li>
&lt;/ul>

&lt;div class='p'>The simple
&lt;a href='https://github.com/robotika/applebot/blob/master/scan3d.py' class='external'>scanning script&lt;/a>
just collects data as fast as possible (over Ethernet) and stores them in
date-time-filename generated log file(s).&lt;/div>

&lt;div class='p'>The experiment was a with growing citrus tree with attached apples like for
Christmas &lt;span class='smile'>&lt;/span>. And the results?&lt;/div>

&lt;div class='p'>First of all &lt;i>TiM 55x&lt;/i> has coarse resolution than former (and bigger) LMS 100
&amp;mdash; 1 degree instead of 0.5 degree. 350 readings corresponded to 0.777m and it
took 32.28 seconds. I.e. approximately 10Hz and 0.024m/s motion. So in &lt;i>X&lt;/i>
we have resolution 2.4mm and &lt;i>Y&lt;/i> depends on apple size and distance from the
scanner.&lt;/div>

&lt;div class='p'>Would you like to see the results? I would recommend to download
&lt;a href='http://www.cloudcompare.org/' class='external'>CloudCompare&lt;/a> for 3D point clouds
visualisation. It is necessary to convert laser scanner data into point cloud
and you can use
&lt;a href='https://github.com/robotika/applebot/blob/master/log2pts.py' class='external'>log2pts.py&lt;/a>
script. Here is also at least one example &lt;a href='/articles/applebot/scan_141202_183934.zip'>scan_141202_183934.zip&lt;/a>. Note,
that you have to unzip it first.&lt;/div>

&lt;div class='p'>And here are the first views in &lt;i>Cloud Compare&lt;/i>:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/apple-3dscan.png'>&lt;img src='/articles/applebot/apple-3dscan_t.png' alt='3D scan' title='3D scan' class='border'  width='320' height='172'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/apple-3dscan.png'>3D scan&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/pot-detail3d.png'>&lt;img src='/articles/applebot/pot-detail3d_t.png' alt='pot detail' title='pot detail' class='border'  width='320' height='172'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/pot-detail3d.png'>pot detail&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Small disappointment? Well, I have to admit, that I have not recognize it too
at first sight. It is necessary to rotate it a little, you should easily
recognize the railing and the three walls are ground, wall and the ceiling.&lt;/div>

&lt;div class='p'>People probably still prefer camera images, right &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/apple-scanner.jpg'>&lt;img src='/articles/applebot/apple-scanner_t.jpg' alt='reality' title='reality' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/apple-scanner.jpg'>reality&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141204">&lt;/a>&lt;/div>

&lt;h2>4th December 2014 &amp;mdash; Back to 2D&lt;/h2>

&lt;div class='p'>Today I will start with a reference to an article: Juan Feng, Gang Liu,
Shengwei Wang, Lihua Zeng, Wen Ren, &lt;i>A Novel 3D Laser Vision System for
Robotic Apple Harvesting&lt;/i>, 2012 ASABE Annual International Meeting
(&lt;a href='https://elibrary.asabe.org/abstract.asp?aid=42037&amp;amp;t=2&amp;amp;redir=&amp;amp;redirType=' class='external'>direct
link&lt;/a>). They did something very similar to what I wanted to try &amp;mdash; 3D laser
scan of the tree and then search for the apples.&lt;/div>

&lt;div class='p'>First of all here is the
&lt;a href='https://github.com/robotika/applebot/blob/master/log2pgm.py' class='external'>log2pgm.py&lt;/a>
utility so you can see scanned &lt;b>distance data&lt;/b> as  &lt;b>2D image&lt;/b>. If you run
it on the data I posted yesterday you will get:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 386px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/distance-image.png' alt='distance image with two apples' title='distance image with two apples' class='border'  width='380' height='274'/>&lt;/span>&lt;br/>
&lt;span>distance image with two apples&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>So it is maybe better to view 3D data as 2D image &amp;mdash; it is in reality
projection of the 3D world on a cylinder, so we have all necessary info there.
Note, that I used &lt;b>fixed scaling&lt;/b> of distances to grayscale. This way I will
not loose distance information (when compared to the referenced article). I
expect that scanned tree is not further away than 1 meter (otherwise it would
require slightly different scale).&lt;/div>

&lt;div class='p'>Do you see the two apples on the citrus tree? I do &lt;span class='smile'>&lt;/span>. The darker one is
closer so it is larger. The image is flipped in &lt;i>Y&lt;/i>-coordinate, as we scanned
this particular one from right to the left, but it is just a detail for the
moment.&lt;/div>

&lt;div class='p'>So what next? Basically the same as we planned to do in 3D: we expect that the
apple has some size limits. These corresponds to some bounding box in 3D. In
grayscale image this means thresholding with narrow window and then searching
for objects of given size.&lt;/div>

&lt;div class='p'>There is another rule we can also add: the apple should not have holes, i.e.
occluded parts have to be closer/darker otherwise detected object will be
rejected.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141205">&lt;/a>&lt;/div>

&lt;h2>5th December 2014 &amp;mdash; cv2.MSER&lt;/h2>

&lt;div class='p'>I wanted to get &lt;i>Version0&lt;/i> running this morning, but I was a bit naive. At
least I prepared necessary tools for 3D image analysis.&lt;/div>

&lt;div class='p'>The first idea was to use threshold with lower and upper limits. You can play
with the threshold yourself: all you need is output from
&lt;a href='https://github.com/robotika/applebot/blob/master/log2pgm.py' class='external'>log2pgm.py&lt;/a> and
&lt;a href='https://github.com/m3d/cv2-bits/blob/master/threshold.py' class='external'>cv2-bits/threshold.py&lt;/a>.
You can track both apples with minimum distance in the center and complete
shape in 10 steps corresponding to 5cm.&lt;/div>

&lt;div class='p'>Then I was thinking that even &lt;i>cv2.MSER&lt;/i> will extract easier what we need
(see code
&lt;a href='https://github.com/robotika/applebot/commit/1305a25706d46daec70b9bfaf93d8f03808a4de2' class='external'>diff&lt;/a>).
MSER worked fine in &lt;a href='/competitions/robotchallenge/2014/en'>AirRace rectangle
detection&lt;/a>, so maybe apple object are also well separable? Some are and some
are not, see the temporary output:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 386px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/mser.png' alt='output of MSER with restricted BBox size' title='output of MSER with restricted BBox size' class='border'  width='380' height='274'/>&lt;/span>&lt;br/>
&lt;span>output of MSER with restricted BBox size&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>You can see that darker left apple was not classified as candidate. It is
clearly the nearby branches spreading the blob. On the other hand even the
ligher (further away) apple was not obvious. I did not measure them, my fault,
so I expected size like 10cm and this seems to be 6cm?? Here is the console
input/output if you would like to repeat my steps:&lt;/div>

&lt;pre>m:\git\applebot>finder.py 0.06 logs\scan_141202_183934.txt
(380, 274) uint8
0.0696 (15, 138) (44, 149)
0.0528 (0, 132) (22, 141)
0.0504 (125, 138) (146, 148)
0.0648 (124, 137) (151, 153)
0.0672 (123, 137) (151, 156)
0.0648 (208, 116) (235, 139)
0.06 (171, 136) (196, 146)
0.0576 (75, 149) (99, 159)
None&lt;/pre>

&lt;div class='p'>I should also correct my statement about count of „apple points” as a
function of distance to scanning device. In &lt;i>X&lt;/i> coordinate the size (bounding
box in pixels) should be the same. It is necessary to take into account that
due to shaking the readings corresponding of nearby object will be much more
reliable, but still, the size will be plus minus the same.&lt;/div>

&lt;div class='p'>The difference is in &lt;i>Y&lt;/i> coordinate is on the other hand significant. The
resolution if given by scanner resolution (1 degree), so an apple in 50cm
distance will have double the number of points when compared to an apple in 1
meter.&lt;/div>

&lt;div class='p'>What next? I is necessary to filter out obviously wrong detection rectangles:&lt;/div>

&lt;ul>
&lt;li>find median distance value&lt;/li>

&lt;li>verify aspect ratio of detected rectangle (closer apple should be thin 
while further thick)&lt;/li>

&lt;li>verify compactness of the surface (no holes)&lt;/li>
&lt;/ul>

&lt;div class='p'>I would guess that these simple steps will already give &lt;i>the apple&lt;/i> but let's
wait for real verification.&lt;/div>

&lt;div class='p'>And what after that? Well, now we have the whole 3D scan in powerful
&lt;a href='http://wiki.scipy.org/Tentative_NumPy_Tutorial' class='external'>NumPy array&lt;/a> so once we have
potential object we can return back to high resolution (laser readings are in
millimeters) and play for example with surface normals. I would be quite
curious if you can really reliably extract them and then try 3D object matching
like in &lt;a href='http://ar.in.tum.de/pub/drost2010CVPR/drost2010CVPR.pdf' class='external'>Model
Globally, Match Locally: Efficient and Robust 3D Object Recognition&lt;/a> paper.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141206">&lt;/a>&lt;/div>

&lt;h2>6th December 2014 &amp;mdash; Three apples?!&lt;/h2>

&lt;div class='p'>I must admit that the experiment last Tuesday was a bit spontaneous and now I
have the impression that the very last 3D scan (the one I
&lt;a href='/articles/applebot/scan_141202_183934.zip'>uploaded data&lt;/a>) is actually with 3 apples! That some
joker placed extra apple on the box (the next time we will automatically
record it with Eduro IP camera to have reference pictures with timestamps):&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 386px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/mser-apples.png' alt='candidates detected by MSER' title='candidates detected by MSER' class='border'  width='380' height='274'/>&lt;/span>&lt;br/>
&lt;span>candidates detected by MSER&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>In mean time I received dimensions of scanned apples:&lt;/div>

&lt;ul>
&lt;li>smaller one is 7cm in diameter and 5.5cm high&lt;/li>

&lt;li>yellow apple is 7.5cm in diameter and 7cm high&lt;/li>
&lt;/ul>

&lt;div class='p'>I should also mention a
&lt;a href='https://github.com/robotika/applebot/commit/f65a08e14e22f8651119b68365682a4aaa302807' class='external'>bugfix&lt;/a>
of numpy array conversion. I am learning it and I must admit it is very
powerful library. The bug was in my conversion from int32 to uint8 and that
was the source of horizontal lines. The corrected code looks like this:&lt;/div>

&lt;pre>tmp = np.array( scans ) / 5
mask = tmp &amp;gt; 255
tmp[mask] = 255&lt;/pre>

&lt;div class='p'>So you can define Boolean mask and then modify all elements where the mask is
True with single operation (see hint source at
&lt;a href='http://stackoverflow.com/questions/1623849/fastest-way-to-zero-out-low-values-in-array' class='external'>stackoverflow.com&lt;/a>).
&lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;a href='https://github.com/robotika/applebot/commit/afa527b8ec11e44a82499e66be069e7549acf91f' class='external'>Here&lt;/a>
is code used for above picture. The parameters for MSER are slightly modified
(threshold step 8 instead of 1) and there is a new function &lt;i>isItApple()&lt;/i> for
verification of the patch (in this context a submatrix of original 3D
scan/array). Also inside (half the size in &lt;i>X&lt;/i> and &lt;i>Y&lt;/i> so 1/4th in area)
elements are dumped:&lt;/div>

&lt;pre>m:\git\applebot>finder.py 0.065 logs\scan_141202_183934.txt
(380, 274) uint8
0.0672 (123, 137) (151, 156)
205
[ [102 102 103 255 101 101 102 255 255  50  79 107 103 101]
 [126 126 125 126 125 125 126 125 125 101 117 125 125 111]
 [123 125 125 125 125 125 117 111 123 124 122 125 124 125]
 [ 83 115 124 124 121 109  99  98 110 123 124 124 124 124]
 [ 66  73 122 124 104  98  99  97 111 124 123 123 124 124]
 [ 64  65 117 115  97  98  98  95 119 122 123 123 124 124]
 [ 65  64  96  97  96 100  94  97 121 121 122 121 122 123]
 [ 64  65  87  91  96  98  92 112 121 121 122 121 122 122]
 [ 65  69  91  91  94  95  96 111 115 114 114 115 114 115]
 [ 69  79  88  90  89  89  92  99 100  99 100  99 100 101] ]
0.0672 (88, 165) (116, 179)
94
[ [111 108  58  24  21  21  21  21  22  20  21  22  22  22]
 [ 17  19  20  21  21  20  21  21  21  21  21  22  22  22]
 [ 19  20  20  21  21  22  22  21  22  21  22  22  23  23]
 [ 20  20  20  21  20  22  21  21  21  22  21  22  23  23]
 [ 20  20  20  21  21  22  22  22  21  22  22  22  22  22]
 [ 21  21  21  20  21  21  21  22  21  22  22  22  23  23]
 [ 20  20  21  20  21  21  21  22  21  22  21  22  22  22] ]
0.0624 (170, 133) (196, 146)
9
[ [62 61 58 56 57 55 55 56 56 55 54 56 56]
 [56 56 56 56 56 56 55 56 55 55 54 54 55]
 [55 56 55 55 55 55 56 55 55 54 54 55 56]
 [55 54 56 53 55 53 55 53 55 55 55 54 55]
 [54 56 54 53 54 54 53 55 54 54 54 53 55]
 [54 54 55 54 53 54 53 53 54 54 56 55 54] ]
0.0576 (75, 149) (99, 159)
67
[ [ 41  40  40  39  41  39  40  38  39  39  49  59]
 [ 40  40  40  40  39  39  40  40  40  39  41  39]
 [ 40  40  40  40  39  39  39  38  39  39  39  39]
 [ 41  40  40  39  39  39  39  39  39  39  39  39]
 [105  88  71  58  53  45  44  42  42  40  49  53] ]
[]&lt;/pre>

&lt;div class='p'>Is it clear which one is not an apple? &lt;span class='wink'>&lt;/span> It is not simple because I believe
that one apple, touching the pot, was quite occluded by leaves. Also the smooth
surface could be a chair behind in the very left rectangle.&lt;/div>

&lt;div class='p'>The current plan is to switch back to higher resolution (millimeters) and try
to fit a sphere there.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141207">&lt;/a>&lt;/div>

&lt;h2>7th December 2014 &amp;mdash; Probabilistic Sphere&lt;/h2>

&lt;div class='p'>I did not expect that for &lt;i>version 0&lt;/i> (the simplest thing which can possibly
work) I will need strong weapons like &lt;i>probabilistic matching&lt;/i>. Because
&lt;a href='/guide/umor05/rastrove-mapy.pdf'>my old lecture&lt;/a> is in Czech I will
briefly review it in English for &lt;b>sphere&lt;/b>:&lt;/div>

&lt;div class='p'>Suppose you have a set of 3D points where at least 50% belong to the apple
surface. How to recognize, which points belong to the apple and which do not?
If you randomly pick one, there is 50% chance that you succeed. If you pick two
points the probability multiply so 25%. It is necessary to have four points to
define a sphere in the space (the points are on the sphere surface). So the
probability is 1/16th=6.25%.&lt;/div>

&lt;div class='p'>Now comes the trick. We are living in the world of powerful computing machines
and it is no problem to repeat our try 100 times or more. What is the chance
that after 100 repetition we find 4 surface points? It is 1-(1-0.0625^100),
which is 99.8% and that could be enough for the beginning &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>So is it really that easy? Well, sure enough there are always some drawbacks.
The first one is recognition that your sphere fits to the 50% of apple surface.
What you can do is count distances of each 3D point to surface and if it is
within limit you pick the sphere with the highest number of points in this
zone.&lt;/div>

&lt;div class='p'>Do you want so know my first result? It was:&lt;/div>

&lt;pre>center = (0.018495501893939396,-2108163884176161.0, -146098213063779.62) 
radius = 2.11322020869e+15&lt;/pre>

&lt;div class='p'>&amp;hellip; so radius with 15 zeros so something like a quoter of
&lt;a href='http://en.wikipedia.org/wiki/Light-year' class='external'>light year&lt;/a>!? The reason is that we
are computing with limited precision and due to rounding 3D point within few
millimeters will fall into the same &lt;i>double&lt;/i> precision number. So even
there were 99.7% of 3D point on the surface within the limit it was surely not
what we were looking for.&lt;/div>

&lt;div class='p'>I added new parameters &lt;i>minRadius&lt;/i> and &lt;i>maxRadius&lt;/i> set to 3cm and 15cm
respectively to fix that. Here is the
&lt;a href='https://github.com/robotika/applebot/commit/61efe400d14f1521abd677048949e7e63291646e' class='external'>code
diff&lt;/a>.&lt;/div>

&lt;div class='p'>There is another drawback with probabilistic algorithms &amp;mdash; they sometimes fail
and it you keep them purely random you may get different results every time you
call them. For that reason every apple detector has its own random generator
with random seed set to zero:&lt;/div>

&lt;pre>self.random = random.Random(0).choice&lt;/pre>

&lt;div class='p'>Now how you can get a patch with 50% „apple surface points”? You can use
already mentioned MSER, or what I learned on „Computer Vision after 20 years”
lecture &amp;mdash; just brute force in these days. Write some quick algorithms/filters
which will quickly reject non-apple image window. And then you can try even all
possible windows. Even average PC has typically 1024 graphic processors which
can do the filtering job. The windows which remains you can then throughly
analyse with methods like the one mentioned today. More over if you are in
production you can collect tons of reference data and your program can learn
from them over time &amp;hellip; give up humans &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>Note, that an apple is not sphere. It won't fit perfectly. If you want to, you
can find some parametrized 3D model, so instead of 4 points you will need
probably few more, to define it better. But on the other hand we should not
focus on laser only. If you add camera we have pre-selected set of 3D points
and we can classify their color too. So red points are OK, and brown/green are
not (if you have a leave, unluckily twisted in sphere shape, you will not
distinguish it from an apple via laser).&lt;/div>

&lt;div class='p'>„Final” note about my 3 apples. I added extra function &lt;i>saveAsCloud&lt;/i> so you
can analyse just a patch in &lt;i>Cloud Comapare&lt;/i> program (see
&lt;a href='https://github.com/robotika/applebot/commit/003b0f5e0b31b5519409f0110422afde3d4defb3' class='external'>diff&lt;/a>).
The nearest "apple" is probably just leave touching the scaner &amp;mdash; in the last
experiment we put it very close. So with version of code available at
&lt;a href='https://github.com/robotika/applebot' class='external'>https://github.com/robotika/applebot&lt;/a> you will see this result:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 386px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/fit-sphere.png' alt='fit sphere with more than 80% of points (2cm tolerance)' title='fit sphere with more than 80% of points (2cm tolerance)' class='border'  width='380' height='274'/>&lt;/span>&lt;br/>
&lt;span>fit sphere with more than 80% of points (2cm tolerance)&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141212">&lt;/a>&lt;/div>

&lt;h2>12th December 2014 &amp;mdash; URCaps&lt;/h2>

&lt;div class='p'>A few days ago I received an advertisement from Czech PR company working for
&lt;a href='http://www.universal-robots.com/' class='external'>Universal Robots&lt;/a> &amp;mdash; successful Danish
robotic company. I already know about their product
&lt;a href='http://www.universal-robots.com/GB/Products.aspx' class='external'>UR5&lt;/a> (robotic arm capable
to carry 5kg). You also need an end effector (gripper), and that is something
what was missing in UR portfolio. They decided to do an interesting step: they
created &lt;a href='http://www.urcaps.com/URCaps/' class='external'>URCaps&lt;/a> website, where you can find
various &lt;i>U R Capable&lt;/i> add-ons.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/urcaps.png'>&lt;img src='/articles/applebot/urcaps_t.png' alt='URCaps' title='URCaps' class='border'  width='220' height='232'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/urcaps.png'>URCaps&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Why I am mentioning this on the AppleBot blog?  Well,
even from the very begging I was thinking of UR5 as potential platform for
picking the apples, but I did not get confirmation yet, that the arm will
survive transport between stops for picking.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141215">&lt;/a>&lt;/div>

&lt;h2>15th December 2014 &amp;mdash; PoE IP Camera&lt;/h2>

&lt;div class='p'>Last Tuesday we received new camera for robotic experiments. It is from the
same manufacture as the camera on &lt;a href='/robots/eduro/en'>robot Eduro&lt;/a>. It is
&lt;i>compact&lt;/i> (half length) and it has two sensors &amp;mdash; see
&lt;a href='http://www.arecontvision.com/product/MegaVideo+Compact+Series/AV3236DN#KeyFeatures' class='external'>AV3236
specification&lt;/a>. The camera is sold for Day/Night surveillance and it has
removable IR cut filter.&lt;/div>

&lt;div class='p'>When we unpacked it we wondered: where is the power connector? There is none!
There are 4 pins for input/output, but none for separate power. The reason is
hidden in label &lt;b>PoE&lt;/b>
(&lt;a href='http://en.wikipedia.org/wiki/Power_over_Ethernet' class='external'>Power over Ethernet&lt;/a>).
i.e. the camera is powered by unused lines of Ethernet cable.&lt;/div>

&lt;div class='p'>You need something like &lt;i>power injector&lt;/i>.
&lt;a href='http://www.alza.cz/airlive-passive-poe-kit-d332362.htm' class='external'>AirLive passive PoE
Kit&lt;/a> looked good, but when I attached the power (18V, 24W) it did not work
:-(. I supposed that the camera is for wide range 12V-30V(?) like the
previous one, but it is not. &lt;i>PoE and Auxiliary Power: 12-48V DC/24V AC
(Auxillary Power Not Available for Dual Sensor Model)&lt;/i>. You need &lt;i>PoE
standard voltage&lt;/i> which means 48V. The camera is still working on 32V (that is
the very edge), but 48V is the recommended voltage.&lt;/div>

&lt;div class='p'>I have seen first H264 video and some pictures from this sensor, but I do not
have the power adapter now to run it and show it you too &amp;mdash; so at least some
photos:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/camera-box.jpg'>&lt;img src='/articles/applebot/camera-box_t.jpg' alt='Camera in box' title='Camera in box' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/camera-box.jpg'>Camera in box&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/camera-front.jpg'>&lt;img src='/articles/applebot/camera-front_t.jpg' alt='Dual Sensor' title='Dual Sensor' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/camera-front.jpg'>Dual Sensor&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/camera-back.jpg'>&lt;img src='/articles/applebot/camera-back_t.jpg' alt='Back view' title='Back view' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/camera-back.jpg'>Back view&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141216">&lt;/a>&lt;/div>

&lt;h2>16th December 2014 &amp;mdash; Dual Sensor&lt;/h2>

&lt;div class='p'>Now only a quick info about camera status. I bought 48V power adapter and when
I connected power to injector and Ethernet cable to camera it did not work.
Again. Or to be precise there is not way of telling if it is working or not.
The LED for Ethernet will start to blink &lt;b>only&lt;/b> when Ethernet cable is also
connected on the other side (to router or laptop). So make sure you have
connected everything and then it works. Fixed.&lt;/div>

&lt;div class='p'>The second issue was IP address of the camera. If you have Windows OS then
probably the simplest way is to install &lt;i>AV200 Software&lt;/i> and click and click.
The good news is that old command/URL to get JPEG image is still working:&lt;/div>

&lt;pre>http://192.168.1.6/img.jpg&lt;/pre>

&lt;div class='p'>Which sensor is used depends on your settings. Default is &lt;i>Automatic&lt;/i> (for
bright light it is using &lt;i>Day sensor&lt;/i> and for darkness &lt;i>Night sensor&lt;/i>),
then &lt;i>Day&lt;/i>, &lt;i>Night&lt;/i> and &lt;i>Dual Channel Streaming&lt;/i>. Here are first pictures
for day and night manual setting:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/day-img.jpg'>&lt;img src='/articles/applebot/day-img_t.jpg' alt='Day sensor' title='Day sensor' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/day-img.jpg'>Day sensor&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/night-img.jpg'>&lt;img src='/articles/applebot/night-img_t.jpg' alt='Night sensor' title='Night sensor' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/night-img.jpg'>Night sensor&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>So it looks like the left &lt;i>eye&lt;/i> is for night mode. The next step would be
to test streaming of both sensors &amp;mdash; I am curious if there is still a chance
to extract individual images from both sensors. We will see (later today at
school).&lt;/div>

&lt;div class='p'>p.s note, that by default day sensor has resolution 1024x768 and night sensor
640x480&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141217">&lt;/a>&lt;/div>

&lt;h2>17th December 2014 &amp;mdash; 2nd 3D-Scanning Experiment&lt;/h2>

&lt;div class='p'>Yesterday we repeated the &lt;a href='/articles/applebot/en#141203'>3D scanning experiment&lt;/a>.
This time in combination with new camera so we have also reference pictures
with timestamps. Again, there were several new things we learned.&lt;/div>

&lt;div class='p'>First of all it is the new Dual Sensor Camera AV3236DN. I did not succeed to
automatically record H.264 video streams from both cameras, so fall back was a
single image every 10 scans (see
&lt;a href='https://github.com/robotika/applebot/commit/c0af5002bb2a2afc918442afa4286d4c8f2a0403' class='external'>diff&lt;/a>).&lt;/div>

&lt;div class='p'>It was quite hard to find &lt;b>Camera API&lt;/b>. There are several
manuals, papers, presentations pointing to non-existing page :-(. At the end I
found it
&lt;a href='http://www.arecontvision.com/forcedown.php?file_name=UWNNTkVCVmcxanQwYk4rZHlIcW94RTV5RWYxV3BsZENkRm1DODV2Y2dQK3I3R0J5WjVsZHEvSWQrNUN4ZUFCMGdkdnF5Q083MXFwSGdCVDFjOE9lRkE9PQ==&amp;amp;file_path=Wkh5Q2FBc2VCSFhEeXFhQWNUTy9aL0ZzL0plaE9sWGd4VUpncUxOMzNPMD0=' class='external'>here&lt;/a>,
but I doubt it is persistent link. Look for
&lt;i>ArecontVisionAPIandCameraWebPageManual6.23.2014.pdf&lt;/i>. Here are some
interesting hints:&lt;/div>

&lt;pre>http://192.168.1.6/livevideo
http://192.168.1.6/h264stream
http://192.168.1.6/h264f&lt;/pre>

&lt;div class='p'>Once I typed &lt;i>h264stream&lt;/i> in my browser it generated like 100 windows to save
data &amp;mdash; not good. The &lt;i>h264f&lt;/i> is shortcut for H264 frame, but I was
receiving I-frames only and no P-frame even when I was changing &lt;i>iframe=0&lt;/i>
parameter (OT you can be interested in &lt;a href='/articles/h264-drone-vision/en'>&lt;span class='cs'>my
experiments with H264&lt;/span>&lt;/a>, but it is in Czech only at the moment). So it needs
more calmer time for investigation and that was not yesterday.&lt;/div>

&lt;div class='p'>Another issue we had was related to &lt;i>remission data&lt;/i> from laser scanner
TiM55. I remember Jirka was trying to get it work under Linux (see
&lt;a href='https://sites.google.com/site/cogitoteam/robotour-2014/diary/tim551laserscaner-configurationprotocol20140215' class='external'>his
website&lt;/a>). The easiest way was to use supported software to enable remission
(checkbox RSSI), and then switch Ethernet cable back to my notebook.&lt;/div>

&lt;div class='p'>There were extra 3 elements in output array (instead of expected 580 like on
older USB laser scanner there were 583), and now I see on Jirka's page that &lt;b>
"B not defined"&lt;/b>. Yes:&lt;/div>

&lt;pre>A9', 'B2', 'B2', 'AC', 'AC', 'AC', 'B2', 'AF', 'AC
', 'A9', 'AC', '0', '1', 'B', 'not', 'defined', '0', '0', '0']&lt;/pre>

&lt;div class='p'>&amp;hellip; that's the same &lt;i>problem&lt;/i>.&lt;/div>

&lt;div class='p'>OK, I have to get ready for early morning meeting, so at least some results.
Robotics should be also fun, so thanks Standa for volunteering &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/standa.png'>&lt;img src='/articles/applebot/standa_t.png' alt='Distance data' title='Distance data' class='border'  width='220' height='157'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/standa.png'>Distance data&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/standa-rssi.png'>&lt;img src='/articles/applebot/standa-rssi_t.png' alt='Remission data' title='Remission data' class='border'  width='220' height='159'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/standa-rssi.png'>Remission data&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/pic_141216_191126_021.jpg'>&lt;img src='/articles/applebot/pic_141216_191126_021_t.jpg' alt='Day camera' title='Day camera' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/pic_141216_191126_021.jpg'>Day camera&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Note, there is
&lt;a href='https://github.com/robotika/applebot/commit/15e88b35779088ac6ede3875ce272fca74153c1c' class='external'>minor
change&lt;/a> of &lt;i>log2pgm.py&lt;/i> in order to handle both range and remission data.&lt;/div>

&lt;div class='p'>p.s. based on the timestamps of taken images every 10 laser scans it looks like
we really running at 10Hz, but the sensor should support 15Hz, i.e. better
resolution in &lt;i>X&lt;/i> coordinate &amp;hellip; next time.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141218">&lt;/a>&lt;/div>

&lt;h2>18th December 2014 &amp;mdash; applethreshold.py&lt;/h2>

&lt;div class='p'>There is new interactive tool on github now:
&lt;a href='https://github.com/robotika/applebot/blob/master/applethreshold.py' class='external'>applethreshold.py&lt;/a>.
It is modification of
&lt;a href='https://github.com/m3d/cv2-bits/blob/master/threshold.py' class='external'>threshold.py&lt;/a> where
I am interested only in &lt;b>narrow space slice&lt;/b>. The apple has some expected
size, say 10cm, and in the depth image we can see only front face. This defines
the slice width to 5cm. All other distances are drawn with white color.&lt;/div>

&lt;div class='p'>The laser scanner is measuring in millimeters and we use division by 5 for gray
conversion, so the 5cm corresponds to integer appleSize=10. Here are two space
slices with apples at 91/2 cm and 98/2 cm distance:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/apple-threshold-91.png'>&lt;img src='/articles/applebot/apple-threshold-91_t.png' alt='Slice for 43-48cm' title='Slice for 43-48cm' class='border'  width='220' height='198'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/apple-threshold-91.png'>Slice for 43-48cm&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/apple-threshold-98.png'>&lt;img src='/articles/applebot/apple-threshold-98_t.png' alt='Slice for 46.5-51.5cm' title='Slice for 46.5-51.5cm' class='border'  width='220' height='198'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/apple-threshold-98.png'>Slice for 46.5-51.5cm&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>What is it good for? Well, I was not very happy with
&lt;a href='https://github.com/robotika/applebot/commit/9d24c6e219e6a5c49f8922c5c218cc0750a67c4e' class='external'>brute
force experiment&lt;/a>, so I am looking for an alternative solution. This is just a
visualisation tool &amp;mdash; the algorithm should look for uniform blobs of given
size.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141219">&lt;/a>&lt;/div>

&lt;h2>19th December 2014 &amp;mdash; Crazy Ideas&lt;/h2>

&lt;div class='p'>I must be influenced by the book
&lt;a href='http://en.wikipedia.org/wiki/The_Naked_Sun' class='external'>Naked Sun&lt;/a> I am just reading.
There are plenty of Solarian robots working in apple orchards &lt;span class='wink'>&lt;/span>&amp;hellip; I was
looking for Christmas present two days ago &amp;mdash; a telescopic tool for picking
apples, and I end up on this Czech website about
&lt;a href='http://www.cestananovyzeland.cz/clanky/prace/43-prace-v-jablecnem-sadu-a-sbirani-jablek-na-novem-zelandu/' class='external'>work
and travel on New Zealand&lt;/a>. And then I realized, that I do not have to wait
till September for real apple picking test. There is new world where the
picking season starts at the end of January and continues till April &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>Another piece of jigsaw puzzle is &lt;a href='http://www.exactec.com/' class='external'>EXACTEC&lt;/a>, Czech
distributor of &lt;a href='http://www.universal-robots.com/' class='external'>Universal Robots&lt;/a>. There is
a chance that we will have their UR5 robot for testing for a week. The goal is
&lt;i>version0&lt;/i> i.e. to autonomously find an apple and pick it from the tree
(because of the winter I would be probably rather Christmas tree).  And
yesterday I learned that on 20th of January 2015 I should have a presentation
about my future research &amp;hellip; so why not to join these two, and present the
first results on the real machine?! Yes, mission impossible, but we are used to
it, right? Did we have more time for any other „competition”?  There is one
month remaining (well, half of that are holidays), and it is strong motivation
to win.&lt;/div>

&lt;div class='p'>So instead of H264 codec I was looking rather on UR5 specs and communication
protocol. Based on the
&lt;a href='http://www.tandfonline.com/doi/pdf/10.1080/00288230709510392' class='external'>English manual&lt;/a>
there should be Ethernet, which is now probably the simplest way of integration
(both laser scanner and IP camera are connected to the laptop over Ethernet).
Moreover I already found some Python examples how to control the robotic arm
(see
&lt;a href='http://www.zacobria.com/universal-robots-zacobria-forum-hints-tips-how-to/script-via-socket-connection/' class='external'>Zacobria
Robot community forum&lt;/a>), and it looks like fun. Is it crazy enough?&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141221">&lt;/a>&lt;/div>

&lt;h2>21st December 2014 &amp;mdash; Memories and UR5 API&lt;/h2>

&lt;div class='p'>Yesterday I had &lt;i>no computer day&lt;/i> so besides packing of a few Christmas
presents I re-read my very old book about robots (I bought it when I was on
elementary school). The title is
&lt;a href='http://muj-antikvariat.cz/kniha/roboty-slouzi-cloveku-sury-jiri-1982' class='external'>Roboty
slouží člověku&lt;/a> (Robots Serves Humans), Surý, Rempsa 1982. I remember that I
was a bit disappointed that time (30+ years ago) and probably did not
understand it very much.&lt;/div>

&lt;div class='p'>If nothing else it is clear now, why &lt;b>UR5&lt;/b> has
&lt;a href='http://en.wikipedia.org/wiki/Six_degrees_of_freedom' class='external'>6-DOF&lt;/a> (six degrees of
freedom). If you want to be able to hold a tool in given position (x,y,z) you
need 3-DOF. But if you also need 3D orientation (roll, pitch, yaw) then you
need &lt;i>obviously&lt;/i> six degrees of freedom &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>My second memory, related to robotic manipulators, is connected with
&lt;a href='http://en.wikipedia.org/wiki/Probabilistic_roadmap' class='external'>Probabilistic Path Planner
(PPP)&lt;/a>. In particular I remember talking to
&lt;a href='http://en.wikipedia.org/wiki/Mark_Overmars' class='external'>prof. Overmars&lt;/a> in 2000 (Eilat,
&lt;i>Workshop on Computional Geometry&lt;/i>), when he was explaining unnecessary
movement issues with probabilistic planners with large number of DOFs
(snake-like robots with 20 or even 50 DOFs).&lt;/div>

&lt;div class='p'>The third memory is relatively fresh. Last year I was playing with hexapod
&lt;a href='/robots/fireant/en'>&lt;span class='cs'>FireAnt&lt;/span>&lt;/a> having 25 servos, i.e. 25-DOFs. Each leg had 3
servos and it was necessary to compute leg transition from one position to
another &amp;hellip; and the same holds now for UR5 too. So the distinction between
commands &lt;b>moveJ&lt;/b> and &lt;b>moveL&lt;/b> is clear now &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;h3>UR5 API&lt;/h3>

&lt;div class='p'>Position of 6-DOF robot is defined by 6 numbers. In the case of UR5 these
numbers correspond to the angles of each joint. UR is using term &lt;i>waypoint&lt;/i>
for this 6-tuple.&lt;/div>

&lt;div class='p'>Next, if you have a motor, the simplest control curve is to define acceleration
and maximal speed. The speed profile then looks like trapezoid.&lt;/div>

&lt;div class='p'>OK, so we have &lt;i>waypoints&lt;/i> and standard control for motor speed. Now it
starts to be a little bit more tricky. What if you combine two or more
waypoints? Say when you want to move from one position to another? See page 117
of
&lt;a href='http://www.tandfonline.com/doi/pdf/10.1080/00288230709510392' class='external'>UR5_User_Manual_GB.pdf&lt;/a>:&lt;/div>

&lt;ul>
&lt;li>&lt;b>moveJ&lt;/b> 
&amp;mdash; will make movements that are calculated in the joint space of the robot arm.
Each joint is controlled to reach the desired end location at the same time.
This movement type results in a curved path for the tool. The shared parameters
that apply to this movement type are the maximum joint speed and joint
acceleration to use for the movement calculations, specified in deg/s and
deg/s2, respectively. If it is desired to have the robot arm move fast between
waypoints, disregarding the path of the tool between those waypoints, this
movement type is the favorable choice.&lt;/li>

&lt;li>&lt;b>moveL&lt;/b>
&amp;mdash; will make the tool move linearly between waypoints. This means that each
joint performs a more complicated motion to keep the tool on a straight line
path. The shared parameters that can be set for this movement type are the
desired tool speed and tool acceleration specified in mm/s and mm/s 2 ,
respectively, and also a feature. The selected feature will determine in which
feature space the tool positions of the waypoints are represented in. Of
specific interest concerning feature spaces are variable features and
variable waypoints.  Variable features can be used when the tool position of a
waypoint need to be determined by the actual value of the variable feature when
the robot program runs.&lt;/li>
&lt;/ul>

&lt;div class='p'>Note, that there exists also &lt;b>moveP&lt;/b>, which &lt;i>will move the tool linearly
with constant speed with circular blends, and is intended for some process
operations, like gluing or dispensing.&lt;/i>&lt;/div>

&lt;div class='p'>I suppose, that combination of &lt;i>moveJ&lt;/i> and &lt;i>moveL&lt;/i> should be enough for
control of &lt;b>applebot-ver0&lt;/b> &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141222">&lt;/a>&lt;/div>

&lt;h2>22nd December 2014 &amp;mdash; Apple Generators&lt;/h2>

&lt;div class='p'>On Sunday evening I did one more experiment. An apple has shape close to sphere
so the central part has to be closer than the points on the edge. That is what
you should see after this
&lt;a href='https://github.com/robotika/applebot/commit/83da2f5ae1f021edf297256eaf3e0366effa6207' class='external'>diff&lt;/a>.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 403px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/two-levels.png' alt='two levels threshold' title='two levels threshold' class='border'  width='397' height='310'/>&lt;/span>&lt;br/>
&lt;span>two levels threshold&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Here you can see that the center part the apple is gray while the area around
it black. Note, that this is also true for the pot below.&lt;/div>

&lt;div class='p'>Then I went back and tried to detect &lt;i>dense areas&lt;/i>, i.e. if you have window
10x10 pixels that most pixels are occupied. Note, that all you need to do is
convolution with matrix filled with ones (you have to change image that apple
pixels are 1 while the rest is 0).  With
&lt;a href='https://github.com/robotika/applebot/blob/master/applethreshold.py' class='external'>applethrehold.py&lt;/a>
you can now see both windows (threshold and dense areas).&lt;/div>

&lt;div class='p'>The last step was change of dense areas into generator of interesting positions
and integration with the apple sphere matching &amp;mdash; see the
&lt;a href='https://github.com/robotika/applebot/commit/f4aea78d67eadcf9a006319c1ea34d2b74448ab0' class='external'>diff&lt;/a>.&lt;/div>

&lt;div class='p'>Does it sound/look good? Well, the result is a bit disappointing &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 386px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/dense-areas.png' alt='dense areas apple detections' title='dense areas apple detections' class='border'  width='380' height='271'/>&lt;/span>&lt;br/>
&lt;span>dense areas apple detections&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>You see that also the pot and the chair has 3D surface similar to sphere. There
will be probably no such smooth large areas in the outdoor tests, but still
something which should be solved soon.&lt;/div>

&lt;div class='p'>This is crazy:&lt;/div>

&lt;pre>winSizeY 4 (271, 380, 3)
1.00: (0.034, 0.847, -0.035) 0.085&lt;/pre>

&lt;div class='p'>so this part of chair fits 100%?! There is narrow window because of the 0.899m
distance, but &amp;hellip; I can add there also the upper limit &amp;mdash; not all point should
fit perfectly. It is projection of sphere with bounding rectangle. Expected fit
is given by area, i.e. for radius=1 is the area &lt;i>PI&lt;/i> and square is 2x2 =
math.pi/4. = 0.7853981633974483.&lt;/div>

&lt;div class='p'>Here is result with upper limit:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 386px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/upper-limit.png' alt='upper limit for fitting sphere points' title='upper limit for fitting sphere points' class='border'  width='380' height='271'/>&lt;/span>&lt;br/>
&lt;span>upper limit for fitting sphere points&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>and here is the
&lt;a href='https://github.com/robotika/applebot/commit/d191bb82482149b29103e72c3fcd2f3fae1f1431' class='external'>code
change&lt;/a>. There is still problem on the border (and always will be), and also
the tolerance to apple view blocking leaves is lower.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150104">&lt;/a>&lt;/div>

&lt;h2>4th January 2015 &amp;mdash; Recording H264 video&lt;/h2>

&lt;div class='p'>Well, I was &lt;i>a bit&lt;/i> sick last two weeks &amp;hellip; I was not obviously following
recommendation: &lt;b>An apple a day keeps the doctor away&lt;/b> &lt;span class='smile'>&lt;/span>. It is still not
very good but I miss playing with Applebot &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>I would start with experiment with the new camera today. You can find now
on github simple script
&lt;a href='https://github.com/robotika/applebot/blob/master/dualcam.py' class='external'>dualcam.py&lt;/a>,
which is recording H264 video stream. I was actually very close to the
working solution. The missing parameter was &lt;b>ssn&lt;/b> &amp;mdash; &lt;i>stream identifier&lt;/i>.
If you do not specify it you will &lt;i>always&lt;/i> get I-frames only.&lt;/div>

&lt;pre>http://192.168.1.6/h264f?res=half&amp;amp;qp=20&amp;amp;ssn=13
http://192.168.1.6/h264f?res=half&amp;amp;qp=20&amp;amp;ssn=13&amp;amp;iframe=0&lt;/pre>

&lt;div class='p'>[SOLVED]&lt;/div>

&lt;div class='p'>The second task is to enable day and night sensors. It is necessary to enable
given input:&lt;/div>

&lt;pre>http://192.168.1.6/get?daynight
http://192.168.1.6/set?daynight=dual&lt;/pre>

&lt;div class='p'>and then you can download separately color or monochrome images:&lt;/div>

&lt;pre>http://192.168.1.6/image?channel=mono
http://192.168.1.6/image?channel=color&lt;/pre>

&lt;div class='p'>&lt;a href='https://github.com/robotika/applebot/commit/2b055e64abea7509d744967f8281433d910a3fc0' class='external'>Here&lt;/a>
is the first attempt to set the mode and record H264 video &amp;hellip; but it does not
work, so far. If I use &lt;i>day&lt;/i> or &lt;i>night&lt;/i> mode then video stream is OK. But
if I select &lt;i>dual&lt;/i> then &lt;i>channel=color&lt;/i> and &lt;i>channel=mono&lt;/i> does not seems
to have any influence. Maybe mix with &lt;i>ssn&lt;/i>?&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150109">&lt;/a>&lt;/div>

&lt;h2>9th January 2015 &amp;mdash; stereo_match.py&lt;/h2>

&lt;div class='p'>Almost two weeks ago I was playing with following idea: &lt;i>If I have several
images collected from linear motion, I should be able to get some stereo vision
results?!&lt;/i> &amp;hellip; and sure enough there is directly example in OpenCV2 (see hint
from
&lt;a href='http://stackoverflow.com/questions/13456188/opencv2-4-python-stereo-matching-and-disparity-map' class='external'>stackoverflow&lt;/a>).
&lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>It is necessary to slightly modify the &lt;b>stereo_match.py&lt;/b> source code, that it
will read your images instead of prepared demo (BTW the source was in my case
in &lt;i>c:\opencv\sources\samples\python2\stereo_match.py&lt;/i>) and as result you
will get scene of polygonal patches in
&lt;a href='http://paulbourke.net/dataformats/ply/' class='external'>PLY format&lt;/a>.&lt;/div>

&lt;div class='p'>Note, that you can open it with already mentioned
&lt;a href='http://www.cloudcompare.org/' class='external'>CloudCompare&lt;/a> and you will get something like
this:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 486px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/stereo-apple.jpg' alt='Stereo Apple' title='Stereo Apple' class='border'  width='480' height='360'/>&lt;/span>&lt;br/>
&lt;span>Stereo Apple&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&amp;hellip; unfortunately the static image is quite boring. If you view it in 3D you
can well recognize the shape of pot, two apples and also small citrus in front
&lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150111">&lt;/a>&lt;/div>

&lt;h2>11th January 2015 &amp;mdash; matplotlib&lt;/h2>

&lt;div class='p'>The topic today was supposed to be &lt;b>debugging laser apple&lt;/b>, as I was trying
to find out what is going on in the apple 3D recognition from the laser scans.
There were bits I expected, like &lt;i>blended borders&lt;/i> when readings on the edge
of apple are non-sense. They correspond to some averaging of distance to the
apple and distance to the background. Other details were not expected as for
example variation of laser readings.&lt;/div>

&lt;div class='p'>At first I wrote a simple
&lt;a href='https://github.com/robotika/applebot/blob/master/cutter.py' class='external'>cutting utility&lt;/a>,
which is based on OpenCV2 sample &lt;i>mouse_and_match.py&lt;/i>. You can click on the
image, select rectangle and then do something with cut patch. Current version
opens extra window with 16x times zoomed image and dumps gray values for
displayed image. The latest version dumps also data of the original laser scan
and its corresponding remission. For more details see
&lt;a href='https://github.com/robotika/applebot/commits/master/cutter.py' class='external'>cutter.py
history&lt;/a>.&lt;/div>

&lt;div class='p'>So now I can see the numbers. There are only a few of them &amp;mdash; even for an
apple 30cm from the scanner I got like 8 readings, which corresponds to
approximately 4cm in height &amp;hellip; strange, I expected more apple-related
readings.&lt;/div>

&lt;div class='p'>The second surprise was how bad the reading were. There were jumps like 2cm on
the surface of the apple! I tried to visualise it and that's the moment when
&lt;a href='http://matplotlib.org/' class='external'>matplot&lt;/a> comes into the game. After installation on
Win7 I had also install &lt;a href='https://pypi.python.org/pypi/six' class='external'>six&lt;/a>,
&lt;a href='https://labix.org/python-dateutil' class='external'>dateutil&lt;/a> and
&lt;a href='https://pypi.python.org/pypi/pyparsing/2.0.3' class='external'>pyparsing&lt;/a>, but that was surely
worth it &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>Here is the first plot for the laser scans of the apple:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/apple-detail-readings.png'>&lt;img src='/articles/applebot/apple-detail-readings_t.png' alt='detail laser readings for an apple' title='detail laser readings for an apple' class='border'  width='320' height='161'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/apple-detail-readings.png'>detail laser readings for an apple&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The Y-axis is distance in millimeters and you can see for example on the red
plot 1cm oscillations. This is maybe a better visible on the graph with
artificial offset 1cm for each plot:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/offset10.png'>&lt;img src='/articles/applebot/offset10_t.png' alt='scans with 10mm artificial offset' title='scans with 10mm artificial offset' class='border'  width='320' height='200'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/offset10.png'>scans with 10mm artificial offset&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>And one more graph, when for both axis correspond to distances in
millimeters:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/semi-circles.png'>&lt;img src='/articles/applebot/semi-circles_t.png' alt='these plots should be semi circles' title='these plots should be semi circles' class='border'  width='320' height='387'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/semi-circles.png'>these plots should be semi circles&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The artificial offset was set to 5mm here in order to fit all plots in the
image. Do you see the plots as approximation of various semi-circles? Well,
maybe &amp;hellip; &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150116">&lt;/a>&lt;/div>

&lt;h2>16th January 2015 &amp;mdash; The First Tests with Universal Robot UR5&lt;/h2>

&lt;div class='p'>Yes! &lt;span class='smile'>&lt;/span> Believe it or not we have now the robotic arm UR5 from Universal
Robots.  Just for a week (for the Tuesday Jan 20th show), but still it is a
great! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>I am currently exhausted and still sick and I should go home, but I am waiting
for completion of devices download for
&lt;a href='https://www.mysick.com/eCat.aspx?go=DataSheet&amp;amp;Cat=Gus&amp;amp;At=Fa&amp;amp;Cult=English&amp;amp;Category=Software&amp;amp;ProductID=28665' class='external'>SOPAS&lt;/a>,
which I need only to change IP address of our laser scanner :-(. We already
did it some time ago, but it was probably not saved so now it works on
169.254.225.156, which is not very useful when camera, robot UR5 and my laptop
run all on 192.168.1.x network &amp;hellip; just to be patient, which is hard for me
now.&lt;/div>

&lt;div class='p'>So where are we? We have the robotic arm working = it scans, goes to desired
position, close the gripper, returns home. Smoothly, nicely &lt;span class='smile'>&lt;/span>. If you would
like to see the code, it is on github under
&lt;a href='https://github.com/robotika/applebot/blob/master/ur5.py' class='external'>ur5.py&lt;/a>.&lt;/div>

&lt;div class='p'>Camera is recording, so there should not be any problem.&lt;/div>

&lt;div class='p'>Laser is an issue now. First the IP but also the mounting has to be revised.
There are very bulky cables and the first version of the gripper with sensors
did not take into account wide radius of the arm.&lt;/div>

&lt;div class='p'>Time to go home &amp;hellip;&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/gripper-front.jpg'>&lt;img src='/articles/applebot/gripper-front_t.jpg' alt='gripper - front' title='gripper - front' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/gripper-front.jpg'>gripper - front&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/gripper-side.jpg'>&lt;img src='/articles/applebot/gripper-side_t.jpg' alt='gripper - side' title='gripper - side' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/gripper-side.jpg'>gripper - side&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/ur5-work-in-progress.jpg'>&lt;img src='/articles/applebot/ur5-work-in-progress_t.jpg' alt='UR5, work in progress' title='UR5, work in progress' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/ur5-work-in-progress.jpg'>UR5, work in progress&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&amp;hellip; at home &amp;hellip;&lt;/div>

&lt;div class='p'>On the way home I was thinking about several steps we already completed, but I
did not mention them. I am more relaxed now so it should be easier to do the
revision:&lt;/div>

&lt;h3>Inverse kinematics&lt;/h3>

&lt;div class='p'>Well, it was not necessary to do ANY computations &lt;span class='smile'>&lt;/span>. There is magic &lt;b>p
prefix&lt;/b> to coordinates send to UR5 so if you send for example:&lt;/div>

&lt;pre>movej( p[0.139, -0.065, 0.869, -1.311, 1.026, -0.869] )&lt;/pre>

&lt;div class='p'>then the parameters are not angles in radians but they are &lt;b>x, y, z, rx, ry,
rz&lt;/b> instead. And because we need angles fixed for version 0 there is no need
to worry what exactly &lt;i>rx, ry, rz&lt;/i> means. You just turn the wrist to the
orientation you need and change only the first three parameters &lt;i>x, y, z&lt;/i>.&lt;/div>

&lt;h3>Server vs. Client&lt;/h3>

&lt;div class='p'>When we visited &lt;a href='http://www.exactec.com/' class='external'>Exactec&lt;/a> I learned that the UR5 is
so universal that it also supports many types of communications including one,
where the robot is only a slave/client. You can open sockets to other servers,
run several threads etc. I am glad that also the other way works, i.e. I use
UR5 as server and my notebook is client sending requests. For this kind of
communication is dedicated port 30002. Note, that you can change IP address of
the robot, but it is highly recommend to reboot the robot after this action
&lt;span class='wink'>&lt;/span>.  After reboot it worked.&lt;/div>

&lt;h3>Status Info&lt;/h3>

&lt;div class='p'>I was wondering how to get regular update about current robot joints position
and it turned out that all you need to do is listen &lt;span class='smile'>&lt;/span>. There is a binary
protocol containing 1247 bytes per packet, where you will find everything from
positions, targets, speeds, Cartesian coordinates, voltages, currents,
temperatures &amp;hellip;&lt;/div>

&lt;div class='p'>At the moment I am using only absolute speed of all joints as terminal
condition (reached desired position).&lt;/div>

&lt;h3>Logging&lt;/h3>

&lt;div class='p'>I fixed my mistake that in Liberec I did not have logging ready. Because we
forgot there power cable (do not laugh &lt;span class='wink'>&lt;/span>) I was using only bits remaining in
my console. Now I have almost 100 files from today testing, so &lt;i>crack&lt;/i> what
is 64bit timestamps should be trivial. And it was (see
&lt;a href='https://github.com/robotika/applebot/commit/dc048b80e5e0216d86999ecf91de0fc6bf05326e' class='external'>diff&lt;/a>).
So it is probably time in milliseconds since reboot. The differences are like
12-13ms, i.e. update is approximately 80Hz (???). I will have to measure
absolute time as reference tomorrow.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150117">&lt;/a>&lt;/div>

&lt;h2>17th January 2015 &amp;mdash; UR5 Video&lt;/h2>

&lt;div class='p'>Here is the first video with UR5 with camera and laser scanner in the
„hand”:&lt;/div>

&lt;div class='p'>&lt;iframe width="640" height="360" src="//www.youtube.com/embed/m8Ij3ATyN5s?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;div class='p'>The corresponding demo code commit is
&lt;a href='https://github.com/robotika/applebot/commit/70fb8fae44de046ef1a08e85d6bc55b0ed250429' class='external'>here&lt;/a>.&lt;/div>

&lt;div class='p'>You may notice couple details: first of all the laser scanner is rotated by 180
degrees. There is the yellow Merkur part (something like Mechano), which was
&lt;i>quick and dirty&lt;/i> solution, but it works:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/merkur1.jpg'>&lt;img src='/articles/applebot/merkur1_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/merkur2.jpg'>&lt;img src='/articles/applebot/merkur2_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/merkur3.jpg'>&lt;img src='/articles/applebot/merkur3_t.jpg' alt='' title='' class='border'  width='220' height='165'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>I quite like it for rapid prototyping &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>Second, the hand does not look like on pictures with Merkur above. I was getting
several times &lt;i>protective stop, position deviates from path Wrist1&lt;/i> until I
realized that the hand is actually &lt;b>self-colliding!&lt;/b>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 171px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/hand-self-colliding.jpg'>&lt;img src='/articles/applebot/hand-self-colliding_t.jpg' alt='Self colliding robot' title='Self colliding robot' class='border'  width='165' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/hand-self-colliding.jpg'>Self colliding robot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The solution was to change all three joints to this configuration:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/hand-upside-down.jpg'>&lt;img src='/articles/applebot/hand-upside-down_t.jpg' alt='Upside down configuration' title='Upside down configuration' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/hand-upside-down.jpg'>Upside down configuration&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>What was interesting that &lt;i>X,Y,Z&lt;/i> as well as rotation angles were the same as
before! Well, I was warned but you do not quite believe it until you see it
&lt;span class='wink'>&lt;/span>. So letting robot solve inverse kinematics is good, but there could be more
than one solution and only the joints space defines the robot pose
unambiguously!&lt;/div>

&lt;div class='p'>What else? You can hear the clicking sounds near the apple, as the robot is
trying to pick it and then to drop it. I repeated the test six times and in
four attempts the gripper did not work?! I am not sure if the command was lost
or what was the problem? So one task is to review UR5 output stream if the I/O
is confirmed. It is nice to have all the log files (sent commands included)
where you can see that in all tests the gripper was supposed to close and
open.&lt;/div>

&lt;div class='p'>Well, I do not see anything in &lt;i>Tool Data&lt;/i>, except small current change at
one moment
(&lt;a href='https://github.com/robotika/applebot/commit/050fcbba4b93f395ec68867447a0b204e557aabf' class='external'>diff&lt;/a>).
There are also some &lt;i>Masterboard Data&lt;/i>
(&lt;a href='https://github.com/robotika/applebot/commit/82a6f0331944d9bf256876dde80cbce1e06bde14' class='external'>diff&lt;/a>),
and I/O bits changed from ['0x20000', '0x0'] to ['0x10000', '0x10000'] &amp;hellip; hmm,
interesting. Yes! I checked the second log file and the bits did not change.
Just double check that it worked in 6th test &amp;hellip; yes &lt;span class='smile'>&lt;/span>. OK, so it is probably
necessary to verify these bits and repeat the command until they change.&lt;/div>

&lt;div class='p'>There was an interesting moment when my Ethernet cable was accidentally
disconnected. UR5 stopped to talk to me. It looked like that it is necessary to
send some command to UR5 first and the it starts to send the data again.&lt;/div>

&lt;div class='p'>I did not mention that &lt;b>the video is fake!!!&lt;/b> &amp;mdash; if you followed the github
link you would see it, but who would do that, right? &lt;span class='wink'>&lt;/span> The data collection
was running so you can see me recording the video:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/pic_150117_125328_030.jpg'>&lt;img src='/articles/applebot/pic_150117_125328_030_t.jpg' alt='I see you you see me ...' title='I see you you see me ...' class='border'  width='320' height='240'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/pic_150117_125328_030.jpg'>I see you you see me ...&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The laser was running all the time so top-down scan corresponds only to part of
this distance map image:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 477px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/ur5-scan.png' alt='Distance image from laser scanner' title='Distance image from laser scanner' class='border'  width='471' height='271'/>&lt;/span>&lt;br/>
&lt;span>Distance image from laser scanner&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Do you see there the apple with cable? &lt;span class='wink'>&lt;/span> In the last experiment I moved away
my notebook and I slowed the scanning 4 times (original speed was 0.1m/s). One
pixel corresponded to 1mm (laser has update 10Hz) and after change to 1/4th &amp;hellip;
do you see the difference?&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 404px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/ur5-slow-scan.png' alt='4times slower scan' title='4times slower scan' class='border'  width='398' height='271'/>&lt;/span>&lt;br/>
&lt;span>4times slower scan&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>I would guess that it is only two times slower, but the log file should clear
this up.&lt;/div>

&lt;div class='p'>p.s. &lt;i>timestamp&lt;/i> 14000-8547=5453 corresponds in reality to 43 seconds (images
pic_150117_132421_000.jpg to pic_150117_132504_039.jpg), so 1 step = 8ms???
Strange. Based on the size of log file the update looks like 20Hz. Maybe some
other documentation will clear this up.&lt;/div>

&lt;div class='p'>p.s.2 now I should uncomment the
&lt;a href='https://github.com/robotika/applebot/blob/70fb8fae44de046ef1a08e85d6bc55b0ed250429/demo.py#L92' class='external'>apples = findApples( APPLE_SIZE, scan )&lt;/a> and see what would real detector return + do
transformation to 3D with some offsets &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150120">&lt;/a>&lt;/div>

&lt;h2>20th January 2015 &amp;mdash; Applebot UR5 ver0&lt;/h2>

&lt;div class='p'>It is working now &lt;span class='smile'>&lt;/span> &amp;mdash; see corresponding
&lt;a href='https://github.com/robotika/applebot/commit/810032d8229c810ff0c71b9e56c9f72742bed924' class='external'>diff&lt;/a>.
It is not perfect as recorded fake over weekend, but it does the work
autonomously:&lt;/div>

&lt;ul>
&lt;li>scan the scene&lt;/li>

&lt;li>search for the apple in recorded 3D data&lt;/li>

&lt;li>compute desired 3D pose&lt;/li>

&lt;li>navigate hand towards the apple&lt;/li>

&lt;li>close gripper&lt;/li>

&lt;li>move towards drop zone&lt;/li>

&lt;li>open gripper&lt;/li>
&lt;/ul>

&lt;div class='p'>I started to record from both cameras and as you can see I had a witness last
night:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/articles/applebot/bw_150119_195237_032.jpg'>&lt;img src='/articles/applebot/bw_150119_195237_032_t.jpg' alt='Milan watching the demo' title='Milan watching the demo' class='border'  width='320' height='240'/>&lt;/a>&lt;br/>
&lt;a href='/articles/applebot/bw_150119_195237_032.jpg'>Milan watching the demo&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Note, that it is not perfect yet. In particular it is necessary to calibrate
rotation around Z axis. I used only 3D offset and that is not sufficient.
Slight rotation of the scanner + imperfect mounting is probably source of
couple centimeters error, which depends on the position on the table (BTW
thanks Milan for nice „apple holder”).&lt;/div>

&lt;div class='p'>The algorithm is quite simple now. It is using dual threshold defined by apple
size and scans with 5mm step from 20cm to 50cm. It looks only for objects
(contours) of expected size (area parametrized by motion step and distance) in
given 3D slice. Only the first occurrence is accepted now (for higher
reliability for the today's demo).&lt;/div>

&lt;div class='p'>If you see it working you may find it trivial, but it was a bit „jumpy road”
towards this intermediate goal.&lt;/div>

&lt;div class='p'>Over weekend I was reading about similar apple harvesting projects, and I
read the Chinese article again. Why did they use 9kg SICK LMS211, with so
bad accuracy (10mm +/- 15mm)?!  Well, it was not so bad &amp;mdash; the TIM551 has
&lt;i>Systematic error&lt;/i> +/- 60mm and &lt;i>Statistical error&lt;/i> +/- 20mm! So the fact
that I had problem recognize smooth 3D sphere shape in 3D cloud from small
patch is expected. :-(&lt;/div>

&lt;div class='p'>I also checked specs of our first scanner LMS100 (from
&lt;a href='/competitions/sick-robot-day/2010/en'>&lt;span class='cs'>SICK Robot Day 2010&lt;/span>&lt;/a>) and it promise
&lt;i>Systematic error&lt;/i> ± 30 mm and &lt;i>Statistical error&lt;/i> ± 12 mm. Also it can
scan at 50Hz scanning frequency with 0.25 degree resolution. The bad news is
weight 1.1kg without cables and operating range starting from 0.5 meter. So we
will probably stay with the smaller TIM551. On the other hand maybe it is just
a typo, as TIM551 has operating range from 0.05 meter and we used LMS100 mostly
in this close range &amp;mdash; see
&lt;a href='https://www.mysick.com/partnerPortal/ProductCatalog/DataSheet.aspx?ProductID=33753' class='external'>data
sheet&lt;/a>.&lt;/div>

&lt;div class='p'>This remains me the issue with scanning frequency of TIM551. It should be 15Hz,
but I was getting 10Hz only. Sure enough it is mine fault, due to historical
reasons. It was probably not possible to switch LMS100 to scanning frequency
10Hz, so we had to add there sleep (see Eduro
&lt;a href='https://github.com/robotika/eduro/blob/master/laser.py#L163' class='external'>laser.py&lt;/a> code).
So by removing this sleep we may get now 50% faster scanning rate &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>Another eyeopener was with UR5. Do you remember my comment that I would expect
the apple from the 6th test on Saturday much bigger (4 times) then in previous
experiments? Well there is a global slider named &lt;b>Target Speed Fraction&lt;/b>,
which is probably used mainly in manufacturing, where you first test your
assembly line in slow motion and then you increase to 100%. I was operating the
arm with touch screen control, so I set it to slower speed (41%). But this
control influences also external commands! So in reality I was not scanning
with speed 0.1m/s but rather 0.041m/s. For the last (6th) test I first rebooted
all machines, just to be sure, so this slider was back on 100% and slow speed
0.025m/s was in reality just approximately 2 times slower then in the first
test &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150124">&lt;/a>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 153px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/articles/applebot/robotshadowh_t.jpg' alt='Shadow Robot Hand' title='Shadow Robot Hand' class='border'  width='147' height='220'/>&lt;/span>&lt;br/>
&lt;span>Shadow Robot Hand&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>24th January 2015 &amp;mdash; The Hand&lt;/h2>

&lt;div class='p'>I went across &lt;a href='http://phys.org/news/2014-12-robot-shadow.html' class='external'>this article&lt;/a>
last weekend when I was preparing my presentation. It is about space research
but see the picture! And believe it or not it is not science fiction &lt;span class='smile'>&lt;/span>. You
can contact &lt;a href='http://www.shadowrobot.com/' class='external'>Shadow Robot Company&lt;/a> and get quote
for their robotic hand. I was interested in combination with UR5 and they offer
even this combination (also with UR10). Options are left/right hand but also
&lt;i>Lite Hand&lt;/i>, which would be the option to combine with UR5. The "normal" hand
weights 4.3kg, which is very close to UR5 payload (5kg) while the "lite"
version is 2.4kg only.&lt;/div>

&lt;div class='p'>As you would expect it is not cheap toy. But if you take into account that you
can control all part of all fingers and you have force feedback &amp;hellip; one day I
would like to test it &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150125">&lt;/a>&lt;/div>

&lt;h2>25th January 2015 &amp;mdash; Standa and UR5&lt;/h2>

&lt;div class='p'>This is a video recorded shortly before we packed UR5 back into the box. Standa
wanted to try to program it from control panel and you can see that he was more
successful than me with autonomous navigation &lt;span class='wink'>&lt;/span>. Note, that all sensors were
running, in particular you can see TIM551 scanning which you would not see
with bare eye.&lt;/div>

&lt;div class='p'>&lt;iframe width="560" height="315" src="//www.youtube.com/embed/videoseries?list=PL2gPpyBs1e22_WWSH_Cw6fTMFzK_-kYDT" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="150130">&lt;/a>&lt;/div>

&lt;h2>30th January 2015 &amp;mdash; The Others (part1)&lt;/h2>

&lt;div class='p'>According to a nice survey article &lt;i>Harvesting robots for high-value crops:
state-of-the-art review  and  challenges ahead&lt;/i>, Bac, Henten, Hemming, Edan,
&lt;i>Journal of Field Robotics&lt;/i> (2014) &amp;hellip; &lt;i>performance of harvesting robots did
not improve in the last three decades and none of these 50 robots were
commercialized.&lt;/i> (A project was included in the review if and only if a
complete functional system was built and reported in an English written
conference paper or peer-reviewed journal article.)&lt;/div>

&lt;div class='p'>Well, that is a bit discouraging. There were only five (!) project for apple
harvesting and from them only two robots were autonomous. It is possible that
situation changed last year or that some project was not published or authors
overlooked it, but still &amp;hellip; I expected more projects.&lt;/div>

&lt;div class='p'>The first presented autonomous apple harvesting robot (AAHR) is from Belgium
and you can read about it in popular article for example
&lt;a href='http://www.vision-systems.com/articles/print/volume-12/issue-8/features/profile-in-industry-solutions/vision-system-simplifies-robotic-fruit-picking.html' class='external'>here&lt;/a>
and &lt;a href='https://hal.inria.fr/inria-00194739/document' class='external'>here&lt;/a> is scientific paper.
I was told that the robot worked well, but the remaining problems were
increasing productivity, ensuring safety (according to standard regulation in
open field), and the price bringing it to the market. There was in the end no
company willing to take the risk. At the moment there is new group continuing
on the work in this research area.&lt;/div>

&lt;div class='p'>The second presented AAHR is from China
&lt;a href='http://www.sciencedirect.com/science/article/pii/S1537511011001206' class='external'>Design and
control of an apple harvesting robot&lt;/a>, but I did not track that path yet.&lt;/div>

&lt;div class='p'>There is new promising 2014 project on New Zealand &amp;mdash; see this
&lt;a href='http://www.nzherald.co.nz/business-around-new-zealand/news/article.cfm?c_id=1503701&amp;amp;objectid=11338529' class='external'>article&lt;/a>.
A new startup company with already successful kiwi harvesting robot was
supported by government to commercialize the technology, and the goal is also
to pick apples. The project just started and it is for 4 years.&lt;/div>

&lt;div class='p'>Finally note for myself &lt;span class='wink'>&lt;/span>, that all we tried to present with our experiment
here I found yesterday in 15 years old article:
&lt;a href='http://www.google.cz/url?sa=t&amp;amp;rct=j&amp;amp;q=&amp;amp;esrc=s&amp;amp;source=web&amp;amp;cd=13&amp;amp;ved=0CDkQFjACOAo&amp;amp;url=http%3A%2F%2Fwww.researchgate.net%2Fpublication%2F220464650_A_vision_system_based_on_a_laser_range-finder_applied_to_robotic_fruit_harvesting%2Flinks%2F0046351c175db74a59000000.pdf&amp;amp;ei=CH_KVLWXDsexPJKogYAK&amp;amp;usg=AFQjCNFDeluOvVY-Z3OgbuqBUeDk0P6VIw&amp;amp;bvm=bv.84607526,d.ZWU&amp;amp;cad=rja' class='external'>A
vision system based on a laser range-finder applied to robotic fruit
harvesting&lt;/a>. The laser technology was not so widespread that time but they did
very nice work in recognition of partially occluded spherical fruits. The
inputs were distance and reflectivity measured by laser scanner.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/articles/applebot/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>Jessica</title>
	<link rel='alternate' href="http://localhost/robots/jessica/en"/>
	<id>http://localhost/robots/jessica/en</id>
	<updated>2014-11-03T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> Parrot minidrone (Rolling Spider) is my current toy. The primary motivation is
preparation of a demo for the contest of autonomous robots &lt;b>Tour the Stairs&lt;/b>
(part of robotic festival at the end of November in Prague). Anybody interested
in decoding communication before SDK will be freely available? &lt;b>Blog update:&lt;/b> 4/12 &amp;mdash; &lt;a href='/robots/jessica/en#141204'>Parrot released ARDroneSDK3!&lt;/a>
 </summary>
	<content type='html'> 
&lt;div class='p'>Here are the first pictures:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/jessica/box.jpg'>&lt;img src='/robots/jessica/box_t.jpg' alt='box' title='box' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/jessica/box.jpg'>box&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/jessica/jessica.jpg'>&lt;img src='/robots/jessica/jessica_t.jpg' alt='minidrone' title='minidrone' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/jessica/jessica.jpg'>minidrone&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/jessica/battery.jpg'>&lt;img src='/robots/jessica/battery_t.jpg' alt='battery' title='battery' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/jessica/battery.jpg'>battery&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The drone was lend from Czech Parrot distributor:
&lt;a href='http://www.icornerhightech.cz/' class='external'>http://www.icornerhightech.cz/&lt;/a>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/jessica/spider.jpg'>&lt;img src='/robots/jessica/spider_t.jpg' alt='Parrot Minidrone' title='Parrot Minidrone' class='border'  width='320' height='229'/>&lt;/a>&lt;br/>
&lt;a href='/robots/jessica/spider.jpg'>Parrot Minidrone&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&amp;hellip; and here is the link where hopefully will be SDK from Parrot one day:&lt;/div>

&lt;ul>
&lt;li>&lt;a href='https://devzone.parrot.com/projects/show/oss-rolling-spider' class='external'>https://devzone.parrot.com/projects/show/oss-rolling-spider&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>Currently  you will find there: &lt;i>Note: this is not the Software Development
Kit (SDK) for developing applications to control the drone from a remote
device.  The SDK will be released at a later time.&lt;/i>&lt;/div>

&lt;div class='p'>Unfortunately we have tight dead-line, because
&lt;a href='/competitions/tour-the-stairs/en'>Tour the Stairs&lt;/a> is by the end of November
2014 and the situation with SDK probably won't change.&lt;/div>

&lt;h2>Bad news&lt;/h2>

&lt;div class='p'>&lt;a href='http://www.icornerhightech.cz/rolling-spider' class='external'>Rolling Spider&lt;/a> uses
&lt;a href='http://en.wikipedia.org/wiki/Bluetooth' class='external'>Bluetooth 4&lt;/a> = BLE, which means
&lt;a href='http://en.wikipedia.org/wiki/Bluetooth_low_energy' class='external'>Bluetooth Low Energy&lt;/a>.
This sounds like an interesting new technology but for example my Windows 7 does
not support it (update: it looks like there are some &lt;a href='http://support.lenovo.com/us/en/downloads/ds029104' class='external'>new drivers&lt;/a>).
 On the other hand the other notebook with Windows 8 behaves
like it could potentially talk to the drone.&lt;/div>

&lt;div class='p'>Rolling Spider comes with application
&lt;a href='https://play.google.com/store/apps/details?id=com.parrot.freeflight3&amp;amp;hl=en' class='external'>Free
Flight 3&lt;/a>, which is nice, but &amp;hellip; in order to talk to the drone you need not
only BT4 but also your device has to be in a relatively short list of supported
devices:&lt;/div>

&lt;ul>
&lt;li>&lt;a href='http://blog.parrot.com/2014/07/22/devices-compatibility-with-minidrones/' class='external'>http://blog.parrot.com/2014/07/22/devices-compatibility-with-minidrones/&lt;/a>&lt;/li>
&lt;/ul>

&lt;h2>Good news &amp;mdash; btsnoop_hci.log&lt;/h2>

&lt;div class='p'>There is a hope and it has name &lt;b>btsnoop_hci.log&lt;/b> &lt;span class='smile'>&lt;/span>. What is it? Since
Android 4.4 there is a new tool for Bluetooth capture communication (see this
&lt;a href='https://viaforensics.com/articles-presentations/bluetooth-packet-capture-android.html' class='external'>article&lt;/a>).
Just check in &lt;i>Developer options&lt;/i> option &lt;i>Enable Bluetooth HCI snoop log&lt;/i>
and whole communication is logged into the file &lt;i>btsnoop_hci.log&lt;/i>.  Cool isn't
it?!&lt;/div>

&lt;div class='p'>No idea what to do with it? Well, there has to be coded info why drone refuses
other phones, commands to fly, termination of communication, etc. At the moment
I have couple logs from two different phones. If you want an example
&lt;a href='/robots/jessica/btsnoop_hci.log'>here&lt;/a> is one log from Samsung S4 (well I am not sure what
kind of "secret" information is hidden there). It was just on/off and once I
accidentally touched the display and the propellers turned a little &amp;hellip; and it
has 140kB, sigh.&lt;/div>

&lt;div class='p'>The parsing is relatively simple. It contains timestamps, how many bytes are
transfered and in which direction. Here is new
&lt;a href='https://github.com/robotika/jessica' class='external'>Jessica&amp;#039;s repository&lt;/a> on github with
simple &lt;a href='https://github.com/robotika/jessica/blob/master/btsnoop/parse.py' class='external'>parse.py&lt;/a>.&lt;/div>

&lt;div class='p'>If you have phone with Android 4.4 and would like to repeat what I did, here
are two important details:&lt;/div>

&lt;ul>
&lt;li>the default phone menu does not contain &lt;i>Developer options&lt;/i>&lt;/li>

&lt;li>&lt;i>btsnoop_hci.log&lt;/i> is not necessarily at &lt;i>/sdcard/btsnoop_hci.log&lt;/i>&lt;/li>
&lt;/ul>

&lt;div class='p'>The first thing was simple &amp;mdash; the answer is for example
&lt;a href='http://stackoverflow.com/questions/16707137/how-to-find-and-turn-on-usb-debugging-mode-on-nexus-4' class='external'>here&lt;/a>.
It is necessary to press seven times in &lt;i>About phone&lt;/i> otherwise gray &lt;i>Build
number&lt;/i> and you will see &lt;i>You are a developer!&lt;/i>&lt;/div>

&lt;div class='p'>The hint about exact absolute path was written in already mentioned
&lt;a href='https://viaforensics.com/articles-presentations/bluetooth-packet-capture-android.html' class='external'>article&lt;/a>.
In reality it was &lt;i>/sdcard/Android/data/btsnoop_hci.log&lt;/i>, but it can be
different on every system. In particular it is necessary to look in file
&lt;i>/etc/bluetooth/bt_stack.conf&lt;/i> and there you will find it. If you have
problem how to look at this directory you can use for example
&lt;a href='https://play.google.com/store/apps/details?id=com.ghisler.android.TotalCommander' class='external'>Total
Commander&lt;/a>.&lt;/div>

&lt;ul>
&lt;li>&lt;a href='https://projects.ardrone.org/boards/1/topics/show/6844' class='external'>AR Drone forum: SDK support for Bebop and Rolling Spider?&lt;/a>&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;hr/>

&lt;h1>Blog&lt;/h1>

&lt;div class='p'>&lt;a id="141105">&lt;/a>&lt;/div>

&lt;h2>5th November 2014 &amp;mdash; BLE, BLE, BLE&lt;/h2>

&lt;div class='p'>It is too early and there are no suitable „ready to use” tools. I am giving
up combination "Bluetooth 4 BLE Python Windows 7". It looks like there is only
one Python wrapper supporting Bluetooth Low Energy for Linux
(&lt;a href='https://github.com/IanHarvey/bluepy' class='external'>bluepy&lt;/a>), but that is basically
wrapper for &lt;a href='http://www.bluez.org/' class='external'>bluez&lt;/a>. It could be a source of
inspiration though like
&lt;a href='https://github.com/IanHarvey/bluepy/blob/master/bluepy/btle.py#L342-L342' class='external'>list
of services UUIDs&lt;/a>.&lt;/div>

&lt;div class='p'>The way to go now is to get running
&lt;a href='http://developer.android.com/samples/BluetoothLeGatt/index.html' class='external'>BluetoothLeGatt
example&lt;/a> for Android 4.3+ in Java. GATT seems to be an important word &amp;mdash;
Generic Attribute Profile used for BLE. It would be just too simple to use
the sample: &lt;i>Note: At this time, the downloadable projects are designed for
use with Gradle and Android Studio. Project downloads for Eclipse will be
available soon!&lt;/i> (taken from
&lt;a href='http://developer.android.com/samples/index.html' class='external'>samples page&lt;/a>).&lt;/div>

&lt;div class='p'>And what I am trying to do? Well, it would be a bit useless to fully understand
the &lt;b>btsnoop_hci.log&lt;/b> if I would not be able to &lt;b>replay&lt;/b> it. So that is the
plan &amp;mdash; record some minidrone action with logging facilities and then replay
it on the same phone, where it should be hopefully work. And then, in the
next phase, try to understand the command details.&lt;/div>

&lt;div class='p'>p.s. obviously I am not the only one, who is looking for
&lt;a href='http://www.reddit.com/r/Python/comments/2ce703/are_there_really_no_ble_wrapperslibraries/' class='external'>Python
BLE libraries&lt;/a>&amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141107">&lt;/a>&lt;/div>

&lt;h2>7th November 2014 &amp;mdash; Android Sample&lt;/h2>

&lt;div class='p'>I have to write down some notes, because you would not be able to tell that
there was some progress (especially if there are no new commits on
&lt;a href='https://github.com/robotika/jessica' class='external'>github&lt;/a>).  The good news is that I
managed to get working
&lt;a href='http://developer.android.com/guide/topics/connectivity/bluetooth-le.html' class='external'>Android
Bluetooth LE Sample&lt;/a>. I was not „click and go” &amp;mdash; instead that was a kind of
blind fight with installation, drivers, etc.&lt;/div>

&lt;div class='p'>At the end I decided to try
&lt;a href='https://developer.android.com/sdk/installing/studio.html' class='external'>Android Studio&lt;/a>. I
am still not sure if I was supposed to use &lt;b>studio.exe&lt;/b> directly or
&lt;b>studio.bat&lt;/b>, but now I have set path to Java SDK (the &lt;i>bat&lt;/i> wrote:
&lt;i>ERROR: cannot start Android Studio.  No JDK found. Please validate either
ANDROID_STUDIO_JDK, JDK_HOME or JAVA_HOME points to valid JDK installation.&lt;/i>)
and I also overcome problem in studio itself:&lt;/div>

&lt;pre>Error:Unable to start the daemon process.
This problem might be caused by incorrect configuration of the daemon.
For example, an unrecognized jvm option is used.
Please refer to the user guide chapter on the daemon at 
   http://gradle.org/docs/1.12/userguide/gradle_daemon.html
Please read below process output to find out more:
- - -  - - - - - - - -  - - - - - -  - - - - - -
Error occurred during initialization of VM
Could not reserve enough space for object heap
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.&lt;/pre>

&lt;div class='p'>The solution for this was found on &lt;a href='http://stackoverflow.com/questions/23663299/android-studio-gradle-project-sync-failed' class='external'>stack overflow&lt;/a>: &lt;i>to add something like this to your gradle.properties file in the project:&lt;/i>&lt;/div>

&lt;pre>org.gradle.jvmargs=-Xmx512m -XX:MaxPermSize=512m&lt;/pre>

&lt;div class='p'>I did not have any file named  &lt;i>gradle.properties&lt;/i>, but details where such a
file could be you can find in
&lt;a href='http://www.gradle.org/docs/current/userguide/tutorial_this_and_that.html' class='external'>gradle
documentation&lt;/a>.&lt;/div>

&lt;div class='p'>So what the Bluetooth sample does? It scans for BLE devices, if you select one
it connects to it and list available services, and if you select service it can
read given server (e.g. minidrone) characteristics. I probably mismatched exact
Bluetooth terms, but what I mean is that you can connect to the drone and ask
what it can do. And surprisingly enough 80% (?) of initial communication is
&lt;b>identical&lt;/b> to &lt;i>Free Flight 3&lt;/i>!&lt;/div>

&lt;div class='p'>The conclusion is that it is not only „decoding Parrot protocol” but rather
understanding GATT and Bluetooth LE protocol (which is good and bad news in
one). For me this is completely new area, but the motivation is relatively
strong to learn more. An interesting online book with reasonable explanation is
&lt;a href='https://www.safaribooksonline.com/library/view/getting-started-with/9781491900550/ch04.html' class='external'>here&lt;/a>,
Chapter 4. GATT (Services and Characteristics).&lt;/div>

&lt;div class='p'>A funny note at the end of this update &amp;hellip; as I am decoding communication
bottom up it is fun to discover messages like this (two different phones):
[2, 64, 32, 17, 0, 13, 0, 4, 0, 9, 11, 22, 0, 71, 97, 108, 97, 120, 121, 32,
83, 52] and [2, 64, 32, 27, 0, 23, 0, 4, 0, 9, 21, 22, 0, 76, 106, 32, 67, 114,
101, 97, 116, 111, 114, 32, 40, 71, 97, 108, 97, 120, 121, 32]. Do you see
there some similarity? And what about [2, 64, 32, LEN=27, 0, LEN=23, 0, 4, 0,
9, LEN=21, 22, 0, 76, 106, 32, 67, 114, 101, 97, 116, 111, 114, 32, 40, 71, 97,
108, 97, 120, 121, 32]. So it is envelop of envelop for envelop for string.
Almost half of transmitted bytes are length of sent buffer. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141109">&lt;/a>&lt;/div>

&lt;h2>9th November 2014 &amp;mdash; GATT, Services, Characteristics, &amp;hellip;&lt;/h2>

&lt;div class='p'>It is quite hard for me to read through Bluetooth LE specification. The bits
slowly fit into the whole image, after many reading attempts. Everything is
written there just my brain somehow refuses to understand it &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>Now I would like to say that the best start is probably
&lt;a href='http://en.wikipedia.org/wiki/Bluetooth_low_energy' class='external'>wikipedia Bluetooth Low
Energy&lt;/a> article. Note, that you have to understand the BLE concept if you want
to understand &lt;i>Parrot minidrone protocol&lt;/i>, so that's why I am trying „so
hard” &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>GATT, Services, Characteristics, &amp;hellip; these are all &lt;i>keywords&lt;/i> used in BLE.
&lt;i>GATT is an acronym for the Generic Attribute Profile, and it defines the way
that two Bluetooth Low Energy devices transfer data back and forth using
concepts called Services and Characteristics. It makes use of a generic data
protocol called the Attribute Protocol (ATT), which is used to store Services,
Characteristics and related data in a simple lookup table using 16-bit IDs for
each entry in the table.&lt;/i>
(&lt;a href='https://learn.adafruit.com/introduction-to-bluetooth-low-energy/gatt' class='external'>source&lt;/a>)
Clear? &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>I cannot find now mine „eye opener” image/table so at least one more
complicated:
&lt;a href='http://www.jiataochina.com/UpLoad/201311/2013111439221329.png' class='external'>source&lt;/a>.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 646px;'>&lt;tr>&lt;td>
&lt;a href='/robots/jessica/gatt-l2cap.jpg'>&lt;img src='/robots/jessica/gatt-l2cap_t.jpg' alt='GATT based on L2CAP' title='GATT based on L2CAP' class='border'  width='640' height='290'/>&lt;/a>&lt;br/>
&lt;a href='/robots/jessica/gatt-l2cap.jpg'>GATT based on L2CAP&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>So GATT is based on existing L2CAP (Logical Link Control and Adaptation
Protocol) which is something similar like UDP on Internet communication (and it
is not available under Microsoft Bluetooth Stack, as far as I understand it).
For night reading I would recommend
&lt;a href='http://people.csail.mit.edu/rudolph/Teaching/Articles/BTBook.pdf' class='external'>Bluetooth
for Programmers&lt;/a> PDF from MIT &amp;mdash; a bit older with many TODOs, but still good
enough. There you will learn about &lt;i>L2CAP&lt;/i>.&lt;/div>

&lt;div class='p'>Does it help? Well not really. It means that if I would be able to send
&lt;i>packets&lt;/i> via &lt;i>L2CAP&lt;/i> I should be able to replay logged data as-it-is.
Maybe. But if I want to use GATT available in Android, for example, then it is
necessary to understand the protocol in more details and go to the ground
&lt;i>what each byte means&lt;/i>.&lt;/div>

&lt;div class='p'>I would pick some (for me important) sentences from
&lt;a href='http://en.wikipedia.org/wiki/Bluetooth_low_energy' class='external'>wikipedia&lt;/a>:&lt;/div>

&lt;ul>
&lt;li>Bluetooth Smart is not backward-compatible with the previous Bluetooth
protocol. Bluetooth Smart uses the same 2.4 GHz radio frequencies, but it uses
a simpler modulation system. (i.e. not good idea to convince old hardware to
talk with BLE)&lt;/li>

&lt;li>Write operations always identify the characteristic by handle, but have a
choice of whether or not a response from the server is required.
(So it is not enough to know characteristics by its UUID, you need to find &lt;b>the
handle&lt;/b>.)&lt;/li>
&lt;/ul>

&lt;div class='p'>Now back to the minodrone. If you run
&lt;a href='http://developer.android.com/guide/topics/connectivity/bluetooth-le.html' class='external'>Android
Bluetooth LE&lt;/a> sample, you will find a big list of available characteristics
(32+32+7=71 in total).  The complete list is in
&lt;a href='https://github.com/robotika/jessica/blob/master/android/SampleGattAttributes.java' class='external'>modified
SampleGattAttributes.java&lt;/a>.  But how to find which UUID corresponds to
the „magical command” for motion control,
&lt;a href='https://github.com/robotika/jessica/commit/e385113396b904c4be8961e94f160f553123f4fa' class='external'>mentioned
sooner&lt;/a>? After some experiments with write it looks like that
&lt;a href='https://github.com/robotika/jessica/commit/e385113396b904c4be8961e94f160f553123f4fa#diff-086232ecb08dc32632ba1d05ff68759dR50' class='external'>the
assert&lt;/a> with array [0x12, 0x0, 0x4, 0x0, 0x52, 0x40, 0x0, 0x2] contains the
16bit handle [0x40, 0x0].&lt;/div>

&lt;div class='p'>How to write characteristics? This part is missing in the example, but you can
find the answer
&lt;a href='http://stackoverflow.com/questions/20043388/working-with-ble-android-4-3-how-to-write-charactaristics' class='external'>here&lt;/a>.&lt;/div>

&lt;div class='p'>This was my hack (modification of DeviceControlActivity.java):&lt;/div>

&lt;pre>//hack mBluetoothLeService.readCharacteristic(characteristic);
// hacking
byte[] value = new byte[5];
value[0] = (byte) (0x11);
value[1] = (byte) (0x22);
value[2] = (byte) (0x33);
value[3] = (byte) (0x44);
value[4] = (byte) (0x55);
characteristic.setValue( value );
mBluetoothLeService.writeCharacteristic(characteristic);
// end of hacking&lt;/pre>

&lt;div class='p'>and changes of &lt;i>BluetoothLeService.java&lt;/i>&lt;/div>

&lt;pre>public void writeCharacteristic(BluetoothGattCharacteristic characteristic) {
    if (mBluetoothAdapter == null || mBluetoothGatt == null) {
        Log.w(TAG, "BluetoothAdapter not initialized");
        return;
    }
    mBluetoothGatt.writeCharacteristic(characteristic);
}&lt;/pre>

&lt;div class='p'>If you look in &lt;b>btsnoop_hci.log&lt;/b> you will find there 0x11, 0x22, 0x33, 0x44,
0x55:&lt;/div>

&lt;pre>[0x2, 0x40, 0x20, 0xc, 0x0, 0x8, 0x0, 0x4, 0x0, 0x52, 0x22, 0x1, 0x11, 0x22, 0x33, 0x44, 0x55]
[0x2, 0x40, 0x20, 0xc, 0x0, 0x8, 0x0, 0x4, 0x0, 0x52, 0x25, 0x1, 0x11, 0x22, 0x33, 0x44, 0x55]&lt;/pre>

&lt;div class='p'>I was sending this array at first to characteristics &lt;i>D52&lt;/i> and &lt;i>D53&lt;/i> (the
only difference is 0x22 and 0x25), but then I tried &lt;i>Axx&lt;/i> characteristics
(there are 32 of them) and got&lt;/div>

&lt;pre>[0x2, 0x40, 0x20, 0xc, 0x0, 0x8, 0x0, 0x4, 0x0, 0x52, 0x22, 0x0, 0x11, 0x22, 0x33, 0x44, 0x55]
[0x2, 0x40, 0x20, 0xc, 0x0, 0x8, 0x0, 0x4, 0x0, 0x52, 0x25, 0x0, 0x11, 0x22, 0x33, 0x44, 0x55]
&amp;hellip;&lt;/pre>

&lt;div class='p'>At first moment I was quite disappointed that I am getting the same numbers as
for  &lt;i>D52&lt;/i> and &lt;i>D53&lt;/i> and that it is probably dynamically assigned &amp;hellip; but I
overlooked 0x1/0x0 difference &lt;span class='smile'>&lt;/span>. And handle is 16bit, so the treasure has
&lt;b>UUID=9a66fa0a-0800-9191-11e4-012d1540cb8e&lt;/b>.&lt;/div>

&lt;div class='p'>p.s. do not ask me if it is already flying &amp;mdash; it is not :-( &amp;hellip; not yet.&lt;/div>

&lt;div class='p'>p.s.2 one implementation note, which could be useful &lt;i>the advertisement packet
is composed of a series of variable length blocks, that can appear in any
order.  each block starts with a length byte, followed by a type byte, followed
by the data.  the payload cannot exceed 31 bytes.&lt;/i> &amp;hellip; taken from
&lt;a href='https://github.com/RFduino/RFduino/blob/master/libraries/RFduinoBLE/examples/AdvertisementRaw/AdvertisementRaw.ino' class='external'>RFduinoBLE
github&lt;/a> &amp;hellip; but maybe it is not related.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141111">&lt;/a>&lt;/div>

&lt;h2>11th November 2014 &amp;mdash; Cracking Handles&lt;/h2>

&lt;div class='p'>Thanks to &lt;i>Šimi&lt;/i> &lt;span class='smile'>&lt;/span> I realized why I was not able to find &lt;i>handles&lt;/i> (16bit
identification of &lt;i>characteristics&lt;/i>) for &lt;i>Bxx&lt;/i> services. The reason are
&lt;i>GATT PROPERTIES&lt;/i>. It struck me out when I saw in debugger that
&lt;i>mProperties&lt;/i> differ:&lt;/div>

&lt;pre>mUuid = {java.util.UUID830047596672}"9a66ffc1-0800-9191-11e4-012d1540cb8e"
mService = {android.bluetooth.BluetoothGattService830047490720}
mProperties = 12
mPermissions = 0
mKeySize = 16
mInstance = 0
mWriteType = 1

mUuid = {java.util.UUID830047598160}"9a66fd22-0800-9191-11e4-012d1540cb8e"
mService = {android.bluetooth.BluetoothGattService830047491960}
mProperties = 30
mPermissions = 0
mKeySize = 16
mInstance = 0
mWriteType = 1&lt;/pre>

&lt;div class='p'>The bit array is following:&lt;/div>

&lt;pre>public static final int PROPERTY_BROADCAST = 1;
public static final int PROPERTY_EXTENDED_PROPS = 128;
public static final int PROPERTY_INDICATE = 32;
public static final int PROPERTY_NOTIFY = 16;
public static final int PROPERTY_READ = 2;
public static final int PROPERTY_SIGNED_WRITE = 64;
public static final int PROPERTY_WRITE = 8;
public static final int PROPERTY_WRITE_NO_RESPONSE = 4;&lt;/pre>

&lt;div class='p'>So some characteristics you can &lt;i>write only&lt;/i>, some can be &lt;i>read only&lt;/i>, etc.
And mine piece of code was in "if can read" condition:&lt;/div>

&lt;pre>final int charaProp = characteristic.getProperties();
if ((charaProp | BluetoothGattCharacteristic.PROPERTY_READ) > 0) { &amp;hellip; }&lt;/pre>

&lt;div class='p'>(I am not Java programmer, but this bit operation would be wrong in C &amp;hellip; I
would guess that there should be &lt;b>&amp;amp;&lt;/b> instead &amp;hellip; this is taken from the
Android Sample)&lt;/div>

&lt;div class='p'>What I am up to is to find &lt;i>handle&lt;/i> for all &lt;i>UUID&lt;/i>, and after reading
&lt;a href='http://e2e.ti.com/support/wireless_connectivity/f/538/t/299434.aspx' class='external'>this
webpage&lt;/a> I was looking for another way: &lt;i>2. The Android BLE developers chose
to use UUID as the characteristic identifier in GATT API's; handles are not
directly exposed to applications. Internally, Android maps UUID's to handles
and uses handles to access characteristics, but apps have no need to know the
handle.&lt;/i> &amp;hellip; so it is not a good idea to search for it in Android Java.&lt;/div>

&lt;div class='p'>The good news is that you can find it in &lt;b>btsnoop_hci.log&lt;/b>. I wanted to
know handle for &lt;i>9a66fa0a-0800-9191-11e4-012d1540cb8e&lt;/i> (I know this one, it
is 0x40). All you need is to search for "0a fa 66 9a" (reversed order), and you
will find &lt;b>02 40 20 1B 00 17 00 04 00 09 15 3F 00 04 _40 00_ 8E CB 40 15 2D 01
E4 11 91 91 00 08 0A FA 66 9A&lt;/b>&lt;/div>

&lt;div class='p'>&amp;hellip; so now I have all handles of all characteristics + their properties &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141112">&lt;/a>&lt;/div>

&lt;h2>12th November 2014 &amp;mdash; Missing handles and more UUIDs&lt;/h2>

&lt;div class='p'>OK, now I think that I can finally generate identical output as &lt;i>FreeFlight3&lt;/i>.
But it was not that easy. Where was the problem? Well I could not get this
output section:&lt;/div>

&lt;pre>02 40 20 09 00 05 00 04 00 12 C0 00 01 00
02 40 20 09 00 05 00 04 00 12 BD 00 01 00
02 40 20 09 00 05 00 04 00 12 E4 00 01 00
02 40 20 09 00 05 00 04 00 12 E7 00 01 00
02 40 20 09 00 05 00 04 00 52 16 01 01 00
02 40 20 09 00 05 00 04 00 52 26 01 01 00&lt;/pre>

&lt;div class='p'>The last two "packets" have 0x52 which is similar to&lt;/div>

&lt;pre>02 40 20 0A 00 06 00 04 00 52 7C 00 01 01 01&lt;/pre>

&lt;div class='p'>&amp;hellip; and this means write &lt;i>01 01 01&lt;/i> to handle 0x7C. So in previous section
this should correspond to "write 01 00 to handle 0x116". Well, but there is no
such UUID, which would correspond to handle 0x116!? :-(. Do you know the
answer?  I did not. The major hint was
&lt;a href='http://processors.wiki.ti.com/images/a/a8/BLE_SensorTag_GATT_Server.pdf' class='external'>BLE_SensorTag_GATT_Server.pdf&lt;/a>
where I found &lt;i>Client Characteristics Configuration --- Write "01:00" to
enable notifications, "00:00" to disable notification&lt;/i>. I supposed that the
Android Sample is already sending notifications but it was only for some
special UUID_HEART_RATE_MEASUREMENT.&lt;/div>

&lt;div class='p'>The code was replaced by:&lt;/div>

&lt;pre>for( BluetoothGattDescriptor descriptor : characteristic.getDescriptors() ) {
        descriptor.setValue(BluetoothGattDescriptor.ENABLE_NOTIFICATION_VALUE);
        mBluetoothGatt.writeDescriptor(descriptor);
}&lt;/pre>

&lt;div class='p'>And then I tried nearby handles, and it worked &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>Remained 4 packets with 0x12 instead of 0x52. They belong to &lt;i>Bxx&lt;/i> section
where only &lt;i>PROPERTY_NOTIFY = 16&lt;/i> is set. Tried that and it worked too
although the offsets were a bit different. Here is the
&lt;a href='https://github.com/robotika/jessica/commit/bf9edc5eb1650a0c417b357d3a9249ea95380838' class='external'>diff&lt;/a>
for &lt;i>Bxx notifications&lt;/i>.&lt;/div>

&lt;div class='p'>And one small bonus &amp;mdash; I was tired of watching the minidrone LEDs, so last
night I started &lt;i>FreeFlight3&lt;/i> again and just spin the propellers &lt;span class='wink'>&lt;/span>. And there
was a difference when compared to old the logged output:&lt;/div>

&lt;pre>02 40 20 18 00 14 00 04 00 52 43 00 04 01 00 04 01 00 32 30 31 34 2D 31 31 2D 31 31 00
02 40 20 1A 00 16 00 04 00 52 43 00 04 02 00 04 02 00 54 32 32 33 32 33 38 2B 30 31 30 30 00
02 40 20 0D 00 09 00 04 00 52 43 00 04 03 00 02 00 00&lt;/pre>

&lt;div class='p'>Guess, what is it? 0x30-0x39 are numbers &amp;hellip; &lt;i>2014-10-29&lt;/i> and
&lt;i>T223238+0100&lt;/i> is better? So UUID
&lt;a href='https://github.com/robotika/jessica/commit/876eb50506b149734d6527ca2504313ccc618113' class='external'>9a66fa0b-0800-9191-11e4-012d1540cb8e&lt;/a>
is Date/Time.&lt;/div>

&lt;div class='p'>Now it is time to put it all together and replay it with 50ms sleeps (at least
it looks like the messages are sent with 20Hz frequency).&lt;/div>

&lt;div class='p'>p.s. I skipped bad news &amp;mdash; I bought
&lt;a href='http://www.alza.cz/asus-usb-bt400-d510631.htm' class='external'>USB BT400&lt;/a> to enable Bluetooth
4.0 on my old laptop with Ubuntu Linux,
&lt;a href='http://www.linux-hardware-guide.com/2014-10-11-asus-usb-bt400-usb-bluetooth-4-0' class='external'>fixed
some issues&lt;/a> with driver, but the drone refused to connect anyway &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141113">&lt;/a>&lt;/div>

&lt;h2>13th November 2014 &amp;mdash; The First Spin&lt;/h2>

&lt;div class='p'>Yesterday morning I finally managed to spin the propellers &lt;span class='smile'>&lt;/span>. Why it did not
work sooner I am not 100% sure, but it could be due to timing (before I was
sending messages without pause) and I had mistake in selected increasing byte,
which was maybe in previous version too.&lt;/div>

&lt;div class='p'>I decided to put even non-modified source on the github
(&lt;a href='https://github.com/robotika/jessica/commit/512c887672b13009375778a82a4822c0de7bc216' class='external'>diff&lt;/a>),
so if you want you should be able to repeat my steps. Please let me know if you
encounter any problem.&lt;/div>

&lt;div class='p'>I tried to record a demonstration video this morning, and sure enough it did
not work. There is still a big TODO for version 0 (homologation, climb one
step) and it is necessary to be able to stop the robot. There will be new
application button for "Tour the Stairs", but it is not there at the moment, so
what you need to do is enable notifications via B01 service, B0E
characteristic &amp;mdash; and that works:&lt;/div>

&lt;div class='p'>&lt;iframe width="640" height="360" src="//www.youtube.com/embed/4jyffZZ1At0?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141114">&lt;/a>&lt;/div>

&lt;h2>14th November 2014 &amp;mdash; Not there yet&lt;/h2>

&lt;div class='p'>This is the first attempt for &lt;b>version 0&lt;/b>
&lt;a href='https://github.com/robotika/jessica/commit/911b5a06bf75d999dcd3e7ec8554d45464e9b388' class='external'>Tour
the Stairs code&lt;/a>. And it is not working, yet. The drone moves slowly towards
the step (beware, if you would like to repeat my attempt do not forget to
attach wheels and configure the minidrone to use them), but then it fails to
fly a little bit up. The step friction is too big. You can take off in free
area but it is a bit scary/dangerous.&lt;/div>

&lt;div class='p'>Note, that now two more bytes are correctly interpreted (see
&lt;a href='https://github.com/robotika/jessica/commit/48c1b584b6bfdd23c87dd3e17ba2735946786313' class='external'>diff&lt;/a>).
Instead of 16bit value there are two 8bit values, turn right/left and up/down.
Both values are probably percentage, i.e. 100 is maximum and you can use -100
for opposite direction.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141115">&lt;/a>&lt;/div>

&lt;h2>15th November 2014 &amp;mdash; Hints&lt;/h2>

&lt;div class='p'>I am no longer alone to fight this minidrone-stair-climbing task &lt;span class='smile'>&lt;/span>. And the
other guy knows some great details.&lt;/div>

&lt;div class='p'>For example, I can confirm that the last four bytes in &lt;i>AT*PCMD&lt;/i> should be
interpreted as a float number. I am still not absolutely sure what it does (I
have to test it) &amp;mdash; it is probably a multiple for other &lt;b>byte&lt;/b> parameters?
Do you remember me about writing that first 16bit number looks like forward
speed (see &lt;a href='https://projects.ardrone.org/boards/1/topics/show/6844' class='external'>forum&lt;/a>)?
Well, I was wrong. The same command is used also for flying and there would be
missing parameter for tilt left/right. When you are rolling on the wheels you
will not notice it (yeah, I am just trying to find an excuse &lt;span class='wink'>&lt;/span>). Now I tried
to replay all my old &lt;i>FreeFlight3&lt;/i> logs and I can confirm that both
forward/backward and tilt left/right values are within range -100..100 (see
fixed
&lt;a href='https://github.com/robotika/jessica/commit/7e5686dcc060480c80caf9fad83db44518112b64' class='external'>parse.py
diff&lt;/a>).&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141120">&lt;/a>&lt;/div>

&lt;h2>20th November 2014 &amp;mdash; Takeoff and landing&lt;/h2>

&lt;div class='p'>You probably noticed that there was no „homologation video” posted till the
deadline 17th November 2014 :-(. I overcome some problems like interpretation
of the first two bytes in &lt;i>AT*PCMD&lt;/i> command: when I send 8000 as 16bit
integer it moved in reasonable speed forward, but when I set only byte part of
it (hex(8000)='0x1f40', i.e. 0x40 and 0x1F) and the second byte zero then it
did not move at all. The explanation was easy &amp;mdash; it is not the first byte for
the moving forward, but the second byte. And tilt byte helped to roll.&lt;/div>

&lt;div class='p'>A friend of mine (PetrS) suggested to use nicer code via
&lt;a href='https://docs.oracle.com/javase/7/docs/api/java/io/ByteArrayOutputStream.html' class='external'>ByteArrayOutputStream&lt;/a>
and
&lt;a href='https://docs.oracle.com/javase/7/docs/api/java/io/DataOutputStream.html' class='external'>DataOutputStream&lt;/a>
but I did not properly solve endians for the &lt;i>float&lt;/i> number (see
&lt;a href='https://github.com/robotika/jessica/commit/9506e0c7438d78299f4c431ded688b519eaf91a4' class='external'>diff&lt;/a>).&lt;/div>

&lt;div class='p'>So what is the problem? I switched to &lt;i>manual control&lt;/i> and used
&lt;i>FreeFlight3&lt;/i> and I did not manage to climb the step even with several
attempts. Sigh. So it is hard to convince the machine to do it autonomously if
I cannot do it manually.  Note, that I was only driving on the floor and near
the step I switched to full power up but it hardly lifted and it looked very
dangerous.&lt;/div>

&lt;div class='p'>My last chance is &lt;b>to fly&lt;/b> to the upper step. So last night I pushed
&lt;i>takeoff&lt;/i> and a second later &lt;i>land&lt;/i>. I was very very surprised how smoothly
it went up and down when compared to older &lt;a href='/robots/heidi/en'>&lt;span class='cs'>AR Drone 2.0&lt;/span>&lt;/a>.
Again I parsed &lt;i>btsnoop_hci.log&lt;/i> and I found there these new commands:&lt;/div>

&lt;pre>02 40 20 0D 00 09 00 04 00 52 43 00 04 05 02 00 01 00
&amp;hellip;
02 40 20 0D 00 09 00 04 00 52 43 00 04 06 02 00 03 00
&amp;hellip;
02 40 20 0D 00 09 00 04 00 52 43 00 04 07 02 00 03 00&lt;/pre>

&lt;div class='p'>The first is &lt;i>takeoff&lt;/i> but I was not sure which one is &lt;i>landing&lt;/i>. Note the
5th number from the right &amp;mdash; it is again increasing like for &lt;i>AT*PCMD&lt;/i> and
&lt;b>0x43&lt;/b> is actually handle for settings. See &lt;a href='/robots/jessica/en#141112'>older
post&lt;/a> and compare this handle with setting date and time &lt;span class='smile'>&lt;/span>. So &lt;i>landing&lt;/i>
command was pressed twice.&lt;/div>

&lt;div class='p'>Now it is early in the morning so I do not want to try autonomous
takeoff/landing but I will probably
&lt;a href='https://github.com/robotika/jessica/commit/5775adc37e5c8ad3b7646eadd97ce6f014206066' class='external'>publish
the code&lt;/a>, so if there is anybody interested you will know what I am talking
about.&lt;/div>

&lt;div class='p'>The plan is to takeoff to 20cm, fly forward and land. Unfortunately I did not
find out how to collect data like altitude, absolute position, even if the
takeoff/landing is complete &amp;hellip; so it will be controlled only by time. At the
moment it is probably „the simplest thing which could possibly work” &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>p.s. sure enough it flew to 1 meter and then it refused to land &amp;hellip; I am glad
that the old trick that you take the drone by the frame (here wheels) and turn
it up-side-down works &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141121">&lt;/a>&lt;/div>

&lt;h2>21th November 2014 &amp;mdash; Emergency Stop&lt;/h2>

&lt;div class='p'>Last night I confirmed that it was not an accident that Rolling Spider did not
want to land shortly after takeoff but „feature”. I tried to repeat landing
during takeoff in FF3 (Free Flight 3), and it does nothing until you finish the
takeoff sequence.&lt;/div>

&lt;div class='p'>So there are two possibilities now:&lt;/div>

&lt;ul>
&lt;li>&lt;i>climb&lt;/i> the stairs with 1m high takeoff/land sequence&lt;/li>

&lt;li>use &lt;b>Emergency Stop&lt;/b> to cut all motors even during takeoff&lt;/li>
&lt;/ul>

&lt;div class='p'>The byte-sequence for Emergency Stop is:&lt;/div>

&lt;pre>02 40 20 0D 00 09 00 04 00 52 46 00 04 01 02 00 04 00
&amp;hellip;
02 40 20 0D 00 09 00 04 00 52 46 00 04 02 02 00 04 00&lt;/pre>

&lt;div class='p'>So it is again another &lt;i>handle&lt;/i> (0x46) and it has its own counter (5th byte
from the end) &amp;mdash; very similar to takeoff/land.&lt;/div>

&lt;div class='p'>This
&lt;a href='https://github.com/robotika/jessica/commit/eece8c18ede19e1bfa4902e7083139f82983302b' class='external'>code&lt;/a>
managed to „jump” on the step (the AR Drone 2.0 paper box), but it failed
during video recording and now the battery needs recharging &amp;hellip; so next test
tomorrow.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141123">&lt;/a>&lt;/div>

&lt;h2>23rd November 2014 &amp;mdash; Battery packet&lt;/h2>

&lt;div class='p'>A few days ago I came across these inputs:&lt;/div>

&lt;pre>71.973 1: 02 40 20 0E 00 0A 00 04 00 1B BF 00 02 06 00 05 01 00 5F
72.108 1: 02 40 20 0E 00 0A 00 04 00 1B BF 00 02 07 00 05 01 00 5E
&amp;hellip;
72.940 1: 02 40 20 0E 00 0A 00 04 00 1B BF 00 02 0E 00 05 01 00 57&lt;/pre>

&lt;div class='p'>The log was from experiment in &lt;i>FreeFlight3&lt;/i> with takeoff and I remembered
that battery status was at the end 87%. The last number in the packet was
decreasing by one and the last one 0x57 is 87 &lt;span class='smile'>&lt;/span> &amp;hellip; so I think that now we
have „battery packet”.&lt;/div>

&lt;div class='p'>It was necessary to add more notifications (see
&lt;a href='https://github.com/robotika/jessica/commit/c10503324649b73b3c2871073d5492917d8c9f89' class='external'>code&lt;/a>).
The plan is to enable notification all available &lt;i>handles&lt;/i> but time is
getting short
(&lt;a href='http://cafe-neu-romance.com/home/cafe-neu-romance-2014-program' class='external'>the contest&lt;/a>
is on 29th November 2014). At least I finally recorded
&lt;a href='http://youtu.be/5sct8ynYdu0' class='external'>homologation video&lt;/a> („jump” one step).&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141124">&lt;/a>&lt;/div>

&lt;h2>24th November 2014 &amp;mdash; RC toy?! (5 days)&lt;/h2>

&lt;div class='p'>Bad news &amp;mdash; it looks like that current version of Rolling Spider does not
support sending of speed/position/altitude status :-(. The machine has to know
it but it is not necessary for &lt;i>Free Flight 3&lt;/i> application. Experimentally I
enabled all available notifications (see
&lt;a href='https://github.com/robotika/jessica/commit/c02c494a3cc12ad4c6fae8cff917c33b40343f1f' class='external'>diff&lt;/a>)
and I have not see anything „interesting”.&lt;/div>

&lt;div class='p'>So what can we do without any feedback? I tried &lt;i>takeoff&lt;/i> with &lt;i>down&lt;/i>
command &amp;hellip; crazy combination but it is still moving up anyway. What I would
try is to &lt;i>takeoff&lt;/i>, wait until it is &lt;i>flying&lt;/i> and then &lt;i>move down&lt;/i>
without &lt;i>landing&lt;/i>. For that we will need to know &lt;i>flying status&lt;/i>.&lt;/div>

&lt;div class='p'>With help it should be these messages:&lt;/div>

&lt;pre>0: 02 40 20 0D 00 09 00 04 00 52 43 00 04 05 02 00 01 00 = takeoff
1: 02 40 20 11 00 0D 00 04 00 1B BC 00 04 15 02 03 01 00 01 00 00 00 = takingoff&lt;/pre>

&lt;pre>0: 02 40 20 0D 00 09 00 04 00 52 43 00 04 07 02 00 03 00 = land
1: 02 40 20 11 00 0D 00 04 00 1B BC 00 04 18 02 03 01 00 04 00 00 00 = landing
1: 02 40 20 11 00 0D 00 04 00 1B BC 00 04 19 02 03 01 00 00 00 00 00 = landed&lt;/pre>

&lt;pre>0: 02 40 20 0D 00 09 00 04 00 52 46 00 04 01 02 00 04 00 = emergency cmd
1: 02 40 20 11 00 0D 00 04 00 1B BC 00 04 02 02 03 01 00 05 00 00 00 = emergency&lt;/pre>

&lt;hr/>

&lt;div class='p'>&lt;a id="141125">&lt;/a>&lt;/div>

&lt;h2>25th November 2014 &amp;mdash; Takeoff workaround (4 days)&lt;/h2>

&lt;div class='p'>Finally there was more serious testing today. I experimented with a
&lt;a href='https://github.com/robotika/jessica/commit/5da56ba721916021f00064070f027c6030be0813' class='external'>takeoff
workaround&lt;/a>, i.e. complete &lt;i>takeoff sequence&lt;/i>, return back on the ground
with &lt;i>move down&lt;/i> command and then &lt;i>move forward on the ground&lt;/i>. The results
were interesting:&lt;/div>

&lt;ul>
&lt;li>sometimes notifications failed to turn on (i.e. the drone was infinitely
waiting for takeoff sequence completion, or there were no data about low
battery)&lt;/li>

&lt;li>when the drone reached the ground with &lt;i>move down&lt;/i> command it was
oscillating back and forth for a while&lt;/li>

&lt;li>as the battery was lower and lower the drone finished &lt;i>takeoff&lt;/i> and switched
to &lt;i>hovering&lt;/i> even 20cm above the ground (instead of 1 meter)&lt;/li>
&lt;/ul>

&lt;div class='p'>The main problem was going „against the wall” without any feedback. Jessica
just hit it hard and jumped back. This would mean falling down from the stairs.
It is possible to limit vertical speed but the minimal value you can set is
0.5m/s. Maybe it is just limitation of &lt;i>Free Flight 3&lt;/i>, and the speed could
be set lower??  But the battery was dead I did not want to wait at school for 90
minutes until it would be fully charged.&lt;/div>

&lt;div class='p'>p.s. one new byte sequence:&lt;/div>

&lt;pre>1: 02 40 20 11 00 0D 00 04 00 1B BC 00 04 04 02 03 01 00 02 00 00 00 = hovering status&lt;/pre>

&lt;hr/>

&lt;div class='p'>&lt;a id="141126">&lt;/a>&lt;/div>

&lt;h2>26th November 2014 &amp;mdash; Ver0 revised (3 days)&lt;/h2>

&lt;div class='p'>First of all I would like to thank
&lt;a href='http://www.icornerhightech.cz' class='external'>icornerhightech.cz&lt;/a> for lending me another
drone (Jessica B./Blue/Backup) and extra battery. Now I should survive 8 rounds
on Saturday with much lower risk (I wanted to write „without any problem” but
we all know that something will „show up”).&lt;/div>

&lt;div class='p'>Second, some issues I have seen yesterday were not caused by low battery but by
re-starting the testing application. If I turn the minidrone off and on again
it works usually fine.&lt;/div>

&lt;div class='p'>Third, I wrote &lt;i>usually&lt;/i> because once it did not work fine and I experienced
very ugly crash. Even the wheels  and one propeller fell off :-(. Maybe bad
calibration at the start?? The drone just went in 20deg angle instead of
vertical takeoff, hit the wall and fell on the ground. It still runs fine, so
the mechanics is very robust!&lt;/div>

&lt;div class='p'>I do not know if I mentioned
&lt;a href='https://github.com/robotika/jessica/commit/f9e7200e75886044b179747105ddcfb4bbe8f004' class='external'>battery
and status info&lt;/a> in the testing application. Now you can see if there are any
updates coming, if the status is changing and how much battery is left.&lt;/div>

&lt;div class='p'>Today we were testing in NTK Gallery and I had to
&lt;a href='https://github.com/robotika/jessica/commit/90ef2999fdac96193c00dbb5d401d610220ea76d' class='external'>tune
ver0&lt;/a> &amp;mdash; if there were 15 repetition Jessica jumped to 2nd step instead of
1st one. 12 repetitions = 0.6s was mostly just right, but when the battery
was at 58% Jessica was not able to climb the step (sure enough in front of TV
camera).&lt;/div>

&lt;div class='p'>Because of the scary experience with one takeoff I will probably continue from
&lt;i>ver0&lt;/i> (see function &lt;i>ver0ex()&lt;/i>, where I called &lt;i>ver0&lt;/i> three times but I
was not able to reach the 3rd step yet). So &lt;i>takeoff&lt;/i> while moving forward
and &lt;i>emergency stop&lt;/i> is the way.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141128">&lt;/a>&lt;/div>

&lt;h2>28th November 2014 &amp;mdash; !!!STOP!!! (1 day)&lt;/h2>

&lt;div class='p'>Tonight I was preparing „contest version”. It was is not very great, but
just a few changes to
&lt;a href='https://github.com/robotika/jessica/commit/79c896e5861bfd0b223ece9ee72a09c4c3e74f80' class='external'>the last
commit&lt;/a>:&lt;/div>

&lt;ul>
&lt;li>STOP button &lt;b>has to be&lt;/b> separate &amp;mdash; the drone is dangerous and it is very
bad to guess it you switched it on or off&lt;/li>

&lt;li>if you disconnect and connect again it works fine as if you reset the drone&lt;/li>

&lt;li>the height of takeoff during first 12/20 to 18/20 seconds is quite random and
probably depends on friction of the wheel with the step, battery status, &amp;hellip; ???&lt;/li>

&lt;li>Jessica Blue can be controlled with the same program but while the Red One
flies almost 20cm the Blue One hardly took off 5cm &amp;hellip; so I will use it as 
charger only.&lt;/li>

&lt;li>I swapped &lt;i>approachStep&lt;/i> with &lt;i>ver0&lt;/i> so there is at least a chance to 
score for the first step&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;div class='p'>&lt;a id="141130">&lt;/a>&lt;/div>

&lt;h2>30th November 2014 &amp;mdash; Tour the Stairs&lt;/h2>

&lt;div class='p'>The competition &lt;a href='/competitions/tour-the-stairs/en'>Tour the Stairs&lt;/a> is over.
Jessica finished on 4th place (i.e. the last place) but I would still consider
it success. In the best attempts she was able to jump/climb two steps, but
&lt;i>emergency stop&lt;/i> cutting the power to stop &lt;i>takeoff&lt;/i> always disturbed it
too much.&lt;/div>

&lt;div class='p'>For the last two runs (straight and spiral staircase) the other team from Písek
convinced me to try „prohibited” &lt;b>strategy with a rope&lt;/b>. It was attached in
the battery holder and it was approximately 1m long. I changed the code (see
&lt;a href='https://github.com/robotika/jessica/commit/32abb3967d3974a13ec8c2383ec73a341824b57f' class='external'>diff&lt;/a>)
to complete &lt;i>takeoff&lt;/i> and to fly forward with power set to 10%. Now the
hardest part was to enter the staircase (there were railings on both sides).
Once it succeeded it was really superior to previous approach &amp;mdash; I cut the
first try on the staircase rest area but after the contest, just for a video
documentation, Jessica was able to climb the whole staircase &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>So there is another difference between old &lt;i>AR Drone 2.0&lt;/i> and &lt;i>Rolling
Spider&lt;/i> (I think that this change was also between old &lt;i>AR Drone&lt;/i> and the
next generation). As I said I was flying &lt;b>forward only&lt;/b> but I never hit the
steps. So the altitude is only relative and not absolute to the ground as for
&lt;i>AR Drone 2.0&lt;/i>. On the other hand it could be caused by sensor confusion
because of the rope (bad sonar or camera readings?) so I am not sure.&lt;/div>

&lt;div class='p'>What next? Well, I hope I receive some videos of the contest (the official one
will be in January, but something else could be next week). And that is going
to be end of this blog (both drones will be returned this Tuesday).&lt;/div>

&lt;div class='p'>p.s. I still hope that &lt;a href='http://www.parrot.com/usa/' class='external'>Parrot&lt;/a> will revise the
firmware to send drone data (time, speed, altitude, reading from sonar and
camera, estimate of position) and that it will be possible to &lt;i>stream&lt;/i> some
pictures over Bluetooth (now you can save them and then transfer them but I did
not try that during flight yet).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/robots/jessica/Rolling_Spider_2014-11-29T142036_0000_3EB0A851644090E964BFD8B616F7F977.jpg'>&lt;img src='/robots/jessica/Rolling_Spider_2014-11-29T142036_0000_3EB0A851644090E964BFD8B616F7F977_t.jpg' alt='What is the FOV of the camera?' title='What is the FOV of the camera?' class='border'  width='320' height='240'/>&lt;/a>&lt;br/>
&lt;a href='/robots/jessica/Rolling_Spider_2014-11-29T142036_0000_3EB0A851644090E964BFD8B616F7F977.jpg'>What is the FOV of the camera?&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="141204">&lt;/a>&lt;/div>

&lt;h2>4th December 2014 &amp;mdash; Parrot released ARDroneSDK3!&lt;/h2>

&lt;div class='p'>Today I received two interesting e-mails related to &lt;i>Rolling Spider&lt;/i>. The
first one pointed me to &lt;a href='https://github.com/valentin-bas/RSControl' class='external'>another
github repository&lt;/a> &amp;hellip; so somebody else was trying to modify Android
BluetoothLE example like me &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>The second mail trump the first one:&lt;/div>

&lt;h1>&lt;a href='https://github.com/ARDroneSDK3' class='external'>!!! PARROT RELEASED ARDroneSDK3 !!!&lt;/a>&lt;/h1>

&lt;div class='p'>It is probably not publicly announced yet (?), but the github is ready. If you are
lost in these tons of source code I would recommend to start from
&lt;a href='https://github.com/ARDroneSDK3/libARCommands/blob/master/Xml/ARDrone3_commands.xml' class='external'>ARDrone3_commands.xml&lt;/a>,
where are all commands shortly described.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/robots/jessica/en#email'>contact form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>Results</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2014/results/en"/>
	<id>http://localhost/competitions/robotour/2014/results/en</id>
	<updated>2014-09-23T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> So how was Robotour, the ninth year of outdoor contest of autonomous robots? I
would almost introduce a new motto: „if you would like to checkout your robot,
take it to the Robotour!”, because this year it was really hard, especially the
start at 2pm &amp;hellip; but I am now jumping ahead.

 </summary>
	<content type='html'> 
&lt;h2>Robotour 2014 in a nutshell&lt;/h2>

&lt;div class='p'>It rained this year. And it was not just a few drops &amp;mdash; it was a serious
pelting rain, which would some robots easily flood out. So that's probably the
first unforgivable memory to this year Robotour in Pilsen.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/osm-path.jpg'>&lt;img src='/competitions/robotour/2014/results/osm-path_t.jpg' alt='Do you see the path right down?!' title='Do you see the path right down?!' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/osm-path.jpg'>Do you see the path right down?!&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Another positive memory is that all 12+1 teams arrived (well, the new team
&lt;i>Blade XXII&lt;/i> mask that till Saturday). There were &lt;b>five new teams&lt;/b> and all
fightet bravely, it is true that some with their own robots and software, but
&amp;hellip;  &lt;b>AmBot&lt;/b> finished „winner box” and big chances had also Polish team
&lt;b>TAPAS Team&lt;/b>, which was favorite in first runs (they did not start in 3rd
pelting rain run, which was wise decision (*)).&lt;/div>

&lt;div class='p'>The score was very bad this year. The paths were narrower than in Lodz and some
were a grass ones. The top is the path on the right picture, which is mapped in
OSM. So it was harder to judge and there were debates of some not perfectly
mapped pieces of road. Also the 3rd rainy round was a source of some
disappointment.&lt;/div>

&lt;div class='p'>Although it was quite demanding, I am happy for &lt;i>Robotour 2014&lt;/i> &amp;mdash; by the
way is not the reason why we are doing that? &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;h2>A few quick pictures&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/tent.jpg'>&lt;img src='/competitions/robotour/2014/results/tent_t.jpg' alt='Safety tent' title='Safety tent' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/tent.jpg'>Safety tent&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/e-liska.jpg'>&lt;img src='/competitions/robotour/2014/results/e-liska_t.jpg' alt='Naked E-liška' title='Naked E-liška' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/e-liska.jpg'>Naked E-liška&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/cogito.jpg'>&lt;img src='/competitions/robotour/2014/results/cogito_t.jpg' alt='The half of Cogito' title='The half of Cogito' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/cogito.jpg'>The half of Cogito&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/mars.jpg'>&lt;img src='/competitions/robotour/2014/results/mars_t.jpg' alt='MarS' title='MarS' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/mars.jpg'>MarS&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/plecharts.jpg'>&lt;img src='/competitions/robotour/2014/results/plecharts_t.jpg' alt='Plecharts' title='Plecharts' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/plecharts.jpg'>Plecharts&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/warning.jpg'>&lt;img src='/competitions/robotour/2014/results/warning_t.jpg' alt='Warning' title='Warning' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/warning.jpg'>Warning&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Overall score&lt;/h2>

&lt;div class='p'>&lt;table border="1">
	&lt;tr>
		&lt;th>Order&lt;/th>
		&lt;th>Team&lt;/th>
		&lt;th>1st run&lt;/th>
		&lt;th>2nd run&lt;/th>
		&lt;th>3rd run&lt;/th>
		&lt;th>4th run&lt;/th>
		&lt;th>Total&lt;/th>
	&lt;/tr>
	&lt;tr bgcolor="yellow">
		&lt;td align="center">1st&lt;/td>
		&lt;td>&lt;b>Smelý Zajko&lt;/b>&lt;/td>
		&lt;td>46&lt;/td>
		&lt;td>40&lt;/td>
		&lt;td>64&lt;/td>
		&lt;td>80&lt;/td>
		&lt;td>230&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">2nd-3rd&lt;/td>
		&lt;td>&lt;b>Radioklub Písek&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>179&lt;/td>
		&lt;td>181&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">2nd-3rd&lt;/td>
		&lt;td>&lt;b>AmBot&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>16&lt;/td>
		&lt;td>127&lt;/td>
		&lt;td>37&lt;/td>
		&lt;td>180&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">4th-5th&lt;/td>
		&lt;td>&lt;b>TAPAS Team&lt;/b>&lt;/td>
		&lt;td>22&lt;/td>
		&lt;td>53&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>63&lt;/td>
		&lt;td>138&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">4th-5th&lt;/td>
		&lt;td>&lt;b>ARBot&lt;/b>&lt;/td>
		&lt;td>11&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>49&lt;/td>
		&lt;td>73&lt;/td>
		&lt;td>133&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">6th&lt;/td>
		&lt;td>&lt;b>Cogito&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>16&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>13&lt;/td>
		&lt;td>29&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">7th&lt;/td>
		&lt;td>&lt;b>NDTeam&lt;/b>&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>7&lt;/td>
		&lt;td>9&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">8th&lt;/td>
		&lt;td>&lt;b>JECC&lt;/b>&lt;/td>
		&lt;td>2&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>2&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">9th-12th&lt;/td>
		&lt;td>&lt;b>AutoLUT2&lt;/b>&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">9th-12th&lt;/td>
		&lt;td>&lt;b>Istrobotics&lt;/b>&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">9th-12th&lt;/td>
		&lt;td>&lt;b>MarS&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>-&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
	&lt;tr>
		&lt;td align="center">9th-12th&lt;/td>
		&lt;td>&lt;b>Plecharts&lt;/b>&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
		&lt;td>0&lt;/td>
	&lt;/tr>
&lt;/table>&lt;/div>

&lt;div class='p'>The winner is after five years of participation &lt;b>Smelý Zajko&lt;/b>!
Congratulation, also for endurance and patience.&lt;/div>

&lt;h2>Workshop&lt;/h2>

&lt;div class='p'>There were only Czech and Slovak teams on Saturday workshop, so after a while
it was easier to talk in our native languages. I was relatively tired from
Saturday, but the workshop gave me a plenty energy and optimism.&lt;/div>

&lt;div class='p'>At the beginning &lt;i>Smelý Zajko&lt;/i> described his hard way to the victory (they
participated since 2010).&lt;/div>

&lt;div class='p'>&lt;i>Radioklub Písek&lt;/i> presented a story of burned motors of E-liška and „current
magic” on new controllers.&lt;/div>

&lt;div class='p'>&lt;i>Cogito&lt;/i> finished their presentation with comment that is is not necessary to
a build autonomous car and that the goals of Robotour can be reached by
autonomous bike, cart or skateboard. See the last page of his
&lt;a href='https://docs.google.com/presentation/d/13u0Uk4lubRXtuU52T9EAU8EJip3vkpXBY1JJkg8U1M4/edit?usp=sharing' class='external'>presentation&lt;/a>.&lt;/div>

&lt;div class='p'>[&lt;i>The objective of the Robotour contest is to encourage development of robots
capable of transporting you to work in the morning or to deliver the building
material you have just purchased in an online shop.  &lt;/i>]&lt;/div>

&lt;div class='p'>&lt;i>ARBot&lt;/i> showed his first results from stereo vision computed on gate array.&lt;/div>

&lt;div class='p'>&lt;i>NDTeam&lt;/i>, working in picoseconds timing business, presented his own robot
with aluminium construction controlled by a 4-core computer of size of
credit card.&lt;/div>

&lt;div class='p'>&lt;i>Istrobotics&lt;/i> mentioned synchronization of web camera PS3 (?) &amp;hellip; you just
have to solder proper inputs, but there are some issues with Windows drivers
fro multiple cameras etc.&lt;/div>

&lt;div class='p'>Simply all experts! The presentation often started like in alcohol/drog abuse
therapy: „I am on Robotour since XYZ ...” Well, it is maybe addictive, so be
beware! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;h2>Plans for Robotour 2015&lt;/h2>

&lt;div class='p'>Just brief points:&lt;/div>

&lt;ul>
&lt;li>the Friday homologation will not be mandatory
 (it was already this year)&lt;/li>

&lt;li>the order of teams in second and next round will be reversed, i.e. teams with
high score will start at the end of the starting area&lt;/li>

&lt;li>the payload remains mandatory&lt;/li>

&lt;li>the runs are in all weather conditions&lt;/li>
&lt;/ul>

&lt;div class='p'>The next year is anniversary so we are thinking how to properly celebrate it.
The preference for next location is still Czech Republic, but there are other
alternatives like
&lt;a href='http://de.wikipedia.org/wiki/Gro%C3%9Fer_Garten_%28Dresden%29' class='external'>Großer Garten
(Dresden)&lt;/a>. Smelý Zajko took some pictures on the way home in park in Hluboká
nad Vltavou and comment it &lt;i>If Pilsen was difficulty level 4, the Hluboka is
level 7!&lt;/i> &amp;hellip; so something to look forward.&lt;/div>

&lt;h2>Links&lt;/h2>

&lt;ul>
&lt;li>&lt;a href='https://github.com/robotika/robotour' class='external'>Robotour GitHub&lt;/a>&lt;/li>

&lt;li>&lt;a href='https://www.facebook.com/media/set/?set=a.333128763527550.1073741935.122209444619484&amp;amp;type=3' class='external'>Facebook of Westernbohemia university in Pilsen - photos&lt;/a>&lt;/li>

&lt;li>&lt;a href='https://docs.google.com/presentation/d/13u0Uk4lubRXtuU52T9EAU8EJip3vkpXBY1JJkg8U1M4/edit?usp=sharing' class='external'>Cogito presentation&lt;/a>&lt;/li>

&lt;li>ARBot video: &lt;a href='http://youtu.be/L2liO9ZL1qU' class='external'>data analysis&lt;/a>, &lt;a href='http://youtu.be/_iPBaoqiFQo' class='external'>outside view&lt;/a>&lt;/li>

&lt;li>&lt;a href='http://youtu.be/GdDPSsLuyYI' class='external'>Greetings from Roboauto&lt;/a>&lt;/li>
&lt;/ul>

&lt;h2>Acknowledgement&lt;/h2>

&lt;div class='p'>I would like to thank especially to Petr Weissar and his student for creating
the base. Without the tent we wound not survive the Saturday afternoon. Thanks
to Westernbohemia university for support: the dormitories next to the park were
very handy and the conference room in
&lt;a href='http://www.plzen.eu/obcan/aktuality/z-mesta/zapadoceska-univerzita-otevrela-akademicke-centrum.aspx' class='external'>Academic
center with café íčko&lt;/a> in city center was a nice alternative for workshop.
&lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>If you find this contest interesting and you would like to participate next year, or you have interesting pictures or videos, please let us know via
&lt;a href='/competitions/robotour/2014/results/en#email'>contact form&lt;/a>.&lt;/div>

&lt;div class='p'>(*) I am reading conclusions form last year Poland and one was that a team can
give up one attempt without punishment. On the other hand &lt;i>TAPAS Team&lt;/i> would
score similarly only &lt;i>Smelý Zajko&lt;/i> would have lower score and the order would
be identical.&lt;/div>

&lt;hr/>

&lt;h1>Photos&lt;/h1>

&lt;div class='p'>authors: &lt;a href='http://fotozdekubik.wz.cz/' class='external'>Zdeněk Kubík&lt;/a> and Petr Weissar&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-cup.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-cup_t.jpg' alt='The Cup' title='The Cup' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-cup.jpg'>The Cup&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-ambot_t.jpg' alt='AmBot' title='AmBot' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot.jpg'>AmBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-ambot2_t.jpg' alt='AmBot' title='AmBot' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot2.jpg'>AmBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-arbot.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-arbot_t.jpg' alt='ARBot' title='ARBot' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-arbot.jpg'>ARBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-autolut2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-autolut2_t.jpg' alt='AutoLUT2' title='AutoLUT2' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-autolut2.jpg'>AutoLUT2&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-istrobotics.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-istrobotics_t.jpg' alt='Istrobotics' title='Istrobotics' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-istrobotics.jpg'>Istrobotics&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-jecc.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-jecc_t.jpg' alt='JECC' title='JECC' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-jecc.jpg'>JECC&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 153px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team5.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-tapas-team5_t.jpg' alt='TAPAS Team' title='TAPAS Team' class='border'  width='147' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team5.jpg'>TAPAS Team&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot3.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-ambot3_t.jpg' alt='AmBot' title='AmBot' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot3.jpg'>AmBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot4.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-ambot4_t.jpg' alt='AmBot' title='AmBot' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-ambot4.jpg'>AmBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-arbot2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-arbot2_t.jpg' alt='ARBot' title='ARBot' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-arbot2.jpg'>ARBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-arbot-support.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-arbot-support_t.jpg' alt='ARBot support' title='ARBot support' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-arbot-support.jpg'>ARBot support&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-bezne-pivo.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-bezne-pivo_t.jpg' alt='Common beer' title='Common beer' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-bezne-pivo.jpg'>Common beer&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-cogito.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-cogito_t.jpg' alt='Cogito' title='Cogito' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-cogito.jpg'>Cogito&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-jecc2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-jecc2_t.jpg' alt='JECC' title='JECC' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-jecc2.jpg'>JECC&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-mars.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-mars_t.jpg' alt='MarS' title='MarS' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-mars.jpg'>MarS&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-mars2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-mars2_t.jpg' alt='MarS' title='MarS' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-mars2.jpg'>MarS&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-mars3.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-mars3_t.jpg' alt='MarS' title='MarS' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-mars3.jpg'>MarS&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-ndteam.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-ndteam_t.jpg' alt='NDTeam' title='NDTeam' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-ndteam.jpg'>NDTeam&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-ndteam2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-ndteam2_t.jpg' alt='NDTeam' title='NDTeam' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-ndteam2.jpg'>NDTeam&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-ndteam3.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-ndteam3_t.jpg' alt='NDTeam' title='NDTeam' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-ndteam3.jpg'>NDTeam&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-plecharts.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-plecharts_t.jpg' alt='Plecharts' title='Plecharts' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-plecharts.jpg'>Plecharts&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-plecharts2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-plecharts2_t.jpg' alt='Plecharts' title='Plecharts' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-plecharts2.jpg'>Plecharts&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-radioklub-pisek.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-radioklub-pisek_t.jpg' alt='Radioklub Písek' title='Radioklub Písek' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-radioklub-pisek.jpg'>Radioklub Písek&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-radioklub-pisek2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-radioklub-pisek2_t.jpg' alt='Radioklub Písek' title='Radioklub Písek' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-radioklub-pisek2.jpg'>Radioklub Písek&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-radioklub-pisek3.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-radioklub-pisek3_t.jpg' alt='Radioklub Písek' title='Radioklub Písek' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-radioklub-pisek3.jpg'>Radioklub Písek&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-rain.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-rain_t.jpg' alt='Rain' title='Rain' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-rain.jpg'>Rain&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-robots.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-robots_t.jpg' alt='Robots' title='Robots' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-robots.jpg'>Robots&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-smely-zajko.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-smely-zajko_t.jpg' alt='Smelý Zajko' title='Smelý Zajko' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-smely-zajko.jpg'>Smelý Zajko&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-smely-zajko2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-smely-zajko2_t.jpg' alt='Smelý Zajko' title='Smelý Zajko' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-smely-zajko2.jpg'>Smelý Zajko&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-smely-zajko3.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-smely-zajko3_t.jpg' alt='Smelý Zajko' title='Smelý Zajko' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-smely-zajko3.jpg'>Smelý Zajko&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-start2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-start2_t.jpg' alt='Start 2' title='Start 2' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-start2.jpg'>Start 2&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-start2b.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-start2b_t.jpg' alt='Start 2' title='Start 2' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-start2b.jpg'>Start 2&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-start3.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-start3_t.jpg' alt='Start 3' title='Start 3' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-start3.jpg'>Start 3&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-start4.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-start4_t.jpg' alt='Start 4' title='Start 4' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-start4.jpg'>Start 4&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-start4b.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-start4b_t.jpg' alt='Start 4' title='Start 4' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-start4b.jpg'>Start 4&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-tapas-team_t.jpg' alt='TAPAS Team' title='TAPAS Team' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team.jpg'>TAPAS Team&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team2.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-tapas-team2_t.jpg' alt='TAPAS Team' title='TAPAS Team' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team2.jpg'>TAPAS Team&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team3.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-tapas-team3_t.jpg' alt='TAPAS Team' title='TAPAS Team' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team3.jpg'>TAPAS Team&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team4.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-tapas-team4_t.jpg' alt='TAPAS Team' title='TAPAS Team' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-tapas-team4.jpg'>TAPAS Team&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-tent.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-tent_t.jpg' alt='Stan' title='Stan' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-tent.jpg'>Stan&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-e-liska.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-e-liska_t.jpg' alt='E-liška and ZCU' title='E-liška and ZCU' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-e-liska.jpg'>E-liška and ZCU&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-wet-coordinates.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-wet-coordinates_t.jpg' alt='Wet coordinates' title='Wet coordinates' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-wet-coordinates.jpg'>Wet coordinates&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/results/zcu-robo-art.jpg'>&lt;img src='/competitions/robotour/2014/results/zcu-robo-art_t.jpg' alt='Robo-art' title='Robo-art' class='border'  width='220' height='147'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotour/2014/results/zcu-robo-art.jpg'>Robo-art&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>
 </content>
</entry>
<entry>
	<title>Introduction of teams</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2014/teams/en"/>
	<id>http://localhost/competitions/robotour/2014/teams/en</id>
	<updated>2014-08-12T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> Nice! Fibonacci would be happy &amp;mdash; Switzerland, Germany, Poland, Slovakia and
Czech Republic. There are 12 teams registered in total. The distribution is,
when compared to previous years, more international this time &lt;span class='smile'>&lt;/span>. So when and
where you can see the teams to compete? &lt;b>20th September 2014, Borský park,
Plzeň/Czech Republic&lt;/b>. Concurent starts of all robots at &lt;b>10am, 11am, 2pm and
3pm&lt;/b>.
 </summary>
	<content type='html'> 
&lt;h1>Teams&lt;/h1>

&lt;h1>&lt;a href='http://www.youtube.com/playlist?list=PL2gPpyBs1e23NtQPoMnl41gyRO8zNljba' class='external'>YouTube playlist of all registered teams&lt;/a>&lt;/h1>

&lt;h2>&lt;a href='http://ambot6.webnode.cz/' class='external'>AmBot&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/ambot.jpg'>&lt;img src='/competitions/robotour/2014/teams/ambot_t.jpg' alt='' title='' class='border'  width='320' height='191'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/RC306Ex3aVU' class='external'>http://youtu.be/RC306Ex3aVU&lt;/a>&lt;/div>

&lt;div class='p'>Robot Ferda is modified kids electric car ("ride-on") for Robotour 2014. The
main control system is Arduino based with ATmega2560. It takes care of
motors control, integrates magnetometer and two sonars for obstacle detection.
It also reads data from external GPS receiver via Bluetooth converter. Arduino
software provides possibility to define GPS waypoints and the car tries to
navigate by them. It can accept also commands from Bluetooth converter. The
goal is to extend system with Android smartphone running simple application
with visual navigation (to keep the robot on the road).&lt;/div>

&lt;h2>&lt;a href='http://www.arbot.cz' class='external'>ARBot&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/arbot.jpg'>&lt;img src='/competitions/robotour/2014/teams/arbot_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/ve-CoNReFTw' class='external'>http://youtu.be/ve-CoNReFTw&lt;/a>&lt;/div>

&lt;div class='p'>ARBot is a small robotic vehicle constructed for outdoor competitions of
autonomous robots. The robot has four-wheels chassis, each wheel is powered and
has encoder. Robot has camera, GPS, AHRS unit and three sonars. The computation
is handled by DSP BF537 with power 1000 MIPS.&lt;/div>

&lt;h2>AutoLUT2 (PL)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/autolut2.jpg'>&lt;img src='/competitions/robotour/2014/teams/autolut2_t.jpg' alt='' title='' class='border'  width='320' height='181'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://www.youtube.com/watch?v=LBPyjKe679Y' class='external'>https://www.youtube.com/watch?v=LBPyjKe679Y&lt;/a>&lt;/div>

&lt;div class='p'>The vehicle is driven by 36V DC motor and it can turn using an electric 
ram. It is controlled by the arduino microcontroller and additional 
computer.&lt;/div>

&lt;h2>Blade XXII (SK)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/bladexxii.jpg'>&lt;img src='/competitions/robotour/2014/teams/bladexxii_t.jpg' alt='' title='' class='border'  width='320' height='177'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://www.youtube.com/watch?v=BnWsfl_jUJA' class='external'>https://www.youtube.com/watch?v=BnWsfl_jUJA&lt;/a>&lt;/div>

&lt;div class='p'>Leopard Pro 36 converted to eletric power
STM32F103 - motor/servo control, sensors
Radxa Rock - navigation, optical recognition&lt;/div>

&lt;h2>&lt;a href='https://sites.google.com/site/cogitoteam/robotour-2014' class='external'>Cogito&lt;/a> (CH/CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/cogito.jpg'>&lt;img src='/competitions/robotour/2014/teams/cogito_t.jpg' alt='' title='' class='border'  width='320' height='179'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/2H_66EUvTKg' class='external'>http://youtu.be/2H_66EUvTKg&lt;/a>&lt;/div>

&lt;div class='p'>Lot of hardware, lot of software, lot of fun.&lt;/div>

&lt;div class='p'>All electronic of robot B-trix is, as in 2012, attached to electrochassis 1:5
and low-level control handles Arduino Duemilanove.  There was serious upgrade
of sensoric part - new laser rangefinder, compass has inclination compensation,
number of sonar is tripled, "ordinary" camera was replaced by stereo camera.
Xtion remained but nobody expects anything from it. GPS, magnetic encoder,
gyroscopes and accelerometers are common sensors on this contest.&lt;/div>

&lt;div class='p'>The high level control is managed by mini-ITX with Atom processor There are so
many Ethernet toys that robot carries its own intranet.  Software is mixture of
Python, C++, C and bash. A plenty of vision, plenty of planning, but it is
almost impossible to compute it in time.&lt;/div>

&lt;h2>Istrobotics (SK)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/istrobotics.jpg'>&lt;img src='/competitions/robotour/2014/teams/istrobotics_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/395K47_SHfc' class='external'>http://youtu.be/395K47_SHfc&lt;/a>&lt;/div>

&lt;div class='p'>The base of the robot is modified RC model TRAXXAS E-MAXX (3903). It is
equipped with webcam, GPS, sonars HC-SR04, IMU with 3D compass and magnetic
IRC. The basic sensors handles arduino mega. The image processing and GPS runs
on 8" tablet with Intel Atom and Windows 8.  The program is written in C++ and
is using OpenCV.&lt;/div>

&lt;h2>&lt;a href='http://www.jecc.de' class='external'>JECC&lt;/a> (DE)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/jecc.jpg'>&lt;img src='/competitions/robotour/2014/teams/jecc_t.jpg' alt='' title='' class='border'  width='320' height='181'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/Doo27P37gCs' class='external'>http://youtu.be/Doo27P37gCs&lt;/a>&lt;/div>

&lt;div class='p'>4 wheel drive
powered by Lipo 42 5000 mAh
motorcontroller: DRV8800
Controller: Beagle Bone Black&lt;/div>

&lt;h2>NDTeam (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/ndteam.jpg'>&lt;img src='/competitions/robotour/2014/teams/ndteam_t.jpg' alt='' title='' class='border'  width='320' height='179'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/FWOTWIGRihw' class='external'>http://youtu.be/FWOTWIGRihw&lt;/a>&lt;/div>

&lt;div class='p'>Robot Robík is own construction inspired by robot Orpheus. It weights
approximately 15kg, driven by two DC motors with planetary gearbox. The wheels
are connected with toothed belt. The control is own electronics based on ARM
processor Cortex M3. Equipment: GPS + 9 DOF AHRS, sonar, camera+OpenCV on
platform Odroid U3 for road detection.&lt;/div>

&lt;h2>&lt;a href='http://www.plecharts.cz' class='external'>Plecharts&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/plecharts.jpg'>&lt;img src='/competitions/robotour/2014/teams/plecharts_t.jpg' alt='' title='' class='border'  width='320' height='180'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/j4cK0xIAdUU' class='external'>http://youtu.be/j4cK0xIAdUU&lt;/a>&lt;/div>

&lt;div class='p'>The robot construction is from freely available parts. The skeleton is mounted
from aluminium profiles. The drive is by two electromotors with maximal
combined power about 2.6 kW. The power supply is provided by two Pb accumulators
12V 72 Ah each (usually only one is used). The maximal speed is about 0.5 m/s.
All modules (sensors, motor control, etc.) communicate by TCP/IP protocol.&lt;/div>

&lt;div class='p'>Software is written in pure C++, and runs on older notebook. The road
recognition uses neural networks. Park navigation may use GPS, maybe even some
map.&lt;/div>

&lt;h2>&lt;a href='http://www.kufr.cz' class='external'>Radioklub Písek&lt;/a> (CZ)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/radioklub-pisek.jpg'>&lt;img src='/competitions/robotour/2014/teams/radioklub-pisek_t.jpg' alt='' title='' class='border'  width='320' height='239'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='http://youtu.be/bAk96kyaTgc' class='external'>http://youtu.be/bAk96kyaTgc&lt;/a>&lt;/div>

&lt;div class='p'>Radioklub Písek is participating already for the 6th year on robotic outdoor
competitions. Last year we completed new robot E-liška, and we took it
for the first time to ROBOTOUR, where we reached 3rd place, again. And this year
3rd place on Robotem Rovně and 3rd place on RoboOrienteering &lt;span class='smile'>&lt;/span>. After small
upgrades we count with it on ROBOTOUR 2014. E-liška dimensions are 95x60x48 cm
and weight approximately 40 kg. The on-board voltage is 24V, provided by two gel
accumulators 12V/18Ah. E-liška has spring-loaded four-wheel chassis with
Ackermann steering and all wheels are powered. Each wheel has its own control
unit. We use Lidar Sick , GPS a 9dof unit for orientation. The main control
handles notebook, and motors have its own module with STM32. The power control
is handled by H-bridges of own construction. The main program is written in
Python and runs on Linux.&lt;/div>

&lt;h2>&lt;a href='http://dai.fmph.uniba.sk/projects/smelyzajko/' class='external'>Smely Zajko&lt;/a> (SK)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/smely-zajko.jpg'>&lt;img src='/competitions/robotour/2014/teams/smely-zajko_t.jpg' alt='' title='' class='border'  width='320' height='239'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://www.youtube.com/watch?v=dAUn9LSuKm0' class='external'>https://www.youtube.com/watch?v=dAUn9LSuKm0&lt;/a>&lt;/div>

&lt;div class='p'>HARDWARE:
Parallax (Motor Mount and Wheel Kit), encoders, 2xHB25
Sbot board (AVR ATmega128, designed and assembled by David Gustafik)
PC ASUS UL30V
5x SRF-08
GPS NaviLock NL-302U USB SiRF III
Compass with tilt compensation (HMC6343)
AVR ATmega8 (compass driver)
Camcorder Panasonic SDR-T50 (or USB webcam)
video grabber EasyCap DC60 USB 2.0 TV DVD VHS Video Adapter W / Audio
AV Capture TV DVD CVBS-Adapter
usual usb hub
Power: HAZE HZS 12V 9Ah
handmade wood &amp;amp; aluminium base (contributions by Miroslav Nadhajský
and Pavel Petrovič)
red power switch, and power circuitry (contributions by Richard Balogh)&lt;/div>

&lt;div class='p'>SOFTWARE:
Ubuntu 14.04 Desktop LTS
Netbeans
OpenCV
Smelý zajko controller utilizing an Artificial Neural Network (FANN)
(result of Miroslav Nadhajský master thesis)
SBOT firmware written in C/AVR Studio (David Gustafik) with
modifications (Pavel Petrovič)
Compass driver with serial port interface written in C/AVR Studio
available at &lt;a href='https://code.google.com/p/smely-zajko/' class='external'>https://code.google.com/p/smely-zajko/&lt;/a>&lt;/div>

&lt;h2>&lt;a href='http://cybair.put.poznan.pl/index.php?option=com_content&amp;amp;view=category&amp;amp;layout=blog&amp;amp;id=52&amp;amp;Itemid=125' class='external'>TAPAS Team&lt;/a> (PL)&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotour/2014/teams/tapas-team.jpg'>&lt;img src='/competitions/robotour/2014/teams/tapas-team_t.jpg' alt='' title='' class='border'  width='320' height='181'/>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>youtube        &lt;a href='https://www.youtube.com/watch?v=RYOczhVBujE' class='external'>https://www.youtube.com/watch?v=RYOczhVBujE&lt;/a>&lt;/div>

&lt;div class='p'>Robot's hardware was intended to be simple, modular reliable. It's main 
body is built from aluminium profiles. Motors are rigidly mounted with 
wheel mounted on their's shaft. Considering electronics, we use prebuild 
motor drivers, Discovery STM32F4 evaluation board with cape made by us, 
AHRS sensor, Hokuyo laser scanner, GPS receiver and nettop computer. All 
peripherials are conneted by USB link. Some parts, especially fixings, 
were 3D printed.&lt;/div>

&lt;div class='p'>Software is composed of 3 main modules: localization, movement 
constraints and navigation. First one uses Extended Kalman Filter and 
data from AHRS, encoders and GPS to obtain global position. Second one 
uses Hokuyo laser scanner and camera to compute movement constraints. It 
consists two parts: obstacles detection from point cloud (agregated from 
laser scans) and terrain classification from combined camera image and 
point cloud intensity values. The last module uses Vector Field 
Histogram for local planning and A* for global one. As a base we use 
Ubuntu operating system. For telemetry and development purposes we use 
remote desktop and dedicated GUI.&lt;/div>

&lt;hr/>

&lt;div class='p'>If you would like to somehow support this contest or you have some
comments/question, please use our standard &lt;a href='/competitions/robotour/2014/teams/en#email'>contact form&lt;/a>.&lt;/div>
 </content>
</entry>
<entry>
	<title>Robot Challenge 2014</title>
	<link rel='alternate' href="http://localhost/competitions/robotchallenge/2014/en"/>
	<id>http://localhost/competitions/robotchallenge/2014/en</id>
	<updated>2014-02-10T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> Robot Challenge is well established competition with many categories and many
participating robots. One of them is flying 8-figures in net protected area
&amp;mdash; AirRace.  There used to be also a semi-autonomous category for the first
two years, where the best competitor Flying Dutchman reached 26 rounds. It is
time to beat his score with autonomous drone now! &lt;b>Blog update:&lt;/b> 11/4 &amp;mdash;
&lt;a href='/competitions/robotchallenge/2014/en#140411'>Conclusion&lt;/a>

 </summary>
	<content type='html'> 
&lt;div class='p'>Note, that this is not just „English translation” of Czech article. It seems
that fund raising campaign failed (well, there is still a week to go, but I
doubt it will jump to required limit), so I decided to „log” my progress at
least in English. Let me know, if you find this interesting &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;ul>
&lt;li>&lt;a href='http://www.robotchallenge.org/fileadmin/user_upload/_temp_/RobotChallenge/Reglement/RC-AirRace.pdf' class='external'>AirRace rules (PDF file)&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>We are playing with two robots" &lt;a href='/robots/heidi/en'>&lt;span class='cs'>Heidi&lt;/span>&lt;/a> and Isabele. Both are
&lt;a href='http://ardrone2.parrot.com/' class='external'>AR Drone 2.0 from Parrot&lt;/a>.&lt;/div>

&lt;div class='p'>The source code is available at github:
&lt;a href='https://github.com/robotika/heidi' class='external'>https://github.com/robotika/heidi&lt;/a>&lt;/div>

&lt;div class='p'>Here are some pictures from older Czech blog:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/navigation-line.jpg'>&lt;img src='/competitions/robotchallenge/2014/navigation-line_t.jpg' alt='Navigation line' title='Navigation line' class='border'  width='220' height='124'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/navigation-line.jpg'>Navigation line&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/thresher.jpg'>&lt;img src='/competitions/robotchallenge/2014/thresher_t.jpg' alt='Scary landing area' title='Scary landing area' class='border'  width='220' height='124'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/thresher.jpg'>Scary landing area&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/roundel.jpg'>&lt;img src='/competitions/robotchallenge/2014/roundel_t.jpg' alt='Camera bottom view' title='Camera bottom view' class='border'  width='220' height='124'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/roundel.jpg'>Camera bottom view&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Links:&lt;/h2>

&lt;ul>
&lt;li>Contest website: &lt;a href='http://www.robotchallenge.org/' class='external'>http://www.robotchallenge.org/&lt;/a>&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;hr/>

&lt;h1>Blog&lt;/h1>

&lt;div class='p'>&lt;a id="140210">&lt;/a>&lt;/div>

&lt;h2>February 10th, 2014 - cv2&lt;/h2>

&lt;div class='p'>Yesterday I was fighting with
&lt;a href='http://docs.opencv.org/trunk/doc/py_tutorials/py_setup/py_intro/py_intro.html' class='external'>OpenCV
for Python&lt;/a>, which seems to be just perfect tool to help us reach the goal. My
problem was that I was following only the
&lt;a href='http://docs.opencv.org/trunk/doc/py_tutorials/py_setup/py_setup_in_windows/py_setup_in_windows.html' class='external'>Python
setup&lt;/a> and it kind of worked (loaded and displayed image, for example). But as
soon as I need to to parse video with H264 codec I was doomed :-(. There are
many comments on the web that integration with
&lt;a href='http://www.ffmpeg.org/' class='external'>ffmpeg&lt;/a> is complicated, but there were already some
DLLs prepared?!&lt;/div>

&lt;div class='p'>Well, next time start with
&lt;a href='http://docs.opencv.org/doc/tutorials/introduction/windows_install/windows_install.html' class='external'>basic
OpenCV installation&lt;/a> and then the Python part. The only critical detail is
that you have to set system path to binaries, ffmpeg DLLs in particular &amp;hellip;
yeah, quite obvious, I know.&lt;/div>

&lt;div class='p'>Here is the first
&lt;a href='https://github.com/robotika/heidi/commit/57c5ff97c29f9dcef80b8db2b4ac7414230fbd24' class='external'>image dummy
processing&lt;/a>, and here is
&lt;a href='https://github.com/robotika/heidi/commit/b2cbd78a70d2288f7591032b218ff01c3b26798d' class='external'>video
replay code&lt;/a> recorded by &lt;a href='https://github.com/tajgr/isabele' class='external'>Isabele&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140212">&lt;/a>&lt;/div>

&lt;h2>February 10th, 2014 - PaVE and cv2.VideoCapture&lt;/h2>

&lt;div class='p'>&lt;a href='http://docs.opencv.org/trunk/doc/py_tutorials/py_gui/py_video_display/py_video_display.html#display-video' class='external'>cv2.VideoCapture&lt;/a>
is nice function for access to video stream. Thanks to integrated FFMPEG you
can read videos with H264 codec, and this is something used by Parrot AR Drone
2.0. The bad news is that it is not clean H264 stream, but there are also
&lt;b>PaVE&lt;/b> (Parrot Video Encapsulation) headers.&lt;/div>

&lt;div class='p'>Does it matter? Well, it does. Yesterday I have seen couple time crashing Linux
implementation, and on Windows you get several warnings like:&lt;/div>

&lt;pre>[h264 &amp;#64; 03adc9a0] slice type too large (32) at 0 45
[h264 &amp;#64; 03adc9a0] decode_slice_header error&lt;/pre>

&lt;div class='p'>The reason is that H264 control bytes &lt;b>00 00 00 01&lt;/b> are quite common in PaVE
header and then the decoeer is confused. Note, that most projects ignore this:&lt;/div>

&lt;ul>
&lt;li>&lt;a href='https://github.com/Sanderi44/AR-Drone-Fire-Detection/blob/master/videotest.py' class='external'>AR-Drone-Fire-Detection&lt;/a>&lt;/li>

&lt;li>&lt;a href='https://github.com/puku0x/cvdrone/blob/master/src/ardrone/video.cpp' class='external'>CV Drone (C++)&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>It is OK, if you just need „some” images, but once the decoder is confused,
you loose up to 1 second of video stream until I-frame is successfully found.&lt;/div>

&lt;div class='p'>So how to work around it? I had two ideas, both crazy and ugly. One is to
create „TCP Proxy”, which would read images from AR Drone 2.0 and offer them
to OpenCV via TCP socket. Another possibility is temporary file, where &lt;i>ret&lt;/i>
value is sometime False, but you can replay the video in parallel.&lt;/div>

&lt;pre>cap = cv2.VideoCapture( filename )
while(cap.isOpened()):
    ret, frame = cap.read()&lt;/pre>

&lt;div class='p'>There are probably more complex solutions like directly control FFMPEG API, but
it is pity to lose this otherwise nice and clean code &lt;span class='smile'>&lt;/span>. Any ideas?&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140218">&lt;/a>&lt;/div>

&lt;h2>February 18th, 2014 - I-Frames only&lt;/h2>

&lt;div class='p'>Well, cv2 is still winning. I tried both &amp;mdash; temporary file as well as TCP
Proxy. It took a bit too long to start simple server (several seconds), which
is not acceptable in „real-time video processing”. Temporary file looked more
promising, but on Linux it was completely unusable. Hacking around
&lt;b>cv2.VideoCapture&lt;/b> is maybe not the best idea, and proper solution with
buffer frame by frame decoding would be the right way, but &amp;hellip;&lt;/div>

&lt;div class='p'>Today's fall back was usage of I-Frames only. It does not work if you create
video from I-Frames only, but you can create new file, write there single
I-Frame, and then read it with &lt;i>cv2.VideoCapture&lt;/i>. I know crazy. But it works
for now and hopefully in meantime we will find some more effective solution.
For details see
&lt;a href='https://github.com/robotika/heidi/commit/21c7f2cae9ce488bcd6826def7ebafe8397d83fb' class='external'>source
diff&lt;/a>.&lt;/div>

&lt;div class='p'>&lt;a href='/robots/heidi/en'>&lt;span class='cs'>Heidi&lt;/span>&lt;/a> has now new flying code
&lt;a href='https://github.com/robotika/heidi/blob/master/airrace_drone.py' class='external'>airrace_drone.py&lt;/a>.
It is glued together not to interfere with drone Isabele. Simple
&lt;i>processFrame&lt;/i> is then in
&lt;a href='https://github.com/robotika/heidi/blob/master/airrace.py' class='external'>airrace.py&lt;/a>.
Adaptive thresholding, detection of contours and for reasonable size contours
compute bounding rectangle (with rotation).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/rectangles.jpg'>&lt;img src='/competitions/robotchallenge/2014/rectangles_t.jpg' alt='detected rectangles' title='detected rectangles' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/rectangles.jpg'>detected rectangles&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140219">&lt;/a>&lt;/div>

&lt;h2>February 19th, 2014 - Rotated rectangles&lt;/h2>

&lt;div class='p'>Writing a robo-blog in English is not as much fun as in Czech. At first my
English is much worse and at second I do not have a clue if there is anybody
reading it. That's something &lt;a href='http://fandorama.cz/' class='external'>Fandorama&lt;/a> was good for.
I've got clear feedback how many people are interested so much that they
would even support it. And almost nobody was interested in this blog in Czech,
so &amp;hellip;  you get the picture. If you find a bit of information useful for you
let me know. It helps to brighten up mine otherwise depressed spirit &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>What I want to do now is revision of logged data from yesterday. The cv2 logs
are normal text files with results of image analysis + number of updates
between them. The last program was supposed to turn drone based on third
parameter of first rectangle. Just a quick hack that I see that there is some
feedback going on and that I can replay already recorded data with exact
match.&lt;/div>

&lt;pre>NAVI-ON
-26.5650501251
-15.945394516
-45.0
-88.9583740234
NAVI-OFF&lt;/pre>

&lt;div class='p'>This is the output in console &amp;mdash; angles used for rotation. It is also my first
TODO, to add there video timestamps.&lt;/div>

&lt;pre>182
[((817.0, 597.0), (10.733125686645508, 12.521980285644531), -26.56505012512207),
 ((223.17218017578125,
 440.579833984375), (319.85919189453125, 60.09449768066406), -88.75463104248047)]
198
[((298.0, 508.0), (58.51559829711914, 319.50067138671875), -15.945394515991211)]
198
[((982.2500610351562, 492.25), (29.698482513427734, 17.677669525146484), -45.0),
 ((951.9046020507812, 
507.705078125), (60.70906448364258, 84.2083511352539), -69.02650451660156)]
185
[((741.7918090820312, 247.95079040527344), (43.32011032104492, 72.9879379272461),
 -88.9583740234375),
 ((1035.0001220703125, 14.999971389770508), (16.970561981201172, 
12.727922439575195), -45.0)]
199&lt;/pre>

&lt;div class='p'>My second TODO would be to round/truncate floating numbers to integers. It
would be easier to read and remember, my goal is to turn left/right, i.e.
something if mass center is bigger or smaller than width/2 &amp;hellip;&lt;/div>

&lt;pre>182
[((817, 597), (10, 12), -26), ((223, 440), (319, 60), -88)]
198
[((298, 508), (58, 319), -15)]
198
[((982, 492), (29, 17), -45), ((951, 507), (60, 84), -69)]
185
[((741, 247), (43, 72), -88), ((1035, 14), (16, 12), -45)]
199&lt;/pre>

&lt;div class='p'>Better? I think so &lt;span class='smile'>&lt;/span>. Now I would almost guess that it is
((x,y,(width,height),rotation), but API is so simple that it is worth it to
test it:&lt;/div>

&lt;pre>import cv2
import numpy as np
frame = np.zeros( (1200,720), np.uint8)
rect = ((223, 440), (319, 60), -88)
box = cv2.cv.BoxPoints(rect)
box = np.int0(box)
cv2.drawContours( frame,[box],0,(0,0,255),2)
cv2.imshow('image', frame)
cv2.imwrite( "tmp.jpg", frame )
cv2.waitKey(0)
cv2.destroyAllWindows()&lt;/pre>

&lt;div class='p'>Well, you probably know how to make it simper, but this is whole working
program, which will display image and save it as reference. The width/height of
image is swapped and I do not see there anything :-(, so it is not that great.
In general I would recommend
&lt;a href='http://docs.opencv.org/trunk/doc/py_tutorials/py_tutorials.html' class='external'>OpenCV-Python
Tutorials&lt;/a>. Note, that they are for non-existing (or better to say for not so
easily available) version 3.0, and some cv2.cv.* functions are renamed (see
&lt;a href='https://github.com/Itseez/opencv/pull/2319' class='external'>notes&lt;/a>).&lt;/div>

&lt;div class='p'>So how to fix the program? Back to
&lt;a href='http://docs.opencv.org/trunk/doc/py_tutorials/py_gui/py_drawing_functions/py_drawing_functions.html#drawing-functions' class='external'>Drawing
Functions example&lt;/a>. Great! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;pre># Create a black image
img = np.zeros((512,512,3), np.uint8)&lt;/pre>

&lt;div class='p'>so my fix is&lt;/div>

&lt;pre>frame = np.zeros( (720,1200,3), np.uint8)&lt;/pre>

&lt;div class='p'>and now I see red rectangle. With modification &lt;i>rect = ((223+300, 440), (319,
60), -88)&lt;/i> and &lt;i>rect = ((223+600, 440), (319, 60), 0)&lt;/i> I see that first
coordinate is really (x,y), then (width,height) and finally angle in degrees,
where Y-coordinate is pointing down &amp;hellip; maybe I was too fast going to
conclusion. One more test &lt;i>rect = ((223+600, 440), (319, 60), 45)&lt;/i>, so (x,y)
is the &lt;b>center of rectangle&lt;/b>! And now is time to
&lt;a href='http://docs.opencv.org/trunk/doc/py_tutorials/py_imgproc/py_contours/py_contour_features/py_contour_features.html' class='external'>read
the manual&lt;/a> &lt;span class='wink'>&lt;/span>. Hmm &amp;hellip; &lt;i>The function used is cv2.minAreaRect(). It returns
a Box2D structure which contains following detals - ( top-left corner(x,y),
(width, height), angle of rotation ).&lt;/i> Test first! Here is my accumulated
image:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 281px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/draw-rect.jpg'>&lt;img src='/competitions/robotchallenge/2014/draw-rect_t.jpg' alt='Drawing rectangles' title='Drawing rectangles' class='border'  width='275' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/draw-rect.jpg'>Drawing rectangles&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>So it was not that boring at the end &lt;span class='wink'>&lt;/span>. No wonder drone did what she did.
First it navigated to random square, then to perpendicular angle (but be aware
&amp;hellip; width is not necessary bigger than height, see &lt;i>((298, 508), (58, 319),
-15)&lt;/i> &amp;hellip; homework for you, was for this particular box correct to navigate to
angle 0?&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140221">&lt;/a>&lt;/div>

&lt;h2>February 21st, 2014 - The Plan&lt;/h2>

&lt;div class='p'>Aleš wrote me e-mail, that it is not clear (from my description) what is the
exact plan. And he is right! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>The plan is to use standard notebook (one is running Win7 OS for Heidi, another
Linux for Isabele) where all image processing will be handled. Navigation data
plus video stream from drone is received via WiFi connection. WiFi is also
used for sending AT commands to drone.  The code is written in Python. For
image processing is used OpenCV. The code is publicly available at
&lt;a href='https://github.com/robotika/heidi' class='external'>https://github.com/robotika/heidi&lt;/a>.&lt;/div>

&lt;div class='p'>Aleš is also right regarding task complexity. According to his estimates it
will be necessary to flight with average speed at least 1.5m/s. With video
sampling rate 1Hz (current status), that will be quite hard. It is a challenge
&amp;hellip; like what the name of competition says „Robot Challenge”! See you in the
area! &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>p.s. if this blog was sufficient impulse for Aleš to install and start to play
with „OpenCV for Python” then I think that at least one alternative goal was
already reached &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140226">&lt;/a>&lt;/div>

&lt;h2>February 26th, 2014 - Version 0 (almost)&lt;/h2>

&lt;div class='p'>Version 0 should be the simplest program solving given task. In the context of
AirRace this means complete at least one 8-loop. And we are not there yet.
Camera integration always slows down the development &amp;hellip;&lt;/div>

&lt;div class='p'>In meantime I also checked
&lt;a href='http://www.robotchallenge.org/competition/participants/' class='external'>the other
competitors&lt;/a>. There are (or yesterday were) 12 of them! So it is clear that
while in 2012 one completed round would be enough to win the competition, in
2013 it was enough for 3rd place (see &lt;a href='/robots/heidi/en'>&lt;span class='cs'>Heidi&lt;/span>&lt;/a>) and in 2014? We
will see.&lt;/div>

&lt;div class='p'>Here is the source code diff for
&lt;a href='https://github.com/robotika/heidi/commit/54e5384ffb7003081d286ab8b7cc0571c855b24f' class='external'>test-14-2-25&lt;/a>.
There were some surprises, as usual &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;h3>Z-coordinate&lt;/h3>

&lt;div class='p'>Probably the biggest surprise was test recorded in &lt;i>meta_140225_183844.log&lt;/i>.
Drone aggressively started and reached 2 meters height instead of usual 1
meter. That would be still OK, and soon it would reduce the height to desired
1.5 meter, but according to navdata the height was 1.43m! Moreover the
restriction for minimum sizes of detected rectangles was set to 200 pixels so
no navigation data was used/recognized.&lt;/div>

&lt;h3>Rectangles&lt;/h3>

&lt;div class='p'>One more comment about rectangles detection. Originally there was no limit
on the size of rectangle, and that was wrong. Here are some numbers so you get
an idea: at 1m the 30cm long strip is detected as 300 pixels rectangle. This
means that at 2m it should be approximately 150 pixels (current filter value).
Thanks to Jakub bad rectangles are now visible in the debug the output:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 366px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/blue-red-rect.jpg'>&lt;img src='/competitions/robotchallenge/2014/blue-red-rect_t.jpg' alt='Blue = rejected rectangles' title='Blue = rejected rectangles' class='border'  width='360' height='203'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/blue-red-rect.jpg'>Blue = rejected rectangles&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Now I am browsing through recorded videos and looking for „false positive” or
how is it called &amp;mdash; rectangles which should not be selected and were selected,
and &amp;hellip; here is a bad surprise:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 366px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/incomplete.jpg'>&lt;img src='/competitions/robotchallenge/2014/incomplete_t.jpg' alt='Mixed video frame' title='Mixed video frame' class='border'  width='360' height='203'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/incomplete.jpg'>Mixed video frame&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>We are still processing &lt;b>recorded video&lt;/b> which should be
complete. So it is possible that PaVE packet contained only part of the video
frame. Another TODO.&lt;/div>

&lt;h3>Corrections in Y-coordinate&lt;/h3>

&lt;div class='p'>It turned out that it is necessary to make also sideways corrections. Sure
enough there was a mistake in sign: image x-coordinate is drone Y coordinate
and it should be positive for x=0. Fixed.&lt;/div>

&lt;h3>Hovering&lt;/h3>

&lt;div class='p'>The most important step was insertion of hovering. Just to get the drone at
least a little bit under control. The video frames are processed at 1Hz so by
setting the proportion between 0.0-1.0 splits the time for steering and
hovering. 0.5s was too slow 0.8s too fast and 0.7 was reasonable at this
stage.  The plan is to trigger hovering only when the error grows above some
predefined limit.&lt;/div>

&lt;h3>Results 14-2-25&lt;/h3>

&lt;div class='p'>The drone is extremely slow at the moment. In one minute it was not able to
complete a semi-circle. It will be necessary to recognize which part of 8-loop
is it and steer instead of go forward as default command.&lt;/div>

&lt;div class='p'>Another detail to think about is going backward instead of forward. The down
pointing camera is shifted little bit back. It may simplify navigation in some
cases. Like test logged as &lt;i>meta_140225_185421.log&lt;/i>, when Isabele flew across
our table (we do not have protective nets), because only in one frame it had
seen perpendicular strip &amp;hellip; and then it was too late &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140228">&lt;/a>&lt;/div>

&lt;h2>February 28th, 2014 - Bug in log process&lt;/h2>

&lt;div class='p'>Today I was investigating timestamps of video vs. navigation data. What scared
me a lot was &lt;b>five seconds gap&lt;/b> between initial frames:&lt;/div>

&lt;pre>1548.280426
0
1549.273254
-16
1550.32669 &amp;lt;&amp;mdash; !!!!!!!!
-18
1555.131744 &amp;lt;&amp;mdash; !!!!!!!!
-9
1557.108245 &amp;lt;&amp;mdash; !!!!!!!!
-14&lt;/pre>

&lt;div class='p'>&amp;hellip; here you see time in seconds and detected angle in degrees, just to
find/match corresponding frame. So what is the problem? Here is the log file
for image processing:&lt;/div>

&lt;pre>&amp;hellip;
[((1009, 643), (74, 48), -18), ((1013, 624), (16, 12), -45), ((464, 482), (46, 266), -7), &amp;hellip;]
178
[]
750
[((172, 209), (15, 8), -9)]
188
[]
184
[((327, 551), (43, 239), -14), ((244, 235), (43, 216), -14)]
190
[((1045, 676), (31, 38), -45), ((268, 483), (43, 236), -16)]
199&lt;/pre>

&lt;div class='p'>The results are interlaced  with the number of updates of drone internal loop
(running at 200Hz). The problematic number is 750 &amp;hellip; so for such a long time
there was no result?! This would mean that drone navigation was based on video
frame several seconds old!&lt;/div>

&lt;div class='p'>Well, good news is that in reality that was not so bad and the mistake is in
logging. It is necessary to have all results different in order to &lt;i>store
changes&lt;/i>. Here for several seconds were no rectangles detected (i.e. empty
list []) and these results were &lt;i>packed&lt;/i> into one. If there would be
timestamps and frame number the results would be always different and then it
should work fine.&lt;/div>

&lt;div class='p'>It is fixed now, and ready for new test (see
&lt;a href='https://github.com/robotika/heidi/commit/63d9b4b535b5da26c6ccde64142f4d18689f954b' class='external'>github
diff&lt;/a>).&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140303">&lt;/a>&lt;/div>

&lt;h2>March 3rd, 2014 - Z-scaling&lt;/h2>

&lt;div class='p'>A quick observation based on today's Jakub tests: it is about strip (x,y)
coordinates scaling based on the drone height (Z-scaling).  The distance is
estimated individually for each strip. I could not see any reason why three
strips in line:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 299px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/z-scaling.jpg'>&lt;img src='/competitions/robotchallenge/2014/z-scaling_t.jpg' alt='strips in line' title='strips in line' class='border'  width='293' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/z-scaling.jpg'>strips in line&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>are not linear in &lt;a href='https://github.com/robotika/heidi/blob/master/viewer.py' class='external'>the log viewer&lt;/a>:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 354px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/z-scaling-viewer.png'>&lt;img src='/competitions/robotchallenge/2014/z-scaling-viewer_t.png' alt='pink centers of strips' title='pink centers of strips' class='border'  width='348' height='200'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/z-scaling-viewer.png'>pink centers of strips&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>?! I already told you the reason &amp;hellip; the z coordinate is estimated separately
for each strip (based on its length) and the middle is much shorter (due to a
light reflection), so it is expected to be further away &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140304">&lt;/a>&lt;/div>

&lt;h2>March 4th, 2014 - A Brief History of Time&lt;/h2>

&lt;div class='p'>I could not resist to use
&lt;a href='http://en.wikipedia.org/wiki/A_Brief_History_of_Time' class='external'>Stephen Hawking book
title&lt;/a>, sorry &lt;span class='smile'>&lt;/span>. AR Drone 2.0 has at least two time sequences: NavData and
Video. They are encoded differently (see for example
&lt;a href='https://projects.ardrone.org/boards/1/topics/show/5379' class='external'>Parrot forum&lt;/a>), but
good news is that they are probably related to same &lt;i>beginning of time&lt;/i>. In
this case not Bing Bang, but only Drone Boot.&lt;/div>

&lt;div class='p'>The video stream is delayed. Not much for recording but maybe a lot for fast
drone navigation. The number vary around 0.3s, but you can see also values like
0.44s and probably more. Here you can see the difference if you tight video
frame to current pose or if you keep
&lt;a href='https://github.com/robotika/heidi/commit/6c5cd6cbbd0be78204a5dba705159f2750250dfd' class='external'>poseHistory&lt;/a>:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 382px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotchallenge/2014/before-time-correction.png' alt='before time correction' title='before time correction' class='border'  width='376' height='376'/>&lt;/span>&lt;br/>
&lt;span>before time correction&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 382px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotchallenge/2014/after-time-correction.png' alt='after time correction' title='after time correction' class='border'  width='376' height='376'/>&lt;/span>&lt;br/>
&lt;span>after time correction&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140305">&lt;/a>&lt;/div>

&lt;h2>March 5th, 2014 - test-14-3-4&lt;/h2>

&lt;div class='p'>Yesterday tests were quite discouraging :(. We had visitors and the best we
could do was finish 1/2 of 8-loop. After the second turn Isabelle decided to
„turn right” on the crossing and that was it. So we still do not have working
version 0!!!&lt;/div>

&lt;div class='p'>Now it is time to go through the log files and look for anomalies. Starting
from the last test I can see 90 degrees jump in heading. This is probably
&lt;i>normal&lt;/i> as internal compass is taken into account and it takes while before
enough data are collected.&lt;/div>

&lt;div class='p'>Isabelle is not very high (approx. 1m) at the start position. It is pity that
it won't recognize partially visible strip. Moreover the recognized strip is,
due to light reflection, shorter so distance estimate is not very accurate.&lt;/div>

&lt;div class='p'>There is an obvious problem with Y-coordinate correction. It is „one shot”
only, with dead zone 5cm. For 10cm error it creates beautiful oscillations &lt;span class='smile'>&lt;/span>.
On the other hand correction in angle with dead-zone +/- 15 degrees (!) does
not correct for first three image snapshots at all &amp;hellip; OK, now it is clear. Who
programmed that?! &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>Another source of problems is failure to detect transition from circle to line.
As result &lt;i>circle corrections&lt;/i> are used instead of &lt;i>line navigation&lt;/i>.&lt;/div>

&lt;div class='p'>There are only 3 more tries/test evenings to go, so we will have to pick from
the following list:&lt;/div>

&lt;ul>
&lt;li>switch to smaller images, stream preview video (i.e not recorded), faster
frame rate&lt;/li>

&lt;li>C/Python binding to FFMPEG and parsing intermediate images (i.e. more than
1Hz video update rate)&lt;/li>

&lt;li>MCL (Monte-Carlo Localisation) based on strips position&lt;/li>

&lt;li>improve quality of strips image recognition&lt;/li>

&lt;li>big image of the floor from matched and joined mosaic images&lt;/li>

&lt;li>rectangle merging (detected two small which are split due to light reflection)&lt;/li>

&lt;li>position driven corrections&lt;/li>
&lt;/ul>

&lt;hr/>

&lt;div class='p'>&lt;a id="140306">&lt;/a>&lt;/div>

&lt;h2>March 6th, 2014 - Cemetery of Bugs&lt;/h2>

&lt;div class='p'>Hopefully I can call with this title
&lt;a href='https://github.com/robotika/heidi/commit/0a50b30977b222c7b7ce8aad878be43b3f84d899' class='external'>today&amp;#039;s
commit&lt;/a> &lt;span class='smile'>&lt;/span>. Sometimes it is hard to believe how many bugs can be in few lines
of code. I already mentioned couple yesterday, so few more today:&lt;/div>

&lt;pre>drone.moveXYZA( sx, -sy/4., 0, 0 )&lt;/pre>

&lt;div class='p'>This was supposed to be replacement for hover() function in Y coordinate. The
idea was that you send the drone with the opposite speed then which is flying
now. So what is the problem? &lt;i>sy&lt;/i> is not the drone speed! It is the command
after image analysis for tilt right or left. It should be &lt;i>drone.vy&lt;/i>
instead.&lt;/div>

&lt;div class='p'>Another bug, probably not so critical, is a block of code with condition, if any
strip was detected. If there was no strip then history of poses was constantly
increasing (in proper way it is cut with the timestamp of the last decoded
video frame). Solved via completely removing the if statement &amp;mdash; it is no
longer needed. I no longer worry that I do not see any rectangle.&lt;/div>

&lt;div class='p'>Then there was a surprise with &lt;i>ARDrone2.vy&lt;/i> speed itself. I never need it,
except for drone pose computation, and so it was with signs as you can get them
from Parrot. Fixed in
&lt;a href='https://github.com/robotika/heidi/blob/0a50b30977b222c7b7ce8aad878be43b3f84d899/ardrone2.py#L398' class='external'>ardrone2.py&lt;/a>.&lt;/div>

&lt;div class='p'>So what next? Hopefully today will Jakub test new code with reference circles.
Instead of navigation based on old images, Isabelle should now use circular
path estimated from the image. I should be much better, but will be? &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140310">&lt;/a>&lt;/div>

&lt;h2>March 10th, 2014 - Shadow&lt;/h2>

&lt;div class='p'>Jakub sent me results of Friday tests with navigation along reference line with
comment: &lt;i>we've got problems with shadows&lt;/i> &amp;hellip; and he was absolutely right:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 306px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/shadow.jpg'>&lt;img src='/competitions/robotchallenge/2014/shadow_t.jpg' alt='Shadow --- video_rec_140307_150703.bin:630' title='Shadow --- video_rec_140307_150703.bin:630' class='border'  width='300' height='169'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/shadow.jpg'>Shadow --- video_rec_140307_150703.bin:630&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>This is not the first time when&lt;/div>

&lt;pre>ret, binary = cv2.threshold( gray, 0, 255, cv2.THRESH_OTSU )&lt;/pre>

&lt;div class='p'>fails. On the other hand it is not surprise &amp;mdash; there are three peaks in
histogram instead of two.  The worst thing on this particular picture is that
it not only does not detect any of three black strips, but it evaluates garbage
as perpendicular strip!&lt;/div>

&lt;div class='p'>For quick test I used only Irfan histogram to show what I mean:
&lt;table class='image_panel center' style='width: 309px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/competitions/robotchallenge/2014/histogram.png' alt='Image histogram ' title='Image histogram ' class='border'  width='303' height='319'/>&lt;/span>&lt;br/>
&lt;span>Image histogram &lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Note, that I ignored red/green/blue pixels and picked blue histogram, because
there you can see the best three peaks.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140312">&lt;/a>&lt;/div>

&lt;h2>March 12th, 2014 - Finally Version 0!&lt;/h2>

&lt;div class='p'>I can sleep again &lt;span class='smile'>&lt;/span>. Not much (it is 5am now), but still it is much better
&lt;span class='wink'>&lt;/span>.  Yesterday &lt;a href='/robots/heidi/en'>&lt;span class='cs'>Heidi&lt;/span>&lt;/a> finally broke the spell and after
several weeks completed the figure eight loop!&lt;/div>

&lt;div class='p'>Are you asking why Heidi and not here &lt;i>sister&lt;/i> Isabelle? Well, there was a
surprise. Isabelle does not have working compass! Yesterday I wanted to add
extra direction filtering, basically to make sure that I am following the
proper line. So I integrated magneto tag into
&lt;a href='https://github.com/robotika/heidi/commit/8e74037274a8316e2ed53d460c69f32fdab62b4b' class='external'>ARDrone2
variables&lt;/a>. I was using last 2 minutes test from Jakub (OT that was maybe
another moment for celebration when Isabelle was able to autonomously fly and
navigate for 2 minutes) and to my surprise there were constant raw data
readings from compass:&lt;/div>

&lt;pre>compass:	
-55	-57	-191	
193.7109375	187.55859375	671.484375	
-120.508392334	108.018455505	671.484375
314.219329834	79.5401382446	0.0	-121.496063232
0.0	-137.727676392 0x01	2	
402.916381836	-241.082290649	0.0546875&lt;/pre>

&lt;div class='p'>This corresponds to following named variables:&lt;/div>

&lt;pre>values = dict(zip(['mx', 'my', 'mz', 
  'magneto_raw_x', 'magneto_raw_y', 'magneto_raw_z', 
  'magneto_rectified_x', 'magneto_rectified_y', 'magneto_rectified_z',
  'magneto_offset_x', 'magneto_offset_y', 'magneto_offset_z',
  'heading_unwrapped', 'heading_gyro_unwrapped', 'heading_fusion_unwrapped', 
  'magneto_calibration_ok', 'magneto_state', 'magneto_radius',
  'error_mean', 'error_var'], magneto))&lt;/pre>

&lt;div class='p'>This was kind of magic &amp;mdash; it is almost hard to believe how data fusion of 
gyro + compass perfectly hide this problem. In particular if you watch variables
&lt;i>heading_unwrapped&lt;/i>, &lt;i>heading_gyro_unwrapped&lt;/i>, &lt;i>heading_fusion_unwrapped&lt;/i>
then first (compass heading) was constant (as well as mx,my,mz and others).
Second gyro heading was slowly changing and finally in fusion heading was
originally winner compass, but later on it was loosing against gyro &lt;span class='smile'>&lt;/span>. It is
really hard to believe that it worked so well and except longer flights you
would not notice it.&lt;/div>

&lt;div class='p'>I could not resist and tried immediately Heidi at home in the kitchen. I cut
the motors as soon as they started to turn (it was 6am, so neighbours would
probably complain). There were two test: first drone pointing south and then
drone pointing north. Values were different, so it is issue of Isabelle.&lt;/div>

&lt;div class='p'>Now, what is worse than not working sensor? Any idea? Well, there is even worse
case &amp;mdash; &lt;b>sometimes not working sensor&lt;/b>! After several tests flights with
Heidi yesterday we tried final working algorithm also with Isabelle. And it
worked fine! Do you see the &lt;i>persistent&lt;/i> direction of 8 figure?&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 293px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/isabelle-8-loops.png'>&lt;img src='/competitions/robotchallenge/2014/isabelle-8-loops_t.png' alt='Isabelle with working compass' title='Isabelle with working compass' class='border'  width='287' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/isabelle-8-loops.png'>Isabelle with working compass&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Just to get an idea, visible only form raw data, here is older 2 minutes flight
mentioned sooner:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 293px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/isabelle-broken-compass.png'>&lt;img src='/competitions/robotchallenge/2014/isabelle-broken-compass_t.png' alt='Isabelle with broken compass' title='Isabelle with broken compass' class='border'  width='287' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/isabelle-broken-compass.png'>Isabelle with broken compass&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Note, that I had to change the scale so you will see the crazy trajectory &amp;hellip;&lt;/div>

&lt;div class='p'>Fun, isn't it? &lt;span class='wink'>&lt;/span> Now I would guess that communication with compass module
sometimes fails and the only possibility is to power off the drone and restart
the whole system. Maybe. Another TODO to go through older log files and check
for this behavior.&lt;/div>

&lt;div class='p'>So that was the bad news &amp;hellip; now the good ones &amp;hellip; and the title of today's
post.  We finally managed to complete the 8-loop and repeat it several times.
There were three changes (see older
&lt;a href='https://github.com/robotika/heidi/commit/11906701c4d90b465e2f4fb27ca2714433f927a7' class='external'>diff&lt;/a>):&lt;/div>

&lt;ul>
&lt;li>decreased speed from 0.5m/s to 0.3m/s&lt;/li>

&lt;li>wait on start if no strip was detected&lt;/li>

&lt;li>fixed bug &amp;mdash; missing abs()&lt;/li>
&lt;/ul>

&lt;div class='p'>The speed is obvious and does not need any explanation. The second point is
related to elimination of most bad starts &amp;mdash; the takeoff sequence is completed
at about 1m above the ground and reasonable view from bottom camera is at about
1.5m. The algorithm slowly increases the height, but it should not leave the
start area.&lt;/div>

&lt;div class='p'>And the third one is the funny one &lt;span class='smile'>&lt;/span>. That was the reason why Isabelle or
Heidi took wrong turn on the crossing. It was quite repeatable &amp;hellip; fixed.&lt;/div>

&lt;div class='p'>p.s. I wanted to post also link to YouTube video, but my video converter no
longer works &amp;hellip; it looks like there were
&lt;a href='http://www.visionopen.com/forums/topic/ffmpeg-big-changes/' class='external'>some changes in
FFmpeg&lt;/a> &amp;hellip; but maybe it is only to change avcodec_get_context_defaults2 to
avcodec_get_context_defaults3 and replace AVMEDIA_TYPE_VIDEO by NULL as in
&lt;a href='http://patches.libav.org/patch/16569/' class='external'>this example&lt;/a>. It looks OK and
&lt;a href='http://youtu.be/f6_kiLwPWE0' class='external'>here&lt;/a> is the video &amp;hellip; note the strange
behaviour near the end. Still plenty of work to do.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140313">&lt;/a>&lt;/div>

&lt;h2>March 13th, 2014 - Frame contour&lt;/h2>

&lt;div class='p'>Did you watch the video of &lt;a href='http://youtu.be/f6_kiLwPWE0' class='external'>the first Isabelle&amp;#039;s
successful flight&lt;/a> to the end? Can you guess what happened? If you have no
idea then frame number 3630 from &lt;i>video_rec_140311_192140.bin&lt;/i> can help you:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 306px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/frame-contour.jpg'>&lt;img src='/competitions/robotchallenge/2014/frame-contour_t.jpg' alt='False strip on the frame contour' title='False strip on the frame contour' class='border'  width='300' height='169'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/frame-contour.jpg'>False strip on the frame contour&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>And here is the report of changes near the end:&lt;/div>

&lt;pre>FRAME 3540 I I
FRAME 3570 I I
TRANS I -> X
FRAME 3600 X I
TRANS I -> L
FRAME 3630 L L
NO dist change! -0.989344682643
TRANS L -> X
FRAME 3660 X I
FRAME 3690 I I
FRAME 3720 ? I
FRAME 3750 ? I
TRANS I -> L
FRAME 3780 L L
FRAME 3810 L L
FRAME 3840 L L
FRAME 3870 L L
NAVI-OFF 120.001953&lt;/pre>

&lt;div class='p'>Isabelle wrongly interpreted this image as &lt;b>turn left&lt;/b> (!) &amp;hellip; well, yeah &amp;hellip;
one pair looks like two following strips on left turn circle. And then there is
a transition from circle to crossing, which should not ever happen where
drone picked wrong line for navigation and started to fly loops in the opposite
direction.&lt;/div>

&lt;div class='p'>p.s. note, that Heidi (older sister with several injuries from the last year)
in reality finished the 8-loop sooner, but in her last attempt
(&lt;i>meta_140311_191029.log&lt;/i>) unexpectedly landed in the middle of the test.
What happened there? Low battery (on the start 39% and after landing 0%) &amp;hellip;
OK, so now we know that if battery is low done will automatically land &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140314">&lt;/a>&lt;/div>

&lt;h2>March 14th, 2014 - cv2.contourArea(cnt, oriented=True)&lt;/h2>

&lt;div class='p'>VlastimilD wrote: &lt;i>the function cv2.contourArea has parameter "oriented",
which is by default False and if set to True, it returns negative area which
means the contour has the opposite orientation and that should be white on
black i this case. So if you don't want those, setting oriented to True should
filter them out in the following "if".&lt;/i> &amp;hellip; and he is absolutely right! Thanks
for hint &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>For details see
&lt;a href='http://docs.opencv.org/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html#double%20contourArea%28InputArray%20contour,%20bool%20oriented%29' class='external'>OpenCV
documentation&lt;/a>.&lt;/div>

&lt;div class='p'>p.s. I am sorry for misspeled name in
&lt;a href='https://github.com/robotika/heidi/commit/53710c104bc989729fbaad846524d3fdd2fdc84a' class='external'>the
commit&lt;/a>.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140316">&lt;/a>&lt;/div>

&lt;h2>March 16th, 2014 - MP4_360P_H264_360P_CODEC&lt;/h2>

&lt;div class='p'>We have decided to switch to video recording in MP4_360P_H264_360P_CODEC mode.
The previous mode was MP4_360P_H264_720P_CODEC, i.e. with higher resolution
(1280x720) at 30fps. Our main motivation was to get rid of split images &amp;mdash;
cases, where you get mixed old and new image (for details see
&lt;a href='/competitions/robotchallenge/2014/incomplete.jpg'>mixed video frame&lt;/a>).&lt;/div>

&lt;div class='p'>Another motivation was to reduce WiFi traffic and the size of recorded
reference videos. So far so good. And yes, there was at least one surprise &lt;span class='smile'>&lt;/span>.
The 720P video was not only with higher resolution but also on 30fps, and 360P
video is running at 15fps! For us it is actually good news, because expected
size will not be 4 times smaller but 8 times (in reality it won't be so big
difference, because large and more frequent pictures are easier to pack).&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140317">&lt;/a>&lt;/div>

&lt;h2>March 17th, 2014 - Scale Blindness&lt;/h2>

&lt;div class='p'>Did you experience the feeling when the mistake is so big, &lt;i>out of range&lt;/i>,
that you won't see it? Here is the story: low resolution video does not work.
The navigation algorithm is much much worse. Why? The error is too big to be
seen &amp;hellip; and it is already presented on all previous images &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>Just for comparison here is one not working 360P image:
&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/sample-360p.jpg'>&lt;img src='/competitions/robotchallenge/2014/sample-360p_t.jpg' alt='Example of 360P image' title='Example of 360P image' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/sample-360p.jpg'>Example of 360P image&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Do you see now, why
&lt;a href='http://docs.opencv.org/modules/imgproc/doc/miscellaneous_transformations.html' class='external'>
cv2.THRESH_OTSU&lt;/a> used to work fine before, but it is not working any more? No?&lt;/div>

&lt;div class='p'>&lt;span class='smile'>&lt;/span> boring, right? The bottom facing camera has QVGA (320*240) resolution,
according to Parrot specification . That resolution is neither compatible 
with 1280x720 nor with 640x360. But what is maybe not expected is that for high
resolution you scale it 3 times (720=240*3) and fill the margin with black
pixels, while for 640x360 you scale it twice (640=320*2) and &lt;b>cut the
overlapping pixels!&lt;/b> Histogram for high resolution video contained
(1280-960)*720 totally black pixels &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140318">&lt;/a>&lt;/div>

&lt;h2>March 18th, 2014 - Version 1 and MSER&lt;/h2>

&lt;div class='p'>It is still quite depressing &amp;mdash; we would like to get 30 rounds and at the
moment we are happy if we complete 3! &lt;span class='smile'>&lt;/span> Well, it is called &lt;i>progress&lt;/i>, but
up-to-now Version 0 with 8 times 8-loop was still better.&lt;/div>

&lt;div class='p'>What is the main difference between
&lt;a href='https://github.com/robotika/heidi/commit/11906701c4d90b465e2f4fb27ca2714433f927a7' class='external'>Version
0&lt;/a> and
&lt;a href='https://github.com/robotika/heidi/commit/aa583a6c4a144fe251d100b6f87271856cbe2e45' class='external'>Version
1&lt;/a>? First of all it is the video input with smaller resolution. After
yesterday discovery we know that the image is covering only 3/4th of reality
visible on high resolution video. This means that you see two complete strips
less frequently than before. There is a new
&lt;a href='https://github.com/robotika/heidi/blob/master/striploc.py' class='external'>class
StripsLocalisation&lt;/a>, which should compensate for this by taking into account
also previous image and the movement of drone between snapshots.&lt;/div>

&lt;div class='p'>There is not much time left &amp;mdash; only one week so almost zero chance that we
will integrate Version 2, which would handle &lt;i>absolute localization&lt;/i> and
compensate for multiple failures &amp;hellip; oh well.&lt;/div>

&lt;div class='p'>There was a good news at the end too: it is possible to use
&lt;a href='http://en.wikipedia.org/wiki/Maximally_stable_extremal_regions' class='external'>MSER&lt;/a>
(Minimally Stable External Regions) implementation in OpenCV in real-time image
processing. Isabelle was able to complete several rounds with this algorithm.
The video delay was comparable to normal binary threshold set for 5% of all
pixels count.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140319">&lt;/a>&lt;/div>

&lt;h2>March 19th, 2014 - MSER and negative area&lt;/h2>

&lt;div class='p'>The trick with negative area does not work for MSER output (!). And by the way
it is also the function &lt;i>cv2.contourArea(cnt, oriented=True)&lt;/i> which fails on
Linux:&lt;/div>

&lt;pre>OpenCV Error: Unsupported format or combination of formats (The matrix can not
be converted to point sequence because of inappropriate element type) in
cvPointSeqFromMat, file
/build/buildd/opencv-2.4.2+dfsg/modules/imgproc/src/utils.cpp, line 59
0
Traceback (most recent call last):
  File "airrace.py", line 180, in &lt;module>
    testPaVEVideo( filename )
  File "airrace.py", line 155, in testPaVEVideo
    print frameNumber( header )/15,  filterRectangles(processFrame( frame, debug=True ), minWidth=150/2)
  File "airrace.py", line 31, in processFrame
    area = cv2.contourArea(cnt, oriented=True)
cv2.error: /build/buildd/opencv-2.4.2+dfsg/modules/imgproc/src/utils.cpp:59:
error: (-210) The matrix can not be converted to point sequence because of
inappropriate element type in function cvPointSeqFromMat&lt;/pre>

&lt;div class='p'>Why it does not work is probably due to missing &lt;i>external contour&lt;/i>, but it
sometimes works, so I am not sure. If you want to play, &lt;a href='/competitions/robotchallenge/2014/mser-orig.jpg'>here&lt;/a>
is the original picture, uncomment g_mser creation
(&lt;a href='https://github.com/robotika/heidi/blob/master/airrace.py#L27' class='external'>airrace.py:27&lt;/a>)
and you can see yourself:&lt;/div>

&lt;pre>./airrace.py mser-orig.jpg&lt;/pre>

&lt;div class='p'>No, you won't see it! That's even more scary!! Forgot about oriented parameter
for MSER area!!! The stored JPG file is a little bit different, but you get
completely different results than from original video (see the sign for 1412):&lt;/div>

&lt;pre>m:\hg\robotika.cz\htdocs\competitions\robotchallenge\2014>m:\git\heidi\airrace.py mser-orig.jpg
1412.5
-3237.5
-7165.5
-897.5
-1488.0
-1261.0
492.5
-6224.0&lt;/pre>

&lt;div class='p'>vs.&lt;/div>

&lt;pre>m:\git\heidi>airrace.py logs\video_rec_140318_195005.bin 62
62 -6079.5
-3189.0
-1648.5
-32.5
-1202.0
-4846.5
-369.5
[((406, 49), (124, 23), -61), ((346, 209), (132, 23), -77)]&lt;/pre>

&lt;div class='p'>&amp;hellip; I do not feel comfortable now :-(. For MSER just forget about area. It is
filtered out at the beginning anyway, right?&lt;/div>

&lt;pre>g_mser = cv2.MSER( _delta = 10, _min_area=100, _max_area=300*50*2 )&lt;/pre>

&lt;div class='p'>I should probably learn more about the parameters&amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140323">&lt;/a>&lt;/div>

&lt;h2>March 23rd, 2014 - mosaic.py (5days remaining)&lt;/h2>

&lt;div class='p'>Couple days ago I wanted to play a little, so instead of reviewing more and
more tests results, I tried to write &lt;a href='https://github.com/m3d/cv2-bits/blob/master/mosaic.py' class='external'>mosaic.py&lt;/a> &amp;mdash; script for
collection multiple images into one. The first plan was to feed the script with
a list of images only, and the rest will be automatic. Very naive with my
energy and mental level that evening. So I drop it that night.&lt;/div>

&lt;div class='p'>Later I started to play with MCL (Monte-Carlo Localization) and one crucial
"detail" is how big is the drone position estimation error? My only reference
is list of the collected images, so &lt;i>mosaic.py&lt;/i> implementation would have
place even in this tight schedule &lt;span class='smile'>&lt;/span>. This time it was fully manual: press
arrow keys for shifting new image and q/a keys for image rotation. It worked.
But it was a tedious work even for three samples. And it did not quite solve
the original task &amp;mdash; how big is the error?&lt;/div>

&lt;div class='p'>So I slightly changed the interface and instead of list of images I am passing
there CSV file containing filename, x, y, degAngle on every line. It also
prints this on standard output when you do the correction, so you can use it
next time.&lt;/div>

&lt;div class='p'>Well, so far so good. The struggle came with &lt;a href='https://github.com/robotika/heidi/commit/aeeab3c500943f6bff91453fbc248758fc5abfa0' class='external'>exporting CSV from
airrace_drone.py&lt;/a>. Y-coordinate on images is upside-down, rotation is not in
the center but in top left corner, camera is rotated -90 degrees to robot/drone
pose &amp;hellip; and probably much more, like image size dependent on height or tilt
compensation.&lt;/div>

&lt;div class='p'>The result is still not worth presentation &amp;mdash; it is mainly for me, to write
the notes down, so I will realize where is the mistake? &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 306px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/mosaic-bad.jpg'>&lt;img src='/competitions/robotchallenge/2014/mosaic-bad_t.jpg' alt='not very nice mosaic' title='not very nice mosaic' class='border'  width='300' height='200'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/mosaic-bad.jpg'>not very nice mosaic&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The images are added via &lt;i>cv2.add(img1,img2)&lt;/i>, so overlapping parts are
brighter.  The shift in the turn is probably due to scaling but at the moment I
have no idea why the drone basically stops on the straight segment (white area
at the bottom). Maybe I should manually correct the mosaic, so you get an idea
how nice/useful it could be. BTW the absolute error in direction is very small
(less then degree), so compass is great (&lt;b>if it works at all!&lt;/b> &amp;mdash; we should
get replacement for Isabelle next week, ha ha ha).&lt;/div>

&lt;div class='p'>So here is mosaic with shift correction only
&lt;table class='image_panel center' style='width: 306px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/mosaic-corrected.jpg'>&lt;img src='/competitions/robotchallenge/2014/mosaic-corrected_t.jpg' alt='mosaic with shift correction' title='mosaic with shift correction' class='border'  width='300' height='200'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/mosaic-corrected.jpg'>mosaic with shift correction&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Every image is single video I-frame, i.e. 1Hz and approximately 15seconds of
flight. The black dash-line segments are 30cm long with 10cm spacing.&lt;/div>

&lt;div class='p'>And here is the mosaic corrected with angle (13 degrees correction for the last
image, i.e. 1 degree per frame):&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 306px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/mosaic-angle.jpg'>&lt;img src='/competitions/robotchallenge/2014/mosaic-angle_t.jpg' alt='mosaic with corrected angle' title='mosaic with corrected angle' class='border'  width='300' height='200'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/mosaic-angle.jpg'>mosaic with corrected angle&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Exercise for you &amp;mdash; how fast was Isabelle flying?&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140324">&lt;/a>&lt;/div>

&lt;h2>March 24th, 2014 - MSER and battery (4days remaining)&lt;/h2>

&lt;div class='p'>Here are some results of today testing:&lt;/div>

&lt;ol>
&lt;li>MSER is superior to normal thresholding&lt;/li>

&lt;li>there is again problem like before with normal contours and negative area:
&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/mser-inv.jpg'>&lt;img src='/competitions/robotchallenge/2014/mser-inv_t.jpg' alt='Inverse MSER region' title='Inverse MSER region' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/mser-inv.jpg'>Inverse MSER region&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/li>

&lt;li>we finally completed 10 loops (see &lt;a href='http://youtu.be/ELtMDPGEXkw' class='external'>video&lt;/a>), but Heidi battery died after 6 minutes&lt;/li>
&lt;/ol>

&lt;hr/>

&lt;div class='p'>&lt;a id="140325">&lt;/a>&lt;/div>

&lt;h2>March 25th, 2014 - Bermuda Triangle (3days remaining)&lt;/h2>

&lt;div class='p'>My &lt;a href='http://en.wikipedia.org/wiki/Bermuda_Triangle' class='external'>Bermuda Triangle&lt;/a> is the
center area, where the lines are crossing, and drones are loosing their
navigation skills. The wrongly detected strip has following properties:&lt;/div>

&lt;ul>
&lt;li>it is white&lt;/li>

&lt;li>it is on the image border&lt;/li>

&lt;li>it is triangle&lt;/li>
&lt;/ul>

&lt;div class='p'>At first I want to attack the 3rd point, triangle. The idea was: &lt;i>OK, if it is
rectangle than minimum fitting rectangle will have almost the same area while
triangle will have only half of it&lt;/i>. Does it sound reasonable?&lt;/div>

&lt;div class='p'>I verified 6 minutes video that I still get identical results as in the real
flight yesterday (sometimes these "stupid" tests are the ones most important)
and it was OK. So I added area computation and print:&lt;/div>

&lt;pre>rect = cv2.minAreaRect(cnt)
area = cv2.contourArea(cnt)
print "AREA %.2f\t%d\t%d" % (area/float(rect[1][0]*rect[1][1]), area, rect[1][0]*rect[1][1])&lt;/pre>

&lt;div class='p'>Well, I expected percentage of covered area, but &amp;hellip;&lt;/div>

&lt;pre>AREA 0.05       121     2482
AREA 1.60       4936    3084
AREA 1.26       5033    3983
AREA 0.35       253     731
AREA 1.93       1830    950
AREA 0.57       1366    2415
AREA 2.24       6819    3044&lt;/pre>

&lt;div class='p'>&amp;hellip; nice, isn't it &lt;span class='wink'>&lt;/span>. So what the hell is that?!&lt;/div>

&lt;div class='p'>It looks like it is just an array of points, unsorted. Now it makes sense to
apply convex hull as in one of MSER examples, before the contour is drawn. So I
am not looking for &lt;i>area&lt;/i>, but all I need is &lt;i>len(cnt)&lt;/i>&lt;/div>

&lt;pre>AREA 0.90       2242    2482
AREA 0.91       2806    3084
AREA 0.88       3514    3983
AREA 0.86       626     731
AREA 0.83       787     950
AREA 0.90       2178    2415
AREA 0.93       2845    3044&lt;/pre>

&lt;div class='p'>That looks much better &lt;span class='smile'>&lt;/span>.&lt;/div>

&lt;div class='p'>Do you want to see numbers for yesterday image
(&lt;i>video_rec_140324_195834.bin&lt;/i>, frame 17)?&lt;/div>

&lt;pre>AREA 0.58       1438    2475
AREA 0.57       1843    3219
AREA 0.26       4041    15301
AREA 0.32       5284    16650
AREA 1.01       624     620
AREA 0.99       818     825
AREA 0.86       1586    1850
AREA 0.85       2104    2481
AREA 0.87       1320    1519
AREA 0.85       1673    1964
AREA 0.82       2269    2772
AREA 0.91       1602    1767
AREA 0.84       2361    2794
AREA 0.91       3016    3299&lt;/pre>

&lt;div class='p'>First two are probably detected triangles for different threshold levels. After
that is a big crossing area and then reasonable strips. So 70% would be maybe
enough. And here is the result:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/mser-70-area.jpg'>&lt;img src='/competitions/robotchallenge/2014/mser-70-area_t.jpg' alt='Filtering 70% rectangle area' title='Filtering 70% rectangle area' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/mser-70-area.jpg'>Filtering 70% rectangle area&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>And now it is time for repeated video processing with identity assert &amp;mdash; will
it fail? Yes, even the first very picture :-(. I had to add condition to skip
first 10 frames (seconds) &amp;hellip; during take-off there are many random objects.
Then it went fine to frame 218 &amp;hellip; hmm, but I do not see anything interesting
there (*). Time to &lt;i>re-log&lt;/i> whole video and compare text results.&lt;/div>

&lt;div class='p'>OK, there are 17 different results, frames 5, 920, 1055, 1220, 1835, 2180,
2480, 2810, 3470, 3800, 4460, 5405, 5420, 6035, 6185, 6350, 6530 (divide it by
15 to get I-frame order). Boring? Who told you that this is going to be fun?
&lt;span class='wink'>&lt;/span> Off topic &amp;mdash; I can pass I-frame index to see/analyze only particular
frame, but what should I do with these "times 15" numbers? Bad mathematician
and lazy programmer end up with the following code:&lt;/div>

&lt;pre>testPaVEVideo( filename, onlyFrameNumber=int(eval(sys.argv[2])) )&lt;/pre>

&lt;div class='p'>The result looks OK, actually even better. It filters not only the bad triangles
but also slightly inflated rectangles &amp;hellip; so far so good. We will see during
tonight tests. Here is
&lt;a href='https://github.com/robotika/heidi/commit/68015896e5fe5bb8616e95a17ec0788f69e1a966' class='external'>the
code diff&lt;/a>.&lt;/div>

&lt;div class='p'>(*) 218 was not frame number. It was number of updates between frame 905 and
920, so the first failing frame was in reality 920/15&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140326">&lt;/a>&lt;/div>

&lt;h2>March 26th, 2014 - Isabelle II (2days remaining)&lt;/h2>

&lt;div class='p'>Believe it or not, we have new drone now. It came yesterday as "hot swap" for
Isabelle with broken compass. And she already flew several rounds &amp;hellip; thanks
Apple Store &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>The second most important progress is, that MSER now works also on Jakub's
computer with Linux and OpenCV 2.4.2. The workaround is
&lt;a href='https://github.com/robotika/heidi/commit/9a5bb423411255dc304a99722abc7f1400155270' class='external'>here&lt;/a>
(I am sure there is some faster/proper way to do it directly in
&lt;a href='http://wiki.scipy.org/Tentative_NumPy_Tutorial' class='external'>NumPy&lt;/a>, but there is not much
time left for study/experiments).&lt;/div>

&lt;div class='p'>The first test went as always: Heidi took-off, detected strip, started to turn,
missed transition to straight segment and continued in the circle &amp;hellip; damn.
The second test went fine. This scenario is already repeated like 5(?) times
&amp;hellip;  and the magic is &amp;hellip; height. We added extra time to reach the 1.5m height
and since that (knock, knock, knock) it works fine. Here is the next
&lt;a href='https://github.com/robotika/heidi/commit/a6fcf5a26137d8d7460eee18a3a22484323ca9a2' class='external'>changeset&lt;/a>.&lt;/div>

&lt;div class='p'>There is left/right tilt compensation correction
&lt;a href='https://github.com/robotika/heidi/commit/f6db1246dc64d1a382300c93aceceb92e7327818' class='external'>implemented
now&lt;/a>. Why? We had added strips filtering that they not only have to be spaced
by 40cm and similar angle, but that Y-coordinate offset is smaller then 25cm.
But then we have seen drone following line and on one image it is tilted to
the right and on the second image (a second later) it is tilted to the left.
The difference was only 6 degrees, but for 1.5meter height this means 15cm and
that already matters.&lt;/div>

&lt;div class='p'>Another "break through" (ha ha ha) came to mine mind on the bus on the way to
&lt;a href='http://czu.cz/' class='external'>CZU&lt;/a>. We should not follow the line exactly but with an
offset! &lt;span class='smile'>&lt;/span> That way we increase the chance to see transition &lt;i>line to
circle&lt;/i>. Here is
&lt;a href='https://github.com/robotika/heidi/commit/a1cc7daa36ff294536989ab986bb7b3541c4fd9a' class='external'>the
code&lt;/a>.&lt;/div>

&lt;div class='p'>OK, if you finished reading up to here &amp;hellip; here is
&lt;a href='http://youtu.be/_7LWMaFujtk' class='external'>video from testing&lt;/a> (not edited). You can see
there whole team except me + one visitor.&lt;/div>

&lt;h3>Internal notes &amp;hellip;&lt;/h3>

&lt;div class='p'>Now is time to review yesterday log files. There are 7 of them + 500MB from
Jakub. First two logs are without offset, then for speed 0.4m/s, 0.6m/s with
increasing step 0.05 per loop, 0.75m/s and 0.7m/s.&lt;/div>

&lt;div class='p'>I am not able to replay &lt;i>meta_140325_192727.log&lt;/i> &amp;hellip; my notes are not very
good &amp;hellip; I see that was the moment when we had speed 0.4m/s, step 0.05 BUT
there was constant angle correction independent on desiredSpeed. OK, now replay
works without assert.&lt;/div>

&lt;div class='p'>The last log ends nicely:&lt;/div>

&lt;pre>BATTERY LOW! 8
!DANGER! - video delay 1.097775&lt;/pre>

&lt;div class='p'>&amp;hellip; and in 5 seconds it hit the column.&lt;/div>

&lt;div class='p'>Now, refactoring &amp;hellip; I promised to delete all unused code, sigh. It must be
done. Now. MCL &amp;mdash; gone. classifyPath &amp;mdash; gone. PATH_UNKNOWN and PATH_CROSSING
&amp;mdash; gone.
&lt;a href='https://github.com/robotika/heidi/commit/57da8d9b4b635b33194e2f7d7b422df52b51bc8c' class='external'>Diff&lt;/a>
looks now a bit scary. Hopefully three reference tests were enough for this
refactoring. There is one more round of test tonight, so this was the last
chance.&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140327">&lt;/a>&lt;/div>

&lt;h2>March 27th, 2014 - Sign mutation (1day remaining)&lt;/h2>

&lt;div class='p'>Are you also sometimes lazy to think about slightly more complicated
mathematical formula that you just try it instead? And if it does not work, you
randomly change signs? I would call it &lt;i>genetics programming&lt;/i> (to complete
the evolution you can add to this process also copy and paste). And sometimes
it is not good idea&amp;hellip;&lt;/div>

&lt;div class='p'>I suppose that I had to use it for navigation along the right turn curve. It is
like left turn, just the signs are opposite, right? Some mutants are quite
healthy, and this one was for a relatively long time! Have a look at
&lt;a href='http://youtu.be/_7LWMaFujtkthe' class='external'>the video&lt;/a> again and closely watch the
right turns (1:01-1:16, 1:41-1:57). It is easier to hear it than to see it.
What is the problem? We expected some magnetic anomalies, but the truth is that
&lt;a href='https://github.com/robotika/heidi/commit/53a184899202f5ce529a776428a1b7c81c3b6223' class='external'>the
signs were not quite optimal&lt;/a>. I am not sure if I should be ashamed or happy
that we found the bug &lt;span class='wink'>&lt;/span>. Simply for right/left angle correction was used
&lt;i>the wrong side of the circle&lt;/i> &lt;span class='smile'>&lt;/span> (imagine circle and its sides, &lt;span class='smile'>&lt;/span> &amp;hellip; I
mean there is a place where your error is close to zero and there is also the
opposite place where you are close to math.pi, -math.pi singularity. It
actually works (as you can see it on the video), except that you steer with
maximum left, right, left, right &amp;hellip;&lt;/div>

&lt;div class='p'>OK, that was good one. There are two more issues, which I would call
&lt;b>nightmares&lt;/b>. I know about them, but I am not quite sure what to do about
them:&lt;/div>

&lt;ol>
&lt;li>losing WiFi connection&lt;/li>

&lt;li>video delay&lt;/li>
&lt;/ol>

&lt;div class='p'>The first one looks like this:&lt;/div>

&lt;pre>Traceback (most recent call last):
  File "M:\git\heidi\airrace_drone.py", line 237, in &lt;module>
    launcher.launch( sys.argv, AirRaceDrone, competeAirRace )
  File "M:\git\heidi\launcher.py", line 82, in launch
    task( robotFactory( replayLog=replayLog, metaLog=metaLog, console=console ))
  File "M:\git\heidi\airrace_drone.py", line 215, in competeAirRace
    drone.moveXYZA( sx, sy, sz, sa )
  File "M:\git\heidi\ardrone2.py", line 633, in moveXYZA
    self.movePCMD( -vy, -vx, vz, -va )
  File "M:\git\heidi\ardrone2.py", line 539, in movePCMD
    self.update("AT*PCMD=%i,1,"+ "%d,%d,%d,%d"%(f2i(leftRight),f2i(frontBack),f2
i(upDown),f2i(turn)) + "\r")
  File "M:\git\heidi\airrace_drone.py", line 79, in update
    ARDrone2.update( self, cmd )
  File "M:\git\heidi\ardrone2.py", line 258, in update
    Pdata = self.io.update( cmd % self.lastSeq )
rocess Process-1:
Traceback (most recent call last):
  File "C:\Python27\lib\multiprocessing\process.py", line 258, in _bootstrap
  File "M:\git\heidi\ardrone2.py", line 113, in update
Process Process-2:
Traceback (most recent call last):
  File "C:\Python27\lib\multiprocessing\process.py", line 258, in _bootstrap
    self.command.sendto(cmd, (HOST, COMMAND_PORT))
socket.error: [Errno 10065] A socket operation was attempted to an unreachable
host
    self.run()
  File "C:\Python27\lib\multiprocessing\process.py", line 114, in run
    self.run()
  File "C:\Python27\lib\multiprocessing\process.py", line 114, in run
     self._target(*self._args, * *self._kwargs)
   self._target(*self._args, * *self._kwargs)
  File "M:\git\heidi\video.py", line 30, in logVideoStream
     data = s.recv(10240)
e File "M:\git\heidi\video.py", line 30, in logVideoStream
rror: [Errno 10054] An existing connection was forcibly closed by the remote host
    data = s.recv(10240)
error: [Errno 10054] An existing connection was forcibly closed by the remote host&lt;/pre>

&lt;div class='p'>I wanted to somehow clean the dump for you, and I did not realize that it is
actually &lt;b>two crashing processes&lt;/b> with interlaced error output (*). One process
is handling navdata and sending flight commands while the other is video
processing. Both crashed on socket operation, where &lt;i>An existing connection
was forcibly closed by the remote host&lt;/i>.&lt;/div>

&lt;div class='p'>This happens after several minutes flight from my Win7 notebook. The drone
stops and is hovering at last position. Network is temporarily unavailable (in
the worst case the notebook can switch to another WiFi access point, if you are
not cautious enough). So what? Take a break and restart the whole program in
5 seconds? What about details like take-off, trim gyros or that first segment
is not PATH_TURN_LEFT any more?&lt;/div>

&lt;div class='p'>The second nightmare looks like this&lt;/div>

&lt;pre>NAVI-ON
!DANGER! - video delay 158.220822
FRAME 20 [-1.6 1.2] TRANS L -> I
I True 5
!DANGER! - video delay 158.392327
FRAME 22 [-1.6 1.2] I True 4
!DANGER! - video delay 158.393941
FRAME 24 [-1.6 1.2] I True 3
!DANGER! - video delay 158.490557
FRAME 26 [-1.6 1.2] SKIPPED2
I True 2
!DANGER! - video delay 158.642409&lt;/pre>

&lt;div class='p'>And here is the story how it happened: I asked Jakub to read configuration
files form Heidi and Isabelle II, because for some unknown reasons Heidi is
still faster. Maybe more experienced? &lt;span class='wink'>&lt;/span> My old hacked program crashed after
the read, and it almost did not matter &amp;hellip;  except the recorded video was not
transfered, so the drone tried to do that in the next run.&lt;/div>

&lt;div class='p'>You can see similar reports from my tests last year. I.e. it is expected, but
we should be careful about it. It is quite hard to navigate from more than 2
minutes old video &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>(*) there are actually three failing processes: navdata, video and recorded video&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140328">&lt;/a>&lt;/div>

&lt;h2>March 28th, 2014 - WiFi Sentinel (0days remaining)&lt;/h2>

&lt;div class='p'>I cannot sleep again (the clock says &lt;b>2:47am&lt;/b> now) :-(. Yes, the nightmare(s)
again. Yesterday I took „drone day off”, but Jakub did some more tests with
WiFi failure. We had several iterations with
&lt;a href='https://github.com/robotika/heidi/blob/master/wifisentinel.py' class='external'>wifisentinel.py&lt;/a>,
which is a script for &lt;i>securing WiFi connection&lt;/i>. You write program arguments
as usual except you put this wrapper at first place, for example:&lt;/div>

&lt;pre>python wifisentinel.py python airrace_drone.py test&lt;/pre>

&lt;div class='p'>The sentinel first verifies if you are connected via one of three allowed IP
addresses. If yes then it starts the drone code otherwise it waits 3 seconds.
When the drone code crashes, for example due to &lt;i>Unreachable host&lt;/i> problem,
it again waits until connection is reestablished and runs the code again and
again, and again &amp;hellip;&lt;/div>

&lt;div class='p'>Probably the funniest bug was
&lt;a href='https://github.com/robotika/heidi/commit/6d59b82a06365389be641efd303bf976e26c32b9' class='external'>this
one&lt;/a>, when instead of drone flying code it was calling self again. Then there
were some issues with shell=True &amp;hellip; you can always put the shell name in
front, so now it is shell=False when using the subprocess.call(). Finally the
code for &lt;i>Python ipconfig&lt;/i> is extremely ugly and fits only my and Jakub's
operating system &amp;mdash; I have no idea why he choose Czech Linux &lt;span class='wink'>&lt;/span>, where you
get strange console outputs like:&lt;/div>

&lt;pre>&amp;hellip;
inet adr:10.110.13.102  Všesměr:10.110.15.255 Maska:255.255.248.0
&amp;hellip;
AKTIVOVÁNO VŠESMĚROVÉ_VYSÍLÁNÍ BĚŽÍ MULTICAST  MTU:1500 Metrika:1
&amp;hellip;&lt;/pre>

&lt;div class='p'>If you know how to do this better, both on Linux and Windows without need of
installing extra package, let me know. I will change it (not today and not
tomorrow).&lt;/div>

&lt;div class='p'>The test was following:&lt;/div>

&lt;ol>
&lt;li>start drone program&lt;/li>

&lt;li>turn WiFi off during flight&lt;/li>

&lt;li>turn WiFi on again (after couple seconds)&lt;/li>
&lt;/ol>

&lt;div class='p'>The old code would crash immediately on the step 2 and drone would be hovering
in place.  Surprisingly you get the same bad result also with previous revision
of the new code &lt;span class='wink'>&lt;/span>. Yeah, testing is important. In that case the &lt;i>navdata&lt;/i>
communication channel crashed, but both video processes remaining active! At
least I learned that there is
&lt;a href='https://github.com/robotika/heidi/commit/a0e02554fe96240c02e159a02be2c665aa514cbd' class='external'>daemon
value for multiprocess&lt;/a> too. And then it went fine. Jakub repeated several
WiFi failures and Isabelle always recovered. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>So what is the problem? There is still &lt;b>video delay&lt;/b>. Say, the drone recovers
within 15 seconds. During that period it is recording area below it. It is
slightly moving, and it is still recording. Then it starts to fly and the log
file looks like:&lt;/div>

&lt;pre>NAVI-ON
USING VIRTUAL LEFT TURN CIRCLE!
!DANGER! - video delay 20.909436
FRAME 71 [-1.7 0.2] TRANS L -> I
I True 4
!DANGER! - video delay 20.406242
FRAME 72 [-1.7 0.2] I True 3
!DANGER! - video delay 19.874204
FRAME 73 [-1.7 0.2] I True 2
!DANGER! - video delay 19.346308
FRAME 74 [-1.7 0.2] I True 1
!DANGER! - video delay 18.818665
FRAME 75 [-1.7 0.2] I True 0
!DANGER! - video delay 18.260648&lt;/pre>

&lt;div class='p'>i.e. the delay is getting smaller and smaller as the drone is downloading the
old video from period without connection. But it also tries to fly based on
these video frames!!! Believe it or not, it could be OK for couple seconds until
you have to switch from line to circle navigation. It is using only the old
video as reference but it has no knowledge how far it is on the 8-loop.&lt;/div>

&lt;div class='p'>So now the dilemma: change otherwise tested code or not? Was is just luck that
it worked fine yesterday or the drone should rather stop and wait another
couple seconds? It seems that it is capable to download 2s of video within 1s,
so to decrease the video delay by second on every image &amp;hellip; no, stupid me.
There are different times. The time difference cannot be more than 1 second. If
the processing would take 0.001s the images are recorded at fixed times, every
second, so it would decrease it by 0.999s, but it would took 0.001s. So maybe
it is not that bad.&lt;/div>

&lt;div class='p'>Let's review this example FRAME 71: 20.9s delay and FRAME 103: 1.6s delay. So
32 frames and 19.3 seconds, 0.6s per frame, so approximately 0.4s it needs for
processing on Jakub's older notebook. It also means that it took 0.4*32=12.8s
in real time, at slow speed (0.4m/s) this means more than 5m! OK, I will change
the code NOW!! &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;div class='p'>I am looking at the logs in detail and it is scary, see this picture:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/drone-reset.jpg'>&lt;img src='/competitions/robotchallenge/2014/drone-reset_t.jpg' alt='View after reset during flight' title='View after reset during flight' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/drone-reset.jpg'>View after reset during flight&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The default configuration is to look forward, and one of the first steps is to
switch to down pointing camera. But it will navigate based on detected strips
for a while &amp;hellip; I am not going to change this part anyway. So you can tell even
from images that the drone was restarted.&lt;/div>

&lt;div class='p'>Code for
&lt;a href='https://github.com/robotika/heidi/commit/3b0db4781bd2f7ee6af083ff2ee76ddadfd74bd3' class='external'>The
Last Prague Test&lt;/a> is ready and I can go sleep again &amp;hellip; it is 4:46am, so I can
at least try &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>p.s. yesterday came very important info from Karim (Robot Challenge
organization). The &lt;a href='http://www.robotchallenge.org/schedule' class='external'>schedule&lt;/a> is
updated and &lt;i>Air Race --- Each robot has to pass the homologation in order to
be allowed to participate in the competition. Please don't wait till the last
minute during the homologation time. In the competition, each robot has
multiple attempts - the best one counts. Please don't wait till the last minute
with your runs.&lt;/i> I am waiting for confirmation but in this case we could test
STABLE (current) version and RISKY (to be prepared during drive to Vienna)
version &lt;span class='smile'>&lt;/span>. And we have two drones to crash! Now I have reason why to take
spare indoor protective hull &lt;span class='wink'>&lt;/span>. &lt;b>Držte nám palce&lt;/b>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140329">&lt;/a>&lt;/div>

&lt;h2>March 29th, 2014 - Falling Down (11 rounds)&lt;/h2>

&lt;div class='p'>It is over now. Do you know the movie
&lt;a href='http://en.wikipedia.org/wiki/Falling_Down' class='external'>Falling Down&lt;/a>? Somehow it is the
first association I have now &lt;span class='wink'>&lt;/span>. But let's start from the beginning &amp;hellip;&lt;/div>

&lt;div class='p'>&amp;hellip; I just realized why the robot died, as always. Originally a good idea
killed that &amp;hellip;&lt;/div>

&lt;div class='p'>Yesterday evening we have studied the last Friday tests, including speed
profiles.  Here is the graph for max desired speed set to 0.8m/s, dropping to
0.5m/s when no strip was detected and slowing down to 0.4m/s for transitions
between turns and straight lines:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/speed-profile.png'>&lt;img src='/competitions/robotchallenge/2014/speed-profile_t.png' alt='Desired and current speed' title='Desired and current speed' class='border'  width='320' height='176'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/speed-profile.png'>Desired and current speed&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The robot at the end failed due to large shadows wrongly detected as strip. So
we added also
&lt;a href='https://github.com/robotika/heidi/commit/ee0ed4049bea2996447440742946d8e5dd0d2785' class='external'>maxWidth
limit&lt;/a>. That was all we changed &amp;hellip; we were too afraid and tired to do any
other change.&lt;/div>

&lt;div class='p'>This morning we did some more changes. First of all we set the
&lt;a href='https://github.com/robotika/heidi/commit/b7f3081f19b111c4a4b5df49b4186da21bef43d7' class='external'>radius
to 1.25m&lt;/a>, because it was smaller in Vienna than in Prague. We did first test
and there were problems with video delay, crossing, and Isabelle height. So
we decided to
&lt;a href='https://github.com/robotika/heidi/commit/d093beef9c2dedeac0710f4d83b4adb77309e6c9' class='external'>increase
desiredHeight to 1.7m, limited max allowed video delay to 2s and implemented
crossing detection&lt;/a>. The crossing on Robot Challenge 2014 looked like this:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/vienna-crossing.jpg'>&lt;img src='/competitions/robotchallenge/2014/vienna-crossing_t.jpg' alt='Crossing in Vienna' title='Crossing in Vienna' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/vienna-crossing.jpg'>Crossing in Vienna&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>And that was it. That was
&lt;a href='https://github.com/robotika/heidi/commit/d093beef9c2dedeac0710f4d83b4adb77309e6c9' class='external'>the
final code&lt;/a>.&lt;/div>

&lt;div class='p'>How it went I will leave for tomorrow morning &amp;hellip; the hint, why robot crashed
after completed 11 loops is already mentioned &amp;hellip; but maybe not as Jakub just
showed me the latest images.&lt;/div>

&lt;div class='p'>p.s. &lt;a href='http://youtu.be/brrHXWxjO9k' class='external'>Air Race 2014 - Isabelle CULS video&lt;/a> is
finally uploaded &amp;hellip; again it is without censure and editors cuts, so sorry if
you will hear some rude words &amp;hellip;&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140331">&lt;/a>&lt;/div>

&lt;h2>March 31st, 2014 - 3rd place&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/results-before.jpg'>&lt;img src='/competitions/robotchallenge/2014/results-before_t.jpg' alt='Results before Isabelle start' title='Results before Isabelle start' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/results-before.jpg'>Results before Isabelle start&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/judge-attact.jpg'>&lt;img src='/competitions/robotchallenge/2014/judge-attact_t.jpg' alt='Judge attact - end of Isabelle attempt' title='Judge attact - end of Isabelle attempt' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/judge-attact.jpg'>Judge attact - end of Isabelle attempt&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/airrace2014-3rd-place.jpg'>&lt;img src='/competitions/robotchallenge/2014/airrace2014-3rd-place_t.jpg' alt='3rd place' title='3rd place' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/airrace2014-3rd-place.jpg'>3rd place&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;iframe width="640" height="360" src="//www.youtube.com/embed/brrHXWxjO9k?feature=player_detailpage" frameborder="0" allowfullscreen>&lt;/iframe>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140401">&lt;/a>&lt;/div>

&lt;h2>April 1st, 2014 - Altitude, Pressure and Temperature&lt;/h2>

&lt;div class='p'>Yesterday I tried to investigate „the falling down” problem, which caused
problems with navigation and Isabelle crash after 11th loop was completed. Here
are some data for &lt;i>NAVDATA_ALTITUDE_TAG&lt;/i>:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/altitude.png'>&lt;img src='/competitions/robotchallenge/2014/altitude_t.png' alt='slowly falling down' title='slowly falling down' class='border'  width='320' height='185'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/altitude.png'>slowly falling down&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>It is a little bit hard to see but the sonar (green) and vision (red) graphs
are almost identical. These are raw values in millimeters, if I remember
correctly. The deadly reference has yellow color. Finally you can see also blue
graph of maximal rectangle size (0 for no rectangle detected) and you can see
that it was slowly growing but did not reach limit 200. So Jakub was right and
we did not broke it with maxWidth filter set to 200.&lt;/div>

&lt;div class='p'>So why is the yellow reference so wrong? Well, first of all to use altitude
(&lt;i>drone.coord[2]&lt;/i>) was not good idea. It is integrated over time also from
PWM and pressure sensor. It has to compensate for terrain changes and detected
obstacles (I know that the floor was flat, but Isabelle maybe did not know it).
There was also small breeze in the hall, which could cause the first bigger
drop (??).&lt;/div>

&lt;div class='p'>The temperature was slowly increasing from 291 to 424 &amp;mdash; no idea about
resolution and units, so I would guess that it is 1/10th of degree. It was
maybe getting hot due to heavier battery??&lt;/div>

&lt;div class='p'>Anyway &amp;mdash; temperature and pressure data are
&lt;a href='https://github.com/robotika/heidi/commit/a182219b29a52876ea16426bbbd9be293b2a802c' class='external'>parsed
now&lt;/a> so you can use it in ARDrone2 class if you want.&lt;/div>

&lt;div class='p'>I just replay some older test (&lt;i>meta_140326_201259.log&lt;/i>) and the temperature
there was 515, dropping to 510 and then rising to 525 &amp;hellip; so I am not sure if
it means 52 degrees of Celsius?! Could it be that if the drone is cold the
estimation is not very reliable and we usually tested several times so the
drone warmed up?? I am looking now at test &lt;i>meta_140325_180914.log&lt;/i> and there
is the reference estimate lower than readings from sonar and vision &amp;hellip; so I
would conclude, for Air Race do not use altitude, but only sonar and/or vision
distance from the ground estimate.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 326px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotchallenge/2014/start-view.jpg'>&lt;img src='/competitions/robotchallenge/2014/start-view_t.jpg' alt='Start view' title='Start view' class='border'  width='320' height='180'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotchallenge/2014/start-view.jpg'>Start view&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>p.s. if there will be next time we will have to look forward &amp;mdash; the video
footage would be much more interesting than to look down &lt;span class='wink'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a id="140411">&lt;/a>&lt;/div>

&lt;h2>April 11th, 2014 - Conclusion&lt;/h2>

&lt;div class='p'>Originally, I wanted to write more info about logs, wifi failures, competitors
and plans for Air Race 2015. There is no time left and priorities sifted again
&amp;hellip; so I would like to thank the CZU team (mainly Jakub who did many many tests
for several weeks &amp;mdash; 9GB of log files).&lt;/div>

&lt;div class='p'>Did we beat the last year best semi-autonomous score? No. We did not. But the
winning team &lt;a href='http://www.youtube.com/watch?v=yAxX7VtpfeI' class='external'>Sirin&lt;/a>, with
navigation strategy only above the cross, could have done that. They had times
around 20s per loop, so 30 loops per 10 minutes. But they had also some
problems &amp;hellip; as everybody.&lt;/div>

&lt;div class='p'>And what about Air Race 2015? Two Russian teams mentioned that they plan to
build/use new drone, in particular Alex was mentioning
&lt;a href='http://pixhawk.org/start' class='external'>PX4 platform&lt;/a>. There is also a plan to modify the
rules &amp;mdash; we will see. I am definitely voting for multiple 10 minutes
attempts &lt;span class='smile'>&lt;/span>. That way we will have chance to test stable and risky version and
speed up if other competitors will also speed up.&lt;/div>

&lt;div class='p'>p.s. if you would like to compete with your drone in a maze,
&lt;a href='http://www.robots.croc.ru/' class='external'>here&lt;/a> is another contest organized by Russian
company &amp;hellip; but there is age limit set to 18 &amp;hellip; never mind &amp;hellip; but it is the
lower limit &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/competitions/robotchallenge/2014/en#email'>contact form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>Field Robot 2013</title>
	<link rel='alternate' href="http://localhost/competitions/fieldrobot/2013/en"/>
	<id>http://localhost/competitions/fieldrobot/2013/en</id>
	<updated>2012-10-23T00:00:00Z</updated>
	<author><name>Martin Dlouhý</name></author>
	<summary type='html'> The contest Field Robot Event 2013 will take place in Prague/Czech Republic for
the first time. The main organizer will be Czech University of Life Sciences
Prague (ČZU). The contest dates are 27th-29th June 2013, and the place will be
very close to university grounds, i.e. Kamycka 129, Prague 6 - Suchdol.
&lt;b>Registration extended until May 15th!&lt;/b>
 </summary>
	<content type='html'>  </content>
</entry>
<entry>
	<title>Robotour 2013</title>
	<link rel='alternate' href="http://localhost/competitions/robotour/2013/en"/>
	<id>http://localhost/competitions/robotour/2013/en</id>
	<updated>2012-11-08T00:00:00Z</updated>
	<author><name>Martin Dlouhý & Zbyněk Winkler</name></author>
	<summary type='html'> The eight year of contest of autonomous robots will travel to Poland/Lodz and
it will take place on &lt;b>21st of September 2013&lt;/b> (!date change!). The basic rules remain the
same. The changes are only in extension of time limit to 1 hour and if the
robot reaches the goal, which is something nobody achieved yet, then after
signal it can get extra points for returning back to the start.
 </summary>
	<content type='html'> 
&lt;div class='p'>Rules in PDF format: &lt;a href='/competitions/robotour/2013/Robotour-rules.pdf'>English&lt;/a>, &lt;a href='/competitions/robotour/2013/Robotour-pravidla.pdf'>Czech&lt;/a>.&lt;/div>

&lt;hr/>

&lt;h1>Sponsors&lt;/h1>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 231px;'>&lt;tr>&lt;td>
&lt;a href='http://ni.com/'>&lt;img src='/competitions/robotour/2013/logo-ni.png' alt='' title='' class='border'  width='225' height='64'/>&lt;/a>&lt;br/>
	&lt;a href='http://ni.com/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 186px;'>&lt;tr>&lt;td>
&lt;a href='http://www.ieee.pl/'>&lt;img src='/competitions/robotour/2013/logo-ieee.png' alt='' title='' class='border'  width='180' height='52'/>&lt;/a>&lt;br/>
	&lt;a href='http://www.ieee.pl/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 114px;'>&lt;tr>&lt;td>
&lt;a href='http://en.uml.lodz.pl/'>&lt;img src='/competitions/robotour/2013/logo-city-lodz.png' alt='' title='' class='border'  width='108' height='160'/>&lt;/a>&lt;br/>
	&lt;a href='http://en.uml.lodz.pl/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 218px;'>&lt;tr>&lt;td>
&lt;a href='http://www.ericpol.pl/'>&lt;img src='/competitions/robotour/2013/logo-ericpol.png' alt='' title='' class='border'  width='212' height='64'/>&lt;/a>&lt;br/>
	&lt;a href='http://www.ericpol.pl/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel ' style='width: 206px;'>&lt;tr>&lt;td>
&lt;a href='http://www.robonet.pl/'>&lt;img src='/competitions/robotour/2013/logo-robonet.png' alt='' title='' class='border'  width='200' height='53'/>&lt;/a>&lt;br/>
	&lt;a href='http://www.robonet.pl/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel ' style='width: 243px;'>&lt;tr>&lt;td>
&lt;a href='http://www.kukarobotics.pl/'>&lt;img src='/competitions/robotour/2013/logo-kuka.png' alt='' title='' class='border'  width='237' height='41'/>&lt;/a>&lt;br/>
	&lt;a href='http://www.kukarobotics.pl/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;hr style='clear: both;'/>&lt;/div>

&lt;h1>Patronage&lt;/h1>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 156px;'>&lt;tr>&lt;td>
&lt;a href='http://visegradfund.org/'>&lt;img src='/competitions/robotour/2013/logo-visegrad-fund.png' alt='' title='' class='border'  width='150' height='62'/>&lt;/a>&lt;br/>
	&lt;a href='http://visegradfund.org/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 87px;'>&lt;tr>&lt;td>
&lt;a href='http://p.lodz.pl/'>&lt;img src='/competitions/robotour/2013/logo-tul.png' alt='' title='' class='border'  width='81' height='127'/>&lt;/a>&lt;br/>
	&lt;a href='http://p.lodz.pl/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel ' style='width: 183px;'>&lt;tr>&lt;td>
&lt;a href='http://weeia.p.lodz.pl/'>&lt;img src='/competitions/robotour/2013/logo-weeia.png' alt='' title='' class='border'  width='177' height='77'/>&lt;/a>&lt;br/>
	&lt;a href='http://weeia.p.lodz.pl/'>&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>
 </content>
</entry>
<entry>
	<title>Robotem rovně 2010</title>
	<link rel='alternate' href="http://localhost/competitions/robotem-rovne/2010/en"/>
	<id>http://localhost/competitions/robotem-rovne/2010/en</id>
	<updated>2011-03-09T00:00:00Z</updated>
	<author><name>Martin Dlouhý, translated by Martin Černý</name></author>
	<summary type='html'> The second year of "robotem rovně" (freely translated as go straightforward
with robots) took part on May 22nd 2010 in Palackého sady, Písek. The event is
organized by Radioklub Písek, aiming to increase robotics awareness in general
public and to attract youth to technology. Entrants compete in two
categories: robots and cars (robots without any sensors). The goal is to get as
far as you can on a straight path, but it’s not nearly as easy to do, as it
appears to be.

 </summary>
	<content type='html'> 
&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/cogito-ambra.jpg'>&lt;img src='/competitions/robotem-rovne/2010/cogito-ambra_t.jpg' alt='Cogito MART, robot Ambra' title='Cogito MART, robot Ambra' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/cogito-ambra.jpg'>Cogito MART, robot Ambra&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h1>Rules&lt;/h1>

&lt;div class='p'>The goal of the competition: “To let the robot drive through the track without
any interaction with his owner or anyone else and to stay on the pathway at all
times. The road is about 3 meters wide and 300 meters long.” That’s just an
excerpt from the rules, which you can find at
&lt;a href='http://www.kufr.cz/view.php?cisloclanku=2010020002' class='external'>Radioklub Pisek official
website&lt;/a>.&lt;/div>

&lt;div class='p'>The robots competed in three rounds and in a given order, which has been
determined by their velocity, measured at the approval (a test if the robot can
pass at least a meter long track). The faster robots went first and the slower
later, so they wouldn’t have to redundantly pass each other. The time
difference between the starts was two minutes, so they went relatively quickly
one after another.&lt;/div>

&lt;div class='p'>A point was given for every meter the robot successfully drove, and the total
from all the rounds was the final score.&lt;/div>

&lt;hr/>

&lt;h2>First impressions&lt;/h2>

&lt;div class='p'>When I saw Radioklub Pisek had this competition last summer, I thought of it as
a Robotour’s rival. But I see it way differently now – “Robotem rovně” (RR)
could be an ideal preparation race for Robotour. RR gives a chance to complete
beginners and mainly, lots of experience that you can use at Robotour. It was
not a coincidence, that many teams from Robotour came to this event, to dust
off their robots and train.&lt;/div>

&lt;div class='p'>The event was very well organized and you could see that this wasn’t the first
one, that Radioklub Pisek held. Even the relatively fast tempo of the
competition was acceptable – observers weren’t bored, something was going on
all the time, and the competitors also had a good time those three rounds.&lt;/div>

&lt;h2>Competition difficulty&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/cam100522_095612_116.jpg'>&lt;img src='/competitions/robotem-rovne/2010/cam100522_095612_116_t.jpg' alt='The result of odometry' title='The result of odometry' class='border'  width='220' height='176'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/cam100522_095612_116.jpg'>The result of odometry&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Even though it doesn’t seem at first sight, the competition difficulty is
pretty high. To be more precise, the further you want to go, the harder it is.
Almost all the cars and robots make it 10 meters. The width to length ratio of
3m to 10m gives you about 17 degrees range, so it’s no real problem to point
your vehicle that way.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/google-gps.jpg'>&lt;img src='/competitions/robotem-rovne/2010/google-gps_t.jpg' alt='First round from the GPS’s point of view' title='First round from the GPS’s point of view' class='border'  width='220' height='160'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/google-gps.jpg'>First round from the GPS’s point of view&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>If your ambitions are a little bit higher, you need a bit of luck, or some
sensors. If you were thinking about GPS, you can give up already. There are
some beautiful, huge trees in the park and even though you have signal from up
to 9 satellites, the error isn’t going to be below 5 meters. I think that a
demonstration sample from NMEA at Google maps (see pic.) will give you an idea.
Necessary to say, the worst GPS position was on the start (then at least led
the same way as the road), but you can forget about telling on which side of
the road you are. After the import to OpenStreetMap, you can clearly see the
uncertainty level (circles).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/josm-gps.jpg'>&lt;img src='/competitions/robotem-rovne/2010/josm-gps_t.jpg' alt='A GPS error in JOSM' title='A GPS error in JOSM' class='border'  width='220' height='160'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/josm-gps.jpg'>A GPS error in JOSM&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Probably the next sensor, that comes to your mind, is a compass. That will
propel you way further, but some surprise will show up also. Firstly, you’ll
most likely have just a 2D compass, because they’re cheaper and more available.
Which wouldn’t matter much, but you have to know what tilting does to the
sensor. The road starts sloping downwards and, even if your compass can handle
this, is like any other road slightly tilted to one side because of the rain.&lt;/div>

&lt;div class='p'>Aside the tilt of the compass, which can be compensated, pay attention to the
cables and the sewers hidden underneath the road, they will turn your compass
into a roulette. So, you can use one, but it would be best to combine with
odometry and gyroscopes.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/cam100522_103808_000.jpg'>&lt;img src='/competitions/robotem-rovne/2010/cam100522_103808_000_t.jpg' alt='A look from the camera at the start' title='A look from the camera at the start' class='border'  width='220' height='176'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/cam100522_103808_000.jpg'>A look from the camera at the start&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>The only remaining thing from commonly used sensors, is camera. The race
consisted of three rounds, and each one of them was unique. The weather at 10
a.m. was cloudy, at 11 a.m. rainy and at 12.a.m. the sun was shining sharp.
How’s Radioklub Pisek doing that, so you could train in all possible weather
conditions, I don’t know &lt;span class='wink'>&lt;/span>.&lt;/div>

&lt;div class='p'>About 100 meters from the start, there’s a past for camera too. You will arrive
on a small, open atrium, and now search, kiddo. There was a little unclarity in
the rules, whether the robot can use the side ways: path is path, but I guess
the organizers meant to stay on the main path. Well, it isn’t simple.&lt;/div>

&lt;div class='p'>But a proof, that you can reach the end, brought us Roboauto, that got the
maximum possible score. The road was 300m last year, but this time it was cut
down to 200m because of excavation works.&lt;/div>

&lt;h2>Results&lt;/h2>

&lt;div class='p'>The table of results  is taken from the article
&lt;a href='http://www.kufr.cz/view.php?nazevclanku=robotem-rovne-2010-uspesne-za-nami&amp;amp;cisloclanku=2010050003' class='external'>robotem-rovne-2010-uspesne-za-nami&lt;/a>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 315px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/results.png'>&lt;img src='/competitions/robotem-rovne/2010/results_t.png' alt='Results' title='Results' class='border'  width='309' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/results.png'>Results&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;h1>R1 &amp;mdash; Quido&lt;/h1>

&lt;h3>Tomáš Ondráček/Roboauto&lt;/h3>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/roboauto.jpg'>&lt;img src='/competitions/robotem-rovne/2010/roboauto_t.jpg' alt='Roboauto' title='Roboauto' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/roboauto.jpg'>Roboauto&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>We took the competition in Písek as an opportunity to test our new platform:
robot Quido based on an RC model wheelframe. It used to be a second car
following a target on the lead “Karlík”, but we have upgraded it to work
autonomously.&lt;/div>

&lt;div class='p'>From cooperation with FIT VUT in Brno, our team received lidar SICK LMS 100 for
the Sick Robot Day 2010 competition. We attached it to Quido not statically,
but to a turning hinge, so that it could turn during the ride, and cover most
of the area this way. Other than a camera we also used an odometer and a
compass. We turned off the GPS for the race, because it wasn’t necessary and it
would probably just complicate everything.  For the same reasons, the front
sonar was turned off.&lt;/div>

&lt;div class='p'>The driving software was the same we used at Robotour. Lidar and camera took
care of way detection, and the information was aggregated into a 2D map, in
which the robot planned his way using a reactivating algorithm. No global map
was necessary, considering the spirit of the competition.&lt;/div>

&lt;div class='p'>To make the robot not to turn any undesired way on crossroads and large paved
areas, we added an option to set a preferred stable direction in the way
planner (which otherwise changes dynamically along the position in the RNDF
map).&lt;/div>

&lt;div class='p'>Last Friday night before the competition we arrived to the park in Písek with
an intention to try the road once and to “smooth tune constants” from the
recording whole night long. To our surprise Quido went forth and back the whole
way without any doubts, so we could go to check the city square and some local
restaurants :). On Saturday the same thing repeated, Quido had no malfunctions,
so we ended up getting the maximum number of points.&lt;/div>

&lt;div class='p'>The competition is, in my opinion, a good preparation for this year’s Robotour
in autumn and can be a nice first public test for beginning roboticians.
Radioklub Pisek did a great job in organizing this year, took care of nice
weather, and a big thanks belongs to them for that.&lt;/div>

&lt;hr/>

&lt;h1>R3 &amp;mdash; Irena&lt;/h1>

&lt;h3>Kamil Řezáč/Sirael&lt;/h3>

&lt;div class='p'>It might sound defeatistly, but I didn’t come to Písek to win. I took the
competition primarily as something that will make me finish the robot in time.
Which I had, with a couple of reductions, been able to do, but I didn’t have
time to put together some advanced sensors or algorithms. So I decided to
create a very simple algorithm – to control the driving unit (turning of the
front wheels) according to the compass – basically a P regulator. To avoid
problems with the absolute value of azimuth, the error caused by compass
tilting and so on, I sampled and averaged the data from after the start from
the compass (50 samples) and I drove by that value.&lt;/div>

&lt;div class='p'>Originally I planned to not change the algorithm (which was tested one evening
on the sidewalk in front of my house) at all. That didn’t go very well, the
regulation loop was oscillating more and more. Despite all that, I started the
first round with the original program, a poor result (25 m) was corresponding.
So I reduced the “strengthening” of the regulation loop, the robot calmed down
and the distance passed almost doubled (42 m). Because I saw the robot trying
to turn right a little, I set it it turned slightly left the last round and the
result was another increase in score (80m). The robot could go a couple meters
beyond that, but his destiny has been cut off by a collision with the 80m
sign&amp;hellip;&lt;/div>

&lt;div class='p'>I think this is the best (or close to it), that a “blind” robot can do, and it
surpassed my expectations by far (best test try result was 42m).&lt;/div>

&lt;hr/>

&lt;h1>R4 &amp;mdash; Eduro Maxi&lt;/h1>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/eduro-maxi.jpg'>&lt;img src='/competitions/robotem-rovne/2010/eduro-maxi_t.jpg' alt='Eduro Maxi' title='Eduro Maxi' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/eduro-maxi.jpg'>Eduro Maxi&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h3>Martin Dlouhý/Eduro Team&lt;/h3>

&lt;div class='p'>This was the first of the outdoor competitions this year for “Maxík”. It’s
still a baby, which had driven his first meter on Monday in a kitchen, then a
couple more meters after closing the bottom on Wednesday, and finally on
Saturday, it went on his first race.&lt;/div>

&lt;div class='p'>Robot was homologed using version 0 – go straight using the odometry. Version 1
then combined compass and version 2 added data from a camera and GPS. In the
end, the robot was using a customized version 2, he drove straight for 1m, used
a combined data from camera and the compass, turned the right way, and went
straight for another meter. No rush, but you could easily see when the sensors
started to feel, that the robot is going off the road.&lt;/div>

&lt;div class='p'>The “Robotem rovně” was a great competition. It provoked us to make the robot
work, so it will hopefully have a chance to score some points at
&lt;a href='/competitions/fieldrobot/en'>Field Robot Event&lt;/a>. A week after that is
&lt;a href='http://www.vosrk.cz/roboorienteering' class='external'>RoboOrienteering&lt;/a>, and well,
&lt;a href='/competitions/robotour/en'>Robotour 2010&lt;/a> in Bratislava in autumn and
&lt;a href='http://www.sick.com/group/EN/home/pr/events/robot_day/Pages/Robot_day_2010.aspx' class='external'>SICK
Robot Day&lt;/a> in Germany. Once again, huge thanks belong to the organizers for a
great event.&lt;/div>

&lt;hr/>

&lt;h1>R7 &amp;mdash; ARbot&lt;/h1>

&lt;h3>Aleš Ruda/ARbot&lt;/h3>

&lt;div class='p'>The robot went through a big changeover after last year’s Robotour. New, more
accurate GPS, AHRS instead of a compass, new optics on the camera, writing
parts of the SW into assembly, a new barrel holder and a lot of little things.
Unfortunately, the MD23 motor units started to stiffen. Subsequently I found
out, that the cause were high voltage peaks in the 5V supply branch. Two months
of suffering came to an end in the beginning of May, when I upgraded to MD25.
Last day before the competition, the robot started moving again, some errors in
SW had been fixed and he even drove through a curvy way towards my house, so
“Let’s go!” to Písek in the morning.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/arbot.jpg'>&lt;img src='/competitions/robotem-rovne/2010/arbot_t.jpg' alt='A view from the robot' title='A view from the robot' class='border'  width='220' height='176'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/arbot.jpg'>A view from the robot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Little of time, no space to be a hero. The strategy was, that a simple camera
will keep the robot on the track and sonar will assist with obstacles. It
worked for Robotour, anyways.&lt;/div>

&lt;div class='p'>Quickly test the robot in local ambience and to the start of the first round.
3, 2, 1, start. The robot started moving, but instead of going straight on the
way, he “drunkishly” went from side to side, but he made it to the end.
Greeeeat. What’s with the weaving, though? Oh, I forgot the camera lens
cover.&lt;/div>

&lt;div class='p'>I tried to make it go with the camera between rounds, but the results weren’t
good. Robot went off the road very soon. I couldn’t fix the road detection
algorithm. I tried to make the AHRS work, but didn’t face much success. So, if
it worked once, why not twice more? Robot will go according to the sonar. Next
round 90 and 184 meters. Good job. I wouldn’t believe this if someone told it
to me, but it’s true. I saw it with my own two eyes. &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;div class='p'>I think that this competition was a great start of the outdoor robot contests.
Thanks to organizers for the  attitude.&lt;/div>

&lt;hr/>

&lt;h1>R9 &amp;mdash; Brimstone&lt;/h1>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/brimstone.jpg'>&lt;img src='/competitions/robotem-rovne/2010/brimstone_t.jpg' alt='Žlutásek' title='Žlutásek' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/brimstone.jpg'>Žlutásek&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h3>Michal Eliáš and Monika Svědirohová&lt;/h3>

&lt;div class='p'>The “Robotem rovně” competition presented to us the first experience with
outdoor navigation. The wheelcase of the robot is Monika’s
&lt;a href='/articles/brimstone/en'>Žluťásek&lt;/a>, I was left with the algorithms and track
planning. I used a convolution principle from the webcamera, according to which
our robot drove, no other sensors were used. The principle is to drive through
the park manually, when the camera takes pictures at a given rate, while the
picture isn’t saved as a whole, just as vertical columns of information
calculated from the picture. Then on the competition the robots takes pictures
at the same rate, and compares them to ones from before. This method is very
memory and processor demanding, it needs at least an ARM7 level processor. We
have been able to go 152 meters, but testing before, we always passed the whole
track. The reason for not passing the track was a vulnerability of the
algorithm to spectators standing alongside the track. We came fifth in the end,
which is a great success, considering the fact, that the first test of the
algorithm was made in Písek.&lt;/div>

&lt;hr/>

&lt;h1>R11 &amp;mdash; Ambra, R13 - UTrooper&lt;/h1>

&lt;h3>Jiří Iša/Cogito MART&lt;/h3>

&lt;div class='p'>Our faculty (MFF UK) has bought a new outdoor robotic wheelcase this year –
Utrooper. The goal (achieved) for Písek’s “Robotem rovně” was to test it in a
competition and to find some of it’s attributes and flaws. Flaws were the
bigger piece, but that can be expected with a new untested wheelcase.&lt;/div>

&lt;div class='p'>The other robot taking part in the competition was Ambra – a customized RC
Hummer, originally built for Robotou 2009. Ambra went through a big hardware
change this year. So it’s basically a new robot’s premiere. Even this robot
showed a nice flaw, after two pretty successful rounds. But altogether Ambra
drove pretty well, plus got the vote of the 2nd vicemiss from the spectators!&lt;/div>

&lt;div class='p'>We recommend the “Robotem rovně” competition to teams, that find Robotour too
difficult, but also teams, that have bigger ambition, but have a new robot.
Thanks to the organizers for a well held event in a beautiful neighbourhood &lt;span class='smile'>&lt;/span>&lt;/div>

&lt;hr/>

&lt;h2>Final words&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/competitions/robotem-rovne/2010/roboti.jpg'>&lt;img src='/competitions/robotem-rovne/2010/roboti_t.jpg' alt='Robots' title='Robots' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/competitions/robotem-rovne/2010/roboti.jpg'>Robots&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>We recommend the “Robotem rovně a.k.a Autíčka v parku” competition warmly to
everybody. Other than a good event it is a great place to meet robotics and
roboticians, and full afternoon is free, so you can chat a bit. If you missed
this event, the next outdoor autonomous robots contest is
&lt;a href='http://www.vosrk.cz/roboorienteering' class='external'>RoboOrienteering&lt;/a> in Rychnov nad
Kněžnou. Finally, if you’re from Písek or its surroundings and robots are one
of your interests or hobbies, don’t be shy to contact
&lt;a href='http://www.kufr.cz/' class='external'>Radioklub Písek&lt;/a>, they’ll be glad to assist you, or take
you into their team.&lt;/div>

&lt;h2>References:&lt;/h2>

&lt;ul>
&lt;li>Organizers website: &lt;a href='http://www.kufr.cz/' class='external'>http://www.kufr.cz/&lt;/a>&lt;/li>
&lt;/ul>

&lt;div class='p'>&lt;a href='/competitions/robotem-rovne/2010/en#email'>Contact form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>PocketBot</title>
	<link rel='alternate' href="http://localhost/robots/pocketbot/en"/>
	<id>http://localhost/robots/pocketbot/en</id>
	<updated>2010-05-06T00:00:00Z</updated>
	<author><name>Ondřej Staněk</name></author>
	<summary type='html'> PocketBot project consists of three parts. The key part of the project is the
robot itself – a tiny line following vehicle of a matchbox size. Furthermore,
the robot is supported with an USB communication device and with a PC control
application. Altogether, these three parts form a complex solution to the line
following issue. Each part of the project will be described in this article.

 </summary>
	<content type='html'> 
&lt;div class='p'>&lt;table class='image_panel right' style='width: 327px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/top-view.jpg'>&lt;img src='/robots/pocketbot/top-view_t.jpg' alt='PocketBot' title='PocketBot' class='border'  width='321' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/top-view.jpg'>PocketBot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;ul>
&lt;li>dimensions: 48 × 32 × 12 mm&lt;/li>

&lt;li>weight: 19 g (body 13g, cells 6g)&lt;/li>

&lt;li>speed: 0.35 m/s (line following)
       0.6 m/s (maximal)&lt;/li>
&lt;/ul>

&lt;div class='p'>The robot was primary designed to fit into a matchbox. A homemade double-sided
printed circuit board stands as the robot’s chassis at the same time. Robot is
powered with two rechargeable lithium-ion button batteries wired in parallel
(3.6V, 40mAh each). The Atmel ATmega8 microcontroller runs robot’s program,
which is written in C. An 8-pin connector offers ISP and UART interface for
programming and debugging, respectively.&lt;/div>

&lt;h2>Undercarriage&lt;/h2>

&lt;div class='p'>Two separately driven wheels (8mm diameter) provide differential steering. The
dimensions of the gear mechanism were crucial due to considerable space
constraints. Fortunately, I met Josef Vandělík who designed and manufactured
the wheelframe for my robot. [3] The wheelframe employs a friction gear system
with magnetic pressure. A neodymium magnet in the central tube attracts wheel
axles, pressing each wheel to the motor shaft.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 266px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/pocketbot-top.jpg'>&lt;img src='/robots/pocketbot/pocketbot-top_t.jpg' alt='Top View' title='Top View' class='border'  width='260' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/pocketbot-top.jpg'>Top View&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Line following&lt;/h2>

&lt;div class='p'>The robot is capable of line following. That means it is able to follow a black
guiding line marked on bright surface. There might be line crossing on the
track, in such case the robot chooses a straight direction. The robot can avoid
obstacles on the way; if an obstacle is detected, the robot reverses and
continues backwards.&lt;/div>

&lt;div class='p'>Finally, the robot can find a guiding line in unknown environment. When there
is no line present under the robot, the robot starts to seek on a spiral
trajectory until it crosses a guiding line.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 300px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/pocketbot-bottom.jpg'>&lt;img src='/robots/pocketbot/pocketbot-bottom_t.jpg' alt='Bottom View' title='Bottom View' class='border'  width='294' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/pocketbot-bottom.jpg'>Bottom View&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Line sensor module&lt;/h2>

&lt;div class='p'>A guideline is marked with a black PVC isolation tape. This material doesn’t
reflect infrared light so it is easy to distinguish a guideline by a light
reflection of the surface.&lt;/div>

&lt;div class='p'>The sensor module consists of 3 detectors and 4 emitters. The emitters
(infra-red LEDs) and detectors (phototransistors) are placed in a row
alternately so that each phototransistor is surrounded with two IR LEDs. Thanks
to this design it is possible to measure the surface reflexivity on six spots
under the sensor module, using only three phototransistors and four IR LEDs.
Generally, this approach reduces the number of components and ADC inputs
required for a line sensor module, which is desired with respect to dimension
constraints.&lt;/div>

&lt;div class='p'>The illustration bellow shows how this method works:&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 728px;'>&lt;tr>&lt;td>
&lt;span>&lt;img src='/robots/pocketbot/sensor-module.jpg' alt='' title='' class='border'  width='722' height='182'/>&lt;/span>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>LED1 is emitting infrared light that reflects to the phototransistor T1, hence
the light reflexivity at point 1 is measured.&lt;/div>

&lt;div class='p'>Then, LED1 is turned off and LED2 starts emitting IR light. The phototransistor
T1 measures the light reflexivity at point 2.&lt;/div>

&lt;div class='p'>In real application (especially when high refresh rate is desired) the
characteristics of IR components must be taken into account. Due to reaction
delays (latency), data from sensor might be biased if the reflexivity at point
2 is measured right away after measuring the reflexivity at point 1. To keep
from this, I analysed sensor characteristics on an oscilloscope and then I
modified the scanning sequence in a way so that the sensors do not interact
with each other.&lt;/div>

&lt;h2>Ambient light suppression&lt;/h2>

&lt;div class='p'>Because light conditions often vary according to time and place, it is
necessary to use an ambient light suppression algorithm for sensors to work
properly. The method is simple: Every sensor does two measurements. At first,
it scans for the amount of ambient light. Then, it turns its infra-red LED on
and measures the value again. Subtracting these two values, the bias of ambient
light is suppressed.&lt;/div>

&lt;h2>Sensor calibration&lt;/h2>

&lt;div class='p'>There might be slight differences in characteristics of individual optical
components; therefore the sensor module should be calibrated. The calibration
is done manually in two steps:&lt;/div>

&lt;h3>1. Offset calibration&lt;/h3>

&lt;div class='p'>All sensors are placed above the black guiding line. Once the calibration
command is received, all sensors measure the surface reflexivity and measured
values are stored in memory. Those are the offset calibration values. From now
on, all measured values are automatically corrected with this offset. (Each
time a measurement is made, the offset is simply subtracted from the actual
value). As a result, all sensors will return equal value when they are located
above the black line.&lt;/div>

&lt;h3>2. Gain calibration&lt;/h3>

&lt;div class='p'>During the gain calibration all sensors are placed above a white surface. Some
sensors might be more sensitive than others, so the measured values differ from
each other. But because the surface reflexivity under the sensor module is
supposed to be equal, the gain coefficients for each sensor can be easily
calculated. For future measurements, every measured value will be corrected
(multiplicated) with its gain coefficient; so that all calibrated sensors will
have similar characteristics.&lt;/div>

&lt;div class='p'>Consequently, the calibrated sensor module will output normalized values.&lt;/div>

&lt;h2>Processing data from sensors, motor control&lt;/h2>

&lt;div class='p'>Optical sensors measure the light reflexivity of the surface and acquired data
is processed by the line detection algorithm. The algorithm is designed in such
a manner that line width doesn’t matter. The line detection algorithm outputs
signed integer value that states the actual deflection of a guideline. Values
close to zero mean that the line is located accurately in the middle of the
sensor module, positive values state how much does the line deflects to the
right and negative values state the deflection to the left.&lt;/div>

&lt;div class='p'>This output is then used for proportional-integral-derivate (PID) control of
the line tracking. The PID controller adjusts the motors’ speed according to
the actual line deflection and previous states. The position of the line is
evaluated 30 times per second.&lt;/div>

&lt;div class='p'>In other words, the PID controller drives the robot so that the line is always
centered to the middle of the sensor module, so that the robot performs smooth
line following.&lt;/div>

&lt;h2>Remote controlling&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/remote.jpg'>&lt;img src='/robots/pocketbot/remote_t.jpg' alt='Remote Control' title='Remote Control' class='border'  width='220' height='108'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/remote.jpg'>Remote Control&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Robot is equipped with an infra-red remote control receiver. Therefore it can
be controlled with a standard remote control or from a PC application. The
wireless link to the robot is used for adjusting parameters (such as speed and
PID constants), for sending sensor calibration commands and also for a manual
operation. The wireless communication utilizes the NEC remote control protocol.
[4] This protocol was implemented in both PocketBot (as a receiver) and
USBdockStation device (as a transmitter).&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 213px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/control-panel-sensors.png'>&lt;img src='/robots/pocketbot/control-panel-sensors_t.png' alt='Sensors' title='Sensors' class='border'  width='207' height='267'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/control-panel-sensors.png'>Sensors&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 213px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/control-panel-remote.png'>&lt;img src='/robots/pocketbot/control-panel-remote_t.png' alt='Remote Control' title='Remote Control' class='border'  width='207' height='267'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/control-panel-remote.png'>Remote Control&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h1>USBdockStation&lt;/h1>

&lt;div class='p'>The USB device ensures both wire and wireless communication between the
computer and PocketBot. It’s based on an AVR-CDC project, a USB to UART
converter. [2] Once the device is connected to a computer, operation system
creates a virtual COM port which can be accessed from a computer application.
PocketBot, USBdockStation, and the PC application communicate with each other
through this UART interface, using a particular protocol that was designed for
this purpose.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/dock-station-top.jpg'>&lt;img src='/robots/pocketbot/dock-station-top_t.jpg' alt='Top View' title='Top View' class='border'  width='220' height='101'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/dock-station-top.jpg'>Top View&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>To control PocketBot remotely, I added some extra functionality to the original
AVR-CDC firmware: The USBdockStation can emulate an infra-red remote control.
It has a power infra-red LED and it is capable of sending IR NEC remote control
packets as an ordinary remote control. It is obvious that the wireless
communication is only unidirectional.&lt;/div>

&lt;div class='p'>&lt;table class='image_panel center' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/dock-station-bottom.jpg'>&lt;img src='/robots/pocketbot/dock-station-bottom_t.jpg' alt='Bottom View' title='Bottom View' class='border'  width='220' height='101'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/dock-station-bottom.jpg'>Bottom View&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Firmware is programmed in C and it runs on ATmega8 with external 16MHz
oscillator. The printed circuit board has custom design to fulfill needs of my
project and it was manufactured industrially.&lt;/div>

&lt;h2>Communication diagram&lt;/h2>

&lt;h3>PocketBot Control Panel&lt;/h3>

&lt;div class='p'>The PC application offers sensor diagnostic; it shows real-time visualization
of sensor module state, including the assumed line position. Moreover, the
application has also capabilities of sending wireless commands to the robot,
which gives the key feature of adjusting PID controller constants remotely.
Thus, PID constants can be tuned on-line during line following. On the other
hand, wireless communication can be used for manual operation as well.&lt;/div>

&lt;div class='p'>After startup, the application automatically scans over all COM ports for
USBdockStation presence. Then it can detect whether the robot is connected with
a cable, or whether only wireless communication mode is available. In
bi-directional (cable) mode the battery fitness and calibration data can be
accessed.&lt;/div>

&lt;div class='p'>The application is programmed in Borland Delphi and runs on Windows operation
systems.&lt;/div>

&lt;h2>Project timeline in brief&lt;/h2>

&lt;ul>
&lt;li>2007/06	I have started programming microcontrollers.&lt;/li>

&lt;li>2007/10	I have designed and put up together my first line following robot [9]&lt;/li>

&lt;li>2007/12	Project PocketBot started. I got inspired by the Desktop Line Following robot [1]. I built sensor module prototype and I designed the line position algorithm. Ambient light suppression method tested.&lt;/li>

&lt;li>2008/01	Sony remote control protocol decoding, UART debugging, PWM motor control.&lt;/li>

&lt;li>2008/03	Robot’s undercarriage finished&lt;/li>

&lt;li>2008/04	Lithium-ion charger built. I won first place at the Student Scientific Competition with my first line following robot.&lt;/li>

&lt;li>2008/05	PocketBot linefollower finished. Line following with proporcional (P) regulator [5]&lt;/li>

&lt;li>2008/06	I presented paper titled Line following robots [11] at the international conference VIASL [10]&lt;/li>

&lt;li>2008/07	PD regulator allows line following at higher speed [6]&lt;/li>

&lt;li>2009/02	I implemented new NEC remote control protocol into the robot.&lt;/li>

&lt;li>2009/04	USBdockStation finished. Sensor gain and offset calibration. Working on optimized switching sequence for sensors.&lt;/li>

&lt;li>2009/05	PocketBot Control Panel finished. Graduation at high school in Prague.&lt;/li>

&lt;li>2010/01	Remote control protocol analyzer finished [7]&lt;/li>

&lt;li>2010/03	PocketBot won first place in the freestyle category on RobotChallenge competition in Vienna. [8]&lt;/li>
&lt;/ul>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/robots/pocketbot/in-box.jpg'>&lt;img src='/robots/pocketbot/in-box_t.jpg' alt='In Box' title='In Box' class='border'  width='220' height='189'/>&lt;/a>&lt;br/>
&lt;a href='/robots/pocketbot/in-box.jpg'>In Box&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Author&lt;/h2>

&lt;div class='p'>Ondřej Staněk (20) is the first year student of the Computer science at the
Faculty of Mathematics and Physics at Charles University in Prague, the Czech
Republic. He is interested in computer programming and electronics. Apart from
mobile robotics, the author participates in a development of a snow measuring
device for hydrometeorology science.&lt;/div>

&lt;div class='p'>&lt;a href='http://www.ostan.cz' class='external'>http://www.ostan.cz&lt;/a>, ostan89-at-gmail.com&lt;/div>

&lt;h1>References&lt;/h1>

&lt;div class='p'>[1] ChaN’s Desktop line following robot: &lt;a href='http://elm-chan.org/works/ltc/report.html' class='external'>http://elm-chan.org/works/ltc/report.html&lt;/a>&lt;/div>

&lt;div class='p'>[2] Software USB to UART converter for AVR microcontrollers:  &lt;a href='http://www.recursion.jp/avrcdc/' class='external'>http://www.recursion.jp/avrcdc/&lt;/a>&lt;/div>

&lt;div class='p'>[3] Author of pocketBot’s wheelframe: &lt;a href='http://www.volny.cz/jova3/pocket_bot/pocket_bot.htm' class='external'>http://www.volny.cz/jova3/pocket_bot/pocket_bot.htm&lt;/a>&lt;/div>

&lt;div class='p'>[4] NEC protocol description: &lt;a href='http://www.sbprojects.com/knowledge/ir/nec.htm' class='external'>http://www.sbprojects.com/knowledge/ir/nec.htm&lt;/a>&lt;/div>

&lt;div class='p'>[5] PocketBot video I.: &lt;a href='http://vimeo.com/6394938' class='external'>http://vimeo.com/6394938&lt;/a>&lt;/div>

&lt;div class='p'>[6] PocketBot video II.: &lt;a href='http://www.youtube.com/watch?v=8UCQyGJ5M0E' class='external'>http://www.youtube.com/watch?v=8UCQyGJ5M0E&lt;/a>&lt;/div>

&lt;div class='p'>[7] Remote control protocol decoding application: &lt;a href='http://ostan.cz/IR_protocol_analyzer/' class='external'>http://ostan.cz/IR_protocol_analyzer/&lt;/a>&lt;/div>

&lt;div class='p'>[8] &lt;a href='http://www.robotchallenge.org/scripts/trunk/trainstation/detail.php?hide_controls=true&amp;amp;refresh=99999&amp;amp;competition=126' class='external'>http://www.robotchallenge.org/scripts/trunk/trainstation/detail.php?hide_controls=true&amp;amp;refresh=99999&amp;amp;competition=126&lt;/a>&lt;/div>

&lt;div class='p'>[9] My first line following robot: &lt;a href='http://ostan.cz/robot/ipage00013.htm' class='external'>http://ostan.cz/robot/ipage00013.htm&lt;/a>&lt;/div>

&lt;div class='p'>[10] The VIASL conference: &lt;a href='http://www.ifip2008praha.cz/' class='external'>http://www.ifip2008praha.cz/&lt;/a>&lt;/div>

&lt;div class='p'>[11] Submission paper for the VIASL conference: &lt;a href='http://www.ostan.cz/robot/line_following_robots.doc' class='external'>http://www.ostan.cz/robot/line_following_robots.doc&lt;/a>&lt;/div>

&lt;hr/>

&lt;div class='p'>&lt;a href='/robots/pocketbot/en#email'>Comments and questions form&lt;/a>&lt;/div>
 </content>
</entry>
<entry>
	<title>Brimstone</title>
	<link rel='alternate' href="http://localhost/articles/brimstone/en"/>
	<id>http://localhost/articles/brimstone/en</id>
	<updated>2010-03-02T00:00:00Z</updated>
	<author><name>Monika Svědirohová</name></author>
	<summary type='html'> I took park in the national finals of Science Expo AMAVET, where I reached the
fifth rank and therefore I could participate on the international exhibition in
Tunis. Moreover, I won the High school science and technology project contest
in branch of electronics. On these pages, I would like to introduce my robot&amp;hellip;

 </summary>
	<content type='html'> 
&lt;h1>Cybernetic car ("Brimstone")&lt;/h1>

&lt;h2>Construction&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel right' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/robot.jpg'>&lt;img src='/articles/brimstone/robot_t.jpg' alt='Robot' title='Robot' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/robot.jpg'>Robot&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>Robot chassis was based on the Lynxmotion kit. Its a 4x4 car with "tank" like
motion.&lt;/div>

&lt;div class='p'>Brimstone works in three modes :&lt;/div>

&lt;ul>
&lt;li>Bluetooth remote control, where I command the robot and it detects 
obstacles. If an obstacle is detected, it stops and waits for further commands.&lt;/li>

&lt;li>Fully automatic mode, where it an actively bypass obstacles.&lt;/li>

&lt;li>Fetching mode. Robot is remote controlled and picks up objects. Obstacle
detection does not stop the robot but slows it down so the robot can pick up
the object easily.&lt;/li>
&lt;/ul>

&lt;div class='p'>Bluetooth remote protocol allows control of my robot by a PC, PDA or a
cellphone. Bluetooth range, which is approximately 100m is sufficient for most
environments.&lt;/div>

&lt;h2>Software&lt;/h2>

&lt;h3>Remote control&lt;/h3>

&lt;div class='p'>The program is written for ATmega32 in C language. The code implements a "state
machine", which works as follows :&lt;/div>

&lt;ul>
&lt;li>robot receives a command via a serial line and performs action based on the
command and its current state. The action can change current state of the
robot.&lt;/li>

&lt;li>example: current state - robot moves backwards, received command - go forward,
resulting action - stop and then move forward (stopping before reversing speed
preserves both motors and control boards&lt;/li>
&lt;/ul>

&lt;h3>Autonomous mode&lt;/h3>

&lt;div class='p'>This mode uses interrupts generated by A/D converters attached to robot
sensors. As soon as the sensors detect an obstacle, an interrupt is generated
and obstacle avoidance is activated. Further interrupts are inhibited and robot
starts to avoid obstacles. Obstacle avoidance routines are based on state
machine too. Autonomous mode is terminated by an interrupt generated by a
serial communication interface.&lt;/div>

&lt;h2>Construction:&lt;/h2>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/soldering.jpg'>&lt;img src='/articles/brimstone/soldering_t.jpg' alt='soldering' title='soldering' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/soldering.jpg'>soldering&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/inside.jpg'>&lt;img src='/articles/brimstone/inside_t.jpg' alt='robot from inside' title='robot from inside' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/inside.jpg'>robot from inside&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/wires.jpg'>&lt;img src='/articles/brimstone/wires_t.jpg' alt='wiring' title='wiring' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/wires.jpg'>wiring&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/top.jpg'>&lt;img src='/articles/brimstone/top_t.jpg' alt='robot from above' title='robot from above' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/top.jpg'>robot from above&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel left' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/hand.jpg'>&lt;img src='/articles/brimstone/hand_t.jpg' alt='gripper' title='gripper' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/hand.jpg'>gripper&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel ' style='width: 226px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/light.jpg'>&lt;img src='/articles/brimstone/light_t.jpg' alt='eyes in the dark' title='eyes in the dark' class='border'  width='220' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/light.jpg'>eyes in the dark&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;h2>Eurobot&lt;/h2>

&lt;div class='p'>After improvement of the mechanical hand I tried to participate on contest
&lt;a href='/competitions/eurobot/2009/en'>&lt;span class='cs'>EUROBOT 2009&lt;/span>&lt;/a> in category STARTER, where I
surprised the audience by moving the robot over the playing elements instead of
picking them up. My robot was not able to build any "temple", but I gained
valuable experience by participation.&lt;/div>

&lt;h2>Links&lt;/h2>

&lt;ul>
&lt;li>&lt;a href='http://www.amavet.cz' class='external'>AMAVET&lt;/a>, fifth rank in national finals of Science Expo AMAVET&lt;/li>

&lt;li>&lt;a href='http://www.tunisie-mailing.com/ESPACE-CLIENT-WP/SitesClients/ESI2009' class='external'>international exhibition in Tunis&lt;/a>&lt;/li>

&lt;li>&lt;a href='http://www.soc.cz' class='external'>SOČ&lt;/a>, national finals - first place&lt;/li>
&lt;/ul>

&lt;h2>Video&lt;/h2>

&lt;div class='p'>&lt;object width="425" height="344">&lt;param name="movie" value="http://www.youtube.com/v/QA3cvvwzarE&amp;amp;hl=en&amp;amp;fs=1">&lt;/param>&lt;param name="allowFullScreen" value="true">&lt;/param>&lt;param name="allowscriptaccess" value="always">&lt;/param>&lt;embed src="http://www.youtube.com/v/QA3cvvwzarE&amp;amp;hl=en&amp;amp;fs=1" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="425" height="344">&lt;/embed>&lt;/object>&lt;/div>

&lt;hr/>

&lt;div class='p'>Created by: Monika Svědirohová&lt;/div>

&lt;div class='p'>&lt;table class='image_panel left' style='width: 171px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/monika.jpg'>&lt;img src='/articles/brimstone/monika_t.jpg' alt='Monika' title='Monika' class='border'  width='165' height='220'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/monika.jpg'>Monika&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>
&lt;table class='image_panel center' style='width: 254px;'>&lt;tr>&lt;td>
&lt;a href='/articles/brimstone/demo.jpg'>&lt;img src='/articles/brimstone/demo_t.jpg' alt='demo' title='demo' class='border'  width='248' height='165'/>&lt;/a>&lt;br/>
&lt;a href='/articles/brimstone/demo.jpg'>demo&lt;/a>
&lt;/td>&lt;/tr>&lt;/table>&lt;/div>

&lt;hr/>

&lt;div class='p'>If you have any questions or comments &amp;ndash;
&lt;a href='/articles/brimstone/en#email'>contact us&lt;/a>.&lt;/div>
 </content>
</entry>
</feed>
