dinsdag 28 april 2009

28-04-2009

My method of trying to implement the omnicam rangefinder isn't really working, i've tried to implement it in the Agent and in Imageanalysis but I get stuck all the time mostly because i do not fully understand the code yet I guess.

I would also need to create an entire system just for the omnicam rangefinder which is going to take me alot of time. I really need to research on masking the omnicam as a LaserRangeData object. Therefore I will now try and find out where LaserRangeData.vb is getting it's data from and where it saves it's data.

vrijdag 24 april 2009

24-04-2009

Patch.vb #Region " Omnicam Observations " method ProcessCameraData tells WHEN(ex. how many per second) to process the omnicam image.
ManifoldSlam.vb method ProcessOmnicamData processes the cameradata, not giving images only processing, it processes the currentPose too.
ManifoldSlam.vb method ProcessSensorUpdate forwards any sensor update for SLAM, thus this is kind of the controller method.

Find out what is one laserRangeData scan. Is it only 1 point or really 180 points as one object?

Find out if the cameradata is in meters or anything else, because scanmatching works on millimeters.

I think i have to replace ScanObservation.vb with something that works for the omnicam. it is using laser rangedata right now as input. I can also mask the omnicam output again as laser rangedata which is actually still the most easiest way to tackle this problem. The only problem is that laser rangedata has a max of 180 points while omnicam should at least make use of 360 points(1 radial line per degree), will this work on the current SLAM system?? I haven't checked this yet. I can also divide an image in 2 parts (180 points each) and throw them both through the system.

donderdag 23 april 2009

23-04-2009

I have to find the file calibrate_camera.m from scaramuzza about the omnicam rangefinder. This file tells me how to calibrate the omnicam.
OmnicamRangeSensor.vb is a file i created, i am going to try and implement the entire omnicamrangefinder in this method. I am still going to need a seperate file for the colorhistogram.

donderdag 9 april 2009

09-04-2009

There are now 2 robots that i can use to test my rangefinder("OmniP2AT" and "OmniP2DX"). Arnoud has fused rev. 1323 with all the new techniques from rev. 1799. It is now some kind of experimental/custom revision that is having some bugs but the omnicam part of the program works, which is of course the most important.

Both the MAZE coordinates that I found on 08-04-2009 are correct, but they are just different locations in the maze.
I have also found out how to retrieve omnicam images, \Agents\Sensors\Camerasensor.vb is being used to retrieve omnicam images.

Patch.vb is being used alot throughout the program. This function is being used for the localisation of the robot. This function will take a picture at a certain location and the robot will then drive away from that location. While driving the robot is taking a new picture, if the picture match gets below the 90% the robot is going to make a new patch.

I have also looked at the laser rangefinder function of the program. An omnicam rangefinder will be some kind of copy of the laser rangefinder, that is why it is important to look at the structure of the laser rangefinder. The method "ProcessLaserRangeData" in ManifoldSlam.vb is being used for the rangefinder.
I can create my omnicam rangefinder system by following the structure of the "ProcessLaserRangeData" method. I think I know enough to start writing the code. I am going to do this next week.

woensdag 8 april 2009

08-04-2009

Arnoud helped me to start up USARSim today. The problem was that I did not put in the rotation coordinates. I will also use the "extended image server" program for USARSim. This program can create camera images with a resolution of 640*480.
We will also make the omnicam rangefinder parse 1 image per second, this is because finding free space on an image can take some time. 1 image per second should give the system sufficient time to keep up.

I also know the coordinates of the "DM-compWorldDay1_250" maze, this is the environment where I am going to test my rangefinder.
(-35.64, -11.1,-4.0 ; according to Arnoud) found at this link:

http://staff.science.uva.nl/~arnoud/research/roboresc/March2007.html

Arnoud gave me sroebert revision 1323 to work with. I am going to find the files that are used to get an omnicam image.

Images are being stored in the OmnicamObservation.class and OmniMaplayer.class makes use of the images. OmniMaplayer.class makes alot of references to Patch.class, I will find out what that class does tomorrow.

I have also found other coordinates for the maze in the sroebert program.

MAZE coordinates (-54.51, -6.97, -4.0 ; according to SRoebert) from the file compday-maze.cfg from sroebert rev.1323, i am going to find out what the position of these coordinates are next time I am going to use USARSim.

dinsdag 7 april 2009

07-04-2009

I have turned in my time schedule yesterday. The time schedule did not really helped me but it gave me a good representation about what to do. I have been to the robolab today and I tried to start up USARSim. USARSim gave me a "wrong coordinates error", but I knew for sure that I had put in the right starting position coordinates. I will have to ask Arnoud(supervisor) about this.

zaterdag 4 april 2009

04-04-2009

I still have to get used with USARSim(the program that I am going to use to do my research). Unfortunately I did not had time to work on my research this week. I am going to use this weekend to create a timeschedule which should be done by tomorrow. I am going to use my next week to get used to USARSim. I have also turned in my project proposal last week.

About this blog

Hi,

I am a student of the UvA and i'm studying 'kunstmatige intelligentie' = 'artificial intelligence'. My bachelorthesis is going to be about the creation and performance testing of an omnicam rangefinder. I will develop and test the rangefinder in a program called UsarSim which is a simulationprogram of the real world.