Embedded Eye

Give your gizmo the gift of sight

I have been working on a research project for some time now as an undergraduate directed study, and I have found a great deal of success with the Centeye vision chips. I am about ready to take the next step and integrate the sensors onboard a test platform so i can start tweaking and fine tuning. I would, however, like some advice on the math side of things. 


I have been looking at various OF algorithms, and the IIA algorithms, especially the one by Prof. Mandyam Srinivasam seem to be fairly useful for obstacle avoidance and terrain guidance. I have parsed through the code supplied on the site here and while I understand the progression of ir, I cannot understand how exactly this calculates a flow. 


If anyone could explain to me a bit clearer how this OF algorithm works, I'd be very appreciative, as it will help me continue on with my research and development of a working system. 


Like I said, I'm pretty new to the idea of computer vision, so any input would help.



Views: 1006

Reply to This

Replies to This Discussion

Hi Michael,

The IIA algorithm is a variant of the classic gradient algorithm and is related to Lucas-Kanade. Essentially you have two sequential images X1 and X2. Then you take X1 and shift it left, right, up, and down. Then you compute the linear combination of these four shifted images that best matches X2 in a least square sense. This gives a closed form solution that the algorithm directly computes. The above is a simplified explanation. It can be extended to directly handle both curl and divergence. Here is the paper that introduces the IIA algorithm:


SRINIVASAN, M.V. An Image Interpolation Technique for the Computation of Optic Flow and Egomotion, Biological Cybernetics, Vol. 71, pp. 401-415, 1994.


I'm also attaching two MATLAB functions that implement IIA- one for linear (one dimensional) optical flow and one for two dimensional. Use the value 1 for "delta".


You may also want to take a good look at Lucas Kanade. I'll try to dig up MATLAB functions for that...




Hey Geof,


I have a couple questions. We currently have our helicopter flying in a semi-autonomous mode using some sensors a classmate of mine embedded onto a coax copter. It runs fairly well at approximately 4 to 4.5 feet from the ground. My plan for your sensors are to embed two one either side of the helicopter to be used for telemetry and obstacle avoidance or some kind of wall avoidance. My biggest challenge right now is the focal length of the tam2 sensors. I am reading consistent flow out to approximately 1 foot in good lighting conditions, but outside of that range, the surface quality seems to deteriorate quite significantly and the OF data fails. Do you have any suggestions to remedy the problem (possible addition of a larger lens?) Eventually I want to be able to stay in the middle of a given path by equalizing the flow on either side of the helicopter and then set up some sort of case function to override the given algorithm should an obstacle appear in its flight path, but i need a bit more clearance than 8-10 inches before it starts reading significant flow data.





I think the problem you are facing is that with the currently embedded optics, the pixels are large enough angle-wise that when you get too far away from the wall, the smaller features tend to blend together and become invisible. One fix for this is, as you suggest, different optics with a larger focal length. Another "hack" in the mean time is to put larger texture on the wall- post-it notes, pictures, etc so that this sensor has additional stuff to lock onto. Can you post a typical photograph of one of the walls?
What about rotation estimation in IIA algorithm? I am working on optical flow using Srinivasan algorithm, I would like to estimate rotation (theta) together with x and y.

After estimating pixel shift how to estimate velocity.


© 2022   Created by Geoffrey L. Barrows.   Powered by

Badges  |  Report an Issue  |  Terms of Service