Thursday, December 11, 2025

 

FPath: Algorithmic Interlude #1

I have now posted the results of FPath Experiment 011 which is part of the FPath project. This is not an actual experiment per-se, rather, it is documentation of the image recognition and Graphical Stigmergy techniques applied in Experiment 010,

Experiment 010 used an interesting image recognition technique in order to automate the movement of a probe at the sub-millimeter level. The major benefit of this technique is that it is fast enough to keep up with a sequence of 640x480 images arriving at a 30 fps frame rate. 

Also, which may be of some interest, is a discussion of the mechanics of the Graphical Stigmergy algorithm. This discussion shows how complex emergent behaviors can be obtained from combination of relatively simple actions and the modification of the environment to send signals.

I just thought that some of you might be interested in how all this is done. The video explains all: https://youtu.be/be725uWk4c8

The image below shows a frame from the video - Ok, I know, it isn't the most sophisticated graphics you've ever seen. (click on the image to enlarge, watch the video for context) 



 

Labels: , , ,


Comments:
Very cool!!
 
Thanks :-)
 
I won't pretend to understand all you're doing here but in another area I've had much better results creatively using nanobanana masking rather than relying on brittle 'machine learning' models. You may want to check out if you can apply it. Active perception at micro scale is a very interesting idea.
 
Thanks for the tip. I have now had a look at nanobanana - very clever tool. Speed is the problem - whatever I do, I have to do it to 30 frames every second in order to get the realtime feedback. Google are using a massive number of TPU processors to get NB operating as fast as it does and it is still way to slow for FPath needs (although it is impressively fast as an image manipulation tool).

>Active perception at micro scale is a very interesting idea.

I think it shows promise as well, but time will tell if it can deliver. Of course, as one goes further down the scale, optics will not work so some other feedback technique will be needed if it is to be usable there.
 
Yeah, we're going to need something like an AFM. My head is still kicking around daft ideas ranging from antennae to sensor grid arrays and micron scale verniers. I've had a hand injury and can't manipulate things well, so got time to think. Dangerous pastime.
 
Ah, 30 FPS is the speed at which you have to process incoming images. That didn't even click when I first read the post, because it's such an insane requirement. At that speed it's impressive that any vision algorithm is working at all.
 
Hi Vik. Noticed that you've not posted for a while. How are you doing? Hope your hand gets better soon.
 
Yeah, just about there. I worked the microscope yesterday, hope to get back to it. Will leave etching probes a bit longer - don't particularly want to get anything vaguely caustic on the hand as it's shed a lot of tissue and is pretty pink. Necrotic animal bite btw., dealt with by our lovely local medical facility.
 
PS Fortunately I have the tendency to heal like a lizard, which is just as well.
 
You seem to be in a very hostile relationship with nature!
 
Yeah, this year was particularly bad. Beginning to feel like a bunch of scar tissue held together by some Vik. Tried to get some dev done yesterday, but ended up making CPAP spare parts. Try again today.
 
Post a Comment

<< Home

This page is powered by Blogger. Isn't yours?

Subscribe to
Comments [Atom]