Here's an updated color-coded architecture diagram. I've decided to drop the speaker output feature, mainly because I'm not sure what I'd do with it, but I may resurrect the idea.
Compare to my original chart here |
I've got the Pawn scripting working pretty well, I've got some of my low level functionality exposed as Pawn API's, and I've adapted the Chibi/OS shell to treat Pawn bytecode files as "executables" that I can optionally launch in a background thread.
I could probably fill 20 pages with all the developments, but I'm not going to, I just spared myself enough time for a brief post.
Here's a video showing a script running that tries to navigate a square. The compass and and encoder readings are queried in the script to detect 90 degree turns and half meter movements. The compass code is probably beta level, and I'd call the encoder tracking alpha level, so that's why the square is somewhat off. Eventually, I'll have more elaborate geo-spatial reckoning, so that I'll simply command the robot to "go to coordinate x,y, and it will figure out how to get the from the current position(avoiding obstacles too of course). I hope this comes out, Blogger doesn't seem to have an easy way to preview video before publishing a post.
Fun times.
--P
Note to self, 9:16 videos don't look so good embedded in a webpage. I think I had the motors running at about 40% power, they get go quite a bit faster, but I'm currently polling the encoders in Pawn script, so with debug unoptimized builds and some comms latency, I loose some accuracy when I go too fast. Eventually, the propeller should be able to watch the encoders and change the motors directly.
ReplyDelete