Saturday 19 December 2015

Magnetic Encoders

Robbie is now fully functional again after his computer problems the reason the arms missed commands was due to the controllers resetting. after I supplied power to the USB hubs every thing worked as required.
To increase the accuracy of the arm I started replacing the potentiometers with magnetic encoders to fit the new encoders required a few modification to the gearbox, I incorporated a bearing in the top of the gearbox and a mount for the magnet in the drive gear plus a few extra tweaks to increase the strength of the assembly not all modification will be fitted at the same time some will wait until the next major rebuild

Moveit Update
Robbie's moveit configuration is working again accuracy is 15 cm not very good but magnetic encoders will help plus a better calibration. Obstacle avoidance suffers because planning only just misses the obstacles. Robbie now has a point at node where he will point to a published target pose.

Face recognition
We are now running the COB face recognition package this works well in daylight but the garage is to dark Robbie makes a few errors, I need to add more lights. The AI will say Hello when he first recognises a face then after 5 minutes he will just say Hi. The name of the recognised face is returned to the chat bot so he knows who he is talking to

Object recognition
will recognise a pre programmed object but wont learn a new object because ECTO requires direct access to the kinect driver Freenect uses a different driver and Openni will not work under indigo
2d recognition, shift and surf and not included in the opencv package so its very flaky

Navigation
Increasing the Global inflation will make Robbie plan further away from obstacles.

Autonomous operation
shutdown command will not work when Robbie is started using robot upstart also depth registered points from the top kinect will not always work unless something uses it straight away the lower kinect has the point cloud to laser scan and gives no trouble. I will start face recognition on start up and see if it remains stable. We haven't had any jitters or strange events since we started using the powered hubs for the arduinos. The current high temperatures are causing a few resets I need a bigger fan and more vents in the CPU bay

Robbie's Emotion system
has been turned off for the moment since he spent most of the time bored and kept quoting a markovian chain from Sun Tzu. It needs a lot more configuration and thought but its fun for a while

As the design of Robbie matures I'm starting to add covers to hide the wires and keep dust off electronics but this has induce a few extra problems
  1. Heat build up
    more fans need to be included in the design
  2. striped out threads
    printed PLA and MDF wont hold a thread for very long so now I will add M3 threaded inserts and M4 rivnuts to the structure




Tuesday 15 September 2015

AI Updates

Robbie's computer is still broken so I was able to catch up on some tasks I never had time for.
The potentiometer were never very accurate I have designed magnetic encoders as a replacement they are more accurate and and just plug in the the existing structure they will be fitted on the next rebuild.

The overall control was a very hard to maintain and expand natural language is not very good with robot control some verbs are shown as nouns thus wont be interpreted as commands. In NLTK you can define what words are verbs or nouns but maintaining the files is troublesome, I tried pattern_en but it suffers from the same limitations. I also tried WIT a online language processor the learning curve is too steep and I wanted a local solution. Robbie's chat engine on the other hand works well.

I never really looked into pyaiml's capabilities but it can run system programs with command line args. For testing I reduced the loaded aimls to two one for commands the other for general response.
Of course that just puts me back to where it was before but with a lot more potential. Pyaiml will throw a error message for a unknown command I made it so it will append the command to a aiml file I only have to add the meaning later I can automate this but for now I want control over it, this sort of gives me a learning robot.
One of the intriguing possibilities is to query Knowrob ontologies.
For now I can add the name of a person from the face recognition node.
Next task is to make a semantic map and name the objects so when asked his location Robbie will answer “near the desk in the garage” not x,y,z.

All of Robbie's physical task are now controlled through behaviour trees program with action servers any task can be pre empted and resumed if there is a fault or error condition. The behaviour tree also monitors and controls Robbie emotions tasks will give pleasure doing nothing will result in boredom, when boredom reaches a certain level Robbie will do a random task that varies from uttering a quip using a Markovian chains, moving his head or arms to driving around in circles.
Using simulators like Rviz and Gazebo has made these tasks much easier.

Thursday 18 June 2015

Robbie Upgrade



The original base was causing some problems with arms in the relaxed and travel position the arms
would hit the lower body , I don't want the arms to make Robbie to wide he has enough trouble getting through door ways as it is also I didn't have the time for a total rebuild (moving the wheels further inboard) so I opted for some covers and a redesigned waist joint. I ripped the speakers apart

and put them in the space for the kinect. I have put the kinect back in the head so I can visualise the work space  




Monday 6 April 2015

Autonomous Update 2

For a robot to be autonomous it must be easy for others to operate it logging in and having to run some launch files is never going to work. The answer is upstart we tried this a few time but it never worked as expected. The version that comes with indigo works now to start Robbie you just have to turn him on. Then you can control Robbie with voice commands or Rviz off course with the greater reliance on voice commands we need to expand his capabilities.  One of the options is show time which demonstrates his capabilities in a continuous display, good for showing off  to visitors. Other commands will be included over time. The original NLTK is showing its age now Pattern_en shows a lot more promise but with a steeper learning curve

Tuesday 31 March 2015

Get Me a beer

This is Robbie's first simulation run of picking up a object of a table moving it to another table, then 
returning to his station. The simulation didn't show the object attached to the hand but the hand opens and closes in the right position


Wednesday 11 March 2015

ROS MoveIT Update

I have had Moveit working for a while now but I could never get pick and place work I could
move the arm to a location but that was it. After a bit of research I traced the problem to my
URDF, I didn't place to coordinate frame at the end of the joint and the orientation of the frame
was wrong. After I adjusted everything it worked the arms can now reach more places and the resulting  arm position is more natural. the only real problem is the simple URDF I made for
testing works quicker and has a better result with pick and place(less fails). Next up is a test with
the real robot I have to change the joint configuration on Robbie to match the URDF the wrist yaw joint worked better than the wrist pan joint now the arm has 7 joints plus the gripper not the 8 from before


Wednesday 4 March 2015

Robbie system update

With navigation working as expected now is a good time to upgrade Robbie to the next stable version
ubuntu 14.4 , ROS Indigo and clean up the directory structure. Putting every thing under Robbie_ros.

The good news is most thing works as before the only glitch so far is Arduino 1.6 will mess up my
messenger commands they have to be loaded with Arduino 1.05 and Face recognition is having a problem with opencv

I calibrated battery voltage to read true value, health monitor will move to docking station pose when battery voltage goes below the set level(12.2 for now) and start 
auto dock. I still have to work on the charge state message.

The most important thing to remember running Gmapping before we start localization


Sunday 8 February 2015

Autonomous Robot Update


The reliability program is on going, I have identified many problems
that need correcting the most surprising is heat the weather has been
very hot of late causing the computer to reset. On the software side
the old repository had to many old programs that were several versions
old I have replaced the structure with a more ROS like structure.
I have refreshed the moveit package and now control both arms
with Moveit. While it works with out tearing it self to pieces the old
L298N motor drivers are not really up to the task, they lack control below
70 PWM making fine adjustment near impossible. I never had the I2C bus
working with the Arduino Due reliably so I reverted to using a Arduino UNO
I still have the occasional glitch especially when doing restarts and poking
around the boards. That brings us to the subject of wiring, crimp pins
(header types )are not the best for a robot, they have been the biggest
problem to reliability I will go with aviation style solder pins and plugs.
The new motor drivers will be MC33887 with magnetic encoders not sure
what to do with the I2C bus

Monday 26 January 2015

Voice Recognition

The last major piece missing from Robbie was voice recognition that worked using the kinect showed promise but never worked all that well and was a pain to set up. Today
i stumbled on a android app for voice recognition that connects to ROS it recognizes once or continuously work up to a meter from the phone, it recognizes the kids voice as well
the app is from JSK robotics  you need messages from here https://github.com/jsk-ros-pkg/jsk_common

Friday 16 January 2015

Autonomous robot


This we get from wikipedia
A fully autonomous robot can:
  • Gain information about the environment (Rule #1)
  • Work for an extended period without human intervention (Rule #2)
  • Move either all or part of itself throughout its operating environment without human assistance (Rule #3)
  • Avoid situations that are harmful to people, property, or itself unless those are part of its design specifications (Rule #4)
An autonomous robot may also learn or gain new knowledge like adjusting for new methods of accomplishing its tasks or adapting to changing surroundings.

I have been asked the question how autonomous is Robbie and do you let him move on this own?
While in principle he has all the systems, and has demonstrated that they work on there own and sometimes they all work together. The fact is in the last 2 years he has been tethered to the battery charger and partially disassembled. Stage 1 is now complete we have a working robot. What we don't have is trust in him and reliability. Stage 2 of this build is address those problems trust will come with reliability but autonomy needs more , below is a list of some tasks the robot should do

Self-maintenance

  • Charge the battery, this part works using a behaviour tree
  • monitor the systems to be part of the above

Sensing the environment

  • Is anyone near me, face recognition work but needs to be improved
  • where am I, while localisation will give a map reference we need a name ie lounge room
  • day and night shut down nodes that wont be used at night
  • short and long term memory

Task performance

  • Goto a place, did I achieve my goal?
  • get something, did I achieve my goal?
  • locate something, did I achieve my goal?

Indoor navigation

  • Localisation
  • update the known world what has changed
and we also need to log activity, success and failures to measure performance, in the lab he can go through a door with out touching but in real life? Same for localisation.
I'm sure this list will grow.