Madgrizzle's Robot

  1. The control loop is too slow? Did write it yourself? Is it blocking on some input? Control loops usually run the fastest, in a hierarchy, the bottom needs to move the fastest, and 100-200Hz is typical, even on crummy hardware.

  2. Have you looked at rqt_graph, rostopic info, and rosnode show? You can look for missing connections/remaps that way

  3. If you’re having performance issues, you can use top or (more user friendly) htop. Then you can see what process is taking too much cpu or memory.

That computer looks pretty good. Maybe it’s a matter of using less processing on less time critical pieces and using more processing on things that are more critical in timing.

No, that message comes from the session running move_base. iirc, changing the controller_frequency value in move_base parameters file from 5 to 2, significantly reduced the error messages, but the robot wouldn’t really move consistently after doing so.

I’ve watched everything with rostopic hz and did occasionally see hiccups with the map, but they didn’t seem to correlate with issues reported by move_base. I’ll check the others… I’m using all very basic static transforms (no urdf). I positioned the laser scanner to be directly above the base footprint / base link.

Also, not previously mentioned, when I ran Rviz, after a period of time, I’d start getting laserscan error messages (unknown reason for transform error). I ended up modifying the mapper’s SDK code to change time_increment to 0 versus a computed value based upon some post I read. I’m not sure its related. Since it happens after a period of time, I wonder if there’s a clock issue, but I can’t change the time on the mapper from what I can tell. I installed chrony on the celeron and I think its working.

I did use top but didn’t see any issues. I’ll try htop. Initially, I was running nomachine to remotely access the celeron machine, but turned it off and went to ssh sessions to see if it helped… it didn’t.

1 Like

Weirdness, I had a heck of time getting the computer to reboot into Ubuntu today… was as if it was loading as fast as a commodore 64 floppy disk. It never worked until I booted in recovery mode.

Anyway, I connected the mapper directly to the computer using Ethernet and I got none of the errors. Everything seemed to work well. Maybe it was just trying to send data over WiFi or the computer was having issues and just needed to be rebooted. Nevertheless, it’s working so there’s no performance issue with the Celeron.

1 Like

Made huge progress last night. I got a kinect attached and working to scan for obstructions and got it working with the local and global planner. But then I noticed that the global planner wasn’t creating a costmap from the map from the mapper. This goes back to my big concern about trying to use the map from the mapper rather than a static map. However, I found one post were someone was trying to do something similar (use a map from a published map topic as the static layer). I used the same method and it amazingly worked. So now I have costmaps using the map and the kinect and the global planner actually produces a path that’s not just a straight line to the goal.

However, the robot still likes to snag corners and then complain about not being to find a path. I think I will end up spending a fair amount of time trying to optimize the kinect and the inflation layer parameters to get it to work well enough. I probably should also build a basic urdf of the robot to visualize it in RViz.

edit:
Discovered a typo in my common parameter file where instead of “robot_radius:” I had “robot radius:” This would explain it snagging on corners.

Had a little failure night before last where the front caster got stuck in a gap in the floor and the back motors just kept spinning and ended up breaking the bolt axles free from the coupler. So I just went ahead and epoxied the axle and coupler together to be done with it.

Normally the computer runs at around 40% cpu utilization and I decided to see if I could reduce that by filtering the pointcloud data from the kinect (less data to process, should result in lower cpu utilization). To my surprise, the voxlelfilter to do the reduction consumes 100% cpu utilization and makes it unusable. It does “work” (i.e., reduces the pointcloud) but does not help the situation at all.

I then worked on saving and reloading the map in slamtec’s robostudio and it seemed to go okay, but I still need to spend more time with it. I think I have to put the robot in localization mode after loading the map so the robot finds its pose and produces correct odom. After loading the map and manually setting the pose, I tried to get the robot to navigate the same way as before but this time it cut the corner of doorway a little close and ended up in the lethal zone and at that point, move_base never could get the robot out of it. Perhaps because every move was invalid since it was in a lethal zone? I don’t know if the two events are correlated (loading a map and getting stuck in the doorway) or coincidental.

1 Like

I discovered the robot acts drunk when the battery is low. I didn’t plug it in overnight and when I told it to go somewhere in the morning it went in the general direction and then plowed into a box.

Here’s a recent photo (forgive the electrical tape… its temporary).

The laser scanner is on the second level with all the power converters and terminal strips (you can’t see it). I want to redo the support pipes there to minimize their cross-section and reduce the amount of the laserscan that’s lost. I like the laser scanner being there since its relatively protected from 8-year old humans and 1-year old dogs.

I repurposed an old dd-wrt router and configured in client mode to function as a local ethernet switch/wifi bridge. Now, the mapper, computer, and soon-to-be-installed jetson nano will all talk via ethernet.

I’m working on getting my lowrider back working (finally getting around to putting the new belt holders on) so I can cut a new platform to relocate the kinect and install the jetson and cameras.

2 Likes

Do you think the xbox one kinect would be an upgrade to this project? Or is your setup specific to the 360 kinect?

I’m asking because I have an xbox one kinect I’d give you if you think it would be an upgrade.

I always did enjoy reading about your projects!

1 Like

Honestly, I don’t know which would be better. The XBox One version has a wider field of view which is better, but not sure about how the other specs would play in. I worry that too much data will bog down the computer.

One thing that I noticed is that with the turtlebot (a commercial product that uses a kinect) is that the connect is mounted in the rear, facing forward. I wonder if this is done because the kinect has a minimum distance spec. I have mine at the very front and when I do the new platform for it, I’ll look to see if mounting it in the rear improves its performance.

But, as for the XBox One kinect… hold on to it because I might have a use for it down the road when I start looking at trying to do things with a 6dof manipulator. Having a depth camera tracking the arm would be handy…

2 Likes

Cool deal man! It’s yours when you’re ready for it!

1 Like

I’ve further ‘calibrated’ the position of the kinect and it lines up with the laser scanner output well. I was off by a few centimeters before and adjusting the position really helped the robot to manage to navigate through a doorway. Being 20 inches in diameter, it’s not easy for a computer with imperfect senses to navigate through a 29-inch opening. That’s 4.5 inches on either side and the senor data is noisy. It’s funny what we take for granted.

However, I now have a new problem and need ideas. I have wood floors with a transition strip along the doorway. When the robot is trying to navigate through the doorway, it does so slowly because of the relatively narrow path it has to navigate through. The front caster has a small diameter and bumps up against the front of the transition strip and it stops the robot from moving forward (it’s not a smooth transition). If it had speed, it would just run up the bump but because its slowed down so much (and even stops) it’s a problem.

The caster is small (1.5-inch diameter) and directly under the batteries (the zip ties are arranged to secure the two batteries) so there’s a lot of weight forward. I really don’t have a ready solution to this problem that doesn’t involve putting in a larger caster by cutting off the nose of the robot and rebuilding it… but I’m not sure that would solve the issue.

1 Like

An idea came to me while shaving (always when they do… I should start shaving 3-4 times a day) that I should drive the robot in the other direction so the big wheels are the front and the caster is in the rear. That way, the robot will already be in through the doorway and moving prior to the small caster running up against the transition strip. AND in doing so, its likely that the caster will strike the transition strip either straight on or at a slight angle, which should also make it easier to get over it. To make this work, I think all I need to do is to turn around the laser scanner (though I might just be able to just transform it) and the kinect (which can’t be transformed since its not omnidirectional) and change the pin assignments on the motor controller.

5 Likes

Switching things around worked out well. Robot can move across the transition strips now.
I had been trying to get the kinect working with a voxel obstruction layer and I think it was just taxing the computer too much. It just couldn’t successfully and repeatedly navigate through the door way. I switched to depthimage_to_laserscan and created a local costmap using the kinect laser scan plus the mapper’s laser scan and a global costmap using the static map plus the mapper’s laser scan. So far so good. I also took out two of the support posts holding up the top layer. it’s a little flimsy but is holding… now the mapper’s laser scanner has a good view when it passes through the doorway.

I’m now playing with my new 3D printer trying to get it to work so I can make some parts for the robot. I think I need to build an enclosure for it since the parts pop-off during the print. The bottom edge starts to warp away from the build plate and eventually pops off entirely… I can hear the popping sounds as parts of it start to pop away. I’ve got an AC vent in the room I closed off, but there’s still lots of cold air blowing in.

2 Likes

I think it’s time to add the robot arm, so it can open your doors and start exploring more of the world.

Baby steps… Need to get camera/face recognition working… then voice recognition/speech synthesis…

Target detection, weapons systems…

2 Likes
1 Like

Drop your weapon. You have 15 seconds to comply.

3 Likes

If I were to 3d print new bearing mounts for the wheels (replaced the 3/4-inch plywood assemblies) would PLA be too weak? There’s a lot of weight resting on the bearing mounts. I hear abs is much stronger, but gives off nasty fumes when being printed… and it’s hard to print. PTEG? I only have experience (minimal at that) with printing with PLA.

PLA is the hardest of those three. I’d stick to plywood if you can though.

I relocated (at least temporarily) the kinect to the very bottom to give it a better view of obstacles. I had it on top but when a short obstacle (below height of the 360-degree laser scanner) passed out of it’s view (below the kinect’s vertical field of view), it disappeared from costmap and the robot ran into it. So I dropped it down to the first level and that solved that problem… the robot would turn to avoid the obstacle. But then when the obstacle passed out of the camera’s horizontal field of view, it disappeared from the costmap and the robot turned back into the obstacle.

I figured out what was happening… the local costmap uses the Kinect and the laser scanner. Both were set to mark obstacles and clear obstacles. Because the short obstacle was below the laser scanners level, the laser scanner was clearing it from the costmap as soon as the Kinect stopped marking it. So I changed clearing to false for the laser scanner for the local costmap and kept it true for the global costmap and it’s working much better. I might relocate the Kinect back to where it was to see if it performs just as well.