A dumb charger thats just constant current/constant voltage would be fine but a fancy one is going to sense the current to infer more about the battery and an extra load is going to hose that up for sure.
I’d say you would need a dumb charger or a separate supply, maybe slightly higher voltage and diodes so when the dc supply is unplugged (and presumably the battery charger) it will draw from the battery.
I’m not sure what you are suggesting. This is a rough sketch of how the power is currently hooked up. The charger is an on-board unit and stays connected to the batteries.
My idea is to add a relay on the input to the fuse block that will switch between battery and a AC/DC power supply. I could put in a manual switch, but I would need to make the switch happen quickly after plugging in the power cord so it switched over before the charger starts its analyze phase, therefore I thought a relay might make the most sense.
If the DC supply is slightly higher voltage than the battery then no current will flow out of the charger/battery while the DC supply is on, so the charger and battery should be essentially isolated and shouldn’t get confused by the presence of the load or the DC supply. Then when power is cut it would automatically switch over to consuming from the battery. Not saying this is a great plan necessarily but clarifying what I was trying to describe.
A relay is also possible but I’m not sure how fast it would need to be and whether it would produce transients. If you have buck/boost converters in line anyway then they can probably absorb short transients so probably not a big issue.
Thanks, makes sense now. Simple is best so I’ll give it a shot. Something like a 15V or higher DC supply might work or is that cutting it too close? Charger hits around 13.2V iirc. I might have a have 15V or higher in my hoard. The motor controller is TLE5206 based so it can take up to 40V. The rest of the connections are isolated with buck/boost converters and seem fairly tolerable to input range. But I’d need some fairly big diodes… I had gotten some 3 amp 1N5404 diodes at one time if I can find them… in my hoard.
The diodes appeared to have worked. I connected up an old Dell 19v power supply with those diodes and the batteries fully charged (green light on the charger this morning). My current sensors arrive tomorrow and I’ll hook them up and print out a housing for everything… I’ve created sparks a few times with errant wrenches. Thanks again for the good idea.
I got my voltage dividers and current sensors installed and connected them to an esp32 that I programmed to output webpages of the values. Here’s a screenshot of the current being monitored right after plugging in the power cord to charge the batteries.
What I found interesting was that the right battery is taking on 3A while the left battery is only taking on 1A. Both are paralleled (so would think they would tend to draw down evenly if everything is good) and get put on the charger at the exact same amount of time. So, I’m kinda thinking one battery is sick… lol… but don’t know which! It would explain why I don’t get as much battery life as I calculate. The system at steady-state draws about 1.3 - 1.5 A and with two 22AH battery, I get only about 10 hours… doesn’t add up. I might have harmed them by letting them get too low and then just sitting around. But now with remote monitoring ability, I should be able to prevent it… I need to add some email/push notifications to the esp32 program.
Batteries running parallel need to be pretty well matched. I am guessing the right is doing all the work and so it’s taking more current to charge. The left is dropping voltage quicker, while providing less current, leaving the right to do all the work. It’s like every school team project ever.
When I checked about 30 minutes later, the right current had dropped to 1 A to match the left current. I don’t log that far back in time, so I don’t know when it did drop down, just somewhere between 5 and 30 minutes. It’s something I’ll keep an eye on now that I can see what’s going on. I’ll probably modify the software to track the various charging stages.
I wish I understood the battery charging technology better. The batteries are wired in parallel and so there’s electrical continuity between them. It just seems to violate ohms law to be able to send specific amounts of current to particular batteries.
Two big steps forward… one big step back. After a lot of work getting ROS melodic up and running and everything talking to it properly, it’s moving around the room very well with the sonar sensors and the laserscanner. However, my wife is reorganizing rooms upstairs and has packed the hallway with ‘crapple’ (what I call stuff that really should be taken to the garbage, but for some reason she wants to keep) and its too narrow for the robot to even attempt to go down.
After deciding to wait for the hallway to get cleared-up so I can try to navigate through the doorway, I went ahead and connected the Kinect back up to see how it performed. To keep a long story relatively short, after 16 grueling hours of troubleshooting and deep diving into source code, I discovered that the Atomic Pi, with a single USB 3.0 port (no 2.0 ports), somehow doesn’t have enough ‘bandwidth’ to support the Kinect and ANYTHING else… I can have a Kinect and nothing else, or everything else (two arduinos, esp32) and no Kinect. I’m not sure why the Kinect 360 takes all the bandwidth of a USB 3.0 port as I thought it was a 2.0 device.
Regardless, there is a USB 2.0 port on the motherboard in the form of a 5-pin 2.0mm pitch header (was intended for their webcam). Problem is I haven’t figured out a good way to connect a USB jack to it as they don’t make a cable for that purpose.
Radxa apparently will be shipping their atom-based Rock Pi X in 3-4 weeks, so that might be the ultimate solution (can get one with 4 GB of RAM). They offered samples to people that have a cool project to use it with, but they don’t seem to be very interested in my project. I’m really not sure why. I’m going to start a thread over there and maybe see if I can convince them to send me one. Would love it if people would drop by and give it a like… and maybe vote in the poll.
Since the kinect doesn’t require an x86, I could put it on an RPI or the Jetson Nano I have, but I’m always thinking about power consumption and if I can get it done in one box, I’d rather keep it that way. I don’t have an explicit use form my Jetson Nano now that I backed this kickstarter:
I’m hoping that will replace what I was going to try to build with the nano so maybe I can use that if the Rock Pi X isn’t up to doing everything. I might be asking too much of a single atom processor to do (navigation + camera-based mapping). I’m looking at using STVL with the Kinect to create costmap.
It worked, they are sending a sample! Scratch one problem off the list
Unfortunately, add another… Last night I was moving the robot forward using teleop_twist_keyboard and noticed it didn’t always move straight ahead at the beginning and end of the move. So I lifted up the robot (my baby is getting heavy these days) and put it on some blocks and was able to spin the wheels when I applied a fair bit of force (hard to describe how much force). So once again, the axles are slipping in the coupler when under acceleration. I don’t recall, but I might not have threadlocked the set screws, so I’ll disassemble the lower section and check. I was wondering about using the green (wicking) threadlock. The idea is to put some down the hole of the set screw, then install the set screw. My thought is that some will wick into the actual shaft/coupler joint and setup there… so not only does it lock the set screw, but also the shaft/coupler.
The green threadlocker worked well. The wheels/axles feel very solidly connected to the motor.
I now have the kinect mounted to the top of the PVC support pipe. It’s tilted down ~25 degrees and has good vision of obstacles. I used depth_nav_tools to convert the kinect’s depthimage to a laserscan. Once I aligned that laserscan with the mapper’s laserscan, I was finally able to get the robot to navigate through the door of my ‘lab’ and out into the hallway. But, I had to disable the sonar sensors to do so. Maybe they aren’t accurate enough our something. I might move them all to the back and just use them as backup sensors so they have little effect on forward navigation. The whole thing moves a lot better than my previous model… so I think the 3d printed parts for the motor and bearing mounts were definitely the way to go. The narrower design (2-inches cut off each side) seems to help as well.
I’m still waiting on the Radxa Rock Pi X to arrive, so I’m still using the atomic pi and added my jetson nano to do the kinect processing. Please excuse the mess of cables… it’s temporary.
I also managed to modify the slamware ROS SDK to save the map and pose (pose is the location and orientation of the robot with respect to the map) to disk and then reload them… both are a ‘service’ call that I can use rosservice to initiate… ultimately I might make some kind of webpage or something to initiate the saves and loads (and other things) when needed. This is important so I can name spots on the map (e.g., beer fridge) and tell the robot to go there. If I couldn’t save/load the map, then I’d have to reenter that data every time it gets power cycled and it would be easier to just get up and get my own beer.
Unfortunately, after the server node running on the jetson nano loads the map and pose and sends them to mapper to use, the server node crashes. When I restart the server node after the crash, the loaded map and pose is present on the mapper, so it does gets loaded into the mapper successfully, but the server node must crash right after sends them. I’m pretty sure its a memory thing as sometimes a “free() invalid pointer” gets logged to the screen (sometimes not) but the PID just dies and there’s no indication of where/how/why it died. I don’t know enough about debugging tools yet to figure out the culprit. But, I can consistently save and load the map so until I figure out the software glitch, I can work around it with scripts or something to reload the server node.
Is this your code? You can run a node in gdb by setting a specific prefix. I can never remember the syntax, but it is something like prefix=“gdb -c run”.
Rospy is pretty easy to write and so is the flask python library. There are also javascript definitions for mesaages and a way to set up a websocket, but that’s not as natural for me, so I haven’t tried it. But a flask page with a button that sends a service request should be a very quick (and dare I say fun) project.
Narrowed it down to the pose loading routine. I guess I need to look at how to properly save the pose to a file… I tried something and thought it worked, but clearly not.
It’s working now… found someone on the interweb that accomplished the same thing I was trying to do (save a topic to a file and then reload it).
I’ll probably go the route of using rospy and flask… not sure if I want to use websockets, but I do have experience with doing so. Socketio is definitely handy for pushing data out to the browser, but I don’t really want to create a big robot app or a ‘remote control’ for it as I want to try to do everything possible through speech recognition. For the web interface, I want to keep as minimal as possible (graph/chart power levels, maybe a few reset/reboot options, etc.)
Then again… rosbridge_server looks pretty simple to implement and I can use my esp32 that I use to monitor voltages and power levels to serve the html page.
It’s navigating really well. I just sent it into the twins room and back and not a hiccup. I think two things were key… first was disabling the sonar sensors and second was building everything in release mode. I was constantly missing my control loop goal of 5 Hz and once I built in in release mode (vs default which was debug mode) it never missed a beat and ran MUCH smoother.
I’m now investigating a voice control system and to my surprise, there really aren’t many “complete” systems out there. Lots of things cobbled together with different software with varying degrees of success. One looked promising (picovoice) but though they talk about being open-source, they really are pretty much close-source. So, I think I’ve settled on trying mycroft as the engine. I don’t need it to do all that much and someone already is using it, along with a seeedstudio respeaker core v2.0, as the speech interface to their robot (one of those two-wheel balancing things).
Oh man, I wish I would have thought that. That has bit me sooo many times. I have a little alias that I use to build and its main job is to make sure I am configured for RelWithDebInfo. If I am not, it prints the long command to set it. I use catkin tools to build.
I had a good robot day today too. I can’t share the details (prop. stuff), but good results .