I built the stewart platform using hobby servos, arduino and quaternion mathematics. Now I want to try to transfer the project to stepper motors with a planetary gearbox, since the idea came up to use the platform as a robot, but the noise from the servos is very strong.
Found your wonderful project Jackpot CNC Controller Board.
If I understand correctly, this is a regular esp32 and I can program the board as usual? It’s just that the outputs/inputs are predetermined.
It also has a gpio expander controlling some of the outputs over I2C. Take a close looks at the schematic to see which outputs are controlled by that gpio expander.
The esp32 isn’t bootloader locked or anything. You can put whatever code you want on it.
You can also edit the schematic and board layout in easyeda and have them build you slightly modified versions.
Please post any details. That project seems awesome.
Thank you, I placed an order. I will wait for a parcel. At the same time, I will order stepper motors with planetary gearboxes. A little later I’ll post my notes on the theoretical part of the platform’s operation on quaternions (there are practically no sines/cosines and should work quickly on microcontrollers) - I need to remember it myself and translate it into English.
I use quaternions at my work to avoid the ambiguity of euler angles. I don’t really deeply understand them. But I know enough to use them. I still convert to Euler angles when I debug them
But I code for Linux and have plenty of CPu and GPU power.
Well, I guess I did not take every math class known to man kind… Something I have never even heard of let alone touched.
In fact, such a branch of mathematics does not exist Except for linear quaternion algebra…
This is my idea - “quaternion mathematics”, but it means exactly what it means - the use of quaternions for an exact position in space, since they do not have the disadvantages of Euler angles, airplane or ship ones. Although they cannot be understood digitally, code and mathematics are very enjoyable to work with.
I use the HTC Vive controller as a 6-axis mouse in Blender (3 translation axes and 3 rotation axes). By default, position data from the controller appears as if it is turned upside down and the starting coordinate point is inside the controller.
I use a python script to work with it and consider the bottom of the controller as the starting coordinate point, so I get a matrix, through OpenXR - I transform it into a position vector + rotation quaternion, rotate the vector by the resulting quaternion and subtract the resulting vector from the vector and get a normal “picture” ". I couldn’t do it with Euler angles, and it’s very difficult to get confused with matrices.
Yeah. One thing I have learned is you always have to write out transforms. Any time I try to just try something, it seems right until it isn’t. Then there ends up being a bunch of 90-heading kind of fixes that fix the other 90-heading bugs I have. It only ends up working if I clearly define the coordinate systems and explicitly write out the transforms.
I mostly use ROS, which has tf and tf2. I also use the tux Eigen library which is really good on full computers. I haven’t tried to use it on microcontrollers. But it handles the repetitive math and gives you an api so you can construction a transform matrix by making a
Quaternion(x,y,z,w) and making a
Translation(x,y,z) and just multiplying them together and by the matrix or vector you want to transform. It can do much more. My job is usually just to make sure the transform from a sensor to the robot base is correct, then let Eigen or tf2 do the math to transform the data to robot coordinates.