Summary
I’ve been interested in this space for about 6 months now, but this past week, I’ve really started to dive in. For the most part I am familiarzing myself with software & dependenices, which has been a headache, but is hump I have to overcome at some point. So far here is what I’ve learned on the software side:
- Added a UTM VM for linux 22.04. On this instance I installed ROS Humble Desktop, which is what I will be using in robotics class.
- This was relatively straightforward. I was able to build from binary, and there are linux 22.04 distributions that support ARM virtualization, which means it runs perfectly on my mac
- Struggled with ROS Melodic packages and dependencies.
- For my research we are running a UUV (Unmanned Underwater Vehicle) simulator. Unfortunately this simulator only supports ROS versions up to ROS Medolic, which is still ROS1. ROS Melodic is only supported by linux 18.04 and older. Unfortunately Linux 18.04 does not support ARM64 architecture because it came before that. So I explored the following solutions:
- Docker container running linux 18.04 with AMD64 architecture being emulated. X11 screen forward a GUI to my mac
- → Failed, was too slow and X11 forwarding was very glitchy
- Docker container running linux 18.04 with AMD64 architecture being emulated, forward screen with VNC.
- → Same problems as X11 screen forwarding
- Using my friend’s Window’s gaming laptop, putting virtualBox on it, and running linux 18.04 from there.
- → this is the most promising solution. I was able to install on the packages and libraries that I need. I have root access and a GUI. The only issue is that the VM is not fast enough to run the graphical simulator. It crashes each time I attempt to move the submarine around in its environment. I think this is because the VM only has access to the computer’s CPUs and is not able to access its’ graphics cards. I think to fix this I actually need to dual boot linux. I will ask if the lab has a computer I can remote into, because this would be easier than all of this.
On the math side of things I started learning about coordinate transformations. The idea is that there is some global reference coordinate system and robot’s local coordinate system. Ultimately we want to know the robots position with respect to the global system, so we need a way to transform between coordinate systems.
We can use the coordinate transformation matrix $\begin{bmatrix}
cos(\theta) & sin(\theta)\\
-sin(\theta) & cos(\theta)
\end{bmatrix}$ to transform our coordinate system. This matrix is actually the same as the rotation matrix $\begin{bmatrix}
cos(\theta) & -sin(\theta)\\
sin(\theta) & cos(\theta)
\end{bmatrix}$ rotated in the negative theta direction. This makes sense because we can either rotate or coordinate system by theta or rotate our point by negative theta and achieve the same final coordinates.
In 2d these rotation transformations are linear, meaning that v = R1v1, v2 = R2v1, v2 = R1R2v. So v2 = R(theta1 + theta2)v. In 3d this does not hold.