It's this fly-away stage that I want to analyze. If I can get the acceleration as it leaves, then maybe I can model its trajectory to see where it would land. Yes, NASA knows exactly where it landed—they even have a picture of its crash site. But it's fun to see if I can do this just from the single rover video.READ ALSO:
OK, let's get started. The plan is to use angular size of the descent stage to get the distance from the rover in each frame of the video. But what is angular size, and what does it have to do with position? Here is a quick experiment for you. Take your thumb and hold it at arm's length from your face and close one eye. Yes, really do this. Now find something in the room that your thumb covers up. What happens when you bring your thumb closer to your eye? It looks bigger and covers up even more stuff in the background. The actual size of your thumb didn't change, just its angular size.
Suppose there is some other object—maybe it's a stick of length L in your field of view. Imagine that you can can draw a line from your eye to each end of the stick. It would look like this.
The stick is sort of like a part of a circle with a radius r centered on your eye. This means the length of the stick is approximately equal to the arc length that has an angle θ. Assuming the angle is measured in radians, then the following would be true.