Instead of deleting the question I'm posting this as community wiki anyway as others may find it useful. Originally I made a mistake and forgot the one half factor from the sine function, which caused a wrong result and confused me. This page shows the results of such measurements. Note 2: Some projection software such as PanoTools use a more general projection formula of r = k1 f sin(θ/k2) where k1 and k2 are found empirically (from measurements) for different lenses. The page linked above has a lot of information on various projections. Some use different projections, such as the Samyang 8 mm, which is said to be closer to stereographic. Note 1: This is not valid for all fisheye lenses. Let's make another plot for the sensor size of an actual camera (Nikon D7100). Most current APS-C cameras have slightly smaller sensors with diagonals closer to 14 mm, so in practice the angle of view will be a bit less than what's shown on this graph. We can also make a plot to show the field of view versus focal length for equal area projection fisheyes, and compare with rectilinear lenses. For APS-C with r = 15 mm and f = 17 mm this gives 105 degrees, which matches Tokina's claim for 100 degrees at 17 mm. IntelRealSense Vision Processor D4 up to 1280 x 720 active stereo depth resolution up to 1920 x 1080 RGB resolution diagonal field of view over 90 two. Where r is now the half-diagonal of the sensor. The reverse formula for calculating the field of view will be FoV = 2 θ = arcsin(r / (2f)) Diagonal fisheye lenses using this projection indeed all have ~ 10 mm focal length (Nikon 10.5 mm, Sigma 10 mm, Tokina 10-17 mm). This roughly matches the half diagonal of APS-C sensors ( nominally 15 mm). Where r is the distance from the centre in the projection plane of a point that is visible under angle θ.įor 10 mm focal length this gives r = 2 * 10 mm * sin(π/2 / 2) = 14 mm for the radius of the 180-degree image circle. Photography cheat sheet: What is Field of View (FoV) By Lauren Scott. The diagonal is a tad trickier, since you need to "convert" the image height into the same units as the width.Michel Thoby's webpage has the following formula for the equal-area projection: r = 2 f sin(θ/2) If the pixels are not square the same expressions apply for FOV_vertical, but using K22 and Hp, etc. Then one simply adds the two sides of each FOV angle. Then the same equations as above apply, replacing W with Wp, H with Hp and f with K11.Ī lil more complex is the case just as above, but with the principal point off-center. The Depth FOV values for Intel RealSense cameras can be found on the datasheet, at page number 57. In the simplest case the focal axis is orthogonal to the image plane (K12 = 0), the pixels are square (K11 = K22), and the principal point is at the image center (K13 = Wp/2 K23 = Hp/2). Let Wp and Hp respectively be the width and height of the image in pixels. When, as is usual, the focal length is estimated through camera calibration, and is expressed in pixels, the above expressions need some adapting.ĭenote with K the 3x3 camera matrix, with the camera frame having its origin at the camera center (focal point), X axis oriented left-to-right, Y axis top-to-bottom and Z axis toward the scene. Note that, if you have the sensor size and horizontal or vertical fov's, you can solve one of the first two equations for f and plug it into the third one to get the diagonal fov. Therefore, if you denote with f the focal length (in mm), W and H respectively the image sensor width and height (in mm), and assume the focal axis is orthogonal to the image plane, by simple trigonometry it is: FOV_Horizontal = 2 * atan(W/2/f) = 2 * atan2(W/2, f) radiansįOV_Vertical = 2 * atan(H/2/f) = 2 * atan2(H/2, f) radiansįOV_Diagonal = 2 * atan2(sqrt(W^2 + H^2)/2, f) radians The latter, in the pinhole camera model, is the the distance between the camera center and the image plane. The physical quantities of interest are the sensor size and the focal length.
0 Comments
Leave a Reply. |