Azimuth

Definition

The azimuth is defined as the angle formed between a reference direction and a line from the observer to the point of interest projected, on the same plane as the reference direction orthogonal to the zenith.

Etymology of the word

The origin of the word is from Arabic  "al-sumūt", which translates to the directions.

 Measurement

Azimuth is an angular measurement used in the spherical coordinate system. The vector from the origin (the observer) to the point of interest is perpendicularly projected onto the reference plane. It is the angle between the projected vector and a reference vector on the reference plane. For a celestial coordinate,  it defines the horizontal direction of any astronomical body in the sky. So, a star can be the point of interest, the reference plane is a local area on Earth surface around the observer, and the reference vector points towards true north. So, it is the angle between the star's vector on the horizontal plane and the north vector. Azimuth unit is degrees. Moreover, it is used in many applications mainly mapping, astronomy, engineering, navigation and ballistics.

Different  Applications

 
  • Navigation
Azimuth is usually symbolized as alpha, α, In land navigation and defined as the horizontal angle measured in clockwise direction from the north base meridian  
  • Cartographical azimuth
When the coordinates of two points in a flat plane are known, the cartographical azimuth (decimal deg) can be calculated
  • Astronomy
In celestial navigation, it is the direction of the celestial body from the observer. While in astronomy, it is occasionally named as a bearing. In modern astronomy, it is commonly measured from the north. In the past, it was commonly referred to the south, as it was zero at the same time that a star’s  hour angle  was zero. This is however based on the assumption that the star (upper) culminates in the south, which is correct only if the observer's latitude is more than the star's declination .
  • Calculating coordinates
When the two coordinates of one point X1, Y1, with distance L between the points, and the azimuth α to the other point are known, X2, Y2 can be calculated. Consequently, this technique is usually used for triangulation. A better approximation puts an assumption that the Earth is a slightly-squashed sphere, then azimuth has two very marginally different meanings. Normal-section azimuth is the measured angle at our viewpoint by a theodolite whose axis is perpendicular to the surface of the spheroid; geodetic azimuth is the angle between north and the geodesic; which is, the shortest path on the spheroid surface from our viewpoint to Point 2. And, the difference is usually incalculably minor; if Point 2 is less than 100 km away, the difference will be less than 0.03 arc second.    
Place comment