Airport codes are one of the most fascinating and frustrating aspects of air travel. They are a necessary evil that we must all deal with if we want to get from one place to another, but they can also be a maddening mess. Some codes are very logical based on the city or airport’s name, but some don’t make a lot of sense. Let’s take a closer look at the alphabet soup of air travel to try and understand the airport codes, what they mean, and why they can sometimes drive us all a little bit crazy.

To start, let’s watch this video from CGP Grey titled “The Maddening Mess of Airport Codes!” In the video, he goes through a brief history of airport codes and how they have evolved over time. He also explains how the three-letter codes are not random and why certain letters are used more often than others. But perhaps the most interesting part of the video is when he talks about how confusing and inconsistent airport codes can be, especially when they don’t match the name of the city or airport.

IATA


The International Air Transport Association (IATA) assigns three-letter codes to airports and can be chosen in different ways. Sometimes, they are based on the capital of the province, such as COR for Córdoba, and ORK for Cork as COR already exists. In other cases, they refer to the historical name of the city or are simply an abbreviation of the name of the city. They can also be based on the airport’s name, such as CDG for Charles de Gaulle in Paris. However, there are instances where the code letters do not correspond to any of the above and are just a random combination of letters because the most similar combination for the city name is already in use by another airport, such as MQK for San Matías in Bolivia.

ICAO


In addition to IATA codes, the International Civil Aviation Organization (ICAO) assigns codes to airports for aviation safety and regulatory purposes. ICAO codes consist of four letters, where the first letter denotes the geographical region of the airport. The second letter represents the country, and the remaining two letters are generally assigned sequentially to the specific airport. For instance, L is assigned to southern European countries, E to northern countries, K to the United States (except for Hawaii and Alaska), and Y to Australia. There are exceptions, though, such as GC for the Canary Islands. In the United States and Canada, however, many airports have ICAO codes that are identical to their three-letter IATA codes, with the addition of a geographical prefix.  For example, the IATA code JFK will have the ICAO code KJFK.

Why do US ICAO codes start with K?


The reason why the United States was assigned the letter “K” as its designation within the two-letter code system is not fully understood. However, there are a few theories.

One theory suggests that the use of “K” for the United States was based on the telegraph and radio communication practices of the time. In those days, radio communication was an important means of transmitting information, and the Morse code for “K” was used to indicate the letter “K” on telegraphs and radio communications. It’s possible that this association with “K” led to its adoption as the prefix for American airport codes.

Another theory suggests that the use of “K” was a nod to the first letter of the company that developed the first commercial radio equipment, the “Kellogg Switchboard and Supply Company.” This company played a key role in the early days of aviation, and it’s possible that its association with “K” led to the adoption of the letter as the prefix for American airport codes.

Regardless of the reason for its adoption, the use of “K” as the prefix for American airport codes has become a widely recognized and accepted practice within the aviation industry.

Can US airports use all letters?


There are also some letters US airports are not allowed to exist in their IATA code decided by the FAA. For example, the letter N, is reserved for the navy. Nashville, for example, has the code BNA instead of something starting with N. Q for the random reason of morse code also isn’t allowed. There is a set of 3 letters international morse codes that begin with Q for quick communications that apparently don’t allow US IATA codes to start with Q. That’s not all; Z is reserved for air traffic control stations themselves, and Y is mainly reserved for Canadian airports. However, FAA doesn’t write the laws, so there are some exceptions where airports still decided to start their code with W or N.

Why Y for Canadian airports?


Most Canadian airports start with the letter “Y” because of a historical convention that dates back to the early days of aviation in Canada. When the first radio communication systems were developed for aviation, each airport was assigned a two-letter code to identify it. At the time, most of Canada’s airports were operated by the Department of Transport, and the letter “Y” was assigned to them as a prefix. While the radio communication system has since been replaced by more advanced technologies, the “Y” prefix has remained as a distinctive feature of Canadian aviation.

While IATA codes try to keep things organized and easy to understand, there are still many examples of airport codes that are confusing or inconsistent. Let’s start with megacodes for megacities. These are overarching codes for multiple airports in a city. In London, you, for example, have 6 airports of which (not very coherent) 4 start with L and 2 start with S and then there’s the megacode LON which you can use for flights landing in London, but don’t care where. If you think this is complicated, there is also the airport in Basel which has 2 airport codes, MLH and BSL with the megacode EAP if it could be both. What? This is all because of cooperation between France and Switzerland both used half of the airport, and hence 2 codes were needed. Nowadays, the airport act as one anyway, so these 2 codes don’t really matter now.

The 3 letters of IATA codes can only form a little over 17.000 combinations, while there are around 40.000(!) airports. That is where the ICAO codes come in handy, also covering barely used airstrips besides commercial airports. These 4 letters make a lot more combinations! I can explain all of the combinations and exceptions for all of these codes, but then I would have to write a book! However, there is one last fun fact I want to tell you. You won’t see J being used as the first letter for ICAO codes. Do you have any idea why? The answer is that it is used for Mars! In the Jezero crater, ICAO gave the iconic landing location the code JZRO. Pretty cool right?

Despite their quirks, airport codes are an important part of air travel, and they help to keep the industry running smoothly. But they can also be a source of frustration and confusion for travelers, especially when they don’t match the name of the city or airport. So the next time you find yourself scratching your head over an airport code, just remember that you’re not alone. Even the experts sometimes find them to be a maddening mess.


What will be the airport code of your next destination?

StaffTraveler for web