Wi-Fi has and will continue to develop to produce more bandwidth for your devices, but as the saying goes….build it and they will come.
Bandwidth will quickly be consumed; the more there is, the more the users will take. Let’s take a brief look at the history of the WLAN and how the standard has changed over time.
In 1970, the University of Hawaii developed the first wireless network to wirelessly communicate data among the Hawaiian Islands. However, it wasn’t until 1991 that the Institute of Electrical and Electronics Engineers (IEEE) (which I will describe in more detail next week) began to discuss standardizing WLAN technologies. In 1997, the IEEE ratified the original 802.11 standard—the “802.11” technology term simply refers to Wi-Fi.
In 1999 wireless was introduced to the general public as a “nice to have” with the 802.11 a and b ratifications. These standards had very low speeds (up to 54 Mbps & 11Mbps respectively) but it was ok, because there were no handheld mobile phones that utilized Wi-Fi and very few laptops.
By 2003, however, some mobile devices that utilized Wi-Fi were coming out and portable laptops were becoming more standard for both business and personal use. That is when 802.11g was ratified— delivering up to 54 Mbps in the 2.4 GHz space. As we moved closer to today, in 2007, the birth of the smartphone really came about and along with it came the ratification of 802.11n.
The “n” standard brought about faster processing speeds of up to 450 Mbps for Wi-Fi and it supported both 2.4 Ghz and 5 Ghz devices. Today, smart devices are robust enough to replace specialized, more expensive laptop technologies so wireless has had to catch up.
This is where we get into the current realm of 802.11ac. 802.11ac is the fantastic new wireless technology that brings us into the age of Gigabit Wi-Fi.