Gaming is now a $100 billion-dollar industry, with tons of enthusiasts joining in on the action as each day passes. Studies show that nearly two-thirds of the homes in America have a household member that plays video games on a regular basis. Although they have been around for decades, video games still seem like some new phenomena that is constantly challenging how far we can push electronics, graphics, and other aspects of digital distribution. In this article, we explore the history of gaming to discover how it has evolved from the beginning to now. Join us as we take a trip down memory lane to see how the games, community, and industry have changed.


Academic research scientists were the first to experiment with video game technology. In 1958, American physicist William Higinbotham developed an electronic tennis game that preceded Pong. After Tennis for Two came the first game system that was designed for commercial home use. German-born American engineer Ralph Baer and his team developed a prototype for the first multiplayer, multiprogram video game system. After being licensed to Magnavox, the “Brown Box” was released as the Magnavox Odyssey in 1972, preceding Atari by just a few months. Magnavox sold about 300,000 consoles between August 1972 and 1975, when the Brown Box was finally discontinued.

Even before the “Brown Box” was developed, licensed, and released, companies such as Sega and Taito were piquing the public’s interest in arcade gaming. Atari was the first gaming company to experience growth on a grand scale. They developed their games in-house and began marketing the video game Pong, which was the first real electronic video game. Arcade games began to show up in bowling alleys, restaurants, malls, and other venues around the world. Many chain restaurants began installing video games to attract new clientele. Players began getting competitive and recording their best scores.

Players later began competing on separate screens with the release of Empire, which was a game written for the PLATO system in 1973. According to usage logs, players spent about 300,000 hours playing Empire from 1978 to 1985. By the 1970s, companies began manufacturing personal computers and gaming consoles. Intel invented the world’s first microprocessor, which led to an onslaught of new games. Gunfight was released in 1975 and served as the first example of a multiplayer combat shooting game with a new style of gameplay. Users would employ a joystick to control movement and another to shoot in a specific direction.

Atari VCS was released in 1977, but initial sales were underwhelming, and the system was only designed to play simple games. However, the Atari VCS was the first console to include an external ROM slot where game cartridges could be plugged in and users could play games that outperformed the console’s original capabilities. Around the same time, the gaming community blossomed with the release of hobbyist magazines and competitions. Once the market was too saturated to interest the gaming majority, consumers and manufacturers faced the video game crash of 1983. Despite the popularity of the Nintendo Entertainment System (NES), which was released in 1983, home computers grew ever more popular, because they were more affordable than the game consoles at that time.

The concept of LAN parties was born with the release of Pathway to Darkness in 1993, and gaming enthusiasts united over Marathon on the Macintosh computer in 1994. Gaming as we know it today grew because of LAN networks and the Internet, which allowed users to compete and interact with different computers. Within the last three decades, consumers have moved from consoles to home computers to mobile devices and beyond. We can only expect this colossal industry to keep growing as companies make advancements in cloud-based streaming, virtual reality, and so much more.

Back to blog