What’s The Next Big Step in Technology?

What’s The Next Big Step in Technology?

Alex Elchev, Reporter

As companies continue to make innovations in their fields, what’s the next revolutionary improvement?

When Apple released their first personal computer (PC) in 1977, it began the commercial use of computers, an industry worth billions today. Since then, much smaller inventions have been hailed as the next big thing, the most recent example being the MacBook Touch Bar. Similarly, television and screen design are reaching a technological plateau with the introduction of 8K Ultra HD displays. The mobile market is historically competitive, with Samsung and Apple pushing each other towards yearly innovations. Most changes are small, so what is the next major technological breakthrough?

According to the History.com, personal and industrial computer performance has been steadily improving since the very first computers were invented during World War 2. Before 1977, computers were mostly military machines, used for complex calculations and strategic planning. Several key computer components directly led to the slimming and mass production of computers, most important of which was the microprocessor, commonly called CPU’s. Modern CPU’s use small quantities of gold, mainly for its high electrical conductivity, to run most of a computer’s commands, and are found in nearly every phone, laptop, and computer.

Gold may be an incredibly good electrical conductor, but it will eventually be phased out of electronics. Experimental carbon technologies such as graphene may replace metals in computing. Currently expensive and complicated to make, graphene is a compound consisting of a single layer of carbon atoms. In 2014, IBM created a computer chip made of graphene that was reportedly 10,000 times better than other similar processors. It is unclear when graphene chips will be available for public use, but it is clear that they will be superior to our current technologies.

Scientists and researchers are also looking into alternatives to graphene. Atom-wide structures made out of silicon, boron, and nitrogen could all replace graphene as the premier wonder-material, and only time will tell whether they will be successful.

Similar to computers, screen technology might also be hitting a developmental plateau. Pixels are a collection of three tiny anodes that emit red, blue, and green light. Combined, these three lights produce every color on a screen. As mentioned previously , 8K Ultra HD televisions, which have a whopping 33 million pixels, are the current peak performers when it comes to commercially available screens, and they may not be very far from the optimal product.

Retina displays currently employed by Apple and other companies are already sharp enough to be inseparable from higher screen resolutions. In other words, at a certain point, our eyes just can’t see more pixels. Scaled up, these displays are more detailed than that of IMAX theater screens.

Apple CEO Tim Cook enters the stage during the Apple Developers Conference in San Francisco, Monday, June 11, 2012. Apple says it’s introducing a laptop with a super-high resolution “Retina” display, setting a new standard for screen sharpness. The new MacBook Pro will have a 15-inch screen and four times the resolution of previous models, Apple CEO Tim Cook told developers at a conference in San Francisco. (AP Photo/Marcio Jose Sanchez)

As of December 18th, 2016, the Samsung Galaxy S7 has the highest pixel density of about 577 pixels per square inch (ppi). It nearly doubles the iPhone 7’s 326 ppi, even though the naked eye can barely distinguish between the two at arms length.

If not in mobile technology or computing, what field could the next major technology arise? A popular answer to this question would be the field of artificial technology (AI). Several products have already been created that can learn independently of their creators. Amazon’s Echo and Echo Dot, along with Google’s Google Home are both capable of observing their owners patterns, and adapting their behavior based on past experiences. Even though AI has been used in some capacity for several decades, it  has made significant improvements these past few years.

Artificial intelligence has arguably the most potential of any future technology. In an article by  Business Insider, experts argue that its future will greatly benefit humanity. High functioning learning programs could predict natural disasters before they even happen (including preventing and fixing climate change), search for cures to diseases in ways that no human could, and even be implemented into people to augment reality. Oren Etzioni, the CEO of the Allen Institute for Artificial Intelligence, says that when it comes to artificial intelligence, “the sky’s the limit.”

When it comes to technology, no one can predict what big thing can truly make an impact, whether it be immediate or a long term investment. Although new things are made everyday, no one can tell whether they’ll be booms or busts.