3 Computing Trends for 2021

3 Computing Trends for 2021

With the fourth industrial revolution upon us, speed and security will rule the day. Data analytics, artificial intelligence, machine learning, the Internet of Things, cloud computing, and innovations that are becoming more central to our businesses, governments, and daily lives, all rely on similar technological foundations: Speed/Power, Security, and data processing. Find out more about these trends below:

Security

Halfway through 2021, it’s looking like this could be the year of cybersecurity. At the end of 2020, a supply-chain hack compromised SolarWinds, which runs in hundreds of departments and agencies of the United States government. Microsoft called it “the largest and most sophisticated attack the world has ever seen.”

The quantity and depth of breaches has only increased this year. Ransomware, when attackers shut down a network until their financial demands are met, has proven especially popular among cyber criminals. In early June, the world’s largest meat processing company, JBS, paid $11M in ransom to resume their operations. Colonial Pipeline, which supplies 45% of the fuel to the U.S.’ east coast, paid hackers a $4.4M ransom.

In early July, a supply chain attack struck Kaseya software, which supplies 200 U.S. companies and has forced Swedish supermarket Coop to close half their stores. One cyber security expert calls it “the SolarWinds of ransomware.”  The Economist has predicted that such cyber crimes are increasingly common.

According to European Union figures obtained by CNN, significant cyberattacks against critical targets in Europe, such as hospitals, have doubled in the past year. The European Commission has therefore announced plans to form a Joint Cyber Unit to tackle cyber-attacks.

 

Speed/Power

It’s not an uncommon refrain to say there’s more computing power in the phone in my pocket than there was used to power the Apollo 11 rocket to the moon. It’s hard to describe how far computation has come, but it keeps getting faster and more powerful. Between quantum computing and multi-core processing, the digital world is moving toward a new era of possibilities.

Tech behemoths like Google, IBM, and Microsoft are developing quantum computers, which, theoretically, will push past the limits of even the fastest supercomputer. Google claimed “quantum supremacy” in Fall 2020, when its prototype quantum computer did a calculation in three minutes that would take the world’s largest supercomputer 10,000 years to perform (only 1,000 years, actually, according to IBM). Developing a stable quantum computer could still be years out, but it’s clear that should it happen, it will represent a difference in kind from current computing capabilities.

For consumers, multi-core processors are beginning to shake up everything from laptops to phones to wearables. For many years, people have been hailing multi-core processing, but Apple’s new M1 chip is either its pinnacle or its breakthrough. The 16-processor chip in theory works faster, quieter, and more efficiently (i.e. uses less battery) than chips made by Intel, for example. Though true testing complicates the M1’s supremacy, Apple yet could fulfill all of multi-core’s potential with further development.

 

Edge Computing

According to one estimate by Gartner, an IT service management company, at least 80% of companies will move away from traditional data centres by 2025.  MIT Technology Review projects that the global edge computing market will reach compound annual growth of more than 37% between 2020 and 2027. In other words, the global edge computing market should reach $43.4B by 2027, up from $3.5B in 2019.

Edge computing is the next stage in cloud computing. As opposed to a few dozen centralized data centres doing the computing, edge computing happens at or near the source of the data being generated. The big impact is to reduce latency, which makes a considerable difference when unfathomable amounts of data are speeding across space. Edge computing can also increase bandwidth, so that more data can be flung across a network at any time. It also has the potential to better protect data and privacy; to provide the flexibility to comply with local regulations in an increasingly splintered global Internet; and to ensure continuous operation.

 

 

 

SOURCES

Bajak, Frank; O’Brien, Matt, and Eric Tucker. “Ransomware attack suspected from REvil gang hits at least 200 U.S. companies.” The Associated Press. 2 July 2021.

BBC. “Colonial Pipeline boss ‘deeply sorry’ for cyber attack.” 8 June 2021.

BBC. “Meat giant JBS pays $11m in ransom to resolve cyber-attack.” 10 June 2021.

Brant, Tom. “What is the Apple M1 chip?” PCMag. 22 April 2021.

The Economist. “Ransomware attacks like the one that hit Colonial Pipeline are increasingly common.” 10 May 2021.

Howart, John. “7 Important Computer Science Trends 2021-2025.” Exploding Topics. 18 May 2021.

IEEE Computer Society. “IEEE Computer Society 2022 Report.”

MIT Technological Review. “Computing at the Cutting Edge.” 26 June 2021.

Moore, Susan. “The Data Centre Is (Almost) Dead.” Gartner. 5 August 2019.

Paton Walsh, Nick. “Serious cyberattacks in Europe doubled in the past year, new figures reveal, as criminals exploited the pandemic.” CNN. 10 June 2021.

Reuters. “U.S. government SolarWinds hack was largest, ‘most sophisticated attack’ ever: Microsoft.” 15 February 2021.

Temple, James. “10 Breakthrough Technologies 2020.” MIT Technology Review. 26 February 2020.

Tidy, Joe. “EU wants emergency team for ‘nightmare’ cyber-attacks.” BBC. 23 June 2021.

Tidy, Joe. “Swedish Coop supermarkets shut due to US ransomware cyber-attack.” BBC. 3 July 2021.

About the author

No comments

Leave a Reply

Your email address will not be published. Required fields are marked *