'Fog computing' could be more important than the cloud

Posted Mar 20, 2017 by James Walker
Cloud computing has taken off in recent years, driven by increased demand for artificial intelligence and the Internet of Things. Now a new concept could surpass the cloud though, bringing in a 'fog' of files with even greater power.
Hewlett-Packard ProLiant commercial data servers are assembled in Houston.
Hewlett-Packard ProLiant commercial data servers are assembled in Houston.
Donna Carson / Reuters
The term "fog computing" has been floating around for a couple of years. Coined by Cisco, it essentially refers to a strain of cloud computing with a less centralised storage system. Rather than keep everything in one physical facility, the network tries to move its files closer to where the end user – such as your phone – is located.
This is one of the biggest issues with current cloud computing. Because your phone doesn't have enough power to run the code behind assistants like Siri or Google Assistant, it talks to a data centre each time you make a voice query. The nearest servers could be hundreds or thousands of miles away, incurring a performance penalty each time you use the system.
Fog computing decentralises the cloud. It uses smaller data centres placed across a wider area to reduce the distance between your device and the servers. The distance data has to travel is lowered so you get a response more quickly. As TechRadar reports, a team at the University of Camerino has now developed the concept further, creating an actual "fog" of files that also increases cloud security.
This form of "fog computing" endlessly redistributes packets representing an individual file among the nodes in its network. At no point is the entire file stored in a single location. Even if an attacker gains access to one of the servers, all they can download is the garbled contents of sections of a file.
READ MORE: Cisco finds a critical flaw in 300 of its Internet-connected products
"Our proposal is based on this idea of a service which renders information completely immaterial in the sense that for a given period of time there is no place on earth that contains information complete in its entirety," team members Rosario Culmone and Maria Concetta De Vivo said in the paper.
Combined with the wider decentralised aims of Cisco's fog computing, this solution envisions a decentralised "cloud" where data is taken out of the giant server farms and into smaller clusters in your region. The servers are still linked by a "fog" of data that prevents attackers stealing information while ensuring everything is decentralised.
Fog computing is set to become increasingly prevalent over the next few years as more consumers demand faster access to emerging centralised technologies. By dismantling the datacentre, processing becomes more efficient and the user is given a smoother experience.  The concept could go some way towards solving the problems of cloud computing services operating in highly latency-sensitive environments, such as autonomous vehicles, medicine and telecommunications.
The OpenFog Consortium is already working towards achieving widespread fog computing usage. Its founding members are ARM, Cisco, Dell, Intel, Microsoft and Princeton University with contributions from firms including Foxconn, Hitachi and Schneider Electric.