Based on smaller computing resources and located closer to users, fog computing avoids large data centers. This approach is particularly suitable for Internet of Things and systems that require immediate calculations.
In the world of distributed systems, where computing resources such as memory and computing power are distributed among different machines, the cloud has experienced rapid democratization over the past fifteen years. However, it is not the only remote data storage and processing system. ” A cloud typically consists of one large data center, or several medium-sized centers, but this system is reaching its limits because it is increasingly difficult to concentrate computing resourcesexplains Guillaume Pierre, professor at the University of Rennes and member of the Institute for Research in Informatics and Random Systems (Irisa). We then realized that transmitting data far away for mass processing in large centers wasn’t always the best solution. »
Data processed and stored nearby
fog calculation (misty computing, in French, editor’s note) is an answer to these problems. Data is processed there on network nodes that are closer to their source and smaller. In case connected sensors and objects, sending information to a small local server, which will forward it to more powerful machines only when needed, may therefore be sufficient. Because dependence on large data centers can lead to pitfalls. Due to different legislations, it may be preferable that personal, medical or military data is not stored abroad. Distance can also cause latency, which isn’t a problem when you just want to check your emails, but it hurts critical or interactive applications. They range from video games to remote surgery.
The challenge of fog computing is to maintain the good properties of the cloud, such as flexibility and power, while allocating resources more efficiently.
” The challenge of fog computing is to maintain the good properties of the cloud, such as flexibility and power, while allocating resources more efficientlysays Guillaume Pierre. However, many models and assumptions designed for the cloud perform poorly, if at all, with fog computing. It is therefore necessary to reinvent the architecture of these systems. »
For example, machines are physically very close together in data centers, which increases the reliability of calculations and reduces latency in information processing. Their cable connections are in fact much faster than the large-scale networks used by fog computing, such as 5G, ADSL or fiber. The management of dispersed machines also requires a particular optimization of flows and of the network. If data is processed at nodes that are less distant than a large data center located overseas, but communications between those nodes are poorly optimized, the data can end up traveling greater distances.
Experiments within a European project
These lines of research and improvement have been put into practice particularly in the context of fogguru, coordinated by Guillaume Pierre. ” This European doctoral training project enabled eight doctoral students to exercise their talents in fog computingexplains the researcher. The city of Valencia, Spain, has given us the keys to the city to experiment with these technologies.. »
Two use cases were then studied. The first concerned the distribution of drinking water, a precious resource in this semi-arid area. Two hundred thousand smart meters had already been installed in residents’ homes to detect leaks, but the system has not yielded satisfactory results. ” It took three or four days to pinpoint a problemexplains Guillaume Pierre. With fog computing, we collect and process data up to once an hour, so we could notify technicians within half a day. We developed a counter prototype, but it was not possible to implement it on a large scale due to the pandemic (of Covid-19, ed). »
Left: A “smart” water meter here directly connected to a tap to simulate water consumption. Right: one of the sensors installed in the Valencia marina to count the number of pedestrians entering or leaving the area.
The second use case, finalized, concerned the Valencia Marina. This involved installing sensors to measure wind and wave strength, weather readings as well as site entrances and exits, and then using this data to help manage the site. A hackathon was organized by FogGuru, won by a team that designed a mathematical model of marina silting. ” Rather than dredging the entire sector, which costs one to two million euros a year, this solution should make it possible to reduce this work to priority areas covering only 10% of the marina.Guillaume Pierre rejoices. This processing works without having to send the data, which is only useful for the port of Valencia, to a distant server. »
Ways to reduce energy consumption
Fog computing is also attracting interest as an alternative to the enormous energy consumption of data centers. ” Energy saving is generally not the main reason to prefer fog computing, because this system is currently less optimized than the cloudquenches Pierre Guillaume. But in our time we can no longer afford to ignore this question. We are therefore studying the energy efficiency of fog computing platforms, but we are not at the end of the matter.. »
Prototype of a Fog platform produced within the FogGuru project.
This theme joins those dear to Jean-Marc Pierson, professor at the University of Toulouse and director of the Toulouse Computer Science Research Institute (Irit), a specialist in energy efficiency of distributed systems. He is also a member of the Service group for eco-responsible computing of the CNRS (EcoInfo GDS). ” The difficulty is that fog computing doesn’t replace the cloud, it adds to itsays Jean-Marc Pierson. Overall consumption therefore increases. Fog computing, however, uses smaller plants, requires less infrastructure, needs less cooling, and can more easily be powered by renewable energy. . It would be necessary to carry out studies of the complete life cycle of the different data centers to really know which is the most sober system. »
Among other lines of research, concerning fog computing and distributed systems in general, the programming, i.e. optimized scheduling of computer tasks, is very popular. The reduction of data replication is also studied, for example for large and popular files such as videos and music. “ We also review user acceptance of degraded servicecontinues Jean-Marc Pierson. Slower service and lower quality images result in significant energy savings. It’s almost a philosophical question: is it really useful to immediately request these services whenever you want them? »
With its emphasis on less powerful, and therefore less energy-intensive, computer systems, fog computing could respond to these approaches to energy sobriety. However, it will be necessary to continue to adapt and optimize existing systems for this architecture to be fully implemented.♦
Read on our site
How to make applications less energy intensive?
The challenges of the Internet of Things