A related toy, the ‘My Friend Cayla’ doll, on a shelf in a London shop.
©LEON NEAL / AFP
The technical minute
The connectivity of toys, electronic gadgets or high-tech devices increasingly exposes users’ personal data to the risk of cyber attacks or hacks.
Atlantico: Bluetooth connections, augmented reality, complementary applications… It’s there no more a toy or gadget without connectivity. What are your tips to prevent them from becoming a vacuum cleaner for personal data?
loic guezo: The first tip is to identify the connected functions in these objects. From there it will be possible to proceed, depending on the objects, with different settings that will allow minimizing the surface area for potential data aspiration. The second tip is to find out about the product and possibly avoid giving your data to manufacturers who do not comply with the various data protection regulations. In 2015, I dealt with a first hacking case involving a manufacturer of connected toys that hadn’t been able to integrate the security controls necessary for the correct management of the collected data into its products. The personal data of the children using the toys was leaked and could be used against them.
Another case dates back to 2017, when a “conversational” doll was banned in Germany after researchers found its Bluetooth connection could be hacked to allow strangers to listen in on children. Have companies made any efforts regarding security since then? How to be sure?
It’s hard to tell because there’s a spike in smart toys every year around the holidays, and they’re different from year to year. But overall, the suppliers of these toys, like any supplier of connected items, are under increasing pressure from the regulators of the countries where these toys are sold. In France, European regulations are starting to take hold, in particular on the protection of collected data, but also and above all on the safety by design of these connected objects. We are, in 2023, in a transition phase where now, at least in Europe, all connected objects must meet the security requirements when they are marketed but also throughout their life. That’s a good point.
Smartphone Blackmail: When mobile application data is sold for public exploitation
But this problem of misappropriation of the object’s functions will always remain; like the case of the doll you mentioned, which turned into a spy microphone. In this case we are not in a normal use of the object but in a diversion of its functions. These unconventional uses of objects are always to be considered with a somewhat paranoid eye that parents must assume when their children are offered this type of object or when they themselves are thinking of offering them. You have to become a bit of a technician, a bit of a screenwriter and imagine what a diversion of functions could be: cameras, microphones, geolocation… These scenarios aren’t exactly science fiction. For example, if you have an object that has cameras and a geotagging feature, someone who accesses that data could piece together your living environments. We can imagine scenarios where a would-be thief would make the child’s doll talk to see if there will be a response or not and infer if there is an occupancy of the housing, for example.
You mentioned security by design. For example, does it take the form of camera shutters or on/off buttons on a microphone? Do you need to check this sort of thing when buying?
That is more privacy by design, that is to say that the object must allow to disconnect functions that collect sensitive data. It is classically a shutter for a camera, but it can also take the form of a control panel for the different functions of the digital object or application, in which we will be able to deactivate all sensitive functions. It must also be possible to determine what data is collected and where and how this data is sent, stored, protected, destroyed after a certain period of time. This provides a true overview of the risk associated with this object.
It is therefore a subject that goes beyond connected objects. On applications this should also be checked, right?
This should in fact be the ba-ba of any user of any application, whether it is classic like a social network, or whether it is a connected object which is ultimately just an application with particular sensors and a specific rendering, taking for example the shape of a doll.
The hack that raises questions about the security of password managers