The morning started with a message from a friend: “I used your photos to train my local version of Midjourney. I hope you don't mind”, followed up with generated pictures of me wearing a flirty steampunk costume.
I did in fact mind. I felt violated. Wouldn't you? I bet Taylor Swift did when deepfakes of her hit the internet. But is the legal status of my face different from the face of a celebrity?
Your facial information is a unique form of personal sensitive information. It can identify you. Intense profiling and mass government surveillance receives much attention. But businesses and individuals are also using tools that collect, store and modify facial information, and we're facing an unexpected wave of photos and videos generated with artificial intelligence (AI) tools.
The development of legal regulation for these uses is lagging. At what levels and in what ways should our facial information be protected?
The Australian Privacy Act considers biometric information (which would include your face) to be a part of our personal sensitive information. However, the act doesn't define biometric information.
Despite its drawbacks, the act is currently the main legislation in Australia aimed at facial information protection. It states biometric information cannot be collected without a person's consent.
But the law doesn't specify whether it should be express or implied consent. Express consent is given explicitly, either orally or in writing. Implied consent means consent may reasonably be inferred from the individual's actions in a given context. For example, if you walk into a store that has a sign “facial recognition camera on the premises”, your consent is implied.
But using implied consent opens our facial data up to potential exploitation. Bunnings, Kmart and Woolworths have all used easy-to-miss signage that facial recognition technology is used in their stores.
Our facial information has become so valuable, data companies such as Clearview AI and PimEye are mercilessly
Read more on tech.hindustantimes.com