Meta has been training their artificial intelligence or AI program with public pictures uploaded on Facebook and Instagram for many years. This time Meta is focusing on the same technology that users have never uploaded to Facebook or Instagram from their phone.
Technology media told The Verge that they were not training AI on the users' camera roll, unpublished pictures. However, the company did not give any clear answer to the question whether they would do anything in the future. At the same time, it did not say what the right on these personal pictures would be Meta.
Meanwhile, in a report last Friday, TechCrunch said that many users are receiving a pop-up message while using the Facebook Story Feature, where the camera roll is being sought to upload regularly to the Mater Server through the film 'Cloud Processing'. Users are agreed to introduce this feature on the terms of Meta AI, under which the AI will be able to analyze the user's pictures and faces. Even, Meta will be able to collect information about the picture, other people or objects in the picture.
The company says it is an 'very early stage' test and it will be done with the user's consent.
“We are not training these pictures as part of this test,” Ryan Daniels, a public relations manager of Facebook to Virge, told Virge.
“We are trying to make the content sharing the content on Facebook,” said Maria Cubeta, Meta Communication Manager. This is why a feature of content suggested from the camera roll is being examined. This is based on the user's consent and the pictures will only be shown to you until you share. You can close it at any time if you want. '
However, several questions have been created with this initiative. Because, under the Mater AI policy, if users consent, the media of the Meta Camera Roll and the personal data on it will 'preserve and use'. Although Meta says they are not using these images for AI training, there is no guarantee on future use in their conditions.
In this context, the Google Photos statement can be compared. Google has clearly stated that private pictures in Google Photos are not used in AI training. But in the case of Meta, this precision is still missing.
Meta claimed that they collected a maximum of 7 -day camera rolls with the consent of the user. However, TechCrunch said, “Meta himself says – in some cases, the 5 -day -old picture can also be used in this feature.”
Facebook users can stop the camera roll cloud processing if you want to go to the settings. Once this feature is closed, the unpublished images from Mater Cloud will begin to be removed within 5 days.
Meta recently acknowledged that they have collected information from all the contents published on Facebook and Instagram for 25 years and trained their generative AI. Although the company said that only the public posts of adult users were used in it, the company did not give any clear explanation as to what was meant for 'public' and what was the criteria for determining 'adult' in the 21st.
A user on Reddit alleged that Meta had restored their wedding pictures in 'Studio Jibli' style, while they did not launch any such feature. That is, the user has unknowingly worked on Mater AI.
Overall, Materi initiative has emerged as a new threat to users' privacy. Because, if the users do not 'public' for so long, there was no opportunity for the AI model to be used. This time Meta is also trying to cross that barrier to the user's consent – the user's phone is trying to reach the gallery.
