Holywood News

Why this cute AI trend is not worthy of your child’s privacy

You may have seen it Viral doll trend This is on the tour where you can use AI to create your own version of toys with accessories that represent your interests and then share them on social media.

There was also a Ghibli-style trend where you can create personalized images in the style of animated houses, before that, Yearbook AI trends. You can even share photos of your kids to see how they look when they are older.

Reports suggest Chatgpt, owned by Open AI, saw record users this year due to the launch of its image generator, which prompts Tech company owner Sam Altman asks people to “please relax” As its graphics processing unit is working hard to keep up with demand.

With this tool, experts warned against uploading photos of children.

In the new scroll, Dr. Madhumitha Ezhil – Who runs a parent Instagram account without screens – Start with how to upload a child’s photo to an AI tool “harmless”.

But she added that when we do, we are giving AI companies “our kids’ faces – storage, learning and learning”.

She added: “Now they may be able to accurately predict the appearance of a child, which is not only impressive, but also very dangerous. Their faces may be used to train facial recognition systems, build deep effects of reality, and even be sold to unknown third parties.”

“We are the first generation to raise children in the AI ​​era – I personally think it’s better to be cautious because once we upload their data, we will never recover it.”

HuffPost UK contacted Open AI about Dr. Ezhil’s concerns, and they declined to comment.

Is this actually happening?

ChatGpt users can indeed control how usage data is used, as stated by Dr. Ezhil Title of her video. There are some self-service tools for people to access, export or delete personal information, and you can choose not to use content to improve and train AI models.

HuffPost UK also understands that AI platforms will not actively seek personal information to train their models. Therefore, public information available on the Internet is not used to construct personal data about people or sell their data. Can’t say the same thing However, other AI models.

Chatgpt also does not allow imaging editing of children’s images, but people can upload their images to the tool (I was able to upload stock photos of the baby and asked the tool to generate an appearance image of the age of 25, which can be done in minutes.

Dr. Francis Reesa leading legal lecturer Children’s Influencer Projecttelling HuffPost UK: “If you consider putting something on Chatgpt, you’ll provide a lot of information – you can give it facial recognition and the identity of the child, but in their context, such as the crest of the school uniform, pets, pets, bedrooms, houses, homes, the number of houses, type of information, and any metadata for the GPS or phone itself.

“Parents may not understand that when they effectively feed it into the machine, that’s what’s happening.”

She added: “Even if the child is old enough to agree, it doesn’t matter because the child will not be aware of the range of risks now and in the future.”

“The AI ​​will store, train machines and train other facilities to improve itself based on that data,” Dr. Rees said.

Case against “land”

Durgence is where parents share photos, videos and details of their children’s lives online – usually on social media, they may have many followers they don’t even know about.

Children’s images shared online can be used to create explicitly deep hits – false audio, images and videos are generated or manipulated using AI, but look and sound like real content.

According to Internet transactions, 13% of teenagers have rich experience – They are what we know. Many parents simply don’t know if their children’s content has been used and used evilly.

Images (whether real or fake) can then be used Intimidate or extort victims. There is also the risk of identity fraud in the future.

A year ago, Telekom, Germany shared a powerful campaign that included an AI Deepfake for a girl named Ella, which showed the consequences of sharing photos of children on the internet.

In the clip, Ella (an AI simulation of the young girl’s future self) sits and explains to her parents how her identity will one day be stolen or used for crimes, and she can be sentenced to jail. We also see how innocent photos of her as a kid on the beach are shared in sinister ways on the dark web.

At the time, the sport was praised by the audience, calling it a “wake call” and saying “everyone needs to see this.”

Should we not be online and online, staying at the photos of children who are all in their efforts?

Ultimately, parents are the children’s privacy guardians – so, this is your choice. But it is wise to be aware of risks.

“I think there’s just no informed consent,” Dr. Rees said. “Because the kids, even if they say they’re okay, they won’t understand the consequences of it.

“So parents as privacy guardians have to be aware of this. Think about it: Why do you post? What do you get from it? What harm does it cause?

“Who needs photos of my child? Who needs to know my child’s bedroom? Who needs to know my child’s pet? Why I share it on public platforms or feed it to a machine that I can share, I think it’s an important consideration, and I think parents are an important consideration for their own beats and interrogating themselves.”

I will be delivering this on board.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button