With the rise of AI-generated hyper-realistic visual and audio deepfakes on the internet, prominent public figures and celebrities have raised concerns about how their likenesses are used without their consent to produce content. Loti AI, a deepfake detection firm, entered the scene in 2022 to help safeguard and protect public figures against AI-generated content and has now expanded its services.
On Wednesday, Loti AI announced that its "human-first" likeness protection technology will be available to all users. Previously only offered to public figures and celebrities, the company will now provide tools to anyone interested in protecting their digital reputation. Deepfakes are videos, speech, or images in which the actor or action is not real but created by AI, making distinguishing between real and fake content challenging.
Also: The best free AI courses and certificates in 2025
Loti AI's platform scans the internet daily for deepfakes, adult content, impersonations, unauthorized content, and even authentic images being misused. The firm's likeness protection technology uses a custom-built filtering system to safeguard users' digital identity by removing fake content with "four simple steps":
According to the company, users who opted into its auto-takedown functionality saw a 95% takedown rate within 17 hours. Loti AI offers free and paid membership options on a rolling basis. To get started, potential users must sign up for the waitlist at lotiai.com/sign-up or download the Loti AI app on iOS/Android.
"The internet is getting out of hand, and people's digital reputations are at risk like never before," said Loti AI CEO Luke Arrigoni. "From deepfakes to unauthorized illicit content, these threats are no longer limited to celebrities. Whether you're an everyday person or a high-profile individual, you should be able to protect your image and personal data online."
Also: The hidden cost of AI video generators that no one warns you about
He continued: "Our goal is simple: to help you reach zero images of you online that you haven't approved."