BETA
This is a BETA experience. You may opt-out by clicking here
Edit Story

An AI App Claiming To Detect STIs From Photos Of Genitals Is A Privacy ‘Disaster’

Calmara, an app from startup HeHealth, is encouraging women to submit photos of their sexual partners’ genitalia, and claims its AI “wizardry” can detect the presence of STIs.

Following

“AI-powered” genital scans could be the next big thing in artificial intelligence.

That’s according to Mei Ling Lu and Yudara Kularathne, the founders of HeHealth, who claim their “patented” AI “wizardry” can analyze images of male genitalia for the presence of common sexually transmitted infections. This month, they launched a platform called Calmara that’s marketed as a sexual wellness tool for women, encouraging them to submit “peen checks,” or photos of their partners’ private parts.

In 60 seconds, Calmara claims its algorithm can identify the presence of more than 10 diseases and infections, such as herpes, syphilis and HPV, with an accuracy of up to 96%. “Our whole intention was really to bring about a screening solution in sexual health where, let's say, the other alternative is people Googling,” the company’s cofounder and CEO Mei Ling Lu told Forbes.

But since its launch, Calmara has been met with a deluge of criticism over consent, data privacy concerns and the possibility that child sexual abuse material could wind up on Calmara’s servers, calling their credibility into question and prompting its creators to hastily backpedal aspects of their product.

“I don’t think a solution is taking photos of someone else's genitals and sharing them with an app that's not even a medical provider and isn't subject to same standards a doctor’s office would be,” said Sara Geoghegan, who serves as counsel at the Electronic Privacy Information Center and focuses on issues related to consumer privacy.

AI in healthcare represents a $23 billion market, and many companies are racing to deploy models for diagnostics, patient intake, hospital management and a host of other uses. Calmara is one of an explosion of AI-driven consumer facing products such as therapy chatbots and health coach apps that invoke AI as a tidy but potentially problematic solution for a wide array of health concerns. Last year, for example, the National Eating Disorders Association removed its own chatbot for providing harmful advice like recommending weight loss and measuring body fat.

“The nature of consent here is impossible.”

Sara Geoghegan, Electronic Privacy Information Center counsel

In 2022, Lu, a former management consultant, and Yudara, a medical doctor who previously practiced in Singapore, cofounded HeHealth, a San Francisco-headquartered healthcare startup that provides the Calmara platform as a service and offers a similar product for men to self-check for STIs. HeHealth has raised $1.1 million in funding from external investors, its founders said. PitchBook data lists Singapore Management University’s Institute of Innovation & Entrepreneurship, California’s Plug and Play Tech Center and Japanese venture capital firm ARKRAY 4U as investors.

Both Calmara and HeHealth’s app for men were built on the same AI model, their founders told Forbes, and HeHealth additionally claims to connect users in the United States, Singapore and India with sexual health experts at its “vetted partnered clinics.” The company alleges its AI is powered by a peer-reviewed “proprietary dataset” that’s been tested by more than 30,000 users worldwide.

This data contains several thousand images — both original and synthetic — of five penile diseases, with the original images obtained from physicians in India, Sri Lanka, Singapore, Australia, the U.S. and United Kingdom, according to a March 2024 preprint published without peer review authored by Kularathne, two paid HeHealth consultants and a medical executive at the company. It’s unclear why Calmara claims to identify the presence of more than 10 diseases when the AI model it uses was trained on only five. Forbes was also unable to find evidence that the model and its efficacy were independently reviewed.

The use of synthetic data — in this case, images augmented or artificially generated by image models — to train AI has been proposed by some researchers as a solution for filling gaps in datasets and even bypassing privacy restrictions that might block someone’s access to particular data. Last year, researchers at the Massachusetts Institute of Technology showed that synthetic images could be generated using text-to-image models like Stable Diffusion, but warned that reliance on such libraries could also amplify biases. Kularathne said their images were modified to include a wider range of skin tones.

In addition, the HeHealth team notes in the paper that it developed “a custom-built web-scraping tool to download images freely available on the internet,” and also “publicly sourced images from the mobile app interface, which was predominantly used in North America.” Kularathne told Forbes that the mobile app images were from a trial that required participants to consent to the use of their information.

But the app does not appear to take consent as seriously. Forbes tested Calmara with a nondescript image and was only required to affirm that “Yes, I have their consent,” on a popup that appeared after taking a photo but before submitting it.

Even if a platform like Calmara deletes images shared by users, they’re still collecting this material and thus potentially answerable to certain data privacy laws

“The nature of consent here is impossible,” Geoghegan said. In Calmara’s case, “the person who the app is for is not the one whose genital images are being shared, so the actual subject of this invasive practice is not the one whose consent is sought after.”

Dr. Jeffrey Klausner, a renowned professor at the University of Southern California’s Keck School of Medicine and a paid medical advisor at HeHealth, defended the company in an interview. When asked about consent concerns, he said that even in medical settings, “consent is always an issue in sexual health” and “[Calmara] should be used with the consent of both parties.”

Then there’s the question of what happens to a photo once a user submits it. When Calmara launched, images were stored on its AWS servers and kept for an amount of time that the founders declined to share. But after public backlash, Lu and Kularathne changed the retention policy. Now, images are automatically deleted from Calmara once they’ve been screened for STIs. (The company’s FAQ still says, “Your info's stashed in a digital stronghold, wrapped in layers of encryption, and handled by rules that would make even the strictest dungeon master proud.”)

HeHealth, on the other hand, does store user data including images, said Lu and Kularathne. Since HeHealth’s privacy policy states some user information is shared with “service providers and partners who assist us in operating the services,” Forbes asked if images from the platform are used to refine its AI model. The founders denied that photos are shared, but said that partnering physicians currently review every image submitted by users, and their “feedback” is used to retrain the model. (He did not disclose who these physicians are or what their employment relationship to the company is. Separately, Forbes confirmed that in California HeHealth is affiliated with CityHealth, a subscription healthcare service with locations around the Bay Area.)

CityHealth’s spokesperson Kathy Chu said it provides medical consultation to patients who use HeHealth. “The idea is to make it convenient for both physician and patient, as the app will scan the patient’s penile area and the certified provider will be able to review the scan,” before deciding whether to diagnose the issue, Chu told Forbes.

Little of this information is apparent in Camara’s app. “The website describes ‘ghost mode’ or ‘incognito’ but does not give me enough information to say how privacy-protective that is,” Glenn Cohen, a Harvard Law School professor specializing in medical ethics, told Forbes. “I also don’t know whether any of the data, even stripped of identifiers, is inherently reidentifiable.”

Even if a platform like Calmara deletes images shared by users, they’re still collecting this material and thus potentially answerable to certain data privacy laws, said Wendell Bartnick, a partner at Reed Smith who advises healthcare companies on cybersecurity and data privacy. Calmara has claimed its services don’t fall under HIPAA (the Health Insurance Portability and Accountability Act) because it can’t be used to identify someone.

“There's no way for a company to control what kind of data it receives, and a photo could very well include someone's face, which would likely be considered personal information that's regulated by some state laws​​,” Bartnick told Forbes.

Like many platforms that handle medical images, Calmara and HeHealth are still handling very sensitive images. “My workers’ computers are filled with penis pictures,” Kularathne told Forbes. Neither Calmara nor HeHealth have ever found or reported child sexual abuse material, Kularathne said.

Following Calmara’s launch, Carey Lening, a privacy researcher and infosec lead at Castlebridge, published a scathing deep dive review of the company’s privacy practices, calling it a “sheer unmitigated disaster.” In a reaction post on LinkedIn, Lu called the feedback Calmara has received since launch “super two-sided” and added: “So to the guys and other critics — trust me, embracing health innovation might just be the coolest thing you do.” Lu’s post has since been edited to remove this language.

“Who has reviewed the algorithm outside of the company and what assurances for reliability are they able to offer?”

Glenn Cohen, Harvard Law School professor

Lu and Kularathne have been quietly updating their products in the aftermath, including changing Calmara’s data retention period post-launch. After Forbes contacted Lu for an interview, HeHealth also removed a list of “ongoing elite academic partnerships” from its website, which included University of Southern California, Harvard’s T.H. Chan School of Public Health and the University of North Carolina at Chapel Hill. All three schools told Forbes they did not have a partnership with either HeHealth, Calmara or Kularathne.

Kularathne insisted to Forbes that he, in the capacity of HeHealth’s CEO, indeed has partnerships with them. When Forbes asked to see his agreements with the universities, Kularathne said he was unable to provide them, citing confidentiality reasons. “Sadly, I cannot [share] the agreements, which I have in my hand.”

When it comes to transparency, some experts say Calmara and HeHealth leave much to be desired. Harvard Medical School’s Cohen said patients have a right to know what is done with the images they submit, how the AI is trained and how its accuracy differs between various demographics. “Who has reviewed the algorithm outside of the company and what assurances for reliability are they able to offer? What is the company’s plan for instances where there is an error? What liability, if any, will the company face in such cases and have they attempted to disclaim that liability?”

In the meantime, Calmara’s founders appear to be enjoying the attention. The platform was recently mocked on the Tonight Show, which cited an article from Techcrunch. “Thrilled to see Calmara and HeHealth getting a shoutout on Friday,” Kularathne wrote on LinkedIn. The founders told Forbes they would be disclosing their product updates in the coming days, but have yet to do so. Lu claimed they were “in talks” with a dating site about a potential partnership, but declined to share more. Kularathne said he’s undeterred from his goal of saving “a billion people’s lives” with his technology. “I look at some of the people who have written so many articles about me,” he said. “My technologies will be better than them.”

MORE FROM FORBES

MORE FROM FORBESThe Startup Behind The First Pig-Human Kidney Transplant Is Targeting Hearts And Livers NextMORE FROM FORBESStability AI Founder Emad Mostaque Plans To Resign As CEO, Sources SayMORE FROM FORBESFeeling Sad, Excited Or Bored? This Startup Claims Its AI Can (Mostly) TellMORE FROM FORBESFeds Ordered Google To Unmask Certain YouTube Users. Critics Say It's 'Terrifying.'
Follow me on TwitterSend me a secure tip