Categories: US News

Reululers strive to accompany the fastest and complexity of Ai Therapy Apps

[ad_1]

If there is no powerful Federal Federal Regulation, other provinces have begun to control AI “therapy” as many people turn to intelligent mental advice.

But the rules, all passed this year, do not fully address the changing state of developing AI Software. Also application developers, policy makers and psychiatrist lawyers say that the output of state laws are not enough to protect users or catch harmful technologies.

“The truth is millions of people using these tools and will not go back,” Karin Andrea Stephan, CEO and the Founder of the Mental Health Chatbot App Earnkick.

___

Editor’s note – This story includes having suicide. If you do not know the help, the National Subsine and Frisis Lifeline in the US is available by driving or sending messages 988. There is online discussion – 988Lefeline.org.

___

Kingdom laws take different methods. Illinois and Nevada blocked AI to treat mental health. Utah set some restrictions on treatment interviews, including protecting them from protecting users’ health information and clearly disclosed Chatbot that Chatbot is not a person. Pennsylvania, New Jersey and California are also looking for ways to control AI treatment.

Impact on users varies. Some apps have prevented access to the provinces with strokes. Some say that they do not make changes as they expect the formal formulation of formal formulation.

And many rules do not include common Chatbots such as ChatGPT, which is clearly sold for treatment but is used by the uninforceable population. Those bots dragged down crimes in the conditions where users lose their literal or take their lives after contacting them.

Vaile Wright, facing the Health Care’s Innovation American Psychological Association, admitted that apps may comply, comment in defense nationwide, high cost of maintenance and unequivocity care.

Mental Health Chatbots focused on science, designed by expanding and consideration of people can change the location, wright.

“This can be something that helps people before they come to Fresis,” he said. “That’s not what is traded right now.”

That is why the terms of the unity and the oversight is necessary, he said.

Earlier this month, the Federal Trade Commission was opened by the Parents of Ai Chatbot – including Instagram and Facebook companies, Chatgot, Chatgot and drug management costs Nok’s advice. 6 Reviewing the delicate health devices provided by AI.

Federal organizations can consider the limits that Chatbots are sold, reduce the disclosure of good medical providers, seeking companies to track and place the legal advances for people who report badly by companies, Wrights said.

Not all applications blocked access

From “Aipn App To” Ai Therapists “in” Wellness Wellness “, the use of AI in mental health care varies and difficult to explain, not to mention the rules all around.

That result in different control paths. Some say, for example, take the purpose of friendly friendly friendly friendships, but do not enter into mental health care. The laws of Illinois and Nevada Ban Products said it provides visual mental treatment, threatens the fine of up to $ 10,000 in Illinois and $ 15,000 Nevada.

But even a single app can be difficult for separating.

Earnkick said there was much “very muddy” in the Illinois law, for example, the company has no access there.

Stephan and her team at first grabbed the phone calling their Chatbot, which looked like a Partoon Panda, a medical doctor. But when users start using the name in the revision, they accept the word and therefore the application will appear in the search.

Last week, they returned using treatment and medical ideas. Earkick’s website describes its Chatbot as “your AI adviser, you have been equipped to support your mental health trip,” but now it is “Self care chatbot.”

Still, we “do not look at it,” Stephan was kept.

Users can set up the “shock button” to call a trusted person when he is in trouble and chatbot “will do” the users. But Stephan said, Stephan said if someone told the bot with thoughts.

Stephan said he was happy that people looked at AI critically, but they are concerned about the United States’ power to comply with new.

“Speed ​​when everything comes from big,” he said.

Some applications restricted to immediate access. When Illinois users downloaded Ai Therapy App, a message embedded them to submit their Mo Email, forbid “rules” has closed apps like an Ash “while leaving uncontrolled conversations intend to control injuries.”

ASH spokesman did not respond with many interview applications.

Mario Treeto Jr.

“Treatment is more than words alternate,” said Tretto. “It requires empathy, requires the clinic’s judgment, requires the moral responsibility, no AI can really repeat it now.”

One Chatbot company tries to repeat treatment

In March, a dartmouth University publisher publish a detailed clinical trial of ai chatbot random for mental treatment.

The purpose was to have a discussion, called Therbot, to treat people forxy, oppression or food disruption. It was trained in Vignettes and texts listed by the team to illustrate the relevant response.

The study found that users were measured by therebot such as the doctor and had low-minded symptoms after eight weeks compared to people who did not do it. Regular partnerships are considered by a partner if the Chatbot response was dangerous or not a supported evidence.

Nicholas Jacobson, psychiatrist at the study clinic, said the results said the first promise but large lessons are needed to show that Terebot works for a large number of people.

“The space is very new that I think the territory needs to continue the great monitoring that occurred now,” he said.

Many AI apps are designed to participate and be built to support all that users say, rather than the thoughts of the challenges of the way they are assigned. Many travel a friendlier line and therapy, collision of boundaries of the sea will not.

Trabot group wanted to avoid those problems.

The app is still checking and is not available widely. But Jacobson is worried about which days of solid ban will mean that the developers use a careful method. He saw that Illinois had a clear way to give evidence that the app is safe and successful.

“They want to protect people, but the Traditional Plan is currently really unwolding people,” he said. “Therefore, trying to stick with the quo condition is not something to do.”

Controllers and lawy lawyers say that they are open to the changes. However, modern Chatbots are not a psychiatric solution, said Hilllman, who has been in charge of Illinois and Nevada in his integrity and the National Association of Social Associal.

“Not everyone feels sad needs a medical doctor,” he said. But for people with real-life problems or suicidal thoughts, “he tells them, ‘I know there’s a shortage of staff but here is a bot’ – that position is.”

___

The Associated Press Health and Science Department receives support from the Huyard Hughes Hughes Institute Science Science Department is responsible for all content.

[ad_2]
kimdc171

Recent Posts

2026년 중국 반도체, 한국 턱밑까지 왔다: 위기의 7가지 신호?

최근 중국 반도체 산업의 도약 속도가 한국 시장에 바짝 다가온 느낌이다. 일각에서는 이 흐름이 단순한…

11 hours ago

입시 전략, 정시 말고 다른 길은? 5가지 대안 집중 분석과 팁

최근 대입에서 정시 비중이 강화되면서 다양한 입시 전략을 고민하는 분위기가 확산됐다. 일부에서는 학생부종합전형이나 논술, 실기…

1 day ago

한국 경찰은 왜, 충격적으로 얼마나 많은 실종자를 ‘가출’로 처리하고 있을까?

최근 실종자 처리 방식이 의문을 불러일으키고 있다. 경찰 통계상 상당수가 ‘가출’로 분류된 사례가 확인됐다. 이는…

2 days ago

2026년 스타링크 한국 상륙, 인터넷 시장 대전환 위기 현실?

요약: 우주인터넷 서비스 스타링크가 드디어 한국에 상륙했다. 그동안 통신사 중심이던 국내 인터넷 시장에 일대 변화를…

3 days ago

긴장 고조, 전 매니저 폭로로 재점화된 박나래 사건 핵심 쟁점 5가지?

전 매니저의 폭로로 송출된 '술잔 비행' 의혹이 다시 수면 위로 떠올랐다. 당시 현장 분위기와 공개된…

4 days ago

비트코인은 사기라던 그가 왜 지금 세계 최대 ETF 운용자가 됐을까?

과거 비트코인을 사기라 부르던 금융 전문가가 이제 세계 최대 규모의 비트코인 ETF를 운영하고 있어 놀랍다.…

5 days ago