A 60-year-old man developed bromism after applying the recommendations of the dietary rod and changed salt with sodium bromide
With the continuous evolution and progress made as a tool, many are based on overweight, not only to rationalize their tasks and daily lives, but for deep personal things.We have heard the strange wedding cases, as a result of the tool to find infidelity evidence, a disturbing case has been created when the AI is at risk of trusting the health care assistant.A 60-year-old Newouge man has said that he has developed a rare and dangerous situation after following the dietary advice specified by the model.
The hidden dangers of the advice on the health of AI: bromism and alternative to salt that went too far
As I heard about the fact that we don't rely on any too much model, recent warning comes from the American Medical Magazine against the Chatggpt's Council, after the man is seriously ill after the Chatbot Council.For 60 years, who has developed bromism, also known as bromine poisoning, after treatment of assistants instructions.
He took a normal table in the form of rare and sodium chloride for three months.A neighbor is prepared to poison.
The man finished the hospital, where the doctors found themselves in Burnhism, which is a lot of stomachs.A lot.
The patient's history also generates alarm about how IA should be used with caution, especially in healthcare contexts.Researchers even examined whether it is chatgpt by giving those answers and, in fact, the bromide suggested as an alternative to chloride.No toxicity warning has been given, making the answers even more important.Responsibility and therefore should not be taken as experts in the sector.
At the same time IC models should come with the best security guards, especially when emotional topics are in the Ice and Zzzi.