When his grandmother died about two years ago, his family writer Jebal King was in charge of herbal articles. However, King had never written before and did not know where to start. Sorrow did not help. “I didn't have a way to do this,” said 31 -year -old from Los Angeles.
Almost at the same time, he used Openai's ChatGpt and artificial intelligence chatbots to tweak the technology to create a food list and budget formation tool. What would happen if he could help the death article? King Fedes Chat asked some details about his grandmother -she loved bowling and had many grandchildren and asked to write a dead article.
“I knew it was a beautiful dead article, and I explained her life,” says King. “It was not a problem that it was from Chatgpt.”
The result was one of the most personal texts in life. King finely adjusted the language, added details, and revised his death article with his mother's help. Eventually, King felt that Chattgut helped him to commemorate his grandmother in a language that properly expresses his emotions. “I knew it was a beautiful dead article and explained her life,” says King, who works for a luxurious handbag company video production. “It was not a problem that it was from Chatgpt.”
The generated AI has greatly changed how people communicate and recognizes communication. At an early stage, the use was relatively benign. IMESSAGES and Gmail predictive text provided proposals for each word or for each phrase. However, in late 2022, Chatgpt's public release reported that technology advanced, and the application of technology exploded. Users have also grown the number of chatbots that can be used for experiments, so that AI will be useful to strengthen the response of emails, recommended letters, and dating apps. However, there was a rebound. If the writing looks dishonest or stiff, the recipient immediately claims the AI used by the author.
Now, the content creep of the AI chatbot is becoming more and more personal, using wedding vows, pathetic DOL consciousness, dissolution text, gratitude memo, and yes, to create death articles. As people apply AI to a fairly true communication form, if they discover, they take the risk of violating or very dishonest. Nevertheless, according to users, AI does not intend to manufacture emotions, but intends to provide templates that can map emotions.
Anyone who is asked to provide a speech or console that a friend can prove is famous for creating a perfect message, especially for the first time. These communications are very personal and intended to evoke a specific reaction, which has pressure to nail the tone. There is a narrow line between the effective notes of the support and those that make the recipient feel sick.
Therefore, the AI tools are particularly attractive to help nervous writer to avoid social blunder, knowing how they feel, but cannot be fully expressed. Provides intestinal checks to people. David Marcowitz, an associate professor at Michigan State University, states: “If you want to write an apology letter for some violation, write the apology letter and give it to Chatgpt or Claude and say,” I am looking for a warm and caring tone here. I am this. Is it right, or did you write this well, and that's a bit cold. If you're you, maybe some here. You will change the word, and it will just improve things. “
Of course, the generated AI platforms do not have emotions or experienced them, but instead learned about them by cutting a lot of literature, psychological research, and other personal writing. Says. “This process is similar to learning about culture without experiencing it,” he says. “It's not a direct experience, through observation of behavioral patterns.” Therefore, technology does not understand emotions, but can compare what you wrote with what people generally learned how to express emotions.
Katie Hoffman, a 34 -year -old marketing person living in Philadelphia, has many times for Chatgup's advice, especially when brooching a delicate conversation. In an example, she used it to draft text to a friend and told her that she would not attend her wedding. At another time, Hoffman and her sister urged Chatbot to provide a diplomatic reaction to a friend who retreated from the Hoffman's single party at the end. “How do you say this without sounding like a jerk? Hoffman says,” That will give us the message we created from there. “
Hoffman revealed, excessively explained, and found that Chatgpt's script was found more objectively and accurate than what he wrote by himself, rather than sending too many details. 。 She said she had a workshop and personalized before sending the text, and her friend was not wise.
“I know what to say, but I'm having a hard time thinking about it and writing down,” said Torres. “I don't want to sound stupid. I don't want to thank you.”
Ironically, the more often the author takes over the message, the more the author's chatbot is executed and the need for editing, the more he often takes over messages. If you haven't made much of the output so much, you'll be less like you've really written a message. “It may have an effect on it. You feel fake. You feel you have been deceived,” Naaman says.
But it has not stopped trying chatbots for sentimental communication. Following the writer's block game, 26 -year -old Jeanna Torres used Chatgpt to outsource the writing party to a graduation party. “I know what to say, but I'm having a hard time thinking about it and writing down,” says a Philadelphia occupational therapist. “I don't want to hear stupid. I don't want to be grateful.” She urged her to generate a sincere message to commemorate her milestone. In the first attempt, Chatgpt was a long letter, but I asked for a short version with a beautiful version on each card.
“People say,” Chatogup has no emotions, “” says Torres. “That's true, but I feel it, how to write a message.”
Torres's friends and family did not initially ink ring that she was helping her to write notes. In other words, until her cousin sees Tactoku Torres posting about workarounds. Her cousin was surprised. Torres told her that she did not deny how she felt the fact that she had helped. She needed a little nudge.
You may believe in the ability to find the language created by AI, but it's pretty bad for the average person to analyze whether the message was written by chatbot. Feed to Chatgpt to generate compelling textbooks by providing enough personal information. The text is edited when it contains a statement that uses the word “I”, “I”, “myself”, or “my”. According to Markowitz, these words are one of the biggest markers in language honesty. “They help show some kind of psychological intimacy that people feel about what they are talking about.”
However, if the recipient is suspected of outsourcing the author's honesty to AI, they will not take it well. “As soon as I doubt that some content is written by AI,” Naaman says. [the writer] Low reliability. I don't think communication is very successful. This can be clearly seen on Google backlash on the AI platform Olympic advertisement last summer. GEMINI: Audiens has been like an App to rely on AI to help her daughter write a fan letter on the Olympic athlete. As technology continues to proliferate, viewers are becoming more and more skeptical of content that seems to be off or manufactured.
If you are not working on words to clarify your emotions completely, are they real? Do you remember how everything felt?
AI is that the negative reaction to the writing of outsourcing, which people feel essentially emotional, may be due to the overall skepticism of technology and the use of their intuition. Marte Jung, an associate professor of the Cornell University, who studied the effect, said. communication. “People still have more negative perceptions about technology and AI, and may be due to those who use their negative perception,” he says. (According to a survey by the Purisato Center in 2023, more than half of the Americans are concerned about AI, not exciting innovation.)
Jung says that people may think that the communication generated by AI is “not real, or real or honesty.” If you are not working on words to clarify your emotions completely, are they real? Do you remember how everything felt?
The response was overwhelmingly negative when King, who wrote a grandmother's death article using Chatgpt, relay how AI was used in X. “I couldn't believe it,” he says. Blowback urged him to clean his mother. “It really made me think a little,” says King. “I never thought I was a bad thing, so many people tried to turn them crazy and evil.”
When discussing ethics of AI communication, the intention is important -to some extent. Who is not involved in the brain for the perfect combination of language and emotions? The desire to be warm and genuine and real may be enough to create effective messages. “The important problem is the efforts of people, the honesty of what they want to write,” Jung says. “It may be independent of how it is recognized. You used Chatgpt, but people still deny you, no matter if you are sincere. You may see it.
Some people may not care at all because the generated AI is very ubiquitous.
39 -year -old Chris Harihal, who works for a public relations in New York City, had an anecdote of a specific childhood that he wanted to include in a speech at his sister's wedding, but could not incorporate it. 。 Help. He uploaded his speech in his current form, told the story he was aiming for to incorporate, and asked him to connect the story to his lifetime partnership. “I was able to give me these threads that I thought before it was completely reasonable,” said Harihal.
HARIHAR is an early recruitment of AI, and since he frequently used platforms such as Claude and ChatGpt in personal and occupational life, his family said that he would use AI to complete a speech. The family was not surprised.
HARIHAR uses the AI tool to answer very unique questions, which are characteristics of a 4 -year -old daughter's child. Recently, Harihal's daughter was wondering why people had different skin tones, and he urged Chatgpt to provide a kind explanation to his child. The bot provided a diplomatic and appropriate breakdown of melanin. He was impressed by Harihal -he said he probably didn't think it would disassemble it. Harihal does not feel like losing the moment of raising children with outsourcing help, but regards this technology as another resource.
“From a child -rearing perspective, sometimes you are just trying to survive that day,” he says. “It will help you explain in other ways that you can struggle for some reason so that you can use one of these tools.”