Something Bizarre Is Happening To ChatGPT Power Users

(by NOOR AL-SIBAI) Researchers have found that ChatGPT “power users,” or those who use it the most and at the longest durations, are becoming dependent upon — or even addicted to — the chatbot.

In a new joint study, researchers with OpenAI and the MIT Media Lab found that this small subset of ChatGPT users engaged in more “problematic use,” defined in the paper as “indicators of addiction… including preoccupation, withdrawal symptoms, loss of control, and mood modification.”

To get there, the MIT and OpenAI team surveyed thousands of ChatGPT users to glean not only how they felt about the chatbot, but also to study what kinds of “affective cues,” which was defined in a joint summary of the research as “aspects of interactions that indicate empathy, affection, or support,” they used when chatting with it.

Though the vast majority of people surveyed didn’t engage emotionally with ChatGPT, those who used the chatbot for longer periods of time seemed to start considering it to be a “friend.” The survey participants who chatted with ChatGPT the longest tended to be lonelier and get more stressed out over subtle changes in the model’s behavior, too.

Chat Lackeys

Add it all up, and it’s not good. In this study as in other cases we’ve seen, people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI — and where that leads could end up being sad, scary, or somewhere entirely unpredictable.

This new research also highlighted unexpected contradictions based on how ChatGPT was used.

For instance, people tended to use more emotional language with text-based ChatGPT than with Advanced Voice Mode, and “voice modes were associated with better well-being when used briefly,” the summary explained.

And those who used ChatGPT for “personal” reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for “non-personal” reasons, like brainstorming or asking for advice.

Perhaps the biggest takeaway, however, was that prolonged usage seemed to exacerbate problematic use across the board. Whether you’re using ChatGPT text or voice, asking it personal questions, or just brainstorming for work, it seems that the longer you use the chatbot, the more likely you are to become emotionally dependent upon it.

https://fu turism.com/the-byte/chatgpt-dependence-addiction

***

Government Hires Controversial AI Company to Spy on “Known Populations”

What happens when you combine Palantir — the not-at-all evil AI company named after the spying orbs from “The Lord of the Rings” — with an immigration agency that’s disappearing US residents to a South American internment camp?

The makings of one hell of a surveillance state, that’s what.

Bombshell reporting by 404 Media just revealed a massive contract between US Immigration and Customs Enforcement (ICE) and Palantir, to the tune of tens of millions of dollars.

Per 404, the contract tasks Palantir with tweaking ICE’s Investigative Case Management system (ICM) to allow for the “complete target analysis of known populations.” It also assigns Palantir with ongoing maintenance duties for the massive database, which contains real-time tracking tools, visa records, and data from other agencies including the FBI, CIA, DEA, and ATF.

The ICM allows agents to sort people by “hundreds of highly specific categories,” according to 404, which caught a glimpse of the database last week. These include physical traits like race, eye color, tattoos, administrative data like social security numbers, employment address, and bankruptcy filing, as well as port of entry and resident status, in addition to “hundreds more” criteria.

It’s part of the Department of Homeland Security’s (DHS’s) $96 million contract with Palantir, a five-year agreement signed back in 2022.

The revelation comes as the DHS under Trump — of which ICE is an investigation arm — wages a brutal campaign of disappearances and deportations against greencard holders, nonresidents with work and student visas, and foreign tourists.

Some, like permanent resident Mahmoud Khalil or doctoral visa student Rumeysa Öztürk, have been snatched off the street by plainclothes DHS agents for expressing what Secretary of State Marco Rubio calls “beliefs, statements, or associations” deemed to be “at odds” with US foreign policy interests. Hundreds of others have been grabbed and whisked away to an El Salvadorian internment camp, or to ICE detention facilities in the US — leaving families and lawyers in the dark as to their whereabouts.

The detentions and disappearances are speeding up to a nearly industrial scale, a feat made possible thanks to the participation of big tech companies like GoogleAmazon, and Palantir. Last month, the ACLU of New Mexico formally challenged the disappearance of 48 New Mexico residents — whose identities, location, conditions, or legal status were not disclosed by ICE.

It’s a dream come true for ICE Director Todd Lyons, who said he wants ICE to run “like [Amazon] Prime, but with human beings” at the Border Security Expo last week.

Palantir, meanwhile, is living up that type of rhetoric, rolling out a bus stop ad campaign declaring that “moment of reckoning has arrived for the West.”

“We built Palantir to ensure America’s future, not to tinker at the margins,” the ad reads. “On the factory floor, in the operating room, across the battlefield — we build to dominate.”

https://futurism.com/government-ice-palantir

Leave a comment