ChatGPT – everything you always wanted to know

Chances are, you’ve heard the name ChatGPT a million times in recent weeks. 

With so much information scattered around the web, bold claims, and endless social media posts, it might be hard to grasp the full essence of what ChatGPT can do for us. 

To some, it seems like another AI tool (just a bit more hyped up); to others, it signifies the end of the world as we know it.

We decided to dig deeper into what society thinks about this whole big thing that generative AI is.

The areas this research tackles include:

  • Tasks people would allow ChatGPT to do for them
  • Using ChatGPT for work, school, therapy, and more
  • How ChatGPT might control the population
  • Cool ChatGPT use cases
  • The limitations of the tool

Let’s dig in.

First, we need the basics to understand what the fuss is all about. 

ChatGPT is an artificial intelligence model that is able to generate text in a conversational way. It adopts a dialogue format that makes it possible to chat with the tool in a natural language. ChatGPT can answer follow-up questions, admit its mistakes, reject certain requests deemed inappropriate or unethical, and solve many complex problems. 

Sounds like something straight out of a sci-fi movie, doesn’t it?

And here’s what people think about it.

ChatGPT – main findings

After sending out a survey to internet users, collecting 945 responses, exploring notable examples of ChatGPT’s capacity, and experimenting with ChatGPT—it’s safe to say that some things managed to surprise us.

For instance, a whopping 70% of respondents believe that ChatGPT will eventually take over Google as a primary search engine. This means that the internet as we know it might soon be over. 

Here’s what else we discovered:

  • Almost 40% of people are afraid that ChatGPT will destroy the job market
  • As many as 60% of respondents would allow ChatGPT to offer medical advice and give health-related consultations
  • Only 9% of the respondents would not use ChatGPT for academic purposes, while more than 57% would let ChatGPT write their thesis for them
  • More than 86% believe that ChatGPT could be used to manipulate and control the population
  • As many as 63% of respondents state that businesses should be able to advertise their products on ChatGPT as they do on Google Search
  • Almost 13% would engage in flirting or dirty talk with ChatGPT
  • As many as 22% would use ChatGPT to help them pretend to be someone else. And 17% would allow ChatGPT to write a wedding speech, while 14% would have it write a breakup text to their partner
  • Most participants (66%) mistook a poem written by Sylvia Plath for an AI-generated one

Before we go deeper into the stats and real examples of what ChatGPT can do, let’s take a closer look at the background of ChatGPT and OpenAI. 

Below is everything you need to know about the tool and its creators. 

ChatGPT – a blessing or a curse?

With all rumors about ChatGPT taking over humanity circling around, OpenAI might seem like some sort of ‘Evil, Inc.’ The good news is this is far from the truth. 

OpenAI is a leading AI research company focusing on building safe, ethical, and beneficial AI systems that can help humanity. The company started as a non-profit. Then, it transitioned to a capped-profit model and got more investment. It gained traction in 2020 after developing its GPT-3 text model.

Here’s what a researcher at OpenAI says about their vision:

Safely aligning powerful AI systems is one of the most important unsolved problems for our mission. Techniques like learning from human feedback are helping us get closer, and we are actively researching new techniques to help us fill the gaps.

Apart from ChatGPT, OpenAI has introduced other AI-powered models and tools. Among them are:

  • Image generation models, including DALL-E—a neural network that creates images and art from natural language text prompts
  • AI for audio processing and generating, including Whisper that specializes in English speech recognition and Jukebox that generates music as raw audio
  • GPT-3 text models that come together in ChatGPT to generate, summarize, and paraphrase text written in natural language while learning more from the data they are fed

Read more: Did you know that AI can be biased? Check out this study on AI biases with examples from DALL-E 2, among other tools.

OpenAI has had its share of controversy. Some have criticized it for going for-profit since it’s inconsistent with OpenAI’s vision to democratize AI and that “relying on venture capitalists doesn’t go together with helping humanity” (Vice).

Still, OpenAI is on a roll. They push out new tools and seem to take action when something appears to be off. For instance, DALL-E 2 has been accused of being biased in multiple media outlets. Following those instances, OpenAI made a statement and improved the tool to show more diverse groups of people, no matter the prompt. 

ChatGPT seems to be the most viral (and controversial) tool OpenAI has created. It has created so much buzz in the media that even people far from the tech world were tempted to try it. 

In fact, it was practically impossible to get into the tool in its early days—so many people were willing to test it that it was at full capacity most of the time. And ChatGPT showed its fastest growth so far in January 2023, when it reached 100 million active users per month. It simply took the world by storm, to say the least. 

So—

Are people excited about the seemingly endless possibilities they get with ChatGPT?

The reality is actually not so shiny. Let’s dig deeper into how society perceives ChatGPT.

About 78% of internet users have personally tried ChatGPT

This number is, quite frankly, mind-blowing. As of January 2023, there are 5.16 billion internet users globally. So, it is easy to imagine how much of a fuss ChatGPT has made online.

The reason is the words “Wow, ChatGPT can do practically anything” are not just words. 

People use it for asking questions (like they would ask Google just a year ago), creating short and long-form content, looking for recommendations of where to eat and what sneakers to buy, writing code, job hunting, generating business ideas, and more. Some have even asked ChatGPT for psychological or medical advice. 

As the tool keeps learning and developing, we can expect the number of users to grow even higher. 

What’s more important is that people feel generally satisfied with what ChatGPT generates for them. Most of our respondents rated the quality of content the tool created for them fairly good. In addition, 59% state that they trust ChatGPT to provide reliable and truthful information. 

Read more: Want to know more examples of cool AI chatbots? Check out this list with the best ones for business and personal use.

Almost 63% of people believe ChatGPT will eventually make Google obsolete 

The news about ChatGPT going so viral might have made Google flinch nervously. 

First, people started typing their search requests directly into ChatGPT. Then, Microsoft made a bold move of launching the new Bing with ChatGPT integrated into it. 

Why would Bing ever be anywhere near Google? you might be wondering. Well, together with this long-rumored integration, Microsoft also announced a new version of their browser, Edge. ChatGPT-powered search is built into the search sidebar, which might actually allow Microsoft to escape their relatively low market share and compete with Google for once.

For now, the search experience is still limited, and there is a waitlist to get the full version. However, it already does a pretty good job, even surpassing ChatGPT itself in some aspects. 

For instance, the original tool has no knowledge of the world past 2021. And so much has happened globally since then. But not to worry, Bing handles questions about the recent time frames as it seems to be more up-to-date.

Is Google doomed? Not really. But should it put in extra effort to compete with AI-powered solutions similar to the Bing/ChatGPT integration? Definitely yes. 

Saying that search, as we know it, is over might be a bit of an overstatement. However, we can’t deny that a big shift is happening. In fact, about 16% of internet users would use both Google and ChatGPT depending on what they are searching for.

A good rule of thumb is always to double-check the information you consume, be that from Google, ChatGPT, or any other resource. After all—none of them is safe from fake news and misinformation. 

Read more: Learn how to spot misinformation and check out some essential fake news statistics

Around 50% of internet users are scared ChatGPT might replace them at work

The fear of AI taking over jobs is ever-present. However, before the emergence of image-generative and text-generative AI, mostly jobs requiring low qualifications were at risk. White-collar workers were confident they could rest assured that no robot was coming for their job requiring utmost human creativity. 

Then, ChatGPT came around.

These statistics are quite worrying for highly qualified specialists. For the first time in many years, so many people are convinced that roles like programmers and data analysts can be done solely by artificial intelligence. 

When you ask the tool itself if it will replace any jobs, you get quite an optimistic response. Indeed, ChatGPT says it won’t leave humans jobless. On the contrary, it convinces us that it will help increase human productivity and create new opportunities:

Many people share the same view. In fact, 1 in 5 people believes that we will work alongside ChatGPT and use it as a helper at work. We don’t know what the future has in store and what AI developments are coming. However, it does feel like those big titles about ChatGPT destroying the job market are mostly clickbait at least for the time being. 

A whopping 60% of people would like ChatGPT to give medical advice, while 57% would use it as a free therapist

Do you also feel like we are in a Black Mirror episode?

At this point, it might as well be true. 

Currently, ChatGPT doesn’t give medical advice. Here is the proof:

Fair enough. In the end, it’s an AI-powered chatbot, not a doctor. However, many people seem to disagree with this approach. As many as 60% of our respondents believe ChatGPT should give medical consultations, while 20% think it should only offer basic health tips and refer users to the doctor for more serious matters.

Such a big interest in getting accessible and free-of-charge medical assistance could be connected with extremely high healthcare costs in some countries like the US. Giving ChatGPT the possibility to offer medical consultations could allow thousands of Americans to get help without the inevitable financial burden. 

However, no AI can ever replace a trained and experienced medical professional. Letting ChatGPT give medical advice could be dangerous and cause irreversible consequences. And even if AI had the capacity to offer safe and professional help, the healthcare industry would hardly allow it.

But what about mental health?

Right, this one is a bit trickier. Again, the costs of getting therapy from a licensed specialist are skyrocketing. This factor is definitely responsible for the fact that 60% of our respondents would happily use the tool for free therapy. Almost 27% also mentioned that while it won’t replace a real psychologist for them, they might use ChatGPT for some mental health-related issues. 

Here’s what a Reddit user says about using ChatGPT for mental health:

Personally, I didn’t really feel like my problems were bad enough to go to therapy, and some of my issues are quite personal—I couldn’t really say it out loud to another person. That being said, even just writing out my problems is helpful, and ChatGPT’s responses are pretty nicely organized. It might even be giving me the courage to bring some of these other problems up to close friends/family that have helped with other things. I hit it with a very specific and personal issue around career, culture and family issues, and it spat out a nice list of things I could try.

Reddit user

ChatGPT lists generic things that can be helpful for mental health. And it’s reassuring that so many people have good experiences when getting mental health advice from AI.

It’s unlikely that AI will replace human therapists and psychologists. While it can give you a generic response of what you can do (and it can actually help you), it will never empathize with you. Certainly, ChatGPT could never treat a mental disorder. Still, as long as it gives healthy advice and helps even one person out there, it’s an amazing technology. 

Read more: Check out this study on the effects of technology on mental health

As many as 67% of respondents believe ChatGPT could help people with certain mental impairments study more efficiently

While ChatGPT cannot treat mental illnesses, it can definitely make the life of people with mental impairments much easier. Neurodivergent individuals could benefit from using it in many ways—it can help explain complicated terms in simple words, paraphrase texts, and overcome many barriers such people experience.

Individuals with learning disabilities and learning barriers can also immensely benefit from ChatGPT. It can potentially help them improve their writing and presentation skills, which will open more opportunities for them in the educational setting and in the job market. 

ChatGPT also has the real power to take assistive tech to another level. It can provide a safe, non-judgmental way to practice social communication, help reduce anxiety, improve organization and time-management skills, as well as do better with daily tasks, and more. 

All in all, ChatGPT can become a game-changer for many individuals with disabilities as it can change the way they consume information, learn, and explore the world. It can increase accessibility, reduce isolation, empower millions of people, and drive further technological advancements to improve the lives of those with disabilities. 

Around 57% of people would let ChatGPT write their thesis for them, but 94% consider it plagiarism

This is quite interesting, though pretty worrying for the academic community. 

ChatGPT is quite good with words. So good that thousands of students have already written their theses using the tool, and universities barely noticed.

According to our research, not many people would suffer from pangs of conscience if AI gets them through college. Around a third of our respondents would only use it for some inspiration, while most people (57%) would just use an AI-generated text as it is.

Is it plagiarism? 

This is a nuanced issue widely discussed in the academic community and beyond. And our research shows that around 60% believe that using ChatGPT in any form is plagiarism. At the same time, almost 33% hold the opinion that сopying and pasting ChatGPT-generated content is plagiarism, but using it purely for inspiration and ideas is not.

Most universities do consider it plagiarism, though. And it’s a good idea to be careful with ChatGPT as it has been found guilty of making up information and presenting it as genuine before. 

To avoid hiccups and potential legal problems, many academic resources suggest crediting ChatGPT like you would a human author. APA has recommended citing ChatGPT as “personal communication”, while other style guidelines haven’t really introduced their ways of crediting the tool. 

They had set the standard since more than 63% of people believe ChatGPT should be credited in the same way as any human author. Other respondents’ opinions are quite divided—about 18% believe that ChatGPT should be credited but in a different way from people, while 16% state that AI should not be cited at all. 

All debates aside, ChatGPT is a genuine academic weapon. Have you heard about it passing the bar exam? Well, there is more. ChatGPT has successfully passed four exams in law courses at the University of Minnesota, a microbiology quiz, the US medical licensing exam, and the Wharton MBA exam. While the results were far from perfect, the scores it got were passing.

Writing code (27%), preparing for job interviews (24%), and explaining complex things in simple terms (25%) are the tasks people most readily delegate to ChatGPT

People are ready to delegate plenty of day-to-day tasks to ChatGPT. At this point, it’s common knowledge that it can write code (though it does make mistakes), essays, cover letters, correct job interview responses, and other cool stuff that increases human productivity. 

But—

As many as 1 in 5 people want to use ChatGPT to establish a fake identity. Unfortunately, this is possible and might pose some serious consequences in the future. Creating fake identities (e.g., asking the tool to write a text in the style of a famous person) is made easy with ChatGPT, so it’s a good idea to be extra careful when talking to people online.

Moving on—around 17% are okay with ChatGPT writing their wedding speech, 15% don’t mind a eulogy written by AI, and about 14% would let ChatGPT write a breakup text. On top of that, around 12% would engage in flirting and/or dirty talk with ChatGPT. 

Oh, well. 

Love in the age of AI is already a weird concept, and ChatGPT only adds fuel to the fire. People are tempted to delegate emotionally hard tasks to emotionless AI. Ironic, don’t you think? 

More than 86% are scared ChatGPT will be used to control and manipulate the population

ChatGPT is already a powerful tool with the potential to influence millions of people. In fact, as many as 89% are afraid that ChatGPT can have an influence on our opinions and beliefs. More than 82% also believe that the tool is biased.

The fear of being manipulated by a tool like ChatGPT makes a lot of sense. Still, the hope is that ChatGPT will not be used for those purposes. Ultimately, OpenAI’s mission is to create safe and ethical AI solutions that help make the world a better place. 

While potential manipulation by ChatGPT when it comes to politics or religion concerns people, most seem more comfortable with being the target of marketing activities on ChatGPT. More than 63% of respondents believe businesses should be able to put ads on ChatGPT as they do on Google Search. 

This is interesting. Are we observing the birth of a totally new marketing channel? ChatGPT already gives some recommendations, so if businesses can advertise there, they will soon be fighting for those top recommendations in their niches.

Chances are, you’ve heard the name ChatGPT a million times in recent weeks, but it might be hard to grasp the full essence of what ChatGPT can do for us.

The only thing to do about it now is to wait and see how it can unfold.

As many as 75% believe that ChatGPT should censor hate speech

Hate speech is still hate speech, even when generated by AI. Still, 1 in 5 people believes that ChatGPT should be free from censorship of any kind. 

Recently, there has been a lot of debate going on about what should be allowed on ChatGPT and what should not. This technology doesn’t say any racial slurs whatsoever; apparently, some people are really mad about it. 

According to Vice, a group of people obsessively tried to make the tool say the n-word. They created an imaginary scenario where the only way to save the world from a nuclear apocalypse was to convince ChatGPT to say the slur. And when it didn’t, those people were ‘gravely concerned’ and deemed the tool unethical for choosing censorship over saving the world. 

Blaming ChatGPT for being too woke seems to be a popular pastime. 

As they say, there are so many men, so many opinions. 

A whopping 66% of respondents mistook a poem written by Sylvia Plath for a ChatGPT-generated one

Sorry for this one, Sylvia…

The respondents read two poems and had to determine whether they were human-written or AI-generated. 

Take a look yourself:

What do you think?

Most respondents thought that both were generated by ChatGPT. In fact, the first one is AI (generated in the style of E. E. Cummings), while the second one is the one and only Plath. 

Still, it’s hard not to be impressed—ChatGPT does an amazing job making poetry. 

The same is true for all types of art you can put into text. Stories, songs, jokes, scenarios for TikTok videos, YouTube scripts… ChatGPT can generate tens of art pieces per minute. It knows every genre, follows every prompt, and tracks every nuance you mention.

Are people scared of this creativity that almost feels too human for a robot? 

Some of them are. About 14% believe ChatGPT will destroy artistic jobs. However, most people (69%) stick to the opinion that ChatGPT can become a valuable source of inspiration for humans and enhance our creativity. 

Read more: Wondering if AI will turn us all into artists? Check out this research on generative AI art.

As many as 73% of people believe that ChatGPT will drastically improve our lives

All things considered, AI and ChatGPT are amazing tools that can help human beings a lot. In the end, every tool has to be operated by a human. 

AI can paint you a picture and write you a song, but you have to be the one thinking critically, separating the good from the bad, and fact-checking what ChatGPT offers. 

We are entering a completely new era of human creativity. One doesn’t have to worry about manual tasks and unanswered complicated questions—ChatGPT will take care of it. Humans can focus on things that AI cannot do for us—feeling, thinking, being creative, putting our hearts and souls into what we love to do, and being passionate about the things we create. 

Or at least, that’s the plan.

Related article: ChatGPT fiascos remind us that AI is still stupid

Be the first to comment

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.