People have been living in their own media bubbles or echo chambers, whatever you want to call them, for quite some time. You never have to hear an opposing opinion for the rest of your life, if you curate your algorithm well enough.
Now, with the sudden boom of AI chatbots, the problem has gotten even worse. Some folks are using these chatty and friendly algorithms as pseudo-therapists that don’t tell people what they need to hear, but rather tell them exactly what they want to hear. AI chatbots are becoming highly efficient echo chambers that can quickly ruin someone’s life by reinforcing their worst impulses.
These digital yes-men with big vocabularies, a knack for buttering you up, and absolutely no moral compass are often used for emotional support. It’s something we’ve covered quit