HOME NEWS BLOG
Mannequin Taking A Butt Selfie

Will AI Kill The Internet?

I'm not usually one to take part in sensationalism or speculation, but hear me out for a moment. Artificial Intelligence (AI) today is a stand-in phrase for Convolutional feed forward Neural Networks (CNN) and more specifically as of late Transformer networks which are a special case of CNN. See my article on why CNN are definitely not AI, but they are pattern recognition and pattern generation systems. The current wave of generative AI websites are quite the diversion and can be quite entertaining. They are essentially gleaning patterns from the world-wide-web and able to reproduce those patterns in occasionally interesting ways. So long as we don't call them AI or, god forbid, general artificial intelligence (GAI), and we recognize them for what they are. They are an interesting advancement of CNN that have been around for decades.

If you haven't played around with ChatGPT or Gemini, you should, just don't spend money on it. On particularly entertaining diversion is to ask it to generate images for you. It is really quite good at some types of images where there are billions of training examples for it to consume. For example, generating photographs of food. Obviously, Instagram seems to be a repository of 3 meals a day for most influencers and regular folks alike. There are probably millions of food photos uploaded and labeled per day. So, it's pretty good at generating these. So long as you can overlook the finer details being crazy. If you lean back and squint, they look pretty damn passable. And we can extrapolate that they will generate passable examples of any type of images they have that type of data-stream for. Gemini won't generate images of people unless you pay, and I don't have the readies for that type of triviality currently. But I'd bet you dollars to doughnuts that it absolutely excels at duck lip selfies in bathroom mirrors.

And we can extrapolate this to these networks will be passably good at generating any of the types of images that are flood uploading to the social media platforms they train from each day. Which are also the types of images people love to spend their attention dollars on, in the surveillance capitalism sense.

But here is the scary part...

If these networks will be able to generate the types of images we most like to swipe on and look at, there will be a huge reward for making bots to generate this type of content. Imagine a Jane Doe bot account that generates hundreds of butt selfie yoga pants videos a day with a better body that real women could possibly have? Or John Doe uploading hundreds of photos a day flexing in the gym with better biceps than any real human could have? And another bot account making more elaborate food look better than the real stuff and 60 meals a day rather than 3? You probably see where I'm going here. Very soon we will all be asking ourselves if any of the images are real or made with pattern generators. You should probably be asking yourself this already. We are coming out the end of the age of asking ourselves how many filters are applied to all the images and graduating to if anything in the image is real at all.

So the next and I think the gigga-dollar question is: will you want to look at the internet if it is dominated with fake content?

© 2024

expert curated independent technology news