The Impact of Generative Pre-trained Transformers
Understanding the Influence of GPTs
It is March 29, 2023, and you’re watching the code report. Generative pre-trained Transformers are transforming the world as their name implies. OpenAI recently released a paper listing out all the occupations that will be affected by large language models…
Concerns Surrounding AI Advancements
People are afraid. OpenAI recently released a paper listing out all the occupations that will be affected by large language models. They concluded that up to 49% of workers could have at least half of their job functions enhanced by AI…
Reasons Why AI Falls Short
Questioning the Need for Computer Science Degrees
First of all, it’s forcing many people to question whether or not they should get a degree in computer science. What’s the point if AI can write and debug its own code or build an app based on a design on a napkin? But almost every other degree like history, math, gender studies, and business will also be affected just as much…
Limitations of Chad GPT in Complex Applications
The reality is that most people don’t use what they learn in their degree in their actual field of work. You need real experience to develop a skill, and ChatGPT is making the system look even more ridiculous because now almost any assignment or quiz can be solved instantaneously…
Unveiling Marketing Hype Surrounding AI
But at the same time, they release a paper talking about how awesome GPT4 is and how it’s showing sparks of AGI without exposing any important technical details about how it works and talk about how AI regulation is needed ASAP before artificial general intelligence emerges…
The Boring Internet: AI’s Influence
The fourth reason AI kind of sucks is that it’s making the internet boring. Now it’s almost impossible to know if a social media account or image is human crafted or not. Like my grandma thought this dripped out image of the Pope was real…
The Eventual Plateau of AI Development
The algorithms that power LLMs are not even that complex when you get into the details. You can build your own GPT with Python in a day, but doing it well takes a massive amount of computing power and time. The idea of a transformer in deep learning was first introduced by Google in 2017…
Leave a Reply