GPT-5, the long-awaited next-generation leap from OpenAI LLM, promises the next big advance in artificial intelligence, but its development faces challenges which makes us think about the limits of progress in this industry. I can’t help but remember how amazing the jump was between GPT-3 and GPT-4, let alone GPT-4o: suddenly the possibilities for understanding, contextualizing and generating text seemed almost magical. However now, with “Orion” – the project’s internal name – reports suggest that improvements, while present, may not be as revolutionary as we expected. Are we witnessing the beginning of a phase where progress will be slower? This option leaves me with mixed feelings.
Jump from GPT-3 to GPT-4 labeled before and after. The latter demonstrated a greater ability to understand complex contexts, offer more precise answers, and adapt to more diverse tasks. However, making progress of this magnitude with GPT-5 appears to be more of a challenge. One of the most prominent problems is the lack of quality textual data. According to a report by Epoch AI, supplies of this essential resource could run out before 2028, pushing companies like OpenAI to increasingly rely on synthetic data. Although they are useful, they do not always achieve the diversity and accuracy needed for training high-level models.
They add to that limitations in computing resources. Sam Altman, CEO of OpenAI, recently admitted that the company faces limitations in its technology infrastructure, forcing it to make strategic decisions about how to allocate its capabilities. Developing ever larger and more complex models requires enormous computing power, which is not always easy to scale. These types of barriers raise an important question: Is it possible to continue to support innovation with current tools, or are we reaching a point where advances will become more costly and less meaningful?

Not everyone sees this panorama pessimistically. Figures like Eric Schmidt, former CEO of Google, They assure that there is still room within the “escalation laws”.although they recognize that these are not infinite. Kevin Scott, CTO of Microsoft, goes further and says that the industry has not seen diminishing returns in expanding models. This optimism suggests that significant progress could still be made in the coming years, provided companies are able to overcome current challenges with more strategic and innovative approaches.
As we await the release of GPT-5, I can’t help but feel curious and slightly impatient. He can surprise us like his predecessor, Or are we entering a new era of slower and less visible progress?? Despite the challenges, each step along the way redefines our relationship with technology. I, for one, look forward to seeing how far this journey can go, although current indicators remind me that even the most promising paths have their obstacles. and you? What do you expect from the future of GPT-5?
More information