Is GenAI entropic?
Generative AI (GenAI) has certainly captured the imagination of a lot of people from professionals to hobbyists and society at large. Its promise to improve productivity by simplifying the creation of text, images, and other content is its main unique selling point. GenAI is stochastic and it works by using probabilistic models to analyse patterns in large datasets to generate new content by predicting the most likely sequence of outputs based on learned relationships. It applies Bayes’ theorem to refine predictions about the likelihood of a particular outcome (e.g., the next word or image pixel) based on prior knowledge and evidence from the input context.
However, this raises an intriguing question: Is GenAI subject to entropy? To explore this, we must understand the principles of entropy and how they might apply to the systems underpinning GenAI.
Entropy is a concept from the second law of thermodynamics which describes the tendency of a closed system to move toward greater disorder and randomness. Is GenAI a closed system? It depends, if it operates on a fixed dataset, it is a closed system, but if it is continuously updated with fresh data and inputs, it is an open system. However, as the use of GenAI becomes more ubiquitous, it will use more of its own output data as the source of fresh data, amplifying its own biases and effectively creating a closed system.
If we conclude from this that GenAI is a closed system, entropy suggests that over time, and without energy in the form of external, diverse and high-quality data sources, entropy ensues increasingly producing more chaotic and random outputs. Over time, it is feasible that without intervention, the information produced might approach a tipping point where GenAI becomes completely random, losing all its coherence and utility.
How likely is this tipping point? As mentioned, GenAI operates on stochastic principles, leveraging probabilistic models to predict the most likely sequence of words or data points. These probabilities are structured as binary — 1s and 0s — forming the backbone of its outputs. Using Ising’s model, which describes lattice of binary elements which influence their neighbours based on their states, it highlights how macroscopic behaviours produce patterns of coherence or randomness depending on the system’s energy and configuration. As the system approaches randomness, it encounters the Curie point where the state of the whole appears completely random. This tipping point follows power-law scaling which describes the relationship between two values where one is the power of the other. This produces an exponential curve that reaches a critical point — a sudden acceleration towards entropy.
Whether GenAI is entropic depends largely on whether it functions as an open or closed system. In an open system, GenAI continually absorbs new data, ensuring its predictions and outputs remain relevant and well-grounded. This influx of information effectively counters entropy by providing the “energy” necessary to maintain order. Conversely, if the data available to GenAI is finite or stagnant, the system becomes closed. Under these conditions, entropy becomes inevitable. Without fresh inputs, the system could degrade, producing increasingly chaotic and nonsensical outputs over time. At its core, the potential for entropy in GenAI serves as a reminder of the fragility inherent in even the most advanced technologies. As we integrate AI more deeply into our lives, understanding and mitigating these risks will be essential for harnessing its full potential while safeguarding against its limitations.
In conclusion, the entropic nature of GenAI is not a fixed state but a dynamic balance. It is important to treat it as an open system — supplying it with a continual stream of fresh data and refining its algorithms — we can stave off the chaos that entropy implies. However, neglecting this principle could lead GenAI toward a critical state in which randomness dominates and GenAI loses its value. This underscores the need for ongoing vigilance and innovation in its development.