Behind The Fog Machines

Truth—deceptively simple in definition yet endlessly complex in philosophy—has always carried an unexpected practicality, one that is often mischaracterised or poorly grasped in public discourse. This specific function has shifted across epochs and now, in the digital age, morphs into new shapes even faster.
For the casual media consumer, tracing this evolution is a helpful tool—an initial step towards recognising how narratives are shaped, framed, and often quite deliberately distorted. But for anyone stepping into the field of intelligence analysis, particularly within open-source domains, such understanding is essential. Without a conceptual grasp of how truth and information have been operationalised through time, even well-intentioned analytic judgement may overlook obvious distortions, calculated misdirections, or embedded bias—as many historic concepts explained here are still in use today, albeit in subtler forms.
The Value of Information
Information has always had value. Even in the earliest human societies, having correct and timely information often meant better chances of survival—a phenomenon we can still observe in the animal kingdom. For primitive humans, predicting weather patterns and changes, recognising significant natural events, or anticipating hostile encounters would have been crucial aspects of life. A story not too distant from our modern condition, in truth.
Now, pause for a moment and consider why that is the case. Why are we so dependent on information? Without drifting too far into philosophy, I would offer two starting points. First, our very limited power to influence the external world. And second, our very limited knowledge of that world. In other words, if you are trying to escape wildfires, your best chance lies in knowing where the fire is at any given moment, along with the wind speed and direction—because your ability to pass through it unharmed, or to control it by force, is close to none.
Truth is a Gamble
The sad truth about truth is that it can never be fully trusted. No matter how much information one collects, or how carefully it is analysed, there is always the possibility that a new piece of data will emerge—seemingly out of nowhere—reshaping our entire understanding of events. This is a familiar reality in the scientific community: we did not create the universe, nor can we fully control or comprehend it; everything we "know" is, in the end, a theory awaiting its eventual contradiction.
In day-to-day life, believing or disbelieving information has always been a source of regret and disappointment. Trust the wrong source, and consequences follow. Dismiss the right signal, and an opportunity may be lost. This double-edged nature makes accepting information inherently risky. It demands not just one, but a set of tools and skills. Those whose assessments come close to the factual truth—though they may never be able to fully realise or prove it—often gain a form of advantage. In this sense, every act of belief is not simply a logical decision, but a strategic gamble, one with its own potential gains and penalties.
Belief as a Social Uniform
As our societies have grown more sophisticated, the need for cooperation has also increased. This has often meant living with or interacting among unfamiliar faces—frequently in contexts where one might find oneself exposed or vulnerable. To manage this, societies began to add layers of abstraction, and one effective means of doing so was by grouping individuals according to their beliefs.
Belief systems, then, became more than just intellectual positions; they evolved into badges of belonging. What one believes signals not only personal convictions, but also social alignment—indicating which groups one identifies with, and which are to be kept at arm’s length. In workplaces, religious institutions, political associations, and online communities, shared truths function as glue. They form an invisible dress code. In this way, belief can both empower and constrain, strengthening or weakening individuals and the organisations they belong to.
This is one of the earliest and most visible spaces where manufactured truth becomes not only functional, but beneficial—particularly at scale. But not all beliefs carry the same weight. Soldiers die in brutal battles to defend a belief—that the homeland and its borders must be protected at all costs—even if they have held disagreements in the past with the very same society they are now willing to die for.
When Truth Became Currency
As personal wealth and institutional assets began to accumulate, the stakes for protecting them rose significantly. At a certain point in history, it became cheaper to pay for accurate information than to suffer the costs of its absence. That tipping point transformed truth into a form of currency—something that could be exchanged for money or services. The informant, the adviser, the analyst: each emerged as a profession grounded in the economic utility of truth.
The arrangement proved so effective that being trustworthy became indispensable in many professions—and, in some contexts, the sole requirement. For a military general, for example, trustworthiness was not a desirable trait but a non-negotiable condition. High-ranking officers deemed untrustworthy were not simply dismissed; they were demoted, exiled, or eliminated. The more critical the role, the higher the threshold of trust that had to be established beforehand and actively upheld throughout the term of service.
The Pre-Analogue World
Before the advent of mass broadcasting, information moved slowly and was geographically constrained. The flow of information was limited to small, local circles—villages, towns, or close-knit communities. People mostly learned news through word of mouth, handwritten letters, or local gatherings. In this era, because everyone shared the same limited sources of information, what people believed about someone could spread quickly—and linger for a long time. Your name—good or bad—travelled with the stories others chose to tell about you.
The limitation of this era was advantageous, but only to a narrow extent: with fewer information channels, it was sometimes easier to trace the origin of a rumour or verify a claim through shared memory or direct witness. Yet, the lack of education and underdevelopment in various sciences meant that people were often more susceptible to believing whatever story arrived—be it from a passing traveller, a soldier, or a religious figure—without any reliable means to question it. Imaginary tales, moral warnings, and wildly exaggerated accounts could spread just as easily as facts, particularly when they aligned with existing fears, superstitions, or desires. The only modest safeguard was time—it could take years, if not centuries, for a falsehood to spread widely across geographic regions.
Analogue Broadcasting Era
The arrival of analogue technologies—radio, film, and television—changed the dynamic. Now, a farmer in India could be exposed to Soviet propaganda or American consumer advertising. National and ideological boundaries still applied, but the distance between sender and receiver was collapsing. Broadcasts could reach across cultures, though they were still curated by powerful gatekeepers such as state broadcasters or major studios.
It was both difficult and expensive to produce and broadcast programmes for the masses, and still limited in reach: the receiver needed a functioning device, tuned to the right station, at the right time. But the revolution was already underway. Large populations could now be influenced more easily than ever before—and crucially, almost in real time. Events unfolding on European war fronts could reach American households within hours or days, not months, and with far greater emotional force. This was made possible by curated audiovisual material—soundtracks, voiceovers, dramatic images—that shaped not just what people knew, but how they felt about it.
The Internet
The digital age further revolutionised how information was created and shared. For the first time, individuals could publish without needing a press, studio, or licence. However, early internet distribution still relied on traditional forms of amplification—advertising, peer recommendations, or word of mouth.
The democratisation of information had begun, but it remained tethered to older models of propagation. At the same time, regimes that viewed this shift as a threat to their stability began introducing new forms of censorship and control, adapting familiar repressive tools to the digital domain. Still, these limitations—technical, structural, and political—would soon pale in comparison to what came next: the emergence of a new paradigm, one defined not by human networks but by algorithmic systems. This marked the beginning of an era we now inhabit, where the logic of information is no longer governed by scarcity or authority, but by visibility, virality, and velocity.
Search Engines
The search engine era introduced a new value system for information. Despite the narratives pushed by tech companies, credibility ceased to be the central measure—and for publishers, reach became the defining currency. Once algorithms, rather than editors or subject-matter experts, began determining what surfaced to the top, the notion of truth was further marginalised. No matter how advanced a search engine’s model for evaluating "trustworthiness", it could always be manipulated to favour one source over another, or grant monopoly over specific topics to certain actors. And with that came real-world consequences.
For the first time, geography became truly irrelevant. A blog post written in a remote village could, in theory, reach a global audience. However, whether a piece of content would be seen depended not on its factual accuracy or public value, but on whether it could game the system well enough to land on the front page. In effect, the algorithm became the new gatekeeper—opaque, unaccountable, and profoundly influential.
Social Media
Social media platforms amplified this shift. Information could now travel not just far, but instantly. And instead of being filtered by professionals or experts, it was now distributed by networks of casual users, influencers, and bots. What mattered was no longer whether something was true—but whether it was engaging, emotional, or controversial. That was the definition of usefulness in social media terms. The line between truth and entertainment was gone, and the architecture of belief itself began to shift.
The Fog Machines
Once reach became more valuable than credibility, organisations adapted. Search engines decided who was worthy of being on the first page, and those positions were for sale to anyone from anywhere. Subsequently, organisations began prioritising digital territory over factual integrity. PR departments, content teams, and communication strategists became the engineers of the fog machines—producing a constant stream of content not to clarify, but to add to the confusion.
In the race to dominate narratives, the truth became negotiable. What mattered was visibility, not veracity. Experts lost their audience to influencers and bots, and every news or media outlet quickly became another fog machine.
It is important to understand how these fog machines operate—because their sole purpose is to manipulate the field of decision-making in ways that serve their own interests. Every organisation, without exception, engages in this—whether with ill intent or out of fear of becoming irrelevant.
The mechanisms are often subtle, but not hard to spot. One common tactic is sensory overload: by flooding the public with repetitive messaging whilst omitting potentially troublesome facts, they create an illusion of clarity. In reality, they are narrowing the scope of what you see—and therefore, what you can reasonably choose.
Another trick is strategic distraction. Like a magician who performs in plain sight, organisations often shift attention away from core issues without ever asking you to look away. They do not persuade you to ignore the truth—they simply arrange the spectacle in a way so that you miss it entirely.
Then there is the illusion trick. Just as an optical illusion misleads the eye through calculated visual cues, organisations can engineer confusion through improbable but persuasive combinations. Many issues—especially in politics, finance, and foreign policy—are naturally or intentionally complex. Fog machines thrive in these areas, as the stakes are high and the opportunity exists to find a beneficiary who can pay the bills of a giant fog machine and still earn millions in the end.
Final Note
This article is the first in a series of content designed for beginner to intermediate intelligence analysts or investigators. Understanding how we arrived at the current information landscape is essential, as in many situations you will encounter a mix of contrasting conditions—for example, a remote village in a developing country with full access to the internet and social media platforms. In such cases, maintaining an awareness of all contextual layers—technological, cultural, and informational—can prove crucial to sound analysis.