Search
Close this search box.

Social Engineering: What It Is and How It Works—by Richard P. Weigand

Introduction

I am very pleased to publish this new paper by Richard P. Weigand on the subject of “social engineering.” I think it is one of the most important articles we’ve published so far, and everyone should read and understand it. Simply put, “social engineering” as a subject is “about changing an opinion in the public mind. Creating it, not leaving it to chance or finding out what the general opinion is and working with that. It’s the action of selling a specific opinion to a market. And that opinion has everything to do with the social structure of the area being marketed to.”[1] The tools of social engineering, which is also known as “Psyops”, “Brainwashing”, “5th Generation Warfare”, or “Black Propaganda”, include “propaganda by the redefinition of words”, as well as the invalidation of cultural norms for the perceived benefit of those perpetrating the engineering. As our country, the United States, has been subjected to, and is currently being subjected to operations of this sort, understanding how it is done is critically important. In this short paper you have the basics. Please read it and share it. MA

__________________

In the modern world, many cultural shifts appear sudden, mysterious, or even inevitable. But behind these shifts, there is often a remarkably simple mechanism at work. It is not always visible. It is rarely debated. But once understood, it is easy to spot. This is the basic engine behind much of social engineering, propaganda, and the redefinition of words:

Step 1: Choose the Word or Idea

Decide what concept you want to promote. It could be a new term entirely (like “gender identity”), or an old word with a fresh coat of ideological paint (like “equity” or “freedom”). Choose a word that carries emotional weight or rhetorical power.

Step 2: Identify the Field

Next, locate the discipline where that word would seem to belong. Is it medicine? Psychology? Law? Education? Public health? The chosen word must have a home—a credible field that lends it legitimacy. Without it, it floats.

Step 3: Find the Authorities

Now locate respected voices in that field. These might be researchers, academics, think tanks, nonprofit organizations, or credentialed influencers. You don’t need all of them. You just need a few—the right ones.

Step 4: Start with the Sympathetic

Among those authorities, find the ones most likely to support or sympathize with your redefinition. Maybe they already hinted at it in past work. Maybe they have personal or ideological alignment. Maybe they can be persuaded, pressured, or paid.

Step 5: Publish and Broadcast

Help that authority publish something—an article, a white paper, an op-ed, a podcast interview. Assist if needed. Fund it if you must. The goal is to get the redefinition out there from a “credible” source. Give it polish. Give it exposure.

Step 6: Repeat the Process

Now do it again. And again. Find a second expert. Then a third. Have them cite each other. Link their papers. Reference their work in media soundbites. This creates the illusion of consensus.

Step 7: Media Amplification

Make sure these redefinitions are broadcast, quoted, and discussed in friendly media outlets. Push headlines. Use phrases like “experts say,” “research finds,” and “studies show.” Public repetition cements the phrase. What gets said often enough begins to feel true.

Step 8: Let the Crowd Finish the Job

Once the term has academic backing and media coverage, regular people will start to adopt it. Some will do so to fit in. Others to sound informed. Some will simply repeat what they heard without realizing it was planted. At this point, the term has entered the culture. It is now accepted. Not because it was debated; not because it was proven, but because it was repeated, credentialed, and normalized.

Why This Matters

This method is not inherently evil. But it is ethically very dangerous.

It short-circuits public reasoning. It replaces open discourse with manufactured consensus. It weaponizes the public’s trust in expertise to smuggle in new meanings, and with them, new norms, new laws, and new expectations.

This is how definitions are changed without consent. This is how new truths are installed. Not by vote or debate, but by repetition, reputation, and reach.

Once you understand this method, you begin to see it everywhere: In medicine, education, law, mental health and in politics. You see words take on new meanings overnight; you see objections labeled as ignorance. You see the same phrases show up across headlines, classrooms, and corporate HR manuals. It all feels coordinated, because it is. It is not a mystery. It is a method.

And that is why we must reclaim our language, our reasoning, and our ability to say: “Wait! Who said that? And why should I believe them?”

The information in this article belongs in any serious discussion of propaganda, education, psychology, media manipulation, or the decline of meaning itself [2].

It belongs in any book that seeks to understand how a society can be changed without its people ever fully knowing how it happened.

How Redefinitions Take Hold

How do you change a society’s mind?

Not, at first, by brute force, but by reshaping the way people process information.

In the modern world, particularly in the West, we trust in systems—systems of knowledge; systems of verification, and systems of authority. And, like all systems, these can be exploited.

Step One: Pick the New Idea

Every manipulation begins with a goal. You start by choosing the word, or idea, you want to normalize. Redefine a term, shift a norm, or introduce a new belief.

That word or idea already belongs to a specific domain, whether medicine, psychology, education, law, gender, media, or morality. So, the next move is strategic.

Step Two: Target the Field

You find the field the idea belongs to. You then look for the authorities inside that field. Not the ones most respected, but the ones most persuadable; the ones already adjacent to your goal, or vulnerable to prestige, ideology, or funding.

Then you approach them. You invite them to publish a paper. You offer a platform, an honor, a grant. You don’t need them to lie. You just need them to agree a little. To shift the language. To say, “we might need to rethink,” or “emerging evidence suggests.”

Once that first agreement is on record, it becomes the basis for the next. Soon, others follow. More papers and more interviews, then citations, roundtables, and media coverage. Suddenly the idea is not fringe, it’s “being discussed.” Then it’s “increasingly supported,” and becomes “settled science.” And finally: “everyone knows.”

But, there’s a catch…

Western societies place great trust in two things: The consensus of experts and the repetition of data across multiple sources

That is the loophole.

If an idea is repeated by enough credentialed voices, and those voices appear across multiple platforms, it will be received as truth, regardless of whether it is true. Truth, in this model, is a function of perceived agreement. The average person doesn’t have time to trace every source, and so relies on filters, like:

“Is this expert reputable?”

“Is this idea repeated in many places?”

“Does it align with what I already hear?”

All of these filters can be gamed. If you choose the right experts—and make sure they’re amplified across media—you can manufacture the illusion of consensus. Not by discovering truth, but by distributing agreement.

It is not a conspiracy. It’s a strategy.

And it works because of how the system processes credibility.

The error in the machine is what counts as a fact?

Here is where language plays its greatest trick. We treat facts as foundations. But not all facts are truths. A fact is simply something verifiable. It is a fact that someone said something. It is a fact that an article was published. It is a fact that a study was cited. But whether what was said, published, or cited is true is another question entirely.

This is the error built into every system that counts repetition as validation.

Because once something becomes a fact in this shallow sense—something that can be pointed to—it gains legitimacy. And legitimacy becomes leverage.

What no one asks is: “Was the foundation sound?” “Did the original paper hold up?” “Were the premises tested—or just aligned?”

How Lies Become Structures

This is how language shifts. This is how gender became identity, and how feelings replaced evidence. It’s how science became faith in credentialed consensus, not the testing of hypotheses.

None of it required force, only influence, but applied in the right place.

If you want to control belief, you don’t need to persuade the public. You just need to persuade a few well-placed voices—and make sure everyone hears them enough.

Because the human mind, when tired or overwhelmed, will mistake repetition for reality.

Copyright©2025

By Richard P. Weigand

All Rights Reserved

About Richard P. Weigand

Richard P. Weigand

Born in 1946 and currently living in rural Virginia, Richard has spent most of his life engaged in trouble shooting of one kind or another. He has been referred to as a business psychologist, a label he does not relish. He’s also a Vietnam War veteran, having served in the United States Navy. Beginning at an early stage he often found himself being asked to solve problems for others, something he seemed to have a knack for. By the early 1990s the “knack” had turned into a profession, with Richard working in Hollywood consulting artists, directors, producers, musicians, and actors; from novices on up to and including Academy Award nominees. As that business grew, his client base expanded to all types of professions, from the unemployed to the heads of big businesses around the world—often with spectacular success. Ultimately these wins led to offers from a South American government to help them analyze and clean up the corruption in their police and military, including advising the government on its handling of the drug cartels. A skilled investigator and a published poet, Richard today devotes most of his time to research and writing and has several projects scheduled for future publication.


[1] Quoted from the book “Countering Cultural Destruction” by Roger Westlin, chapter “Social Engineering-The War.” This book is recommended reading on this subject and is linked here:

https://a.co/d/fcNMB4X

[2] suggesting that the cumulative effect of these manipulations is not just misinformation, but a hollowing out of shared meaning, language itself, and truth.

Insightful Commentary on Today's Battle for Human Rights!

In today's WOKE world, the real message of our basic, intrinsic, and inalienable Human Rights gets perverted and lost. It is my mission to prevent that from happening.

Sign up below for updates on things you won't hear from mainstream media, exclusive news, and sneak peeks at upcoming projects.​

Like this post? Please share it...
Facebook
Twitter
LinkedIn
Pinterest
Email

Leave a Reply

Your email address will not be published. Required fields are marked *

Insightful Commentary on Today's Battle for Human Rights!

In today's WOKE world, the real message of our basic, intrinsic, and inalienable Human Rights gets perverted and lost. It is my mission to prevent that from happening.

Sign up below for updates on things you won't hear from mainstream media, exclusive news, and sneak peeks at upcoming projects.​

Visit My Bookstore