Joe Biden never won. This is our Real President – 45, 46, 47.

AND our beautiful REALFLOTUS.
This Stormwatch Monday Open Thread remains open – VERY OPEN – a place for everybody to post whatever they feel they would like to tell the White Hats, and the rest of the MAGA/KAG/KMAG world (with KMAG being a bit of both).
And yes, it’s Monday…again.

But we WILL get through it!

We will always remember Wheatie,


Pray for Trump,

Yet have fun,

and HOLD ON when things get crazy!

We will follow the RULES of civility that Wheatie left for us:
Wheatie’s Rules:
- No food fights.
- No running with scissors.
- If you bring snacks, bring enough for everyone.
And while we engage in vigorous free speech, we will remember Wheatie’s advice on civility, non-violence, and site unity:
“We’re on the same side here so let’s not engage in friendly fire.”
“Let’s not give the odious Internet Censors a reason to shut down this precious haven that Wolf has created for us.”
If this site gets shut down, please remember various ways to get back in touch with the rest of the gang:
- Our backup site, The Q Tree 579486807, https://theqtree579486807.wordpress.com/
- Our old alternative site, The U Tree, where civility is not a requirement
- Our Gab Group, which is located at https://gab.com/groups/4178
- Our various sister sites, listed in the Blogroll in the sidebar
Our beloved country is under Occupation by hostile forces.

Daily outrage and epic phuckery abound.
We can give in to despair…or we can be defiant and fight back in any way that we can.
Joe Biden didn’t win.
And we will keep saying Joe Biden didn’t win until we get His Fraudulency out of our White House.

Wolfie’s Wheatie’s Word of the Week:
jigamaree
noun
- a thingamajig or thingamabob
- a poorly or difficultly described object or entity
- a cunning trick or maneuver
- a trivial, ridiculous, frivolous, or worthless fancy
- a gadget or object of uncertain name or purpose
Used in a sentence
Whether it was better described as a jigamaree or a thingamajig, the strange aluminum contraption was supposedly an essential device for our mountaineering adventure.
Shown in a picture

Shown in a video
MUSIC!
This isn’t so much listenable, as snippets and memories. Check it out! You probably remember a few of these songs.
THE STUFF

This will hurt your brain, but you will undoubtedly learn something. You don’t have to finish it, but if you do stick with it, you will literally understand some cool stuff about “spheres in higher dimensions”.
And YES – this is NOT “AI slop”!
And as a bonus, if you stuck around for it – the formulas for “lower-dimensional spheres” – meaning the point and the line segment! Kinda strange – right?
Just sayin’!
And remember…….
Until victory, have faith!

And trust the big plan, too!

And as always….

ENJOY THE SHOW

W











New Schlichter —
https://townhall.com/columnists/kurtschlichter/2026/03/23/no-maga-is-not-falling-apart-because-a-few-podcasters-did-not-get-their-way-n2673218
I like the title!
YUP! Same here!
It’s a good piece about how movements are composed of coalitions.
And (emphasis mine):
Hi, pgroup, if you are reading this! 👋 🙏 Hope to see you before long.
Scott457: you follow bitcoin news, right? I just heard the last part of a Michael Brown (the guy George Bush once told: “you’re doing a heck of a job Brownie”) radio show talking about Iran and Bitcoin. It was very interesting because it may be how Iran (and others) did mining in a different way that we were not specifically tracking, or at least not pubicly.
Listen here:
https://khow.iheart.com/featured/the-michael-brown-show/content/2026-03-21-1248-the-situation-the-weekend-3-21-26-the-weekend-hour-3-iran-bitcoin/
I will check it out, thanks Barb! 👍
The video about the songs of the ’70s is interesting. I had no idea that Vicki Lawrence had a hit song. I didn’t know she had done any serious recording.
A lot of those songs have either singable melodies, compelling stories, or both, and I think people are drawn to that. It’s why folk songs are popular. Some of them are nonsensical, some are sad, and many describe human experiences. Adults remember the songs learned as children.
I think *they* tried to destroy the cultural music and dancing that draws people together.
This seems like a failure of oversight, not to cast blame, but as a wakeup call to get something in place like DOGE to prevent and catch the fraud.
Bill Melugin:
Aubergine was discussing the problem of AI modeling not being based on the best of the best.
Stephen Simmons
@simmonsactual
Mar 22
Model Collapse and You.
Here’s what’s on my mind right now about the trajectory of AI. While this may be old news for some, we need to be collectively aware of risk as we continue to integrate AI into everything we do.
We are in an era of extraordinary AI scaling…models that power everything from personalized recommendations and content generation to logistics optimization, search, and decision support. But a subtle challenge is accelerating beneath the surface, and it has the potential to undermine the progress we’re counting on.
The core issue: AI is eating its own tail (the Ouroboros effect).
Today’s large language models, and recommendation systems, were trained on mostly human-generated internet data (pre-2023). But generative AI now floods the web with trillions of words, images, code snippets, reviews, and articles—much of it is low-quality, hallucinated, or homogenized. When the next generation of AI models scrape the web for training data, they ingest this “faulty” synthetic content.
The Result? Performance collapses:
– Variation drops (this is not DEI…the AI models output bland, repetitive junk).
– Errors compound (hallucinations become the norm).
– Real-world accuracy plummets.
It’s classic garbage-in/garbage-out on steroids. Research shows an even smaller fraction of synthetic data can break scaling laws, resulting in what I can only describe as the ultimate self-licking ice cream cone.
This creates a self-reinforcing feedback cycle…AI validating itself through AI. Small hallucinations, biases, simplifications, and homogenized patterns compound recursively. Variety in outputs narrows dramatically. Errors amplify. Performance degrades in subtle but cumulative ways resulting in recommendations becoming less useful, creativity erosion, accuracy drifts, and reliability erosion.
Researchers have documented this as “model collapse,” where the system breaks the very scaling laws that have driven progress, leading to cascading mediocrity rather than a sudden failure.
The stakes are high…and this challenge will impact big tech especially hard. The good news is that this isn’t an unsolvable problem. There are some easy ways to solve:
– Deliberately anchor machine learning/training with human-informed data — verified, diverse (not DEI – literally differing perspectives), and grounded in real-world human experience, judgment, and variety.
– Implement rigorous provenance tracking to know exactly what fraction of training data is synthetic vs. human-originated.
– Enforce minimum ratios of human-generated content in every major retraining cycle.
– Prioritize ongoing curation, external verification, and human oversight to prevent recursive degradation.
These aren’t heavy-handed restrictions on innovation…they are disciplined practices that preserve AI usefulness, trustworthiness, and long-term advancement. When we protect the human input, we ensure these AI models continue to improve rather than echo their own mediocrity.Mar 22, 2026 · 1:03 AM UTC