Monday, March 9, 2026

Why Talk About Ed Tech Integration is a Bad Idea?

Why is all the talk about integrating Tech into education a bad idea? Here’s why?

The issue is the idea of “integrating.” To “integrate means to combine (one thing) with another so that they become a whole.”

This notion of “integrating” implies that teaching and learning and educating are somehow “incomplete” or not whole, and that the tech to be integrated is somehow AUTOMATICALLY going to bring about that wholeness. Not so, as history has shown us many, many times.

To speak of “integrating” a tech is to assume it is whole and sufficiently able to offer a solution to whatever instructional problem ails the teaching act. Often, these technologies are not whole by themselves and they come bundled with a whole host of unintended and sometimes nasty consequences. (That just means the teacher now has to spend inordinate amounts of time addressing these side effects.)

Instead, the Ed Tech conversation should always be about ADOPTION. This immediately reframes the entire Ed Tech conversation. 

Ed Tech companies would help education even more if they designed their products as a solution to specific problems, instead of wasting time trying to get teachers to find ways to make their products useful and legitimate.

Their products should be solutions to specific educational problems, not solutions in search of educational problems to solve.

The reason the whole Ed Tech goal should be adoption instead of integration is because the “act of adopting” places that teacher as a AGENT in the process. No longer are they subjected to Ed Tech; they choose the tech tools they need.

Educators as “adopters” have the power to investigate technologies, ask the tough questions, and if they find it inadequate as a solution; they can veto it.

In the ED TECH ADOPTION model, the teacher is empowered to make decisions about the tools they will use or not use.

In the contrast between Ed Tech integration versus adoption, a tech solution is truely evaluated for its usefulness in specific teaching situations.

Thursday, March 5, 2026

It's Time to Rethink the Teacher Shortage Problem and It Does Not Involve Pay

 Perhaps the real problem with the shortage of teachers is that fewer and fewer people want to do the work as it has evolved over the past 30 years or so. 

When I started teaching in 1989 teachers operated in classrooms that allowed for independent creavitity, initiative, and excitement. There were no testing surveillance systems. You could operate without the intrusion of administrative experts and consultants who claimed to know how to teach content better than you. Parents were generally supportive of teachers and were not engaged in antagonistic tactics to what you were doing. They came to you if their were problems usually, and the teacher could work with the parent.

Classrooms have become culture war zones. They are places where the teacher often receives less and less professional deference. Instead, there are so many voices out their saying, “No, you need to do it this way, not that.” In a word, teaching has been transformed into a mechanistic scientific management task where one is surrounded by a troup of experts all telling the her/him how to do the job. 

There is no art to teaching anymore, because the administrators and their cadre of experts have transformed the instructional act into a scientific management work task.

It’s no longer rewarding to be a teacher. So, the answer seems to be in focusing on pay. Certainly you can find someone willing to do this work for the right pay, the idea goes. The problem is apparently you can’t pay enough for someone to do the teaching work today because fewer want to do it.

The reality is, teaching has lost what librarian-researcher Fobazi Ettarh calls “vocational awe.” 

Vocational awe is defined as a set of notions that Librarians have about their institution and themselves. To have vocational awe, the worker has to believe in their institution’s goodness and rightness. Also, the worker has to believe that their profession, the work they do is inherently good and sacred. In other other words, the worker believes their work is a calling, which means they will endure and persevere in the work tasks because of the good, sacred and worthwhile big picture.

Teaching has lost this vocational awe. Schools are constantly labeled failing by everyone. Even administors focus on the negative always in an environment of so-called continuous improvement. In addition, the teacher’s work is no longer seen as sacred, as special because it has been turned into tasks to be carried out scientifically. The teacher’s institution and the teacher’s work is fundamentally degraded by a system paranoiacally obsessed with trying to improve or change, in the worship of constant innovation.

What’s more, administrators and school HR recruiters can no longer capitalize on “vocational awe” to fill teaching positions. That’s because the “awe of teaching” and “being a teacher” is gone. 

The profession of teaching has been destroyed by politicians who want to cut budgets and continuously impose new requirements on teachers. 

It has been decimated by administrators who think they know how to teach so well, they constantly intrude into classrooms with their so-called coaching and feedback, treating teachers as if they don’t know anything. 

The teaching profession has been decimated by a consultant industry made up of experts who say they know teaching better, even though some of them spent less time in the classroom, and sometimes no time there.

The teaching shortage problem will not be solved by pay alone. 

It will certainly not be solved by relying on the vocational awe myth any more because no one is buying it. 

The teaching problem will only be solved if those who have degraded the work of teaching to the point that no one wants to do it, no matter the pay, are convinced to change their ways. 

No one wants to be a teacher anymore because vocational awe no longer exists.

Wednesday, March 4, 2026

Teaching Students About AI or Any Technology Just Might Be Shortsighted and Morally Wrong

Should our schools be focused on training students how to use AI above all else? No. Here’s why…

In the 1990s, I taught at a high school located in an area where 3 major fiber option manufacturers had set up shop, and they partnered with our schools to prepare students for the kinds of jobs they had to offer.

I attended multiple PD sessions, guided by district personnel and trainers from these three manufacturers. The goal was to train teachers to teach students the kinds of skills these manufacturers, and others like them, valued in employees.

I went back to my classroom and dutifully and conscientiously taught those skills because it was my job to teach students for the jobs in their future.

Fast forward 7 or 8 years later…the fiber optic industry tanked when demand fell. These manufacturers closed plants, merged and merged again, and laid off workers and shifted jobs to foreigh countries. Many lost their jobs, perhaps even some that I had dutifully prepared for that future.

The point here is business and manufacturing often live and survive in the short term and the now. They no longer provide lifetime careers. If profits can be made by shifting manufacturing elsewhere, they move. That’s how it is.

As educators, to prepare students for any jobs that exist currently or even hypothetically in the future is also shortsighted and potentially morally wrong. The current job situation will change when companies find the grass greener elsewhere, and trying to teach skills for jobs whose existence we are trying to predict or guess about is gambling our students’ futures. That is wrong.

The Seers of Silicon Valley have gotten much wrong in the past. I bet their predictions about AI will be wrong as well, or at least far off the mark.

As educators, we need to teach students, not for theoretical futures. We need to teach them everything that will allow them to live, adapt, cope, and survive in uncertainty and be decent, critical human beings.

Obsessively focusing on AI or any technology of the day is as shortsighted as most businesses currently operate. Sure, knowing what AI is, its faults, its capabilities, its limitations, its effects on culture and the environment, are all needed, but not placed at the center of all learning.

The point is, we do not need to do Silicon Valley bidding and teach students to be dutiful users of AI or any technology; we need to teach way beyond that to a world where AI has passed into banality and life has moved on to even greater things.

You Don't Have to Believe All Those Predictions About AI Because We've Been Here Before

There is a perfectly rational reason for discounting all the AI predictions of AI Evangelists and Ed Tech Consultants.

In the mid-90s, the internet zealots promoted the idea that the web was somehow “magically” to bring us all together. It was what Vincent Mosco called “the Myth of the Death of Distance.” The web was going to bring us all together. It was the end of geography. Too bad it did not happen.

Even the economists got it wrong. It was Frances Cairncross, economist for the journal “The Economist” who wrote in her book “The Death of Distance”:

With the web people would be “Free to explore different points of view, on the Internet or on the thousands of television and radio channels that will eventually be available. PEOPLE WILL BECOME LESS SUSCEPTIBLE TO PROPAGANDA from politicians who seek to stir up conflicts.” (CAP EMPHASIS MINE)

What’s more she added this now laughable prognostication:

“Bonded together by the invisible strands of global communications, HUMANITY MAY FIND THAT PEACE AND PROSPERITY ARE FOSTERED BY THE DEATH OF DISTANCE.”

Boy did she get it wrong, like so many other Silicon Valley Seers of salvation by technology. The only bonding that has taken place is social media companies and our personal data.

The web and its demon spawn social media, manufactured by Big Tech, more interested in getting extremely rich, has only made us more polarized and divided than we have ever been. Their algorithms are designed to shove into our eyeballs that which divides us, not bring us together.

As far as the wonderful “bonds of community” wrought by the internet and its technologies with the “Death of Distance? The only thing that has died has been what little genuine human connection we had among many other things.

So, when the AI Evangelists speak of the promise of not having to do those things we hate; when they boast that AI is the educational tool that is going to transform our profession; and that AI will some day figure out all our problems, can you undertstand why one should call them on this nonsense?

The best thing to do is to discount all the prediction nonsense, for no one ever provides the evidence. When they give us a massive list of jobs that will be replaced, consider it nonsense. They never provide any evidence for their assertion.

The last thing educators should do is gamble the lives of their students that all these AI prognostications are gospel. You can’t prepare them for a world that does not exist yet, because no one knows what that world we be like, not even the Silicon Valley CEO Seers nor the Ed Tech AI consultants.

Monday, March 2, 2026

Why the Web Has Become a Garbage Dump?

 Evidence that the internet is now a garbage dump?

As an early user of the web, I used to enjoy "surfing the web." This consisted of  typing key words into a search engine (Yes, I am old enough to admit I used AltaVista, Yahoo, etc.) and enjoy reading through the results, and it was a pleasurable experience. If it was a controversial topic, you often had both sides of the argument for your review.

You could enjoy seeing sites that were interested in to conveying INFORMATION and not trying to game the algorithms by trying to get their slop in front of searchers.

Today, surfing the web has become impossible. There's too much pooh, garbage, and sewage floating around that makes it impossible. To use Cory Doctorow's term from "Enshittification"? The entire internet is "enshittified."

The web is a sewer, a big garbage dump where whoever is willing to pay to get their slop in front of eyeballs gets an audience.

The pay-to-get-your-content-viewed ignores whether such content is worthy of eyeball time at all. No wonder the internet slop problem is so bad.

When the web was transformed entirely into a money-making avenue, that was the death of the old web.

What was once touted the "information highway" has become a massive garbage dispensary.

Too bad. Web surfing is a lost sport.


#EdTech #Internet #Education

Sunday, March 1, 2026

AI Educational Utopian Myths Abound: Be Skeptical

Check to be sure that you have not fallen for the utopian dreams of endless prosperity and freedom offered up by AI Evangelists and Ed Tech consultants. Those will turn out to be empty dreams.

Vincent Mosco wrote in his 2005 book "Digital Sublime: Myth, Power, and Cyberspace:

"American history in particular is replete with visions of technological utopia spun by mythmaking optimists." (p. 36)

Mosco captures in 2005 the same spirit of the so-called "Age of AI." Today, we still have an abundance of "mythmaking optimists" who peddle their "visions of technological uptopia" powered by AI. It is a myth.

Those optimists are at it again, as the Silicon Valley mob share their mythical visions of utopia. But it is an old story:

First, they brought promises of a utopian community through social media that has resulted in a world of massive polarization and division. False promise number one.

Second, they promised an internet that would provide us with knowledge at our fingertips, but instead they gave us a deformed web where paywalls and data extraction/exploitation must be the ransom paid before you receive that knowledge.

What Silicon Valley ultimately gives us is a deformed, mutant versions of its utopian promises.

You can bet Silicon Valley's mythical vision of AI utopia will turn into a mutated version that somehow makes us all worse off.


#AI #EdTech #AIEducation #Education

Friday, February 27, 2026

Are Our Screens and Devices Harming the Very Students We Serve? Perhaps, Here's a Book to Spark Critical Thinking about Device Addiction in Schools

 In order to Disrupt the passive, uncritical acceptance of all things technological into schools, I recommend that school leaders and all educators add Jared Coooney Horvath's "The Digital Delusion: How Classroom Technology Harms Our Kids' Learning—And How to Help Them Thrive Again" to their reading list.

It really isn't about "banning all screens" in schools; it's about not allowing devices and tech determine what happens in our classrooms and with our students.

Horvath rightfully captures how we as educators have been complicit in turning the control of education over to companies who have made big promises that have not panned out. In fact, the evidence is growing, despite dismissal by the tech evangelical movement, that there is some actual harm caused by this proliferation of technologies.

Don't forget, the smartphone and its apps, especially social media apps, are designed to be addictive and to "capture eyeballs" and we have invited these into our classrooms with open arms. 

Horvath is correct in his whole premise that we need to wrestle back control of our education system, our schools, our classrooms, and our instruction from devices.

It doesn't mean a complete ban; it means removing tech from its central pedestal on which we have placed it.

I could see using this book as a faculty-wide read with some powerful and lively discussions on the rightful place of technologies in our schools and in our lives.

Horvath even offers many hands-on ideas to implement a EdTech Detoxification Process in schools or even in our lives as parents.

If we are going to foster critical examination of EdTech and the constant flow of gadgets from Silicon Valley this book is a good place to start.




The Label "Smart" Device Might Not Be a Good Thing: Read Jathan Sadowski's "Too Smart"

 Here is a book to add to your critical Edtech and critical thinking about technology list, even though it goes back a bit to 2020.

"Too Smart: How Digital Capitalism Is Extracting Data, Controlling Our Lives, and Taking Over the World" By Jathan Sadowski

Sadowski takes you through a critical overview of how companies are purposefully making their products "smart" products in order to facilitate data extraction for exploitation purposes. 

When a device is labeled "smart" you can bet it is gathering data about you and not always for your benefit. 

  • Free consumer apps companies use this data to sell. 
  • Insurance companies use this data against you in their pricing schemes and to manipulate you in your driving habits.
  • Government entities use it in their surveillance activities.

After reading this book, when a salesperson touts that a TV or a dryer is a "smart" device, you will not automatically see that as a plus. You will know that it is more of a tactic of exploitation at best and manipulation at the worst.

A lot of money has been spent on convincing us as consumers that the quality of being "smart" is a good thing for our devices. It is not.

Sadowski even suggests ideas of how to disrupt and avoid all this, from turning off these features or anything related to them to the idea of purposefully sabotaging the whole smart enterprise.

There is a lot to be said about shading parts of your life out of reach of Big Tech. 




Tuesday, February 24, 2026

Why Promises of EdTech Disruption Fail: What Should Educators Do Instead

One thing educators can expect—a continuous barrage of new product pitches that claim to have disruptive and transformative abilities—and that is happening as Tech Companies churn out their new gadgets.

AI is just the latest iteration of that pitch. This time, the AI evangelists claim, there is finally going to be profound changes in education.

This prediction is wrong.

Schools are conservative institutions. They resist disruptive change because that’s the way they are built, for better or worse. 

If they change, the do so incrementally and slowly and that is purposeful, because if schools radically changed at the arrival of every technological or pedagogical whim, they would be “fad-surfing institions.”

Institutions that surf the lastest fads don’t every really fundamentally change in ways beneficial to anybody. Once the hype and the money has been spent on EdTech and AI consultants and technological hardware, the school is still there, and history shows it is no better or worse mostly.

Schools spend millions on these so-called “disruptive and transformative initiatives and when the hype dies down and has moved on to the next thing, they are left wandering why things are still the same and where all the money has gone.

True incremental educational change does not come from adopting new gadgets and paying off EdTech and AI consultants.

True incremental change happens when educators as a community of teachers sit down and do the hard work of examining where they are and working to find solutions.

You don’t start with a solution looking for a problem to solve which is what AI seems to be. We did that with PCs, the web, social media, online learning, only to discover that our long-time problems were left behind.

Friday, February 20, 2026

Silicon Valley Big Tech Innovation Model and Ed Tech's Role in It

 Silicon Valley Big Tech Innovation Model…

Big Tech engages in the “BIG Search.” This is where the companies search for the next Tech that will capture and enslave and addict users.

Discovery of Next Thing. Big Tech companies find a technology, device that has addiction/enslavement potential. (Variation, sometimes they transform and invention into an addictive technology).

Marketing for Addiction. Tech companies market their product as: a) a must-have tech or you will be left behind/irrelevant, or worse a Luddite, b) everybody is using or will be using, so you will be left out, c) you might as well adopt and adapt because the tech is already changing the world for the better. (NOTE: This is said even if it is not or if its negative consequences are substantial.)

Getting the Ed Institutions On-Board. Tech companies next get educators and Ed Tech involved by getting them “integrate” or “usage-promote” for students. This ensures future and sustainable users and markets for the companies. Also, Ed Tech consultants get a cut of the pie through consultant fees, keynote speaking fees. (NOTE: This is usually done on hearsay and no evidence. Educators who want to do what’s best for students are guilt-tripped until they get on board.)

Maintenance of the Addictive Solution/Technology. Tech companies maintain usage through continued marketing tactics above. They use uncritical acceptance of their product to their advantage. Ed Tech evangelists attack anyone who questions and criticizes. (NOTE: The Luddite Name-Calling Tactic is common.) They market their product as an unequivocal societal good, even as negative consequences stack up.

Big Tech Innovation Cycle Repetition. Tech companies search for more “innovative” addictive tech products. (NOTE: Variation—Big Tech companies buy out other technologies by small new companies and repeat the process above.)

As an educator what is most worrisome is the uncritical, entanglement of Ed Tech with these companies. This forces educators to subject students to these technologies uncritically. 

Educators are expected to sanitize and Tech-wash these products by Big Tech and the Educational Establishment.


Thursday, February 12, 2026

Another AI Company CEO Boasts About AI: Educators Need to Be Aware of a Used Car Salesmen Here

Another AI Company CEO Matt Schumer is promising major disruptions due to his pet technology. His X post hyping up his AI systems are below.

Schumer Something Big Is Happening (and I stand to make a bundle so you need to purchare my Hyperwrite Product)

Those who are sharing this individual's AI braggadocio, have you even asked the critical questions of these claims? 

First of all, have you considered that this individual has a biased interest that would make him say such things? After all, he wants users to sign up for his product and stands to make a bundle.

Educators, use some critical thinking before you buy into this nonsense. This just continues to fuel the AI bubble which is going blow at some point.

These AI CEO Shysters are out for your money and anyone's money and don't really care how their predictions harm others.

Educators should avoid doing anything or subjecting students to any Tech gadgets based on what these CEOs say.

Tuesday, February 10, 2026

EdTech Consultants and Some Educators Suffer from the Borg Complex: They See Resistance to All Technologies as Futile

I think I have found an effective diagnosis of the condition currently suffered by EdTech consultants and evangelists who can't help slobbering over AI: it is called the "Borg Complex."

The Borg Complex is described in an article entitled "Borg Complex: A Primer" in 2013 by L.M. Sacasas.

These EdTech AI boosters suffer from Borg Complex because they "explicitly assert or implicitly assume that resistance to technology is futile. The Borg is a cybernetic alien race in the Star Trek Universe that tells their victims that they will assimilate them biologically and technologically into their own and that "Resistance Is Futile."

Our EdTech consultants and boosters tell us educators that we might as well adopt AI because its here. In other words, "Resistance is futile." 

They might also exhibit some of the other symptoms as well:

For example, in this article, 

Symptom 1: "Makes grandiose, but unsupported claims for technology." How often have we heard that "AI is a gamechanger" or that it is "revolutionizing education" with absolutely NO support? 

Symptom 3: "Pays lip service to, but ultimately dismisses genuine concerns." This is repeatedly done when they are presented with new research that points to cognitive outsourcing issues, or when the environmental costs of all these AI server farms are mentioned.

Symptom 4: "Equates resistance or caution to reactionary nostalgia." If you resist AI, you are simply clinging to old inefficient, unproductive ways.

And Symptom 8: "Refers to historical antecedents to solely dismiss present concerns." How many times have I heard an AI booster nostalgically call resistance to AI like the resistance to calculators when introduced.

Borg Complex Rhetoric is designed to short-circuit any critical thought and critical examination of AI. 


Monday, February 9, 2026

Sometimes All That Technology We Buy Fails and We Need to Admit It

I am not sure EdTech has ever found a technology they did not like or they labeled a failure.

EdTech gurus often say that when a technology initiative or technology program fails, it’s always due to either:

1-Lack of proper training

2-Lack of fidelity of implementation.

Very rarely will you hear, “Well, that technology use was a flop!” It is never the technology that was the problem.

Gun advocates say similar. Its the people who use technology, not the technology. (Notice you can change the word technology to guns here.)

It’s not visionary to hang on to what isn’t working just because it “looks like you’re innovative.” Or, because technology justifies your existence or job or your consulting position.

Cell phones and screens as well as tech are disruptive all right.

Their engineering for addiction works all too well. They demand the students’ focus and attention because that’s what Big Tech wants, their eyeballs glued to devices.

Saturday, February 7, 2026

No Technology Is Inevitable and to Make That Claim Is to Limit Possibilities

“Educators might as well accept AI because it is here to stay” so goes one of the pet EdTech and AI evangelist arguments. That is not necessarily true, but let’s look at it more closely.

This statement actually can’t be proven right because no one can see the future. Tech comes and goes so to say it is here to stay is a prediction, not a fact.

In addition, this statement assumes that educators should accept AI in whatever form being offered. Again not necessarily true. It is possible to demand that AI be safe and that it be subject to regulations that shape it in ways that is less destructive than it is.

There is also always the consumer choice not to be a user. I do not have to have a subscription or AI account.

The problem with this statement is that is authoritarian and totalitarian by nature. It tries to remove choice, and that is a way for Silicon Valley to dictate that their products are accepted.

It is this: “If you limit and direct what people can imagine, you set the parameters of possibility.”

Friday, February 6, 2026

Don't Believe the Silicon Valley Marketing Tactic: AI Is Not Inevitable

Silicon Valley Tech companies have taken advantage of clever marketing, favorable public opinion, and shiny-magic gadgets to ensnare us with tech designed to be addictive, invasively surveilliant, and exploitative.

It is acceptable to question the reality that these techno-oligarchs and digital capitalists claim to be making and see that they aren’t actually making our world better. They only make themselves richer, which is evident from the homes they buy, the cars they drive, and even the clothes they wear. They are prospering at the expense of all their users.

Here’s just some of the examples of their past promises and what they’ve done instead.

The web was to bring glorious access to content that was free, current, and reliable. Instead, we have a internet garbage dump and sewer of nonsense. Search and you don’t know what excrement you will get next, and the stench only increases.

Next, social media was supposed to bring us closer together and connect the globe. Instead, we have never been more polarized and divided. Facebook and Twitter have proven to be misinformation machines and BS spreaders. Even Linked-In is a BS-marketing platform where if you can package it and sell it to get clicks, you become an “influencer.” Tik-Tok, YouTube, all are platforms that allow you spread excrement and get paid for it.

Then there was cell phones which were supposed to provide us constant access to all of this—the web, social media, etc. We could always be connected. Instead, it offers always-on-demand addiction and isolation. It even makes us less social…just watch a family sitting in a restaurant, all engaged with screens instead of each other. There’s connection, but it is to what these tech companies want us connected to so they can sell ads and make money from our addictions and data.

Now it is AI. It is here and it has its promises of taking away all the dirty, distasteful work we don’t like doing. It is going to solve all our problems. It promises to make us even more “efficent and free.” What will its “instead” be? Even today there are hints.

Instead of fulfilling its promises, AI will bring us a more polluted world because of its increased demands for power needed for their server farms. Coal plants that were going to be decommissioned are being kept online, furthering polluting the environment. There is even talk of restarting the use of a nuclear plant on the East Coast that almost made a big swath of Pennsylvania into the American version of Chernobyl. 

In addition, instead of fulfilling its promises, AI is causing tech companies to consume even more scarce fresh water resources to cool their massive server farms  in many areas of the country at a time it is becoming harder and harder to provide safe drinking water to populations. 

Finally, instead of fulfiling its promises, AI is adding more garbage and sewage to the Internet garbage dump with its growing pile of AI slop. The web will become more and more a place of misinformation and nonsense. One can only imagine what the web will be in 15 or 20 years!

As these AI companies and those that keep peddling their products as a replacement for human workers, we seem to be getting closer to the utopia of machines that Kurt Vonnegut describes in his novel Player Piano where people who have no purpose in life live in cities with no future and no hope.

Here’s the lesson: NOTHING BIG TECH INVENTS WAS AND IS INEVITABLE. Our purpose in life is not to use their products or adapt our lives to use their products. We can, with leadership and vision, demand they create products that serve our ends and not just theirs.

Educators who are scrambling to “adapt to AI” because they’ve been sold on its inevitability are misguided. There is no evidence that it has to be inevitable in its current form or any form. Choices can be made, and we do not have to surrender to make these products successful.


Friday, January 30, 2026

EdTech AI Promoters Need a New Argument...They Sound Like a Tired Over-Aired Commercial

 LinkedIn AI promoters need new promo tactics.

The tired, worn-out statement used by AI cheerleaders:


AI isn’t going to replace _____ (insert whatever job title the person is peddling AI to, i.e., teacher, programmer, or school leader); it will replace _____ (insert same from above) who do not use AI.


One would wish that AI cheerleaders would at least come up with some original arguments, instead of using old, tired, and unproven statements like this.


If you want to convince someone of the necessary utility of your pet product at least try to make some new, supported, valid arguments.


AI cheerleaders’ posts on here is like seeing that same, boring commercial that runs every commercial break while watching the news. And, I would add it is about as convincing. One can only impatiently wait until the tired commercial is over.

Perhaps Its Time for a "Screenless" Charter School Focused on Teaching How to Be Human

Perhaps it is time to establish a “screenless” charter school in NC. It could have as its central mission to educate without using EdTech as a crutch for learning and technology as a means of controlling teachers, students and the learning.

The school could make a commitment to teaching students to be critical, independent and responsible citizens not proper consumers of the latest product on offer from Silicon Valley and Big Tech.

The school could still utilize technology, but that technology would be in control of the teachers and parents not Ed Leaders and EdTech consultants.

The goal of this screenless charter school would be to create a space where teachers connect with students and their parents without the constant mediation of impersonal devices whose goals are to addict and capture attention. A place for people not machines.

Students would learn the foundations that would make them critical consumers instead of EdTech fad chasers.


Wednesday, January 28, 2026

Do I Really Care If My Dishwasher Is Silcon Valley Smart?

 

Photo by Author: Implements of Progress

“Smart technologies” are all around us. Walk into an electronic store, and there are gadgets everywhere, from light switches to TVs that proudly wear the label “smart.” But do we really need “smart” devices? Does my overhead light turning on by itself really provide any value? Am I just so lazy and obsessed with efficiency that flipping a light switch won't do? Tech companies to car companies hope you fall for smartness. They have worked overtime to make sure “smart” label is something that sells. In the end, who's the sucker and has really earned the label "dumb"?

What does “smart” mean? I think the simplest definition of this term is offered by Jathan Sadowski in his book Too Smart: How Digital Capitalism Is Extracting Data, Controlling Our Lives, and Taking Over the World. He writes:

...’smart’ means a thing is embedded with digital technology for data collection, network connectivity, and enhanced control.”

The question is, Do I really care whether my TV collects data, is connected to network, and provides enhanced control? Perhaps I do for some of these, I like being able to stream to my TV, because it is much easier way of getting programming than by antenna or cable, so network connectivity I care about. The other two, well, not so much.

I really do not want my devices collecting data about me and my usage. Someone out there knowing which shows I watch is not something I value at all. I don’t even like the recommendations that pop up in Netflix and would prefer the old fashioned way of reading descriptions and then deciding what I want to watch. And, the “control?” Whose control I would ask? If means I have more control, I thought I was already in control. If it means someone else's, that's creepy.

I once purchased a dishwasher that self-declared to be “smart.” Admittedly, hearing that it had that feature seemed to be an added plus at first. Then, I realized, I bought a dishwasher to wash the dishes, not to send back my usage data to some company in the cloud so that they can profit from it. I disabled the smartness, and the dishwasher continues to do what I bought it for in the first place: wash the dishes.

"Smartness" has been peddled by companies as innovative and must-have. When Big Tech throws around its “innovative” products and ideas, we would do well to ask: “For whom is this product really innovative?” “Who is really gaining the most from this “smartness” thing?

Chances are, the answers to those questions are not "me" or at least entirely me. The whole industry of Big Tech kind of reminds me of Leroy, the used car salesman. He’d tell you that the car could fly if it means you would buy it. Silicon Valley Tech Companies have earned that same label, slimy salesmanship.