6 min read

Silicon Valley, and delusions STEMing from AI

Silicon Valley, and delusions STEMing from AI
Photo by Alexander Sinn / Unsplash

Let's take an analogy.

Apple in English, and Arabic, is a matter of a different language. But both are talking about the same thing (apple).

So to have a conversation, we assume there's a benefit from having an English - Arabic translator, to talk better about apples. The apple itself doesn't change - only the understanding in a domain does.

That is, both parties mutually realize that the subject of discussion is the same thing (apple).

Given that people are not computers, we don't need that level of translation for computers - we start with truth (apple).

The roots of computer science are in computation, logic, semantics, and electrical execution. That's why people from Arabia, India, Russia and America can talk about OOP concepts for apples, though some of us have to learn more in terms of expression, than others.

Logic doesn't have a language, though it's often expressed in, or as one.

So when I hear use cases for "AI" translating between domains, or between different programming languages, I am concerned that engineers are missing the point of engineering.

An AI that translates A (SaaS commerce c# application) to B (Statistical analysis of weather, in python), is interesting to think about - and that's all it is.

Occam's razor says, using Computer Science, make 2 websites, and expose APIs if needed. If someone really wants to code a web app to deal with 2 different websites - let them code it, and pay for 2 APIs.

Now, you might say that it's innovative, creative, or interesting to still have an AI that can do such translations.

I hear you, I'm a writer! I love innovation, creativity and weird interesting.

The problem is that taking the longer route to a destination, however scenic, is the longer route to a destination.

And creating AI, to listen in English, what will be compiled to binary, and executed in binary, is longer than coding.

It's not the engineer's way.

The engineer takes the shortest route - theory prizes elegance, brevity, clarity, order and method.

For everything else, we have artists. šŸ˜‰

Taking the scenic route is the Artist's way, because:

  1. We can think longer. Artists like art - it's fulfilling to appreciate Mother Nature.
  2. We might see something new. Artists learn from Mother Nature, at the level of praying mantis first principles.
  3. Doing something that's useless, helps us understand the merit in taking shorter routes, which comes under growth for us, especially since we don't have quantification of 'beauty' or 'elegance'. Our path is both the journey and the destination, especially when we're lacking in teachers of the way.

Engineers can be criticized for reinventing the wheel - we artists are happy to forget, just to rediscover the essence of Kung Fu.

It's like how we encourage kids who disagree with us, to keep writing 1+1+1+1.... to self realize that it's just easier to write 100 - we are teaching a semantic system (10 based) and the law of cumulation.

They now will write 100, instead of 100 1s.

At work, we want those now (grown up kids) to use calculators. The abstraction is there for a reason - it takes too much paper (and kids) to write things by hand.

A calculator is emergent relative to an educated kid. We don't want them to write 1s by hand.

So wanting an English-Arabic translator is a very different analogy from wanting an all-in-1, natural language processing, 100% accurate AI.

The former is a need from wanting to do business with different cultures; the latter is an elaborate orchestration to solve a solved problem that most people don't care about.

I wouldn't point this out if AI - its creation, the study, and research, and the pursuit - was seen as art. As an artist, I don't judge.

I am point this out as an engineer, because:

"Our analysis suggests that the recent investments in AI-related categories have contributed significantly to the real GDP growth in 2025."


As an artist who believes in creating STEM value based on principles of computation, please realize that creating engineering in the name of art, will never work, especially in capitalistic America.

Ask any artist near you.

The STEM industry is helping in demonstrating the value of art, with AI slop - people are realizing the difference between intentional, soul driven, authentic work, and random guesswork based on the same.

Instead of realizing you all are at the wrong party, and being happy drinking air, you are investing even more into products that don't reflect computation basics.

It's not that I have a problem with disruptive hype - I am just looking out for the STEM people.

This is my argument, here:

Betting against AI: my concerns and questions | Suman J. posted on the topic | LinkedIn
Hi, I have been vocally betting against AI, and some of you might have seen my comments, criticisms, or jokes, when I’ve posted publicly. In that context, what this person said resonates with me, but more importantly, I like that this video has a lot of data, and has some relevant lessons for most IT engineers, especially below mid-level. https://lnkd.in/ddr64CiU Here are things I’ve said before, in defense of betting against AI. 1. I believe that the mind-body problem is a hard problem, and I think the Chinese Room argument reveals that being, and mimicking can give the same results, but that doesn’t necessarily mean sentience exists. I don’t see how chatGPT is intelligent, let alone sentient. 2. I believe a lot of resources went into training ChatGPT. I haven’t come across a reason to believe that AI is a product of that training, as opposed to more than a few genuine innovations. As stated in my instagram post - can we train monkeys, to write Shakespeare? If your answer was no, how about this experiment with generational selectiveness, human input, and a huge budget? 3. Pokemon Yellow has AI, and so do many fun video games. Even Scythe, a board game, has an artificial system in place. I am concerned that as far as IT is concerned, engineers are seeing something fancy, and out of a genuine love for technology are conflating their love of something with a fair appraisal of product. What is different about ChatGPT, than World of Warcraft at the speed of Google’s servers? 4. I do think it’s cool that ChatGPT can write poetry. I don’t think it’s competition to me, or a good product for any other person interested in buying common art for their loved ones. I have seen publishing companies ban chatGPT generated stories. 5. I am concerned that AI can be a legal bottleneck, which can hurt both investors and consumers, in the long run. I was under the impression that software companies don’t like open source software with tricky permissive clauses, due to losing business advantage, and increased legal risk. 6. If everyone can use AI, then AI is a game changer, not an advantage. This is because something anyone can use, becomes ubiquitous. 7. I am concerned that the meaning of the word AI, now connotes ā€˜scaled, speedy and robust computation mimicking intelligence’ and not science fiction based AI, which I thought was the point of the hype. When Google talks about Project Relate, are we talking about insane speedy computation, or ā€œAIā€?* https://lnkd.in/dkfnnRvy Fact checking / criticism / sources / counter-arguments are very welcome! My background is mostly in Software development, as in ā€˜SDLC’ and not ā€˜machine learning’, though I am interested in the idea of reinforced learning, and other CS - topics. Thanks! #ArtificialIntelligence

And this is Djistkra's argument - not from the books of Cracking the Coding Interview, or DSA for Google leetcode prep - but an argument from archives talking about On the foolishness of "natural language programming".