Where the danger is, so to the saving power grows! Gestell, as positionality, as enframing, renders us mere points in a graph, mere objects in a set—literally Human Resources. I was very excited to see Wendell Berry appear in here, I’ve always thought that his work called me towards Gelassenheit
Accurate. If we look at the etymological root of the word “technology”, it means just that: the “systematic treatment of an art, craft, or technique”. At it’s core, “technology” is not restricted to software; but is, rather, the way one approaches their art, craft, or doing of a thing.. when we get a clearer picture of what it is we do (as originating from who we are) then we can more clearly put these digital and information technologies (as well as other technologies) in their rightful place as tools which serve greater, likely evolving, ends.
Really enjoyed the Ellul and McLuhan references here. Their work captures how optimization gradually becomes the organizing logic of entire systems. One thing that seems increasingly visible today is how that process can start to erode the contextual structures that give things their meaning. When those structures weaken, people end up having to constantly reinterpret what they are seeing or experiencing. This may explain why so many digital environments feel strangely disorienting lately.
100%. I also think this (“eroding contextual structures”) is why we’ve started to refer to everything as a ‘vibe’. those meaning structures are gone, so we rely on assuming that everyone is in the same moment, feeling/experiencing the same vibrations.
Reminds me somewhat of how oral cultures didn’t have objective concept of meaning like we have (or seek) today. This is explored by Walker Percy in Lost in the Cosmos. I’ve also encountered similar themes in Owen Barfield’s Saving the Appearances as he describes pre-literate culture concept of “meaning”:
Compared with us, they felt themselves and the objects around them and the words that expressed those objects, immersed together in something like a clear lake of… ‘meaning’.
You might enjoy this piece I wrote earlier this year which picked up on some similar themes and thinkers… (especially your note on McLuhan’s ideas about “how optimization gradually becomes the organizing logic of entire systems”). https://figureandground.substack.com/p/found-in-the-cosmos
"Software—the industry designed to optimize everything else—is now itself being optimized away. Automation has turned inward. Technique has begun to consume its own infrastructure."
Yet, there is a distinction that many miss - although much time is spent in software development making things work, the most important effort is spent determining "what" to do - determining the behavior in detail. When human behaviors, complex behaviors and critical elements come into play, the work is not trivial. That analysis and confirming the validity of what was conceived and produced is the true heart of the discipline currently known as software engineering. All else is friction encountered along the way.
love this distinction! Our tools still need an intent, a direction. I’ve always said that the real question of software is asking what should we do more than what can we do.
This also clarifies the purpose technology to be path-finding in an obstacle-filled space. AI becomes simply another lever and friction reducer.
For well-known problems, AI can rapidly select a known and stable solution and tweak it. For new and unknown problems, unless AI becomes an oracle, mind-reader, and prophet, there will be a role for human explorers, dreamers, analysts, and directors, albeit with more powerful, though reality-limited tools.
In Barry Longyear's scifi novel "The Tomorrow Testament, there is a discipline called Talma, used by all disciplines within the alien Drac race to achieve goals -
---
..."What is Talma. Mitzak?"
"It took me months to understand, Nicole."
"Try."
"Nicole, you are in a place. There is a place that you want to be. Your task is to get from the first to the second."
"How?"
"You must know where you are; you must know where you want to go; you must know the limits on the paths between the two.... "
...
"Situation assessment, goal formulation, and path construction and evaluation are not systemized disciplines among humans."
---
This is the abstraction of all constructive and problem-solving disciplines, including software engineering. If we take a step backwards and out from the problems we solve, we begin to see the regular shapes of structured thinking. There are echos in architecture and software - design patterns, though those are child's scribbles compared to the possibilities.
This is an interesting read. Thank you for your thoughts!
I’ve often thought that modern generative AI is a clear example of Heidegger’s concept of Enframing, whereby the feedback loop between man and technology gradually converge to the same trajectory. If we completely give up our input to autonomous systems, then we cease to truly generate anything novel. With AI, the words of the teacher ring true: “There is nothing new under the sun.”
Humans are beautiful and we create things. But it’s arrogant to see ourselves as a pinnacle of everything.
If I can draw a parallel here: humans vs technology :: first organisms vs ion channel proteins.
In even basic bateria there are ion channels maintained by certain proteins, because electric signals are faster than chemicals. Cells and then animals learned to do many things with it under the strict rules of evolutionary game. Then some cells became specialized in bioelectrical signals and we have developed neural cells, then neural systems, complex brains. Next big jump is evidently optimizing neural principles by extracting them from slow and feeble meat altogether.
As human I desire to prosper and have technology serve me. But there is value in letting it grow on its own, as a next fractal layer of low entropy peak that we inhabit.
I love eating itself
You sound like my son ... your more philosophical and he's more technical ..
Where the danger is, so to the saving power grows! Gestell, as positionality, as enframing, renders us mere points in a graph, mere objects in a set—literally Human Resources. I was very excited to see Wendell Berry appear in here, I’ve always thought that his work called me towards Gelassenheit
Accurate. If we look at the etymological root of the word “technology”, it means just that: the “systematic treatment of an art, craft, or technique”. At it’s core, “technology” is not restricted to software; but is, rather, the way one approaches their art, craft, or doing of a thing.. when we get a clearer picture of what it is we do (as originating from who we are) then we can more clearly put these digital and information technologies (as well as other technologies) in their rightful place as tools which serve greater, likely evolving, ends.
Really enjoyed the Ellul and McLuhan references here. Their work captures how optimization gradually becomes the organizing logic of entire systems. One thing that seems increasingly visible today is how that process can start to erode the contextual structures that give things their meaning. When those structures weaken, people end up having to constantly reinterpret what they are seeing or experiencing. This may explain why so many digital environments feel strangely disorienting lately.
100%. I also think this (“eroding contextual structures”) is why we’ve started to refer to everything as a ‘vibe’. those meaning structures are gone, so we rely on assuming that everyone is in the same moment, feeling/experiencing the same vibrations.
Reminds me somewhat of how oral cultures didn’t have objective concept of meaning like we have (or seek) today. This is explored by Walker Percy in Lost in the Cosmos. I’ve also encountered similar themes in Owen Barfield’s Saving the Appearances as he describes pre-literate culture concept of “meaning”:
Compared with us, they felt themselves and the objects around them and the words that expressed those objects, immersed together in something like a clear lake of… ‘meaning’.
You might enjoy this piece I wrote earlier this year which picked up on some similar themes and thinkers… (especially your note on McLuhan’s ideas about “how optimization gradually becomes the organizing logic of entire systems”). https://figureandground.substack.com/p/found-in-the-cosmos
"Software—the industry designed to optimize everything else—is now itself being optimized away. Automation has turned inward. Technique has begun to consume its own infrastructure."
Yet, there is a distinction that many miss - although much time is spent in software development making things work, the most important effort is spent determining "what" to do - determining the behavior in detail. When human behaviors, complex behaviors and critical elements come into play, the work is not trivial. That analysis and confirming the validity of what was conceived and produced is the true heart of the discipline currently known as software engineering. All else is friction encountered along the way.
love this distinction! Our tools still need an intent, a direction. I’ve always said that the real question of software is asking what should we do more than what can we do.
This also clarifies the purpose technology to be path-finding in an obstacle-filled space. AI becomes simply another lever and friction reducer.
For well-known problems, AI can rapidly select a known and stable solution and tweak it. For new and unknown problems, unless AI becomes an oracle, mind-reader, and prophet, there will be a role for human explorers, dreamers, analysts, and directors, albeit with more powerful, though reality-limited tools.
In Barry Longyear's scifi novel "The Tomorrow Testament, there is a discipline called Talma, used by all disciplines within the alien Drac race to achieve goals -
---
..."What is Talma. Mitzak?"
"It took me months to understand, Nicole."
"Try."
"Nicole, you are in a place. There is a place that you want to be. Your task is to get from the first to the second."
"How?"
"You must know where you are; you must know where you want to go; you must know the limits on the paths between the two.... "
...
"Situation assessment, goal formulation, and path construction and evaluation are not systemized disciplines among humans."
---
This is the abstraction of all constructive and problem-solving disciplines, including software engineering. If we take a step backwards and out from the problems we solve, we begin to see the regular shapes of structured thinking. There are echos in architecture and software - design patterns, though those are child's scribbles compared to the possibilities.
This is an interesting read. Thank you for your thoughts!
I’ve often thought that modern generative AI is a clear example of Heidegger’s concept of Enframing, whereby the feedback loop between man and technology gradually converge to the same trajectory. If we completely give up our input to autonomous systems, then we cease to truly generate anything novel. With AI, the words of the teacher ring true: “There is nothing new under the sun.”
Agreed with the most, except the finale.
A bit too human-centric.
Humans are beautiful and we create things. But it’s arrogant to see ourselves as a pinnacle of everything.
If I can draw a parallel here: humans vs technology :: first organisms vs ion channel proteins.
In even basic bateria there are ion channels maintained by certain proteins, because electric signals are faster than chemicals. Cells and then animals learned to do many things with it under the strict rules of evolutionary game. Then some cells became specialized in bioelectrical signals and we have developed neural cells, then neural systems, complex brains. Next big jump is evidently optimizing neural principles by extracting them from slow and feeble meat altogether.
As human I desire to prosper and have technology serve me. But there is value in letting it grow on its own, as a next fractal layer of low entropy peak that we inhabit.
So, yes, go Team Humans but not *just* humans.