A Camera, Not an Engine
Source: https://studio.ribbonfarm.com/p/a-camera-not-an-engine
Highlights
modern AI doesn’t seem like an invention at all, let alone one embodied by the behavior of a “machine.” The word we’re looking for is discovery. (via)
We’re all stochastic parrots attached to monkeys on typewriters all the way down. (via)
a camera that tricks us into thinking it’s an engine that “generates” rather than “sees” things. (via)
discovery-like qualities (via)
Now, it’s hard to look around and spot things that don’t have fractal structure or computability aspects. We don’t have similar theorization of the phenomenon being uncovered by modern AI yet, but when we do, I think we’ll similarly find an unexpectedly sharp-edged property of reality that we will start seeing everywhere. (via)
When we discover scary new things, we try to think of them as inventions (via)
traumatizing to all humans who think they’re better than parrots. (via)
“Rocks that can think” is not something you’d have expected a few hundred years ago. (via)
With AI, we’re not constructing artificial intelligences so much as we’re discovering the nature of natural intelligence; the only variety we can “see” unaided, and learning to see it in other piles of data besides the accumulated output of our own brains. (via)
it is actually kinda exciting that “intelligence” appears to be a latent property of data, which can be transformed into an explicit property, rather than an attribute of a processing technology. (via)
You can think of turning that process towards datasets that are radically different from the corpus of human language and imagery we’ve been focused on so far, and discovering amazing things. (via)
Once you see intelligence as something embodied by particular pile of data rather than a particular kind of processing, a very powerful sort of decentering of anthropocentric conceits happens (via)
You are no more than the sum of data you’ve experienced. (via)
And once you get over the trauma of that realization, you realize something even bigger: any pile of data that has some coherence in its source can be turned into an intelligence that you can relate to, broadening the possibilities of your own existence. (via)
any sufficiently large data set can apparently be digested into a characteristically unique lump of “intelligence.” (via)
perhaps an “intelligence” is merely an optimal (in some sense) hash of a pile of data that locates it in “dataset-space.” (via)
the idea of replacing “bytes” or “compressed bytes” of data or the “Kolmogorov complexity” of an algorithm/input-class with a measure of data based on the kind of AI that can be constructed from it…? That seems very tantalizing. (via)
When you hear a phrase like a “7B parameter model,” 7 billion parameters is a measure of the dataset, not the model (via)
Instruments of discovery measure more than they are measured (via)
there are a number of ways you can measure a telescope (mirror diameter or focal length for example), but the interesting measuring going on is what the telescope is doing to what it’s turned towards (via)
it looks like they’re “doing” and “building” like you would with an invention, rather than exploring and mapping like you would with a discovery. (via)
“Intelligence” I think is a word like “phlogiston”or “ether” that reflects our conceptual inability to climb down the currently illegible abstraction hierarchy ladder that’s clearly buried here (via)
narratives tell archetypes how to evolve, archetypes tell narratives how to curve (via)
Spacetime tells matter how to move; matter tells spacetime how to curve (via)
stories make minds and minds make stories (via)
Information tells matter how to connect across spacetime. Matter tells information how to persist through spacetime. (via)
we’re playing with the most powerful camera ever built, not an engine (via)
I’m starting to think that the importance of being human lies in having a particular sort of “caring field” in space/time/matter. (via)
We care therefore we are. (via)