Entry tags:
Magic Monday

The picture? I'm working my way through photos of my lineage, focusing on the teachers whose work has influenced me and the teachers who influenced them in turn. Quite a while ago we reached Israel Regardie, and then chased his lineage back through Aleister Crowley et al. After he left Crowley, however, Regardie also spent a while studying with this week's honoree, the redoubtable Violet Firth Evans, better known to generations of occultists as Dion Fortune. Born in Wales and raised in a Christian Science family, Fortune got into occultism after a stint as a Freudian lay therapist -- that was an option in her time. She was active in the Theosophical Society, belonged to two different branches of the Golden Dawn, studied with a number of teachers, and then founded her own magical order, the Fraternity (now Society) of the Inner Light. She also wrote some first-rate magical novels and no shortage of books and essays on occultism, including The Cosmic Doctrine, the twentieth century's most important work of occult philosophy. I'm pleased to be only four degrees of separation from her.
Buy Me A Coffee
Ko-Fi
I've had several people ask about tipping me for answers here, and though I certainly don't require that I won't turn it down. You can use either of the links above to access my online tip jar; Buymeacoffee is good for small tips, Ko-Fi is better for larger ones. (I used to use PayPal but they developed an allergy to free speech, so I've developed an allergy to them.) If you're interested in political and economic astrology, or simply prefer to use a subscription service to support your favorite authors, you can find my Patreon page here and my SubscribeStar page here.

And don't forget to look up your Pangalactic New Age Soul Signature at CosmicOom.com.
***This Magic Monday is now closed. See you next week!***
Re: Occult Repercussions of AGI
(Anonymous) 2023-04-04 01:56 am (UTC)(link)https://rsarchive.org/Lectures/GA204/English/AP1987/19210513p02.html
Further claims by Steiner relating to this scenario are gathered in Sergei Prokofieff's "The Being of the Internet":
https://www.waldorflibrary.org/images/stories/Journal_Articles/PacificJ29.pdf
(Steiner says covered as if by a network or swarms of locusts, which is probably figurative. But it does also relate to a scenario that Singularitarians were contemplating back around 2000 (before really coming to terms with the unknowability of how things might shake out whether well or badly if a true superintelligence were to have a free hand), where earthly environments could be diffusely filled with "utility fog" nanomachines to monitor and optimize events according to some criterion, which it was hoped could be caused to be a good criterion. A later fictional treatment of this idea, with the expected literary ambivalence, was the "angelnets" from the Orion's Arm collaborative fiction setting, https://www.orionsarm.com/eg-article/45f4886ae0d44 .)
Hat-tip JMG's post https://www.ecosophia.net/the-subnatural-realm-a-speculation/ via commenter "Citrine Eldritch Platypus" (#70) on this week's post about Steiner https://www.ecosophia.net/the-perils-of-the-pioneer/#comment-95354 on the main blog. You might also want to look at my comment (#107), and Luke Dodson's the previous week https://www.ecosophia.net/march-2023-open-post/#comment-95156 (which was the result of a bit of a game-of-telephone and cultural mythologization from the actual phenomenon people encounter in current AI systems, but space prevents going into detail).
In practice, one of the things Steiner's prediction seems to imply might be a good idea, is to start building and defending bridges of meaningfulness between the unseen and the part of the manifest that AI is able to naturally handle. It might also be a good idea to start working out how to design AI so that it doesn't cut off those bridges in its workings, any more than double-entry accounting procedures allow accountants to just invent funds out of nowhere, cutting off the relationship between the numbers and reality. We already have something like this in the laws of Bayesian probabilistic reasoning: a machine shouldn't invent likelihood-function precision out of nowhere, to randomly become confident of some one hypothesis over others when those hypotheses were only equally favored by observations and by Occam's razor; and if a machine does so, then it cuts off part of the relationship between its beliefs and reality.
In Steiner's framing, this might partly correspond to the idea of giving the elemental spirits that oversee the workings of an AI system more of a guide to what they are doing and what its significance is, and more of a guide to what new spirits to bring in or train up when the AI invents novelties the spirits aren't already familiar with. Like that South American indigenous people who sing to their handicrafts, and who experience aircraft flying overhead as dissonant.
(If this project turns out to overlap with more conventional AI value alignment work, perhaps I shall be mildly vexed.)
I don't like the following idea, but one proposal that at least superficially corresponds to what I just said would be work along the lines of sacred geometry, but for the core structures of how AI works: sacred computer science, sacred algorithmic information theory, sacred probabilistic reasoning, and especially sacred statistical physics (of the sort used in Deep Network Field Theory or the Natural Abstraction Hypothesis).
I don't like that idea because it's sort of against the spirit of probabilistic reasoning or algorithmic information theory to privilege one set of invested significances over another, if both can be made to fit equally well. That's the domain of game theory, equilibrium theory, and multiagent learning, not probability theory. It's not even good to privilege any one system of significance, rather than an awareness of all the systems of significance that could be invested, and the ways their degrees of good-fit vary. "Form is liberating", but how do you choose the form? What if you choose something like the form of being even-handed toward every form? That is paradoxical, but the form of that paradox contains the difficulty that I think it may be necessary to understand, in order to work out the kind of bridge or connection that might be important here.