Psalm Recommendations from ChatGPT
An engineer's hesitation tugs on many threads within a theology of computing.
A friend reached out to me recently. He was aware of what I'm doing here at Full Stack Theology - or rather, he's been aware of the many events that led up to FST over the past few years - and wanted my thoughts. He was curious about using ChatGPT to take reflections and movements of the soul he's written down during the day and use them to recommend a Psalm to pray with at bedtime. His question:
Separate from whether the AI could actually do that, but assuming it did a perfect job, is that a good idea? Are there any moral hazards, or is it perfectly fine and just feels weird?
I reflected for a short time, and made a few requests of my own for borrowed prudence.1 The emerging consensus was that ChatGPT seemed fine in this use case - its use was not intrinsically bad, given certain additional cautions - but there were other risks around the implied goals of prayer and the motivation for using ChatGPT in the first place. I'm writing about this question because it's interesting in itself and also because it tugs on so many threads relating to technology use more generally.
In broad strokes, I judged that ChatGPT was used like a more flexible search engine to find a certain psalm. I have seen guides with lines saying "When you feel X, read Psalm Y." That seems, on the whole, good. The benefit that ChatGPT provides is that it has better results because of its better "understanding of context"2 when drawing out the salient point from a larger paragraph.
In my reply to this friend, I followed this initial judgment with several caveats. I'll discuss the caveats first, but I couldn't help but wonder why so often the response to a new technology is a cautious and measured "It's good, but" - that discussion will follow. The caveats are in order from most visibly practical to most visibly theological.
Hallucinations, another word for "the AI was factually incorrect." ChatGPT's responses are generated through a statistical process. Its expressiveness comes at the risk of "hallucinations," i.e., generating text in a way that does not match reality. To ensure the integrity of the process, it would be important to refer to a trusted source for what the text of such-and-such Psalm actually is. My initial tests (like, three of them) show that ChatGPT is pretty good at reproducing the text of the Psalms3, likely because their text is widely available and faithfully reproduced.
The question I find more interesting is how it could be surprising that LLMs would generate factually incorrect information. As anyone who regularly programs computers will tell you, computers are fragile. Sometimes, it is a wonder to me that they work at all. I am reminded of a story from Babbage, upon making his calculator: "On two occasions I have been asked, — 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' … I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."4 The comparison to Babbage is not perfect - the "right answer" is probably in the training data somewhere - but the idea that guarantees could be made for the correctness of a system without a mathematical formalization of what correctness is just seems so foreign to me. Perhaps this has something to do with how we think of rationality and the perception of computers as "being really smart"?
Distraction. ChatGPT does many things. One person chimed in that he would be tempted to continue to use ChatGPT and avoid prayer. The fix is an easy one: get the Psalm, find the text of that Psalm, and then close out so you don't get distracted.Â
The root of distraction is that computers seem to be designed to be very abstract and to fit into a multitude of uses. This seems critical to the concept of computing and its explosion into society. Picture an old phone taped to an old camera, a map, and a newspaper - and to use one you need to bring all the others along. Is it a grotesque image because it is large and difficult to carry, or because it's used to do so many things at once? To continue further in this line of reasoning is to tug on questions like "what is abstraction?" and "what is mediation?".
Artifacts set laws. ChatGPT has been trained on a massive set of texts with varying kinds of references to Christianity. I would guess the reflections it produces on religion would be towards an inoffensive or self-help focused Christianity, one that (by definition) looks good to a broad set of the writing of people represented in its training data. My warning is to be discerning of the motions of your heart when you read: are they towards Christ or away from Christ? If it feels fiery, is it disturbing your peace or burning away sin? If it feels cooling, is it soothing a worried soul in the embrace of Christ or is it appeasing the pain of conversion? Christianity is not set by democracy.
The laws - the computer code - set by the designers of ChatGPT cannot be argued with or reasoned with, they can only be used to subvert themselves. ChatGPT will pull out themes, understand words, and give recommendations - more generally, will operate - as it has been programmed to operate. It is programmed to predict the most likely next word, according to some very clever definition of "likely" and some extensive computational work to produce a similarly extensive dataset of text. The laws of ChatGPT have decreed it to be valuable to be common, and not valuable to be rare. Play within these laws, and ChatGPT can help you. There is much more to be said about programming being a kind of law (with credit to Lawrence Lessig), but a Thomistic understanding of law gives a very different flavor to that statement.Â
Relevance and the technological gaze. I also added the caveat that "relevance" shouldn't be everything. The focus on relevance is a temptation to viewing prayer as an experience to be designed or optimized, as something that can be taken to its perfection by understanding it and modifying it against some understandable end. Think if any other relationship had such a framing: is that a meaningful relationship?5 Of course, it is good to say "I want to enter into the words of the Psalms that have been prayed for thousands of years, and I think it would be beneficial to do so through the reflections I have had in my day." The temptation, however, is inside what is meant by "beneficial," and assuming a Psalm you decide as relevant is always better than a Psalm you decide as irrelevant. My recommendation was somewhat contrived and calculated: include some "random noise" on your Psalm selection so you're not habituated to prayer being "about you." Others had better ideas, like building off of the psalm order from The Bible In a Year6 or selecting one chosen by the Liturgy of the Hours. Someone else shared a link7 to a version of the Benedictine cycle in which you pray all the Psalms over the course of a week, and he suggested just following that you have all the psalms memorized. All good options.
Technology plays into this by habituating ourselves to a certain way of thinking. I'm lifting from several people here, including Marc Barnes, Michael Hanby, Josef Ratzinger, and Hans Jonas: technology, in its root, is an -ology, a field of study, a way of knowing. The technological gaze is a way of knowing that manifests itself in being able to modify and change and reorient the material world to one's will. It is, in other words, the mentality of engineering. One cannot engineer a relationship because the subject of study is not a material thing to completely understand and bend to one's will. The more often we take a technological approach, the more often it will spill over to other aspects of life - like prayer.
Remedial use and mediation as relationship. With all of that aside, there is another consideration. Growth in prayer would imply becoming familiar with these Psalms as time goes on. I presume you would start to see patterns. In fact, asking ChatGPT for recommendations is a step towards that - it is a far easier way to enter the Psalms than reading them all and categorizing them ahead of time. But as time goes on, your relationship to the Psalms will change. Perhaps after a month you start to know the Psalms you go to often, and a few times a week you don't need ChatGPT to direct you. And after a year you can make your clusters and you don't need a hyper-powered search engine.
The role of mediation in technology is a curious one. As we recently discussed, mediation - including mediation to others - is a way we can be Christlike. It can also be corrupted, turning mediation away from love and participation toward use and enslavement. Consider that we rent our skill of navigation from Google through our smartphones and Google Maps.8 There is a relationship we are in, but what kind of relationship is it? Am I working with Google9 for our common good or is our relationship simply mutual use?10
If you'd like to pursue the common good of a peaceful society built upon a proper understanding of computing, consider subscribing to the email list and sharing this post. Thanks!
Don't scale this.11 Working with copy-and-pasted text from typed-out journals is the right level to work at. Making it more automated makes it more rigid, and it also makes it more permanent. This is something new - something that may not last a long time - and so it needs to be flexible.
Beyond the consideration of design flexibility, scaling this project would assert that the person making it knowing that this tool would be beneficial to the people who use it. At the moment, even my friend himself doesn't know whether it's in fact good for him. How much more of a question would it be if you were to make this tool for people you didn't even know? This ties back into two other themes: that using a technology is a kind of relationship, and making a technology is a kind of lawgiving.Â
I don't think this is a complete list, but it's the list I have now. I gave my friend the recommendations. But as I was finalizing my reply, I wondered why I formulated my reply in that way: "It's good, but…". Why does it seem so negative? This deserves a larger treatment, but in short, it's because the social norms around our use often provide guardrails - and without the norms (such as in the case of a new technology), we have no guardrails, and it takes much more thought to use a tool justly. This is how, for example, my Bible study group transitioned from using physical Bibles to reading passages on one's phone. It took time, and it took some heightened awareness and some effort in avoiding distractions on the phone, but largely I am far more focused when using my phone use during small group than in most situations. Over time, it is not as if our technologies become safer intrinsically (though that can happen too), and it is not as if we were originally wrong about the dangers a technology could bring (though that is true to some degree). It is that we are smart and oriented towards the good and we learn what errors to anticipate and how to avoid them.Â
Substack has features that enable commenting, so some questions to ponder (or write about in the comments):
Would you add some cautions? Would you remove some?
Which of these points do you want to hear more about?
Are there computing technologies or (technologies in general) that faithful people in your life have embraced wholeheartedly, or is the response often a cautious and measured "Yes, but…"?
Do you share my friend's vague discomfort with using ChatGPT for something like this? How might you put it into words?
Our general discussion channels in the Full Stack Theology discord (invite link) are named the "Community Prudence Pooling Layers" and I love it.
It is more proper to say that it has a much broader set of training data and a more expressive set of inputs and outputs.
However, curiously, they started appearing in the web interface as VB.NET code blocks. VB is the language of the gods, truly.
Passages from the Life of a Philosopher (1864), ch. 5 "Difference Engine No. 1"
Not incidentally, the epitome of applying engineering thinking to relationships is the bizarro horror-world of AI "girlfriends."
https://media.ascensionpress.com/category/ascension-podcasts/bibleinayear/
Marc Barnes, "Wendell Berry Against your GPS" https://archive.ph/FPR5q. Ironically, this very article is now behind a paywall (hence the archive link) - reminding me that if the web is my external memory, that is now something I have to rent, too.
It is helpful in the flow of this paragraph to refer to Google as a monolith, but it is more accurate to break down the various roles people have at Google: the stock owners, the directors, the managers, the programmers, etc. Note that the very fact there is such a diffuse responsibility doesn't bode well for a positive decision on whether that relationship is a good one.
Answering this question is hard, and the fact it's a hard question should indicate that we are way in over our heads on the moral weight of using technologies.
If you are tempted to scale this, join the Discord and we'll find a better thing to build and scale.