Skip to main content
Loading...

ChatGPT and the rise of technology

Should Christians fear artificial intelligence?
Topics

When I was a teenager, I purchased an early personal computer called a Timex Sinclair ZX-81 with money I earned from my paper route. I was amazed at how computer programs enabled me to build “castles in the air . . . creating by exertion of the imagination.” What started as a hobby later developed into a vocation as I pursued a Ph.D. in the field of robotics and computer vision. At the time (nearly 20 years ago), the field of AI was climbing out of an “AI winter,” and I found myself attracted to newer machine learning methods that were being used for image recognition. I recall being astounded at the profound elegance of “training” a computer with a set of example images and then observing how well it could identify new images that were not part of the original training set. Even those early machine-learning techniques seemed magical.

Technology amplifies both good and harm

Two things became apparent to me in the following years. First, the technology amplified opportunities to do good as well as to do harm. Already as a grad student, I observed many research efforts being directed towards face recognition—an intriguing and challenging technical problem that had pitfalls for misuse and a myriad of privacy issues. I consciously chose a research direction that I felt was a more redemptive application of machine learning, such as automating the visual sorting of recyclable goods. I later recognized this approach as confirming the theological notion of structure and direction: the possibility for technology is rooted in the structure of God’s good creation, and direction refers to how we unfold technology in either obedience or disobedience to God.

Technology is advancing faster than we thought

The second thing that became apparent to me was that AI was developing faster than many of us would have predicted. As an engineering grad student some 20 years ago, I would have scoffed at the notion of an autonomous car; the computer vision challenges were simply too great in unstructured and unpredictable environments. Within the decade, Google successfully demonstrated a self-driving car. In the words of Yogi Berra, “It’s tough to make predictions, especially about the future”—even for those who are developing technology.

What do we make of Chat GPT?

One of the latest developments to catch widespread attention has been ChatGPT, a chatbot developed by OpenAI. Unlike the modest number of example images I used for training in my graduate work, ChatGPT used 570 gigabytes of example documents.4 ChatGPT can interact with a user by responding to questions and replying to prompts (you can try it out here). While some of the responses are amusing or simply wrong, the results are frequently astonishing, providing surprisingly coherent and cogent responses to a wide variety of prompts including composing poems, stories, sermons, and essays. The results have been so remarkable it has led to speculations that the college essay is dead and about the future of many skilled jobs. Indeed, computer programmers may be programming themselves out of a job. A system called Copilot takes input prompts and generates computer code, leading some to speculate about the end of programming.

Will Chat GPT Encourage Cheating in Schools and Universities?

While rumors of the demise of the essay and of programming are likely exaggerated, there will be definite impacts for higher education. How should we modify our writing assignments and academic integrity policies when student have access to AI-generated text? Could we use AI-generated text for critical assessment exercises that might help students write (and code) better? Although these questions will require wider faculty discussions, what follows are three general guidelines as we discern a Christian response to AI.

First, we need to avoid the pitfalls of viewing technology with either too much optimism or with undue pessimism. We must reject a reductionistic worldview that sees all problems as reduceable to technical problems that can be solved by technology. A trust in technology, sometimes referred to as technicism, is essentially a form of idolatry. On the other hand, we should not view technological developments with a despair that they will inevitably threaten humanity. AI is part of the latent potential in creation, and we are called to responsibly unfold its possibilities. Theologian Al Wolters writes that “the Bible is unique in its uncompromising rejection of all attempts . . . to identify part of creation as either the villain or the savior.”

Are Computers Sentient?

Second, rather than focusing on what AI can do, we need to start with an ontological question: how are people distinct from machines? A common tendency is to anthropomorphize our machines, thereby elevating the status of our machines and, in doing so, reducing the distinctiveness of human beings. Already in the 1960s, the early AI pioneer Joseph Weizenbaum explored the notion of automating psychotherapy with a chatbot named ELIZA. Weizenbaum concluded, “There are limits to what computers ought to be put to do.”

In his book, Humans Are Underrated, Geoff Colvin suggests asking the following question: “What are the activities that we humans, driven by our deepest nature or by the realities of daily life, will simply insist be performed by other humans, regardless of what computers can do?”7 An AI chatbot or robot should never substitute for human wisdom, care, or companionship. Without a biblically informed ontological grounding, we will be susceptible to various reductionistic philosophies like physicalism and Gnosticism.

The Biblical story is clear that while humans are also creatures, we are uniquely created in the image of God and distinct from machines. The notion of the imago Dei endures, even as our machines become more capable of things that, up to now, only humans have been able to do. The theologian Herman Bavinck argues that “a human being does not bear or have the image of God, but . . . he or she is the image of God.”

Third, we need to discern norms for the responsible use of AI. The creators of ChatGPT bumped up against the “AI Alignment” problem—the challenge of aligning an AI system with the goals and values of the designers. The developers had to grapple with bias (including racism) in their training set. Technology is not neutral, and neither are the algorithms and the training data used in AI. Consequently, AI systems can perpetuate injustice, a real threat as big data is employed in a wide variety of fields including insurance, policing, marketing, loans, and politics. We will need to discern creational norms for AI which include considerations like justice, cultural appropriateness, caring, social norms, stewardship, transparency, and trust.

Wisdom in using AI

Since norms are not simply reducible to algorithms, we will need wisdom to discern the extent that we ought to replace traditional human roles with machines. Moreover, appropriate norms should point us towards using AI to opening up new possibilities for showing love to our neighbor and caring for the earth and its creatures. Already, AI has shown amazing redemptive applications in medicine, drug discovery, environmental monitoring, wildlife preservation, assisting people with disabilities, and enhancing traffic safety. Christian computer scientists and engineers can find common cause and join groups such as AI for Good, AI for Earth, and AI and Faith. Moreover, computer scientists will need the help of philosophers, theologians, social scientists, and others in the humanities to help direct technologies like AI-generated text in normative ways (in fact, a liberal arts context is an ideal setting for such collaboration).

Fred Brooks, a respected Christian computer scientist, wrote, “It is time to recognize that the original goals of AI were not merely extremely difficult, they were goals that, although glamorous and motivating, sent the discipline off in the wrong direction.” Brooks advocates for IA (Intelligence Amplifying) systems over AI, suggesting people and machines will be able to do far more than AI alone. As an example, one of my colleagues at Calvin University has been exploring the use of AI for helping people write better (as opposed to writing for them).

Despite the possibilities for sinful distortions, AI is part of the exciting possibilities in creation that Christians can help direct in God-honoring ways. Christians will need to join the wider dialogue surrounding these powerful new technologies, bringing insights into what it means to be human and to help shape public policy with a voice that is both biblical and relevant.

This article first published at the CSR Christ Animating Learning Blog. For the full article, including links and footnotes, please go to https://christianscholars.com/chatgpt-and-the-rise-of-ai/

Photo by ThisIsEngineering: https://www.pexels.com/photo/code-projected-over-woman-3861969/

With
More like this ...
Amy Isham
Chris Watkin
Can philosophy tell us who we are?
John Cook
How can we know what's true?
Aaron Johnstone
Daniel Sih
What's your relationship with technology?
Chris Mulherin
Are there any other games in town when it comes to working out what's true in the world? Is science enough?