human hand and robotic hand

What Does it Mean To Be Human in the Age of AI?

What Does it Mean To Be Human in the Age of AI?

What Genesis teaches about dignity and agency.

Creation Story

The creation story flashes out of the first few chapters of the Bible, reframing for ancient minds how the earth was formed — not out of some battle between gods, but in a way that created structure and order in our world, and gave humans agency and responsibility. God forms the world in seven (really six) days in chapter one of Genesis; in the next chapter he breathes life into the man (adam in Hebrew) and forms the woman (who gets the name Life or Living — Eve).

The first humans are given a mandate to fill the earth, which theologians have interpreted not simply to mean numerical growth (though we’ve certainly done that), but filling the earth with good things like skyscrapers and sailboats, and to harness the earth’s potential by cultivating trees, digging for ore, bringing the best out of what’s available to us.

In chapter two of the story, the man is told to care for and keep the garden he’s placed in, and this is another aspect of what humans are given: a responsibility over this earth, including each other (we see this two chapters later, as Cain asks if he’s his brother Abel’s keeper — using the same Hebrew word — and the text answers definitively: yes).

The creation story is less focused on the question of “what is a human?” — which may be a more pressing question today — and more focused on “what ought a human to do?”

The answer stems from the shocking responsibility this God gives humanity. In the Ancient Near East, humans were seen as slaves to the gods (whose representative on earth was the king). They were given gifts of agriculture, music, and so on in order to please the gods. The creation story in the Bible answers this very differently: humans have striking agency and incredible responsibility. This is what the image of God may most clearly refer to — our agency and responsibility.

This is the outline of the story, and the narrative raises the dignity of humans — every human — far above what it had been. Part of this dignity is our agency and responsibility over creation and each other (along with the greatest argument for natural “rights” in a political sense, but that’s another topic).

We call these first chapters of Genesis myth because they offer a reason for human existence and an understanding of how creation is ordered. But if we go forward a couple more chapters, it soon shifts to tragedy.

What do we see in the next scene? The man and woman listen to a snake. However this happened (I presume people of ancient times knew snakes couldn’t talk), we see humans abdicating responsibility to an animal. They were told, remember, to have responsibility over animals.

We have responsibility over what we create.

It’s precisely this dignity and responsibility as humans that the creation story asserts, and that we (sometimes) eagerly abdicate.

Think of corporate-speak, a sentence like: Falling revenues have led to temporary job reductions. The humanity in this sentence — and responsibility — is hidden. What decisions led to falling revenues? Who made those decisions? Was it a failure to act, or an action that didn’t pay off? Who decided now was the time to cut jobs? Who made the decisions of which jobs and why? Corporate-speak hides responsibility, through the simple linguistic strategies of vagueness and passive voice.

The next great temptation around abdicating responsibility will be AI.

On an individual level, we’re seeing millions turn to AI for mental help — a space where people are seeking everything from advice to deep healing. AI surely offers some help, but the responsibility of a therapist is significant, and the creators of the technology are not holding this same responsibility (rather, if it’s a free version of AI, someone’s questions are being used to train the machine itself).

On a societal level, we’re seeing corporations likely reducing hiring due to AI efficiencies, whether real or anticipated. Leaders are making decisions with AI’s input; surely the time is coming when they will blame decisions on AI’s advice.

AI offers an “out” to deny our agency and responsibility and pass the buck. To grasp the creation story in 2026 is less about insisting AI can’t help us create — because it is and it will continue to — and more about insisting on agency and responsibility. It’s Adam and Eve at the tree with the snake.

This becomes more important when we project the story forward. If AI delivers in the ways some people are prognosticating (greatly reducing hallucinations, offering advice beyond sycophantic praise, an ability to “understand” complex subjects), it will change how we view ourselves. This last happened 500+ years ago with the development of the printing press, as we moved from an oral to a literate culture. In an oral culture, truth was embodied and communal; memory was repetitive. Humans were necessarily relational — holding truth together as a community. Wisdom came in the form of age (elders) or bards — those who could tell truth in stories or wise sayings.

In a literate culture, humans become interiorized, less relational, and more individual. We prized standardization and accuracy. The wisest were those who could reason and recall. We needed the standardization of printing for reason to become vitally important and displace story and imagination. This culture led to an industrial culture, where education and creativity became earning opportunities, rather than ways to fulfill ourselves and live responsibly in the community.

In an AI culture, humans will no longer have sole ownership of cognition and reason; recall can be offloaded to machines; creativity will become a joint human-machine venture. AI can recognize patterns faster and at a deeper level. This leaves humans without a sole claim on reason, recall, pattern-recognition, or creativity.

Rather, we’ll have a sole claim on conscious embodiment, meaning-making, and responsibility.

Like oral cultures, memory will be shared and external. Like literate cultures, we are left alone to provide meaning.

Meaning, values, and responsibility will be what makes humans unique, rather than reason. We will feel this shift in 2026 and years (decades!) beyond. The risk with AI is passivity and abdication of responsibility as we give away judgment. In this way, AI can become a “monster” if we allow it: a system or algorithm that degrades us as humans, where we give away our dignity to a machine.

The promise of this technology is to hand off cognitive work around memory and pattern recognition, so we can emphasize embodiment and wisdom that machines cannot provide. To do so, we must insist that our responsibility is a human gift, and that we are the decision-makers (not the animals or machines), and our values: love, care, embodiment, morals — matter especially now.

The creation story reminds us that to create is to be responsible. In 2026, we need people who know this story, who tell it, who use language of responsibility and agency, who insist that we live out our role as those made in God’s image.

About the Author

Sr. Director Content and Buzz 

Gabe is the senior director of content and buzz at Young Life, where he focuses on telling stories to connect people to what God is doing. Young Life allows him to combine his training (MFA in creative writing) and experience as a staff kid, participant, volunteer leader, and staff member to point people to God’s big story, and see themselves as part of it.  

Gabe and his wife, Brooke, have two daughters, Ellis and Maci. They live in Colorado Springs, Colorado, and enjoy hiking, skiing, and generally being outdoors.  

Share This