AI Fluency Ministry
Whoever Controls the Model
Controls the Theology.
So We Built OpenLumin.
By Kalib Alibuas · AI Fluency Ministry · April 2026
When you ask AI a question about God — who decided what it would say? Not you. Not your pastor. Not your denomination. A team of engineers decided. Contract workers ranked the answers. And a document they literally call a “constitution” filtered the response before it reached your screen.
Your church didn't write that constitution. But it's shaping what your people believe.
The Three Layers of Control
Every AI response has been shaped by three layers of control — and none of them belong to the church.
Layer one: training data. AI learns by consuming billions of pages of internet content. The internet is not a balanced theological library. There is far more secular humanist content than Pentecostal content. More progressive theology than conservative theology. When you ask AI about God, the starting point is already tilted. That's not conspiracy. That's math.
Layer two: human feedback. Companies hire annotators to rate the model's responses. A 2025 study found these annotators have “an excessive amount of discretion” and “frequently use their power of discretion arbitrarily.” These are the people deciding what a “good” theology answer looks like. Not theologians. Contract workers following a style guide.
Layer three: the constitution. Some AI companies use Constitutional AI — a literal set of principles the model uses to evaluate its own responses. The researchers say it plainly: “The principles encode the values of their authors. There is no escape from human judgment — only a change in where it enters.”
“Aligned to whose values? What values?”
The Evidence Is Measured
Researchers benchmarking ChatGPT's religious output found the model systematically favors secular humanism and Buddhism — and scores traditional Christianity lowest in sentiment. When confronted directly, ChatGPT admitted that its cultural bias was “a valid concern.”
Gloo — a Christian tech company — built a benchmark to measure how well AI models reflect a Christian worldview. On a scale of 1 to 100, leading models averaged 61. Then Gloo trained their own models on Christian worldview data. Same AI architecture. Different training data.
30-point gap.
Same engine. Different driver. Completely different destination.
The Problem We Saw
64% of pastors now use AI for sermon preparation. 40% of Gen Z trusts AI spiritual advice as much as their pastor. And 73% of churches have no AI policy at all.
That means millions of Christians are getting theology from models built by companies with no doctrinal accountability — to anyone. Not to your denomination. Not to Scripture. Not to the congregation in your pews on Sunday.
And the response from most of the church? Either panic (“AI is evil, ban it”) or surrender (“let ChatGPT write my sermon”). Both are wrong.
The Answer: Augmentation, Not Automation
The answer is not to ban AI. It's to use it for what it's actually good at: research, not authoring.
That's why we built OpenLumin. It's a research companion — not a replacement for your brain, your conviction, or your calling. Here's the difference:
What others do
AI generates the sermon, the devotional, the Bible study. You copy and paste. The AI authored the theology. You were the delivery mechanism.
What OpenLumin does
AI retrieves the evidence — commentaries, historical context, original language data, cross-references. You read it. You wrestle with it. You draw your own conclusions. The insights are yours.
Seven Principles That Shaped the Product
Our research into AI fluency for ministry produced seven insights. Every one of them shaped how we built OpenLumin:
Augmentation over automation
AI gathers evidence. You do the thinking. Research Mode gives you raw scholarly data — you build the sermon from it.
Human stays in control
You approve the outline before generation starts. You see premise findings. You edit the direction. The AI doesn't run ahead of you.
Reliability must be verified
Two-tier citation system: every claim marked as 'verified' (from evidence data) or 'training-assisted' (flagged for review). You always know what you can trust.
Transparency, not black boxes
Every lesson shows which sources were used, how many citations were verified, and what the AI's premise validator found. Full audit trail.
Tailored to your theology
Your denomination isn't one option among many — it's the anchor. Your Statement of Faith is the guardrail. The AI researches within your framework, not against it.
Evidence before interpretation
Every study starts with what is historically verified — timeline, cultural world, original language. Scholar opinions come after evidence, not before.
The church owns its data
We don't depend on a third-party AI Bible platform. The evidence database — 6,000+ entries from 15+ scholarly sources — is the asset. No vendor can take that away.
The Biblical Standard
James 3:1 says: “Not many of you should become teachers, my brothers, for you know that we who teach will be judged more strictly.”
When AI is used for Bible study, sermon prep, or discipleship — it shapes a person's understanding of God. That makes it a teacher. And since a computer cannot be held accountable, the responsibility falls on whoever built and controlled the model.
If you use a tool to help you teach — you'd better make sure you control what that tool says.
“You shall teach them diligently to your children.”
You. Not the algorithm. Not the model trained on Reddit threads and secular textbooks. You.
What OpenLumin Is
OpenLumin is a free Bible research companion. It retrieves evidence from 15+ scholarly sources — Matthew Henry, John Gill, Jamieson-Fausset-Brown, Michael Heiser, Ancient Near East cultural context, Theographic Bible data (3,000+ people, places, events), Easton's Dictionary, and your denomination's own documents.
Every claim is sourced. Every citation is marked as verified or flagged for review. Every study starts with evidence, not opinions. And every conclusion is yours to draw.
It's not a sermon writer. It's not a theology generator. It's a research companion that knows your belief system — it retrieves evidence, you do the thinking.
Because AI should sharpen your Bible study, not replace it.
Whoever controls the model controls the output.
Whoever controls the output controls the theology.
That should be the church.
Watch the Full Episode
AI Fluency Ministry — Who Controls the Model Controls the Output
About the author: Kalib Alibuas is the founder of AI Fluency Ministry — a project helping the church understand and use AI wisely. OpenLumin is the practical application of that research. Based on the podcast episode “Who Controls the Model Controls the Output” and the 7 Key Insights from AI Fluency in Ministry research.
