Churches in Asia must engage in Christian witness with ethical responsibility, prophetic courage, and theological conviction, observe panelists at CCA study consultation
From left to right: Dr Leonard Chrysostomos Epafras (Indonesia), Very Rev. Fr Philip Thomas Cor Episcopa (Malaysia), Dr Erica M. Larson (Singapore/USA)
Cyberjaya, Malaysia: In a time when digital technologies are redefining the identities of individuals, communities, and authority, the Church in Asia must engage in Christian witness with ethical responsibility, prophetic courage, and theological conviction, observed two distinguished panelists on the closing day of the CCA International Consultation on ‘Artificial Intelligence and Posthumanism: Ethical and Theological Perspectives.’
Insights shared by Dr Erica M. Larson, Cultural Anthropologist at the National University of Singapore, and Dr Leonard Chrysostomos Epafras of the Universitas Gadjah Mada Consortium in Indonesia converged on the same point, offering a reminder that faith, justice, and compassion must remain at the heart of Asia’s digital transformation.
Drawing on social scientific theories and empirical studies across anthropology, media studies, and religious studies, Dr Larson examined how digital technologies, particularly Artificial Intelligence, are reshaping religious life, ethics, and Christian witness in Asia’s pluralistic societies. She emphasised that while the emergence of generative AI introduces new complexities, it must be understood within the broader continuum of digital religion, which “continues to reshape the boundaries of the sacred, the religious subject, and the communal.”
Rejecting technological determinism, Dr Larson stressed that “the adoption, adaptation, and use of technology are fundamentally social processes.”
Dr Larson warned that in designing such technologies, societies must consciously determine the kinds of vulnerabilities they are willing to live with. She explained that this reflects the ‘ethics of ambiguity’ inherent in technology, which states that while innovations help overcome certain human limitations, they simultaneously introduce new existential and moral vulnerabilities.
Reflecting on the COVID-19 pandemic, Dr Larson noted that online worship and virtual gatherings were not mere surrogates for in-person rituals but created new ways of experiencing religion and building community. She highlighted how platforms like Zoom enabled ritual innovation, reshaping worship behaviours, and emphasised that such adaptations reflect the “affordances and limitations” of technology, which both enable and constrain actions depending on context.
Presenting findings from her research among Christian and Muslim educators in Manado, Indonesia, Dr Larson highlighted the double-edged nature of smartphones as both moral threats and tools of witness. Educators reported that devices facilitated the spreading of religious teachings, coordinating charity, and deepening faith, but also contributed to distraction, moral decay, and exposure to harmful content.
Turning to the broader Asian context, Dr Larson noted that digital spaces both connect and divide communities. Among religious youth on Instagram, online engagement often strengthened inter-religious boundaries while weakening intra-religious ones, illustrating how algorithms and media structures subtly shape religious identity and belonging. She also emphasised that digital platforms hold potential for inter-religious dialogue, but cautioned that these spaces frequently mirror existing divisions rather than bridging them, making the outcomes of online encounters uncertain.
In her concluding reflections, Dr Larson warned that AI technologies are not free from bias, emphasising that algorithms are not neutral and can multiply existing prejudices while appearing objective. She noted that earlier versions of ChatGPT exhibited a strong Islamophobic bias due to the nature of the training data, highlighting the real-world consequences of algorithmic prejudice in religiously plural contexts like Asia. The spread of AI-generated misinformation can have tangible impacts on religious minorities and social cohesion, she added.
Dr Larson called for discernment and responsibility in engaging digital and AI technologies, urging faith communities to move beyond fear or fascination and cultivate critical awareness and media literacy. Her message resonated with the consultation’s wider theme: Christian witness in the digital age requires not withdrawal, but wise participation. She emphasised that in designing and using technology, we must assume responsibility for the kind of world we create and the vulnerabilities we are prepared to accept.
Complementing Dr Larson’s perspective, Dr Leonard Chrysostomos Epafras offered a critical examination of the Indonesian context, addressing majoritarianism, the militarisation of social spaces, shrinking digital democracy, and data privacy.
He noted that while digital technologies have opened new avenues for communication, education, and interfaith exchange, they have simultaneously intensified existing tensions within society, with social media algorithms often amplifying dominant religious and political voices while marginalising minority perspectives.
Dr Epafras further emphasised that digital platforms are increasingly used to propagate divisive narratives, nationalistic fervour, and religious hostility. In this tightening environment, he urged faith communities to reclaim their prophetic role in defending truth, justice, and human dignity. He highlighted that the collection and commercialisation of personal data raise serious ethical concerns, placing human freedom in a vulnerable position.
Dr Epafras stressed that safeguarding privacy and resisting the manipulation of digital systems are moral imperatives grounded in the Christian understanding of human dignity.
For photos from the fourth day of the International Consultation, please click here.