Why AI Still Can’t Replace Human Judgment in Courts: Insights from INCOLEM 2025

The INCOLEM 2025 conference at SRM University brought together legal thinkers from all over the world. Everyone came with one big question: Can AI ever replace human judgment in courts?

By the end of the conference, the answer was clear. AI can help, but it cannot replace the human mind, heart, and conscience that justice demands. 

AI Is Helpful, But It Does Not Think Like Humans

Honourable Justice Dinesh Maheshwari opened the discussions with a strong reminder. He said that AI is only a supplement, not an alternative to human knowledge. 

AI does not understand the truth. It does not feel any responsibility. It produces responses as per the patterns, even when the responses are wrong. In courts, where lives and rights are associated, this can even be more dangerous. 

Some of the key concerns that he raised are marked below:

  • AI has no independent thinking
  • It cannot check its own answers are correct
  • It has no emotional understanding of situations
  • It might create mistakes that humans should correct later. 

Justice Maheshwari also warned about the confusion around accountability. If AI makes a mistake, we do not know who should be blamed. This uncertainty shows why AI cannot make decisions in legal procedures.

Human Oversight Is Still Essential

Honorable Justice Mohan Peris from Sri Lanka added another layer to the discussion. He agreed that AI is now a part of the legal work. Lawyers use it for research, drafting, and writing judgments. It saves time and reduces the overall workload. 

He stressed that even with all this help, AI must remain under human control at all times. Humans create algorithms. Humans set rules, and they must decide what is right and wrong. 

According to him:

  • AI improves speed, not wisdom
  • It supports decisions but cannot make them
  • It cannot understand the moral duty and social context
  • Judges must always check and approve what AI produces.

AI can be smart with data, but human judgment understands the emotions, intentions, fear and hope. That is something no machine can do. 

Justice Needs a Human Touch

Professor Upendra Baxi spoke deeply about the human soul of justice. He said something very powerful: technology cannot deliver justice as it does not feel. 

Justice is not just about rules. It is all about fairness, compassion, and understanding the lived experiences of people. 

He informed everyone that:

  • Technology can be used for harm if not properly monitored
  • Machines cannot carry the moral responsibility
  • Human rights impact the assessments that must be done before releasing new technology
  • Without empathy, justice becomes cold as well as mechanical.

The message he had was simple. Justice needs humanity, and humanity cannot be programmed.

srm university delhi

Dharma and the Need for Ethical AI Laws

The Vice-Chancellor of SRM University shared a thoughtful perspective rooted in dharma. He spoke about the need to create AI laws which protect the dignity, fairness and the constitution. He reminded everyone that technology grows quickly, but the ethics must grow with it. 

His focus was on the following elements:

  • Digital change must respect human values
  • AI laws should reflect both innovation and responsibility
  • Society needs technologies that help and not harm
  • Dharma comes up with a moral foundation for future AI regulations. 

His words reminded everyone that the legal future of India must stay grounded in principles that protect humanity. 

Global Voices, Shared Concerns

The conference brought experts from the USA, UK, Ireland, Australia, Romania, Singapore, Sri Lanka, Argentina, and many other countries. Their thoughts were different, but their concerns were very much similar. They agreed that:

  • AI is spreading fast in the legal system
  • Courts can stay careful and alert while adopting it
  • Ethics and accountability must guide every decision
  • Human judgment cannot be replaced anywhere in the world. 

This global unity showed how important the topic has become. 

SRM University Delhi-NCR, Sonepat’s Role in Shaping the Future

SRM University played a powerful role by hosting the INCOLEM 2025. The university created a global platform where judges, scholars, and professionals can openly discuss the future of AI and law. This event showed the commitment of SRM University to prepare students and society for the fast-changing digital world. 

The faculty of law guaranteed that the discussions were rich and diverse. Professor VK Singh opened the event with a warm and thoughtful welcome. He focused on the need for collaboration between disciplines and countries. His vision shaped the conference and motivated open conversations among the participants.

Registrar Professor Dr. V. Samuel Raj closed the event with an important message. He said that academic discussions like these help society to understand the new challenges. He noted that AI can reduce court dependency when used responsibly. Moreover, the focus must always be on fairness. He thanked the participants from India and abroad for contributing to the meaningful conversations that help society move forward. 

By hosting such a major international event, SRM University has shown that it is not just an academic institution. It is a place where ideas shape policy, where students learn from global voices, and where the future of law and technology is taken seriously. 

Why Human Judgment Still Matters

After two days of discussions, one truth stood strong: AI can assist, but it cannot judge. Courts deal with emotions, intentions, suffering, dignity and hope. These human experiences cannot be reduced to data. AI cannot do the following:

  • It cannot understand the pain
  • It cannot judge the intention
  • It cannot feel empathy
  • It cannot carry the moral responsibility

Human judgment is not replaceable, as justice is not a machine. It is considered to be a human promise.