Artificial intelligence expert originates new theory for decision-making


Thu, 09/17/2020

author

Jon Niccum, KU News Service

Rendering of black question marks on a black surface with one red question mark

LAWRENCE — How should people make decisions when the outcomes of their choices are uncertain, and the uncertainty is described by probability theory?

That’s the question faced by Prakash Shenoy, the Ronald G. Harper Distinguished Professor of Artificial Intelligence at the University of Kansas School of Business.

His answer can be found in the article “An Interval-Valued Utility Theory for Decision Making with Dempster-Shafer Belief Functions,” which appears in the September issue of the International Journal of Approximate Reasoning.

“People assume that you can always attach probabilities to uncertain events,” Shenoy said.

“But in real life, you never know what the probabilities are. You don’t know if it’s 50 percent or 60 percent. This is the essence of the theory of belief functions that Arthur Dempster and Glenn Shafer formulated in the 1970s.”

His article (co-written with Thierry Denoeux) generalizes the theory of decision-making from probability to belief functions.

“Probability decision theory is used for making any sort of high-stakes choice. Like should I accept a new job or a marriage proposal? Something high stakes. You wouldn’t need it for where to go for lunch,” he said.

“But in general, we never know what’s going to happen. You accept a job, but it may turn out you have a bad boss. There is a lot of uncertainty. You may have two job offers, so you have to decide two choices of what to accept. Then you do a pros and cons and attach probabilities to them. Probabilities are fine when you have lots of repetitions. But if it’s a one-time thing, you can’t ‘average your winnings.’”

One of the earliest answers to this question was provided by John von Neumann and Oskar Morgenstern in their 1947 book “Theory of Games and Economic Behavior,” Shenoy said. In 1961, Daniel Ellsberg showed via experiments that von Neumann and Morgenstern's decision theory was not descriptive of human behavior, especially when there was ambiguity in representing uncertainty by probability theory.

In the late ’60s and mid-’70s, Arthur Dempster and Glenn Shafer (who was a former KU faculty member in both mathematics and business) formulated an uncertainty calculus called belief functions that was a generalization of probability theory, which was better able to represent ambiguity. However, there was no decision theory for making decisions when uncertainty was described by this theory.

Shenoy’s article provides the first formulation of a theory for decision-making when uncertainty is described by Dempster-Shafer’s belief functions that is analogous to the von Neumann-Morgenstern theory. And Shenoy said this theory is better able to explain Ellsberg’s experimental findings for choices under ambiguity.

The professor first approached Denoeux about this topic three years ago when both were speaking to doctoral students.

“(Denoeux) went through all the theories of decision-making with belief functions. Afterward I went and told him, ‘All of this you said is unsatisfactory.’ And he agreed with me! I said I’d like to come and work with him on this. So he sent me an invitation.”

Shenoy applied for a sabbatical, then headed to France in spring 2019, where he spent five months collaborating with Denoeux at the Université de Technologie de Compiègne.

“It was culturally very enriching and professionally rewarding,” he said.

Now in his 43rd year at KU, Shenoy remains an expert on uncertain reasoning and its applications to artificial intelligence. He is the inventor of Valuation-Based Systems (VBS), a mathematical architecture for knowledge representation and inference that includes many uncertainty calculi. His VBS architecture is currently used for multisensor fusion in ballistic missiles for the U.S. Department of Defense.

He hopes his latest research can benefit those who rely on belief functions.

“That includes a lot of people in the military, for example,” Shenoy said. “They like belief functions because of its flexibility, and they want to know how you make decisions. And if you’re going to reduce everything to probabilities at the end, why not use probabilities to begin with?”

Top photo: Pixabay

Thu, 09/17/2020

author

Jon Niccum, KU News Service