Search

# Dating Game Explained With Probabilistic Graphical Models And Neural Networks

Neural Networks are a network of nodes which can process data and extract valuable insights, inspired from biological neurons within the brain. They were introduced with the sole purpose of recreating or achieving the capabilities of a human brain. A neural network predicts by determining the correlation between the input and output.

### What Is A Probabilistic Graphical Model?

The Probabilistic Graphical Model or PGM is an amalgamation of the classic Probabilistic Models and the Graph Theory.

### How Are Probabilistic Graphical Models And Neural Networks Related?

Both PGM and NN are data-driven frameworks and both are capable of solving problems on their own. The essential difference between them rests on how they use the data to predict the outcome.

One of the major drawbacks of NN is that it has a certain level of uncertainty in the prediction of outcomes. PGM makes the uncertainty explicit and helps to build models that are more faithful to reality. PGM is also more capable than NN in that case.

A PGM can be limited to providing the functionality of an NN and at the same time, NN can also be exploited to obtain the information that a PGM would provide.

### Problem Statement

We are going to use one of the popular use cases found on the internet — asking a girl out for a date — to describe the same.

John Doe likes a girl he knows through N mutual acquaintances and wants to ask her out on a date. He has low self-esteem and wants to ask some subset of their mutual friends to introduce him to her. However, a referral from someone she mistrusts may only have a weak positive influence, while a referral from someone she dislikes might even have a negative impact.

Now suppose n of his friends have already had the same idea, and have tried doing this before (assume, for the purposes of this argument, that the young lady in question is unbearably desirable). So he has in his possession, a set of observations, which he can compactly represent as an n bit vector [1,0,1,0 ….] where 1 is a boost, 0 means no contact.

John needs to find the set of friends that he should ask to boost him, i.e. the best vector. Note that this need not be one of the n vectors already tried.

### Solution

Using a Neural Network: The input layer would have N nodes that obtain the appraisal input, the output layer would be the girl’s response in binary and we can throw in a hidden layer of M neurons that all have N connections back up to the input layer, and encode various quanta of me-friend-girl dynamics that when pooled together in M forward connections to the output layer, determine her interest in him.

When we train this neural network with the n vectors we have, we will learn weights between 0 and 1 on the NM input-mid-layer connections and the M mid-layer-output connections. Intuitively, combinations of inputs that predict the output well (say her 3 closest friends) will be repeatedly reinforced and will become bigger across n observations. So will the weight corresponding to the mid-level neuron that is receiving inputs from specifically that clique. However, there will be no easy way of knowing which of the M mid-layer neurons contains the 3-closest-friends information. The NN will function as a Delphic oracle — you can ask it about the fate of individual vectors, but not for reasons explaining its prediction.

Using PGM: We could also treat this problem as one of Bayesian reasoning, where we potentially receive observations of approval from N nodes, which leads to the formation of an impression (a random variable), which causes date acceptance (an observable event). In this case, we get to see the likelihood probability p(approval from friend i|impression), from which we have to estimate the conditional posterior probability p(impression|vector of all approvals) using Bayes theorem.

Going from p(approval i|impression) to p(all approvals|impression) though, is hard. Usually, machine learners tend to assume conditional independence for all i approvals, i.e. p(all approvals|impression) = product of all p(approval i | impression). This is simple to compute, but gives up on the possibility of modelling non-trivial correlations between inputs. For example, if hearing good things from either A, B and C or from C and F, but not from A and C together impresses the girl (assume that the girl’s social life is extremely rich), such effects won’t show up in such ‘naive’ Bayesian predictions.

Summary: The neural Network lacks the reason behind a prediction that it makes whereas a Probabilistic Graphical Model shows enough evidence to support its prediction. PGM’s will give you detailed intermediate predictions about how likely the individual inputs will be to generate the effect.

### Outlook

Both Neural Networks and Probabilistic Graphical Models can describe the correlation between the dependent and independent factors of any problem which involves a number of features that results in a  specific outcome. Both can be used to learn about the network functions.

A Computer Science Engineer turned Data Scientist who is passionate about AI and all related technologies. Contact: amal.nair@analyticsindiamag.com

### Telegram group

Discover special offers, top stories, upcoming events, and more.

### Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

### Data Science Hiring Process at Pegasystems

For data science roles, Pega focuses on the candidate’s ability to learn and adapt rather

### Alibaba’s Cloud Business Gets Qwen-ched!

Alibaba’s latest Qwen model can save its failing cloud business.

### Genpact, AWS Collaborate to Revolutionise Insurance Claims Lifecycle

Genpact also announced an expanded collaboration with AWS aimed at revolutionising financial crime risk operations

### 7 Bizarre Things About ChatGPT You Wish You Knew

Ever wondered why it’s called ChatGPT?

### China Open Sources DeepSeek LLM, Outperforms Llama 2 and Claude-2

DeepSeek LLM 7B/67B models, including base and chat versions, are released to the public on

### Apple’s Scary New Innovation Gives Voice to the Voiceless

Apple’s latest innovation, Personal Voice, unveiled just before the International Day of Persons with Disabilities,

### 9 Must-Know Open Source Models From Meta in 2023

Meta has been synonymous with open source ecosystems. Recently, its research arm, FAIR, completed 10

### AI Assists Production in Indian Film Industry

Implementing AI in pre-production can bring down storyboarding process time by 50-80% and reduce the

### Is GPT-4 Really Better than Radiologists?

“Radiology report summaries created by GPT-4 are comparable, and in some cases, even preferred over

### TSMC: The Wizard Behind AI’s Curtain

TSMC anticipates a substantial CAGR of nearly 50% in the AI sector from 2022 to 2027.