a

Lorem ipsum dolor sit amet, consectetur adicing elit ut ullamcorper. leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet. Leo, eget euismod orci. Cum sociis natoque penati bus et magnis dis.Proin gravida nibh vel velit auctor aliquet.

  /  Project   /  Blog: How Avengers used decision tree to beat Thanos

Blog: How Avengers used decision tree to beat Thanos


This post is for absolute beginners in Machine Learning who wants to know about Decision Tree.

Story time

How Avengers used decision tree to beat Thanos:

On planet titan

Iron Man was fighting Thanos.

Iron Man (talking to himself): Something is wrong, this guy is so powerful. I should think smart. If I manage to send my message to earth, I may have chance of winning.

As a last resort Iron man infuses some message in form of a nanotech when he wounded Thanos.

Thanos got time stone from Dr. Strange and goes to earth for mind stone.

*Iron Man’s secret message reaches Captain America*

*Message says to implement decision tree*

Captain America: Thanos, Stop, listen to me. We have an idea. We will help you in your quest. Just listen to me once.

Thanos: Speak.

Cap: We will help you with eliminating half of population in earth. There are many innocent people on earth. We can make this planet stronger by killing people who are ruining this planet.

Thanos: How do you do that?

Cap: Stark collected data about every person standing on earth right now. Using that data, we can make a decision whether or not to eliminate a person from earth.

Thanos: Explain me.

Cap: This is a sample of the data collected.

Sample data set

If Execute = “Yes” then that person will be executed.

Thanos: How can we use this data to build a decision tree?

Cap: By calculating entropy of each feature. If sample contains all the results as “No”, then entropy = 0. If sample contain half of the results as 0 and half of the results as 1 then entropy = 1.

Thanos: Explain How do you calculate entropy with example?

Cap: Here is formula for entropy

Let’s calculate entropy of result by using above formula.

Let’s calculate entropy of each feature by using below formula.

Entropy (Execute, Type of Person) = P(Civilian) * E(Civilian) + P(Terrorist) * E(Terrorist) + P(Criminal) * E(Criminal)

Entropy (Execute, Type of Person) = 0.357*0.971+0.286*0+0.357*0.971 = 0.694

Similarly, for other features Entropy values are as follows:

As we know that entropy is measure of randomness.

Above information from tables, we have to select the feature which has lowest entropy (randomness).

We calculate Information Gain (Entropy of system-Entropy of feature) from above tables.

Information Gain (Type of Person) = 0.940–0.694 = 0.246

Information Gain (Behaviour) = 0.940–0.911 = 0.029

Information Gain (Generosity) = 0.940–0.788 = 0.152

Information Gain (Family) = 0.940–0.892 = 0.048

From above data, clearly Type of Person gives highest Information gain.

So, the root node of tree is “Type of Person”

Let’s construct a tree from above information

Now we have to split data into three parts according to the tree structure (Feature “Type of person” is eliminated as it taken as a root).

Civilian Data set is as follows

Terrorist Data set is as follows

Criminal Data set is as follows

Let’s look at civilian data set and try to create tables for information gain

In civilian data set Generosity feature has highest information gain. So splitting feature will be generosity.

Let’s look at terrorist data set and try to create tables for information gain

Clearly Entropy value from table is zero. So, no further splitting is required.

Let’s look at criminal data set and try to create tables for information gain

In criminal data set Guilty feature has highest information gain. So splitting feature will be guilty.

Let’s re-construct a tree from above information

Now we have to split generosity data and family data.

Splitting generosity data

Clearly from above tables Entropy = 0.

It implies if generosity = “Yes” then person lives

Splitting Guilty data

Clearly from above tables Entropy = 0.

It implies if Guilty = “Yes” then person lives

Let’s re-construct a tree from above information

Thanos: That’s great. Let’s do this.

Cap: Yeah, Let’s do this.

Captain America gives all the data of people presently living on earth to Thanos to execute. Secretly he included Thanos name in the list.

Thanos executes the data and he eliminated himself from the universe.

At last avengers beat Thanos by using a decision tree.

References:

  1. http://www.saedsayad.com/decision_tree.htm
  2. https://bricaud.github.io/personal-blog/entropy-in-decision-trees/
  3. https://medium.com/udacity/shannon-entropy-information-gain-and-picking-balls-from-buckets-5810d35d54b4
  4. https://www.youtube.com/watch?v=2s3aJfRr9gE&list=PLSQl0a2vh4HC9lvrBhVt4UUkhzpp3N5_x

Inspired by:

  1. https://medium.com/@racheltho/why-you-yes-you-should-blog-7d2544ac1045

Thank You.

Source: Artificial Intelligence on Medium

(Visited 6 times, 1 visits today)
Post a Comment

Newsletter