Max Min Algorithm in Grid Computing with code in C
Here you will find the easiest explanation for max min algorithm in grid computing.
Let us first know why this algorithm is needed?
This is a static task scheduling algorithm used for load balancing.
The Max-Min algorithm first finds the minimum execution time of all tasks. Then it chooses the task with the maximum execution time among all the tasks. The same procedure is repeated by Max-Min until all tasks are scheduled.
Why we call it Max-Min when actually we are finding the minimum first then maximum?
This is because the semantics of saying Max-min means that the task with maximum execution time (heavier process ) is allocated to the machine with minimum completion time (best machine).
Let us take an example before moving to the code
Here Task 1 will execute in 140ms in VM1 and in 100 ms in VM2. Task 2 will execute for 20ms in VM1 and in 100ms in VM2 . Task 3 will run for 60 ms in VM1 and for 70 ms in VM2.
What is a Task ?
Let’s take an example. Suppose you have developed a social media app called fakebook. Now user one wants to change her DP , another user wants to create her account and the third user wants to like a picture.
All these are the tasks and involve manipulation in the database. The entity responsible for that change is the backend server. For effective use of the server you have made virtual partitions of the server making some virtual machines good for light computation and some virtual machines for heavy computation.
Now for sure you will not let a weak virtual machine to do heavy computation and a strong machine to do a trivial task, but this is the optimal condition, therefore we need task scheduling algorithm to make this decision for us.
Algorithm to solve the problem
- Find minimum time for each task , in the above image minimum time for each task is (100 ms ,T1 ,VM2), (20 ms, T2, VM1) ,(60 ms, T3, VM1)
- Now find the set with maximum time and that is (100 ms, T1, VM2)
- Execute T1 in VM2 for 100ms
- Since 100ms has been elapsed we need to update our data. New data is:
5. Here we impute T1 as INT_MAX(2³¹) to let our code know that this task is over and it denotes infinite. Now repeat step 1
6. Find minimum time for each task , in the above image minimum time for each task is (20 ms ,T2 ,VM1), (60 ms, T3, VM1)
7. Now find the set with maximum time and that is (60 ms, T2, VM1)
8. Execute T3 in VM1 for 60ms
9. Since 60ms has been elapsed we need to update our data. New data is:
10. Find minimum time for each task , in the above image minimum time for each task is (80 ms ,T2 ,VM1)
11. Exit Since all task has been scheduled.
Makespan produced by any algorithm for a schedule can be calculated as follows:
makespan = max (CT (ti, mj))
CTij = Rj+ETij
Where CT completion time of machines ,ETij expected execution time of job i on resource j , Rj ready time or availability time of resource j after completing the previously assigned jobs.
Here makespan = max(100,60,80) = 100 ms
Finally the Code
#include<stdio.h>
#include <limits.h>
int main(){int nT,nM;//number of tasks , number of machines
printf(“\nEnter number of machines and tasks\n”);
scanf(“%d%d”,&nM,&nT);
/*
Declare a 2d-array of size nM x nT
Data should be in the following format :
T1 T2 T3
M1 | 140 | 20 | 60 |
M2 | 100 | 100 | 70 |
*/
int maxMin[nM][nT];
int tmp[nM][nT];
int makespan=0;
printf(“\nFill Data\n”);
for(int i=0;i<nM;i++)
for(int j=0;j<nT;j++){
scanf(“%d”,&maxMin[i][j]);
tmp[i][j]=maxMin[i][j];
}
// visualise data
printf(“\nOriginal Data\n”);
for(int i=0;i<nM;i++){
for(int j=0;j<nT;j++)
printf(“%d “,maxMin[i][j]);
printf(“\n”);
}
//This array will hold the answer
int resultTask[nT];
int resultMachine[nT];
int resultTime[nT];
int ptr=-1; //Indicates if result set is full or not
while(ptr<nT-1){
int time[nT],machine[nT]; //stores minimum time w.r.t machine of each task
for(int j=0;j<nT;j++){
int minimum = INT_MAX;
int pos=-1;
for(int i=0;i<nM;i++){
if(maxMin[i][j]<minimum){
minimum=maxMin[i][j];
pos=i;
}
}
time[j]=minimum;
machine[j]=pos;
}
// Now we find task with maximum time
int maximum=INT_MIN;
int pos=-1;
for(int j=0;j<nT;j++){
if(time[j]>maximum && time[j] != INT_MAX){
maximum=time[j];
pos=j;
}
}
resultTask[++ptr]=pos;
resultMachine[ptr]=machine[pos];
resultTime[ptr]=tmp[machine[pos]][pos];
if(maximum>makespan)
makespan=maximum;
// resetting states
for(int i=0;i<nM;i++){
for(int j=0;j<nT;j++){
if(j==resultTask[ptr])
maxMin[i][j]=INT_MAX;
else if(i==resultMachine[ptr] && maxMin[i][j]!=INT_MAX)
maxMin[i][j]+=maximum;
else
continue;
}
}
}
//printing answer
printf(“\nScheduled Task are :\n”);
for(int i=0;i<nT;i++){
printf(“\nTask %d Runs on Machine %d with Time %d units\n”,resultTask[i]+1,resultMachine[i]+1,resultTime[i]);
}
printf(“\nTotal elapsed time : %d units\n”,makespan);
return 0;
}
Final Words
Max-min algorithm allocates larger task Ti to the resource Rj where large tasks have highest priority rather than smaller tasks. Therefore in Max-min we can execute many short tasks concurrently while executing the larger one.