Min Min Algorithm in Grid Computing with code in C
Here you will find the easiest explanation for min min algorithm in grid computing.
Let us first know why this algorithm is needed?
This is a static task scheduling algorithm used for load balancing.
The Min-Min algorithm first finds the minimum execution time of all tasks. Then it chooses the task with the least execution time among all the tasks. The algorithm proceeds by assigning the task to the resource that produces the minimum completion time. The same procedure is repeated by Min-Min until all tasks are scheduled.
Let us take an example before moving to the code
Here Task 1 will execute in 140ms in VM1 and in 100 ms in VM2. Task 2 will execute for 20ms in VM1 and in 100ms in VM2 . Task 3 will run for 60 ms in VM1 and for 70 ms in VM2.
What is a Task ?
Let’s take an example. Suppose you have developed a social media app called fakebook. Now user one wants to change her DP , another user wants to create her account and the third user wants to like a picture.
All these are the tasks and involve manipulation in the database. The entity responsible for that change is the backend server. For effective use of the server you have made virtual partitions of the server making some virtual machines good for light computation and some virtual machines for heavy computation.
Now for sure you will not let a weak virtual machine to do heavy computation and a strong machine to do a trivial task, but this is the optimal condition, therefore we need task scheduling algorithm to make this decision for us.
Algorithm to solve the problem
- Find minimum time for each task , in the above image minimum time for each task is (100 ms ,T1 ,VM2), (20 ms, T2, VM1) ,(60 ms, T3, VM1)
- Now find the set with minimum time and that is (20 ms, T2, VM1)
- Execute T2 in VM1 for 20 ms
- Since 20ms has been elapsed we need to update our data. New data is:
5. Here we impute T2 as INT_Max(2³¹) to let our code know that this task is over and it denotes infinite. Now repeat step 1
6. Find minimum time for each task , in the above image minimum time for each task is (100 ms ,T1 ,VM2), (70 ms, T3, VM2)
7. Now find the set with minimum time and that is (70 ms, T3, VM2)
8. Execute T3 in VM1 for 70ms
9. Since 70ms has been elapsed we need to update our data. New data is:
10. Find minimum time for each task , in the above image minimum time for each task is (160 ms ,T1 ,VM2)
11. Exit Since all task has been scheduled.
Makespan produced by any algorithm for a schedule can be calculated as follows:
makespan = max (CT (ti, mj))
CTij = Rj+ETij
Where CT completion time of machines ,ETij expected execution time of job i on resource j , Rj ready time or availability time of resource j after completing the previously assigned jobs.
Here makespan = max(20,70,160) = 160 ms
Finally the Code
int nT,nM;//number of tasks , number of machines
printf(“\nEnter number of machines and tasks\n”);
Declare a 2d-array of size nM x nT
Data should be in the following format :
T1 T2 T3
M1 | 140 | 20 | 60 |
M2 | 100 | 100 | 70 |
// visualise data
//This array will hold the answer
int ptr=-1; //Indicates if result set is full or not
int time[nT],machine[nT]; //stores minimum time w.r.t machine of each task
int minimum = INT_MAX;
// Now we find task with minimum time
// resetting states
else if(i==resultMachine[ptr] && minMin[i][j]!=INT_MAX)
printf(“\nScheduled Task are :\n”);
printf(“\nTask %d Runs on Machine %d with Time %d units\n”,resultTask[i]+1,resultMachine[i]+1,resultTime[i]);
printf(“\nMakespan : %d units\n”,makespan);
Min-Min algorithm is simple and fast, at the same time it produces a better makespan. But it considers the shortest jobs first so it fails to utilize the resources efficiently which leads to a load imbalance.