Web Toolbar by Wibiya

Pages

Thursday, March 15, 2012

Coin denomination problem

Problem Statement: You are given n types of coin denominations of values d1 < d2 < ... < dn (all integers). Give an algorithm which makes change for an amount of money C with as few coins as possible.

Solution #1: Let min[i] represents the minimum number of coins required to make the change 'i'. Now, minimum number of coins required to make change 'C' = min[C-dj] + 1, where d1 <= dj <= dn. Below is the code with comments,

#define INFINITY 99999

int min[1000] = {0, };


int coins(int* arr, int len, int c)
{
    int i,j;

    //initialize all elements with INFINITY i.e.
    //infinite coins needed to make ith change
    for(i=0; i<=c; i++)
        min[i] = INFINITY;

    //0 coins are needed to get 0 change!
    min[0] = 0;

    //1 coin is needed to get the change equal to
    //any of the available coins
    for(i=0; i < len; i++)
        min[arr[i]] = 1;

    for(i=0; i<=c; i++)
    {
        for(j=0; j < len; j++)
        {
            int index = i-arr[j];
            if(index < 0)
                continue;

            if(min[index]+1 < min[i])
                min[i] = min[index]+1;
        }
    }

    return min[c];
}

int main()
{
    int arr[] = {4, 5, 7};
    printf("COINS: %d\n", coins(arr, 3, 166));

    return 0;
}

No comments: