Intro to Algorithm Analysis with some simple Loop Analysis

  • Consider the following program fragment that process an input array of n elements:

       double sum = 0
    
       for ( int i = 0; i < n; i++ )
           sum += array[i];  

  • How many times is the loop body executed ?

    • ???

Intro to Algorithm Analysis with some simple Loop Analysis

  • Consider the following program fragment that process an input array of n elements:

       double sum = 0
    
       for ( int i = 0; i < n; i++ )
           sum += array[i];  

  • How many times is the loop body executed ?

    • n

Intro to Algorithm Analysis with some simple Loop Analysis

  • Consider the following program fragment that process an input array of n elements:

       double sum = 0
    
       for ( int i = 0; i < n; i = i + 2 )
           sum += array[i];  

  • How many times is the loop body executed ?

    • ???

Intro to Algorithm Analysis with some simple Loop Analysis

  • Consider the following program fragment that process an input array of n elements:

       double sum = 0
    
       for ( int i = 0; i < n; i = i + 2 )
           sum += array[i];  

  • How many times is the loop body executed ?

    • n/2

Intro to Algorithm Analysis with some simple Loop Analysis

  • Consider the following program fragment that process an input array of n elements:

       double sum = 0
    
       for ( int i = 0; i < n; i++ )
           for ( int j = 0; j < n; ++ )
               sum += array[i]*array[j];  

  • How many times is the loop body executed ?

    • ???

Intro to Algorithm Analysis with some simple Loop Analysis

  • Consider the following program fragment that process an input array of n elements:

       double sum = 0
    
       for ( int i = 0; i < n; i++ )
           for ( int j = 0; j < n; ++ )
               sum += array[i]*array[j];  

  • How many times is the loop body executed ?

    • n × n = n2

Characterizing the running time (complexity) of algorithms

  • The running time of algorithm is a function of the input size

  • We saw in the previous examples that the relationship (= function) between running time and input size (n) can be of these forms:

      Example 1:     n
      Example 2:     n/2
      Example 3:     n2  

  • The running time in terms of input size n can be a general Mathematical function

    E.g.:

        3n2 + 20n + 5 

  • However, we are mostly interested in the:

    • "Order" of the growth function (and not the exact formula)

An approximate definition of the order of growth

  • Definition:

     Given 2 function f(n) and g(n).
    
                            f(n)
         f(n) ~ g(n)   if   ---- ---> 1  when n -> 
                            g(n)
    

  • Examples:

        2n + 10 ~ 2n                 
    
            Try n = 100000:    2n + 10 = 200010
                               2n      = 200000
    
        3n3 + 20n2 + 5 ~ 3n3 
    
            Try n = 100000:    3n3 + 20n2 + 5 = 3000200000000005
                               3n3            = 3000000000000000 

  • In run time analysis, we can ignore the less significant terms

Mathematical definition of Order of growth

  • Definition:

    • Given functions f(n) and g(n).

      The function f(n) is O(g(n)) (order of g(n)) if:

          c and n0:  f(n) ≤ cg(n)  for  n ≥ n0
      

  • We call O(g(n))

    • "Big-Oh" of g

  • So a function f(n) is Big-Oh of g(n) if:

    • f(n) ≤ cg(n) (i.e. "dominated" by some multiple of g(n)) for large values of n

Mathematical definition of Order of growth

  • Example 1:

        f(n) = 2n + 10     is   O(n)

    Graphs of 3n, f(n) = 2n + 10 and g(n) = n:


    For n > 10: 2n + 103n
    Therefore, we can find c = 3 and n0 = 10: for which f(n) ≤ cg(n) when n ≥ n0

Mathematical definition of Order of growth

  • Example 2:

        f(n) = n2     is NOT  O(n)

    Graphs of f(n) = n2, 100n, 10n and g(n) = n:


    To bound n2cn, we must have: n ≤ c for all n.
    This can never be satisfied (e.g.: picking n = c + 1)

Big-Oh and Growth rate

  • The Big-Oh notation gives an upper bound on the growth rate of a function f(n) (that represents the run time complexity of some algorithm)

  • If f(n) is O(g(n)) then:

    • The growth rate of f(n) is no more than the growth rate of g(n)

  • In Algorithm Analysis, we use O(g(n)) to rank (= categorize) functions by their growth rate

    Example:

      2n + 4, 7n + 9, 10000n + 999  are all O(n)

    I.e.: all these functions grow at the same rate

Summary: How to perform algorithm analysis

  • Determine the frequency (= how often) of the primitive operation

    • Primitive operation corresponds to a low-level (basic) computation with a constant execution time (= without a loop)

      Examples:

      • Evaluating an expression,
      • Assigning a value to a variable,
      • Combination of the above operations

  • Characterize it as a function of the input size (N)

    • Retain only the fastest growing factor

  • Consider the worst case behavior and if possible, the average behavior

Summary: how to obtain the Big Oh notation

  1. Derive the cost function f(n) of your algorithm

     Example:    3n2 + 5n + 10
    

  2. Throw away all lower-order terms [tilde notation]

     Example:    ~ 3n2
    

  3. Drop the constant factor [Big-Oh notation]

     Example:    O(n2)
    

  • Common running times:

    • O(1): constant time
    • O(log(n)): logarithmic
    • O(n): linear
    • O(nlog(n)): log linear
    • O(n2): quadratic