Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Computer logic problem

Tags:

logic

Consider a multilevel computer in which all the levels are different. Each level has instructions that are m times as powerful as those of the level below it; that is, one level r instruction can do the work of m level r – 1 instructions, If a level 1 program requires k seconds to run, how long would equivalent programs take at levels 2, 3, and 4, assuming n level r instructions are required to interpret a single r+1 instruction?

This is the solution I came up with. Can anyone confirm or comment?

This is the solution I wound up coming up with. Can anyone verify or comment?

Level (r)        Level-1 Instructions (m)          Time
4                m^3                               t(q) ==(m^3q+n+nm+nm^2) (k/q)
3                m^2                       t(q) =(m^2q+n+nm)(k/q)
2                m                             t(q) = (mq+n)(k/q)
1                1                             t(q) = k

In order to calculate runtime t(q) for a given program containing q level-1 instructions, we must take into account both the exponentially-increasing number of level-1 instructions each level r instruction represents (shown as m^(r-1)) and the additional number of level-1 instructions required for interpretation for each layer on which the program is executed (shown as nm^(r-1)). The additional level-1 instructions used for interpretation by the lower levels must also be added into the final equations for r>2. Finally, for each equation we can determine the number of seconds the program takes to run by multiplying the total number of level-1 instructions used by the execution time of one level-1 cycle, as calculated by (k/q).

Disclaimer: This IS homework, the assignment has already been handed in. I simply cannot get the semantics of this problem, and I would really like to understand it.

like image 377
MarathonStudios Avatar asked Sep 10 '11 03:09

MarathonStudios


1 Answers

I think you are all making this too complicated. The problem statement says, in other words, that each layer runs m times faster than than the layer above. Hence layer 2 completes programs in 1/m the time, layer 3 in 1/m * 1/m and so on. So the final equation is just:

t(q) = k / (m ** q)

like image 199
Gene Olson Avatar answered Oct 23 '22 08:10

Gene Olson