Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Throwing the fattest people off of an overloaded airplane.

This won't help for your proxy problem, however:

For 1,000,000 passengers to drop 3000 pounds of weight, each passenger must lose (3000/1000000) = 0.003 lbs per person. That could be achieved through jettisoning every ones shirt, or shoes, or probably even fingernail clippings, saving everyone. This assumes efficient collection and jettison before the weight loss needed increased as the plane used more fuel.

Actually, they don't allow fingernail clippers on board anymore, so that's out.


One way would be to use a min heap (std::priority_queue in C++). Here's how you'd do it, assuming you had a MinHeap class. (Yes, my example is in C#. I think you get the idea.)

int targetTotal = 3000;
int totalWeight = 0;
// this creates an empty heap!
var myHeap = new MinHeap<Passenger>(/* need comparer here to order by weight */);
foreach (var pass in passengers)
{
    if (totalWeight < targetTotal)
    {
        // unconditionally add this passenger
        myHeap.Add(pass);
        totalWeight += pass.Weight;
    }
    else if (pass.Weight > myHeap.Peek().Weight)
    {
        // If this passenger is heavier than the lightest
        // passenger already on the heap,
        // then remove the lightest passenger and add this one
        var oldPass = myHeap.RemoveFirst();
        totalWeight -= oldPass.Weight;
        myHeap.Add(pass);
        totalWeight += pass.Weight;
    }
}

// At this point, the heaviest people are on the heap,
// but there might be too many of them.
// Remove the lighter people until we have the minimum necessary
while ((totalWeight - myHeap.Peek().Weight) > targetTotal)
{
    var oldPass = myHeap.RemoveFirst();
    totalWeight -= oldPass.Weight; 
}
// The heap now contains the passengers who will be thrown overboard.

According to the standard references, running time should be proportional to n log k, where n is the number of passengers and k is the maximum number of items on the heap. If we assume that passengers' weights will typically be 100 lbs or more, then it's unlikely that the heap will contain more than 30 items at any time.

The worst case would be if the passengers are presented in order from lowest weight to highest. That would require that every passenger be added to the heap, and every passenger be removed from the heap. Still, with a million passengers and assuming that the lightest weighs 100 lbs, the n log k works out to a reasonably small number.

If you get the passengers' weights randomly, performance is much better. I use something quite like this for a recommendation engine (I select the top 200 items from a list of several million). I typically end up with only 50,000 or 70,000 items actually added to the heap.

I suspect that you'll see something quite similar: the majority of your candidates will be rejected because they're lighter than the lightest person already on the heap. And Peek is an O(1) operation.

For a more information about the performance of heap select and quick select, see When theory meets practice. Short version: if you're selecting fewer than 1% of the total number of items, then heap select is a clear winner over quick select. More than 1%, then use quick select or a variant like Introselect.


Below is a rather simple implementation of the straightforward solution. I don't think there is a faster way that is 100% correct.

size_t total = 0;
std::set<passenger> dead;
for ( auto p : passengers ) {
    if (dead.empty()) {
       dead.insert(p);
       total += p.weight;
       continue;
    }
    if (total < threshold || p.weight > dead.begin()->weight)
    {
        dead.insert(p);
        total += p.weight;
        while (total > threshold)
        {
            if (total - dead.begin()->weight < threshold)
                break;
            total -= dead.begin()->weight;
            dead.erase(dead.begin());
        }
    }
 }

This works by filling up the set of "dead people" until it meets the threshold. Once the threshold is met, we keep going through the list of passengers trying to find any that are heavier than the lightest dead person. When we have found one, we add them to the list and then start "Saving" the lightest people off the list until we can't save any more.

In the worst case, this will perform about the same as a sort of the entire list. But in the best case (the "dead list" is filled up properly with the first X people) it will perform O(n).


Assuming all passengers will cooperate: Use a parallel sorting network. (see also this)

Here is a live demonstration

Update: Alternative video (jump to 1:00)

Asking pairs of people to compare-exchange - you can't get faster than this.


@Blastfurnace was on the right track. You use quickselect where the pivots are weight thresholds. Each partition splits one set of people into sets, and returns the total weight for each set of people. You continue breaking the appropriate bucket until your buckets corresponding to the highest weight people are over 3000 pounds, and your lowest bucket that is in that set has 1 person (that is, it can't be split any further.)

This algorithm is linear time amortized, but quadratic worst case. I think it is the only linear time algorithm.


Here's a Python solution that illustrates this algorithm:

#!/usr/bin/env python
import math
import numpy as np
import random

OVERWEIGHT = 3000.0
in_trouble = [math.floor(x * 10) / 10
              for x in np.random.standard_gamma(16.0, 100) * 8.0]
dead = []
spared = []

dead_weight = 0.0

while in_trouble:
    m = np.median(list(set(random.sample(in_trouble, min(len(in_trouble), 5)))))
    print("Partitioning with pivot:", m)
    lighter_partition = []
    heavier_partition = []
    heavier_partition_weight = 0.0
    in_trouble_is_indivisible = True
    for p in in_trouble:
        if p < m:
            lighter_partition.append(p)
        else:
            heavier_partition.append(p)
            heavier_partition_weight += p
        if p != m:
            in_trouble_is_indivisible = False
    if heavier_partition_weight + dead_weight >= OVERWEIGHT and not in_trouble_is_indivisible:
        spared += lighter_partition
        in_trouble = heavier_partition
    else:
        dead += heavier_partition
        dead_weight += heavier_partition_weight
        in_trouble = lighter_partition

print("weight of dead people: {}; spared people: {}".format(
    dead_weight, sum(spared)))
print("Dead: ", dead)
print("Spared: ", spared)

Output:

Partitioning with pivot: 121.2
Partitioning with pivot: 158.9
Partitioning with pivot: 168.8
Partitioning with pivot: 161.5
Partitioning with pivot: 159.7
Partitioning with pivot: 158.9
weight of dead people: 3051.7; spared people: 9551.7
Dead:  [179.1, 182.5, 179.2, 171.6, 169.9, 179.9, 168.8, 172.2, 169.9, 179.6, 164.4, 164.8, 161.5, 163.1, 165.7, 160.9, 159.7, 158.9]
Spared:  [82.2, 91.9, 94.7, 116.5, 108.2, 78.9, 83.1, 114.6, 87.7, 103.0, 106.0, 102.3, 104.9, 117.0, 96.7, 109.2, 98.0, 108.4, 99.0, 96.8, 90.7, 79.4, 101.7, 119.3, 87.2, 114.7, 90.0, 84.7, 83.5, 84.7, 111.0, 118.1, 112.1, 92.5, 100.9, 114.1, 114.7, 114.1, 113.7, 99.4, 79.3, 100.1, 82.6, 108.9, 103.5, 89.5, 121.8, 156.1, 121.4, 130.3, 157.4, 138.9, 143.0, 145.1, 125.1, 138.5, 143.8, 146.8, 140.1, 136.9, 123.1, 140.2, 153.6, 138.6, 146.5, 143.6, 130.8, 155.7, 128.9, 143.8, 124.0, 134.0, 145.0, 136.0, 121.2, 133.4, 144.0, 126.3, 127.0, 148.3, 144.9, 128.1]