Robert Eisele
Engineer, Systems Architect and DBA

Codingame: Apple tree

Original Problem


There are N apples on the tree. Every apple is a sphere with position (x, y, z) and radius r.
Then the i-th apple begins to fall straight down and can collide with others. When static apple gets hit by the falling one it begins to fall too, and the falling apple continues to fall straight down.
Your task is to determine how many apples will remain on the tree.

"Down" direction is vector (0, 0, -1), i.e. apple with position (0,0,10) is higher than (0,0,5)
Line 1 Two integers N and i – the number of apples and index of the falling apple
Next N lines Four space-separated integers XYZ and R – position and radius of the apple
A single integer – number of remaining apples
1 ≤ N < 100
0 ≤ i < N
-1000000 ≤ xyz ≤ 1000000
1 ≤ r < 1000000


This problem can easily be made computational expensive, with a high polynomial or even exponential runtime. The best algorithm I was able to come up with lives in \(O(n^2)\) and goes like this: We push all apples to a list for simple lookup by index and add all indices not being the initial falling apple to a maybe list \(M\) and the initial index to a working set \(W\).

For each \(w\in W\) we hittest any \(m\in M\). Every hit allows us to remove the element from the maybe list and will be added to the working list. After all elements have been processed recursively in the tree structure, we can stop and the elements in the maybe set are now definitely the apples that will remain. This algorithm can be implemented like this:

var inputs = readline().split(' ');
var N = +inputs[0];
var I = +inputs[1];

var apples = [];

var M = new Set; // maybe
var W = new Set; // working set

for (var i = 0; i < N; i++) {
  var inputs = readline().split(' ').map(Number);

    x: inputs[0],
    y: inputs[1],
    z: inputs[2],
    r: inputs[3]

  if (I !== i) M.add(i);
  else         W.add(i);

W.forEach(w => {
  M.forEach(m => {
    if (hits(apples[w], apples[m])) {


The remaining problem is the hittest when an apple is falling vertically to the ground. When we look at the x-y-plane from the top, we only see circles. So, the distance from the falling apple to another apple must be less than the sum of their radii to be interesting at all. In 3D this can be seen as tubes that intersect. But there is more, since we work in 3D, we also need to take the z-axis into account. We know the direction vector \(\mathbf{d}=[0, 0, -1]^T\) and the vector \(\mathbf{c}\) between the center points of apples. The dot-product now states that \(\mathbf{c}\cdot\mathbf{d} = |\mathbf{c}|\cdot|\mathbf{d}|\cos\theta\) from which follows that the angle \(\theta\) is \(\cos^{-1}\left(-\frac{\Delta z}{|\mathbf{c}|}\right)\), where \(\Delta z\) is the height difference of the apples. Since we require the angle \(\theta\) to be \(<90^{\circ}\) it follows that \(\Delta z < 0\), which makes sense. It just says that the z-coordinate of the apple to be tested must be below the falling apple to be affected for a straight fall. When implemented we get:

function hits(a, b) {

  var dx = b.x - a.x;
  var dy = b.y - a.y;
  var pr = b.r + a.r;
  // touch horizontally?
  if ((dx * dx + dy * dy) > (pr * pr)) {
    return false;
  return b.z < a.z;

« Back to problem overview