# Open Algorithmic Problems

## Minimum-area triangles

Given a set of n points in the plane, we can find the minimum area triangle in quadratic time, but the best known lower bound is only Omega(n log n). Find a subquadratic algorithm, or prove a quadratic lower bound in any model of computation.

This problem is clearly 3SUM-hard, since it's harder than finding colinear triples in the plane. Thus, a subquadratic algorithm seems unlikely. I have established quadratic lower bounds for the colinear points problem in several (slightly) different models of computation, but the minimum-area triangle problem cannot even be solved in the models in which my lower bounds hold.

In d dimensions, the minimum volume simplex can be found in O(nd) time and O(n2) space, but again, the best known lower bound is only Omega(n log n).

Leonid Khachian [J. Complexity 11:138-163 (1995)] shows that, if the dimension is not fixed, finding the minimum volume simplex, or even approximating its volume to within a factor of 2poly(n,d), is NP-hard.

[This is the problem that first led me to study lower bounds. I'm not much closer to a solution than when I started!]

## Complex colinearities [David Eppstein]

Given a set of n points in the complex plane C2, how quickly can we determine whether any three lie on a complex line? (Recall that C2 is isomorphic to R4, and a complex line is isomorphic to a 2-flat. Every triple of points in R4 lie on a 2-flat, but not all these flats are isomorphic to complex lines.) There is a simple algorithm that runs in time O(n2 log n). The challenge is to get rid of the log factor in the running time. There is a quadratic-time algorithm for the corresponding problem over the reals, but it does not generalize to the complex case.

On the other hand, the best known lower bound for this problem is only Omega(n log n). The challenge here is to increase the lower bound to Omega(n2) in any reasonable model of computation. The problem is clearly 3SUM-hard, since it's harder than the corresponding problem over the reals. I have established a quadratic lower bound for the real problem, but the complex problem cannot even be solved in the models in which my lower bound holds.

## Extreme points

Given a set of n points in Rd, for some d>3, determine whether or not every point is a vertex of the set's convex hull. The best known algorithm for this problem, developed by Timothy Chan using results of Jíri Matousek, runs in time

O( n2 floor{d/2} / (floor{d/2}+1) polylog n ).

Except for the polylog factor, this algorithm is almost certainly optimal, but the best known lower bound is only Omega(n log n). Prove a better lower bound in any model of computation, or derive a significantly faster algorithm.

This question is (probably) related to the polytope questions on the other page.

## A dynamic programming problem [David Eppstein]

Given a sorted list of n real numbers, find the smallest interval containing exactly k elements, for all values of k between 1 and n. There is a simple dynamic programming algorithm that solves this problem in quadratic time, but the best known lower bound is only linear. Either describe a faster algorithm, or prove a bigger lower bound in some reasonable model of computation. ("Dynamic programming" is not a reasonable model of comptuation!!)

## Shortest paths in line arrangements [Marc van Kreveld]

Given a set of lines in the plane, and let s and t be two vertices of its arrangement. How quickly can we find the shortest path from s to t along the edges of the arrangement? There is an easy O(n2 log n)-time algorithm, but the best lower bound is only Omega(n log n), by reduction from convex hull size verification. Is a subquadratic algorithm possible?

The problem is open even in the special case where the lines form two pencils, and the source and target points are corners of the distorted grid they form. See the figure. A straightforward dynamic programming algorithm runs in quadratic time. Either describe a faster algorithm, or prove a bigger lower bound in some reasonable model of computation. (Again, "dynamic programming" is not a reasonable model of computation!!)

Bose, Evans, Kirkpatrick, McAllister, and Snoeyink have an approximation algorithm that runs in O(n log n) time and finds a path at most twice as long as the shortest path.

Of course, similar questions can be asked about arrangements of line segments. David Eppstein and David Hart recently derived an algorithm that finds shortest paths in an arrangement of axis-parallel segments in O(n3/2) time. See "An efficient algorithm for shortest paths in vertical and horizontal segments", Proc. 5th WADS (Lecture Notes in Comput. Sci. 1272), pp. 234-247, 1997.

Open Problems - Jeff Erickson (jeffe@cs.duke.edu) 05 Sep 97