Introduction

In two-dimensions, given two lines each defined by two points you can usually find a point where the lines intersect. Wikipedia gives the coordinates1 in terms of determinants, but not a derivation.

Derivation

Suppose \(\textbf{a}\) and \(\textbf{b}\) are on the first line, and \(\textbf{c}\) and \(\textbf{d}\) are on the second. Then, assuming that the lines aren’t parallel, we can write a general point as

\[ \textbf{x} = \mu \, (\textbf{b} - \textbf{a}) + \nu \, (\textbf{d} - \textbf{c}). \]

So to find \(\textbf{x}\) we just need to find \(\mu\) and \(\nu\). We can get rid of \(\nu\) by taking the dot-product with \(\textbf{d} - \textbf{c}\) rotated by \(\pi / 2\), but the notation gets a bit messy. Equivalently we can embed the 2D vectors in the \(z = 0\) plane of a 3D space, then look at the cross-product with \(\textbf{d} - \textbf{c}\):

\[ (\textbf{d} - \textbf{c}) \times \textbf{x} = \mu \, (\textbf{d} - \textbf{c}) \times (\textbf{b} - \textbf{a}). \]

Now, \(\textbf{x}\) lies between \(\textbf{c}\) and \(\textbf{d}\), so

\[ \begin{align} \left(\textbf{d} - \textbf{c}\right) \times \left(\textbf{x} - \textbf{d}\right) &= 0,\\ \left(\textbf{d} - \textbf{c}\right) \times \textbf{x} - \left(\textbf{d} - \textbf{c}\right) \times \textbf{d} &= 0,\\ \left(\textbf{d} - \textbf{c}\right) \times \textbf{x} &= \left(\textbf{d} - \textbf{c}\right) \times \textbf{d},\\ \left(\textbf{d} - \textbf{c}\right) \times \textbf{x} &= \textbf{d} \times \textbf{c}. \end{align} \]

Which we can substitute to give

\[ \mu \, (\textbf{d} - \textbf{c}) \times (\textbf{b} - \textbf{a}) = \textbf{d} \times \textbf{c}. \]

The results of both cross-products lie along \(\textbf{z}\), so we divide them with the understanding that we’re just looking at the \(z\)-components,

\[ \mu = \frac{\textbf{d} \times \textbf{c}}{(\textbf{d} - \textbf{c}) \times (\textbf{b} - \textbf{a})}. \]

By a similar analysis we also have,

\[ \nu = \frac{\textbf{b} \times \textbf{a}}{(\textbf{b} - \textbf{a}) \times (\textbf{d} - \textbf{c})}. \]

And thus,

\[ \textbf{x} = \frac{\textbf{d} \times \textbf{c}}{(\textbf{d} - \textbf{c}) \times (\textbf{b} - \textbf{a})} \, (\textbf{b} - \textbf{a})- \frac{\textbf{b} \times \textbf{a}}{(\textbf{d} - \textbf{c}) \times (\textbf{b} - \textbf{a})} \, (\textbf{d} - \textbf{c}). \]

Assumptions

The main assumption is that the cross-product in the denominator doesn’t vanish, i.e.,

\[ (\textbf{b} - \textbf{a}) \times (\textbf{d} - \textbf{c}) \neq 0, \]

which is equivalent to saying that the lines aren’t parallel. With finite precision arithmetic though, one needs also to beware the case where the lines are almost parallel.

Implementation

Here’s a trivial implementation in stand-alone Haskell:

-- Our vector								
type V2 = (Double,Double)                  				
									
-- Subtraction								
(.-.) :: V2 -> V2 -> V2							
(xa,ya) .-. (xb,yb) = (xa - xb, ya - yb)				
									
-- Scalar multiplication						
(*.) :: Double -> V2 -> V2                      			
a *. (x,y) = (a * x, a * y)						
									
-- z-component of cross-product						
(.^.) :: V2 -> V2 -> Double             				
(xa,ya) .^. (xb,yb) = xa * yb - xb * ya					
									
intersect :: V2 -> V2 -> V2 -> V2 -> V2                      		
intersect a b c d = (s *. ba) .-. (t *. dc)				
            where ba = b .-. a						
                  dc = d .-. c						
                  z  = dc .^. ba					
                  s  = d .^. c / z					
                  t  = b .^. a / z