Frequently Asked Questions

When is a square root irrational?

The answer is very simple: if the square root a natural number is non-integer, then it is irrational.

From y² = ny = √n (taking positive root).
Alternatively, (y²)½ = n½y = n½.
Therefore √n = n½.

In general, n1/q is the qth root of n, and from this there is a remarkable and general result which we shall prove.

If n, q are natural numbers and n1/q (the qth root of n) is non-integer, then it is irrational.

Given that n1/q is non-integer and n, q are natural numbers, let us assume that it is rational.

Let n1/q = a/b, where a and b are a pair of natural numbers with no common factors; clearly b cannot equal one.

Therefore, (n1/q)q = (a/b)q, gives n = aq/bq.

As the LHS is integer, the RHS must also be integer. But a and b have no common factors, so bq cannot divide aq (see note), unless bq = 1, which is a contradiction. Hence there can be no ratio of natural numbers such that n1/q is rational and so we prove it must be irrational.

NOTE: The statement, bq cannot divide aq, is based on the Fundamental Theorem of Arithmetic. For example, if a = 5 and b = 2 (which have no common factors), then no amount of multiplying 2 by itself (2q) will produce a factor of 5, and so it will never divide into 5q.

If n, p, q are natural numbers and (np)1/q is non-integer then it must be irrational (by the result just proved). As (np)1/q = np/q, it follows that if the result of raising n to any rational power, p/q, is non-integer it must be irrational.