Why do we need a formal limit definition? You may say we already have a definition. Isn't that what we learned in limit of a function?

In fact, that was just the informal definition. Now, we'll try to arrive to a more formal one.

Why do we need a formal definition? Simply because it is precise. It allows us to clear any misconception or doubt.

Do you have a question or doubt about this topic? An "impossible problem"? Submit it here!

Let's start with the informal limit definition. In limit of a function, we said that the limit of a function f(x) when x approaches a value "a" is simply the value f(x) approaches.

Now, what does "approaches" even mean? I know, I know... We have an intuitive idea of what that is. For example, just by looking this graph we can see that the function is approaching L near point a:

But now I have another question for you... Can you describe step by step how you came to the conclusion that f(x) approaches L as x approaches a? If a computer had to "decide" whether or not L is the limit of f(x), could you write the program?

It turns out that sometimes it is hard to define precisely an intuitive idea, but let's give it a try.

First of all, we need a concept of distance. The distance between two numbers is relatively easy to define in mathematics. For example, we could define the distance between numbers "a" and "b" as the difference between them:

So, for example, the distance between 3 and 2 is:

We still need to improve this definition of distance, though. To see why we need to improve it, let's represent the set of all real numbers as a line:

We can see that the distance d2 is equal to distance d1. However, what would happen if we apply our definition of distance to numbers -4 and -3? We'll get that the distance is:

We don't get the same distance as with numbers 3 and 2, but we should. We solve this by using not the difference, but its absolute value instead.

So, the distance between numbers a and b is defined as the number:

Using this definition with numbers -4 and -3 we get:

And that's correct, because d1=d2.

Now we have the necesary tools. Let's approach the limit definition as a kind of game. Let's consider again the function:

Let's say I'm telling you that the limit of this function as "x" approaches "a" is "L", but you don't believe me.

So, I challenge you: "Give me any distance from L, as small as you want".

You tell me: "Okay, 0.0001".

"Alright, I guarantee you that if you take any "x" at a distance smaller than δ(delta) from "a", but not "a" itself, f(x) will be at a distance smaller than 0.0001".

We could play this game over and over, with any distance from L you choose. The number δ I give you depends on the distance you give me.

What the limit definition says is that if the statement about the limit is true, I would win every time.

The distance from L you give me could be called the margin of "error". So, we use the letter ε(epsilon) to denote it. You could associate the number δ with distance.

Now, using these symbols, how could we write the limit definition using also the concept of distance? Now I'll present you the epsilon-delta limit definition (drum roll...):

This statement:

Meand the following:

Given a number ε>0 (error), there exists a number δ>0 (distance from a), such that:

We can represent all these numbers in a graph:

Graphically, the definition says:

- Given a margin of error around L (an interval)
- There exists an interval around "a" such that for all x's in that interval that are not equal to "a", f(x) will fall between the two green lines.

This definition is a really ingenious way of explaining in exact words an intuitive idea. It is usually covered up by notation that can confuse you, so I hope this helped to clear things up.

This definition is essential, it is the foundation of all of calculus. Now, I want to show you an example of using this definition to prove a limit.

You can't usually use this limit definition to "find" limits. We'll just use it to prove that the limit we've found is indeed the correct one or that it really exists.

One of the easiest limits to prove is:

According to the definition, this statement means:

Given any number ε>0, there exists a δ>0 such that:

Let's remember that this could be thought of as a game. So, you give me a margin of error ε to shoot at L. In this case, the limit L=a.

Now, it is my turn. I must choose a distance δ from a. The delta I would choose in this case is:

This δ is valid, there are no rules against choosing the same number. So, let's check the definition:

And that's it, I won, because:

So,

And I would always win, because for any ε you choose I can always choose δ equal to that number. The definition is satisfied, and we can confidently say that:

If you've found this page useful, you'll love the * Intuitive Online Calculus Course. *This is a multiple-part course that gives you the basic tools for you to master calculus.

You'll receive the first lesson immediatelly after you sign-up. A new lesson will be delivered to your e-mail inbox every second day during the duration of the course.

**Don't miss this opportunity! **This is a completelly free course in which I put my best ideas, no strings attached.

Subscribe below and you'll receive instant access to the special resources and the first lesson.

Conclusion

- We need a formal definition of limit because it is precise and allows us to clear any doubt or misconception.
- The formal definition is just a very clever way of expressing an intuitive idea. You just need to get used to notation.
- Remember that you can think of the definition as a game.

Return from **Limit Definition** to **Limits and Continuity**

## New! Comments

Do you have a doubt, or want some help with a problem? Leave a comment in the box below.