Radius and Interval of Convergence of Power Series - AP Calculus BC
Card 1 of 30
Find the interval of convergence of
for the series
.
Find the interval of convergence of for the series
.
Tap to reveal answer
Using the root test,

Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:

Using the root test,
Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:
← Didn't Know|Knew It →
Find the interval of convergence for
of the Taylor Series
.
Find the interval of convergence for of the Taylor Series
.
Tap to reveal answer
Using the root test

and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
Using the root test
and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
← Didn't Know|Knew It →
Which of following intervals of convergence cannot exist?
Which of following intervals of convergence cannot exist?
Tap to reveal answer
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
← Didn't Know|Knew It →
Find the interval of convergence of
for the series
.
Find the interval of convergence of for the series
.
Tap to reveal answer
Using the root test,

Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:

Using the root test,
Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:
← Didn't Know|Knew It →
Find the interval of convergence for
of the Taylor Series
.
Find the interval of convergence for of the Taylor Series
.
Tap to reveal answer
Using the root test

and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
Using the root test
and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
← Didn't Know|Knew It →
Which of following intervals of convergence cannot exist?
Which of following intervals of convergence cannot exist?
Tap to reveal answer
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
← Didn't Know|Knew It →
Find the interval of convergence of
for the series
.
Find the interval of convergence of for the series
.
Tap to reveal answer
Using the root test,

Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:

Using the root test,
Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:
← Didn't Know|Knew It →
Find the interval of convergence for
of the Taylor Series
.
Find the interval of convergence for of the Taylor Series
.
Tap to reveal answer
Using the root test

and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
Using the root test
and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
← Didn't Know|Knew It →
Which of following intervals of convergence cannot exist?
Which of following intervals of convergence cannot exist?
Tap to reveal answer
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
← Didn't Know|Knew It →
Find the interval of convergence of
for the series
.
Find the interval of convergence of for the series
.
Tap to reveal answer
Using the root test,

Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:

Using the root test,
Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:
← Didn't Know|Knew It →
Find the interval of convergence for
of the Taylor Series
.
Find the interval of convergence for of the Taylor Series
.
Tap to reveal answer
Using the root test

and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
Using the root test
and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
← Didn't Know|Knew It →
Which of following intervals of convergence cannot exist?
Which of following intervals of convergence cannot exist?
Tap to reveal answer
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
← Didn't Know|Knew It →
Find the interval of convergence of
for the series
.
Find the interval of convergence of for the series
.
Tap to reveal answer
Using the root test,

Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:

Using the root test,
Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:
← Didn't Know|Knew It →
Find the interval of convergence for
of the Taylor Series
.
Find the interval of convergence for of the Taylor Series
.
Tap to reveal answer
Using the root test

and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
Using the root test
and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
← Didn't Know|Knew It →
Which of following intervals of convergence cannot exist?
Which of following intervals of convergence cannot exist?
Tap to reveal answer
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
← Didn't Know|Knew It →
Find the interval of convergence of
for the series
.
Find the interval of convergence of for the series
.
Tap to reveal answer
Using the root test,

Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:

Using the root test,
Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:
← Didn't Know|Knew It →
Find the interval of convergence for
of the Taylor Series
.
Find the interval of convergence for of the Taylor Series
.
Tap to reveal answer
Using the root test

and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
Using the root test
and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
← Didn't Know|Knew It →
Which of following intervals of convergence cannot exist?
Which of following intervals of convergence cannot exist?
Tap to reveal answer
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
cannot be an interval of convergence because a theorem states that a radius has to be either nonzero and finite, or infinite (which would imply that it has interval of convergence
). Thus,
can never be an interval of convergence.
← Didn't Know|Knew It →
Find the interval of convergence of
for the series
.
Find the interval of convergence of for the series
.
Tap to reveal answer
Using the root test,

Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:

Using the root test,
Because 0 is always less than 1, the root test shows that the series converges for any value of x.
Therefore, the interval of convergence is:
← Didn't Know|Knew It →
Find the interval of convergence for
of the Taylor Series
.
Find the interval of convergence for of the Taylor Series
.
Tap to reveal answer
Using the root test

and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
Using the root test
and
. T
herefore, the series only converges when it is equal to zero.
This occurs when x=5.
← Didn't Know|Knew It →