한국어
Calculus: Existence of Global Maximum and Global Minimum for \$f(x)=e^xcdot X^3\$

HINT.- \$f'(x)=e^xx^2(x3)=0Rightarrow x=0,-3,-infty\$ and \$f'(x)gt0\$ for \$xgt0\$. You can deduce from this that a global minimum is taken at \$x=-3\$ and that there is not a global maximum (or it is \$infty\$ if you want)

1. How to calculate \$iint_SF cdot n dS\$ for the following.

You should note that for your parameterisation \$ vecr(alpha, beta)\$, we have that \$\$ iint mathbfF cdot hatmathbfn text dS = iint mathbfF cdot underbrace dfracleft( dfracpartial rpartial alpha times dfracpartial rpartial beta

ight) left| dfracpartial rpartial alpha times dfracpartial rpartial beta

ight|_= mathbfhatn overbrace left| dfracpartial rpartial alpha times dfracpartial rpartial beta

ight| text dalpha text dbeta ^= text dS\$\$

2. Prove that \$A_T= frac12a cdot b cdot sin(gamma)= frac12a cdot c cdot sin(beta)=frac12b cdot c cdot sin(alpha)\$

Hint:note that:\$bsin gamma\$ is the height with respect the side \$a\$\$asin beta\$ is the height with respect the side \$c\$\$csin alpha\$ is the height with respect the side \$b\$From the figure:If \$a\$ is the basis than \$AD\$ is the relative heigt and the trisangle \$ADC\$ is rectangle in \$D\$ and \$AD= b sin gamma\$, so the area of \$ABC\$ is \$Area=frac12ab sin gamma\$.You can do the same using the oter sides as a basis

3. Is there a general result that groups of order \$2^ncdot 3\$ are solvable?

Let \$G\$ be a group of order \$2^ncdot 3\$. If \$G\$ has a normal Sylow 2-group, you are done (can you see why?). Otherwise, \$G\$ has 3 Sylow 2-subgroups, and that gives a non-trivial homomorphism of \$G\$ to \$S_3\$ via the conjugation action on these Sylow 2-subgroups. The kernel is a 2-group, and the image is solvable, so again, you are done.

4. Does \$kcdot x

Oh, well, I got to this counter-example :\$k = 2, x = -2, y=-2\$

5. Solve \$Acdot x = b\$ by NaÃ¯ve Gaussian elimination

The matrix \$A\$ is lower triangular and invertible, because all coefficients on the diagonal are nonzero. So when you do the \$LU\$ decomposition, you have \$L=A\$ and \$U\$ is the identity matrix.If you properly do Gaussian elimination, the first step (eliminating under the first pivot) will give the matrix \$\$ beginbmatrix 1 & 0 & 0 & 0 0 & 1 & 0 & 0 0 & 3 & 1 & 0 0 &-2 & 1 & 1 endbmatrix \$\$ and so on for the other columns.Since the original matrix has been changed, here's what can be said; after elimination according to the Gauss-Doolittle method (no reduction of pivots), we find, for the augmented matrix, the reduced form \$\$ [Umid c]=left[beginarraycccc|c 4 & 2 & -5 & 1 & -1 0 & 4 & -1 & 9 & -3 0 & 0 & 6 & -1 & 4 0 & 0 & 0 & 5 & 10 endarray

ight] \$\$ and the matrix \$L\$ such that \$L[,Umid c,]=[,Amid b,]\$ is \$\$ L=beginbmatrix 1 & 0 & 0 & 0 -2 & 1 & 0 & 0 -6 & -3 & 1 & 0 8 & 2 & -1 & 1 endbmatrix. \$\$The question 'How is the \$U\$ matrix related to the reduced terms in the augmented matrix before back-substitution?' is not really clear. What I can say is that the system \$\$ Ux=c, \$\$ where \$c\$ denotes the last column in the reduced augmented matrix, is equivalent to the original linear system \$Ax=b\$. In particular, the form of \$U\$ tells you that the system has a unique solution. The fact that \$L[,Umid c,]=[,Amid b,]\$ implies that \$Lc=b\$, so \$c=L^-1b\$.What can be done now is to multiply the last row by \$1/5\$ and do "backwards elimination", reducing the pivots: we find \$\$left[beginarraycccc|c 1 & 0 & 0 & 0 & 12 0 & 1 & 0 & 0 & -5 0 & 0 & 1 & 0 & 1 0 & 0 & 0 & 1 & 2 endarray

ight] \$\$ which shows the unique solution.

6. Show that (\$ell^1\$, \$|cdot|_1\$) is complete

Consider a Cauchy sequence \$x^n\$in \$l_1\$. Where \$x^n = (x_1^n, x_2^n, dots)\$. For any \$epsilon >0\$ there exist \$k_1 in mathbbN \$ s.t. \$|x^p - x^q|_1 k_1\$. So \$sum_i=1^infty |x_i^p -x_i^q|

latest articles
최신 주제
신규 품목
related searches