程序代做CS代考 Analytic Geometry 2 – cscodehelp代写

Analytic Geometry 2
Australian National University
1

Norm-Aware Embedding for Efficient Person Search
(Chen et al., CVPR 2020)
• Person search (Zheng et al., CVPR 2017, Xiao et al. CVPR 2017)
Nanjing University of Science and Technology Max Planck Institute for Informatics
What is the meaning of norm and angle?
Norm can differentiate person from background Angle can differentiate different persons
2

3.8 Orthogonal Projections
• High-dimensional data.
• only a few dimensions contain most information
• When we compress or visualize high-dimensional data, we will lose information.
• To minimize this compression loss, we want to find the most informative dimensions in the data.
• Orthogonal projections of high-dimensional data retain as much information as possible
Orthogonal projection (orange dots) of a two-dimensional dataset (blue dots) onto a one-dimensional subspace (straight line)
3

3.8 Orthogonal Projections
• Let 𝑉 be a vector space and 𝑈 ⊆ 𝑉 a subspace of 𝑉. A linear
mapping 𝜋: 𝑉 → 𝑉 is called a projection if 𝜋2 = 𝜋 ∘ 𝜋 = 𝜋.
• Linear mappings can be expressed by transformation matrices.
• The projection matrices 𝑷 has the property 𝑷2 = 𝑷 . 𝜋𝜋𝜋
4

3.8.1 Projection onto One-Dimensional Subspaces (Lines) • Assume we are given a line (one-dimensional subspace)
through the origin with basis vector 𝑏 ∈ R𝑛.
• When we project 𝒙 ∈ R𝑛 onto 𝑈, we seek the vector 𝜋𝑈 𝒙 that is closest to 𝒙.
5

• The projection 𝜋𝑈 𝒙 should be closest to 𝒙. 𝒙 − 𝜋𝑈 𝒙 is minimal.
𝜋𝑈 𝒙 −𝒙isorthogonalto𝑈,whichisspannedby𝒃. 𝜋𝑈 𝒙 −𝒙,𝒃 =0
• 𝜋𝑈 𝒙 isanelementof𝑈spannedby𝒃. 𝜋𝑈 𝒙 =𝜆𝒃forsome𝜆∈R.
Howtodetermine𝜆,𝜋𝑈 𝒙 andtheprojectionmatrix𝑷𝜋?
6

1. Finding the coordinate 𝜆
• The orthogonality condition
𝒙−π𝑈𝒙,𝒃=0𝜋𝑈𝒙=λ𝒃 𝒙–𝜆𝒃,𝒃=0
• We use the bilinearity of inner product
𝒙,𝒃 −λ𝒃,𝒃 =0 𝜆= 𝒙,𝒃 = 𝒃,𝒙 𝒃,𝒃 𝒃2
• If we choose ∙,∙ to be the dot product, we obtain 𝜆=𝒃⊤𝒙= 𝒃⊤𝒙
𝒃⊤𝒃 𝒃2 • If 𝒃 =1then𝜆isgivenby𝒃⊤𝒙.
inner products are symmetric
7

2. Finding the projection point 𝜋𝑈 𝒙 ∈ U
• Since πU 𝒙 = λ𝒃, we immediately obtain
𝜋𝑈 𝒙 =λ𝒃=<𝒙,𝒃>𝒃=𝒃⊤𝒙𝒃
Assuming dot product
We can also compute the length of 𝜋𝑈 𝒙 as
𝜋𝑈 𝒙 = 𝜆𝒃 = λ 𝒃
Hence, our projection is of length |λ| times the length of 𝒃. • Using the dot product as an inner product, we get
𝜋𝑈 𝒙 = 𝒃⊤𝒙 𝒃 = 𝒃⊤𝒙 𝒙 𝒃 = 𝑐𝑜𝑠 𝜔 𝒙 𝒃 𝒃2𝒃𝒙𝒃𝒃
𝜋𝑈 𝒙 = 𝑐𝑜𝑠𝜔 𝒙 𝒃 = 𝑐𝑜𝑠𝜔 𝒙 𝒃 = 𝑐𝑜𝑠𝜔 𝒙 𝒃𝒃
𝜔 is the angle between 𝒙 and 𝒃. This equation should be familiar from trigonometry.
𝒃2 𝒃2
8

3. Finding the projection matrix 𝑷𝜋
• A projection is a linear mapping
• There exists a projection matrix 𝑷𝜋 such that 𝜋𝑈 𝒙 = 𝑷𝜋𝒙
• With the dot product as inner product and
𝒃⊤𝒙 𝒃𝒃⊤ 𝜋𝑈 𝒙 = 𝜆𝒃 = 𝒃𝜆 = 𝒃 𝒃 2 = 𝒃 2 𝒙
we immediately see that
𝑷𝜋 =
• Note that 𝒃 𝒃⊤ (and, consequently, 𝑷𝜋) is a symmetric matrix (of rank 1), and 𝒃 2 = 𝒃, 𝒃 is a scalar.
𝒃𝒃⊤ 𝒃2
9

Example (Projection onto a Line)
• Find the projection matrix 𝑷𝜋 onto the line through the origin spanned by 𝒃 =
1 −1⊤.
𝒃𝒃⊤ 1 1 1 1 −1 𝑷𝜋=𝒃2=2−11−1=2−1 1
• We choose a particular 𝒙 and see whether its projection lies in the subspace spanned by 𝒃. For 𝒙 = 3 5 ⊤, the projection is
𝜋𝑈𝒙=𝑷𝜋𝒙=1 1 −1 3=1−2∈𝑠𝑝𝑎𝑛 1 2 −1 1 5 2 2 −1
• Further application of 𝑷𝜋 to 𝜋𝑈 𝒙 does not change anything, i.e., 𝑷𝜋𝜋𝑈 𝒙 =
𝜋𝑈 𝒙 . This is expected because according to the definition of Projection, we
know that a projection matrix 𝑷 satisfies 𝑷2 𝒙 = 𝑷 𝒙 for all 𝒙. 𝜋𝜋𝜋
10

3.8.2 Projection onto General Subspaces
• We look at orthogonal projections of vectors 𝒙 ∈ R𝑛 onto lower-
dimensional subspaces 𝑈 ⊆ R𝑛 with dim 𝑈 = 𝑚 ≥ 1.
Projecting 𝒙 ∈ R3 onto a two- dimensional subspace
• Assume 𝒃1,⋯,𝒃𝑚 is a basis of 𝑈.
• The projection 𝜋𝑈 𝒙 is a component of 𝑈.
𝜋 𝒙 =σ𝑚 𝜆𝒃 𝑈 𝑖=1 𝑖 𝑖
• How to determine 𝜆𝑖, 𝜋𝑈 𝒙 and 𝑷𝜋?
11

1. Find the coordinates 𝜆𝑖 , ⋯ , 𝜆𝑚 • Thelinearcombination 𝑚
𝜋𝑈 𝒙 = ෍𝜆𝑖𝒃𝑖 =𝑩𝝀 𝑩= 𝒃1,…,𝒃𝑚 ∈R𝑛×𝑚,𝝀= 𝜆1,…,𝜆𝑚 ⊺ ∈R𝑚 𝑖=1
should be closest to 𝒙 ∈ R𝑛,
the vector connecting 𝜋𝑈 𝒙 ∈ 𝑈 and 𝒙 ∈ R𝑛 must be orthogonal to all basis
vectors of 𝑈.
We obtain 𝑚 simultaneous conditions (using the dot product)
𝒃,𝒙−𝜋(𝒙)=𝒃T𝒙−𝜋 𝒙 =0 1𝑈1𝑈

𝒃 ,𝒙−𝜋 (𝒙) =𝒃T 𝒙−𝜋 𝒙 =0 𝑚𝑈𝑚𝑈
with𝜋𝑈 𝒙 =𝑩𝝀,were-writetheaboveas 𝒃T 𝒙−𝑩𝝀 =0
1

𝒃T 𝒙−𝑩𝝀 =0 𝑚
such that we obtain a homogeneous linear equation system
𝒃T
𝒃T 12 𝑚
1TTT
⋮ 𝒙−𝑩𝝀 =0⇔𝑩 𝒙−𝑩𝝀 =0⇔𝑩𝑩𝝀=𝑩𝒙.

1. Find the coordinates 𝜆𝑖 , ⋯ , 𝜆𝑚
𝑩T𝑩𝝀 = 𝑩T𝒙.
• 𝒃1, . . . , 𝒃𝑚 are a basis of 𝑈, so they are linearly independent.
This allows us to solve 𝝀
𝒓𝑩T𝑩 =𝒓𝑩 =𝑚 𝝀 = (𝑩T𝑩)−1𝑩T𝒙
• The matrix (𝑩T𝑩)−1𝑩T is also called the pseudo-inverse of 𝑩.
2. Find the projection 𝜋𝑈(𝒙) ∈ 𝑈. We already established that
𝜋𝑈 𝒙 =𝑩𝝀.Therefore,wecalculate𝜋𝑈(𝒙)as 𝜋𝑈(𝒙) = 𝑩(𝑩T𝑩)−1𝑩T𝒙
13

3. Find the projection matrix 𝜬𝜋
• Wehave𝜬𝜋 𝒙=𝜋𝑈 𝒙 • From step 2, we have
𝜋𝑈(𝒙) = 𝑩(𝑩T𝑩)−1𝑩T𝒙
• We can immediately see that
𝜬𝜋 = 𝑩(𝑩T𝑩)−1𝑩T
• If dim 𝑈 = 1, i.e., projecting onto a 1-dim subspace, we have
𝑩T𝑩 is a scalar. We can re-write
𝜬𝜋 = 𝑩(𝑩T𝑩)−1𝑩T
as
which is exactly the projection matrix in the 1-D case.
𝑷𝜋 =
𝒃𝒃⊤ 𝒃2
14

Example – Projection onto a Two-dimensional Subspace
106
• For a subspace 𝑈 = span[ 1 , 1 ] ⊆ R3, and 𝒙 = 0 ∈ R3, find the
120
coordinates 𝝀 of 𝒙 in terms of 𝑈, the projection point 𝜋𝑈 𝒙 and the projection
matrix 𝜬𝜋.
• Solution
• First, the generating set of 𝑈 is a basis (linear independence) and write the 10
basis vectors of 𝑈 into a matrix 𝑩 = 1 1 . 12
• Second, we compute the matrix 𝑩T𝑩 and the vector 𝑩T𝒙 as
T 11110 33T 1116 6
𝑩𝑩=01211=35,𝑩𝒙=0120=0 120
• Third, we solve the normal equation 𝑩T𝑩𝝀 = 𝑩T𝒙 to find 𝝀:
3 3 𝜆1 = 6 ⟺𝝀= 5 35𝜆2 0 −3
15

Example – Projection onto a Two-dimensional Subspace
• Fourth, the projection point 𝜋𝑈(𝒙) of 𝒙 onto 𝑈, i.e., into the column space of 𝑩, can be directly computed via
5
𝜋𝑈 𝒙 =𝑩𝝀=
−1
• The corresponding projection error is the norm of the difference between the original vector and its projection onto 𝑈, i.e.,
𝒙−𝜋𝑈𝒙 = 1 −2 1T = 6
2
• Fifth, the projection matrix (for any 𝒙 ∈ R3) is given by T −1 T 1 5 2 −1
𝜬𝜋=𝑩(𝑩𝑩)𝑩=62 2 2 −1 2 5
16

Things to note
• 𝜋𝑈(𝒙) is still in R3, although it lies in a 2-dim subspace 𝑈 ⊆ R3
• We can find approximate solutions to unsolvable linear equation systems 𝑨𝒙 = 𝒃 using projections.
• The idea is to find the vector in the subspace spanned by the columns of 𝑨 that is closest to 𝒃, i.e., we compute the orthogonal projection of 𝒃 onto the subspace spanned by the columns of 𝑨. — least-squares solution
• If 𝑩 is an orthonomal basis (ONB), i.e., 𝑩T𝑩 = 𝑰, we have 𝜋(𝒙)=𝑩𝑩T𝒙 𝒃,𝒃 =0 for 𝑖≠𝑗
𝑖𝑗 𝝀=𝑩T𝒙 𝒃𝑖,𝒃𝑖 =1
𝑈
17

3.8.3 Gram-
• Constructively transform basis 𝒃1, ⋯ , 𝒃𝑛 of an 𝑛-dim vector space 𝑉 into an
orthogonal/orthonormal basis 𝒖1, ⋯ , 𝒖𝑛 of 𝑉. span[𝒃1, ⋯ , 𝒃𝑛]=span[𝒖1, ⋯ , 𝒖𝑛]
• The process iterates as follows
𝒖1:= 𝒃1
𝒖𝑘:= 𝒃𝑘 − 𝜋span 𝒖1,⋯,𝒖𝑘−1 𝒃𝑘 , 𝑘 = 2,⋯,𝑛
• The 𝑘th basis vector 𝒃𝑘 is projected onto the subspace spanned by the first 𝑘 − 1 constructed orthogonal vectors 𝒖1, ⋯ , 𝒖𝑘−1.
• This projection is then subtracted from 𝒃𝑘 and yields a vector 𝒖𝑘 that is orthogonaltothe 𝑘−1-dimsubspacespannedby𝒖1,⋯,𝒖𝑘−1
• If we normalize 𝒖𝑘, we obtain an ONB where 𝒖𝑘 = 1 for 𝑘 = 1,⋯,𝑛.
18

Example – Gram- • Consider a basis of R2
𝒃1 = 2 𝒃2 = 1 01
• Using the Gram-Schmidt method, we construct an orthogonal basis 𝒖1, 𝒖2 of R2 as follows (using dot product).
2 0
𝒖1:=𝒃1 =
𝒖 𝒖T 1 1 0 1 0 𝒖2∶=𝒃2−𝜋span𝒖 𝒃2 =𝒃2−11𝒃2= – =
1 𝒖1 2 1 0 0 1 1
• We immediately see that 𝒖 , 𝒖 are orthogonal, i.e., 𝒖T 𝒖 = 0 12 12
19

Check your understanding
(A) Orthogonal projections are linear projections
(B) When applying orthogonal projection multiple times (>1), the projection result will not change anymore.
(C)Given a subspace to project on, orthogonal projection gives the minimum information loss (l2).
(D)Gram- outputs the same number of basis vectors as the input.
(E) Projections allow us to better visualize and understand high- dimensional data.
20

Leave a Reply

Your email address will not be published. Required fields are marked *