SRQ seminar – October 17th, 2018
Notes from a talk in the SRQ series.
INI Seminar 20181017 Giuliani
Interacting dimers models (5/6)
We recall some notations about the multiscale decomposition of the
partition function
\(\displaystyle Z_{L, \lambda} (A) = \int \mathrm{D} \psi e^{- \sum_e
E_e e^{A_e} + V (\psi,
A)} .\)
The free propagator is
\(\displaystyle g (x, y) = \int \frac{\mathrm{d} k}{(2 \pi)^2}
\frac{e^{i k \cdot (x -
y)}}{\mu (k)} .\)
We have a finite \(L\) is an IR cutoff and momenta are away from zero:
\(| \delta k | \geqslant \pi / L\). Recall that
\(\displaystyle \mu (k) = 1 + ie^{ik_1} - e^{i (k_1 + k_2)} - i e^{i
k_2}\)
with Fermi points \(\mu (k) = 0\) iff \(k = p^+ = (0, 0)\) or \(k = p^-
= (\pi, \pi)\). There
\(\displaystyle \mu (k + p^{\omega}) \approx (- i - \omega) k_1 + (- i +
\omega) k_2, \qquad
\omega = \pm 1.\)
We rewrote
\(\displaystyle g (x, y) = \sum_{\omega = \pm 1} e^{- i p^{\omega} (x -
y)}
g_{\omega}^{(\leqslant 0)} (x, y)\)
where
\(\displaystyle g_{\omega}^{(\leqslant 0)} (x, y) = \int
\frac{\mathrm{d} k}{(2 \pi)^2}
\frac{e^{i k \cdot (x - y)}}{\mu (k)}
\underbrace{\chi (k)}_{\approx
\mathbb{I} (| k | \lesssim \pi)}\)
and used the addition principle to rewrite the objects of interest
splitted over scales
\(\displaystyle \frac{Z_{\lambda}}{Z_0} = \int P_{\leqslant 0}
(\mathrm{d}
\psi_{\omega}^{(\leqslant 0)}) e^{V^{(0)} (\psi^{(\leqslant
0)}_{\omega})} =
\int P_{\leqslant - 1} (\mathrm{d} \psi^{(\leqslant -
1)}) \underbrace{\int
P_0 (\mathrm{d} \psi^{(0)}) e^{V^{(0)} (\psi^{(0)}
+ \psi^{(\leqslant -
1)})}}_{=: \exp [L^2 E^{(- 1)} + V^{(- 1)}
(\psi^{(\leqslant - 1)})]}\)
where we have
\(\displaystyle \psi^{(\leqslant 0), \pm}_{x, w} = \psi^{(\leqslant -
1), \pm}_{x, w} +
\psi^{(0), \pm}_{x, w}\)
with propagators
\(\displaystyle g_{\omega}^{(0)} (x, y) = \int \frac{\mathrm{d} k}{(2
\pi)^2} \frac{e^{i k
\cdot (x - y)}}{\mu (k)} \underbrace{(\chi (k) -
\chi (2 k))}_{f_0 (k)}
\approx C e^{- c | x - y |^{1 / 2}}\)
and
\(\displaystyle g_{\omega}^{(\leqslant - 1)} (x, y) = \int
\frac{\mathrm{d} k}{(2 \pi)^2}
\frac{e^{i k \cdot (x - y)}}{\mu (k)}
\chi (2 k) \approx 2^{- 1}
g_{\omega}^{(\leqslant 0)} (2^{- 1} x, 2^{-
1} y) .\)
The first is nicely decaying and the second is approximatively a
rescaling of the original.
In the last lecture we described the effect of the integration of the
\(\psi^{(0)}\) fields.
\(\displaystyle L^2 E^{(- 1)} + V^{(- 1)} (\psi^{(\leqslant - 1)}) =
\sum_{n \geqslant 1}
\frac{1}{n!} \mathcal{E}_0 (\underbrace{V^{(0)}
(\psi^{(0)} + \psi^{(\leqslant
- 1)}) ; \cdots ; V^{(0)} (\psi^{(0)} +
\psi^{(\leqslant - 1)})}_{
\text{$n$-times}})\)
\(\displaystyle = L^2 E^{(- 1)} + \sum_{\ell \geqslant 2, \text{$\ell$
even}} \psi^+_{x_1,
\omega_1} \psi^-_{x_2, \omega_2} \cdots
\psi^+_{x_{\ell - 1}, \omega_{\ell -
1}} \psi^-_{x_{\ell},
\omega_{\ell}} W^{(- 1)}_{\ell, \underline{\omega}}
(x_1, \ldots,
x_{\ell})\)
where we have the formula
\(\displaystyle W^{(- 1)}_{\ell, \underline{\omega}} (\underline{x}) =
\sum_{n \geqslant
1}
\frac{1}{n!}
\sum_{\text{\scriptsize{$\underbrace{Q_1}_{\text{\scriptsize{$\begin{array}{c}
\text{external}\\
\text{lines}
\end{array}$}}} \subset P_1, \ldots,
Q_n \subset P_n$}}}^{(1, \ldots, \ell)}
\sigma_{\underline{P},
\underline{Q}}
\sum_{\text{\scriptsize{$\begin{array}{c}
\underline{x}
(P_1), \ldots, \underline{x} (P_n)\\
\underline{\omega} (P_1), \ldots,
\underline{\omega} (P_n)
\end{array}$}}}^{(\underline{x},
\underline{\omega})} \mathcal{E}_0 (\Psi_{P_1
\backslash Q_1} ; \cdots ;
\Psi_{P_n \backslash Q_n}) \left[ \prod_{j = 1}^n v
(\underline{x}
(P_j)) \right]\)
where we introduce fields monomials \(\Psi_P\) such that, for example,
the following formula for the initial interaction holds:
\(\displaystyle V^{(0)} (\psi) = - 2 \alpha \sum_x \psi^+_x \psi^-_x
\psi^+_{x + e_2}
\psi^-_{x - e_1} =: \sum_{\underline{x} (P),
\underline{\omega} (P)}
v_{\underline{\omega} (P)} (\underline{x} (P))
\underbrace{\Psi_P}_{\prod_{f
\in P} \psi^{\varepsilon (f)}_{x (f),
\omega (f)}} .\)
In this notation \(Q_i\) denotes the “external” lines, with
the constraint that the number of externals lines are \(\ell\) (this is
indicated by the index in the summations). And we have constraints on
the internal lines because they have to corresponds to external lines
with positions and \(\omega\) indexes given and this is denoted but the
upper index \((\underline{x}, \underline{\omega})\) in the summations.
We used the \(\operatorname{BBFKAR}\) formula:
\(\displaystyle \mathcal{E}_0 (\Psi_{P_1 \backslash Q_1} ; \cdots ;
\Psi_{P_n \backslash Q_n})
= \sum_{\text{$T$ spanning trees}} \sigma_T
\prod_{\ell \in T} g^{(0)}_{\ell}
\int \mu_T (\mathrm{d} \underline{t})
[\{ \det (g^{(0)} (\underline{t}, x_i,
x_{i'})) \}_{i, i' = 1, \ldots,
N}]\)
\(\displaystyle
\text{\raisebox{-0.5\height}{\includegraphics[width=14.8741473173291cm,height=8.92447199265381cm]{image-1.pdf}}}\)
where \(T\) are spanning trees of the graph of the vertices making the
connected expectation, each of them with \(| P_i \backslash Q_i |\)
contracted legs graphically “exiting” from the \(i\)-th
vertices and \(N\) is the number of propagators outside the spanning
tree, namely
\(\displaystyle N = \frac{(4 n - \ell)}{2} - (n - 1) .\)
Here \(\underline{t}\) are interpolation parameters such that
\(\underline{t} = \{ t_{j, j'} \in [0, 1] : 1 \leqslant j, j' \leqslant
n \}\) and
\(\displaystyle g^{(0)}_{\omega} (\underline{t} ; x_i, x_{i'}) = t_{j,
j'} g^{(0)}_{\omega}
(x_i, x_{i'}) .\)
\(\displaystyle \frac{1}{L^2} \| W^{(- 1)}_{\ell, \underline{\omega}}
\|_{L^1} = \frac{1}{L^2}
\sum_{x_1, \ldots, x_{\ell}} | W^{(- 1)}_{\ell,
\underline{\omega}}
(\underline{x}) |\)
\(\displaystyle \leqslant \sum_{n \geqslant 1} \frac{1}{n!} C^n | \alpha
|^n \times
\underbrace{n!}_{\text{sum on the spanning trees}} \times \|
g^{(0)}_{\omega}
\|^n_{L^1} \times \underbrace{\|
g^{(0)}_{\omega}
\|_{L^{\infty}}^N}_{\text{\scriptsize{$\begin{array}{c}
\text{not quite, here some}\\
\text{details are hiding}
\end{array}
\text{}$}}}\)
The bound on the determinant has to be done via a Gram representation as
a scalar product of two vectors (more on that by David tomorrow).
We can also allow an exponential weight in the \(L^1\) norm by using the
exponential decay for the propagators in the spanning tree.
Now we proceed by integrating out the \((- 1)\) layer.
\(\displaystyle g_{\omega}^{(\leqslant - 1)} (x, y) =
g_{\omega}^{(\leqslant - 2)} (x, y) +
g_{\omega}^{(- 1)} (x, y)\)
where
\(\displaystyle g_{\omega}^{(- 1)} (x, y) := \int \frac{\mathrm{d} k}{(2
\pi)^2} \frac{e^{i k
\cdot (x - y)}}{\mu (k)} \underbrace{[\chi (2 k) -
\chi (2^2 k)]}_{f_{- 1}
(k)} .\)
In order to do the resummations we need to classify them. For the moment
we go on with the multiscale procedure without resummations and we will
fix it later when we have shown the basic procedure.
Now we have for the potential \(V^{(- 2)}\) the following expressions
for the kernels:
\(\displaystyle W^{(- 2)}_{\ell, \underline{\omega}} (\underline{x}) =
\sum_{s \geqslant 1}
\frac{1}{s!} \sum_{P_1, \ldots, P_s} \sum_{Q_1
\subset P_1, \ldots, Q_s
\subset P_s}^{(P_1, \ldots, P_s)}
\sigma_{\underline{P},
\underline{Q}}
\sum_{\text{\scriptsize{$\begin{array}{c}
\underline{x}
(P_1), \ldots, \underline{x} (P_s)\\
\underline{\omega} (P_1), \ldots,
\underline{\omega} (P_s)
\end{array}$}}}^{(\underline{x},
\underline{\omega})}\)
\(\displaystyle \times \mathcal{E}_{- 1} (\Psi_{P_1 \backslash Q_1} ;
\cdots ; \Psi_{P_s
\backslash Q_s}) \left[ \prod_{j = 1}^s W^{(- 1)}_{|
P_j |, \underline{\omega}
(P_j)} (\underline{x} (P_j)) \right]\)
where each of the \(W^{(- 1)}_{| P_j |, \underline{\omega} (P_j)}
(\underline{x} (P_j))\) is given by a similar expression on the previous
scale.
This recursive structure of kernels inside kernels is graphically
represented as
\(\displaystyle W^{(- 2)}_{\ell, \underline{\omega}} (\underline{x}) =
\sum_{s \geqslant 1}
\sum_{s_1, \ldots, s_s \geqslant
1}
\raisebox{-0.5\height}{\includegraphics[width=14.8825429620884cm,height=8.92447199265381cm]{image-1.pdf}}\)
where the hollow dots represent the kernels at the level of \(V^{(0)}\)
and the two layers, the two layers of integrations. So we can give a
bounds of the kernels as
\(\displaystyle \frac{1}{L^2} \| W^{(- 2)}_{\ell, \underline{\omega}}
\|_{L^1} \leqslant
\sum_{n \geqslant 1}
\sum_{\text{\scriptsize{$\begin{array}{c}
\text{trees $\tau$ with}\\
\text{$n$ endpoints}
\end{array}$}}}^{(\ell)}\)
\(\displaystyle \times \frac{C^{s_{v_0}} C^{s_{v_1} + \cdots +
s_{v_{s_{v_0}}}}}{s_{v_0} !
\cdots s_{v_n} !} (s_{v_0} !) \|
g_{\omega}^{(- 1)} \|^{s_{v_0} - 1}_{L^1} \|
g_{\omega}^{(- 1)}
\|_{L^{\infty}}^{\frac{| P_{v_1} | + \cdots + \left|
P_{v_{s_{v_0}}}
\right| - | P_{v_0} |}{2} - (s_{v_0} - 1)}\)
\(\displaystyle \times \prod_{j = 1}^{s_{v_0}} \left[ \left\|
g_{\omega}^{(- 1)}
{\right\|_{L^1}^{s_{v_j} - 1}} \| g_{\omega}^{(- 1)}
\|_{L^{\infty}}^{\frac{4
s_{v_{s_j}} - | P_{v_j} |}{2} - (s_{v_j} - 1)}
(s_{v_j} !) \right]\)
where we sum over trees \(\tau\) with \(n = s_1 + \cdots + s_s\)
endpoints.
Now, in general for \(h \leqslant 0\) we have
\(\displaystyle \frac{Z_{\lambda}}{Z_0} = e^{L^2 E^{(h)}} \int
P_{\leqslant h} (\mathrm{d}
\psi^{(\leqslant h)}) e^{V^{(h)}
(\psi^{(\leqslant h)}_{\omega})}\)
where \(V^{(h)}\) has kernels
\(\displaystyle \| W^{(h)}_{\ell, \underline{\omega}} (\underline{x})
\|_{L^1} \leqslant
\sum_{n \geqslant 1} \sum_{\text{trees with $n$
endpoints}} \sum_{\{ P_v \} :
| P_{v_0} | = \ell} \times\)
\(\displaystyle \times \prod_{v \text{ not e.p.}} \left[ C^{s_v} \|
g_{\omega}^{(h_v)} \|^{s_v
- 1}_{L^1} \| g_{\omega}^{(h_v)}
\|_{L^{\infty}}^{\frac{\sum_{i = 1}^{s_v} |
P_{v_i} | - | P_v |}{2} -
(s_v - 1)} \right]\)
and now since
\(\displaystyle \| g_{\omega}^{(h_v)} \|_{L^1} \approx 2^{- h}, \qquad
\| g_{\omega}^{(h_v)}
\|_{L^{\infty}} \approx 2^h\)
and
\(\displaystyle \frac{1}{L^2} \| W^{(h)}_{\ell, \underline{\omega}}
(\underline{x}) \|_{L^1}
\leqslant \sum_{n \geqslant 1} | \alpha |^n
\sum_{\tau \in J_{h, n}} \prod_{v
\text{ not e.p.}} \left[ C^{s_v} 2^{-
(h_v - h + h) (s_v - 1)} 2^{(h_v - h +
h) \frac{\sum_{i = 1}^{s_v} |
P_{v_i} | - | P_v |}{2} - (s_v - 1)} \right]\)
\(\displaystyle \leqslant \sum_{n \geqslant 1} | \alpha |^n \sum_{\tau
\in J_{h, n}} 2^{h
\sum_{v \geqslant v_0} \left[ - 2 (s_v - 1) +
\frac{1}{2} \sum_{i = 1}^{s_v} |
P_{v_i} | - \frac{1}{2} | P_v |
\right]}\)
\(\displaystyle \times \prod_{v \text{ not e.p.}} C^{s_v} \left[ C^{s_v}
2^{- (h_v - h) (s_v -
1)} 2^{(h_v - h) \frac{\sum_{i = 1}^{s_v} |
P_{v_i} | - | P_v |}{2} - (s_v -
1)} \right]\)
and now
\(\displaystyle 2^{h \sum_{v \geqslant v_0} \left[ - 2 (s_v - 1) +
\frac{1}{2} \sum_{i =
1}^{s_v} | P_{v_i} | - \frac{1}{2} | P_v |
\right]} = 2^{h \left[ - 2 (n - 1)
+ \frac{4 n - \ell}{2} \right]} =
2^{h (2 - \ell / 2)}\)
and the fact that \(n\) drops out is the signal that the theory is just
renormalizable (but not super renormalizable).
This computation signals that \((2 - \ell / 2)\) is the scaling
dimension of \(W^{(h)}_{\ell}\). We have now
\(\displaystyle \frac{1}{L^2} \| W^{(h)}_{\ell, \underline{\omega}}
(\underline{x}) \|_{L^1}
\leqslant 2^{h (2 - \ell / 2)} \sum_{n
\geqslant 1} | \alpha |^n \sum_{\tau
\in J_{h, n}}\)
\(\displaystyle \times \prod_{v \text{ not e.p.}} \left[ C^{s_v} 2^{-
(h_v - h) (s_v - 1)}
2^{(h_v - h) \frac{\sum_{i = 1}^{s_v} | P_{v_i} | -
| P_v |}{2} - (s_v - 1)}
\right]\)
We now rewrite the scale jumps along the path \([v_0, v]\) on the tree
to \(v\) as:
\(\displaystyle h_v - h = \sum_{v_0 \leqslant w \leqslant v} (h_w -
h_{w'})\)
where \(w'\) is the parent of \(w\) along this path. So we have
\(\displaystyle \sum_{v \geqslant v_0} (s_v - 1) (h_v - h) = \sum_{v
\geqslant v_0} \sum_{v_0
\leqslant w \leqslant v} (h_w - h_{w'}) (s_v -
1) = \sum_{w \geqslant v_0}
(h_w - h_{w'}) \sum_{v \geqslant w} (s_v -
1)\)
\(\displaystyle = {\sum_{w \geqslant v_0}} (h_w - h_{w'})
(\underbrace{n_w}_{\# \text{of e.p.
following $w$ on $\tau$}} - 1)\)
and reasoning in this way we obtain
\(\displaystyle \prod_{\text{$w$ not e.p.}} 2^{(h_w - h_{w'}) \left[
\left( - 2 (n_w - 1) +
\frac{4 n_w - | P_w |}{2} \right) \right]} =
\prod_{\text{$v$ not e.p.}}
2^{(h_v - h_{v'}) (2 - | P_v | / 2)}\)
and therefore using
\(\displaystyle \frac{1}{L^2} \| W^{(h)}_{\ell, \underline{\omega}}
(\underline{x}) \|_{L^1}
\leqslant 2^{h (2 - \ell / 2)} \sum_{n
\geqslant 1} C^n | \alpha |^n
\sum_{\tau \in J_{h, n}} \prod_{\text{$v$
not e.p.}} 2^{\overbrace{(h_v -
h_{v'})}^{> 0} \left( 2 - \frac{1}{2} |
P_v | \right)}\)
From this formula we see that we loose memory of all the \(v\) for which
\(2 - | P_v | / 2 < 0\). We have \(2 - | P_v | / 2 = 0\) for \(| P_v | =
4\) and \(2 - | P_v | / 2 = 1\) for \(| P_v | = 2\). So all \(v\) such
that \(| P_v | \geqslant 6\) we are fine: we can resum the expression
above and get a uniform bound. But we have terms with effective vertices
for which \(| P_v | = 2, 4\) then we have problems if we need to
iterated the formula many times.
How we cure this we will see next time.