E_1$-degeneration and $d'd''$-lemma
For a double complex $(A, d', d'')$, we show that if it satisfies the $d'd''$-lemma and the spectral sequence $\{E^{p, q}_r\}$ induced by $A$ does not degenerate at $E_0$, then it degenerates at $E_1$. We apply this result to prove the degeneration at $E_1$ of a Hodge-d...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
21.06.2015
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | For a double complex $(A, d', d'')$, we show that if it satisfies the
$d'd''$-lemma and the spectral sequence $\{E^{p, q}_r\}$ induced by $A$ does
not degenerate at $E_0$, then it degenerates at $E_1$. We apply this result to
prove the degeneration at $E_1$ of a Hodge-de Rham spectral sequence on compact
bi-generalized Hermitian manifolds that satisfy a version of $d'd''$-lemma. |
---|---|
DOI: | 10.48550/arxiv.1506.06451 |