E_1$-degeneration and $d'd''$-lemma

For a double complex $(A, d', d'')$, we show that if it satisfies the $d'd''$-lemma and the spectral sequence $\{E^{p, q}_r\}$ induced by $A$ does not degenerate at $E_0$, then it degenerates at $E_1$. We apply this result to prove the degeneration at $E_1$ of a Hodge-d...

Full description

Saved in:
Bibliographic Details
Main Authors Chen, Tai-Wei, Ho, Chung-I, Teh, Jyh-Haur
Format Journal Article
LanguageEnglish
Published 21.06.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For a double complex $(A, d', d'')$, we show that if it satisfies the $d'd''$-lemma and the spectral sequence $\{E^{p, q}_r\}$ induced by $A$ does not degenerate at $E_0$, then it degenerates at $E_1$. We apply this result to prove the degeneration at $E_1$ of a Hodge-de Rham spectral sequence on compact bi-generalized Hermitian manifolds that satisfy a version of $d'd''$-lemma.
DOI:10.48550/arxiv.1506.06451