Skip to content

Commit

Permalink
fixup! Add option to preserve comments when parsing templates
Browse files Browse the repository at this point in the history
  • Loading branch information
pawamoy committed Oct 10, 2024
1 parent a3db7ab commit 35681ec
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 1 deletion.
1 change: 1 addition & 0 deletions CHANGES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ Unreleased
- Use modern packaging metadata with ``pyproject.toml`` instead of ``setup.cfg``.
:pr:`1793`
- Use ``flit_core`` instead of ``setuptools`` as build backend.
- Add the ``preserve_comments`` parameter to ``Environment.parse`` to preserve comments in template ASTs. :pr:`2037`


Version 3.1.5
Expand Down
3 changes: 3 additions & 0 deletions src/jinja2/environment.py
Original file line number Diff line number Diff line change
Expand Up @@ -609,6 +609,9 @@ def parse(
If you are :ref:`developing Jinja extensions <writing-extensions>`
this gives you a good overview of the node tree generated.
.. versionchanged:: 3.2
Added `preserve_comments` parameter.
"""
try:
return self._parse(source, name, filename, preserve_comments)
Expand Down
9 changes: 8 additions & 1 deletion src/jinja2/lexer.py
Original file line number Diff line number Diff line change
Expand Up @@ -614,7 +614,11 @@ def tokenize(
state: t.Optional[str] = None,
preserve_comments: bool = False,
) -> TokenStream:
"""Calls tokeniter + tokenize and wraps it in a token stream."""
"""Calls tokeniter + tokenize and wraps it in a token stream.
.. versionchanged:: 3.2
Added `preserve_comments` parameter.
"""
stream = self.tokeniter(source, name, filename, state)
return TokenStream(
self.wrap(stream, name, filename, preserve_comments), name, filename
Expand All @@ -629,6 +633,9 @@ def wrap(
) -> t.Iterator[Token]:
"""This is called with the stream as returned by `tokenize` and wraps
every token in a :class:`Token` and converts the value.
.. versionchanged:: 3.2
Added `preserve_comments` parameter.
"""
ignored = ignored_tokens
if preserve_comments:
Expand Down

0 comments on commit 35681ec

Please sign in to comment.