| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
| |
* Update CHANGES.
* Fix missing `versionadded` declarations.
* Fix a few linter issues.
|
| |
|
|
|
|
|
| |
Check both ways now: If a token should be whitespace, but also if a
token was incorrectly marked as whitespace.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Porting notes:
- tox handles Python environments automatically. Remove a bit of PYTHONPATH
manipulation (that was using Python 2 code which always failed!)
- No `clean` target: `git clean -xdf` should fit the bill.
- No `reindent` target: the `reindent.py` script it was using does not
exist (anymore?).
- No equivalent of tox-test-coverage, which was an artifact of the past,
using nose. Instead, the test-coverage target only is ported, which
uses pytest, and works.
|
| |
|
| |
|
| |
|
|
|
|
|
| |
Some of these are probably unnecessary (ASCII-only content), but it's
easier not to think about it.
|
| |
|
|
|
|
| |
Skip the `---tokens---` line when parsing a snippet.
|
|
|
|
| |
Scan snippet files in addition to token output streams.
|
| |
|
|
|
|
|
|
| |
Add a script which checks for whitespace tokens, similar to the script checking for
repeated tokens. Also move some functionality shared between them into a utility file,
and make check_repeated_token PEP8 compliant.
|
|
|
| |
Also fix a broken link and decode as UTF8 in count_token_references.py.
|
|
|
|
| |
pathlib.Path entries in sys.path are actually ignored. See
https://github.com/python/cpython/issues/96482
|
|
|
|
|
|
|
|
|
| |
Use a unified script, to reduce code duplication and in preparation
for doing a similar thing with styles and filters. The new script
also uses a bit more modern Python APIs (e.g., pathlib).
Unlike the previous scripts, it does not replace replace CRLF with LF
because Git should do that itself.
|
|
|
|
|
| |
This change also adds a script to regenerate the list of CSS properties
from the W3C source if needed.
|
| |
|
|
|
|
| |
and use it in the "duplicate filenames" checker.
|
| |
|
| |
|
|
|
|
| |
Use -m build instead of setup.py.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Web accessibility is important. Unfortunately currently many pygments
styles have rules with poor contrasts. This commit introduces a test
case that fails if the minimum contrast of a style gets worse, e.g:
E AssertionError: contrast degradation for style 'borland'
E The following rules have a contrast lower than the required 2.9:
E
E * 1.90 Token.Text.Whitespace
E * 2.80 Token.Generic.Heading
E * 2.30 Token.Generic.Subheading
E
E assert not 1.9 < 2.9
This is accomplished by storing the current minimum contrasts in
./tests/contrast/min_contrasts.json.
When you improve a minimum contrast the test fails with:
E AssertionError: congrats, you improved a contrast! please run ./scripts/update_contrasts.py
E assert not 1.9 > 0.9
Running the script as instructed updates the JSON file, making the test pass.
New styles are required to meet the WCAG AA contrast minimum of 4.5.
First commit to address #1718.
|
|
|
|
|
|
|
|
|
| |
Improve checks.
* Fix lots of small errors.
* Remove the line length check.
* Add an option to skip lexers with no alias
* Run checks in make check
* Add a new CI target.
|
|
|
|
|
|
| |
The PR #1819 provides a tool to identify unique token types. This PR
aims to remove the most obvious cases of unicorn styles which are used
in a single lexer only.
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
and a few other things
|
|
|
| |
Co-authored-by: Georg Brandl <georg@python.org>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
pyupgrade is a tool to automatically upgrade syntax for newer versions
of the Python language.
The project has been Python 3 only since
35544e2fc6eed0ce4a27ec7285aac71ff0ddc473, allowing for several cleanups:
- Remove unnecessary "-*- coding: utf-8 -*-" cookie. Python 3 reads all
source files as utf-8 by default.
- Replace IOError/EnvironmentError with OSError. Python 3 unified these
exceptions. The old names are aliases only.
- Use the Python 3 shorter super() syntax.
- Remove "utf8" argument form encode/decode. In Python 3, this value is
the default.
- Remove "r" from open() calls. In Python 3, this value is the default.
- Remove u prefix from Unicode strings. In Python 3, all strings are
Unicode.
- Replace io.open() with builtin open(). In Python 3, these functions
are functionally equivalent.
Co-authored-by: Matthäus G. Chajdas <Anteru@users.noreply.github.com>
|
| |
|
| |
|
|
|
|
| |
(useful for fuzzer testcases)
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Add a check for CR/LF in files.
This can occur when checking out things on Windows, and it breaks the
tarball. This adds a script to check for the presence of CR/LF which
exits early if anything gets found.
* Improve error checking.
* Include the external folder and check that.
* Include .bashcomp files.
* Use the correct CR/LF on the checker itself.
* Address review feedback.
* Remove || true
* Fix docs
* Print the first offending file name
|
|
|
|
|
| |
Windows doesn't support symlinks out of the box, and there doesn't
seem to be any use of this symlink, so let's remove it.
|
|
|
|
|
|
|
|
|
|
|
| |
* all: remove "u" string prefix
* util: remove unirange
Since Python 3.3, all builds are wide unicode compatible.
* unistring: remove support for narrow-unicode builds
which stopped being relevant with Python 3.3
|
| |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
* Remove Python 2 compatibility
* remove 2/3 shims in pygments.util
* update setup.py metadata
* Remove unneeded object inheritance.
* Remove unneeded future imports.
|
| |
|
| |
|
| |
|