| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
A recent commit [1] broke Windows support by using shlex to
unquote paths.
The issue with shlex.split is that it doesn't work with Windows
paths, treating backslashes as escape characters.
We'll just replace backslashes with slashes before using shlex.split,
converting them back afterwards.
Closes-Bug: #1831242
[1] Id2cc32e4e40c1f834b19756e922118d8526358d3
Change-Id: Icb3abca004a35ab9760db8116fedfa96d012d0d0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Previously, when using data_files with a glob that matched a file with
whitespace in the name, pip would error with a message that the file
does not exist, e.g:
error: can't copy 'ansible/roles/yatesr.timezone/templates/timezone-Arch': doesn't exist or not a regular file
The problem was that ansible/roles/yatesr.timezone/templates/timezone-Arch
was a truncated form of the actual filename:
ansible/roles/yatesr.timezone/templates/timezone-Arch Linux.j2
Note the space in the filename and that it has been split on this space.
This change allows you to use a glob that matches files with whitespace
in the name. It does this by quoting the path.
Change-Id: Id2cc32e4e40c1f834b19756e922118d8526358d3
Fixes-Bug: 1810934
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
If a subdirectory contained the source prefix in it's name, this was
replaced globally e.g using share/ansible = ansible/*, with the following
directory structure:
ansible/roles/kolla-ansible/test
would result in the files being installed as follows:
share/ansible/roles/kolla-share/test
whereas we expected:
share/ansible/roles/kolla-ansible/test
This patch changes the behavior so that only the first occurance is
replaced.
Change-Id: I0aab845315dab0aaccd5f67725d2ebcf0fd08aef
Fixes-Bug: 1810804
|
|
|
|
|
|
|
|
|
|
| |
codesearch.o.o shows a single, long-dead project using this [1]. Let's
just remove it and push people to set 'builders' instead in they really
want LaTeX.
[1] http://codesearch.openstack.org/?q=build_sphinx_latex
Change-Id: I820d9c540ae81717d7b33bbb4d2a4031b529b52c
|
|
|
|
|
|
|
|
|
| |
It seems like we only (currently) exposed to rpm
version translation command when the deb version
translation command is equally useful (for those
that package deb things).
Change-Id: I0df175e5206d9d3a806bf33c486765ad1aa8aa6b
|
|\ |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Intended usage, getting a rpm *compatible* version number from pbr
from a pbr using python package. Typically this will then get
put into a rpm spec file (what is used to build an rpm) and used to
build an rpm using `rpmbuild` (the thing used to translate rpm spec
files into rpms).
For example: this allows [1][2] (which I am the primary maintainer of
and godaddy, cray, and y! are users of) to get out of the whacky
and/or flakey rpm version number calculation/conversion process (which
previously just worked around by saying only use tags).
[1] https://github.com/stackforge/anvil
[2] https://anvil.readthedocs.org/en/latest/topics/summary.html
Change-Id: I26cd1d6e8aef69d52e8a4f36bd264c690e712ba3
|
|/
|
|
|
|
|
|
|
|
| |
readthedocs uses 'setup.py install' to prepare trees for doc creation,
but ChangeLog is not currently created there, and doing so would be
nice. This won't affect develop invocations AFAICT, and even if it
did, the overheads are ~10% of the time to run 0 tests in Nova today -
e.g. quite tolerable.
Change-Id: I7bc18fc9ca2dbe852598cc79b2ad6273fc53557d
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Setuptools doesn't [yet] support markeres in tests_require parameters
to setup(). While we need to get that fixed, we also need to support
syncing marker constrained requirements from global-requirements.
Since version constraints are needed to successfully port to Python3
this won't be a long term viable solution: but few of our test-only
dependencies require version constraints, so we can do this in the
interim. The ones that do - such as python-mysql - we're either
moving away from, or centralising them into optional dependencies
on other packages where the tests_require limitation shouldn't apply.
Change-Id: I31adaf35c8d7b72fe3f8c9242cc356fe34d537e8
|
|
|
|
|
|
|
| |
Treat each of the separately when searching for packages in there..
Change-Id: Icb2e2cf2a9cac3d15141aeb8ffad0e38f76cf2e7
Closes-Bug: #1426449
|
|
|
|
|
|
|
|
| |
setuptools > 12 has a new flow for writing out generated script text.
It's nicer, actually, because it means we can just subclass and extend
one method instead of monekypatching.
Change-Id: I56e7bea60df8a59d859575d426ce93c45ffee314
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Stop including git sha in version strings
We include it in pbr.json now. Including it is contentious in the world
of python, and it's up for debate as to whether or not it provides value.
Write and read more complex git sha info
Instead of encoding the git sha into the version string, add it to
a metadata file. This will allow us to get out of the business of
arguing with pip and setuptools about version info. In order to make
this really nice, provide a command line utility called "pbr" that has
subcommands to print out the metadata that we're now including in the
egg-info dir.
Only import sphinx during hook processing
When pbr is imported to handle writing the egg_info file because of
the entry point, it's causing sphinx to get imported. This has a
cascading effect once docutils is trying to be installed on a system
with pbr installed. If some of the imports fail along the way, allow
pbr to continue usefully but without the Sphinx extensions
available. Eventually, when everything is installed, those
extensions will work again when the commands for build_sphinx, etc.
are run separately.
Also slip in a change to reorder the default list of environments
run by tox so the testr database is created using a dbm format
available to all python versions.
Integration test PBR commits
Make sure that if a PBR commit is being tested then we install and
use that source rather than the latest PBR release.
Change-Id: Ie121e795be2eef30822daaa5fe8ab1c2315577ae
(cherry picked from commit 65f4fafd907a16ea1952ab7072676db2e9e0c51d)
(cherry picked from commit cd7da23937b66fea3ec42fa2f5a128f363a97e7e)
Closes-Bug: #1403510
Co-Authored-By: Clark Boylan <clark.boylan@gmail.com>
Co-Authored-By: Doug Hellmann <doug@doughellmann.com>
Co-Authored-By: Jeremy Stanley <fungi@yuggoth.org>
|
|
|
|
|
| |
Change-Id: I9af2d95d19e170a30f872112b4571196dc98d393
Partial-Bug: #1229324
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Implement a local egg_info command that only re-generates the
SOURCES.txt file when we need to. That is:
- If there is no SOURCES.txt, make one
- If we have run the sdist command, make one
Otherwise, leave well enough alone.
Also, skip doing any git processing if SKIP_GIT_SDIST is specified.
This should mean that consumers of our tarballs should not get screwed
by the need to inject git processing into the sdist.
Change-Id: I163b1c153d030e79b120600a2890edeb49e1fa90
|
|\ |
|
| |
| |
| |
| |
| |
| |
| | |
Similar to the work in the packages argument, allow the specification
of a directory to recursively include as part of the install.
Change-Id: Ife0414af468e7fcd4fc419eafc3e19e29efcfc7b
|
|\ \ |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
Setting the value directly in kwargs is not doing what we want it to do.
This is largely because of how the hook system works. When the hooks
process backwards_compat, they do a config.get('backwards_compat',
dict()), which is then written back to the config dict - which means
that blank override values are being fed in.
Later we can go through and just re-engineer how that works. For now,
unbreak nova.
Change-Id: I0c6055253cbc89b6884553e6f2fbfe8a7bbd1953
|
|/ /
| |
| |
| |
| |
| |
| |
| | |
During python setup.py develop, easy_install is called via a different
code path to install the scripts. We need to inject into that place as
well.
Change-Id: Iab319f3771529c6d57f6a36ec717fb3278839f78
|
|/
|
|
|
|
|
|
|
|
| |
The console scripts generated by entry_points are too complex for
our needs and make things run slowly in service of a multi-version
install that we don't use.
Instead, install a simple script which just does a direct import.
Change-Id: I1ec46837cea07db514f2fb6338c7bced34a83c4a
|
|
|
|
| |
Change-Id: Iaa327862fc9e43aade2dd1ed66aa27f335f12ac7
|
|
|
|
|
|
|
| |
As a step towards a common interface to running tests even when we're
not in the virtualenv - make python setup.py test actually be useful.
Change-Id: I8e4bc9bb78be37b4d13f8d6c2edfe2d67554ad78
|
|
|
|
|
|
|
|
|
| |
If we're running the installation of dependencies for installation
ourselves anyway - just use pip and be done with it. This means
that our requirements will get installed consistently regardless of
whether we're using pip, python setup.py install or a tox environment.
Change-Id: If92557a33a76553ad36bd136fa87780857a894b1
|
|
It was long and rambly. It's also in need of unittesting. So split
it into some classes so that we can test the inputs and outputs
more sensibly.
Change-Id: I3d28f5771e38b819f98a9af06aeb06529be7b302
|