| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
|\
| |
| |
| |
| | |
Fix rst link formatting for guideline links
See merge request BuildStream/buildstream!811
|
|/ |
|
|\
| |
| |
| |
| | |
_frontend/status.py: Completely remove the blessings dependency from BuildStream
See merge request BuildStream/buildstream!808
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This actually improves reliability of the status bar because we
now disable it completely in the case that not all of the terminal
escape sequences are supported on the given terminal.
This replaces the few functions we were using, to move the cursor
up one line, move it to the beginning of the line, and to clear a
line, with low level functions provided by the curses module in
the standard library.
This change makes it easier for downstream distro package maintainers
to package BuildStream, particularly on Fedora.
Asides from changing _frontend/status.py, this commit includes the
following changes:
* _frontend/app.py: Use python isatty() function to determine if
we are connected to a tty, instead of relying
on blessings.
* setup.py: Remove the dependency on blessings.
|
|\
| |
| |
| |
| | |
Add SkipJob for indicating a skipped activity
See merge request BuildStream/buildstream!765
|
| |
| |
| |
| |
| |
| |
| | |
Both pulling and pushing INFO messages are now status messages.
Calls to the messaging API through `self.context.message()` have now
been switched to `element.info`.
|
| |
| |
| |
| |
| |
| |
| |
| | |
Pulled/Pushed messages will no longer be produced from within
element.py, instead they will be produced during CasCache push() and
pull() appropriately.
Message consistency has also been improved.
|
| |
| |
| |
| |
| | |
This removes the timed_activity for an element _push action. This is
unnecessary as the job is already being timed elsewhere.
|
| |
| |
| |
| |
| | |
Adds a test to ensure that BuildStream alerts the user of a skipped push
when the remote already has the artifact cached.
|
|/
|
|
|
|
|
|
|
|
| |
The SKIPPED message type is now used to indicate the end of a task which
was successful without having to perform the given task.
This overhauls the use of `Queue.done()` and therefore queues do not
need to provide a processed/skipped return value from `done()`. Instead
this is replaced with the action of raising a `SkipJob` exception from
within `Queue.process()`.
|
|\
| |
| |
| |
| |
| |
| | |
Incorrect error when malformed project.conf
Closes #642
See merge request BuildStream/buildstream!792
|
| | |
|
|/
|
|
|
|
|
|
| |
bst build returns "missing project.conf" when a project.conf is invalid
This results in an existing project with malformed yaml being dismissed
and attempts to make a new project
Added new exception
|
|\
| |
| |
| |
| | |
fix chroot sandbox devices
See merge request BuildStream/buildstream!781
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This is needed to permit access to the device nodes added to /dev
on Linux when FUSE is used as root.
The chroot sandbox only works with all privileges,
so there's no explicit check for being root
or having the appropriate capabilities.
A check for whether it's running as root isn't needed on Linux with bubblewrap
because /dev or its devices are mounted on top of the FUSE layer,
so device nodes are accessed directly rather than through the FUSE layer.
|
|/
|
|
|
|
| |
This fixes all devices being mapped to the non-existant device 0,
which prevents being able to use even safe devices like /dev/null
through the hardlinks FUSE layer.
|
|\
| |
| |
| |
| |
| |
| | |
Fix artifact config crash
Closes #625
See merge request BuildStream/buildstream!804
|
| |
| |
| |
| |
| | |
Test that we get the expected error when configuring a client-cert
without client-key, or the inverse.
|
|/
|
|
|
|
|
|
|
|
| |
mal-specified
When configuring a push remote and specifying either the client-cert
or the client-key, then both must be specified. This ensures we
get an informative error instead of a stack trace and BUG.
Fixes issue #625
|
|\
| |
| |
| |
| | |
sandbox/_sandboxremote.py: Acquire cache via Platform
See merge request BuildStream/buildstream!797
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The SandboxRemote used to construct its own CASCache which was
considered dangerous. This patch replaces that with acquisition of
the cache via the Platform singleton, hopefully eliminating issues
from having more than one artifact cache object in a single process.
Signed-off-by: Daniel Silverstone <daniel.silverstone@codethink.co.uk>
|
| |
| |
| |
| |
| |
| |
| |
| | |
The initialization of remotes is done by ArtifactCache.setup_remotes()
and as such it was wrong for these tests to be calling
CASCache.initialize_remotes() a second time.
Signed-off-by: Daniel Silverstone <daniel.silverstone@codethink.co.uk>
|
|/
|
|
|
|
|
|
| |
Since ArtifactCache.setup_remotes() can be expensive and should only
happen once, this commit will assert() if it is called a second time
on an artifact cache instance.
Signed-off-by: Daniel Silverstone <daniel.silverstone@codethink.co.uk>
|
|\
| |
| |
| |
| |
| |
| | |
Fix override options
Closes #658
See merge request BuildStream/buildstream!802
|
| |
| |
| |
| | |
This is a regression test for issue #658
|
|/
|
|
|
|
|
|
|
| |
This ensures that option expressions are resolved in the project
level overrides before attempting to composite them on the instantiated
elements. Seems this is a regression from introducing the include
directive.
This fixes issue #658
|
|\
| |
| |
| |
| | |
Update contributing guide
See merge request BuildStream/buildstream!801
|
| | |
|
| | |
|
|/ |
|
|\
| |
| |
| |
| | |
Address post-merge review of Ensure PWD is set in process environment
See merge request BuildStream/buildstream!788
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The current directory isn't always in the python module search path,
so we have to ensure it is for the script to work.
Strictly speaking, the user may already have a modified PYTHONPATH
at which point PYTHONPATH=".${PYTHONPATH+:$PYTHONPATH}" is necessary,
but it's probably premature to overcomplicate the documentation like that
before we discover it's a problem.
|
| |
| |
| |
| |
| | |
Since we now set PWD in the environment of builds
existing builds may behave differently so must cache differently now.
|
|/ |
|
|
|
|
|
|
| |
Somehow I missed this when originally forking the file from the click
library, now noticing that we should have followed what was written
in: https://github.com/pallets/click/blob/master/LICENSE
|
|\
| |
| |
| |
| | |
Bunch of cleanups
See merge request BuildStream/buildstream!798
|
| |
| |
| |
| | |
Remove unneeded cruft.
|
| | |
|
| |
| |
| |
| |
| |
| | |
* Rename tree to dir_digest to make it clear this is a Digest object,
and not a Tree object.
* Add documentation
|
| |
| |
| |
| |
| |
| |
| |
| | |
* Rename it to _commit_directory() because… it is what it does; and
also for symmetry with _fetch_directory().
* Rename digest to dir_digest to make it clear this is a digest for a
directory. A following commit will also reuse the same variable name
* Document method.
|
| |
| |
| |
| | |
Tristan Maat created the original file, so he is added as the author.
|
| |
| |
| |
| |
| | |
We want to check if some file is already cached here, not the parent
directory.
|
|/ |
|
|\
| |
| |
| |
| | |
Don't delete required artifacts when tracking is enabled
See merge request BuildStream/buildstream!793
|
| |
| |
| |
| |
| |
| | |
Same test as test_never_delete_required(), except that this test ensures
that we never delete required artifacts when their cache keys are
discovered dynamically during the build.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
* create_element_size()
Now uses a git Repo object instead of a local source, and
returns the repo.
* update_element_size()
Added this function which can now resize the expected output
of an element generated with create_element_size(), useful
to allow testing sized elements with the tracking feature.
|
| |
| |
| |
| |
| | |
This allows one to modify a file in an existing git repo,
as opposed to adding a new one.
|
| |
| |
| |
| |
| |
| |
| | |
These tests were not checking that we fail for the expected reasons.
Added `res.assert_task_error(ErrorDomain.ARTIFACT, 'cache-too-full')`
where we expect to fail because the cache is too full.
|