| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This commit enriches the metadata we store on artifacts in the
new detatched low/high diversity metadata files:
* The SandboxConfig is now stored in the artifact, allowing
one to perform activities such as launching sandboxes on
artifacts downloaded via artifact name (without backing
project data).
* The environment variables is now stored in the artifact,
similarly allowing one to shell into a downloaded artifacts
which are unrelated to a loaded project.
* The element variables are now stored in the artifact, allowing
more flexibility in what the core can do with a downloaded
ArtifactElement
* The element's strict key
All of these of course can additionally enhance traceability
in the UI with commands such as `bst artifact show`.
Summary of changes:
* _artifact.py:
- Store new data in the new proto digests.
- Added new accessors to extract these new aspects from loaded artifacts.
- Bump the proto version number for compatibility
* _artifactcache.py: Adjusted to push and pull the new blobs and digests.
* element.py:
- Call Artifact.cache() with new parameters
- Expect the strict key from Artifact.get_meta_keys()
- Always specify the strict key when constructing an Artifact
instance which will later be used to cache the artifact
(i.e. the self.__artifact Artifact).
* _versions.py: Bump the global artifact version number, as this breaks
the artifact format.
* tests/cachekey: Updated cache key test for new keys.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Added low_diversity_meta and high_diversity_meta artifact.proto.
These two fields represent detatched metadata files in which we can
store artifact metadata which does not pertain directly to the artifact's
cache key and as such should not stored directly on the proto itself.
The low diversity data is meant to maximize on deduplication of data
in the CAS store, while data which is expected to diverge more should
be stored in the high diversity metadata.
Also added the strict key to the proto.
The strict key is interesting to have when instantiating an Element
from a downloaded artifact, and also provides some context as to
whether the downloaded artifact's strong key matches it's strict key.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This commit changes SandboxConfig such that it now has a simple constructor
and a new SandboxConfig.new_from_node() classmethod to load it from a YAML
configuration node. The new version of SandboxConfig now uses type annotations.
SandboxConfig also now sports a to_dict() method to help in serialization in
artifacts, this replaces SandboxConfig.get_unique_key() since it does exactly
the same thing, but uses the same names as expected in the YAML configuration
to achieve it.
The element.py code has been updated to use the classmethod, and to
use the to_dict() method when constructing cache keys.
This refactor is meant to allow instantiating a SandboxConfig without
any MappingNode, such that we can later load a SandboxConfig from an
Artifact instead of from an parsed Element.
This commit also updates the cache keys in the cache key test, as
the cache key format is slightly changed by the to_dict() method.
|
| |
| |
| |
| |
| | |
In order to use types from the Node family with mypy, we need to
have the methods we use on those types defined.
|
|/ |
|
|\
| |
| |
| |
| | |
First part of the parent-child job separation cleanup
See merge request BuildStream/buildstream!2111
|
| |
| |
| |
| |
| |
| | |
Now that the only type of message that goes through are messages for the
messenger, we can remove the enveloppe and only ever handle messenger's
messages
|
| |
| |
| |
| |
| | |
Since we run in a single process, we do not need this distinction
anymore
|
| |
| |
| |
| |
| | |
This is not needed now that jobs run in the smae process, we can just
return the value from the method.
|
|/
|
|
|
|
|
|
|
|
|
|
| |
This is currently only used by the ElementJob to send back information
about the workspace, that we can get directly now that we run in the
same process
* elementjob.py: Remove the returning of the workspace dict. This is
directly available in the main thread.
* queue.py: Use the workspace from the element directly instead of going
through child data
|
|\
| |
| |
| |
| |
| |
| | |
Rework the scheduler to use threads instead of processes
Closes #911, #93, and #810
See merge request BuildStream/buildstream!1982
|
| |
| |
| |
| |
| | |
This reduces a race condition where a sigint received shortly after
restarting the scheduler would cause the schedulert to crash.
|
| |
| |
| |
| |
| | |
This moves it to tests with a simplified usage, since we don't use it
anywhere else
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This changes how the scheduler works and adapts all the code that needs
adapting in order to be able to run in threads instead of in
subprocesses, which helps with Windows support, and will allow some
simplifications in the main pipeline.
This addresses the following issues:
* Fix #810: All CAS calls are now made in the master process, and thus
share the same connection to the cas server
* Fix #93: We don't start as many child processes anymore, so the risk
of starving the machine are way less
* Fix #911: We now use `forkserver` for starting processes. We also
don't use subprocesses for jobs so we should be starting less
subprocesses
And the following highlevel changes where made:
* cascache.py: Run the CasCacheUsageMonitor in a thread instead of a
subprocess.
* casdprocessmanager.py: Ensure start and stop of the process are thread
safe.
* job.py: Run the child in a thread instead of a process, adapt how we
stop a thread, since we ca't use signals anymore.
* _multiprocessing.py: Not needed anymore, we are not using `fork()`.
* scheduler.py: Run the scheduler with a threadpool, to run the child
jobs in. Also adapt how our signal handling is done, since we are not
receiving signals from our children anymore, and can't kill them the
same way.
* sandbox: Stop using blocking signals to wait on the process, and use
timeouts all the time.
* messenger.py: Use a thread-local context for the handler, to allow for
multiple parameters in the same process.
* _remote.py: Ensure the start of the connection is thread safe
* _signal.py: Allow blocking entering in the signal's context managers
by setting an event. This is to ensure no thread runs long-running
code while we asked the scheduler to pause. This also ensures all the
signal handlers is thread safe.
* source.py: Change check around saving the source's ref. We are now
running in the same process, and thus the ref will already have been
changed.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This ensures that we can cleanly cleanup processes and threads on
termination of BuildStream.
Plugins should use this helper whenever there is a risk of them being
blocked on a syscall for an indefinite amount of time
* downloadablefilesource.py: Use this new helper to do the actual
download, which would prevent the process from completely blocking if
we have a badly behaving upstream
|
| |
| |
| |
| |
| |
| |
| | |
This is required when we run this in the main process, with the threaded
scheduler rework.
Otherwise the state is kept between tests
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This does not behaves as we would expect, as it is not always
consistent, and doesn't have any impact in most cases.
We should revisit our handling of permissions and umasks separately, in
the meantime, this is required in order to fix building with a threaded
scheduler, as it would otherwise introduce concurrency errors
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
* This modifies the signal terminator so that it can be called from any
thread.
This checks that either:
- The signal handler is already in place
- Or the caller is in the main thread, allowing to set the signal
handler.
This also removes the exact callback that was added instead of removing
the last one, and fixes the `suspend_handler` to do the same.
This is required, as we don't know which interleaving of calls will be
done, and we can't guarantee that the last one is the right one to
remove
|
|/
|
|
|
|
| |
This ensures that, if we were to receive signals or other things while
we are on this blocking call, we would be able to process them instead
of waiting for the end of the process
|
|\
| |
| |
| |
| | |
Fix missing cert tests to expect the error for the correct reason
See merge request BuildStream/buildstream!2105
|
| |
| |
| |
| |
| |
| |
| |
| | |
min-version
This test was broken as it was failing for the wrong reason, even though
in both cases it was a missing yaml key. Fix the test to fail due to it
being missing the required cert specified in the cache config.
|
| |
| |
| |
| |
| |
| | |
This test was broken as it was failing for the wrong reason, even though
in both cases it was a missing yaml key. Fix the test to fail due to it
being missing the required cert specified in the cache config.
|
|/
|
|
|
|
| |
This test was broken as it was failing for the wrong reason, even though
in both cases it was a missing yaml key. Fix the test to fail due to it
being missing the required cert specified in the cache config.
|
|\
| |
| |
| |
| |
| |
| | |
setup.py: Ensure we have a version number
Closes #1383
See merge request BuildStream/buildstream!2101
|
|/
|
|
|
|
|
|
| |
BuildStream requires a version number at runtime. But, builds fine if
I can't find any tags. So, make it an error at build time if we don't
have a valid version number.
Fixes #1383.
|
|\
| |
| |
| |
| | |
tests/frontend/push.py: Skip expiry test without subsecond mtime
See merge request BuildStream/buildstream!2104
|
|/
|
|
|
| |
Skip an artifact expiry test in the case we don't have subsecond mtime
precision.
|
|\
| |
| |
| |
| |
| |
| | |
Fix glob handling in the CLI
Closes #959
See merge request BuildStream/buildstream!2102
|
| | |
|
| |
| |
| |
| |
| |
| | |
This tests a few glob patterns through `bst artifact show` and also
asserts that globs which match both elements and artifacts will produce
an error.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
We should not have a different globbing behavior than split rules for
the command line.
This should also make artifact globbing slightly more performant, as
the regular expression under the hood need not be recompiled for each
file being checked.
This commit also updates tests/frontend/artifact_list_contents.py to
use a double star `**` (globstar syntax) in order to match path
separators as well as all other characters in the list contents command.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Don't use fnmatch(), as this has a different behavior from utils.glob(),
which is a bit closer to what we expect from a shell (* matches everything
except path separators, while ** matches path separators), and also
consistent with other places where BuildStream handles globbing, like
when considering split rules.
We should not have a different globbing behavior than split rules for
the command line.
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The patch does the following things:
* Ensure that we only ever try to match artifacts to user provided
glob patterns if we are performing a command which tries to load
artifacts.
* Stops being selective about glob patterns, if the user provides
a pattern which does not end in ".bst", we still try to match it
against elements.
* Provide a warning when the provided globs did not match anything,
previously this code only provided this warning if artifacts were
not matched to globs, but not elements.
* tests/frontend/artifact_delete.py, tests/frontend/push.py,
tests/frontend/buildcheckout.py:
Fixed tests to to not try to determine success by examining the
wording of a user facing message, use the machine readable errors
instead.
Fixes #959
|
|\
| |
| |
| |
| | |
src/buildstream/element.py: __use_remote_execution() reword desc
See merge request BuildStream/buildstream!2097
|
|/
|
|
|
| |
Element Plugins now require the use of virtual directories as such
they do not influence RE support. Reword for config hierarchy.
|
|\
| |
| |
| |
| | |
Add test environment for Python 3.9
See merge request BuildStream/buildstream!2098
|
| | |
|
|/ |
|
|\
| |
| |
| |
| | |
_stream.py: Make `_enqueue_plan` a timed activity
See merge request BuildStream/buildstream!1840
|
|/
|
|
|
|
| |
This enqueue_plan can take a long time, as it triggers a verification
of the 'cached' state for sources in some cases, which can take a long
time.
|
|\
| |
| |
| |
| | |
Support overrides semantic for elements
See merge request BuildStream/buildstream!2094
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
* Test that we can override an element in a subproject with a local element,
the local element has a dependency on another element in the subproject
through the same junction.
* Test that we can override the dependency in the subproject, proving
that reverse dependencies in that subproject are built against the
overridden element.
* Test that we can override a subproject element using a local link
to another element in the same subproject.
* Test that we can declare an override of a subproject element using
a link in that subproject, and it will be effective even if that
link is not traversed by the actual dependency chain.
* Check that the same element being overridden multiple times in
a subproject is overridden by the highest level project which should
have the highest priority in the overrides.
|
|/
|
|
|
|
|
|
|
|
|
| |
Using the same semantic used to override junctions in subprojects, allow
overriding of elements.
As discussed in this proposal: https://lists.apache.org/thread.html/r34c8e94f024aae3d5afd260554dac594e82751ca60dea28880f520d5%40%3Cdev.buildstream.apache.org%3E
Notably, this also adds the "fully loaded" flag to LoadElement, and
separates the logic around loading a single file with redirections
for overrides and links in consideration into a single function.
|
|\
| |
| |
| |
| | |
testutils/platform: Refactor to be compatible with Python 3.9
See merge request BuildStream/buildstream!2092
|
|/
|
|
|
|
|
|
|
|
|
| |
Starting from Python 3.9, it seems like the `_replace()` method no
longer works on `platform.uname_result` objects, that are returned by
`platform.uname()`. This causes some of our tests to fail on Python 3.9.
See https://bugs.python.org/issue42163 for upstream issue.
Fix it by slightly changing the way we override the values of the
`platform.uname()` function, such that it works on both Python 3.9 and
3.8 (and below).
|
|\
| |
| |
| |
| |
| |
| | |
into 'master'
Get release and snapshot badge images from docs website
See merge request BuildStream/buildstream!2086
|
|/
|
|
|
|
| |
get release badge image and snapshot badge image from:
docs.buildstream.build/master/_static/release.svg, and
docs.buildstream.build/master/_static/snapshot.svg
|
|\
| |
| |
| |
| | |
Corner case fixes for the loader code
See merge request BuildStream/buildstream!2093
|
| |
| |
| |
| |
| | |
These appear to be popping up randomly, as we squash them in our
game of whackamole...
|