| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
|\
| |
| |
| |
| | |
requirements: Update all requirements
See merge request BuildStream/buildstream!2110
|
|/
|
|
|
|
|
|
|
|
| |
* .pylintrc: Disable new `raise-missing-from` check. We might want to
enable that later, but it fails in many places. Let's not merge both
changes here.
* pluginoriginpip.py: Catch the newer thrown exception from
pkg_resources. The previous one still exists, so we should be good
keeping the same compatibility as before
|
|\
| |
| |
| |
| |
| |
| | |
Fix operations which interact with artifacts
Closes #1410
See merge request BuildStream/buildstream!2108
|
| |
| |
| |
| |
| |
| | |
This new test tests that environment variables are preserved in generated
artifacts, and that artifact data is observed rather than irrelevant
local state when integrating an artifact checked out by it's artifact name.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
When instantiating an ArtifactElement, use an ArtifactProject to ensure
that the Element does not accidentally have access to any incidentally
existing project loaded from the current working directory.
Also pass along the Artifact to the Element's initializer directly, and
conditionally instantiate the element based on it's artifact instead of
based on loading YAML configuration.
Fixes #1410
Summary of changes:
* _artifactelement.py:
- Now load the Artifact and pass it along to the Element constructor
- Now use an ArtifactProject for the element's project
- Remove overrides of Element methods, instead of behaving differently,
now we just fill in all the blanks for an Element to behave more
naturally when loaded from an artifact.
- Avoid redundantly loading the same artifact twice, if the artifact
was cached then we will load only one artifact.
* element.py:
- Conditionally instantiate from the passed Artifact instead of
considering any YAML loading.
- Error out early in _prepare_sandbox() in case that we are trying
to instantiate a sandbox for an uncached artifact, in which case
we don't have any SandboxConfig at hand to do so.
* _stream.py:
- Clear the ArtifactProject cache after loading artifacts
- Ensure we load a list of unique artifacts without any duplicates
* tests/frontend/buildcheckout.py: Expect a different error when trying
to checkout an uncached artifact
* tests/frontend/push.py, tests/frontend/artifact_show.py: No longer expect
duplicates to show up with wild card statements which would capture multiple
versions of the same artifact (this changes because of #1410 being fixed)
|
| |
| |
| |
| |
| |
| | |
These properties allow easy addressing of the different cache keys,
and report a cached value if the artifact was cached, otherwise report
the value assigned to the Artifact at instantiation time.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The Project's class initializer is now refactored such that loading
a project.conf is made optional. The initializer is now well sorted
with public members showing up before private members, followed by
the initialization body. pep484 type hints are now employed aggressively
for all project instance members.
The added ArtifactProject is added to serve as the data model counterpart
of the ArtifactElement, ensuring that we never mistakenly use locally
loaded project data in ArtifactElement instances.
Consequently, the Project.sandbox and Project.splits variables are
properly made public by this commit, as these are simply loaded from
the project config and accessed elsewhere by Element; Element is updated
to access these public members by their new public names.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Instead of delegating this job to a Project instance, implement
the application state cleanup directly on the Stream object.
This commit also:
* Removes Project.cleanup()
* Reword incorrect API documentation for Element._reset_load_state()
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Instead of having _pipeline.py implement load_artifacts() by calling
_project.py's other implementation of load_artifacts(), instead just
implement _load_artifacts() directly in _stream.py.
This of course removes the load_artifacts() implementations from
_pipeline.py and _project.py.
|
| | |
|
| |
| |
| |
| |
| |
| | |
This was recently regressed to logging a DisplayKey object and was not
fixed to display the DisplayKey.brief, as it had already been fixed
in other log messages.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This commit enriches the metadata we store on artifacts in the
new detatched low/high diversity metadata files:
* The SandboxConfig is now stored in the artifact, allowing
one to perform activities such as launching sandboxes on
artifacts downloaded via artifact name (without backing
project data).
* The environment variables is now stored in the artifact,
similarly allowing one to shell into a downloaded artifacts
which are unrelated to a loaded project.
* The element variables are now stored in the artifact, allowing
more flexibility in what the core can do with a downloaded
ArtifactElement
* The element's strict key
All of these of course can additionally enhance traceability
in the UI with commands such as `bst artifact show`.
Summary of changes:
* _artifact.py:
- Store new data in the new proto digests.
- Added new accessors to extract these new aspects from loaded artifacts.
- Bump the proto version number for compatibility
* _artifactcache.py: Adjusted to push and pull the new blobs and digests.
* element.py:
- Call Artifact.cache() with new parameters
- Expect the strict key from Artifact.get_meta_keys()
- Always specify the strict key when constructing an Artifact
instance which will later be used to cache the artifact
(i.e. the self.__artifact Artifact).
* _versions.py: Bump the global artifact version number, as this breaks
the artifact format.
* tests/cachekey: Updated cache key test for new keys.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Added low_diversity_meta and high_diversity_meta artifact.proto.
These two fields represent detatched metadata files in which we can
store artifact metadata which does not pertain directly to the artifact's
cache key and as such should not stored directly on the proto itself.
The low diversity data is meant to maximize on deduplication of data
in the CAS store, while data which is expected to diverge more should
be stored in the high diversity metadata.
Also added the strict key to the proto.
The strict key is interesting to have when instantiating an Element
from a downloaded artifact, and also provides some context as to
whether the downloaded artifact's strong key matches it's strict key.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This commit changes SandboxConfig such that it now has a simple constructor
and a new SandboxConfig.new_from_node() classmethod to load it from a YAML
configuration node. The new version of SandboxConfig now uses type annotations.
SandboxConfig also now sports a to_dict() method to help in serialization in
artifacts, this replaces SandboxConfig.get_unique_key() since it does exactly
the same thing, but uses the same names as expected in the YAML configuration
to achieve it.
The element.py code has been updated to use the classmethod, and to
use the to_dict() method when constructing cache keys.
This refactor is meant to allow instantiating a SandboxConfig without
any MappingNode, such that we can later load a SandboxConfig from an
Artifact instead of from an parsed Element.
This commit also updates the cache keys in the cache key test, as
the cache key format is slightly changed by the to_dict() method.
|
| |
| |
| |
| |
| | |
In order to use types from the Node family with mypy, we need to
have the methods we use on those types defined.
|
|/ |
|
|\
| |
| |
| |
| | |
First part of the parent-child job separation cleanup
See merge request BuildStream/buildstream!2111
|
| |
| |
| |
| |
| |
| | |
Now that the only type of message that goes through are messages for the
messenger, we can remove the enveloppe and only ever handle messenger's
messages
|
| |
| |
| |
| |
| | |
Since we run in a single process, we do not need this distinction
anymore
|
| |
| |
| |
| |
| | |
This is not needed now that jobs run in the smae process, we can just
return the value from the method.
|
|/
|
|
|
|
|
|
|
|
|
|
| |
This is currently only used by the ElementJob to send back information
about the workspace, that we can get directly now that we run in the
same process
* elementjob.py: Remove the returning of the workspace dict. This is
directly available in the main thread.
* queue.py: Use the workspace from the element directly instead of going
through child data
|
|\
| |
| |
| |
| |
| |
| | |
Rework the scheduler to use threads instead of processes
Closes #911, #93, and #810
See merge request BuildStream/buildstream!1982
|
| |
| |
| |
| |
| | |
This reduces a race condition where a sigint received shortly after
restarting the scheduler would cause the schedulert to crash.
|
| |
| |
| |
| |
| | |
This moves it to tests with a simplified usage, since we don't use it
anywhere else
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This changes how the scheduler works and adapts all the code that needs
adapting in order to be able to run in threads instead of in
subprocesses, which helps with Windows support, and will allow some
simplifications in the main pipeline.
This addresses the following issues:
* Fix #810: All CAS calls are now made in the master process, and thus
share the same connection to the cas server
* Fix #93: We don't start as many child processes anymore, so the risk
of starving the machine are way less
* Fix #911: We now use `forkserver` for starting processes. We also
don't use subprocesses for jobs so we should be starting less
subprocesses
And the following highlevel changes where made:
* cascache.py: Run the CasCacheUsageMonitor in a thread instead of a
subprocess.
* casdprocessmanager.py: Ensure start and stop of the process are thread
safe.
* job.py: Run the child in a thread instead of a process, adapt how we
stop a thread, since we ca't use signals anymore.
* _multiprocessing.py: Not needed anymore, we are not using `fork()`.
* scheduler.py: Run the scheduler with a threadpool, to run the child
jobs in. Also adapt how our signal handling is done, since we are not
receiving signals from our children anymore, and can't kill them the
same way.
* sandbox: Stop using blocking signals to wait on the process, and use
timeouts all the time.
* messenger.py: Use a thread-local context for the handler, to allow for
multiple parameters in the same process.
* _remote.py: Ensure the start of the connection is thread safe
* _signal.py: Allow blocking entering in the signal's context managers
by setting an event. This is to ensure no thread runs long-running
code while we asked the scheduler to pause. This also ensures all the
signal handlers is thread safe.
* source.py: Change check around saving the source's ref. We are now
running in the same process, and thus the ref will already have been
changed.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This ensures that we can cleanly cleanup processes and threads on
termination of BuildStream.
Plugins should use this helper whenever there is a risk of them being
blocked on a syscall for an indefinite amount of time
* downloadablefilesource.py: Use this new helper to do the actual
download, which would prevent the process from completely blocking if
we have a badly behaving upstream
|
| |
| |
| |
| |
| |
| |
| | |
This is required when we run this in the main process, with the threaded
scheduler rework.
Otherwise the state is kept between tests
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This does not behaves as we would expect, as it is not always
consistent, and doesn't have any impact in most cases.
We should revisit our handling of permissions and umasks separately, in
the meantime, this is required in order to fix building with a threaded
scheduler, as it would otherwise introduce concurrency errors
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
* This modifies the signal terminator so that it can be called from any
thread.
This checks that either:
- The signal handler is already in place
- Or the caller is in the main thread, allowing to set the signal
handler.
This also removes the exact callback that was added instead of removing
the last one, and fixes the `suspend_handler` to do the same.
This is required, as we don't know which interleaving of calls will be
done, and we can't guarantee that the last one is the right one to
remove
|
|/
|
|
|
|
| |
This ensures that, if we were to receive signals or other things while
we are on this blocking call, we would be able to process them instead
of waiting for the end of the process
|
|\
| |
| |
| |
| | |
Fix missing cert tests to expect the error for the correct reason
See merge request BuildStream/buildstream!2105
|
| |
| |
| |
| |
| |
| |
| |
| | |
min-version
This test was broken as it was failing for the wrong reason, even though
in both cases it was a missing yaml key. Fix the test to fail due to it
being missing the required cert specified in the cache config.
|
| |
| |
| |
| |
| |
| | |
This test was broken as it was failing for the wrong reason, even though
in both cases it was a missing yaml key. Fix the test to fail due to it
being missing the required cert specified in the cache config.
|
|/
|
|
|
|
| |
This test was broken as it was failing for the wrong reason, even though
in both cases it was a missing yaml key. Fix the test to fail due to it
being missing the required cert specified in the cache config.
|
|\
| |
| |
| |
| |
| |
| | |
setup.py: Ensure we have a version number
Closes #1383
See merge request BuildStream/buildstream!2101
|
|/
|
|
|
|
|
|
| |
BuildStream requires a version number at runtime. But, builds fine if
I can't find any tags. So, make it an error at build time if we don't
have a valid version number.
Fixes #1383.
|
|\
| |
| |
| |
| | |
tests/frontend/push.py: Skip expiry test without subsecond mtime
See merge request BuildStream/buildstream!2104
|
|/
|
|
|
| |
Skip an artifact expiry test in the case we don't have subsecond mtime
precision.
|
|\
| |
| |
| |
| |
| |
| | |
Fix glob handling in the CLI
Closes #959
See merge request BuildStream/buildstream!2102
|
| | |
|
| |
| |
| |
| |
| |
| | |
This tests a few glob patterns through `bst artifact show` and also
asserts that globs which match both elements and artifacts will produce
an error.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
We should not have a different globbing behavior than split rules for
the command line.
This should also make artifact globbing slightly more performant, as
the regular expression under the hood need not be recompiled for each
file being checked.
This commit also updates tests/frontend/artifact_list_contents.py to
use a double star `**` (globstar syntax) in order to match path
separators as well as all other characters in the list contents command.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Don't use fnmatch(), as this has a different behavior from utils.glob(),
which is a bit closer to what we expect from a shell (* matches everything
except path separators, while ** matches path separators), and also
consistent with other places where BuildStream handles globbing, like
when considering split rules.
We should not have a different globbing behavior than split rules for
the command line.
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The patch does the following things:
* Ensure that we only ever try to match artifacts to user provided
glob patterns if we are performing a command which tries to load
artifacts.
* Stops being selective about glob patterns, if the user provides
a pattern which does not end in ".bst", we still try to match it
against elements.
* Provide a warning when the provided globs did not match anything,
previously this code only provided this warning if artifacts were
not matched to globs, but not elements.
* tests/frontend/artifact_delete.py, tests/frontend/push.py,
tests/frontend/buildcheckout.py:
Fixed tests to to not try to determine success by examining the
wording of a user facing message, use the machine readable errors
instead.
Fixes #959
|
|\
| |
| |
| |
| | |
src/buildstream/element.py: __use_remote_execution() reword desc
See merge request BuildStream/buildstream!2097
|
|/
|
|
|
| |
Element Plugins now require the use of virtual directories as such
they do not influence RE support. Reword for config hierarchy.
|
|\
| |
| |
| |
| | |
Add test environment for Python 3.9
See merge request BuildStream/buildstream!2098
|
| | |
|
|/ |
|
|\
| |
| |
| |
| | |
_stream.py: Make `_enqueue_plan` a timed activity
See merge request BuildStream/buildstream!1840
|