| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
These tests were quite hard to follow what they were testing, and
did not cover the error cases thoroughly either. Instead, use some
test parametrization to implement more succinct tests which cover
the API surface more thoroughly.
Due to parametrization and discrete testing of various use cases,
this was going to be very expensive, so this patch introduces some
pytest module scope fixtures.
This allows multiple discrete tests to be run against a prebuilt
ArtifactShare with specific artifacts already built and available
in the share, so that this discrete testing is quite optimized.
|
|
|
|
|
| |
This allows our tests to use the ArtifactShare() object in
custom fixtures.
|
|
|
|
|
| |
This allows third party developers to use the Cli() object
in their own test fixtures.
|
|
|
|
| |
Instead of requiring every fixture to do it separately.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
In errors pertaining to failing to launch a shell with a buildtree.
Other related updates:
- _frontend/cli.py: Propagate machine readable error codes in `bst shell`
This command prefixes a reported error, so it rewraps the error into
an AppError, this needs to propagate the originating machine readable
error.
- tests/integration/shell.py, tests/integration/shellbuildtrees.py:
Updated to use new machine readable errors
|
|\
| |
| |
| |
| | |
.gitlab-ci.yml: No need special runner for cached overnigth test
See merge request BuildStream/buildstream!2107
|
|/ |
|
|\
| |
| |
| |
| | |
_stream.py, _frontend/widget.py: Fix weird hack
See merge request BuildStream/buildstream!2117
|
|/
|
|
|
|
|
|
|
| |
When stream is asked for a list of artifacts to show for
the purpose of `bst artifact show`, it was squashing the element
name with the artifact name before it gets displayed in the
frontend.
Instead, make the special casing in the frontend.
|
|\
| |
| |
| |
| |
| |
| | |
Require all stack dependencies to be both build & runtime dependencies
Closes #1075
See merge request BuildStream/buildstream!2113
|
| |
| |
| |
| |
| | |
Assert that errors are raised when stack dependencies are declared as
build-only or runtime-only dependencies.
|
| | |
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Stack elements cannot be build-only dependencies, as this would defeat
the purpose of using stack elements in order to directly build-depend on
them.
Stack element dependencies must all be built in order to build depend
on them, and as such we gain no build parallelism by allowing runtime-only
dependencies on stack elements. Declaring a runtime-only dependency on
a stack element as a whole might still be useful, but still requires the
entire stack to be built at the time we need that stack.
Instead, it is more useful to ensure that a stack element is a logical
group of all dependencies, including runtime dependencies, such that we
can guarantee cache key alignment with all stack dependencies.
This allows for stronger reliability in commands such as
`bst artifact checkout`, which can now reliably download and checkout
a fully built stack as a result, without any uncertainty about possible
runtime-only dependencies which might exist in the project where that
artifact was created.
This consequently closes #1075
This also fixes the following tests such that the no longer
require build-depends or runtime-depends to work in stack elements:
* tests/frontend/default_target.py: Was not necessary to check results of show,
these stacks were set to runtime-depends so that they would have the same
buildable state as their dependencies when shown.
* tests/format/dependencies.py: tests/frontend/pull.py, test/frontend/show.py,
tests/integration/compose.py:
These tests were using specific build/runtime dependencies in stacks, but
for no particular reason.
|
| |
| |
| |
| |
| |
| | |
Added a description about artifact names at the beginning of the
artifact commands section, along with a new glossary entry for
artifact names which refers to the description.
|
|/
|
|
|
|
|
|
|
|
| |
* When documenting dependency types, add references to the shorthand
convenience lists `build-depends` and `runtime-depends`.
* When documenting the `type` attribute of dependencies, correct the
language referring to the convenience lists to specify `build-depends`
and `runtime-depends` instead of the incorrectly worded
`Build-Depends` and `Runtime-Depends`, which would be invalid keys.
|
|\
| |
| |
| |
| | |
Refactor State object
See merge request BuildStream/buildstream!2115
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This improves overall documentation comments on the State object,
adds full pep484 type hinting, and renames the Task.set_render_cb()
to Task.set_task_changed_callback() to be more consistently named.
This also adds missing frontend facing API for the group changed
status notifications, even though the frontend does not currently
use these, it makes better sense to have them than to remove the
entire codepaths and callback lists.
This also reorders the classes in this file so that Task and TaskGroup
are both defined before State, this helps a bit with undefined references
for type hinting information.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
We created the State object in the core for the purpose of advertizing
state to the frontend, and the frontend can register callbacks and get
updates to state changes (implicit invocation in the frontend), state
always belongs to the core and the frontend can only read state.
When the frontend asks the core to do something, this should always
be done with an explicit function call, and preferably not via the
State object, as this confuses the use of state, which is only a readonly
state advertizing desk.
This was broken (implemented backwards) for job retries, instead we had
the frontend telling state "It has been requested that this job be retried !",
and then we had the core registering callbacks to that frontend request - this
direction of implicit invocation should not happen (the core should never
have to register callbacks on the State object at all in fact).
Summary of changes:
* _stream.py: Change _failure_retry(), which was for some reason
private albeit called from the frontend, to an explicit function
call named "retry_job()".
Instead of calling into the State object and causing core-side
callbacks to be triggered, later to be handled by the Scheduler,
implement the retry directly from the Stream, since this implementation
deals only with Queues and State, which already directly belong to
the Stream object, there is no reason to trouble the Scheduler
with this.
* _scheduler.py: Remove the callback handling the State "task retry"
event.
* _state.py: Remove the task retry callback chain completely.
* _frontend/app.py: Call stream.retry_job() instead of
stream.failure_retry(), now passing along the task's action name
rather than the task's ID.
This API now assumes that Stream.retry_job() can only be called on
a task which originates from a scheduler Queue, and expects to be
given the action name of the queue in which the given element has
failed and should be retried..
|
|/
|
|
|
| |
The Task object is not internal to the State object, it is clearly
given to the frontend and passed around.
|
|\
| |
| |
| |
| | |
CASCache improvements
See merge request BuildStream/buildstream!2112
|
| | |
|
| | |
|
| |
| |
| |
| |
| | |
The implementation can be reused to replace `local_missing_blobs()` and
simplify `contains_files()`.
|
| | |
|
| |
| |
| |
| |
| | |
This allows adding multiple objects in a single batch, avoiding extra
gRPC round trips to buildbox-casd.
|
| | |
|
| |
| |
| |
| |
| | |
It's only used by `_fetch_tree()` and can be replaced by a single
additional line.
|
| |
| |
| |
| |
| | |
This eliminates code duplication in `ArtifactCache`, `SourceCache` and
`ElementSourcesCache`.
|
| |
| |
| |
| | |
This simplifies the code, delegating the logic to buildbox-casd.
|
|/
|
|
| |
It's not used outside testutils.
|
|\
| |
| |
| |
| | |
.gitlab-ci.yml: Disable parallel testing on centos-7.7 and ubuntu-18.04
See merge request BuildStream/buildstream!2114
|
|/
|
|
|
| |
With Python 3.6 CI frequently fails with `INTERNALERROR` when parallel
testing is enabled.
|
|\
| |
| |
| |
| | |
_context.py: don't delete bst1 extract directory
See merge request BuildStream/buildstream!2080
|
|/ |
|
|\
| |
| |
| |
| | |
requirements: Update all requirements
See merge request BuildStream/buildstream!2110
|
|/
|
|
|
|
|
|
|
|
| |
* .pylintrc: Disable new `raise-missing-from` check. We might want to
enable that later, but it fails in many places. Let's not merge both
changes here.
* pluginoriginpip.py: Catch the newer thrown exception from
pkg_resources. The previous one still exists, so we should be good
keeping the same compatibility as before
|
|\
| |
| |
| |
| |
| |
| | |
Fix operations which interact with artifacts
Closes #1410
See merge request BuildStream/buildstream!2108
|
| |
| |
| |
| |
| |
| | |
This new test tests that environment variables are preserved in generated
artifacts, and that artifact data is observed rather than irrelevant
local state when integrating an artifact checked out by it's artifact name.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
When instantiating an ArtifactElement, use an ArtifactProject to ensure
that the Element does not accidentally have access to any incidentally
existing project loaded from the current working directory.
Also pass along the Artifact to the Element's initializer directly, and
conditionally instantiate the element based on it's artifact instead of
based on loading YAML configuration.
Fixes #1410
Summary of changes:
* _artifactelement.py:
- Now load the Artifact and pass it along to the Element constructor
- Now use an ArtifactProject for the element's project
- Remove overrides of Element methods, instead of behaving differently,
now we just fill in all the blanks for an Element to behave more
naturally when loaded from an artifact.
- Avoid redundantly loading the same artifact twice, if the artifact
was cached then we will load only one artifact.
* element.py:
- Conditionally instantiate from the passed Artifact instead of
considering any YAML loading.
- Error out early in _prepare_sandbox() in case that we are trying
to instantiate a sandbox for an uncached artifact, in which case
we don't have any SandboxConfig at hand to do so.
* _stream.py:
- Clear the ArtifactProject cache after loading artifacts
- Ensure we load a list of unique artifacts without any duplicates
* tests/frontend/buildcheckout.py: Expect a different error when trying
to checkout an uncached artifact
* tests/frontend/push.py, tests/frontend/artifact_show.py: No longer expect
duplicates to show up with wild card statements which would capture multiple
versions of the same artifact (this changes because of #1410 being fixed)
|
| |
| |
| |
| |
| |
| | |
These properties allow easy addressing of the different cache keys,
and report a cached value if the artifact was cached, otherwise report
the value assigned to the Artifact at instantiation time.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The Project's class initializer is now refactored such that loading
a project.conf is made optional. The initializer is now well sorted
with public members showing up before private members, followed by
the initialization body. pep484 type hints are now employed aggressively
for all project instance members.
The added ArtifactProject is added to serve as the data model counterpart
of the ArtifactElement, ensuring that we never mistakenly use locally
loaded project data in ArtifactElement instances.
Consequently, the Project.sandbox and Project.splits variables are
properly made public by this commit, as these are simply loaded from
the project config and accessed elsewhere by Element; Element is updated
to access these public members by their new public names.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Instead of delegating this job to a Project instance, implement
the application state cleanup directly on the Stream object.
This commit also:
* Removes Project.cleanup()
* Reword incorrect API documentation for Element._reset_load_state()
|
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Instead of having _pipeline.py implement load_artifacts() by calling
_project.py's other implementation of load_artifacts(), instead just
implement _load_artifacts() directly in _stream.py.
This of course removes the load_artifacts() implementations from
_pipeline.py and _project.py.
|
| | |
|
| |
| |
| |
| |
| |
| | |
This was recently regressed to logging a DisplayKey object and was not
fixed to display the DisplayKey.brief, as it had already been fixed
in other log messages.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This commit enriches the metadata we store on artifacts in the
new detatched low/high diversity metadata files:
* The SandboxConfig is now stored in the artifact, allowing
one to perform activities such as launching sandboxes on
artifacts downloaded via artifact name (without backing
project data).
* The environment variables is now stored in the artifact,
similarly allowing one to shell into a downloaded artifacts
which are unrelated to a loaded project.
* The element variables are now stored in the artifact, allowing
more flexibility in what the core can do with a downloaded
ArtifactElement
* The element's strict key
All of these of course can additionally enhance traceability
in the UI with commands such as `bst artifact show`.
Summary of changes:
* _artifact.py:
- Store new data in the new proto digests.
- Added new accessors to extract these new aspects from loaded artifacts.
- Bump the proto version number for compatibility
* _artifactcache.py: Adjusted to push and pull the new blobs and digests.
* element.py:
- Call Artifact.cache() with new parameters
- Expect the strict key from Artifact.get_meta_keys()
- Always specify the strict key when constructing an Artifact
instance which will later be used to cache the artifact
(i.e. the self.__artifact Artifact).
* _versions.py: Bump the global artifact version number, as this breaks
the artifact format.
* tests/cachekey: Updated cache key test for new keys.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Added low_diversity_meta and high_diversity_meta artifact.proto.
These two fields represent detatched metadata files in which we can
store artifact metadata which does not pertain directly to the artifact's
cache key and as such should not stored directly on the proto itself.
The low diversity data is meant to maximize on deduplication of data
in the CAS store, while data which is expected to diverge more should
be stored in the high diversity metadata.
Also added the strict key to the proto.
The strict key is interesting to have when instantiating an Element
from a downloaded artifact, and also provides some context as to
whether the downloaded artifact's strong key matches it's strict key.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
This commit changes SandboxConfig such that it now has a simple constructor
and a new SandboxConfig.new_from_node() classmethod to load it from a YAML
configuration node. The new version of SandboxConfig now uses type annotations.
SandboxConfig also now sports a to_dict() method to help in serialization in
artifacts, this replaces SandboxConfig.get_unique_key() since it does exactly
the same thing, but uses the same names as expected in the YAML configuration
to achieve it.
The element.py code has been updated to use the classmethod, and to
use the to_dict() method when constructing cache keys.
This refactor is meant to allow instantiating a SandboxConfig without
any MappingNode, such that we can later load a SandboxConfig from an
Artifact instead of from an parsed Element.
This commit also updates the cache keys in the cache key test, as
the cache key format is slightly changed by the to_dict() method.
|
| |
| |
| |
| |
| | |
In order to use types from the Node family with mypy, we need to
have the methods we use on those types defined.
|
|/ |
|