summaryrefslogtreecommitdiff
Commit message (Collapse)AuthorAgeFilesLines
* Move artifact cache query to pull jobJürg Billeter2020-12-144-107/+67
|
* Always schedule pull jobJürg Billeter2020-12-143-58/+64
| | | | | This is preparation to perform artifact cache query as part of the same job as artifact pulling.
* _artifact.py: Add cached and strong_key parameters to set_cached()Jürg Billeter2020-12-141-4/+14
|
* Move source cache query to fetch jobJürg Billeter2020-12-1411-29/+91
|
* _stream.py: Add query_cache() methodJürg Billeter2020-12-142-0/+20
| | | | This is preparation for explicit cache query.
* _elementsources.py: Make cache query explicitJürg Billeter2020-12-141-5/+16
| | | | | | Cache query can be fairly expensive as it checks the presence of all blobs. Make this more explicit with a `query_cache()` method, instead of implicitly querying the cache on the first call to `cached()`.
* _artifact.py: Make cache query explicitJürg Billeter2020-12-143-22/+25
| | | | | | | Cache query can be fairly expensive as it checks the presence of all artifact blobs. Make this more explicit with a `query_cache()` method, instead of implicitly querying the cache on the first call to `cached()`.
* _pipeline.py: Drop the optimization for cached elements in the plannerJürg Billeter2020-12-141-17/+9
| | | | | | | | | The overhead of planning already cached elements and unneeded build-only dependencies should be fairly small as unneeded jobs can still be skipped. This optimization was also already disabled for non-strict build plans with a remote artifact cache. This change is necessary in preparation for parallelizing cache queries.
* fetchqueue.py: Don't skip elements with a cached failureJürg Billeter2020-12-141-1/+1
| | | | | | The build queue requires the sources to be available for all elements where `_cached_success()` returns `False`. This includes elements with a cached failure.
* tests/internals/pluginloading: Add missing get_ref() to FooSourceJürg Billeter2020-12-141-0/+3
|
* tests/frontend/push.py: Allow pushing of dependenciesJürg Billeter2020-12-141-4/+4
| | | | | | The assertions in `test_push_after_pull` are too strict. Pushing dependencies to the second (empty) artifact server should not cause a test failure.
* Merge branch 'tristan/fix-artifact-name-hack' into 'master'bst-marge-bot2020-12-112-7/+12
|\ | | | | | | | | _stream.py, _frontend/widget.py: Fix weird hack See merge request BuildStream/buildstream!2117
| * _stream.py, _frontend/widget.py: Fix weird hacktristan/fix-artifact-name-hackTristan van Berkom2020-12-112-7/+12
|/ | | | | | | | | When stream is asked for a list of artifacts to show for the purpose of `bst artifact show`, it was squashing the element name with the artifact name before it gets displayed in the frontend. Instead, make the special casing in the frontend.
* Merge branch 'tristan/stack-require-depends-all' into 'master'Tristan Van Berkom2020-12-1019-35/+200
|\ | | | | | | | | | | | | Require all stack dependencies to be both build & runtime dependencies Closes #1075 See merge request BuildStream/buildstream!2113
| * tests/format/stack.py: Test stack dependency typestristan/stack-require-depends-allTristan van Berkom2020-12-105-0/+36
| | | | | | | | | | Assert that errors are raised when stack dependencies are declared as build-only or runtime-only dependencies.
| * NEWS: Document breaking change, stack element dependencies must be build + runTristan van Berkom2020-12-101-0/+3
| |
| * plugins/elements/stack.py: Require all dependencies be build & run.Tristan van Berkom2020-12-1010-33/+126
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Stack elements cannot be build-only dependencies, as this would defeat the purpose of using stack elements in order to directly build-depend on them. Stack element dependencies must all be built in order to build depend on them, and as such we gain no build parallelism by allowing runtime-only dependencies on stack elements. Declaring a runtime-only dependency on a stack element as a whole might still be useful, but still requires the entire stack to be built at the time we need that stack. Instead, it is more useful to ensure that a stack element is a logical group of all dependencies, including runtime dependencies, such that we can guarantee cache key alignment with all stack dependencies. This allows for stronger reliability in commands such as `bst artifact checkout`, which can now reliably download and checkout a fully built stack as a result, without any uncertainty about possible runtime-only dependencies which might exist in the project where that artifact was created. This consequently closes #1075 This also fixes the following tests such that the no longer require build-depends or runtime-depends to work in stack elements: * tests/frontend/default_target.py: Was not necessary to check results of show, these stacks were set to runtime-depends so that they would have the same buildable state as their dependencies when shown. * tests/format/dependencies.py: tests/frontend/pull.py, test/frontend/show.py, tests/integration/compose.py: These tests were using specific build/runtime dependencies in stacks, but for no particular reason.
| * doc: Documenting "artifact names".Tristan van Berkom2020-12-102-0/+27
| | | | | | | | | | | | Added a description about artifact names at the beginning of the artifact commands section, along with a new glossary entry for artifact names which refers to the description.
| * doc/source/format_declaring.rst: Minor corrections and added references.Tristan van Berkom2020-12-101-2/+8
|/ | | | | | | | | | * When documenting dependency types, add references to the shorthand convenience lists `build-depends` and `runtime-depends`. * When documenting the `type` attribute of dependencies, correct the language referring to the convenience lists to specify `build-depends` and `runtime-depends` instead of the incorrectly worded `Build-Depends` and `Runtime-Depends`, which would be invalid keys.
* Merge branch 'tristan/refactor-retry-task' into 'master'Tristan Van Berkom2020-12-106-185/+239
|\ | | | | | | | | Refactor State object See merge request BuildStream/buildstream!2115
| * _state.py: Full type hintingtristan/refactor-retry-taskTristan van Berkom2020-12-102-131/+215
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This improves overall documentation comments on the State object, adds full pep484 type hinting, and renames the Task.set_render_cb() to Task.set_task_changed_callback() to be more consistently named. This also adds missing frontend facing API for the group changed status notifications, even though the frontend does not currently use these, it makes better sense to have them than to remove the entire codepaths and callback lists. This also reorders the classes in this file so that Task and TaskGroup are both defined before State, this helps a bit with undefined references for type hinting information.
| * Refactor: Use explicit invocation for retrying jobs.Tristan van Berkom2020-12-104-58/+28
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | We created the State object in the core for the purpose of advertizing state to the frontend, and the frontend can register callbacks and get updates to state changes (implicit invocation in the frontend), state always belongs to the core and the frontend can only read state. When the frontend asks the core to do something, this should always be done with an explicit function call, and preferably not via the State object, as this confuses the use of state, which is only a readonly state advertizing desk. This was broken (implemented backwards) for job retries, instead we had the frontend telling state "It has been requested that this job be retried !", and then we had the core registering callbacks to that frontend request - this direction of implicit invocation should not happen (the core should never have to register callbacks on the State object at all in fact). Summary of changes: * _stream.py: Change _failure_retry(), which was for some reason private albeit called from the frontend, to an explicit function call named "retry_job()". Instead of calling into the State object and causing core-side callbacks to be triggered, later to be handled by the Scheduler, implement the retry directly from the Stream, since this implementation deals only with Queues and State, which already directly belong to the Stream object, there is no reason to trouble the Scheduler with this. * _scheduler.py: Remove the callback handling the State "task retry" event. * _state.py: Remove the task retry callback chain completely. * _frontend/app.py: Call stream.retry_job() instead of stream.failure_retry(), now passing along the task's action name rather than the task's ID. This API now assumes that Stream.retry_job() can only be called on a task which originates from a scheduler Queue, and expects to be given the action name of the queue in which the given element has failed and should be retried..
| * _state.py: Rename _Task -> TaskTristan van Berkom2020-12-102-5/+5
|/ | | | | The Task object is not internal to the State object, it is clearly given to the frontend and passed around.
* Merge branch 'juerg/cas' into 'master'bst-marge-bot2020-12-098-206/+142
|\ | | | | | | | | CASCache improvements See merge request BuildStream/buildstream!2112
| * Use CASCache.open()juerg/casJürg Billeter2020-12-095-8/+8
| |
| * cascache.py: Add open() methodJürg Billeter2020-12-091-0/+16
| |
| * cascache.py: Generalize remote_missing_blobs() into missing_blobs()Jürg Billeter2020-12-093-35/+16
| | | | | | | | | | The implementation can be reused to replace `local_missing_blobs()` and simplify `contains_files()`.
| * cascache.py: Optimize _fetch_tree() using add_objects()Jürg Billeter2020-12-091-5/+6
| |
| * cascache.py: Add add_objects() methodJürg Billeter2020-12-091-16/+48
| | | | | | | | | | This allows adding multiple objects in a single batch, avoiding extra gRPC round trips to buildbox-casd.
| * cascache.py: Remove unused parameters from add_object()Jürg Billeter2020-12-093-12/+6
| |
| * cascache.py: Remove _ensure_blob() methodJürg Billeter2020-12-091-24/+2
| | | | | | | | | | It's only used by `_fetch_tree()` and can be replaced by a single additional line.
| * cascache.py: Also fetch file blobs in _fetch_directory()Jürg Billeter2020-12-094-22/+7
| | | | | | | | | | This eliminates code duplication in `ArtifactCache`, `SourceCache` and `ElementSourcesCache`.
| * cascache.py: Reimplement _fetch_directory() with FetchTree()Jürg Billeter2020-12-091-55/+17
| | | | | | | | This simplifies the code, delegating the logic to buildbox-casd.
| * Move _reachable_refs_dir() method from cascache.py to testutilsJürg Billeter2020-12-092-33/+20
|/ | | | It's not used outside testutils.
* Merge branch 'juerg/ci' into 'master'Jürg Billeter2020-12-091-1/+7
|\ | | | | | | | | .gitlab-ci.yml: Disable parallel testing on centos-7.7 and ubuntu-18.04 See merge request BuildStream/buildstream!2114
| * .gitlab-ci.yml: Disable parallel testing on centos-7.7 and ubuntu-18.04juerg/ciJürg Billeter2020-12-091-1/+7
|/ | | | | With Python 3.6 CI frequently fails with `INTERNALERROR` when parallel testing is enabled.
* Merge branch 'abderrahim/no-delete-extract' into 'master'bst-marge-bot2020-12-081-4/+3
|\ | | | | | | | | _context.py: don't delete bst1 extract directory See merge request BuildStream/buildstream!2080
| * _context.py: don't delete bst1 extract directoryAbderrahim Kitouni2020-12-081-4/+3
|/
* Merge branch 'bschubert/coverage' into 'master'bst-marge-bot2020-12-075-31/+32
|\ | | | | | | | | requirements: Update all requirements See merge request BuildStream/buildstream!2110
| * requirements: Update all requirementsBenjamin Schubert2020-12-075-31/+32
|/ | | | | | | | | | * .pylintrc: Disable new `raise-missing-from` check. We might want to enable that later, but it fails in many places. Let's not merge both changes here. * pluginoriginpip.py: Catch the newer thrown exception from pkg_resources. The previous one still exists, so we should be good keeping the same compatibility as before
* Merge branch 'tristan/refactor-artifact-elements' into 'master'Tristan Van Berkom2020-12-0775-720/+1529
|\ | | | | | | | | | | | | Fix operations which interact with artifacts Closes #1410 See merge request BuildStream/buildstream!2108
| * tests/integration/artifact.py: Test preservation of environment variablestristan/refactor-artifact-elementsTristan van Berkom2020-12-074-0/+90
| | | | | | | | | | | | This new test tests that environment variables are preserved in generated artifacts, and that artifact data is observed rather than irrelevant local state when integrating an artifact checked out by it's artifact name.
| * Refactor ArtifactElement instantiationTristan van Berkom2020-12-076-113/+169
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | When instantiating an ArtifactElement, use an ArtifactProject to ensure that the Element does not accidentally have access to any incidentally existing project loaded from the current working directory. Also pass along the Artifact to the Element's initializer directly, and conditionally instantiate the element based on it's artifact instead of based on loading YAML configuration. Fixes #1410 Summary of changes: * _artifactelement.py: - Now load the Artifact and pass it along to the Element constructor - Now use an ArtifactProject for the element's project - Remove overrides of Element methods, instead of behaving differently, now we just fill in all the blanks for an Element to behave more naturally when loaded from an artifact. - Avoid redundantly loading the same artifact twice, if the artifact was cached then we will load only one artifact. * element.py: - Conditionally instantiate from the passed Artifact instead of considering any YAML loading. - Error out early in _prepare_sandbox() in case that we are trying to instantiate a sandbox for an uncached artifact, in which case we don't have any SandboxConfig at hand to do so. * _stream.py: - Clear the ArtifactProject cache after loading artifacts - Ensure we load a list of unique artifacts without any duplicates * tests/frontend/buildcheckout.py: Expect a different error when trying to checkout an uncached artifact * tests/frontend/push.py, tests/frontend/artifact_show.py: No longer expect duplicates to show up with wild card statements which would capture multiple versions of the same artifact (this changes because of #1410 being fixed)
| * _artifact.py: Added properties for cache key accessTristan van Berkom2020-12-071-0/+45
| | | | | | | | | | | | These properties allow easy addressing of the different cache keys, and report a cached value if the artifact was cached, otherwise report the value assigned to the Artifact at instantiation time.
| * _project.py, _artifactproject.py: Adding ArtifactProjectTristan van Berkom2020-12-073-67/+181
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | The Project's class initializer is now refactored such that loading a project.conf is made optional. The initializer is now well sorted with public members showing up before private members, followed by the initialization body. pep484 type hints are now employed aggressively for all project instance members. The added ArtifactProject is added to serve as the data model counterpart of the ArtifactElement, ensuring that we never mistakenly use locally loaded project data in ArtifactElement instances. Consequently, the Project.sandbox and Project.splits variables are properly made public by this commit, as these are simply loaded from the project config and accessed elsewhere by Element; Element is updated to access these public members by their new public names.
| * _stream.py: Centralize application state cleanupTristan van Berkom2020-12-073-12/+3
| | | | | | | | | | | | | | | | | | | | | | Instead of delegating this job to a Project instance, implement the application state cleanup directly on the Stream object. This commit also: * Removes Project.cleanup() * Reword incorrect API documentation for Element._reset_load_state()
| * _stream.py: Add _load_artifacts() hereTristan van Berkom2020-12-073-37/+20
| | | | | | | | | | | | | | | | | | Instead of having _pipeline.py implement load_artifacts() by calling _project.py's other implementation of load_artifacts(), instead just implement _load_artifacts() directly in _stream.py. This of course removes the load_artifacts() implementations from _pipeline.py and _project.py.
| * element.py: Added missing api doc comment for _walk_artifact_files()Tristan van Berkom2020-12-071-0/+8
| |
| * _artifactcache.py: Fixed logging messages to display cache keysTristan van Berkom2020-12-071-2/+2
| | | | | | | | | | | | This was recently regressed to logging a DisplayKey object and was not fixed to display the DisplayKey.brief, as it had already been fixed in other log messages.
| * _artifact.py: Store additional metadata on the artifactTristan van Berkom2020-12-0733-47/+174
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This commit enriches the metadata we store on artifacts in the new detatched low/high diversity metadata files: * The SandboxConfig is now stored in the artifact, allowing one to perform activities such as launching sandboxes on artifacts downloaded via artifact name (without backing project data). * The environment variables is now stored in the artifact, similarly allowing one to shell into a downloaded artifacts which are unrelated to a loaded project. * The element variables are now stored in the artifact, allowing more flexibility in what the core can do with a downloaded ArtifactElement * The element's strict key All of these of course can additionally enhance traceability in the UI with commands such as `bst artifact show`. Summary of changes: * _artifact.py: - Store new data in the new proto digests. - Added new accessors to extract these new aspects from loaded artifacts. - Bump the proto version number for compatibility * _artifactcache.py: Adjusted to push and pull the new blobs and digests. * element.py: - Call Artifact.cache() with new parameters - Expect the strict key from Artifact.get_meta_keys() - Always specify the strict key when constructing an Artifact instance which will later be used to cache the artifact (i.e. the self.__artifact Artifact). * _versions.py: Bump the global artifact version number, as this breaks the artifact format. * tests/cachekey: Updated cache key test for new keys.