| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
| |
junctions
We already test the behavior without workspaces, lets just augment
these tests to also run with a workspaced junction and check the results.
This guards against regressions of #1030.
|
| |
|
|
|
|
|
|
| |
actually build
This is a regression test for issue #1014
|
|
|
|
|
| |
This adds a check to test_build_track() to ensure that the target
is cached as a result of building with tracking of selected elements.
|
|
|
|
|
|
| |
artifacts are deleted
This is a regression test for #1017
|
|
|
|
|
|
|
|
|
| |
subdirs.
The artifact directories are based on the element normal_name, which
substitutes any path separators with dashes.
Fix the helper function to use the correct path.
|
|
|
|
| |
This is a regression test for issue #990
|
|
|
|
|
|
|
|
|
|
|
|
| |
This needs to be added along with the status messages added
to the artifact cache, and this detail diverges from master.
This is because we have not bothered to backport !1071 which
refactors the CASCache to be a delegate object of the ArtifactCache
istead of a derived class - backporting !1071 would allow us to
remove these message handlers because the CAS server and test
fixture only use the CASCache object directly, not the business
logic in the ArtifactCache.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is not an error related to loading data, like a parse error
in the quota specification is, but a problem raised by the artifact
cache - this allows us to assert more specific machine readable
errors in test cases (instead of checking the string in stderr, which
this patch also fixes).
This also removes a typo from the error message in the said error.
* tests/artifactcache/cache_size.py
Updated test case to expect the artifact error, which consequently
changes the test case to properly assert a machine readable error
instead of asserting text in the stderr (which is the real, secret
motivation behind this patch).
* tests/artifactcache/expiry.py: Reworked test_invalid_cache_quota()
Now expect the artifact error for the tests which check configurations
which create caches too large to fit on the disk.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This will benefit from a better UtilError being raised, and
and turns the artifact cache's local function into a one liner.
The loop which finds the first existing directory in the
given path has been removed, being meaningless due to the
call to os.makedirs() in ArtifactCache.__init__().
The local function was renamed to _get_cache_volume_size() and
no longer takes any arguments, which is more suitable for the
function as it serves as a testing override surface for
unittest.mock().
The following test cases which use the function to override
the ArtifactCache behavior have been updated to use the new
overridable function name:
tests/artifactcache/cache_size.py
tests/artifactcache/expiry.py
|
| |
|
|
|
|
|
|
|
|
| |
This causes multiple source instances to interact with the same
backing data store at the same time, increasing the likelyhood
of triggering issues around concurrent access.
This more reliably triggers issue #868
|
|
|
|
|
|
|
|
|
| |
With get_element_state(), you need to invoke BuildStream once
for every element state you want to observe in a pipeline.
The new get_element_states() reports a dictionary with
the element state hashed by element name and is better to use
if you have more than one element to observe the state of.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- Tests that the target is still built even when a workspace is open
on a runtime dependency of a build-only dependency.
- Tests that the target is still built even when a workspace is open
on a runtime dependency of a runtime dependency of a build-only
dependency
This adds the regression test provided by Matthew Yates for issue #919
This test differs from the one committed in master as:
- We have an orthogonal bug in 1.2.x where buildable elements show up
in a waiting state instead of a buildable state
- Some new APIs were used in the test case added in master, adjusted
to use only 1.2 APIs for the test case.
|
|
|
|
|
|
|
| |
boundaries
These include assertions for the expected provenance in the errors,
protecting against regressions of #947
|
|
|
|
|
| |
These also assert that the provenance of the references to missing
files are reported, guarding for regressions of issue #947
|
|
|
|
|
|
|
| |
This scaffolding needs to manually tell coverage to cleanup
when sigterm happens in the process in order to collect
the coverage report, but we need to do this conditionally
in case we are running tests without coverage support.
|
|
|
|
|
|
|
|
|
| |
This test ensures the overlap failure vs warning policy in one
project only ever affects the artifacts created for the project
which declares it and does not force it's policy onto another
consuming project.
A regression test against issue #926
|
|
|
|
| |
This was broken by 9252a18180ce79d70c193768293baa0f0eff9981.
|
|
|
|
|
|
|
|
|
| |
Also bump the element's version so cached artifacts would be
invalidated.
Fixes #883
(cherry picked from commit 03111d39e093b11ffc6589071f2f5040d5f61ab4)
|
|
|
|
|
|
|
|
|
| |
When using aliases there are multiple remotes used in the cache
repository. When fetching, tags are not updated if the were previously
fetched from a different remote. Commits that not in a branch and only
tagged do not get fetched if the tag is not fetched.
Fixes #812
|
|
|
|
|
| |
When there is less than 2GB left, it cleans up have 10GB available.
These values are configurable.
|
|
|
|
|
| |
This also remove references when some objects are missing. This is in
preparation for the move from reference to object garbage collection.
|
|
|
|
|
|
|
|
| |
Root directory was marked as a non-artifact mount, so not using
SafeHardLink. However integration commands executed with write access
to the root directory.
Fixes #749
|
|
|
|
|
|
|
|
|
|
|
| |
This is a backport of !819.
---
Currently, `bst checkout --deps none` command always produces empty
output. Fix this issue and add regression test for the same.
Fixes #670.
|
| |
|
|
|
|
|
|
|
|
|
| |
The issue happens on Silverblue where /home is a symlink to /var/home.
With this element-path is something like
/var/home/user/project/elements, when the project path is
/home/usr/project. Comparing canonical paths solves the issue.
Fixes #673
|
|
|
|
|
|
|
|
| |
Pulled/Pushed messages will no longer be produced from within
element.py, instead they will be produced during CasCache push() and
pull() appropriately.
Message consistency has also been improved.
|
|
|
|
|
| |
Adds a test to ensure that BuildStream alerts the user of a skipped push
when the remote already has the artifact cached.
|
|
|
|
|
|
|
|
|
|
| |
The SKIPPED message type is now used to indicate the end of a task which
was successful without having to perform the given task.
This overhauls the use of `Queue.done()` and therefore queues do not
need to provide a processed/skipped return value from `done()`. Instead
this is replaced with the action of raising a `SkipJob` exception from
within `Queue.process()`.
|
|
|
|
|
| |
Test that we get the expected error when configuring a client-cert
without client-key, or the inverse.
|
|
|
|
| |
This is a regression test for issue #658
|
| |
|
|
|
|
| |
Setting "max-jobs" won't be allowed anymore in a following commit.
|
|
|
|
|
|
| |
Same test as test_never_delete_required(), except that this test ensures
that we never delete required artifacts when their cache keys are
discovered dynamically during the build.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* create_element_size()
Now uses a git Repo object instead of a local source, and
returns the repo.
* update_element_size()
Added this function which can now resize the expected output
of an element generated with create_element_size(), useful
to allow testing sized elements with the tracking feature.
|
|
|
|
|
| |
This allows one to modify a file in an existing git repo,
as opposed to adding a new one.
|
|
|
|
|
|
|
| |
These tests were not checking that we fail for the expected reasons.
Added `res.assert_task_error(ErrorDomain.ARTIFACT, 'cache-too-full')`
where we expect to fail because the cache is too full.
|
|
|
|
|
|
|
| |
This commit renames test_never_delete_dependencies() to
test_never_delete_required(), renders the test more readable by renaming
some elements and reordering some statements and makes the comments more
straight forward and accurate.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The Source must now mention whether the marked or translated
URL is "primary" or not. Even when a Source may have multiple
URLs, the auxilliary URLs are derived from the primary one, not
only is this true for git, but it is mandated by our tracking
API which assumes there is a primary URL.
This adjusts the `git` source and the test `fetch_source.py` source
to behave properly and advertize it's primary URL properly.
This is a part of #620
|
| |
|
|
|
|
|
|
| |
Failures to write files when tracking were not reported.
Fixes #533.
|
|
|
|
|
| |
Note: This modifies the docker containers used for testing to supply the
pytest-timeout package.
|
|
|
|
|
| |
This test was skipped because of issue #538, but #538 was fixed
and the test was still not reenabled.
|
|
|
|
|
|
|
|
|
| |
There is no reason that the filter element codepaths can behave
differently depending on the Source implementation used in the test,
as the Source implementation does not have any filter specific
virtual methods.
Removing the redundant tests and just performing these tests with the git source.
|
| |
|
|
|
|
|
| |
Removed redundant tests from recently merged !740, this new
test does not need to run for every different source kind.
|
|
|
|
| |
This adds a regression test for #461.
|
|
|
|
|
|
|
|
| |
This makes the integration tests use the same 'alpine' alias for the
tests as we use in the examples, this avoids a redundant download
of an extra alpine tarball in integration tests.
This is a part of #603
|