diff options
author | Sam Doran <sdoran@redhat.com> | 2021-03-19 15:09:18 -0400 |
---|---|---|
committer | GitHub <noreply@github.com> | 2021-03-19 12:09:18 -0700 |
commit | abacf6a108b038571a0c3daeae63da0897c8fcb6 (patch) | |
tree | c9da1813642dde72ff13f89ac03e4fee0e043f39 /lib/ansible/module_utils/common | |
parent | 089d0a0508a470799d099d95fc371e66756a00b3 (diff) | |
download | ansible-abacf6a108b038571a0c3daeae63da0897c8fcb6.tar.gz |
Use ArgumentSpecValidator in AnsibleModule (#73703)
* Begin using ArgumentSpecValidator in AnsibleModule
* Add check parameters to ArgumentSpecValidator
Add additional parameters for specifying required and mutually exclusive parameters.
Add code to the .validate() method that runs these additional checks.
* Make errors related to unsupported parameters match existing behavior
Update the punctuation in the message slightly to make it more readable.
Add a property to ArgumentSpecValidator to hold valid parameter names.
* Set default values after performining checks
* FIx sanity test failure
* Use correct parameters when checking sub options
* Use a dict when iterating over check functions
Referencing by key names makes things a bit more readable IMO.
* Fix bug in comparison for sub options evaluation
* Add options_context to check functions
This allows the parent parameter to be added the the error message if a validation
error occurs in a sub option.
* Fix bug in apply_defaults behavior of sub spec validation
* Accept options_conext in get_unsupported_parameters()
If options_context is supplied, a tuple of parent key names of unsupported parameter will be
created. This allows the full "path" to the unsupported parameter to be reported.
* Build path to the unsupported parameter for error messages.
* Remove unused import
* Update recursive finder test
* Skip if running in check mode
This was done in the _check_arguments() method. That was moved to a function that has no
way of calling fail_json(), so it must be done outside of validation.
This is a silght change in behavior, but I believe the correct one.
Previously, only unsupported parameters would cause a failure. All other checks would not be executed
if the modlue did not support check mode. This would hide validation failures in check mode.
* The great purge
Remove all methods related to argument spec validation from AnsibleModule
* Keep _name and kind in the caller and out of the validator
This seems a bit awkward since this means the caller could end up with {name} and {kind} in
the error message if they don't run the messages through the .format() method
with name and kind parameters.
* Double moustaches work
I wasn't sure if they get stripped or not. Looks like they do. Neat trick.
* Add changelog
* Update unsupported parameter test
The error message changed to include name and kind.
* Remove unused import
* Add better documentation for ArgumentSpecValidator class
* Fix example
* Few more docs fixes
* Mark required and mutually exclusive attributes as private
* Mark validate functions as private
* Reorganize functions in validation.py
* Remove unused imports in basic.py related to argument spec validation
* Create errors is module_utils
We have errors in lib/ansible/errors/ but those cannot be used by modules.
* Update recursive finder test
* Move errors to file rather than __init__.py
* Change ArgumentSpecValidator.validate() interface
Raise AnsibleValidationErrorMultiple on validation error which contains all AnsibleValidationError
exceptions for validation failures.
Return the validated parameters if validation is successful rather than True/False.
Update docs and tests.
* Get attribute in loop so that the attribute name can also be used as a parameter
* Shorten line
* Update calling code in AnsibleModule for new validator interface
* Update calling code in validate_argument_spec based in new validation interface
* Base custom exception class off of Exception
* Call the __init__ method of the base Exception class to populate args
* Ensure no_log values are always updated
* Make custom exceptions more hierarchical
This redefines AnsibleError from lib/ansible/errors with a different signature since that cannot
be used by modules. This may be a bad idea. Maybe lib/ansible/errors should be moved to
module_utils, or AnsibleError defined in this commit should use the same signature as the original.
* Just go back to basing off Exception
* Return ValidationResult object on successful validation
Create a ValidationResult class.
Return a ValidationResult from ArgumentSpecValidator.validate() when validation is successful.
Update class and method docs.
Update unit tests based on interface change.
* Make it easier to get error objects from AnsibleValidationResultMultiple
This makes the interface cleaner when getting individual error objects contained in a single
AnsibleValidationResultMultiple instance.
* Define custom exception for each type of validation failure
These errors indicate where a validation error occured. Currently they are empty but could
contain specific data for each exception type in the future.
* Update tests based on (yet another) interface change
* Mark several more functions as private
These are all doing rather "internal" things. The ArgumentSpecValidator class is the preferred
public interface.
* Move warnings and deprecations to result object
Rather than calling deprecate() and warn() directly, store them on the result object so the
caller can decide what to do with them.
* Use subclass for module arg spec validation
The subclass uses global warning and deprecations feature
* Fix up docs
* Remove legal_inputs munging from _handle_aliases()
This is done in AnsibleModule by the _set_internal_properties() method. It only makes sense
to do that for an AnsibleModule instance (it should update the parameters before performing
validation) and shouldn't be done by the validator.
Create a private function just for getting legal inputs since that is done in a couple of places.
It may make sense store that on the ValidationResult object.
* Increase test coverage
* Remove unnecessary conditional
ci_complete
* Mark warnings and deprecations as private in the ValidationResult
They can be made public once we come up with a way to make them more generally useful,
probably by creating cusom objects to store the data in more structure way.
* Mark valid_parameter_names as private and populate it during initialization
* Use a global for storing the list of additonal checks to perform
This list is used by the main validate method as well as the sub spec validation.
Diffstat (limited to 'lib/ansible/module_utils/common')
-rw-r--r-- | lib/ansible/module_utils/common/arg_spec.py | 261 | ||||
-rw-r--r-- | lib/ansible/module_utils/common/parameters.py | 810 | ||||
-rw-r--r-- | lib/ansible/module_utils/common/validation.py | 80 |
3 files changed, 660 insertions, 491 deletions
diff --git a/lib/ansible/module_utils/common/arg_spec.py b/lib/ansible/module_utils/common/arg_spec.py index 54bf80a587..c4d4a247ed 100644 --- a/lib/ansible/module_utils/common/arg_spec.py +++ b/lib/ansible/module_utils/common/arg_spec.py @@ -5,71 +5,146 @@ from __future__ import absolute_import, division, print_function __metaclass__ = type - from copy import deepcopy -from ansible.module_utils.common._collections_compat import ( - Sequence, -) - from ansible.module_utils.common.parameters import ( - get_unsupported_parameters, - handle_aliases, - list_no_log_values, - remove_values, - set_defaults, + _ADDITIONAL_CHECKS, + _get_legal_inputs, + _get_unsupported_parameters, + _handle_aliases, + _list_no_log_values, + _set_defaults, + _validate_argument_types, + _validate_argument_values, + _validate_sub_spec, set_fallbacks, - validate_argument_types, - validate_argument_values, - validate_sub_spec, ) from ansible.module_utils.common.text.converters import to_native from ansible.module_utils.common.warnings import deprecate, warn + from ansible.module_utils.common.validation import ( + check_mutually_exclusive, check_required_arguments, + check_required_by, + check_required_if, + check_required_one_of, + check_required_together, +) + +from ansible.module_utils.errors import ( + AliasError, + AnsibleValidationErrorMultiple, + MutuallyExclusiveError, + NoLogError, + RequiredByError, + RequiredDefaultError, + RequiredError, + RequiredIfError, + RequiredOneOfError, + RequiredTogetherError, + UnsupportedError, ) -from ansible.module_utils.six import string_types +class ValidationResult: + """Result of argument spec validation. -class ArgumentSpecValidator(): - """Argument spec validation class""" + :param parameters: Terms to be validated and coerced to the correct type. + :type parameters: dict - def __init__(self, argument_spec, parameters): - self._error_messages = [] + """ + + def __init__(self, parameters): self._no_log_values = set() - self.argument_spec = argument_spec - # Make a copy of the original parameters to avoid changing them - self._validated_parameters = deepcopy(parameters) self._unsupported_parameters = set() - - @property - def error_messages(self): - return self._error_messages + self._validated_parameters = deepcopy(parameters) + self._deprecations = [] + self._warnings = [] + self.errors = AnsibleValidationErrorMultiple() @property def validated_parameters(self): return self._validated_parameters - def _add_error(self, error): - if isinstance(error, string_types): - self._error_messages.append(error) - elif isinstance(error, Sequence): - self._error_messages.extend(error) - else: - raise ValueError('Error messages must be a string or sequence not a %s' % type(error)) + @property + def unsupported_parameters(self): + return self._unsupported_parameters + + @property + def error_messages(self): + return self.errors.messages + + +class ArgumentSpecValidator: + """Argument spec validation class + + Creates a validator based on the ``argument_spec`` that can be used to + validate a number of parameters using the ``validate()`` method. + + :param argument_spec: Specification of valid parameters and their type. May + include nested argument specs. + :type argument_spec: dict + + :param mutually_exclusive: List or list of lists of terms that should not + be provided together. + :type mutually_exclusive: list, optional + + :param required_together: List of lists of terms that are required together. + :type required_together: list, optional - def _sanitize_error_messages(self): - self._error_messages = remove_values(self._error_messages, self._no_log_values) + :param required_one_of: List of lists of terms, one of which in each list + is required. + :type required_one_of: list, optional - def validate(self, *args, **kwargs): - """Validate module parameters against argument spec. + :param required_if: List of lists of ``[parameter, value, [parameters]]`` where + one of [parameters] is required if ``parameter`` == ``value``. + :type required_if: list, optional + + :param required_by: Dictionary of parameter names that contain a list of + parameters required by each key in the dictionary. + :type required_by: dict, optional + """ + + def __init__(self, argument_spec, + mutually_exclusive=None, + required_together=None, + required_one_of=None, + required_if=None, + required_by=None, + ): + + self._mutually_exclusive = mutually_exclusive + self._required_together = required_together + self._required_one_of = required_one_of + self._required_if = required_if + self._required_by = required_by + self._valid_parameter_names = set() + self.argument_spec = argument_spec + + for key in sorted(self.argument_spec.keys()): + aliases = self.argument_spec[key].get('aliases') + if aliases: + self._valid_parameter_names.update(["{key} ({aliases})".format(key=key, aliases=", ".join(sorted(aliases)))]) + else: + self._valid_parameter_names.update([key]) + + def validate(self, parameters, *args, **kwargs): + """Validate module parameters against argument spec. Returns a + ValidationResult object. + + Error messages in the ValidationResult may contain no_log values and should be + sanitized before logging or displaying. :Example: - validator = ArgumentSpecValidator(argument_spec, parameters) - passeded = validator.validate() + validator = ArgumentSpecValidator(argument_spec) + result = validator.validate(parameters) + + if result.error_messages: + sys.exit("Validation failed: {0}".format(", ".join(result.error_messages)) + + valid_params = result.validated_parameters :param argument_spec: Specification of parameters, type, and valid values :type argument_spec: dict @@ -77,58 +152,104 @@ class ArgumentSpecValidator(): :param parameters: Parameters provided to the role :type parameters: dict - :returns: True if no errors were encountered, False if any errors were encountered. - :rtype: bool + :return: Object containing validated parameters. + :rtype: ValidationResult """ - self._no_log_values.update(set_fallbacks(self.argument_spec, self._validated_parameters)) + result = ValidationResult(parameters) + + result._no_log_values.update(set_fallbacks(self.argument_spec, result._validated_parameters)) alias_warnings = [] alias_deprecations = [] try: - alias_results, legal_inputs = handle_aliases(self.argument_spec, self._validated_parameters, alias_warnings, alias_deprecations) + aliases = _handle_aliases(self.argument_spec, result._validated_parameters, alias_warnings, alias_deprecations) except (TypeError, ValueError) as e: - alias_results = {} - legal_inputs = None - self._add_error(to_native(e)) + aliases = {} + result.errors.append(AliasError(to_native(e))) + + legal_inputs = _get_legal_inputs(self.argument_spec, result._validated_parameters, aliases) for option, alias in alias_warnings: - warn('Both option %s and its alias %s are set.' % (option, alias)) + result._warnings.append({'option': option, 'alias': alias}) for deprecation in alias_deprecations: - deprecate("Alias '%s' is deprecated. See the module docs for more information" % deprecation['name'], - version=deprecation.get('version'), date=deprecation.get('date'), - collection_name=deprecation.get('collection_name')) + result._deprecations.append({ + 'name': deprecation['name'], + 'version': deprecation.get('version'), + 'date': deprecation.get('date'), + 'collection_name': deprecation.get('collection_name'), + }) - self._no_log_values.update(list_no_log_values(self.argument_spec, self._validated_parameters)) + try: + result._no_log_values.update(_list_no_log_values(self.argument_spec, result._validated_parameters)) + except TypeError as te: + result.errors.append(NoLogError(to_native(te))) + + try: + result._unsupported_parameters.update(_get_unsupported_parameters(self.argument_spec, result._validated_parameters, legal_inputs)) + except TypeError as te: + result.errors.append(RequiredDefaultError(to_native(te))) + except ValueError as ve: + result.errors.append(AliasError(to_native(ve))) - if legal_inputs is None: - legal_inputs = list(alias_results.keys()) + list(self.argument_spec.keys()) - self._unsupported_parameters.update(get_unsupported_parameters(self.argument_spec, self._validated_parameters, legal_inputs)) + try: + check_mutually_exclusive(self._mutually_exclusive, result._validated_parameters) + except TypeError as te: + result.errors.append(MutuallyExclusiveError(to_native(te))) - self._no_log_values.update(set_defaults(self.argument_spec, self._validated_parameters, False)) + result._no_log_values.update(_set_defaults(self.argument_spec, result._validated_parameters, False)) try: - check_required_arguments(self.argument_spec, self._validated_parameters) + check_required_arguments(self.argument_spec, result._validated_parameters) except TypeError as e: - self._add_error(to_native(e)) + result.errors.append(RequiredError(to_native(e))) + + _validate_argument_types(self.argument_spec, result._validated_parameters, errors=result.errors) + _validate_argument_values(self.argument_spec, result._validated_parameters, errors=result.errors) + + for check in _ADDITIONAL_CHECKS: + try: + check['func'](getattr(self, "_{attr}".format(attr=check['attr'])), result._validated_parameters) + except TypeError as te: + result.errors.append(check['err'](to_native(te))) + + result._no_log_values.update(_set_defaults(self.argument_spec, result._validated_parameters)) + + _validate_sub_spec(self.argument_spec, result._validated_parameters, + errors=result.errors, + no_log_values=result._no_log_values, + unsupported_parameters=result._unsupported_parameters) + + if result._unsupported_parameters: + flattened_names = [] + for item in result._unsupported_parameters: + if isinstance(item, tuple): + flattened_names.append(".".join(item)) + else: + flattened_names.append(item) + + unsupported_string = ", ".join(sorted(list(flattened_names))) + supported_string = ", ".join(self._valid_parameter_names) + result.errors.append( + UnsupportedError("{0}. Supported parameters include: {1}.".format(unsupported_string, supported_string))) + + return result - validate_argument_types(self.argument_spec, self._validated_parameters, errors=self._error_messages) - validate_argument_values(self.argument_spec, self._validated_parameters, errors=self._error_messages) - self._no_log_values.update(set_defaults(self.argument_spec, self._validated_parameters)) +class ModuleArgumentSpecValidator(ArgumentSpecValidator): + def __init__(self, *args, **kwargs): + super(ModuleArgumentSpecValidator, self).__init__(*args, **kwargs) - validate_sub_spec(self.argument_spec, self._validated_parameters, - errors=self._error_messages, - no_log_values=self._no_log_values, - unsupported_parameters=self._unsupported_parameters) + def validate(self, parameters): + result = super(ModuleArgumentSpecValidator, self).validate(parameters) - if self._unsupported_parameters: - self._add_error('Unsupported parameters: %s' % ', '.join(sorted(list(self._unsupported_parameters)))) + for d in result._deprecations: + deprecate("Alias '{name}' is deprecated. See the module docs for more information".format(name=d['name']), + version=d.get('version'), date=d.get('date'), + collection_name=d.get('collection_name')) - self._sanitize_error_messages() + for w in result._warnings: + warn('Both option {option} and its alias {alias} are set.'.format(option=w['option'], alias=w['alias'])) - if self.error_messages: - return False - else: - return True + return result diff --git a/lib/ansible/module_utils/common/parameters.py b/lib/ansible/module_utils/common/parameters.py index 4fa5dab84c..e297573410 100644 --- a/lib/ansible/module_utils/common/parameters.py +++ b/lib/ansible/module_utils/common/parameters.py @@ -15,6 +15,22 @@ from ansible.module_utils.common.collections import is_iterable from ansible.module_utils.common.text.converters import to_bytes, to_native, to_text from ansible.module_utils.common.text.formatters import lenient_lowercase from ansible.module_utils.common.warnings import warn +from ansible.module_utils.errors import ( + AliasError, + AnsibleFallbackNotFound, + AnsibleValidationErrorMultiple, + ArgumentTypeError, + ArgumentValueError, + ElementError, + MutuallyExclusiveError, + NoLogError, + RequiredByError, + RequiredError, + RequiredIfError, + RequiredOneOfError, + RequiredTogetherError, + SubParameterTypeError, +) from ansible.module_utils.parsing.convert_bool import BOOLEANS_FALSE, BOOLEANS_TRUE from ansible.module_utils.common._collections_compat import ( @@ -59,6 +75,13 @@ from ansible.module_utils.common.validation import ( # Python2 & 3 way to get NoneType NoneType = type(None) +_ADDITIONAL_CHECKS = ( + {'func': check_required_together, 'attr': 'required_together', 'err': RequiredTogetherError}, + {'func': check_required_one_of, 'attr': 'required_one_of', 'err': RequiredOneOfError}, + {'func': check_required_if, 'attr': 'required_if', 'err': RequiredIfError}, + {'func': check_required_by, 'attr': 'required_by', 'err': RequiredByError}, +) + # if adding boolean attribute, also add to PASS_BOOL # some of this dupes defaults from controller config PASS_VARS = { @@ -97,8 +120,221 @@ DEFAULT_TYPE_VALIDATORS = { } -class AnsibleFallbackNotFound(Exception): - pass +def _get_type_validator(wanted): + """Returns the callable used to validate a wanted type and the type name. + + :arg wanted: String or callable. If a string, get the corresponding + validation function from DEFAULT_TYPE_VALIDATORS. If callable, + get the name of the custom callable and return that for the type_checker. + + :returns: Tuple of callable function or None, and a string that is the name + of the wanted type. + """ + + # Use one our our builtin validators. + if not callable(wanted): + if wanted is None: + # Default type for parameters + wanted = 'str' + + type_checker = DEFAULT_TYPE_VALIDATORS.get(wanted) + + # Use the custom callable for validation. + else: + type_checker = wanted + wanted = getattr(wanted, '__name__', to_native(type(wanted))) + + return type_checker, wanted + + +def _get_legal_inputs(argument_spec, parameters, aliases=None): + if aliases is None: + aliases = _handle_aliases(argument_spec, parameters) + + return list(aliases.keys()) + list(argument_spec.keys()) + + +def _get_unsupported_parameters(argument_spec, parameters, legal_inputs=None, options_context=None): + """Check keys in parameters against those provided in legal_inputs + to ensure they contain legal values. If legal_inputs are not supplied, + they will be generated using the argument_spec. + + :arg argument_spec: Dictionary of parameters, their type, and valid values. + :arg parameters: Dictionary of parameters. + :arg legal_inputs: List of valid key names property names. Overrides values + in argument_spec. + :arg options_context: List of parent keys for tracking the context of where + a parameter is defined. + + :returns: Set of unsupported parameters. Empty set if no unsupported parameters + are found. + """ + + if legal_inputs is None: + legal_inputs = _get_legal_inputs(argument_spec, parameters) + + unsupported_parameters = set() + for k in parameters.keys(): + if k not in legal_inputs: + context = k + if options_context: + context = tuple(options_context + [k]) + + unsupported_parameters.add(context) + + return unsupported_parameters + + +def _handle_aliases(argument_spec, parameters, alias_warnings=None, alias_deprecations=None): + """Process aliases from an argument_spec including warnings and deprecations. + + Modify ``parameters`` by adding a new key for each alias with the supplied + value from ``parameters``. + + If a list is provided to the alias_warnings parameter, it will be filled with tuples + (option, alias) in every case where both an option and its alias are specified. + + If a list is provided to alias_deprecations, it will be populated with dictionaries, + each containing deprecation information for each alias found in argument_spec. + + :param argument_spec: Dictionary of parameters, their type, and valid values. + :type argument_spec: dict + + :param parameters: Dictionary of parameters. + :type parameters: dict + + :param alias_warnings: + :type alias_warnings: list + + :param alias_deprecations: + :type alias_deprecations: list + """ + + aliases_results = {} # alias:canon + + for (k, v) in argument_spec.items(): + aliases = v.get('aliases', None) + default = v.get('default', None) + required = v.get('required', False) + + if alias_deprecations is not None: + for alias in argument_spec[k].get('deprecated_aliases', []): + if alias.get('name') in parameters: + alias_deprecations.append(alias) + + if default is not None and required: + # not alias specific but this is a good place to check this + raise ValueError("internal error: required and default are mutually exclusive for %s" % k) + + if aliases is None: + continue + + if not is_iterable(aliases) or isinstance(aliases, (binary_type, text_type)): + raise TypeError('internal error: aliases must be a list or tuple') + + for alias in aliases: + aliases_results[alias] = k + if alias in parameters: + if k in parameters and alias_warnings is not None: + alias_warnings.append((k, alias)) + parameters[k] = parameters[alias] + + return aliases_results + + +def _list_deprecations(argument_spec, parameters, prefix=''): + """Return a list of deprecations + + :arg argument_spec: An argument spec dictionary + :arg parameters: Dictionary of parameters + + :returns: List of dictionaries containing a message and version in which + the deprecated parameter will be removed, or an empty list:: + + [{'msg': "Param 'deptest' is deprecated. See the module docs for more information", 'version': '2.9'}] + """ + + deprecations = [] + for arg_name, arg_opts in argument_spec.items(): + if arg_name in parameters: + if prefix: + sub_prefix = '%s["%s"]' % (prefix, arg_name) + else: + sub_prefix = arg_name + if arg_opts.get('removed_at_date') is not None: + deprecations.append({ + 'msg': "Param '%s' is deprecated. See the module docs for more information" % sub_prefix, + 'date': arg_opts.get('removed_at_date'), + 'collection_name': arg_opts.get('removed_from_collection'), + }) + elif arg_opts.get('removed_in_version') is not None: + deprecations.append({ + 'msg': "Param '%s' is deprecated. See the module docs for more information" % sub_prefix, + 'version': arg_opts.get('removed_in_version'), + 'collection_name': arg_opts.get('removed_from_collection'), + }) + # Check sub-argument spec + sub_argument_spec = arg_opts.get('options') + if sub_argument_spec is not None: + sub_arguments = parameters[arg_name] + if isinstance(sub_arguments, Mapping): + sub_arguments = [sub_arguments] + if isinstance(sub_arguments, list): + for sub_params in sub_arguments: + if isinstance(sub_params, Mapping): + deprecations.extend(_list_deprecations(sub_argument_spec, sub_params, prefix=sub_prefix)) + + return deprecations + + +def _list_no_log_values(argument_spec, params): + """Return set of no log values + + :arg argument_spec: An argument spec dictionary + :arg params: Dictionary of all parameters + + :returns: Set of strings that should be hidden from output:: + + {'secret_dict_value', 'secret_list_item_one', 'secret_list_item_two', 'secret_string'} + """ + + no_log_values = set() + for arg_name, arg_opts in argument_spec.items(): + if arg_opts.get('no_log', False): + # Find the value for the no_log'd param + no_log_object = params.get(arg_name, None) + + if no_log_object: + try: + no_log_values.update(_return_datastructure_name(no_log_object)) + except TypeError as e: + raise TypeError('Failed to convert "%s": %s' % (arg_name, to_native(e))) + + # Get no_log values from suboptions + sub_argument_spec = arg_opts.get('options') + if sub_argument_spec is not None: + wanted_type = arg_opts.get('type') + sub_parameters = params.get(arg_name) + + if sub_parameters is not None: + if wanted_type == 'dict' or (wanted_type == 'list' and arg_opts.get('elements', '') == 'dict'): + # Sub parameters can be a dict or list of dicts. Ensure parameters are always a list. + if not isinstance(sub_parameters, list): + sub_parameters = [sub_parameters] + + for sub_param in sub_parameters: + # Validate dict fields in case they came in as strings + + if isinstance(sub_param, string_types): + sub_param = check_type_dict(sub_param) + + if not isinstance(sub_param, Mapping): + raise TypeError("Value '{1}' in the sub parameter field '{0}' must by a {2}, " + "not '{1.__class__.__name__}'".format(arg_name, sub_param, wanted_type)) + + no_log_values.update(_list_no_log_values(sub_argument_spec, sub_param)) + + return no_log_values def _return_datastructure_name(obj): @@ -217,79 +453,7 @@ def _remove_values_conditions(value, no_log_strings, deferred_removals): return value -def _sanitize_keys_conditions(value, no_log_strings, ignore_keys, deferred_removals): - """ Helper method to sanitize_keys() to build deferred_removals and avoid deep recursion. """ - if isinstance(value, (text_type, binary_type)): - return value - - if isinstance(value, Sequence): - if isinstance(value, MutableSequence): - new_value = type(value)() - else: - new_value = [] # Need a mutable value - deferred_removals.append((value, new_value)) - return new_value - - if isinstance(value, Set): - if isinstance(value, MutableSet): - new_value = type(value)() - else: - new_value = set() # Need a mutable value - deferred_removals.append((value, new_value)) - return new_value - - if isinstance(value, Mapping): - if isinstance(value, MutableMapping): - new_value = type(value)() - else: - new_value = {} # Need a mutable value - deferred_removals.append((value, new_value)) - return new_value - - if isinstance(value, tuple(chain(integer_types, (float, bool, NoneType)))): - return value - - if isinstance(value, (datetime.datetime, datetime.date)): - return value - - raise TypeError('Value of unknown type: %s, %s' % (type(value), value)) - - -def env_fallback(*args, **kwargs): - """Load value from environment variable""" - - for arg in args: - if arg in os.environ: - return os.environ[arg] - raise AnsibleFallbackNotFound - - -def set_fallbacks(argument_spec, parameters): - no_log_values = set() - for param, value in argument_spec.items(): - fallback = value.get('fallback', (None,)) - fallback_strategy = fallback[0] - fallback_args = [] - fallback_kwargs = {} - if param not in parameters and fallback_strategy is not None: - for item in fallback[1:]: - if isinstance(item, dict): - fallback_kwargs = item - else: - fallback_args = item - try: - fallback_value = fallback_strategy(*fallback_args, **fallback_kwargs) - except AnsibleFallbackNotFound: - continue - else: - if value.get('no_log', False) and fallback_value: - no_log_values.add(fallback_value) - parameters[param] = fallback_value - - return no_log_values - - -def set_defaults(argument_spec, parameters, set_default=True): +def _set_defaults(argument_spec, parameters, set_default=True): """Set default values for parameters when no value is supplied. Modifies parameters directly. @@ -326,284 +490,50 @@ def set_defaults(argument_spec, parameters, set_default=True): return no_log_values -def list_no_log_values(argument_spec, params): - """Return set of no log values - - :arg argument_spec: An argument spec dictionary from a module - :arg params: Dictionary of all parameters - - :returns: Set of strings that should be hidden from output:: - - {'secret_dict_value', 'secret_list_item_one', 'secret_list_item_two', 'secret_string'} - """ - - no_log_values = set() - for arg_name, arg_opts in argument_spec.items(): - if arg_opts.get('no_log', False): - # Find the value for the no_log'd param - no_log_object = params.get(arg_name, None) - - if no_log_object: - try: - no_log_values.update(_return_datastructure_name(no_log_object)) - except TypeError as e: - raise TypeError('Failed to convert "%s": %s' % (arg_name, to_native(e))) - - # Get no_log values from suboptions - sub_argument_spec = arg_opts.get('options') - if sub_argument_spec is not None: - wanted_type = arg_opts.get('type') - sub_parameters = params.get(arg_name) - - if sub_parameters is not None: - if wanted_type == 'dict' or (wanted_type == 'list' and arg_opts.get('elements', '') == 'dict'): - # Sub parameters can be a dict or list of dicts. Ensure parameters are always a list. - if not isinstance(sub_parameters, list): - sub_parameters = [sub_parameters] - - for sub_param in sub_parameters: - # Validate dict fields in case they came in as strings - - if isinstance(sub_param, string_types): - sub_param = check_type_dict(sub_param) - - if not isinstance(sub_param, Mapping): - raise TypeError("Value '{1}' in the sub parameter field '{0}' must by a {2}, " - "not '{1.__class__.__name__}'".format(arg_name, sub_param, wanted_type)) - - no_log_values.update(list_no_log_values(sub_argument_spec, sub_param)) - - return no_log_values - - -def list_deprecations(argument_spec, parameters, prefix=''): - """Return a list of deprecations - - :arg argument_spec: An argument spec dictionary from a module - :arg parameters: Dictionary of parameters - - :returns: List of dictionaries containing a message and version in which - the deprecated parameter will be removed, or an empty list:: - - [{'msg': "Param 'deptest' is deprecated. See the module docs for more information", 'version': '2.9'}] - """ - - deprecations = [] - for arg_name, arg_opts in argument_spec.items(): - if arg_name in parameters: - if prefix: - sub_prefix = '%s["%s"]' % (prefix, arg_name) - else: - sub_prefix = arg_name - if arg_opts.get('removed_at_date') is not None: - deprecations.append({ - 'msg': "Param '%s' is deprecated. See the module docs for more information" % sub_prefix, - 'date': arg_opts.get('removed_at_date'), - 'collection_name': arg_opts.get('removed_from_collection'), - }) - elif arg_opts.get('removed_in_version') is not None: - deprecations.append({ - 'msg': "Param '%s' is deprecated. See the module docs for more information" % sub_prefix, - 'version': arg_opts.get('removed_in_version'), - 'collection_name': arg_opts.get('removed_from_collection'), - }) - # Check sub-argument spec - sub_argument_spec = arg_opts.get('options') - if sub_argument_spec is not None: - sub_arguments = parameters[arg_name] - if isinstance(sub_arguments, Mapping): - sub_arguments = [sub_arguments] - if isinstance(sub_arguments, list): - for sub_params in sub_arguments: - if isinstance(sub_params, Mapping): - deprecations.extend(list_deprecations(sub_argument_spec, sub_params, prefix=sub_prefix)) - - return deprecations - - -def sanitize_keys(obj, no_log_strings, ignore_keys=frozenset()): - """ Sanitize the keys in a container object by removing no_log values from key names. - - This is a companion function to the `remove_values()` function. Similar to that function, - we make use of deferred_removals to avoid hitting maximum recursion depth in cases of - large data structures. - - :param obj: The container object to sanitize. Non-container objects are returned unmodified. - :param no_log_strings: A set of string values we do not want logged. - :param ignore_keys: A set of string values of keys to not sanitize. - - :returns: An object with sanitized keys. - """ - - deferred_removals = deque() - - no_log_strings = [to_native(s, errors='surrogate_or_strict') for s in no_log_strings] - new_value = _sanitize_keys_conditions(obj, no_log_strings, ignore_keys, deferred_removals) - - while deferred_removals: - old_data, new_data = deferred_removals.popleft() +def _sanitize_keys_conditions(value, no_log_strings, ignore_keys, deferred_removals): + """ Helper method to sanitize_keys() to build deferred_removals and avoid deep recursion. """ + if isinstance(value, (text_type, binary_type)): + return value - if isinstance(new_data, Mapping): - for old_key, old_elem in old_data.items(): - if old_key in ignore_keys or old_key.startswith('_ansible'): - new_data[old_key] = _sanitize_keys_conditions(old_elem, no_log_strings, ignore_keys, deferred_removals) - else: - # Sanitize the old key. We take advantage of the sanitizing code in - # _remove_values_conditions() rather than recreating it here. - new_key = _remove_values_conditions(old_key, no_log_strings, None) - new_data[new_key] = _sanitize_keys_conditions(old_elem, no_log_strings, ignore_keys, deferred_removals) + if isinstance(value, Sequence): + if isinstance(value, MutableSequence): + new_value = type(value)() else: - for elem in old_data: - new_elem = _sanitize_keys_conditions(elem, no_log_strings, ignore_keys, deferred_removals) - if isinstance(new_data, MutableSequence): - new_data.append(new_elem) - elif isinstance(new_data, MutableSet): - new_data.add(new_elem) - else: - raise TypeError('Unknown container type encountered when removing private values from keys') - - return new_value - - -def remove_values(value, no_log_strings): - """ Remove strings in no_log_strings from value. If value is a container - type, then remove a lot more. - - Use of deferred_removals exists, rather than a pure recursive solution, - because of the potential to hit the maximum recursion depth when dealing with - large amounts of data (see issue #24560). - """ - - deferred_removals = deque() - - no_log_strings = [to_native(s, errors='surrogate_or_strict') for s in no_log_strings] - new_value = _remove_values_conditions(value, no_log_strings, deferred_removals) + new_value = [] # Need a mutable value + deferred_removals.append((value, new_value)) + return new_value - while deferred_removals: - old_data, new_data = deferred_removals.popleft() - if isinstance(new_data, Mapping): - for old_key, old_elem in old_data.items(): - new_elem = _remove_values_conditions(old_elem, no_log_strings, deferred_removals) - new_data[old_key] = new_elem + if isinstance(value, Set): + if isinstance(value, MutableSet): + new_value = type(value)() else: - for elem in old_data: - new_elem = _remove_values_conditions(elem, no_log_strings, deferred_removals) - if isinstance(new_data, MutableSequence): - new_data.append(new_elem) - elif isinstance(new_data, MutableSet): - new_data.add(new_elem) - else: - raise TypeError('Unknown container type encountered when removing private values from output') - - return new_value - - -def handle_aliases(argument_spec, parameters, alias_warnings=None, alias_deprecations=None): - """Return a two item tuple. The first is a dictionary of aliases, the second is - a list of legal inputs. - - Modify supplied parameters by adding a new key for each alias. - - If a list is provided to the alias_warnings parameter, it will be filled with tuples - (option, alias) in every case where both an option and its alias are specified. - - If a list is provided to alias_deprecations, it will be populated with dictionaries, - each containing deprecation information for each alias found in argument_spec. - """ - - legal_inputs = ['_ansible_%s' % k for k in PASS_VARS] - aliases_results = {} # alias:canon - - for (k, v) in argument_spec.items(): - legal_inputs.append(k) - aliases = v.get('aliases', None) - default = v.get('default', None) - required = v.get('required', False) - - if alias_deprecations is not None: - for alias in argument_spec[k].get('deprecated_aliases', []): - if alias.get('name') in parameters: - alias_deprecations.append(alias) - - if default is not None and required: - # not alias specific but this is a good place to check this - raise ValueError("internal error: required and default are mutually exclusive for %s" % k) - - if aliases is None: - continue - - if not is_iterable(aliases) or isinstance(aliases, (binary_type, text_type)): - raise TypeError('internal error: aliases must be a list or tuple') - - for alias in aliases: - legal_inputs.append(alias) - aliases_results[alias] = k - if alias in parameters: - if k in parameters and alias_warnings is not None: - alias_warnings.append((k, alias)) - parameters[k] = parameters[alias] - - return aliases_results, legal_inputs - - -def get_unsupported_parameters(argument_spec, parameters, legal_inputs=None): - """Check keys in parameters against those provided in legal_inputs - to ensure they contain legal values. If legal_inputs are not supplied, - they will be generated using the argument_spec. - - :arg argument_spec: Dictionary of parameters, their type, and valid values. - :arg parameters: Dictionary of parameters. - :arg legal_inputs: List of valid key names property names. Overrides values - in argument_spec. - - :returns: Set of unsupported parameters. Empty set if no unsupported parameters - are found. - """ - - if legal_inputs is None: - aliases, legal_inputs = handle_aliases(argument_spec, parameters) - - unsupported_parameters = set() - for k in parameters.keys(): - if k not in legal_inputs: - unsupported_parameters.add(k) - - return unsupported_parameters - - -def get_type_validator(wanted): - """Returns the callable used to validate a wanted type and the type name. - - :arg wanted: String or callable. If a string, get the corresponding - validation function from DEFAULT_TYPE_VALIDATORS. If callable, - get the name of the custom callable and return that for the type_checker. - - :returns: Tuple of callable function or None, and a string that is the name - of the wanted type. - """ + new_value = set() # Need a mutable value + deferred_removals.append((value, new_value)) + return new_value - # Use one our our builtin validators. - if not callable(wanted): - if wanted is None: - # Default type for parameters - wanted = 'str' + if isinstance(value, Mapping): + if isinstance(value, MutableMapping): + new_value = type(value)() + else: + new_value = {} # Need a mutable value + deferred_removals.append((value, new_value)) + return new_value - type_checker = DEFAULT_TYPE_VALIDATORS.get(wanted) + if isinstance(value, tuple(chain(integer_types, (float, bool, NoneType)))): + return value - # Use the custom callable for validation. - else: - type_checker = wanted - wanted = getattr(wanted, '__name__', to_native(type(wanted))) + if isinstance(value, (datetime.datetime, datetime.date)): + return value - return type_checker, wanted + raise TypeError('Value of unknown type: %s, %s' % (type(value), value)) -def validate_elements(wanted_type, parameter, values, options_context=None, errors=None): +def _validate_elements(wanted_type, parameter, values, options_context=None, errors=None): if errors is None: - errors = [] + errors = AnsibleValidationErrorMultiple() - type_checker, wanted_element_type = get_type_validator(wanted_type) + type_checker, wanted_element_type = _get_type_validator(wanted_type) validated_parameters = [] # Get param name for strings so we can later display this value in a useful error message if needed # Only pass 'kwargs' to our checkers and ignore custom callable checkers @@ -622,11 +552,11 @@ def validate_elements(wanted_type, parameter, values, options_context=None, erro if options_context: msg += " found in '%s'" % " -> ".join(options_context) msg += " is of type %s and we were unable to convert to %s: %s" % (type(value), wanted_element_type, to_native(e)) - errors.append(msg) + errors.append(ElementError(msg)) return validated_parameters -def validate_argument_types(argument_spec, parameters, prefix='', options_context=None, errors=None): +def _validate_argument_types(argument_spec, parameters, prefix='', options_context=None, errors=None): """Validate that parameter types match the type in the argument spec. Determine the appropriate type checker function and run each @@ -637,7 +567,7 @@ def validate_argument_types(argument_spec, parameters, prefix='', options_contex :param argument_spec: Argument spec :type argument_spec: dict - :param parameters: Parameters passed to module + :param parameters: Parameters :type parameters: dict :param prefix: Name of the parent key that contains the spec. Used in the error message @@ -653,7 +583,7 @@ def validate_argument_types(argument_spec, parameters, prefix='', options_contex """ if errors is None: - errors = [] + errors = AnsibleValidationErrorMultiple() for param, spec in argument_spec.items(): if param not in parameters: @@ -664,7 +594,7 @@ def validate_argument_types(argument_spec, parameters, prefix='', options_contex continue wanted_type = spec.get('type') - type_checker, wanted_name = get_type_validator(wanted_type) + type_checker, wanted_name = _get_type_validator(wanted_type) # Get param name for strings so we can later display this value in a useful error message if needed # Only pass 'kwargs' to our checkers and ignore custom callable checkers kwargs = {} @@ -685,22 +615,22 @@ def validate_argument_types(argument_spec, parameters, prefix='', options_contex if options_context: msg += " found in '%s'." % " -> ".join(options_context) msg += ", elements value check is supported only with 'list' type" - errors.append(msg) - parameters[param] = validate_elements(elements_wanted_type, param, elements, options_context, errors) + errors.append(ArgumentTypeError(msg)) + parameters[param] = _validate_elements(elements_wanted_type, param, elements, options_context, errors) except (TypeError, ValueError) as e: msg = "argument '%s' is of type %s" % (param, type(value)) if options_context: msg += " found in '%s'." % " -> ".join(options_context) msg += " and we were unable to convert to %s: %s" % (wanted_name, to_native(e)) - errors.append(msg) + errors.append(ArgumentTypeError(msg)) -def validate_argument_values(argument_spec, parameters, options_context=None, errors=None): +def _validate_argument_values(argument_spec, parameters, options_context=None, errors=None): """Ensure all arguments have the requested values, and there are no stray arguments""" if errors is None: - errors = [] + errors = AnsibleValidationErrorMultiple() for param, spec in argument_spec.items(): choices = spec.get('choices') @@ -716,8 +646,8 @@ def validate_argument_values(argument_spec, parameters, options_context=None, er choices_str = ", ".join([to_native(c) for c in choices]) msg = "value of %s must be one or more of: %s. Got no match for: %s" % (param, choices_str, diff_list) if options_context: - msg += " found in %s" % " -> ".join(options_context) - errors.append(msg) + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) + errors.append(ArgumentValueError(msg)) elif parameters[param] not in choices: # PyYaml converts certain strings to bools. If we can unambiguously convert back, do so before checking # the value. If we can't figure this out, module author is responsible. @@ -740,23 +670,23 @@ def validate_argument_values(argument_spec, parameters, options_context=None, er choices_str = ", ".join([to_native(c) for c in choices]) msg = "value of %s must be one of: %s, got: %s" % (param, choices_str, parameters[param]) if options_context: - msg += " found in %s" % " -> ".join(options_context) - errors.append(msg) + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) + errors.append(ArgumentValueError(msg)) else: msg = "internal error: choices for argument %s are not iterable: %s" % (param, choices) if options_context: - msg += " found in %s" % " -> ".join(options_context) - errors.append(msg) + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) + errors.append(ArgumentTypeError(msg)) -def validate_sub_spec(argument_spec, parameters, prefix='', options_context=None, errors=None, no_log_values=None, unsupported_parameters=None): +def _validate_sub_spec(argument_spec, parameters, prefix='', options_context=None, errors=None, no_log_values=None, unsupported_parameters=None): """Validate sub argument spec. This function is recursive.""" if options_context is None: options_context = [] if errors is None: - errors = [] + errors = AnsibleValidationErrorMultiple() if no_log_values is None: no_log_values = set() @@ -766,11 +696,11 @@ def validate_sub_spec(argument_spec, parameters, prefix='', options_context=None for param, value in argument_spec.items(): wanted = value.get('type') - if wanted == 'dict' or (wanted == 'list' and value.get('elements', '') == dict): + if wanted == 'dict' or (wanted == 'list' and value.get('elements', '') == 'dict'): sub_spec = value.get('options') if value.get('apply_defaults', False): if sub_spec is not None: - if parameters.get(value) is None: + if parameters.get(param) is None: parameters[param] = {} else: continue @@ -788,7 +718,7 @@ def validate_sub_spec(argument_spec, parameters, prefix='', options_context=None for idx, sub_parameters in enumerate(elements): if not isinstance(sub_parameters, dict): - errors.append("value of '%s' must be of type dict or list of dicts" % param) + errors.append(SubParameterTypeError("value of '%s' must be of type dict or list of dicts" % param)) # Set prefix for warning messages new_prefix = prefix + param @@ -799,53 +729,159 @@ def validate_sub_spec(argument_spec, parameters, prefix='', options_context=None no_log_values.update(set_fallbacks(sub_spec, sub_parameters)) alias_warnings = [] + alias_deprecations = [] try: - options_aliases, legal_inputs = handle_aliases(sub_spec, sub_parameters, alias_warnings) + options_aliases = _handle_aliases(sub_spec, sub_parameters, alias_warnings, alias_deprecations) except (TypeError, ValueError) as e: options_aliases = {} - legal_inputs = None - errors.append(to_native(e)) + errors.append(AliasError(to_native(e))) for option, alias in alias_warnings: warn('Both option %s and its alias %s are set.' % (option, alias)) - no_log_values.update(list_no_log_values(sub_spec, sub_parameters)) + try: + no_log_values.update(_list_no_log_values(sub_spec, sub_parameters)) + except TypeError as te: + errors.append(NoLogError(to_native(te))) - if legal_inputs is None: - legal_inputs = list(options_aliases.keys()) + list(sub_spec.keys()) - unsupported_parameters.update(get_unsupported_parameters(sub_spec, sub_parameters, legal_inputs)) + legal_inputs = _get_legal_inputs(sub_spec, sub_parameters, options_aliases) + unsupported_parameters.update(_get_unsupported_parameters(sub_spec, sub_parameters, legal_inputs, options_context)) try: - check_mutually_exclusive(value.get('mutually_exclusive'), sub_parameters) + check_mutually_exclusive(value.get('mutually_exclusive'), sub_parameters, options_context) except TypeError as e: - errors.append(to_native(e)) + errors.append(MutuallyExclusiveError(to_native(e))) - no_log_values.update(set_defaults(sub_spec, sub_parameters, False)) + no_log_values.update(_set_defaults(sub_spec, sub_parameters, False)) try: - check_required_arguments(sub_spec, sub_parameters) + check_required_arguments(sub_spec, sub_parameters, options_context) except TypeError as e: - errors.append(to_native(e)) + errors.append(RequiredError(to_native(e))) - validate_argument_types(sub_spec, sub_parameters, new_prefix, options_context, errors=errors) - validate_argument_values(sub_spec, sub_parameters, options_context, errors=errors) + _validate_argument_types(sub_spec, sub_parameters, new_prefix, options_context, errors=errors) + _validate_argument_values(sub_spec, sub_parameters, options_context, errors=errors) - checks = [ - (check_required_together, 'required_together'), - (check_required_one_of, 'required_one_of'), - (check_required_if, 'required_if'), - (check_required_by, 'required_by'), - ] - - for check in checks: + for check in _ADDITIONAL_CHECKS: try: - check[0](value.get(check[1]), parameters) + check['func'](value.get(check['attr']), sub_parameters, options_context) except TypeError as e: - errors.append(to_native(e)) + errors.append(check['err'](to_native(e))) - no_log_values.update(set_defaults(sub_spec, sub_parameters)) + no_log_values.update(_set_defaults(sub_spec, sub_parameters)) # Handle nested specs - validate_sub_spec(sub_spec, sub_parameters, new_prefix, options_context, errors, no_log_values, unsupported_parameters) + _validate_sub_spec(sub_spec, sub_parameters, new_prefix, options_context, errors, no_log_values, unsupported_parameters) options_context.pop() + + +def env_fallback(*args, **kwargs): + """Load value from environment variable""" + + for arg in args: + if arg in os.environ: + return os.environ[arg] + raise AnsibleFallbackNotFound + + +def set_fallbacks(argument_spec, parameters): + no_log_values = set() + for param, value in argument_spec.items(): + fallback = value.get('fallback', (None,)) + fallback_strategy = fallback[0] + fallback_args = [] + fallback_kwargs = {} + if param not in parameters and fallback_strategy is not None: + for item in fallback[1:]: + if isinstance(item, dict): + fallback_kwargs = item + else: + fallback_args = item + try: + fallback_value = fallback_strategy(*fallback_args, **fallback_kwargs) + except AnsibleFallbackNotFound: + continue + else: + if value.get('no_log', False) and fallback_value: + no_log_values.add(fallback_value) + parameters[param] = fallback_value + + return no_log_values + + +def sanitize_keys(obj, no_log_strings, ignore_keys=frozenset()): + """ Sanitize the keys in a container object by removing no_log values from key names. + + This is a companion function to the `remove_values()` function. Similar to that function, + we make use of deferred_removals to avoid hitting maximum recursion depth in cases of + large data structures. + + :param obj: The container object to sanitize. Non-container objects are returned unmodified. + :param no_log_strings: A set of string values we do not want logged. + :param ignore_keys: A set of string values of keys to not sanitize. + + :returns: An object with sanitized keys. + """ + + deferred_removals = deque() + + no_log_strings = [to_native(s, errors='surrogate_or_strict') for s in no_log_strings] + new_value = _sanitize_keys_conditions(obj, no_log_strings, ignore_keys, deferred_removals) + + while deferred_removals: + old_data, new_data = deferred_removals.popleft() + + if isinstance(new_data, Mapping): + for old_key, old_elem in old_data.items(): + if old_key in ignore_keys or old_key.startswith('_ansible'): + new_data[old_key] = _sanitize_keys_conditions(old_elem, no_log_strings, ignore_keys, deferred_removals) + else: + # Sanitize the old key. We take advantage of the sanitizing code in + # _remove_values_conditions() rather than recreating it here. + new_key = _remove_values_conditions(old_key, no_log_strings, None) + new_data[new_key] = _sanitize_keys_conditions(old_elem, no_log_strings, ignore_keys, deferred_removals) + else: + for elem in old_data: + new_elem = _sanitize_keys_conditions(elem, no_log_strings, ignore_keys, deferred_removals) + if isinstance(new_data, MutableSequence): + new_data.append(new_elem) + elif isinstance(new_data, MutableSet): + new_data.add(new_elem) + else: + raise TypeError('Unknown container type encountered when removing private values from keys') + + return new_value + + +def remove_values(value, no_log_strings): + """ Remove strings in no_log_strings from value. If value is a container + type, then remove a lot more. + + Use of deferred_removals exists, rather than a pure recursive solution, + because of the potential to hit the maximum recursion depth when dealing with + large amounts of data (see issue #24560). + """ + + deferred_removals = deque() + + no_log_strings = [to_native(s, errors='surrogate_or_strict') for s in no_log_strings] + new_value = _remove_values_conditions(value, no_log_strings, deferred_removals) + + while deferred_removals: + old_data, new_data = deferred_removals.popleft() + if isinstance(new_data, Mapping): + for old_key, old_elem in old_data.items(): + new_elem = _remove_values_conditions(old_elem, no_log_strings, deferred_removals) + new_data[old_key] = new_elem + else: + for elem in old_data: + new_elem = _remove_values_conditions(elem, no_log_strings, deferred_removals) + if isinstance(new_data, MutableSequence): + new_data.append(new_elem) + elif isinstance(new_data, MutableSet): + new_data.add(new_elem) + else: + raise TypeError('Unknown container type encountered when removing private values from output') + + return new_value diff --git a/lib/ansible/module_utils/common/validation.py b/lib/ansible/module_utils/common/validation.py index df40905987..d8c74e0232 100644 --- a/lib/ansible/module_utils/common/validation.py +++ b/lib/ansible/module_utils/common/validation.py @@ -39,7 +39,35 @@ def count_terms(terms, parameters): return len(set(terms).intersection(parameters)) -def check_mutually_exclusive(terms, parameters): +def safe_eval(value, locals=None, include_exceptions=False): + # do not allow method calls to modules + if not isinstance(value, string_types): + # already templated to a datavaluestructure, perhaps? + if include_exceptions: + return (value, None) + return value + if re.search(r'\w\.\w+\(', value): + if include_exceptions: + return (value, None) + return value + # do not allow imports + if re.search(r'import \w+', value): + if include_exceptions: + return (value, None) + return value + try: + result = literal_eval(value) + if include_exceptions: + return (result, None) + else: + return result + except Exception as e: + if include_exceptions: + return (value, e) + return value + + +def check_mutually_exclusive(terms, parameters, options_context=None): """Check mutually exclusive terms against argument parameters Accepts a single list or list of lists that are groups of terms that should be @@ -63,12 +91,14 @@ def check_mutually_exclusive(terms, parameters): if results: full_list = ['|'.join(check) for check in results] msg = "parameters are mutually exclusive: %s" % ', '.join(full_list) + if options_context: + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) raise TypeError(to_native(msg)) return results -def check_required_one_of(terms, parameters): +def check_required_one_of(terms, parameters, options_context=None): """Check each list of terms to ensure at least one exists in the given module parameters @@ -93,12 +123,14 @@ def check_required_one_of(terms, parameters): if results: for term in results: msg = "one of the following is required: %s" % ', '.join(term) + if options_context: + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) raise TypeError(to_native(msg)) return results -def check_required_together(terms, parameters): +def check_required_together(terms, parameters, options_context=None): """Check each list of terms to ensure every parameter in each list exists in the given parameters @@ -125,12 +157,14 @@ def check_required_together(terms, parameters): if results: for term in results: msg = "parameters are required together: %s" % ', '.join(term) + if options_context: + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) raise TypeError(to_native(msg)) return results -def check_required_by(requirements, parameters): +def check_required_by(requirements, parameters, options_context=None): """For each key in requirements, check the corresponding list to see if they exist in parameters @@ -161,12 +195,14 @@ def check_required_by(requirements, parameters): for key, missing in result.items(): if len(missing) > 0: msg = "missing parameter(s) required by '%s': %s" % (key, ', '.join(missing)) + if options_context: + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) raise TypeError(to_native(msg)) return result -def check_required_arguments(argument_spec, parameters): +def check_required_arguments(argument_spec, parameters, options_context=None): """Check all paramaters in argument_spec and return a list of parameters that are required but not present in parameters @@ -190,12 +226,14 @@ def check_required_arguments(argument_spec, parameters): if missing: msg = "missing required arguments: %s" % ", ".join(sorted(missing)) + if options_context: + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) raise TypeError(to_native(msg)) return missing -def check_required_if(requirements, parameters): +def check_required_if(requirements, parameters, options_context=None): """Check parameters that are conditionally required Raises TypeError if the check fails @@ -272,6 +310,8 @@ def check_required_if(requirements, parameters): for missing in results: msg = "%s is %s but %s of the following are missing: %s" % ( missing['parameter'], missing['value'], missing['requires'], ', '.join(missing['missing'])) + if options_context: + msg = "{0} found in {1}".format(msg, " -> ".join(options_context)) raise TypeError(to_native(msg)) return results @@ -304,34 +344,6 @@ def check_missing_parameters(parameters, required_parameters=None): return missing_params -def safe_eval(value, locals=None, include_exceptions=False): - # do not allow method calls to modules - if not isinstance(value, string_types): - # already templated to a datavaluestructure, perhaps? - if include_exceptions: - return (value, None) - return value - if re.search(r'\w\.\w+\(', value): - if include_exceptions: - return (value, None) - return value - # do not allow imports - if re.search(r'import \w+', value): - if include_exceptions: - return (value, None) - return value - try: - result = literal_eval(value) - if include_exceptions: - return (result, None) - else: - return result - except Exception as e: - if include_exceptions: - return (value, e) - return value - - # FIXME: The param and prefix parameters here are coming from AnsibleModule._check_type_string() # which is using those for the warning messaged based on string conversion warning settings. # Not sure how to deal with that here since we don't have config state to query. |