diff options
54 files changed, 2510 insertions, 0 deletions
@@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + +TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + +1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + +2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + +3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + +4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + +5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + +6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + +7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + +8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + +9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + +END OF TERMS AND CONDITIONS + +APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + +Copyright [yyyy] [name of copyright owner] + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. diff --git a/README.rdoc b/README.rdoc new file mode 100644 index 0000000..4456659 --- /dev/null +++ b/README.rdoc @@ -0,0 +1,10 @@ +== Chef Zero + += DESCRIPTION: + +chef-zero is a simple, easy-instal, in-memory Chef server that can be useful +for Chef client testing and chef-solo-like tasks that require a full Chef +server. It IS intended to be simple, fully Chef 11 compliant, easy to +run and fast to start. It is NOT intended to be highly scalable, performant +or persistent. + diff --git a/Rakefile b/Rakefile new file mode 100644 index 0000000..0adbc31 --- /dev/null +++ b/Rakefile @@ -0,0 +1,15 @@ +require 'bundler' +require 'rubygems' +require 'rubygems/package_task' +require 'rdoc/task' + +Bundler::GemHelper.install_tasks + +gem_spec = eval(File.read("chef-zero.gemspec")) + +RDoc::Task.new do |rdoc| + rdoc.rdoc_dir = 'rdoc' + rdoc.title = "chef-zero #{gem_spec.version}" + rdoc.rdoc_files.include('README*') + rdoc.rdoc_files.include('lib/**/*.rb') +end diff --git a/bin/chef-zero b/bin/chef-zero new file mode 100755 index 0000000..590d670 --- /dev/null +++ b/bin/chef-zero @@ -0,0 +1,8 @@ +#!/usr/bin/env ruby + +require 'rubygems' +$:.unshift(File.expand_path(File.join(File.dirname(__FILE__), "..", "lib"))) +require 'chef_zero/server' + +server = ChefZero::Server.new(:Port => 8889) +server.start diff --git a/chef-zero.gemspec b/chef-zero.gemspec new file mode 100644 index 0000000..5c316aa --- /dev/null +++ b/chef-zero.gemspec @@ -0,0 +1,23 @@ +$:.unshift(File.dirname(__FILE__) + '/lib') +require 'chef_zero/version' + +Gem::Specification.new do |s| + s.name = "chef-zero" + s.version = ChefZero::VERSION + s.platform = Gem::Platform::RUBY + s.has_rdoc = true + s.extra_rdoc_files = ["README.rdoc", "LICENSE"] + s.summary = "Self-contained, easy-setup, fast-start in-memory Chef server for testing and solo setup purposes" + s.description = s.summary + s.author = "John Keiser" + s.email = "jkeiser@opscode.com" + s.homepage = "http://www.opscode.com" + + s.add_dependency 'chef' # For version, version constraint and deep merge + + s.bindir = "bin" + s.executables = %w( chef-zero ) + s.require_path = 'lib' + s.files = %w(LICENSE README.rdoc Rakefile) + Dir.glob("{lib,spec}/**/*") +end + diff --git a/lib/chef_zero.rb b/lib/chef_zero.rb new file mode 100644 index 0000000..2e91115 --- /dev/null +++ b/lib/chef_zero.rb @@ -0,0 +1,5 @@ +module ChefZero + CERTIFICATE = "-----BEGIN CERTIFICATE-----\nMIIDMzCCApygAwIBAgIBATANBgkqhkiG9w0BAQUFADCBnjELMAkGA1UEBhMCVVMx\nEzARBgNVBAgMCldhc2hpbmd0b24xEDAOBgNVBAcMB1NlYXR0bGUxFjAUBgNVBAoM\nDU9wc2NvZGUsIEluYy4xHDAaBgNVBAsME0NlcnRpZmljYXRlIFNlcnZpY2UxMjAw\nBgNVBAMMKW9wc2NvZGUuY29tL2VtYWlsQWRkcmVzcz1hdXRoQG9wc2NvZGUuY29t\nMB4XDTEyMTEyMTAwMzQyMVoXDTIyMTExOTAwMzQyMVowgZsxEDAOBgNVBAcTB1Nl\nYXR0bGUxEzARBgNVBAgTCldhc2hpbmd0b24xCzAJBgNVBAYTAlVTMRwwGgYDVQQL\nExNDZXJ0aWZpY2F0ZSBTZXJ2aWNlMRYwFAYDVQQKEw1PcHNjb2RlLCBJbmMuMS8w\nLQYDVQQDFCZVUkk6aHR0cDovL29wc2NvZGUuY29tL0dVSURTL3VzZXJfZ3VpZDCC\nASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBANLDmPbR71bS2esZlZh/HfC6\n0azXFjl2677wq2ovk9xrUb0Ui4ZLC66TqQ9C/RBzOjXU4TRf3hgPTqvlCgHusl0d\nIcLCrsSl6kPEhJpYWWfRoroIAwf82A9yLQekhqXZEXu5EKkwoUMqyF6m0ZCasaE1\ny8niQxdLAsk3ady/CGQlFqHTPKFfU5UASR2LRtYC1MCIvJHDFRKAp9kPJbQo9P37\nZ8IU7cDudkZFgNLmDixlWsh7C0ghX8fgAlj1P6FgsFufygam973k79GhIP54dELB\nc0S6E8ekkRSOXU9jX/IoiXuFglBvFihAdhvED58bMXzj2AwXUyeAlxItnvs+NVUC\nAwEAATANBgkqhkiG9w0BAQUFAAOBgQBkFZRbMoywK3hb0/X7MXmPYa7nlfnd5UXq\nr2n32ettzZNmEPaI2d1j+//nL5qqhOlrWPS88eKEPnBOX/jZpUWOuAAddnrvFzgw\nrp/C2H7oMT+29F+5ezeViLKbzoFYb4yECHBoi66IFXNae13yj7taMboBeUmE664G\nTB/MZpRr8g==\n-----END CERTIFICATE-----\n" + PUBLIC_KEY = "-----BEGIN PUBLIC KEY-----\nMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA0sOY9tHvVtLZ6xmVmH8d\n8LrRrNcWOXbrvvCrai+T3GtRvRSLhksLrpOpD0L9EHM6NdThNF/eGA9Oq+UKAe6y\nXR0hwsKuxKXqQ8SEmlhZZ9GiuggDB/zYD3ItB6SGpdkRe7kQqTChQyrIXqbRkJqx\noTXLyeJDF0sCyTdp3L8IZCUWodM8oV9TlQBJHYtG1gLUwIi8kcMVEoCn2Q8ltCj0\n/ftnwhTtwO52RkWA0uYOLGVayHsLSCFfx+ACWPU/oWCwW5/KBqb3veTv0aEg/nh0\nQsFzRLoTx6SRFI5dT2Nf8iiJe4WCUG8WKEB2G8QPnxsxfOPYDBdTJ4CXEi2e+z41\nVQIDAQAB\n-----END PUBLIC KEY-----\n" + PRIVATE_KEY = "-----BEGIN RSA PRIVATE KEY-----\nMIIEpAIBAAKCAQEA0sOY9tHvVtLZ6xmVmH8d8LrRrNcWOXbrvvCrai+T3GtRvRSL\nhksLrpOpD0L9EHM6NdThNF/eGA9Oq+UKAe6yXR0hwsKuxKXqQ8SEmlhZZ9GiuggD\nB/zYD3ItB6SGpdkRe7kQqTChQyrIXqbRkJqxoTXLyeJDF0sCyTdp3L8IZCUWodM8\noV9TlQBJHYtG1gLUwIi8kcMVEoCn2Q8ltCj0/ftnwhTtwO52RkWA0uYOLGVayHsL\nSCFfx+ACWPU/oWCwW5/KBqb3veTv0aEg/nh0QsFzRLoTx6SRFI5dT2Nf8iiJe4WC\nUG8WKEB2G8QPnxsxfOPYDBdTJ4CXEi2e+z41VQIDAQABAoIBAALhqbW2KQ+G0nPk\nZacwFbi01SkHx8YBWjfCEpXhEKRy0ytCnKW5YO+CFU2gHNWcva7+uhV9OgwaKXkw\nKHLeUJH1VADVqI4Htqw2g5mYm6BPvWnNsjzpuAp+BR+VoEGkNhj67r9hatMAQr0I\nitTvSH5rvd2EumYXIHKfz1K1SegUk1u1EL1RcMzRmZe4gDb6eNBs9Sg4im4ybTG6\npPIytA8vBQVWhjuAR2Tm+wZHiy0Az6Vu7c2mS07FSX6FO4E8SxWf8idaK9ijMGSq\nFvIS04mrY6XCPUPUC4qm1qNnhDPpOr7CpI2OO98SqGanStS5NFlSFXeXPpM280/u\nfZUA0AECgYEA+x7QUnffDrt7LK2cX6wbvn4mRnFxet7bJjrfWIHf+Rm0URikaNma\nh0/wNKpKBwIH+eHK/LslgzcplrqPytGGHLOG97Gyo5tGAzyLHUWBmsNkRksY2sPL\nuHq6pYWJNkqhnWGnIbmqCr0EWih82x/y4qxbJYpYqXMrit0wVf7yAgkCgYEA1twI\ngFaXqesetTPoEHSQSgC8S4D5/NkdriUXCYb06REcvo9IpFMuiOkVUYNN5d3MDNTP\nIdBicfmvfNELvBtXDomEUD8ls1UuoTIXRNGZ0VsZXu7OErXCK0JKNNyqRmOwcvYL\nJRqLfnlei5Ndo1lu286yL74c5rdTLs/nI2p4e+0CgYB079ZmcLeILrmfBoFI8+Y/\ngJLmPrFvXBOE6+lRV7kqUFPtZ6I3yQzyccETZTDvrnx0WjaiFavUPH27WMjY01S2\nTMtO0Iq1MPsbSrglO1as8MvjB9ldFcvp7gy4Q0Sv6XT0yqJ/S+vo8Df0m+H4UBpU\nf5o6EwBSd/UQxwtZIE0lsQKBgQCswfjX8Eg8KL/lJNpIOOE3j4XXE9ptksmJl2sB\njxDnQYoiMqVO808saHVquC/vTrpd6tKtNpehWwjeTFuqITWLi8jmmQ+gNTKsC9Gn\n1Pxf2Gb67PqnEpwQGln+TRtgQ5HBrdHiQIi+5am+gnw89pDrjjO5rZwhanAo6KPJ\n1zcPNQKBgQDxFu8v4frDmRNCVaZS4f1B6wTrcMrnibIDlnzrK9GG6Hz1U7dDv8s8\nNf4UmeMzDXjlPWZVOvS5+9HKJPdPj7/onv8B2m18+lcgTTDJBkza7R1mjL1Cje/Z\nKcVGsryKN6cjE7yCDasnA7R2rVBV/7NWeJV77bmzT5O//rW4yIfUIg==\n-----END RSA PRIVATE KEY-----\n" +end diff --git a/lib/chef_zero/data_normalizer.rb b/lib/chef_zero/data_normalizer.rb new file mode 100644 index 0000000..1dfa618 --- /dev/null +++ b/lib/chef_zero/data_normalizer.rb @@ -0,0 +1,129 @@ +require 'chef_zero' +require 'chef_zero/rest_base' + +module ChefZero + class DataNormalizer + def self.normalize_client(client, name) + client['name'] ||= name + client['admin'] ||= false + client['public_key'] ||= PUBLIC_KEY + client['validator'] ||= false + client['json_class'] ||= "Chef::ApiClient" + client['chef_type'] ||= "client" + client + end + + def self.normalize_user(user, name) + user['name'] ||= name + user['admin'] ||= false + user['public_key'] ||= PUBLIC_KEY + user + end + + def self.normalize_data_bag_item(data_bag_item, data_bag_name, id, method) + if method == 'DELETE' + # TODO SERIOUSLY, WHO DOES THIS MANY EXCEPTIONS IN THEIR INTERFACE + if !(data_bag_item['json_class'] == 'Chef::DataBagItem' && data_bag_item['raw_data']) + data_bag_item['id'] ||= id + data_bag_item = { 'raw_data' => data_bag_item } + data_bag_item['chef_type'] ||= 'data_bag_item' + data_bag_item['json_class'] ||= 'Chef::DataBagItem' + data_bag_item['data_bag'] ||= data_bag_name + data_bag_item['name'] ||= "data_bag_item_#{data_bag_name}_#{id}" + end + else + # If it's not already wrapped with raw_data, wrap it. + if data_bag_item['json_class'] == 'Chef::DataBagItem' && data_bag_item['raw_data'] + data_bag_item = data_bag_item['raw_data'] + end + # Argh. We don't do this on GET, but we do on PUT and POST???? + if %w(PUT POST).include?(method) + data_bag_item['chef_type'] ||= 'data_bag_item' + data_bag_item['data_bag'] ||= data_bag_name + end + data_bag_item['id'] ||= id + end + data_bag_item + end + + def self.normalize_environment(environment, name) + environment['name'] ||= name + environment['description'] ||= '' + environment['cookbook_versions'] ||= {} + environment['json_class'] ||= "Chef::Environment" + environment['chef_type'] ||= "environment" + environment['default_attributes'] ||= {} + environment['override_attributes'] ||= {} + environment + end + + def self.normalize_cookbook(cookbook, name, version, base_uri, method) + # TODO I feel dirty + if method != 'PUT' + cookbook.each_pair do |key, value| + if value.is_a?(Array) + value.each do |file| + if file.is_a?(Hash) && file.has_key?('checksum') + file['url'] ||= RestBase::build_uri(base_uri, ['file_store', file['checksum']]) + end + end + end + end + end + cookbook['name'] ||= "#{name}-#{version}" + # TODO this feels wrong, but the real chef server doesn't expand this default +# cookbook['version'] ||= version + cookbook['cookbook_name'] ||= name + cookbook['json_class'] ||= 'Chef::CookbookVersion' + cookbook['chef_type'] ||= 'cookbook_version' + cookbook['frozen?'] ||= false + cookbook['metadata'] ||= {} + cookbook['metadata']['version'] ||= version + cookbook['metadata']['name'] ||= name + cookbook + end + + def self.normalize_node(node, name) + node['name'] ||= name + node['json_class'] ||= 'Chef::Node' + node['chef_type'] ||= 'node' + node['chef_environment'] ||= '_default' + node['override'] ||= {} + node['normal'] ||= {} + node['default'] ||= {} + node['automatic'] ||= {} + node['run_list'] ||= [] + node['run_list'] = normalize_run_list(node['run_list']) + node + end + + def self.normalize_role(role, name) + role['name'] ||= name + role['description'] ||= '' + role['json_class'] ||= 'Chef::Role' + role['chef_type'] ||= 'role' + role['default_attributes'] ||= {} + role['override_attributes'] ||= {} + role['run_list'] ||= [] + role['run_list'] = normalize_run_list(role['run_list']) + role['env_run_lists'] ||= {} + role['env_run_lists'].each_pair do |env, run_list| + role['env_run_lists'][env] = normalize_run_list(run_list) + end + role + end + + def self.normalize_run_list(run_list) + run_list.map{|item| + case item + when /^recipe\[.*\]$/ + item # explicit recipe + when /^role\[.*\]$/ + item # explicit role + else + "recipe[#{item}]" + end + }.uniq + end + end +end diff --git a/lib/chef_zero/endpoints/actor_endpoint.rb b/lib/chef_zero/endpoints/actor_endpoint.rb new file mode 100644 index 0000000..467422b --- /dev/null +++ b/lib/chef_zero/endpoints/actor_endpoint.rb @@ -0,0 +1,68 @@ +require 'json' +require 'chef_zero/endpoints/rest_object_endpoint' +require 'chef_zero/data_normalizer' + +module ChefZero + module Endpoints + # /clients/* and /users/* + class ActorEndpoint < RestObjectEndpoint + def put(request) + # Find out if we're updating the public key. + request_body = JSON.parse(request.body, :create_additions => false) + if request_body['public_key'].nil? + # If public_key is null, then don't overwrite it. Weird patchiness. + body_modified = true + request_body.delete('public_key') + else + updating_public_key = true + end + + # Generate private_key if requested. + if request_body.has_key?('private_key') + body_modified = true + if request_body['private_key'] + private_key, public_key = server.gen_key_pair + updating_public_key = true + request_body['public_key'] = public_key + end + request_body.delete('private_key') + end + + # Save request + request.body = JSON.pretty_generate(request_body) if body_modified + + # PUT /clients is patchy + request.body = patch_request_body(request) + + result = super(request) + + # Inject private_key into response, delete public_key/password if applicable + if result[0] == 200 + response = JSON.parse(result[2], :create_additions => false) + response['private_key'] = private_key if private_key + response.delete('public_key') if !updating_public_key && request.rest_path[0] == 'users' + response.delete('password') + # For PUT /clients, a rename returns 201. + if request_body['name'] && request.rest_path[1] != request_body['name'] + json_response(201, response) + else + json_response(200, response) + end + else + result + end + end + + def populate_defaults(request, response_json) + response = JSON.parse(response_json, :create_additions => false) + if request.rest_path[0] == 'clients' + response = DataNormalizer.normalize_client(response, request.rest_path[1]) + else + response = DataNormalizer.normalize_user(response, request.rest_path[1]) + end + JSON.pretty_generate(response) + end + end + end +end + diff --git a/lib/chef_zero/endpoints/actors_endpoint.rb b/lib/chef_zero/endpoints/actors_endpoint.rb new file mode 100644 index 0000000..52908d2 --- /dev/null +++ b/lib/chef_zero/endpoints/actors_endpoint.rb @@ -0,0 +1,32 @@ +require 'json' +require 'chef_zero/endpoints/rest_list_endpoint' + +module ChefZero + module Endpoints + # /clients or /users + class ActorsEndpoint < RestListEndpoint + def post(request) + # First, find out if the user actually posted a public key. If not, make + # one. + request_body = JSON.parse(request.body, :create_additions => false) + public_key = request_body['public_key'] + if !public_key + private_key, public_key = server.gen_key_pair + request_body['public_key'] = public_key + request.body = JSON.pretty_generate(request_body) + end + + result = super(request) + if result[0] == 201 + # If we generated a key, stuff it in the response. + response = JSON.parse(result[2], :create_additions => false) + response['private_key'] = private_key if private_key + response['public_key'] = public_key + json_response(201, response) + else + result + end + end + end + end +end diff --git a/lib/chef_zero/endpoints/authenticate_user_endpoint.rb b/lib/chef_zero/endpoints/authenticate_user_endpoint.rb new file mode 100644 index 0000000..ce044c7 --- /dev/null +++ b/lib/chef_zero/endpoints/authenticate_user_endpoint.rb @@ -0,0 +1,21 @@ +require 'json' +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # /authenticate_user + class AuthenticateUserEndpoint < RestBase + def post(request) + request_json = JSON.parse(request.body, :create_additions => false) + name = request_json['name'] + password = request_json['password'] + user = data['users'][name] + verified = user && JSON.parse(user, :create_additions => false)['password'] == password + json_response(200, { + 'name' => name, + 'verified' => !!verified + }) + end + end + end +end diff --git a/lib/chef_zero/endpoints/cookbook_endpoint.rb b/lib/chef_zero/endpoints/cookbook_endpoint.rb new file mode 100644 index 0000000..9334af7 --- /dev/null +++ b/lib/chef_zero/endpoints/cookbook_endpoint.rb @@ -0,0 +1,39 @@ +require 'chef_zero/endpoints/cookbooks_base' + +module ChefZero + module Endpoints + # /cookbooks/NAME + class CookbookEndpoint < CookbooksBase + def get(request) + filter = request.rest_path[1] + case filter + when '_latest' + result = {} + filter_cookbooks(data['cookbooks'], {}, 1) do |name, versions| + if versions.size > 0 + result[name] = build_uri(request.base_uri, ['cookbooks', name, versions[0]]) + end + end + json_response(200, result) + when '_recipes' + result = [] + filter_cookbooks(data['cookbooks'], {}, 1) do |name, versions| + if versions.size > 0 + cookbook = JSON.parse(data['cookbooks'][name][versions[0]], :create_additions => false) + result += recipe_names(name, cookbook) + end + end + json_response(200, result.sort) + else + cookbook_list = { filter => get_data(request, request.rest_path) } + json_response(200, format_cookbooks_list(request, cookbook_list)) + end + end + + def latest_version(versions) + sorted = versions.sort_by { |version| Chef::Version.new(version) } + sorted[-1] + end + end + end +end diff --git a/lib/chef_zero/endpoints/cookbook_version_endpoint.rb b/lib/chef_zero/endpoints/cookbook_version_endpoint.rb new file mode 100644 index 0000000..1ff74e8 --- /dev/null +++ b/lib/chef_zero/endpoints/cookbook_version_endpoint.rb @@ -0,0 +1,102 @@ +require 'json' +require 'chef_zero/endpoints/rest_object_endpoint' +require 'chef_zero/rest_error_response' +require 'chef_zero/data_normalizer' + +module ChefZero + module Endpoints + # /cookbooks/NAME/VERSION + class CookbookVersionEndpoint < RestObjectEndpoint + def get(request) + if request.rest_path[2] == "_latest" + request.rest_path[2] = latest_version(get_data(request, request.rest_path[0..1]).keys) + end + super(request) + end + + def put(request) + name = request.rest_path[1] + version = request.rest_path[2] + data['cookbooks'][name] = {} if !data['cookbooks'][name] + existing_cookbook = data['cookbooks'][name][version] + + # Honor frozen + if existing_cookbook + existing_cookbook_json = JSON.parse(existing_cookbook, :create_additions => false) + if existing_cookbook_json['frozen?'] + if request.query_params['force'] != "true" + raise RestErrorResponse.new(409, "The cookbook #{name} at version #{version} is frozen. Use the 'force' option to override.") + end + # For some reason, you are forever unable to modify "frozen?" on a frozen cookbook. + request_body = JSON.parse(request.body, :create_additions => false) + if !request_body['frozen?'] + request_body['frozen?'] = true + request.body = JSON.pretty_generate(request_body) + end + end + end + + # Set the cookbook + data['cookbooks'][name][version] = request.body + + # If the cookbook was updated, check for deleted files and clean them up + if existing_cookbook + missing_checksums = get_checksums(existing_cookbook) - get_checksums(request.body) + if missing_checksums.size > 0 + hoover_unused_checksums(missing_checksums) + end + end + + already_json_response(existing_cookbook ? 200 : 201, populate_defaults(request, data['cookbooks'][name][version])) + end + + def delete(request) + deleted_cookbook = get_data(request, request.rest_path) + response = super(request) + cookbook_name = request.rest_path[1] + data['cookbooks'].delete(cookbook_name) if data['cookbooks'][cookbook_name].size == 0 + + # Hoover deleted files, if they exist + hoover_unused_checksums(get_checksums(deleted_cookbook)) + response + end + + def get_checksums(cookbook) + result = [] + JSON.parse(cookbook, :create_additions => false).each_pair do |key, value| + if value.is_a?(Array) + value.each do |file| + if file.is_a?(Hash) && file.has_key?('checksum') + result << file['checksum'] + end + end + end + end + result + end + + def hoover_unused_checksums(deleted_checksums) + data['cookbooks'].each_pair do |cookbook_name, versions| + versions.each_pair do |cookbook_version, cookbook| + deleted_checksums = deleted_checksums - get_checksums(cookbook) + end + end + deleted_checksums.each do |checksum| + data['file_store'].delete(checksum) + end + end + + def populate_defaults(request, response_json) + # Inject URIs into each cookbook file + cookbook = JSON.parse(response_json, :create_additions => false) + cookbook = DataNormalizer.normalize_cookbook(cookbook, request.rest_path[1], request.rest_path[2], request.base_uri, request.method) + JSON.pretty_generate(cookbook) + end + + def latest_version(versions) + sorted = versions.sort_by { |version| Chef::Version.new(version) } + sorted[-1] + end + end + end +end diff --git a/lib/chef_zero/endpoints/cookbooks_base.rb b/lib/chef_zero/endpoints/cookbooks_base.rb new file mode 100644 index 0000000..53f7945 --- /dev/null +++ b/lib/chef_zero/endpoints/cookbooks_base.rb @@ -0,0 +1,59 @@ +require 'json' +require 'chef/exceptions' # Needed so Chef::Version/VersionConstraint load +require 'chef/version_class' +require 'chef/version_constraint' +require 'chef_zero/rest_base' +require 'chef_zero/data_normalizer' + +module ChefZero + module Endpoints + # Common code for endpoints that return cookbook lists + class CookbooksBase < RestBase + def format_cookbooks_list(request, cookbooks_list, constraints = {}, num_versions = nil) + results = {} + filter_cookbooks(cookbooks_list, constraints, num_versions) do |name, versions| + versions_list = versions.map do |version| + { + 'url' => build_uri(request.base_uri, ['cookbooks', name, version]), + 'version' => version + } + end + results[name] = { + 'url' => build_uri(request.base_uri, ['cookbooks', name]), + 'versions' => versions_list + } + end + results + end + + def filter_cookbooks(cookbooks_list, constraints = {}, num_versions = nil) + cookbooks_list.keys.sort.each do |name| + constraint = Chef::VersionConstraint.new(constraints[name]) + versions = [] + cookbooks_list[name].keys.sort_by { |version| Chef::Version.new(version) }.reverse.each do |version| + break if num_versions && versions.size >= num_versions + if constraint.include?(version) + versions << version + end + end + yield [name, versions] + end + end + + def recipe_names(cookbook_name, cookbook) + result = [] + if cookbook['recipes'] + cookbook['recipes'].each do |recipe| + if recipe['path'] == "recipes/#{recipe['name']}" && recipe['name'][-3..-1] == '.rb' + if recipe['name'] == 'default.rb' + result << cookbook_name + end + result << "#{cookbook_name}::#{recipe['name'][0..-4]}" + end + end + end + result + end + end + end +end diff --git a/lib/chef_zero/endpoints/cookbooks_endpoint.rb b/lib/chef_zero/endpoints/cookbooks_endpoint.rb new file mode 100644 index 0000000..a595718 --- /dev/null +++ b/lib/chef_zero/endpoints/cookbooks_endpoint.rb @@ -0,0 +1,12 @@ +require 'chef_zero/endpoints/cookbooks_base' + +module ChefZero + module Endpoints + # /cookbooks + class CookbooksEndpoint < CookbooksBase + def get(request) + json_response(200, format_cookbooks_list(request, data['cookbooks'])) + end + end + end +end diff --git a/lib/chef_zero/endpoints/data_bag_endpoint.rb b/lib/chef_zero/endpoints/data_bag_endpoint.rb new file mode 100644 index 0000000..6f3d204 --- /dev/null +++ b/lib/chef_zero/endpoints/data_bag_endpoint.rb @@ -0,0 +1,50 @@ +require 'json' +require 'chef_zero/endpoints/rest_list_endpoint' +require 'chef_zero/endpoints/data_bag_item_endpoint' +require 'chef_zero/rest_error_response' + +module ChefZero + module Endpoints + # /data/NAME + class DataBagEndpoint < RestListEndpoint + def initialize(server) + super(server, 'id') + end + + def post(request) + key = JSON.parse(request.body, :create_additions => false)[identity_key] + response = super(request) + if response[0] == 201 + already_json_response(201, DataBagItemEndpoint::populate_defaults(request, request.body, request.rest_path[1], key)) + else + response + end + end + + def get_key(contents) + data_bag_item = JSON.parse(contents, :create_additions => false) + if data_bag_item['json_class'] == 'Chef::DataBagItem' && data_bag_item['raw_data'] + data_bag_item['raw_data']['id'] + else + data_bag_item['id'] + end + end + + def delete(request) + key = request.rest_path[1] + container = data['data'] + if !container.has_key?(key) + raise RestErrorResponse.new(404, "Object not found: #{build_uri(request.base_uri, request.rest_path)}") + end + result = container[key] + container.delete(key) + json_response(200, { + 'chef_type' => 'data_bag', + 'json_class' => 'Chef::DataBag', + 'name' => key + }) + end + end + end +end + diff --git a/lib/chef_zero/endpoints/data_bag_item_endpoint.rb b/lib/chef_zero/endpoints/data_bag_item_endpoint.rb new file mode 100644 index 0000000..9c084a3 --- /dev/null +++ b/lib/chef_zero/endpoints/data_bag_item_endpoint.rb @@ -0,0 +1,25 @@ +require 'json' +require 'chef_zero/endpoints/rest_object_endpoint' +require 'chef_zero/endpoints/data_bag_item_endpoint' +require 'chef_zero/data_normalizer' + +module ChefZero + module Endpoints + # /data/NAME/NAME + class DataBagItemEndpoint < RestObjectEndpoint + def initialize(server) + super(server, 'id') + end + + def populate_defaults(request, response_json) + DataBagItemEndpoint::populate_defaults(request, response_json, request.rest_path[1], request.rest_path[2]) + end + + def self.populate_defaults(request, response_json, data_bag, data_bag_item) + response = JSON.parse(response_json, :create_additions => false) + response = DataNormalizer.normalize_data_bag_item(response, data_bag, data_bag_item, request.method) + JSON.pretty_generate(response) + end + end + end +end diff --git a/lib/chef_zero/endpoints/data_bags_endpoint.rb b/lib/chef_zero/endpoints/data_bags_endpoint.rb new file mode 100644 index 0000000..8cf015b --- /dev/null +++ b/lib/chef_zero/endpoints/data_bags_endpoint.rb @@ -0,0 +1,21 @@ +require 'json' +require 'chef_zero/endpoints/rest_list_endpoint' + +module ChefZero + module Endpoints + # /data + class DataBagsEndpoint < RestListEndpoint + def post(request) + container = get_data(request) + contents = request.body + name = JSON.parse(contents, :create_additions => false)[identity_key] + if container[name] + error(409, "Object already exists") + else + container[name] = {} + json_response(201, {"uri" => "#{build_uri(request.base_uri, request.rest_path + [name])}"}) + end + end + end + end +end diff --git a/lib/chef_zero/endpoints/environment_cookbook_endpoint.rb b/lib/chef_zero/endpoints/environment_cookbook_endpoint.rb new file mode 100644 index 0000000..3360cd5 --- /dev/null +++ b/lib/chef_zero/endpoints/environment_cookbook_endpoint.rb @@ -0,0 +1,24 @@ +require 'json' +require 'chef_zero/endpoints/cookbooks_base' + +module ChefZero + module Endpoints + # /environments/NAME/cookbooks/NAME + class EnvironmentCookbookEndpoint < CookbooksBase + def get(request) + cookbook_name = request.rest_path[3] + environment = JSON.parse(get_data(request, request.rest_path[0..1]), :create_additions => false) + constraints = environment['cookbook_versions'] || {} + cookbook = get_data(request, request.rest_path[2..3]) + if request.query_params['num_versions'] == 'all' + num_versions = nil + elsif request.query_params['num_versions'] + num_versions = request.query_params['num_versions'].to_i + else + num_versions = nil + end + json_response(200, format_cookbooks_list(request, { cookbook_name => cookbook }, constraints, num_versions)) + end + end + end +end diff --git a/lib/chef_zero/endpoints/environment_cookbook_versions_endpoint.rb b/lib/chef_zero/endpoints/environment_cookbook_versions_endpoint.rb new file mode 100644 index 0000000..0d601ce --- /dev/null +++ b/lib/chef_zero/endpoints/environment_cookbook_versions_endpoint.rb @@ -0,0 +1,114 @@ +require 'json' +require 'chef/exceptions' # Needed so Chef::Version/VersionConstraint load +require 'chef/version_class' +require 'chef/version_constraint' +require 'chef_zero/rest_base' +require 'chef_zero/rest_error_response' + +module ChefZero + module Endpoints + # /environments/NAME/cookbook_versions + class EnvironmentCookbookVersionsEndpoint < RestBase + def cookbooks + data['cookbooks'] + end + + def environments + data['environments'] + end + + def post(request) + # Get the list of cookbooks and versions desired by the runlist + desired_versions = {} + run_list = JSON.parse(request.body, :create_additions => false)['run_list'] + run_list.each do |run_list_entry| + if run_list_entry =~ /(.+)\@(.+)/ + raise RestErrorResponse.new(412, "No such cookbook: #{$1}") if !cookbooks[$1] + raise RestErrorResponse.new(412, "No such cookbook version for cookbook #{$1}: #{$2}") if !cookbooks[$1][$2] + desired_versions[$1] = [ $2 ] + else + raise RestErrorResponse.new(412, "No such cookbook: #{run_list_entry}") if !cookbooks[run_list_entry] + desired_versions[run_list_entry] = cookbooks[run_list_entry].keys + end + end + + # Filter by environment constraints + environment = JSON.parse(get_data(request, request.rest_path[0..1]), :create_additions => false) + environment_constraints = environment['cookbook_versions'] + + desired_versions.each_key do |name| + desired_versions = filter_by_constraint(desired_versions, name, environment_constraints[name]) + end + + # Depsolve! + solved = depsolve(desired_versions.keys, desired_versions, environment_constraints) + if !solved + return raise RestErrorResponse.new(412, "Unsolvable versions!") + end + + result = {} + solved.each_pair do |name, versions| + result[name] = JSON.parse(data['cookbooks'][name][versions[0]], :create_additions => false) + end + json_response(200, result) + end + + def depsolve(unsolved, desired_versions, environment_constraints) + return nil if desired_versions.values.any? { |versions| versions.empty? } + + # If everything is already + solve_for = unsolved[0] + return desired_versions if !solve_for + + # Go through each desired version of this cookbook, starting with the latest, + # until we find one we can solve successfully with + sort_versions(desired_versions[solve_for]).each do |desired_version| + new_desired_versions = desired_versions.clone + new_desired_versions[solve_for] = [ desired_version ] + new_unsolved = unsolved[1..-1] + + # Pick this cookbook, and add dependencies + cookbook_obj = JSON.parse(cookbooks[solve_for][desired_version], :create_additions => false) + dep_not_found = false + cookbook_obj['metadata']['dependencies'].each_pair do |dep_name, dep_constraint| + # If the dep is not already in the list, add it to the list to solve + # and bring in all environment-allowed cookbook versions to desired_versions + if !new_desired_versions.has_key?(dep_name) + new_unsolved = new_unsolved + [dep_name] + # If the dep is missing, we will try other versions of the cookbook that might not have the bad dep. + if !cookbooks[dep_name] + dep_not_found = true + break + end + new_desired_versions[dep_name] = cookbooks[dep_name].keys + new_desired_versions = filter_by_constraint(new_desired_versions, dep_name, environment_constraints[dep_name]) + end + new_desired_versions = filter_by_constraint(new_desired_versions, dep_name, dep_constraint) + end + + next if dep_not_found + + # Depsolve children with this desired version! First solution wins. + result = depsolve(new_unsolved, new_desired_versions, environment_constraints) + return result if result + end + return nil + end + + def sort_versions(versions) + result = versions.sort_by { |version| Chef::Version.new(version) } + result.reverse + end + + def filter_by_constraint(versions, cookbook_name, constraint) + return versions if !constraint + constraint = Chef::VersionConstraint.new(constraint) + new_versions = versions[cookbook_name] + new_versions = new_versions.select { |version| constraint.include?(version) } + result = versions.clone + result[cookbook_name] = new_versions + result + end + end + end +end diff --git a/lib/chef_zero/endpoints/environment_cookbooks_endpoint.rb b/lib/chef_zero/endpoints/environment_cookbooks_endpoint.rb new file mode 100644 index 0000000..591f43f --- /dev/null +++ b/lib/chef_zero/endpoints/environment_cookbooks_endpoint.rb @@ -0,0 +1,22 @@ +require 'json' +require 'chef_zero/endpoints/cookbooks_base' + +module ChefZero + module Endpoints + # /environments/NAME/cookbooks + class EnvironmentCookbooksEndpoint < CookbooksBase + def get(request) + environment = JSON.parse(get_data(request, request.rest_path[0..1]), :create_additions => false) + constraints = environment['cookbook_versions'] || {} + if request.query_params['num_versions'] == 'all' + num_versions = nil + elsif request.query_params['num_versions'] + num_versions = request.query_params['num_versions'].to_i + else + num_versions = 1 + end + json_response(200, format_cookbooks_list(request, data['cookbooks'], constraints, num_versions)) + end + end + end +end diff --git a/lib/chef_zero/endpoints/environment_endpoint.rb b/lib/chef_zero/endpoints/environment_endpoint.rb new file mode 100644 index 0000000..a418e78 --- /dev/null +++ b/lib/chef_zero/endpoints/environment_endpoint.rb @@ -0,0 +1,33 @@ +require 'json' +require 'chef_zero/endpoints/rest_object_endpoint' +require 'chef_zero/data_normalizer' + +module ChefZero + module Endpoints + # /environments/NAME + class EnvironmentEndpoint < RestObjectEndpoint + def delete(request) + if request.rest_path[1] == "_default" + # 405, really? + error(405, "The '_default' environment cannot be modified.") + else + super(request) + end + end + + def put(request) + if request.rest_path[1] == "_default" + error(405, "The '_default' environment cannot be modified.") + else + super(request) + end + end + + def populate_defaults(request, response_json) + response = JSON.parse(response_json, :create_additions => false) + response = DataNormalizer.normalize_environment(response, request.rest_path[1]) + JSON.pretty_generate(response) + end + end + end +end diff --git a/lib/chef_zero/endpoints/environment_nodes_endpoint.rb b/lib/chef_zero/endpoints/environment_nodes_endpoint.rb new file mode 100644 index 0000000..31a4044 --- /dev/null +++ b/lib/chef_zero/endpoints/environment_nodes_endpoint.rb @@ -0,0 +1,23 @@ +require 'json' +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # /environment/NAME/nodes + class EnvironmentNodesEndpoint < RestBase + def get(request) + # 404 if environment does not exist + get_data(request, request.rest_path[0..1]) + + result = {} + data['nodes'].each_pair do |name, node| + node_json = JSON.parse(node, :create_additions => false) + if node['chef_environment'] == request.rest_path[1] + result[name] = build_uri(request.base_uri, 'nodes', name) + end + end + json_response(200, result) + end + end + end +end diff --git a/lib/chef_zero/endpoints/environment_recipes_endpoint.rb b/lib/chef_zero/endpoints/environment_recipes_endpoint.rb new file mode 100644 index 0000000..0bbaa8b --- /dev/null +++ b/lib/chef_zero/endpoints/environment_recipes_endpoint.rb @@ -0,0 +1,22 @@ +require 'json' +require 'chef_zero/endpoints/cookbooks_base' + +module ChefZero + module Endpoints + # /environment/NAME/recipes + class EnvironmentRecipesEndpoint < CookbooksBase + def get(request) + environment = JSON.parse(get_data(request, request.rest_path[0..1]), :create_additions => false) + constraints = environment['cookbook_versions'] || {} + result = [] + filter_cookbooks(data['cookbooks'], constraints, 1) do |name, versions| + if versions.size > 0 + cookbook = JSON.parse(data['cookbooks'][name][versions[0]], :create_additions => false) + result += recipe_names(name, cookbook) + end + end + json_response(200, result.sort) + end + end + end +end diff --git a/lib/chef_zero/endpoints/environment_role_endpoint.rb b/lib/chef_zero/endpoints/environment_role_endpoint.rb new file mode 100644 index 0000000..94be3ab --- /dev/null +++ b/lib/chef_zero/endpoints/environment_role_endpoint.rb @@ -0,0 +1,35 @@ +require 'json' +require 'chef_zero/endpoints/cookbooks_base' + +module ChefZero + module Endpoints + # /environments/NAME/roles/NAME + # /roles/NAME/environments/NAME + class EnvironmentRoleEndpoint < CookbooksBase + def get(request) + # 404 if environment does not exist + if request.rest_path[0] == 'environments' + environment_path = request.rest_path[0..1] + role_path = request.rest_path[2..3] + else + environment_path = request.rest_path[2..3] + role_path = request.rest_path[0..1] + end + get_data(request, environment_path) + + role = JSON.parse(get_data(request, role_path), :create_additions => false) + environment_name = environment_path[1] + if environment_name == '_default' + run_list = role['run_list'] + else + if role['env_run_lists'] + run_list = role['env_run_lists'][environment_name] + else + run_list = nil + end + end + json_response(200, { 'run_list' => run_list }) + end + end + end +end diff --git a/lib/chef_zero/endpoints/file_store_file_endpoint.rb b/lib/chef_zero/endpoints/file_store_file_endpoint.rb new file mode 100644 index 0000000..98cea4d --- /dev/null +++ b/lib/chef_zero/endpoints/file_store_file_endpoint.rb @@ -0,0 +1,22 @@ +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # The minimum amount of S3 necessary to support cookbook upload/download + # /file_store/FILE + class FileStoreFileEndpoint < RestBase + def json_only + false + end + + def get(request) + [200, {"Content-Type" => 'application/x-binary'}, get_data(request) ] + end + + def put(request) + data['file_store'][request.rest_path[1]] = request.body + json_response(200, {}) + end + end + end +end diff --git a/lib/chef_zero/endpoints/node_endpoint.rb b/lib/chef_zero/endpoints/node_endpoint.rb new file mode 100644 index 0000000..980008f --- /dev/null +++ b/lib/chef_zero/endpoints/node_endpoint.rb @@ -0,0 +1,17 @@ +require 'json' +require 'chef_zero/endpoints/rest_object_endpoint' +require 'chef_zero/data_normalizer' + +module ChefZero + module Endpoints + # /nodes/ID + class NodeEndpoint < RestObjectEndpoint + def populate_defaults(request, response_json) + node = JSON.parse(response_json, :create_additions => false) + node = DataNormalizer.normalize_node(node, request.rest_path[1]) + JSON.pretty_generate(node) + end + end + end +end + diff --git a/lib/chef_zero/endpoints/not_found_endpoint.rb b/lib/chef_zero/endpoints/not_found_endpoint.rb new file mode 100644 index 0000000..5625c63 --- /dev/null +++ b/lib/chef_zero/endpoints/not_found_endpoint.rb @@ -0,0 +1,9 @@ +module ChefZero + module Endpoints + class NotFoundEndpoint + def call(env) + return [404, {"Content-Type" => "application/json"}, "Object not found: #{env['REQUEST_PATH']}"] + end + end + end +end diff --git a/lib/chef_zero/endpoints/principal_endpoint.rb b/lib/chef_zero/endpoints/principal_endpoint.rb new file mode 100644 index 0000000..1833592 --- /dev/null +++ b/lib/chef_zero/endpoints/principal_endpoint.rb @@ -0,0 +1,30 @@ +require 'json' +require 'chef_zero' +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # /principals/NAME + class PrincipalEndpoint < RestBase + def get(request) + name = request.rest_path[-1] + json = data['users'][name] + if json + type = 'user' + else + json = data['clients'][name] + type = 'client' + end + if json + json_response(200, { + 'name' => name, + 'type' => type, + 'public_key' => JSON.parse(json)['public_key'] || PUBLIC_KEY + }) + else + error(404, 'Principal not found') + end + end + end + end +end diff --git a/lib/chef_zero/endpoints/rest_list_endpoint.rb b/lib/chef_zero/endpoints/rest_list_endpoint.rb new file mode 100644 index 0000000..46f8f88 --- /dev/null +++ b/lib/chef_zero/endpoints/rest_list_endpoint.rb @@ -0,0 +1,41 @@ +require 'json' +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # Typical REST list endpoint (/roles or /data/BAG) + class RestListEndpoint < RestBase + def initialize(server, identity_key = 'name') + super(server) + @identity_key = identity_key + end + + attr_reader :identity_key + + def get(request) + # Get the result + result_hash = {} + get_data(request).keys.sort.each do |name| + result_hash[name] = "#{build_uri(request.base_uri, request.rest_path + [name])}" + end + json_response(200, result_hash) + end + + def post(request) + container = get_data(request) + contents = request.body + key = get_key(contents) + if container[key] + error(409, 'Object already exists') + else + container[key] = contents + json_response(201, {'uri' => "#{build_uri(request.base_uri, request.rest_path + [key])}"}) + end + end + + def get_key(contents) + JSON.parse(contents, :create_additions => false)[identity_key] + end + end + end +end diff --git a/lib/chef_zero/endpoints/rest_object_endpoint.rb b/lib/chef_zero/endpoints/rest_object_endpoint.rb new file mode 100644 index 0000000..bd45afe --- /dev/null +++ b/lib/chef_zero/endpoints/rest_object_endpoint.rb @@ -0,0 +1,65 @@ +require 'json' +require 'chef_zero/rest_base' +require 'chef_zero/rest_error_response' + +module ChefZero + module Endpoints + # Typical REST leaf endpoint (/roles/NAME or /data/BAG/NAME) + class RestObjectEndpoint < RestBase + def initialize(server, identity_key = 'name') + super(server) + @identity_key = identity_key + end + + attr_reader :identity_key + + def get(request) + already_json_response(200, populate_defaults(request, get_data(request))) + end + + def put(request) + # We grab the old body to trigger a 404 if it doesn't exist + old_body = get_data(request) + request_json = JSON.parse(request.body, :create_additions => false) + key = request_json[identity_key] || request.rest_path[-1] + container = get_data(request, request.rest_path[0..-2]) + # If it's a rename, check for conflict and delete the old value + rename = key != request.rest_path[-1] + if rename + if container.has_key?(key) + return error(409, "Cannot rename '#{request.rest_path[-1]}' to '#{key}': '#{key}' already exists") + end + container.delete(request.rest_path[-1]) + end + container[key] = request.body + already_json_response(200, populate_defaults(request, request.body)) + end + + def delete(request) + key = request.rest_path[-1] + container = get_data(request, request.rest_path[0..-2]) + if !container.has_key?(key) + raise RestErrorResponse.new(404, "Object not found: #{build_uri(request.base_uri, request.rest_path)}") + end + result = container[key] + container.delete(key) + already_json_response(200, populate_defaults(request, result)) + end + + def patch_request_body(request) + container = get_data(request, request.rest_path[0..-2]) + existing_value = container[request.rest_path[-1]] + if existing_value + request_json = JSON.parse(request.body, :create_additions => false) + existing_json = JSON.parse(existing_value, :create_additions => false) + merged_json = existing_json.merge(request_json) + if merged_json.size > request_json.size + return JSON.pretty_generate(merged_json) + end + end + request.body + end + end + end +end + diff --git a/lib/chef_zero/endpoints/role_endpoint.rb b/lib/chef_zero/endpoints/role_endpoint.rb new file mode 100644 index 0000000..6a4cfd4 --- /dev/null +++ b/lib/chef_zero/endpoints/role_endpoint.rb @@ -0,0 +1,16 @@ +require 'json' +require 'chef_zero/endpoints/rest_object_endpoint' +require 'chef_zero/data_normalizer' + +module ChefZero + module Endpoints + # /roles/NAME + class RoleEndpoint < RestObjectEndpoint + def populate_defaults(request, response_json) + role = JSON.parse(response_json, :create_additions => false) + role = DataNormalizer.normalize_role(role, request.rest_path[1]) + JSON.pretty_generate(role) + end + end + end +end diff --git a/lib/chef_zero/endpoints/role_environments_endpoint.rb b/lib/chef_zero/endpoints/role_environments_endpoint.rb new file mode 100644 index 0000000..327602e --- /dev/null +++ b/lib/chef_zero/endpoints/role_environments_endpoint.rb @@ -0,0 +1,14 @@ +require 'json' +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # /roles/NAME/environments + class RoleEnvironmentsEndpoint < RestBase + def get(request) + role = JSON.parse(get_data(request, request.rest_path[0..1]), :create_additions => false) + json_response(200, [ '_default' ] + (role['env_run_lists'].keys || [])) + end + end + end +end diff --git a/lib/chef_zero/endpoints/sandbox_endpoint.rb b/lib/chef_zero/endpoints/sandbox_endpoint.rb new file mode 100644 index 0000000..09cb180 --- /dev/null +++ b/lib/chef_zero/endpoints/sandbox_endpoint.rb @@ -0,0 +1,22 @@ +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # /sandboxes/ID + class SandboxEndpoint < RestBase + def put(request) + existing_sandbox = get_data(request, request.rest_path) + data['sandboxes'].delete(request.rest_path[1]) + time_str = existing_sandbox[:create_time].strftime('%Y-%m-%dT%H:%M:%S%z') + time_str = "#{time_str[0..21]}:#{time_str[22..23]}" + json_response(200, { + :guid => request.rest_path[1], + :name => request.rest_path[1], + :checksums => existing_sandbox[:checksums], + :create_time => time_str, + :is_completed => true + }) + end + end + end +end diff --git a/lib/chef_zero/endpoints/sandboxes_endpoint.rb b/lib/chef_zero/endpoints/sandboxes_endpoint.rb new file mode 100644 index 0000000..698564f --- /dev/null +++ b/lib/chef_zero/endpoints/sandboxes_endpoint.rb @@ -0,0 +1,44 @@ +require 'json' +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # /sandboxes + class SandboxesEndpoint < RestBase + def initialize(server) + super(server) + @next_id = 1 + end + + def post(request) + sandbox_checksums = [] + + needed_checksums = JSON.parse(request.body, :create_additions => false)['checksums'] + result_checksums = {} + needed_checksums.keys.each do |needed_checksum| + if data['file_store'].has_key?(needed_checksum) + result_checksums[needed_checksum] = { :needs_upload => false } + else + result_checksums[needed_checksum] = { + :needs_upload => true, + :url => build_uri(request.base_uri, ['file_store', needed_checksum]) + } + sandbox_checksums << needed_checksum + end + end + + id = @next_id.to_s + @next_id+=1 + + data['sandboxes'][id] = { :create_time => Time.now.utc, :checksums => sandbox_checksums } + + json_response(201, { + :uri => build_uri(request.base_uri, request.rest_path + [id.to_s]), + :checksums => result_checksums, + :sandbox_id => id + }) + end + end + end +end + diff --git a/lib/chef_zero/endpoints/search_endpoint.rb b/lib/chef_zero/endpoints/search_endpoint.rb new file mode 100644 index 0000000..8a66d9a --- /dev/null +++ b/lib/chef_zero/endpoints/search_endpoint.rb @@ -0,0 +1,139 @@ +require 'json' +require 'chef/mixin/deep_merge' +require 'chef_zero/endpoints/rest_object_endpoint' +require 'chef_zero/data_normalizer' +require 'chef_zero/rest_error_response' +require 'chef_zero/solr/solr_parser' +require 'chef_zero/solr/solr_doc' + +module ChefZero + module Endpoints + # /search/INDEX + class SearchEndpoint < RestBase + def get(request) + results = search(request) + results['rows'] = results['rows'].map { |name,uri,value,search_value| value } + json_response(200, results) + end + + def post(request) + full_results = search(request) + keys = JSON.parse(request.body, :create_additions => false) + partial_results = full_results['rows'].map do |name, uri, doc, search_value| + data = {} + keys.each_pair do |key, path| + if path.size > 0 + value = search_value + path.each do |path_part| + value = value[path_part] if !value.nil? + end + data[key] = value + else + data[key] = nil + end + end + { + 'url' => uri, + 'data' => data + } + end + json_response(200, { + 'rows' => partial_results, + 'start' => full_results['start'], + 'total' => full_results['total'] + }) + end + + private + + def search_container(request, index) + case index + when 'client' + [ data['clients'], Proc.new { |client, name| DataNormalizer.normalize_client(client, name) }, build_uri(request.base_uri, [ 'clients' ]) ] + when 'node' + [ data['nodes'], Proc.new { |node, name| DataNormalizer.normalize_node(node, name) }, build_uri(request.base_uri, [ 'nodes' ]) ] + when 'environment' + [ data['environments'], Proc.new { |environment, name| DataNormalizer.normalize_environment(environment, name) }, build_uri(request.base_uri, [ 'environments' ]) ] + when 'role' + [ data['roles'], Proc.new { |role, name| DataNormalizer.normalize_role(role, name) }, build_uri(request.base_uri, [ 'roles' ]) ] + else + [ data['data'][index], Proc.new { |data_bag_item, id| DataNormalizer.normalize_data_bag_item(data_bag_item, index, id, 'DELETE') }, build_uri(request.base_uri, [ 'data', index ]) ] + end + end + + def expand_for_indexing(value, index, id) + if index == 'node' + result = {} + Chef::Mixin::DeepMerge.deep_merge!(value['default'] || {}, result) + Chef::Mixin::DeepMerge.deep_merge!(value['normal'] || {}, result) + Chef::Mixin::DeepMerge.deep_merge!(value['override'] || {}, result) + Chef::Mixin::DeepMerge.deep_merge!(value['automatic'] || {}, result) + result['recipe'] = [] + result['role'] = [] + if value['run_list'] + value['run_list'].each do |run_list_entry| + if run_list_entry =~ /^(recipe|role)\[(.*)\]/ + result[$1] << $2 + end + end + end + value.each_pair do |key, value| + result[key] = value unless %w(default normal override automatic).include?(key) + end + result + + elsif !%w(client environment role).include?(index) + DataNormalizer.normalize_data_bag_item(value, index, id, 'GET') + else + value + end + end + + def search(request) + # Extract parameters + index = request.rest_path[1] + query_string = request.query_params['q'] || '*:*' + solr_query = ChefZero::Solr::SolrParser.new(query_string).parse + sort_string = request.query_params['sort'] + start = request.query_params['start'] + start = start.to_i if start + rows = request.query_params['rows'] + rows = rows.to_i if rows + + # Get the search container + container, expander, base_uri = search_container(request, index) + if container.nil? + raise RestErrorResponse.new(404, "Object not found: #{build_uri(request.base_uri, request.rest_path)}") + end + + # Search! + result = [] + container.each_pair do |name,value| + expanded = expander.call(JSON.parse(value, :create_additions => false), name) + result << [ name, build_uri(base_uri, [name]), expanded, expand_for_indexing(expanded, index, name) ] + end + result = result.select do |name, uri, value, search_value| + solr_query.matches_doc?(ChefZero::Solr::SolrDoc.new(search_value, name)) + end + total = result.size + + # Sort + if sort_string + sort_key, sort_order = sort_string.split(/\s+/, 2) + result = result.sort_by { |name,uri,value,search_value| ChefZero::Solr::SolrDoc.new(search_value, name)[sort_key] } + result = result.reverse if sort_order == "DESC" + end + + # Paginate + if start + result = result[start..start+(rows||-1)] + end + { + 'rows' => result, + 'start' => start || 0, + 'total' => total + } + end + end + end +end diff --git a/lib/chef_zero/endpoints/searches_endpoint.rb b/lib/chef_zero/endpoints/searches_endpoint.rb new file mode 100644 index 0000000..d7ab451 --- /dev/null +++ b/lib/chef_zero/endpoints/searches_endpoint.rb @@ -0,0 +1,18 @@ +require 'chef_zero/rest_base' + +module ChefZero + module Endpoints + # /search + class SearchesEndpoint < RestBase + def get(request) + # Get the result + result_hash = {} + indices = (%w(client environment node role) + data['data'].keys).sort + indices.each do |index| + result_hash[index] = build_uri(request.base_uri, request.rest_path + [index]) + end + json_response(200, result_hash) + end + end + end +end diff --git a/lib/chef_zero/rest_base.rb b/lib/chef_zero/rest_base.rb new file mode 100644 index 0000000..068a55d --- /dev/null +++ b/lib/chef_zero/rest_base.rb @@ -0,0 +1,82 @@ +require 'chef_zero/rest_request' +require 'chef_zero/rest_error_response' + +module ChefZero + class RestBase + def initialize(server) + @server = server + end + + attr_reader :server + + def data + server.data + end + + def call(env) + begin + rest_path = env['PATH_INFO'].split('/').select { |part| part != "" } + method = env['REQUEST_METHOD'].downcase.to_sym + if !self.respond_to?(method) + accept_methods = [:get, :put, :post, :delete].select { |m| self.respond_to?(m) } + accept_methods_str = accept_methods.map { |m| m.to_s.upcase }.join(', ') + return [405, {"Content-Type" => "text/plain", "Allow" => accept_methods_str}, "Bad request method for '#{env['REQUEST_PATH']}': #{env['REQUEST_METHOD']}"] + end + if json_only && !env['HTTP_ACCEPT'].split(';').include?('application/json') + return [406, {"Content-Type" => "text/plain"}, "Must accept application/json"] + end + # Dispatch to get()/post()/put()/delete() + begin + self.send(method, RestRequest.new(env)) + rescue RestErrorResponse => e + error(e.response_code, e.error) + end + rescue + puts $!.inspect + puts $!.backtrace + raise + end + end + + def json_only + true + end + + def get_data(request, rest_path=nil) + rest_path ||= request.rest_path + # Grab the value we're looking for + value = data + rest_path.each do |path_part| + if !value.has_key?(path_part) + raise RestErrorResponse.new(404, "Object not found: #{build_uri(request.base_uri, rest_path)}") + end + value = value[path_part] + end + value + end + + def error(response_code, error) + json_response(response_code, {"error" => [error]}) + end + + def json_response(response_code, json) + already_json_response(response_code, JSON.pretty_generate(json)) + end + + def already_json_response(response_code, json_text) + [response_code, {"Content-Type" => "application/json"}, json_text] + end + + def build_uri(base_uri, rest_path) + RestBase::build_uri(base_uri, rest_path) + end + + def self.build_uri(base_uri, rest_path) + "#{base_uri}/#{rest_path.join('/')}" + end + + def populate_defaults(request, response) + response + end + end +end diff --git a/lib/chef_zero/rest_error_response.rb b/lib/chef_zero/rest_error_response.rb new file mode 100644 index 0000000..2edca25 --- /dev/null +++ b/lib/chef_zero/rest_error_response.rb @@ -0,0 +1,11 @@ +module ChefZero + class RestErrorResponse < Exception + def initialize(response_code, error) + @response_code = response_code + @error = error + end + + attr_reader :response_code + attr_reader :error + end +end diff --git a/lib/chef_zero/rest_request.rb b/lib/chef_zero/rest_request.rb new file mode 100644 index 0000000..5df3d0f --- /dev/null +++ b/lib/chef_zero/rest_request.rb @@ -0,0 +1,42 @@ +require 'rack/request' + +module ChefZero + class RestRequest + def initialize(env) + @env = env + end + + attr_reader :env + + def base_uri + @base_uri ||= "#{env['rack.url_scheme']}://#{env['HTTP_HOST']}#{env['SCRIPT_NAME']}" + end + + def method + @env['REQUEST_METHOD'] + end + + def rest_path + @rest_path ||= env['PATH_INFO'].split('/').select { |part| part != "" } + end + + def body=(body) + @body = body + end + + def body + @body ||= env['rack.input'].read + end + + def query_params + @query_params ||= begin + params = Rack::Request.new(env).GET + params.keys.each do |key| + params[key] = URI.unescape(params[key]) + end + params + end + end + end +end + diff --git a/lib/chef_zero/router.rb b/lib/chef_zero/router.rb new file mode 100644 index 0000000..0389c8a --- /dev/null +++ b/lib/chef_zero/router.rb @@ -0,0 +1,24 @@ +module ChefZero + class Router + def initialize(routes) + @routes = routes.map do |route, endpoint| + pattern = Regexp.new("^#{route.gsub('*', '[^/]*')}$") + [ pattern, endpoint ] + end + end + + attr_reader :routes + attr_accessor :not_found + + def call(env) + puts "#{env['REQUEST_METHOD']} #{env['PATH_INFO']}#{env['QUERY_STRING'] != '' ? "?" + env['QUERY_STRING'] : ''}" + clean_path = "/" + env['PATH_INFO'].split('/').select { |part| part != "" }.join("/") + routes.each do |route, endpoint| + if route.match(clean_path) + return endpoint.call(env) + end + end + not_found.call(env) + end + end +end diff --git a/lib/chef_zero/server.rb b/lib/chef_zero/server.rb new file mode 100644 index 0000000..7f0b780 --- /dev/null +++ b/lib/chef_zero/server.rb @@ -0,0 +1,140 @@ +# +# Author:: John Keiser (<jkeiser@opscode.com>) +# Copyright:: Copyright (c) 2012 Opscode, Inc. +# License:: Apache License, Version 2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +require 'rubygems' +require 'webrick' +require 'rack' +require 'openssl' +require 'chef_zero' +require 'chef_zero/router' + +require 'chef_zero/endpoints/authenticate_user_endpoint' +require 'chef_zero/endpoints/actors_endpoint' +require 'chef_zero/endpoints/actor_endpoint' +require 'chef_zero/endpoints/cookbooks_endpoint' +require 'chef_zero/endpoints/cookbook_endpoint' +require 'chef_zero/endpoints/cookbook_version_endpoint' +require 'chef_zero/endpoints/data_bags_endpoint' +require 'chef_zero/endpoints/data_bag_endpoint' +require 'chef_zero/endpoints/data_bag_item_endpoint' +require 'chef_zero/endpoints/rest_list_endpoint' +require 'chef_zero/endpoints/environment_endpoint' +require 'chef_zero/endpoints/environment_cookbooks_endpoint' +require 'chef_zero/endpoints/environment_cookbook_endpoint' +require 'chef_zero/endpoints/environment_cookbook_versions_endpoint' +require 'chef_zero/endpoints/environment_nodes_endpoint' +require 'chef_zero/endpoints/environment_recipes_endpoint' +require 'chef_zero/endpoints/environment_role_endpoint' +require 'chef_zero/endpoints/node_endpoint' +require 'chef_zero/endpoints/principal_endpoint' +require 'chef_zero/endpoints/role_endpoint' +require 'chef_zero/endpoints/role_environments_endpoint' +require 'chef_zero/endpoints/sandboxes_endpoint' +require 'chef_zero/endpoints/sandbox_endpoint' +require 'chef_zero/endpoints/searches_endpoint' +require 'chef_zero/endpoints/search_endpoint' +require 'chef_zero/endpoints/file_store_file_endpoint' +require 'chef_zero/endpoints/not_found_endpoint' + +module ChefZero + class Server < Rack::Server + def initialize(options) + options[:host] ||= "localhost" # TODO 0.0.0.0? + options[:port] ||= 80 + options[:generate_real_keys] = true if !options.has_key?(:generate_real_keys) + super(options) + @generate_real_keys = options[:generate_real_keys] + @data = { + 'clients' => { + 'chef-validator' => '{ "validator": true }', + 'chef-webui' => '{ "admin": true }' + }, + 'cookbooks' => {}, + 'data' => {}, + 'environments' => { + '_default' => '{ "description": "The default Chef environment" }' + }, + 'file_store' => {}, + 'nodes' => {}, + 'roles' => {}, + 'sandboxes' => {}, + 'users' => { + 'admin' => '{ "admin": true }' + } + } + end + + attr_reader :data + attr_reader :generate_real_keys + + include ChefZero::Endpoints + + def app + @app ||= begin + router = Router.new([ + [ '/authenticate_user', AuthenticateUserEndpoint.new(self) ], + [ '/clients', ActorsEndpoint.new(self) ], + [ '/clients/*', ActorEndpoint.new(self) ], + [ '/cookbooks', CookbooksEndpoint.new(self) ], + [ '/cookbooks/*', CookbookEndpoint.new(self) ], + [ '/cookbooks/*/*', CookbookVersionEndpoint.new(self) ], + [ '/data', DataBagsEndpoint.new(self) ], + [ '/data/*', DataBagEndpoint.new(self) ], + [ '/data/*/*', DataBagItemEndpoint.new(self) ], + [ '/environments', RestListEndpoint.new(self) ], + [ '/environments/*', EnvironmentEndpoint.new(self) ], + [ '/environments/*/cookbooks', EnvironmentCookbooksEndpoint.new(self) ], + [ '/environments/*/cookbooks/*', EnvironmentCookbookEndpoint.new(self) ], + [ '/environments/*/cookbook_versions', EnvironmentCookbookVersionsEndpoint.new(self) ], + [ '/environments/*/nodes', EnvironmentNodesEndpoint.new(self) ], + [ '/environments/*/recipes', EnvironmentRecipesEndpoint.new(self) ], + [ '/environments/*/roles/*', EnvironmentRoleEndpoint.new(self) ], + [ '/nodes', RestListEndpoint.new(self) ], + [ '/nodes/*', NodeEndpoint.new(self) ], + [ '/principals/*', PrincipalEndpoint.new(self) ], + [ '/roles', RestListEndpoint.new(self) ], + [ '/roles/*', RoleEndpoint.new(self) ], + [ '/roles/*/environments', RoleEnvironmentsEndpoint.new(self) ], + [ '/roles/*/environments/*', EnvironmentRoleEndpoint.new(self) ], + [ '/sandboxes', SandboxesEndpoint.new(self) ], + [ '/sandboxes/*', SandboxEndpoint.new(self) ], + [ '/search', SearchesEndpoint.new(self) ], + [ '/search/*', SearchEndpoint.new(self) ], + [ '/users', ActorsEndpoint.new(self) ], + [ '/users/*', ActorEndpoint.new(self) ], + + [ '/file_store/*', FileStoreFileEndpoint.new(self) ], + ]) + router.not_found = NotFoundEndpoint.new + router + end + end + + def gen_key_pair + if generate_real_keys + private_key = OpenSSL::PKey::RSA.new(2048) + public_key = private_key.public_key.to_s + public_key.sub!(/^-----BEGIN RSA PUBLIC KEY-----/, '-----BEGIN PUBLIC KEY-----') + public_key.sub!(/-----END RSA PUBLIC KEY-----(\s+)$/, '-----END PUBLIC KEY-----\1') + [private_key.to_s, public_key] + else + [PRIVATE_KEY, PUBLIC_KEY] + end + end + end +end diff --git a/lib/chef_zero/solr/query/binary_operator.rb b/lib/chef_zero/solr/query/binary_operator.rb new file mode 100644 index 0000000..bd8fa4c --- /dev/null +++ b/lib/chef_zero/solr/query/binary_operator.rb @@ -0,0 +1,53 @@ +module ChefZero + module Solr + module Query + class BinaryOperator + def initialize(left, operator, right) + @left = left + @operator = operator + @right = right + end + + def to_s + "(#{left} #{operator} #{right})" + end + + attr_reader :left + attr_reader :operator + attr_reader :right + + def matches_doc?(doc) + case @operator + when 'AND' + left.matches_doc?(doc) && right.matches_doc?(doc) + when 'OR' + left.matches_doc?(doc) || right.matches_doc?(doc) + when '^' + left.matches_doc?(doc) + when ':' + if left.respond_to?(:literal_string) && left.literal_string + value = doc[left.literal_string] + right.matches_values?([value]) + else + values = doc.matching_values { |key| left.matches_values?([key]) } + right.matches_values?(values) + end + end + end + + def matches_values?(values) + case @operator + when 'AND' + left.matches_values?(values) && right.matches_values?(values) + when 'OR' + left.matches_values?(values) || right.matches_values?(values) + when '^' + left.matches_values?(values) + when ':' + raise ": does not work inside a : or term" + end + end + end + end + end +end diff --git a/lib/chef_zero/solr/query/phrase.rb b/lib/chef_zero/solr/query/phrase.rb new file mode 100644 index 0000000..f229da9 --- /dev/null +++ b/lib/chef_zero/solr/query/phrase.rb @@ -0,0 +1,23 @@ +require 'chef_zero/solr/query/regexpable_query' + +module ChefZero + module Solr + module Query + class Phrase < RegexpableQuery + def initialize(terms) + # Phrase is terms separated by whitespace + if terms.size == 0 && terms[0].literal_string + literal_string = terms[0].literal_string + else + literal_string = nil + end + super(terms.map { |term| term.regexp_string }.join("#{NON_WORD_CHARACTER}+"), literal_string) + end + + def to_s + "Phrase(\"#{@regexp_string}\")" + end + end + end + end +end diff --git a/lib/chef_zero/solr/query/range_query.rb b/lib/chef_zero/solr/query/range_query.rb new file mode 100644 index 0000000..db92548 --- /dev/null +++ b/lib/chef_zero/solr/query/range_query.rb @@ -0,0 +1,34 @@ +module ChefZero + module Solr + module Query + class RangeQuery + def initialize(from, to, from_inclusive, to_inclusive) + @from = from + @to = to + @from_inclusive = from_inclusive + @to_inclusive = to_inclusive + end + + def to_s + "#{@from_inclusive ? '[' : '{'}#{@from} TO #{@to}#{@to_inclusive ? '[' : '{'}" + end + + def matches?(key, value) + case @from <=> value + when -1 + return false + when 0 + return false if !@from_inclusive + end + case @to <=> value + when 1 + return false + when 0 + return false if !@to_inclusive + end + return true + end + end + end + end +end diff --git a/lib/chef_zero/solr/query/regexpable_query.rb b/lib/chef_zero/solr/query/regexpable_query.rb new file mode 100644 index 0000000..5166309 --- /dev/null +++ b/lib/chef_zero/solr/query/regexpable_query.rb @@ -0,0 +1,29 @@ +module ChefZero + module Solr + module Query + class RegexpableQuery + def initialize(regexp_string, literal_string) + @regexp_string = regexp_string + # Surround the regexp with word boundaries + @regexp = Regexp.new("(^|#{NON_WORD_CHARACTER})#{regexp_string}($|#{NON_WORD_CHARACTER})", true) + @literal_string = literal_string + end + + attr_reader :literal_string + attr_reader :regexp_string + attr_reader :regexp + + def matches_doc?(doc) + value = doc[DEFAULT_FIELD] + return value ? matches_values?([value]) : false + end + def matches_values?(values) + values.any? { |value| !@regexp.match(value).nil? } + end + + WORD_CHARACTER = "[A-Za-z0-9@._':]" + NON_WORD_CHARACTER = "[^A-Za-z0-9@._':]" + end + end + end +end diff --git a/lib/chef_zero/solr/query/subquery.rb b/lib/chef_zero/solr/query/subquery.rb new file mode 100644 index 0000000..3727a20 --- /dev/null +++ b/lib/chef_zero/solr/query/subquery.rb @@ -0,0 +1,35 @@ +module ChefZero + module Solr + module Query + class Subquery + def initialize(subquery) + @subquery = subquery + end + + def to_s + "(#{@subquery})" + end + + def literal_string + subquery.literal_string + end + + def regexp + subquery.regexp + end + + def regexp_string + subquery.regexp_string + end + + def matches_doc?(doc) + subquery.matches_doc?(doc) + end + + def matches_values?(values) + subquery.matches_values?(values) + end + end + end + end +end diff --git a/lib/chef_zero/solr/query/term.rb b/lib/chef_zero/solr/query/term.rb new file mode 100644 index 0000000..23f4a72 --- /dev/null +++ b/lib/chef_zero/solr/query/term.rb @@ -0,0 +1,45 @@ +require 'chef_zero/solr/query/regexpable_query' + +module ChefZero + module Solr + module Query + class Term < RegexpableQuery + def initialize(term) + # Get rid of escape characters, turn * and ? into .* and . for regex, and + # escape everything that needs escaping + literal_string = "" + regexp_string = "" + index = 0 + while index < term.length + if term[index] == '*' + regexp_string << "#{WORD_CHARACTER}*" + literal_string = nil + index += 1 + elsif term[index] == '?' + regexp_string << WORD_CHARACTER + literal_string = nil + index += 1 + elsif term[index] == '~' + raise "~ unsupported" + else + if term[index] == '\\' + index = index+1 + if index >= term.length + raise "Backslash at end of string '#{term}'" + end + end + literal_string << term[index] if literal_string + regexp_string << Regexp.escape(term[index]) + index += 1 + end + end + super(regexp_string, literal_string) + end + + def to_s + "Term(#{regexp_string})" + end + end + end + end +end diff --git a/lib/chef_zero/solr/query/unary_operator.rb b/lib/chef_zero/solr/query/unary_operator.rb new file mode 100644 index 0000000..fc46c0d --- /dev/null +++ b/lib/chef_zero/solr/query/unary_operator.rb @@ -0,0 +1,43 @@ +module ChefZero + module Solr + module Query + class UnaryOperator + def initialize(operator, operand) + @operator = operator + @operand = operand + end + + def to_s + "#{operator} #{operand}" + end + + attr_reader :operator + attr_reader :operand + + def matches_doc?(doc) + case @operator + when '-' + when 'NOT' + !operand.matches_doc?(doc) + when '+' + # TODO This operator uses relevance to eliminate other, unrelated + # expressions. +a OR b means "if it has b but not a, don't return it" + raise "+ not supported yet, because it is hard." + end + end + + def matches_values?(values) + case @operator + when '-' + when 'NOT' + !operand.matches_values?(values) + when '+' + # TODO This operator uses relevance to eliminate other, unrelated + # expressions. +a OR b means "if it has b but not a, don't return it" + raise "+ not supported yet, because it is hard." + end + end + end + end + end +end diff --git a/lib/chef_zero/solr/solr_doc.rb b/lib/chef_zero/solr/solr_doc.rb new file mode 100644 index 0000000..d61a7d5 --- /dev/null +++ b/lib/chef_zero/solr/solr_doc.rb @@ -0,0 +1,62 @@ +module ChefZero + module Solr + # This does what expander does, flattening the json doc into keys and values + # so that solr can search them. + class SolrDoc + def initialize(json, id) + @json = json + @id = id + end + + def [](key) + values = matching_values { |match_key| match_key == key } + values[0] + end + + def matching_values(&block) + result = {} + key_values(nil, @json) do |key, value| + if block.call(key) + if result.has_key?(key) + result[key] << value.to_s + else + result[key] = value.to_s.clone + end + end + end + # Handle manufactured value(s) + if block.call('X_CHEF_id_CHEF_X') + if result.has_key?('X_CHEF_id_CHEF_X') + result['X_CHEF_id_CHEF_X'] << @id.to_s + else + result['X_CHEF_id_CHEF_X'] = @id.to_s.clone + end + end + + result.values + end + + private + + def key_values(key_so_far, value, &block) + if value.is_a?(Hash) + value.each_pair do |child_key, child_value| + block.call(child_key, child_value.to_s) + if key_so_far + new_key = "#{key_so_far}_#{child_key}" + key_values(new_key, child_value, &block) + else + key_values(child_key, child_value, &block) if child_value.is_a?(Hash) || child_value.is_a?(Array) + end + end + elsif value.is_a?(Array) + value.each do |child_value| + key_values(key_so_far, child_value, &block) + end + else + block.call(key_so_far || 'text', value.to_s) + end + end + end + end +end diff --git a/lib/chef_zero/solr/solr_parser.rb b/lib/chef_zero/solr/solr_parser.rb new file mode 100644 index 0000000..589b78f --- /dev/null +++ b/lib/chef_zero/solr/solr_parser.rb @@ -0,0 +1,194 @@ +require 'chef_zero/solr/query/binary_operator' +require 'chef_zero/solr/query/unary_operator' +require 'chef_zero/solr/query/term' +require 'chef_zero/solr/query/phrase' +require 'chef_zero/solr/query/range_query' +require 'chef_zero/solr/query/subquery' + +module ChefZero + module Solr + class SolrParser + def initialize(query_string) + @query_string = query_string + @index = 0 + end + + def parse + read_expression + end + + # + # Tokenization + # + def peek_token + @next_token ||= parse_token + end + + def next_token + result = peek_token + @next_token = nil + result + end + + def parse_token + # Skip whitespace + skip_whitespace + return nil if eof? + + # Operators + operator = peek_operator_token + if operator + @index+=operator.length + operator + else + # Everything that isn't whitespace or an operator, is part of a term + # (characters plus backslashed escaped characters) + start_index = @index + begin + if @query_string[@index] == '\\' + @index+=1 + end + @index+=1 if !eof? + end until eof? || @query_string[@index] =~ /\s/ || peek_operator_token + @query_string[start_index..@index-1] + end + end + + def skip_whitespace + if @query_string[@index] =~ /\s/ + whitespace = /\s+/.match(@query_string, @index) + @index += whitespace[0].length + end + end + + def peek_operator_token + if ['"', '+', '-', '!', '(', ')', '{', '}', '[', ']', '^', ':'].include?(@query_string[@index]) + return @query_string[@index] + else + result = @query_string[@index..@index+1] + if ['&&', '||'].include?(result) + return result + end + end + nil + end + + def eof? + !@next_token && @index >= @query_string.length + end + + # Parse tree creation + def read_expression + result = read_single_expression + # Expression is over when we hit a close paren or eof + # (peek_token has the side effect of skipping whitespace for us, so we + # really know if we're at eof or not) + until peek_token == ')' || eof? + operator = peek_token + if binary_operator?(operator) + next_token + else + # If 2 terms are next to each other, the default operator is OR + operator = 'OR' + end + next_expression = read_single_expression + + # Build the operator, taking precedence into account + if result.is_a?(Query::BinaryOperator) && + binary_operator_precedence(operator) > binary_operator_precedence(result.operator) + # a+b*c -> a+(b*c) + new_right = Query::BinaryOperator.new(result.right, operator, next_expression) + result = Query::BinaryOperator.new(result.left, result.operator, new_right) + else + # a*b+c -> (a*b)+c + result = Query::BinaryOperator.new(result, operator, next_expression) + end + end + result + end + + def parse_error(token, str) + error = "Error on token '#{token}' at #{@index} of '#{@query_string}': #{str}" + puts error + raise error + end + + def read_single_expression + token = next_token + # If EOF, we have a problem Houston + if !token + parse_error(nil, "Expected expression!") + + # If it's an unary operand, build that + elsif unary_operator?(token) + operand = read_single_expression + # TODO We rely on all unary operators having higher precedence than all + # binary operators. Check if this is the case. + Query::UnaryOperator.new(token, operand) + + # If it's the start of a phrase, read the terms in the phrase + elsif token == '"' + # Read terms until close " + phrase_terms = [] + until (term = next_token) == '"' + phrase_terms << Query::Term.new(term) + end + Query::Phrase.new(phrase_terms) + + # If it's the start of a range query, build that + elsif token == '{' || token == '[' + left = next_token + parse_error(left, "Expected left term in range query") if !left + to = next_token + parse_error(left, "Expected TO in range query") if to != "TO" + right = next_token + parse_error(right, "Expected left term in range query") if !right + end_range = next_token + parse_error(right, "Expected end range '#{expected_end_range}") if !['{', '['].include?(end_range) + Query::RangeQuery.new(left, right, token == '[', end_range == ']') + + elsif token == '(' + subquery = read_expression + close_paren = next_token + parse_error(close_paren, "Expected ')'") if close_paren != ')' + Query::Subquery.new(subquery) + + # If it's the end of a closure, raise an exception + elsif ['}',']',')'].include?(token) + parse_error(token, "Unexpected end paren") + + # If it's a binary operator, raise an exception + elsif binary_operator?(token) + parse_error(token, "Unexpected binary operator") + + # Otherwise it's a term. + else + Query::Term.new(token) + end + end + + def unary_operator?(token) + [ 'NOT', '+', '-' ].include?(token) + end + + def binary_operator?(token) + [ 'AND', 'OR', '^', ':'].include?(token) + end + + def binary_operator_precedence(token) + case token + when '^' + 4 + when ':' + 3 + when 'AND' + 2 + when 'OR' + 1 + end + end + + DEFAULT_FIELD = 'text' + end + end +end diff --git a/lib/chef_zero/version.rb b/lib/chef_zero/version.rb new file mode 100644 index 0000000..9b7d21a --- /dev/null +++ b/lib/chef_zero/version.rb @@ -0,0 +1,3 @@ +module ChefZero + VERSION = '0.9' +end diff --git a/test/chef-zero-pedant-config.rb b/test/chef-zero-pedant-config.rb new file mode 100755 index 0000000..39c6729 --- /dev/null +++ b/test/chef-zero-pedant-config.rb @@ -0,0 +1,115 @@ +# Copyright: Copyright (c) 2012 Opscode, Inc. +# License: Apache License, Version 2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +# This annotated Pedant configuration file details the various +# configuration settings available to you. It is separate from the +# actual Pedant::Config class because not all settings have sane +# defaults, and not all settings are appropriate in all settings. + +################################################################################ +# You MUST specify the address of the server the API requests will be +# sent to. Only specify protocol, hostname, and port. +chef_server "http://127.0.0.1:8889" + +# If you are doing development testing, you can specify the address of +# the Solr server. The presence of this parameter will enable tests +# to force commits to Solr, greatly decreasing the amout of time +# needed for testing the search endpoint. This is only an +# optimization for development! If you are testing a "live" Chef +# Server, or otherwise do not have access to the Solr server from your +# testing location, you should not specify a value for this parameter. +# The tests will still run, albeit slower, as they will now need to +# poll for a period to ensure they are querying committed results. +#search_server "http://localhost:8983" + +# Related to the 'search_server' parameter, this specifies the maximum +# amout of time (in seconds) that search endpoint requests should be +# retried before giving up. If not explicitly set, it will default to +# 65 seconds; only set it if you know that your Solr commit interval +# differs significantly from this. +maximum_search_time 0 + +# We're starting to break tests up into groups based on different +# criteria. The proper API tests (the results of which are viewable +# to OPC customers) should be the only ones run by Pedant embedded in +# OPC installs. There are other specs that help us keep track of API +# cruft that we want to come back and fix later; these shouldn't be +# viewable to customers, but we should be able to run them in +# development and CI environments. If this parameter is missing or +# explicitly `false` only the customer-friendly tests will be run. +# +# This is mainly here for documentation purposes, since the +# command-line `opscode-pedant` utility ultimately determines this +# value. +include_internal false + +# Test users. The five users specified below are required; their +# names (:user, :non_org_user, etc.) are indicative of their role +# within the tests. All users must have a ':name' key. If they have +# a ':create_me' key, Pedant will create these users for you. If you +# are using pre-existing users, you must supply a ':key_file' key, +# which should be the fully-qualified path /on the machine Pedant is +# running on/ to a private key for that user. + +superuser_name 'admin' +superuser_key '../stickywicket.pem' +webui_key '../stickywicket.pem' + + +requestors({ + :clients => { + # The the admin user, for the purposes of getting things rolling + :admin => { + :name => "pedant_admin_client", + :create_me => true, + :create_knife => true, + :admin => true + }, + :non_admin => { + :name => 'pedant_client', + :create_me => true, + :create_knife => true + }, + :bad => { + :name => 'bad_client', + :bogus => true + } + }, + :users => { + :admin => { + :name => "admin", + :key_file => "../stickywicket.pem", + :create_me => false, + :create_knife => false, + :admin => true + }, + :non_admin => { + :name => "pedant_non_admin_user", + :create_me => true, + :create_knife => true, + :admin => false + }, + # A user for Knife tests. A knife.rb and key files will be set up + # for this user + :knife_user => { + :name => "knifey", + :create_me => true, + :create_knife => true + } + } + }) + +self[:tags] = [:validation, :authentication, :authorization] +verify_error_messages false diff --git a/test/run-pedant.rb b/test/run-pedant.rb new file mode 100644 index 0000000..2551d38 --- /dev/null +++ b/test/run-pedant.rb @@ -0,0 +1,18 @@ +#!/usr/bin/env ruby + +require 'rubygems' +$:.unshift(File.expand_path(File.join(File.dirname(__FILE__), "..", "lib"))) +require 'chef_zero/server' + +thread = Thread.new do + server = ChefZero::Server.new(:Port => 8889) + server.start +end + +system('git clone git://github.com/opscode/chef-pedant.git') +#system('cd chef-pedant && git pull') +system('cd chef-pedant && git reset --hard 458a3eed89915ff54913040f0001fd2ccd75511b') +system('cd chef-pedant && bundle install') +result = system('cd chef-pedant && bin/chef-pedant -c ../chef-zero-pedant-config.rb --skip-validation --skip-authentication --skip-authorization') +thread.kill +exit(result) diff --git a/test/stickywicket.pem b/test/stickywicket.pem new file mode 100644 index 0000000..ff09e73 --- /dev/null +++ b/test/stickywicket.pem @@ -0,0 +1,27 @@ +-----BEGIN RSA PRIVATE KEY----- +MIIEpQIBAAKCAQEApNCkX2k+lFGDWRVhX4uClaVQrumG9XXvk6X7M2izrIg7RzMP +Dk4thhZkpx5gr22By7PZQdMEjWC/Zo8MBjtoJ0GV0jw8npefbU1MGKs2dtpYgo0N +Fq8fX8MdFPu4h2W3g0dMEdhT8icc2H4EjhZmdeUhUn3RIEt2duCgp3YDYnUUZx3j +N7MHcTIdzD58ikr6zQrZzHOv+OOI86Xk9EpyEEQizOLoQxkICNrhqN7ElQDuvXaX +BSBrYDRKH2umBMMcXzvsR/SvkmqxoEESSpIlW8zeKAWQ+znNjDC0tmTg7jZmgSP7 +siKrwo4t4ebjcmjpIoi/JKww/nGN3Uhz1ZOZuwIDAQABAoIBAQCaJQD2s0nyEeKU +uKhfYe155Cl3zbWJcQnmv4AXbr9MiAVY6+oS6Q8ur1bn7kNjDzoruENjiuZhC7E3 +TGZklb8tp+tluyy+7vQOmBKpp8fClSfewekR5CultqhGbb8B8yIVR+NfdUHd4rLZ +z9KWyWB+txPZQQ8L80gSmrfmpzs3IuT7oPvmtBU1Wq9QapC4n/rUohHUpUV1du4G +0wCIF4zQTg6cbYW2YXozwVQvw+P7P3RVEqZt+aZlbVcy0fNr6jNao0hi1KFC9OH2 +VjjU+PioreoA/NU3aZPIUzmJpWtsu31yuOZxXmytAkYooCZgiEQNEHnJlNPv0RmC +6BPMzVoBAoGBAM7yZoSNJpzdP/q1/4+H3zyy7o4I0VTW9u/GqUzhnbjm5poK30X9 +YXh/7WOVV0OoVqdO6ljRKygP3Oggf41ZEbi1C6bbsO57pksBWgx9bD9V35XscZ0J +F1ERe//kMHwVQy74R8/cIuRwm75haLSBj5/fwGbLeeVDglJkCVqPjtuBAoGBAMvh +qsAGG5k9u6voTcXlFwS+B5YjULhK4NSxdJ2BnOxzYzxQ3IYQZMlb2xt8yZYx/ZZK +wjkr9rcAPEQIQZ2A6NUbGq6qCD7sSmg6UAi0CgiqTokQ/Wtag0UDvFMzwerdg/On +37uxffpxpte8z1jYi/MxRaoTYueuc1UVnqofVIM7AoGBALZJzwPzUY/bVAADUJmd +lYZiFsAGBF42/E05MOgH1GaK/ZWy/fkouDLsfK67XaK7JZk6ajLSDLG9R1kxRym6 +y2FoGFtiKPfo8xIenrNhx3gCrG/jVjB9UYyXWiKNXifukr9M8/SkdBfFGWsZYqGd +fmXVMiVaFoVcce8hLxwWWEABAoGBAKcyhKX/HEj6YFqlIoqkydDAylXs1jicZ27l +rF2yum8KXZpMMdzbutuKsdAD8Ql0K6NB4a+jByuiTMn5/11cJxUEqkgM9sArZQW+ +tH2+r+/VQpyTS0/rpXVGj/2nl2K1kI2T4R36e/aTl6CanWweAf9JK/lC9rxKyxg+ +p6SaFuObAoGACP6TKCkp2oymXlKgdUUgPrnsaz2VAw8jD5QHtx10U4wty0C8gxsk +MLe00h09iLPyFmvJpD+MgbxV/r6RrZeVdsKdU/5LG52YgiVSTaizyy+ciEfW7xoQ +CL5EtZd8Cn5OKinBEzzFpELqunlqepIKCIDOcLKz/cjR+3a+E6Zx5Wo= +-----END RSA PRIVATE KEY----- |