Sign Up
Log In
Log In
or
Sign Up
Places
All Projects
Status Monitor
Collapse sidebar
devel:tools:building
bazel-rules-cc
bazel-rules-cc-20190722.obscpio
Overview
Repositories
Revisions
Requests
Users
Attributes
Meta
File bazel-rules-cc-20190722.obscpio of Package bazel-rules-cc
07070100000000000041ED000003E800000064000000025D359B4200000000000000000000000000000000000000000000002100000000bazel-rules-cc-20190722/.bazelci07070100000001000081A4000003E800000064000000015D359B420000032E000000000000000000000000000000000000002F00000000bazel-rules-cc-20190722/.bazelci/presubmit.yml--- platforms: ubuntu1604: run_targets: build_targets: - "..." test_flags: - "--test_timeout=300" test_targets: - "..." ubuntu1804: run_targets: build_targets: - "..." test_flags: - "--test_timeout=300" test_targets: - "..." ubuntu1804_nojava: run_targets: build_flags: - "--javabase=@openjdk11_linux_archive//:runtime" build_targets: - "..." test_flags: - "--test_timeout=300" - "--javabase=@openjdk11_linux_archive//:runtime" test_targets: - "..." macos: run_targets: build_targets: - "..." test_flags: - "--test_timeout=300" test_targets: - "..." windows: run_targets: build_targets: - "..." test_flags: - "--test_timeout=300" test_targets: - "..." 07070100000002000081A4000003E800000064000000015D359B4200000068000000000000000000000000000000000000001E00000000bazel-rules-cc-20190722/BUILDpackage(default_visibility = ["//visibility:public"]) licenses(["notice"]) exports_files(["LICENSE"]) 07070100000003000081A4000003E800000064000000015D359B420000001E000000000000000000000000000000000000002300000000bazel-rules-cc-20190722/CODEOWNERS* @hlopko @scentini @oquenchil07070100000004000081A4000003E800000064000000015D359B420000044D000000000000000000000000000000000000002800000000bazel-rules-cc-20190722/CONTRIBUTING.md# How to Contribute We'd love to accept your patches and contributions to this project. There are just a few small guidelines you need to follow. ## Contributor License Agreement Contributions to this project must be accompanied by a Contributor License Agreement. You (or your employer) retain the copyright to your contribution; this simply gives us permission to use and redistribute your contributions as part of the project. Head over to <https://cla.developers.google.com/> to see your current agreements on file or to sign a new one. You generally only need to submit a CLA once, so if you've already submitted one (even if it was for a different project), you probably don't need to do it again. ## Code reviews All submissions, including submissions by project members, require review. We use GitHub pull requests for this purpose. Consult [GitHub Help](https://help.github.com/articles/about-pull-requests/) for more information on using pull requests. ## Community Guidelines This project follows [Google's Open Source Community Guidelines](https://opensource.google.com/conduct/). 07070100000005000081A4000003E800000064000000015D359B4200000802000000000000000000000000000000000000002A00000000bazel-rules-cc-20190722/ISSUE_TEMPLATE.md > ATTENTION! Please read and follow: > - if this is a _question_ about how to build / test / query / deploy using Bazel, ask it on StackOverflow instead: https://stackoverflow.com/questions/tagged/bazel > - if this is a _discussion starter_, send it to bazel-discuss@googlegroups.com or cc-bazel-discuss@googlegroups.com > - if this is a _bug_ or _feature request_, fill the form below as best as you can. ### Description of the problem / feature request: > Replace this line with your answer. ### Feature requests: what underlying problem are you trying to solve with this feature? > Replace this line with your answer. ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. > Replace this line with your answer. ### What operating system are you running Bazel on? > Replace this line with your answer. ### What's the output of `bazel info release`? > Replace this line with your answer. ### If `bazel info release` returns "development version" or "(@non-git)", tell us how you built Bazel. > Replace this line with your answer. ### What version of rules_cc do you use? Can you paste the workspace rule used to fetch rules_cc? What other relevant dependencies does your project have? > Replace this line with your answer. ### What Bazel options do you use to trigger the issue? What C++ toolchain do you use? > Replace this line with your answer. ### Have you found anything relevant by searching the web? > Replace these lines with your answer. > > Places to look: > * StackOverflow: http://stackoverflow.com/questions/tagged/bazel > * GitHub issues: > * https://github.com/bazelbuild/rules_cc/issues > * https://github.com/bazelbuild/bazel/issues > * email threads: > * https://groups.google.com/forum/#!forum/bazel-discuss > * https://groups.google.com/forum/#!forum/cc-bazel-discuss ### Any other information, logs, or outputs that you want to share? > Replace these lines with your answer. > > If the files are large, upload as attachment or provide link. 07070100000006000081A4000003E800000064000000015D359B4200002C5D000000000000000000000000000000000000002000000000bazel-rules-cc-20190722/LICENSE Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.07070100000007000081A4000003E800000064000000015D359B420000086A000000000000000000000000000000000000002200000000bazel-rules-cc-20190722/README.md# C++ rules for Bazel [![Build status](https://badge.buildkite.com/f03592ae2d7d25a2abc2a2ba776e704823fa17fd3e061f5103.svg?branch=master)](https://buildkite.com/bazel/rules-cc) This repository contains Starlark implementation of C++ rules in Bazel. The rules are being incrementally converted from their native implementations in the [Bazel source tree](https://source.bazel.build/bazel/+/master:src/main/java/com/google/devtools/build/lib/rules/cpp/). For the list of C++ rules, see the Bazel [documentation](https://docs.bazel.build/versions/master/be/overview.html). # Getting Started There is no need to use rules from this repository just yet. If you want to use rules\_cc anyway, add the following to your WORKSPACE file: ``` load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") http_archive( name = "rules_cc", urls = ["https://github.com/bazelbuild/rules_cc/archive/TODO"], sha256 = "TODO", ) ``` Then, in your BUILD files, import and use the rules: ``` load("@rules_cc//cc:defs.bzl", "cc_library") cc_library( ... ) ``` # Migration Tools This repository also contains migration tools that can be used to migrate your project for Bazel incompatible changes. ## Legacy fields migrator Script that migrates legacy crosstool fields into features ([incompatible flag](https://github.com/bazelbuild/bazel/issues/6861), [tracking issue](https://github.com/bazelbuild/bazel/issues/5883)). TLDR: bazel run @rules_cc//tools/migration:legacy_fields_migrator -- \ --input=my_toolchain/CROSSTOOL \ --inline # Contributing Bazel and rules_cc are the work of many contributors. We appreciate your help! To contribute, please read the contribution guidelines: [CONTRIBUTING.md](https://github.com/bazelbuild/rules_cc/blob/master/CONTRIBUTING.md). Note that the rules_cc use the GitHub issue tracker for bug reports and feature requests only. For asking questions see: * [Stack Overflow](https://stackoverflow.com/questions/tagged/bazel) * [rules_cc mailing list](https://groups.google.com/forum/#!forum/cc-bazel-discuss) * Slack channel `#cc` on [slack.bazel.build](https://slack.bazel.build) 07070100000008000081A4000003E800000064000000015D359B4200000BCA000000000000000000000000000000000000002200000000bazel-rules-cc-20190722/WORKSPACEworkspace(name = "rules_cc") load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") http_archive( name = "six_archive", urls = [ "https://mirror.bazel.build/pypi.python.org/packages/source/s/six/six-1.10.0.tar.gz", "https://pypi.python.org/packages/source/s/six/six-1.10.0.tar.gz", ], sha256 = "105f8d68616f8248e24bf0e9372ef04d3cc10104f1980f54d57b2ce73a5ad56a", strip_prefix = "six-1.10.0", build_file = "@//third_party:six.BUILD", ) bind( name = "six", actual = "@six_archive//:six", ) http_archive( name = "com_google_protobuf", sha256 = "3e933375ecc58d01e52705479b82f155aea2d02cc55d833f8773213e74f88363", strip_prefix = "protobuf-3.7.0", urls = [ "https://mirror.bazel.build/github.com/protocolbuffers/protobuf/archive/protobuf-all-3.7.0.tar.gz", "https://github.com/protocolbuffers/protobuf/releases/download/v3.7.0/protobuf-all-3.7.0.tar.gz", ], ) http_archive( name = "io_abseil_py", sha256 = "74a2203a9b4681851f4f1dfc17f2832e0a16bae0369b288b18b431cea63f0ee9", strip_prefix = "abseil-py-pypi-v0.6.1", urls = [ "https://mirror.bazel.build/github.com/abseil/abseil-py/archive/pypi-v0.6.1.zip", "https://github.com/abseil/abseil-py/archive/pypi-v0.6.1.zip", ], ) http_archive( name = "py_mock", sha256 = "b839dd2d9c117c701430c149956918a423a9863b48b09c90e30a6013e7d2f44f", urls = [ "https://mirror.bazel.build/pypi.python.org/packages/source/m/mock/mock-1.0.1.tar.gz", "https://pypi.python.org/packages/source/m/mock/mock-1.0.1.tar.gz", ], strip_prefix = "mock-1.0.1", patch_cmds = [ "mkdir -p py/mock", "mv mock.py py/mock/__init__.py", """echo 'licenses(["notice"])' > BUILD""", "touch py/BUILD", """echo 'py_library(name = "mock", srcs = ["__init__.py"], visibility = ["//visibility:public"],)' > py/mock/BUILD""", ], ) # TODO(https://github.com/protocolbuffers/protobuf/issues/5918: Remove when protobuf releases protobuf_deps.bzl) http_archive( name = "net_zlib", build_file = "@com_google_protobuf//examples:third_party/zlib.BUILD", sha256 = "c3e5e9fdd5004dcb542feda5ee4f0ff0744628baf8ed2dd5d66f8ca1197cb1a1", strip_prefix = "zlib-1.2.11", urls = ["https://zlib.net/zlib-1.2.11.tar.gz"], ) bind( name = "zlib", actual = "@net_zlib//:zlib" ) # Go rules and proto support http_archive( name = "io_bazel_rules_go", urls = [ "https://mirror.bazel.build/github.com/bazelbuild/rules_go/releases/download/0.18.6/rules_go-0.18.6.tar.gz", "https://github.com/bazelbuild/rules_go/releases/download/0.18.6/rules_go-0.18.6.tar.gz", ], sha256 = "f04d2373bcaf8aa09bccb08a98a57e721306c8f6043a2a0ee610fd6853dcde3d", ) load("@io_bazel_rules_go//go:deps.bzl", "go_rules_dependencies", "go_register_toolchains") go_rules_dependencies() go_register_toolchains() load("//cc:repositories.bzl", "rules_cc_dependencies") rules_cc_dependencies() 07070100000009000041ED000003E800000064000000025D359B4200000000000000000000000000000000000000000000001B00000000bazel-rules-cc-20190722/cc0707010000000A000081A4000003E800000064000000015D359B420000009E000000000000000000000000000000000000002100000000bazel-rules-cc-20190722/cc/BUILDlicenses(["notice"]) # Apache 2.0 alias( name = "toolchain_type", actual = "@bazel_tools//tools/cpp:toolchain_type", ) exports_files(["defs.bzl"]) 0707010000000B000081A4000003E800000064000000015D359B4200000F43000000000000000000000000000000000000002400000000bazel-rules-cc-20190722/cc/defs.bzl# Copyright 2018 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Starlark rules for building C++ projects.""" load("@bazel_tools//tools/cpp:cc_flags_supplier.bzl", _cc_flags_supplier = "cc_flags_supplier") load("@bazel_tools//tools/cpp:compiler_flag.bzl", _compiler_flag = "compiler_flag") _MIGRATION_TAG = "__CC_RULES_MIGRATION_DO_NOT_USE_WILL_BREAK__" def _add_tags(attrs): if "tags" in attrs and attrs["tags"] != None: attrs["tags"] += [_MIGRATION_TAG] else: attrs["tags"] = [_MIGRATION_TAG] return attrs def cc_binary(**attrs): """Bazel cc_binary rule. https://docs.bazel.build/versions/master/be/c-cpp.html#cc_binary Args: **attrs: Rule attributes """ native.cc_binary(**_add_tags(attrs)) def cc_test(**attrs): """Bazel cc_test rule. https://docs.bazel.build/versions/master/be/c-cpp.html#cc_test Args: **attrs: Rule attributes """ native.cc_test(**_add_tags(attrs)) def cc_library(**attrs): """Bazel cc_library rule. https://docs.bazel.build/versions/master/be/c-cpp.html#cc_library Args: **attrs: Rule attributes """ native.cc_library(**_add_tags(attrs)) def cc_import(**attrs): """Bazel cc_import rule. https://docs.bazel.build/versions/master/be/c-cpp.html#cc_import Args: **attrs: Rule attributes """ native.cc_import(**_add_tags(attrs)) def cc_proto_library(**attrs): """Bazel cc_proto_library rule. https://docs.bazel.build/versions/master/be/c-cpp.html#cc_proto_library Args: **attrs: Rule attributes """ native.cc_proto_library(**_add_tags(attrs)) def fdo_prefetch_hints(**attrs): """Bazel fdo_prefetch_hints rule. https://docs.bazel.build/versions/master/be/c-cpp.html#fdo_prefetch_hints Args: **attrs: Rule attributes """ native.fdo_prefetch_hints(**_add_tags(attrs)) def fdo_profile(**attrs): """Bazel fdo_profile rule. https://docs.bazel.build/versions/master/be/c-cpp.html#fdo_profile Args: **attrs: Rule attributes """ native.fdo_profile(**_add_tags(attrs)) def cc_toolchain(**attrs): """Bazel cc_toolchain rule. https://docs.bazel.build/versions/master/be/c-cpp.html#cc_toolchain Args: **attrs: Rule attributes """ native.cc_toolchain(**_add_tags(attrs)) def cc_toolchain_suite(**attrs): """Bazel cc_toolchain_suite rule. https://docs.bazel.build/versions/master/be/c-cpp.html#cc_toolchain_suite Args: **attrs: Rule attributes """ native.cc_toolchain_suite(**_add_tags(attrs)) def objc_library(**attrs): """Bazel objc_library rule. https://docs.bazel.build/versions/master/be/objective-c.html#objc_library Args: **attrs: Rule attributes """ native.objc_library(**_add_tags(attrs)) def objc_import(**attrs): """Bazel objc_import rule. https://docs.bazel.build/versions/master/be/objective-c.html#objc_import Args: **attrs: Rule attributes """ native.objc_import(**_add_tags(attrs)) def cc_flags_supplier(**attrs): """Bazel cc_flags_supplier rule. Args: **attrs: Rule attributes """ _cc_flags_supplier(**_add_tags(attrs)) def compiler_flag(**attrs): """Bazel compiler_flag rule. Args: **attrs: Rule attributes """ _compiler_flag(**_add_tags(attrs)) 0707010000000C000081A4000003E800000064000000015D359B4200000AD0000000000000000000000000000000000000003100000000bazel-rules-cc-20190722/cc/find_cc_toolchain.bzl# pylint: disable=g-bad-file-header # Copyright 2016 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """ Returns the current `CcToolchainInfo`. * When https://github.com/bazelbuild/bazel/issues/7260 is **not** flipped, current C++ toolchain is selected using the legacy mechanism (`--crosstool_top`, `--cpu`, `--compiler`). For that to work the rule needs to declare an `_cc_toolchain` attribute, e.g. foo = rule( implementation = _foo_impl, attrs = { "_cc_toolchain": attr.label(default = Label("@bazel_tools//tools/cpp:current_cc_toolchain")), }, ) * When https://github.com/bazelbuild/bazel/issues/7260 **is** flipped, current C++ toolchain is selected using the toolchain resolution mechanism (`--platforms`). For that to work the rule needs to declare a dependency on C++ toolchain type: foo = rule( implementation = _foo_impl, toolchains = ["@rules_cc//cc:toolchain_type"], ) We advise to depend on both `_cc_toolchain` attr and `@rules_cc//cc:toolchain_type` for the duration of the migration. After https://github.com/bazelbuild/bazel/issues/7260 is flipped (and support for old Bazel version is not needed), it's enough to only keep the `@rules_cc//cc:toolchain_type`. """ def find_cc_toolchain(ctx): """ Returns the current `CcToolchainInfo`. Args: ctx: The rule context for which to find a toolchain. Returns: A CcToolchainInfo. """ # Check the incompatible flag for toolchain resolution. if hasattr(cc_common, "is_cc_toolchain_resolution_enabled_do_not_use") and cc_common.is_cc_toolchain_resolution_enabled_do_not_use(ctx = ctx): if "@rules_cc//cc:toolchain_type" in ctx.toolchains: return ctx.toolchains["@rules_cc//cc:toolchain_type"] fail("In order to use find_cc_toolchain, you must include the '@rules_cc//cc:toolchain_type' in the toolchains argument to your rule.") # Fall back to the legacy implicit attribute lookup. if hasattr(ctx.attr, "_cc_toolchain"): return ctx.attr._cc_toolchain[cc_common.CcToolchainInfo] # We didn't find anything. fail("In order to use find_ccc_toolchain, you must define the '_cc_toolchain' attribute on your rule or aspect.") 0707010000000D000081A4000003E800000064000000015D359B42000002AA000000000000000000000000000000000000002C00000000bazel-rules-cc-20190722/cc/repositories.bzl"""Repository rules entry point module for rules_cc.""" load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") def rules_cc_dependencies(): _maybe( http_archive, name = "bazel_skylib", sha256 = "2ea8a5ed2b448baf4a6855d3ce049c4c452a6470b1efd1504fdb7c1c134d220a", strip_prefix = "bazel-skylib-0.8.0", urls = [ "https://mirror.bazel.build/github.com/bazelbuild/bazel-skylib/archive/0.8.0.tar.gz", "https://github.com/bazelbuild/bazel-skylib/archive/0.8.0.tar.gz", ], ) def _maybe(repo_rule, name, **kwargs): if not native.existing_rule(name): repo_rule(name = name, **kwargs) 0707010000000E000041ED000003E800000064000000055D359B4200000000000000000000000000000000000000000000002100000000bazel-rules-cc-20190722/examples0707010000000F000081A4000003E800000064000000015D359B42000002A9000000000000000000000000000000000000002700000000bazel-rules-cc-20190722/examples/BUILD# Copyright 2019 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # A collection of examples showing the usage of rules_cc licenses(["notice"]) 07070100000010000041ED000003E800000064000000025D359B4200000000000000000000000000000000000000000000002E00000000bazel-rules-cc-20190722/examples/my_c_archive07070100000011000081A4000003E800000064000000015D359B4200000487000000000000000000000000000000000000003400000000bazel-rules-cc-20190722/examples/my_c_archive/BUILD# Copyright 2019 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # Example showing how to create a custom Starlark rule that rules_cc can depend on licenses(["notice"]) load("//examples/my_c_compile:my_c_compile.bzl", "my_c_compile") load("//examples/my_c_archive:my_c_archive.bzl", "my_c_archive") load("@rules_cc//cc:defs.bzl", "cc_binary") cc_binary( name = "main", srcs = ["main.c"], deps = [":archive"], ) my_c_archive( name = "archive", object = ":object", deps = [":bar"], ) my_c_compile( name = "object", src = "foo.c", ) cc_library( name = "bar", srcs = ["bar.c"], ) 07070100000012000081A4000003E800000064000000015D359B420000001A000000000000000000000000000000000000003400000000bazel-rules-cc-20190722/examples/my_c_archive/bar.cint bar() { return -42; } 07070100000013000081A4000003E800000064000000015D359B4200000281000000000000000000000000000000000000003400000000bazel-rules-cc-20190722/examples/my_c_archive/foo.c// Copyright 2019 The Bazel Authors. All rights reserved. // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. int foo() { return 42; } 07070100000014000081A4000003E800000064000000015D359B420000003B000000000000000000000000000000000000003500000000bazel-rules-cc-20190722/examples/my_c_archive/main.cint foo(); int bar(); int main() { return foo() + bar(); } 07070100000015000081A4000003E800000064000000015D359B4200000DE3000000000000000000000000000000000000003F00000000bazel-rules-cc-20190722/examples/my_c_archive/my_c_archive.bzl# Copyright 2019 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Example showing how to create a rule that rules_cc can depend on.""" load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain") load("@bazel_tools//tools/build_defs/cc:action_names.bzl", "CPP_LINK_STATIC_LIBRARY_ACTION_NAME") load("//examples/my_c_compile:my_c_compile.bzl", "MyCCompileInfo") def _my_c_archive_impl(ctx): cc_toolchain = find_cpp_toolchain(ctx) object_file = ctx.attr.object[MyCCompileInfo].object output_file = ctx.actions.declare_file(ctx.label.name + ".a") feature_configuration = cc_common.configure_features( ctx = ctx, cc_toolchain = cc_toolchain, requested_features = ctx.features, unsupported_features = ctx.disabled_features, ) library_to_link = cc_common.create_library_to_link( actions = ctx.actions, feature_configuration = feature_configuration, cc_toolchain = cc_toolchain, static_library = output_file, ) compilation_context = cc_common.create_compilation_context() linking_context = cc_common.create_linking_context(libraries_to_link = [library_to_link]) archiver_path = cc_common.get_tool_for_action( feature_configuration = feature_configuration, action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME, ) archiver_variables = cc_common.create_link_variables( feature_configuration = feature_configuration, cc_toolchain = cc_toolchain, output_file = output_file.path, is_using_linker = False, ) command_line = cc_common.get_memory_inefficient_command_line( feature_configuration = feature_configuration, action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME, variables = archiver_variables, ) args = ctx.actions.args() args.add_all(command_line) args.add(object_file) env = cc_common.get_environment_variables( feature_configuration = feature_configuration, action_name = CPP_LINK_STATIC_LIBRARY_ACTION_NAME, variables = archiver_variables, ) ctx.actions.run( executable = archiver_path, arguments = [args], env = env, inputs = depset( direct = [object_file], transitive = [ cc_toolchain.all_files, ], ), outputs = [output_file], ) cc_info = cc_common.merge_cc_infos(cc_infos = [ CcInfo(compilation_context = compilation_context, linking_context = linking_context), ] + [dep[CcInfo] for dep in ctx.attr.deps]) return [cc_info] my_c_archive = rule( implementation = _my_c_archive_impl, attrs = { "object": attr.label(mandatory = True, providers = [MyCCompileInfo]), "deps": attr.label_list(providers = [CcInfo]), "_cc_toolchain": attr.label(default = Label("@bazel_tools//tools/cpp:current_cc_toolchain")), }, fragments = ["cpp"], toolchains = ["@bazel_tools//tools/cpp:toolchain_type"], ) 07070100000016000041ED000003E800000064000000025D359B4200000000000000000000000000000000000000000000002E00000000bazel-rules-cc-20190722/examples/my_c_compile07070100000017000081A4000003E800000064000000015D359B420000033C000000000000000000000000000000000000003400000000bazel-rules-cc-20190722/examples/my_c_compile/BUILD# Copyright 2019 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # Example showing how to create a custom Starlark rule that just compiles C sources licenses(["notice"]) load("//examples/my_c_compile:my_c_compile.bzl", "my_c_compile") my_c_compile( name = "foo", src = "foo.c", ) 07070100000018000081A4000003E800000064000000015D359B4200000281000000000000000000000000000000000000003400000000bazel-rules-cc-20190722/examples/my_c_compile/foo.c// Copyright 2019 The Bazel Authors. All rights reserved. // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. int foo() { return 42; } 07070100000019000081A4000003E800000064000000015D359B4200000BC8000000000000000000000000000000000000003F00000000bazel-rules-cc-20190722/examples/my_c_compile/my_c_compile.bzl# Copyright 2019 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Example showing how to create a rule that just compiles C sources.""" load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain") load("@bazel_tools//tools/build_defs/cc:action_names.bzl", "C_COMPILE_ACTION_NAME") MyCCompileInfo = provider(doc = "", fields = ["object"]) DISABLED_FEATURES = [ # "module_maps", # copybara-comment-this-out-please ] def _my_c_compile_impl(ctx): cc_toolchain = find_cpp_toolchain(ctx) source_file = ctx.file.src output_file = ctx.actions.declare_file(ctx.label.name + ".o") feature_configuration = cc_common.configure_features( ctx = ctx, cc_toolchain = cc_toolchain, requested_features = ctx.features, unsupported_features = DISABLED_FEATURES + ctx.disabled_features, ) c_compiler_path = cc_common.get_tool_for_action( feature_configuration = feature_configuration, action_name = C_COMPILE_ACTION_NAME, ) c_compile_variables = cc_common.create_compile_variables( feature_configuration = feature_configuration, cc_toolchain = cc_toolchain, user_compile_flags = ctx.fragments.cpp.copts + ctx.fragments.cpp.conlyopts, source_file = source_file.path, output_file = output_file.path, ) command_line = cc_common.get_memory_inefficient_command_line( feature_configuration = feature_configuration, action_name = C_COMPILE_ACTION_NAME, variables = c_compile_variables, ) env = cc_common.get_environment_variables( feature_configuration = feature_configuration, action_name = C_COMPILE_ACTION_NAME, variables = c_compile_variables, ) ctx.actions.run( executable = c_compiler_path, arguments = command_line, env = env, inputs = depset( items = [source_file], transitive = [cc_toolchain.all_files], ), outputs = [output_file], ) return [ DefaultInfo(files = depset(items = [output_file])), MyCCompileInfo(object = output_file), ] my_c_compile = rule( implementation = _my_c_compile_impl, attrs = { "src": attr.label(mandatory = True, allow_single_file = True), "_cc_toolchain": attr.label(default = Label("@bazel_tools//tools/cpp:current_cc_toolchain")), }, toolchains = ["@bazel_tools//tools/cpp:toolchain_type"], fragments = ["cpp"], ) 0707010000001A000041ED000003E800000064000000025D359B4200000000000000000000000000000000000000000000003800000000bazel-rules-cc-20190722/examples/write_cc_toolchain_cpu0707010000001B000081A4000003E800000064000000015D359B420000034A000000000000000000000000000000000000003E00000000bazel-rules-cc-20190722/examples/write_cc_toolchain_cpu/BUILD# Copyright 2019 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # Example showing how to get CcToolchainInfo in a custom starlark rule licenses(["notice"]) load("//examples/write_cc_toolchain_cpu:write_cc_toolchain_cpu.bzl", "write_cc_toolchain_cpu") write_cc_toolchain_cpu(name = "write_me_the_cpu") 0707010000001C000081A4000003E800000064000000015D359B420000054B000000000000000000000000000000000000005300000000bazel-rules-cc-20190722/examples/write_cc_toolchain_cpu/write_cc_toolchain_cpu.bzl# Copyright 2019 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Example showing how to get CcToolchainInfo in a custom rule.""" load("@bazel_tools//tools/cpp:toolchain_utils.bzl", "find_cpp_toolchain") def _write_cc_toolchain_cpu_impl(ctx): cc_toolchain = find_cpp_toolchain(ctx) output = ctx.actions.declare_file(ctx.label.name + "_cpu") ctx.actions.write(output, cc_toolchain.cpu) return [DefaultInfo(files = depset([output]))] # This rule does nothing, just writes the target_cpu from the cc_toolchain used for this build. write_cc_toolchain_cpu = rule( implementation = _write_cc_toolchain_cpu_impl, attrs = { "_cc_toolchain": attr.label(default = Label("@bazel_tools//tools/cpp:current_cc_toolchain")), }, toolchains = ["@bazel_tools//tools/cpp:toolchain_type"], ) 0707010000001D000081A4000003E800000064000000015D359B4200000029000000000000000000000000000000000000002600000000bazel-rules-cc-20190722/renovate.json{ "extends": [ "config:base" ] } 0707010000001E000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000002400000000bazel-rules-cc-20190722/third_party0707010000001F000081A4000003E800000064000000015D359B4200000042000000000000000000000000000000000000002A00000000bazel-rules-cc-20190722/third_party/BUILD# Intentionally empty, only there to make //third_party a package.07070100000020000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000002800000000bazel-rules-cc-20190722/third_party/com07070100000021000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000002F00000000bazel-rules-cc-20190722/third_party/com/github07070100000022000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000003A00000000bazel-rules-cc-20190722/third_party/com/github/bazelbuild07070100000023000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000004000000000bazel-rules-cc-20190722/third_party/com/github/bazelbuild/bazel07070100000024000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000004400000000bazel-rules-cc-20190722/third_party/com/github/bazelbuild/bazel/src07070100000025000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000004900000000bazel-rules-cc-20190722/third_party/com/github/bazelbuild/bazel/src/main07070100000026000041ED000003E800000064000000025D359B4200000000000000000000000000000000000000000000005200000000bazel-rules-cc-20190722/third_party/com/github/bazelbuild/bazel/src/main/protobuf07070100000027000081A4000003E800000064000000015D359B42000002E7000000000000000000000000000000000000005800000000bazel-rules-cc-20190722/third_party/com/github/bazelbuild/bazel/src/main/protobuf/BUILDlicenses(["notice"]) # Apache 2.0 load("@com_google_protobuf//:protobuf.bzl", "py_proto_library") load("@io_bazel_rules_go//proto:def.bzl", "go_proto_library") py_proto_library( name = "crosstool_config_py_pb2", srcs = ["crosstool_config.proto"], visibility = [ "//tools/migration:__pkg__", ], ) proto_library( name = "crosstool_config_pb2", srcs = ["crosstool_config.proto"], visibility = [ "//tools/migration:__pkg__", ], ) go_proto_library( name = "crosstool_config_go_proto", importpath = "third_party/com/github/bazelbuild/bazel/src/main/protobuf/crosstool_config_go_proto", proto = ":crosstool_config_pb2", visibility = [ "//tools/migration:__pkg__", ], ) 07070100000028000081A4000003E800000064000000015D359B4200005306000000000000000000000000000000000000006900000000bazel-rules-cc-20190722/third_party/com/github/bazelbuild/bazel/src/main/protobuf/crosstool_config.proto// Copyright 2014 The Bazel Authors. All rights reserved. // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. // // File format for Blaze to configure Crosstool releases. syntax = "proto2"; // option java_api_version = 2; // copybara-comment-this-out-please option java_package = "com.google.devtools.build.lib.view.config.crosstool"; package com.google.devtools.build.lib.view.config.crosstool; // A description of a toolchain, which includes all the tools generally expected // to be available for building C/C++ targets, based on the GNU C compiler. // // System and cpu names are two overlapping concepts, which need to be both // supported at this time. The cpu name is the blaze command-line name for the // target system. The most common values are 'k8' and 'piii'. The system name is // a more generic identification of the executable system, based on the names // used by the GNU C compiler. // // Typically, the system name contains an identifier for the cpu (e.g. x86_64 or // alpha), an identifier for the machine (e.g. pc, or unknown), and an // identifier for the operating system (e.g. cygwin or linux-gnu). Typical // examples are 'x86_64-unknown-linux-gnu' and 'i686-unknown-cygwin'. // // The system name is used to determine if a given machine can execute a given // executable. In particular, it is used to check if the compilation products of // a toolchain can run on the host machine. message CToolchain { // A group of correlated flags. Supports parametrization via variable // expansion. // // To expand a variable of list type, flag_group has to be annotated with // `iterate_over` message. Then all nested flags or flag_groups will be // expanded repeatedly for each element of the list. // // For example: // flag_group { // iterate_over: 'include_path' // flag: '-I' // flag: '%{include_path}' // } // ... will get expanded to -I /to/path1 -I /to/path2 ... for each // include_path /to/pathN. // // To expand a variable of structure type, use dot-notation, e.g.: // flag_group { // iterate_over: "libraries_to_link" // flag_group { // iterate_over: "libraries_to_link.libraries" // flag: "-L%{libraries_to_link.libraries.directory}" // } // } // // Flag groups can be nested; if they are, the flag group must only contain // other flag groups (no flags) so the order is unambiguously specified. // In order to expand a variable of nested lists, 'iterate_over' can be used. // // For example: // flag_group { // iterate_over: 'object_files' // flag_group { flag: '--start-lib' } // flag_group { // iterate_over: 'object_files' // flag: '%{object_files}' // } // flag_group { flag: '--end-lib' } // } // ... will get expanded to // --start-lib a1.o a2.o ... --end-lib --start-lib b1.o b2.o .. --end-lib // with %{object_files} being a variable of nested list type // [['a1.o', 'a2.o', ...], ['b1.o', 'b2.o', ...], ...]. // // TODO(bazel-team): Write more elaborate documentation and add a link to it. message FlagGroup { repeated string flag = 1; repeated FlagGroup flag_group = 2; optional string iterate_over = 3; repeated string expand_if_all_available = 4; repeated string expand_if_none_available = 5; optional string expand_if_true = 6; optional string expand_if_false = 7; optional VariableWithValue expand_if_equal = 8; } message VariableWithValue { required string variable = 1; required string value = 2; } // A key/value pair to be added as an environment variable. The value of // this pair is expanded in the same way as is described in FlagGroup. // The key remains an unexpanded string literal. message EnvEntry { required string key = 1; required string value = 2; } // A set of features; used to support logical 'and' when specifying feature // requirements in Feature. message FeatureSet { repeated string feature = 1; } // A set of positive and negative features. This stanza will // evaluate to true when every 'feature' is enabled, and every // 'not_feature' is not enabled. message WithFeatureSet { repeated string feature = 1; repeated string not_feature = 2; } // A set of flags that are expanded in the command line for specific actions. message FlagSet { // The actions this flag set applies to; each flag set must specify at // least one action. repeated string action = 1; // The flags applied via this flag set. repeated FlagGroup flag_group = 2; // A list of feature sets defining when this flag set gets applied. The // flag set will be applied when any one of the feature sets evaluate to // true. (That is, when when every 'feature' is enabled, and every // 'not_feature' is not enabled.) // // If 'with_feature' is omitted, the flag set will be applied // unconditionally for every action specified. repeated WithFeatureSet with_feature = 3; // Deprecated (https://github.com/bazelbuild/bazel/issues/7008) - use // expand_if_all_available in flag_group // // A list of build variables that this feature set needs, but which are // allowed to not be set. If any of the build variables listed is not // set, the feature set will not be expanded. // // NOTE: Consider alternatives before using this; usually tools should // consistently create the same set of files, even if empty; use this // only for backwards compatibility with already existing behavior in tools // that are currently not worth changing. repeated string expand_if_all_available = 4; } // A set of environment variables that are expanded in the command line for // specific actions. message EnvSet { // The actions this env set applies to; each env set must specify at // least one action. repeated string action = 1; // The environment variables applied via this env set. repeated EnvEntry env_entry = 2; // A list of feature sets defining when this env set gets applied. The // env set will be applied when any one of the feature sets evaluate to // true. (That is, when when every 'feature' is enabled, and every // 'not_feature' is not enabled.) // // If 'with_feature' is omitted, the env set will be applied // unconditionally for every action specified. repeated WithFeatureSet with_feature = 3; } // Contains all flag specifications for one feature. // Next ID: 8 message Feature { // The feature's name. Feature names are generally defined by Bazel; it is // possible to introduce a feature without a change to Bazel by adding a // 'feature' section to the toolchain and adding the corresponding string as // feature in the BUILD file. optional string name = 1; // If 'true', this feature is enabled unless a rule type explicitly marks it // as unsupported. Such features cannot be turned off from within a BUILD // file or the command line. optional bool enabled = 7; // If the given feature is enabled, the flag sets will be applied for the // actions in the modes that they are specified for. repeated FlagSet flag_set = 2; // If the given feature is enabled, the env sets will be applied for the // actions in the modes that they are specified for. repeated EnvSet env_set = 6; // A list of feature sets defining when this feature is supported by the // toolchain. The feature is supported if any of the feature sets fully // apply, that is, when all features of a feature set are enabled. // // If 'requires' is omitted, the feature is supported independently of which // other features are enabled. // // Use this for example to filter flags depending on the build mode // enabled (opt / fastbuild / dbg). repeated FeatureSet requires = 3; // A list of features or action configs that are automatically enabled when // this feature is enabled. If any of the implied features or action configs // cannot be enabled, this feature will (silently) not be enabled either. repeated string implies = 4; // A list of names this feature conflicts with. // A feature cannot be enabled if: // - 'provides' contains the name of a different feature or action config // that we want to enable. // - 'provides' contains the same value as a 'provides' in a different // feature or action config that we want to enable. // // Use this in order to ensure that incompatible features cannot be // accidentally activated at the same time, leading to hard to diagnose // compiler errors. repeated string provides = 5; } // Describes a tool associated with a crosstool action config. message Tool { // Path to the tool, relative to the location of the crosstool. required string tool_path = 1; // A list of feature sets defining when this tool is applicable. The tool // will used when any one of the feature sets evaluate to true. (That is, // when when every 'feature' is enabled, and every 'not_feature' is not // enabled.) // // If 'with_feature' is omitted, the tool will apply for any feature // configuration. repeated WithFeatureSet with_feature = 2; // Requirements on the execution environment for the execution of this tool, // to be passed as out-of-band "hints" to the execution backend. // Ex. "requires-darwin" repeated string execution_requirement = 3; } // The name for an artifact of a given category of input or output artifacts // to an action. message ArtifactNamePattern { // The category of artifacts that this selection applies to. This field // is compared against a list of categories defined in bazel. Example // categories include "linked_output" or "debug_symbols". An error is thrown // if no category is matched. required string category_name = 1; // The prefix and extension for creating the artifact for this selection. // They are used to create an artifact name based on the target name. required string prefix = 2; required string extension = 3; } // An action config corresponds to a blaze action, and allows selection of // a tool based on activated features. Action configs come in two varieties: // automatic (the blaze action will exist whether or not the action config // is activated) and attachable (the blaze action will be added to the // action graph only if the action config is activated). // // Action config activation occurs by the same semantics as features: a // feature can 'require' or 'imply' an action config in the same way that it // would another feature. // Next ID: 9 message ActionConfig { // The name other features will use to activate this action config. Can // be the same as action_name. required string config_name = 1; // The name of the blaze action that this config applies to, ex. 'c-compile' // or 'c-module-compile'. required string action_name = 2; // If 'true', this feature is enabled unless a rule type explicitly marks it // as unsupported. Such action_configs cannot be turned off from within a // BUILD file or the command line. optional bool enabled = 8; // The tool applied to the action will be the first Tool with a feature // set that matches the feature configuration. An error will be thrown // if no tool matches a provided feature configuration - for that reason, // it's a good idea to provide a default tool with an empty feature set. repeated Tool tool = 3; // If the given action config is enabled, the flag sets will be applied // to the corresponding action. repeated FlagSet flag_set = 4; // If the given action config is enabled, the env sets will be applied // to the corresponding action. repeated EnvSet env_set = 5; // A list of feature sets defining when this action config // is supported by the toolchain. The action config is supported if any of // the feature sets fully apply, that is, when all features of a // feature set are enabled. // // If 'requires' is omitted, the action config is supported independently // of which other features are enabled. // // Use this for example to filter actions depending on the build // mode enabled (opt / fastbuild / dbg). repeated FeatureSet requires = 6; // A list of features or action configs that are automatically enabled when // this action config is enabled. If any of the implied features or action // configs cannot be enabled, this action config will (silently) // not be enabled either. repeated string implies = 7; } repeated Feature feature = 50; repeated ActionConfig action_config = 53; repeated ArtifactNamePattern artifact_name_pattern = 54; // The unique identifier of the toolchain within the crosstool release. It // must be possible to use this as a directory name in a path. // It has to match the following regex: [a-zA-Z_][\.\- \w]* required string toolchain_identifier = 1; // A basic toolchain description. required string host_system_name = 2; required string target_system_name = 3; required string target_cpu = 4; required string target_libc = 5; required string compiler = 6; required string abi_version = 7; required string abi_libc_version = 8; // Tool locations. Relative paths are resolved relative to the configuration // file directory. // NOTE: DEPRECATED. Prefer specifying an ActionConfig for the action that // needs the tool. // TODO(b/27903698) migrate to ActionConfig. repeated ToolPath tool_path = 9; // Feature flags. // TODO(bazel-team): Sink those into 'Feature' instances. // Legacy field, ignored by Bazel. optional bool supports_gold_linker = 10 [default = false]; // Legacy field, ignored by Bazel. optional bool supports_thin_archives = 11 [default = false]; // Legacy field, use 'supports_start_end_lib' feature instead. optional bool supports_start_end_lib = 28 [default = false]; // Legacy field, use 'supports_interface_shared_libraries' instead. optional bool supports_interface_shared_objects = 32 [default = false]; // Legacy field, use 'static_link_cpp_runtimes' feature instead. optional bool supports_embedded_runtimes = 40 [default = false]; // If specified, Blaze finds statically linked / dynamically linked runtime // libraries in the declared crosstool filegroup. Otherwise, Blaze // looks in "[static|dynamic]-runtime-libs-$TARGET_CPU". // Deprecated, see https://github.com/bazelbuild/bazel/issues/6942 optional string static_runtimes_filegroup = 45; // Deprecated, see https://github.com/bazelbuild/bazel/issues/6942 optional string dynamic_runtimes_filegroup = 46; // Legacy field, ignored by Bazel. optional bool supports_incremental_linker = 41 [default = false]; // Legacy field, ignored by Bazel. optional bool supports_normalizing_ar = 26 [default = false]; // Legacy field, use 'per_object_debug_info' feature instead. optional bool supports_fission = 43 [default = false]; // Legacy field, ignored by Bazel. optional bool supports_dsym = 51 [default = false]; // Legacy field, use 'supports_pic' feature instead optional bool needsPic = 12 [default = false]; // Compiler flags for C/C++/Asm compilation. repeated string compiler_flag = 13; // Additional compiler flags for C++ compilation. repeated string cxx_flag = 14; // Additional unfiltered compiler flags for C/C++/Asm compilation. // These are not subject to nocopt filtering in cc_* rules. // Note: These flags are *not* applied to objc/objc++ compiles. repeated string unfiltered_cxx_flag = 25; // Linker flags. repeated string linker_flag = 15; // Additional linker flags when linking dynamic libraries. repeated string dynamic_library_linker_flag = 27; // Additional test-only linker flags. repeated string test_only_linker_flag = 49; // Objcopy flags for embedding files into binaries. repeated string objcopy_embed_flag = 16; // Ld flags for embedding files into binaries. This is used by filewrapper // since it calls ld directly and needs to know what -m flag to pass. repeated string ld_embed_flag = 23; // Ar flags for combining object files into archives. If this is not set, it // defaults to "rcsD". // TODO(b/37271982): Remove after blaze with ar action_config release repeated string ar_flag = 47; // Legacy field, ignored by Bazel. repeated string ar_thin_archives_flag = 48; // Legacy field, ignored by Bazel. repeated string gcc_plugin_compiler_flag = 34; // Additional compiler and linker flags depending on the compilation mode. repeated CompilationModeFlags compilation_mode_flags = 17; // Additional linker flags depending on the linking mode. repeated LinkingModeFlags linking_mode_flags = 18; // Legacy field, ignored by Bazel. repeated string gcc_plugin_header_directory = 19; // Legacy field, ignored by Bazel. repeated string mao_plugin_header_directory = 20; // Make variables that are made accessible to rules. repeated MakeVariable make_variable = 21; // Built-in include directories for C++ compilation. These should be the exact // paths used by the compiler, and are generally relative to the exec root. // The paths used by the compiler can be determined by 'gcc -Wp,-v some.c'. // We currently use the C++ paths also for C compilation, which is safe as // long as there are no name clashes between C++ and C header files. // // Relative paths are resolved relative to the configuration file directory. // // If the compiler has --sysroot support, then these paths should use // %sysroot% rather than the include path, and specify the sysroot attribute // in order to give blaze the information necessary to make the correct // replacements. repeated string cxx_builtin_include_directory = 22; // The built-in sysroot. If this attribute is not present, blaze does not // allow using a different sysroot, i.e. through the --grte_top option. Also // see the documentation above. optional string builtin_sysroot = 24; // Legacy field, ignored by Bazel. optional string default_python_top = 29; // Legacy field, ignored by Bazel. optional string default_python_version = 30; // Legacy field, ignored by Bazel. optional bool python_preload_swigdeps = 42; // The default GRTE to use. This should be a label, and gets the same // treatment from Blaze as the --grte_top option. This setting is only used in // the absence of an explicit --grte_top option. If unset, Blaze will not pass // -sysroot by default. The local part must be 'everything', i.e., // '//some/label:everything'. There can only be one GRTE library per package, // because the compiler expects the directory as a parameter of the -sysroot // option. // This may only be set to a non-empty value if builtin_sysroot is also set! optional string default_grte_top = 31; // Legacy field, ignored by Bazel. repeated string debian_extra_requires = 33; // Legacy field, ignored by Bazel. Only there for compatibility with // things internal to Google. optional string cc_target_os = 55; // Next free id: 56 } message ToolPath { required string name = 1; required string path = 2; } enum CompilationMode { FASTBUILD = 1; DBG = 2; OPT = 3; // This value is ignored and should not be used in new files. COVERAGE = 4; } message CompilationModeFlags { required CompilationMode mode = 1; repeated string compiler_flag = 2; repeated string cxx_flag = 3; // Linker flags that are added when compiling in a certain mode. repeated string linker_flag = 4; } enum LinkingMode { FULLY_STATIC = 1; MOSTLY_STATIC = 2; DYNAMIC = 3; MOSTLY_STATIC_LIBRARIES = 4; } message LinkingModeFlags { required LinkingMode mode = 1; repeated string linker_flag = 2; } message MakeVariable { required string name = 1; required string value = 2; } message DefaultCpuToolchain { required string cpu = 1; required string toolchain_identifier = 2; } // An entire crosstool release, containing the version number, and a set of // toolchains. message CrosstoolRelease { // The major and minor version of the crosstool release. required string major_version = 1; required string minor_version = 2; // Legacy field, ignored by Bazel. optional string default_target_cpu = 3; // Legacy field, ignored by Bazel. repeated DefaultCpuToolchain default_toolchain = 4; // All the toolchains in this release. repeated CToolchain toolchain = 5; } 07070100000029000081A4000003E800000064000000015D359B420000012B000000000000000000000000000000000000002E00000000bazel-rules-cc-20190722/third_party/six.BUILD# Description: # Six provides simple utilities for wrapping over differences between Python 2 # and Python 3. licenses(["notice"]) # MIT exports_files(["LICENSE"]) py_library( name = "six", srcs = ["six.py"], srcs_version = "PY2AND3", visibility = ["//visibility:public"], ) 0707010000002A000041ED000003E800000064000000035D359B4200000000000000000000000000000000000000000000001E00000000bazel-rules-cc-20190722/tools0707010000002B000041ED000003E800000064000000025D359B4200000000000000000000000000000000000000000000002800000000bazel-rules-cc-20190722/tools/migration0707010000002C000081A4000003E800000064000000015D359B4200000F12000000000000000000000000000000000000002E00000000bazel-rules-cc-20190722/tools/migration/BUILD# Copyright 2018 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. package(default_visibility = ["//visibility:public"]) # Go rules load("@io_bazel_rules_go//go:def.bzl", "go_binary", "go_library", "go_test") licenses(["notice"]) # Apache 2.0 py_binary( name = "legacy_fields_migrator", srcs = ["legacy_fields_migrator.py"], python_version = "PY2", deps = [ ":legacy_fields_migration_lib", "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_py_pb2", "@io_abseil_py//absl:app", "@io_abseil_py//absl/flags", ], ) py_library( name = "legacy_fields_migration_lib", srcs = ["legacy_fields_migration_lib.py"], deps = [ "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_py_pb2", ], ) py_test( name = "legacy_fields_migration_lib_test", srcs = ["legacy_fields_migration_lib_test.py"], python_version = "PY2", deps = [ ":legacy_fields_migration_lib", "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_py_pb2", ], ) py_binary( name = "crosstool_query", srcs = ["crosstool_query.py"], python_version = "PY2", deps = [ "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_py_pb2", "@io_abseil_py//absl:app", "@io_abseil_py//absl/flags", ], ) py_binary( name = "ctoolchain_comparator", srcs = ["ctoolchain_comparator.py"], python_version = "PY2", deps = [ ":ctoolchain_comparator_lib", "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_py_pb2", "@io_abseil_py//absl:app", "@io_abseil_py//absl/flags", ], ) py_library( name = "ctoolchain_comparator_lib", srcs = ["ctoolchain_comparator_lib.py"], deps = [ "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_py_pb2", ], ) py_test( name = "ctoolchain_comparator_lib_test", srcs = ["ctoolchain_comparator_lib_test.py"], python_version = "PY2", deps = [ ":ctoolchain_comparator_lib", "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_py_pb2", "@py_mock//py/mock", ], ) go_binary( name = "convert_crosstool_to_starlark", srcs = ["convert_crosstool_to_starlark.go"], deps = [ ":crosstooltostarlarklib", "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_go_proto", "@com_github_golang_protobuf//proto:go_default_library", ], ) go_library( name = "crosstooltostarlarklib", srcs = ["crosstool_to_starlark_lib.go"], importpath = "tools/migration/crosstooltostarlarklib", deps = ["//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_go_proto"], ) go_test( name = "crosstooltostarlarklib_test", size = "small", srcs = ["crosstool_to_starlark_lib_test.go"], embed = [":crosstooltostarlarklib"], deps = [ "//third_party/com/github/bazelbuild/bazel/src/main/protobuf:crosstool_config_go_proto", "@com_github_golang_protobuf//proto:go_default_library", ], ) filegroup( name = "bazel_osx_p4deps", srcs = [ "BUILD", "ctoolchain_compare.bzl", ], ) 0707010000002D000081A4000003E800000064000000015D359B420000098A000000000000000000000000000000000000004900000000bazel-rules-cc-20190722/tools/migration/convert_crosstool_to_starlark.go/* The convert_crosstool_to_starlark script takes in a CROSSTOOL file and generates a Starlark rule. See https://github.com/bazelbuild/bazel/issues/5380 Example usage: bazel run \ @rules_cc//tools/migration:convert_crosstool_to_starlark -- \ --crosstool=/path/to/CROSSTOOL \ --output_location=/path/to/cc_config.bzl */ package main import ( "flag" "fmt" "io/ioutil" "os" "os/user" "path" "strings" // Google internal base/go package, commented out by copybara "log" "github.com/golang/protobuf/proto" crosstoolpb "third_party/com/github/bazelbuild/bazel/src/main/protobuf/crosstool_config_go_proto" "tools/migration/crosstooltostarlarklib" ) var ( crosstoolLocation = flag.String( "crosstool", "", "Location of the CROSSTOOL file") outputLocation = flag.String( "output_location", "", "Location of the output .bzl file") ) func toAbsolutePath(pathString string) (string, error) { usr, err := user.Current() if err != nil { return "", err } homeDir := usr.HomeDir if strings.HasPrefix(pathString, "~") { return path.Join(homeDir, pathString[1:]), nil } if path.IsAbs(pathString) { return pathString, nil } workingDirectory := os.Getenv("BUILD_WORKING_DIRECTORY") return path.Join(workingDirectory, pathString), nil } func main() { flag.Parse() if *crosstoolLocation == "" { log.Fatalf("Missing mandatory argument 'crosstool'") } crosstoolPath, err := toAbsolutePath(*crosstoolLocation) if err != nil { log.Fatalf("Error while resolving CROSSTOOL location:", err) } if *outputLocation == "" { log.Fatalf("Missing mandatory argument 'output_location'") } outputPath, err := toAbsolutePath(*outputLocation) if err != nil { log.Fatalf("Error resolving output location:", err) } in, err := ioutil.ReadFile(crosstoolPath) if err != nil { log.Fatalf("Error reading CROSSTOOL file:", err) } crosstool := &crosstoolpb.CrosstoolRelease{} if err := proto.UnmarshalText(string(in), crosstool); err != nil { log.Fatalf("Failed to parse CROSSTOOL:", err) } file, err := os.Create(outputPath) if err != nil { log.Fatalf("Error creating output file:", err) } defer file.Close() rule, err := crosstooltostarlarklib.Transform(crosstool) if err != nil { log.Fatalf("Error converting CROSSTOOL to a Starlark rule:", err) } if _, err := file.WriteString(rule); err != nil { log.Fatalf("Error converting CROSSTOOL to a Starlark rule:", err) } fmt.Println("Success!") } 0707010000002E000081A4000003E800000064000000015D359B4200000685000000000000000000000000000000000000003B00000000bazel-rules-cc-20190722/tools/migration/crosstool_query.py"""Script to make automated CROSSTOOL refactorings easier. This script reads the CROSSTOOL file and allows for querying of its fields. """ from absl import app from absl import flags from google.protobuf import text_format from third_party.com.github.bazelbuild.bazel.src.main.protobuf import crosstool_config_pb2 flags.DEFINE_string("crosstool", None, "CROSSTOOL file path to be queried") flags.DEFINE_string("identifier", None, "Toolchain identifier to specify toolchain.") flags.DEFINE_string("print_field", None, "Field to be printed to stdout.") def main(unused_argv): crosstool = crosstool_config_pb2.CrosstoolRelease() crosstool_filename = flags.FLAGS.crosstool identifier = flags.FLAGS.identifier print_field = flags.FLAGS.print_field if not crosstool_filename: raise app.UsageError("ERROR crosstool unspecified") if not identifier: raise app.UsageError("ERROR identifier unspecified") if not print_field: raise app.UsageError("ERROR print_field unspecified") with open(crosstool_filename, "r") as f: text = f.read() text_format.Merge(text, crosstool) toolchain_found = False for toolchain in crosstool.toolchain: if toolchain.toolchain_identifier == identifier: toolchain_found = True if not print_field: continue for field, value in toolchain.ListFields(): if print_field == field.name: print value if not toolchain_found: print "toolchain_identifier %s not found, valid values are:" % identifier for toolchain in crosstool.toolchain: print " " + toolchain.toolchain_identifier if __name__ == "__main__": app.run(main) 0707010000002F000081A4000003E800000064000000015D359B420000AB1B000000000000000000000000000000000000004500000000bazel-rules-cc-20190722/tools/migration/crosstool_to_starlark_lib.go/* Package crosstooltostarlarklib provides the Transform method for conversion of a CROSSTOOL file to a Starlark rule. https://github.com/bazelbuild/bazel/issues/5380 */ package crosstooltostarlarklib import ( "bytes" "errors" "fmt" "sort" "strings" crosstoolpb "third_party/com/github/bazelbuild/bazel/src/main/protobuf/crosstool_config_go_proto" ) // CToolchainIdentifier is what we'll use to differ between CToolchains // If a CToolchain can be distinguished from the other CToolchains // by only one of the fields (eg if cpu is different for each CToolchain // then only that field will be set. type CToolchainIdentifier struct { cpu string compiler string } // Writes the load statement for the cc_toolchain_config_lib func getCcToolchainConfigHeader() string { return `load("@bazel_tools//tools/cpp:cc_toolchain_config_lib.bzl", "action_config", "artifact_name_pattern", "env_entry", "env_set", "feature", "feature_set", "flag_group", "flag_set", "make_variable", "tool", "tool_path", "variable_with_value", "with_feature_set", ) ` } var allCompileActions = []string{ "c-compile", "c++-compile", "linkstamp-compile", "assemble", "preprocess-assemble", "c++-header-parsing", "c++-module-compile", "c++-module-codegen", "clif-match", "lto-backend", } var allCppCompileActions = []string{ "c++-compile", "linkstamp-compile", "c++-header-parsing", "c++-module-compile", "c++-module-codegen", "clif-match", } var preprocessorCompileActions = []string{ "c-compile", "c++-compile", "linkstamp-compile", "preprocess-assemble", "c++-header-parsing", "c++-module-compile", "clif-match", } var codegenCompileActions = []string{ "c-compile", "c++-compile", "linkstamp-compile", "assemble", "preprocess-assemble", "c++-module-codegen", "lto-backend", } var allLinkActions = []string{ "c++-link-executable", "c++-link-dynamic-library", "c++-link-nodeps-dynamic-library", } var actionNames = map[string]string{ "c-compile": "ACTION_NAMES.c_compile", "c++-compile": "ACTION_NAMES.cpp_compile", "linkstamp-compile": "ACTION_NAMES.linkstamp_compile", "cc-flags-make-variable": "ACTION_NAMES.cc_flags_make_variable", "c++-module-codegen": "ACTION_NAMES.cpp_module_codegen", "c++-header-parsing": "ACTION_NAMES.cpp_header_parsing", "c++-module-compile": "ACTION_NAMES.cpp_module_compile", "assemble": "ACTION_NAMES.assemble", "preprocess-assemble": "ACTION_NAMES.preprocess_assemble", "lto-indexing": "ACTION_NAMES.lto_indexing", "lto-backend": "ACTION_NAMES.lto_backend", "c++-link-executable": "ACTION_NAMES.cpp_link_executable", "c++-link-dynamic-library": "ACTION_NAMES.cpp_link_dynamic_library", "c++-link-nodeps-dynamic-library": "ACTION_NAMES.cpp_link_nodeps_dynamic_library", "c++-link-static-library": "ACTION_NAMES.cpp_link_static_library", "strip": "ACTION_NAMES.strip", "objc-compile": "ACTION_NAMES.objc_compile", "objc++-compile": "ACTION_NAMES.objcpp_compile", "clif-match": "ACTION_NAMES.clif_match", // "objcopy_embed_data": "ACTION_NAMES.objcopy_embed_data", // copybara-comment-this-out-please // "ld_embed_data": "ACTION_NAMES.ld_embed_data", // copybara-comment-this-out-please } func getLoadActionsStmt() string { return "load(\"@bazel_tools//tools/build_defs/cc:action_names.bzl\", \"ACTION_NAMES\")\n\n" } // Returns a map {toolchain_identifier : CToolchainIdentifier} func toolchainToCToolchainIdentifier( crosstool *crosstoolpb.CrosstoolRelease) map[string]CToolchainIdentifier { cpuToCompiler := make(map[string][]string) compilerToCPU := make(map[string][]string) var cpus []string var compilers []string var identifiers []string res := make(map[string]CToolchainIdentifier) for _, cToolchain := range crosstool.GetToolchain() { cpu := cToolchain.GetTargetCpu() compiler := cToolchain.GetCompiler() cpuToCompiler[cpu] = append(cpuToCompiler[cpu], compiler) compilerToCPU[compiler] = append(compilerToCPU[compiler], cpu) cpus = append(cpus, cToolchain.GetTargetCpu()) compilers = append(compilers, cToolchain.GetCompiler()) identifiers = append(identifiers, cToolchain.GetToolchainIdentifier()) } for i := range cpus { if len(cpuToCompiler[cpus[i]]) == 1 { // if cpu is unique among CToolchains, we don't need the compiler field res[identifiers[i]] = CToolchainIdentifier{cpu: cpus[i], compiler: ""} } else { res[identifiers[i]] = CToolchainIdentifier{ cpu: cpus[i], compiler: compilers[i], } } } return res } func getConditionStatementForCToolchainIdentifier(identifier CToolchainIdentifier) string { if identifier.compiler != "" { return fmt.Sprintf( "ctx.attr.cpu == \"%s\" and ctx.attr.compiler == \"%s\"", identifier.cpu, identifier.compiler) } return fmt.Sprintf("ctx.attr.cpu == \"%s\"", identifier.cpu) } func isArrayPrefix(prefix []string, arr []string) bool { if len(prefix) > len(arr) { return false } for i := 0; i < len(prefix); i++ { if arr[i] != prefix[i] { return false } } return true } func isAllCompileActions(actions []string) (bool, []string) { if isArrayPrefix(allCompileActions, actions) { return true, actions[len(allCompileActions):] } return false, actions } func isAllCppCompileActions(actions []string) (bool, []string) { if isArrayPrefix(allCppCompileActions, actions) { return true, actions[len(allCppCompileActions):] } return false, actions } func isPreprocessorCompileActions(actions []string) (bool, []string) { if isArrayPrefix(preprocessorCompileActions, actions) { return true, actions[len(preprocessorCompileActions):] } return false, actions } func isCodegenCompileActions(actions []string) (bool, []string) { if isArrayPrefix(codegenCompileActions, actions) { return true, actions[len(codegenCompileActions):] } return false, actions } func isAllLinkActions(actions []string) (bool, []string) { if isArrayPrefix(allLinkActions, actions) { return true, actions[len(allLinkActions):] } return false, actions } func getActionNames(actions []string) []string { var res []string for _, el := range actions { if name, ok := actionNames[el]; ok { res = append(res, name) } else { res = append(res, "\""+el+"\"") } } return res } func getListOfActions(name string, depth int) string { var res []string if name == "all_compile_actions" { res = getActionNames(allCompileActions) } else if name == "all_cpp_compile_actions" { res = getActionNames(allCppCompileActions) } else if name == "preprocessor_compile_actions" { res = getActionNames(preprocessorCompileActions) } else if name == "codegen_compile_actions" { res = getActionNames(codegenCompileActions) } else if name == "all_link_actions" { res = getActionNames(allLinkActions) } stmt := fmt.Sprintf("%s%s = %s\n\n", getTabs(depth), name, makeStringArr(res, depth /* isPlainString= */, false)) return stmt } func processActions(actions []string, depth int) []string { var res []string var ok bool initLen := len(actions) if ok, actions = isAllCompileActions(actions); ok { res = append(res, "all_compile_actions") } if ok, actions = isAllCppCompileActions(actions); ok { res = append(res, "all_cpp_compile_actions") } if ok, actions = isPreprocessorCompileActions(actions); ok { res = append(res, "preprocessor_compile_actions") } if ok, actions = isCodegenCompileActions(actions); ok { res = append(res, "codegen_actions") } if ok, actions = isAllLinkActions(actions); ok { res = append(res, "all_link_actions") } if len(actions) != 0 { actions = getActionNames(actions) newDepth := depth + 1 if len(actions) != initLen { newDepth++ } res = append(res, makeStringArr(actions, newDepth /* isPlainString= */, false)) } return res } func getUniqueValues(arr []string) []string { valuesSet := make(map[string]bool) for _, val := range arr { valuesSet[val] = true } var uniques []string for val, _ := range valuesSet { uniques = append(uniques, val) } sort.Strings(uniques) return uniques } func getRule(cToolchainIdentifiers map[string]CToolchainIdentifier, allowedCompilers []string) string { cpus := make(map[string]bool) shouldUseCompilerAttribute := false for _, val := range cToolchainIdentifiers { cpus[val.cpu] = true if val.compiler != "" { shouldUseCompilerAttribute = true } } var cpuValues []string for cpu := range cpus { cpuValues = append(cpuValues, cpu) } var args []string sort.Strings(cpuValues) args = append(args, fmt.Sprintf( `"cpu": attr.string(mandatory=True, values=["%s"]),`, strings.Join(cpuValues, "\", \""))) if shouldUseCompilerAttribute { // If there are two CToolchains that share the cpu we need the compiler attribute // for our cc_toolchain_config rule. allowedCompilers = getUniqueValues(allowedCompilers) args = append(args, fmt.Sprintf(`"compiler": attr.string(mandatory=True, values=["%s"]),`, strings.Join(allowedCompilers, "\", \""))) } return fmt.Sprintf(`cc_toolchain_config = rule( implementation = _impl, attrs = { %s }, provides = [CcToolchainConfigInfo], executable = True, ) `, strings.Join(args, "\n ")) } func getImplHeader() string { return "def _impl(ctx):\n" } func getStringStatement(crosstool *crosstoolpb.CrosstoolRelease, cToolchainIdentifiers map[string]CToolchainIdentifier, field string, depth int) string { identifiers := getToolchainIdentifiers(crosstool) var fieldValues []string if field == "toolchain_identifier" { fieldValues = getToolchainIdentifiers(crosstool) } else if field == "host_system_name" { fieldValues = getHostSystemNames(crosstool) } else if field == "target_system_name" { fieldValues = getTargetSystemNames(crosstool) } else if field == "target_cpu" { fieldValues = getTargetCpus(crosstool) } else if field == "target_libc" { fieldValues = getTargetLibcs(crosstool) } else if field == "compiler" { fieldValues = getCompilers(crosstool) } else if field == "abi_version" { fieldValues = getAbiVersions(crosstool) } else if field == "abi_libc_version" { fieldValues = getAbiLibcVersions(crosstool) } else if field == "cc_target_os" { fieldValues = getCcTargetOss(crosstool) } else if field == "builtin_sysroot" { fieldValues = getBuiltinSysroots(crosstool) } mappedValuesToIds := getMappedStringValuesToIdentifiers(identifiers, fieldValues) return getAssignmentStatement(field, mappedValuesToIds, crosstool, cToolchainIdentifiers, depth /* isPlainString= */, true /* shouldFail= */, true) } func getFeatures(crosstool *crosstoolpb.CrosstoolRelease) ( map[string][]string, map[string]map[string][]string, error) { featureNameToFeature := make(map[string]map[string][]string) toolchainToFeatures := make(map[string][]string) for _, toolchain := range crosstool.GetToolchain() { id := toolchain.GetToolchainIdentifier() if len(toolchain.GetFeature()) == 0 { toolchainToFeatures[id] = []string{} } for _, feature := range toolchain.GetFeature() { featureName := strings.ToLower(feature.GetName()) + "_feature" featureName = strings.Replace(featureName, "+", "p", -1) featureName = strings.Replace(featureName, ".", "_", -1) featureName = strings.Replace(featureName, "-", "_", -1) stringFeature, err := parseFeature(feature, 1) if err != nil { return nil, nil, fmt.Errorf( "Error in feature '%s': %v", feature.GetName(), err) } if _, ok := featureNameToFeature[featureName]; !ok { featureNameToFeature[featureName] = make(map[string][]string) } featureNameToFeature[featureName][stringFeature] = append( featureNameToFeature[featureName][stringFeature], id) toolchainToFeatures[id] = append(toolchainToFeatures[id], featureName) } } return toolchainToFeatures, featureNameToFeature, nil } func getFeaturesDeclaration(crosstool *crosstoolpb.CrosstoolRelease, cToolchainIdentifiers map[string]CToolchainIdentifier, featureNameToFeature map[string]map[string][]string, depth int) string { var res []string for featureName, featureStringToID := range featureNameToFeature { res = append(res, getAssignmentStatement( featureName, featureStringToID, crosstool, cToolchainIdentifiers, depth, /* isPlainString= */ false, /* shouldFail= */ false)) } return strings.Join(res, "") } func getFeaturesStmt(cToolchainIdentifiers map[string]CToolchainIdentifier, toolchainToFeatures map[string][]string, depth int) string { var res []string arrToIdentifier := make(map[string][]string) for id, features := range toolchainToFeatures { arrayString := strings.Join(features, "{arrayFieldDelimiter}") arrToIdentifier[arrayString] = append(arrToIdentifier[arrayString], id) } res = append(res, getStringArrStatement( "features", arrToIdentifier, cToolchainIdentifiers, depth, /* isPlainString= */ false)) return strings.Join(res, "\n") } func getActions(crosstool *crosstoolpb.CrosstoolRelease) ( map[string][]string, map[string]map[string][]string, error) { actionNameToAction := make(map[string]map[string][]string) toolchainToActions := make(map[string][]string) for _, toolchain := range crosstool.GetToolchain() { id := toolchain.GetToolchainIdentifier() var actionName string if len(toolchain.GetActionConfig()) == 0 { toolchainToActions[id] = []string{} } for _, action := range toolchain.GetActionConfig() { if aName, ok := actionNames[action.GetActionName()]; ok { actionName = aName } else { actionName = strings.ToLower(action.GetActionName()) actionName = strings.Replace(actionName, "+", "p", -1) actionName = strings.Replace(actionName, ".", "_", -1) actionName = strings.Replace(actionName, "-", "_", -1) } stringAction, err := parseAction(action, 1) if err != nil { return nil, nil, fmt.Errorf( "Error in action_config '%s': %v", action.GetActionName(), err) } if _, ok := actionNameToAction[actionName]; !ok { actionNameToAction[actionName] = make(map[string][]string) } actionNameToAction[actionName][stringAction] = append( actionNameToAction[actionName][stringAction], id) toolchainToActions[id] = append( toolchainToActions[id], strings.TrimPrefix(strings.ToLower(actionName), "action_names.")+"_action") } } return toolchainToActions, actionNameToAction, nil } func getActionConfigsDeclaration( crosstool *crosstoolpb.CrosstoolRelease, cToolchainIdentifiers map[string]CToolchainIdentifier, actionNameToAction map[string]map[string][]string, depth int) string { var res []string for actionName, actionStringToID := range actionNameToAction { variableName := strings.TrimPrefix(strings.ToLower(actionName), "action_names.") + "_action" res = append(res, getAssignmentStatement( variableName, actionStringToID, crosstool, cToolchainIdentifiers, depth, /* isPlainString= */ false, /* shouldFail= */ false)) } return strings.Join(res, "") } func getActionConfigsStmt( cToolchainIdentifiers map[string]CToolchainIdentifier, toolchainToActions map[string][]string, depth int) string { var res []string arrToIdentifier := make(map[string][]string) for id, actions := range toolchainToActions { var arrayString string arrayString = strings.Join(actions, "{arrayFieldDelimiter}") arrToIdentifier[arrayString] = append(arrToIdentifier[arrayString], id) } res = append(res, getStringArrStatement( "action_configs", arrToIdentifier, cToolchainIdentifiers, depth, /* isPlainString= */ false)) return strings.Join(res, "\n") } func parseAction(action *crosstoolpb.CToolchain_ActionConfig, depth int) (string, error) { actionName := action.GetActionName() aName := "" if val, ok := actionNames[actionName]; ok { aName = val } else { aName = "\"" + action.GetActionName() + "\"" } name := fmt.Sprintf("action_name = %s", aName) fields := []string{name} if action.GetEnabled() { fields = append(fields, "enabled = True") } if len(action.GetFlagSet()) != 0 { flagSets, err := parseFlagSets(action.GetFlagSet(), depth+1) if err != nil { return "", err } fields = append(fields, "flag_sets = "+flagSets) } if len(action.GetImplies()) != 0 { implies := "implies = " + makeStringArr(action.GetImplies(), depth+1 /* isPlainString= */, true) fields = append(fields, implies) } if len(action.GetTool()) != 0 { tools := "tools = " + parseTools(action.GetTool(), depth+1) fields = append(fields, tools) } return createObject("action_config", fields, depth), nil } func getStringArrStatement(attr string, arrValToIds map[string][]string, cToolchainIdentifiers map[string]CToolchainIdentifier, depth int, plainString bool) string { var b bytes.Buffer if len(arrValToIds) == 0 { b.WriteString(fmt.Sprintf("%s%s = []\n", getTabs(depth), attr)) } else if len(arrValToIds) == 1 { for value := range arrValToIds { var arr []string if value == "" { arr = []string{} } else if value == "None" { b.WriteString(fmt.Sprintf("%s%s = None\n", getTabs(depth), attr)) break } else { arr = strings.Split(value, "{arrayFieldDelimiter}") } b.WriteString( fmt.Sprintf( "%s%s = %s\n", getTabs(depth), attr, makeStringArr(arr, depth+1, plainString))) break } } else { first := true var keys []string for k := range arrValToIds { keys = append(keys, k) } sort.Strings(keys) for _, value := range keys { ids := arrValToIds[value] branch := "elif" if first { branch = "if" } first = false var arr []string if value == "" { arr = []string{} } else if value == "None" { b.WriteString( getIfStatement( branch, ids, attr, "None", cToolchainIdentifiers, depth /* isPlainString= */, true)) continue } else { arr = strings.Split(value, "{arrayFieldDelimiter}") } b.WriteString( getIfStatement(branch, ids, attr, makeStringArr(arr, depth+1, plainString), cToolchainIdentifiers, depth /* isPlainString= */, false)) } b.WriteString(fmt.Sprintf("%selse:\n%sfail(\"Unreachable\")\n", getTabs(depth), getTabs(depth+1))) } b.WriteString("\n") return b.String() } func getStringArr(crosstool *crosstoolpb.CrosstoolRelease, cToolchainIdentifiers map[string]CToolchainIdentifier, attr string, depth int) string { var res []string arrToIdentifier := make(map[string][]string) for _, toolchain := range crosstool.GetToolchain() { id := toolchain.GetToolchainIdentifier() arrayString := strings.Join(getArrField(attr, toolchain), "{arrayFieldDelimiter}") arrToIdentifier[arrayString] = append(arrToIdentifier[arrayString], id) } statement := getStringArrStatement(attr, arrToIdentifier, cToolchainIdentifiers, depth /* isPlainString= */, true) res = append(res, statement) return strings.Join(res, "\n") } func getArrField(attr string, toolchain *crosstoolpb.CToolchain) []string { var arr []string if attr == "cxx_builtin_include_directories" { arr = toolchain.GetCxxBuiltinIncludeDirectory() } return arr } func getTabs(depth int) string { var res string for i := 0; i < depth; i++ { res = res + " " } return res } func createObject(objtype string, fields []string, depth int) string { if len(fields) == 0 { return objtype + "()" } singleLine := objtype + "(" + strings.Join(fields, ", ") + ")" if len(singleLine) < 60 { return singleLine } return objtype + "(\n" + getTabs(depth+1) + strings.Join(fields, ",\n"+getTabs(depth+1)) + ",\n" + getTabs(depth) + ")" } func getArtifactNamePatterns(crosstool *crosstoolpb.CrosstoolRelease, cToolchainIdentifiers map[string]CToolchainIdentifier, depth int) string { var res []string artifactToIds := make(map[string][]string) for _, toolchain := range crosstool.GetToolchain() { artifactNamePatterns := parseArtifactNamePatterns( toolchain.GetArtifactNamePattern(), depth) artifactToIds[artifactNamePatterns] = append( artifactToIds[artifactNamePatterns], toolchain.GetToolchainIdentifier()) } res = append(res, getAssignmentStatement( "artifact_name_patterns", artifactToIds, crosstool, cToolchainIdentifiers, depth, /* isPlainString= */ false, /* shouldFail= */ true)) return strings.Join(res, "\n") } func parseArtifactNamePatterns( artifactNamePatterns []*crosstoolpb.CToolchain_ArtifactNamePattern, depth int) string { var res []string for _, pattern := range artifactNamePatterns { res = append(res, parseArtifactNamePattern(pattern, depth+1)) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseArtifactNamePattern( artifactNamePattern *crosstoolpb.CToolchain_ArtifactNamePattern, depth int) string { categoryName := fmt.Sprintf("category_name = \"%s\"", artifactNamePattern.GetCategoryName()) prefix := fmt.Sprintf("prefix = \"%s\"", artifactNamePattern.GetPrefix()) extension := fmt.Sprintf("extension = \"%s\"", artifactNamePattern.GetExtension()) fields := []string{categoryName, prefix, extension} return createObject("artifact_name_pattern", fields, depth) } func parseFeature(feature *crosstoolpb.CToolchain_Feature, depth int) (string, error) { name := fmt.Sprintf("name = \"%s\"", feature.GetName()) fields := []string{name} if feature.GetEnabled() { fields = append(fields, "enabled = True") } if len(feature.GetFlagSet()) > 0 { flagSets, err := parseFlagSets(feature.GetFlagSet(), depth+1) if err != nil { return "", err } fields = append(fields, "flag_sets = "+flagSets) } if len(feature.GetEnvSet()) > 0 { envSets := "env_sets = " + parseEnvSets(feature.GetEnvSet(), depth+1) fields = append(fields, envSets) } if len(feature.GetRequires()) > 0 { requires := "requires = " + parseFeatureSets(feature.GetRequires(), depth+1) fields = append(fields, requires) } if len(feature.GetImplies()) > 0 { implies := "implies = " + makeStringArr(feature.GetImplies(), depth+1 /* isPlainString= */, true) fields = append(fields, implies) } if len(feature.GetProvides()) > 0 { provides := "provides = " + makeStringArr(feature.GetProvides(), depth+1 /* isPlainString= */, true) fields = append(fields, provides) } return createObject("feature", fields, depth), nil } func parseFlagSets(flagSets []*crosstoolpb.CToolchain_FlagSet, depth int) (string, error) { var res []string for _, flagSet := range flagSets { parsedFlagset, err := parseFlagSet(flagSet, depth+1) if err != nil { return "", err } res = append(res, parsedFlagset) } return makeStringArr(res, depth /* isPlainString= */, false), nil } func parseFlagSet(flagSet *crosstoolpb.CToolchain_FlagSet, depth int) (string, error) { var fields []string if len(flagSet.GetAction()) > 0 { actionArr := processActions(flagSet.GetAction(), depth) actions := "actions = " + strings.Join(actionArr, " +\n"+getTabs(depth+2)) fields = append(fields, actions) } if len(flagSet.GetFlagGroup()) > 0 { flagGroups, err := parseFlagGroups(flagSet.GetFlagGroup(), depth+1) if err != nil { return "", err } fields = append(fields, "flag_groups = "+flagGroups) } if len(flagSet.GetWithFeature()) > 0 { withFeatures := "with_features = " + parseWithFeatureSets(flagSet.GetWithFeature(), depth+1) fields = append(fields, withFeatures) } return createObject("flag_set", fields, depth), nil } func parseFlagGroups(flagGroups []*crosstoolpb.CToolchain_FlagGroup, depth int) (string, error) { var res []string for _, flagGroup := range flagGroups { flagGroupString, err := parseFlagGroup(flagGroup, depth+1) if err != nil { return "", err } res = append(res, flagGroupString) } return makeStringArr(res, depth /* isPlainString= */, false), nil } func parseFlagGroup(flagGroup *crosstoolpb.CToolchain_FlagGroup, depth int) (string, error) { var res []string if len(flagGroup.GetFlag()) != 0 { res = append(res, "flags = "+makeStringArr(flagGroup.GetFlag(), depth+1, true)) } if flagGroup.GetIterateOver() != "" { res = append(res, fmt.Sprintf("iterate_over = \"%s\"", flagGroup.GetIterateOver())) } if len(flagGroup.GetFlagGroup()) != 0 { flagGroupString, err := parseFlagGroups(flagGroup.GetFlagGroup(), depth+1) if err != nil { return "", err } res = append(res, "flag_groups = "+flagGroupString) } if len(flagGroup.GetExpandIfAllAvailable()) > 1 { return "", errors.New("Flag group must not have more than one 'expand_if_all_available' field") } if len(flagGroup.GetExpandIfAllAvailable()) != 0 { res = append(res, fmt.Sprintf( "expand_if_available = \"%s\"", flagGroup.GetExpandIfAllAvailable()[0])) } if len(flagGroup.GetExpandIfNoneAvailable()) > 1 { return "", errors.New("Flag group must not have more than one 'expand_if_none_available' field") } if len(flagGroup.GetExpandIfNoneAvailable()) != 0 { res = append(res, fmt.Sprintf( "expand_if_not_available = \"%s\"", flagGroup.GetExpandIfNoneAvailable()[0])) } if flagGroup.GetExpandIfTrue() != "" { res = append(res, fmt.Sprintf("expand_if_true = \"%s\"", flagGroup.GetExpandIfTrue())) } if flagGroup.GetExpandIfFalse() != "" { res = append(res, fmt.Sprintf("expand_if_false = \"%s\"", flagGroup.GetExpandIfFalse())) } if flagGroup.GetExpandIfEqual() != nil { res = append(res, "expand_if_equal = "+parseVariableWithValue( flagGroup.GetExpandIfEqual(), depth+1)) } return createObject("flag_group", res, depth), nil } func parseVariableWithValue(variable *crosstoolpb.CToolchain_VariableWithValue, depth int) string { variableName := fmt.Sprintf("name = \"%s\"", variable.GetVariable()) value := fmt.Sprintf("value = \"%s\"", variable.GetValue()) return createObject("variable_with_value", []string{variableName, value}, depth) } func getToolPaths(crosstool *crosstoolpb.CrosstoolRelease, cToolchainIdentifiers map[string]CToolchainIdentifier, depth int) string { var res []string toolPathsToIds := make(map[string][]string) for _, toolchain := range crosstool.GetToolchain() { toolPaths := parseToolPaths(toolchain.GetToolPath(), depth) toolPathsToIds[toolPaths] = append( toolPathsToIds[toolPaths], toolchain.GetToolchainIdentifier()) } res = append(res, getAssignmentStatement( "tool_paths", toolPathsToIds, crosstool, cToolchainIdentifiers, depth, /* isPlainString= */ false, /* shouldFail= */ true)) return strings.Join(res, "\n") } func parseToolPaths(toolPaths []*crosstoolpb.ToolPath, depth int) string { var res []string for _, toolPath := range toolPaths { res = append(res, parseToolPath(toolPath, depth+1)) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseToolPath(toolPath *crosstoolpb.ToolPath, depth int) string { name := fmt.Sprintf("name = \"%s\"", toolPath.GetName()) path := toolPath.GetPath() if path == "" { path = "NOT_USED" } path = fmt.Sprintf("path = \"%s\"", path) return createObject("tool_path", []string{name, path}, depth) } func getMakeVariables(crosstool *crosstoolpb.CrosstoolRelease, cToolchainIdentifiers map[string]CToolchainIdentifier, depth int) string { var res []string makeVariablesToIds := make(map[string][]string) for _, toolchain := range crosstool.GetToolchain() { makeVariables := parseMakeVariables(toolchain.GetMakeVariable(), depth) makeVariablesToIds[makeVariables] = append( makeVariablesToIds[makeVariables], toolchain.GetToolchainIdentifier()) } res = append(res, getAssignmentStatement( "make_variables", makeVariablesToIds, crosstool, cToolchainIdentifiers, depth, /* isPlainString= */ false, /* shouldFail= */ true)) return strings.Join(res, "\n") } func parseMakeVariables(makeVariables []*crosstoolpb.MakeVariable, depth int) string { var res []string for _, makeVariable := range makeVariables { res = append(res, parseMakeVariable(makeVariable, depth+1)) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseMakeVariable(makeVariable *crosstoolpb.MakeVariable, depth int) string { name := fmt.Sprintf("name = \"%s\"", makeVariable.GetName()) value := fmt.Sprintf("value = \"%s\"", makeVariable.GetValue()) return createObject("make_variable", []string{name, value}, depth) } func parseTools(tools []*crosstoolpb.CToolchain_Tool, depth int) string { var res []string for _, tool := range tools { res = append(res, parseTool(tool, depth+1)) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseTool(tool *crosstoolpb.CToolchain_Tool, depth int) string { toolPath := "path = \"NOT_USED\"" if tool.GetToolPath() != "" { toolPath = fmt.Sprintf("path = \"%s\"", tool.GetToolPath()) } fields := []string{toolPath} if len(tool.GetWithFeature()) != 0 { withFeatures := "with_features = " + parseWithFeatureSets(tool.GetWithFeature(), depth+1) fields = append(fields, withFeatures) } if len(tool.GetExecutionRequirement()) != 0 { executionRequirements := "execution_requirements = " + makeStringArr(tool.GetExecutionRequirement(), depth+1 /* isPlainString= */, true) fields = append(fields, executionRequirements) } return createObject("tool", fields, depth) } func parseEnvEntries(envEntries []*crosstoolpb.CToolchain_EnvEntry, depth int) string { var res []string for _, envEntry := range envEntries { res = append(res, parseEnvEntry(envEntry, depth+1)) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseEnvEntry(envEntry *crosstoolpb.CToolchain_EnvEntry, depth int) string { key := fmt.Sprintf("key = \"%s\"", envEntry.GetKey()) value := fmt.Sprintf("value = \"%s\"", envEntry.GetValue()) return createObject("env_entry", []string{key, value}, depth) } func parseWithFeatureSets(withFeatureSets []*crosstoolpb.CToolchain_WithFeatureSet, depth int) string { var res []string for _, withFeature := range withFeatureSets { res = append(res, parseWithFeatureSet(withFeature, depth+1)) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseWithFeatureSet(withFeature *crosstoolpb.CToolchain_WithFeatureSet, depth int) string { var fields []string if len(withFeature.GetFeature()) != 0 { features := "features = " + makeStringArr(withFeature.GetFeature(), depth+1 /* isPlainString= */, true) fields = append(fields, features) } if len(withFeature.GetNotFeature()) != 0 { notFeatures := "not_features = " + makeStringArr(withFeature.GetNotFeature(), depth+1 /* isPlainString= */, true) fields = append(fields, notFeatures) } return createObject("with_feature_set", fields, depth) } func parseEnvSets(envSets []*crosstoolpb.CToolchain_EnvSet, depth int) string { var res []string for _, envSet := range envSets { envSetString := parseEnvSet(envSet, depth+1) res = append(res, envSetString) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseEnvSet(envSet *crosstoolpb.CToolchain_EnvSet, depth int) string { actionsStatement := processActions(envSet.GetAction(), depth) actions := "actions = " + strings.Join(actionsStatement, " +\n"+getTabs(depth+2)) fields := []string{actions} if len(envSet.GetEnvEntry()) != 0 { envEntries := "env_entries = " + parseEnvEntries(envSet.GetEnvEntry(), depth+1) fields = append(fields, envEntries) } if len(envSet.GetWithFeature()) != 0 { withFeatures := "with_features = " + parseWithFeatureSets(envSet.GetWithFeature(), depth+1) fields = append(fields, withFeatures) } return createObject("env_set", fields, depth) } func parseFeatureSets(featureSets []*crosstoolpb.CToolchain_FeatureSet, depth int) string { var res []string for _, featureSet := range featureSets { res = append(res, parseFeatureSet(featureSet, depth+1)) } return makeStringArr(res, depth /* isPlainString= */, false) } func parseFeatureSet(featureSet *crosstoolpb.CToolchain_FeatureSet, depth int) string { features := "features = " + makeStringArr(featureSet.GetFeature(), depth+1 /* isPlainString= */, true) return createObject("feature_set", []string{features}, depth) } // Takes in a list of string elements and returns a string that represents // an array : // [ // "element1", // "element2", // ] // The isPlainString argument tells us whether the input elements should be // treated as string (eg, flags), or not (eg, variable names) func makeStringArr(arr []string, depth int, isPlainString bool) string { if len(arr) == 0 { return "[]" } var escapedArr []string for _, el := range arr { if isPlainString { escapedArr = append(escapedArr, strings.Replace(el, "\"", "\\\"", -1)) } else { escapedArr = append(escapedArr, el) } } addQuote := "" if isPlainString { addQuote = "\"" } singleLine := "[" + addQuote + strings.Join(escapedArr, addQuote+", "+addQuote) + addQuote + "]" if len(singleLine) < 60 { return singleLine } return "[\n" + getTabs(depth+1) + addQuote + strings.Join(escapedArr, addQuote+",\n"+getTabs(depth+1)+addQuote) + addQuote + ",\n" + getTabs(depth) + "]" } // Returns a string that represents a value assignment // (eg if ctx.attr.cpu == "linux": // compiler = "llvm" // elif ctx.attr.cpu == "windows": // compiler = "mingw" // else: // fail("Unreachable") func getAssignmentStatement(field string, valToIds map[string][]string, crosstool *crosstoolpb.CrosstoolRelease, toCToolchainIdentifier map[string]CToolchainIdentifier, depth int, isPlainString, shouldFail bool) string { var b bytes.Buffer if len(valToIds) <= 1 { // if there is only one possible value for this field, we don't need if statements for val := range valToIds { if val != "None" && isPlainString { val = "\"" + val + "\"" } b.WriteString(fmt.Sprintf("%s%s = %s\n", getTabs(depth), field, val)) break } } else { first := true var keys []string for k := range valToIds { keys = append(keys, k) } sort.Strings(keys) for _, value := range keys { ids := valToIds[value] branch := "elif" if first { branch = "if" } b.WriteString( getIfStatement(branch, ids, field, value, toCToolchainIdentifier, depth, isPlainString)) first = false } if shouldFail { b.WriteString( fmt.Sprintf( "%selse:\n%sfail(\"Unreachable\")\n", getTabs(depth), getTabs(depth+1))) } else { b.WriteString( fmt.Sprintf( "%selse:\n%s%s = None\n", getTabs(depth), getTabs(depth+1), field)) } } b.WriteString("\n") return b.String() } func getCPUToCompilers(identifiers []CToolchainIdentifier) map[string][]string { res := make(map[string][]string) for _, identifier := range identifiers { if identifier.compiler != "" { res[identifier.cpu] = append(res[identifier.cpu], identifier.compiler) } } return res } func getIfStatement(ifOrElseIf string, identifiers []string, field, val string, toCToolchainIdentifier map[string]CToolchainIdentifier, depth int, isPlainString bool) string { usedStmts := make(map[string]bool) if val != "None" && isPlainString { val = "\"" + val + "\"" } var cToolchainIdentifiers []CToolchainIdentifier for _, value := range toCToolchainIdentifier { cToolchainIdentifiers = append(cToolchainIdentifiers, value) } cpuToCompilers := getCPUToCompilers(cToolchainIdentifiers) countCpus := make(map[string]int) var conditions []string for _, id := range identifiers { identifier := toCToolchainIdentifier[id] stmt := getConditionStatementForCToolchainIdentifier(identifier) if _, ok := usedStmts[stmt]; !ok { conditions = append(conditions, stmt) usedStmts[stmt] = true if identifier.compiler != "" { countCpus[identifier.cpu]++ } } } var compressedConditions []string usedStmtsOptimized := make(map[string]bool) for _, id := range identifiers { identifier := toCToolchainIdentifier[id] var stmt string if _, ok := countCpus[identifier.cpu]; ok { if countCpus[identifier.cpu] == len(cpuToCompilers[identifier.cpu]) { stmt = getConditionStatementForCToolchainIdentifier( CToolchainIdentifier{cpu: identifier.cpu, compiler: ""}) } else { stmt = getConditionStatementForCToolchainIdentifier(identifier) } } else { stmt = getConditionStatementForCToolchainIdentifier(identifier) } if _, ok := usedStmtsOptimized[stmt]; !ok { compressedConditions = append(compressedConditions, stmt) usedStmtsOptimized[stmt] = true } } sort.Strings(compressedConditions) val = strings.Join(strings.Split(val, "\n"+getTabs(depth)), "\n"+getTabs(depth+1)) return fmt.Sprintf(`%s%s %s: %s%s = %s `, getTabs(depth), ifOrElseIf, "("+strings.Join(compressedConditions, "\n"+getTabs(depth+1)+"or ")+")", getTabs(depth+1), field, val) } func getToolchainIdentifiers(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetToolchainIdentifier()) } return res } func getHostSystemNames(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetHostSystemName()) } return res } func getTargetSystemNames(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetTargetSystemName()) } return res } func getTargetCpus(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetTargetCpu()) } return res } func getTargetLibcs(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetTargetLibc()) } return res } func getCompilers(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetCompiler()) } return res } func getAbiVersions(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetAbiVersion()) } return res } func getAbiLibcVersions(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { res = append(res, toolchain.GetAbiLibcVersion()) } return res } func getCcTargetOss(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { targetOS := "None" if toolchain.GetCcTargetOs() != "" { targetOS = toolchain.GetCcTargetOs() } res = append(res, targetOS) } return res } func getBuiltinSysroots(crosstool *crosstoolpb.CrosstoolRelease) []string { var res []string for _, toolchain := range crosstool.GetToolchain() { sysroot := "None" if toolchain.GetBuiltinSysroot() != "" { sysroot = toolchain.GetBuiltinSysroot() } res = append(res, sysroot) } return res } func getMappedStringValuesToIdentifiers(identifiers, fields []string) map[string][]string { res := make(map[string][]string) for i := range identifiers { res[fields[i]] = append(res[fields[i]], identifiers[i]) } return res } func getReturnStatement() string { return ` out = ctx.actions.declare_file(ctx.label.name) ctx.actions.write(out, "Fake executable") return [ cc_common.create_cc_toolchain_config_info( ctx = ctx, features = features, action_configs = action_configs, artifact_name_patterns = artifact_name_patterns, cxx_builtin_include_directories = cxx_builtin_include_directories, toolchain_identifier = toolchain_identifier, host_system_name = host_system_name, target_system_name = target_system_name, target_cpu = target_cpu, target_libc = target_libc, compiler = compiler, abi_version = abi_version, abi_libc_version = abi_libc_version, tool_paths = tool_paths, make_variables = make_variables, builtin_sysroot = builtin_sysroot, cc_target_os = cc_target_os ), DefaultInfo( executable = out, ), ] ` } // Transform writes a cc_toolchain_config rule functionally equivalent to the // CROSSTOOL file. func Transform(crosstool *crosstoolpb.CrosstoolRelease) (string, error) { var b bytes.Buffer cToolchainIdentifiers := toolchainToCToolchainIdentifier(crosstool) toolchainToFeatures, featureNameToFeature, err := getFeatures(crosstool) if err != nil { return "", err } toolchainToActions, actionNameToAction, err := getActions(crosstool) if err != nil { return "", err } header := getCcToolchainConfigHeader() if _, err := b.WriteString(header); err != nil { return "", err } loadActionsStmt := getLoadActionsStmt() if _, err := b.WriteString(loadActionsStmt); err != nil { return "", err } implHeader := getImplHeader() if _, err := b.WriteString(implHeader); err != nil { return "", err } stringFields := []string{ "toolchain_identifier", "host_system_name", "target_system_name", "target_cpu", "target_libc", "compiler", "abi_version", "abi_libc_version", "cc_target_os", "builtin_sysroot", } for _, stringField := range stringFields { stmt := getStringStatement(crosstool, cToolchainIdentifiers, stringField, 1) if _, err := b.WriteString(stmt); err != nil { return "", err } } listsOfActions := []string{ "all_compile_actions", "all_cpp_compile_actions", "preprocessor_compile_actions", "codegen_compile_actions", "all_link_actions", } for _, listOfActions := range listsOfActions { actions := getListOfActions(listOfActions, 1) if _, err := b.WriteString(actions); err != nil { return "", err } } actionConfigDeclaration := getActionConfigsDeclaration( crosstool, cToolchainIdentifiers, actionNameToAction, 1) if _, err := b.WriteString(actionConfigDeclaration); err != nil { return "", err } actionConfigStatement := getActionConfigsStmt( cToolchainIdentifiers, toolchainToActions, 1) if _, err := b.WriteString(actionConfigStatement); err != nil { return "", err } featureDeclaration := getFeaturesDeclaration( crosstool, cToolchainIdentifiers, featureNameToFeature, 1) if _, err := b.WriteString(featureDeclaration); err != nil { return "", err } featuresStatement := getFeaturesStmt( cToolchainIdentifiers, toolchainToFeatures, 1) if _, err := b.WriteString(featuresStatement); err != nil { return "", err } includeDirectories := getStringArr( crosstool, cToolchainIdentifiers, "cxx_builtin_include_directories", 1) if _, err := b.WriteString(includeDirectories); err != nil { return "", err } artifactNamePatterns := getArtifactNamePatterns( crosstool, cToolchainIdentifiers, 1) if _, err := b.WriteString(artifactNamePatterns); err != nil { return "", err } makeVariables := getMakeVariables(crosstool, cToolchainIdentifiers, 1) if _, err := b.WriteString(makeVariables); err != nil { return "", err } toolPaths := getToolPaths(crosstool, cToolchainIdentifiers, 1) if _, err := b.WriteString(toolPaths); err != nil { return "", err } if _, err := b.WriteString(getReturnStatement()); err != nil { return "", err } rule := getRule(cToolchainIdentifiers, getCompilers(crosstool)) if _, err := b.WriteString(rule); err != nil { return "", err } return b.String(), nil } 07070100000030000081A4000003E800000064000000015D359B420000CCAD000000000000000000000000000000000000004A00000000bazel-rules-cc-20190722/tools/migration/crosstool_to_starlark_lib_test.gopackage crosstooltostarlarklib import ( "fmt" "strings" "testing" "log" "github.com/golang/protobuf/proto" crosstoolpb "third_party/com/github/bazelbuild/bazel/src/main/protobuf/crosstool_config_go_proto" ) func makeCToolchainString(lines []string) string { return fmt.Sprintf(`toolchain { %s }`, strings.Join(lines, "\n ")) } func makeCrosstool(CToolchains []string) *crosstoolpb.CrosstoolRelease { crosstool := &crosstoolpb.CrosstoolRelease{} requiredFields := []string{ "major_version: '0'", "minor_version: '0'", "default_target_cpu: 'cpu'", } CToolchains = append(CToolchains, requiredFields...) if err := proto.UnmarshalText(strings.Join(CToolchains, "\n"), crosstool); err != nil { log.Fatalf("Failed to parse CROSSTOOL:", err) } return crosstool } func getSimpleCToolchain(id string) string { lines := []string{ "toolchain_identifier: 'id-" + id + "'", "host_system_name: 'host-" + id + "'", "target_system_name: 'target-" + id + "'", "target_cpu: 'cpu-" + id + "'", "compiler: 'compiler-" + id + "'", "target_libc: 'libc-" + id + "'", "abi_version: 'version-" + id + "'", "abi_libc_version: 'libc_version-" + id + "'", } return makeCToolchainString(lines) } func getCToolchain(id, cpu, compiler string, extraLines []string) string { lines := []string{ "toolchain_identifier: '" + id + "'", "host_system_name: 'host'", "target_system_name: 'target'", "target_cpu: '" + cpu + "'", "compiler: '" + compiler + "'", "target_libc: 'libc'", "abi_version: 'version'", "abi_libc_version: 'libc_version'", } lines = append(lines, extraLines...) return makeCToolchainString(lines) } func TestStringFieldsConditionStatement(t *testing.T) { toolchain1 := getSimpleCToolchain("1") toolchain2 := getSimpleCToolchain("2") toolchains := []string{toolchain1, toolchain2} crosstool := makeCrosstool(toolchains) testCases := []struct { field string expectedText string }{ {field: "toolchain_identifier", expectedText: ` if (ctx.attr.cpu == "cpu-1"): toolchain_identifier = "id-1" elif (ctx.attr.cpu == "cpu-2"): toolchain_identifier = "id-2" else: fail("Unreachable")`}, {field: "host_system_name", expectedText: ` if (ctx.attr.cpu == "cpu-1"): host_system_name = "host-1" elif (ctx.attr.cpu == "cpu-2"): host_system_name = "host-2" else: fail("Unreachable")`}, {field: "target_system_name", expectedText: ` if (ctx.attr.cpu == "cpu-1"): target_system_name = "target-1" elif (ctx.attr.cpu == "cpu-2"): target_system_name = "target-2" else: fail("Unreachable")`}, {field: "target_cpu", expectedText: ` if (ctx.attr.cpu == "cpu-1"): target_cpu = "cpu-1" elif (ctx.attr.cpu == "cpu-2"): target_cpu = "cpu-2" else: fail("Unreachable")`}, {field: "target_libc", expectedText: ` if (ctx.attr.cpu == "cpu-1"): target_libc = "libc-1" elif (ctx.attr.cpu == "cpu-2"): target_libc = "libc-2" else: fail("Unreachable")`}, {field: "compiler", expectedText: ` if (ctx.attr.cpu == "cpu-1"): compiler = "compiler-1" elif (ctx.attr.cpu == "cpu-2"): compiler = "compiler-2" else: fail("Unreachable")`}, {field: "abi_version", expectedText: ` if (ctx.attr.cpu == "cpu-1"): abi_version = "version-1" elif (ctx.attr.cpu == "cpu-2"): abi_version = "version-2" else: fail("Unreachable")`}, {field: "abi_libc_version", expectedText: ` if (ctx.attr.cpu == "cpu-1"): abi_libc_version = "libc_version-1" elif (ctx.attr.cpu == "cpu-2"): abi_libc_version = "libc_version-2" else: fail("Unreachable")`}} got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } failed := false for _, tc := range testCases { if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) failed = true } } if failed { t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(toolchains, "\n"), got) } } func TestConditionsSameCpu(t *testing.T) { toolchainAA := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainAB := getCToolchain("2", "cpuA", "compilerB", []string{}) toolchains := []string{toolchainAA, toolchainAB} crosstool := makeCrosstool(toolchains) testCases := []struct { field string expectedText string }{ {field: "toolchain_identifier", expectedText: ` if (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerA"): toolchain_identifier = "1" elif (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerB"): toolchain_identifier = "2" else: fail("Unreachable")`}, {field: "host_system_name", expectedText: ` host_system_name = "host"`}, {field: "target_system_name", expectedText: ` target_system_name = "target"`}, {field: "target_cpu", expectedText: ` target_cpu = "cpuA"`}, {field: "target_libc", expectedText: ` target_libc = "libc"`}, {field: "compiler", expectedText: ` if (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerA"): compiler = "compilerA" elif (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerB"): compiler = "compilerB" else: fail("Unreachable")`}, {field: "abi_version", expectedText: ` abi_version = "version"`}, {field: "abi_libc_version", expectedText: ` abi_libc_version = "libc_version"`}} got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } failed := false for _, tc := range testCases { if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) failed = true } } if failed { t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(toolchains, "\n"), got) } } func TestConditionsSameCompiler(t *testing.T) { toolchainAA := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainBA := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchains := []string{toolchainAA, toolchainBA} crosstool := makeCrosstool(toolchains) testCases := []struct { field string expectedText string }{ {field: "toolchain_identifier", expectedText: ` if (ctx.attr.cpu == "cpuA"): toolchain_identifier = "1" elif (ctx.attr.cpu == "cpuB"): toolchain_identifier = "2" else: fail("Unreachable")`}, {field: "target_cpu", expectedText: ` if (ctx.attr.cpu == "cpuA"): target_cpu = "cpuA" elif (ctx.attr.cpu == "cpuB"): target_cpu = "cpuB" else: fail("Unreachable")`}, {field: "compiler", expectedText: ` compiler = "compilerA"`}} got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } failed := false for _, tc := range testCases { if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) failed = true } } if failed { t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(toolchains, "\n"), got) } } func TestNonMandatoryStrings(t *testing.T) { toolchainAA := getCToolchain("1", "cpuA", "compilerA", []string{"cc_target_os: 'osA'"}) toolchainBB := getCToolchain("2", "cpuB", "compilerB", []string{}) toolchains := []string{toolchainAA, toolchainBB} crosstool := makeCrosstool(toolchains) testCases := []struct { field string expectedText string }{ {field: "cc_target_os", expectedText: ` if (ctx.attr.cpu == "cpuB"): cc_target_os = None elif (ctx.attr.cpu == "cpuA"): cc_target_os = "osA" else: fail("Unreachable")`}, {field: "builtin_sysroot", expectedText: ` builtin_sysroot = None`}} got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } failed := false for _, tc := range testCases { if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) failed = true } } if failed { t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(toolchains, "\n"), got) } } func TestBuiltinIncludeDirectories(t *testing.T) { toolchainAA := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainBA := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainCA := getCToolchain("3", "cpuC", "compilerA", []string{"cxx_builtin_include_directory: 'dirC'"}) toolchainCB := getCToolchain("4", "cpuC", "compilerB", []string{"cxx_builtin_include_directory: 'dirC'", "cxx_builtin_include_directory: 'dirB'"}) toolchainDA := getCToolchain("5", "cpuD", "compilerA", []string{"cxx_builtin_include_directory: 'dirC'"}) toolchainsEmpty := []string{toolchainAA, toolchainBA} toolchainsOneNonempty := []string{toolchainAA, toolchainBA, toolchainCA} toolchainsSameNonempty := []string{toolchainCA, toolchainDA} allToolchains := []string{toolchainAA, toolchainBA, toolchainCA, toolchainCB, toolchainDA} testCases := []struct { field string toolchains []string expectedText string }{ {field: "cxx_builtin_include_directories", toolchains: toolchainsEmpty, expectedText: ` cxx_builtin_include_directories = []`}, {field: "cxx_builtin_include_directories", toolchains: toolchainsOneNonempty, expectedText: ` if (ctx.attr.cpu == "cpuA" or ctx.attr.cpu == "cpuB"): cxx_builtin_include_directories = [] elif (ctx.attr.cpu == "cpuC"): cxx_builtin_include_directories = ["dirC"] else: fail("Unreachable")`}, {field: "cxx_builtin_include_directories", toolchains: toolchainsSameNonempty, expectedText: ` cxx_builtin_include_directories = ["dirC"]`}, {field: "cxx_builtin_include_directories", toolchains: allToolchains, expectedText: ` if (ctx.attr.cpu == "cpuA" or ctx.attr.cpu == "cpuB"): cxx_builtin_include_directories = [] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerA" or ctx.attr.cpu == "cpuD"): cxx_builtin_include_directories = ["dirC"] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerB"): cxx_builtin_include_directories = ["dirC", "dirB"]`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func TestMakeVariables(t *testing.T) { toolchainEmpty1 := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainEmpty2 := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainA1 := getCToolchain("3", "cpuC", "compilerA", []string{"make_variable {name: 'A', value: 'a/b/c'}"}) toolchainA2 := getCToolchain("4", "cpuC", "compilerB", []string{"make_variable {name: 'A', value: 'a/b/c'}"}) toolchainAB := getCToolchain("5", "cpuC", "compilerC", []string{"make_variable {name: 'A', value: 'a/b/c'}", "make_variable {name: 'B', value: 'a/b/c'}"}) toolchainBA := getCToolchain("6", "cpuD", "compilerA", []string{"make_variable {name: 'B', value: 'a/b/c'}", "make_variable {name: 'A', value: 'a b c'}"}) toolchainsEmpty := []string{toolchainEmpty1, toolchainEmpty2} toolchainsOneNonempty := []string{toolchainEmpty1, toolchainA1} toolchainsSameNonempty := []string{toolchainA1, toolchainA2} toolchainsDifferentOrder := []string{toolchainAB, toolchainBA} allToolchains := []string{ toolchainEmpty1, toolchainEmpty2, toolchainA1, toolchainA2, toolchainAB, toolchainBA, } testCases := []struct { field string toolchains []string expectedText string }{ {field: "make_variables", toolchains: toolchainsEmpty, expectedText: ` make_variables = []`}, {field: "make_variables", toolchains: toolchainsOneNonempty, expectedText: ` if (ctx.attr.cpu == "cpuA"): make_variables = [] elif (ctx.attr.cpu == "cpuC"): make_variables = [make_variable(name = "A", value = "a/b/c")] else: fail("Unreachable")`}, {field: "make_variables", toolchains: toolchainsSameNonempty, expectedText: ` make_variables = [make_variable(name = "A", value = "a/b/c")]`}, {field: "make_variables", toolchains: toolchainsDifferentOrder, expectedText: ` if (ctx.attr.cpu == "cpuC"): make_variables = [ make_variable(name = "A", value = "a/b/c"), make_variable(name = "B", value = "a/b/c"), ] elif (ctx.attr.cpu == "cpuD"): make_variables = [ make_variable(name = "B", value = "a/b/c"), make_variable(name = "A", value = "a b c"), ] else: fail("Unreachable")`}, {field: "make_variables", toolchains: allToolchains, expectedText: ` if (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerC"): make_variables = [ make_variable(name = "A", value = "a/b/c"), make_variable(name = "B", value = "a/b/c"), ] elif (ctx.attr.cpu == "cpuD"): make_variables = [ make_variable(name = "B", value = "a/b/c"), make_variable(name = "A", value = "a b c"), ] elif (ctx.attr.cpu == "cpuA" or ctx.attr.cpu == "cpuB"): make_variables = [] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerA" or ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerB"): make_variables = [make_variable(name = "A", value = "a/b/c")] else: fail("Unreachable")`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func TestToolPaths(t *testing.T) { toolchainEmpty1 := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainEmpty2 := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainA1 := getCToolchain("3", "cpuC", "compilerA", []string{"tool_path {name: 'A', path: 'a/b/c'}"}) toolchainA2 := getCToolchain("4", "cpuC", "compilerB", []string{"tool_path {name: 'A', path: 'a/b/c'}"}) toolchainAB := getCToolchain("5", "cpuC", "compilerC", []string{"tool_path {name: 'A', path: 'a/b/c'}", "tool_path {name: 'B', path: 'a/b/c'}"}) toolchainBA := getCToolchain("6", "cpuD", "compilerA", []string{"tool_path {name: 'B', path: 'a/b/c'}", "tool_path {name: 'A', path: 'a/b/c'}"}) toolchainsEmpty := []string{toolchainEmpty1, toolchainEmpty2} toolchainsOneNonempty := []string{toolchainEmpty1, toolchainA1} toolchainsSameNonempty := []string{toolchainA1, toolchainA2} toolchainsDifferentOrder := []string{toolchainAB, toolchainBA} allToolchains := []string{ toolchainEmpty1, toolchainEmpty2, toolchainA1, toolchainA2, toolchainAB, toolchainBA, } testCases := []struct { field string toolchains []string expectedText string }{ {field: "tool_paths", toolchains: toolchainsEmpty, expectedText: ` tool_paths = []`}, {field: "tool_paths", toolchains: toolchainsOneNonempty, expectedText: ` if (ctx.attr.cpu == "cpuA"): tool_paths = [] elif (ctx.attr.cpu == "cpuC"): tool_paths = [tool_path(name = "A", path = "a/b/c")] else: fail("Unreachable")`}, {field: "tool_paths", toolchains: toolchainsSameNonempty, expectedText: ` tool_paths = [tool_path(name = "A", path = "a/b/c")]`}, {field: "tool_paths", toolchains: toolchainsDifferentOrder, expectedText: ` if (ctx.attr.cpu == "cpuC"): tool_paths = [ tool_path(name = "A", path = "a/b/c"), tool_path(name = "B", path = "a/b/c"), ] elif (ctx.attr.cpu == "cpuD"): tool_paths = [ tool_path(name = "B", path = "a/b/c"), tool_path(name = "A", path = "a/b/c"), ] else: fail("Unreachable")`}, {field: "tool_paths", toolchains: allToolchains, expectedText: ` if (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerC"): tool_paths = [ tool_path(name = "A", path = "a/b/c"), tool_path(name = "B", path = "a/b/c"), ] elif (ctx.attr.cpu == "cpuD"): tool_paths = [ tool_path(name = "B", path = "a/b/c"), tool_path(name = "A", path = "a/b/c"), ] elif (ctx.attr.cpu == "cpuA" or ctx.attr.cpu == "cpuB"): tool_paths = [] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerA" or ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerB"): tool_paths = [tool_path(name = "A", path = "a/b/c")] else: fail("Unreachable")`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func getArtifactNamePattern(lines []string) string { return fmt.Sprintf(`artifact_name_pattern { %s }`, strings.Join(lines, "\n ")) } func TestArtifactNamePatterns(t *testing.T) { toolchainEmpty1 := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainEmpty2 := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainA1 := getCToolchain("3", "cpuC", "compilerA", []string{ getArtifactNamePattern([]string{ "category_name: 'A'", "prefix: 'p'", "extension: '.exe'"}), }, ) toolchainA2 := getCToolchain("4", "cpuC", "compilerB", []string{ getArtifactNamePattern([]string{ "category_name: 'A'", "prefix: 'p'", "extension: '.exe'"}), }, ) toolchainAB := getCToolchain("5", "cpuC", "compilerC", []string{ getArtifactNamePattern([]string{ "category_name: 'A'", "prefix: 'p'", "extension: '.exe'"}), getArtifactNamePattern([]string{ "category_name: 'B'", "prefix: 'p'", "extension: '.exe'"}), }, ) toolchainBA := getCToolchain("6", "cpuD", "compilerA", []string{ getArtifactNamePattern([]string{ "category_name: 'B'", "prefix: 'p'", "extension: '.exe'"}), getArtifactNamePattern([]string{ "category_name: 'A'", "prefix: 'p'", "extension: '.exe'"}), }, ) toolchainsEmpty := []string{toolchainEmpty1, toolchainEmpty2} toolchainsOneNonempty := []string{toolchainEmpty1, toolchainA1} toolchainsSameNonempty := []string{toolchainA1, toolchainA2} toolchainsDifferentOrder := []string{toolchainAB, toolchainBA} allToolchains := []string{ toolchainEmpty1, toolchainEmpty2, toolchainA1, toolchainA2, toolchainAB, toolchainBA, } testCases := []struct { field string toolchains []string expectedText string }{ {field: "artifact_name_patterns", toolchains: toolchainsEmpty, expectedText: ` artifact_name_patterns = []`}, {field: "artifact_name_patterns", toolchains: toolchainsOneNonempty, expectedText: ` if (ctx.attr.cpu == "cpuC"): artifact_name_patterns = [ artifact_name_pattern( category_name = "A", prefix = "p", extension = ".exe", ), ] elif (ctx.attr.cpu == "cpuA"): artifact_name_patterns = [] else: fail("Unreachable")`}, {field: "artifact_name_patterns", toolchains: toolchainsSameNonempty, expectedText: ` artifact_name_patterns = [ artifact_name_pattern( category_name = "A", prefix = "p", extension = ".exe", ), ]`}, {field: "artifact_name_patterns", toolchains: toolchainsDifferentOrder, expectedText: ` if (ctx.attr.cpu == "cpuC"): artifact_name_patterns = [ artifact_name_pattern( category_name = "A", prefix = "p", extension = ".exe", ), artifact_name_pattern( category_name = "B", prefix = "p", extension = ".exe", ), ] elif (ctx.attr.cpu == "cpuD"): artifact_name_patterns = [ artifact_name_pattern( category_name = "B", prefix = "p", extension = ".exe", ), artifact_name_pattern( category_name = "A", prefix = "p", extension = ".exe", ), ] else: fail("Unreachable")`}, {field: "artifact_name_patterns", toolchains: allToolchains, expectedText: ` if (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerC"): artifact_name_patterns = [ artifact_name_pattern( category_name = "A", prefix = "p", extension = ".exe", ), artifact_name_pattern( category_name = "B", prefix = "p", extension = ".exe", ), ] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerA" or ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerB"): artifact_name_patterns = [ artifact_name_pattern( category_name = "A", prefix = "p", extension = ".exe", ), ] elif (ctx.attr.cpu == "cpuD"): artifact_name_patterns = [ artifact_name_pattern( category_name = "B", prefix = "p", extension = ".exe", ), artifact_name_pattern( category_name = "A", prefix = "p", extension = ".exe", ), ] elif (ctx.attr.cpu == "cpuA" or ctx.attr.cpu == "cpuB"): artifact_name_patterns = [] else: fail("Unreachable")`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func getFeature(lines []string) string { return fmt.Sprintf(`feature { %s }`, strings.Join(lines, "\n ")) } func TestFeatureListAssignment(t *testing.T) { toolchainEmpty1 := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainEmpty2 := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainA1 := getCToolchain("3", "cpuC", "compilerA", []string{getFeature([]string{"name: 'A'"})}, ) toolchainA2 := getCToolchain("4", "cpuC", "compilerB", []string{getFeature([]string{"name: 'A'"})}, ) toolchainAB := getCToolchain("5", "cpuC", "compilerC", []string{ getFeature([]string{"name: 'A'"}), getFeature([]string{"name: 'B'"}), }, ) toolchainBA := getCToolchain("6", "cpuD", "compilerA", []string{ getFeature([]string{"name: 'B'"}), getFeature([]string{"name: 'A'"}), }, ) toolchainsEmpty := []string{toolchainEmpty1, toolchainEmpty2} toolchainsOneNonempty := []string{toolchainEmpty1, toolchainA1} toolchainsSameNonempty := []string{toolchainA1, toolchainA2} toolchainsDifferentOrder := []string{toolchainAB, toolchainBA} allToolchains := []string{ toolchainEmpty1, toolchainEmpty2, toolchainA1, toolchainA2, toolchainAB, toolchainBA, } testCases := []struct { field string toolchains []string expectedText string }{ {field: "features", toolchains: toolchainsEmpty, expectedText: ` features = []`}, {field: "features", toolchains: toolchainsOneNonempty, expectedText: ` if (ctx.attr.cpu == "cpuA"): features = [] elif (ctx.attr.cpu == "cpuC"): features = [a_feature] else: fail("Unreachable")`}, {field: "features", toolchains: toolchainsSameNonempty, expectedText: ` features = [a_feature]`}, {field: "features", toolchains: toolchainsDifferentOrder, expectedText: ` if (ctx.attr.cpu == "cpuC"): features = [a_feature, b_feature] elif (ctx.attr.cpu == "cpuD"): features = [b_feature, a_feature] else: fail("Unreachable")`}, {field: "features", toolchains: allToolchains, expectedText: ` if (ctx.attr.cpu == "cpuA" or ctx.attr.cpu == "cpuB"): features = [] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerA" or ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerB"): features = [a_feature] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerC"): features = [a_feature, b_feature] elif (ctx.attr.cpu == "cpuD"): features = [b_feature, a_feature] else: fail("Unreachable")`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func getActionConfig(lines []string) string { return fmt.Sprintf(`action_config { %s }`, strings.Join(lines, "\n ")) } func TestActionConfigListAssignment(t *testing.T) { toolchainEmpty1 := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainEmpty2 := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainA1 := getCToolchain("3", "cpuC", "compilerA", []string{ getActionConfig([]string{"action_name: 'A'", "config_name: 'A'"}), }, ) toolchainA2 := getCToolchain("4", "cpuC", "compilerB", []string{ getActionConfig([]string{"action_name: 'A'", "config_name: 'A'"}), }, ) toolchainAB := getCToolchain("5", "cpuC", "compilerC", []string{ getActionConfig([]string{"action_name: 'A'", "config_name: 'A'"}), getActionConfig([]string{"action_name: 'B'", "config_name: 'B'"}), }, ) toolchainBA := getCToolchain("6", "cpuD", "compilerA", []string{ getActionConfig([]string{"action_name: 'B'", "config_name: 'B'"}), getActionConfig([]string{"action_name: 'A'", "config_name: 'A'"}), }, ) toolchainsEmpty := []string{toolchainEmpty1, toolchainEmpty2} toolchainsOneNonempty := []string{toolchainEmpty1, toolchainA1} toolchainsSameNonempty := []string{toolchainA1, toolchainA2} toolchainsDifferentOrder := []string{toolchainAB, toolchainBA} allToolchains := []string{ toolchainEmpty1, toolchainEmpty2, toolchainA1, toolchainA2, toolchainAB, toolchainBA, } testCases := []struct { field string toolchains []string expectedText string }{ {field: "action_configs", toolchains: toolchainsEmpty, expectedText: ` action_configs = []`}, {field: "action_configs", toolchains: toolchainsOneNonempty, expectedText: ` if (ctx.attr.cpu == "cpuA"): action_configs = [] elif (ctx.attr.cpu == "cpuC"): action_configs = [a_action] else: fail("Unreachable")`}, {field: "action_configs", toolchains: toolchainsSameNonempty, expectedText: ` action_configs = [a_action]`}, {field: "action_configs", toolchains: toolchainsDifferentOrder, expectedText: ` if (ctx.attr.cpu == "cpuC"): action_configs = [a_action, b_action] elif (ctx.attr.cpu == "cpuD"): action_configs = [b_action, a_action] else: fail("Unreachable")`}, {field: "action_configs", toolchains: allToolchains, expectedText: ` if (ctx.attr.cpu == "cpuA" or ctx.attr.cpu == "cpuB"): action_configs = [] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerA" or ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerB"): action_configs = [a_action] elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerC"): action_configs = [a_action, b_action] elif (ctx.attr.cpu == "cpuD"): action_configs = [b_action, a_action] else: fail("Unreachable")`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func TestAllAndNoneAvailableErrorsWhenMoreThanOneElement(t *testing.T) { toolchainFeatureAllAvailable := getCToolchain("1", "cpu", "compiler", []string{getFeature([]string{ "name: 'A'", "flag_set {", " action: 'A'", " flag_group {", " flag: 'f'", " expand_if_all_available: 'e1'", " expand_if_all_available: 'e2'", " }", "}", })}, ) toolchainFeatureNoneAvailable := getCToolchain("1", "cpu", "compiler", []string{getFeature([]string{ "name: 'A'", "flag_set {", " action: 'A'", " flag_group {", " flag: 'f'", " expand_if_none_available: 'e1'", " expand_if_none_available: 'e2'", " }", "}", })}, ) toolchainActionConfigAllAvailable := getCToolchain("1", "cpu", "compiler", []string{getActionConfig([]string{ "config_name: 'A'", "action_name: 'A'", "flag_set {", " action: 'A'", " flag_group {", " flag: 'f'", " expand_if_all_available: 'e1'", " expand_if_all_available: 'e2'", " }", "}", })}, ) toolchainActionConfigNoneAvailable := getCToolchain("1", "cpu", "compiler", []string{getActionConfig([]string{ "config_name: 'A'", "action_name: 'A'", "flag_set {", " action: 'A'", " flag_group {", " flag: 'f'", " expand_if_none_available: 'e1'", " expand_if_none_available: 'e2'", " }", "}", })}, ) testCases := []struct { field string toolchain string expectedText string }{ {field: "features", toolchain: toolchainFeatureAllAvailable, expectedText: "Error in feature 'A': Flag group must not have more " + "than one 'expand_if_all_available' field"}, {field: "features", toolchain: toolchainFeatureNoneAvailable, expectedText: "Error in feature 'A': Flag group must not have more " + "than one 'expand_if_none_available' field"}, {field: "action_configs", toolchain: toolchainActionConfigAllAvailable, expectedText: "Error in action_config 'A': Flag group must not have more " + "than one 'expand_if_all_available' field"}, {field: "action_configs", toolchain: toolchainActionConfigNoneAvailable, expectedText: "Error in action_config 'A': Flag group must not have more " + "than one 'expand_if_none_available' field"}, } for _, tc := range testCases { crosstool := makeCrosstool([]string{tc.toolchain}) _, err := Transform(crosstool) if err == nil || !strings.Contains(err.Error(), tc.expectedText) { t.Errorf("Expected error: %s, got: %v", tc.expectedText, err) } } } func TestFeaturesAndActionConfigsSetToNoneWhenAllOptionsAreExausted(t *testing.T) { toolchainFeatureAEnabled := getCToolchain("1", "cpuA", "compilerA", []string{getFeature([]string{"name: 'A'", "enabled: true"})}, ) toolchainFeatureADisabled := getCToolchain("2", "cpuA", "compilerB", []string{getFeature([]string{"name: 'A'", "enabled: false"})}, ) toolchainWithoutFeatureA := getCToolchain("3", "cpuA", "compilerC", []string{}) toolchainActionConfigAEnabled := getCToolchain("4", "cpuA", "compilerD", []string{getActionConfig([]string{ "config_name: 'A'", "action_name: 'A'", "enabled: true", })}) toolchainActionConfigADisabled := getCToolchain("5", "cpuA", "compilerE", []string{getActionConfig([]string{ "config_name: 'A'", "action_name: 'A'", })}) toolchainWithoutActionConfigA := getCToolchain("6", "cpuA", "compilerF", []string{}) testCases := []struct { field string toolchains []string expectedText string }{ {field: "features", toolchains: []string{ toolchainFeatureAEnabled, toolchainFeatureADisabled, toolchainWithoutFeatureA}, expectedText: ` if (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerB"): a_feature = feature(name = "A") elif (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerA"): a_feature = feature(name = "A", enabled = True) else: a_feature = None `}, {field: "action_config", toolchains: []string{ toolchainActionConfigAEnabled, toolchainActionConfigADisabled, toolchainWithoutActionConfigA}, expectedText: ` if (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerE"): a_action = action_config(action_name = "A") elif (ctx.attr.cpu == "cpuA" and ctx.attr.compiler == "compilerD"): a_action = action_config(action_name = "A", enabled = True) else: a_action = None `}, } for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly convert '%s' field, expected to contain:\n%v\n", tc.field, tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func TestActionConfigDeclaration(t *testing.T) { toolchainEmpty1 := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainEmpty2 := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainNameNotInDict := getCToolchain("3", "cpBC", "compilerB", []string{ getActionConfig([]string{"action_name: 'A-B.C'", "config_name: 'A-B.C'"}), }, ) toolchainNameInDictA := getCToolchain("4", "cpuC", "compilerA", []string{ getActionConfig([]string{"action_name: 'c++-compile'", "config_name: 'c++-compile'"}), }, ) toolchainNameInDictB := getCToolchain("5", "cpuC", "compilerB", []string{ getActionConfig([]string{ "action_name: 'c++-compile'", "config_name: 'c++-compile'", "tool {", " tool_path: '/a/b/c'", "}", }), }, ) toolchainComplexActionConfig := getCToolchain("6", "cpuC", "compilerC", []string{ getActionConfig([]string{ "action_name: 'action-complex'", "config_name: 'action-complex'", "enabled: true", "tool {", " tool_path: '/a/b/c'", " with_feature {", " feature: 'a'", " feature: 'b'", " not_feature: 'c'", " not_feature: 'd'", " }", " with_feature{", " feature: 'e'", " }", " execution_requirement: 'a'", "}", "tool {", " tool_path: ''", "}", "flag_set {", " flag_group {", " flag: 'a'", " flag: '%b'", " iterate_over: 'c'", " expand_if_all_available: 'd'", " expand_if_none_available: 'e'", " expand_if_true: 'f'", " expand_if_false: 'g'", " expand_if_equal {", " variable: 'var'", " value: 'val'", " }", " }", " flag_group {", " flag_group {", " flag: 'a'", " }", " }", "}", "flag_set {", " with_feature {", " feature: 'a'", " feature: 'b'", " not_feature: 'c'", " not_feature: 'd'", " }", "}", "env_set {", " action: 'a'", " env_entry {", " key: 'k'", " value: 'v'", " }", " with_feature {", " feature: 'a'", " }", "}", "requires {", " feature: 'a'", " feature: 'b'", "}", "implies: 'a'", "implies: 'b'", }), }, ) testCases := []struct { toolchains []string expectedText string }{ { toolchains: []string{toolchainEmpty1, toolchainEmpty2}, expectedText: ` action_configs = []`}, { toolchains: []string{toolchainEmpty1, toolchainNameNotInDict}, expectedText: ` a_b_c_action = action_config(action_name = "A-B.C")`}, { toolchains: []string{toolchainNameInDictA, toolchainNameInDictB}, expectedText: ` if (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerB"): cpp_compile_action = action_config( action_name = ACTION_NAMES.cpp_compile, tools = [tool(path = "/a/b/c")], ) elif (ctx.attr.cpu == "cpuC" and ctx.attr.compiler == "compilerA"): cpp_compile_action = action_config(action_name = ACTION_NAMES.cpp_compile)`}, { toolchains: []string{toolchainComplexActionConfig}, expectedText: ` action_complex_action = action_config( action_name = "action-complex", enabled = True, flag_sets = [ flag_set( flag_groups = [ flag_group( flags = ["a", "%b"], iterate_over = "c", expand_if_available = "d", expand_if_not_available = "e", expand_if_true = "f", expand_if_false = "g", expand_if_equal = variable_with_value(name = "var", value = "val"), ), flag_group(flag_groups = [flag_group(flags = ["a"])]), ], ), flag_set( with_features = [ with_feature_set( features = ["a", "b"], not_features = ["c", "d"], ), ], ), ], implies = ["a", "b"], tools = [ tool( path = "/a/b/c", with_features = [ with_feature_set( features = ["a", "b"], not_features = ["c", "d"], ), with_feature_set(features = ["e"]), ], execution_requirements = ["a"], ), tool(path = "NOT_USED"), ], )`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly declare an action_config, expected to contain:\n%v\n", tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func TestFeatureDeclaration(t *testing.T) { toolchainEmpty1 := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainEmpty2 := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainSimpleFeatureA1 := getCToolchain("3", "cpuB", "compilerB", []string{ getFeature([]string{"name: 'Feature-c++.a'", "enabled: true"}), }, ) toolchainSimpleFeatureA2 := getCToolchain("4", "cpuC", "compilerA", []string{ getFeature([]string{"name: 'Feature-c++.a'"}), }, ) toolchainComplexFeature := getCToolchain("5", "cpuC", "compilerC", []string{ getFeature([]string{ "name: 'complex-feature'", "enabled: true", "flag_set {", " action: 'c++-compile'", // in ACTION_NAMES " action: 'something-else'", // not in ACTION_NAMES " flag_group {", " flag: 'a'", " flag: '%b'", " iterate_over: 'c'", " expand_if_all_available: 'd'", " expand_if_none_available: 'e'", " expand_if_true: 'f'", " expand_if_false: 'g'", " expand_if_equal {", " variable: 'var'", " value: 'val'", " }", " }", " flag_group {", " flag_group {", " flag: 'a'", " }", " }", "}", "flag_set {", // all_compile_actions " action: 'c-compile'", " action: 'c++-compile'", " action: 'linkstamp-compile'", " action: 'assemble'", " action: 'preprocess-assemble'", " action: 'c++-header-parsing'", " action: 'c++-module-compile'", " action: 'c++-module-codegen'", " action: 'clif-match'", " action: 'lto-backend'", "}", "flag_set {", // all_cpp_compile_actions " action: 'c++-compile'", " action: 'linkstamp-compile'", " action: 'c++-header-parsing'", " action: 'c++-module-compile'", " action: 'c++-module-codegen'", " action: 'clif-match'", "}", "flag_set {", // all_link_actions " action: 'c++-link-executable'", " action: 'c++-link-dynamic-library'", " action: 'c++-link-nodeps-dynamic-library'", "}", "flag_set {", // all_cpp_compile_actions + all_link_actions " action: 'c++-compile'", " action: 'linkstamp-compile'", " action: 'c++-header-parsing'", " action: 'c++-module-compile'", " action: 'c++-module-codegen'", " action: 'clif-match'", " action: 'c++-link-executable'", " action: 'c++-link-dynamic-library'", " action: 'c++-link-nodeps-dynamic-library'", "}", "flag_set {", // all_link_actions + something else " action: 'c++-link-executable'", " action: 'c++-link-dynamic-library'", " action: 'c++-link-nodeps-dynamic-library'", " action: 'some.unknown-c++.action'", "}", "env_set {", " action: 'a'", " env_entry {", " key: 'k'", " value: 'v'", " }", " with_feature {", " feature: 'a'", " }", "}", "env_set {", " action: 'c-compile'", "}", "env_set {", // all_compile_actions " action: 'c-compile'", " action: 'c++-compile'", " action: 'linkstamp-compile'", " action: 'assemble'", " action: 'preprocess-assemble'", " action: 'c++-header-parsing'", " action: 'c++-module-compile'", " action: 'c++-module-codegen'", " action: 'clif-match'", " action: 'lto-backend'", "}", "requires {", " feature: 'a'", " feature: 'b'", "}", "implies: 'a'", "implies: 'b'", "provides: 'c'", "provides: 'd'", }), }, ) testCases := []struct { toolchains []string expectedText string }{ { toolchains: []string{toolchainEmpty1, toolchainEmpty2}, expectedText: ` features = [] `}, { toolchains: []string{toolchainEmpty1, toolchainSimpleFeatureA1}, expectedText: ` feature_cpp_a_feature = feature(name = "Feature-c++.a", enabled = True)`}, { toolchains: []string{toolchainSimpleFeatureA1, toolchainSimpleFeatureA2}, expectedText: ` if (ctx.attr.cpu == "cpuC"): feature_cpp_a_feature = feature(name = "Feature-c++.a") elif (ctx.attr.cpu == "cpuB"): feature_cpp_a_feature = feature(name = "Feature-c++.a", enabled = True)`}, { toolchains: []string{toolchainComplexFeature}, expectedText: ` complex_feature_feature = feature( name = "complex-feature", enabled = True, flag_sets = [ flag_set( actions = [ACTION_NAMES.cpp_compile, "something-else"], flag_groups = [ flag_group( flags = ["a", "%b"], iterate_over = "c", expand_if_available = "d", expand_if_not_available = "e", expand_if_true = "f", expand_if_false = "g", expand_if_equal = variable_with_value(name = "var", value = "val"), ), flag_group(flag_groups = [flag_group(flags = ["a"])]), ], ), flag_set(actions = all_compile_actions), flag_set(actions = all_cpp_compile_actions), flag_set(actions = all_link_actions), flag_set( actions = all_cpp_compile_actions + all_link_actions, ), flag_set( actions = all_link_actions + ["some.unknown-c++.action"], ), ], env_sets = [ env_set( actions = ["a"], env_entries = [env_entry(key = "k", value = "v")], with_features = [with_feature_set(features = ["a"])], ), env_set(actions = [ACTION_NAMES.c_compile]), env_set(actions = all_compile_actions), ], requires = [feature_set(features = ["a", "b"])], implies = ["a", "b"], provides = ["c", "d"], )`}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly declare a feature, expected to contain:\n%v\n", tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } func TestRule(t *testing.T) { simpleToolchain := getSimpleCToolchain("simple") expected := `load("@bazel_tools//tools/cpp:cc_toolchain_config_lib.bzl", "action_config", "artifact_name_pattern", "env_entry", "env_set", "feature", "feature_set", "flag_group", "flag_set", "make_variable", "tool", "tool_path", "variable_with_value", "with_feature_set", ) load("@bazel_tools//tools/build_defs/cc:action_names.bzl", "ACTION_NAMES") def _impl(ctx): toolchain_identifier = "id-simple" host_system_name = "host-simple" target_system_name = "target-simple" target_cpu = "cpu-simple" target_libc = "libc-simple" compiler = "compiler-simple" abi_version = "version-simple" abi_libc_version = "libc_version-simple" cc_target_os = None builtin_sysroot = None all_compile_actions = [ ACTION_NAMES.c_compile, ACTION_NAMES.cpp_compile, ACTION_NAMES.linkstamp_compile, ACTION_NAMES.assemble, ACTION_NAMES.preprocess_assemble, ACTION_NAMES.cpp_header_parsing, ACTION_NAMES.cpp_module_compile, ACTION_NAMES.cpp_module_codegen, ACTION_NAMES.clif_match, ACTION_NAMES.lto_backend, ] all_cpp_compile_actions = [ ACTION_NAMES.cpp_compile, ACTION_NAMES.linkstamp_compile, ACTION_NAMES.cpp_header_parsing, ACTION_NAMES.cpp_module_compile, ACTION_NAMES.cpp_module_codegen, ACTION_NAMES.clif_match, ] preprocessor_compile_actions = [ ACTION_NAMES.c_compile, ACTION_NAMES.cpp_compile, ACTION_NAMES.linkstamp_compile, ACTION_NAMES.preprocess_assemble, ACTION_NAMES.cpp_header_parsing, ACTION_NAMES.cpp_module_compile, ACTION_NAMES.clif_match, ] codegen_compile_actions = [ ACTION_NAMES.c_compile, ACTION_NAMES.cpp_compile, ACTION_NAMES.linkstamp_compile, ACTION_NAMES.assemble, ACTION_NAMES.preprocess_assemble, ACTION_NAMES.cpp_module_codegen, ACTION_NAMES.lto_backend, ] all_link_actions = [ ACTION_NAMES.cpp_link_executable, ACTION_NAMES.cpp_link_dynamic_library, ACTION_NAMES.cpp_link_nodeps_dynamic_library, ] action_configs = [] features = [] cxx_builtin_include_directories = [] artifact_name_patterns = [] make_variables = [] tool_paths = [] out = ctx.actions.declare_file(ctx.label.name) ctx.actions.write(out, "Fake executable") return [ cc_common.create_cc_toolchain_config_info( ctx = ctx, features = features, action_configs = action_configs, artifact_name_patterns = artifact_name_patterns, cxx_builtin_include_directories = cxx_builtin_include_directories, toolchain_identifier = toolchain_identifier, host_system_name = host_system_name, target_system_name = target_system_name, target_cpu = target_cpu, target_libc = target_libc, compiler = compiler, abi_version = abi_version, abi_libc_version = abi_libc_version, tool_paths = tool_paths, make_variables = make_variables, builtin_sysroot = builtin_sysroot, cc_target_os = cc_target_os ), DefaultInfo( executable = out, ), ] cc_toolchain_config = rule( implementation = _impl, attrs = { "cpu": attr.string(mandatory=True, values=["cpu-simple"]), }, provides = [CcToolchainConfigInfo], executable = True, ) ` crosstool := makeCrosstool([]string{simpleToolchain}) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if got != expected { t.Fatalf("Expected:\n%v\nGot:\n%v\nTested CROSSTOOL:\n%v", expected, got, simpleToolchain) } } func TestAllowedCompilerValues(t *testing.T) { toolchainAA := getCToolchain("1", "cpuA", "compilerA", []string{}) toolchainBA := getCToolchain("2", "cpuB", "compilerA", []string{}) toolchainBB := getCToolchain("3", "cpuB", "compilerB", []string{}) toolchainCC := getCToolchain("4", "cpuC", "compilerC", []string{}) testCases := []struct { toolchains []string expectedText string }{ { toolchains: []string{toolchainAA, toolchainBA}, expectedText: ` cc_toolchain_config = rule( implementation = _impl, attrs = { "cpu": attr.string(mandatory=True, values=["cpuA", "cpuB"]), }, provides = [CcToolchainConfigInfo], executable = True, ) `}, { toolchains: []string{toolchainBA, toolchainBB}, expectedText: ` cc_toolchain_config = rule( implementation = _impl, attrs = { "cpu": attr.string(mandatory=True, values=["cpuB"]), "compiler": attr.string(mandatory=True, values=["compilerA", "compilerB"]), }, provides = [CcToolchainConfigInfo], executable = True, ) `}, { toolchains: []string{toolchainAA, toolchainBA, toolchainBB}, expectedText: ` cc_toolchain_config = rule( implementation = _impl, attrs = { "cpu": attr.string(mandatory=True, values=["cpuA", "cpuB"]), "compiler": attr.string(mandatory=True, values=["compilerA", "compilerB"]), }, provides = [CcToolchainConfigInfo], executable = True, ) `}, { toolchains: []string{toolchainAA, toolchainBA, toolchainBB, toolchainCC}, expectedText: ` cc_toolchain_config = rule( implementation = _impl, attrs = { "cpu": attr.string(mandatory=True, values=["cpuA", "cpuB", "cpuC"]), "compiler": attr.string(mandatory=True, values=["compilerA", "compilerB", "compilerC"]), }, provides = [CcToolchainConfigInfo], executable = True, ) `}} for _, tc := range testCases { crosstool := makeCrosstool(tc.toolchains) got, err := Transform(crosstool) if err != nil { t.Fatalf("CROSSTOOL conversion failed: %v", err) } if !strings.Contains(got, tc.expectedText) { t.Errorf("Failed to correctly declare the rule, expected to contain:\n%v\n", tc.expectedText) t.Fatalf("Tested CROSSTOOL:\n%v\n\nGenerated rule:\n%v\n", strings.Join(tc.toolchains, "\n"), got) } } } 07070100000031000081A4000003E800000064000000015D359B4200001261000000000000000000000000000000000000004100000000bazel-rules-cc-20190722/tools/migration/ctoolchain_comparator.py# Copyright 2018 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. r"""A script that compares 2 CToolchains from proto format. This script accepts two files in either a CROSSTOOL proto text format or a CToolchain proto text format. It then locates the CToolchains with the given toolchain_identifier and checks if the resulting CToolchain objects in Java are the same. Example usage: bazel run \ @rules_cc//tools/migration:ctoolchain_comparator -- \ --before=/path/to/CROSSTOOL1 \ --after=/path/to/CROSSTOOL2 \ --toolchain_identifier=id """ import os from absl import app from absl import flags from google.protobuf import text_format from third_party.com.github.bazelbuild.bazel.src.main.protobuf import crosstool_config_pb2 from tools.migration.ctoolchain_comparator_lib import compare_ctoolchains flags.DEFINE_string( "before", None, ("A text proto file containing the relevant CTooclchain before the change, " "either a CROSSTOOL file or a single CToolchain proto text")) flags.DEFINE_string( "after", None, ("A text proto file containing the relevant CToolchain after the change, " "either a CROSSTOOL file or a single CToolchain proto text")) flags.DEFINE_string("toolchain_identifier", None, "The identifier of the CToolchain that is being compared.") flags.mark_flag_as_required("before") flags.mark_flag_as_required("after") flags.mark_flag_as_required("toolchain_identifier") def _to_absolute_path(path): path = os.path.expanduser(path) if os.path.isabs(path): return path else: if "BUILD_WORKING_DIRECTORY" in os.environ: return os.path.join(os.environ["BUILD_WORKING_DIRECTORY"], path) else: return path def _find_toolchain(crosstool, toolchain_identifier): for toolchain in crosstool.toolchain: if toolchain.toolchain_identifier == toolchain_identifier: return toolchain return None def _read_crosstool_or_ctoolchain_proto(input_file, toolchain_identifier): """Reads a proto file and finds the CToolchain with the given identifier.""" with open(input_file, "r") as f: text = f.read() crosstool_release = crosstool_config_pb2.CrosstoolRelease() c_toolchain = crosstool_config_pb2.CToolchain() try: text_format.Merge(text, crosstool_release) toolchain = _find_toolchain(crosstool_release, toolchain_identifier) if toolchain is None: print(("Cannot find a CToolchain with an identifier '%s' in CROSSTOOL " "file") % toolchain_identifier) return None return toolchain except text_format.ParseError as crosstool_error: try: text_format.Merge(text, c_toolchain) if c_toolchain.toolchain_identifier != toolchain_identifier: print(("Expected CToolchain with identifier '%s', got CToolchain with " "identifier '%s'" % (toolchain_identifier, c_toolchain.toolchain_identifier))) return None return c_toolchain except text_format.ParseError as toolchain_error: print(("Error parsing file '%s':" % input_file)) # pylint: disable=superfluous-parens print("Attempt to parse it as a CROSSTOOL proto:") # pylint: disable=superfluous-parens print(crosstool_error) # pylint: disable=superfluous-parens print("Attempt to parse it as a CToolchain proto:") # pylint: disable=superfluous-parens print(toolchain_error) # pylint: disable=superfluous-parens return None def main(unused_argv): before_file = _to_absolute_path(flags.FLAGS.before) after_file = _to_absolute_path(flags.FLAGS.after) toolchain_identifier = flags.FLAGS.toolchain_identifier toolchain_before = _read_crosstool_or_ctoolchain_proto( before_file, toolchain_identifier) toolchain_after = _read_crosstool_or_ctoolchain_proto(after_file, toolchain_identifier) if not toolchain_before or not toolchain_after: print("There was an error getting the required toolchains.") exit(1) found_difference = compare_ctoolchains(toolchain_before, toolchain_after) if found_difference: exit(1) if __name__ == "__main__": app.run(main) 07070100000032000081A4000003E800000064000000015D359B42000058CA000000000000000000000000000000000000004500000000bazel-rules-cc-20190722/tools/migration/ctoolchain_comparator_lib.py# Copyright 2018 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """Module providing compare_ctoolchains function. compare_ctoolchains takes in two parsed CToolchains and compares them """ def _print_difference(field_name, before_value, after_value): if not before_value and after_value: print(("Difference in '%s' field:\nValue before change is not set\n" "Value after change is set to '%s'") % (field_name, after_value)) elif before_value and not after_value: print(("Difference in '%s' field:\nValue before change is set to '%s'\n" "Value after change is not set") % (field_name, before_value)) else: print(("Difference in '%s' field:\nValue before change:\t'%s'\n" "Value after change:\t'%s'\n") % (field_name, before_value, after_value)) def _array_to_string(arr, ordered=False): if not arr: return "[]" elif len(arr) == 1: return "[" + list(arr)[0] + "]" if not ordered: return "[\n\t%s\n]" % "\n\t".join(arr) else: return "[\n\t%s\n]" % "\n\t".join(sorted(list(arr))) def _check_with_feature_set_equivalence(before, after): before_set = set() after_set = set() for el in before: before_set.add((str(set(el.feature)), str(set(el.not_feature)))) for el in after: after_set.add((str(set(el.feature)), str(set(el.not_feature)))) return before_set == after_set def _check_tool_equivalence(before, after): """Compares two "CToolchain.Tool"s.""" if before.tool_path == "NOT_USED": before.tool_path = "" if after.tool_path == "NOT_USED": after.tool_path = "" if before.tool_path != after.tool_path: return False if set(before.execution_requirement) != set(after.execution_requirement): return False if not _check_with_feature_set_equivalence(before.with_feature, after.with_feature): return False return True def _check_flag_group_equivalence(before, after): """Compares two "CToolchain.FlagGroup"s.""" if before.flag != after.flag: return False if before.expand_if_true != after.expand_if_true: return False if before.expand_if_false != after.expand_if_false: return False if set(before.expand_if_all_available) != set(after.expand_if_all_available): return False if set(before.expand_if_none_available) != set( after.expand_if_none_available): return False if before.iterate_over != after.iterate_over: return False if before.expand_if_equal != after.expand_if_equal: return False if len(before.flag_group) != len(after.flag_group): return False for (flag_group_before, flag_group_after) in zip(before.flag_group, after.flag_group): if not _check_flag_group_equivalence(flag_group_before, flag_group_after): return False return True def _check_flag_set_equivalence(before, after, in_action_config=False): """Compares two "CToolchain.FlagSet"s.""" # ActionConfigs in proto format do not have a 'FlagSet.action' field set. # Instead, when construction the Java ActionConfig object, we set the # flag_set.action field to the action name. This currently causes the # CcToolchainConfigInfo.proto to generate a CToolchain.ActionConfig that still # has the action name in the FlagSet.action field, therefore we don't compare # the FlagSet.action field when comparing flag_sets that belong to an # ActionConfig. if not in_action_config and set(before.action) != set(after.action): return False if not _check_with_feature_set_equivalence(before.with_feature, after.with_feature): return False if len(before.flag_group) != len(after.flag_group): return False for (flag_group_before, flag_group_after) in zip(before.flag_group, after.flag_group): if not _check_flag_group_equivalence(flag_group_before, flag_group_after): return False return True def _check_action_config_equivalence(before, after): """Compares two "CToolchain.ActionConfig"s.""" if before.config_name != after.config_name: return False if before.action_name != after.action_name: return False if before.enabled != after.enabled: return False if len(before.tool) != len(after.tool): return False for (tool_before, tool_after) in zip(before.tool, after.tool): if not _check_tool_equivalence(tool_before, tool_after): return False if before.implies != after.implies: return False if len(before.flag_set) != len(after.flag_set): return False for (flag_set_before, flag_set_after) in zip(before.flag_set, after.flag_set): if not _check_flag_set_equivalence(flag_set_before, flag_set_after, True): return False return True def _check_env_set_equivalence(before, after): """Compares two "CToolchain.EnvSet"s.""" if set(before.action) != set(after.action): return False if not _check_with_feature_set_equivalence(before.with_feature, after.with_feature): return False if before.env_entry != after.env_entry: return False return True def _check_feature_equivalence(before, after): """Compares two "CToolchain.Feature"s.""" if before.name != after.name: return False if before.enabled != after.enabled: return False if len(before.flag_set) != len(after.flag_set): return False for (flag_set_before, flag_set_after) in zip(before.flag_set, after.flag_set): if not _check_flag_set_equivalence(flag_set_before, flag_set_after): return False if len(before.env_set) != len(after.env_set): return False for (env_set_before, env_set_after) in zip(before.env_set, after.env_set): if not _check_env_set_equivalence(env_set_before, env_set_after): return False if len(before.requires) != len(after.requires): return False for (requires_before, requires_after) in zip(before.requires, after.requires): if set(requires_before.feature) != set(requires_after.feature): return False if before.implies != after.implies: return False if before.provides != after.provides: return False return True def _compare_features(features_before, features_after): """Compares two "CToolchain.FlagFeature" lists.""" feature_name_to_feature_before = {} feature_name_to_feature_after = {} for feature in features_before: feature_name_to_feature_before[feature.name] = feature for feature in features_after: feature_name_to_feature_after[feature.name] = feature feature_names_before = set(feature_name_to_feature_before.keys()) feature_names_after = set(feature_name_to_feature_after.keys()) before_after_diff = feature_names_before - feature_names_after after_before_diff = feature_names_after - feature_names_before diff_string = "Difference in 'feature' field:" found_difference = False if before_after_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List before change contains entries for the following features " "that the list after the change doesn't:\n%s") % _array_to_string( before_after_diff, ordered=True)) if after_before_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List after change contains entries for the following features " "that the list before the change doesn't:\n%s") % _array_to_string( after_before_diff, ordered=True)) names_before = [feature.name for feature in features_before] names_after = [feature.name for feature in features_after] if names_before != names_after: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("Features not in right order:\n" "* List of features before change:\t%s" "* List of features before change:\t%s") % (_array_to_string(names_before), _array_to_string(names_after))) for name in feature_name_to_feature_before: feature_before = feature_name_to_feature_before[name] feature_after = feature_name_to_feature_after.get(name, None) if feature_after and not _check_feature_equivalence(feature_before, feature_after): if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* Feature '%s' differs before and after the change:\n" "Value before change:\n%s\n" "Value after change:\n%s") % (name, str(feature_before), str(feature_after))) if found_difference: print("") # pylint: disable=superfluous-parens return found_difference def _compare_action_configs(action_configs_before, action_configs_after): """Compares two "CToolchain.ActionConfig" lists.""" action_name_to_action_before = {} action_name_to_action_after = {} for action_config in action_configs_before: action_name_to_action_before[action_config.config_name] = action_config for action_config in action_configs_after: action_name_to_action_after[action_config.config_name] = action_config config_names_before = set(action_name_to_action_before.keys()) config_names_after = set(action_name_to_action_after.keys()) before_after_diff = config_names_before - config_names_after after_before_diff = config_names_after - config_names_before diff_string = "Difference in 'action_config' field:" found_difference = False if before_after_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List before change contains entries for the following " "action_configs that the list after the change doesn't:\n%s") % _array_to_string(before_after_diff, ordered=True)) if after_before_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List after change contains entries for the following " "action_configs that the list before the change doesn't:\n%s") % _array_to_string(after_before_diff, ordered=True)) names_before = [config.config_name for config in action_configs_before] names_after = [config.config_name for config in action_configs_after] if names_before != names_after: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("Action configs not in right order:\n" "* List of action configs before change:\t%s" "* List of action_configs before change:\t%s") % (_array_to_string(names_before), _array_to_string(names_after))) for name in config_names_before: action_config_before = action_name_to_action_before[name] action_config_after = action_name_to_action_after.get(name, None) if action_config_after and not _check_action_config_equivalence( action_config_before, action_config_after): if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* Action config '%s' differs before and after the change:\n" "Value before change:\n%s\n" "Value after change:\n%s") % (name, str(action_config_before), str(action_config_after))) if found_difference: print("") # pylint: disable=superfluous-parens return found_difference def _compare_tool_paths(tool_paths_before, tool_paths_after): """Compares two "CToolchain.ToolPath" lists.""" tool_to_path_before = {} tool_to_path_after = {} for tool_path in tool_paths_before: tool_to_path_before[tool_path.name] = ( tool_path.path if tool_path.path != "NOT_USED" else "") for tool_path in tool_paths_after: tool_to_path_after[tool_path.name] = ( tool_path.path if tool_path.path != "NOT_USED" else "") tool_names_before = set(tool_to_path_before.keys()) tool_names_after = set(tool_to_path_after.keys()) before_after_diff = tool_names_before - tool_names_after after_before_diff = tool_names_after - tool_names_before diff_string = "Difference in 'tool_path' field:" found_difference = False if before_after_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List before change contains entries for the following tools " "that the list after the change doesn't:\n%s") % _array_to_string( before_after_diff, ordered=True)) if after_before_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List after change contains entries for the following tools that " "the list before the change doesn't:\n%s") % _array_to_string( after_before_diff, ordered=True)) for tool in tool_to_path_before: path_before = tool_to_path_before[tool] path_after = tool_to_path_after.get(tool, None) if path_after and path_after != path_before: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* Path for tool '%s' differs before and after the change:\n" "Value before change:\t'%s'\n" "Value after change:\t'%s'") % (tool, path_before, path_after)) if found_difference: print("") # pylint: disable=superfluous-parens return found_difference def _compare_make_variables(make_variables_before, make_variables_after): """Compares two "CToolchain.MakeVariable" lists.""" name_to_variable_before = {} name_to_variable_after = {} for variable in make_variables_before: name_to_variable_before[variable.name] = variable.value for variable in make_variables_after: name_to_variable_after[variable.name] = variable.value variable_names_before = set(name_to_variable_before.keys()) variable_names_after = set(name_to_variable_after.keys()) before_after_diff = variable_names_before - variable_names_after after_before_diff = variable_names_after - variable_names_before diff_string = "Difference in 'make_variable' field:" found_difference = False if before_after_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List before change contains entries for the following variables " "that the list after the change doesn't:\n%s") % _array_to_string( before_after_diff, ordered=True)) if after_before_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List after change contains entries for the following variables " "that the list before the change doesn't:\n%s") % _array_to_string( after_before_diff, ordered=True)) for variable in name_to_variable_before: value_before = name_to_variable_before[variable] value_after = name_to_variable_after.get(variable, None) if value_after and value_after != value_before: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print( ("* Value for variable '%s' differs before and after the change:\n" "Value before change:\t'%s'\n" "Value after change:\t'%s'") % (variable, value_before, value_after)) if found_difference: print("") # pylint: disable=superfluous-parens return found_difference def _compare_cxx_builtin_include_directories(directories_before, directories_after): if directories_before != directories_after: print(("Difference in 'cxx_builtin_include_directory' field:\n" "List of elements before change:\n%s\n" "List of elements after change:\n%s\n") % (_array_to_string(directories_before), _array_to_string(directories_after))) return True return False def _compare_artifact_name_patterns(artifact_name_patterns_before, artifact_name_patterns_after): """Compares two "CToolchain.ArtifactNamePattern" lists.""" category_to_values_before = {} category_to_values_after = {} for name_pattern in artifact_name_patterns_before: category_to_values_before[name_pattern.category_name] = ( name_pattern.prefix, name_pattern.extension) for name_pattern in artifact_name_patterns_after: category_to_values_after[name_pattern.category_name] = ( name_pattern.prefix, name_pattern.extension) category_names_before = set(category_to_values_before.keys()) category_names_after = set(category_to_values_after.keys()) before_after_diff = category_names_before - category_names_after after_before_diff = category_names_after - category_names_before diff_string = "Difference in 'artifact_name_pattern' field:" found_difference = False if before_after_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List before change contains entries for the following categories " "that the list after the change doesn't:\n%s") % _array_to_string( before_after_diff, ordered=True)) if after_before_diff: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* List after change contains entries for the following categories " "that the list before the change doesn't:\n%s") % _array_to_string( after_before_diff, ordered=True)) for category in category_to_values_before: value_before = category_to_values_before[category] value_after = category_to_values_after.get(category, None) if value_after and value_after != value_before: if not found_difference: print(diff_string) # pylint: disable=superfluous-parens found_difference = True print(("* Value for category '%s' differs before and after the change:\n" "Value before change:\tprefix:'%s'\textension:'%s'\n" "Value after change:\tprefix:'%s'\textension:'%s'") % (category, value_before[0], value_before[1], value_after[0], value_after[1])) if found_difference: print("") # pylint: disable=superfluous-parens return found_difference def compare_ctoolchains(toolchain_before, toolchain_after): """Compares two CToolchains.""" found_difference = False if (toolchain_before.toolchain_identifier != toolchain_after.toolchain_identifier): _print_difference("toolchain_identifier", toolchain_before.toolchain_identifier, toolchain_after.toolchain_identifier) if toolchain_before.host_system_name != toolchain_after.host_system_name: _print_difference("host_system_name", toolchain_before.host_system_name, toolchain_after.host_system_name) found_difference = True if toolchain_before.target_system_name != toolchain_after.target_system_name: _print_difference("target_system_name", toolchain_before.target_system_name, toolchain_after.target_system_name) found_difference = True if toolchain_before.target_cpu != toolchain_after.target_cpu: _print_difference("target_cpu", toolchain_before.target_cpu, toolchain_after.target_cpu) found_difference = True if toolchain_before.target_libc != toolchain_after.target_libc: _print_difference("target_libc", toolchain_before.target_libc, toolchain_after.target_libc) found_difference = True if toolchain_before.compiler != toolchain_after.compiler: _print_difference("compiler", toolchain_before.compiler, toolchain_after.compiler) found_difference = True if toolchain_before.abi_version != toolchain_after.abi_version: _print_difference("abi_version", toolchain_before.abi_version, toolchain_after.abi_version) found_difference = True if toolchain_before.abi_libc_version != toolchain_after.abi_libc_version: _print_difference("abi_libc_version", toolchain_before.abi_libc_version, toolchain_after.abi_libc_version) found_difference = True if toolchain_before.cc_target_os != toolchain_after.cc_target_os: _print_difference("cc_target_os", toolchain_before.cc_target_os, toolchain_after.cc_target_os) found_difference = True if toolchain_before.builtin_sysroot != toolchain_after.builtin_sysroot: _print_difference("builtin_sysroot", toolchain_before.builtin_sysroot, toolchain_after.builtin_sysroot) found_difference = True found_difference = _compare_features( toolchain_before.feature, toolchain_after.feature) or found_difference found_difference = _compare_action_configs( toolchain_before.action_config, toolchain_after.action_config) or found_difference found_difference = _compare_tool_paths( toolchain_before.tool_path, toolchain_after.tool_path) or found_difference found_difference = _compare_cxx_builtin_include_directories( toolchain_before.cxx_builtin_include_directory, toolchain_after.cxx_builtin_include_directory) or found_difference found_difference = _compare_make_variables( toolchain_before.make_variable, toolchain_after.make_variable) or found_difference found_difference = _compare_artifact_name_patterns( toolchain_before.artifact_name_pattern, toolchain_after.artifact_name_pattern) or found_difference if not found_difference: print("No difference") # pylint: disable=superfluous-parens return found_difference 07070100000033000081A4000003E800000064000000015D359B420000BF91000000000000000000000000000000000000004A00000000bazel-rules-cc-20190722/tools/migration/ctoolchain_comparator_lib_test.py# Copyright 2018 The Bazel Authors. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. import unittest from google.protobuf import text_format from third_party.com.github.bazelbuild.bazel.src.main.protobuf import crosstool_config_pb2 from tools.migration.ctoolchain_comparator_lib import compare_ctoolchains from py import mock try: # Python 2 from cStringIO import StringIO except ImportError: # Python 3 from io import StringIO def make_toolchain(toolchain_proto): toolchain = crosstool_config_pb2.CToolchain() text_format.Merge(toolchain_proto, toolchain) return toolchain class CtoolchainComparatorLibTest(unittest.TestCase): def test_string_fields(self): first = make_toolchain(""" toolchain_identifier: "first-id" host_system_name: "first-host" target_system_name: "first-target" target_cpu: "first-cpu" target_libc: "first-libc" compiler: "first-compiler" abi_version: "first-abi" abi_libc_version: "first-abi-libc" builtin_sysroot: "sysroot" """) second = make_toolchain(""" toolchain_identifier: "second-id" host_system_name: "second-host" target_system_name: "second-target" target_cpu: "second-cpu" target_libc: "second-libc" compiler: "second-compiler" abi_version: "second-abi" abi_libc_version: "second-abi-libc" cc_target_os: "os" """) error_toolchain_identifier = ( "Difference in 'toolchain_identifier' field:\n" "Value before change:\t'first-id'\n" "Value after change:\t'second-id'\n") error_host_system_name = ("Difference in 'host_system_name' field:\n" "Value before change:\t'first-host'\n" "Value after change:\t'second-host'\n") error_target_system_name = ("Difference in 'target_system_name' field:\n" "Value before change:\t'first-target'\n" "Value after change:\t'second-target'\n") error_target_cpu = ("Difference in 'target_cpu' field:\n" "Value before change:\t'first-cpu'\n" "Value after change:\t'second-cpu'\n") error_target_libc = ("Difference in 'target_libc' field:\n" "Value before change:\t'first-libc'\n" "Value after change:\t'second-libc'\n") error_compiler = ("Difference in 'compiler' field:\n" "Value before change:\t'first-compiler'\n" "Value after change:\t'second-compiler'\n") error_abi_version = ("Difference in 'abi_version' field:\n" "Value before change:\t'first-abi'\n" "Value after change:\t'second-abi'\n") error_abi_libc_version = ("Difference in 'abi_libc_version' field:\n" "Value before change:\t'first-abi-libc'\n" "Value after change:\t'second-abi-libc'\n") error_builtin_sysroot = ("Difference in 'builtin_sysroot' field:\n" "Value before change is set to 'sysroot'\n" "Value after change is not set\n") error_cc_target_os = ("Difference in 'cc_target_os' field:\n" "Value before change is not set\n" "Value after change is set to 'os'\n") mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn(error_toolchain_identifier, mock_stdout.getvalue()) self.assertIn(error_host_system_name, mock_stdout.getvalue()) self.assertIn(error_target_system_name, mock_stdout.getvalue()) self.assertIn(error_target_cpu, mock_stdout.getvalue()) self.assertIn(error_target_libc, mock_stdout.getvalue()) self.assertIn(error_compiler, mock_stdout.getvalue()) self.assertIn(error_abi_version, mock_stdout.getvalue()) self.assertIn(error_abi_libc_version, mock_stdout.getvalue()) self.assertIn(error_builtin_sysroot, mock_stdout.getvalue()) self.assertIn(error_cc_target_os, mock_stdout.getvalue()) def test_tool_path(self): first = make_toolchain(""" tool_path { name: "only_first" path: "/a/b/c" } tool_path { name: "paths_differ" path: "/path/first" } """) second = make_toolchain(""" tool_path { name: "paths_differ" path: "/path/second" } tool_path { name: "only_second_1" path: "/a/b/c" } tool_path { name: "only_second_2" path: "/a/b/c" } """) error_only_first = ("* List before change contains entries for the " "following tools that the list after the change " "doesn't:\n[only_first]\n") error_only_second = ("* List after change contains entries for the " "following tools that the list before the change " "doesn't:\n" "[\n" "\tonly_second_1\n" "\tonly_second_2\n" "]\n") error_paths_differ = ("* Path for tool 'paths_differ' differs before and " "after the change:\n" "Value before change:\t'/path/first'\n" "Value after change:\t'/path/second'\n") mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn(error_only_first, mock_stdout.getvalue()) self.assertIn(error_only_second, mock_stdout.getvalue()) self.assertIn(error_paths_differ, mock_stdout.getvalue()) def test_make_variable(self): first = make_toolchain(""" make_variable { name: "only_first" value: "val" } make_variable { name: "value_differs" value: "first_value" } """) second = make_toolchain(""" make_variable { name: "value_differs" value: "second_value" } make_variable { name: "only_second_1" value: "val" } make_variable { name: "only_second_2" value: "val" } """) error_only_first = ("* List before change contains entries for the " "following variables that the list after the " "change doesn't:\n[only_first]\n") error_only_second = ("* List after change contains entries for the " "following variables that the list before the " "change doesn't:\n" "[\n" "\tonly_second_1\n" "\tonly_second_2\n" "]\n") error_value_differs = ("* Value for variable 'value_differs' differs before" " and after the change:\n" "Value before change:\t'first_value'\n" "Value after change:\t'second_value'\n") mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn(error_only_first, mock_stdout.getvalue()) self.assertIn(error_only_second, mock_stdout.getvalue()) self.assertIn(error_value_differs, mock_stdout.getvalue()) def test_cxx_builtin_include_directories(self): first = make_toolchain(""" cxx_builtin_include_directory: "a/b/c" cxx_builtin_include_directory: "d/e/f" """) second = make_toolchain(""" cxx_builtin_include_directory: "d/e/f" cxx_builtin_include_directory: "a/b/c" """) expect_error = ("Difference in 'cxx_builtin_include_directory' field:\n" "List of elements before change:\n" "[\n" "\ta/b/c\n" "\td/e/f\n" "]\n" "List of elements after change:\n" "[\n" "\td/e/f\n" "\ta/b/c\n" "]\n") mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn(expect_error, mock_stdout.getvalue()) def test_artifact_name_pattern(self): first = make_toolchain(""" artifact_name_pattern { category_name: 'object_file' prefix: '' extension: '.obj1' } artifact_name_pattern { category_name: 'executable' prefix: 'first' extension: '.exe' } artifact_name_pattern { category_name: 'dynamic_library' prefix: '' extension: '.dll' } """) second = make_toolchain(""" artifact_name_pattern { category_name: 'object_file' prefix: '' extension: '.obj2' } artifact_name_pattern { category_name: 'static_library' prefix: '' extension: '.lib' } artifact_name_pattern { category_name: 'executable' prefix: 'second' extension: '.exe' } artifact_name_pattern { category_name: 'interface_library' prefix: '' extension: '.if.lib' } """) error_only_first = ("* List before change contains entries for the " "following categories that the list after the " "change doesn't:\n[dynamic_library]\n") error_only_second = ("* List after change contains entries for the " "following categories that the list before the " "change doesn't:\n" "[\n" "\tinterface_library\n" "\tstatic_library\n" "]\n") error_extension_differs = ("* Value for category 'object_file' differs " "before and after the change:\n" "Value before change:" "\tprefix:''" "\textension:'.obj1'\n" "Value after change:" "\tprefix:''" "\textension:'.obj2'\n") error_prefix_differs = ("* Value for category 'executable' differs " "before and after the change:\n" "Value before change:" "\tprefix:'first'" "\textension:'.exe'\n" "Value after change:" "\tprefix:'second'" "\textension:'.exe'\n") mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn(error_only_first, mock_stdout.getvalue()) self.assertIn(error_only_second, mock_stdout.getvalue()) self.assertIn(error_extension_differs, mock_stdout.getvalue()) self.assertIn(error_prefix_differs, mock_stdout.getvalue()) def test_features_not_ordered(self): first = make_toolchain(""" feature { name: 'feature1' } feature { name: 'feature2' } """) second = make_toolchain(""" feature { name: 'feature2' } feature { name: 'feature1' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("Features not in right order", mock_stdout.getvalue()) def test_features_missing(self): first = make_toolchain(""" feature { name: 'feature1' } """) second = make_toolchain(""" feature { name: 'feature2' } """) error_only_first = ("* List before change contains entries for the " "following features that the list after the " "change doesn't:\n[feature1]\n") error_only_second = ("* List after change contains entries for the " "following features that the list before the " "change doesn't:\n[feature2]\n") mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn(error_only_first, mock_stdout.getvalue()) self.assertIn(error_only_second, mock_stdout.getvalue()) def test_feature_enabled(self): first = make_toolchain(""" feature { name: 'feature' enabled: true } """) second = make_toolchain(""" feature { name: 'feature' enabled: false } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) def test_feature_provides(self): first = make_toolchain(""" feature { name: 'feature' provides: 'a' } """) second = make_toolchain(""" feature { name: 'feature' provides: 'b' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_feature_provides_preserves_order(self): first = make_toolchain(""" feature { name: 'feature' provides: 'a' provides: 'b' } """) second = make_toolchain(""" feature { name: 'feature' provides: 'b' provides: 'a' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_feature_implies(self): first = make_toolchain(""" feature { name: 'feature' implies: 'a' } """) second = make_toolchain(""" feature { name: 'feature' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_feature_implies_preserves_order(self): first = make_toolchain(""" feature { name: 'feature' implies: 'a' implies: 'b' } """) second = make_toolchain(""" feature { name: 'feature' implies: 'b' implies: 'a' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_feature_requires_preserves_list_order(self): first = make_toolchain(""" feature { name: 'feature' requires: { feature: 'feature1' } requires: { feature: 'feature2' } } """) second = make_toolchain(""" feature { name: 'feature' requires: { feature: 'feature2' } requires: { feature: 'feature1' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_feature_requires_ignores_required_features_order(self): first = make_toolchain(""" feature { name: 'feature' requires: { feature: 'feature1' feature: 'feature2' } } """) second = make_toolchain(""" feature { name: 'feature' requires: { feature: 'feature2' feature: 'feature1' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_feature_requires_differs(self): first = make_toolchain(""" feature { name: 'feature' requires: { feature: 'feature1' } } """) second = make_toolchain(""" feature { name: 'feature' requires: { feature: 'feature2' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_action_config_ignores_requires(self): first = make_toolchain(""" action_config { config_name: 'config' requires: { feature: 'feature1' } } """) second = make_toolchain(""" action_config { config_name: 'config' requires: { feature: 'feature2' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_env_set_actions_differ(self): first = make_toolchain(""" feature { name: 'feature' env_set { action: 'a1' } } """) second = make_toolchain(""" feature { name: 'feature' env_set: { action: 'a1' action: 'a2' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_env_set_ignores_actions_order(self): first = make_toolchain(""" feature { name: 'feature' env_set { action: 'a2' action: 'a1' } } """) second = make_toolchain(""" feature { name: 'feature' env_set: { action: 'a1' action: 'a2' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_env_set_env_entries_not_ordered(self): first = make_toolchain(""" feature { name: 'feature' env_set { env_entry { key: 'k1' value: 'v1' } env_entry { key: 'k2' value: 'v2' } } } """) second = make_toolchain(""" feature { name: 'feature' env_set { env_entry { key: 'k2' value: 'v2' } env_entry { key: 'k1' value: 'v1' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_env_set_env_entries_differ(self): first = make_toolchain(""" feature { name: 'feature' env_set { env_entry { key: 'k1' value: 'value_first' } } } """) second = make_toolchain(""" feature { name: 'feature' env_set { env_entry { key: 'k1' value: 'value_second' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_feature_preserves_env_set_order(self): first = make_toolchain(""" feature { name: 'feature' env_set { env_entry { key: 'first' value: 'first' } } env_set { env_entry { key: 'second' value: 'second' } } } """) second = make_toolchain(""" feature { name: 'feature' env_set { env_entry { key: 'second' value: 'second' } } env_set { env_entry { key: 'first' value: 'first' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after the change:", mock_stdout.getvalue()) def test_action_config_ignores_env_set(self): first = make_toolchain(""" action_config { config_name: 'config' env_set { env_entry { key: 'k1' value: 'value_first' } } } """) second = make_toolchain(""" action_config { config_name: 'config' env_set { env_entry { key: 'k1' value: 'value_second' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_env_set_ignores_with_feature_set_order(self): first = make_toolchain(""" feature { name: 'feature' env_set{ with_feature { feature: 'feature1' } with_feature { not_feature: 'feature2' } } } """) second = make_toolchain(""" feature { name: 'feature' env_set { with_feature { not_feature: 'feature2' } with_feature { feature: 'feature1' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_env_set_ignores_with_feature_set_lists_order(self): first = make_toolchain(""" feature { name: 'feature' env_set{ with_feature { feature: 'feature1' feature: 'feature2' not_feature: 'not_feature1' not_feature: 'not_feature2' } } } """) second = make_toolchain(""" feature { name: 'feature' env_set{ with_feature { feature: 'feature2' feature: 'feature1' not_feature: 'not_feature2' not_feature: 'not_feature1' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_flag_set_ignores_actions_order(self): first = make_toolchain(""" feature { name: 'feature' flag_set { action: 'a1' action: 'a2' } } """) second = make_toolchain(""" feature { name: 'feature' flag_set { action: 'a2' action: 'a1' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_action_config_flag_set_actions_ignored(self): first = make_toolchain(""" action_config { config_name: 'config' flag_set { action: 'a1' } } """) second = make_toolchain(""" action_config { config_name: 'config' flag_set { action: 'a2' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_flag_set_ignores_with_feature_set_order(self): first = make_toolchain(""" feature { name: 'feature' flag_set { with_feature { feature: 'feature1' } with_feature { not_feature: 'feature2' } } } action_config { config_name: 'config' flag_set { with_feature { feature: 'feature1' } with_feature { not_feature: 'feature2' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set { with_feature { not_feature: 'feature2' } with_feature { feature: 'feature1' } } } action_config { config_name: 'config' flag_set { with_feature { not_feature: 'feature2' } with_feature { feature: 'feature1' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_flag_set_ignores_with_feature_set_lists_order(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ with_feature { feature: 'feature1' feature: 'feature2' not_feature: 'not_feature1' not_feature: 'not_feature2' } } } action_config { config_name: 'config' flag_set{ with_feature { feature: 'feature1' feature: 'feature2' not_feature: 'not_feature1' not_feature: 'not_feature2' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ with_feature { feature: 'feature2' feature: 'feature1' not_feature: 'not_feature2' not_feature: 'not_feature1' } } } action_config { config_name: 'config' flag_set{ with_feature { feature: 'feature2' feature: 'feature1' not_feature: 'not_feature2' not_feature: 'not_feature1' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_flag_set_preserves_flag_group_order(self): first = make_toolchain(""" feature { name: 'feature' flag_set { flag_group { flag: 'a' } flag_group { flag: 'b' } } } action_config { config_name: 'config' flag_set { flag_group { flag: 'a' } flag_group { flag: 'b' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set { flag_group { flag: 'b' } flag_group { flag: 'a' } } } action_config { config_name: 'config' flag_set { flag_group { flag: 'b' } flag_group { flag: 'a' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_preserves_flags_order(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { flag: 'flag1' flag: 'flag2' } } } action_config { config_name: 'config' flag_set{ flag_group { flag: 'flag1' flag: 'flag2' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { flag: 'flag2' flag: 'flag1' } } } action_config { config_name: 'config' flag_set{ flag_group { flag: 'flag2' flag: 'flag1' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_iterate_over_differs(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { iterate_over: 'a' } } } action_config { config_name: 'config' flag_set{ flag_group { iterate_over: 'a' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { iterate_over: 'b' } } } action_config { config_name: 'config' flag_set{ flag_group { iterate_over: 'b' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_expand_if_true_differs(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_true: 'a' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_true: 'a' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_true: 'b' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_true: 'b' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_expand_if_false_differs(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_false: 'a' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_false: 'a' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_false: 'b' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_false: 'b' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_expand_if_all_available_differs(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_all_available: 'a' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_all_available: 'a' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_all_available: 'b' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_all_available: 'b' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_expand_if_none_available_differs(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_none_available: 'a' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_none_available: 'a' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_none_available: 'b' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_none_available: 'b' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_expand_if_all_available_ignores_order(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_all_available: 'a' expand_if_all_available: 'b' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_all_available: 'a' expand_if_all_available: 'b' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_all_available: 'b' expand_if_all_available: 'a' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_all_available: 'b' expand_if_all_available: 'a' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_flag_group_expand_if_none_available_ignores_order(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_none_available: 'a' expand_if_none_available: 'b' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_none_available: 'a' expand_if_none_available: 'b' } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_none_available: 'b' expand_if_none_available: 'a' } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_none_available: 'b' expand_if_none_available: 'a' } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_flag_group_expand_if_equal_differs(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_equal { variable: 'first' value: 'val' } } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_equal { variable: 'first' value: 'val' } } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { expand_if_equal { variable: 'second' value: 'val' } } } } action_config { config_name: 'config' flag_set{ flag_group { expand_if_equal { variable: 'second' value: 'val' } } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_flag_group_flag_groups_differ(self): first = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { flag_group { flag: 'a' flag: 'b' } } } } action_config { config_name: 'config' flag_set{ flag_group { flag_group { flag: 'a' flag: 'b' } } } } """) second = make_toolchain(""" feature { name: 'feature' flag_set{ flag_group { flag_group { flag: 'b' flag: 'a' } } } } action_config { config_name: 'config' flag_set{ flag_group { flag_group { flag: 'b' flag: 'a' } } } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Feature 'feature' differs before and after", mock_stdout.getvalue()) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_action_configs_not_ordered(self): first = make_toolchain(""" action_config { config_name: 'action1' } action_config { config_name: 'action2' } """) second = make_toolchain(""" action_config { config_name: 'action2' } action_config { config_name: 'action1' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("Action configs not in right order", mock_stdout.getvalue()) def test_action_configs_missing(self): first = make_toolchain(""" action_config { config_name: 'action1' } """) second = make_toolchain(""" action_config { config_name: 'action2' } """) error_only_first = ("* List before change contains entries for the " "following action_configs that the list after the " "change doesn't:\n[action1]\n") error_only_second = ("* List after change contains entries for the " "following action_configs that the list before the " "change doesn't:\n[action2]\n") mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn(error_only_first, mock_stdout.getvalue()) self.assertIn(error_only_second, mock_stdout.getvalue()) def test_action_config_enabled(self): first = make_toolchain(""" action_config { config_name: 'config' enabled: true } """) second = make_toolchain(""" action_config { config_name: 'config' enabled: false } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_action_config_action_name(self): first = make_toolchain(""" action_config { config_name: 'config' action_name: 'config1' } """) second = make_toolchain(""" action_config { config_name: 'config' action_name: 'config2' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_action_config_tool_tool_path_differs(self): first = make_toolchain(""" action_config { config_name: 'config' tool { tool_path: 'path1' } } """) second = make_toolchain(""" action_config { config_name: 'config' tool { tool_path: 'path2' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_action_config_tool_execution_requirements_differ(self): first = make_toolchain(""" action_config { config_name: 'config' tool { execution_requirement: 'a' } } """) second = make_toolchain(""" action_config { config_name: 'config' tool { execution_requirement: 'b' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_action_config_tool_execution_requirements_ignores_order(self): first = make_toolchain(""" action_config { config_name: 'config' tool { execution_requirement: 'a' execution_requirement: 'b' } } """) second = make_toolchain(""" action_config { config_name: 'config' tool { execution_requirement: 'b' execution_requirement: 'a' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_action_config_implies_differs(self): first = make_toolchain(""" action_config { config_name: 'config' implies: 'a' } """) second = make_toolchain(""" action_config { config_name: 'config' implies: 'b' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_action_config_implies_preserves_order(self): first = make_toolchain(""" action_config { config_name: 'config' implies: 'a' implies: 'b' } """) second = make_toolchain(""" action_config { config_name: 'config' implies: 'b' implies: 'a' } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("* Action config 'config' differs before and after", mock_stdout.getvalue()) def test_unused_tool_path(self): first = make_toolchain(""" tool_path { name: "empty" path: "" } """) second = make_toolchain(""" tool_path { name: "empty" path: "NOT_USED" } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) def test_unused_tool_path_in_tool(self): first = make_toolchain(""" action_config { config_name: 'config' tool { tool_path: '' } } """) second = make_toolchain(""" action_config { config_name: 'config' tool { tool_path: 'NOT_USED' } } """) mock_stdout = StringIO() with mock.patch("sys.stdout", mock_stdout): compare_ctoolchains(first, second) self.assertIn("No difference", mock_stdout.getvalue()) if __name__ == "__main__": unittest.main() 07070100000034000081A4000003E800000064000000015D359B4200000700000000000000000000000000000000000000003F00000000bazel-rules-cc-20190722/tools/migration/ctoolchain_compare.bzl"""A test rule that compares two CToolchains in proto format.""" def _impl(ctx): toolchain_config_proto = ctx.actions.declare_file(ctx.label.name + "_toolchain_config.proto") ctx.actions.write( toolchain_config_proto, ctx.attr.toolchain_config[CcToolchainConfigInfo].proto, ) script = ("%s --before='%s' --after='%s' --toolchain_identifier='%s'" % ( ctx.executable._comparator.short_path, ctx.file.crosstool.short_path, toolchain_config_proto.short_path, ctx.attr.toolchain_identifier, )) test_executable = ctx.actions.declare_file(ctx.label.name) ctx.actions.write(test_executable, script, is_executable = True) runfiles = ctx.runfiles(files = [toolchain_config_proto, ctx.file.crosstool]) runfiles = runfiles.merge(ctx.attr._comparator[DefaultInfo].default_runfiles) return DefaultInfo(runfiles = runfiles, executable = test_executable) cc_toolchains_compare_test = rule( implementation = _impl, attrs = { "crosstool": attr.label( mandatory = True, allow_single_file = True, doc = "Location of the CROSSTOOL file", ), "toolchain_config": attr.label( mandatory = True, providers = [CcToolchainConfigInfo], doc = ("Starlark rule that replaces the CROSSTOOL file functionality " + "for the CToolchain with the given identifier"), ), "toolchain_identifier": attr.string( mandatory = True, doc = "identifier of the CToolchain that is being compared", ), "_comparator": attr.label( default = ":ctoolchain_comparator", executable = True, cfg = "host", ), }, test = True, ) 07070100000035000081A4000003E800000064000000015D359B42000050E3000000000000000000000000000000000000004700000000bazel-rules-cc-20190722/tools/migration/legacy_fields_migration_lib.py"""Module providing migrate_legacy_fields function. migrate_legacy_fields takes parsed CROSSTOOL proto and migrates it (inplace) to use only the features. Tracking issue: https://github.com/bazelbuild/bazel/issues/5187 Since C++ rules team is working on migrating CROSSTOOL from text proto into Starlark, we advise CROSSTOOL owners to wait for the CROSSTOOL -> Starlark migrator before they invest too much time into fixing their pipeline. Tracking issue for the Starlark effort is https://github.com/bazelbuild/bazel/issues/5380. """ from third_party.com.github.bazelbuild.bazel.src.main.protobuf import crosstool_config_pb2 ALL_CC_COMPILE_ACTIONS = [ "assemble", "preprocess-assemble", "linkstamp-compile", "c-compile", "c++-compile", "c++-header-parsing", "c++-module-compile", "c++-module-codegen", "lto-backend", "clif-match" ] ALL_OBJC_COMPILE_ACTIONS = [ "objc-compile", "objc++-compile" ] ALL_CXX_COMPILE_ACTIONS = [ action for action in ALL_CC_COMPILE_ACTIONS if action not in ["c-compile", "preprocess-assemble", "assemble"] ] ALL_CC_LINK_ACTIONS = [ "c++-link-executable", "c++-link-dynamic-library", "c++-link-nodeps-dynamic-library" ] ALL_OBJC_LINK_ACTIONS = [ "objc-executable", "objc++-executable", ] DYNAMIC_LIBRARY_LINK_ACTIONS = [ "c++-link-dynamic-library", "c++-link-nodeps-dynamic-library" ] NODEPS_DYNAMIC_LIBRARY_LINK_ACTIONS = ["c++-link-nodeps-dynamic-library"] TRANSITIVE_DYNAMIC_LIBRARY_LINK_ACTIONS = ["c++-link-dynamic-library"] TRANSITIVE_LINK_ACTIONS = ["c++-link-executable", "c++-link-dynamic-library"] CC_LINK_EXECUTABLE = ["c++-link-executable"] def compile_actions(toolchain): """Returns compile actions for cc or objc rules.""" if _is_objc_toolchain(toolchain): return ALL_CC_COMPILE_ACTIONS + ALL_OBJC_COMPILE_ACTIONS else: return ALL_CC_COMPILE_ACTIONS def link_actions(toolchain): """Returns link actions for cc or objc rules.""" if _is_objc_toolchain(toolchain): return ALL_CC_LINK_ACTIONS + ALL_OBJC_LINK_ACTIONS else: return ALL_CC_LINK_ACTIONS def executable_link_actions(toolchain): """Returns transitive link actions for cc or objc rules.""" if _is_objc_toolchain(toolchain): return CC_LINK_EXECUTABLE + ALL_OBJC_LINK_ACTIONS else: return CC_LINK_EXECUTABLE def _is_objc_toolchain(toolchain): return any(ac.action_name == "objc-compile" for ac in toolchain.action_config) # Map converting from LinkingMode to corresponding feature name LINKING_MODE_TO_FEATURE_NAME = { "FULLY_STATIC": "fully_static_link", "MOSTLY_STATIC": "static_linking_mode", "DYNAMIC": "dynamic_linking_mode", "MOSTLY_STATIC_LIBRARIES": "static_linking_mode_nodeps_library", } def migrate_legacy_fields(crosstool): """Migrates parsed crosstool (inplace) to not use legacy fields.""" crosstool.ClearField("default_toolchain") for toolchain in crosstool.toolchain: _ = [_migrate_expand_if_all_available(f) for f in toolchain.feature] _ = [_migrate_expand_if_all_available(ac) for ac in toolchain.action_config] _ = [_migrate_repeated_expands(f) for f in toolchain.feature] _ = [_migrate_repeated_expands(ac) for ac in toolchain.action_config] if (toolchain.dynamic_library_linker_flag or _contains_dynamic_flags(toolchain)) and not _get_feature( toolchain, "supports_dynamic_linker"): feature = toolchain.feature.add() feature.name = "supports_dynamic_linker" feature.enabled = True if toolchain.supports_start_end_lib and not _get_feature( toolchain, "supports_start_end_lib"): feature = toolchain.feature.add() feature.name = "supports_start_end_lib" feature.enabled = True if toolchain.supports_interface_shared_objects and not _get_feature( toolchain, "supports_interface_shared_libraries"): feature = toolchain.feature.add() feature.name = "supports_interface_shared_libraries" feature.enabled = True if toolchain.supports_embedded_runtimes and not _get_feature( toolchain, "static_link_cpp_runtimes"): feature = toolchain.feature.add() feature.name = "static_link_cpp_runtimes" feature.enabled = True if toolchain.needsPic and not _get_feature(toolchain, "supports_pic"): feature = toolchain.feature.add() feature.name = "supports_pic" feature.enabled = True if toolchain.supports_fission and not _get_feature( toolchain, "per_object_debug_info"): # feature { # name: "per_object_debug_info" # enabled: true # flag_set { # action: "assemble" # action: "preprocess-assemble" # action: "c-compile" # action: "c++-compile" # action: "c++-module-codegen" # action: "lto-backend" # flag_group { # expand_if_all_available: 'is_using_fission'", # flag: "-gsplit-dwarf" # } # } # } feature = toolchain.feature.add() feature.name = "per_object_debug_info" feature.enabled = True flag_set = feature.flag_set.add() flag_set.action[:] = [ "c-compile", "c++-compile", "c++-module-codegen", "assemble", "preprocess-assemble", "lto-backend" ] flag_group = flag_set.flag_group.add() flag_group.expand_if_all_available[:] = ["is_using_fission"] flag_group.flag[:] = ["-gsplit-dwarf"] if toolchain.objcopy_embed_flag and not _get_feature( toolchain, "objcopy_embed_flags"): feature = toolchain.feature.add() feature.name = "objcopy_embed_flags" feature.enabled = True flag_set = feature.flag_set.add() flag_set.action[:] = ["objcopy_embed_data"] flag_group = flag_set.flag_group.add() flag_group.flag[:] = toolchain.objcopy_embed_flag action_config = toolchain.action_config.add() action_config.action_name = "objcopy_embed_data" action_config.config_name = "objcopy_embed_data" action_config.enabled = True tool = action_config.tool.add() tool.tool_path = _find_tool_path(toolchain, "objcopy") if toolchain.ld_embed_flag and not _get_feature( toolchain, "ld_embed_flags"): feature = toolchain.feature.add() feature.name = "ld_embed_flags" feature.enabled = True flag_set = feature.flag_set.add() flag_set.action[:] = ["ld_embed_data"] flag_group = flag_set.flag_group.add() flag_group.flag[:] = toolchain.ld_embed_flag action_config = toolchain.action_config.add() action_config.action_name = "ld_embed_data" action_config.config_name = "ld_embed_data" action_config.enabled = True tool = action_config.tool.add() tool.tool_path = _find_tool_path(toolchain, "ld") # Create default_link_flags feature for linker_flag flag_sets = _extract_legacy_link_flag_sets_for(toolchain) if flag_sets: if _get_feature(toolchain, "default_link_flags"): continue if _get_feature(toolchain, "legacy_link_flags"): for f in toolchain.feature: if f.name == "legacy_link_flags": f.ClearField("flag_set") feature = f _rename_feature_in_toolchain(toolchain, "legacy_link_flags", "default_link_flags") break else: feature = _prepend_feature(toolchain) feature.name = "default_link_flags" feature.enabled = True _add_flag_sets(feature, flag_sets) # Create default_compile_flags feature for compiler_flag, cxx_flag flag_sets = _extract_legacy_compile_flag_sets_for(toolchain) if flag_sets and not _get_feature(toolchain, "default_compile_flags"): if _get_feature(toolchain, "legacy_compile_flags"): for f in toolchain.feature: if f.name == "legacy_compile_flags": f.ClearField("flag_set") feature = f _rename_feature_in_toolchain(toolchain, "legacy_compile_flags", "default_compile_flags") break else: feature = _prepend_feature(toolchain) feature.enabled = True feature.name = "default_compile_flags" _add_flag_sets(feature, flag_sets) # Unfiltered cxx flags have to have their own special feature. # "unfiltered_compile_flags" is a well-known (by Bazel) feature name that is # excluded from nocopts filtering. if toolchain.unfiltered_cxx_flag: # If there already is a feature named unfiltered_compile_flags, the # crosstool is already migrated for unfiltered_compile_flags if _get_feature(toolchain, "unfiltered_compile_flags"): for f in toolchain.feature: if f.name == "unfiltered_compile_flags": for flag_set in f.flag_set: for flag_group in flag_set.flag_group: if flag_group.iterate_over == "unfiltered_compile_flags": flag_group.ClearField("iterate_over") flag_group.ClearField("expand_if_all_available") flag_group.ClearField("flag") flag_group.flag[:] = toolchain.unfiltered_cxx_flag else: if not _get_feature(toolchain, "user_compile_flags"): feature = toolchain.feature.add() feature.name = "user_compile_flags" feature.enabled = True flag_set = feature.flag_set.add() flag_set.action[:] = compile_actions(toolchain) flag_group = flag_set.flag_group.add() flag_group.expand_if_all_available[:] = ["user_compile_flags"] flag_group.iterate_over = "user_compile_flags" flag_group.flag[:] = ["%{user_compile_flags}"] if not _get_feature(toolchain, "sysroot"): sysroot_actions = compile_actions(toolchain) + link_actions(toolchain) sysroot_actions.remove("assemble") feature = toolchain.feature.add() feature.name = "sysroot" feature.enabled = True flag_set = feature.flag_set.add() flag_set.action[:] = sysroot_actions flag_group = flag_set.flag_group.add() flag_group.expand_if_all_available[:] = ["sysroot"] flag_group.flag[:] = ["--sysroot=%{sysroot}"] feature = toolchain.feature.add() feature.name = "unfiltered_compile_flags" feature.enabled = True flag_set = feature.flag_set.add() flag_set.action[:] = compile_actions(toolchain) flag_group = flag_set.flag_group.add() flag_group.flag[:] = toolchain.unfiltered_cxx_flag # clear fields toolchain.ClearField("debian_extra_requires") toolchain.ClearField("gcc_plugin_compiler_flag") toolchain.ClearField("ar_flag") toolchain.ClearField("ar_thin_archives_flag") toolchain.ClearField("gcc_plugin_header_directory") toolchain.ClearField("mao_plugin_header_directory") toolchain.ClearField("supports_normalizing_ar") toolchain.ClearField("supports_thin_archives") toolchain.ClearField("supports_incremental_linker") toolchain.ClearField("supports_dsym") toolchain.ClearField("supports_gold_linker") toolchain.ClearField("default_python_top") toolchain.ClearField("default_python_version") toolchain.ClearField("python_preload_swigdeps") toolchain.ClearField("needsPic") toolchain.ClearField("compilation_mode_flags") toolchain.ClearField("linking_mode_flags") toolchain.ClearField("unfiltered_cxx_flag") toolchain.ClearField("ld_embed_flag") toolchain.ClearField("objcopy_embed_flag") toolchain.ClearField("supports_start_end_lib") toolchain.ClearField("supports_interface_shared_objects") toolchain.ClearField("supports_fission") toolchain.ClearField("supports_embedded_runtimes") toolchain.ClearField("compiler_flag") toolchain.ClearField("cxx_flag") toolchain.ClearField("linker_flag") toolchain.ClearField("dynamic_library_linker_flag") toolchain.ClearField("static_runtimes_filegroup") toolchain.ClearField("dynamic_runtimes_filegroup") # Enable features that were previously enabled by Bazel default_features = [ "dependency_file", "random_seed", "module_maps", "module_map_home_cwd", "header_module_compile", "include_paths", "pic", "preprocessor_define" ] for feature_name in default_features: feature = _get_feature(toolchain, feature_name) if feature: feature.enabled = True def _find_tool_path(toolchain, tool_name): """Returns the tool path of the tool with the given name.""" for tool in toolchain.tool_path: if tool.name == tool_name: return tool.path return None def _add_flag_sets(feature, flag_sets): """Add flag sets into a feature.""" for flag_set in flag_sets: with_feature = flag_set[0] actions = flag_set[1] flags = flag_set[2] expand_if_all_available = flag_set[3] not_feature = None if len(flag_set) >= 5: not_feature = flag_set[4] flag_set = feature.flag_set.add() if with_feature is not None: flag_set.with_feature.add().feature[:] = [with_feature] if not_feature is not None: flag_set.with_feature.add().not_feature[:] = [not_feature] flag_set.action[:] = actions flag_group = flag_set.flag_group.add() flag_group.expand_if_all_available[:] = expand_if_all_available flag_group.flag[:] = flags return feature def _extract_legacy_compile_flag_sets_for(toolchain): """Get flag sets for default_compile_flags feature.""" result = [] if toolchain.compiler_flag: result.append( [None, compile_actions(toolchain), toolchain.compiler_flag, []]) # Migrate compiler_flag from compilation_mode_flags for cmf in toolchain.compilation_mode_flags: mode = crosstool_config_pb2.CompilationMode.Name(cmf.mode).lower() # coverage mode has been a noop since a while if mode == "coverage": continue if (cmf.compiler_flag or cmf.cxx_flag) and not _get_feature(toolchain, mode): feature = toolchain.feature.add() feature.name = mode if cmf.compiler_flag: result.append([mode, compile_actions(toolchain), cmf.compiler_flag, []]) if toolchain.cxx_flag: result.append([None, ALL_CXX_COMPILE_ACTIONS, toolchain.cxx_flag, []]) # Migrate compiler_flag/cxx_flag from compilation_mode_flags for cmf in toolchain.compilation_mode_flags: mode = crosstool_config_pb2.CompilationMode.Name(cmf.mode).lower() # coverage mode has been a noop since a while if mode == "coverage": continue if cmf.cxx_flag: result.append([mode, ALL_CXX_COMPILE_ACTIONS, cmf.cxx_flag, []]) return result def _extract_legacy_link_flag_sets_for(toolchain): """Get flag sets for default_link_flags feature.""" result = [] # Migrate linker_flag if toolchain.linker_flag: result.append([None, link_actions(toolchain), toolchain.linker_flag, []]) # Migrate linker_flags from compilation_mode_flags for cmf in toolchain.compilation_mode_flags: mode = crosstool_config_pb2.CompilationMode.Name(cmf.mode).lower() # coverage mode has beed a noop since a while if mode == "coverage": continue if cmf.linker_flag and not _get_feature(toolchain, mode): feature = toolchain.feature.add() feature.name = mode if cmf.linker_flag: result.append([mode, link_actions(toolchain), cmf.linker_flag, []]) # Migrate linker_flags from linking_mode_flags for lmf in toolchain.linking_mode_flags: mode = crosstool_config_pb2.LinkingMode.Name(lmf.mode) feature_name = LINKING_MODE_TO_FEATURE_NAME.get(mode) # if the feature is already there, we don't migrate, lmf is not used if _get_feature(toolchain, feature_name): continue if lmf.linker_flag: feature = toolchain.feature.add() feature.name = feature_name if mode == "DYNAMIC": result.append( [None, NODEPS_DYNAMIC_LIBRARY_LINK_ACTIONS, lmf.linker_flag, []]) result.append([ None, TRANSITIVE_DYNAMIC_LIBRARY_LINK_ACTIONS, lmf.linker_flag, [], "static_link_cpp_runtimes", ]) result.append([ feature_name, executable_link_actions(toolchain), lmf.linker_flag, [] ]) elif mode == "MOSTLY_STATIC": result.append( [feature_name, CC_LINK_EXECUTABLE, lmf.linker_flag, []]) else: result.append( [feature_name, link_actions(toolchain), lmf.linker_flag, []]) if toolchain.dynamic_library_linker_flag: result.append([ None, DYNAMIC_LIBRARY_LINK_ACTIONS, toolchain.dynamic_library_linker_flag, [] ]) if toolchain.test_only_linker_flag: result.append([ None, link_actions(toolchain), toolchain.test_only_linker_flag, ["is_cc_test"] ]) return result def _prepend_feature(toolchain): """Create a new feature and make it be the first in the toolchain.""" features = toolchain.feature toolchain.ClearField("feature") new_feature = toolchain.feature.add() toolchain.feature.extend(features) return new_feature def _get_feature(toolchain, name): """Returns feature with a given name or None.""" for feature in toolchain.feature: if feature.name == name: return feature return None def _migrate_expand_if_all_available(message): """Move expand_if_all_available field to flag_groups.""" for flag_set in message.flag_set: if flag_set.expand_if_all_available: for flag_group in flag_set.flag_group: new_vars = ( flag_group.expand_if_all_available[:] + flag_set.expand_if_all_available[:]) flag_group.expand_if_all_available[:] = new_vars flag_set.ClearField("expand_if_all_available") def _migrate_repeated_expands(message): """Replace repeated legacy fields with nesting.""" todo_queue = [] for flag_set in message.flag_set: todo_queue.extend(flag_set.flag_group) while todo_queue: flag_group = todo_queue.pop() todo_queue.extend(flag_group.flag_group) if len(flag_group.expand_if_all_available) <= 1 and len( flag_group.expand_if_none_available) <= 1: continue current_children = flag_group.flag_group current_flags = flag_group.flag flag_group.ClearField("flag_group") flag_group.ClearField("flag") new_flag_group = flag_group.flag_group.add() new_flag_group.flag_group.extend(current_children) new_flag_group.flag.extend(current_flags) if len(flag_group.expand_if_all_available) > 1: expands_to_move = flag_group.expand_if_all_available[1:] flag_group.expand_if_all_available[:] = [ flag_group.expand_if_all_available[0] ] new_flag_group.expand_if_all_available.extend(expands_to_move) if len(flag_group.expand_if_none_available) > 1: expands_to_move = flag_group.expand_if_none_available[1:] flag_group.expand_if_none_available[:] = [ flag_group.expand_if_none_available[0] ] new_flag_group.expand_if_none_available.extend(expands_to_move) todo_queue.append(new_flag_group) todo_queue.append(flag_group) def _contains_dynamic_flags(toolchain): for lmf in toolchain.linking_mode_flags: mode = crosstool_config_pb2.LinkingMode.Name(lmf.mode) if mode == "DYNAMIC": return True return False def _rename_feature_in_toolchain(toolchain, from_name, to_name): for f in toolchain.feature: _rename_feature_in(f, from_name, to_name) for a in toolchain.action_config: _rename_feature_in(a, from_name, to_name) def _rename_feature_in(msg, from_name, to_name): if from_name in msg.implies: msg.implies.remove(from_name) for requires in msg.requires: if from_name in requires.feature: requires.feature.remove(from_name) requires.feature.extend([to_name]) for flag_set in msg.flag_set: for with_feature in flag_set.with_feature: if from_name in with_feature.feature: with_feature.feature.remove(from_name) with_feature.feature.extend([to_name]) if from_name in with_feature.not_feature: with_feature.not_feature.remove(from_name) with_feature.not_feature.extend([to_name]) for env_set in msg.env_set: for with_feature in env_set.with_feature: if from_name in with_feature.feature: with_feature.feature.remove(from_name) with_feature.feature.extend([to_name]) if from_name in with_feature.not_feature: with_feature.not_feature.remove(from_name) with_feature.not_feature.extend([to_name]) 07070100000036000081A4000003E800000064000000015D359B420000C822000000000000000000000000000000000000004C00000000bazel-rules-cc-20190722/tools/migration/legacy_fields_migration_lib_test.pyimport unittest from google.protobuf import text_format from third_party.com.github.bazelbuild.bazel.src.main.protobuf import crosstool_config_pb2 from tools.migration.legacy_fields_migration_lib import ALL_CC_COMPILE_ACTIONS from tools.migration.legacy_fields_migration_lib import ALL_OBJC_COMPILE_ACTIONS from tools.migration.legacy_fields_migration_lib import ALL_CXX_COMPILE_ACTIONS from tools.migration.legacy_fields_migration_lib import ALL_CC_LINK_ACTIONS from tools.migration.legacy_fields_migration_lib import ALL_OBJC_LINK_ACTIONS from tools.migration.legacy_fields_migration_lib import DYNAMIC_LIBRARY_LINK_ACTIONS from tools.migration.legacy_fields_migration_lib import NODEPS_DYNAMIC_LIBRARY_LINK_ACTIONS from tools.migration.legacy_fields_migration_lib import TRANSITIVE_LINK_ACTIONS from tools.migration.legacy_fields_migration_lib import TRANSITIVE_DYNAMIC_LIBRARY_LINK_ACTIONS from tools.migration.legacy_fields_migration_lib import CC_LINK_EXECUTABLE from tools.migration.legacy_fields_migration_lib import migrate_legacy_fields def assert_has_feature(self, toolchain, name): self.assertTrue(any(feature.name == name for feature in toolchain.feature)) def make_crosstool(string): crosstool = crosstool_config_pb2.CrosstoolRelease() text_format.Merge("major_version: '123' minor_version: '456'", crosstool) toolchain = crosstool.toolchain.add() text_format.Merge(string, toolchain) return crosstool def migrate_to_string(crosstool): migrate_legacy_fields(crosstool) return to_string(crosstool) def to_string(crosstool): return text_format.MessageToString(crosstool) class LegacyFieldsMigrationLibTest(unittest.TestCase): def test_deletes_fields(self): crosstool = make_crosstool(""" debian_extra_requires: 'debian-1' gcc_plugin_compiler_flag: 'gcc_plugin_compiler_flag-1' ar_flag: 'ar_flag-1' ar_thin_archives_flag: 'ar_thin_archives_flag-1' gcc_plugin_header_directory: 'gcc_plugin_header_directory-1' mao_plugin_header_directory: 'mao_plugin_header_directory-1' default_python_top: 'default_python_top-1' default_python_version: 'default_python_version-1' python_preload_swigdeps: false supports_normalizing_ar: false supports_thin_archives: false supports_incremental_linker: false supports_dsym: false supports_gold_linker: false needsPic: false supports_start_end_lib: false supports_interface_shared_objects: false supports_fission: false supports_embedded_runtimes: false static_runtimes_filegroup: 'yolo' dynamic_runtimes_filegroup: 'yolo' """) output = migrate_to_string(crosstool) self.assertNotIn("debian_extra_requires", output) self.assertNotIn("gcc_plugin_compiler_flag", output) self.assertNotIn("ar_flag", output) self.assertNotIn("ar_thin_archives_flag", output) self.assertNotIn("gcc_plugin_header_directory", output) self.assertNotIn("mao_plugin_header_directory", output) self.assertNotIn("supports_normalizing_ar", output) self.assertNotIn("supports_thin_archives", output) self.assertNotIn("supports_incremental_linker", output) self.assertNotIn("supports_dsym", output) self.assertNotIn("default_python_top", output) self.assertNotIn("default_python_version", output) self.assertNotIn("python_preload_swigdeps", output) self.assertNotIn("supports_gold_linker", output) self.assertNotIn("needsPic", output) self.assertNotIn("supports_start_end_lib", output) self.assertNotIn("supports_interface_shared_objects", output) self.assertNotIn("supports_fission", output) self.assertNotIn("supports_embedded_runtimes", output) self.assertNotIn("static_runtimes_filegroup", output) self.assertNotIn("dynamic_runtimes_filegroup", output) def test_deletes_default_toolchains(self): crosstool = make_crosstool("") crosstool.default_toolchain.add() self.assertEqual(len(crosstool.default_toolchain), 1) migrate_legacy_fields(crosstool) self.assertEqual(len(crosstool.default_toolchain), 0) def test_replace_legacy_compile_flags(self): crosstool = make_crosstool(""" feature { name: 'foo' } feature { name: 'legacy_compile_flags' } compiler_flag: 'clang-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.compiler_flag), 0) self.assertEqual(output.feature[0].name, "foo") self.assertEqual(output.feature[1].name, "default_compile_flags") self.assertEqual(output.feature[1].flag_set[0].action, ALL_CC_COMPILE_ACTIONS) self.assertEqual(output.feature[1].flag_set[0].flag_group[0].flag, ["clang-flag-1"]) def test_replace_legacy_compile_flags_in_action_configs(self): crosstool = make_crosstool(""" feature { name: 'foo' implies: 'legacy_compile_flags' requires: { feature: 'legacy_compile_flags' } flag_set { with_feature { feature: 'legacy_compile_flags' } with_feature { not_feature: 'legacy_compile_flags' } } env_set { with_feature { feature: 'legacy_compile_flags' } with_feature { not_feature: 'legacy_compile_flags' } } } feature { name: 'legacy_compile_flags' } action_config { action_name: 'foo' config_name: 'foo' implies: 'legacy_compile_flags' requires: { feature: 'legacy_compile_flags' } flag_set { with_feature { feature: 'legacy_compile_flags' } with_feature { not_feature: 'legacy_compile_flags' } } env_set { with_feature { feature: 'legacy_compile_flags' } with_feature { not_feature: 'legacy_compile_flags' } } } compiler_flag: 'clang-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.action_config[0].action_name, "foo") self.assertEqual(output.action_config[0].implies, []) self.assertEqual(output.action_config[0].requires[0].feature, ["default_compile_flags"]) self.assertEqual( output.action_config[0].flag_set[0].with_feature[0].feature, ["default_compile_flags"]) self.assertEqual( output.action_config[0].flag_set[0].with_feature[1].not_feature, ["default_compile_flags"]) self.assertEqual(output.action_config[0].env_set[0].with_feature[0].feature, ["default_compile_flags"]) self.assertEqual( output.action_config[0].env_set[0].with_feature[1].not_feature, ["default_compile_flags"]) self.assertEqual(output.feature[0].name, "foo") self.assertEqual(output.feature[0].implies, []) self.assertEqual(output.feature[0].requires[0].feature, ["default_compile_flags"]) self.assertEqual(output.feature[0].flag_set[0].with_feature[0].feature, ["default_compile_flags"]) self.assertEqual(output.feature[0].flag_set[0].with_feature[1].not_feature, ["default_compile_flags"]) self.assertEqual(output.feature[0].env_set[0].with_feature[0].feature, ["default_compile_flags"]) self.assertEqual(output.feature[0].env_set[0].with_feature[1].not_feature, ["default_compile_flags"]) def test_replace_legacy_link_flags(self): crosstool = make_crosstool(""" feature { name: 'foo' } feature { name: 'legacy_link_flags' } linker_flag: 'ld-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.compiler_flag), 0) self.assertEqual(output.feature[0].name, "foo") self.assertEqual(output.feature[1].name, "default_link_flags") self.assertEqual(output.feature[1].flag_set[0].action, ALL_CC_LINK_ACTIONS) self.assertEqual(output.feature[1].flag_set[0].flag_group[0].flag, ["ld-flag-1"]) def test_replace_legacy_link_flags_in_action_configs(self): crosstool = make_crosstool(""" feature { name: 'foo' implies: 'legacy_link_flags' requires: { feature: 'legacy_link_flags' } flag_set { with_feature { feature: 'legacy_link_flags' } with_feature { not_feature: 'legacy_link_flags' } } env_set { with_feature { feature: 'legacy_link_flags' } with_feature { not_feature: 'legacy_link_flags' } } } feature { name: 'legacy_link_flags' } action_config { action_name: 'foo' config_name: 'foo' implies: 'legacy_link_flags' requires: { feature: 'legacy_link_flags' } flag_set { with_feature { feature: 'legacy_link_flags' } with_feature { not_feature: 'legacy_link_flags' } } env_set { with_feature { feature: 'legacy_link_flags' } with_feature { not_feature: 'legacy_link_flags' } } } linker_flag: 'clang-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.action_config[0].action_name, "foo") self.assertEqual(output.action_config[0].implies, []) self.assertEqual(output.action_config[0].requires[0].feature, ["default_link_flags"]) self.assertEqual( output.action_config[0].flag_set[0].with_feature[0].feature, ["default_link_flags"]) self.assertEqual( output.action_config[0].flag_set[0].with_feature[1].not_feature, ["default_link_flags"]) self.assertEqual(output.action_config[0].env_set[0].with_feature[0].feature, ["default_link_flags"]) self.assertEqual( output.action_config[0].env_set[0].with_feature[1].not_feature, ["default_link_flags"]) self.assertEqual(output.feature[0].name, "foo") self.assertEqual(output.feature[0].implies, []) self.assertEqual(output.feature[0].requires[0].feature, ["default_link_flags"]) self.assertEqual(output.feature[0].flag_set[0].with_feature[0].feature, ["default_link_flags"]) self.assertEqual(output.feature[0].flag_set[0].with_feature[1].not_feature, ["default_link_flags"]) self.assertEqual(output.feature[0].env_set[0].with_feature[0].feature, ["default_link_flags"]) self.assertEqual(output.feature[0].env_set[0].with_feature[1].not_feature, ["default_link_flags"]) def test_migrate_compiler_flags(self): crosstool = make_crosstool(""" compiler_flag: 'clang-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.compiler_flag), 0) self.assertEqual(output.feature[0].name, "default_compile_flags") self.assertEqual(output.feature[0].flag_set[0].action, ALL_CC_COMPILE_ACTIONS) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["clang-flag-1"]) def test_migrate_compiler_flags_for_objc(self): crosstool = make_crosstool(""" action_config { action_name: "objc-compile" } compiler_flag: 'clang-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.compiler_flag), 0) self.assertEqual(output.feature[0].name, "default_compile_flags") self.assertEqual(output.feature[0].flag_set[0].action, ALL_CC_COMPILE_ACTIONS + ALL_OBJC_COMPILE_ACTIONS) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["clang-flag-1"]) def test_migrate_cxx_flags(self): crosstool = make_crosstool(""" cxx_flag: 'clang-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.cxx_flag), 0) self.assertEqual(output.feature[0].name, "default_compile_flags") self.assertEqual(output.feature[0].flag_set[0].action, ALL_CXX_COMPILE_ACTIONS) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["clang-flag-1"]) def test_compiler_flag_come_before_cxx_flags(self): crosstool = make_crosstool(""" compiler_flag: 'clang-flag-1' cxx_flag: 'clang-flag-2' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "default_compile_flags") self.assertEqual(output.feature[0].flag_set[0].action, ALL_CC_COMPILE_ACTIONS) self.assertEqual(output.feature[0].flag_set[1].action, ALL_CXX_COMPILE_ACTIONS) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["clang-flag-1"]) self.assertEqual(output.feature[0].flag_set[1].flag_group[0].flag, ["clang-flag-2"]) def test_migrate_linker_flags(self): crosstool = make_crosstool(""" linker_flag: 'linker-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.linker_flag), 0) self.assertEqual(output.feature[0].name, "default_link_flags") self.assertEqual(output.feature[0].flag_set[0].action, ALL_CC_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["linker-flag-1"]) def test_migrate_dynamic_library_linker_flags(self): crosstool = make_crosstool(""" dynamic_library_linker_flag: 'linker-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.dynamic_library_linker_flag), 0) self.assertEqual(output.feature[0].name, "default_link_flags") self.assertEqual(output.feature[0].flag_set[0].action, DYNAMIC_LIBRARY_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["linker-flag-1"]) def test_compilation_mode_flags(self): crosstool = make_crosstool(""" compiler_flag: "compile-flag-1" cxx_flag: "cxx-flag-1" linker_flag: "linker-flag-1" compilation_mode_flags { mode: OPT compiler_flag: "opt-flag-1" cxx_flag: "opt-flag-2" linker_flag: "opt-flag-3" } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.compilation_mode_flags), 0) assert_has_feature(self, output, "opt") self.assertEqual(output.feature[0].name, "default_compile_flags") self.assertEqual(output.feature[1].name, "default_link_flags") # flag set for compiler_flag fields self.assertEqual(len(output.feature[0].flag_set[0].with_feature), 0) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["compile-flag-1"]) # flag set for compiler_flag from compilation_mode_flags self.assertEqual(len(output.feature[0].flag_set[1].with_feature), 1) self.assertEqual(output.feature[0].flag_set[1].with_feature[0].feature[0], "opt") self.assertEqual(output.feature[0].flag_set[1].flag_group[0].flag, ["opt-flag-1"]) # flag set for cxx_flag fields self.assertEqual(len(output.feature[0].flag_set[2].with_feature), 0) self.assertEqual(output.feature[0].flag_set[2].flag_group[0].flag, ["cxx-flag-1"]) # flag set for cxx_flag from compilation_mode_flags self.assertEqual(len(output.feature[0].flag_set[3].with_feature), 1) self.assertEqual(output.feature[0].flag_set[3].with_feature[0].feature[0], "opt") self.assertEqual(output.feature[0].flag_set[3].flag_group[0].flag, ["opt-flag-2"]) # default_link_flags, flag set for linker_flag self.assertEqual(len(output.feature[1].flag_set[0].with_feature), 0) self.assertEqual(output.feature[1].flag_set[0].flag_group[0].flag, ["linker-flag-1"]) # default_link_flags, flag set for linker_flag from # compilation_mode_flags self.assertEqual(len(output.feature[1].flag_set[1].with_feature), 1) self.assertEqual(output.feature[1].flag_set[1].with_feature[0].feature[0], "opt") self.assertEqual(output.feature[1].flag_set[1].flag_group[0].flag, ["opt-flag-3"]) def test_linking_mode_flags(self): crosstool = make_crosstool(""" linker_flag: "linker-flag-1" compilation_mode_flags { mode: DBG linker_flag: "dbg-flag-1" } linking_mode_flags { mode: MOSTLY_STATIC linker_flag: "mostly-static-flag-1" } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.compilation_mode_flags), 0) self.assertEqual(len(output.linking_mode_flags), 0) # flag set for linker_flag self.assertEqual(len(output.feature[0].flag_set[0].with_feature), 0) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["linker-flag-1"]) # flag set for compilation_mode_flags self.assertEqual(len(output.feature[0].flag_set[1].with_feature), 1) self.assertEqual(output.feature[0].flag_set[1].with_feature[0].feature[0], "dbg") self.assertEqual(output.feature[0].flag_set[1].flag_group[0].flag, ["dbg-flag-1"]) # flag set for linking_mode_flags self.assertEqual(len(output.feature[0].flag_set[2].with_feature), 1) self.assertEqual(output.feature[0].flag_set[2].action, CC_LINK_EXECUTABLE) self.assertEqual(output.feature[0].flag_set[2].with_feature[0].feature[0], "static_linking_mode") self.assertEqual(output.feature[0].flag_set[2].flag_group[0].flag, ["mostly-static-flag-1"]) def test_coverage_compilation_mode_ignored(self): crosstool = make_crosstool(""" compilation_mode_flags { mode: COVERAGE compiler_flag: "coverage-flag-1" cxx_flag: "coverage-flag-2" linker_flag: "coverage-flag-3" } """) output = migrate_to_string(crosstool) self.assertNotIn("compilation_mode_flags", output) self.assertNotIn("coverage-flag-1", output) self.assertNotIn("coverage-flag-2", output) self.assertNotIn("coverage-flag-3", output) self.assertNotIn("COVERAGE", output) def test_supports_dynamic_linker_when_dynamic_library_linker_flag_is_used( self): crosstool = make_crosstool(""" dynamic_library_linker_flag: "foo" """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "default_link_flags") self.assertEqual(output.feature[1].name, "supports_dynamic_linker") self.assertEqual(output.feature[1].enabled, True) def test_supports_dynamic_linker_is_added_when_DYNAMIC_present(self): crosstool = make_crosstool(""" linking_mode_flags { mode: DYNAMIC } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_dynamic_linker") self.assertEqual(output.feature[0].enabled, True) def test_supports_dynamic_linker_is_not_added_when_present(self): crosstool = make_crosstool(""" feature { name: "supports_dynamic_linker" enabled: false } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_dynamic_linker") self.assertEqual(output.feature[0].enabled, False) def test_all_linker_flag_ordering(self): crosstool = make_crosstool(""" linker_flag: 'linker-flag-1' compilation_mode_flags { mode: OPT linker_flag: 'cmf-flag-2' } linking_mode_flags { mode: MOSTLY_STATIC linker_flag: 'lmf-flag-3' } linking_mode_flags { mode: DYNAMIC linker_flag: 'lmf-dynamic-flag-4' } dynamic_library_linker_flag: 'dl-flag-5' test_only_linker_flag: 'to-flag-6' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "default_link_flags") self.assertEqual(output.feature[0].enabled, True) self.assertEqual(output.feature[0].flag_set[0].action[:], ALL_CC_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag[:], ["linker-flag-1"]) self.assertEqual(output.feature[0].flag_set[1].action[:], ALL_CC_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[1].with_feature[0].feature[0], "opt") self.assertEqual(output.feature[0].flag_set[1].flag_group[0].flag, ["cmf-flag-2"]) self.assertEqual(output.feature[0].flag_set[2].action, CC_LINK_EXECUTABLE) self.assertEqual(output.feature[0].flag_set[2].with_feature[0].feature[0], "static_linking_mode") self.assertEqual(output.feature[0].flag_set[2].flag_group[0].flag, ["lmf-flag-3"]) self.assertEqual(len(output.feature[0].flag_set[3].with_feature), 0) self.assertEqual(output.feature[0].flag_set[3].flag_group[0].flag, ["lmf-dynamic-flag-4"]) self.assertEqual(output.feature[0].flag_set[3].action, NODEPS_DYNAMIC_LIBRARY_LINK_ACTIONS) self.assertEqual( output.feature[0].flag_set[4].with_feature[0].not_feature[0], "static_link_cpp_runtimes") self.assertEqual(output.feature[0].flag_set[4].flag_group[0].flag, ["lmf-dynamic-flag-4"]) self.assertEqual(output.feature[0].flag_set[4].action, TRANSITIVE_DYNAMIC_LIBRARY_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[5].with_feature[0].feature[0], "dynamic_linking_mode") self.assertEqual(output.feature[0].flag_set[5].flag_group[0].flag, ["lmf-dynamic-flag-4"]) self.assertEqual(output.feature[0].flag_set[5].action, CC_LINK_EXECUTABLE) self.assertEqual(output.feature[0].flag_set[6].flag_group[0].flag, ["dl-flag-5"]) self.assertEqual(output.feature[0].flag_set[6].action, DYNAMIC_LIBRARY_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[7].flag_group[0].flag, ["to-flag-6"]) self.assertEqual(output.feature[0].flag_set[7].action, ALL_CC_LINK_ACTIONS) self.assertEqual( output.feature[0].flag_set[7].flag_group[0].expand_if_all_available, ["is_cc_test"]) def test_all_linker_flag_objc_actions(self): crosstool = make_crosstool(""" action_config { action_name: "objc-compile" } linker_flag: 'linker-flag-1' compilation_mode_flags { mode: OPT linker_flag: 'cmf-flag-2' } linking_mode_flags { mode: MOSTLY_STATIC linker_flag: 'lmf-flag-3' } dynamic_library_linker_flag: 'dl-flag-5' test_only_linker_flag: 'to-flag-6' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "default_link_flags") self.assertEqual(output.feature[0].flag_set[0].action[:], ALL_CC_LINK_ACTIONS + ALL_OBJC_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[1].action[:], ALL_CC_LINK_ACTIONS + ALL_OBJC_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[2].action[:], CC_LINK_EXECUTABLE) self.assertEqual(output.feature[0].flag_set[3].action[:], DYNAMIC_LIBRARY_LINK_ACTIONS) self.assertEqual(output.feature[0].flag_set[4].action[:], ALL_CC_LINK_ACTIONS + ALL_OBJC_LINK_ACTIONS) def test_linking_mode_features_are_not_added_when_present(self): crosstool = make_crosstool(""" linking_mode_flags { mode: DYNAMIC linker_flag: 'dynamic-flag' } linking_mode_flags { mode: FULLY_STATIC linker_flag: 'fully-static-flag' } linking_mode_flags { mode: MOSTLY_STATIC linker_flag: 'mostly-static-flag' } linking_mode_flags { mode: MOSTLY_STATIC_LIBRARIES linker_flag: 'mostly-static-libraries-flag' } feature { name: "static_linking_mode" } feature { name: "dynamic_linking_mode" } feature { name: "static_linking_mode_nodeps_library" } feature { name: "fully_static_link" } """) output = migrate_to_string(crosstool) self.assertNotIn("linking_mode_flags", output) self.assertNotIn("DYNAMIC", output) self.assertNotIn("MOSTLY_STATIC", output) self.assertNotIn("MOSTLY_STATIC_LIBRARIES", output) self.assertNotIn("MOSTLY_STATIC_LIBRARIES", output) self.assertNotIn("dynamic-flag", output) self.assertNotIn("fully-static-flag", output) self.assertNotIn("mostly-static-flag", output) self.assertNotIn("mostly-static-libraries-flag", output) def test_unfiltered_require_user_compile_flags_and_sysroot(self): crosstool = make_crosstool(""" feature { name: 'preexisting_feature' } unfiltered_cxx_flag: 'unfiltered-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] # all these features are added after features that are already present in # the crosstool self.assertEqual(output.feature[0].name, "preexisting_feature") self.assertEqual(output.feature[1].name, "user_compile_flags") self.assertEqual(output.feature[2].name, "sysroot") self.assertEqual(output.feature[3].name, "unfiltered_compile_flags") def test_user_compile_flags_not_migrated_when_present(self): crosstool = make_crosstool(""" unfiltered_cxx_flag: 'unfiltered-flag-1' feature { name: 'user_compile_flags' } feature { name: 'preexisting_feature' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "user_compile_flags") self.assertEqual(output.feature[1].name, "preexisting_feature") self.assertEqual(output.feature[2].name, "sysroot") self.assertEqual(output.feature[3].name, "unfiltered_compile_flags") def test_sysroot_not_migrated_when_present(self): crosstool = make_crosstool(""" unfiltered_cxx_flag: 'unfiltered-flag-1' feature { name: 'sysroot' } feature { name: 'preexisting_feature' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "sysroot") self.assertEqual(output.feature[1].name, "preexisting_feature") self.assertEqual(output.feature[2].name, "user_compile_flags") self.assertEqual(output.feature[3].name, "unfiltered_compile_flags") def test_user_compile_flags(self): crosstool = make_crosstool(""" unfiltered_cxx_flag: 'unfiltered-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "user_compile_flags") self.assertEqual(output.feature[0].enabled, True) self.assertEqual(output.feature[0].flag_set[0].action, ALL_CC_COMPILE_ACTIONS) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].expand_if_all_available, ["user_compile_flags"]) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].iterate_over, "user_compile_flags") self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["%{user_compile_flags}"]) def test_sysroot(self): sysroot_actions = ALL_CC_COMPILE_ACTIONS + ALL_CC_LINK_ACTIONS sysroot_actions.remove("assemble") self.assertTrue("assemble" not in sysroot_actions) crosstool = make_crosstool(""" unfiltered_cxx_flag: 'unfiltered-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[1].name, "sysroot") self.assertEqual(output.feature[1].enabled, True) self.assertEqual(output.feature[1].flag_set[0].action, sysroot_actions) self.assertEqual( output.feature[1].flag_set[0].flag_group[0].expand_if_all_available, ["sysroot"]) self.assertEqual(output.feature[1].flag_set[0].flag_group[0].flag, ["--sysroot=%{sysroot}"]) def test_unfiltered_compile_flags_is_not_added_when_already_present(self): crosstool = make_crosstool(""" unfiltered_cxx_flag: 'unfiltered-flag-1' feature { name: 'something_else' } feature { name: 'unfiltered_compile_flags' } feature { name: 'something_else_2' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "something_else") self.assertEqual(output.feature[1].name, "unfiltered_compile_flags") self.assertEqual(len(output.feature[1].flag_set), 0) self.assertEqual(output.feature[2].name, "something_else_2") def test_unfiltered_compile_flags_is_not_edited_if_old_variant_present(self): crosstool = make_crosstool(""" unfiltered_cxx_flag: 'unfiltered-flag-1' feature { name: 'unfiltered_compile_flags' flag_set { action: 'c-compile' flag_group { flag: 'foo-flag-1' } } } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "unfiltered_compile_flags") self.assertEqual(len(output.feature[0].flag_set), 1) self.assertEqual(output.feature[0].flag_set[0].action, ["c-compile"]) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["foo-flag-1"]) def test_use_of_unfiltered_compile_flags_var_is_removed_and_replaced(self): crosstool = make_crosstool(""" unfiltered_cxx_flag: 'unfiltered-flag-1' feature { name: 'unfiltered_compile_flags' flag_set { action: 'c-compile' flag_group { flag: 'foo-flag-1' } } flag_set { action: 'c++-compile' flag_group { flag: 'bar-flag-1' } flag_group { expand_if_all_available: 'unfiltered_compile_flags' iterate_over: 'unfiltered_compile_flags' flag: '%{unfiltered_compile_flags}' } flag_group { flag: 'bar-flag-2' } } flag_set { action: 'c-compile' flag_group { flag: 'foo-flag-2' } } } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "unfiltered_compile_flags") self.assertEqual(output.feature[0].flag_set[0].action, ["c-compile"]) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["foo-flag-1"]) self.assertEqual(output.feature[0].flag_set[1].action, ["c++-compile"]) self.assertEqual(output.feature[0].flag_set[1].flag_group[0].flag, ["bar-flag-1"]) self.assertEqual(output.feature[0].flag_set[1].flag_group[1].flag, ["unfiltered-flag-1"]) self.assertEqual(output.feature[0].flag_set[1].flag_group[2].flag, ["bar-flag-2"]) self.assertEqual(output.feature[0].flag_set[2].action, ["c-compile"]) self.assertEqual(output.feature[0].flag_set[2].flag_group[0].flag, ["foo-flag-2"]) def test_unfiltered_compile_flags_is_added_at_the_end(self): crosstool = make_crosstool(""" feature { name: 'something_else' } unfiltered_cxx_flag: 'unfiltered-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "something_else") self.assertEqual(output.feature[1].name, "user_compile_flags") self.assertEqual(output.feature[2].name, "sysroot") self.assertEqual(output.feature[3].name, "unfiltered_compile_flags") self.assertEqual(output.feature[3].flag_set[0].action, ALL_CC_COMPILE_ACTIONS) self.assertEqual(output.feature[3].flag_set[0].flag_group[0].flag, ["unfiltered-flag-1"]) def test_unfiltered_compile_flags_are_not_added_for_objc(self): crosstool = make_crosstool(""" action_config { action_name: "obc-compile" } feature { name: 'something_else' } unfiltered_cxx_flag: 'unfiltered-flag-1' """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[3].name, "unfiltered_compile_flags") self.assertEqual(output.feature[3].flag_set[0].action, ALL_CC_COMPILE_ACTIONS) self.assertEqual(output.feature[3].flag_set[0].flag_group[0].flag, ["unfiltered-flag-1"]) def test_default_link_flags_is_added_first(self): crosstool = make_crosstool(""" linker_flag: 'linker-flag-1' feature { name: 'something_else' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "default_link_flags") self.assertEqual(output.feature[0].enabled, True) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["linker-flag-1"]) def test_default_link_flags_is_not_added_when_already_present(self): crosstool = make_crosstool(""" linker_flag: 'linker-flag-1' feature { name: 'something_else' } feature { name: 'default_link_flags' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "something_else") self.assertEqual(output.feature[1].name, "default_link_flags") def test_default_compile_flags_is_not_added_when_no_reason_to(self): crosstool = make_crosstool(""" feature { name: 'something_else' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "something_else") self.assertEqual(len(output.feature), 1) def test_default_compile_flags_is_first(self): crosstool = make_crosstool(""" compiler_flag: 'compiler-flag-1' feature { name: 'something_else' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "default_compile_flags") self.assertEqual(output.feature[0].enabled, True) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag, ["compiler-flag-1"]) def test_default_compile_flags_not_added_when_present(self): crosstool = make_crosstool(""" compiler_flag: 'compiler-flag-1' feature { name: 'something_else' } feature { name: 'default_compile_flags' } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "something_else") self.assertEqual(output.feature[1].name, "default_compile_flags") self.assertEqual(len(output.feature[1].flag_set), 0) def test_supports_start_end_lib_migrated(self): crosstool = make_crosstool("supports_start_end_lib: true") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_start_end_lib") self.assertEqual(output.feature[0].enabled, True) def test_supports_start_end_lib_not_migrated_on_false(self): crosstool = make_crosstool("supports_start_end_lib: false") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.feature), 0) def test_supports_start_end_lib_not_migrated_when_already_present(self): crosstool = make_crosstool(""" supports_start_end_lib: true feature { name: "supports_start_end_lib" enabled: false } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_start_end_lib") self.assertEqual(output.feature[0].enabled, False) def test_supports_interface_shared_libraries_migrated(self): crosstool = make_crosstool("supports_interface_shared_objects: true") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_interface_shared_libraries") self.assertEqual(output.feature[0].enabled, True) def test_supports_interface_shared_libraries_not_migrated_on_false(self): crosstool = make_crosstool("supports_interface_shared_objects: false") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.feature), 0) def test_supports_interface_shared_libraries_not_migrated_when_present(self): crosstool = make_crosstool(""" supports_interface_shared_objects: true feature { name: "supports_interface_shared_libraries" enabled: false } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_interface_shared_libraries") self.assertEqual(output.feature[0].enabled, False) def test_supports_embedded_runtimes_migrated(self): crosstool = make_crosstool("supports_embedded_runtimes: true") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "static_link_cpp_runtimes") self.assertEqual(output.feature[0].enabled, True) def test_supports_embedded_runtimes_not_migrated_on_false(self): crosstool = make_crosstool("supports_embedded_runtimes: false") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.feature), 0) def test_supports_embedded_runtimes_not_migrated_when_already_present(self): crosstool = make_crosstool(""" supports_embedded_runtimes: true feature { name: "static_link_cpp_runtimes" enabled: false } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "static_link_cpp_runtimes") self.assertEqual(output.feature[0].enabled, False) def test_needs_pic_migrated(self): crosstool = make_crosstool("needsPic: true") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_pic") self.assertEqual(output.feature[0].enabled, True) def test_needs_pic_not_migrated_on_false(self): crosstool = make_crosstool("needsPic: false") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.feature), 0) def test_needs_pic_not_migrated_when_already_present(self): crosstool = make_crosstool(""" needsPic: true feature { name: "supports_pic" enabled: false } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "supports_pic") self.assertEqual(output.feature[0].enabled, False) def test_supports_fission_migrated(self): crosstool = make_crosstool("supports_fission: true") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "per_object_debug_info") self.assertEqual(output.feature[0].enabled, True) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].expand_if_all_available, ["is_using_fission"]) def test_supports_fission_not_migrated_on_false(self): crosstool = make_crosstool("supports_fission: false") migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(len(output.feature), 0) def test_supports_fission_not_migrated_when_already_present(self): crosstool = make_crosstool(""" supports_fission: true feature { name: "per_object_debug_info" enabled: false } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "per_object_debug_info") self.assertEqual(output.feature[0].enabled, False) def test_migrating_objcopy_embed_flag(self): crosstool = make_crosstool(""" tool_path { name: "objcopy" path: "foo/objcopy" } objcopy_embed_flag: "a" objcopy_embed_flag: "b" """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "objcopy_embed_flags") self.assertEqual(output.feature[0].enabled, True) self.assertEqual(output.feature[0].flag_set[0].action[:], ["objcopy_embed_data"]) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag[:], ["a", "b"]) self.assertEqual(len(output.objcopy_embed_flag), 0) self.assertEqual(output.action_config[0].action_name, "objcopy_embed_data") self.assertEqual(output.action_config[0].tool[0].tool_path, "foo/objcopy") def test_not_migrating_objcopy_embed_flag_when_feature_present(self): crosstool = make_crosstool(""" objcopy_embed_flag: "a" objcopy_embed_flag: "b" feature { name: "objcopy_embed_flags" } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "objcopy_embed_flags") self.assertEqual(output.feature[0].enabled, False) def test_migrating_ld_embed_flag(self): crosstool = make_crosstool(""" tool_path { name: "ld" path: "foo/ld" } ld_embed_flag: "a" ld_embed_flag: "b" """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "ld_embed_flags") self.assertEqual(output.feature[0].enabled, True) self.assertEqual(output.feature[0].flag_set[0].action[:], ["ld_embed_data"]) self.assertEqual(output.feature[0].flag_set[0].flag_group[0].flag[:], ["a", "b"]) self.assertEqual(len(output.ld_embed_flag), 0) self.assertEqual(output.action_config[0].action_name, "ld_embed_data") self.assertEqual(output.action_config[0].tool[0].tool_path, "foo/ld") def test_not_migrating_objcopy_embed_flag_when_feature_present(self): crosstool = make_crosstool(""" objcopy_embed_flag: "a" objcopy_embed_flag: "b" feature { name: "objcopy_embed_flags" } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "objcopy_embed_flags") self.assertEqual(output.feature[0].enabled, False) def test_migrate_expand_if_all_available_from_flag_sets(self): crosstool = make_crosstool(""" action_config { action_name: 'something' config_name: 'something' flag_set { expand_if_all_available: 'foo' flag_group { flag: '%{foo}' } flag_group { flag: 'bar' } } } feature { name: 'something_else' flag_set { action: 'c-compile' expand_if_all_available: 'foo' flag_group { flag: '%{foo}' } flag_group { flag: 'bar' } } } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.action_config[0].action_name, "something") self.assertEqual(len(output.action_config[0].flag_set), 1) self.assertEqual( len(output.action_config[0].flag_set[0].expand_if_all_available), 0) self.assertEqual(len(output.action_config[0].flag_set[0].flag_group), 2) self.assertEqual( output.action_config[0].flag_set[0].flag_group[0] .expand_if_all_available, ["foo"]) self.assertEqual( output.action_config[0].flag_set[0].flag_group[1] .expand_if_all_available, ["foo"]) self.assertEqual(output.feature[0].name, "something_else") self.assertEqual(len(output.feature[0].flag_set), 1) self.assertEqual( len(output.feature[0].flag_set[0].expand_if_all_available), 0) self.assertEqual(len(output.feature[0].flag_set[0].flag_group), 2) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].expand_if_all_available, ["foo"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].expand_if_all_available, ["foo"]) def test_enable_previously_default_features(self): default_features = [ "dependency_file", "random_seed", "module_maps", "module_map_home_cwd", "header_module_compile", "include_paths", "pic", "preprocessor_define" ] crosstool = make_crosstool(""" feature { name: "dependency_file" } feature { name: "random_seed" } feature { name: "module_maps" } feature { name: "module_map_home_cwd" } feature { name: "header_module_compile" } feature { name: "include_paths" } feature { name: "pic" } feature { name: "preprocessor_define" } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] for i in range(0, 8): self.assertEqual(output.feature[i].name, default_features[i]) self.assertTrue(output.feature[i].enabled) def test_migrate_repeated_expand_if_all_available_from_flag_groups(self): crosstool = make_crosstool(""" action_config { action_name: 'something' config_name: 'something' flag_set { flag_group { expand_if_all_available: 'foo' expand_if_all_available: 'bar' flag: '%{foo}' } flag_group { expand_if_none_available: 'foo' expand_if_none_available: 'bar' flag: 'bar' } } } feature { name: 'something_else' flag_set { action: 'c-compile' flag_group { expand_if_all_available: 'foo' expand_if_all_available: 'bar' flag: '%{foo}' } flag_group { expand_if_none_available: 'foo' expand_if_none_available: 'bar' flag: 'bar' } } } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.action_config[0].action_name, "something") self.assertEqual(len(output.action_config[0].flag_set), 1) self.assertEqual( len(output.action_config[0].flag_set[0].expand_if_all_available), 0) self.assertEqual(len(output.action_config[0].flag_set[0].flag_group), 2) self.assertEqual( output.action_config[0].flag_set[0].flag_group[0] .expand_if_all_available, ["foo"]) self.assertEqual( output.action_config[0].flag_set[0].flag_group[0].flag_group[0] .expand_if_all_available, ["bar"]) self.assertEqual( output.action_config[0].flag_set[0].flag_group[1] .expand_if_none_available, ["foo"]) self.assertEqual( output.action_config[0].flag_set[0].flag_group[1].flag_group[0] .expand_if_none_available, ["bar"]) self.assertEqual(output.feature[0].name, "something_else") self.assertEqual(len(output.feature[0].flag_set), 1) self.assertEqual( len(output.feature[0].flag_set[0].expand_if_all_available), 0) self.assertEqual(len(output.feature[0].flag_set[0].flag_group), 2) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].expand_if_all_available, ["foo"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].flag_group[0] .expand_if_all_available, ["bar"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].expand_if_none_available, ["foo"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].flag_group[0] .expand_if_none_available, ["bar"]) def test_migrate_repeated_expands_from_nested_flag_groups(self): crosstool = make_crosstool(""" feature { name: 'something' flag_set { action: 'c-compile' flag_group { flag_group { expand_if_all_available: 'foo' expand_if_all_available: 'bar' flag: '%{foo}' } } flag_group { flag_group { expand_if_all_available: 'foo' expand_if_all_available: 'bar' expand_if_none_available: 'foo' expand_if_none_available: 'bar' flag: '%{foo}' } } } } """) migrate_legacy_fields(crosstool) output = crosstool.toolchain[0] self.assertEqual(output.feature[0].name, "something") self.assertEqual(len(output.feature[0].flag_set[0].flag_group), 2) self.assertEqual( len(output.feature[0].flag_set[0].flag_group[0].expand_if_all_available ), 0) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].flag_group[0] .expand_if_all_available, ["foo"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].flag_group[0].flag_group[0] .expand_if_all_available, ["bar"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[0].flag_group[0].flag_group[0] .flag, ["%{foo}"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].flag_group[0] .expand_if_all_available, ["foo"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].flag_group[0] .expand_if_none_available, ["foo"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].flag_group[0].flag_group[0] .expand_if_none_available, ["bar"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].flag_group[0].flag_group[0] .expand_if_all_available, ["bar"]) self.assertEqual( output.feature[0].flag_set[0].flag_group[1].flag_group[0].flag_group[0] .flag, ["%{foo}"]) if __name__ == "__main__": unittest.main() 07070100000037000081A4000003E800000064000000015D359B4200000977000000000000000000000000000000000000004200000000bazel-rules-cc-20190722/tools/migration/legacy_fields_migrator.py"""Script migrating legacy CROSSTOOL fields into features. This script migrates the CROSSTOOL to use only the features to describe C++ command lines. It is intended to be added as a last step of CROSSTOOL generation pipeline. Since it doesn't retain comments, we assume CROSSTOOL owners will want to migrate their pipeline manually. """ # Tracking issue: https://github.com/bazelbuild/bazel/issues/5187 # # Since C++ rules team is working on migrating CROSSTOOL from text proto into # Starlark, we advise CROSSTOOL owners to wait for the CROSSTOOL -> Starlark # migrator before they invest too much time into fixing their pipeline. Tracking # issue for the Starlark effort is # https://github.com/bazelbuild/bazel/issues/5380. from absl import app from absl import flags from google.protobuf import text_format from third_party.com.github.bazelbuild.bazel.src.main.protobuf import crosstool_config_pb2 from tools.migration.legacy_fields_migration_lib import migrate_legacy_fields import os flags.DEFINE_string("input", None, "Input CROSSTOOL file to be migrated") flags.DEFINE_string("output", None, "Output path where to write migrated CROSSTOOL.") flags.DEFINE_boolean("inline", None, "Overwrite --input file") def main(unused_argv): crosstool = crosstool_config_pb2.CrosstoolRelease() input_filename = flags.FLAGS.input output_filename = flags.FLAGS.output inline = flags.FLAGS.inline if not input_filename: raise app.UsageError("ERROR --input unspecified") if not output_filename and not inline: raise app.UsageError("ERROR --output unspecified and --inline not passed") if output_filename and inline: raise app.UsageError("ERROR both --output and --inline passed") with open(to_absolute_path(input_filename), "r") as f: input_text = f.read() text_format.Merge(input_text, crosstool) migrate_legacy_fields(crosstool) output_text = text_format.MessageToString(crosstool) resolved_output_filename = to_absolute_path( input_filename if inline else output_filename) with open(resolved_output_filename, "w") as f: f.write(output_text) def to_absolute_path(path): path = os.path.expanduser(path) if os.path.isabs(path): return path else: if "BUILD_WORKING_DIRECTORY" in os.environ: return os.path.join(os.environ["BUILD_WORKING_DIRECTORY"], path) else: return path if __name__ == "__main__": app.run(main) 07070100000000000000000000000000000000000000010000000000000000000000000000000000000000000000000000000B00000000TRAILER!!!644 blocks
Locations
Projects
Search
Status Monitor
Help
OpenBuildService.org
Documentation
API Documentation
Code of Conduct
Contact
Support
@OBShq
Terms
openSUSE Build Service is sponsored by
The Open Build Service is an
openSUSE project
.
Sign Up
Log In
Places
Places
All Projects
Status Monitor