add shadow, jflex, jacc plugins, update to gradle 7.4

This commit is contained in:
Jörg Prante 2022-02-19 23:05:50 +01:00
parent 6e1e4831c4
commit f246d2882c
202 changed files with 20381 additions and 102 deletions

202
LICENSE.txt Normal file
View file

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

0
NOTICE.txt Normal file
View file

View file

@ -1,6 +1,6 @@
wrapper { wrapper {
gradleVersion = "${project.property('gradle.wrapper.version')}" gradleVersion = libs.versions.gradle.get()
distributionType = Wrapper.DistributionType.ALL distributionType = Wrapper.DistributionType.ALL
} }

View file

@ -0,0 +1,10 @@
This work is based on an old (before June 2018) 1.5.6 version of
https://github.com/asciidoctor/asciidoctor-gradle-plugin
You can find remnants in the official code under org.asciidoctor.gradle.compat, when it was created with
https://github.com/ysb33r/asciidoctor-gradle-plugin/commit/1ad29b2eba915f7bce3cb004e3e9582578c8ac72
License: Apache 2.0

View file

@ -1,6 +1,6 @@
plugins { plugins {
id 'java-gradle-plugin' id 'java-gradle-plugin'
id 'com.gradle.plugin-publish' version '0.18.0' alias(libs.plugins.publish)
} }
apply plugin: 'java-gradle-plugin' apply plugin: 'java-gradle-plugin'
@ -10,10 +10,10 @@ apply from: rootProject.file('gradle/compile/groovy.gradle')
dependencies { dependencies {
api gradleApi() api gradleApi()
implementation "org.asciidoctor:asciidoctorj:${project.property('asciidoctorj.version')}" implementation libs.asciidoctorj
implementation "org.jruby:jruby:${project.property('jruby.version')}" implementation libs.jruby
testImplementation "org.spockframework:spock-core:${project.property('spock.version')}" testImplementation libs.spock.core
testImplementation "org.jsoup:jsoup:${project.property('jsoup.version')}" testImplementation libs.jsoup
} }
gradlePlugin { gradlePlugin {

View file

@ -1 +1 @@
version = 2.5.2.0 version = 2.5.2.1

View file

@ -4,7 +4,7 @@ import org.gradle.api.Project
class AsciidoctorExtension { class AsciidoctorExtension {
String version = '2.5.3' String version = '2.5.2'
boolean addDefaultRepositories = true boolean addDefaultRepositories = true

View file

@ -1,28 +1,18 @@
plugins { plugins {
id 'java-gradle-plugin' id 'java-gradle-plugin'
id 'groovy' alias(libs.plugins.publish)
id 'com.gradle.plugin-publish' version '0.18.0'
} }
apply plugin: 'groovy'
apply plugin: 'java-gradle-plugin' apply plugin: 'java-gradle-plugin'
apply plugin: 'com.gradle.plugin-publish' apply plugin: 'com.gradle.plugin-publish'
apply from: rootProject.file('gradle/compile/groovy.gradle')
dependencies { dependencies {
api gradleApi() api gradleApi()
testImplementation gradleTestKit() testImplementation gradleTestKit()
} }
compileGroovy {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
}
compileTestGroovy {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
}
gradlePlugin { gradlePlugin {
plugins { plugins {
dockerPlugin { dockerPlugin {

View file

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View file

@ -1,29 +1,19 @@
plugins { plugins {
id 'java-gradle-plugin' id 'java-gradle-plugin'
id 'groovy' alias(libs.plugins.publish)
id 'com.gradle.plugin-publish' version '0.18.0'
} }
apply plugin: 'groovy'
apply plugin: 'java-gradle-plugin' apply plugin: 'java-gradle-plugin'
apply plugin: 'com.gradle.plugin-publish' apply plugin: 'com.gradle.plugin-publish'
apply from: rootProject.file('gradle/compile/groovy.gradle')
dependencies { dependencies {
api gradleApi() api gradleApi()
api "org.xbib.groovy:groovy-git:${project.property('groovy-git.version')}" api libs.groovy.git
testImplementation gradleTestKit() testImplementation gradleTestKit()
testImplementation "org.spockframework:spock-core:${project.property('spock.version')}" testImplementation libs.spock.core
testImplementation "junit:junit:${project.property('junit4.version')}" testImplementation libs.junit4
}
compileGroovy {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
}
compileTestGroovy {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
} }
gradlePlugin { gradlePlugin {

View file

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

52
gradle-plugin-jacc/README.md Executable file
View file

@ -0,0 +1,52 @@
# gradle-plugin-jacc
A Gradle plugin for [Jacc](http://web.cecs.pdx.edu/~mpj/jacc/)
## Usage
plugins {
id 'org.xbib.gradle.plugin.jacc'
}
apply plugin: 'org.xbib.gradle.plugin.jacc'
Gradle will look for your jacc files in the source sets you specified.
By default, it looks with the pattern `**/*.jacc` under `src/main/jacc`
and `src/test/jacc`.
You can set up the source sets like this:
sourceSets {
main {
jacc {
srcDir "src/main/jacc"
}
java {
srcDir "build/my-generated-sources/jacc"
}
}
}
The lastJava `srcDir` definition will be used as the base for the Jacc target path.
If not given, the Jacc target path for generated Java source follows the pattern:
`${project.buildDir}/generated/sources/jacc`
The Jacc target path will be added automaticlly to the java compile task source directory
of the source set.
# License
Copyright (C) 2015-2020 Jörg Prante
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

41
gradle-plugin-jacc/build.gradle Executable file
View file

@ -0,0 +1,41 @@
plugins {
id 'java-gradle-plugin'
alias(libs.plugins.publish)
}
apply plugin: 'java-gradle-plugin'
apply plugin: 'com.gradle.plugin-publish'
apply from: rootProject.file('gradle/compile/groovy.gradle')
apply from: rootProject.file('gradle/test/junit5.gradle')
dependencies {
api gradleApi()
implementation libs.jacc
testImplementation gradleTestKit()
}
gradlePlugin {
plugins {
jaccPlugin {
id = 'org.xbib.gradle.plugin.jacc'
implementationClass = 'org.xbib.gradle.plugin.jacc.JaccPlugin'
}
}
}
if (project.hasProperty('gradle.publish.key')) {
pluginBundle {
website = scmUrl
vcsUrl = scmUrl
plugins {
jaccPlugin {
id = 'org.xbib.gradle.plugin.jacc'
version = project.version
description = 'Gradle Jacc plugin'
displayName = 'Gradle Jacc plugin'
tags = ['jacc']
}
}
}
}

View file

@ -0,0 +1 @@
version = 1.4.0

View file

@ -0,0 +1,80 @@
package org.xbib.gradle.plugin.jacc
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.file.SourceDirectorySet
import org.gradle.api.logging.Logger
import org.gradle.api.logging.Logging
import org.gradle.api.tasks.SourceSet
import org.gradle.api.tasks.TaskProvider
class JaccPlugin implements Plugin<Project> {
private static final Logger logger = Logging.getLogger(JaccPlugin)
@Override
void apply(Project project) {
project.with {
apply plugin: 'java-library'
addSourceSetExtensions(project)
}
project.afterEvaluate {
addTasks(project)
}
}
private static void addSourceSetExtensions(Project project) {
project.sourceSets.all { SourceSet sourceSet ->
createSourceSetExtension(project, sourceSet)
createConfiguration(project, sourceSet)
}
}
private static void createSourceSetExtension(Project project, SourceSet sourceSet) {
String name = sourceSet.name
SourceDirectorySet sourceDirectorySet = project.objects.sourceDirectorySet(name, "${name} Jacc source")
sourceSet.extensions.add('jacc', sourceDirectorySet)
sourceDirectorySet.srcDir("src/${name}/jacc")
sourceDirectorySet.include("**/*.jacc")
}
private static void createConfiguration(Project project, SourceSet sourceSet) {
String configName = sourceSet.name + 'Jacc'
if (project.configurations.findByName(configName) == null) {
logger.info "create configuration ${configName}"
project.configurations.create(configName) {
visible = false
transitive = true
extendsFrom = []
}
}
}
private static void addTasks(Project project) {
project.sourceSets.all { SourceSet sourceSet ->
addTaskForSourceSet(project, sourceSet)
}
}
private static void addTaskForSourceSet(Project project, SourceSet sourceSet) {
String taskName = sourceSet.getTaskName('generate', 'jacc')
SourceDirectorySet sourceDirectorySet = sourceSet.extensions.getByName('jacc') as SourceDirectorySet
File targetFile = sourceSet.java && sourceSet.java.srcDirs ? sourceSet.java.srcDirs.last() :
project.file("${project.buildDir}/generated/sources/${sourceSet.name}")
if (sourceDirectorySet.asList()) {
TaskProvider<JaccTask> taskProvider = project.tasks.register(taskName, JaccTask) {
group = 'jacc'
description = 'Generates code from Jacc files in ' + sourceSet.name
source = sourceDirectorySet.asList()
target = targetFile
}
logger.info "created ${taskName} for sources ${sourceDirectorySet.asList()} and target ${targetFile}"
project.tasks.findByName(sourceSet.compileJavaTaskName).dependsOn taskProvider
if (sourceSet.java && sourceSet.java.srcDirs) {
sourceSet.java.srcDirs += targetFile
}
}
}
}

View file

@ -0,0 +1,33 @@
package org.xbib.gradle.plugin.jacc
import org.gradle.api.DefaultTask
import org.gradle.api.tasks.InputFiles
import org.gradle.api.tasks.OutputDirectory
import org.gradle.api.tasks.TaskAction
import org.xbib.jacc.Jacc
class JaccTask extends DefaultTask {
@InputFiles
Iterable<File> source
@OutputDirectory
File target
@TaskAction
void generateAndTransformJacc() throws Exception {
source.each { file ->
String pkg = getPackageName(file)
File fullTarget = new File(target, pkg.replace('.','/'))
project.mkdir(fullTarget)
Jacc.main([file.absolutePath, '-d', fullTarget] as String[])
}
}
static String getPackageName(File file) {
String string = file.readLines().find { line ->
line.startsWith('package')
}
return string == null ? '' : string.substring(8, string.length() - 1)
}
}

View file

@ -0,0 +1,65 @@
package org.xbib.gradle.plugin.jacc
import org.gradle.testkit.runner.BuildResult
import org.gradle.testkit.runner.GradleRunner
import org.gradle.testkit.runner.TaskOutcome
import org.junit.jupiter.api.BeforeEach
import org.junit.jupiter.api.Test
import org.junit.jupiter.api.io.TempDir
import static org.junit.jupiter.api.Assertions.*
class JaccPluginTest {
private File projectDir
private File settingsFile
private File buildFile
@BeforeEach
void setup(@TempDir File testProjectDir) throws IOException {
this.projectDir = testProjectDir
this.settingsFile = new File(testProjectDir, "settings.gradle")
this.buildFile = new File(testProjectDir, "build.gradle")
}
@Test
void testJacc() {
String settingsFileContent = '''
rootProject.name = 'jacc-test'
'''
settingsFile.write(settingsFileContent)
String buildFileContent = '''
plugins {
id 'org.xbib.gradle.plugin.jacc'
}
sourceSets {
test {
jacc {
srcDir "${System.getProperty('user.dir')}/src/test/jacc"
}
java {
srcDir "${System.getProperty('user.dir')}/build/my-generated-sources/jacc"
}
}
}
'''
buildFile.write(buildFileContent)
BuildResult result = GradleRunner.create()
.withProjectDir(projectDir)
.withArguments(":build", "--info")
.withPluginClasspath()
.forwardOutput()
.build()
assertEquals(TaskOutcome.SUCCESS, result.task(":build").getOutcome())
File file = new File("${System.getProperty('user.dir')}/build/my-generated-sources/jacc")
if (file.exists()) {
List<File> list = Arrays.asList(file.listFiles())
assertEquals(2, list.size())
}
}
}

View file

@ -0,0 +1,104 @@
// To compile and run this program using jacc and Sun's JDK:
//
// jacc simpleCalc.jacc
// javac Calc.java CalcTokens.java
// java Calc
// ... enter arithmetic expressions ... hit EOF to terminate
//
%class Calc
%interface CalcTokens
%semantic int : yylval
%get token
%next yylex()
%token '+' '-' '*' '/' '(' ')' ';' INTEGER
%left '+' '-'
%left '*' '/'
%%
prog : prog ';' expr { System.out.println($3); }
| expr { System.out.println($1); }
;
expr : expr '+' expr { $$ = $1 + $3; }
| expr '-' expr { $$ = $1 - $3; }
| expr '*' expr { $$ = $1 * $3; }
| expr '/' expr { $$ = $1 / $3; }
| '(' expr ')' { $$ = $2; }
| INTEGER { $$ = $1; }
;
%%
private void yyerror(String msg) {
System.out.println("ERROR: " + msg);
System.exit(1);
}
private int c;
/** Read a single input character from standard input.
*/
private void nextChar() {
if (c>=0) {
try {
c = System.in.read();
} catch (Exception e) {
c = (-1);
}
}
}
int token;
int yylval;
/** Read the next token and return the
* corresponding integer code.
*/
int yylex() {
for (;;) {
// Skip whitespace
while (c==' ' || c=='\n' || c=='\t' || c=='\r') {
nextChar();
}
if (c<0) {
return (token=ENDINPUT);
}
switch (c) {
case '+' : nextChar();
return token='+';
case '-' : nextChar();
return token='-';
case '*' : nextChar();
return token='*';
case '/' : nextChar();
return token='/';
case '(' : nextChar();
return token='(';
case ')' : nextChar();
return token=')';
case ';' : nextChar();
return token=';';
default : if (Character.isDigit((char)c)) {
int n = 0;
do {
n = 10*n + (c - '0');
nextChar();
} while (Character.isDigit((char)c));
yylval = n;
return token=INTEGER;
} else {
yyerror("Illegal character "+c);
nextChar();
}
}
}
}
public static void main(String[] args) {
Calc calc = new Calc();
calc.nextChar(); // prime the character input stream
calc.yylex(); // prime the token input stream
calc.parse(); // parse the input
}

View file

@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

79
gradle-plugin-jflex/README.md Executable file
View file

@ -0,0 +1,79 @@
# gradle-plugin-jflex
A Gradle plugin for [JFlex](http://jflex.de)
## Usage
plugins {
id "org.xbib.gradle.plugin.jflex" version "1.4.0"
}
Gradle will look for your JFlex files in the source sets you specified.
By default, it looks with the pattern `**/*.jflex` under `src/main/jflex`
and `src/test/jflex`.
You can set up the source sets like this:
sourceSets {
main {
jflex {
srcDir "src/main/jflex"
}
java {
srcDir "$buildDir/my-generated-sources/jflex"
}
}
}
The lastJava `srcDir` definition will be used as the base for the JFLex target path.
If not given, the JFlex target path for generated Java source follows the pattern:
`${project.buildDir}/generated/sources/jflex`
The JFlex target path will be added automatically to the java compile task source directory
of the source set.
## Parameter support
The following parameters can be set in a global `jflex` extension
in the gradle script. See also https://jflex.de/manual.html
| Name | Description |
| ------- | ---------- |
| encoding | the file encoding |
| rootDirectory | the root directory used by JFlex (modification discouraged since the directories are derived from gradle source set)
| skel | uses external skeleton <file> in UTF-8 encoding. This is mainly for JFlex maintenance and special low level customisations. Use only when you know what you are doing! JFlex comes with a skeleton file in the src directory that reflects exactly the internal, pre-compiled skeleton and can be used with the skel option. |
| verbose | display generation progress messages (disabled by default) |
| jlex | tries even harder to comply to JLex interpretation of specs |
| no_minimize | skip the DFA minimisation step during scanner generation |
| no_backup | don't write backup files if this is true |
| unused_warning | warn about unused macros (by default false) |
| progress | progress dots will be printed (by default false) |
| dot | If true, jflex will write graphviz .dot files for generated automata (by default, false) |
| time | If true, jflex will print time statistics about the generation process (by default false) |
| dump | If true, you will be flooded with information, e.g. dfa tables (by default, false) |
| legacy_dot | dot (.) meta character matches [^\n] instead of [^\n\r\u000B\u000C\u0085\u2028\u2029] |
| statistics | print output statistics (by default, false) |
## Credits
gradle-plugin-jflex is a plugin based on
[gradle-jflex-plugin](https://github.com/thomaslee/gradle-jflex-plugin)
which was written by [Tom Lee](http://tomlee.co).
# License
Copyright (C) 2015-2020 Jörg Prante
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View file

@ -0,0 +1,41 @@
plugins {
id 'java-gradle-plugin'
alias(libs.plugins.publish)
}
apply plugin: 'java-gradle-plugin'
apply plugin: 'com.gradle.plugin-publish'
apply from: rootProject.file('gradle/compile/groovy.gradle')
apply from: rootProject.file('gradle/test/junit5.gradle')
dependencies {
api gradleApi()
implementation libs.jflex
testImplementation gradleTestKit()
}
gradlePlugin {
plugins {
jflexPlugin {
id = 'org.xbib.gradle.plugin.jflex'
implementationClass = 'org.xbib.gradle.plugin.jflex.JFlexPlugin'
}
}
}
if (project.hasProperty('gradle.publish.key')) {
pluginBundle {
website = scmUrl
vcsUrl = scmUrl
plugins {
jflexPlugin {
id = 'org.xbib.gradle.plugin.jflex'
version = project.version
description = 'Gradle JFlex plugin'
displayName = 'Gradle JFlex plugin'
tags = ['jflex']
}
}
}
}

View file

@ -0,0 +1 @@
version = 1.6.0

View file

@ -0,0 +1,67 @@
package org.xbib.gradle.plugin.jflex
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.Optional
class JFlexExtension {
@Input
@Optional
String encoding
@Input
@Optional
File rootDirectory
@Input
@Optional
File skel
@Input
@Optional
Boolean verbose = false
@Input
@Optional
Boolean jlex = false
@Input
@Optional
Boolean no_minimize = false
@Input
@Optional
Boolean no_backup = false
@Input
@Optional
Boolean unused_warning = false
@Input
@Optional
Boolean progress = false
@Input
@Optional
Boolean time = false
@Input
@Optional
Boolean dot = false
@Input
@Optional
Boolean dump = false
@Input
@Optional
Boolean legacy_dot = false
@Input
@Optional
Boolean statistics = false
@Input
@Optional
Boolean writeIntoJavaSrc = false
}

View file

@ -0,0 +1,90 @@
package org.xbib.gradle.plugin.jflex
import org.gradle.api.file.SourceDirectorySet
import org.gradle.api.logging.Logger
import org.gradle.api.logging.Logging
import org.gradle.api.tasks.SourceSet
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.tasks.TaskProvider
class JFlexPlugin implements Plugin<Project> {
private static final Logger logger = Logging.getLogger(JFlexPlugin)
@Override
void apply(Project project) {
logger.info "JFlex plugin says hello"
project.with {
apply plugin: 'java-library'
createJflexExtension(project)
addSourceSetExtensions(project)
}
project.afterEvaluate {
addJFlexTasks(project)
}
}
private static void addSourceSetExtensions(Project project) {
project.sourceSets.all { SourceSet sourceSet ->
createSourceSetExtension(project, sourceSet)
createConfiguration(project, sourceSet)
}
}
private static void createSourceSetExtension(Project project, SourceSet sourceSet) {
String name = sourceSet.name
SourceDirectorySet sourceDirectorySet = project.objects.sourceDirectorySet(name, "${name} JFlex source")
sourceSet.extensions.add('jflex', sourceDirectorySet)
sourceDirectorySet.srcDir("src/${name}/jflex")
sourceDirectorySet.include("**/*.jflex")
}
private static void createConfiguration(Project project, SourceSet sourceSet) {
String configName = sourceSet.name + capitalize('jflex' as CharSequence)
if (project.configurations.findByName(configName) == null) {
logger.info "create configuration ${configName}"
project.configurations.create(configName) {
visible = false
transitive = true
extendsFrom = []
}
}
}
private static void createJflexExtension(Project project) {
project.extensions.create ('jflex', JFlexExtension)
}
private static void addJFlexTasks(Project project) {
project.sourceSets.all { SourceSet sourceSet ->
addJFlexTaskForSourceSet(project, sourceSet)
}
}
private static void addJFlexTaskForSourceSet(Project project, SourceSet sourceSet) {
String taskName = sourceSet.getTaskName('generate', 'jflex')
SourceDirectorySet sourceDirectorySet = sourceSet.extensions.getByName('jflex') as SourceDirectorySet
File targetFile = project.file("${project.buildDir}/generated/sources/${sourceSet.name}")
if (sourceDirectorySet.asList()) {
TaskProvider<JFlexTask> taskProvider = project.tasks.register(taskName, JFlexTask) {
group = 'jflex'
description = 'Generates code from JFlex files in ' + sourceSet.name
source = sourceDirectorySet.asList()
target = targetFile
theSourceSet = sourceSet
}
logger.info "created ${taskName} for sources ${sourceDirectorySet.asList()} and target ${targetFile}"
project.tasks.named(sourceSet.compileJavaTaskName).configure({
dependsOn taskProvider
})
if (sourceSet.java && sourceSet.java.srcDirs) {
sourceSet.java.srcDirs += targetFile
}
}
}
private static String capitalize(CharSequence charSequence) {
return charSequence.length() == 0 ? "" : "" + Character.toUpperCase(charSequence.charAt(0)) + charSequence.subSequence(1, charSequence.length())
}
}

View file

@ -0,0 +1,91 @@
package org.xbib.gradle.plugin.jflex
import jflex.exceptions.GeneratorException
import jflex.generator.LexGenerator
import jflex.logging.Out
import jflex.option.Options
import jflex.skeleton.Skeleton
import org.gradle.api.DefaultTask
import org.gradle.api.logging.Logger
import org.gradle.api.logging.Logging
import org.gradle.api.tasks.CacheableTask
import org.gradle.api.tasks.InputFiles
import org.gradle.api.tasks.Internal
import org.gradle.api.tasks.OutputDirectory
import org.gradle.api.tasks.PathSensitive
import org.gradle.api.tasks.PathSensitivity
import org.gradle.api.tasks.SourceSet
import org.gradle.api.tasks.StopActionException
import org.gradle.api.tasks.TaskAction
import java.nio.charset.Charset
@CacheableTask
class JFlexTask extends DefaultTask {
private final Logger logger = Logging.getLogger(JFlexTask)
@InputFiles
@PathSensitive(PathSensitivity.RELATIVE)
Iterable<File> source
@OutputDirectory
File target
@Internal
SourceSet theSourceSet
@TaskAction
void generateAndTransformJflex() throws Exception {
JFlexExtension ext = project.extensions.findByType(JFlexExtension)
Options.setRootDirectory(ext.rootDirectory ? ext.rootDirectory : new File(""))
Skeleton.readDefault()
if (ext.skel) {
Skeleton.readSkelFile(ext.skel)
}
Options.encoding = ext.encoding ? Charset.forName(ext.encoding) : Charset.defaultCharset()
Options.verbose = ext.verbose
Options.progress = ext.progress
Options.unused_warning = ext.unused_warning
Options.jlex = ext.jlex
Options.no_minimize = ext.no_minimize
Options.no_backup = ext.no_backup
Options.time = ext.time
Options.dot = ext.dot
Options.dump = ext.dump
Options.legacy_dot = ext.legacy_dot
// hack for writing directly into java source. Not recommended.
if (ext.writeIntoJavaSrc) {
if (theSourceSet.java && theSourceSet.java.srcDirs) {
logger.info "java sources: ${theSourceSet.java.srcDirs}"
target = theSourceSet.java.srcDirs.first()
logger.info "switching to first java source directory ${target}"
} else {
logger.warn "writing into java source not possible, is empty"
}
}
source.each { file ->
String pkg = getPackageName(file)
File fullTarget = new File(target, pkg.replace('.','/'))
project.mkdir(fullTarget)
Options.directory = fullTarget
logger.info "jflex task: source=${file} pkg=${pkg} dir=${target}"
try {
new LexGenerator(file).generate()
} catch (GeneratorException e) {
Logging.getLogger(JFlexTask).error("JFlex error: ${e.message}", e)
throw new StopActionException('an error occurred during JFlex code generation')
}
}
if (ext.statistics) {
Out.statistics()
}
}
static String getPackageName(File file) {
String string = file.readLines().find { line ->
line.startsWith('package')
}
return string == null ? '' : string.substring(8, string.length() - 1)
}
}

View file

@ -0,0 +1,194 @@
package org.xbib.gradle.plugin.test
import org.gradle.testkit.runner.BuildResult
import org.gradle.testkit.runner.GradleRunner
import org.gradle.testkit.runner.TaskOutcome
import org.junit.jupiter.api.BeforeEach
import org.junit.jupiter.api.Test
import org.junit.jupiter.api.io.TempDir
import java.nio.file.Files
import static org.junit.jupiter.api.Assertions.*
class JFlexPluginTest {
private File projectDir
private File settingsFile
private File buildFile
@BeforeEach
void setup(@TempDir File testProjectDir) throws IOException {
this.projectDir = testProjectDir
this.settingsFile = new File(testProjectDir, "settings.gradle")
this.buildFile = new File(testProjectDir, "build.gradle")
}
@Test
void testJFlex() {
String settingsFileContent = '''
rootProject.name = 'jflex-test'
'''
settingsFile.write(settingsFileContent)
String buildFileContent = '''
plugins {
id 'org.xbib.gradle.plugin.jflex'
}
sourceSets {
test {
jflex {
// point to our test directory where the jflex file lives
srcDir "${System.getProperty('user.dir')}/src/test/jflex"
}
}
}
jflex {
verbose = true
dump = false
progress = false
}
'''
buildFile.write(buildFileContent)
BuildResult result = GradleRunner.create()
.withProjectDir(projectDir)
.withArguments(":build", "--info")
.withPluginClasspath()
.forwardOutput()
.build()
assertEquals(TaskOutcome.SUCCESS, result.task(":build").getOutcome())
// default output dir
File target = new File(projectDir, "build/generated/sources/test")
boolean found = false
if (target.exists()) {
// check for generated output
assertEquals(1, target.listFiles().length)
target.eachFileRecurse {
if (it.isFile()) {
println "found: ${it}"
found = true
}
}
} else {
fail("directory not found: ${target}")
}
if (!found) {
fail("jflex output not found")
}
}
@Test
void testJFlexWriteIntoJavaSrc() {
Files.createDirectories(projectDir.toPath().resolve('src/main/java'))
Files.createDirectories(projectDir.toPath().resolve('src/test/java'))
String settingsFileContent = '''
rootProject.name = 'jflex-test'
'''
settingsFile.write(settingsFileContent)
String buildFileContent = '''
plugins {
id 'org.xbib.gradle.plugin.jflex'
}
sourceSets {
main {
java {
srcDir "src/main/java"
}
jflex {
// point to our test directory where the jflex file lives
srcDir "${System.getProperty('user.dir')}/src/test/jflex"
}
}
test {
java {
srcDir "src/test/java"
}
}
}
jflex {
verbose = true
dump = false
progress = false
// enable legacy behavior of writing directly into Java source directory. Not recommended.
writeIntoJavaSrc = true
}
'''
buildFile.write(buildFileContent)
BuildResult result = GradleRunner.create()
.withProjectDir(projectDir)
.withArguments(":build", "--info")
.withPluginClasspath()
.forwardOutput()
.build()
assertEquals(TaskOutcome.SUCCESS, result.task(":build").getOutcome())
// search the Java source directory
File target = new File(projectDir, "src/main/java")
boolean found = false
if (target.exists()) {
// check for generated file
assertEquals(1, target.listFiles().length)
target.eachFileRecurse {
if (it.isFile()) {
println "found: ${it}"
found = true
}
}
} else {
fail("directory not found: ${target}")
}
if (!found) {
fail("jflex output not found")
}
}
@Test
void testTaskIsNotStarted() {
String buildFileContent = '''
plugins {
id 'org.xbib.gradle.plugin.jflex'
}
sourceSets {
test {
jflex {
srcDir "${System.getProperty('user.dir')}/src/test/jflex"
}
java {
srcDir "${System.getProperty('user.dir')}/build/my-generated-sources/jflex"
}
}
}
jflex {
verbose = false
dump = false
progress = false
}
def configuredTasks = []
tasks.configureEach {
configuredTasks << it
}
gradle.buildFinished {
def configuredTaskPaths = configuredTasks*.path
assert configuredTaskPaths == [':help',':clean']
}
'''
buildFile.write(buildFileContent)
GradleRunner.create()
.withProjectDir(projectDir)
.withArguments(":help")
.withPluginClasspath()
.forwardOutput()
.build()
}
}

View file

@ -0,0 +1,43 @@
package org.xbib.gradle.plugin.test;
import java.io.IOException;
%%
%class Test
%int
%unicode
%line
%column
%{
int token;
double yylval;
int nextToken() {
try {
return token = yylex();
} catch (IOException e) {
return token = -1;
}
}
int getToken() {
return token;
}
double getSemantic() {
return yylval;
}
%}
ws = [ \t\f]
digit = [0-9]
number = {digit}+(\.{digit}+)?(E[+\-]?{digit}+)?
%%
\r|\n|\r\n { return 0; }
{ws}+ { }
{number} { yylval = Double.parseDouble(yytext()); return 1; }
[+\-*/()=] { return (int)(yytext().charAt(0)); }
"*+" { return 2; }
. { throw new Error(yytext()); }

View file

@ -0,0 +1,8 @@
Very loosely based on a 2018 version of nebula.rpm
https://github.com/nebula-plugins/gradle-ospackage-plugin
but completely revised, using the xbib rpm library, not redline rpm.
Lincense: Apache 2.0

View file

@ -1,6 +1,6 @@
plugins { plugins {
id 'java-gradle-plugin' id 'java-gradle-plugin'
id 'com.gradle.plugin-publish' version '0.18.0' alias(libs.plugins.publish)
} }
apply plugin: 'java-gradle-plugin' apply plugin: 'java-gradle-plugin'
@ -10,20 +10,10 @@ apply from: rootProject.file('gradle/compile/groovy.gradle')
dependencies { dependencies {
api gradleApi() api gradleApi()
api "org.xbib:rpm-core:${project.property('rpm.version')}" api libs.rpm
testImplementation gradleTestKit() testImplementation gradleTestKit()
} }
compileGroovy {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
}
compileTestGroovy {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
}
gradlePlugin { gradlePlugin {
plugins { plugins {
rpmPlugin { rpmPlugin {

View file

@ -1,19 +1,13 @@
package org.xbib.gradle.plugin package org.xbib.gradle.plugin
import org.gradle.api.GradleException
import org.gradle.api.Plugin import org.gradle.api.Plugin
import org.gradle.api.Project import org.gradle.api.Project
import org.gradle.api.plugins.BasePlugin import org.gradle.api.plugins.BasePlugin
import org.gradle.util.GradleVersion
class RpmPlugin implements Plugin<Project> { class RpmPlugin implements Plugin<Project> {
@Override @Override
void apply(Project project) { void apply(Project project) {
String version = '6.4'
if (GradleVersion.current() < GradleVersion.version(version)) {
throw new GradleException("need Gradle ${version} or higher")
}
project.plugins.apply(BasePlugin) project.plugins.apply(BasePlugin)
project.ext.Rpm = Rpm.class project.ext.Rpm = Rpm.class
} }

View file

@ -0,0 +1,45 @@
plugins {
id 'java-gradle-plugin'
alias(libs.plugins.publish)
}
apply plugin: 'java-gradle-plugin'
apply plugin: 'com.gradle.plugin-publish'
apply from: rootProject.file('gradle/compile/groovy.gradle')
apply from: rootProject.file('gradle/test/junit5.gradle')
dependencies {
api gradleApi()
implementation libs.asm
implementation libs.asm.commons
implementation libs.asm.util
testImplementation gradleTestKit()
testImplementation libs.spock.core
testImplementation libs.spock.junit4
}
gradlePlugin {
plugins {
shadowPlugin {
id = 'org.xbib.gradle.plugin.shadow'
implementationClass = 'org.xbib.gradle.plugin.shadow.ShadowPlugin'
}
}
}
if (project.hasProperty('gradle.publish.key')) {
pluginBundle {
website = scmUrl
vcsUrl = scmUrl
plugins {
shadowPlugin {
id = 'org.xbib.gradle.plugin.shadow'
version = project.version
description = 'Shadow plugin for Gradle'
displayName = 'Shadow plugin for Gradle'
tags = [ 'shadow' ]
}
}
}
}

View file

@ -0,0 +1 @@
version = 2.0.0

View file

@ -0,0 +1,28 @@
:tests: ../../test/groovy/org/xbib/gradle/plugin/shadow
== Introduction
Shadow is a Gradle plugin for combining dependency classes and resources with a project's into a single
output Jar.
The combined Jar is often referred to a __fat-jar__ or __uber-jar__.
Shadow utilizes `JarInputStream` and `JarOutputStream` to efficiently process dependent libraries
into the output jar without incurring the I/O overhead of expanding the jars to disk.
=== Benefits of Shadow
Shadowing a project output has a major use case:
. Bundling and relocating common dependencies in libraries to avoid classpath conflicts
==== Bundling
Dependency bundling and relocation is the main use case for __library__ authors.
The goal of a bundled library is to create a pre-packaged dependency for other libraries or applications to utilize.
Often in these scenarios, a library may contain a dependency that a downstream library or application also uses.
In __some__ cases, different versions of this common dependency can cause an issue in either the upstream library or
the downstream application.
These issues often manifest themselves as binary incompatibilities in either the library or application code.
By utilizing Shadow's ability to __relocate__ the package names for dependencies, a library author can ensure that the
library's dependencies will not conflict with the same dependency being declared by the downstream application.
include::01-getting-started.adoc[]

View file

@ -0,0 +1,96 @@
=== Getting Started
[source,groovy,subs="+attributes"]
----
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'org.xbib.gradle.plugin:gradle-plugin-shadow:{project-version}'
}
}
apply plugin: 'java'
apply plugin: 'org.xbib.gradle.plugin.shadow'
----
Alternatively, the Gradle Plugin syntax can be used:
[source,groovy,subs="+attributes"]
----
plugins {
id 'java'
id 'org.xbib.gradle.plugin.shadow' version '{project-version}'
}
----
Shadow is a reactive plugin.
This means that applying Shadow by itself will perform no configuration on your project.
Instead, Shadow __reacts__ to the application of other plugins to decorate the project.
This means, that for most users, the `java` or `groovy` plugins must be __explicitly__ applied
to have the desired effect.
=== Default Java/Groovy Tasks
In the presence of the `java` or `groovy` plugins, Shadow will automatically configure the
following behavior:
* Adds a `shadowJar` task to the project.
* Adds a `shadow` configuration to the project.
* Configures the `shadowJar` task to include all sources from the project's `main` sourceSet.
* Configures the `shadowJar` task to bundle all dependencies from the `runtime` configuration.
* Configures the __classifier__ attribute of the `shadowJar` task to be `'all'` .
* Configures the `shadowJar` task to generate a `Manifest` with:
** Inheriting all configuration from the standard `jar` task.
** Adds a `Class-Path` attribute to the `Manifest` that appends all dependencies from the `shadow` configuration
* Configures the `shadowJar` task to __exclude__ any JAR index or cryptographic signature files matching the following patterns:
** `META-INF/INDEX.LIST`
** `META-INF/*.SF`
** `META-INF/*.DSA`
** `META-INF/*.RSA`
* Creates and registers the `shadow` component in the project (used for integrating with `maven-publish`).
* Configures the `uploadShadow` task (as part of the `maven` plugin) with the following behavior:
** Removes the `compile` and `runtime` configurations from the `pom.xml` file mapping.
** Adds the `shadow` configuration to the `pom.xml` file as `RUNTIME` scope.
=== Shadowing Gradle Plugins
Shadow is capable of automatically configuring package relocation for your dependencies.
This is useful especially when building Gradle plugins where you want your dependencies to not conflict with versions
provided by the Gradle runtime.
[source,groovy,subs="+attributes"]
----
buildscript {
repositories {
jcenter()
}
dependencies {
classpath 'org.xbib.gradle.plugin:shadow:{project-version}'
}
}
apply plugin: 'org.xbib.gradle.plugin.shadow'
apply plugin: 'java'
----
Alternatively, the Gradle Plugin syntax can be used:
[source,groovy,subs="+attributes"]
----
plugins {
id 'java'
id 'org.xbib.gradle.plugin.plugin-shadow' version '{project-version}'
}
----
Applying the `plugin-shadow` plugin is the same as applying the standard `shadow` plugin with the additional creation
of the `configureRelocationShadowJar` task.
This task runs before the `shadowJar` task and scans the packages present in the dependencies that will be merged into
the final jar and automatically configures relocation for them.
By default the tasks relocates all packages to the `shadow.` prefix.
For example `org.jdom2.JDOMException` becomes `shadow.org.jdom2.JDOMException`
For more details see the sectinon <<Using Shadow to Package Gradle Plugins>>

View file

@ -0,0 +1,106 @@
== Configuring Shadow
The link:{api}/tasks/ShadowJar.html[`ShadowJar`] task type extends from Gradle's
https://docs.gradle.org/current/dsl/org.gradle.api.tasks.bundling.Jar.html[`Jar`] type.
This means that all attributes and methods available on `Jar` are also available on
link:{api}/tasks/ShadowJar.html[`ShadowJar`].
Refer the __Gradle User Guide__ for https://docs.gradle.org/current/dsl/org.gradle.api.tasks.bundling.Jar.html[Jar] for
details.
=== Configuring Output Name
Shadow configures the default `shadowJar` task to set the output JAR's `destinationDir`, `baseName`, `appendix`,
`version`, and `extension` to the same default values as Gradle does for all `Jar` tasks.
Additionally, it configures the `classifier` to be `all`.
If working with a Gradle project with the name `myApp` and version `1.0`, the default `shadowJar` task will output a
file at: `build/libs/myApp-1.0-all.jar`
As with all `Jar` tasks in Gradle, these values can be overridden:
.Output to `build/libs/shadow.jar`
[source,groovy,indent=0]
----
include::{tests}/ShadowPluginSpec.groovy[tags=rename]
----
=== Configuring the Runtime Classpath
Each Java JAR file contains a manifest file that provides meta data about the contents of the JAR file itself.
When using a shadowed JAR file as an executable JAR, it is assumed that all necessary runtime classes are contained
within the JAR itself.
There may be situations where the desire is to **not** bundle select dependencies into the shadowed JAR file but
they are still required for runtime execution.
In these scenarios, Shadow creates a `shadow` configuration to declare these dependencies.
Dependencies added to the `shadow` configuration are *not* bundled into the output JAR.
Think of `configurations.shadow` as unmerged, runtime dependencies.
The integration with the `maven` and `maven-publish` plugins will automatically configure dependencies added
to `configurations.shadow` as `RUNTIME` scope dependencies in the resulting POM file.
Additionally, Shadow automatically configures the manifest of the `shadowJar` task to contain a `Class-Path` entry
in the JAR manifest.
The value of the `Class-Path` entry is the name of all dependencies resolved in the `shadow` configuration
for the project.
[source,groovy,indent=0]
----
include::{tests}/ShadowPluginSpec.groovy[tags=shadowConfig]
----
Inspecting the `META-INF/MANIFEST.MF` entry in the JAR file will reveal the following attribute:
[source,property,indent=0]
----
Class-Path: junit-3.8.2.jar
----
When deploying a shadowed JAR as an execution JAR, it is important to note that any non-bundled runtime dependencies
**must** be deployed in the location specified in the `Class-Path` entry in the manifest.
=== Configuring the JAR Manifest
Beyond the automatic configuration of the `Class-Path` entry, the `shadowJar` manifest is configured in a number of ways.
First, the manifest for the `shadowJar` task is configured to __inherit__ from the manifest of the standard `jar` task.
This means that any configuration performed on the `jar` task will propagate to the `shadowJar` tasks.
[source,groovy,indent=0]
----
include::{tests}/ShadowPluginSpec.groovy[tags=jarManifest]
----
Inspecting the `META-INF/MANIFEST.MF` entry in the JAR file will revel the following attribute:
[source,property,indent=0]
----
Class-Path: /libs/a.jar
----
If it is desired to inherit a manifest from a JAR task other than the standard `jar` task, the `inheritFrom` methods
on the `shadowJar.manifest` object can be used to configure the upstream.
[source,groovy,indent=0]
----
task testJar(type: Jar) {
manifest {
attributes 'Description': 'This is an application JAR'
}
}
shadowJar {
manifest {
inheritFrom project.tasks.testJar.manifest
}
}
----
include::11-filtering-contents.adoc[]
include::12-controlling-dependencies.adoc[]
include::13-controlling-merging.adoc[]
include::14-package-relocation.adoc[]
include::15-minimizing.adoc[]
include::16-reproducible-builds.adoc[]

View file

@ -0,0 +1,27 @@
=== Filtering Shadow Jar Contents
The final contents of a shadow JAR can be filtered using the `exclude` and `include` methods inherited from Gradle's
`Jar` task type.
Refer to the https://docs.gradle.org/current/dsl/org.gradle.api.tasks.bundling.Jar.html[Jar] documentation for details
on the various versions of the methods and their behavior.
When using `exclude`/`include` with a `ShadowJar` task, the resulting copy specs are applied to the __final__ JAR
contents.
This means that, the configuration is applied to the individual files from both the project source set or __any__
of the dependencies to be merged.
.Exclude a file from Shadow Jar
[source,groovy,indent=0]
----
include::{tests}/FilteringSpec.groovy[tags=excludeFile]
----
Excludes and includes can be combined just like a normal `Jar` task, with `excludes` taking precendence over `includes`.
Additionally, ANT style patterns can be used to match multiple files.
.Configuring output using ANT patterns
[source,groovy,indent=0]
----
include::{tests}/FilteringSpec.groovy[tags=excludeOverInclude]
----

View file

@ -0,0 +1,118 @@
=== Configuring Shadowed Dependencies
Shadow configures the default `shadowJar` task to merge all dependencies from the project's `runtime` configuration
into the final JAR.
The configurations to from which to source dependencies for the merging can be configured using the `configurations` property
of the link:{api}/tasks/ShadowJar.html[`ShadowJar`] task type.
[source,groovy,indent=0]
----
shadowJar {
configurations = [project.configurations.compile]
}
----
The above code sample would configure the `shadowJar` task to merge depdencies from only the `compile` configuration.
This means any dependency declared in the `runtime` configuration would be **not** be included in the final JAR.
[NOTE]
====
Note the literal use of `project.configurations` when setting the `configurations` attribute of a
link:{api}/tasks/ShadowJar.html[`ShadowJar`] task.
This is **required**. It maybe be tempting to specify `configurations = [configurations.compile]` but this will not
have the intended effect, as `configurations.compile` will try to delegate to the `configurations` property of the
the link:{api}/tasks/ShadowJar.html[`ShadowJar`] task instead of the `project`
====
=== Embedding Jar Files Inside Your Shadow Jar
Because of the way that Gradle handles dependency configuration, from a plugin perspective, shadow is unable to distinguish between a jar file configured as a dependency and a jar file included in the resource folder. This means that any jar found in a resource directory will be merged into the shadow jar the same as any other dependency. If your intention is to embed the jar inside, you must rename the jar as to not end with `.jar` before the shadow task begins.
=== Filtering Dependencies
Individual dependencies can be filtered from the final JAR by using the `dependencies` block of a
link:{api}/tasks/ShadowJar.html[`ShadowJar`] task.
Dependency filtering does **not** apply to transitive dependencies.
That is, excluding a dependency does not exclude any of its dependencies from the final JAR.
The `dependency` blocks provides a number of methods for resolving dependencies using the notations familiar from
Gradle's `configurations` block.
.Exclude an Module Dependency
[source,groovy,indent=0]
----
include::{tests}/FilteringSpec.groovy[tags=excludeDep]
----
.Exclude a Project Dependency
[source,groovy,indent=0]
----
include::{tests}/FilteringSpec.groovy[tags=excludeProject]
----
[NOTE]
====
While not being able to filter entire transitive dependency graphs might seem like an oversight, it is necessary
because it would not be possible to intelligently determine the build author's intended results when there is a
common dependency between two 1st level dependencies when one is excluded and the other is not.
====
==== Using Regex Patterns to Filter Dependencies
Dependencies can be filtered using regex patterns.
Coupled with the `<group>:<artifact>:<version>` notation for dependencies, this allows for excluding/including
using any of these individual fields.
.Exclude Any Version of a Dependency
[source,groovy,indent=0]
----
include::{tests}/FilteringSpec.groovy[tags=excludeDepWildcard]
----
Any of the individual fields can be safely absent and will function as though a wildcard was specified.
.Ignore Dependency Version
[source,groovy,indent=0]
----
shadowJar {
dependencies {
exclude(dependency('shadow:d'))
}
}
----
The above code snippet is functionally equivalent to the previous example.
This same patten can be used for any of the dependency notation fields.
.Ignoring An Artifact Regardless of Group
[source,groovy,indent=0]
----
shadowJar {
dependencies {
exclude(dependency(':d:1.0'))
}
}
----
.Excluding All Artifacts From Group
[source,groovy,indent=0]
----
shadowJar {
dependencies {
exclude(dependency('shadow::1.0'))
}
}
----
==== Programmatically Selecting Dependencies to Filter
If more complex decisions are needed to select the dependencies to be included, the
link:{api}/tasks/ShadowJar.html#dependencies(Action<DependencyFilter>)[`dependencies`] block provides a
method that accepts a `Closure` for selecting dependencies.
.Selecting Dependencies to Filter With a Spec
[source,groovy,indent=0]
----
include::{tests}/FilteringSpec.groovy[tags=excludeSpec]
----

View file

@ -0,0 +1,151 @@
=== Controlling JAR Content Merging
Shadow allows for customizing the process by which the output JAR is generated through the
link:{api}/transformers/Transformer.html[`Transformer`] interface.
This is a concept that has been carried over from the original Maven Shade implementation.
A link:{api}/transformers/Transformer.html[`Transformer`] is invoked for each entry in the JAR before being written to
the final output JAR.
This allows a link:{api}/transformers/Transformer.html[`Transformer`] to determine if it should process a particular
entry and apply any modifications beforewriting the stream to the output.
.Adding a Transformer
[source,groovy,indent=0]
----
shadowJar {
transform(MyTransformer.class)
}
----
Additionally, a `Transformer` can accept a `Closure` to configure the provided `Transformer`.
.Configuring a Transformer
[source,groovy,indent=0]
----
shadowJar {
transform(MyTransformer.class) {
enable = true
}
}
----
An instantiated instance of a `Transformer` can also be provided.
.Adding a Transformer Instance
[source,groovy,indent=0]
----
shadowJar {
transform(new MyTransformer(enabled: true))
}
----
==== Merging Service Descriptor Files
Java libraries often contain service descriptors files in the `META-INF/services` directory of the JAR.
A service descriptor typically contains a line delimited list of classes that are supported for a particular __service__.
At runtime, this file is read and used to configure library or application behavior.
Multiple dependencies may use the same service descriptor file name.
In this case, it is generally desired to merge the content of each instance of the file into a single output file.
The link:{api}/transformers/ServiceFileTransformer.html[`ServiceFileTransformer`] class is used to perform this merging.
By default, it will merge each copy of a file under `META-INF/services` into a single file in the output JAR.
.Merging Service Files
[source,groovy,indent=0]
----
shadowJar {
mergeServiceFiles()
}
----
The above code snippet is a convenience syntax for calling
link:{api}/tasks/ShadowJar.html#transform(Class++<? extends Transformer>++)[`transform(ServiceFileTransformer.class)`].
[NOTE]
====
Groovy Extension Module descriptor files (located at `META-INF/services/org.codehaus.groovy.runtime.ExtensionModule`)
are ignored by the link:{api}/transformers/ServiceFileTransformer.html[`ServiceFileTransformer`].
This is due to these files having a different syntax than standard service descriptor files.
Use the link:{api}/tasks/ShadowJar.html#mergeGroovyExtensionModules()[`mergeGroovyExtensionModules()`] method to merge
these files if your dependencies contain them.
====
===== Configuring the Location of Service Descriptor Files
By default the link:{api}/transformers/ServiceFileTransformer.html[`ServiceFileTransformer`] is configured to merge
files in `META-INF/services`.
This directory can be overridden to merge descriptor files in a different location.
.Merging Service Files in a Specific Directory
[source,groovy,indent=0]
----
shadowJar {
mergeServiceFiles {
path = 'META-INF/custom'
}
}
----
===== Excluding/Including Specific Service Descriptor Files From Merging
The link:{api}/transformers/ServiceFileTransformer.html[`ServiceFileTransformer`] class supports specifying specific
files to include or exclude from merging.
.Excluding a Service Descriptor From Merging
[source,groovy,indent=0]
----
shadowJar {
mergeServiceFiles {
exclude 'META-INF/services/com.acme.*'
}
}
----
==== Merging Groovy Extension Modules
Shadow provides a specific transformer for dealing with Groovy extension module files.
This is due to their special syntax and how they need to be merged together.
The link:{api}/transformers/GroovyExtensionModuleTransformer.html[`GroovyExtensionModuleTransformer`] will handle these
files.
The link:{api}/tasks/ShadowJar.html[`ShadowJar`] task also provides a short syntax method to add this transformer.
.Merging Groovy Extension Modules
[source,groovy,indent=0]
----
shadowJar {
mergeGroovyExtensionModules()
}
----
==== Appending Text Files
Generic text files can be appended together using the
link:{api}/transformers/AppendingTransformer.html[`AppendingTransformer`].
Each file is appended using new lines to separate content.
The link:{api}/tasks/ShadowJar.html[`ShadowJar`] task provides a short syntax method of
link:{api}/tasks/ShadowJar.html#append(java.lang.String)[`append(String)`] to configure this transformer.
.Appending a Property File
[source,groovy,indent=0]
----
shadowJar {
append 'test.properties'
}
----
==== Appending XML Files
XML files require a special transformer for merging.
The link:{api}/transformers/XmlAppendingTransformer.html[`XmlAppendingTransformer`] reads each XML document and merges
each root element into a single document.
There is no short syntax method for the link:{api}/transformers/XmlAppendingTransformer.html[`XmlAppendingTransformer`].
It must be added using the link:{api}/tasks/ShadowJar.html#transform(++Class<? extends Transformer>++)[`transform`] methods.
.Appending a XML File
[source,groovy,indent=0]
----
shadowJar {
tranform(XmlAppendingTransformer.class) {
resource = 'properties.xml'
}
}
----

View file

@ -0,0 +1,72 @@
=== Relocating Packages
Shade is capable of scanning a project's classes and relocating specific dependencies to a new location.
This is often required when one of the dependencies is susceptible to breaking changes in versions or
to classpath pollution in a downstream project.
[NOTE]
====
Google's Guava and the ASM library are typical cases where package relocation can come in handy.
====
Shadow uses the ASM library to modify class byte code to replace the package name and any import
statements for a class.
Any non-class files that are stored within a package structure are also relocated to the new location.
.Relocating a Package
[source,groovy,indent=0]
----
include::{tests}/RelocationSpec.groovy[tags=relocate]
----
The code snippet will rewrite the location for any class in the `junit.framework` to be `shadow.junit`.
For example, the class `junit.textui.TestRunner` becomes `shadow.junit.TestRunner`.
In the resulting JAR, the class file is relocated from `junit/textui/TestRunner.class` to
`shadow/junit/TestRunner.class`.
[NOTE]
====
Relocation operates at a package level.
It is not necessary to specify any patterns for matching, it will operate simply on the prefix
provided.
====
[NOTE]
====
Relocation will be applied globally to all instance of the matched prefix.
That is, it does **not** scope to __only__ the dependencies being shadowed.
Be specific as possible when configuring relocation as to avoid unintended relocations.
====
==== Filtering Relocation
Specific classes or files can be `included`/`excluded` from the relocation operation if necessary.
.Configuring Filtering for Relocation
[source,groovy,indent=0]
----
include::{tests}/RelocationSpec.groovy[tags=relocateFilter]
----
==== Automatically Relocating Dependencies
Shadow ships with a task that can be used to automatically configure all packages from all dependencies to be relocated.
To configure automatic dependency relocation, declare a task of type `ConfigureShadowRelocation` and configure the
`target` parameter to be the `ShadowJar` task you wish to auto configure. You will also need to declared a task
dependency so the tasks execute in the correct order.
.Configure Auto Relocation
[source,groovy]
----
task relocateShadowJar(type: ConfigureShadowRelocation) {
target = tasks.shadowJar
prefix = "myapp" // Default value is "shadow"
}
tasks.shadowJar.dependsOn tasks.relocateShadowJar
----
[NOTE]
====
Configuring package auto relocation can add significant time to the shadow process as it will process all dependencies
in the configurations declared to be shadowed. By default, this is the `runtime` or `runtimeClasspath` configurations.
Be mindful that some Gradle plugins (such as `java-gradle-plugin` will automatically add dependencies to your class path
(e.g. `java-gradle-plugin` automatically adds the full Gradle API to your `compile` configuratinon. You may need to
remove these dependencies if you do not intend to shadow them into your library.
====

View file

@ -0,0 +1,25 @@
=== Minimizing
Shadow can automatically remove all classes of dependencies that are not used by the project, thereby minimizing the resulting shadowed JAR.
.Minimizing an shadow JAR
[source,groovy,indent=0]
----
shadowJar {
minimize()
}
----
A dependency can be excluded from the minimization process thereby forcing it's inclusion the shadow JAR.
This is useful when the dependency analyzer cannot find the usage of a class programmatically, for example if the class
is loaded dynamically via `Class.forName(String)`.
.Force a class to be retained during minimization
[source,groovy,indent=0]
----
shadowJar {
minimize {
exclude(dependency('org.scala-lang:.*:.*'))
}
}
----

View file

@ -0,0 +1,16 @@
=== Reproducible Builds
Because JAR files contain the timestamp of the included files, it is often difficult to create reproducible builds
from a source commit that results in a hash identical file.
Gradle supports reproducible JAR creation by setting the timestamps of included files to a consistent value.
Shadow includes support for overriding file timestamps. By default, Shadow will preserve
the file timestamps when creating the Shadow JAR. To set timestamps to a consistent value (1980/1/1 00:00:00),
set the `preserveFileTimestamps` property to `false` on the `ShadowJar` task.
.Reset file timestamps
[source,groovy,indent=0]
----
shadowJar {
preserveFileTimestamps = false
}
----

View file

@ -0,0 +1,20 @@
== Creating a Custom ShadowJar Task
The built in `shadowJar` task only provides an output for the `main` source set of the project.
It is possible to add arbitrary link:{api}/tasks/ShadowJar.html[`ShadowJar`] tasks to a project.
When doing so, ensure that the `configurations` property is specified to inform Shadow which dependencies to merge
into the output.
.Shadowing Test Sources and Dependencies
[source,groovy,indent=0]
----
task testJar(type: ShadowJar) {
classifier = 'tests'
from sourceSets.test.output
configurations = [project.configurations.testRuntime]
}
----
The code snippet above will geneated a shadowed JAR contain both the `main` and `test` sources as well as all `runtime`
and `testRuntime` dependencies.
The file is output to `build/libs/<project>-<version>-tests.jar`.

View file

@ -0,0 +1,69 @@
== Publishing Shadow JARs
=== Publishing with Maven-Publish Plugin
The Shadow plugin will automatically configure the necessary tasks in the presence of Gradle's
`maven-publish` plugin.
The plugin provides the `component` method from the `shadow` extension to configure the
publication with the necessary artifact and dependencies in the POM file.
.Publishing a Shadow JAR with the Maven-Publish Plugin
[source,groovy,indent=0]
----
apply plugin: 'java'
apply plugin: 'maven-publish'
apply plugin: 'org.xbib.plugin.gradle.shadow'
publishing {
publications {
shadow(MavenPublication) { publication ->
project.shadow.component(publication)
}
}
repositories {
maven {
url "http://repo.myorg.com"
}
}
}
----
=== Publishing with Maven Plugin
The Shadow plugin will automatically configure the necessary tasks in the presence of Gradle's
`maven` plugin.
To publish the JAR, simply configure the publish location for the `uploadShadow` task and execute it.
.Publishing a Shadow JAR with the Maven Plugin
[source,groovy,indent=0]
----
apply plugin: 'java'
apply plugin: 'maven'
apply plugin: 'org.xbib.gradle.plugin.shadow'
uploadShadow {
repositories {
mavenDeployer {
repository(url: "http://repo.myorg.com")
}
}
}
----
=== Shadow Configuration and Publishing
The Shadow plugin provides a custom configuration (`configurations.shadow`) to specify
runtime dependencies that are *not* merged into the final JAR file.
When configuring publishing with the Shadow plugin, the dependencies in the `shadow`
configuration, are translated to become `RUNTIME` scoped dependencies of the
published artifact.
No other dependencies are automatically configured for inclusion in the POM file.
For example, excluded dependencies are *not* automatically added to the POM file or
if the configuration for merging are modified by specifying
`shadowJar.configurations = [configurations.myconfiguration]`, there is no automatic
configuration of the POM file.
This automatic configuration occurs _only_ when using the above methods for
configuring publishing. If this behavior is not desirable, then publishing *must*
be manually configured.

View file

@ -0,0 +1,21 @@
== Using Shadow in Multi-Project Builds
When using Shadow in a multi-project build, project dependencies will be treated the same as
external dependencies.
That is a project dependency will be merged into the `shadowJar` output of the project that
is applying the Shadow plugin.
=== Depending on the Shadow Jar from Another Project
In a multi-project build there may be one project that applies Shadow and another that
requires the shadowed JAR as a dependency.
In this case, use Gradle's normal dependency declaration mechanism to depend on the `shadow`
configuration of the shadowed project.
.Depending On Shadow Output of Project
[source,groovy,indent=0]
----
dependencies {
compile project(path: 'api', configuration: 'shadow')
}
----

View file

@ -0,0 +1,49 @@
== Using Shadow to Package Gradle Plugins
In some scenarios, writing a Gradle plugin can be problematic because your plugin may depend on a version that
conflicts with the same dependency provided by the Gradle runtime. If this is the case, then you can utilize Shadow
to relocate your dependencies to a different package name to avoid the collision.
Configuring the relocation has always been possible, but the build author is required to know all the package names
before hand. Shadow introduces a new task type `ConfigureShadowRelocation`.
Tasks of this type are configured to target an instance of a `ShadowJar` task and run immediately before it.
The `ConfigureShadowRelocation` task, scans the dependencies from the configurations specified on the associated
`ShadowJar` task and collects the package names contained within them. It then configures relocation for these
packages using the specified `prefix` on the associated `ShadowJar` task.
While this is useful for developing Gradle plugins, nothing about the `ConfigureShadowRelocation` task is tied to
Gradle projects. It can be used for standard Java or Groovy projects.
A simple Gradle plugin can use this feature by applying the `shadow` plugin and configuring the dependencies
like so:
[source,groovy,subs="+attributes"]
----
import org.xbib.gradle.plugin.shadow.tasks.ConfigureShadowRelocation
plugins {
id 'java'
id 'org.xbib.gradle.plugin.shadow' version '{project-version}'
}
dependencies {
shadow localGroovy()
shadow gradleApi()
compile 'org.ow2.asm:asm:7.0-beta'
compile 'org.ow2.asm:asm-commons:7.0-beta'
compile 'org.ow2.asm:asm-util:7.0-beta'
}
task relocateShadowJar(type: ConfigureShadowRelocation) {
target = tasks.shadowJar
}
tasks.shadowJar.dependsOn tasks.relocateShadowJar
----
Note that the `localGroovy()` and `gradleApi()` dependencies are added to the `shadow` configuration instead of the
normal `compile` configuration. These dependencies are provided by Gradle to compile your project but are ultimately
provided by the Gradle runtime when executing the plugin. Thus, it is __not__ advisable to bundle these dependencies
with your plugin.

View file

@ -0,0 +1,23 @@
== About This Project
John Engelman about the original Shadow plugin:
____
I started this project in December of 2012. We were working on converting from a monolithic application into a the
new hot jazz of "microservices" using Dropwizard.
I had also just started learning about Gradle and I knew that the incremental build system it provided would benefit
our development team greatly.
Unfortunately, the closest thing that Gradle had to Maven's Shade plugin was its ability to create application TARs and
ZIPs.
So, Charlie Knudsen and myself set out to port the existing Shade code into a Gradle plugin.
This port is what existing up until the `0.9` milestone releases for Shadow.
It functioned, but it wasn't idiomatic Gradle by any means.
Starting with 0.9, Shadow was rewritten from the ground up as standard Gradle plugin and leveraged as much of Gradle's
classes and concepts as possible.
At the same time as the 0.9 release, Gradle was announcing the https://plugins.gradle.org[Gradle Plugin Portal] and
so Shadow was published there.
Shadow has had nearly ~900,000 downloads from Bintray and countless more from the Gradle Plugin Portal.
____

View file

@ -0,0 +1,32 @@
:src: ../
:tests: {src}/test/groovy/org/xbib/gradle/plugin/shadow
:api: api/org/xbib/gradle/plugin/shadow
:docinfo1:
ifdef::env-github[]
:note-caption: :information-source:
endif::[]
= Gradle Shadow Plugin User Guide & Examples
:revnumber: {project-version}
NOTE: This documentation was taken from the original Shadow Plugin and is copyrighted by John R. Engelman.
References to older original Shadow Plugin versions have been removed for consistency.
The references to the application mode (runShadow) are no longer supported and have been removed.
To create application images, use Java Platform Module System, especially the `jlink` tool.
link:api/index.html[API Docs]
include::00-intro.adoc[]
include::10-configuring.adoc[]
include::20-custom-tasks.adoc[]
include::40-publishing.adoc[]
include::50-multi-project-builds.adoc[]
include::60-shadowing-plugins.adoc[]
include::99-about.adoc[]

View file

@ -0,0 +1,21 @@
package org.xbib.gradle.plugin.shadow
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.util.GradleVersion
class ShadowBasePlugin implements Plugin<Project> {
static final String EXTENSION_NAME = 'shadow'
static final String CONFIGURATION_NAME = 'shadow'
@Override
void apply(Project project) {
if (GradleVersion.current() < GradleVersion.version('4.10')) {
throw new IllegalArgumentException('shadow requires at least Gradle 4.10')
}
project.extensions.create(EXTENSION_NAME, ShadowExtension, project)
project.configurations.create(CONFIGURATION_NAME)
}
}

View file

@ -0,0 +1,38 @@
package org.xbib.gradle.plugin.shadow
import org.gradle.api.Project
import org.gradle.api.artifacts.SelfResolvingDependency
import org.gradle.api.file.CopySpec
import org.gradle.api.publish.maven.MavenPom
import org.gradle.api.publish.maven.MavenPublication
class ShadowExtension {
CopySpec applicationDistribution
Project project
ShadowExtension(Project project) {
this.project = project
applicationDistribution = project.copySpec {}
}
void component(MavenPublication publication) {
publication.artifact(project.tasks.shadowJar)
publication.pom { MavenPom pom ->
pom.withXml { xml ->
def dependenciesNode = xml.asNode().appendNode('dependencies')
project.configurations.shadow.allDependencies.each {
if (! (it instanceof SelfResolvingDependency)) {
def dependencyNode = dependenciesNode.appendNode('dependency')
dependencyNode.appendNode('groupId', it.group)
dependencyNode.appendNode('artifactId', it.name)
dependencyNode.appendNode('version', it.version)
dependencyNode.appendNode('scope', 'runtime')
}
}
}
}
}
}

View file

@ -0,0 +1,58 @@
package org.xbib.gradle.plugin.shadow
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.plugins.JavaPluginConvention
import org.gradle.configuration.project.ProjectConfigurationActionContainer
import org.xbib.gradle.plugin.shadow.tasks.ShadowJar
import javax.inject.Inject
class ShadowJavaPlugin implements Plugin<Project> {
static final String SHADOW_JAR_TASK_NAME = 'shadowJar'
static final String SHADOW_GROUP = 'Shadow'
private final ProjectConfigurationActionContainer configurationActionContainer;
@Inject
ShadowJavaPlugin(ProjectConfigurationActionContainer configurationActionContainer) {
this.configurationActionContainer = configurationActionContainer
}
@Override
void apply(Project project) {
configureShadowTask(project)
project.configurations.compileClasspath.extendsFrom project.configurations.shadow
}
protected void configureShadowTask(Project project) {
JavaPluginConvention convention = project.convention.getPlugin(JavaPluginConvention)
project.tasks.register(SHADOW_JAR_TASK_NAME, ShadowJar) { shadow ->
shadow.group = SHADOW_GROUP
shadow.description = 'Create a combined JAR of project and runtime dependencies'
shadow.archiveClassifier.set("all")
shadow.manifest.inheritFrom project.tasks.jar.manifest
def libsProvider = project.provider { -> [project.tasks.jar.manifest.attributes.get('Class-Path')] }
def files = project.objects.fileCollection().from { ->
project.configurations.findByName(ShadowBasePlugin.CONFIGURATION_NAME)
}
shadow.doFirst {
if (!files.empty) {
def libs = libsProvider.get()
libs.addAll files.collect { "${it.name}" }
manifest.attributes 'Class-Path': libs.findAll { it }.join(' ')
}
}
shadow.from(convention.sourceSets.main.output)
shadow.configurations = [project.configurations.findByName('runtimeClasspath') ?
project.configurations.runtimeClasspath : project.configurations.runtime]
shadow.exclude('META-INF/INDEX.LIST', 'META-INF/*.SF', 'META-INF/*.DSA', 'META-INF/*.RSA', 'module-info.class')
shadow.dependencies {
exclude(dependency(project.dependencies.gradleApi()))
}
project.artifacts.add(ShadowBasePlugin.CONFIGURATION_NAME, shadow)
}
}
}

View file

@ -0,0 +1,31 @@
package org.xbib.gradle.plugin.shadow
import org.gradle.api.Plugin
import org.gradle.api.Project
import org.gradle.api.plugins.JavaPlugin
import org.xbib.gradle.plugin.shadow.tasks.ShadowJar
class ShadowPlugin implements Plugin<Project> {
@Override
void apply(Project project) {
project.plugins.apply(ShadowBasePlugin)
project.plugins.withType(JavaPlugin) {
project.plugins.apply(ShadowJavaPlugin)
}
def rootProject = project.rootProject
rootProject.plugins.withId('com.gradle.build-scan') {
rootProject.buildScan.buildFinished {
def shadowTasks = project.tasks.withType(ShadowJar)
shadowTasks.each { task ->
if (task.didWork) {
task.stats.buildScanData.each { k, v ->
rootProject.buildScan.value "shadow.${task.path}.${k}", v.toString()
}
rootProject.buildScan.value "shadow.${task.path}.configurations", task.configurations*.name.join(", ")
}
}
}
}
}
}

View file

@ -0,0 +1,81 @@
package org.xbib.gradle.plugin.shadow
import groovy.util.logging.Log
import org.gradle.api.GradleException
@Log
class ShadowStats {
long totalTime
long jarStartTime
long jarEndTime
int jarCount = 1
boolean processingJar
Map<String, String> relocations = [:]
void relocate(String src, String dst) {
relocations[src] = dst
}
String getRelocationString() {
def maxLength = relocations.keySet().collect { it.length() }.max()
relocations.collect { k, v -> "${k} ${separator(k, maxLength)} ${v}"}.sort().join("\n")
}
String separator(String key, int max) {
return "→"
}
void startJar() {
if (processingJar) throw new GradleException("Can only time one entry at a time")
processingJar = true
jarStartTime = System.currentTimeMillis()
}
void finishJar() {
if (processingJar) {
jarEndTime = System.currentTimeMillis()
jarCount++
totalTime += jarTiming
processingJar = false
}
}
void printStats() {
println this
}
long getJarTiming() {
jarEndTime - jarStartTime
}
double getTotalTimeSecs() {
totalTime / 1000
}
double getAverageTimePerJar() {
totalTime / jarCount
}
double getAverageTimeSecsPerJar() {
averageTimePerJar / 1000
}
String toString() {
StringBuilder sb = new StringBuilder()
sb.append "*******************\n"
sb.append "GRADLE SHADOW STATS\n"
sb.append "\n"
sb.append "Total Jars: $jarCount (includes project)\n"
sb.append "Total Time: ${totalTimeSecs}s [${totalTime}ms]\n"
sb.append "Average Time/Jar: ${averageTimeSecsPerJar}s [${averageTimePerJar}ms]\n"
sb.append "*******************"
}
Map<String, String> getBuildScanData() {
[
dependencies: jarCount,
relocations: relocationString
]
}
}

View file

@ -0,0 +1,94 @@
package org.xbib.gradle.plugin.shadow.impl
import org.objectweb.asm.commons.Remapper
import org.xbib.gradle.plugin.shadow.relocation.RelocateClassContext
import org.xbib.gradle.plugin.shadow.relocation.RelocatePathContext
import org.xbib.gradle.plugin.shadow.relocation.Relocator
import org.xbib.gradle.plugin.shadow.tasks.ShadowCopyAction
import org.xbib.gradle.plugin.shadow.ShadowStats
import java.util.regex.Matcher
import java.util.regex.Pattern
class RelocatorRemapper extends Remapper {
private final Pattern classPattern = Pattern.compile("(\\[*)?L(.+)")
List<Relocator> relocators
ShadowStats stats
RelocatorRemapper(List<Relocator> relocators, ShadowStats stats) {
this.relocators = relocators
this.stats = stats
}
boolean hasRelocators() {
return !relocators.empty
}
Object mapValue(Object object) {
if (object instanceof String) {
String name = (String) object
String value = name
String prefix = ""
String suffix = ""
Matcher m = classPattern.matcher(name)
if (m.matches()) {
prefix = m.group(1) + "L"
suffix = ""
name = m.group(2)
}
RelocateClassContext classContext = RelocateClassContext.builder().className(name).stats(stats).build()
RelocatePathContext pathContext = RelocatePathContext.builder().path(name).stats(stats).build()
for (Relocator r : relocators) {
if (r.canRelocateClass(classContext)) {
value = prefix + r.relocateClass(classContext) + suffix
break
} else if (r.canRelocatePath(pathContext)) {
value = prefix + r.relocatePath(pathContext) + suffix
break
}
}
return value
}
return super.mapValue(object)
}
String map(String name) {
String value = name
String prefix = ""
String suffix = ""
Matcher m = classPattern.matcher(name)
if (m.matches()) {
prefix = m.group(1) + "L"
suffix = ""
name = m.group(2)
}
RelocatePathContext pathContext = RelocatePathContext.builder().path(name).stats(stats).build()
for (Relocator r : relocators) {
if (r.canRelocatePath(pathContext)) {
value = prefix + r.relocatePath(pathContext) + suffix
break
}
}
return value
}
String mapPath(String path) {
map(path.substring(0, path.indexOf('.')))
}
String mapPath(ShadowCopyAction.RelativeArchivePath path) {
mapPath(path.pathString)
}
}

View file

@ -0,0 +1,85 @@
package org.xbib.gradle.plugin.shadow.internal
class Clazz implements Comparable<Clazz> {
private final Set<Clazz> dependencies = new HashSet<Clazz>()
private final Set<Clazz> references = new HashSet<Clazz>()
private final Set<ClazzpathUnit> units = new HashSet<ClazzpathUnit>()
private final String name
Clazz( final String pName ) {
name = pName;
}
String getName() {
name
}
void addClazzpathUnit(ClazzpathUnit pUnit ) {
units.add(pUnit)
}
void removeClazzpathUnit( ClazzpathUnit pUnit ) {
units.remove(pUnit)
}
Set<ClazzpathUnit> getClazzpathUnits() {
units
}
void addDependency( final Clazz pClazz ) {
pClazz.references.add(this)
dependencies.add(pClazz)
}
public void removeDependency( final Clazz pClazz ) {
pClazz.references.remove(this)
dependencies.remove(pClazz)
}
Set<Clazz> getDependencies() {
return dependencies
}
Set<Clazz> getReferences() {
return references
}
Set<Clazz> getTransitiveDependencies() {
final Set<Clazz> all = new HashSet<Clazz>();
findTransitiveDependencies(all);
return all;
}
void findTransitiveDependencies( final Set<? super Clazz> pAll ) {
for (Clazz clazz : dependencies) {
if (!pAll.contains(clazz)) {
pAll.add(clazz)
clazz.findTransitiveDependencies(pAll)
}
}
}
boolean equals( final Object pO ) {
if (pO.getClass() != Clazz.class) {
return false
}
Clazz c = (Clazz) pO
name.equals(c.name)
}
int hashCode() {
name.hashCode()
}
int compareTo( final Clazz pO ) {
name.compareTo(((Clazz) pO).name)
}
String toString() {
name
}
}

View file

@ -0,0 +1,228 @@
package org.xbib.gradle.plugin.shadow.internal
import org.objectweb.asm.ClassReader
import java.util.jar.JarEntry
import java.util.jar.JarInputStream
import java.nio.file.Files
import java.nio.file.Path
import java.util.stream.Stream
import java.util.stream.StreamSupport
class Clazzpath {
private final Set<ClazzpathUnit> units
private final Map<String, Clazz> missing
private final Map<String, Clazz> clazzes
Clazzpath() {
units = new HashSet<ClazzpathUnit>()
missing = new HashMap<String, Clazz>()
clazzes = new HashMap<String, Clazz>()
}
boolean removeClazzpathUnit(ClazzpathUnit unit) {
Set<Clazz> unitClazzes = unit.getClazzes()
for (Clazz clazz : unitClazzes) {
clazz.removeClazzpathUnit(unit)
if (clazz.getClazzpathUnits().size() == 0) {
clazzes.remove(clazz.toString())
}
}
units.remove(unit)
}
ClazzpathUnit addClazzpathUnit(File file) throws IOException {
addClazzpathUnit(file.toPath())
}
ClazzpathUnit addClazzpathUnit(File file, String s) throws IOException {
addClazzpathUnit(file.toPath(), s)
}
ClazzpathUnit addClazzpathUnit(Path path) throws IOException {
addClazzpathUnit(path, path.toString())
}
ClazzpathUnit addClazzpathUnit(Path path1, String s) throws IOException {
Path path = path1.toAbsolutePath()
if (Files.isRegularFile(path)) {
return addClazzpathUnit(Files.newInputStream(path), s)
} else if (Files.isDirectory(path)) {
String prefix = Utils.separatorsToUnix(Utils.normalize(path.toString() + File.separatorChar))
List<Resource> list = []
path.traverse { p ->
if (Files.isRegularFile(p) && isValidResourceName(p.getFileName().toString())) {
list << new Resource(p.toString().substring(prefix.length())) {
@Override
InputStream getInputStream() throws IOException {
Files.newInputStream(p)
}
}
}
}
return addClazzpathUnit(list, s, true)
}
throw new IllegalArgumentException("neither file nor directory")
}
ClazzpathUnit addClazzpathUnit(InputStream inputStream, String s) throws IOException {
final JarInputStream jarInputStream = new JarInputStream(inputStream)
try {
Stream stream = toEntryStream(jarInputStream).map { e -> e.getName() }
.filter { name -> isValidResourceName(name) }
.map { name -> new Resource(name) {
@Override
InputStream getInputStream() throws IOException {
jarInputStream
}
}
}
addClazzpathUnit(stream.&iterator, s, false)
} finally {
jarInputStream.close()
}
}
ClazzpathUnit addClazzpathUnit(Iterable<Resource> resources, String s, boolean shouldCloseResourceStream)
throws IOException {
Map<String, Clazz> unitClazzes = new HashMap<String, Clazz>()
Map<String, Clazz> unitDependencies = new HashMap<String, Clazz>()
ClazzpathUnit unit = new ClazzpathUnit(s, unitClazzes, unitDependencies)
for (Resource resource : resources) {
String clazzName = resource.name
Clazz clazz = getClazz(clazzName)
if (clazz == null) {
clazz = missing.get(clazzName)
if (clazz != null) {
// already marked missing
clazz = missing.remove(clazzName)
} else {
clazz = new Clazz(clazzName)
}
}
clazz.addClazzpathUnit(unit)
clazzes.put(clazzName, clazz)
unitClazzes.put(clazzName, clazz)
DependenciesClassRemapper dependenciesClassAdapter = new DependenciesClassRemapper()
InputStream inputStream = resource.getInputStream()
try {
new ClassReader(inputStream.readAllBytes()).accept(dependenciesClassAdapter, ClassReader.EXPAND_FRAMES | ClassReader.SKIP_DEBUG)
} finally {
if (shouldCloseResourceStream) {
inputStream.close()
}
}
Set<String> depNames = dependenciesClassAdapter.getDependencies()
for (String depName : depNames) {
Clazz dep = getClazz(depName)
if (dep == null) {
// there is no such clazz yet
dep = missing.get(depName)
}
if (dep == null) {
// it is also not recorded to be missing
dep = new Clazz(depName)
dep.addClazzpathUnit(unit)
missing.put(depName, dep)
}
if (dep != clazz) {
unitDependencies.put(depName, dep)
clazz.addDependency(dep)
}
}
}
units.add(unit)
unit
}
Set<Clazz> getClazzes() {
new HashSet<Clazz>(clazzes.values())
}
Set<Clazz> getClashedClazzes() {
Set<Clazz> all = new HashSet<Clazz>()
for (Clazz clazz : clazzes.values()) {
if (clazz.getClazzpathUnits().size() > 1) {
all.add(clazz)
}
}
all
}
Set<Clazz> getMissingClazzes() {
new HashSet<Clazz>(missing.values())
}
Clazz getClazz(String clazzName) {
(Clazz) clazzes.get(clazzName)
}
ClazzpathUnit[] getUnits() {
units.toArray(new ClazzpathUnit[units.size()])
}
private Stream<JarEntry> toEntryStream(JarInputStream jarInputStream) {
StreamSupport.stream(Spliterators.spliteratorUnknownSize(new JarEntryIterator(jarInputStream),
Spliterator.IMMUTABLE), false)
}
private class JarEntryIterator implements Iterator<JarEntry> {
JarInputStream jarInputStream
JarEntry entry
JarEntryIterator(JarInputStream jarInputStream) {
this.jarInputStream = jarInputStream
this.entry = null
}
@Override
boolean hasNext() {
try {
if (entry == null) {
entry = jarInputStream.getNextJarEntry()
}
return entry != null
} catch (IOException e) {
throw new RuntimeException(e)
}
}
@Override
JarEntry next() {
try {
JarEntry result = entry != null ? entry : jarInputStream.getNextJarEntry()
entry = null
return result
} catch (IOException e) {
throw new RuntimeException(e)
}
}
}
private static abstract class Resource {
private static int ext = '.class'.length()
String name
Resource(String name) {
// foo/bar/Foo.class -> // foo.bar.Foo
this.name = name.substring(0, name.length() - ext).replace('/', '.')
}
abstract InputStream getInputStream() throws IOException
@Override
String toString() {
name
}
}
private static boolean isValidResourceName(String name) {
(name != null) && name.endsWith('.class') && !name.contains('-')
}
}

View file

@ -0,0 +1,40 @@
package org.xbib.gradle.plugin.shadow.internal
class ClazzpathUnit {
private final String s
private final Map<String, Clazz> clazzes
private final Map<String, Clazz> dependencies
ClazzpathUnit(String s, Map<String, Clazz> clazzes, Map<String, Clazz> dependencies) {
this.s = s
this.clazzes = clazzes
this.dependencies = dependencies
}
Set<Clazz> getClazzes() {
new HashSet<Clazz>(clazzes.values())
}
Clazz getClazz( final String pClazzName ) {
clazzes.get(pClazzName)
}
Set<Clazz> getDependencies() {
new HashSet<Clazz>(dependencies.values())
}
Set<Clazz> getTransitiveDependencies() {
Set<Clazz> all = new HashSet<Clazz>()
for (Clazz clazz : clazzes.values()) {
clazz.findTransitiveDependencies(all)
}
all
}
String toString() {
s
}
}

View file

@ -0,0 +1,126 @@
package org.xbib.gradle.plugin.shadow.internal
import org.gradle.api.Project
import org.gradle.api.artifacts.Configuration
import org.gradle.api.artifacts.Dependency
import org.gradle.api.artifacts.ResolvedDependency
import org.gradle.api.file.FileCollection
import org.gradle.api.specs.Spec
import org.gradle.api.specs.Specs
class DefaultDependencyFilter implements DependencyFilter {
private final Project project
private final List<Spec<? super ResolvedDependency>> includeSpecs = []
private final List<Spec<? super ResolvedDependency>> excludeSpecs = []
DefaultDependencyFilter(Project project) {
this.project = project
}
FileCollection resolve(Configuration configuration) {
Set<ResolvedDependency> includedDeps = []
Set<ResolvedDependency> excludedDeps = []
resolve(configuration.resolvedConfiguration.firstLevelModuleDependencies, includedDeps, excludedDeps)
project.files(configuration.files) - project.files(excludedDeps.collect {
it.moduleArtifacts*.file
}.flatten())
}
FileCollection resolve(Collection<Configuration> configurations) {
configurations.collect {
resolve(it)
}.sum() as FileCollection ?: project.files()
}
/**
* Exclude dependencies that match the provided spec.
*
* @param spec
* @return
*/
DependencyFilter exclude(Spec<? super ResolvedDependency> spec) {
excludeSpecs << spec
this
}
/**
* Include dependencies that match the provided spec.
*
* @param spec
* @return
*/
DependencyFilter include(Spec<? super ResolvedDependency> spec) {
includeSpecs << spec
return this
}
/**
* Create a spec that matches the provided project notation on group, name, and version
* @param notation
* @return
*/
Spec<? super ResolvedDependency> project(Map<String, ?> notation) {
dependency(project.dependencies.project(notation))
}
/**
* Create a spec that matches the default configuration for the provided project path on group, name, and version
*
* @param notation
* @return
*/
Spec<? super ResolvedDependency> project(String notation) {
dependency(project.dependencies.project(path: notation, configuration: 'default'))
}
/**
* Create a spec that matches dependencies using the provided notation on group, name, and version
* @param notation
* @return
*/
Spec<? super ResolvedDependency> dependency(Object notation) {
dependency(project.dependencies.create(notation))
}
/**
* Create a spec that matches the provided dependency on group, name, and version
* @param dependency
* @return
*/
Spec<? super ResolvedDependency> dependency(Dependency dependency) {
this.dependency({ ResolvedDependency it ->
(!dependency.group || it.moduleGroup.matches(dependency.group)) &&
(!dependency.name || it.moduleName.matches(dependency.name)) &&
(!dependency.version || it.moduleVersion.matches(dependency.version))
})
}
/**
* Create a spec that matches the provided closure
* @param spec
* @return
*/
Spec<? super ResolvedDependency> dependency(Closure spec) {
return Specs.<ResolvedDependency>convertClosureToSpec(spec)
}
protected void resolve(Set<ResolvedDependency> dependencies,
Set<ResolvedDependency> includedDependencies,
Set<ResolvedDependency> excludedDependencies) {
dependencies.each {
if (isIncluded(it) ? includedDependencies.add(it) : excludedDependencies.add(it)) {
resolve(it.children, includedDependencies, excludedDependencies)
}
}
}
protected boolean isIncluded(ResolvedDependency dependency) {
boolean include = includeSpecs.empty || includeSpecs.any { it.isSatisfiedBy(dependency) }
boolean exclude = !excludeSpecs.empty && excludeSpecs.any { it.isSatisfiedBy(dependency) }
return include && !exclude
}
}

View file

@ -0,0 +1,28 @@
package org.xbib.gradle.plugin.shadow.internal
import org.gradle.api.UncheckedIOException
import org.xbib.gradle.plugin.shadow.zip.Zip64Mode
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
class DefaultZipCompressor implements ZipCompressor {
private final int entryCompressionMethod
private final Zip64Mode zip64Mode
DefaultZipCompressor(boolean allowZip64Mode, int entryCompressionMethod) {
this.entryCompressionMethod = entryCompressionMethod
zip64Mode = allowZip64Mode ? Zip64Mode.AsNeeded : Zip64Mode.Never
}
@Override
ZipOutputStream createArchiveOutputStream(File destination) {
try {
ZipOutputStream zipOutputStream = new ZipOutputStream(destination)
zipOutputStream.setUseZip64(zip64Mode)
zipOutputStream.setMethod(entryCompressionMethod)
return zipOutputStream
} catch (Exception e) {
throw new UncheckedIOException("unable to create ZIP output stream for file " + destination, e)
}
}
}

View file

@ -0,0 +1,144 @@
package org.xbib.gradle.plugin.shadow.internal
import org.objectweb.asm.AnnotationVisitor
import org.objectweb.asm.ClassVisitor
import org.objectweb.asm.FieldVisitor
import org.objectweb.asm.Label
import org.objectweb.asm.MethodVisitor
import org.objectweb.asm.Opcodes
import org.objectweb.asm.TypePath
import org.objectweb.asm.commons.ClassRemapper
import org.objectweb.asm.commons.Remapper
class DependenciesClassRemapper extends ClassRemapper {
private static final AnnotationVisitor annotationVisitor = new MyAnnotationVisitor()
private static final MethodVisitor methodVisitor = new MyMethodVisitor()
private static final FieldVisitor fieldVisitor = new MyFieldVisitor()
private static final ClassVisitor classVisitor = new MyClassVisitor()
DependenciesClassRemapper() {
super(classVisitor, new CollectingRemapper())
}
Set<String> getDependencies() {
return ((CollectingRemapper) super.remapper).classes
}
private static class CollectingRemapper extends Remapper {
Set<String> classes = new HashSet<String>()
@Override
String map(String className) {
classes.add(className.replace('/', '.'))
className
}
}
private static class MyAnnotationVisitor extends AnnotationVisitor {
MyAnnotationVisitor() {
super(Opcodes.ASM7)
}
@Override
AnnotationVisitor visitAnnotation(String name, String desc) {
this
}
@Override
AnnotationVisitor visitArray(String name) {
this
}
}
private static class MyMethodVisitor extends MethodVisitor {
MyMethodVisitor() {
super(Opcodes.ASM7)
}
@Override
AnnotationVisitor visitAnnotationDefault() {
annotationVisitor
}
@Override
AnnotationVisitor visitAnnotation(String desc, boolean visible) {
annotationVisitor
}
@Override
AnnotationVisitor visitParameterAnnotation(int parameter, String desc, boolean visible) {
annotationVisitor
}
@Override
AnnotationVisitor visitInsnAnnotation(int typeRef, TypePath typePath, String descriptor, boolean visible) {
annotationVisitor
}
@Override
AnnotationVisitor visitLocalVariableAnnotation(int typeRef, TypePath typePath, Label[] start,
Label[] end, int[] index, String descriptor,
boolean visible) {
annotationVisitor
}
@Override
AnnotationVisitor visitTryCatchAnnotation(int typeRef, TypePath typePath, String descriptor, boolean visible ) {
annotationVisitor
}
@Override
AnnotationVisitor visitTypeAnnotation(int typeRef, TypePath typePath, String descriptor, boolean visible ) {
annotationVisitor
}
}
private static class MyFieldVisitor extends FieldVisitor {
MyFieldVisitor() {
super(Opcodes.ASM7)
}
@Override
AnnotationVisitor visitAnnotation(String desc, boolean visible) {
annotationVisitor
}
@Override
AnnotationVisitor visitTypeAnnotation(int typeRef, TypePath typePath, String descriptor, boolean visible) {
annotationVisitor
}
}
private static class MyClassVisitor extends ClassVisitor {
MyClassVisitor() {
super(Opcodes.ASM7);
}
@Override
AnnotationVisitor visitAnnotation(String desc, boolean visible) {
annotationVisitor
}
@Override
FieldVisitor visitField(int access, String name, String desc, String signature, Object value) {
fieldVisitor
}
@Override
MethodVisitor visitMethod(int access, String name, String desc, String signature, String[] exceptions) {
methodVisitor
}
@Override
AnnotationVisitor visitTypeAnnotation(int typeRef, TypePath typePath, String descriptor, boolean visible ) {
annotationVisitor
}
}
}

View file

@ -0,0 +1,76 @@
package org.xbib.gradle.plugin.shadow.internal
import org.gradle.api.artifacts.Configuration
import org.gradle.api.artifacts.Dependency
import org.gradle.api.artifacts.ResolvedDependency
import org.gradle.api.file.FileCollection
import org.gradle.api.specs.Spec
interface DependencyFilter {
/**
* Resolve a Configuration against the include/exclude rules in the filter
* @param configuration
* @return
*/
FileCollection resolve(Configuration configuration)
/**
* Resolve all Configurations against the include/exclude ruels in the filter and combine the results
* @param configurations
* @return
*/
FileCollection resolve(Collection<Configuration> configurations)
/**
* Exclude dependencies that match the provided spec.
*
* @param spec
* @return
*/
DependencyFilter exclude(Spec<? super ResolvedDependency> spec)
/**
* Include dependencies that match the provided spec.
*
* @param spec
* @return
*/
DependencyFilter include(Spec<? super ResolvedDependency> spec)
/**
* Create a spec that matches the provided project notation on group, name, and version
* @param notation
* @return
*/
Spec<? super ResolvedDependency> project(Map<String, ?> notation)
/**
* Create a spec that matches the default configuration for the provided project path on group, name, and version
*
* @param notation
* @return
*/
Spec<? super ResolvedDependency> project(String notation)
/**
* Create a spec that matches dependencies using the provided notation on group, name, and version
* @param notation
* @return
*/
Spec<? super ResolvedDependency> dependency(Object notation)
/**
* Create a spec that matches the provided dependency on group, name, and version
* @param dependency
* @return
*/
Spec<? super ResolvedDependency> dependency(Dependency dependency)
/**
* Create a spec that matches the provided closure
* @param spec
* @return
*/
Spec<? super ResolvedDependency> dependency(Closure spec)
}

View file

@ -0,0 +1,44 @@
package org.xbib.gradle.plugin.shadow.internal
import org.objectweb.asm.ClassReader
import java.util.jar.JarEntry
import java.util.jar.JarInputStream
class DependencyUtils {
static Collection<String> getDependenciesOfJar(InputStream is ) throws IOException {
Set<String> dependencies = []
JarInputStream inputStream = new JarInputStream(is)
inputStream.withCloseable {
while (true) {
JarEntry entry = inputStream.getNextJarEntry()
if (entry == null) {
break
}
if (entry.isDirectory()) {
inputStream.readAllBytes()
continue
}
if (entry.getName().endsWith('.class')) {
DependenciesClassRemapper dependenciesClassAdapter = new DependenciesClassRemapper()
new ClassReader(inputStream.readAllBytes()).accept(dependenciesClassAdapter, 0)
dependencies.addAll(dependenciesClassAdapter.getDependencies())
} else {
inputStream.readAllBytes()
}
}
}
dependencies
}
static Collection<String> getDependenciesOfClass(InputStream is) throws IOException {
final DependenciesClassRemapper dependenciesClassAdapter = new DependenciesClassRemapper()
new ClassReader(is.readAllBytes()).accept(dependenciesClassAdapter, ClassReader.EXPAND_FRAMES)
dependenciesClassAdapter.getDependencies()
}
static Collection<String> getDependenciesOfClass(Class<?> clazz) throws IOException {
getDependenciesOfClass(clazz.getResourceAsStream('/' + clazz.getName().replace('.', '/') + '.class'))
}
}

View file

@ -0,0 +1,142 @@
package org.xbib.gradle.plugin.shadow.internal
/**
* {@link Writer} implementation that outputs to a {@link StringBuilder}.
* <p>
* <strong>NOTE:</strong> This implementation, as an alternative to
* <code>java.io.StringWriter</code>, provides an <i>un-synchronized</i>
* (i.e. for use in a single thread) implementation for better performance.
* For safe usage with multiple {@link Thread}s then
* <code>java.io.StringWriter</code> should be used.
*/
class StringBuilderWriter extends Writer {
private final StringBuilder builder
/**
* Constructs a new {@link StringBuilder} instance with default capacity.
*/
StringBuilderWriter() {
this.builder = new StringBuilder()
}
/**
* Constructs a new {@link StringBuilder} instance with the specified capacity.
*
* @param capacity The initial capacity of the underlying {@link StringBuilder}
*/
StringBuilderWriter(int capacity) {
this.builder = new StringBuilder(capacity)
}
/**
* Constructs a new instance with the specified {@link StringBuilder}.
*
* <p>If {@code builder} is null a new instance with default capacity will be created.</p>
*
* @param builder The String builder. May be null.
*/
StringBuilderWriter(StringBuilder builder) {
this.builder = builder != null ? builder : new StringBuilder()
}
/**
* Appends a single character to this Writer.
*
* @param value The character to append
* @return This writer instance
*/
@Override
Writer append(char value) {
builder.append(value)
this
}
/**
* Appends a character sequence to this Writer.
*
* @param value The character to append
* @return This writer instance
*/
@Override
Writer append(CharSequence value) {
builder.append(value)
this
}
/**
* Appends a portion of a character sequence to the {@link StringBuilder}.
*
* @param value The character to append
* @param start The index of the first character
* @param end The index of the last character + 1
* @return This writer instance
*/
@Override
Writer append(CharSequence value, int start, int end) {
builder.append(value, start, end)
this
}
/**
* Closing this writer has no effect.
*/
@Override
void close() {
// no-op
}
/**
* Flushing this writer has no effect.
*/
@Override
void flush() {
// no-op
}
/**
* Writes a String to the {@link StringBuilder}.
*
* @param value The value to write
*/
@Override
void write(String value) {
if (value != null) {
builder.append(value)
}
}
/**
* Writes a portion of a character array to the {@link StringBuilder}.
*
* @param value The value to write
* @param offset The index of the first character
* @param length The number of characters to write
*/
@Override
void write(char[] value, int offset, int length) {
if (value != null) {
builder.append(value, offset, length)
}
}
/**
* Returns the underlying builder.
*
* @return The underlying builder
*/
StringBuilder getBuilder() {
builder
}
/**
* Returns {@link StringBuilder#toString()}.
*
* @return The contents of the String builder.
*/
@Override
String toString() {
builder.toString()
}
}

View file

@ -0,0 +1,44 @@
package org.xbib.gradle.plugin.shadow.internal
import org.gradle.api.Project
import org.gradle.api.file.FileCollection
import org.gradle.api.tasks.SourceSet
class UnusedTracker {
private final FileCollection toMinimize
private final List<ClazzpathUnit> projectUnits
private final Clazzpath clazzPath
private UnusedTracker(List<File> classDirs, FileCollection toMinimize) {
this.toMinimize = toMinimize
this.clazzPath = new Clazzpath()
this.projectUnits = classDirs.collect { clazzPath.addClazzpathUnit(it) }
}
Set<String> findUnused() {
Set<Clazz> unused = clazzPath.clazzes
for (cpu in projectUnits) {
unused.removeAll(cpu.clazzes)
unused.removeAll(cpu.transitiveDependencies)
}
unused.collect { it.name }.toSet()
}
void addDependency(File jarOrDir) {
if (toMinimize.contains(jarOrDir)) {
clazzPath.addClazzpathUnit(jarOrDir)
}
}
static UnusedTracker forProject(Project project, FileCollection toMinimize) {
final List<File> classDirs = new ArrayList<>()
for (SourceSet sourceSet in project.sourceSets) {
Iterable<File> classesDirs = sourceSet.output.classesDirs
classDirs.addAll(classesDirs.findAll { it.isDirectory() })
}
return new UnusedTracker(classDirs, toMinimize)
}
}

View file

@ -0,0 +1,629 @@
package org.xbib.gradle.plugin.shadow.internal
import java.nio.charset.Charset
import java.nio.file.Files
import java.nio.file.StandardOpenOption
class Utils {
private static final String EMPTY_STRING = ""
private static final Character EXTENSION_SEPARATOR = '.' as Character
private static final Character UNIX_SEPARATOR = '/' as Character
private static final Character WINDOWS_SEPARATOR = '\\' as Character
private static final Character SYSTEM_SEPARATOR = File.separatorChar as Character
private static final int NOT_FOUND = -1
private static final char OTHER_SEPARATOR
static {
if (SYSTEM_SEPARATOR == WINDOWS_SEPARATOR) {
OTHER_SEPARATOR = UNIX_SEPARATOR
} else {
OTHER_SEPARATOR = WINDOWS_SEPARATOR
}
}
static String separatorsToUnix(String path) {
if (path == null || path.indexOf(WINDOWS_SEPARATOR as String) == NOT_FOUND) {
return path
}
path.replace(WINDOWS_SEPARATOR, UNIX_SEPARATOR)
}
static String getExtension(String fileName) throws IllegalArgumentException {
if (fileName == null) {
return null
}
int index = indexOfExtension(fileName)
if (index == NOT_FOUND) {
return EMPTY_STRING
}
return fileName.substring(index + 1)
}
static int indexOfExtension(String fileName) throws IllegalArgumentException {
if (fileName == null) {
return NOT_FOUND
}
if (SYSTEM_SEPARATOR == WINDOWS_SEPARATOR) {
int offset = fileName.indexOf(':' as String, getAdsCriticalOffset(fileName))
if (offset != -1) {
throw new IllegalArgumentException("NTFS ADS separator (':') in file name is forbidden.")
}
}
int extensionPos = fileName.lastIndexOf(EXTENSION_SEPARATOR as String)
int lastSeparator = indexOfLastSeparator(fileName)
lastSeparator > extensionPos ? NOT_FOUND : extensionPos
}
static int getAdsCriticalOffset(String fileName) {
int offset1 = fileName.lastIndexOf(SYSTEM_SEPARATOR as String)
int offset2 = fileName.lastIndexOf(OTHER_SEPARATOR as String)
if (offset1 == -1) {
if (offset2 == -1) {
return 0
}
return offset2 + 1
}
if (offset2 == -1) {
return offset1 + 1
}
return Math.max(offset1, offset2) + 1
}
static int indexOfLastSeparator(String fileName) {
if (fileName == null) {
return NOT_FOUND
}
int lastUnixPos = fileName.lastIndexOf(UNIX_SEPARATOR as String)
int lastWindowsPos = fileName.lastIndexOf(WINDOWS_SEPARATOR as String)
return Math.max(lastUnixPos, lastWindowsPos)
}
static String normalize(String fileName) {
doNormalize(fileName, SYSTEM_SEPARATOR, true)
}
private static String doNormalize(String fileName, char separator, boolean keepSeparator) {
if (fileName == null) {
return null
}
failIfNullBytePresent(fileName)
int size = fileName.length()
if (size == 0) {
return fileName
}
int prefix = getPrefixLength(fileName)
if (prefix < 0) {
return null
}
char[] array = new char[size + 2]
fileName.getChars(0, fileName.length(), array, 0)
char otherSeparator = separator == SYSTEM_SEPARATOR ? OTHER_SEPARATOR : SYSTEM_SEPARATOR
for (int i = 0; i < array.length; i++) {
if (array[i] == otherSeparator) {
array[i] = separator
}
}
boolean lastIsDirectory = true
if (array[size - 1] != separator) {
array[size++] = separator
lastIsDirectory = false
}
for (int i = prefix + 1; i < size; i++) {
if (array[i] == separator && array[i - 1] == separator) {
System.arraycopy(array, i, array, i - 1, size - i)
size--
i--
}
}
char dot = '.' as char
for (int i = prefix + 1; i < size; i++) {
if (array[i] == separator && array[i - 1] == dot &&
(i == prefix + 1 || array[i - 2] == separator)) {
if (i == size - 1) {
lastIsDirectory = true
}
System.arraycopy(array, i + 1, array, i - 1, size - i)
size -=2
i--
}
}
outer:
for (int i = prefix + 2; i < size; i++) {
if (array[i] == separator && array[i - 1] == dot && array[i - 2] == dot &&
(i == prefix + 2 || array[i - 3] == separator)) {
if (i == prefix + 2) {
return null
}
if (i == size - 1) {
lastIsDirectory = true
}
int j
for (j = i - 4 ; j >= prefix; j--) {
if (array[j] == separator) {
System.arraycopy(array, i + 1, array, j + 1, size - i)
size -= i - j
i = j + 1
continue outer
}
}
System.arraycopy(array, i + 1, array, prefix, size - i)
size -= i + 1 - prefix
i = prefix + 1
}
}
if (size <= 0) {
return EMPTY_STRING
}
if (size <= prefix) {
return new String(array, 0, size)
}
if (lastIsDirectory && keepSeparator) {
return new String(array, 0, size)
}
new String(array, 0, size - 1)
}
static int getPrefixLength(String fileName) {
if (fileName == null) {
return NOT_FOUND
}
int len = fileName.length()
if (len == 0) {
return 0
}
char ch0 = fileName.charAt(0) as char
if (ch0 == ':' as char) {
return NOT_FOUND
}
if (len == 1) {
if (ch0 == '~' as char) {
return 2
}
return isSeparator(ch0) ? 1 : 0
}
if (ch0 == '~' as char) {
int posUnix = fileName.indexOf(UNIX_SEPARATOR as String, 1)
int posWin = fileName.indexOf(WINDOWS_SEPARATOR as String, 1)
if (posUnix == NOT_FOUND && posWin == NOT_FOUND) {
return len + 1
}
posUnix = posUnix == NOT_FOUND ? posWin : posUnix
posWin = posWin == NOT_FOUND ? posUnix : posWin
return Math.min(posUnix, posWin) + 1
}
char ch1 = fileName.charAt(1) as char
if (ch1 == ':' as char) {
ch0 = Character.toUpperCase(ch0)
if (ch0 >= ('A' as char) && ch0 <= ('Z' as char)) {
if (len == 2 || !isSeparator(fileName.charAt(2))) {
return 2
}
return 3
} else if (ch0 == UNIX_SEPARATOR) {
return 1
}
return NOT_FOUND
} else if (isSeparator(ch0) && isSeparator(ch1)) {
int posUnix = fileName.indexOf(UNIX_SEPARATOR as String, 2)
int posWin = fileName.indexOf(WINDOWS_SEPARATOR as String, 2)
if (posUnix == NOT_FOUND && posWin == NOT_FOUND || posUnix == 2 || posWin == 2) {
return NOT_FOUND
}
posUnix = posUnix == NOT_FOUND ? posWin : posUnix
posWin = posWin == NOT_FOUND ? posUnix : posWin
int pos = Math.min(posUnix, posWin) + 1
String hostnamePart = fileName.substring(2, pos - 1)
return isValidHostName(hostnamePart) ? pos : NOT_FOUND
} else {
return isSeparator(ch0) ? 1 : 0
}
}
static boolean isSeparator(char ch) {
return ch == UNIX_SEPARATOR || ch == WINDOWS_SEPARATOR
}
private static void failIfNullBytePresent(String path) {
int len = path.length()
for (int i = 0; i < len; i++) {
if (path.charAt(i) == 0 as char) {
throw new IllegalArgumentException("Null byte present in file/path name. There are no " +
"known legitimate use cases for such data, but several injection attacks may use it")
}
}
}
static String removeExtension(String fileName) {
if (fileName == null) {
return null
}
failIfNullBytePresent(fileName)
int index = indexOfExtension(fileName)
if (index == NOT_FOUND) {
return fileName
}
fileName.substring(0, index)
}
private static final int DEFAULT_BUFFER_SIZE = 1024 * 4
static long copyLarge(Reader input, Writer output) throws IOException {
return copyLarge(input, output, new char[DEFAULT_BUFFER_SIZE])
}
static long copyLarge(Reader input, Writer output, final char[] buffer) throws IOException {
long count = 0
int n
while (-1 != (n = input.read(buffer))) {
output.write(buffer, 0, n)
count += n
}
count
}
static long copyLarge(InputStream input, OutputStream output) throws IOException {
return copy(input, output, DEFAULT_BUFFER_SIZE);
}
static long copy(InputStream input, OutputStream output, int bufferSize) throws IOException {
return copyLarge(input, output, new byte[bufferSize])
}
static long copyLarge(InputStream input, OutputStream output, byte[] buffer) throws IOException {
long count = 0
int n
while (-1 != (n = input.read(buffer))) {
output.write(buffer, 0, n)
count += n
}
return count
}
static void writeStringToFile(File file, String data, Charset encoding) throws IOException {
writeStringToFile(file, data, encoding, false)
}
static void writeStringToFile(File file, String data, Charset encoding, boolean append) throws IOException {
OutputStream out = append ?
Files.newOutputStream(file.toPath(), StandardOpenOption.APPEND) : Files.newOutputStream(file.toPath())
out.withCloseable { outputStream ->
write(data, outputStream, encoding)
}
}
static void write(String data, OutputStream output, Charset encoding) throws IOException {
if (data != null) {
output.write(data.getBytes(encoding))
}
}
static String readFileToString(File file, Charset encoding) throws IOException {
InputStream inputStream = Files.newInputStream(file.toPath())
inputStream.withCloseable { is ->
return toString(is, encoding)
}
}
static String toString(InputStream input, Charset encoding) throws IOException {
StringBuilderWriter sw = new StringBuilderWriter()
sw.withCloseable { writer ->
copyLarge(new InputStreamReader(input, encoding), writer)
return sw.toString()
}
}
static byte[] readFileToByteArray(File file) throws IOException {
InputStream inputStream = Files.newInputStream(file.toPath())
inputStream.withCloseable { is ->
long fileLength = file.length()
return fileLength > 0 ? toByteArray(is, fileLength) : toByteArray(is)
}
}
static byte[] toByteArray(InputStream input, long size) throws IOException {
if (size > Integer.MAX_VALUE) {
throw new IllegalArgumentException("Size cannot be greater than Integer max value: " + size)
}
return toByteArray(input, (int) size)
}
static byte[] toByteArray(InputStream input, int size) throws IOException {
if (size < 0) {
throw new IllegalArgumentException("Size must be equal or greater than zero: " + size)
}
if (size == 0) {
return new byte[0]
}
byte[] data = new byte[size]
int offset = 0
int read = 0
while (offset < size && (read = input.read(data, offset, size - offset)) != -1) {
offset += read
}
if (offset != size) {
throw new IOException("Unexpected read size. current: " + offset + ", expected: " + size)
}
data
}
static byte[] toByteArray(InputStream inputStream) throws IOException {
ByteArrayOutputStream outputStream = new ByteArrayOutputStream()
outputStream.withCloseable {
copyLarge(inputStream, outputStream)
return outputStream.toByteArray()
}
}
private static final String PATTERN_HANDLER_PREFIX = "["
private static final String PATTERN_HANDLER_SUFFIX = "]"
private static final String REGEX_HANDLER_PREFIX = "%regex" + PATTERN_HANDLER_PREFIX
private static final String ANT_HANDLER_PREFIX = "%ant" + PATTERN_HANDLER_PREFIX
static boolean matchPath(String pattern, String str, boolean isCaseSensitive) {
matchPath(pattern, str, File.separator, isCaseSensitive)
}
static boolean matchPath(String pattern, String str, String separator, boolean isCaseSensitive) {
if (isRegexPrefixedPattern(pattern)) {
pattern = pattern.substring(REGEX_HANDLER_PREFIX.length(), pattern.length() - PATTERN_HANDLER_SUFFIX.length())
return str.matches(pattern)
} else {
if (isAntPrefixedPattern(pattern)) {
pattern = pattern.substring(ANT_HANDLER_PREFIX.length(),
pattern.length() - PATTERN_HANDLER_SUFFIX.length())
}
return matchAntPathPattern(pattern, str, separator, isCaseSensitive)
}
}
static boolean isRegexPrefixedPattern(String pattern) {
pattern.length() > (REGEX_HANDLER_PREFIX.length() + PATTERN_HANDLER_SUFFIX.length() + 1) &&
pattern.startsWith(REGEX_HANDLER_PREFIX) && pattern.endsWith( PATTERN_HANDLER_SUFFIX)
}
static boolean isAntPrefixedPattern(String pattern) {
pattern.length() > (ANT_HANDLER_PREFIX.length() + PATTERN_HANDLER_SUFFIX.length() + 1) &&
pattern.startsWith(ANT_HANDLER_PREFIX) && pattern.endsWith(PATTERN_HANDLER_SUFFIX)
}
static boolean matchAntPathPattern(String pattern, String str, String separator, boolean isCaseSensitive) {
if (separatorPatternStartSlashMismatch(pattern, str, separator)) {
return false
}
List<String> patDirs = tokenizePathToString(pattern, separator)
List<String> strDirs = tokenizePathToString(str, separator)
matchAntPathPattern(patDirs, strDirs, isCaseSensitive)
}
static boolean separatorPatternStartSlashMismatch(String pattern, String str, String separator) {
str.startsWith(separator) != pattern.startsWith(separator)
}
static List<String> tokenizePathToString(String path, String separator) {
List<String> ret = []
StringTokenizer st = new StringTokenizer(path, separator)
while (st.hasMoreTokens()) {
ret.add(st.nextToken())
}
ret
}
static boolean matchAntPathPattern(List<String> patDirs, List<String> strDirs, boolean isCaseSensitive) {
int patIdxStart = 0
int patIdxEnd = patDirs.size() - 1
int strIdxStart = 0
int strIdxEnd = strDirs.size() - 1
while (patIdxStart <= patIdxEnd && strIdxStart <= strIdxEnd) {
String patDir = patDirs.get(patIdxStart)
if (patDir.equals("**")) {
break
}
if (!match(patDir, strDirs.get(strIdxStart), isCaseSensitive)) {
return false
}
patIdxStart++
strIdxStart++
}
if (strIdxStart > strIdxEnd) {
for (int i = patIdxStart; i <= patIdxEnd; i++) {
if (!patDirs.get(i).equals( "**")) {
return false
}
}
return true
} else {
if (patIdxStart > patIdxEnd) {
return false
}
}
while (patIdxStart <= patIdxEnd && strIdxStart <= strIdxEnd) {
String patDir = patDirs.get(patIdxEnd)
if (patDir.equals("**")) {
break
}
if (!match(patDir, strDirs.get(strIdxEnd), isCaseSensitive)) {
return false
}
patIdxEnd--
strIdxEnd--
}
if (strIdxStart > strIdxEnd) {
for (int i = patIdxStart; i <= patIdxEnd; i++) {
if (!patDirs.get(i).equals("**")) {
return false
}
}
return true
}
while (patIdxStart != patIdxEnd && strIdxStart <= strIdxEnd) {
int patIdxTmp = -1
for (int i = patIdxStart + 1; i <= patIdxEnd; i++) {
if (patDirs.get(i).equals("**")) {
patIdxTmp = i
break
}
}
if (patIdxTmp == patIdxStart + 1) {
patIdxStart++
continue
}
int patLength = (patIdxTmp - patIdxStart - 1)
int strLength = (strIdxEnd - strIdxStart + 1)
int foundIdx = -1;
strLoop:
for (int i = 0; i <= strLength - patLength; i++) {
for (int j = 0; j < patLength; j++) {
String subPat = patDirs.get(patIdxStart + j + 1)
String subStr = strDirs.get(strIdxStart + i + j)
if (!match(subPat, subStr, isCaseSensitive)) {
continue strLoop
}
}
foundIdx = strIdxStart + i
break
}
if (foundIdx == -1) {
return false
}
patIdxStart = patIdxTmp
strIdxStart = foundIdx + patLength
}
for (int i = patIdxStart; i <= patIdxEnd; i++) {
if (!patDirs.get(i).equals("**")) {
return false
}
}
return true
}
static boolean match(String pattern, String str, boolean isCaseSensitive) {
char[] patArr = pattern.toCharArray()
char[] strArr = str.toCharArray()
match(patArr, strArr, isCaseSensitive)
}
static boolean match(char[] patArr, char[] strArr, boolean isCaseSensitive) {
int patIdxStart = 0
int patIdxEnd = patArr.length - 1
int strIdxStart = 0
int strIdxEnd = strArr.length - 1
char ch
boolean containsStar = false
for (char aPatArr : patArr) {
if (aPatArr == ('*' as char)) {
containsStar = true
break
}
}
if (!containsStar) {
if (patIdxEnd != strIdxEnd) {
return false
}
for (int i = 0; i <= patIdxEnd; i++) {
ch = patArr[i]
if ( ch != ('?' as char) && !equals(ch, strArr[i], isCaseSensitive)) {
return false
}
}
return true
}
if (patIdxEnd == 0) {
return true
}
while ((ch = patArr[patIdxStart]) != ('*' as char) && strIdxStart <= strIdxEnd) {
if ( ch != ('?' as char) && !equals( ch, strArr[strIdxStart], isCaseSensitive)) {
return false
}
patIdxStart++
strIdxStart++
}
if (strIdxStart > strIdxEnd) {
for (int i = patIdxStart; i <= patIdxEnd; i++) {
if ( patArr[i] != ('*' as char)) {
return false
}
}
return true
}
while ((ch = patArr[patIdxEnd] ) != ('*' as char) && strIdxStart <= strIdxEnd) {
if (ch != ('?' as char) && !equals( ch, strArr[strIdxEnd], isCaseSensitive)) {
return false
}
patIdxEnd--
strIdxEnd--
}
if (strIdxStart > strIdxEnd) {
for (int i = patIdxStart; i <= patIdxEnd; i++) {
if ( patArr[i] != ('*' as char)) {
return false
}
}
return true
}
while (patIdxStart != patIdxEnd && strIdxStart <= strIdxEnd) {
int patIdxTmp = -1
for (int i = patIdxStart + 1; i <= patIdxEnd; i++) {
if ( patArr[i] == ('*' as char)) {
patIdxTmp = i
break
}
}
if (patIdxTmp == patIdxStart + 1) {
patIdxStart++
continue
}
int patLength = ( patIdxTmp - patIdxStart - 1 )
int strLength = ( strIdxEnd - strIdxStart + 1 )
int foundIdx = -1
strLoop:
for (int i = 0; i <= strLength - patLength; i++) {
for (int j = 0; j < patLength; j++) {
ch = patArr[patIdxStart + j + 1]
if (ch != ('?' as char) && !equals( ch, strArr[strIdxStart + i + j], isCaseSensitive)) {
continue strLoop
}
}
foundIdx = strIdxStart + i
break
}
if (foundIdx == -1) {
return false
}
patIdxStart = patIdxTmp
strIdxStart = foundIdx + patLength
}
for (int i = patIdxStart; i <= patIdxEnd; i++) {
if (patArr[i] != ('*' as char)) {
return false
}
}
return true
}
static boolean equals(char c1, char c2, boolean isCaseSensitive) {
if (c1 == c2) {
return true
}
if (!isCaseSensitive) {
if (Character.toUpperCase(c1) == Character.toUpperCase(c2)
|| Character.toLowerCase(c1) == Character.toLowerCase(c2)) {
return true
}
}
return false
}
}

View file

@ -0,0 +1,11 @@
package org.xbib.gradle.plugin.shadow.internal
import org.gradle.api.internal.file.archive.compression.ArchiveOutputStreamFactory
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
interface ZipCompressor extends ArchiveOutputStreamFactory {
@Override
ZipOutputStream createArchiveOutputStream(File destination)
}

View file

@ -0,0 +1,14 @@
package org.xbib.gradle.plugin.shadow.relocation
import org.xbib.gradle.plugin.shadow.ShadowStats
import groovy.transform.Canonical
import groovy.transform.builder.Builder
@Canonical
@Builder
class RelocateClassContext {
String className
ShadowStats stats
}

View file

@ -0,0 +1,13 @@
package org.xbib.gradle.plugin.shadow.relocation
import org.xbib.gradle.plugin.shadow.ShadowStats
import groovy.transform.Canonical
import groovy.transform.builder.Builder
@Canonical
@Builder
class RelocatePathContext {
String path
ShadowStats stats
}

View file

@ -0,0 +1,17 @@
package org.xbib.gradle.plugin.shadow.relocation
/**
* Modified from org.apache.maven.plugins.shade.relocation.Relocator
*/
interface Relocator {
boolean canRelocatePath(RelocatePathContext context)
String relocatePath(RelocatePathContext context)
boolean canRelocateClass(RelocateClassContext context)
String relocateClass(RelocateClassContext context)
String applyToSourceContent(String sourceContent)
}

View file

@ -0,0 +1,156 @@
package org.xbib.gradle.plugin.shadow.relocation
import org.xbib.gradle.plugin.shadow.internal.Utils
import java.util.regex.Pattern
/**
* Modified from org.apache.maven.plugins.shade.relocation.SimpleRelocator
*/
class SimpleRelocator implements Relocator {
private final String pattern
private final String pathPattern
private final String shadedPattern
private final String shadedPathPattern
private final Set<String> includes
private final Set<String> excludes
private final boolean rawString
SimpleRelocator(String patt, String shadedPattern, List<String> includes, List<String> excludes) {
this(patt, shadedPattern, includes, excludes, false)
}
SimpleRelocator(String patt, String shadedPattern, List<String> includes, List<String> excludes,
boolean rawString) {
this.rawString = rawString
if (rawString) {
this.pathPattern = patt
this.shadedPathPattern = shadedPattern
this.pattern = null // not used for raw string relocator
this.shadedPattern = null // not used for raw string relocator
} else {
if (patt == null) {
this.pattern = ""
this.pathPattern = ""
} else {
this.pattern = patt.replace('/', '.')
this.pathPattern = patt.replace('.', '/')
}
if (shadedPattern != null) {
this.shadedPattern = shadedPattern.replace('/', '.')
this.shadedPathPattern = shadedPattern.replace('.', '/')
} else {
this.shadedPattern = "hidden." + this.pattern
this.shadedPathPattern = "hidden/" + this.pathPattern
}
}
this.includes = normalizePatterns(includes)
this.excludes = normalizePatterns(excludes)
}
SimpleRelocator include(String pattern) {
this.includes.addAll normalizePatterns([pattern])
return this
}
SimpleRelocator exclude(String pattern) {
this.excludes.addAll normalizePatterns([pattern])
return this
}
private static Set<String> normalizePatterns(Collection<String> patterns) {
Set<String> normalized = null
if (patterns != null && !patterns.isEmpty()) {
normalized = new LinkedHashSet<String>()
for (String pattern : patterns) {
String classPattern = pattern.replace('.', '/')
normalized.add(classPattern)
if (classPattern.endsWith("/*")) {
String packagePattern = classPattern.substring(0, classPattern.lastIndexOf('/'))
normalized.add(packagePattern)
}
}
}
return normalized ?: []
}
private boolean isIncluded(String path) {
if (includes != null && !includes.isEmpty()) {
for (String include : includes) {
if (Utils.matchPath(include, path, true)) {
return true
}
}
return false
}
return true
}
private boolean isExcluded(String path) {
if (excludes != null && !excludes.isEmpty()) {
for (String exclude : excludes) {
if (Utils.matchPath(exclude, path, true)) {
return true
}
}
}
return false
}
boolean canRelocatePath(RelocatePathContext context) {
String path = context.path
if (rawString) {
return Pattern.compile(pathPattern).matcher(path).find()
}
if (path.endsWith(".class")) {
path = path.substring(0, path.length() - 6)
}
if (!isIncluded(path) || isExcluded(path)) {
return false
}
// Allow for annoying option of an extra / on the front of a path.
// See MSHADE-119 comes from getClass().getResource("/a/b/c.properties").
return path.startsWith(pathPattern) || path.startsWith("/" + pathPattern)
}
boolean canRelocateClass(RelocateClassContext context) {
String clazz = context.className
RelocatePathContext pathContext = RelocatePathContext.builder().path(clazz.replace('.', '/')).stats(context.stats).build()
return !rawString && clazz.indexOf('/') < 0 && canRelocatePath(pathContext)
}
String relocatePath(RelocatePathContext context) {
String path = context.path
context.stats.relocate(pathPattern, shadedPathPattern)
if (rawString) {
return path.replaceAll(pathPattern, shadedPathPattern)
} else {
return path.replaceFirst(pathPattern, shadedPathPattern)
}
}
String relocateClass(RelocateClassContext context) {
String clazz = context.className
context.stats.relocate(pathPattern, shadedPathPattern)
return clazz.replaceFirst(pattern, shadedPattern)
}
String applyToSourceContent(String sourceContent) {
if (rawString) {
return sourceContent
} else {
return sourceContent.replaceAll("\\b" + pattern, shadedPattern)
}
}
}

View file

@ -0,0 +1,50 @@
package org.xbib.gradle.plugin.shadow.tasks
import org.gradle.api.DefaultTask
import org.gradle.api.Task
import org.gradle.api.artifacts.Configuration
import org.gradle.api.tasks.Input
import org.gradle.api.tasks.InputFiles
import org.gradle.api.tasks.Optional
import org.gradle.api.tasks.TaskAction
import java.util.jar.JarFile
class ConfigureShadowRelocation extends DefaultTask {
@Input
ShadowJar target
@Input
String prefix = "shadow"
@InputFiles @Optional
List<Configuration> getConfigurations() {
return target.configurations
}
@TaskAction
void configureRelocation() {
def packages = [] as Set<String>
configurations.each { configuration ->
configuration.files.each { jar ->
JarFile jf = new JarFile(jar)
jf.entries().each { entry ->
if (entry.name.endsWith(".class")) {
packages << entry.name[0..entry.name.lastIndexOf('/')-1].replaceAll('/', '.')
}
}
jf.close()
}
}
packages.each {
target.relocate(it, "${prefix}.${it}")
}
}
static String taskName(Task task) {
return "configureRelocation${task.name.capitalize()}"
}
}

View file

@ -0,0 +1,99 @@
package org.xbib.gradle.plugin.shadow.tasks
import org.gradle.api.Action
import org.gradle.api.internal.file.FileResolver
import org.gradle.api.java.archives.Attributes
import org.gradle.api.java.archives.Manifest
import org.gradle.api.java.archives.ManifestException
import org.gradle.api.java.archives.ManifestMergeSpec
import org.gradle.api.java.archives.internal.DefaultManifest
import org.gradle.api.java.archives.internal.DefaultManifestMergeSpec
import org.gradle.util.ConfigureUtil
class DefaultInheritManifest implements InheritManifest {
private List<DefaultManifestMergeSpec> inheritMergeSpecs = []
private final FileResolver fileResolver
private final Manifest internalManifest
DefaultInheritManifest(FileResolver fileResolver) {
this.internalManifest = new DefaultManifest(fileResolver)
this.fileResolver = fileResolver
}
InheritManifest inheritFrom(Object... inheritPaths) {
inheritFrom(inheritPaths, null)
this
}
InheritManifest inheritFrom(Object inheritPaths, Closure closure) {
DefaultManifestMergeSpec mergeSpec = new DefaultManifestMergeSpec()
mergeSpec.from(inheritPaths)
inheritMergeSpecs.add(mergeSpec)
ConfigureUtil.configure(closure, mergeSpec)
this
}
@Override
Attributes getAttributes() {
internalManifest.getAttributes()
}
@Override
Map<String, Attributes> getSections() {
internalManifest.getSections()
}
@Override
Manifest attributes(Map<String, ?> map) throws ManifestException {
internalManifest.attributes(map)
this
}
@Override
Manifest attributes(Map<String, ?> map, String s) throws ManifestException {
internalManifest.attributes(map, s)
this
}
@Override
DefaultManifest getEffectiveManifest() {
DefaultManifest base = new DefaultManifest(fileResolver)
inheritMergeSpecs.each {
base = it.merge(base, fileResolver)
}
base.from internalManifest
base.getEffectiveManifest()
}
Manifest writeTo(Writer writer) {
this.getEffectiveManifest().writeTo((Object) writer)
this
}
@Override
Manifest writeTo(Object o) {
this.getEffectiveManifest().writeTo(o)
this
}
@Override
Manifest from(Object... objects) {
internalManifest.from(objects)
this
}
@Override
Manifest from(Object o, Closure<?> closure) {
internalManifest.from(o, closure)
this
}
@Override
Manifest from(Object o, Action<ManifestMergeSpec> action) {
internalManifest.from(o, action)
this
}
}

View file

@ -0,0 +1,10 @@
package org.xbib.gradle.plugin.shadow.tasks
import org.gradle.api.java.archives.Manifest
interface InheritManifest extends Manifest {
InheritManifest inheritFrom(Object... inheritPaths)
InheritManifest inheritFrom(Object inheritPaths, Closure closure)
}

View file

@ -0,0 +1,503 @@
package org.xbib.gradle.plugin.shadow.tasks
import org.gradle.api.Action
import org.gradle.api.GradleException
import org.gradle.api.UncheckedIOException
import org.gradle.api.file.FileCopyDetails
import org.gradle.api.file.FileTreeElement
import org.gradle.api.file.RelativePath
import org.gradle.api.internal.DocumentationRegistry
import org.gradle.api.internal.file.CopyActionProcessingStreamAction
import org.gradle.api.internal.file.copy.CopyAction
import org.gradle.api.internal.file.copy.CopyActionProcessingStream
import org.gradle.api.internal.file.copy.FileCopyDetailsInternal
import org.gradle.api.logging.Logger
import org.gradle.api.specs.Spec
import org.gradle.api.tasks.WorkResult
import org.gradle.api.tasks.WorkResults
import org.gradle.api.tasks.bundling.Zip
import org.gradle.api.tasks.util.PatternSet
import org.gradle.internal.UncheckedException
import org.objectweb.asm.ClassReader
import org.objectweb.asm.ClassVisitor
import org.objectweb.asm.ClassWriter
import org.objectweb.asm.commons.ClassRemapper
import org.xbib.gradle.plugin.shadow.ShadowStats
import org.xbib.gradle.plugin.shadow.impl.RelocatorRemapper
import org.xbib.gradle.plugin.shadow.internal.UnusedTracker
import org.xbib.gradle.plugin.shadow.internal.Utils
import org.xbib.gradle.plugin.shadow.internal.ZipCompressor
import org.xbib.gradle.plugin.shadow.relocation.Relocator
import org.xbib.gradle.plugin.shadow.transformers.Transformer
import org.xbib.gradle.plugin.shadow.transformers.TransformerContext
import org.xbib.gradle.plugin.shadow.zip.UnixStat
import org.xbib.gradle.plugin.shadow.zip.Zip64RequiredException
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipFile
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
import java.time.LocalDate
import java.time.OffsetDateTime
import java.util.zip.ZipException
class ShadowCopyAction implements CopyAction {
static final long CONSTANT_TIME_FOR_ZIP_ENTRIES = LocalDate.of(1980, 1, 1).atStartOfDay()
.toInstant(OffsetDateTime.now().getOffset()).toEpochMilli()
private final Logger log
private final File zipFile
private final ZipCompressor compressor
private final DocumentationRegistry documentationRegistry
private final List<Transformer> transformers
private final List<Relocator> relocators
private final PatternSet patternSet
private final ShadowStats stats
private final String encoding
private final boolean preserveFileTimestamps
private final boolean minimizeJar
private final UnusedTracker unusedTracker
ShadowCopyAction(Logger log, File zipFile, ZipCompressor compressor, DocumentationRegistry documentationRegistry,
String encoding, List<Transformer> transformers, List<Relocator> relocators,
PatternSet patternSet, ShadowStats stats,
boolean preserveFileTimestamps, boolean minimizeJar, UnusedTracker unusedTracker) {
this.log = log
this.zipFile = zipFile
this.compressor = compressor
this.documentationRegistry = documentationRegistry
this.transformers = transformers
this.relocators = relocators
this.patternSet = patternSet
this.stats = stats
this.encoding = encoding
this.preserveFileTimestamps = preserveFileTimestamps
this.minimizeJar = minimizeJar
this.unusedTracker = unusedTracker
}
@Override
WorkResult execute(CopyActionProcessingStream stream) {
Set<String> unusedClasses
if (minimizeJar) {
stream.process(new BaseStreamAction() {
@Override
void visitFile(FileCopyDetails fileDetails) {
if (isArchive(fileDetails)) {
unusedTracker.addDependency(fileDetails.file)
}
}
})
unusedClasses = unusedTracker.findUnused()
} else {
unusedClasses = Collections.emptySet()
}
try {
ZipOutputStream zipOutStr = compressor.createArchiveOutputStream(zipFile)
withResource(zipOutStr, new Action<ZipOutputStream>() {
void execute(ZipOutputStream outputStream) {
try {
stream.process(new StreamAction(outputStream, encoding, transformers, relocators, patternSet,
unusedClasses, stats))
processTransformers(outputStream)
} catch (Exception e) {
log.error(e.getMessage() as String, e)
throw e
}
}
})
} catch (UncheckedIOException e) {
if (e.cause instanceof Zip64RequiredException) {
throw new Zip64RequiredException(
String.format("%s\n\nTo build this archive, please enable the zip64 extension.\nSee: %s",
e.cause.message, documentationRegistry.getDslRefForProperty(Zip, "zip64"))
)
}
} catch (Exception e) {
throw new GradleException("could not create zip '${zipFile.toString()}'", e)
}
return WorkResults.didWork(true)
}
private void processTransformers(ZipOutputStream stream) {
transformers.each { Transformer transformer ->
if (transformer.hasTransformedResource()) {
transformer.modifyOutputStream(stream, preserveFileTimestamps)
}
}
}
private long getArchiveTimeFor(long timestamp) {
return preserveFileTimestamps ? timestamp : CONSTANT_TIME_FOR_ZIP_ENTRIES
}
private ZipEntry setArchiveTimes(ZipEntry zipEntry) {
if (!preserveFileTimestamps) {
zipEntry.setTime(CONSTANT_TIME_FOR_ZIP_ENTRIES)
}
return zipEntry
}
private static <T extends Closeable> void withResource(T resource, Action<? super T> action) {
try {
action.execute(resource)
} catch (Throwable t) {
try {
resource.close()
} catch (IOException e) {
// Ignored
}
throw UncheckedException.throwAsUncheckedException(t)
}
try {
resource.close()
} catch (IOException e) {
throw new UncheckedIOException(e)
}
}
abstract class BaseStreamAction implements CopyActionProcessingStreamAction {
protected boolean isArchive(FileCopyDetails fileDetails) {
return fileDetails.relativePath.pathString.endsWith('.jar')
}
protected boolean isClass(FileCopyDetails fileDetails) {
return Utils.getExtension(fileDetails.path) == 'class'
}
@Override
void processFile(FileCopyDetailsInternal details) {
if (details.directory) {
visitDir(details)
} else {
visitFile(details)
}
}
protected void visitDir(FileCopyDetails dirDetails) {}
protected abstract void visitFile(FileCopyDetails fileDetails)
}
private class StreamAction extends BaseStreamAction {
private final ZipOutputStream zipOutStr
private final List<Transformer> transformers
private final List<Relocator> relocators
private final RelocatorRemapper remapper
private final PatternSet patternSet
private final Set<String> unused
private final ShadowStats stats
private Set<String> visitedFiles = new HashSet<String>()
StreamAction(ZipOutputStream zipOutStr, String encoding, List<Transformer> transformers,
List<Relocator> relocators, PatternSet patternSet, Set<String> unused,
ShadowStats stats) {
this.zipOutStr = zipOutStr
this.transformers = transformers
this.relocators = relocators
this.remapper = new RelocatorRemapper(relocators, stats)
this.patternSet = patternSet
this.unused = unused
this.stats = stats
if(encoding != null) {
this.zipOutStr.setEncoding(encoding)
}
}
private boolean recordVisit(RelativePath path) {
return visitedFiles.add(path.pathString)
}
@Override
void visitFile(FileCopyDetails fileDetails) {
if (!isArchive(fileDetails)) {
try {
boolean isClass = isClass(fileDetails)
if (!remapper.hasRelocators() || !isClass) {
if (!isTransformable(fileDetails)) {
String mappedPath = remapper.map(fileDetails.relativePath.pathString)
ZipEntry archiveEntry = new ZipEntry(mappedPath)
archiveEntry.setTime(getArchiveTimeFor(fileDetails.lastModified))
archiveEntry.unixMode = (UnixStat.FILE_FLAG | fileDetails.mode)
zipOutStr.putNextEntry(archiveEntry)
fileDetails.copyTo(zipOutStr)
zipOutStr.closeEntry()
} else {
transform(fileDetails)
}
} else if (isClass && !isUnused(fileDetails.path)) {
remapClass(fileDetails)
}
recordVisit(fileDetails.relativePath)
} catch (Exception e) {
throw new GradleException(String.format("Could not add %s to ZIP '%s'.", fileDetails, zipFile), e)
}
} else {
processArchive(fileDetails)
}
}
private void processArchive(FileCopyDetails fileDetails) {
stats.startJar()
ZipFile archive = new ZipFile(fileDetails.file)
List<ArchiveFileTreeElement> archiveElements = archive.entries.collect {
new ArchiveFileTreeElement(new RelativeArchivePath(it, fileDetails))
}
Spec<FileTreeElement> patternSpec = patternSet.getAsSpec()
List<ArchiveFileTreeElement> filteredArchiveElements = archiveElements.findAll { ArchiveFileTreeElement archiveElement ->
patternSpec.isSatisfiedBy(archiveElement)
}
filteredArchiveElements.each { ArchiveFileTreeElement archiveElement ->
if (archiveElement.relativePath.file) {
visitArchiveFile(archiveElement, archive)
}
}
archive.close()
stats.finishJar()
}
private void visitArchiveDirectory(RelativeArchivePath archiveDir) {
if (recordVisit(archiveDir)) {
zipOutStr.putNextEntry(archiveDir.entry)
zipOutStr.closeEntry()
}
}
private void visitArchiveFile(ArchiveFileTreeElement archiveFile, ZipFile archive) {
def archiveFilePath = archiveFile.relativePath
if (archiveFile.classFile || !isTransformable(archiveFile)) {
if (recordVisit(archiveFilePath) && !isUnused(archiveFilePath.entry.name)) {
if (!remapper.hasRelocators() || !archiveFile.classFile) {
copyArchiveEntry(archiveFilePath, archive)
} else {
remapClass(archiveFilePath, archive)
}
}
} else {
transform(archiveFile, archive)
}
}
private void addParentDirectories(RelativeArchivePath file) {
if (file) {
addParentDirectories(file.parent)
if (!file.file) {
visitArchiveDirectory(file)
}
}
}
private boolean isUnused(String classPath) {
final String className = Utils.removeExtension(classPath)
.replace('/' as char, '.' as char)
final boolean result = unused.contains(className)
if (result) {
log.debug("dropping unused class: $className")
}
return result
}
private void remapClass(RelativeArchivePath file, ZipFile archive) {
if (file.classFile) {
ZipEntry zipEntry = setArchiveTimes(new ZipEntry(remapper.mapPath(file) + '.class'))
addParentDirectories(new RelativeArchivePath(zipEntry, null))
InputStream is = archive.getInputStream(file.entry)
try {
remapClass(is, file.pathString, file.entry.time)
} finally {
is.close()
}
}
}
private void remapClass(FileCopyDetails fileCopyDetails) {
if (Utils.getExtension(fileCopyDetails.name) == 'class') {
remapClass(fileCopyDetails.file.newInputStream(), fileCopyDetails.path, fileCopyDetails.lastModified)
}
}
private void remapClass(InputStream classInputStream, String path, long lastModified) {
InputStream is = classInputStream
ClassReader cr = new ClassReader(is)
ClassWriter cw = new ClassWriter(0)
ClassVisitor cv = new ClassRemapper(cw, remapper)
try {
cr.accept(cv, ClassReader.EXPAND_FRAMES)
} catch (Throwable ise) {
throw new GradleException("error in ASM processing class " + path, ise)
}
byte[] renamedClass = cw.toByteArray()
String mappedName = remapper.mapPath(path)
InputStream bis = new ByteArrayInputStream(renamedClass)
try {
ZipEntry archiveEntry = new ZipEntry(mappedName + ".class")
archiveEntry.setTime(getArchiveTimeFor(lastModified))
zipOutStr.putNextEntry(archiveEntry)
Utils.copyLarge(bis, zipOutStr)
zipOutStr.closeEntry()
} catch (ZipException e) {
log.warn("there is a duplicate " + mappedName + " in source project")
} finally {
bis.close()
}
}
private void copyArchiveEntry(RelativeArchivePath archiveFile, ZipFile archive) {
String mappedPath = remapper.map(archiveFile.entry.name)
ZipEntry entry = new ZipEntry(mappedPath)
entry.setTime(getArchiveTimeFor(archiveFile.entry.time))
RelativeArchivePath mappedFile = new RelativeArchivePath(entry, archiveFile.details)
addParentDirectories(mappedFile)
zipOutStr.putNextEntry(mappedFile.entry)
InputStream is = archive.getInputStream(archiveFile.entry)
try {
Utils.copyLarge(is, zipOutStr)
} finally {
is.close()
}
zipOutStr.closeEntry()
}
@Override
protected void visitDir(FileCopyDetails dirDetails) {
try {
String path = dirDetails.relativePath.pathString + '/'
ZipEntry archiveEntry = new ZipEntry(path)
archiveEntry.setTime(getArchiveTimeFor(dirDetails.lastModified))
archiveEntry.unixMode = (UnixStat.DIR_FLAG | dirDetails.mode)
zipOutStr.putNextEntry(archiveEntry)
zipOutStr.closeEntry()
recordVisit(dirDetails.relativePath)
} catch (Exception e) {
throw new GradleException(String.format("Could not add %s to ZIP '%s'.", dirDetails, zipFile), e)
}
}
private void transform(ArchiveFileTreeElement element, ZipFile archive) {
InputStream is = archive.getInputStream(element.relativePath.entry)
try {
transform(element, is)
} finally {
is.close()
}
}
private void transform(FileCopyDetails details) {
transform(details, details.file.newInputStream())
}
private void transform(FileTreeElement element, InputStream inputStream) {
String mappedPath = remapper.map(element.relativePath.pathString)
transformers.find { it.canTransformResource(element) }.transform(
TransformerContext.builder()
.path(mappedPath)
.inputStream(inputStream)
.relocators(relocators)
.stats(stats)
.build()
)
}
private boolean isTransformable(FileTreeElement element) {
return transformers.any { it.canTransformResource(element) }
}
}
class RelativeArchivePath extends RelativePath {
ZipEntry entry
FileCopyDetails details
RelativeArchivePath(ZipEntry entry, FileCopyDetails fileDetails) {
super(!entry.directory, entry.name.split('/'))
this.entry = entry
this.details = fileDetails
}
boolean isClassFile() {
return lastName.endsWith('.class')
}
RelativeArchivePath getParent() {
if (!segments || segments.length == 1) {
return null
} else {
String path = segments[0..-2].join('/') + '/'
return new RelativeArchivePath(setArchiveTimes(new ZipEntry(path)), null)
}
}
}
class ArchiveFileTreeElement implements FileTreeElement {
private final RelativeArchivePath archivePath
ArchiveFileTreeElement(RelativeArchivePath archivePath) {
this.archivePath = archivePath
}
boolean isClassFile() {
return archivePath.classFile
}
@Override
File getFile() {
return null
}
@Override
boolean isDirectory() {
return archivePath.entry.directory
}
@Override
long getLastModified() {
return archivePath.entry.lastModifiedDate.time
}
@Override
long getSize() {
return archivePath.entry.size
}
@Override
InputStream open() {
return null
}
@Override
void copyTo(OutputStream outputStream) {
}
@Override
boolean copyTo(File file) {
return false
}
@Override
String getName() {
return archivePath.pathString
}
@Override
String getPath() {
return archivePath.lastName
}
@Override
RelativeArchivePath getRelativePath() {
return archivePath
}
@Override
int getMode() {
return archivePath.entry.unixMode
}
}
}

View file

@ -0,0 +1,398 @@
package org.xbib.gradle.plugin.shadow.tasks
import org.gradle.api.file.DuplicatesStrategy
import org.gradle.api.internal.file.copy.CopySpecResolver
import org.gradle.api.logging.LogLevel
import org.gradle.api.tasks.bundling.ZipEntryCompression
import org.gradle.api.Action
import org.gradle.api.artifacts.Configuration
import org.gradle.api.file.FileCollection
import org.gradle.api.internal.DocumentationRegistry
import org.gradle.api.internal.file.FileResolver
import org.gradle.api.internal.file.copy.CopyAction
import org.gradle.api.tasks.InputFiles
import org.gradle.api.tasks.Internal
import org.gradle.api.tasks.Optional
import org.gradle.api.tasks.TaskAction
import org.gradle.api.tasks.bundling.Jar
import org.gradle.api.tasks.util.PatternSet
import org.gradle.internal.Factory
import org.gradle.api.tasks.util.internal.PatternSets
import org.xbib.gradle.plugin.shadow.ShadowStats
import org.xbib.gradle.plugin.shadow.internal.DefaultDependencyFilter
import org.xbib.gradle.plugin.shadow.internal.DefaultZipCompressor
import org.xbib.gradle.plugin.shadow.internal.DependencyFilter
import org.xbib.gradle.plugin.shadow.internal.UnusedTracker
import org.xbib.gradle.plugin.shadow.internal.ZipCompressor
import org.xbib.gradle.plugin.shadow.relocation.Relocator
import org.xbib.gradle.plugin.shadow.relocation.SimpleRelocator
import org.xbib.gradle.plugin.shadow.transformers.AppendingTransformer
import org.xbib.gradle.plugin.shadow.transformers.GroovyExtensionModuleTransformer
import org.xbib.gradle.plugin.shadow.transformers.ServiceFileTransformer
import org.xbib.gradle.plugin.shadow.transformers.Transformer
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
import java.util.concurrent.Callable
class ShadowJar extends Jar implements ShadowSpec {
private List<Transformer> transformers
private List<Relocator> relocators
private List<Configuration> configurations
private DependencyFilter dependencyFilter
private boolean minimizeJar
private DependencyFilter dependencyFilterForMinimize
private ShadowStats shadowStats
ShadowJar() {
super()
dependencyFilter = new DefaultDependencyFilter(getProject())
dependencyFilterForMinimize = new DefaultDependencyFilter(getProject())
setManifest(new DefaultInheritManifest(getServices().get(FileResolver)))
transformers = []
relocators = []
configurations = []
shadowStats = new ShadowStats()
setDuplicatesStrategy(DuplicatesStrategy.INCLUDE)
}
@Override
ShadowJar minimize() {
minimizeJar = true
this
}
@Override
ShadowJar minimize(Action<DependencyFilter> c) {
minimize()
if (c != null) {
c.execute(dependencyFilterForMinimize)
}
return this
}
@Override
@Internal
ShadowStats getStats() {
shadowStats
}
@Override
InheritManifest getManifest() {
(InheritManifest) super.getManifest()
}
@Override
protected CopyAction createCopyAction() {
DocumentationRegistry documentationRegistry = getServices().get(DocumentationRegistry)
FileCollection toMinimize = dependencyFilterForMinimize.resolve(configurations)
UnusedTracker unusedTracker = UnusedTracker.forProject(getProject(), toMinimize)
CopySpecResolver copySpecResolver = mainSpec.buildRootResolver()
Factory<PatternSet> patternSetFactory = PatternSets.getNonCachingPatternSetFactory()
PatternSet patternSet = patternSetFactory.create()
patternSet.setCaseSensitive(copySpecResolver.caseSensitive)
patternSet.include(copySpecResolver.allIncludes)
patternSet.includeSpecs(copySpecResolver.allIncludeSpecs)
patternSet.exclude(copySpecResolver.allExcludes)
patternSet.excludeSpecs(copySpecResolver.allExcludeSpecs)
new ShadowCopyAction(getLogger(), getArchiveFile().get().getAsFile(), getInternalCompressor(), documentationRegistry,
this.getMetadataCharset(), transformers, relocators, patternSet, shadowStats,
isPreserveFileTimestamps(), minimizeJar, unusedTracker)
}
@Internal
protected ZipCompressor getInternalCompressor() {
switch (getEntryCompression()) {
case ZipEntryCompression.DEFLATED:
return new DefaultZipCompressor(isZip64(), ZipOutputStream.DEFLATED)
case ZipEntryCompression.STORED:
return new DefaultZipCompressor(isZip64(), ZipOutputStream.STORED)
default:
throw new IllegalArgumentException(String.format("unknown compression type %s", entryCompression))
}
}
@TaskAction
protected void copy() {
from(getIncludedDependencies())
super.copy()
getLogger().info(shadowStats.toString())
}
@InputFiles
FileCollection getIncludedDependencies() {
getProject().files(new Callable<FileCollection>() {
@Override
FileCollection call() throws Exception {
return dependencyFilter.resolve(configurations)
}
})
}
/**
* Configure inclusion/exclusion of module & project dependencies into uber jar.
*
* @param c the configuration of the filter
* @return this
*/
@Override
ShadowJar dependencies(Action<DependencyFilter> c) {
if (c != null) {
c.execute(dependencyFilter)
}
this
}
/**
* Add a Transformer instance for modifying JAR resources and configure.
*
* @param clazz the transformer to add. Must have a no-arg constructor
* @return this
*/
@Override
ShadowJar transform(Class<? extends Transformer> clazz)
throws InstantiationException, IllegalAccessException {
transform(clazz, null)
}
/**
* Add a Transformer instance for modifying JAR resources and configure.
*
* @param clazz the transformer class to add. Must have no-arg constructor
* @param c the configuration for the transformer
* @return this
*/
@Override
ShadowJar transform(Class<? extends Transformer> clazz, Action<? extends Transformer> c)
throws InstantiationException, IllegalAccessException {
Transformer transformer = clazz.getDeclaredConstructor().newInstance()
if (c != null) {
c.execute(transformer)
}
transformers.add(transformer)
this
}
/**
* Add a preconfigured transformer instance.
*
* @param transformer the transformer instance to add
* @return this
*/
@Override
ShadowJar transform(Transformer transformer) {
transformers.add(transformer)
this
}
/**
* Syntactic sugar for merging service files in JARs.
*
* @return this
*/
@Override
ShadowJar mergeServiceFiles() {
try {
transform(ServiceFileTransformer.class)
} catch (IllegalAccessException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
} catch (InstantiationException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
}
this
}
/**
* Syntactic sugar for merging service files in JARs.
*
* @return this
*/
@Override
ShadowJar mergeServiceFiles(String rootPath) {
try {
transform(ServiceFileTransformer.class, new Action<ServiceFileTransformer>() {
@Override
void execute(ServiceFileTransformer serviceFileTransformer) {
serviceFileTransformer.setPath(rootPath)
}
})
} catch (IllegalAccessException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
} catch (InstantiationException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
}
this
}
/**
* Syntactic sugar for merging service files in JARs.
*
* @return this
*/
@Override
ShadowJar mergeServiceFiles(Action<ServiceFileTransformer> configureClosure) {
try {
transform(ServiceFileTransformer.class, configureClosure)
} catch (IllegalAccessException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
} catch (InstantiationException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
}
this
}
/**
* Syntactic sugar for merging Groovy extension module descriptor files in JARs
*
* @return this
*/
@Override
ShadowJar mergeGroovyExtensionModules() {
try {
transform(GroovyExtensionModuleTransformer.class)
} catch (IllegalAccessException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
} catch (InstantiationException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
}
this
}
/**
* Syntactic sugar for merging service files in JARs
*
* @return this
*/
@Override
ShadowJar append(String resourcePath) {
try {
transform(AppendingTransformer.class, new Action<AppendingTransformer>() {
@Override
void execute(AppendingTransformer transformer) {
transformer.setResource(resourcePath)
}
})
} catch (IllegalAccessException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
} catch (InstantiationException e) {
getLogger().log(LogLevel.ERROR, e.getMessage() as String, e)
}
return this
}
/**
* Add a class relocator that maps each class in the pattern to the provided destination.
*
* @param pattern the source pattern to relocate
* @param destination the destination package
* @return this
*/
@Override
ShadowJar relocate(String pattern, String destination) {
relocate(pattern, destination, null)
}
/**
* Add a class relocator that maps each class in the pattern to the provided destination.
*
* @param pattern the source pattern to relocate
* @param destination the destination package
* @param configure the configuration of the relocator
* @return this
*/
@Override
ShadowJar relocate(String pattern, String destination, Action<SimpleRelocator> configure) {
SimpleRelocator relocator = new SimpleRelocator(pattern, destination, new ArrayList<String>(), new ArrayList<String>());
if (configure != null) {
configure.execute(relocator)
}
relocators.add(relocator)
this
}
/**
* Add a relocator instance.
*
* @param relocator the relocator instance to add
* @return this
*/
@Override
ShadowJar relocate(Relocator relocator) {
relocators.add(relocator)
this
}
/**
* Add a relocator of the provided class.
*
* @param relocatorClass the relocator class to add. Must have a no-arg constructor.
* @return this
*/
@Override
ShadowJar relocate(Class<? extends Relocator> relocatorClass)
throws InstantiationException, IllegalAccessException {
relocate(relocatorClass, null)
}
/**
* Add a relocator of the provided class and configure.
*
* @param relocatorClass the relocator class to add. Must have a no-arg constructor
* @param configure the configuration for the relocator
* @return this
*/
@Override
ShadowJar relocate(Class<? extends Relocator> relocatorClass, Action<? extends Relocator> configure)
throws InstantiationException, IllegalAccessException {
Relocator relocator = relocatorClass.getDeclaredConstructor().newInstance()
if (configure != null) {
configure.execute(relocator)
}
relocators.add(relocator)
this
}
@Internal
List<Transformer> getTransformers() {
return this.transformers
}
void setTransformers(List<Transformer> transformers) {
this.transformers = transformers
}
@Internal
List<Relocator> getRelocators() {
return this.relocators
}
void setRelocators(List<Relocator> relocators) {
this.relocators = relocators
}
@InputFiles @Optional
List<Configuration> getConfigurations() {
this.configurations
}
void setConfigurations(List<Configuration> configurations) {
this.configurations = configurations
}
@Internal
DependencyFilter getDependencyFilter() {
return this.dependencyFilter
}
void setDependencyFilter(DependencyFilter filter) {
this.dependencyFilter = filter
}
}

View file

@ -0,0 +1,51 @@
package org.xbib.gradle.plugin.shadow.tasks
import org.xbib.gradle.plugin.shadow.ShadowStats
import org.xbib.gradle.plugin.shadow.internal.DependencyFilter
import org.xbib.gradle.plugin.shadow.relocation.Relocator
import org.xbib.gradle.plugin.shadow.relocation.SimpleRelocator
import org.xbib.gradle.plugin.shadow.transformers.ServiceFileTransformer
import org.xbib.gradle.plugin.shadow.transformers.Transformer
import org.gradle.api.Action
import org.gradle.api.file.CopySpec
interface ShadowSpec extends CopySpec {
ShadowSpec minimize()
ShadowSpec minimize(Action<DependencyFilter> configureClosure)
ShadowSpec dependencies(Action<DependencyFilter> configure)
ShadowSpec transform(Class<? extends Transformer> clazz)
throws InstantiationException, IllegalAccessException
ShadowSpec transform(Class<? extends Transformer> clazz, Action<? extends Transformer> configure)
throws InstantiationException, IllegalAccessException
ShadowSpec transform(Transformer transformer)
ShadowSpec mergeServiceFiles()
ShadowSpec mergeServiceFiles(String rootPath)
ShadowSpec mergeServiceFiles(Action<ServiceFileTransformer> configureClosure)
ShadowSpec mergeGroovyExtensionModules()
ShadowSpec append(String resourcePath)
ShadowSpec relocate(String pattern, String destination)
ShadowSpec relocate(String pattern, String destination, Action<SimpleRelocator> configure)
ShadowSpec relocate(Relocator relocator)
ShadowSpec relocate(Class<? extends Relocator> clazz)
throws InstantiationException, IllegalAccessException
ShadowSpec relocate(Class<? extends Relocator> clazz, Action<? extends Relocator> configure)
throws InstantiationException, IllegalAccessException
ShadowStats getStats()
}

View file

@ -0,0 +1,35 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
/**
* Prevents duplicate copies of the license
* Modified from org.apache.maven.plugins.shade.resouce.ApacheLicenseResourceTransformer.
*/
class ApacheLicenseResourceTransformer implements Transformer {
private static final String LICENSE_PATH = "META-INF/LICENSE"
private static final String LICENSE_TXT_PATH = "META-INF/LICENSE.txt"
@Override
boolean canTransformResource(FileTreeElement element) {
def path = element.relativePath.pathString
return LICENSE_PATH.equalsIgnoreCase(path) ||
LICENSE_TXT_PATH.regionMatches(true, 0, path, 0, LICENSE_TXT_PATH.length())
}
@Override
void transform(TransformerContext context) {
}
@Override
boolean hasTransformedResource() {
return false
}
@Override
void modifyOutputStream(ZipOutputStream jos, boolean preserveFileTimestamps) {
}
}

View file

@ -0,0 +1,184 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
import java.text.SimpleDateFormat
/**
* Merges <code>META-INF/NOTICE.TXT</code> files.
* Modified from org.apache.maven.plugins.shade.resource.ApacheNoticeResourceTransformer
*/
class ApacheNoticeResourceTransformer implements Transformer {
Set<String> entries = new LinkedHashSet<String>()
Map<String, Set<String>> organizationEntries = new LinkedHashMap<String, Set<String>>()
String projectName = ""
boolean addHeader = true
String preamble1 = "// ------------------------------------------------------------------\n" +
"// NOTICE file corresponding to the section 4d of The Apache License,\n" +
"// Version 2.0, in this case for "
String preamble2 = "\n// ------------------------------------------------------------------\n"
String preamble3 = "This product includes software developed at\n"
String organizationName = "The Apache Software Foundation"
String organizationURL = "http://www.apache.org/"
String inceptionYear = "2006"
String copyright
/**
* The file encoding of the <code>NOTICE</code> file.
*/
String encoding
private static final String NOTICE_PATH = "META-INF/NOTICE"
private static final String NOTICE_TXT_PATH = "META-INF/NOTICE.txt"
@Override
boolean canTransformResource(FileTreeElement element) {
def path = element.relativePath.pathString
if (NOTICE_PATH.equalsIgnoreCase(path) || NOTICE_TXT_PATH.equalsIgnoreCase(path)) {
return true
}
return false
}
@Override
void transform(TransformerContext context) {
if (entries.isEmpty()) {
String year = new SimpleDateFormat("yyyy").format(new Date())
if (!inceptionYear.equals(year)) {
year = inceptionYear + "-" + year
}
//add headers
if (addHeader) {
entries.add(preamble1 + projectName + preamble2)
} else {
entries.add("")
}
//fake second entry, we'll look for a real one later
entries.add(projectName + "\nCopyright " + year + " " + organizationName + "\n")
entries.add(preamble3 + organizationName + " (" + organizationURL + ").\n")
}
BufferedReader reader
if (encoding != null && !encoding.isEmpty()) {
reader = new BufferedReader(new InputStreamReader(context.inputStream, encoding))
} else {
reader = new BufferedReader(new InputStreamReader(context.inputStream))
}
String line = reader.readLine()
StringBuffer sb = new StringBuffer()
Set<String> currentOrg = null
int lineCount = 0
while (line != null) {
String trimedLine = line.trim()
if (!trimedLine.startsWith("//")) {
if (trimedLine.length() > 0) {
if (trimedLine.startsWith("- ")) {
//resource-bundle 1.3 mode
if (lineCount == 1
&& sb.toString().indexOf("This product includes/uses software(s) developed by") != -1) {
currentOrg = organizationEntries.get(sb.toString().trim())
if (currentOrg == null) {
currentOrg = new TreeSet<String>()
organizationEntries.put(sb.toString().trim(), currentOrg)
}
sb = new StringBuffer()
} else if (sb.length() > 0 && currentOrg != null) {
currentOrg.add(sb.toString())
sb = new StringBuffer()
}
}
sb.append(line).append("\n")
lineCount++
} else {
String ent = sb.toString()
if (ent.startsWith(projectName) && ent.indexOf("Copyright ") != -1) {
copyright = ent
}
if (currentOrg == null) {
entries.add(ent)
} else {
currentOrg.add(ent)
}
sb = new StringBuffer()
lineCount = 0
currentOrg = null
}
}
line = reader.readLine()
}
if (sb.length() > 0) {
if (currentOrg == null) {
entries.add(sb.toString())
} else {
currentOrg.add(sb.toString())
}
}
}
@Override
boolean hasTransformedResource() {
return true
}
@Override
void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
ZipEntry zipEntry = new ZipEntry(NOTICE_PATH)
zipEntry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, zipEntry.time)
os.putNextEntry(zipEntry)
Writer pow
if (encoding != null && !encoding.isEmpty()) {
pow = new OutputStreamWriter(os, encoding)
} else {
pow = new OutputStreamWriter(os)
}
PrintWriter writer = new PrintWriter(pow)
int count = 0
for (String line : entries) {
++count
if (line.equals(copyright) && count != 2) {
continue
}
if (count == 2 && copyright != null) {
writer.print(copyright)
writer.print('\n')
} else {
writer.print(line)
writer.print('\n')
}
if (count == 3) {
for (Map.Entry<String, Set<String>> entry : organizationEntries.entrySet()) {
writer.print(entry.getKey())
writer.print('\n')
for (String l : entry.getValue()) {
writer.print(l)
}
writer.print('\n')
}
}
}
writer.flush()
entries.clear()
}
}

View file

@ -0,0 +1,43 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.internal.Utils
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
/**
* A resource processor that appends content for a resource, separated by a newline.
* Modified from org.apache.maven.plugins.shade.resource.AppendingTransformer.
*/
class AppendingTransformer implements Transformer {
String resource
ByteArrayOutputStream data = new ByteArrayOutputStream()
@Override
boolean canTransformResource(FileTreeElement element) {
def path = element.relativePath.pathString
resource != null && resource.equalsIgnoreCase(path)
}
@Override
void transform(TransformerContext context) {
Utils.copyLarge(context.inputStream, data)
data.write('\n'.bytes)
context.inputStream.close()
}
@Override
boolean hasTransformedResource() {
return data.size() > 0
}
@Override
void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
ZipEntry entry = new ZipEntry(resource)
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time)
os.putNextEntry(entry)
Utils.copyLarge(new ByteArrayInputStream(data.toByteArray()), os)
data.reset()
}
}

View file

@ -0,0 +1,37 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
/**
* A resource processor that prevents the inclusion of an arbitrary resource into the shaded JAR.
* Modified from org.apache.maven.plugins.shade.resource.DontIncludeResourceTransformer
*/
class DontIncludeResourceTransformer implements Transformer {
String resource
@Override
boolean canTransformResource(FileTreeElement element) {
def path = element.relativePath.pathString
if (path.endsWith(resource)) {
return true
}
return false
}
@Override
void transform(TransformerContext context) {
// no op
}
@Override
boolean hasTransformedResource() {
return false
}
@Override
void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
// no op
}
}

View file

@ -0,0 +1,89 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.internal.Utils
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
/**
* Modified from eu.appsatori.gradle.fatjar.tasks.PrepareFiles.groovy
* Resource transformer that merges Groovy extension module descriptor files into a single file. If there are several
* META-INF/services/org.codehaus.groovy.runtime.ExtensionModule resources spread across many JARs the individual
* entries will all be merged into a single META-INF/services/org.codehaus.groovy.runtime.ExtensionModule resource
* packaged into the resultant JAR produced by the shadowing process.
*/
class GroovyExtensionModuleTransformer implements Transformer {
private static final GROOVY_EXTENSION_MODULE_DESCRIPTOR_PATH =
"META-INF/services/org.codehaus.groovy.runtime.ExtensionModule"
private static final MODULE_NAME_KEY = 'moduleName'
private static final MODULE_VERSION_KEY = 'moduleVersion'
private static final EXTENSION_CLASSES_KEY = 'extensionClasses'
private static final STATIC_EXTENSION_CLASSES_KEY = 'staticExtensionClasses'
private static final MERGED_MODULE_NAME = 'MergedByShadowJar'
private static final MERGED_MODULE_VERSION = '1.0.0'
private final Properties module = new Properties()
@Override
boolean canTransformResource(FileTreeElement element) {
return element.relativePath.pathString == GROOVY_EXTENSION_MODULE_DESCRIPTOR_PATH
}
@Override
void transform(TransformerContext context) {
def props = new Properties()
props.load(context.inputStream)
props.each { String key, String value ->
switch (key) {
case MODULE_NAME_KEY:
handle(key, value) {
module.setProperty(key, MERGED_MODULE_NAME)
}
break
case MODULE_VERSION_KEY:
handle(key, value) {
module.setProperty(key, MERGED_MODULE_VERSION)
}
break
case [EXTENSION_CLASSES_KEY, STATIC_EXTENSION_CLASSES_KEY]:
handle(key, value) { String existingValue ->
def newValue = "${existingValue},${value}"
module.setProperty(key, newValue)
}
break
}
}
}
@Override
boolean hasTransformedResource() {
return module.size() > 0
}
@Override
void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
ZipEntry entry = new ZipEntry(GROOVY_EXTENSION_MODULE_DESCRIPTOR_PATH)
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time)
os.putNextEntry(entry)
Utils.copyLarge(toInputStream(module), os)
os.closeEntry()
}
private static InputStream toInputStream(Properties props) {
def baos = new ByteArrayOutputStream()
props.store(baos, null)
return new ByteArrayInputStream(baos.toByteArray())
}
private handle(String key, String value, Closure mergeValue) {
def existingValue = module.getProperty(key)
if (existingValue) {
mergeValue(existingValue)
} else {
module.setProperty(key, value)
}
}
}

View file

@ -0,0 +1,42 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.internal.Utils
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
/**
* A resource processor that allows the addition of an arbitrary file content into the shaded JAR.
* Modified from org.apache.maven.plugins.shade.resource.IncludeResourceTransformer
*/
class IncludeResourceTransformer implements Transformer {
File file
String resource
@Override
boolean canTransformResource(FileTreeElement element) {
return false
}
@Override
void transform(TransformerContext context) {
// no op
}
@Override
boolean hasTransformedResource() {
return file != null ? file.exists() : false
}
@Override
void modifyOutputStream(ZipOutputStream outputStream, boolean preserveFileTimestamps) {
ZipEntry entry = new ZipEntry(resource)
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time)
outputStream.putNextEntry(entry)
InputStream is = new FileInputStream(file)
Utils.copyLarge(is, outputStream)
is.close()
}
}

View file

@ -0,0 +1,78 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
import java.util.jar.*
import java.util.jar.Attributes.Name
/**
* A resource processor that allows the arbitrary addition of attributes to
* the first MANIFEST.MF that is found in the set of JARs being processed, or
* to a newly created manifest for the shaded JAR.
* Modified from org.apache.maven.plugins.shade.resource.ManifestResourceTransformer
*/
class ManifestResourceTransformer implements Transformer {
private String mainClass
private Map<String, Attributes> manifestEntries
private boolean manifestDiscovered
private Manifest manifest
@Override
boolean canTransformResource(FileTreeElement element) {
def path = element.relativePath.pathString
if (JarFile.MANIFEST_NAME.equalsIgnoreCase(path)) {
return true
}
return false
}
@Override
void transform(TransformerContext context) {
if (!manifestDiscovered) {
manifest = new Manifest(context.inputStream)
manifestDiscovered = true
if (context.inputStream) {
context.inputStream.close()
}
}
}
@Override
boolean hasTransformedResource() {
true
}
@Override
void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
if (manifest == null) {
manifest = new Manifest()
}
Attributes attributes = manifest.getMainAttributes()
if (mainClass != null) {
attributes.put(Name.MAIN_CLASS, mainClass)
}
if (manifestEntries != null) {
for (Map.Entry<String, Attributes> entry : manifestEntries.entrySet()) {
attributes.put(new Name(entry.getKey()), entry.getValue())
}
}
ZipEntry entry = new ZipEntry(JarFile.MANIFEST_NAME)
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time)
os.putNextEntry(entry)
manifest.write(os)
}
ManifestResourceTransformer attributes(Map<String, ?> attributes) {
if (manifestEntries == null) {
manifestEntries = [:]
}
manifestEntries.putAll(attributes)
this
}
}

View file

@ -0,0 +1,220 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.internal.Utils
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
import static groovy.lang.Closure.IDENTITY
/**
* Resources transformer that merges Properties files.
*
* <p>The default merge strategy discards duplicate values coming from additional
* resources. This behavior can be changed by setting a value for the <tt>mergeStrategy</tt>
* property, such as 'first' (default), 'latest' or 'append'. If the merge strategy is
* 'latest' then the last value of a matching property entry will be used. If the
* merge strategy is 'append' then the property values will be combined, using a
* merge separator (default value is ','). The merge separator can be changed by
* setting a value for the <tt>mergeSeparator</tt> property.</p>
*
* Say there are two properties files A and B with the
* following entries:
*
* <strong>A</strong>
* <ul>
* <li>key1 = value1</li>
* <li>key2 = value2</li>
* </ul>
*
* <strong>B</strong>
* <ul>
* <li>key2 = balue2</li>
* <li>key3 = value3</li>
* </ul>
*
* With <tt>mergeStrategy = first</tt> you get
*
* <strong>C</strong>
* <ul>
* <li>key1 = value1</li>
* <li>key2 = value2</li>
* <li>key3 = value3</li>
* </ul>
*
* With <tt>mergeStrategy = latest</tt> you get
*
* <strong>C</strong>
* <ul>
* <li>key1 = value1</li>
* <li>key2 = balue2</li>
* <li>key3 = value3</li>
* </ul>
*
* With <tt>mergeStrategy = append</tt> and <tt>mergeSparator = ;</tt> you get
*
* <strong>C</strong>
* <ul>
* <li>key1 = value1</li>
* <li>key2 = value2;balue2</li>
* <li>key3 = value3</li>
* </ul>
*
* <p>There are three additional properties that can be set: <tt>paths</tt>, <tt>mappings</tt>,
* and <tt>keyTransformer</tt>.
* The first contains a list of strings or regexes that will be used to determine if
* a path should be transformed or not. The merge strategy and merge separator are
* taken from the global settings.</p>
*
* <p>The <tt>mappings</tt> property allows you to define merge strategy and separator per
* path</p>. If either <tt>paths</tt> or <tt>mappings</tt> is defined then no other path
* entries will be merged. <tt>mappings</tt> has precedence over <tt>paths</tt> if both
* are defined.</p>
*
* <p>If you need to transform keys in properties files, e.g. because they contain class
* names about to be relocated, you can set the <tt>keyTransformer</tt> property to a
* closure that receives the original key and returns the key name to be used.</p>
*
* <p>Example:</p>
* <pre>
* import org.codehaus.griffon.gradle.shadow.transformers.*
* shadowJar {
* transform(PropertiesFileTransformer) {
* paths = [
* 'META-INF/editors/java.beans.PropertyEditor'
* ]
* keyTransformer = { key ->
* key.replaceAll('^(orig\.package\..*)$', 'new.prefix.$1')
* }
* }
* }
* </pre>
*/
class PropertiesFileTransformer implements Transformer {
private static final String PROPERTIES_SUFFIX = '.properties'
Map<String, Properties> propertiesEntries = [:]
List<String> paths = []
Map<String, Map<String, String>> mappings = [:]
String mergeStrategy = 'first'
String mergeSeparator = ','
Closure<String> keyTransformer = IDENTITY
@Override
boolean canTransformResource(FileTreeElement element) {
def path = element.relativePath.pathString
if (mappings.containsKey(path)) {
return true
}
for (key in mappings.keySet()) {
if (path =~ /$key/) {
return true
}
}
if (path in paths) {
return true
}
for (p in paths) {
if (path =~ /$p/) {
return true
}
}
!mappings && !paths && path.endsWith(PROPERTIES_SUFFIX)
}
@Override
void transform(TransformerContext context) {
Properties props = propertiesEntries[context.path]
Properties incoming = loadAndTransformKeys(context.inputStream)
if (props == null) {
propertiesEntries[context.path] = incoming
} else {
incoming.each { key, value ->
if (props.containsKey(key)) {
switch (mergeStrategyFor(context.path).toLowerCase()) {
case 'latest':
props.put(key, value)
break
case 'append':
props.put(key, props.getProperty(key as String) + mergeSeparatorFor(context.path) + value)
break
case 'first':
break
default:
break
}
} else {
props.put(key, value)
}
}
}
}
@Override
boolean hasTransformedResource() {
propertiesEntries.size() > 0
}
@Override
void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
propertiesEntries.each { String path, Properties props ->
ZipEntry entry = new ZipEntry(path)
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time)
os.putNextEntry(entry)
Utils.copyLarge(toInputStream(props), os)
os.closeEntry()
}
}
private Properties loadAndTransformKeys(InputStream is) {
Properties props = new Properties()
props.load(is)
transformKeys(props)
}
private Properties transformKeys(Properties properties) {
if (keyTransformer == IDENTITY)
return properties
def result = new Properties()
properties.each { key, value ->
result.put(keyTransformer.call(key), value)
}
result
}
private String mergeStrategyFor(String path) {
if (mappings.containsKey(path)) {
return mappings.get(path).mergeStrategy ?: mergeStrategy
}
for (key in mappings.keySet()) {
if (path =~ /$key/) {
return mappings.get(key).mergeStrategy ?: mergeStrategy
}
}
mergeStrategy
}
private String mergeSeparatorFor(String path) {
if (mappings.containsKey(path)) {
return mappings.get(path).mergeSeparator ?: mergeSeparator
}
for (key in mappings.keySet()) {
if (path =~ /$key/) {
return mappings.get(key).mergeSeparator ?: mergeSeparator
}
}
mergeSeparator
}
private static InputStream toInputStream(Properties props) {
ByteArrayOutputStream baos = new ByteArrayOutputStream()
props.store(baos, '')
new ByteArrayInputStream(baos.toByteArray())
}
}

View file

@ -0,0 +1,145 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.xbib.gradle.plugin.shadow.internal.ServiceStream
import org.xbib.gradle.plugin.shadow.internal.Utils
import org.xbib.gradle.plugin.shadow.relocation.RelocateClassContext
import org.gradle.api.file.FileTreeElement
import org.gradle.api.specs.Spec
import org.gradle.api.tasks.util.PatternFilterable
import org.gradle.api.tasks.util.PatternSet
import org.xbib.gradle.plugin.shadow.zip.ZipEntry
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
/**
* Modified from org.apache.maven.plugins.shade.resource.ServiceResourceTransformer.java
* Resources transformer that appends entries in META-INF/services resources into
* a single resource. For example, if there are several META-INF/services/org.apache.maven.project.ProjectBuilder
* resources spread across many JARs the individual entries will all be concatenated into a single
* META-INF/services/org.apache.maven.project.ProjectBuilder resource packaged into the resultant JAR produced
* by the shading process.
*/
class ServiceFileTransformer implements Transformer, PatternFilterable {
private static final String SERVICES_PATTERN = "META-INF/services/**"
private static final String GROOVY_EXTENSION_MODULE_DESCRIPTOR_PATTERN =
"META-INF/services/org.codehaus.groovy.runtime.ExtensionModule"
Map<String, ServiceStream> serviceEntries = [:].withDefault { new ServiceStream() }
private final PatternSet patternSet =
new PatternSet().include(SERVICES_PATTERN).exclude(GROOVY_EXTENSION_MODULE_DESCRIPTOR_PATTERN)
void setPath(String path) {
patternSet.setIncludes(["${path}/**"])
}
@Override
boolean canTransformResource(FileTreeElement element) {
return patternSet.asSpec.isSatisfiedBy(element)
}
@Override
void transform(TransformerContext context) {
def lines = context.inputStream.readLines()
def targetPath = context.path
context.relocators.each {rel ->
if(rel.canRelocateClass(RelocateClassContext.builder().className(new File(targetPath).name).stats(context.stats).build())) {
targetPath = rel.relocateClass(RelocateClassContext.builder().className(targetPath).stats(context.stats).build())
}
lines.eachWithIndex { String line, int i ->
def lineContext = RelocateClassContext.builder().className(line).stats(context.stats).build()
if(rel.canRelocateClass(lineContext)) {
lines[i] = rel.relocateClass(lineContext)
}
}
}
lines.each {line -> serviceEntries[targetPath].append(new ByteArrayInputStream(line.getBytes()))}
}
@Override
boolean hasTransformedResource() {
return serviceEntries.size() > 0
}
@Override
void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
serviceEntries.each { String path, ServiceStream stream ->
ZipEntry entry = new ZipEntry(path)
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time)
os.putNextEntry(entry)
Utils.copyLarge(stream.toInputStream(), os)
os.closeEntry()
}
}
@Override
ServiceFileTransformer include(String... includes) {
patternSet.include(includes)
this
}
@Override
ServiceFileTransformer include(Iterable<String> includes) {
patternSet.include(includes)
this
}
@Override
ServiceFileTransformer include(Spec<FileTreeElement> includeSpec) {
patternSet.include(includeSpec)
this
}
@Override
ServiceFileTransformer include(Closure includeSpec) {
patternSet.include(includeSpec)
this
}
@Override
ServiceFileTransformer exclude(String... excludes) {
patternSet.exclude(excludes)
this
}
@Override
ServiceFileTransformer exclude(Iterable<String> excludes) {
patternSet.exclude(excludes)
this
}
@Override
ServiceFileTransformer exclude(Spec<FileTreeElement> excludeSpec) {
patternSet.exclude(excludeSpec)
this
}
@Override
ServiceFileTransformer exclude(Closure excludeSpec) {
patternSet.exclude(excludeSpec)
this
}
@Override
Set<String> getIncludes() {
patternSet.includes
}
@Override
ServiceFileTransformer setIncludes(Iterable<String> includes) {
patternSet.includes = includes
this
}
@Override
Set<String> getExcludes() {
patternSet.excludes
}
@Override
ServiceFileTransformer setExcludes(Iterable<String> excludes) {
patternSet.excludes = excludes
this
}
}

View file

@ -0,0 +1,18 @@
package org.xbib.gradle.plugin.shadow.transformers
import org.gradle.api.file.FileTreeElement
import org.xbib.gradle.plugin.shadow.zip.ZipOutputStream
/**
* Modified from org.apache.maven.plugins.shade.resource.ResourceTransformer.
*/
interface Transformer {
boolean canTransformResource(FileTreeElement element)
void transform(TransformerContext context)
boolean hasTransformedResource()
void modifyOutputStream(ZipOutputStream jos, boolean preserveFileTimestamps)
}

View file

@ -0,0 +1,24 @@
package org.xbib.gradle.plugin.shadow.transformers
import groovy.transform.Canonical
import groovy.transform.builder.Builder
import org.xbib.gradle.plugin.shadow.ShadowStats
import org.xbib.gradle.plugin.shadow.relocation.Relocator
import org.xbib.gradle.plugin.shadow.tasks.ShadowCopyAction
@Canonical
@Builder
class TransformerContext {
String path
InputStream inputStream
List<Relocator> relocators
ShadowStats stats
static long getEntryTimestamp(boolean preserveFileTimestamps, long entryTime) {
preserveFileTimestamps ? entryTime : ShadowCopyAction.CONSTANT_TIME_FOR_ZIP_ENTRIES
}
}

View file

@ -0,0 +1,28 @@
package org.xbib.gradle.plugin.shadow.internal;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
/**
* This class was moved to java because groovy compiler reports "no such property: count"
*/
public class ServiceStream extends ByteArrayOutputStream {
public ServiceStream() {
super(1024);
}
public void append(InputStream is) throws IOException {
if (count > 0 && buf[count - 1] != '\n' && buf[count - 1] != '\r') {
byte[] newline = new byte[] {'\n'};
write(newline, 0, newline.length);
}
is.transferTo(this);
}
public InputStream toInputStream() {
return new ByteArrayInputStream(buf, 0, count);
}
}

View file

@ -0,0 +1,159 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.nio.charset.StandardCharsets;
import java.util.zip.CRC32;
import java.util.zip.ZipException;
/**
* A common base class for Unicode extra information extra fields.
*/
public abstract class AbstractUnicodeExtraField implements ZipExtraField {
private long nameCRC32;
private byte[] unicodeName;
private byte[] data;
protected AbstractUnicodeExtraField() {
}
/**
* Assemble as unicode extension from the name/comment and
* encoding of the original zip entry.
*
* @param text The file name or comment.
* @param bytes The encoded of the filename or comment in the zip
* file.
* @param off The offset of the encoded filename or comment in
* <code>bytes</code>.
* @param len The length of the encoded filename or comment in
* <code>bytes</code>.
*/
protected AbstractUnicodeExtraField(final String text, final byte[] bytes, final int off,
final int len) {
final CRC32 crc32 = new CRC32();
crc32.update(bytes, off, len);
nameCRC32 = crc32.getValue();
unicodeName = text.getBytes(StandardCharsets.UTF_8);
}
/**
* Assemble as unicode extension from the name/comment and
* encoding of the original zip entry.
*
* @param text The file name or comment.
* @param bytes The encoded of the filename or comment in the zip
* file.
*/
protected AbstractUnicodeExtraField(final String text, final byte[] bytes) {
this(text, bytes, 0, bytes.length);
}
private void assembleData() {
if (unicodeName == null) {
return;
}
data = new byte[5 + unicodeName.length];
// version 1
data[0] = 0x01;
System.arraycopy(ZipLong.getBytes(nameCRC32), 0, data, 1, 4);
System.arraycopy(unicodeName, 0, data, 5, unicodeName.length);
}
/**
* @return The CRC32 checksum of the filename or comment as
* encoded in the central directory of the zip file.
*/
public long getNameCRC32() {
return nameCRC32;
}
/**
* @param nameCRC32 The CRC32 checksum of the filename as encoded
* in the central directory of the zip file to set.
*/
public void setNameCRC32(final long nameCRC32) {
this.nameCRC32 = nameCRC32;
data = null;
}
/**
* @return The utf-8 encoded name.
*/
public byte[] getUnicodeName() {
byte[] b = null;
if (unicodeName != null) {
b = new byte[unicodeName.length];
System.arraycopy(unicodeName, 0, b, 0, b.length);
}
return b;
}
/**
* @param unicodeName The utf-8 encoded name to set.
*/
public void setUnicodeName(final byte[] unicodeName) {
if (unicodeName != null) {
this.unicodeName = new byte[unicodeName.length];
System.arraycopy(unicodeName, 0, this.unicodeName, 0,
unicodeName.length);
} else {
this.unicodeName = null;
}
data = null;
}
/** {@inheritDoc} */
public byte[] getCentralDirectoryData() {
if (data == null) {
this.assembleData();
}
byte[] b = null;
if (data != null) {
b = new byte[data.length];
System.arraycopy(data, 0, b, 0, b.length);
}
return b;
}
/** {@inheritDoc} */
public ZipShort getCentralDirectoryLength() {
if (data == null) {
assembleData();
}
return new ZipShort(data.length);
}
/** {@inheritDoc} */
public byte[] getLocalFileDataData() {
return getCentralDirectoryData();
}
/** {@inheritDoc} */
public ZipShort getLocalFileDataLength() {
return getCentralDirectoryLength();
}
/** {@inheritDoc} */
public void parseFromLocalFileData(final byte[] buffer, final int offset, final int length)
throws ZipException {
if (length < 5) {
throw new ZipException("UniCode path extra data must have at least"
+ " 5 bytes.");
}
final int version = buffer[offset];
if (version != 0x01) {
throw new ZipException("Unsupported version [" + version
+ "] for UniCode path extra data.");
}
nameCRC32 = ZipLong.getValue(buffer, offset + 1);
unicodeName = new byte[length - 5];
System.arraycopy(buffer, offset + 5, unicodeName, 0, length - 5);
data = null;
}
}

View file

@ -0,0 +1,299 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.util.zip.CRC32;
import java.util.zip.ZipException;
/**
* Adds Unix file permission and UID/GID fields as well as symbolic
* link handling.
*
* <p>This class uses the ASi extra field in the format:</p>
* <pre>
* Value Size Description
* ----- ---- -----------
* (Unix3) 0x756e Short tag for this extra block type
* TSize Short total data size for this block
* CRC Long CRC-32 of the remaining data
* Mode Short file permissions
* SizDev Long symlink'd size OR major/minor dev num
* UID Short user ID
* GID Short group ID
* (var.) variable symbolic link filename
* </pre>
* taken from appnote.iz (Info-ZIP note, 981119) found at <a
* href="ftp://ftp.uu.net/pub/archiving/zip/doc/">ftp://ftp.uu.net/pub/archiving/zip/doc/</a>
*
* <p>Short is two bytes and Long is four bytes in big endian byte and
* word order, device numbers are currently not supported.</p>
*
* <p>Since the documentation this class is based upon doesn't mention
* the character encoding of the file name at all, it is assumed that
* it uses the current platform's default encoding.</p>
*/
public class AsiExtraField implements ZipExtraField, UnixStat, Cloneable {
private static final ZipShort HEADER_ID = new ZipShort(0x756E);
private static final int WORD = 4;
/**
* Standard Unix stat(2) file mode.
*/
private int mode = 0;
/**
* User ID.
*/
private int uid = 0;
/**
* Group ID.
*/
private int gid = 0;
/**
* File this entry points to, if it is a symbolic link.
*
* <p>empty string - if entry is not a symbolic link.</p>
*/
private String link = "";
/**
* Is this an entry for a directory?
*/
private boolean dirFlag = false;
/**
* Instance used to calculate checksums.
*/
private CRC32 crc = new CRC32();
/** Constructor for AsiExtraField. */
public AsiExtraField() {
}
/**
* The Header-ID.
* @return the value for the header id for this extrafield
*/
public ZipShort getHeaderId() {
return HEADER_ID;
}
/**
* Length of the extra field in the local file data - without
* Header-ID or length specifier.
* @return a <code>ZipShort</code> for the length of the data of this extra field
*/
public ZipShort getLocalFileDataLength() {
return new ZipShort(WORD // CRC
+ 2 // Mode
+ WORD // SizDev
+ 2 // UID
+ 2 // GID
+ getLinkedFile().getBytes().length);
// Uses default charset - see class Javadoc
}
/**
* Delegate to local file data.
* @return the centralDirectory length
*/
public ZipShort getCentralDirectoryLength() {
return getLocalFileDataLength();
}
/**
* The actual data to put into local file data - without Header-ID
* or length specifier.
* @return get the data
*/
public byte[] getLocalFileDataData() {
// CRC will be added later
byte[] data = new byte[getLocalFileDataLength().getValue() - WORD];
System.arraycopy(ZipShort.getBytes(getMode()), 0, data, 0, 2);
byte[] linkArray = getLinkedFile().getBytes(); // Uses default charset - see class Javadoc
System.arraycopy(ZipLong.getBytes(linkArray.length),
0, data, 2, WORD);
System.arraycopy(ZipShort.getBytes(getUserId()),
0, data, 6, 2);
System.arraycopy(ZipShort.getBytes(getGroupId()),
0, data, 8, 2);
System.arraycopy(linkArray, 0, data, 10, linkArray.length);
crc.reset();
crc.update(data);
long checksum = crc.getValue();
byte[] result = new byte[data.length + WORD];
System.arraycopy(ZipLong.getBytes(checksum), 0, result, 0, WORD);
System.arraycopy(data, 0, result, WORD, data.length);
return result;
}
/**
* Delegate to local file data.
* @return the local file data
*/
public byte[] getCentralDirectoryData() {
return getLocalFileDataData();
}
/**
* Set the user id.
* @param uid the user id
*/
public void setUserId(int uid) {
this.uid = uid;
}
/**
* Get the user id.
* @return the user id
*/
public int getUserId() {
return uid;
}
/**
* Set the group id.
* @param gid the group id
*/
public void setGroupId(int gid) {
this.gid = gid;
}
/**
* Get the group id.
* @return the group id
*/
public int getGroupId() {
return gid;
}
/**
* Indicate that this entry is a symbolic link to the given filename.
*
* @param name Name of the file this entry links to, empty String
* if it is not a symbolic link.
*
*/
public void setLinkedFile(String name) {
link = name;
mode = getMode(mode);
}
/**
* Name of linked file
*
* @return name of the file this entry links to if it is a
* symbolic link, the empty string otherwise.
*/
public String getLinkedFile() {
return link;
}
/**
* Is this entry a symbolic link?
* @return true if this is a symbolic link
*/
public boolean isLink() {
return !getLinkedFile().isEmpty();
}
/**
* File mode of this file.
* @param mode the file mode
*/
public void setMode(int mode) {
this.mode = getMode(mode);
}
/**
* File mode of this file.
* @return the file mode
*/
public int getMode() {
return mode;
}
/**
* Indicate whether this entry is a directory.
* @param dirFlag if true, this entry is a directory
*/
public void setDirectory(boolean dirFlag) {
this.dirFlag = dirFlag;
mode = getMode(mode);
}
/**
* Is this entry a directory?
* @return true if this entry is a directory
*/
public boolean isDirectory() {
return dirFlag && !isLink();
}
/**
* Populate data from this array as if it was in local file data.
* @param data an array of bytes
* @param offset the start offset
* @param length the number of bytes in the array from offset
* @throws ZipException on error
*/
public void parseFromLocalFileData(byte[] data, int offset, int length)
throws ZipException {
long givenChecksum = ZipLong.getValue(data, offset);
byte[] tmp = new byte[length - WORD];
System.arraycopy(data, offset + WORD, tmp, 0, length - WORD);
crc.reset();
crc.update(tmp);
long realChecksum = crc.getValue();
if (givenChecksum != realChecksum) {
throw new ZipException("bad CRC checksum "
+ Long.toHexString(givenChecksum)
+ " instead of "
+ Long.toHexString(realChecksum));
}
int newMode = ZipShort.getValue(tmp, 0);
byte[] linkArray = new byte[(int) ZipLong.getValue(tmp, 2)];
uid = ZipShort.getValue(tmp, 6);
gid = ZipShort.getValue(tmp, 8);
if (linkArray.length == 0) {
link = "";
} else {
System.arraycopy(tmp, 10, linkArray, 0, linkArray.length);
link = new String(linkArray); // Uses default charset - see class Javadoc
}
setDirectory((newMode & DIR_FLAG) != 0);
setMode(newMode);
}
/**
* Get the file mode for given permissions with the correct file type.
* @param mode the mode
* @return the type with the mode
*/
protected int getMode(int mode) {
int type = FILE_FLAG;
if (isLink()) {
type = LINK_FLAG;
} else if (isDirectory()) {
type = DIR_FLAG;
}
return type | (mode & PERM_MASK);
}
@Override
public Object clone() {
try {
AsiExtraField cloned = (AsiExtraField) super.clone();
cloned.crc = new CRC32();
return cloned;
} catch (CloneNotSupportedException cnfe) {
// impossible
throw new RuntimeException(cnfe);
}
}
}

View file

@ -0,0 +1,20 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.util.zip.ZipException;
/**
* {@link ZipExtraField ZipExtraField} that knows how to parse central
* directory data.
*/
public interface CentralDirectoryParsingZipExtraField extends ZipExtraField {
/**
* Populate data from this array as if it was in central directory data.
* @param data an array of bytes
* @param offset the start offset
* @param length the number of bytes in the array from offset
*
* @throws ZipException on error
*/
void parseFromCentralDirectoryData(byte[] data, int offset, int length)
throws ZipException;
}

View file

@ -0,0 +1,272 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.lang.reflect.InvocationTargetException;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.zip.ZipException;
/**
* ZipExtraField related methods.
*/
public class ExtraFieldUtils {
private static final int WORD = 4;
/**
* Static registry of known extra fields.
*/
private static final Map<ZipShort, Class<?>> implementations;
static {
implementations = new ConcurrentHashMap<>();
register(AsiExtraField.class);
register(JarMarker.class);
register(UnicodePathExtraField.class);
register(UnicodeCommentExtraField.class);
register(Zip64ExtendedInformationExtraField.class);
}
/**
* Register a ZipExtraField implementation.
*
* <p>The given class must have a no-arg constructor and implement
* the {@link ZipExtraField ZipExtraField interface}.</p>
* @param c the class to register
*/
public static void register(Class<?> c) {
try {
ZipExtraField ze = (ZipExtraField) c.getDeclaredConstructor().newInstance();
implementations.put(ze.getHeaderId(), c);
} catch (ClassCastException | InstantiationException | IllegalAccessException | InvocationTargetException | NoSuchMethodException e) {
throw new IllegalStateException("unable to initialize zip extra field implementation");
}
}
/**
* Create an instance of the appropriate ExtraField, falls back to
* {@link UnrecognizedExtraField UnrecognizedExtraField}.
* @param headerId the header identifier
* @return an instance of the appropriate ExtraField
* @exception InstantiationException if unable to instantiate the class
* @exception IllegalAccessException if not allowed to instantiate the class
*/
public static ZipExtraField createExtraField(ZipShort headerId)
throws InstantiationException, IllegalAccessException, NoSuchMethodException, InvocationTargetException {
Class<?> c = implementations.get(headerId);
if (c != null) {
return (ZipExtraField) c.getDeclaredConstructor().newInstance();
}
UnrecognizedExtraField u = new UnrecognizedExtraField();
u.setHeaderId(headerId);
return u;
}
/**
* Split the array into ExtraFields and populate them with the
* given data as local file data, throwing an exception if the
* data cannot be parsed.
* @param data an array of bytes as it appears in local file data
* @return an array of ExtraFields
* @throws ZipException on error
*/
public static ZipExtraField[] parse(byte[] data) throws ZipException {
return parse(data, true, UnparseableExtraField.THROW);
}
/**
* Split the array into ExtraFields and populate them with the
* given data, throwing an exception if the data cannot be parsed.
* @param data an array of bytes
* @param local whether data originates from the local file data
* or the central directory
* @return an array of ExtraFields
* @throws ZipException on error
*/
public static ZipExtraField[] parse(byte[] data, boolean local)
throws ZipException {
return parse(data, local, UnparseableExtraField.THROW);
}
/**
* Split the array into ExtraFields and populate them with the
* given data.
* @param data an array of bytes
* @param local whether data originates from the local file data
* or the central directory
* @param onUnparseableData what to do if the extra field data
* cannot be parsed.
* @return an array of ExtraFields
* @throws ZipException on error
*/
public static ZipExtraField[] parse(byte[] data, boolean local,
UnparseableExtraField onUnparseableData)
throws ZipException {
List<ZipExtraField> v = new ArrayList<>();
int start = 0;
LOOP:
while (start <= data.length - WORD) {
ZipShort headerId = new ZipShort(data, start);
int length = (new ZipShort(data, start + 2)).getValue();
if (start + WORD + length > data.length) {
switch (onUnparseableData.getKey()) {
case UnparseableExtraField.THROW_KEY:
throw new ZipException("bad extra field starting at "
+ start + ". Block length of " + length
+ " bytes exceeds remaining data of "
+ (data.length - start - WORD) + " bytes.");
case UnparseableExtraField.READ_KEY:
UnparseableExtraFieldData field = new UnparseableExtraFieldData();
if (local) {
field.parseFromLocalFileData(data, start, data.length - start);
} else {
field.parseFromCentralDirectoryData(data, start, data.length - start);
}
v.add(field);
//$FALL-THROUGH$
case UnparseableExtraField.SKIP_KEY:
// since we cannot parse the data we must assume
// the extra field consumes the whole rest of the
// available data
break LOOP;
default:
throw new ZipException("unknown UnparseableExtraField key: "
+ onUnparseableData.getKey());
}
}
try {
ZipExtraField ze = createExtraField(headerId);
if (local || !(ze instanceof CentralDirectoryParsingZipExtraField)) {
ze.parseFromLocalFileData(data, start + WORD, length);
} else {
((CentralDirectoryParsingZipExtraField) ze)
.parseFromCentralDirectoryData(data, start + WORD, length);
}
v.add(ze);
} catch (InstantiationException | IllegalAccessException | NoSuchMethodException | InvocationTargetException ie) {
throw new ZipException(ie.getMessage());
}
start += (length + WORD);
}
ZipExtraField[] result = new ZipExtraField[v.size()];
return v.toArray(result);
}
/**
* Merges the local file data fields of the given ZipExtraFields.
* @param data an array of ExtraFiles
* @return an array of bytes
*/
public static byte[] mergeLocalFileDataData(ZipExtraField[] data) {
final boolean lastIsUnparseableHolder = data.length > 0
&& data[data.length - 1] instanceof UnparseableExtraFieldData;
int regularExtraFieldCount = lastIsUnparseableHolder ? data.length - 1 : data.length;
int sum = WORD * regularExtraFieldCount;
for (ZipExtraField element : data) {
sum += element.getLocalFileDataLength().getValue();
}
byte[] result = new byte[sum];
int start = 0;
for (int i = 0; i < regularExtraFieldCount; i++) {
System.arraycopy(data[i].getHeaderId().getBytes(),
0, result, start, 2);
System.arraycopy(data[i].getLocalFileDataLength().getBytes(),
0, result, start + 2, 2);
byte[] local = data[i].getLocalFileDataData();
System.arraycopy(local, 0, result, start + WORD, local.length);
start += (local.length + WORD);
}
if (lastIsUnparseableHolder) {
byte[] local = data[data.length - 1].getLocalFileDataData();
System.arraycopy(local, 0, result, start, local.length);
}
return result;
}
/**
* Merges the central directory fields of the given ZipExtraFields.
* @param data an array of ExtraFields
* @return an array of bytes
*/
public static byte[] mergeCentralDirectoryData(ZipExtraField[] data) {
final boolean lastIsUnparseableHolder = data.length > 0
&& data[data.length - 1] instanceof UnparseableExtraFieldData;
int regularExtraFieldCount = lastIsUnparseableHolder ? data.length - 1 : data.length;
int sum = WORD * regularExtraFieldCount;
for (ZipExtraField element : data) {
sum += element.getCentralDirectoryLength().getValue();
}
byte[] result = new byte[sum];
int start = 0;
for (int i = 0; i < regularExtraFieldCount; i++) {
System.arraycopy(data[i].getHeaderId().getBytes(),
0, result, start, 2);
System.arraycopy(data[i].getCentralDirectoryLength().getBytes(),
0, result, start + 2, 2);
byte[] local = data[i].getCentralDirectoryData();
System.arraycopy(local, 0, result, start + WORD, local.length);
start += (local.length + WORD);
}
if (lastIsUnparseableHolder) {
byte[] local = data[data.length - 1].getCentralDirectoryData();
System.arraycopy(local, 0, result, start, local.length);
}
return result;
}
/**
* "enum" for the possible actions to take if the extra field
* cannot be parsed.
*/
public static final class UnparseableExtraField {
/**
* Key for "throw an exception" action.
*/
public static final int THROW_KEY = 0;
/**
* Key for "skip" action.
*/
public static final int SKIP_KEY = 1;
/**
* Key for "read" action.
*/
public static final int READ_KEY = 2;
/**
* Throw an exception if field cannot be parsed.
*/
public static final UnparseableExtraField THROW = new UnparseableExtraField(THROW_KEY);
/**
* Skip the extra field entirely and don't make its data
* available - effectively removing the extra field data.
*/
public static final UnparseableExtraField SKIP = new UnparseableExtraField(SKIP_KEY);
/**
* Read the extra field data into an instance of {@link
* UnparseableExtraFieldData UnparseableExtraFieldData}.
*/
public static final UnparseableExtraField READ = new UnparseableExtraField(READ_KEY);
private final int key;
private UnparseableExtraField(int k) {
key = k;
}
/**
* Key of the action to take.
*
* @return int
*/
public int getKey() {
return key;
}
}
}

View file

@ -0,0 +1,72 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.io.IOException;
import java.nio.ByteBuffer;
/**
* A fallback ZipEncoding, which uses a java.io means to encode names.
*
* <p>This implementation is not favorable for encodings other than
* utf-8, because java.io encodes unmappable character as question
* marks leading to unreadable ZIP entries on some operating
* systems.</p>
*
* <p>Furthermore this implementation is unable to tell whether a
* given name can be safely encoded or not.</p>
*
* <p>This implementation acts as a last resort implementation, when
* neither {@link Simple8BitZipEncoding} nor {@link NioZipEncoding} is
* available.</p>
*
* <p>The methods of this class are reentrant.</p>
*/
class FallbackZipEncoding implements ZipEncoding {
private final String charset;
/**
* Construct a fallback zip encoding, which uses the platform's
* default charset.
*/
public FallbackZipEncoding() {
this.charset = null;
}
/**
* Construct a fallback zip encoding, which uses the given charset.
*
* @param charset The name of the charset or {@code null} for
* the platform's default character set.
*/
public FallbackZipEncoding(final String charset) {
this.charset = charset;
}
/**
* @see ZipEncoding#canEncode(java.lang.String)
*/
public boolean canEncode(final String name) {
return true;
}
/**
* @see ZipEncoding#encode(java.lang.String)
*/
public ByteBuffer encode(final String name) throws IOException {
if (this.charset == null) { // i.e. use default charset, see no-args constructor
return ByteBuffer.wrap(name.getBytes());
} else {
return ByteBuffer.wrap(name.getBytes(this.charset));
}
}
/**
* @see ZipEncoding#decode(byte[])
*/
public String decode(final byte[] data) throws IOException {
if (this.charset == null) { // i.e. use default charset, see no-args constructor
return new String(data);
} else {
return new String(data, this.charset);
}
}
}

View file

@ -0,0 +1,193 @@
package org.xbib.gradle.plugin.shadow.zip;
/**
* Parser/encoder for the "general purpose bit" field in ZIP's local
* file and central directory headers.
*/
public final class GeneralPurposeBit implements Cloneable {
/**
* Indicates that the file is encrypted.
*/
private static final int ENCRYPTION_FLAG = 1;
/**
* Indicates that a data descriptor stored after the file contents
* will hold CRC and size information.
*/
private static final int DATA_DESCRIPTOR_FLAG = 1 << 3;
/**
* Indicates strong encryption.
*/
private static final int STRONG_ENCRYPTION_FLAG = 1 << 6;
/**
* Indicates that filenames are written in utf-8.
*
* <p>The only reason this is public is that {@link
* ZipOutputStream#EFS_FLAG} was public in several versions of
* Apache Ant and we needed a substitute for it.</p>
*/
public static final int UFT8_NAMES_FLAG = 1 << 11;
private boolean languageEncodingFlag = false;
private boolean dataDescriptorFlag = false;
private boolean encryptionFlag = false;
private boolean strongEncryptionFlag = false;
public GeneralPurposeBit() {
}
/**
* whether the current entry uses UTF8 for file name and comment.
*
* @return boolean
*/
public boolean usesUTF8ForNames() {
return languageEncodingFlag;
}
/**
* whether the current entry will use UTF8 for file name and comment.
*
* @param b boolean
*/
public void useUTF8ForNames(boolean b) {
languageEncodingFlag = b;
}
/**
* whether the current entry uses the data descriptor to store CRC
* and size information
*
* @return boolean
*/
public boolean usesDataDescriptor() {
return dataDescriptorFlag;
}
/**
* whether the current entry will use the data descriptor to store
* CRC and size information
*
* @param b boolean
*/
public void useDataDescriptor(boolean b) {
dataDescriptorFlag = b;
}
/**
* whether the current entry is encrypted
*
* @return boolean
*/
public boolean usesEncryption() {
return encryptionFlag;
}
/**
* whether the current entry will be encrypted
*
* @param b boolean
*/
public void useEncryption(boolean b) {
encryptionFlag = b;
}
/**
* whether the current entry is encrypted using strong encryption
*
* @return boolean
*/
public boolean usesStrongEncryption() {
return encryptionFlag && strongEncryptionFlag;
}
/**
* whether the current entry will be encrypted using strong encryption
*
* @param b boolean
*/
public void useStrongEncryption(boolean b) {
strongEncryptionFlag = b;
if (b) {
useEncryption(true);
}
}
/**
* Encodes the set bits in a form suitable for ZIP archives.
*
* @return byte[]
*/
public byte[] encode() {
byte[] result = new byte[2];
encode(result, 0);
return result;
}
/**
* Encodes the set bits in a form suitable for ZIP archives.
*
* @param buf the output buffer
* @param offset
* The offset within the output buffer of the first byte to be written.
* must be non-negative and no larger than <tt>buf.length-2</tt>
*/
public void encode(byte[] buf, int offset) {
ZipShort.putShort((dataDescriptorFlag ? DATA_DESCRIPTOR_FLAG : 0)
| (languageEncodingFlag ? UFT8_NAMES_FLAG : 0)
| (encryptionFlag ? ENCRYPTION_FLAG : 0)
| (strongEncryptionFlag ? STRONG_ENCRYPTION_FLAG : 0),
buf, offset);
}
/**
* Parses the supported flags from the given archive data.
*
* @param data local file header or a central directory entry.
* @param offset offset at which the general purpose bit starts
* @return GeneralPurposeBit
*/
public static GeneralPurposeBit parse(final byte[] data, final int offset) {
final int generalPurposeFlag = ZipShort.getValue(data, offset);
GeneralPurposeBit b = new GeneralPurposeBit();
b.useDataDescriptor((generalPurposeFlag & DATA_DESCRIPTOR_FLAG) != 0);
b.useUTF8ForNames((generalPurposeFlag & UFT8_NAMES_FLAG) != 0);
b.useStrongEncryption((generalPurposeFlag & STRONG_ENCRYPTION_FLAG)
!= 0);
b.useEncryption((generalPurposeFlag & ENCRYPTION_FLAG) != 0);
return b;
}
@Override
public int hashCode() {
return 3 * (7 * (13 * (17 * (encryptionFlag ? 1 : 0)
+ (strongEncryptionFlag ? 1 : 0))
+ (languageEncodingFlag ? 1 : 0))
+ (dataDescriptorFlag ? 1 : 0));
}
@Override
public boolean equals(Object o) {
if (o instanceof GeneralPurposeBit) {
GeneralPurposeBit g = (GeneralPurposeBit) o;
return g.encryptionFlag == encryptionFlag
&& g.strongEncryptionFlag == strongEncryptionFlag
&& g.languageEncodingFlag == languageEncodingFlag
&& g.dataDescriptorFlag == dataDescriptorFlag;
}
return false;
}
@Override
public Object clone() {
try {
return super.clone();
} catch (CloneNotSupportedException ex) {
// impossible
throw new RuntimeException("GeneralPurposeBit is not Cloneable?", ex); //NOSONAR
}
}
}

View file

@ -0,0 +1,87 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.util.zip.ZipException;
/**
* If this extra field is added as the very first extra field of the
* archive, Solaris will consider it an executable jar file.
*/
public final class JarMarker implements ZipExtraField {
private static final ZipShort ID = new ZipShort(0xCAFE);
private static final ZipShort NULL = new ZipShort(0);
private static final byte[] NO_BYTES = new byte[0];
private static final JarMarker DEFAULT = new JarMarker();
/** No-arg constructor */
public JarMarker() {
// empty
}
/**
* Since JarMarker is stateless we can always use the same instance.
* @return the DEFAULT jarmaker.
*/
public static JarMarker getInstance() {
return DEFAULT;
}
/**
* The Header-ID.
* @return the header id
*/
public ZipShort getHeaderId() {
return ID;
}
/**
* Length of the extra field in the local file data - without
* Header-ID or length specifier.
* @return 0
*/
public ZipShort getLocalFileDataLength() {
return NULL;
}
/**
* Length of the extra field in the central directory - without
* Header-ID or length specifier.
* @return 0
*/
public ZipShort getCentralDirectoryLength() {
return NULL;
}
/**
* The actual data to put into local file data - without Header-ID
* or length specifier.
* @return the data
*/
public byte[] getLocalFileDataData() {
return NO_BYTES;
}
/**
* The actual data to put central directory - without Header-ID or
* length specifier.
* @return the data
*/
public byte[] getCentralDirectoryData() {
return NO_BYTES;
}
/**
* Populate data from this array as if it was in local file data.
* @param data an array of bytes
* @param offset the start offset
* @param length the number of bytes in the array from offset
*
* @throws ZipException on error
*/
public void parseFromLocalFileData(byte[] data, int offset, int length)
throws ZipException {
if (length != 0) {
throw new ZipException("JarMarker doesn't expect any data");
}
}
}

View file

@ -0,0 +1,100 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.CharBuffer;
import java.nio.charset.Charset;
import java.nio.charset.CharsetEncoder;
import java.nio.charset.CoderResult;
import java.nio.charset.CodingErrorAction;
/**
* A ZipEncoding, which uses a java.nio {@link
* java.nio.charset.Charset Charset} to encode names.
*
* <p>This implementation works for all cases under java-1.5 or
* later. However, in java-1.4, some charsets don't have a java.nio
* implementation, most notably the default ZIP encoding Cp437.</p>
*
* <p>The methods of this class are reentrant.</p>
*/
class NioZipEncoding implements ZipEncoding {
private final Charset charset;
/**
* Construct an NIO based zip encoding, which wraps the given
* charset.
*
* @param charset The NIO charset to wrap.
*/
public NioZipEncoding(final Charset charset) {
this.charset = charset;
}
/**
* @see ZipEncoding#canEncode(java.lang.String)
*/
public boolean canEncode(final String name) {
final CharsetEncoder enc = this.charset.newEncoder();
enc.onMalformedInput(CodingErrorAction.REPORT);
enc.onUnmappableCharacter(CodingErrorAction.REPORT);
return enc.canEncode(name);
}
/**
* @see ZipEncoding#encode(java.lang.String)
*/
public ByteBuffer encode(final String name) {
final CharsetEncoder enc = this.charset.newEncoder();
enc.onMalformedInput(CodingErrorAction.REPORT);
enc.onUnmappableCharacter(CodingErrorAction.REPORT);
final CharBuffer cb = CharBuffer.wrap(name);
ByteBuffer out = ByteBuffer.allocate(name.length()
+ (name.length() + 1) / 2);
while (cb.remaining() > 0) {
final CoderResult res = enc.encode(cb, out, true);
if (res.isUnmappable() || res.isMalformed()) {
// write the unmappable characters in utf-16
// pseudo-URL encoding style to ByteBuffer.
if (res.length() * 6 > out.remaining()) {
out = ZipEncodingHelper.growBuffer(out, out.position()
+ res.length() * 6);
}
for (int i = 0; i < res.length(); ++i) {
ZipEncodingHelper.appendSurrogate(out, cb.get());
}
} else if (res.isOverflow()) {
out = ZipEncodingHelper.growBuffer(out, 0);
} else if (res.isUnderflow()) {
enc.flush(out);
break;
}
}
out.limit(out.position());
out.rewind();
return out;
}
/**
* @see ZipEncoding#decode(byte[])
*/
public String decode(final byte[] data) throws IOException {
return this.charset.newDecoder()
.onMalformedInput(CodingErrorAction.REPORT)
.onUnmappableCharacter(CodingErrorAction.REPORT)
.decode(ByteBuffer.wrap(data)).toString();
}
}

View file

@ -0,0 +1,250 @@
package org.xbib.gradle.plugin.shadow.zip;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
/**
* This ZipEncoding implementation implements a simple 8 bit character
* set, which meets the following restrictions:
*
* <ul>
* <li>Characters 0x0000 to 0x007f are encoded as the corresponding
* byte values 0x00 to 0x7f.</li>
* <li>All byte codes from 0x80 to 0xff are mapped to a unique Unicode
* character in the range 0x0080 to 0x7fff. (No support for
* UTF-16 surrogates)
* </ul>
*
* <p>These restrictions most notably apply to the most prominent
* omissions of Java 1.4 {@link java.nio.charset.Charset Charset}
* implementation, Cp437 and Cp850.</p>
*
* <p>The methods of this class are reentrant.</p>
*/
class Simple8BitZipEncoding implements ZipEncoding {
/**
* A character entity, which is put to the reverse mapping table
* of a simple encoding.
*/
private static final class Simple8BitChar implements Comparable<Simple8BitChar> {
public final char unicode;
public final byte code;
Simple8BitChar(final byte code, final char unicode) {
this.code = code;
this.unicode = unicode;
}
public int compareTo(final Simple8BitChar a) {
return this.unicode - a.unicode;
}
@Override
public String toString() {
return "0x" + Integer.toHexString(0xffff & unicode)
+ "->0x" + Integer.toHexString(0xff & code);
}
@Override
public boolean equals(final Object o) {
if (o instanceof Simple8BitChar) {
final Simple8BitChar other = (Simple8BitChar) o;
return unicode == other.unicode && code == other.code;
}
return false;
}
@Override
public int hashCode() {
return unicode;
}
}
/**
* The characters for byte values of 128 to 255 stored as an array of
* 128 chars.
*/
private final char[] highChars;
/**
* A list of {@link Simple8BitChar} objects sorted by the unicode
* field. This list is used to binary search reverse mapping of
* unicode characters with a character code greater than 127.
*/
private final List<Simple8BitChar> reverseMapping;
/**
* @param highChars The characters for byte values of 128 to 255
* stored as an array of 128 chars.
*/
public Simple8BitZipEncoding(final char[] highChars) {
this.highChars = highChars.clone();
final List<Simple8BitChar> temp =
new ArrayList<>(this.highChars.length);
byte code = 127;
for (char highChar : this.highChars) {
temp.add(new Simple8BitChar(++code, highChar));
}
Collections.sort(temp);
this.reverseMapping = Collections.unmodifiableList(temp);
}
/**
* Return the character code for a given encoded byte.
*
* @param b The byte to decode.
* @return The associated character value.
*/
public char decodeByte(final byte b) {
// code 0-127
if (b >= 0) {
return (char) b;
}
// byte is signed, so 128 == -128 and 255 == -1
return this.highChars[128 + b];
}
/**
* @param c The character to encode.
* @return Whether the given unicode character is covered by this encoding.
*/
public boolean canEncodeChar(final char c) {
if (c >= 0 && c < 128) {
return true;
}
final Simple8BitChar r = this.encodeHighChar(c);
return r != null;
}
/**
* Pushes the encoded form of the given character to the given byte buffer.
*
* @param bb The byte buffer to write to.
* @param c The character to encode.
* @return Whether the given unicode character is covered by this encoding.
* If {@code false} is returned, nothing is pushed to the
* byte buffer.
*/
public boolean pushEncodedChar(final ByteBuffer bb, final char c) {
if (c >= 0 && c < 128) {
bb.put((byte) c);
return true;
}
final Simple8BitChar r = this.encodeHighChar(c);
if (r == null) {
return false;
}
bb.put(r.code);
return true;
}
/**
* @param c A unicode character in the range from 0x0080 to 0x7f00
* @return A Simple8BitChar, if this character is covered by this encoding.
* A {@code null} value is returned, if this character is not
* covered by this encoding.
*/
private Simple8BitChar encodeHighChar(final char c) {
// for performance an simplicity, yet another reincarnation of
// binary search...
int i0 = 0;
int i1 = this.reverseMapping.size();
while (i1 > i0) {
final int i = i0 + (i1 - i0) / 2;
final Simple8BitChar m = this.reverseMapping.get(i);
if (m.unicode == c) {
return m;
}
if (m.unicode < c) {
i0 = i + 1;
} else {
i1 = i;
}
}
if (i0 >= this.reverseMapping.size()) {
return null;
}
final Simple8BitChar r = this.reverseMapping.get(i0);
if (r.unicode != c) {
return null;
}
return r;
}
/**
* @see ZipEncoding#canEncode(java.lang.String)
*/
public boolean canEncode(final String name) {
for (int i = 0; i < name.length(); ++i) {
final char c = name.charAt(i);
if (!this.canEncodeChar(c)) {
return false;
}
}
return true;
}
/**
* @see ZipEncoding#encode(java.lang.String)
*/
public ByteBuffer encode(final String name) {
ByteBuffer out = ByteBuffer.allocate(name.length()
+ 6 + (name.length() + 1) / 2);
for (int i = 0; i < name.length(); ++i) {
final char c = name.charAt(i);
if (out.remaining() < 6) {
out = ZipEncodingHelper.growBuffer(out, out.position() + 6);
}
if (!this.pushEncodedChar(out, c)) {
ZipEncodingHelper.appendSurrogate(out, c);
}
}
out.limit(out.position());
out.rewind();
return out;
}
/**
* @see ZipEncoding#decode(byte[])
*/
public String decode(final byte[] data) throws IOException {
final char[] ret = new char[data.length];
for (int i = 0; i < data.length; ++i) {
ret[i] = this.decodeByte(data[i]);
}
return new String(ret);
}
}

View file

@ -0,0 +1,51 @@
package org.xbib.gradle.plugin.shadow.zip;
/**
* Info-ZIP Unicode Comment Extra Field (0x6375):
*
* <p>Stores the UTF-8 version of the file comment as stored in the
* central directory header.</p>
*
* <p>See <a href="http://www.pkware.com/documents/casestudies/APPNOTE.TXT">PKWARE's
* APPNOTE.TXT, section 4.6.8</a>.</p>
*
*/
public class UnicodeCommentExtraField extends AbstractUnicodeExtraField {
public static final ZipShort UCOM_ID = new ZipShort(0x6375);
public UnicodeCommentExtraField() {
}
/**
* Assemble as unicode comment extension from the name given as
* text as well as the encoded bytes actually written to the archive.
*
* @param text The file name
* @param bytes the bytes actually written to the archive
* @param off The offset of the encoded comment in <code>bytes</code>.
* @param len The length of the encoded comment or comment in
* <code>bytes</code>.
*/
public UnicodeCommentExtraField(final String text, final byte[] bytes, final int off,
final int len) {
super(text, bytes, off, len);
}
/**
* Assemble as unicode comment extension from the comment given as
* text as well as the bytes actually written to the archive.
*
* @param comment The file comment
* @param bytes the bytes actually written to the archive
*/
public UnicodeCommentExtraField(final String comment, final byte[] bytes) {
super(comment, bytes);
}
/** {@inheritDoc} */
public ZipShort getHeaderId() {
return UCOM_ID;
}
}

View file

@ -0,0 +1,48 @@
package org.xbib.gradle.plugin.shadow.zip;
/**
* Info-ZIP Unicode Path Extra Field (0x7075):
*
* <p>Stores the UTF-8 version of the file name field as stored in the
* local header and central directory header.</p>
*
* <p>See <a href="http://www.pkware.com/documents/casestudies/APPNOTE.TXT">PKWARE's
* APPNOTE.TXT, section 4.6.9</a>.</p>
*/
public class UnicodePathExtraField extends AbstractUnicodeExtraField {
public static final ZipShort UPATH_ID = new ZipShort(0x7075);
public UnicodePathExtraField() {
}
/**
* Assemble as unicode path extension from the name given as
* text as well as the encoded bytes actually written to the archive.
*
* @param text The file name
* @param bytes the bytes actually written to the archive
* @param off The offset of the encoded filename in <code>bytes</code>.
* @param len The length of the encoded filename or comment in
* <code>bytes</code>.
*/
public UnicodePathExtraField(final String text, final byte[] bytes, final int off, final int len) {
super(text, bytes, off, len);
}
/**
* Assemble as unicode path extension from the name given as
* text as well as the encoded bytes actually written to the archive.
*
* @param name The file name
* @param bytes the bytes actually written to the archive
*/
public UnicodePathExtraField(final String name, final byte[] bytes) {
super(name, bytes);
}
/** {@inheritDoc} */
public ZipShort getHeaderId() {
return UPATH_ID;
}
}

Some files were not shown because too many files have changed in this diff Show more