Replace godep with vndr
Vndr has a simpler configuration and allows pointing to forked packages. Additionally other docker projects are now using vndr making vendoring in distribution more consistent. Updates letsencrypt to use fork. No longer uses sub-vendored packages. Signed-off-by: Derek McGowan <derek@mcgstyle.net> (github: dmcgowan)
This commit is contained in:
parent
8f9abbd27f
commit
a685e3fc98
265 changed files with 30150 additions and 19449 deletions
202
vendor/github.com/aws/aws-sdk-go/LICENSE.txt
generated
vendored
202
vendor/github.com/aws/aws-sdk-go/LICENSE.txt
generated
vendored
|
@ -1,202 +0,0 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
3
vendor/github.com/aws/aws-sdk-go/NOTICE.txt
generated
vendored
3
vendor/github.com/aws/aws-sdk-go/NOTICE.txt
generated
vendored
|
@ -1,3 +0,0 @@
|
|||
AWS SDK for Go
|
||||
Copyright 2015 Amazon.com, Inc. or its affiliates. All Rights Reserved.
|
||||
Copyright 2014-2015 Stripe, Inc.
|
12
vendor/github.com/aws/aws-sdk-go/aws/credentials/example.ini
generated
vendored
12
vendor/github.com/aws/aws-sdk-go/aws/credentials/example.ini
generated
vendored
|
@ -1,12 +0,0 @@
|
|||
[default]
|
||||
aws_access_key_id = accessKey
|
||||
aws_secret_access_key = secret
|
||||
aws_session_token = token
|
||||
|
||||
[no_token]
|
||||
aws_access_key_id = accessKey
|
||||
aws_secret_access_key = secret
|
||||
|
||||
[with_colon]
|
||||
aws_access_key_id: accessKey
|
||||
aws_secret_access_key: secret
|
75
vendor/github.com/aws/aws-sdk-go/private/endpoints/endpoints.json
generated
vendored
75
vendor/github.com/aws/aws-sdk-go/private/endpoints/endpoints.json
generated
vendored
|
@ -1,75 +0,0 @@
|
|||
{
|
||||
"version": 2,
|
||||
"endpoints": {
|
||||
"*/*": {
|
||||
"endpoint": "{service}.{region}.amazonaws.com"
|
||||
},
|
||||
"cn-north-1/*": {
|
||||
"endpoint": "{service}.{region}.amazonaws.com.cn",
|
||||
"signatureVersion": "v4"
|
||||
},
|
||||
"cn-north-1/ec2metadata": {
|
||||
"endpoint": "http://169.254.169.254/latest"
|
||||
},
|
||||
"us-gov-west-1/iam": {
|
||||
"endpoint": "iam.us-gov.amazonaws.com"
|
||||
},
|
||||
"us-gov-west-1/sts": {
|
||||
"endpoint": "sts.us-gov-west-1.amazonaws.com"
|
||||
},
|
||||
"us-gov-west-1/s3": {
|
||||
"endpoint": "s3-{region}.amazonaws.com"
|
||||
},
|
||||
"us-gov-west-1/ec2metadata": {
|
||||
"endpoint": "http://169.254.169.254/latest"
|
||||
},
|
||||
"*/cloudfront": {
|
||||
"endpoint": "cloudfront.amazonaws.com",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/cloudsearchdomain": {
|
||||
"endpoint": "",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/data.iot": {
|
||||
"endpoint": "",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/ec2metadata": {
|
||||
"endpoint": "http://169.254.169.254/latest"
|
||||
},
|
||||
"*/iam": {
|
||||
"endpoint": "iam.amazonaws.com",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/importexport": {
|
||||
"endpoint": "importexport.amazonaws.com",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/route53": {
|
||||
"endpoint": "route53.amazonaws.com",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/sts": {
|
||||
"endpoint": "sts.amazonaws.com",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/waf": {
|
||||
"endpoint": "waf.amazonaws.com",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"us-east-1/sdb": {
|
||||
"endpoint": "sdb.amazonaws.com",
|
||||
"signingRegion": "us-east-1"
|
||||
},
|
||||
"*/s3": {
|
||||
"endpoint": "s3-{region}.amazonaws.com"
|
||||
},
|
||||
"us-east-1/s3": {
|
||||
"endpoint": "s3.amazonaws.com"
|
||||
},
|
||||
"eu-central-1/s3": {
|
||||
"endpoint": "{service}.{region}.amazonaws.com"
|
||||
}
|
||||
}
|
||||
}
|
4
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/.gitignore
generated
vendored
4
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/.gitignore
generated
vendored
|
@ -1,4 +0,0 @@
|
|||
testdata/conf_out.ini
|
||||
ini.sublime-project
|
||||
ini.sublime-workspace
|
||||
testdata/conf_reflect.ini
|
191
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/LICENSE
generated
vendored
191
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/LICENSE
generated
vendored
|
@ -1,191 +0,0 @@
|
|||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction, and
|
||||
distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by the copyright
|
||||
owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all other entities
|
||||
that control, are controlled by, or are under common control with that entity.
|
||||
For the purposes of this definition, "control" means (i) the power, direct or
|
||||
indirect, to cause the direction or management of such entity, whether by
|
||||
contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity exercising
|
||||
permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications, including
|
||||
but not limited to software source code, documentation source, and configuration
|
||||
files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical transformation or
|
||||
translation of a Source form, including but not limited to compiled object code,
|
||||
generated documentation, and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or Object form, made
|
||||
available under the License, as indicated by a copyright notice that is included
|
||||
in or attached to the work (an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object form, that
|
||||
is based on (or derived from) the Work and for which the editorial revisions,
|
||||
annotations, elaborations, or other modifications represent, as a whole, an
|
||||
original work of authorship. For the purposes of this License, Derivative Works
|
||||
shall not include works that remain separable from, or merely link (or bind by
|
||||
name) to the interfaces of, the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including the original version
|
||||
of the Work and any modifications or additions to that Work or Derivative Works
|
||||
thereof, that is intentionally submitted to Licensor for inclusion in the Work
|
||||
by the copyright owner or by an individual or Legal Entity authorized to submit
|
||||
on behalf of the copyright owner. For the purposes of this definition,
|
||||
"submitted" means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems, and
|
||||
issue tracking systems that are managed by, or on behalf of, the Licensor for
|
||||
the purpose of discussing and improving the Work, but excluding communication
|
||||
that is conspicuously marked or otherwise designated in writing by the copyright
|
||||
owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity on behalf
|
||||
of whom a Contribution has been received by Licensor and subsequently
|
||||
incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License.
|
||||
|
||||
Subject to the terms and conditions of this License, each Contributor hereby
|
||||
grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,
|
||||
irrevocable copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the Work and such
|
||||
Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License.
|
||||
|
||||
Subject to the terms and conditions of this License, each Contributor hereby
|
||||
grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,
|
||||
irrevocable (except as stated in this section) patent license to make, have
|
||||
made, use, offer to sell, sell, import, and otherwise transfer the Work, where
|
||||
such license applies only to those patent claims licensable by such Contributor
|
||||
that are necessarily infringed by their Contribution(s) alone or by combination
|
||||
of their Contribution(s) with the Work to which such Contribution(s) was
|
||||
submitted. If You institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work or a
|
||||
Contribution incorporated within the Work constitutes direct or contributory
|
||||
patent infringement, then any patent licenses granted to You under this License
|
||||
for that Work shall terminate as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution.
|
||||
|
||||
You may reproduce and distribute copies of the Work or Derivative Works thereof
|
||||
in any medium, with or without modifications, and in Source or Object form,
|
||||
provided that You meet the following conditions:
|
||||
|
||||
You must give any other recipients of the Work or Derivative Works a copy of
|
||||
this License; and
|
||||
You must cause any modified files to carry prominent notices stating that You
|
||||
changed the files; and
|
||||
You must retain, in the Source form of any Derivative Works that You distribute,
|
||||
all copyright, patent, trademark, and attribution notices from the Source form
|
||||
of the Work, excluding those notices that do not pertain to any part of the
|
||||
Derivative Works; and
|
||||
If the Work includes a "NOTICE" text file as part of its distribution, then any
|
||||
Derivative Works that You distribute must include a readable copy of the
|
||||
attribution notices contained within such NOTICE file, excluding those notices
|
||||
that do not pertain to any part of the Derivative Works, in at least one of the
|
||||
following places: within a NOTICE text file distributed as part of the
|
||||
Derivative Works; within the Source form or documentation, if provided along
|
||||
with the Derivative Works; or, within a display generated by the Derivative
|
||||
Works, if and wherever such third-party notices normally appear. The contents of
|
||||
the NOTICE file are for informational purposes only and do not modify the
|
||||
License. You may add Your own attribution notices within Derivative Works that
|
||||
You distribute, alongside or as an addendum to the NOTICE text from the Work,
|
||||
provided that such additional attribution notices cannot be construed as
|
||||
modifying the License.
|
||||
You may add Your own copyright statement to Your modifications and may provide
|
||||
additional or different license terms and conditions for use, reproduction, or
|
||||
distribution of Your modifications, or for any such Derivative Works as a whole,
|
||||
provided Your use, reproduction, and distribution of the Work otherwise complies
|
||||
with the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions.
|
||||
|
||||
Unless You explicitly state otherwise, any Contribution intentionally submitted
|
||||
for inclusion in the Work by You to the Licensor shall be under the terms and
|
||||
conditions of this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify the terms of
|
||||
any separate license agreement you may have executed with Licensor regarding
|
||||
such Contributions.
|
||||
|
||||
6. Trademarks.
|
||||
|
||||
This License does not grant permission to use the trade names, trademarks,
|
||||
service marks, or product names of the Licensor, except as required for
|
||||
reasonable and customary use in describing the origin of the Work and
|
||||
reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty.
|
||||
|
||||
Unless required by applicable law or agreed to in writing, Licensor provides the
|
||||
Work (and each Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied,
|
||||
including, without limitation, any warranties or conditions of TITLE,
|
||||
NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are
|
||||
solely responsible for determining the appropriateness of using or
|
||||
redistributing the Work and assume any risks associated with Your exercise of
|
||||
permissions under this License.
|
||||
|
||||
8. Limitation of Liability.
|
||||
|
||||
In no event and under no legal theory, whether in tort (including negligence),
|
||||
contract, or otherwise, unless required by applicable law (such as deliberate
|
||||
and grossly negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special, incidental,
|
||||
or consequential damages of any character arising as a result of this License or
|
||||
out of the use or inability to use the Work (including but not limited to
|
||||
damages for loss of goodwill, work stoppage, computer failure or malfunction, or
|
||||
any and all other commercial damages or losses), even if such Contributor has
|
||||
been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability.
|
||||
|
||||
While redistributing the Work or Derivative Works thereof, You may choose to
|
||||
offer, and charge a fee for, acceptance of support, warranty, indemnity, or
|
||||
other liability obligations and/or rights consistent with this License. However,
|
||||
in accepting such obligations, You may act only on Your own behalf and on Your
|
||||
sole responsibility, not on behalf of any other Contributor, and only if You
|
||||
agree to indemnify, defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason of your
|
||||
accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work
|
||||
|
||||
To apply the Apache License to your work, attach the following boilerplate
|
||||
notice, with the fields enclosed by brackets "[]" replaced with your own
|
||||
identifying information. (Don't include the brackets!) The text should be
|
||||
enclosed in the appropriate comment syntax for the file format. We also
|
||||
recommend that a file or class name and description of purpose be included on
|
||||
the same "printed page" as the copyright notice for easier identification within
|
||||
third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
560
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/README.md
generated
vendored
560
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/README.md
generated
vendored
|
@ -1,560 +0,0 @@
|
|||
ini [](https://drone.io/github.com/go-ini/ini/latest) [](http://gocover.io/github.com/go-ini/ini)
|
||||
===
|
||||
|
||||

|
||||
|
||||
Package ini provides INI file read and write functionality in Go.
|
||||
|
||||
[简体中文](README_ZH.md)
|
||||
|
||||
## Feature
|
||||
|
||||
- Load multiple data sources(`[]byte` or file) with overwrites.
|
||||
- Read with recursion values.
|
||||
- Read with parent-child sections.
|
||||
- Read with auto-increment key names.
|
||||
- Read with multiple-line values.
|
||||
- Read with tons of helper methods.
|
||||
- Read and convert values to Go types.
|
||||
- Read and **WRITE** comments of sections and keys.
|
||||
- Manipulate sections, keys and comments with ease.
|
||||
- Keep sections and keys in order as you parse and save.
|
||||
|
||||
## Installation
|
||||
|
||||
go get gopkg.in/ini.v1
|
||||
|
||||
## Getting Started
|
||||
|
||||
### Loading from data sources
|
||||
|
||||
A **Data Source** is either raw data in type `[]byte` or a file name with type `string` and you can load **as many as** data sources you want. Passing other types will simply return an error.
|
||||
|
||||
```go
|
||||
cfg, err := ini.Load([]byte("raw data"), "filename")
|
||||
```
|
||||
|
||||
Or start with an empty object:
|
||||
|
||||
```go
|
||||
cfg := ini.Empty()
|
||||
```
|
||||
|
||||
When you cannot decide how many data sources to load at the beginning, you still able to **Append()** them later.
|
||||
|
||||
```go
|
||||
err := cfg.Append("other file", []byte("other raw data"))
|
||||
```
|
||||
|
||||
### Working with sections
|
||||
|
||||
To get a section, you would need to:
|
||||
|
||||
```go
|
||||
section, err := cfg.GetSection("section name")
|
||||
```
|
||||
|
||||
For a shortcut for default section, just give an empty string as name:
|
||||
|
||||
```go
|
||||
section, err := cfg.GetSection("")
|
||||
```
|
||||
|
||||
When you're pretty sure the section exists, following code could make your life easier:
|
||||
|
||||
```go
|
||||
section := cfg.Section("")
|
||||
```
|
||||
|
||||
What happens when the section somehow does not exist? Don't panic, it automatically creates and returns a new section to you.
|
||||
|
||||
To create a new section:
|
||||
|
||||
```go
|
||||
err := cfg.NewSection("new section")
|
||||
```
|
||||
|
||||
To get a list of sections or section names:
|
||||
|
||||
```go
|
||||
sections := cfg.Sections()
|
||||
names := cfg.SectionStrings()
|
||||
```
|
||||
|
||||
### Working with keys
|
||||
|
||||
To get a key under a section:
|
||||
|
||||
```go
|
||||
key, err := cfg.Section("").GetKey("key name")
|
||||
```
|
||||
|
||||
Same rule applies to key operations:
|
||||
|
||||
```go
|
||||
key := cfg.Section("").Key("key name")
|
||||
```
|
||||
|
||||
To create a new key:
|
||||
|
||||
```go
|
||||
err := cfg.Section("").NewKey("name", "value")
|
||||
```
|
||||
|
||||
To get a list of keys or key names:
|
||||
|
||||
```go
|
||||
keys := cfg.Section("").Keys()
|
||||
names := cfg.Section("").KeyStrings()
|
||||
```
|
||||
|
||||
To get a clone hash of keys and corresponding values:
|
||||
|
||||
```go
|
||||
hash := cfg.GetSection("").KeysHash()
|
||||
```
|
||||
|
||||
### Working with values
|
||||
|
||||
To get a string value:
|
||||
|
||||
```go
|
||||
val := cfg.Section("").Key("key name").String()
|
||||
```
|
||||
|
||||
To validate key value on the fly:
|
||||
|
||||
```go
|
||||
val := cfg.Section("").Key("key name").Validate(func(in string) string {
|
||||
if len(in) == 0 {
|
||||
return "default"
|
||||
}
|
||||
return in
|
||||
})
|
||||
```
|
||||
|
||||
To get value with types:
|
||||
|
||||
```go
|
||||
// For boolean values:
|
||||
// true when value is: 1, t, T, TRUE, true, True, YES, yes, Yes, ON, on, On
|
||||
// false when value is: 0, f, F, FALSE, false, False, NO, no, No, OFF, off, Off
|
||||
v, err = cfg.Section("").Key("BOOL").Bool()
|
||||
v, err = cfg.Section("").Key("FLOAT64").Float64()
|
||||
v, err = cfg.Section("").Key("INT").Int()
|
||||
v, err = cfg.Section("").Key("INT64").Int64()
|
||||
v, err = cfg.Section("").Key("UINT").Uint()
|
||||
v, err = cfg.Section("").Key("UINT64").Uint64()
|
||||
v, err = cfg.Section("").Key("TIME").TimeFormat(time.RFC3339)
|
||||
v, err = cfg.Section("").Key("TIME").Time() // RFC3339
|
||||
|
||||
v = cfg.Section("").Key("BOOL").MustBool()
|
||||
v = cfg.Section("").Key("FLOAT64").MustFloat64()
|
||||
v = cfg.Section("").Key("INT").MustInt()
|
||||
v = cfg.Section("").Key("INT64").MustInt64()
|
||||
v = cfg.Section("").Key("UINT").MustUint()
|
||||
v = cfg.Section("").Key("UINT64").MustUint64()
|
||||
v = cfg.Section("").Key("TIME").MustTimeFormat(time.RFC3339)
|
||||
v = cfg.Section("").Key("TIME").MustTime() // RFC3339
|
||||
|
||||
// Methods start with Must also accept one argument for default value
|
||||
// when key not found or fail to parse value to given type.
|
||||
// Except method MustString, which you have to pass a default value.
|
||||
|
||||
v = cfg.Section("").Key("String").MustString("default")
|
||||
v = cfg.Section("").Key("BOOL").MustBool(true)
|
||||
v = cfg.Section("").Key("FLOAT64").MustFloat64(1.25)
|
||||
v = cfg.Section("").Key("INT").MustInt(10)
|
||||
v = cfg.Section("").Key("INT64").MustInt64(99)
|
||||
v = cfg.Section("").Key("UINT").MustUint(3)
|
||||
v = cfg.Section("").Key("UINT64").MustUint64(6)
|
||||
v = cfg.Section("").Key("TIME").MustTimeFormat(time.RFC3339, time.Now())
|
||||
v = cfg.Section("").Key("TIME").MustTime(time.Now()) // RFC3339
|
||||
```
|
||||
|
||||
What if my value is three-line long?
|
||||
|
||||
```ini
|
||||
[advance]
|
||||
ADDRESS = """404 road,
|
||||
NotFound, State, 5000
|
||||
Earth"""
|
||||
```
|
||||
|
||||
Not a problem!
|
||||
|
||||
```go
|
||||
cfg.Section("advance").Key("ADDRESS").String()
|
||||
|
||||
/* --- start ---
|
||||
404 road,
|
||||
NotFound, State, 5000
|
||||
Earth
|
||||
------ end --- */
|
||||
```
|
||||
|
||||
That's cool, how about continuation lines?
|
||||
|
||||
```ini
|
||||
[advance]
|
||||
two_lines = how about \
|
||||
continuation lines?
|
||||
lots_of_lines = 1 \
|
||||
2 \
|
||||
3 \
|
||||
4
|
||||
```
|
||||
|
||||
Piece of cake!
|
||||
|
||||
```go
|
||||
cfg.Section("advance").Key("two_lines").String() // how about continuation lines?
|
||||
cfg.Section("advance").Key("lots_of_lines").String() // 1 2 3 4
|
||||
```
|
||||
|
||||
Note that single quotes around values will be stripped:
|
||||
|
||||
```ini
|
||||
foo = "some value" // foo: some value
|
||||
bar = 'some value' // bar: some value
|
||||
```
|
||||
|
||||
That's all? Hmm, no.
|
||||
|
||||
#### Helper methods of working with values
|
||||
|
||||
To get value with given candidates:
|
||||
|
||||
```go
|
||||
v = cfg.Section("").Key("STRING").In("default", []string{"str", "arr", "types"})
|
||||
v = cfg.Section("").Key("FLOAT64").InFloat64(1.1, []float64{1.25, 2.5, 3.75})
|
||||
v = cfg.Section("").Key("INT").InInt(5, []int{10, 20, 30})
|
||||
v = cfg.Section("").Key("INT64").InInt64(10, []int64{10, 20, 30})
|
||||
v = cfg.Section("").Key("UINT").InUint(4, []int{3, 6, 9})
|
||||
v = cfg.Section("").Key("UINT64").InUint64(8, []int64{3, 6, 9})
|
||||
v = cfg.Section("").Key("TIME").InTimeFormat(time.RFC3339, time.Now(), []time.Time{time1, time2, time3})
|
||||
v = cfg.Section("").Key("TIME").InTime(time.Now(), []time.Time{time1, time2, time3}) // RFC3339
|
||||
```
|
||||
|
||||
Default value will be presented if value of key is not in candidates you given, and default value does not need be one of candidates.
|
||||
|
||||
To validate value in a given range:
|
||||
|
||||
```go
|
||||
vals = cfg.Section("").Key("FLOAT64").RangeFloat64(0.0, 1.1, 2.2)
|
||||
vals = cfg.Section("").Key("INT").RangeInt(0, 10, 20)
|
||||
vals = cfg.Section("").Key("INT64").RangeInt64(0, 10, 20)
|
||||
vals = cfg.Section("").Key("UINT").RangeUint(0, 3, 9)
|
||||
vals = cfg.Section("").Key("UINT64").RangeUint64(0, 3, 9)
|
||||
vals = cfg.Section("").Key("TIME").RangeTimeFormat(time.RFC3339, time.Now(), minTime, maxTime)
|
||||
vals = cfg.Section("").Key("TIME").RangeTime(time.Now(), minTime, maxTime) // RFC3339
|
||||
```
|
||||
|
||||
To auto-split value into slice:
|
||||
|
||||
```go
|
||||
vals = cfg.Section("").Key("STRINGS").Strings(",")
|
||||
vals = cfg.Section("").Key("FLOAT64S").Float64s(",")
|
||||
vals = cfg.Section("").Key("INTS").Ints(",")
|
||||
vals = cfg.Section("").Key("INT64S").Int64s(",")
|
||||
vals = cfg.Section("").Key("UINTS").Uints(",")
|
||||
vals = cfg.Section("").Key("UINT64S").Uint64s(",")
|
||||
vals = cfg.Section("").Key("TIMES").Times(",")
|
||||
```
|
||||
|
||||
### Save your configuration
|
||||
|
||||
Finally, it's time to save your configuration to somewhere.
|
||||
|
||||
A typical way to save configuration is writing it to a file:
|
||||
|
||||
```go
|
||||
// ...
|
||||
err = cfg.SaveTo("my.ini")
|
||||
err = cfg.SaveToIndent("my.ini", "\t")
|
||||
```
|
||||
|
||||
Another way to save is writing to a `io.Writer` interface:
|
||||
|
||||
```go
|
||||
// ...
|
||||
cfg.WriteTo(writer)
|
||||
cfg.WriteToIndent(writer, "\t")
|
||||
```
|
||||
|
||||
## Advanced Usage
|
||||
|
||||
### Recursive Values
|
||||
|
||||
For all value of keys, there is a special syntax `%(<name>)s`, where `<name>` is the key name in same section or default section, and `%(<name>)s` will be replaced by corresponding value(empty string if key not found). You can use this syntax at most 99 level of recursions.
|
||||
|
||||
```ini
|
||||
NAME = ini
|
||||
|
||||
[author]
|
||||
NAME = Unknwon
|
||||
GITHUB = https://github.com/%(NAME)s
|
||||
|
||||
[package]
|
||||
FULL_NAME = github.com/go-ini/%(NAME)s
|
||||
```
|
||||
|
||||
```go
|
||||
cfg.Section("author").Key("GITHUB").String() // https://github.com/Unknwon
|
||||
cfg.Section("package").Key("FULL_NAME").String() // github.com/go-ini/ini
|
||||
```
|
||||
|
||||
### Parent-child Sections
|
||||
|
||||
You can use `.` in section name to indicate parent-child relationship between two or more sections. If the key not found in the child section, library will try again on its parent section until there is no parent section.
|
||||
|
||||
```ini
|
||||
NAME = ini
|
||||
VERSION = v1
|
||||
IMPORT_PATH = gopkg.in/%(NAME)s.%(VERSION)s
|
||||
|
||||
[package]
|
||||
CLONE_URL = https://%(IMPORT_PATH)s
|
||||
|
||||
[package.sub]
|
||||
```
|
||||
|
||||
```go
|
||||
cfg.Section("package.sub").Key("CLONE_URL").String() // https://gopkg.in/ini.v1
|
||||
```
|
||||
|
||||
### Auto-increment Key Names
|
||||
|
||||
If key name is `-` in data source, then it would be seen as special syntax for auto-increment key name start from 1, and every section is independent on counter.
|
||||
|
||||
```ini
|
||||
[features]
|
||||
-: Support read/write comments of keys and sections
|
||||
-: Support auto-increment of key names
|
||||
-: Support load multiple files to overwrite key values
|
||||
```
|
||||
|
||||
```go
|
||||
cfg.Section("features").KeyStrings() // []{"#1", "#2", "#3"}
|
||||
```
|
||||
|
||||
### Map To Struct
|
||||
|
||||
Want more objective way to play with INI? Cool.
|
||||
|
||||
```ini
|
||||
Name = Unknwon
|
||||
age = 21
|
||||
Male = true
|
||||
Born = 1993-01-01T20:17:05Z
|
||||
|
||||
[Note]
|
||||
Content = Hi is a good man!
|
||||
Cities = HangZhou, Boston
|
||||
```
|
||||
|
||||
```go
|
||||
type Note struct {
|
||||
Content string
|
||||
Cities []string
|
||||
}
|
||||
|
||||
type Person struct {
|
||||
Name string
|
||||
Age int `ini:"age"`
|
||||
Male bool
|
||||
Born time.Time
|
||||
Note
|
||||
Created time.Time `ini:"-"`
|
||||
}
|
||||
|
||||
func main() {
|
||||
cfg, err := ini.Load("path/to/ini")
|
||||
// ...
|
||||
p := new(Person)
|
||||
err = cfg.MapTo(p)
|
||||
// ...
|
||||
|
||||
// Things can be simpler.
|
||||
err = ini.MapTo(p, "path/to/ini")
|
||||
// ...
|
||||
|
||||
// Just map a section? Fine.
|
||||
n := new(Note)
|
||||
err = cfg.Section("Note").MapTo(n)
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
Can I have default value for field? Absolutely.
|
||||
|
||||
Assign it before you map to struct. It will keep the value as it is if the key is not presented or got wrong type.
|
||||
|
||||
```go
|
||||
// ...
|
||||
p := &Person{
|
||||
Name: "Joe",
|
||||
}
|
||||
// ...
|
||||
```
|
||||
|
||||
It's really cool, but what's the point if you can't give me my file back from struct?
|
||||
|
||||
### Reflect From Struct
|
||||
|
||||
Why not?
|
||||
|
||||
```go
|
||||
type Embeded struct {
|
||||
Dates []time.Time `delim:"|"`
|
||||
Places []string
|
||||
None []int
|
||||
}
|
||||
|
||||
type Author struct {
|
||||
Name string `ini:"NAME"`
|
||||
Male bool
|
||||
Age int
|
||||
GPA float64
|
||||
NeverMind string `ini:"-"`
|
||||
*Embeded
|
||||
}
|
||||
|
||||
func main() {
|
||||
a := &Author{"Unknwon", true, 21, 2.8, "",
|
||||
&Embeded{
|
||||
[]time.Time{time.Now(), time.Now()},
|
||||
[]string{"HangZhou", "Boston"},
|
||||
[]int{},
|
||||
}}
|
||||
cfg := ini.Empty()
|
||||
err = ini.ReflectFrom(cfg, a)
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
So, what do I get?
|
||||
|
||||
```ini
|
||||
NAME = Unknwon
|
||||
Male = true
|
||||
Age = 21
|
||||
GPA = 2.8
|
||||
|
||||
[Embeded]
|
||||
Dates = 2015-08-07T22:14:22+08:00|2015-08-07T22:14:22+08:00
|
||||
Places = HangZhou,Boston
|
||||
None =
|
||||
```
|
||||
|
||||
#### Name Mapper
|
||||
|
||||
To save your time and make your code cleaner, this library supports [`NameMapper`](https://gowalker.org/gopkg.in/ini.v1#NameMapper) between struct field and actual section and key name.
|
||||
|
||||
There are 2 built-in name mappers:
|
||||
|
||||
- `AllCapsUnderscore`: it converts to format `ALL_CAPS_UNDERSCORE` then match section or key.
|
||||
- `TitleUnderscore`: it converts to format `title_underscore` then match section or key.
|
||||
|
||||
To use them:
|
||||
|
||||
```go
|
||||
type Info struct {
|
||||
PackageName string
|
||||
}
|
||||
|
||||
func main() {
|
||||
err = ini.MapToWithMapper(&Info{}, ini.TitleUnderscore, []byte("packag_name=ini"))
|
||||
// ...
|
||||
|
||||
cfg, err := ini.Load([]byte("PACKAGE_NAME=ini"))
|
||||
// ...
|
||||
info := new(Info)
|
||||
cfg.NameMapper = ini.AllCapsUnderscore
|
||||
err = cfg.MapTo(info)
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
Same rules of name mapper apply to `ini.ReflectFromWithMapper` function.
|
||||
|
||||
#### Other Notes On Map/Reflect
|
||||
|
||||
Any embedded struct is treated as a section by default, and there is no automatic parent-child relations in map/reflect feature:
|
||||
|
||||
```go
|
||||
type Child struct {
|
||||
Age string
|
||||
}
|
||||
|
||||
type Parent struct {
|
||||
Name string
|
||||
Child
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
City string
|
||||
Parent
|
||||
}
|
||||
```
|
||||
|
||||
Example configuration:
|
||||
|
||||
```ini
|
||||
City = Boston
|
||||
|
||||
[Parent]
|
||||
Name = Unknwon
|
||||
|
||||
[Child]
|
||||
Age = 21
|
||||
```
|
||||
|
||||
What if, yes, I'm paranoid, I want embedded struct to be in the same section. Well, all roads lead to Rome.
|
||||
|
||||
```go
|
||||
type Child struct {
|
||||
Age string
|
||||
}
|
||||
|
||||
type Parent struct {
|
||||
Name string
|
||||
Child `ini:"Parent"`
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
City string
|
||||
Parent
|
||||
}
|
||||
```
|
||||
|
||||
Example configuration:
|
||||
|
||||
```ini
|
||||
City = Boston
|
||||
|
||||
[Parent]
|
||||
Name = Unknwon
|
||||
Age = 21
|
||||
```
|
||||
|
||||
## Getting Help
|
||||
|
||||
- [API Documentation](https://gowalker.org/gopkg.in/ini.v1)
|
||||
- [File An Issue](https://github.com/go-ini/ini/issues/new)
|
||||
|
||||
## FAQs
|
||||
|
||||
### What does `BlockMode` field do?
|
||||
|
||||
By default, library lets you read and write values so we need a locker to make sure your data is safe. But in cases that you are very sure about only reading data through the library, you can set `cfg.BlockMode = false` to speed up read operations about **50-70%** faster.
|
||||
|
||||
### Why another INI library?
|
||||
|
||||
Many people are using my another INI library [goconfig](https://github.com/Unknwon/goconfig), so the reason for this one is I would like to make more Go style code. Also when you set `cfg.BlockMode = false`, this one is about **10-30%** faster.
|
||||
|
||||
To make those changes I have to confirm API broken, so it's safer to keep it in another place and start using `gopkg.in` to version my package at this time.(PS: shorter import path)
|
||||
|
||||
## License
|
||||
|
||||
This project is under Apache v2 License. See the [LICENSE](LICENSE) file for the full license text.
|
547
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/README_ZH.md
generated
vendored
547
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/README_ZH.md
generated
vendored
|
@ -1,547 +0,0 @@
|
|||
本包提供了 Go 语言中读写 INI 文件的功能。
|
||||
|
||||
## 功能特性
|
||||
|
||||
- 支持覆盖加载多个数据源(`[]byte` 或文件)
|
||||
- 支持递归读取键值
|
||||
- 支持读取父子分区
|
||||
- 支持读取自增键名
|
||||
- 支持读取多行的键值
|
||||
- 支持大量辅助方法
|
||||
- 支持在读取时直接转换为 Go 语言类型
|
||||
- 支持读取和 **写入** 分区和键的注释
|
||||
- 轻松操作分区、键值和注释
|
||||
- 在保存文件时分区和键值会保持原有的顺序
|
||||
|
||||
## 下载安装
|
||||
|
||||
go get gopkg.in/ini.v1
|
||||
|
||||
## 开始使用
|
||||
|
||||
### 从数据源加载
|
||||
|
||||
一个 **数据源** 可以是 `[]byte` 类型的原始数据,或 `string` 类型的文件路径。您可以加载 **任意多个** 数据源。如果您传递其它类型的数据源,则会直接返回错误。
|
||||
|
||||
```go
|
||||
cfg, err := ini.Load([]byte("raw data"), "filename")
|
||||
```
|
||||
|
||||
或者从一个空白的文件开始:
|
||||
|
||||
```go
|
||||
cfg := ini.Empty()
|
||||
```
|
||||
|
||||
当您在一开始无法决定需要加载哪些数据源时,仍可以使用 **Append()** 在需要的时候加载它们。
|
||||
|
||||
```go
|
||||
err := cfg.Append("other file", []byte("other raw data"))
|
||||
```
|
||||
|
||||
### 操作分区(Section)
|
||||
|
||||
获取指定分区:
|
||||
|
||||
```go
|
||||
section, err := cfg.GetSection("section name")
|
||||
```
|
||||
|
||||
如果您想要获取默认分区,则可以用空字符串代替分区名:
|
||||
|
||||
```go
|
||||
section, err := cfg.GetSection("")
|
||||
```
|
||||
|
||||
当您非常确定某个分区是存在的,可以使用以下简便方法:
|
||||
|
||||
```go
|
||||
section := cfg.Section("")
|
||||
```
|
||||
|
||||
如果不小心判断错了,要获取的分区其实是不存在的,那会发生什么呢?没事的,它会自动创建并返回一个对应的分区对象给您。
|
||||
|
||||
创建一个分区:
|
||||
|
||||
```go
|
||||
err := cfg.NewSection("new section")
|
||||
```
|
||||
|
||||
获取所有分区对象或名称:
|
||||
|
||||
```go
|
||||
sections := cfg.Sections()
|
||||
names := cfg.SectionStrings()
|
||||
```
|
||||
|
||||
### 操作键(Key)
|
||||
|
||||
获取某个分区下的键:
|
||||
|
||||
```go
|
||||
key, err := cfg.Section("").GetKey("key name")
|
||||
```
|
||||
|
||||
和分区一样,您也可以直接获取键而忽略错误处理:
|
||||
|
||||
```go
|
||||
key := cfg.Section("").Key("key name")
|
||||
```
|
||||
|
||||
创建一个新的键:
|
||||
|
||||
```go
|
||||
err := cfg.Section("").NewKey("name", "value")
|
||||
```
|
||||
|
||||
获取分区下的所有键或键名:
|
||||
|
||||
```go
|
||||
keys := cfg.Section("").Keys()
|
||||
names := cfg.Section("").KeyStrings()
|
||||
```
|
||||
|
||||
获取分区下的所有键值对的克隆:
|
||||
|
||||
```go
|
||||
hash := cfg.GetSection("").KeysHash()
|
||||
```
|
||||
|
||||
### 操作键值(Value)
|
||||
|
||||
获取一个类型为字符串(string)的值:
|
||||
|
||||
```go
|
||||
val := cfg.Section("").Key("key name").String()
|
||||
```
|
||||
|
||||
获取值的同时通过自定义函数进行处理验证:
|
||||
|
||||
```go
|
||||
val := cfg.Section("").Key("key name").Validate(func(in string) string {
|
||||
if len(in) == 0 {
|
||||
return "default"
|
||||
}
|
||||
return in
|
||||
})
|
||||
```
|
||||
|
||||
获取其它类型的值:
|
||||
|
||||
```go
|
||||
// 布尔值的规则:
|
||||
// true 当值为:1, t, T, TRUE, true, True, YES, yes, Yes, ON, on, On
|
||||
// false 当值为:0, f, F, FALSE, false, False, NO, no, No, OFF, off, Off
|
||||
v, err = cfg.Section("").Key("BOOL").Bool()
|
||||
v, err = cfg.Section("").Key("FLOAT64").Float64()
|
||||
v, err = cfg.Section("").Key("INT").Int()
|
||||
v, err = cfg.Section("").Key("INT64").Int64()
|
||||
v, err = cfg.Section("").Key("UINT").Uint()
|
||||
v, err = cfg.Section("").Key("UINT64").Uint64()
|
||||
v, err = cfg.Section("").Key("TIME").TimeFormat(time.RFC3339)
|
||||
v, err = cfg.Section("").Key("TIME").Time() // RFC3339
|
||||
|
||||
v = cfg.Section("").Key("BOOL").MustBool()
|
||||
v = cfg.Section("").Key("FLOAT64").MustFloat64()
|
||||
v = cfg.Section("").Key("INT").MustInt()
|
||||
v = cfg.Section("").Key("INT64").MustInt64()
|
||||
v = cfg.Section("").Key("UINT").MustUint()
|
||||
v = cfg.Section("").Key("UINT64").MustUint64()
|
||||
v = cfg.Section("").Key("TIME").MustTimeFormat(time.RFC3339)
|
||||
v = cfg.Section("").Key("TIME").MustTime() // RFC3339
|
||||
|
||||
// 由 Must 开头的方法名允许接收一个相同类型的参数来作为默认值,
|
||||
// 当键不存在或者转换失败时,则会直接返回该默认值。
|
||||
// 但是,MustString 方法必须传递一个默认值。
|
||||
|
||||
v = cfg.Seciont("").Key("String").MustString("default")
|
||||
v = cfg.Section("").Key("BOOL").MustBool(true)
|
||||
v = cfg.Section("").Key("FLOAT64").MustFloat64(1.25)
|
||||
v = cfg.Section("").Key("INT").MustInt(10)
|
||||
v = cfg.Section("").Key("INT64").MustInt64(99)
|
||||
v = cfg.Section("").Key("UINT").MustUint(3)
|
||||
v = cfg.Section("").Key("UINT64").MustUint64(6)
|
||||
v = cfg.Section("").Key("TIME").MustTimeFormat(time.RFC3339, time.Now())
|
||||
v = cfg.Section("").Key("TIME").MustTime(time.Now()) // RFC3339
|
||||
```
|
||||
|
||||
如果我的值有好多行怎么办?
|
||||
|
||||
```ini
|
||||
[advance]
|
||||
ADDRESS = """404 road,
|
||||
NotFound, State, 5000
|
||||
Earth"""
|
||||
```
|
||||
|
||||
嗯哼?小 case!
|
||||
|
||||
```go
|
||||
cfg.Section("advance").Key("ADDRESS").String()
|
||||
|
||||
/* --- start ---
|
||||
404 road,
|
||||
NotFound, State, 5000
|
||||
Earth
|
||||
------ end --- */
|
||||
```
|
||||
|
||||
赞爆了!那要是我属于一行的内容写不下想要写到第二行怎么办?
|
||||
|
||||
```ini
|
||||
[advance]
|
||||
two_lines = how about \
|
||||
continuation lines?
|
||||
lots_of_lines = 1 \
|
||||
2 \
|
||||
3 \
|
||||
4
|
||||
```
|
||||
|
||||
简直是小菜一碟!
|
||||
|
||||
```go
|
||||
cfg.Section("advance").Key("two_lines").String() // how about continuation lines?
|
||||
cfg.Section("advance").Key("lots_of_lines").String() // 1 2 3 4
|
||||
```
|
||||
|
||||
需要注意的是,值两侧的单引号会被自动剔除:
|
||||
|
||||
```ini
|
||||
foo = "some value" // foo: some value
|
||||
bar = 'some value' // bar: some value
|
||||
```
|
||||
|
||||
这就是全部了?哈哈,当然不是。
|
||||
|
||||
#### 操作键值的辅助方法
|
||||
|
||||
获取键值时设定候选值:
|
||||
|
||||
```go
|
||||
v = cfg.Section("").Key("STRING").In("default", []string{"str", "arr", "types"})
|
||||
v = cfg.Section("").Key("FLOAT64").InFloat64(1.1, []float64{1.25, 2.5, 3.75})
|
||||
v = cfg.Section("").Key("INT").InInt(5, []int{10, 20, 30})
|
||||
v = cfg.Section("").Key("INT64").InInt64(10, []int64{10, 20, 30})
|
||||
v = cfg.Section("").Key("UINT").InUint(4, []int{3, 6, 9})
|
||||
v = cfg.Section("").Key("UINT64").InUint64(8, []int64{3, 6, 9})
|
||||
v = cfg.Section("").Key("TIME").InTimeFormat(time.RFC3339, time.Now(), []time.Time{time1, time2, time3})
|
||||
v = cfg.Section("").Key("TIME").InTime(time.Now(), []time.Time{time1, time2, time3}) // RFC3339
|
||||
```
|
||||
|
||||
如果获取到的值不是候选值的任意一个,则会返回默认值,而默认值不需要是候选值中的一员。
|
||||
|
||||
验证获取的值是否在指定范围内:
|
||||
|
||||
```go
|
||||
vals = cfg.Section("").Key("FLOAT64").RangeFloat64(0.0, 1.1, 2.2)
|
||||
vals = cfg.Section("").Key("INT").RangeInt(0, 10, 20)
|
||||
vals = cfg.Section("").Key("INT64").RangeInt64(0, 10, 20)
|
||||
vals = cfg.Section("").Key("UINT").RangeUint(0, 3, 9)
|
||||
vals = cfg.Section("").Key("UINT64").RangeUint64(0, 3, 9)
|
||||
vals = cfg.Section("").Key("TIME").RangeTimeFormat(time.RFC3339, time.Now(), minTime, maxTime)
|
||||
vals = cfg.Section("").Key("TIME").RangeTime(time.Now(), minTime, maxTime) // RFC3339
|
||||
```
|
||||
|
||||
自动分割键值为切片(slice):
|
||||
|
||||
```go
|
||||
vals = cfg.Section("").Key("STRINGS").Strings(",")
|
||||
vals = cfg.Section("").Key("FLOAT64S").Float64s(",")
|
||||
vals = cfg.Section("").Key("INTS").Ints(",")
|
||||
vals = cfg.Section("").Key("INT64S").Int64s(",")
|
||||
vals = cfg.Section("").Key("UINTS").Uints(",")
|
||||
vals = cfg.Section("").Key("UINT64S").Uint64s(",")
|
||||
vals = cfg.Section("").Key("TIMES").Times(",")
|
||||
```
|
||||
|
||||
### 保存配置
|
||||
|
||||
终于到了这个时刻,是时候保存一下配置了。
|
||||
|
||||
比较原始的做法是输出配置到某个文件:
|
||||
|
||||
```go
|
||||
// ...
|
||||
err = cfg.SaveTo("my.ini")
|
||||
err = cfg.SaveToIndent("my.ini", "\t")
|
||||
```
|
||||
|
||||
另一个比较高级的做法是写入到任何实现 `io.Writer` 接口的对象中:
|
||||
|
||||
```go
|
||||
// ...
|
||||
cfg.WriteTo(writer)
|
||||
cfg.WriteToIndent(writer, "\t")
|
||||
```
|
||||
|
||||
### 高级用法
|
||||
|
||||
#### 递归读取键值
|
||||
|
||||
在获取所有键值的过程中,特殊语法 `%(<name>)s` 会被应用,其中 `<name>` 可以是相同分区或者默认分区下的键名。字符串 `%(<name>)s` 会被相应的键值所替代,如果指定的键不存在,则会用空字符串替代。您可以最多使用 99 层的递归嵌套。
|
||||
|
||||
```ini
|
||||
NAME = ini
|
||||
|
||||
[author]
|
||||
NAME = Unknwon
|
||||
GITHUB = https://github.com/%(NAME)s
|
||||
|
||||
[package]
|
||||
FULL_NAME = github.com/go-ini/%(NAME)s
|
||||
```
|
||||
|
||||
```go
|
||||
cfg.Section("author").Key("GITHUB").String() // https://github.com/Unknwon
|
||||
cfg.Section("package").Key("FULL_NAME").String() // github.com/go-ini/ini
|
||||
```
|
||||
|
||||
#### 读取父子分区
|
||||
|
||||
您可以在分区名称中使用 `.` 来表示两个或多个分区之间的父子关系。如果某个键在子分区中不存在,则会去它的父分区中再次寻找,直到没有父分区为止。
|
||||
|
||||
```ini
|
||||
NAME = ini
|
||||
VERSION = v1
|
||||
IMPORT_PATH = gopkg.in/%(NAME)s.%(VERSION)s
|
||||
|
||||
[package]
|
||||
CLONE_URL = https://%(IMPORT_PATH)s
|
||||
|
||||
[package.sub]
|
||||
```
|
||||
|
||||
```go
|
||||
cfg.Section("package.sub").Key("CLONE_URL").String() // https://gopkg.in/ini.v1
|
||||
```
|
||||
|
||||
#### 读取自增键名
|
||||
|
||||
如果数据源中的键名为 `-`,则认为该键使用了自增键名的特殊语法。计数器从 1 开始,并且分区之间是相互独立的。
|
||||
|
||||
```ini
|
||||
[features]
|
||||
-: Support read/write comments of keys and sections
|
||||
-: Support auto-increment of key names
|
||||
-: Support load multiple files to overwrite key values
|
||||
```
|
||||
|
||||
```go
|
||||
cfg.Section("features").KeyStrings() // []{"#1", "#2", "#3"}
|
||||
```
|
||||
|
||||
### 映射到结构
|
||||
|
||||
想要使用更加面向对象的方式玩转 INI 吗?好主意。
|
||||
|
||||
```ini
|
||||
Name = Unknwon
|
||||
age = 21
|
||||
Male = true
|
||||
Born = 1993-01-01T20:17:05Z
|
||||
|
||||
[Note]
|
||||
Content = Hi is a good man!
|
||||
Cities = HangZhou, Boston
|
||||
```
|
||||
|
||||
```go
|
||||
type Note struct {
|
||||
Content string
|
||||
Cities []string
|
||||
}
|
||||
|
||||
type Person struct {
|
||||
Name string
|
||||
Age int `ini:"age"`
|
||||
Male bool
|
||||
Born time.Time
|
||||
Note
|
||||
Created time.Time `ini:"-"`
|
||||
}
|
||||
|
||||
func main() {
|
||||
cfg, err := ini.Load("path/to/ini")
|
||||
// ...
|
||||
p := new(Person)
|
||||
err = cfg.MapTo(p)
|
||||
// ...
|
||||
|
||||
// 一切竟可以如此的简单。
|
||||
err = ini.MapTo(p, "path/to/ini")
|
||||
// ...
|
||||
|
||||
// 嗯哼?只需要映射一个分区吗?
|
||||
n := new(Note)
|
||||
err = cfg.Section("Note").MapTo(n)
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
结构的字段怎么设置默认值呢?很简单,只要在映射之前对指定字段进行赋值就可以了。如果键未找到或者类型错误,该值不会发生改变。
|
||||
|
||||
```go
|
||||
// ...
|
||||
p := &Person{
|
||||
Name: "Joe",
|
||||
}
|
||||
// ...
|
||||
```
|
||||
|
||||
这样玩 INI 真的好酷啊!然而,如果不能还给我原来的配置文件,有什么卵用?
|
||||
|
||||
### 从结构反射
|
||||
|
||||
可是,我有说不能吗?
|
||||
|
||||
```go
|
||||
type Embeded struct {
|
||||
Dates []time.Time `delim:"|"`
|
||||
Places []string
|
||||
None []int
|
||||
}
|
||||
|
||||
type Author struct {
|
||||
Name string `ini:"NAME"`
|
||||
Male bool
|
||||
Age int
|
||||
GPA float64
|
||||
NeverMind string `ini:"-"`
|
||||
*Embeded
|
||||
}
|
||||
|
||||
func main() {
|
||||
a := &Author{"Unknwon", true, 21, 2.8, "",
|
||||
&Embeded{
|
||||
[]time.Time{time.Now(), time.Now()},
|
||||
[]string{"HangZhou", "Boston"},
|
||||
[]int{},
|
||||
}}
|
||||
cfg := ini.Empty()
|
||||
err = ini.ReflectFrom(cfg, a)
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
瞧瞧,奇迹发生了。
|
||||
|
||||
```ini
|
||||
NAME = Unknwon
|
||||
Male = true
|
||||
Age = 21
|
||||
GPA = 2.8
|
||||
|
||||
[Embeded]
|
||||
Dates = 2015-08-07T22:14:22+08:00|2015-08-07T22:14:22+08:00
|
||||
Places = HangZhou,Boston
|
||||
None =
|
||||
```
|
||||
|
||||
#### 名称映射器(Name Mapper)
|
||||
|
||||
为了节省您的时间并简化代码,本库支持类型为 [`NameMapper`](https://gowalker.org/gopkg.in/ini.v1#NameMapper) 的名称映射器,该映射器负责结构字段名与分区名和键名之间的映射。
|
||||
|
||||
目前有 2 款内置的映射器:
|
||||
|
||||
- `AllCapsUnderscore`:该映射器将字段名转换至格式 `ALL_CAPS_UNDERSCORE` 后再去匹配分区名和键名。
|
||||
- `TitleUnderscore`:该映射器将字段名转换至格式 `title_underscore` 后再去匹配分区名和键名。
|
||||
|
||||
使用方法:
|
||||
|
||||
```go
|
||||
type Info struct{
|
||||
PackageName string
|
||||
}
|
||||
|
||||
func main() {
|
||||
err = ini.MapToWithMapper(&Info{}, ini.TitleUnderscore, []byte("packag_name=ini"))
|
||||
// ...
|
||||
|
||||
cfg, err := ini.Load([]byte("PACKAGE_NAME=ini"))
|
||||
// ...
|
||||
info := new(Info)
|
||||
cfg.NameMapper = ini.AllCapsUnderscore
|
||||
err = cfg.MapTo(info)
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
使用函数 `ini.ReflectFromWithMapper` 时也可应用相同的规则。
|
||||
|
||||
#### 映射/反射的其它说明
|
||||
|
||||
任何嵌入的结构都会被默认认作一个不同的分区,并且不会自动产生所谓的父子分区关联:
|
||||
|
||||
```go
|
||||
type Child struct {
|
||||
Age string
|
||||
}
|
||||
|
||||
type Parent struct {
|
||||
Name string
|
||||
Child
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
City string
|
||||
Parent
|
||||
}
|
||||
```
|
||||
|
||||
示例配置文件:
|
||||
|
||||
```ini
|
||||
City = Boston
|
||||
|
||||
[Parent]
|
||||
Name = Unknwon
|
||||
|
||||
[Child]
|
||||
Age = 21
|
||||
```
|
||||
|
||||
很好,但是,我就是要嵌入结构也在同一个分区。好吧,你爹是李刚!
|
||||
|
||||
```go
|
||||
type Child struct {
|
||||
Age string
|
||||
}
|
||||
|
||||
type Parent struct {
|
||||
Name string
|
||||
Child `ini:"Parent"`
|
||||
}
|
||||
|
||||
type Config struct {
|
||||
City string
|
||||
Parent
|
||||
}
|
||||
```
|
||||
|
||||
示例配置文件:
|
||||
|
||||
```ini
|
||||
City = Boston
|
||||
|
||||
[Parent]
|
||||
Name = Unknwon
|
||||
Age = 21
|
||||
```
|
||||
|
||||
## 获取帮助
|
||||
|
||||
- [API 文档](https://gowalker.org/gopkg.in/ini.v1)
|
||||
- [创建工单](https://github.com/go-ini/ini/issues/new)
|
||||
|
||||
## 常见问题
|
||||
|
||||
### 字段 `BlockMode` 是什么?
|
||||
|
||||
默认情况下,本库会在您进行读写操作时采用锁机制来确保数据时间。但在某些情况下,您非常确定只进行读操作。此时,您可以通过设置 `cfg.BlockMode = false` 来将读操作提升大约 **50-70%** 的性能。
|
||||
|
||||
### 为什么要写另一个 INI 解析库?
|
||||
|
||||
许多人都在使用我的 [goconfig](https://github.com/Unknwon/goconfig) 来完成对 INI 文件的操作,但我希望使用更加 Go 风格的代码。并且当您设置 `cfg.BlockMode = false` 时,会有大约 **10-30%** 的性能提升。
|
||||
|
||||
为了做出这些改变,我必须对 API 进行破坏,所以新开一个仓库是最安全的做法。除此之外,本库直接使用 `gopkg.in` 来进行版本化发布。(其实真相是导入路径更短了)
|
1226
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/ini.go
generated
vendored
1226
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/ini.go
generated
vendored
File diff suppressed because it is too large
Load diff
350
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/struct.go
generated
vendored
350
vendor/github.com/aws/aws-sdk-go/vendor/github.com/go-ini/ini/struct.go
generated
vendored
|
@ -1,350 +0,0 @@
|
|||
// Copyright 2014 Unknwon
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License"): you may
|
||||
// not use this file except in compliance with the License. You may obtain
|
||||
// a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
|
||||
// WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
|
||||
// License for the specific language governing permissions and limitations
|
||||
// under the License.
|
||||
|
||||
package ini
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"errors"
|
||||
"fmt"
|
||||
"reflect"
|
||||
"time"
|
||||
"unicode"
|
||||
)
|
||||
|
||||
// NameMapper represents a ini tag name mapper.
|
||||
type NameMapper func(string) string
|
||||
|
||||
// Built-in name getters.
|
||||
var (
|
||||
// AllCapsUnderscore converts to format ALL_CAPS_UNDERSCORE.
|
||||
AllCapsUnderscore NameMapper = func(raw string) string {
|
||||
newstr := make([]rune, 0, len(raw))
|
||||
for i, chr := range raw {
|
||||
if isUpper := 'A' <= chr && chr <= 'Z'; isUpper {
|
||||
if i > 0 {
|
||||
newstr = append(newstr, '_')
|
||||
}
|
||||
}
|
||||
newstr = append(newstr, unicode.ToUpper(chr))
|
||||
}
|
||||
return string(newstr)
|
||||
}
|
||||
// TitleUnderscore converts to format title_underscore.
|
||||
TitleUnderscore NameMapper = func(raw string) string {
|
||||
newstr := make([]rune, 0, len(raw))
|
||||
for i, chr := range raw {
|
||||
if isUpper := 'A' <= chr && chr <= 'Z'; isUpper {
|
||||
if i > 0 {
|
||||
newstr = append(newstr, '_')
|
||||
}
|
||||
chr -= ('A' - 'a')
|
||||
}
|
||||
newstr = append(newstr, chr)
|
||||
}
|
||||
return string(newstr)
|
||||
}
|
||||
)
|
||||
|
||||
func (s *Section) parseFieldName(raw, actual string) string {
|
||||
if len(actual) > 0 {
|
||||
return actual
|
||||
}
|
||||
if s.f.NameMapper != nil {
|
||||
return s.f.NameMapper(raw)
|
||||
}
|
||||
return raw
|
||||
}
|
||||
|
||||
func parseDelim(actual string) string {
|
||||
if len(actual) > 0 {
|
||||
return actual
|
||||
}
|
||||
return ","
|
||||
}
|
||||
|
||||
var reflectTime = reflect.TypeOf(time.Now()).Kind()
|
||||
|
||||
// setWithProperType sets proper value to field based on its type,
|
||||
// but it does not return error for failing parsing,
|
||||
// because we want to use default value that is already assigned to strcut.
|
||||
func setWithProperType(t reflect.Type, key *Key, field reflect.Value, delim string) error {
|
||||
switch t.Kind() {
|
||||
case reflect.String:
|
||||
if len(key.String()) == 0 {
|
||||
return nil
|
||||
}
|
||||
field.SetString(key.String())
|
||||
case reflect.Bool:
|
||||
boolVal, err := key.Bool()
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
field.SetBool(boolVal)
|
||||
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64:
|
||||
durationVal, err := key.Duration()
|
||||
if err == nil {
|
||||
field.Set(reflect.ValueOf(durationVal))
|
||||
return nil
|
||||
}
|
||||
|
||||
intVal, err := key.Int64()
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
field.SetInt(intVal)
|
||||
// byte is an alias for uint8, so supporting uint8 breaks support for byte
|
||||
case reflect.Uint, reflect.Uint16, reflect.Uint32, reflect.Uint64:
|
||||
durationVal, err := key.Duration()
|
||||
if err == nil {
|
||||
field.Set(reflect.ValueOf(durationVal))
|
||||
return nil
|
||||
}
|
||||
|
||||
uintVal, err := key.Uint64()
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
field.SetUint(uintVal)
|
||||
|
||||
case reflect.Float64:
|
||||
floatVal, err := key.Float64()
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
field.SetFloat(floatVal)
|
||||
case reflectTime:
|
||||
timeVal, err := key.Time()
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
field.Set(reflect.ValueOf(timeVal))
|
||||
case reflect.Slice:
|
||||
vals := key.Strings(delim)
|
||||
numVals := len(vals)
|
||||
if numVals == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
sliceOf := field.Type().Elem().Kind()
|
||||
|
||||
var times []time.Time
|
||||
if sliceOf == reflectTime {
|
||||
times = key.Times(delim)
|
||||
}
|
||||
|
||||
slice := reflect.MakeSlice(field.Type(), numVals, numVals)
|
||||
for i := 0; i < numVals; i++ {
|
||||
switch sliceOf {
|
||||
case reflectTime:
|
||||
slice.Index(i).Set(reflect.ValueOf(times[i]))
|
||||
default:
|
||||
slice.Index(i).Set(reflect.ValueOf(vals[i]))
|
||||
}
|
||||
}
|
||||
field.Set(slice)
|
||||
default:
|
||||
return fmt.Errorf("unsupported type '%s'", t)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (s *Section) mapTo(val reflect.Value) error {
|
||||
if val.Kind() == reflect.Ptr {
|
||||
val = val.Elem()
|
||||
}
|
||||
typ := val.Type()
|
||||
|
||||
for i := 0; i < typ.NumField(); i++ {
|
||||
field := val.Field(i)
|
||||
tpField := typ.Field(i)
|
||||
|
||||
tag := tpField.Tag.Get("ini")
|
||||
if tag == "-" {
|
||||
continue
|
||||
}
|
||||
|
||||
fieldName := s.parseFieldName(tpField.Name, tag)
|
||||
if len(fieldName) == 0 || !field.CanSet() {
|
||||
continue
|
||||
}
|
||||
|
||||
isAnonymous := tpField.Type.Kind() == reflect.Ptr && tpField.Anonymous
|
||||
isStruct := tpField.Type.Kind() == reflect.Struct
|
||||
if isAnonymous {
|
||||
field.Set(reflect.New(tpField.Type.Elem()))
|
||||
}
|
||||
|
||||
if isAnonymous || isStruct {
|
||||
if sec, err := s.f.GetSection(fieldName); err == nil {
|
||||
if err = sec.mapTo(field); err != nil {
|
||||
return fmt.Errorf("error mapping field(%s): %v", fieldName, err)
|
||||
}
|
||||
continue
|
||||
}
|
||||
}
|
||||
|
||||
if key, err := s.GetKey(fieldName); err == nil {
|
||||
if err = setWithProperType(tpField.Type, key, field, parseDelim(tpField.Tag.Get("delim"))); err != nil {
|
||||
return fmt.Errorf("error mapping field(%s): %v", fieldName, err)
|
||||
}
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// MapTo maps section to given struct.
|
||||
func (s *Section) MapTo(v interface{}) error {
|
||||
typ := reflect.TypeOf(v)
|
||||
val := reflect.ValueOf(v)
|
||||
if typ.Kind() == reflect.Ptr {
|
||||
typ = typ.Elem()
|
||||
val = val.Elem()
|
||||
} else {
|
||||
return errors.New("cannot map to non-pointer struct")
|
||||
}
|
||||
|
||||
return s.mapTo(val)
|
||||
}
|
||||
|
||||
// MapTo maps file to given struct.
|
||||
func (f *File) MapTo(v interface{}) error {
|
||||
return f.Section("").MapTo(v)
|
||||
}
|
||||
|
||||
// MapTo maps data sources to given struct with name mapper.
|
||||
func MapToWithMapper(v interface{}, mapper NameMapper, source interface{}, others ...interface{}) error {
|
||||
cfg, err := Load(source, others...)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
cfg.NameMapper = mapper
|
||||
return cfg.MapTo(v)
|
||||
}
|
||||
|
||||
// MapTo maps data sources to given struct.
|
||||
func MapTo(v, source interface{}, others ...interface{}) error {
|
||||
return MapToWithMapper(v, nil, source, others...)
|
||||
}
|
||||
|
||||
// reflectWithProperType does the opposite thing with setWithProperType.
|
||||
func reflectWithProperType(t reflect.Type, key *Key, field reflect.Value, delim string) error {
|
||||
switch t.Kind() {
|
||||
case reflect.String:
|
||||
key.SetValue(field.String())
|
||||
case reflect.Bool,
|
||||
reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64,
|
||||
reflect.Uint, reflect.Uint8, reflect.Uint16, reflect.Uint32, reflect.Uint64,
|
||||
reflect.Float64,
|
||||
reflectTime:
|
||||
key.SetValue(fmt.Sprint(field))
|
||||
case reflect.Slice:
|
||||
vals := field.Slice(0, field.Len())
|
||||
if field.Len() == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
var buf bytes.Buffer
|
||||
isTime := fmt.Sprint(field.Type()) == "[]time.Time"
|
||||
for i := 0; i < field.Len(); i++ {
|
||||
if isTime {
|
||||
buf.WriteString(vals.Index(i).Interface().(time.Time).Format(time.RFC3339))
|
||||
} else {
|
||||
buf.WriteString(fmt.Sprint(vals.Index(i)))
|
||||
}
|
||||
buf.WriteString(delim)
|
||||
}
|
||||
key.SetValue(buf.String()[:buf.Len()-1])
|
||||
default:
|
||||
return fmt.Errorf("unsupported type '%s'", t)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func (s *Section) reflectFrom(val reflect.Value) error {
|
||||
if val.Kind() == reflect.Ptr {
|
||||
val = val.Elem()
|
||||
}
|
||||
typ := val.Type()
|
||||
|
||||
for i := 0; i < typ.NumField(); i++ {
|
||||
field := val.Field(i)
|
||||
tpField := typ.Field(i)
|
||||
|
||||
tag := tpField.Tag.Get("ini")
|
||||
if tag == "-" {
|
||||
continue
|
||||
}
|
||||
|
||||
fieldName := s.parseFieldName(tpField.Name, tag)
|
||||
if len(fieldName) == 0 || !field.CanSet() {
|
||||
continue
|
||||
}
|
||||
|
||||
if (tpField.Type.Kind() == reflect.Ptr && tpField.Anonymous) ||
|
||||
(tpField.Type.Kind() == reflect.Struct) {
|
||||
// Note: The only error here is section doesn't exist.
|
||||
sec, err := s.f.GetSection(fieldName)
|
||||
if err != nil {
|
||||
// Note: fieldName can never be empty here, ignore error.
|
||||
sec, _ = s.f.NewSection(fieldName)
|
||||
}
|
||||
if err = sec.reflectFrom(field); err != nil {
|
||||
return fmt.Errorf("error reflecting field(%s): %v", fieldName, err)
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
// Note: Same reason as secion.
|
||||
key, err := s.GetKey(fieldName)
|
||||
if err != nil {
|
||||
key, _ = s.NewKey(fieldName, "")
|
||||
}
|
||||
if err = reflectWithProperType(tpField.Type, key, field, parseDelim(tpField.Tag.Get("delim"))); err != nil {
|
||||
return fmt.Errorf("error reflecting field(%s): %v", fieldName, err)
|
||||
}
|
||||
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// ReflectFrom reflects secion from given struct.
|
||||
func (s *Section) ReflectFrom(v interface{}) error {
|
||||
typ := reflect.TypeOf(v)
|
||||
val := reflect.ValueOf(v)
|
||||
if typ.Kind() == reflect.Ptr {
|
||||
typ = typ.Elem()
|
||||
val = val.Elem()
|
||||
} else {
|
||||
return errors.New("cannot reflect from non-pointer struct")
|
||||
}
|
||||
|
||||
return s.reflectFrom(val)
|
||||
}
|
||||
|
||||
// ReflectFrom reflects file from given struct.
|
||||
func (f *File) ReflectFrom(v interface{}) error {
|
||||
return f.Section("").ReflectFrom(v)
|
||||
}
|
||||
|
||||
// ReflectFrom reflects data sources from given struct with name mapper.
|
||||
func ReflectFromWithMapper(cfg *File, v interface{}, mapper NameMapper) error {
|
||||
cfg.NameMapper = mapper
|
||||
return cfg.ReflectFrom(v)
|
||||
}
|
||||
|
||||
// ReflectFrom reflects data sources from given struct.
|
||||
func ReflectFrom(cfg *File, v interface{}) error {
|
||||
return ReflectFromWithMapper(cfg, v, nil)
|
||||
}
|
4
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/.gitignore
generated
vendored
4
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/.gitignore
generated
vendored
|
@ -1,4 +0,0 @@
|
|||
jpgo
|
||||
jmespath-fuzz.zip
|
||||
cpu.out
|
||||
go-jmespath.test
|
9
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/.travis.yml
generated
vendored
9
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/.travis.yml
generated
vendored
|
@ -1,9 +0,0 @@
|
|||
language: go
|
||||
|
||||
sudo: false
|
||||
|
||||
go:
|
||||
- 1.4
|
||||
|
||||
install: go get -v -t ./...
|
||||
script: make test
|
13
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/LICENSE
generated
vendored
13
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/LICENSE
generated
vendored
|
@ -1,13 +0,0 @@
|
|||
Copyright 2015 James Saryerwinnie
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
44
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/Makefile
generated
vendored
44
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/Makefile
generated
vendored
|
@ -1,44 +0,0 @@
|
|||
|
||||
CMD = jpgo
|
||||
|
||||
help:
|
||||
@echo "Please use \`make <target>' where <target> is one of"
|
||||
@echo " test to run all the tests"
|
||||
@echo " build to build the library and jp executable"
|
||||
@echo " generate to run codegen"
|
||||
|
||||
|
||||
generate:
|
||||
go generate ./...
|
||||
|
||||
build:
|
||||
rm -f $(CMD)
|
||||
go build ./...
|
||||
rm -f cmd/$(CMD)/$(CMD) && cd cmd/$(CMD)/ && go build ./...
|
||||
mv cmd/$(CMD)/$(CMD) .
|
||||
|
||||
test:
|
||||
go test -v ./...
|
||||
|
||||
check:
|
||||
go vet ./...
|
||||
@echo "golint ./..."
|
||||
@lint=`golint ./...`; \
|
||||
lint=`echo "$$lint" | grep -v "astnodetype_string.go" | grep -v "toktype_string.go"`; \
|
||||
echo "$$lint"; \
|
||||
if [ "$$lint" != "" ]; then exit 1; fi
|
||||
|
||||
htmlc:
|
||||
go test -coverprofile="/tmp/jpcov" && go tool cover -html="/tmp/jpcov" && unlink /tmp/jpcov
|
||||
|
||||
buildfuzz:
|
||||
go-fuzz-build github.com/jmespath/go-jmespath/fuzz
|
||||
|
||||
fuzz: buildfuzz
|
||||
go-fuzz -bin=./jmespath-fuzz.zip -workdir=fuzz/testdata
|
||||
|
||||
bench:
|
||||
go test -bench . -cpuprofile cpu.out
|
||||
|
||||
pprof-cpu:
|
||||
go tool pprof ./go-jmespath.test ./cpu.out
|
7
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/README.md
generated
vendored
7
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/README.md
generated
vendored
|
@ -1,7 +0,0 @@
|
|||
# go-jmespath - A JMESPath implementation in Go
|
||||
|
||||
[](https://travis-ci.org/jmespath/go-jmespath)
|
||||
|
||||
|
||||
|
||||
See http://jmespath.org for more info.
|
49
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/api.go
generated
vendored
49
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/api.go
generated
vendored
|
@ -1,49 +0,0 @@
|
|||
package jmespath
|
||||
|
||||
import "strconv"
|
||||
|
||||
// JmesPath is the epresentation of a compiled JMES path query. A JmesPath is
|
||||
// safe for concurrent use by multiple goroutines.
|
||||
type JMESPath struct {
|
||||
ast ASTNode
|
||||
intr *treeInterpreter
|
||||
}
|
||||
|
||||
// Compile parses a JMESPath expression and returns, if successful, a JMESPath
|
||||
// object that can be used to match against data.
|
||||
func Compile(expression string) (*JMESPath, error) {
|
||||
parser := NewParser()
|
||||
ast, err := parser.Parse(expression)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
jmespath := &JMESPath{ast: ast, intr: newInterpreter()}
|
||||
return jmespath, nil
|
||||
}
|
||||
|
||||
// MustCompile is like Compile but panics if the expression cannot be parsed.
|
||||
// It simplifies safe initialization of global variables holding compiled
|
||||
// JMESPaths.
|
||||
func MustCompile(expression string) *JMESPath {
|
||||
jmespath, err := Compile(expression)
|
||||
if err != nil {
|
||||
panic(`jmespath: Compile(` + strconv.Quote(expression) + `): ` + err.Error())
|
||||
}
|
||||
return jmespath
|
||||
}
|
||||
|
||||
// Search evaluates a JMESPath expression against input data and returns the result.
|
||||
func (jp *JMESPath) Search(data interface{}) (interface{}, error) {
|
||||
return jp.intr.Execute(jp.ast, data)
|
||||
}
|
||||
|
||||
// Search evaluates a JMESPath expression against input data and returns the result.
|
||||
func Search(expression string, data interface{}) (interface{}, error) {
|
||||
intr := newInterpreter()
|
||||
parser := NewParser()
|
||||
ast, err := parser.Parse(expression)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return intr.Execute(ast, data)
|
||||
}
|
|
@ -1,16 +0,0 @@
|
|||
// generated by stringer -type astNodeType; DO NOT EDIT
|
||||
|
||||
package jmespath
|
||||
|
||||
import "fmt"
|
||||
|
||||
const _astNodeType_name = "ASTEmptyASTComparatorASTCurrentNodeASTExpRefASTFunctionExpressionASTFieldASTFilterProjectionASTFlattenASTIdentityASTIndexASTIndexExpressionASTKeyValPairASTLiteralASTMultiSelectHashASTMultiSelectListASTOrExpressionASTAndExpressionASTNotExpressionASTPipeASTProjectionASTSubexpressionASTSliceASTValueProjection"
|
||||
|
||||
var _astNodeType_index = [...]uint16{0, 8, 21, 35, 44, 65, 73, 92, 102, 113, 121, 139, 152, 162, 180, 198, 213, 229, 245, 252, 265, 281, 289, 307}
|
||||
|
||||
func (i astNodeType) String() string {
|
||||
if i < 0 || i >= astNodeType(len(_astNodeType_index)-1) {
|
||||
return fmt.Sprintf("astNodeType(%d)", i)
|
||||
}
|
||||
return _astNodeType_name[_astNodeType_index[i]:_astNodeType_index[i+1]]
|
||||
}
|
842
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/functions.go
generated
vendored
842
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/functions.go
generated
vendored
|
@ -1,842 +0,0 @@
|
|||
package jmespath
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"math"
|
||||
"reflect"
|
||||
"sort"
|
||||
"strconv"
|
||||
"strings"
|
||||
"unicode/utf8"
|
||||
)
|
||||
|
||||
type jpFunction func(arguments []interface{}) (interface{}, error)
|
||||
|
||||
type jpType string
|
||||
|
||||
const (
|
||||
jpUnknown jpType = "unknown"
|
||||
jpNumber jpType = "number"
|
||||
jpString jpType = "string"
|
||||
jpArray jpType = "array"
|
||||
jpObject jpType = "object"
|
||||
jpArrayNumber jpType = "array[number]"
|
||||
jpArrayString jpType = "array[string]"
|
||||
jpExpref jpType = "expref"
|
||||
jpAny jpType = "any"
|
||||
)
|
||||
|
||||
type functionEntry struct {
|
||||
name string
|
||||
arguments []argSpec
|
||||
handler jpFunction
|
||||
hasExpRef bool
|
||||
}
|
||||
|
||||
type argSpec struct {
|
||||
types []jpType
|
||||
variadic bool
|
||||
}
|
||||
|
||||
type byExprString struct {
|
||||
intr *treeInterpreter
|
||||
node ASTNode
|
||||
items []interface{}
|
||||
hasError bool
|
||||
}
|
||||
|
||||
func (a *byExprString) Len() int {
|
||||
return len(a.items)
|
||||
}
|
||||
func (a *byExprString) Swap(i, j int) {
|
||||
a.items[i], a.items[j] = a.items[j], a.items[i]
|
||||
}
|
||||
func (a *byExprString) Less(i, j int) bool {
|
||||
first, err := a.intr.Execute(a.node, a.items[i])
|
||||
if err != nil {
|
||||
a.hasError = true
|
||||
// Return a dummy value.
|
||||
return true
|
||||
}
|
||||
ith, ok := first.(string)
|
||||
if !ok {
|
||||
a.hasError = true
|
||||
return true
|
||||
}
|
||||
second, err := a.intr.Execute(a.node, a.items[j])
|
||||
if err != nil {
|
||||
a.hasError = true
|
||||
// Return a dummy value.
|
||||
return true
|
||||
}
|
||||
jth, ok := second.(string)
|
||||
if !ok {
|
||||
a.hasError = true
|
||||
return true
|
||||
}
|
||||
return ith < jth
|
||||
}
|
||||
|
||||
type byExprFloat struct {
|
||||
intr *treeInterpreter
|
||||
node ASTNode
|
||||
items []interface{}
|
||||
hasError bool
|
||||
}
|
||||
|
||||
func (a *byExprFloat) Len() int {
|
||||
return len(a.items)
|
||||
}
|
||||
func (a *byExprFloat) Swap(i, j int) {
|
||||
a.items[i], a.items[j] = a.items[j], a.items[i]
|
||||
}
|
||||
func (a *byExprFloat) Less(i, j int) bool {
|
||||
first, err := a.intr.Execute(a.node, a.items[i])
|
||||
if err != nil {
|
||||
a.hasError = true
|
||||
// Return a dummy value.
|
||||
return true
|
||||
}
|
||||
ith, ok := first.(float64)
|
||||
if !ok {
|
||||
a.hasError = true
|
||||
return true
|
||||
}
|
||||
second, err := a.intr.Execute(a.node, a.items[j])
|
||||
if err != nil {
|
||||
a.hasError = true
|
||||
// Return a dummy value.
|
||||
return true
|
||||
}
|
||||
jth, ok := second.(float64)
|
||||
if !ok {
|
||||
a.hasError = true
|
||||
return true
|
||||
}
|
||||
return ith < jth
|
||||
}
|
||||
|
||||
type functionCaller struct {
|
||||
functionTable map[string]functionEntry
|
||||
}
|
||||
|
||||
func newFunctionCaller() *functionCaller {
|
||||
caller := &functionCaller{}
|
||||
caller.functionTable = map[string]functionEntry{
|
||||
"length": {
|
||||
name: "length",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpString, jpArray, jpObject}},
|
||||
},
|
||||
handler: jpfLength,
|
||||
},
|
||||
"starts_with": {
|
||||
name: "starts_with",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpString}},
|
||||
{types: []jpType{jpString}},
|
||||
},
|
||||
handler: jpfStartsWith,
|
||||
},
|
||||
"abs": {
|
||||
name: "abs",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpNumber}},
|
||||
},
|
||||
handler: jpfAbs,
|
||||
},
|
||||
"avg": {
|
||||
name: "avg",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArrayNumber}},
|
||||
},
|
||||
handler: jpfAvg,
|
||||
},
|
||||
"ceil": {
|
||||
name: "ceil",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpNumber}},
|
||||
},
|
||||
handler: jpfCeil,
|
||||
},
|
||||
"contains": {
|
||||
name: "contains",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArray, jpString}},
|
||||
{types: []jpType{jpAny}},
|
||||
},
|
||||
handler: jpfContains,
|
||||
},
|
||||
"ends_with": {
|
||||
name: "ends_with",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpString}},
|
||||
{types: []jpType{jpString}},
|
||||
},
|
||||
handler: jpfEndsWith,
|
||||
},
|
||||
"floor": {
|
||||
name: "floor",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpNumber}},
|
||||
},
|
||||
handler: jpfFloor,
|
||||
},
|
||||
"map": {
|
||||
name: "amp",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpExpref}},
|
||||
{types: []jpType{jpArray}},
|
||||
},
|
||||
handler: jpfMap,
|
||||
hasExpRef: true,
|
||||
},
|
||||
"max": {
|
||||
name: "max",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArrayNumber, jpArrayString}},
|
||||
},
|
||||
handler: jpfMax,
|
||||
},
|
||||
"merge": {
|
||||
name: "merge",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpObject}, variadic: true},
|
||||
},
|
||||
handler: jpfMerge,
|
||||
},
|
||||
"max_by": {
|
||||
name: "max_by",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArray}},
|
||||
{types: []jpType{jpExpref}},
|
||||
},
|
||||
handler: jpfMaxBy,
|
||||
hasExpRef: true,
|
||||
},
|
||||
"sum": {
|
||||
name: "sum",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArrayNumber}},
|
||||
},
|
||||
handler: jpfSum,
|
||||
},
|
||||
"min": {
|
||||
name: "min",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArrayNumber, jpArrayString}},
|
||||
},
|
||||
handler: jpfMin,
|
||||
},
|
||||
"min_by": {
|
||||
name: "min_by",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArray}},
|
||||
{types: []jpType{jpExpref}},
|
||||
},
|
||||
handler: jpfMinBy,
|
||||
hasExpRef: true,
|
||||
},
|
||||
"type": {
|
||||
name: "type",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpAny}},
|
||||
},
|
||||
handler: jpfType,
|
||||
},
|
||||
"keys": {
|
||||
name: "keys",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpObject}},
|
||||
},
|
||||
handler: jpfKeys,
|
||||
},
|
||||
"values": {
|
||||
name: "values",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpObject}},
|
||||
},
|
||||
handler: jpfValues,
|
||||
},
|
||||
"sort": {
|
||||
name: "sort",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArrayString, jpArrayNumber}},
|
||||
},
|
||||
handler: jpfSort,
|
||||
},
|
||||
"sort_by": {
|
||||
name: "sort_by",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArray}},
|
||||
{types: []jpType{jpExpref}},
|
||||
},
|
||||
handler: jpfSortBy,
|
||||
hasExpRef: true,
|
||||
},
|
||||
"join": {
|
||||
name: "join",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpString}},
|
||||
{types: []jpType{jpArrayString}},
|
||||
},
|
||||
handler: jpfJoin,
|
||||
},
|
||||
"reverse": {
|
||||
name: "reverse",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpArray, jpString}},
|
||||
},
|
||||
handler: jpfReverse,
|
||||
},
|
||||
"to_array": {
|
||||
name: "to_array",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpAny}},
|
||||
},
|
||||
handler: jpfToArray,
|
||||
},
|
||||
"to_string": {
|
||||
name: "to_string",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpAny}},
|
||||
},
|
||||
handler: jpfToString,
|
||||
},
|
||||
"to_number": {
|
||||
name: "to_number",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpAny}},
|
||||
},
|
||||
handler: jpfToNumber,
|
||||
},
|
||||
"not_null": {
|
||||
name: "not_null",
|
||||
arguments: []argSpec{
|
||||
{types: []jpType{jpAny}, variadic: true},
|
||||
},
|
||||
handler: jpfNotNull,
|
||||
},
|
||||
}
|
||||
return caller
|
||||
}
|
||||
|
||||
func (e *functionEntry) resolveArgs(arguments []interface{}) ([]interface{}, error) {
|
||||
if len(e.arguments) == 0 {
|
||||
return arguments, nil
|
||||
}
|
||||
if !e.arguments[len(e.arguments)-1].variadic {
|
||||
if len(e.arguments) != len(arguments) {
|
||||
return nil, errors.New("incorrect number of args")
|
||||
}
|
||||
for i, spec := range e.arguments {
|
||||
userArg := arguments[i]
|
||||
err := spec.typeCheck(userArg)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
return arguments, nil
|
||||
}
|
||||
if len(arguments) < len(e.arguments) {
|
||||
return nil, errors.New("Invalid arity.")
|
||||
}
|
||||
return arguments, nil
|
||||
}
|
||||
|
||||
func (a *argSpec) typeCheck(arg interface{}) error {
|
||||
for _, t := range a.types {
|
||||
switch t {
|
||||
case jpNumber:
|
||||
if _, ok := arg.(float64); ok {
|
||||
return nil
|
||||
}
|
||||
case jpString:
|
||||
if _, ok := arg.(string); ok {
|
||||
return nil
|
||||
}
|
||||
case jpArray:
|
||||
if isSliceType(arg) {
|
||||
return nil
|
||||
}
|
||||
case jpObject:
|
||||
if _, ok := arg.(map[string]interface{}); ok {
|
||||
return nil
|
||||
}
|
||||
case jpArrayNumber:
|
||||
if _, ok := toArrayNum(arg); ok {
|
||||
return nil
|
||||
}
|
||||
case jpArrayString:
|
||||
if _, ok := toArrayStr(arg); ok {
|
||||
return nil
|
||||
}
|
||||
case jpAny:
|
||||
return nil
|
||||
case jpExpref:
|
||||
if _, ok := arg.(expRef); ok {
|
||||
return nil
|
||||
}
|
||||
}
|
||||
}
|
||||
return fmt.Errorf("Invalid type for: %v, expected: %#v", arg, a.types)
|
||||
}
|
||||
|
||||
func (f *functionCaller) CallFunction(name string, arguments []interface{}, intr *treeInterpreter) (interface{}, error) {
|
||||
entry, ok := f.functionTable[name]
|
||||
if !ok {
|
||||
return nil, errors.New("unknown function: " + name)
|
||||
}
|
||||
resolvedArgs, err := entry.resolveArgs(arguments)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if entry.hasExpRef {
|
||||
var extra []interface{}
|
||||
extra = append(extra, intr)
|
||||
resolvedArgs = append(extra, resolvedArgs...)
|
||||
}
|
||||
return entry.handler(resolvedArgs)
|
||||
}
|
||||
|
||||
func jpfAbs(arguments []interface{}) (interface{}, error) {
|
||||
num := arguments[0].(float64)
|
||||
return math.Abs(num), nil
|
||||
}
|
||||
|
||||
func jpfLength(arguments []interface{}) (interface{}, error) {
|
||||
arg := arguments[0]
|
||||
if c, ok := arg.(string); ok {
|
||||
return float64(utf8.RuneCountInString(c)), nil
|
||||
} else if isSliceType(arg) {
|
||||
v := reflect.ValueOf(arg)
|
||||
return float64(v.Len()), nil
|
||||
} else if c, ok := arg.(map[string]interface{}); ok {
|
||||
return float64(len(c)), nil
|
||||
}
|
||||
return nil, errors.New("could not compute length()")
|
||||
}
|
||||
|
||||
func jpfStartsWith(arguments []interface{}) (interface{}, error) {
|
||||
search := arguments[0].(string)
|
||||
prefix := arguments[1].(string)
|
||||
return strings.HasPrefix(search, prefix), nil
|
||||
}
|
||||
|
||||
func jpfAvg(arguments []interface{}) (interface{}, error) {
|
||||
// We've already type checked the value so we can safely use
|
||||
// type assertions.
|
||||
args := arguments[0].([]interface{})
|
||||
length := float64(len(args))
|
||||
numerator := 0.0
|
||||
for _, n := range args {
|
||||
numerator += n.(float64)
|
||||
}
|
||||
return numerator / length, nil
|
||||
}
|
||||
func jpfCeil(arguments []interface{}) (interface{}, error) {
|
||||
val := arguments[0].(float64)
|
||||
return math.Ceil(val), nil
|
||||
}
|
||||
func jpfContains(arguments []interface{}) (interface{}, error) {
|
||||
search := arguments[0]
|
||||
el := arguments[1]
|
||||
if searchStr, ok := search.(string); ok {
|
||||
if elStr, ok := el.(string); ok {
|
||||
return strings.Index(searchStr, elStr) != -1, nil
|
||||
}
|
||||
return false, nil
|
||||
}
|
||||
// Otherwise this is a generic contains for []interface{}
|
||||
general := search.([]interface{})
|
||||
for _, item := range general {
|
||||
if item == el {
|
||||
return true, nil
|
||||
}
|
||||
}
|
||||
return false, nil
|
||||
}
|
||||
func jpfEndsWith(arguments []interface{}) (interface{}, error) {
|
||||
search := arguments[0].(string)
|
||||
suffix := arguments[1].(string)
|
||||
return strings.HasSuffix(search, suffix), nil
|
||||
}
|
||||
func jpfFloor(arguments []interface{}) (interface{}, error) {
|
||||
val := arguments[0].(float64)
|
||||
return math.Floor(val), nil
|
||||
}
|
||||
func jpfMap(arguments []interface{}) (interface{}, error) {
|
||||
intr := arguments[0].(*treeInterpreter)
|
||||
exp := arguments[1].(expRef)
|
||||
node := exp.ref
|
||||
arr := arguments[2].([]interface{})
|
||||
mapped := make([]interface{}, 0, len(arr))
|
||||
for _, value := range arr {
|
||||
current, err := intr.Execute(node, value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
mapped = append(mapped, current)
|
||||
}
|
||||
return mapped, nil
|
||||
}
|
||||
func jpfMax(arguments []interface{}) (interface{}, error) {
|
||||
if items, ok := toArrayNum(arguments[0]); ok {
|
||||
if len(items) == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
if len(items) == 1 {
|
||||
return items[0], nil
|
||||
}
|
||||
best := items[0]
|
||||
for _, item := range items[1:] {
|
||||
if item > best {
|
||||
best = item
|
||||
}
|
||||
}
|
||||
return best, nil
|
||||
}
|
||||
// Otherwise we're dealing with a max() of strings.
|
||||
items, _ := toArrayStr(arguments[0])
|
||||
if len(items) == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
if len(items) == 1 {
|
||||
return items[0], nil
|
||||
}
|
||||
best := items[0]
|
||||
for _, item := range items[1:] {
|
||||
if item > best {
|
||||
best = item
|
||||
}
|
||||
}
|
||||
return best, nil
|
||||
}
|
||||
func jpfMerge(arguments []interface{}) (interface{}, error) {
|
||||
final := make(map[string]interface{})
|
||||
for _, m := range arguments {
|
||||
mapped := m.(map[string]interface{})
|
||||
for key, value := range mapped {
|
||||
final[key] = value
|
||||
}
|
||||
}
|
||||
return final, nil
|
||||
}
|
||||
func jpfMaxBy(arguments []interface{}) (interface{}, error) {
|
||||
intr := arguments[0].(*treeInterpreter)
|
||||
arr := arguments[1].([]interface{})
|
||||
exp := arguments[2].(expRef)
|
||||
node := exp.ref
|
||||
if len(arr) == 0 {
|
||||
return nil, nil
|
||||
} else if len(arr) == 1 {
|
||||
return arr[0], nil
|
||||
}
|
||||
start, err := intr.Execute(node, arr[0])
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
switch t := start.(type) {
|
||||
case float64:
|
||||
bestVal := t
|
||||
bestItem := arr[0]
|
||||
for _, item := range arr[1:] {
|
||||
result, err := intr.Execute(node, item)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
current, ok := result.(float64)
|
||||
if !ok {
|
||||
return nil, errors.New("invalid type, must be number")
|
||||
}
|
||||
if current > bestVal {
|
||||
bestVal = current
|
||||
bestItem = item
|
||||
}
|
||||
}
|
||||
return bestItem, nil
|
||||
case string:
|
||||
bestVal := t
|
||||
bestItem := arr[0]
|
||||
for _, item := range arr[1:] {
|
||||
result, err := intr.Execute(node, item)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
current, ok := result.(string)
|
||||
if !ok {
|
||||
return nil, errors.New("invalid type, must be string")
|
||||
}
|
||||
if current > bestVal {
|
||||
bestVal = current
|
||||
bestItem = item
|
||||
}
|
||||
}
|
||||
return bestItem, nil
|
||||
default:
|
||||
return nil, errors.New("invalid type, must be number of string")
|
||||
}
|
||||
}
|
||||
func jpfSum(arguments []interface{}) (interface{}, error) {
|
||||
items, _ := toArrayNum(arguments[0])
|
||||
sum := 0.0
|
||||
for _, item := range items {
|
||||
sum += item
|
||||
}
|
||||
return sum, nil
|
||||
}
|
||||
|
||||
func jpfMin(arguments []interface{}) (interface{}, error) {
|
||||
if items, ok := toArrayNum(arguments[0]); ok {
|
||||
if len(items) == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
if len(items) == 1 {
|
||||
return items[0], nil
|
||||
}
|
||||
best := items[0]
|
||||
for _, item := range items[1:] {
|
||||
if item < best {
|
||||
best = item
|
||||
}
|
||||
}
|
||||
return best, nil
|
||||
}
|
||||
items, _ := toArrayStr(arguments[0])
|
||||
if len(items) == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
if len(items) == 1 {
|
||||
return items[0], nil
|
||||
}
|
||||
best := items[0]
|
||||
for _, item := range items[1:] {
|
||||
if item < best {
|
||||
best = item
|
||||
}
|
||||
}
|
||||
return best, nil
|
||||
}
|
||||
|
||||
func jpfMinBy(arguments []interface{}) (interface{}, error) {
|
||||
intr := arguments[0].(*treeInterpreter)
|
||||
arr := arguments[1].([]interface{})
|
||||
exp := arguments[2].(expRef)
|
||||
node := exp.ref
|
||||
if len(arr) == 0 {
|
||||
return nil, nil
|
||||
} else if len(arr) == 1 {
|
||||
return arr[0], nil
|
||||
}
|
||||
start, err := intr.Execute(node, arr[0])
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if t, ok := start.(float64); ok {
|
||||
bestVal := t
|
||||
bestItem := arr[0]
|
||||
for _, item := range arr[1:] {
|
||||
result, err := intr.Execute(node, item)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
current, ok := result.(float64)
|
||||
if !ok {
|
||||
return nil, errors.New("invalid type, must be number")
|
||||
}
|
||||
if current < bestVal {
|
||||
bestVal = current
|
||||
bestItem = item
|
||||
}
|
||||
}
|
||||
return bestItem, nil
|
||||
} else if t, ok := start.(string); ok {
|
||||
bestVal := t
|
||||
bestItem := arr[0]
|
||||
for _, item := range arr[1:] {
|
||||
result, err := intr.Execute(node, item)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
current, ok := result.(string)
|
||||
if !ok {
|
||||
return nil, errors.New("invalid type, must be string")
|
||||
}
|
||||
if current < bestVal {
|
||||
bestVal = current
|
||||
bestItem = item
|
||||
}
|
||||
}
|
||||
return bestItem, nil
|
||||
} else {
|
||||
return nil, errors.New("invalid type, must be number of string")
|
||||
}
|
||||
}
|
||||
func jpfType(arguments []interface{}) (interface{}, error) {
|
||||
arg := arguments[0]
|
||||
if _, ok := arg.(float64); ok {
|
||||
return "number", nil
|
||||
}
|
||||
if _, ok := arg.(string); ok {
|
||||
return "string", nil
|
||||
}
|
||||
if _, ok := arg.([]interface{}); ok {
|
||||
return "array", nil
|
||||
}
|
||||
if _, ok := arg.(map[string]interface{}); ok {
|
||||
return "object", nil
|
||||
}
|
||||
if arg == nil {
|
||||
return "null", nil
|
||||
}
|
||||
if arg == true || arg == false {
|
||||
return "boolean", nil
|
||||
}
|
||||
return nil, errors.New("unknown type")
|
||||
}
|
||||
func jpfKeys(arguments []interface{}) (interface{}, error) {
|
||||
arg := arguments[0].(map[string]interface{})
|
||||
collected := make([]interface{}, 0, len(arg))
|
||||
for key := range arg {
|
||||
collected = append(collected, key)
|
||||
}
|
||||
return collected, nil
|
||||
}
|
||||
func jpfValues(arguments []interface{}) (interface{}, error) {
|
||||
arg := arguments[0].(map[string]interface{})
|
||||
collected := make([]interface{}, 0, len(arg))
|
||||
for _, value := range arg {
|
||||
collected = append(collected, value)
|
||||
}
|
||||
return collected, nil
|
||||
}
|
||||
func jpfSort(arguments []interface{}) (interface{}, error) {
|
||||
if items, ok := toArrayNum(arguments[0]); ok {
|
||||
d := sort.Float64Slice(items)
|
||||
sort.Stable(d)
|
||||
final := make([]interface{}, len(d))
|
||||
for i, val := range d {
|
||||
final[i] = val
|
||||
}
|
||||
return final, nil
|
||||
}
|
||||
// Otherwise we're dealing with sort()'ing strings.
|
||||
items, _ := toArrayStr(arguments[0])
|
||||
d := sort.StringSlice(items)
|
||||
sort.Stable(d)
|
||||
final := make([]interface{}, len(d))
|
||||
for i, val := range d {
|
||||
final[i] = val
|
||||
}
|
||||
return final, nil
|
||||
}
|
||||
func jpfSortBy(arguments []interface{}) (interface{}, error) {
|
||||
intr := arguments[0].(*treeInterpreter)
|
||||
arr := arguments[1].([]interface{})
|
||||
exp := arguments[2].(expRef)
|
||||
node := exp.ref
|
||||
if len(arr) == 0 {
|
||||
return arr, nil
|
||||
} else if len(arr) == 1 {
|
||||
return arr, nil
|
||||
}
|
||||
start, err := intr.Execute(node, arr[0])
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if _, ok := start.(float64); ok {
|
||||
sortable := &byExprFloat{intr, node, arr, false}
|
||||
sort.Stable(sortable)
|
||||
if sortable.hasError {
|
||||
return nil, errors.New("error in sort_by comparison")
|
||||
}
|
||||
return arr, nil
|
||||
} else if _, ok := start.(string); ok {
|
||||
sortable := &byExprString{intr, node, arr, false}
|
||||
sort.Stable(sortable)
|
||||
if sortable.hasError {
|
||||
return nil, errors.New("error in sort_by comparison")
|
||||
}
|
||||
return arr, nil
|
||||
} else {
|
||||
return nil, errors.New("invalid type, must be number of string")
|
||||
}
|
||||
}
|
||||
func jpfJoin(arguments []interface{}) (interface{}, error) {
|
||||
sep := arguments[0].(string)
|
||||
// We can't just do arguments[1].([]string), we have to
|
||||
// manually convert each item to a string.
|
||||
arrayStr := []string{}
|
||||
for _, item := range arguments[1].([]interface{}) {
|
||||
arrayStr = append(arrayStr, item.(string))
|
||||
}
|
||||
return strings.Join(arrayStr, sep), nil
|
||||
}
|
||||
func jpfReverse(arguments []interface{}) (interface{}, error) {
|
||||
if s, ok := arguments[0].(string); ok {
|
||||
r := []rune(s)
|
||||
for i, j := 0, len(r)-1; i < len(r)/2; i, j = i+1, j-1 {
|
||||
r[i], r[j] = r[j], r[i]
|
||||
}
|
||||
return string(r), nil
|
||||
}
|
||||
items := arguments[0].([]interface{})
|
||||
length := len(items)
|
||||
reversed := make([]interface{}, length)
|
||||
for i, item := range items {
|
||||
reversed[length-(i+1)] = item
|
||||
}
|
||||
return reversed, nil
|
||||
}
|
||||
func jpfToArray(arguments []interface{}) (interface{}, error) {
|
||||
if _, ok := arguments[0].([]interface{}); ok {
|
||||
return arguments[0], nil
|
||||
}
|
||||
return arguments[:1:1], nil
|
||||
}
|
||||
func jpfToString(arguments []interface{}) (interface{}, error) {
|
||||
if v, ok := arguments[0].(string); ok {
|
||||
return v, nil
|
||||
}
|
||||
result, err := json.Marshal(arguments[0])
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return string(result), nil
|
||||
}
|
||||
func jpfToNumber(arguments []interface{}) (interface{}, error) {
|
||||
arg := arguments[0]
|
||||
if v, ok := arg.(float64); ok {
|
||||
return v, nil
|
||||
}
|
||||
if v, ok := arg.(string); ok {
|
||||
conv, err := strconv.ParseFloat(v, 64)
|
||||
if err != nil {
|
||||
return nil, nil
|
||||
}
|
||||
return conv, nil
|
||||
}
|
||||
if _, ok := arg.([]interface{}); ok {
|
||||
return nil, nil
|
||||
}
|
||||
if _, ok := arg.(map[string]interface{}); ok {
|
||||
return nil, nil
|
||||
}
|
||||
if arg == nil {
|
||||
return nil, nil
|
||||
}
|
||||
if arg == true || arg == false {
|
||||
return nil, nil
|
||||
}
|
||||
return nil, errors.New("unknown type")
|
||||
}
|
||||
func jpfNotNull(arguments []interface{}) (interface{}, error) {
|
||||
for _, arg := range arguments {
|
||||
if arg != nil {
|
||||
return arg, nil
|
||||
}
|
||||
}
|
||||
return nil, nil
|
||||
}
|
418
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/interpreter.go
generated
vendored
418
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/interpreter.go
generated
vendored
|
@ -1,418 +0,0 @@
|
|||
package jmespath
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"reflect"
|
||||
"unicode"
|
||||
"unicode/utf8"
|
||||
)
|
||||
|
||||
/* This is a tree based interpreter. It walks the AST and directly
|
||||
interprets the AST to search through a JSON document.
|
||||
*/
|
||||
|
||||
type treeInterpreter struct {
|
||||
fCall *functionCaller
|
||||
}
|
||||
|
||||
func newInterpreter() *treeInterpreter {
|
||||
interpreter := treeInterpreter{}
|
||||
interpreter.fCall = newFunctionCaller()
|
||||
return &interpreter
|
||||
}
|
||||
|
||||
type expRef struct {
|
||||
ref ASTNode
|
||||
}
|
||||
|
||||
// Execute takes an ASTNode and input data and interprets the AST directly.
|
||||
// It will produce the result of applying the JMESPath expression associated
|
||||
// with the ASTNode to the input data "value".
|
||||
func (intr *treeInterpreter) Execute(node ASTNode, value interface{}) (interface{}, error) {
|
||||
switch node.nodeType {
|
||||
case ASTComparator:
|
||||
left, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
right, err := intr.Execute(node.children[1], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
switch node.value {
|
||||
case tEQ:
|
||||
return objsEqual(left, right), nil
|
||||
case tNE:
|
||||
return !objsEqual(left, right), nil
|
||||
}
|
||||
leftNum, ok := left.(float64)
|
||||
if !ok {
|
||||
return nil, nil
|
||||
}
|
||||
rightNum, ok := right.(float64)
|
||||
if !ok {
|
||||
return nil, nil
|
||||
}
|
||||
switch node.value {
|
||||
case tGT:
|
||||
return leftNum > rightNum, nil
|
||||
case tGTE:
|
||||
return leftNum >= rightNum, nil
|
||||
case tLT:
|
||||
return leftNum < rightNum, nil
|
||||
case tLTE:
|
||||
return leftNum <= rightNum, nil
|
||||
}
|
||||
case ASTExpRef:
|
||||
return expRef{ref: node.children[0]}, nil
|
||||
case ASTFunctionExpression:
|
||||
resolvedArgs := []interface{}{}
|
||||
for _, arg := range node.children {
|
||||
current, err := intr.Execute(arg, value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
resolvedArgs = append(resolvedArgs, current)
|
||||
}
|
||||
return intr.fCall.CallFunction(node.value.(string), resolvedArgs, intr)
|
||||
case ASTField:
|
||||
if m, ok := value.(map[string]interface{}); ok {
|
||||
key := node.value.(string)
|
||||
return m[key], nil
|
||||
}
|
||||
return intr.fieldFromStruct(node.value.(string), value)
|
||||
case ASTFilterProjection:
|
||||
left, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, nil
|
||||
}
|
||||
sliceType, ok := left.([]interface{})
|
||||
if !ok {
|
||||
if isSliceType(left) {
|
||||
return intr.filterProjectionWithReflection(node, left)
|
||||
}
|
||||
return nil, nil
|
||||
}
|
||||
compareNode := node.children[2]
|
||||
collected := []interface{}{}
|
||||
for _, element := range sliceType {
|
||||
result, err := intr.Execute(compareNode, element)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if !isFalse(result) {
|
||||
current, err := intr.Execute(node.children[1], element)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if current != nil {
|
||||
collected = append(collected, current)
|
||||
}
|
||||
}
|
||||
}
|
||||
return collected, nil
|
||||
case ASTFlatten:
|
||||
left, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, nil
|
||||
}
|
||||
sliceType, ok := left.([]interface{})
|
||||
if !ok {
|
||||
// If we can't type convert to []interface{}, there's
|
||||
// a chance this could still work via reflection if we're
|
||||
// dealing with user provided types.
|
||||
if isSliceType(left) {
|
||||
return intr.flattenWithReflection(left)
|
||||
}
|
||||
return nil, nil
|
||||
}
|
||||
flattened := []interface{}{}
|
||||
for _, element := range sliceType {
|
||||
if elementSlice, ok := element.([]interface{}); ok {
|
||||
flattened = append(flattened, elementSlice...)
|
||||
} else if isSliceType(element) {
|
||||
reflectFlat := []interface{}{}
|
||||
v := reflect.ValueOf(element)
|
||||
for i := 0; i < v.Len(); i++ {
|
||||
reflectFlat = append(reflectFlat, v.Index(i).Interface())
|
||||
}
|
||||
flattened = append(flattened, reflectFlat...)
|
||||
} else {
|
||||
flattened = append(flattened, element)
|
||||
}
|
||||
}
|
||||
return flattened, nil
|
||||
case ASTIdentity, ASTCurrentNode:
|
||||
return value, nil
|
||||
case ASTIndex:
|
||||
if sliceType, ok := value.([]interface{}); ok {
|
||||
index := node.value.(int)
|
||||
if index < 0 {
|
||||
index += len(sliceType)
|
||||
}
|
||||
if index < len(sliceType) && index >= 0 {
|
||||
return sliceType[index], nil
|
||||
}
|
||||
return nil, nil
|
||||
}
|
||||
// Otherwise try via reflection.
|
||||
rv := reflect.ValueOf(value)
|
||||
if rv.Kind() == reflect.Slice {
|
||||
index := node.value.(int)
|
||||
if index < 0 {
|
||||
index += rv.Len()
|
||||
}
|
||||
if index < rv.Len() && index >= 0 {
|
||||
v := rv.Index(index)
|
||||
return v.Interface(), nil
|
||||
}
|
||||
}
|
||||
return nil, nil
|
||||
case ASTKeyValPair:
|
||||
return intr.Execute(node.children[0], value)
|
||||
case ASTLiteral:
|
||||
return node.value, nil
|
||||
case ASTMultiSelectHash:
|
||||
if value == nil {
|
||||
return nil, nil
|
||||
}
|
||||
collected := make(map[string]interface{})
|
||||
for _, child := range node.children {
|
||||
current, err := intr.Execute(child, value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
key := child.value.(string)
|
||||
collected[key] = current
|
||||
}
|
||||
return collected, nil
|
||||
case ASTMultiSelectList:
|
||||
if value == nil {
|
||||
return nil, nil
|
||||
}
|
||||
collected := []interface{}{}
|
||||
for _, child := range node.children {
|
||||
current, err := intr.Execute(child, value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
collected = append(collected, current)
|
||||
}
|
||||
return collected, nil
|
||||
case ASTOrExpression:
|
||||
matched, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if isFalse(matched) {
|
||||
matched, err = intr.Execute(node.children[1], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
return matched, nil
|
||||
case ASTAndExpression:
|
||||
matched, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if isFalse(matched) {
|
||||
return matched, nil
|
||||
}
|
||||
return intr.Execute(node.children[1], value)
|
||||
case ASTNotExpression:
|
||||
matched, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if isFalse(matched) {
|
||||
return true, nil
|
||||
}
|
||||
return false, nil
|
||||
case ASTPipe:
|
||||
result := value
|
||||
var err error
|
||||
for _, child := range node.children {
|
||||
result, err = intr.Execute(child, result)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
return result, nil
|
||||
case ASTProjection:
|
||||
left, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
sliceType, ok := left.([]interface{})
|
||||
if !ok {
|
||||
if isSliceType(left) {
|
||||
return intr.projectWithReflection(node, left)
|
||||
}
|
||||
return nil, nil
|
||||
}
|
||||
collected := []interface{}{}
|
||||
var current interface{}
|
||||
for _, element := range sliceType {
|
||||
current, err = intr.Execute(node.children[1], element)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if current != nil {
|
||||
collected = append(collected, current)
|
||||
}
|
||||
}
|
||||
return collected, nil
|
||||
case ASTSubexpression, ASTIndexExpression:
|
||||
left, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return intr.Execute(node.children[1], left)
|
||||
case ASTSlice:
|
||||
sliceType, ok := value.([]interface{})
|
||||
if !ok {
|
||||
if isSliceType(value) {
|
||||
return intr.sliceWithReflection(node, value)
|
||||
}
|
||||
return nil, nil
|
||||
}
|
||||
parts := node.value.([]*int)
|
||||
sliceParams := make([]sliceParam, 3)
|
||||
for i, part := range parts {
|
||||
if part != nil {
|
||||
sliceParams[i].Specified = true
|
||||
sliceParams[i].N = *part
|
||||
}
|
||||
}
|
||||
return slice(sliceType, sliceParams)
|
||||
case ASTValueProjection:
|
||||
left, err := intr.Execute(node.children[0], value)
|
||||
if err != nil {
|
||||
return nil, nil
|
||||
}
|
||||
mapType, ok := left.(map[string]interface{})
|
||||
if !ok {
|
||||
return nil, nil
|
||||
}
|
||||
values := make([]interface{}, len(mapType))
|
||||
for _, value := range mapType {
|
||||
values = append(values, value)
|
||||
}
|
||||
collected := []interface{}{}
|
||||
for _, element := range values {
|
||||
current, err := intr.Execute(node.children[1], element)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if current != nil {
|
||||
collected = append(collected, current)
|
||||
}
|
||||
}
|
||||
return collected, nil
|
||||
}
|
||||
return nil, errors.New("Unknown AST node: " + node.nodeType.String())
|
||||
}
|
||||
|
||||
func (intr *treeInterpreter) fieldFromStruct(key string, value interface{}) (interface{}, error) {
|
||||
rv := reflect.ValueOf(value)
|
||||
first, n := utf8.DecodeRuneInString(key)
|
||||
fieldName := string(unicode.ToUpper(first)) + key[n:]
|
||||
if rv.Kind() == reflect.Struct {
|
||||
v := rv.FieldByName(fieldName)
|
||||
if !v.IsValid() {
|
||||
return nil, nil
|
||||
}
|
||||
return v.Interface(), nil
|
||||
} else if rv.Kind() == reflect.Ptr {
|
||||
// Handle multiple levels of indirection?
|
||||
if rv.IsNil() {
|
||||
return nil, nil
|
||||
}
|
||||
rv = rv.Elem()
|
||||
v := rv.FieldByName(fieldName)
|
||||
if !v.IsValid() {
|
||||
return nil, nil
|
||||
}
|
||||
return v.Interface(), nil
|
||||
}
|
||||
return nil, nil
|
||||
}
|
||||
|
||||
func (intr *treeInterpreter) flattenWithReflection(value interface{}) (interface{}, error) {
|
||||
v := reflect.ValueOf(value)
|
||||
flattened := []interface{}{}
|
||||
for i := 0; i < v.Len(); i++ {
|
||||
element := v.Index(i).Interface()
|
||||
if reflect.TypeOf(element).Kind() == reflect.Slice {
|
||||
// Then insert the contents of the element
|
||||
// slice into the flattened slice,
|
||||
// i.e flattened = append(flattened, mySlice...)
|
||||
elementV := reflect.ValueOf(element)
|
||||
for j := 0; j < elementV.Len(); j++ {
|
||||
flattened = append(
|
||||
flattened, elementV.Index(j).Interface())
|
||||
}
|
||||
} else {
|
||||
flattened = append(flattened, element)
|
||||
}
|
||||
}
|
||||
return flattened, nil
|
||||
}
|
||||
|
||||
func (intr *treeInterpreter) sliceWithReflection(node ASTNode, value interface{}) (interface{}, error) {
|
||||
v := reflect.ValueOf(value)
|
||||
parts := node.value.([]*int)
|
||||
sliceParams := make([]sliceParam, 3)
|
||||
for i, part := range parts {
|
||||
if part != nil {
|
||||
sliceParams[i].Specified = true
|
||||
sliceParams[i].N = *part
|
||||
}
|
||||
}
|
||||
final := []interface{}{}
|
||||
for i := 0; i < v.Len(); i++ {
|
||||
element := v.Index(i).Interface()
|
||||
final = append(final, element)
|
||||
}
|
||||
return slice(final, sliceParams)
|
||||
}
|
||||
|
||||
func (intr *treeInterpreter) filterProjectionWithReflection(node ASTNode, value interface{}) (interface{}, error) {
|
||||
compareNode := node.children[2]
|
||||
collected := []interface{}{}
|
||||
v := reflect.ValueOf(value)
|
||||
for i := 0; i < v.Len(); i++ {
|
||||
element := v.Index(i).Interface()
|
||||
result, err := intr.Execute(compareNode, element)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if !isFalse(result) {
|
||||
current, err := intr.Execute(node.children[1], element)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if current != nil {
|
||||
collected = append(collected, current)
|
||||
}
|
||||
}
|
||||
}
|
||||
return collected, nil
|
||||
}
|
||||
|
||||
func (intr *treeInterpreter) projectWithReflection(node ASTNode, value interface{}) (interface{}, error) {
|
||||
collected := []interface{}{}
|
||||
v := reflect.ValueOf(value)
|
||||
for i := 0; i < v.Len(); i++ {
|
||||
element := v.Index(i).Interface()
|
||||
result, err := intr.Execute(node.children[1], element)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if result != nil {
|
||||
collected = append(collected, result)
|
||||
}
|
||||
}
|
||||
return collected, nil
|
||||
}
|
420
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/lexer.go
generated
vendored
420
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/lexer.go
generated
vendored
|
@ -1,420 +0,0 @@
|
|||
package jmespath
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"strconv"
|
||||
"strings"
|
||||
"unicode/utf8"
|
||||
)
|
||||
|
||||
type token struct {
|
||||
tokenType tokType
|
||||
value string
|
||||
position int
|
||||
length int
|
||||
}
|
||||
|
||||
type tokType int
|
||||
|
||||
const eof = -1
|
||||
|
||||
// Lexer contains information about the expression being tokenized.
|
||||
type Lexer struct {
|
||||
expression string // The expression provided by the user.
|
||||
currentPos int // The current position in the string.
|
||||
lastWidth int // The width of the current rune. This
|
||||
buf bytes.Buffer // Internal buffer used for building up values.
|
||||
}
|
||||
|
||||
// SyntaxError is the main error used whenever a lexing or parsing error occurs.
|
||||
type SyntaxError struct {
|
||||
msg string // Error message displayed to user
|
||||
Expression string // Expression that generated a SyntaxError
|
||||
Offset int // The location in the string where the error occurred
|
||||
}
|
||||
|
||||
func (e SyntaxError) Error() string {
|
||||
// In the future, it would be good to underline the specific
|
||||
// location where the error occurred.
|
||||
return "SyntaxError: " + e.msg
|
||||
}
|
||||
|
||||
// HighlightLocation will show where the syntax error occurred.
|
||||
// It will place a "^" character on a line below the expression
|
||||
// at the point where the syntax error occurred.
|
||||
func (e SyntaxError) HighlightLocation() string {
|
||||
return e.Expression + "\n" + strings.Repeat(" ", e.Offset) + "^"
|
||||
}
|
||||
|
||||
//go:generate stringer -type=tokType
|
||||
const (
|
||||
tUnknown tokType = iota
|
||||
tStar
|
||||
tDot
|
||||
tFilter
|
||||
tFlatten
|
||||
tLparen
|
||||
tRparen
|
||||
tLbracket
|
||||
tRbracket
|
||||
tLbrace
|
||||
tRbrace
|
||||
tOr
|
||||
tPipe
|
||||
tNumber
|
||||
tUnquotedIdentifier
|
||||
tQuotedIdentifier
|
||||
tComma
|
||||
tColon
|
||||
tLT
|
||||
tLTE
|
||||
tGT
|
||||
tGTE
|
||||
tEQ
|
||||
tNE
|
||||
tJSONLiteral
|
||||
tStringLiteral
|
||||
tCurrent
|
||||
tExpref
|
||||
tAnd
|
||||
tNot
|
||||
tEOF
|
||||
)
|
||||
|
||||
var basicTokens = map[rune]tokType{
|
||||
'.': tDot,
|
||||
'*': tStar,
|
||||
',': tComma,
|
||||
':': tColon,
|
||||
'{': tLbrace,
|
||||
'}': tRbrace,
|
||||
']': tRbracket, // tLbracket not included because it could be "[]"
|
||||
'(': tLparen,
|
||||
')': tRparen,
|
||||
'@': tCurrent,
|
||||
}
|
||||
|
||||
// Bit mask for [a-zA-Z_] shifted down 64 bits to fit in a single uint64.
|
||||
// When using this bitmask just be sure to shift the rune down 64 bits
|
||||
// before checking against identifierStartBits.
|
||||
const identifierStartBits uint64 = 576460745995190270
|
||||
|
||||
// Bit mask for [a-zA-Z0-9], 128 bits -> 2 uint64s.
|
||||
var identifierTrailingBits = [2]uint64{287948901175001088, 576460745995190270}
|
||||
|
||||
var whiteSpace = map[rune]bool{
|
||||
' ': true, '\t': true, '\n': true, '\r': true,
|
||||
}
|
||||
|
||||
func (t token) String() string {
|
||||
return fmt.Sprintf("Token{%+v, %s, %d, %d}",
|
||||
t.tokenType, t.value, t.position, t.length)
|
||||
}
|
||||
|
||||
// NewLexer creates a new JMESPath lexer.
|
||||
func NewLexer() *Lexer {
|
||||
lexer := Lexer{}
|
||||
return &lexer
|
||||
}
|
||||
|
||||
func (lexer *Lexer) next() rune {
|
||||
if lexer.currentPos >= len(lexer.expression) {
|
||||
lexer.lastWidth = 0
|
||||
return eof
|
||||
}
|
||||
r, w := utf8.DecodeRuneInString(lexer.expression[lexer.currentPos:])
|
||||
lexer.lastWidth = w
|
||||
lexer.currentPos += w
|
||||
return r
|
||||
}
|
||||
|
||||
func (lexer *Lexer) back() {
|
||||
lexer.currentPos -= lexer.lastWidth
|
||||
}
|
||||
|
||||
func (lexer *Lexer) peek() rune {
|
||||
t := lexer.next()
|
||||
lexer.back()
|
||||
return t
|
||||
}
|
||||
|
||||
// tokenize takes an expression and returns corresponding tokens.
|
||||
func (lexer *Lexer) tokenize(expression string) ([]token, error) {
|
||||
var tokens []token
|
||||
lexer.expression = expression
|
||||
lexer.currentPos = 0
|
||||
lexer.lastWidth = 0
|
||||
loop:
|
||||
for {
|
||||
r := lexer.next()
|
||||
if identifierStartBits&(1<<(uint64(r)-64)) > 0 {
|
||||
t := lexer.consumeUnquotedIdentifier()
|
||||
tokens = append(tokens, t)
|
||||
} else if val, ok := basicTokens[r]; ok {
|
||||
// Basic single char token.
|
||||
t := token{
|
||||
tokenType: val,
|
||||
value: string(r),
|
||||
position: lexer.currentPos - lexer.lastWidth,
|
||||
length: 1,
|
||||
}
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '-' || (r >= '0' && r <= '9') {
|
||||
t := lexer.consumeNumber()
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '[' {
|
||||
t := lexer.consumeLBracket()
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '"' {
|
||||
t, err := lexer.consumeQuotedIdentifier()
|
||||
if err != nil {
|
||||
return tokens, err
|
||||
}
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '\'' {
|
||||
t, err := lexer.consumeRawStringLiteral()
|
||||
if err != nil {
|
||||
return tokens, err
|
||||
}
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '`' {
|
||||
t, err := lexer.consumeLiteral()
|
||||
if err != nil {
|
||||
return tokens, err
|
||||
}
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '|' {
|
||||
t := lexer.matchOrElse(r, '|', tOr, tPipe)
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '<' {
|
||||
t := lexer.matchOrElse(r, '=', tLTE, tLT)
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '>' {
|
||||
t := lexer.matchOrElse(r, '=', tGTE, tGT)
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '!' {
|
||||
t := lexer.matchOrElse(r, '=', tNE, tNot)
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '=' {
|
||||
t := lexer.matchOrElse(r, '=', tEQ, tUnknown)
|
||||
tokens = append(tokens, t)
|
||||
} else if r == '&' {
|
||||
t := lexer.matchOrElse(r, '&', tAnd, tExpref)
|
||||
tokens = append(tokens, t)
|
||||
} else if r == eof {
|
||||
break loop
|
||||
} else if _, ok := whiteSpace[r]; ok {
|
||||
// Ignore whitespace
|
||||
} else {
|
||||
return tokens, lexer.syntaxError(fmt.Sprintf("Unknown char: %s", strconv.QuoteRuneToASCII(r)))
|
||||
}
|
||||
}
|
||||
tokens = append(tokens, token{tEOF, "", len(lexer.expression), 0})
|
||||
return tokens, nil
|
||||
}
|
||||
|
||||
// Consume characters until the ending rune "r" is reached.
|
||||
// If the end of the expression is reached before seeing the
|
||||
// terminating rune "r", then an error is returned.
|
||||
// If no error occurs then the matching substring is returned.
|
||||
// The returned string will not include the ending rune.
|
||||
func (lexer *Lexer) consumeUntil(end rune) (string, error) {
|
||||
start := lexer.currentPos
|
||||
current := lexer.next()
|
||||
for current != end && current != eof {
|
||||
if current == '\\' && lexer.peek() != eof {
|
||||
lexer.next()
|
||||
}
|
||||
current = lexer.next()
|
||||
}
|
||||
if lexer.lastWidth == 0 {
|
||||
// Then we hit an EOF so we never reached the closing
|
||||
// delimiter.
|
||||
return "", SyntaxError{
|
||||
msg: "Unclosed delimiter: " + string(end),
|
||||
Expression: lexer.expression,
|
||||
Offset: len(lexer.expression),
|
||||
}
|
||||
}
|
||||
return lexer.expression[start : lexer.currentPos-lexer.lastWidth], nil
|
||||
}
|
||||
|
||||
func (lexer *Lexer) consumeLiteral() (token, error) {
|
||||
start := lexer.currentPos
|
||||
value, err := lexer.consumeUntil('`')
|
||||
if err != nil {
|
||||
return token{}, err
|
||||
}
|
||||
value = strings.Replace(value, "\\`", "`", -1)
|
||||
return token{
|
||||
tokenType: tJSONLiteral,
|
||||
value: value,
|
||||
position: start,
|
||||
length: len(value),
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (lexer *Lexer) consumeRawStringLiteral() (token, error) {
|
||||
start := lexer.currentPos
|
||||
currentIndex := start
|
||||
current := lexer.next()
|
||||
for current != '\'' && lexer.peek() != eof {
|
||||
if current == '\\' && lexer.peek() == '\'' {
|
||||
chunk := lexer.expression[currentIndex : lexer.currentPos-1]
|
||||
lexer.buf.WriteString(chunk)
|
||||
lexer.buf.WriteString("'")
|
||||
lexer.next()
|
||||
currentIndex = lexer.currentPos
|
||||
}
|
||||
current = lexer.next()
|
||||
}
|
||||
if lexer.lastWidth == 0 {
|
||||
// Then we hit an EOF so we never reached the closing
|
||||
// delimiter.
|
||||
return token{}, SyntaxError{
|
||||
msg: "Unclosed delimiter: '",
|
||||
Expression: lexer.expression,
|
||||
Offset: len(lexer.expression),
|
||||
}
|
||||
}
|
||||
if currentIndex < lexer.currentPos {
|
||||
lexer.buf.WriteString(lexer.expression[currentIndex : lexer.currentPos-1])
|
||||
}
|
||||
value := lexer.buf.String()
|
||||
// Reset the buffer so it can reused again.
|
||||
lexer.buf.Reset()
|
||||
return token{
|
||||
tokenType: tStringLiteral,
|
||||
value: value,
|
||||
position: start,
|
||||
length: len(value),
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (lexer *Lexer) syntaxError(msg string) SyntaxError {
|
||||
return SyntaxError{
|
||||
msg: msg,
|
||||
Expression: lexer.expression,
|
||||
Offset: lexer.currentPos - 1,
|
||||
}
|
||||
}
|
||||
|
||||
// Checks for a two char token, otherwise matches a single character
|
||||
// token. This is used whenever a two char token overlaps a single
|
||||
// char token, e.g. "||" -> tPipe, "|" -> tOr.
|
||||
func (lexer *Lexer) matchOrElse(first rune, second rune, matchedType tokType, singleCharType tokType) token {
|
||||
start := lexer.currentPos - lexer.lastWidth
|
||||
nextRune := lexer.next()
|
||||
var t token
|
||||
if nextRune == second {
|
||||
t = token{
|
||||
tokenType: matchedType,
|
||||
value: string(first) + string(second),
|
||||
position: start,
|
||||
length: 2,
|
||||
}
|
||||
} else {
|
||||
lexer.back()
|
||||
t = token{
|
||||
tokenType: singleCharType,
|
||||
value: string(first),
|
||||
position: start,
|
||||
length: 1,
|
||||
}
|
||||
}
|
||||
return t
|
||||
}
|
||||
|
||||
func (lexer *Lexer) consumeLBracket() token {
|
||||
// There's three options here:
|
||||
// 1. A filter expression "[?"
|
||||
// 2. A flatten operator "[]"
|
||||
// 3. A bare rbracket "["
|
||||
start := lexer.currentPos - lexer.lastWidth
|
||||
nextRune := lexer.next()
|
||||
var t token
|
||||
if nextRune == '?' {
|
||||
t = token{
|
||||
tokenType: tFilter,
|
||||
value: "[?",
|
||||
position: start,
|
||||
length: 2,
|
||||
}
|
||||
} else if nextRune == ']' {
|
||||
t = token{
|
||||
tokenType: tFlatten,
|
||||
value: "[]",
|
||||
position: start,
|
||||
length: 2,
|
||||
}
|
||||
} else {
|
||||
t = token{
|
||||
tokenType: tLbracket,
|
||||
value: "[",
|
||||
position: start,
|
||||
length: 1,
|
||||
}
|
||||
lexer.back()
|
||||
}
|
||||
return t
|
||||
}
|
||||
|
||||
func (lexer *Lexer) consumeQuotedIdentifier() (token, error) {
|
||||
start := lexer.currentPos
|
||||
value, err := lexer.consumeUntil('"')
|
||||
if err != nil {
|
||||
return token{}, err
|
||||
}
|
||||
var decoded string
|
||||
asJSON := []byte("\"" + value + "\"")
|
||||
if err := json.Unmarshal([]byte(asJSON), &decoded); err != nil {
|
||||
return token{}, err
|
||||
}
|
||||
return token{
|
||||
tokenType: tQuotedIdentifier,
|
||||
value: decoded,
|
||||
position: start - 1,
|
||||
length: len(decoded),
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (lexer *Lexer) consumeUnquotedIdentifier() token {
|
||||
// Consume runes until we reach the end of an unquoted
|
||||
// identifier.
|
||||
start := lexer.currentPos - lexer.lastWidth
|
||||
for {
|
||||
r := lexer.next()
|
||||
if r < 0 || r > 128 || identifierTrailingBits[uint64(r)/64]&(1<<(uint64(r)%64)) == 0 {
|
||||
lexer.back()
|
||||
break
|
||||
}
|
||||
}
|
||||
value := lexer.expression[start:lexer.currentPos]
|
||||
return token{
|
||||
tokenType: tUnquotedIdentifier,
|
||||
value: value,
|
||||
position: start,
|
||||
length: lexer.currentPos - start,
|
||||
}
|
||||
}
|
||||
|
||||
func (lexer *Lexer) consumeNumber() token {
|
||||
// Consume runes until we reach something that's not a number.
|
||||
start := lexer.currentPos - lexer.lastWidth
|
||||
for {
|
||||
r := lexer.next()
|
||||
if r < '0' || r > '9' {
|
||||
lexer.back()
|
||||
break
|
||||
}
|
||||
}
|
||||
value := lexer.expression[start:lexer.currentPos]
|
||||
return token{
|
||||
tokenType: tNumber,
|
||||
value: value,
|
||||
position: start,
|
||||
length: lexer.currentPos - start,
|
||||
}
|
||||
}
|
603
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/parser.go
generated
vendored
603
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/parser.go
generated
vendored
|
@ -1,603 +0,0 @@
|
|||
package jmespath
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
||||
type astNodeType int
|
||||
|
||||
//go:generate stringer -type astNodeType
|
||||
const (
|
||||
ASTEmpty astNodeType = iota
|
||||
ASTComparator
|
||||
ASTCurrentNode
|
||||
ASTExpRef
|
||||
ASTFunctionExpression
|
||||
ASTField
|
||||
ASTFilterProjection
|
||||
ASTFlatten
|
||||
ASTIdentity
|
||||
ASTIndex
|
||||
ASTIndexExpression
|
||||
ASTKeyValPair
|
||||
ASTLiteral
|
||||
ASTMultiSelectHash
|
||||
ASTMultiSelectList
|
||||
ASTOrExpression
|
||||
ASTAndExpression
|
||||
ASTNotExpression
|
||||
ASTPipe
|
||||
ASTProjection
|
||||
ASTSubexpression
|
||||
ASTSlice
|
||||
ASTValueProjection
|
||||
)
|
||||
|
||||
// ASTNode represents the abstract syntax tree of a JMESPath expression.
|
||||
type ASTNode struct {
|
||||
nodeType astNodeType
|
||||
value interface{}
|
||||
children []ASTNode
|
||||
}
|
||||
|
||||
func (node ASTNode) String() string {
|
||||
return node.PrettyPrint(0)
|
||||
}
|
||||
|
||||
// PrettyPrint will pretty print the parsed AST.
|
||||
// The AST is an implementation detail and this pretty print
|
||||
// function is provided as a convenience method to help with
|
||||
// debugging. You should not rely on its output as the internal
|
||||
// structure of the AST may change at any time.
|
||||
func (node ASTNode) PrettyPrint(indent int) string {
|
||||
spaces := strings.Repeat(" ", indent)
|
||||
output := fmt.Sprintf("%s%s {\n", spaces, node.nodeType)
|
||||
nextIndent := indent + 2
|
||||
if node.value != nil {
|
||||
if converted, ok := node.value.(fmt.Stringer); ok {
|
||||
// Account for things like comparator nodes
|
||||
// that are enums with a String() method.
|
||||
output += fmt.Sprintf("%svalue: %s\n", strings.Repeat(" ", nextIndent), converted.String())
|
||||
} else {
|
||||
output += fmt.Sprintf("%svalue: %#v\n", strings.Repeat(" ", nextIndent), node.value)
|
||||
}
|
||||
}
|
||||
lastIndex := len(node.children)
|
||||
if lastIndex > 0 {
|
||||
output += fmt.Sprintf("%schildren: {\n", strings.Repeat(" ", nextIndent))
|
||||
childIndent := nextIndent + 2
|
||||
for _, elem := range node.children {
|
||||
output += elem.PrettyPrint(childIndent)
|
||||
}
|
||||
}
|
||||
output += fmt.Sprintf("%s}\n", spaces)
|
||||
return output
|
||||
}
|
||||
|
||||
var bindingPowers = map[tokType]int{
|
||||
tEOF: 0,
|
||||
tUnquotedIdentifier: 0,
|
||||
tQuotedIdentifier: 0,
|
||||
tRbracket: 0,
|
||||
tRparen: 0,
|
||||
tComma: 0,
|
||||
tRbrace: 0,
|
||||
tNumber: 0,
|
||||
tCurrent: 0,
|
||||
tExpref: 0,
|
||||
tColon: 0,
|
||||
tPipe: 1,
|
||||
tOr: 2,
|
||||
tAnd: 3,
|
||||
tEQ: 5,
|
||||
tLT: 5,
|
||||
tLTE: 5,
|
||||
tGT: 5,
|
||||
tGTE: 5,
|
||||
tNE: 5,
|
||||
tFlatten: 9,
|
||||
tStar: 20,
|
||||
tFilter: 21,
|
||||
tDot: 40,
|
||||
tNot: 45,
|
||||
tLbrace: 50,
|
||||
tLbracket: 55,
|
||||
tLparen: 60,
|
||||
}
|
||||
|
||||
// Parser holds state about the current expression being parsed.
|
||||
type Parser struct {
|
||||
expression string
|
||||
tokens []token
|
||||
index int
|
||||
}
|
||||
|
||||
// NewParser creates a new JMESPath parser.
|
||||
func NewParser() *Parser {
|
||||
p := Parser{}
|
||||
return &p
|
||||
}
|
||||
|
||||
// Parse will compile a JMESPath expression.
|
||||
func (p *Parser) Parse(expression string) (ASTNode, error) {
|
||||
lexer := NewLexer()
|
||||
p.expression = expression
|
||||
p.index = 0
|
||||
tokens, err := lexer.tokenize(expression)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
p.tokens = tokens
|
||||
parsed, err := p.parseExpression(0)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
if p.current() != tEOF {
|
||||
return ASTNode{}, p.syntaxError(fmt.Sprintf(
|
||||
"Unexpected token at the end of the expresssion: %s", p.current()))
|
||||
}
|
||||
return parsed, nil
|
||||
}
|
||||
|
||||
func (p *Parser) parseExpression(bindingPower int) (ASTNode, error) {
|
||||
var err error
|
||||
leftToken := p.lookaheadToken(0)
|
||||
p.advance()
|
||||
leftNode, err := p.nud(leftToken)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
currentToken := p.current()
|
||||
for bindingPower < bindingPowers[currentToken] {
|
||||
p.advance()
|
||||
leftNode, err = p.led(currentToken, leftNode)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
currentToken = p.current()
|
||||
}
|
||||
return leftNode, nil
|
||||
}
|
||||
|
||||
func (p *Parser) parseIndexExpression() (ASTNode, error) {
|
||||
if p.lookahead(0) == tColon || p.lookahead(1) == tColon {
|
||||
return p.parseSliceExpression()
|
||||
}
|
||||
indexStr := p.lookaheadToken(0).value
|
||||
parsedInt, err := strconv.Atoi(indexStr)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
indexNode := ASTNode{nodeType: ASTIndex, value: parsedInt}
|
||||
p.advance()
|
||||
if err := p.match(tRbracket); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return indexNode, nil
|
||||
}
|
||||
|
||||
func (p *Parser) parseSliceExpression() (ASTNode, error) {
|
||||
parts := []*int{nil, nil, nil}
|
||||
index := 0
|
||||
current := p.current()
|
||||
for current != tRbracket && index < 3 {
|
||||
if current == tColon {
|
||||
index++
|
||||
p.advance()
|
||||
} else if current == tNumber {
|
||||
parsedInt, err := strconv.Atoi(p.lookaheadToken(0).value)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
parts[index] = &parsedInt
|
||||
p.advance()
|
||||
} else {
|
||||
return ASTNode{}, p.syntaxError(
|
||||
"Expected tColon or tNumber" + ", received: " + p.current().String())
|
||||
}
|
||||
current = p.current()
|
||||
}
|
||||
if err := p.match(tRbracket); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{
|
||||
nodeType: ASTSlice,
|
||||
value: parts,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (p *Parser) match(tokenType tokType) error {
|
||||
if p.current() == tokenType {
|
||||
p.advance()
|
||||
return nil
|
||||
}
|
||||
return p.syntaxError("Expected " + tokenType.String() + ", received: " + p.current().String())
|
||||
}
|
||||
|
||||
func (p *Parser) led(tokenType tokType, node ASTNode) (ASTNode, error) {
|
||||
switch tokenType {
|
||||
case tDot:
|
||||
if p.current() != tStar {
|
||||
right, err := p.parseDotRHS(bindingPowers[tDot])
|
||||
return ASTNode{
|
||||
nodeType: ASTSubexpression,
|
||||
children: []ASTNode{node, right},
|
||||
}, err
|
||||
}
|
||||
p.advance()
|
||||
right, err := p.parseProjectionRHS(bindingPowers[tDot])
|
||||
return ASTNode{
|
||||
nodeType: ASTValueProjection,
|
||||
children: []ASTNode{node, right},
|
||||
}, err
|
||||
case tPipe:
|
||||
right, err := p.parseExpression(bindingPowers[tPipe])
|
||||
return ASTNode{nodeType: ASTPipe, children: []ASTNode{node, right}}, err
|
||||
case tOr:
|
||||
right, err := p.parseExpression(bindingPowers[tOr])
|
||||
return ASTNode{nodeType: ASTOrExpression, children: []ASTNode{node, right}}, err
|
||||
case tAnd:
|
||||
right, err := p.parseExpression(bindingPowers[tAnd])
|
||||
return ASTNode{nodeType: ASTAndExpression, children: []ASTNode{node, right}}, err
|
||||
case tLparen:
|
||||
name := node.value
|
||||
var args []ASTNode
|
||||
for p.current() != tRparen {
|
||||
expression, err := p.parseExpression(0)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
if p.current() == tComma {
|
||||
if err := p.match(tComma); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
}
|
||||
args = append(args, expression)
|
||||
}
|
||||
if err := p.match(tRparen); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{
|
||||
nodeType: ASTFunctionExpression,
|
||||
value: name,
|
||||
children: args,
|
||||
}, nil
|
||||
case tFilter:
|
||||
return p.parseFilter(node)
|
||||
case tFlatten:
|
||||
left := ASTNode{nodeType: ASTFlatten, children: []ASTNode{node}}
|
||||
right, err := p.parseProjectionRHS(bindingPowers[tFlatten])
|
||||
return ASTNode{
|
||||
nodeType: ASTProjection,
|
||||
children: []ASTNode{left, right},
|
||||
}, err
|
||||
case tEQ, tNE, tGT, tGTE, tLT, tLTE:
|
||||
right, err := p.parseExpression(bindingPowers[tokenType])
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{
|
||||
nodeType: ASTComparator,
|
||||
value: tokenType,
|
||||
children: []ASTNode{node, right},
|
||||
}, nil
|
||||
case tLbracket:
|
||||
tokenType := p.current()
|
||||
var right ASTNode
|
||||
var err error
|
||||
if tokenType == tNumber || tokenType == tColon {
|
||||
right, err = p.parseIndexExpression()
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return p.projectIfSlice(node, right)
|
||||
}
|
||||
// Otherwise this is a projection.
|
||||
if err := p.match(tStar); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
if err := p.match(tRbracket); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
right, err = p.parseProjectionRHS(bindingPowers[tStar])
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{
|
||||
nodeType: ASTProjection,
|
||||
children: []ASTNode{node, right},
|
||||
}, nil
|
||||
}
|
||||
return ASTNode{}, p.syntaxError("Unexpected token: " + tokenType.String())
|
||||
}
|
||||
|
||||
func (p *Parser) nud(token token) (ASTNode, error) {
|
||||
switch token.tokenType {
|
||||
case tJSONLiteral:
|
||||
var parsed interface{}
|
||||
err := json.Unmarshal([]byte(token.value), &parsed)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{nodeType: ASTLiteral, value: parsed}, nil
|
||||
case tStringLiteral:
|
||||
return ASTNode{nodeType: ASTLiteral, value: token.value}, nil
|
||||
case tUnquotedIdentifier:
|
||||
return ASTNode{
|
||||
nodeType: ASTField,
|
||||
value: token.value,
|
||||
}, nil
|
||||
case tQuotedIdentifier:
|
||||
node := ASTNode{nodeType: ASTField, value: token.value}
|
||||
if p.current() == tLparen {
|
||||
return ASTNode{}, p.syntaxErrorToken("Can't have quoted identifier as function name.", token)
|
||||
}
|
||||
return node, nil
|
||||
case tStar:
|
||||
left := ASTNode{nodeType: ASTIdentity}
|
||||
var right ASTNode
|
||||
var err error
|
||||
if p.current() == tRbracket {
|
||||
right = ASTNode{nodeType: ASTIdentity}
|
||||
} else {
|
||||
right, err = p.parseProjectionRHS(bindingPowers[tStar])
|
||||
}
|
||||
return ASTNode{nodeType: ASTValueProjection, children: []ASTNode{left, right}}, err
|
||||
case tFilter:
|
||||
return p.parseFilter(ASTNode{nodeType: ASTIdentity})
|
||||
case tLbrace:
|
||||
return p.parseMultiSelectHash()
|
||||
case tFlatten:
|
||||
left := ASTNode{
|
||||
nodeType: ASTFlatten,
|
||||
children: []ASTNode{{nodeType: ASTIdentity}},
|
||||
}
|
||||
right, err := p.parseProjectionRHS(bindingPowers[tFlatten])
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{nodeType: ASTProjection, children: []ASTNode{left, right}}, nil
|
||||
case tLbracket:
|
||||
tokenType := p.current()
|
||||
//var right ASTNode
|
||||
if tokenType == tNumber || tokenType == tColon {
|
||||
right, err := p.parseIndexExpression()
|
||||
if err != nil {
|
||||
return ASTNode{}, nil
|
||||
}
|
||||
return p.projectIfSlice(ASTNode{nodeType: ASTIdentity}, right)
|
||||
} else if tokenType == tStar && p.lookahead(1) == tRbracket {
|
||||
p.advance()
|
||||
p.advance()
|
||||
right, err := p.parseProjectionRHS(bindingPowers[tStar])
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{
|
||||
nodeType: ASTProjection,
|
||||
children: []ASTNode{{nodeType: ASTIdentity}, right},
|
||||
}, nil
|
||||
} else {
|
||||
return p.parseMultiSelectList()
|
||||
}
|
||||
case tCurrent:
|
||||
return ASTNode{nodeType: ASTCurrentNode}, nil
|
||||
case tExpref:
|
||||
expression, err := p.parseExpression(bindingPowers[tExpref])
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{nodeType: ASTExpRef, children: []ASTNode{expression}}, nil
|
||||
case tNot:
|
||||
expression, err := p.parseExpression(bindingPowers[tNot])
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{nodeType: ASTNotExpression, children: []ASTNode{expression}}, nil
|
||||
case tLparen:
|
||||
expression, err := p.parseExpression(0)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
if err := p.match(tRparen); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return expression, nil
|
||||
case tEOF:
|
||||
return ASTNode{}, p.syntaxErrorToken("Incomplete expression", token)
|
||||
}
|
||||
|
||||
return ASTNode{}, p.syntaxErrorToken("Invalid token: "+token.tokenType.String(), token)
|
||||
}
|
||||
|
||||
func (p *Parser) parseMultiSelectList() (ASTNode, error) {
|
||||
var expressions []ASTNode
|
||||
for {
|
||||
expression, err := p.parseExpression(0)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
expressions = append(expressions, expression)
|
||||
if p.current() == tRbracket {
|
||||
break
|
||||
}
|
||||
err = p.match(tComma)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
}
|
||||
err := p.match(tRbracket)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return ASTNode{
|
||||
nodeType: ASTMultiSelectList,
|
||||
children: expressions,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (p *Parser) parseMultiSelectHash() (ASTNode, error) {
|
||||
var children []ASTNode
|
||||
for {
|
||||
keyToken := p.lookaheadToken(0)
|
||||
if err := p.match(tUnquotedIdentifier); err != nil {
|
||||
if err := p.match(tQuotedIdentifier); err != nil {
|
||||
return ASTNode{}, p.syntaxError("Expected tQuotedIdentifier or tUnquotedIdentifier")
|
||||
}
|
||||
}
|
||||
keyName := keyToken.value
|
||||
err := p.match(tColon)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
value, err := p.parseExpression(0)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
node := ASTNode{
|
||||
nodeType: ASTKeyValPair,
|
||||
value: keyName,
|
||||
children: []ASTNode{value},
|
||||
}
|
||||
children = append(children, node)
|
||||
if p.current() == tComma {
|
||||
err := p.match(tComma)
|
||||
if err != nil {
|
||||
return ASTNode{}, nil
|
||||
}
|
||||
} else if p.current() == tRbrace {
|
||||
err := p.match(tRbrace)
|
||||
if err != nil {
|
||||
return ASTNode{}, nil
|
||||
}
|
||||
break
|
||||
}
|
||||
}
|
||||
return ASTNode{
|
||||
nodeType: ASTMultiSelectHash,
|
||||
children: children,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (p *Parser) projectIfSlice(left ASTNode, right ASTNode) (ASTNode, error) {
|
||||
indexExpr := ASTNode{
|
||||
nodeType: ASTIndexExpression,
|
||||
children: []ASTNode{left, right},
|
||||
}
|
||||
if right.nodeType == ASTSlice {
|
||||
right, err := p.parseProjectionRHS(bindingPowers[tStar])
|
||||
return ASTNode{
|
||||
nodeType: ASTProjection,
|
||||
children: []ASTNode{indexExpr, right},
|
||||
}, err
|
||||
}
|
||||
return indexExpr, nil
|
||||
}
|
||||
func (p *Parser) parseFilter(node ASTNode) (ASTNode, error) {
|
||||
var right, condition ASTNode
|
||||
var err error
|
||||
condition, err = p.parseExpression(0)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
if err := p.match(tRbracket); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
if p.current() == tFlatten {
|
||||
right = ASTNode{nodeType: ASTIdentity}
|
||||
} else {
|
||||
right, err = p.parseProjectionRHS(bindingPowers[tFilter])
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
}
|
||||
|
||||
return ASTNode{
|
||||
nodeType: ASTFilterProjection,
|
||||
children: []ASTNode{node, right, condition},
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (p *Parser) parseDotRHS(bindingPower int) (ASTNode, error) {
|
||||
lookahead := p.current()
|
||||
if tokensOneOf([]tokType{tQuotedIdentifier, tUnquotedIdentifier, tStar}, lookahead) {
|
||||
return p.parseExpression(bindingPower)
|
||||
} else if lookahead == tLbracket {
|
||||
if err := p.match(tLbracket); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return p.parseMultiSelectList()
|
||||
} else if lookahead == tLbrace {
|
||||
if err := p.match(tLbrace); err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return p.parseMultiSelectHash()
|
||||
}
|
||||
return ASTNode{}, p.syntaxError("Expected identifier, lbracket, or lbrace")
|
||||
}
|
||||
|
||||
func (p *Parser) parseProjectionRHS(bindingPower int) (ASTNode, error) {
|
||||
current := p.current()
|
||||
if bindingPowers[current] < 10 {
|
||||
return ASTNode{nodeType: ASTIdentity}, nil
|
||||
} else if current == tLbracket {
|
||||
return p.parseExpression(bindingPower)
|
||||
} else if current == tFilter {
|
||||
return p.parseExpression(bindingPower)
|
||||
} else if current == tDot {
|
||||
err := p.match(tDot)
|
||||
if err != nil {
|
||||
return ASTNode{}, err
|
||||
}
|
||||
return p.parseDotRHS(bindingPower)
|
||||
} else {
|
||||
return ASTNode{}, p.syntaxError("Error")
|
||||
}
|
||||
}
|
||||
|
||||
func (p *Parser) lookahead(number int) tokType {
|
||||
return p.lookaheadToken(number).tokenType
|
||||
}
|
||||
|
||||
func (p *Parser) current() tokType {
|
||||
return p.lookahead(0)
|
||||
}
|
||||
|
||||
func (p *Parser) lookaheadToken(number int) token {
|
||||
return p.tokens[p.index+number]
|
||||
}
|
||||
|
||||
func (p *Parser) advance() {
|
||||
p.index++
|
||||
}
|
||||
|
||||
func tokensOneOf(elements []tokType, token tokType) bool {
|
||||
for _, elem := range elements {
|
||||
if elem == token {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func (p *Parser) syntaxError(msg string) SyntaxError {
|
||||
return SyntaxError{
|
||||
msg: msg,
|
||||
Expression: p.expression,
|
||||
Offset: p.lookaheadToken(0).position,
|
||||
}
|
||||
}
|
||||
|
||||
// Create a SyntaxError based on the provided token.
|
||||
// This differs from syntaxError() which creates a SyntaxError
|
||||
// based on the current lookahead token.
|
||||
func (p *Parser) syntaxErrorToken(msg string, t token) SyntaxError {
|
||||
return SyntaxError{
|
||||
msg: msg,
|
||||
Expression: p.expression,
|
||||
Offset: t.position,
|
||||
}
|
||||
}
|
|
@ -1,16 +0,0 @@
|
|||
// generated by stringer -type=tokType; DO NOT EDIT
|
||||
|
||||
package jmespath
|
||||
|
||||
import "fmt"
|
||||
|
||||
const _tokType_name = "tUnknowntStartDottFiltertFlattentLparentRparentLbrackettRbrackettLbracetRbracetOrtPipetNumbertUnquotedIdentifiertQuotedIdentifiertCommatColontLTtLTEtGTtGTEtEQtNEtJSONLiteraltStringLiteraltCurrenttExpreftAndtNottEOF"
|
||||
|
||||
var _tokType_index = [...]uint8{0, 8, 13, 17, 24, 32, 39, 46, 55, 64, 71, 78, 81, 86, 93, 112, 129, 135, 141, 144, 148, 151, 155, 158, 161, 173, 187, 195, 202, 206, 210, 214}
|
||||
|
||||
func (i tokType) String() string {
|
||||
if i < 0 || i >= tokType(len(_tokType_index)-1) {
|
||||
return fmt.Sprintf("tokType(%d)", i)
|
||||
}
|
||||
return _tokType_name[_tokType_index[i]:_tokType_index[i+1]]
|
||||
}
|
185
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/util.go
generated
vendored
185
vendor/github.com/aws/aws-sdk-go/vendor/github.com/jmespath/go-jmespath/util.go
generated
vendored
|
@ -1,185 +0,0 @@
|
|||
package jmespath
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"reflect"
|
||||
)
|
||||
|
||||
// IsFalse determines if an object is false based on the JMESPath spec.
|
||||
// JMESPath defines false values to be any of:
|
||||
// - An empty string array, or hash.
|
||||
// - The boolean value false.
|
||||
// - nil
|
||||
func isFalse(value interface{}) bool {
|
||||
switch v := value.(type) {
|
||||
case bool:
|
||||
return !v
|
||||
case []interface{}:
|
||||
return len(v) == 0
|
||||
case map[string]interface{}:
|
||||
return len(v) == 0
|
||||
case string:
|
||||
return len(v) == 0
|
||||
case nil:
|
||||
return true
|
||||
}
|
||||
// Try the reflection cases before returning false.
|
||||
rv := reflect.ValueOf(value)
|
||||
switch rv.Kind() {
|
||||
case reflect.Struct:
|
||||
// A struct type will never be false, even if
|
||||
// all of its values are the zero type.
|
||||
return false
|
||||
case reflect.Slice, reflect.Map:
|
||||
return rv.Len() == 0
|
||||
case reflect.Ptr:
|
||||
if rv.IsNil() {
|
||||
return true
|
||||
}
|
||||
// If it's a pointer type, we'll try to deref the pointer
|
||||
// and evaluate the pointer value for isFalse.
|
||||
element := rv.Elem()
|
||||
return isFalse(element.Interface())
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// ObjsEqual is a generic object equality check.
|
||||
// It will take two arbitrary objects and recursively determine
|
||||
// if they are equal.
|
||||
func objsEqual(left interface{}, right interface{}) bool {
|
||||
return reflect.DeepEqual(left, right)
|
||||
}
|
||||
|
||||
// SliceParam refers to a single part of a slice.
|
||||
// A slice consists of a start, a stop, and a step, similar to
|
||||
// python slices.
|
||||
type sliceParam struct {
|
||||
N int
|
||||
Specified bool
|
||||
}
|
||||
|
||||
// Slice supports [start:stop:step] style slicing that's supported in JMESPath.
|
||||
func slice(slice []interface{}, parts []sliceParam) ([]interface{}, error) {
|
||||
computed, err := computeSliceParams(len(slice), parts)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
start, stop, step := computed[0], computed[1], computed[2]
|
||||
result := []interface{}{}
|
||||
if step > 0 {
|
||||
for i := start; i < stop; i += step {
|
||||
result = append(result, slice[i])
|
||||
}
|
||||
} else {
|
||||
for i := start; i > stop; i += step {
|
||||
result = append(result, slice[i])
|
||||
}
|
||||
}
|
||||
return result, nil
|
||||
}
|
||||
|
||||
func computeSliceParams(length int, parts []sliceParam) ([]int, error) {
|
||||
var start, stop, step int
|
||||
if !parts[2].Specified {
|
||||
step = 1
|
||||
} else if parts[2].N == 0 {
|
||||
return nil, errors.New("Invalid slice, step cannot be 0")
|
||||
} else {
|
||||
step = parts[2].N
|
||||
}
|
||||
var stepValueNegative bool
|
||||
if step < 0 {
|
||||
stepValueNegative = true
|
||||
} else {
|
||||
stepValueNegative = false
|
||||
}
|
||||
|
||||
if !parts[0].Specified {
|
||||
if stepValueNegative {
|
||||
start = length - 1
|
||||
} else {
|
||||
start = 0
|
||||
}
|
||||
} else {
|
||||
start = capSlice(length, parts[0].N, step)
|
||||
}
|
||||
|
||||
if !parts[1].Specified {
|
||||
if stepValueNegative {
|
||||
stop = -1
|
||||
} else {
|
||||
stop = length
|
||||
}
|
||||
} else {
|
||||
stop = capSlice(length, parts[1].N, step)
|
||||
}
|
||||
return []int{start, stop, step}, nil
|
||||
}
|
||||
|
||||
func capSlice(length int, actual int, step int) int {
|
||||
if actual < 0 {
|
||||
actual += length
|
||||
if actual < 0 {
|
||||
if step < 0 {
|
||||
actual = -1
|
||||
} else {
|
||||
actual = 0
|
||||
}
|
||||
}
|
||||
} else if actual >= length {
|
||||
if step < 0 {
|
||||
actual = length - 1
|
||||
} else {
|
||||
actual = length
|
||||
}
|
||||
}
|
||||
return actual
|
||||
}
|
||||
|
||||
// ToArrayNum converts an empty interface type to a slice of float64.
|
||||
// If any element in the array cannot be converted, then nil is returned
|
||||
// along with a second value of false.
|
||||
func toArrayNum(data interface{}) ([]float64, bool) {
|
||||
// Is there a better way to do this with reflect?
|
||||
if d, ok := data.([]interface{}); ok {
|
||||
result := make([]float64, len(d))
|
||||
for i, el := range d {
|
||||
item, ok := el.(float64)
|
||||
if !ok {
|
||||
return nil, false
|
||||
}
|
||||
result[i] = item
|
||||
}
|
||||
return result, true
|
||||
}
|
||||
return nil, false
|
||||
}
|
||||
|
||||
// ToArrayStr converts an empty interface type to a slice of strings.
|
||||
// If any element in the array cannot be converted, then nil is returned
|
||||
// along with a second value of false. If the input data could be entirely
|
||||
// converted, then the converted data, along with a second value of true,
|
||||
// will be returned.
|
||||
func toArrayStr(data interface{}) ([]string, bool) {
|
||||
// Is there a better way to do this with reflect?
|
||||
if d, ok := data.([]interface{}); ok {
|
||||
result := make([]string, len(d))
|
||||
for i, el := range d {
|
||||
item, ok := el.(string)
|
||||
if !ok {
|
||||
return nil, false
|
||||
}
|
||||
result[i] = item
|
||||
}
|
||||
return result, true
|
||||
}
|
||||
return nil, false
|
||||
}
|
||||
|
||||
func isSliceType(v interface{}) bool {
|
||||
if v == nil {
|
||||
return false
|
||||
}
|
||||
return reflect.TypeOf(v).Kind() == reflect.Slice
|
||||
}
|
Loading…
Add table
Add a link
Reference in a new issue