structure saas with tools

This commit is contained in:
Davidson Gomes
2025-04-25 15:30:54 -03:00
commit 1aef473937
16434 changed files with 6584257 additions and 0 deletions

View File

@@ -0,0 +1,202 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@@ -0,0 +1,840 @@
Metadata-Version: 2.1
Name: google-cloud-aiplatform
Version: 1.90.0
Summary: Vertex AI API client library
Home-page: https://github.com/googleapis/python-aiplatform
Author: Google LLC
Author-email: googleapis-packages@google.com
License: Apache 2.0
Platform: Posix; MacOS X; Windows
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Internet
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.8
License-File: LICENSE
Requires-Dist: google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0,>=1.34.1
Requires-Dist: google-auth<3.0.0,>=2.14.1
Requires-Dist: proto-plus<2.0.0,>=1.22.3
Requires-Dist: protobuf!=4.21.0,!=4.21.1,!=4.21.2,!=4.21.3,!=4.21.4,!=4.21.5,<7.0.0,>=3.20.2
Requires-Dist: packaging>=14.3
Requires-Dist: google-cloud-storage<3.0.0,>=1.32.0
Requires-Dist: google-cloud-bigquery!=3.20.0,<4.0.0,>=1.15.0
Requires-Dist: google-cloud-resource-manager<3.0.0,>=1.3.3
Requires-Dist: shapely<3.0.0
Requires-Dist: pydantic<3
Requires-Dist: typing-extensions
Requires-Dist: docstring-parser<1
Provides-Extra: adk
Requires-Dist: google-adk>=0.0.2; extra == "adk"
Provides-Extra: ag2
Requires-Dist: ag2[gemini]; extra == "ag2"
Requires-Dist: openinference-instrumentation-autogen<0.2,>=0.1.6; extra == "ag2"
Provides-Extra: ag2_testing
Requires-Dist: opentelemetry-sdk<2; extra == "ag2-testing"
Requires-Dist: absl-py; extra == "ag2-testing"
Requires-Dist: google-cloud-trace<2; extra == "ag2-testing"
Requires-Dist: pydantic<3,>=2.11.1; extra == "ag2-testing"
Requires-Dist: cloudpickle<4.0,>=3.0; extra == "ag2-testing"
Requires-Dist: opentelemetry-exporter-gcp-trace<2; extra == "ag2-testing"
Requires-Dist: typing-extensions; extra == "ag2-testing"
Requires-Dist: pytest-xdist; extra == "ag2-testing"
Requires-Dist: ag2[gemini]; extra == "ag2-testing"
Requires-Dist: openinference-instrumentation-autogen<0.2,>=0.1.6; extra == "ag2-testing"
Provides-Extra: agent_engines
Requires-Dist: packaging>=24.0; extra == "agent-engines"
Requires-Dist: cloudpickle<4.0,>=3.0; extra == "agent-engines"
Requires-Dist: google-cloud-trace<2; extra == "agent-engines"
Requires-Dist: google-cloud-logging<4; extra == "agent-engines"
Requires-Dist: opentelemetry-sdk<2; extra == "agent-engines"
Requires-Dist: opentelemetry-exporter-gcp-trace<2; extra == "agent-engines"
Requires-Dist: pydantic<3,>=2.11.1; extra == "agent-engines"
Requires-Dist: typing-extensions; extra == "agent-engines"
Provides-Extra: autologging
Requires-Dist: mlflow<=2.16.0,>=1.27.0; extra == "autologging"
Provides-Extra: cloud_profiler
Requires-Dist: tensorboard-plugin-profile<2.18.0,>=2.4.0; extra == "cloud-profiler"
Requires-Dist: werkzeug<4.0.0,>=2.0.0; extra == "cloud-profiler"
Requires-Dist: tensorflow<3.0.0,>=2.4.0; extra == "cloud-profiler"
Provides-Extra: datasets
Requires-Dist: pyarrow<8.0.0,>=3.0.0; python_version < "3.11" and extra == "datasets"
Requires-Dist: pyarrow>=10.0.1; python_version == "3.11" and extra == "datasets"
Requires-Dist: pyarrow>=14.0.0; python_version >= "3.12" and extra == "datasets"
Provides-Extra: endpoint
Requires-Dist: requests>=2.28.1; extra == "endpoint"
Requires-Dist: requests-toolbelt<=1.0.0; extra == "endpoint"
Provides-Extra: evaluation
Requires-Dist: pandas>=1.0.0; extra == "evaluation"
Requires-Dist: tqdm>=4.23.0; extra == "evaluation"
Requires-Dist: jsonschema; extra == "evaluation"
Requires-Dist: ruamel.yaml; extra == "evaluation"
Requires-Dist: scikit-learn<1.6.0; python_version <= "3.10" and extra == "evaluation"
Requires-Dist: scikit-learn; python_version > "3.10" and extra == "evaluation"
Provides-Extra: full
Requires-Dist: pandas>=1.0.0; extra == "full"
Requires-Dist: requests-toolbelt<=1.0.0; extra == "full"
Requires-Dist: starlette>=0.17.1; extra == "full"
Requires-Dist: google-vizier>=0.1.6; extra == "full"
Requires-Dist: google-cloud-bigquery-storage; extra == "full"
Requires-Dist: pyyaml<7,>=5.3.1; extra == "full"
Requires-Dist: lit-nlp==0.4.0; extra == "full"
Requires-Dist: fastapi<=0.114.0,>=0.71.0; extra == "full"
Requires-Dist: tensorflow<3.0.0,>=2.4.0; extra == "full"
Requires-Dist: numpy>=1.15.0; extra == "full"
Requires-Dist: docker>=5.0.3; extra == "full"
Requires-Dist: immutabledict; extra == "full"
Requires-Dist: werkzeug<4.0.0,>=2.0.0; extra == "full"
Requires-Dist: explainable-ai-sdk>=1.0.0; extra == "full"
Requires-Dist: pyarrow>=6.0.1; extra == "full"
Requires-Dist: tqdm>=4.23.0; extra == "full"
Requires-Dist: tensorflow<3.0.0,>=2.3.0; extra == "full"
Requires-Dist: tensorboard-plugin-profile<2.18.0,>=2.4.0; extra == "full"
Requires-Dist: urllib3<1.27,>=1.21.1; extra == "full"
Requires-Dist: ruamel.yaml; extra == "full"
Requires-Dist: setuptools<70.0.0; extra == "full"
Requires-Dist: uvicorn[standard]>=0.16.0; extra == "full"
Requires-Dist: google-cloud-bigquery; extra == "full"
Requires-Dist: requests>=2.28.1; extra == "full"
Requires-Dist: jsonschema; extra == "full"
Requires-Dist: mlflow<=2.16.0,>=1.27.0; extra == "full"
Requires-Dist: httpx<0.25.0,>=0.23.0; extra == "full"
Requires-Dist: pyarrow<8.0.0,>=3.0.0; python_version < "3.11" and extra == "full"
Requires-Dist: ray[default]!=2.10.*,!=2.11.*,!=2.12.*,!=2.13.*,!=2.14.*,!=2.15.*,!=2.16.*,!=2.17.*,!=2.18.*,!=2.19.*,!=2.20.*,!=2.21.*,!=2.22.*,!=2.23.*,!=2.24.*,!=2.25.*,!=2.26.*,!=2.27.*,!=2.28.*,!=2.29.*,!=2.30.*,!=2.31.*,!=2.32.*,!=2.34.*,!=2.35.*,!=2.36.*,!=2.37.*,!=2.38.*,!=2.39.*,!=2.40.*,!=2.41.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.0,!=2.9.1,!=2.9.2,<=2.42.0,>=2.4; python_version < "3.11" and extra == "full"
Requires-Dist: scikit-learn<1.6.0; python_version <= "3.10" and extra == "full"
Requires-Dist: tensorflow<3.0.0,>=2.3.0; python_version <= "3.11" and extra == "full"
Requires-Dist: pyarrow>=10.0.1; python_version == "3.11" and extra == "full"
Requires-Dist: ray[default]<=2.42.0,>=2.5; python_version == "3.11" and extra == "full"
Requires-Dist: scikit-learn; python_version > "3.10" and extra == "full"
Requires-Dist: pyarrow>=14.0.0; python_version >= "3.12" and extra == "full"
Provides-Extra: langchain
Requires-Dist: langchain<0.4,>=0.3; extra == "langchain"
Requires-Dist: langchain-core<0.4,>=0.3; extra == "langchain"
Requires-Dist: langchain-google-vertexai<3,>=2; extra == "langchain"
Requires-Dist: langgraph<0.3,>=0.2.45; extra == "langchain"
Requires-Dist: openinference-instrumentation-langchain<0.2,>=0.1.19; extra == "langchain"
Provides-Extra: langchain_testing
Requires-Dist: opentelemetry-sdk<2; extra == "langchain-testing"
Requires-Dist: openinference-instrumentation-langchain<0.2,>=0.1.19; extra == "langchain-testing"
Requires-Dist: absl-py; extra == "langchain-testing"
Requires-Dist: google-cloud-trace<2; extra == "langchain-testing"
Requires-Dist: langchain-core<0.4,>=0.3; extra == "langchain-testing"
Requires-Dist: pydantic<3,>=2.11.1; extra == "langchain-testing"
Requires-Dist: cloudpickle<4.0,>=3.0; extra == "langchain-testing"
Requires-Dist: opentelemetry-exporter-gcp-trace<2; extra == "langchain-testing"
Requires-Dist: langchain<0.4,>=0.3; extra == "langchain-testing"
Requires-Dist: typing-extensions; extra == "langchain-testing"
Requires-Dist: langgraph<0.3,>=0.2.45; extra == "langchain-testing"
Requires-Dist: pytest-xdist; extra == "langchain-testing"
Requires-Dist: langchain-google-vertexai<3,>=2; extra == "langchain-testing"
Provides-Extra: lit
Requires-Dist: tensorflow<3.0.0,>=2.3.0; extra == "lit"
Requires-Dist: pandas>=1.0.0; extra == "lit"
Requires-Dist: lit-nlp==0.4.0; extra == "lit"
Requires-Dist: explainable-ai-sdk>=1.0.0; extra == "lit"
Provides-Extra: llama_index
Requires-Dist: llama-index; extra == "llama-index"
Requires-Dist: llama-index-llms-google-genai; extra == "llama-index"
Requires-Dist: openinference-instrumentation-llama-index<4.0,>=3.0; extra == "llama-index"
Provides-Extra: llama_index_testing
Requires-Dist: opentelemetry-sdk<2; extra == "llama-index-testing"
Requires-Dist: openinference-instrumentation-llama-index<4.0,>=3.0; extra == "llama-index-testing"
Requires-Dist: absl-py; extra == "llama-index-testing"
Requires-Dist: google-cloud-trace<2; extra == "llama-index-testing"
Requires-Dist: pydantic<3,>=2.11.1; extra == "llama-index-testing"
Requires-Dist: cloudpickle<4.0,>=3.0; extra == "llama-index-testing"
Requires-Dist: opentelemetry-exporter-gcp-trace<2; extra == "llama-index-testing"
Requires-Dist: llama-index-llms-google-genai; extra == "llama-index-testing"
Requires-Dist: typing-extensions; extra == "llama-index-testing"
Requires-Dist: pytest-xdist; extra == "llama-index-testing"
Requires-Dist: llama-index; extra == "llama-index-testing"
Provides-Extra: metadata
Requires-Dist: pandas>=1.0.0; extra == "metadata"
Requires-Dist: numpy>=1.15.0; extra == "metadata"
Provides-Extra: pipelines
Requires-Dist: pyyaml<7,>=5.3.1; extra == "pipelines"
Provides-Extra: prediction
Requires-Dist: docker>=5.0.3; extra == "prediction"
Requires-Dist: fastapi<=0.114.0,>=0.71.0; extra == "prediction"
Requires-Dist: httpx<0.25.0,>=0.23.0; extra == "prediction"
Requires-Dist: starlette>=0.17.1; extra == "prediction"
Requires-Dist: uvicorn[standard]>=0.16.0; extra == "prediction"
Provides-Extra: preview
Provides-Extra: private_endpoints
Requires-Dist: urllib3<1.27,>=1.21.1; extra == "private-endpoints"
Requires-Dist: requests>=2.28.1; extra == "private-endpoints"
Provides-Extra: ray
Requires-Dist: setuptools<70.0.0; extra == "ray"
Requires-Dist: google-cloud-bigquery-storage; extra == "ray"
Requires-Dist: google-cloud-bigquery; extra == "ray"
Requires-Dist: pandas>=1.0.0; extra == "ray"
Requires-Dist: pyarrow>=6.0.1; extra == "ray"
Requires-Dist: immutabledict; extra == "ray"
Requires-Dist: ray[default]!=2.10.*,!=2.11.*,!=2.12.*,!=2.13.*,!=2.14.*,!=2.15.*,!=2.16.*,!=2.17.*,!=2.18.*,!=2.19.*,!=2.20.*,!=2.21.*,!=2.22.*,!=2.23.*,!=2.24.*,!=2.25.*,!=2.26.*,!=2.27.*,!=2.28.*,!=2.29.*,!=2.30.*,!=2.31.*,!=2.32.*,!=2.34.*,!=2.35.*,!=2.36.*,!=2.37.*,!=2.38.*,!=2.39.*,!=2.40.*,!=2.41.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.0,!=2.9.1,!=2.9.2,<=2.42.0,>=2.4; python_version < "3.11" and extra == "ray"
Requires-Dist: ray[default]<=2.42.0,>=2.5; python_version == "3.11" and extra == "ray"
Provides-Extra: ray_testing
Requires-Dist: setuptools<70.0.0; extra == "ray-testing"
Requires-Dist: google-cloud-bigquery-storage; extra == "ray-testing"
Requires-Dist: google-cloud-bigquery; extra == "ray-testing"
Requires-Dist: pandas>=1.0.0; extra == "ray-testing"
Requires-Dist: pyarrow>=6.0.1; extra == "ray-testing"
Requires-Dist: immutabledict; extra == "ray-testing"
Requires-Dist: pytest-xdist; extra == "ray-testing"
Requires-Dist: ray[train]; extra == "ray-testing"
Requires-Dist: scikit-learn<1.6.0; extra == "ray-testing"
Requires-Dist: tensorflow; extra == "ray-testing"
Requires-Dist: torch<2.1.0,>=2.0.0; extra == "ray-testing"
Requires-Dist: xgboost; extra == "ray-testing"
Requires-Dist: xgboost-ray; extra == "ray-testing"
Requires-Dist: ray[default]!=2.10.*,!=2.11.*,!=2.12.*,!=2.13.*,!=2.14.*,!=2.15.*,!=2.16.*,!=2.17.*,!=2.18.*,!=2.19.*,!=2.20.*,!=2.21.*,!=2.22.*,!=2.23.*,!=2.24.*,!=2.25.*,!=2.26.*,!=2.27.*,!=2.28.*,!=2.29.*,!=2.30.*,!=2.31.*,!=2.32.*,!=2.34.*,!=2.35.*,!=2.36.*,!=2.37.*,!=2.38.*,!=2.39.*,!=2.40.*,!=2.41.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.0,!=2.9.1,!=2.9.2,<=2.42.0,>=2.4; python_version < "3.11" and extra == "ray_testing"
Requires-Dist: ray[default]<=2.42.0,>=2.5; python_version == "3.11" and extra == "ray_testing"
Provides-Extra: reasoningengine
Requires-Dist: cloudpickle<4.0,>=3.0; extra == "reasoningengine"
Requires-Dist: google-cloud-trace<2; extra == "reasoningengine"
Requires-Dist: opentelemetry-sdk<2; extra == "reasoningengine"
Requires-Dist: opentelemetry-exporter-gcp-trace<2; extra == "reasoningengine"
Requires-Dist: pydantic<3,>=2.11.1; extra == "reasoningengine"
Requires-Dist: typing-extensions; extra == "reasoningengine"
Provides-Extra: tensorboard
Requires-Dist: tensorboard-plugin-profile<2.18.0,>=2.4.0; extra == "tensorboard"
Requires-Dist: werkzeug<4.0.0,>=2.0.0; extra == "tensorboard"
Requires-Dist: tensorflow<3.0.0,>=2.4.0; extra == "tensorboard"
Requires-Dist: tensorflow<3.0.0,>=2.3.0; python_version <= "3.11" and extra == "tensorboard"
Provides-Extra: testing
Requires-Dist: pandas>=1.0.0; extra == "testing"
Requires-Dist: requests-toolbelt<=1.0.0; extra == "testing"
Requires-Dist: starlette>=0.17.1; extra == "testing"
Requires-Dist: google-vizier>=0.1.6; extra == "testing"
Requires-Dist: google-cloud-bigquery-storage; extra == "testing"
Requires-Dist: pyyaml<7,>=5.3.1; extra == "testing"
Requires-Dist: lit-nlp==0.4.0; extra == "testing"
Requires-Dist: fastapi<=0.114.0,>=0.71.0; extra == "testing"
Requires-Dist: tensorflow<3.0.0,>=2.4.0; extra == "testing"
Requires-Dist: numpy>=1.15.0; extra == "testing"
Requires-Dist: docker>=5.0.3; extra == "testing"
Requires-Dist: immutabledict; extra == "testing"
Requires-Dist: werkzeug<4.0.0,>=2.0.0; extra == "testing"
Requires-Dist: explainable-ai-sdk>=1.0.0; extra == "testing"
Requires-Dist: pyarrow>=6.0.1; extra == "testing"
Requires-Dist: tqdm>=4.23.0; extra == "testing"
Requires-Dist: tensorflow<3.0.0,>=2.3.0; extra == "testing"
Requires-Dist: tensorboard-plugin-profile<2.18.0,>=2.4.0; extra == "testing"
Requires-Dist: urllib3<1.27,>=1.21.1; extra == "testing"
Requires-Dist: ruamel.yaml; extra == "testing"
Requires-Dist: setuptools<70.0.0; extra == "testing"
Requires-Dist: uvicorn[standard]>=0.16.0; extra == "testing"
Requires-Dist: google-cloud-bigquery; extra == "testing"
Requires-Dist: requests>=2.28.1; extra == "testing"
Requires-Dist: jsonschema; extra == "testing"
Requires-Dist: mlflow<=2.16.0,>=1.27.0; extra == "testing"
Requires-Dist: httpx<0.25.0,>=0.23.0; extra == "testing"
Requires-Dist: sentencepiece>=0.2.0; extra == "testing"
Requires-Dist: nltk; extra == "testing"
Requires-Dist: aiohttp; extra == "testing"
Requires-Dist: google-api-core<3.0.0,>=2.11; extra == "testing"
Requires-Dist: grpcio-testing; extra == "testing"
Requires-Dist: ipython; extra == "testing"
Requires-Dist: kfp<3.0.0,>=2.6.0; extra == "testing"
Requires-Dist: pytest-asyncio; extra == "testing"
Requires-Dist: pytest-xdist; extra == "testing"
Requires-Dist: xgboost; extra == "testing"
Requires-Dist: pyarrow<8.0.0,>=3.0.0; python_version < "3.11" and extra == "testing"
Requires-Dist: ray[default]!=2.10.*,!=2.11.*,!=2.12.*,!=2.13.*,!=2.14.*,!=2.15.*,!=2.16.*,!=2.17.*,!=2.18.*,!=2.19.*,!=2.20.*,!=2.21.*,!=2.22.*,!=2.23.*,!=2.24.*,!=2.25.*,!=2.26.*,!=2.27.*,!=2.28.*,!=2.29.*,!=2.30.*,!=2.31.*,!=2.32.*,!=2.34.*,!=2.35.*,!=2.36.*,!=2.37.*,!=2.38.*,!=2.39.*,!=2.40.*,!=2.41.*,!=2.5.*,!=2.6.*,!=2.7.*,!=2.8.*,!=2.9.0,!=2.9.1,!=2.9.2,<=2.42.0,>=2.4; python_version < "3.11" and extra == "testing"
Requires-Dist: scikit-learn<1.6.0; python_version <= "3.10" and extra == "testing"
Requires-Dist: tensorflow<3.0.0,>=2.3.0; python_version <= "3.11" and extra == "testing"
Requires-Dist: tensorflow==2.13.0; python_version <= "3.11" and extra == "testing"
Requires-Dist: torch<2.1.0,>=2.0.0; python_version <= "3.11" and extra == "testing"
Requires-Dist: pyarrow>=10.0.1; python_version == "3.11" and extra == "testing"
Requires-Dist: ray[default]<=2.42.0,>=2.5; python_version == "3.11" and extra == "testing"
Requires-Dist: scikit-learn; python_version > "3.10" and extra == "testing"
Requires-Dist: tensorflow==2.16.1; python_version > "3.11" and extra == "testing"
Requires-Dist: torch>=2.2.0; python_version > "3.11" and extra == "testing"
Requires-Dist: bigframes; python_version >= "3.10" and extra == "testing"
Requires-Dist: pyarrow>=14.0.0; python_version >= "3.12" and extra == "testing"
Provides-Extra: tokenization
Requires-Dist: sentencepiece>=0.2.0; extra == "tokenization"
Provides-Extra: vizier
Requires-Dist: google-vizier>=0.1.6; extra == "vizier"
Provides-Extra: xai
Requires-Dist: tensorflow<3.0.0,>=2.3.0; extra == "xai"
Vertex AI SDK for Python
=================================================
Gemini API and Generative AI on Vertex AI
-----------------------------------------
.. note::
For Gemini API and Generative AI on Vertex AI, please reference `Vertex Generative AI SDK for Python`_
.. _Vertex Generative AI SDK for Python: https://cloud.google.com/vertex-ai/generative-ai/docs/reference/python/latest
-----------------------------------------
|GA| |pypi| |versions| |unit-tests| |system-tests| |sample-tests|
`Vertex AI`_: Google Vertex AI is an integrated suite of machine learning tools and services for building and using ML models with AutoML or custom code. It offers both novices and experts the best workbench for the entire machine learning development lifecycle.
- `Client Library Documentation`_
- `Product Documentation`_
.. |GA| image:: https://img.shields.io/badge/support-ga-gold.svg
:target: https://github.com/googleapis/google-cloud-python/blob/main/README.rst#general-availability
.. |pypi| image:: https://img.shields.io/pypi/v/google-cloud-aiplatform.svg
:target: https://pypi.org/project/google-cloud-aiplatform/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-aiplatform.svg
:target: https://pypi.org/project/google-cloud-aiplatform/
.. |unit-tests| image:: https://storage.googleapis.com/cloud-devrel-public/python-aiplatform/badges/sdk-unit-tests.svg
:target: https://storage.googleapis.com/cloud-devrel-public/python-aiplatform/badges/sdk-unit-tests.html
.. |system-tests| image:: https://storage.googleapis.com/cloud-devrel-public/python-aiplatform/badges/sdk-system-tests.svg
:target: https://storage.googleapis.com/cloud-devrel-public/python-aiplatform/badges/sdk-system-tests.html
.. |sample-tests| image:: https://storage.googleapis.com/cloud-devrel-public/python-aiplatform/badges/sdk-sample-tests.svg
:target: https://storage.googleapis.com/cloud-devrel-public/python-aiplatform/badges/sdk-sample-tests.html
.. _Vertex AI: https://cloud.google.com/vertex-ai/docs
.. _Client Library Documentation: https://cloud.google.com/python/docs/reference/aiplatform/latest
.. _Product Documentation: https://cloud.google.com/vertex-ai/docs
Quick Start
-----------
In order to use this library, you first need to go through the following steps:
1. `Select or create a Cloud Platform project.`_
2. `Enable billing for your project.`_
3. `Enable the Vertex AI API.`_
4. `Setup Authentication.`_
.. _Select or create a Cloud Platform project.: https://console.cloud.google.com/project
.. _Enable billing for your project.: https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project
.. _Enable the Vertex AI API.: https://cloud.google.com/vertex-ai/docs/start/use-vertex-ai-python-sdk
.. _Setup Authentication.: https://googleapis.dev/python/google-api-core/latest/auth.html
Installation
~~~~~~~~~~~~
Install this library in a `virtualenv`_ using pip. `virtualenv`_ is a tool to
create isolated Python environments. The basic problem it addresses is one of
dependencies and versions, and indirectly permissions.
With `virtualenv`_, it's possible to install this library without needing system
install permissions, and without clashing with the installed system
dependencies.
.. _virtualenv: https://virtualenv.pypa.io/en/latest/
Mac/Linux
^^^^^^^^^
.. code-block:: console
pip install virtualenv
virtualenv <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-cloud-aiplatform
Windows
^^^^^^^
.. code-block:: console
pip install virtualenv
virtualenv <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-cloud-aiplatform
Supported Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^
Python >= 3.8
Deprecated Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^^
Python <= 3.7.
The last version of this library compatible with Python 3.6 is google-cloud-aiplatform==1.12.1.
Overview
~~~~~~~~
This section provides a brief overview of the Vertex AI SDK for Python. You can also reference the notebooks in `vertex-ai-samples`_ for examples.
.. _vertex-ai-samples: https://github.com/GoogleCloudPlatform/vertex-ai-samples/tree/main/notebooks/community/sdk
All publicly available SDK features can be found in the :code:`google/cloud/aiplatform` directory.
Under the hood, Vertex SDK builds on top of GAPIC, which stands for Google API CodeGen.
The GAPIC library code sits in :code:`google/cloud/aiplatform_v1` and :code:`google/cloud/aiplatform_v1beta1`,
and it is auto-generated from Google's service proto files.
For most developers' programmatic needs, they can follow these steps to figure out which libraries to import:
1. Look through :code:`google/cloud/aiplatform` first -- Vertex SDK's APIs will almost always be easier to use and more concise comparing with GAPIC
2. If the feature that you are looking for cannot be found there, look through :code:`aiplatform_v1` to see if it's available in GAPIC
3. If it is still in beta phase, it will be available in :code:`aiplatform_v1beta1`
If none of the above scenarios could help you find the right tools for your task, please feel free to open a github issue and send us a feature request.
Importing
^^^^^^^^^
Vertex AI SDK resource based functionality can be used by importing the following namespace:
.. code-block:: Python
from google.cloud import aiplatform
Initialization
^^^^^^^^^^^^^^
Initialize the SDK to store common configurations that you use with the SDK.
.. code-block:: Python
aiplatform.init(
# your Google Cloud Project ID or number
# environment default used is not set
project='my-project',
# the Vertex AI region you will use
# defaults to us-central1
location='us-central1',
# Google Cloud Storage bucket in same region as location
# used to stage artifacts
staging_bucket='gs://my_staging_bucket',
# custom google.auth.credentials.Credentials
# environment default credentials used if not set
credentials=my_credentials,
# customer managed encryption key resource name
# will be applied to all Vertex AI resources if set
encryption_spec_key_name=my_encryption_key_name,
# the name of the experiment to use to track
# logged metrics and parameters
experiment='my-experiment',
# description of the experiment above
experiment_description='my experiment description'
)
Datasets
^^^^^^^^
Vertex AI provides managed tabular, text, image, and video datasets. In the SDK, datasets can be used downstream to
train models.
To create a tabular dataset:
.. code-block:: Python
my_dataset = aiplatform.TabularDataset.create(
display_name="my-dataset", gcs_source=['gs://path/to/my/dataset.csv'])
You can also create and import a dataset in separate steps:
.. code-block:: Python
from google.cloud import aiplatform
my_dataset = aiplatform.TextDataset.create(
display_name="my-dataset")
my_dataset.import_data(
gcs_source=['gs://path/to/my/dataset.csv'],
import_schema_uri=aiplatform.schema.dataset.ioformat.text.multi_label_classification
)
To get a previously created Dataset:
.. code-block:: Python
dataset = aiplatform.ImageDataset('projects/my-project/location/us-central1/datasets/{DATASET_ID}')
Vertex AI supports a variety of dataset schemas. References to these schemas are available under the
:code:`aiplatform.schema.dataset` namespace. For more information on the supported dataset schemas please refer to the
`Preparing data docs`_.
.. _Preparing data docs: https://cloud.google.com/ai-platform-unified/docs/datasets/prepare
Training
^^^^^^^^
The Vertex AI SDK for Python allows you train Custom and AutoML Models.
You can train custom models using a custom Python script, custom Python package, or container.
**Preparing Your Custom Code**
Vertex AI custom training enables you to train on Vertex AI datasets and produce Vertex AI models. To do so your
script must adhere to the following contract:
It must read datasets from the environment variables populated by the training service:
.. code-block:: Python
os.environ['AIP_DATA_FORMAT'] # provides format of data
os.environ['AIP_TRAINING_DATA_URI'] # uri to training split
os.environ['AIP_VALIDATION_DATA_URI'] # uri to validation split
os.environ['AIP_TEST_DATA_URI'] # uri to test split
Please visit `Using a managed dataset in a custom training application`_ for a detailed overview.
.. _Using a managed dataset in a custom training application: https://cloud.google.com/vertex-ai/docs/training/using-managed-datasets
It must write the model artifact to the environment variable populated by the training service:
.. code-block:: Python
os.environ['AIP_MODEL_DIR']
**Running Training**
.. code-block:: Python
job = aiplatform.CustomTrainingJob(
display_name="my-training-job",
script_path="training_script.py",
container_uri="us-docker.pkg.dev/vertex-ai/training/tf-cpu.2-2:latest",
requirements=["gcsfs==0.7.1"],
model_serving_container_image_uri="us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-2:latest",
)
model = job.run(my_dataset,
replica_count=1,
machine_type="n1-standard-4",
accelerator_type='NVIDIA_TESLA_K80',
accelerator_count=1)
In the code block above `my_dataset` is managed dataset created in the `Dataset` section above. The `model` variable is a managed Vertex AI model that can be deployed or exported.
AutoMLs
-------
The Vertex AI SDK for Python supports AutoML tabular, image, text, video, and forecasting.
To train an AutoML tabular model:
.. code-block:: Python
dataset = aiplatform.TabularDataset('projects/my-project/location/us-central1/datasets/{DATASET_ID}')
job = aiplatform.AutoMLTabularTrainingJob(
display_name="train-automl",
optimization_prediction_type="regression",
optimization_objective="minimize-rmse",
)
model = job.run(
dataset=dataset,
target_column="target_column_name",
training_fraction_split=0.6,
validation_fraction_split=0.2,
test_fraction_split=0.2,
budget_milli_node_hours=1000,
model_display_name="my-automl-model",
disable_early_stopping=False,
)
Models
------
To get a model:
.. code-block:: Python
model = aiplatform.Model('/projects/my-project/locations/us-central1/models/{MODEL_ID}')
To upload a model:
.. code-block:: Python
model = aiplatform.Model.upload(
display_name='my-model',
artifact_uri="gs://python/to/my/model/dir",
serving_container_image_uri="us-docker.pkg.dev/vertex-ai/prediction/tf2-cpu.2-2:latest",
)
To deploy a model:
.. code-block:: Python
endpoint = model.deploy(machine_type="n1-standard-4",
min_replica_count=1,
max_replica_count=5
machine_type='n1-standard-4',
accelerator_type='NVIDIA_TESLA_K80',
accelerator_count=1)
Please visit `Importing models to Vertex AI`_ for a detailed overview:
.. _Importing models to Vertex AI: https://cloud.google.com/vertex-ai/docs/general/import-model
Model Evaluation
----------------
The Vertex AI SDK for Python currently supports getting model evaluation metrics for all AutoML models.
To list all model evaluations for a model:
.. code-block:: Python
model = aiplatform.Model('projects/my-project/locations/us-central1/models/{MODEL_ID}')
evaluations = model.list_model_evaluations()
To get the model evaluation resource for a given model:
.. code-block:: Python
model = aiplatform.Model('projects/my-project/locations/us-central1/models/{MODEL_ID}')
# returns the first evaluation with no arguments, you can also pass the evaluation ID
evaluation = model.get_model_evaluation()
eval_metrics = evaluation.metrics
You can also create a reference to your model evaluation directly by passing in the resource name of the model evaluation:
.. code-block:: Python
evaluation = aiplatform.ModelEvaluation(
evaluation_name='projects/my-project/locations/us-central1/models/{MODEL_ID}/evaluations/{EVALUATION_ID}')
Alternatively, you can create a reference to your evaluation by passing in the model and evaluation IDs:
.. code-block:: Python
evaluation = aiplatform.ModelEvaluation(
evaluation_name={EVALUATION_ID},
model_id={MODEL_ID})
Batch Prediction
----------------
To create a batch prediction job:
.. code-block:: Python
model = aiplatform.Model('/projects/my-project/locations/us-central1/models/{MODEL_ID}')
batch_prediction_job = model.batch_predict(
job_display_name='my-batch-prediction-job',
instances_format='csv',
machine_type='n1-standard-4',
gcs_source=['gs://path/to/my/file.csv'],
gcs_destination_prefix='gs://path/to/my/batch_prediction/results/',
service_account='my-sa@my-project.iam.gserviceaccount.com'
)
You can also create a batch prediction job asynchronously by including the `sync=False` argument:
.. code-block:: Python
batch_prediction_job = model.batch_predict(..., sync=False)
# wait for resource to be created
batch_prediction_job.wait_for_resource_creation()
# get the state
batch_prediction_job.state
# block until job is complete
batch_prediction_job.wait()
Endpoints
---------
To create an endpoint:
.. code-block:: Python
endpoint = aiplatform.Endpoint.create(display_name='my-endpoint')
To deploy a model to a created endpoint:
.. code-block:: Python
model = aiplatform.Model('/projects/my-project/locations/us-central1/models/{MODEL_ID}')
endpoint.deploy(model,
min_replica_count=1,
max_replica_count=5,
machine_type='n1-standard-4',
accelerator_type='NVIDIA_TESLA_K80',
accelerator_count=1)
To get predictions from endpoints:
.. code-block:: Python
endpoint.predict(instances=[[6.7, 3.1, 4.7, 1.5], [4.6, 3.1, 1.5, 0.2]])
To undeploy models from an endpoint:
.. code-block:: Python
endpoint.undeploy_all()
To delete an endpoint:
.. code-block:: Python
endpoint.delete()
Pipelines
---------
To create a Vertex AI Pipeline run and monitor until completion:
.. code-block:: Python
# Instantiate PipelineJob object
pl = PipelineJob(
display_name="My first pipeline",
# Whether or not to enable caching
# True = always cache pipeline step result
# False = never cache pipeline step result
# None = defer to cache option for each pipeline component in the pipeline definition
enable_caching=False,
# Local or GCS path to a compiled pipeline definition
template_path="pipeline.json",
# Dictionary containing input parameters for your pipeline
parameter_values=parameter_values,
# GCS path to act as the pipeline root
pipeline_root=pipeline_root,
)
# Execute pipeline in Vertex AI and monitor until completion
pl.run(
# Email address of service account to use for the pipeline run
# You must have iam.serviceAccounts.actAs permission on the service account to use it
service_account=service_account,
# Whether this function call should be synchronous (wait for pipeline run to finish before terminating)
# or asynchronous (return immediately)
sync=True
)
To create a Vertex AI Pipeline without monitoring until completion, use `submit` instead of `run`:
.. code-block:: Python
# Instantiate PipelineJob object
pl = PipelineJob(
display_name="My first pipeline",
# Whether or not to enable caching
# True = always cache pipeline step result
# False = never cache pipeline step result
# None = defer to cache option for each pipeline component in the pipeline definition
enable_caching=False,
# Local or GCS path to a compiled pipeline definition
template_path="pipeline.json",
# Dictionary containing input parameters for your pipeline
parameter_values=parameter_values,
# GCS path to act as the pipeline root
pipeline_root=pipeline_root,
)
# Submit the Pipeline to Vertex AI
pl.submit(
# Email address of service account to use for the pipeline run
# You must have iam.serviceAccounts.actAs permission on the service account to use it
service_account=service_account,
)
Explainable AI: Get Metadata
----------------------------
To get metadata in dictionary format from TensorFlow 1 models:
.. code-block:: Python
from google.cloud.aiplatform.explain.metadata.tf.v1 import saved_model_metadata_builder
builder = saved_model_metadata_builder.SavedModelMetadataBuilder(
'gs://python/to/my/model/dir', tags=[tf.saved_model.tag_constants.SERVING]
)
generated_md = builder.get_metadata()
To get metadata in dictionary format from TensorFlow 2 models:
.. code-block:: Python
from google.cloud.aiplatform.explain.metadata.tf.v2 import saved_model_metadata_builder
builder = saved_model_metadata_builder.SavedModelMetadataBuilder('gs://python/to/my/model/dir')
generated_md = builder.get_metadata()
To use Explanation Metadata in endpoint deployment and model upload:
.. code-block:: Python
explanation_metadata = builder.get_metadata_protobuf()
# To deploy a model to an endpoint with explanation
model.deploy(..., explanation_metadata=explanation_metadata)
# To deploy a model to a created endpoint with explanation
endpoint.deploy(..., explanation_metadata=explanation_metadata)
# To upload a model with explanation
aiplatform.Model.upload(..., explanation_metadata=explanation_metadata)
Cloud Profiler
----------------------------
Cloud Profiler allows you to profile your remote Vertex AI Training jobs on demand and visualize the results in Vertex AI Tensorboard.
To start using the profiler with TensorFlow, update your training script to include the following:
.. code-block:: Python
from google.cloud.aiplatform.training_utils import cloud_profiler
...
cloud_profiler.init()
Next, run the job with with a Vertex AI TensorBoard instance. For full details on how to do this, visit https://cloud.google.com/vertex-ai/docs/experiments/tensorboard-overview
Finally, visit your TensorBoard in your Google Cloud Console, navigate to the "Profile" tab, and click the `Capture Profile` button. This will allow users to capture profiling statistics for the running jobs.
Next Steps
~~~~~~~~~~
- Read the `Client Library Documentation`_ for Vertex AI
API to see other available methods on the client.
- Read the `Vertex AI API Product documentation`_ to learn
more about the product and see How-to Guides.
- View this `README`_ to see the full list of Cloud
APIs that we cover.
.. _Vertex AI API Product documentation: https://cloud.google.com/vertex-ai/docs
.. _README: https://github.com/googleapis/google-cloud-python/blob/main/README.rst

View File

@@ -0,0 +1,6 @@
Wheel-Version: 1.0
Generator: setuptools (75.3.0)
Root-Is-Purelib: true
Tag: py2-none-any
Tag: py3-none-any

View File

@@ -0,0 +1,2 @@
[console_scripts]
tb-gcp-uploader = google.cloud.aiplatform.tensorboard.uploader_main:run_main

View File

@@ -0,0 +1,3 @@
google
vertex_ray
vertexai