Skip to content

Commit 1586e1e

Browse files
authored
chore(ppai/timeseries-classification): add global fishing watch sample (GoogleCloudPlatform#6557)
Add Global Fishing Watch sample for the People & Planet AI Series. Staged: * [Notebook](https://colab.research.google.com/github/davidcavazos/python-docs-samples/blob/ppai-timeseries-classification/people-and-planet-ai/timeseries-classification/README.ipynb) -- this serves as the README > ⚠️ When cloning the repo, it doesn't exist yet in the main branch, so replace with: > ``` > !git clone --branch ppai-timeseries-classification https://github.com/davidcavazos/python-docs-samples.git ~/python-docs-samples > ``` > The links to source files in the notebook will also be broken since the files don't exist yet in the main branch.
1 parent 4c6fc2d commit 1586e1e

22 files changed

+3084
-0
lines changed

.github/CODEOWNERS

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,7 @@
5858
/monitoring/prometheus @yuriatgoogle @GoogleCloudPlatform/python-samples-owners
5959
/notebooks/**/*.py @alixhami @GoogleCloudPlatform/python-samples-owners
6060
/opencensus/**/*.py @GoogleCloudPlatform/python-samples-owners
61+
/people-and-planet-ai/**/*.py @davidcavazos @GoogleCloudPlatform/python-samples-owners
6162
/profiler/**/*.py @kalyanac @GoogleCloudPlatform/python-samples-owners
6263
/pubsub/**/*.py @anguillanneuf @hongalex @GoogleCloudPlatform/python-samples-owners
6364
/run/**/*.py @GoogleCloudPlatform/cdpe-serverless @GoogleCloudPlatform/python-samples-owners

people-and-planet-ai/README.md

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
# 🌍 People & Planet AI
2+
3+
## 🦏 [Wildlife Insights -- _image-classification_](image-classification)
4+
5+
This model is trained to recognize animal species from
6+
[camera trap](https://en.wikipedia.org/wiki/Camera_trap)
7+
pictures.
8+
9+
* **Creating datasets**: [Apache Beam] in [Dataflow]
10+
* **Training the model**: [AutoML] in [Vertex AI]
11+
* **Getting predictions**: [Vertex AI]
12+
13+
## 🗺 [Global Fishing Watch -- _timeseries-classification_](timeseries-classification)
14+
15+
This model is trained to categorize if a ship is fishing or not every hour from their
16+
[_Maritime Mobile Service Identitiy_ (MMSI)](https://en.wikipedia.org/wiki/Maritime_Mobile_Service_Identity)
17+
location data.
18+
19+
* **Creating datasets**: [Apache Beam] in [Dataflow]
20+
* **Training the model**: [Keras] in [Vertex AI]
21+
* **Getting predictions**: [Keras] in [Cloud Run]
22+
23+
[Apache Beam]: https://beam.apache.org
24+
[AutoML]: https://cloud.google.com/vertex-ai/docs/beginner/beginners-guide
25+
[Cloud Run]: https://cloud.google.com/run
26+
[Dataflow]: https://cloud.google.com/dataflow
27+
[Keras]: https://keras.io
28+
[Vertex AI]: https://cloud.google.com/vertex-ai
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
# Ignore everything except the source files.
2+
**/*
3+
!Dockerfile
4+
!constraints.txt
5+
!requirements.txt
6+
!*.py
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# Copyright 2021 Google LLC
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License");
4+
# you may not use this file except in compliance with the License.
5+
# You may obtain a copy of the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS,
11+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12+
# See the License for the specific language governing permissions and
13+
# limitations under the License.
14+
15+
# Each version of TensorFlow requires a specific CUDA/cuDNN version:
16+
# https://www.tensorflow.org/install/source#gpu
17+
# For a list of all the nvidia images:
18+
# https://ngc.nvidia.com/catalog/containers/nvidia:cuda/tags
19+
FROM nvcr.io/nvidia/cuda:11.3.1-cudnn8-runtime-ubuntu20.04
20+
21+
WORKDIR /app
22+
23+
# Copy the pipeline source files and the requirements file.
24+
COPY *.py ./
25+
COPY requirements.txt ./
26+
COPY constraints.txt ./
27+
28+
# Install Python 3 and other required dependencies.
29+
RUN apt-get update \
30+
&& apt-get install -y --no-install-recommends curl g++ python3.8-dev python3-distutils \
31+
&& rm -rf /var/lib/apt/lists/* \
32+
&& update-alternatives --install /usr/bin/python python /usr/bin/python3.8 10 \
33+
&& curl https://bootstrap.pypa.io/get-pip.py | python \
34+
# Install the pipeline requirements and check that there are no conflicts.
35+
# Since the image already has all the dependencies installed,
36+
# there's no need to run with the --requirements_file option.
37+
&& pip install --no-cache-dir -r requirements.txt -c constraints.txt \
38+
&& pip check
39+
40+
# Set the entrypoint to Apache Beam SDK worker launcher.
41+
COPY --from=apache/beam_python3.8_sdk:2.32.0 /opt/apache/beam /opt/apache/beam
42+
ENTRYPOINT [ "/opt/apache/beam/boot" ]

0 commit comments

Comments
 (0)