| { |
| "cells": [ |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "cellView": "form", |
| "id": "paYiulysGrwR" |
| }, |
| "outputs": [], |
| "source": [ |
| "# @title ###### Licensed to the Apache Software Foundation (ASF), Version 2.0 (the \"License\")\n", |
| "\n", |
| "# Licensed to the Apache Software Foundation (ASF) under one\n", |
| "# or more contributor license agreements. See the NOTICE file\n", |
| "# distributed with this work for additional information\n", |
| "# regarding copyright ownership. The ASF licenses this file\n", |
| "# to you under the Apache License, Version 2.0 (the\n", |
| "# \"License\"); you may not use this file except in compliance\n", |
| "# with the License. You may obtain a copy of the License at\n", |
| "#\n", |
| "# http://www.apache.org/licenses/LICENSE-2.0\n", |
| "#\n", |
| "# Unless required by applicable law or agreed to in writing,\n", |
| "# software distributed under the License is distributed on an\n", |
| "# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n", |
| "# KIND, either express or implied. See the License for the\n", |
| "# specific language governing permissions and limitations\n", |
| "# under the License" |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "0UGzzndTBPWQ" |
| }, |
| "source": [ |
| "# Remote inference in Apache Beam\n", |
| "\n", |
| "<table align=\"left\">\n", |
| " <td>\n", |
| " <a target=\"_blank\" href=\"https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/custom_remote_inference.ipynb\"><img src=\"https://raw.githubusercontent.com/google/or-tools/main/tools/colab_32px.png\" />Run in Google Colab</a>\n", |
| " </td>\n", |
| " <td>\n", |
| " <a target=\"_blank\" href=\"https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/custom_remote_inference.ipynb\"><img src=\"https://raw.githubusercontent.com/google/or-tools/main/tools/github_32px.png\" />View source on GitHub</a>\n", |
| " </td>\n", |
| "</table>\n" |
| ] |
| }, |
| { |
| "attachments": {}, |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "GNbarEZsalS2" |
| }, |
| "source": [ |
| "This example demonstrates how to implement a custom inference call in Apache Beam using the Google Cloud Vision API.\n", |
| "\n", |
| "The prefered way to run inference in Apache Beam is by using the [RunInference API](https://beam.apache.org/documentation/sdks/python-machine-learning/). \n", |
| "The RunInference API enables you to run models as part of your pipeline in a way that is optimized for machine learning inference. \n", |
| "To reduce the number of steps that you need to take, RunInference supports features like batching. For more infomation about the RunInference API, review the [RunInference API](https://beam.apache.org/releases/pydoc/current/apache_beam.ml.inference.html#apache_beam.ml.inference.RunInference), \n", |
| "which demonstrates how to implement model inference in PyTorch, scikit-learn, and TensorFlow.\n", |
| "\n", |
| "Currently, the RunInference API doesn't support making remote inference calls using the Natural Language API, Cloud Vision API, and so on. \n", |
| "Therefore, to use these remote APIs with Apache Beam, you need to write custom inference calls.\n", |
| "\n", |
| "**Note:** all images are licensed CC-BY, creators are listed in the [LICENSE.txt](https://storage.googleapis.com/apache-beam-samples/image_captioning/LICENSE.txt) file." |
| ] |
| }, |
| { |
| "attachments": {}, |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "GNbarEZsalS1" |
| }, |
| "source": [ |
| "## Run the Cloud Vision API\n", |
| "\n", |
| "You can use the Cloud Vision API to retrieve labels that describe an image.\n", |
| "For example, the following image shows a cat with possible labels." |
| ] |
| }, |
| { |
| "attachments": {}, |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "q-jVQn3maZ81" |
| }, |
| "source": [ |
| "" |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "4io1vzkzF683" |
| }, |
| "source": [ |
| "We want to run the Google Cloud Vision API on a large set of images, and Apache Beam is the ideal tool to handle this workflow.\n", |
| "This example demonstates how to retrieve image labels with this API on a small set of images.\n", |
| "\n", |
| "The example follows these steps to implement this workflow:\n", |
| "* Read the images.\n", |
| "* Batch the images together to optimize the model call.\n", |
| "* Send the images to an external API to run inference.\n", |
| "* Postprocess the results of your API.\n", |
| "\n", |
| "**Caution:** Be aware of API quotas and the heavy load you might incur on your external API. Verify that your pipeline and API are configured correctly for your use case.\n", |
| "\n", |
| "To optimize the calls to the external API, limit the parallel calls to the external remote API by configuring [PipelineOptions](https://beam.apache.org/documentation/programming-guide/#configuring-pipeline-options).\n", |
| "In Apache Beam, different runners provide options to handle the parallelism, for example:\n", |
| "* With the [Direct Runner](https://beam.apache.org/documentation/runners/direct/), use the `direct_num_workers` pipeline option.\n", |
| "* With the [Google Cloud Dataflow Runner](https://beam.apache.org/documentation/runners/dataflow/), use the `max_num_workers` pipeline option.\n", |
| "\n", |
| "For information about other runners, see the [Beam capability matrix](https://beam.apache.org/documentation/runners/capability-matrix/) " |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "FAawWOaiIYaS" |
| }, |
| "source": [ |
| "## Before you begin\n", |
| "\n", |
| "This section provides installation steps." |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "XhpKOxINrIqz" |
| }, |
| "source": [ |
| "First, download and install the dependencies." |
| ] |
| }, |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "id": "bA7MLR8OptJw" |
| }, |
| "outputs": [], |
| "source": [ |
| "!pip install --upgrade pip\n", |
| "!pip install protobuf==3.19.4\n", |
| "!pip install apache-beam[interactive,gcp]>=2.40.0\n", |
| "!pip install google-cloud-vision==3.1.1\n", |
| "!pip install requests\n", |
| "\n", |
| "# To use the newly installed version, restart the runtime.\n", |
| "exit() " |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "C-RVR2eprc0r" |
| }, |
| "source": [ |
| "To use the Cloud Vision API, authenticate with Google Cloud." |
| ] |
| }, |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "id": "qGDJCbxgTprh" |
| }, |
| "outputs": [], |
| "source": [ |
| "# Follow the steps to configure your Google Cloup setup.\n", |
| "!gcloud init --console-only" |
| ] |
| }, |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "id": "74acX7AlT91N" |
| }, |
| "outputs": [], |
| "source": [ |
| "\n", |
| "!gcloud auth application-default login" |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "mL4MaHm_XOVd" |
| }, |
| "source": [ |
| "## Run remote inference on Cloud Vision API\n", |
| "\n", |
| "This section demonstates the steps to run remote inference on the Cloud Vision API.\n", |
| "\n", |
| "Download and install Apache Beam and the required modules." |
| ] |
| }, |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "id": "gE0go8CpnTy3" |
| }, |
| "outputs": [], |
| "source": [ |
| "from typing import List\n", |
| "import io\n", |
| "import os\n", |
| "import requests\n", |
| "\n", |
| "from google.cloud import vision\n", |
| "from google.cloud.vision_v1.types import Feature\n", |
| "import apache_beam as beam" |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "09k08IYlLmON" |
| }, |
| "source": [ |
| "This example uses images from the [MSCoco dataset](https://cocodataset.org/#explore) as a list of image URLs.\n", |
| "This data is used as the pipeline input." |
| ] |
| }, |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "id": "_89eN_1QeYEd" |
| }, |
| "outputs": [], |
| "source": [ |
| "image_urls = [\n", |
| " \"http://farm3.staticflickr.com/2824/10213933686_6936eb402b_z.jpg\",\n", |
| " \"http://farm8.staticflickr.com/7026/6388965173_92664a0d78_z.jpg\",\n", |
| " \"http://farm8.staticflickr.com/7003/6528937031_10e1ce0960_z.jpg\",\n", |
| " \"http://farm6.staticflickr.com/5207/5304302785_7b5f763190_z.jpg\",\n", |
| " \"http://farm6.staticflickr.com/5207/5304302785_7b5f763190_z.jpg\",\n", |
| " \"http://farm8.staticflickr.com/7026/6388965173_92664a0d78_z.jpg\",\n", |
| " \"http://farm8.staticflickr.com/7026/6388965173_92664a0d78_z.jpg\",\n", |
| "]\n", |
| "\n", |
| "def read_image(image_url):\n", |
| " \"\"\"Read image from url and return image_url, image bytes\"\"\"\n", |
| " response = requests.get(image_url)\n", |
| " image_bytes = io.BytesIO(response.content).read()\n", |
| " return image_url, image_bytes " |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "HLy7VKJhLrmT" |
| }, |
| "source": [ |
| "### Create a custom DoFn\n", |
| "\n", |
| "In order to implement remote inference, create a DoFn class. This class sends a batch of images to the Cloud vision API.\n", |
| "\n", |
| "The custom DoFn makes it possible to initialize the API. In case of a custom model, a model can also be loaded in the `setup` function. \n", |
| "\n", |
| "The `process` function is the most interesting part. In this function, we implement the model call and return its results.\n", |
| "\n", |
| "When running remote inference, prepare to encounter, identify, and handle failure as gracefully as possible. We recommend using the following techniques: \n", |
| "\n", |
| "* **Exponential backoff:** Retry failed remote calls with exponentially growing pauses between retries. Using exponential backoff ensures that failures don't lead to an overwhelming number of retries in quick succession. \n", |
| "\n", |
| "* **Dead-letter queues:** Route failed inferences to a separate `PCollection` without failing the whole transform. You can continue execution without failing the job (batch jobs' default behavior) or retrying indefinitely (streaming jobs' default behavior).\n", |
| "You can then run custom pipeline logic on the dead-letter queue (unprocessed messages queue) to log the failure, alert, and push the failed message to temporary storage so that it can eventually be reprocessed." |
| ] |
| }, |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "id": "LnaisJ_JiY_Q" |
| }, |
| "outputs": [], |
| "source": [ |
| "class RemoteBatchInference(beam.DoFn):\n", |
| " \"\"\"DoFn that accepts a batch of images as bytearray\n", |
| " and sends that batch to the Cloud vision API for remote inference.\"\"\"\n", |
| " def setup(self):\n", |
| " \"\"\"Init the Google Vision API client.\"\"\"\n", |
| " self._client = vision.ImageAnnotatorClient()\n", |
| " \n", |
| " def process(self, images_batch):\n", |
| " feature = Feature()\n", |
| " feature.type_ = Feature.Type.LABEL_DETECTION\n", |
| "\n", |
| " # The list of image_urls\n", |
| " image_urls = [image_url for (image_url, image_bytes) in images_batch]\n", |
| "\n", |
| " # Create a batch request for all images in the batch.\n", |
| " images = [vision.Image(content=image_bytes) for (image_url, image_bytes) in images_batch]\n", |
| " image_requests = [vision.AnnotateImageRequest(image=image, features=[feature]) for image in images]\n", |
| " batch_image_request = vision.BatchAnnotateImagesRequest(requests=image_requests)\n", |
| "\n", |
| " # Send the batch request to the remote endpoint.\n", |
| " responses = self._client.batch_annotate_images(request=batch_image_request).responses\n", |
| " \n", |
| " return list(zip(image_urls, responses))\n" |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "lHJuyHhvL0-a" |
| }, |
| "source": [ |
| "### Manage batching\n", |
| "\n", |
| "Before we can chain together the pipeline steps, we need to understand batching.\n", |
| "When running inference with your model, either in Apache Beam or in an external API, you can batch your input to increase the efficiency of the model execution.\n", |
| "When using a custom DoFn, as in this example, you need to manage the batching.\n", |
| "\n", |
| "To manage the batching in this pipeline, include a `BatchElements` transform to group elements together and form a batch of the desired size.\n", |
| "\n", |
| "* If you have a streaming pipeline, consider using [GroupIntoBatches](https://beam.apache.org/documentation/transforms/python/aggregation/groupintobatches/),\n", |
| "because `BatchElements` doesn't batch items across bundles. `GroupIntoBatches` requires choosing a key within which items are batched.\n", |
| "\n", |
| "* When batching, make sure that the input batch matches the maximum payload of the external API. \n", |
| "\n", |
| "* If you are designing your own API endpoint, make sure that it can handle batches. \n", |
| "\n", |
| " " |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "4sXHwZk9Url2" |
| }, |
| "source": [ |
| "### Create the pipeline\n", |
| "\n", |
| "This section demonstrates how to chain the steps together to do the following:\n", |
| "\n", |
| "* Read data.\n", |
| "\n", |
| "* Transform the data to fit the model input.\n", |
| "\n", |
| "* Run remote inference.\n", |
| "\n", |
| "* Process and display the results." |
| ] |
| }, |
| { |
| "cell_type": "code", |
| "execution_count": null, |
| "metadata": { |
| "colab": { |
| "base_uri": "https://localhost:8080/" |
| }, |
| "id": "LLg0OTvNkqo4", |
| "outputId": "7250b11d-a805-436a-990b-0a864404a536" |
| }, |
| "outputs": [ |
| { |
| "name": "stdout", |
| "output_type": "stream", |
| "text": [ |
| "('http://farm3.staticflickr.com/2824/10213933686_6936eb402b_z.jpg', label_annotations {\n", |
| " mid: \"/m/083wq\"\n", |
| " description: \"Wheel\"\n", |
| " score: 0.9790800213813782\n", |
| " topicality: 0.9790800213813782\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0h9mv\"\n", |
| " description: \"Tire\"\n", |
| " score: 0.9781236052513123\n", |
| " topicality: 0.9781236052513123\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/043g5f\"\n", |
| " description: \"Fuel tank\"\n", |
| " score: 0.9584090113639832\n", |
| " topicality: 0.9584090113639832\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/05s2s\"\n", |
| " description: \"Plant\"\n", |
| " score: 0.956047534942627\n", |
| " topicality: 0.956047534942627\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0h8lk_j\"\n", |
| " description: \"Automotive fuel system\"\n", |
| " score: 0.9403533339500427\n", |
| " topicality: 0.9403533339500427\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/07yv9\"\n", |
| " description: \"Vehicle\"\n", |
| " score: 0.9362041354179382\n", |
| " topicality: 0.9362041354179382\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02qwkrn\"\n", |
| " description: \"Vehicle brake\"\n", |
| " score: 0.9050074815750122\n", |
| " topicality: 0.9050074815750122\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0h8pb3l\"\n", |
| " description: \"Automotive tire\"\n", |
| " score: 0.8968825936317444\n", |
| " topicality: 0.8968825936317444\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0768fx\"\n", |
| " description: \"Automotive lighting\"\n", |
| " score: 0.8944322466850281\n", |
| " topicality: 0.8944322466850281\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/04tkfx\"\n", |
| " description: \"Tread\"\n", |
| " score: 0.878828227519989\n", |
| " topicality: 0.878828227519989\n", |
| "}\n", |
| ")\n", |
| "('http://farm8.staticflickr.com/7026/6388965173_92664a0d78_z.jpg', label_annotations {\n", |
| " mid: \"/m/054_l\"\n", |
| " description: \"Mirror\"\n", |
| " score: 0.9682560563087463\n", |
| " topicality: 0.9682560563087463\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02jz0l\"\n", |
| " description: \"Tap\"\n", |
| " score: 0.9611372947692871\n", |
| " topicality: 0.9611372947692871\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0130jx\"\n", |
| " description: \"Sink\"\n", |
| " score: 0.9328749775886536\n", |
| " topicality: 0.9328749775886536\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0h8lr5r\"\n", |
| " description: \"Bathroom sink\"\n", |
| " score: 0.9324912428855896\n", |
| " topicality: 0.9324912428855896\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02pkr5\"\n", |
| " description: \"Plumbing fixture\"\n", |
| " score: 0.9191171526908875\n", |
| " topicality: 0.9191171526908875\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02dgv\"\n", |
| " description: \"Door\"\n", |
| " score: 0.8910166621208191\n", |
| " topicality: 0.8910166621208191\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/09ggk\"\n", |
| " description: \"Purple\"\n", |
| " score: 0.8799519538879395\n", |
| " topicality: 0.8799519538879395\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/01j2bj\"\n", |
| " description: \"Bathroom\"\n", |
| " score: 0.8725592494010925\n", |
| " topicality: 0.8725592494010925\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/04wnmd\"\n", |
| " description: \"Fixture\"\n", |
| " score: 0.8603869080543518\n", |
| " topicality: 0.8603869080543518\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/04y4h8h\"\n", |
| " description: \"Bathroom cabinet\"\n", |
| " score: 0.80011385679245\n", |
| " topicality: 0.80011385679245\n", |
| "}\n", |
| ")\n", |
| "('http://farm8.staticflickr.com/7003/6528937031_10e1ce0960_z.jpg', error {\n", |
| " code: 3\n", |
| " message: \"Bad image data.\"\n", |
| "}\n", |
| ")\n", |
| "('http://farm6.staticflickr.com/5207/5304302785_7b5f763190_z.jpg', error {\n", |
| " code: 3\n", |
| " message: \"Bad image data.\"\n", |
| "}\n", |
| ")\n", |
| "('http://farm6.staticflickr.com/5207/5304302785_7b5f763190_z.jpg', error {\n", |
| " code: 3\n", |
| " message: \"Bad image data.\"\n", |
| "}\n", |
| ")\n", |
| "('http://farm8.staticflickr.com/7026/6388965173_92664a0d78_z.jpg', label_annotations {\n", |
| " mid: \"/m/054_l\"\n", |
| " description: \"Mirror\"\n", |
| " score: 0.9682560563087463\n", |
| " topicality: 0.9682560563087463\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02jz0l\"\n", |
| " description: \"Tap\"\n", |
| " score: 0.9611372947692871\n", |
| " topicality: 0.9611372947692871\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0130jx\"\n", |
| " description: \"Sink\"\n", |
| " score: 0.9328749775886536\n", |
| " topicality: 0.9328749775886536\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0h8lr5r\"\n", |
| " description: \"Bathroom sink\"\n", |
| " score: 0.9324912428855896\n", |
| " topicality: 0.9324912428855896\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02pkr5\"\n", |
| " description: \"Plumbing fixture\"\n", |
| " score: 0.9191171526908875\n", |
| " topicality: 0.9191171526908875\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02dgv\"\n", |
| " description: \"Door\"\n", |
| " score: 0.8910166621208191\n", |
| " topicality: 0.8910166621208191\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/09ggk\"\n", |
| " description: \"Purple\"\n", |
| " score: 0.8799519538879395\n", |
| " topicality: 0.8799519538879395\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/01j2bj\"\n", |
| " description: \"Bathroom\"\n", |
| " score: 0.8725592494010925\n", |
| " topicality: 0.8725592494010925\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/04wnmd\"\n", |
| " description: \"Fixture\"\n", |
| " score: 0.8603869080543518\n", |
| " topicality: 0.8603869080543518\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/04y4h8h\"\n", |
| " description: \"Bathroom cabinet\"\n", |
| " score: 0.80011385679245\n", |
| " topicality: 0.80011385679245\n", |
| "}\n", |
| ")\n", |
| "('http://farm8.staticflickr.com/7026/6388965173_92664a0d78_z.jpg', label_annotations {\n", |
| " mid: \"/m/054_l\"\n", |
| " description: \"Mirror\"\n", |
| " score: 0.9682560563087463\n", |
| " topicality: 0.9682560563087463\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02jz0l\"\n", |
| " description: \"Tap\"\n", |
| " score: 0.9611372947692871\n", |
| " topicality: 0.9611372947692871\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0130jx\"\n", |
| " description: \"Sink\"\n", |
| " score: 0.9328749775886536\n", |
| " topicality: 0.9328749775886536\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/0h8lr5r\"\n", |
| " description: \"Bathroom sink\"\n", |
| " score: 0.9324912428855896\n", |
| " topicality: 0.9324912428855896\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02pkr5\"\n", |
| " description: \"Plumbing fixture\"\n", |
| " score: 0.9191171526908875\n", |
| " topicality: 0.9191171526908875\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/02dgv\"\n", |
| " description: \"Door\"\n", |
| " score: 0.8910166621208191\n", |
| " topicality: 0.8910166621208191\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/09ggk\"\n", |
| " description: \"Purple\"\n", |
| " score: 0.8799519538879395\n", |
| " topicality: 0.8799519538879395\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/01j2bj\"\n", |
| " description: \"Bathroom\"\n", |
| " score: 0.8725592494010925\n", |
| " topicality: 0.8725592494010925\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/04wnmd\"\n", |
| " description: \"Fixture\"\n", |
| " score: 0.8603869080543518\n", |
| " topicality: 0.8603869080543518\n", |
| "}\n", |
| "label_annotations {\n", |
| " mid: \"/m/04y4h8h\"\n", |
| " description: \"Bathroom cabinet\"\n", |
| " score: 0.80011385679245\n", |
| " topicality: 0.80011385679245\n", |
| "}\n", |
| ")\n" |
| ] |
| } |
| ], |
| "source": [ |
| "with beam.Pipeline() as pipeline:\n", |
| " _ = (pipeline | \"Create inputs\" >> beam.Create(image_urls)\n", |
| " | \"Read images\" >> beam.Map(read_image)\n", |
| " | \"Batch images\" >> beam.BatchElements(min_batch_size=2, max_batch_size=4)\n", |
| " | \"Inference\" >> beam.ParDo(RemoteBatchInference())\n", |
| " | \"Print image_url and annotation\" >> beam.Map(print)\n", |
| " )" |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": { |
| "id": "7gwn5bF1XaDm" |
| }, |
| "source": [ |
| "## Monitor the pipeline\n", |
| "\n", |
| "Because monitoring can provide insight into the status and health of the application, consider monitoring and measuring pipeline performance.\n", |
| "For information about the available tracking metrics, see [RunInference Metrics](https://beam.apache.org/documentation/ml/runinference-metrics/)." |
| ] |
| }, |
| { |
| "cell_type": "markdown", |
| "metadata": {}, |
| "source": [] |
| } |
| ], |
| "metadata": { |
| "colab": { |
| "collapsed_sections": [], |
| "provenance": [] |
| }, |
| "kernelspec": { |
| "display_name": "Python 3", |
| "language": "python", |
| "name": "python3" |
| }, |
| "language_info": { |
| "name": "python", |
| "version": "3.10.7 (main, Dec 7 2022, 13:34:16) [Clang 14.0.0 (clang-1400.0.29.102)]" |
| }, |
| "vscode": { |
| "interpreter": { |
| "hash": "40c55305dca37c951f6b497e2e996ca59c449c4502b9f8a4515c118ec923845d" |
| } |
| } |
| }, |
| "nbformat": 4, |
| "nbformat_minor": 0 |
| } |