blob: 0699621e2b168535c97da735776b3d97085876b0 [file] [log] [blame]
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
name: Failing Test
description: Report a failing test
title: "[Failing Test]: "
labels: ["bug", "awaiting triage", "failing test"]
assignees:
- octocat
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this failing test report! Once you've created an issue, you can self-assign by commenting `.take-issue`, self-unassign by commenting `.free-issue`, and close by commenting `.close-issue`.
Anyone can reopen a closed issue by commenting `.reopen-issue`.
You can also add/remove labels by commenting `.add-labels label1,label2,'label 3 with spaces'` or `.remove-labels label1,label2,'label 3 with spaces'`,
or with `.set-labels label1,label2,'label 3 with spaces'` (which removes any labels not in that set).
- type: textarea
id: what-happened
attributes:
label: What happened?
description: Please name which test is failing.
placeholder: |
If possible include: when the test started failing, a jenkins link, and any initial debugging you've done.
If this isn't a bug and you have a question or support request, please email user@apache.beam.com with a description of the problem instead of opening this issue.
- type: dropdown
id: failure
attributes:
label: Issue Failure
options:
- "Failure: Test is flaky"
- "Failure: Test is continually failing"
validations:
required: true
- type: dropdown
id: priority
attributes:
label: Issue Priority
description: What priority is this bug? A permanently failing test should be marked P1. See https://beam.apache.org/contribute/issue-priorities for the meaning and expectations around issue priorities.
options:
- "Priority: 2 (backlog / disabled test but we think the product is healthy)"
- "Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)"
- "Priority: 0 (outage / failing precommit test impacting development)"
default: 1
validations:
required: true
- type: checkboxes
id: component
attributes:
label: Issue Components
description: Which languages, SDKs, or features are related to your report? (check all that apply)
options:
- label: "Component: Python SDK"
- label: "Component: Java SDK"
- label: "Component: Go SDK"
- label: "Component: Typescript SDK"
- label: "Component: IO connector"
- label: "Component: Beam YAML"
- label: "Component: Beam examples"
- label: "Component: Beam playground"
- label: "Component: Beam katas"
- label: "Component: Website"
- label: "Component: Infrastructure"
- label: "Component: Spark Runner"
- label: "Component: Flink Runner"
- label: "Component: Samza Runner"
- label: "Component: Twister2 Runner"
- label: "Component: Hazelcast Jet Runner"
- label: "Component: Google Cloud Dataflow Runner"