Merge "Display AI-powered checks with dedicated styling."
diff --git a/Documentation/dev-processes.txt b/Documentation/dev-processes.txt
index 2e87150..52fa75a 100644
--- a/Documentation/dev-processes.txt
+++ b/Documentation/dev-processes.txt
@@ -52,8 +52,8 @@
[[steering-committee-election]]
=== Election of non-Google steering committee members
-The election of the non-Google steering committee members happens once
-a year in June. Non-Google link:dev-roles.html#maintainer[maintainers]
+The election of the non-Google steering committee members and the community managers
+happens once a year in June. Non-Google link:dev-roles.html#maintainer[maintainers]
can nominate themselves by posting an informal application on the
non-public mailto:gerritcodereview-community-managers@googlegroups.com[
community manager mailing list] when the call for nominations is sent to
diff --git a/contrib/git-exproll.sh b/contrib/git-exproll.sh
deleted file mode 100644
index 9ad7a85..0000000
--- a/contrib/git-exproll.sh
+++ /dev/null
@@ -1,566 +0,0 @@
-#!/usr/bin/env bash
-# Copyright (c) 2012, Code Aurora Forum. All rights reserved.
-#
-# Redistribution and use in source and binary forms, with or without
-# modification, are permitted provided that the following conditions are
-# met:
-# # Redistributions of source code must retain the above copyright
-# notice, this list of conditions and the following disclaimer.
-# # Redistributions in binary form must reproduce the above
-# copyright notice, this list of conditions and the following
-# disclaimer in the documentation and/or other materials provided
-# with the distribution.
-# # Neither the name of Code Aurora Forum, Inc. nor the names of its
-# contributors may be used to endorse or promote products derived
-# from this software without specific prior written permission.
-#
-# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY EXPRESS OR IMPLIED
-# WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
-# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT
-# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS
-# BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
-# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
-# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR
-# BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
-# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
-# OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN
-# IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-
-usage() { # error_message
-
- cat <<-EOF
- usage: $(basename $0) [-unvt] [--noref] [--nolosse] [-r|--ratio number]
- [git gc option...] git.repo
-
- -u|-h usage/help
- -v verbose
- -n dry-run don't actually repack anything
- -t touch treat repo as if it had been touched
- --noref avoid extra ref packing timestamp checking
- --noloose do not run just because there are loose object dirs
- (repacking may still run if they are referenced)
- -r ratio <number> packfile ratio to aim for (default 10)
-
- git gc option will be passed as args to git gc
-
- git.repo to run gc against
-
- Garbage collect using a pseudo logarithmic packfile maintenance
- approach. This approach attempts to minimize packfile churn
- by keeping several generations of varying sized packfiles around
- and only consolidating packfiles (or loose objects) which are
- either new packfiles, or packfiles close to the same size as
- another packfile.
-
- An estimate is used to predict when rollups (one consolidation
- would cause another consolidation) would occur so that this
- rollup can be done all at once via a single repack. This reduces
- both the runtime and the pack file churn in rollup cases.
-
- Approach: plan each consolidation by creating a table like this:
-
- Id Keep Size Sha1(or consolidation list) Actions(repack down up note)
- 1 - 11356 9052edfb7392646cd4e5f362b953675985f01f96 y - - New
- 2 - 429088 010904d5c11cd26a79fda91b01ab454d1001b402 y - - New
- c1 - 440444 [1,2] - - -
-
- Id: numbers preceded by a c are estimated "c pack" files
- Keep: - none, k private keep, o our keep
- Size: in disk blocks (default du output)
- Sha1: of packfile, or consolidation list of packfile ids
- Actions
- repack: - n no, y yes
- down: - noop, ^ consolidate with a file above
- up: - noop, v consolidate with a file below
- note: Human description of script decisions:
- New (file is a new packfile)
- Consolidate with:<list of packfile ids>
- (too far from:<list of packfile ids>)
-
- On the first pass, always consolidate any new packfiles along
- with loose objects and along with any packfiles which are within
- the ratio size of their predecessors (note, the list is ordered
- by increasing size). After each consolidation, insert a fake
- consolidation, or "c pack", to naively represent the size and
- ordered positioning of the anticipated new consolidated pack.
- Every time a new pack is planned, rescan the list in case the
- new "c pack" would cause more consolidation...
-
- Once the packfiles which need consolidation are determined, the
- packfiles which will not be consolidated are marked with a .keep
- file, and those which will be consolidated will have their .keep
- removed if they have one. Thus, the packfiles with a .keep will
- not get repacked.
-
- Packfile consolidation is determined by the --ratio parameter
- (default is 10). This ratio is somewhat of a tradeoff. The
- smaller the number, the more packfiles will be kept on average;
- this increases disk utilization somewhat. However, a larger
- ratio causes greater churn and may increase disk utilization due
- to deleted packfiles not being reclaimed since they may still be
- kept open by long running applications such as Gerrit. Sane
- ratio values are probably between 2 and 10. Since most
- consolidations actually end up smaller than the estimated
- consolidated packfile size (due to compression), the true ratio
- achieved will likely be 1 to 2 greater than the target ratio.
- The smaller the target ratio, the greater this discrepancy.
-
- Finally, attempt to skip garbage collection entirely on untouched
- repos. In order to determine if a repo has been touched, use the
- timestamp on the script's keep files, if any relevant file/dir
- is newer than a keep marker file, assume that the repo has been
- touched and gc needs to run. Also assume gc needs to run whenever
- there are loose object dirs since they may contain untouched
- unreferenced loose objects which need to be pruned (once they
- expire).
-
- In order to allow the keep files to be an effective timestamp
- marker to detect relevant changes in a repo since the last run,
- all relevant files and directories which may be modified during a
- gc run (even during a noop gc run), must have their timestamps
- reset to the same time as the keep files or gc will always run
- even on untouched repos. The relevant files/dirs are all those
- files and directories which garbage collection, object packing,
- ref packing and pruning might change during noop actions.
-EOF
-
- [ -n "$1" ] && info "ERROR $1"
-
- exit 128
-}
-
-debug() { [ -n "$SW_V" ] && info "$1" ; }
-info() { echo "$1" >&2 ; }
-
-array_copy() { #v2 # array_src array_dst
- local src=$1 dst=$2
- local s i=0
- eval s=\${#$src[@]}
- while [ $i -lt $s ] ; do
- eval $dst[$i]=\"\${$src[$i]}\"
- i=$(($i + 1))
- done
-}
-
-array_equals() { #v2 # array_name [vals...]
- local a=$1 ; shift
- local s=0 t=() val
- array_copy "$a" t
- for s in "${!t[@]}" ; do s=$((s+1)) ; done
- [ "$s" -ne "$#" ] && return 1
- for val in "${t[@]}" ; do
- [ "$val" = "$1" ] || return 2
- shift
- done
- return 0
-}
-
-packs_sizes() { # git.repo > "size pack"...
- du -s "$1"/objects/pack/pack-$SHA1.pack | sort -n 2> /dev/null
-}
-
-is_ourkeep() { grep -q "$KEEP" "$1" 2> /dev/null ; } # keep
-has_ourkeep() { is_ourkeep "$(keep_for "$1")" ; } # pack
-has_keep() { [ -f "$(keep_for "$1")" ] ; } # pack
-is_repo() { [ -d "$1/objects" ] && [ -d "$1/refs/heads" ] ; } # git.repo
-
-keep() { # pack # returns true if we added our keep
- keep=$(keep_for "$1")
- [ -f "$keep" ] && return 1
- echo "$KEEP" > "$keep"
- return 0
-}
-
-keep_for() { # packfile > keepfile
- local keep=$(echo "$1" | sed -es'/\.pack$/.keep/')
- [ "${keep/.keep}" = "$keep" ] && return 1
- echo "$keep"
-}
-
-idx_for() { # packfile > idxfile
- local idx=$(echo "$1" | sed -es'/\.pack$/.idx/')
- [ "${idx/.idx}" = "$idx" ] && return 1
- echo "$idx"
-}
-
-# pack_or_keep_file > sha
-sha_for() { echo "$1" | sed -es'|\(.*/\)*pack-\([^.]*\)\..*$|\2|' ; }
-
-private_keeps() { # git.repo -> sets pkeeps
- local repo=$1 ary=$2
- local keep keeps=("$repo"/objects/pack/pack-$SHA1.keep)
- pkeeps=()
- for keep in "${keeps[@]}" ; do
- is_ourkeep "$keep" || pkeeps=("${pkeeps[@]}" "$keep")
- done
-}
-
-is_tooclose() { [ "$(($1 * $RATIO))" -gt "$2" ] ; } # smaller larger
-
-unique() { # [args...] > unique_words
- local lines=$(while [ $# -gt 0 ] ; do echo "$1" ; shift ; done)
- lines=$(echo "$lines" | sort -u)
- echo $lines # as words
-}
-
-outfs() { # fs [args...] > argfs...
- local fs=$1 ; shift
- [ $# -gt 0 ] && echo -n "$1" ; shift
- while [ $# -gt 0 ] ; do echo -n "$fs$1" ; shift ; done
-}
-
-sort_list() { # < list > formatted_list
- # n has_keep size sha repack down up note
- awk '{ note=$8; for(i=8;i<NF;i++) note=note " "$(i+1)
- printf("%-5s %s %-14s %-40s %s %s %s %s\n", \
- $1,$2, $3, $4, $5,$6,$7,note)}' |\
- sort -k 3,3n -k 1,1n
-}
-
-is_touched() { # git.repo
- local repo=$1
- local loose keep ours newer
- [ -n "$SW_T" ] && { debug "$SW_T -> treat as touched" ; return 0 ; }
-
- if [ -z "$SW_LOOSE" ] ; then
- # If there are loose objects, they may need to be pruned,
- # run even if nothing has really been touched.
- loose=$(find "$repo/objects" -type d \
- -wholename "$repo/objects/[0-9][0-9]"
- -print -quit 2>/dev/null)
- [ -n "$loose" ] && { info "There are loose object directories" ; return 0 ; }
- fi
-
- # If we don't have a keep, the current packfiles may not have been
- # compressed with the current gc policy (gc may never have been run),
- # so run at least once to repack everything. Also, we need a marker
- # file for timestamp tracking (a dir needs to detect changes within
- # it, so it cannot be a marker) and our keeps are something we control,
- # use them.
- for keep in "$repo"/objects/pack/pack-$SHA1.keep ; do
- is_ourkeep "$keep" && { ours=$keep ; break ; }
- done
- [ -z "$ours" ] && { info 'We have no keep (we have never run?): run' ; return 0 ; }
-
- debug "Our timestamp keep: $ours"
- # The wholename stuff seems to get touched by a noop git gc
- newer=$(find "$repo/objects" "$repo/refs" "$repo/packed-refs" \
- '!' -wholename "$repo/objects/info" \
- '!' -wholename "$repo/objects/info/*" \
- -newer "$ours" \
- -print -quit 2>/dev/null)
- [ -z "$newer" ] && return 1
-
- info "Touched since last run: $newer"
- return 0
-}
-
-touch_refs() { # git.repo start_date refs
- local repo=$1 start_date=$2 refs=$3
- (
- debug "Setting start date($start_date) on unpacked refs:"
- debug "$refs"
- cd "$repo/refs" || return
- # safe to assume no newlines in a ref name
- echo "$refs" | xargs -d '\n' -n 1 touch -c -d "$start_date"
- )
-}
-
-set_start_date() { # git.repo start_date refs refdirs packedrefs [packs]
- local repo=$1 start_date=$2 refs=$3 refdirs=$4 packedrefs=$5 ; shift 5
- local pack keep idx repacked
-
- # This stuff is touched during object packs
- while [ $# -gt 0 ] ; do
- pack=$1 ; shift
- keep="$(keep_for "$pack")"
- idx="$(idx_for "$pack")"
- touch -c -d "$start_date" "$pack" "$keep" "$idx"
- debug "Setting start date on: $pack $keep $idx"
- done
- # This will prevent us from detecting any deletes in the pack dir
- # since gc ran, except for private keeps which we are checking
- # manually. But there really shouldn't be any other relevant deletes
- # in this dir which should cause us to rerun next time, deleting a
- # pack or index file by anything but gc would be bad!
- debug "Setting start date on pack dir: $start_date"
- touch -c -d "$start_date" "$repo/objects/pack"
-
-
- if [ -z "$SW_REFS" ] ; then
- repacked=$(find "$repo/packed-refs" -newer "$repo/objects/pack"
- -print -quit 2>/dev/null)
- if [ -n "$repacked" ] ; then
- # The ref dirs and packed-ref files seem to get touched even on
- # a noop refpacking
- debug "Setting start date on packed-refs"
- touch -c -d "$start_date" "$repo/packed-refs"
- touch_refs "$repo" "$start_date" "$refdirs"
-
- # A ref repack does not imply a ref change, but since it is
- # hard to tell, simply assume so
- if [ "$refs" != "$(cd "$repo/refs" ; find -depth)" ] || \
- [ "$packedrefs" != "$(<"$repo/packed-refs")" ] ; then
- # We retouch if needed (instead of simply checking then
- # touching) to avoid a race between the check and the set.
- debug " but refs actually got packed, so retouch packed-refs"
- touch -c "$repo/packed-refs"
- fi
- fi
- fi
-}
-
-note_consolidate() { # note entry > note (no duplicated consolidated entries)
- local note=$1 entry=$2
- local entries=() ifs=$IFS
- if echo "$note" | grep -q 'Consolidate with:[0-9,c]' ; then
- IFS=,
- entries=( $(echo "$note" | sed -es'/^.*Consolidate with:\([0-9,c]*\).*$/\1/') )
- note=( $(echo "$note" | sed -es'/Consolidate with:[0-9,c]*//') )
- IFS=$ifs
- fi
- entries=( $(unique "${entries[@]}" "$entry") )
- echo "$note Consolidate with:$(outfs , "${entries[@]}")"
-}
-
-note_toofar() { # note entry > note (no duplicated "too far" entries)
- local note=$1 entry=$2
- local entries=() ifs=$IFS
- if echo "$note" | grep -q '(too far from:[0-9,c]*)' ; then
- IFS=,
- entries=( $(echo "$note" | sed -es'/^.*(too far from:\([0-9,c]*\)).*$/\1/') )
- note=( $(echo "$note" | sed -es'/(too far from:[0-9,c]*)//') )
- IFS=$ifs
- fi
- entries=( $(unique "${entries[@]}" "$entry") )
- echo "$note (too far from:$(outfs , "${entries[@]}"))"
-}
-
-last_entry() { # isRepack pline repackline > last_rows_entry
- local size_hit=$1 pline=$2 repackline=$3
- if [ -n "$pline" ] ; then
- if [ -n "$size_hit" ] ; then
- echo "$repack_line"
- else
- echo "$pline"
- fi
- fi
-}
-
-init_list() { # git.repo > shortlist
- local repo=$1
- local file
- local n has_keep size sha repack
-
- packs_sizes "$1" | {
- while read size file ; do
- n=$((n+1))
- repack=n
- has_keep=-
- if has_keep "$file" ; then
- has_keep=k
- has_ourkeep "$file" && has_keep=o
- fi
- sha=$(sha_for "$file")
- echo "$n $has_keep $size $sha $repack"
- done
- } | sort_list
-}
-
-consolidate_list() { # run < list > list
- local run=$1
- local sum=0 psize=0 sum_size=0 size_hit pn clist pline repackline
- local n has_keep size sha repack down up note
-
- {
- while read n has_keep size sha repack down up note; do
- [ -z "$up" ] && up='-'
- [ -z "$down" ] && down="-"
-
- if [ "$has_keep" = "k" ] ; then
- echo "$n $has_keep $size $sha $repack - - Private"
- continue
- fi
-
- if [ "$repack" = "n" ] ; then
- if is_tooclose $psize $size ; then
- size_hit=y
- repack=y
- sum=$(($sum + $sum_size + $size))
- sum_size=0 # Prevents double summing this entry
- clist=($(unique "${clist[@]}" $pn $n))
- down="^"
- [ "$has_keep" = "-" ] && note="$note New +"
- note=$(note_consolidate "$note" "$pn")
- elif [ "$has_keep" = "-" ] ; then
- repack=y
- sum=$(($sum + $size))
- sum_size=0 # Prevents double summing this entry
- clist=($(unique "${clist[@]}" $n))
- note="$note New"
- elif [ $psize -ne 0 ] ; then
- sum_size=$size
- down="!"
- note=$(note_toofar "$note" "$pn")
- else
- sum_size=$size
- fi
- else
- sum_size=$size
- fi
-
- # By preventing "c files" (consolidated) from being marked
- # "repack" they won't get keeps
- repack2=y
- [ "${n/c}" != "$n" ] && { repack=- ; repack2=- ; }
-
- last_entry "$size_hit" "$pline" "$repack_line"
- # Delay the printout until we know whether we are
- # being consolidated with the entry following us
- # (we won't know until the next iteration).
- # size_hit is used to determine which of the lines
- # below will actually get printed above on the next
- # iteration.
- pline="$n $has_keep $size $sha $repack $down $up $note"
- repack_line="$n $has_keep $size $sha $repack2 $down v $note"
-
- pn=$n ; psize=$size # previous entry data
- size_hit='' # will not be consolidated up
-
- done
- last_entry "$size_hit" "$pline" "$repack_line"
-
- [ $sum -gt 0 ] && echo "c$run - $sum [$(outfs , "${clist[@]}")] - - -"
-
- } | sort_list
-}
-
-process_list() { # git.repo > list
- local list=$(init_list "$1") plist run=0
-
- while true ; do
- plist=$list
- run=$((run +1))
- list=$(echo "$list" | consolidate_list "$run")
- if [ "$plist" != "$list" ] ; then
- debug "------------------------------------------------------------------------------------"
- debug "$HEADER"
- debug "$list"
- else
- break
- fi
- done
- debug "------------------------------------------------------------------------------------"
- echo "$list"
-}
-
-repack_list() { # git.repo < list
- local repo=$1
- local start_date newpacks=0 pkeeps keeps=1 refs refdirs rtn
- local packedrefs=$(<"$repo/packed-refs")
-
- # so they don't appear touched after a noop refpacking
- if [ -z "$SW_REFS" ] ; then
- refs=$(cd "$repo/refs" ; find -depth)
- refdirs=$(cd "$repo/refs" ; find -type d -depth)
- debug "Before refs:"
- debug "$refs"
- fi
-
- # Find a private keep snapshot which has not changed from
- # before our start_date so private keep deletions during gc
- # can be detected
- while ! array_equals pkeeps "${keeps[@]}" ; do
- debug "Getting a private keep snapshot"
- private_keeps "$repo"
- keeps=("${pkeeps[@]}")
- debug "before keeps: ${keeps[*]}"
- start_date=$(date)
- private_keeps "$repo"
- debug "after keeps: ${pkeeps[*]}"
- done
-
- while read n has_keep size sha repack down up note; do
- if [ "$repack" = "y" ] ; then
- keep="$repo/objects/pack/pack-$sha.keep"
- info "Repacking $repo/objects/pack/pack-$sha.pack"
- [ -f "$keep" ] && rm -f "$keep"
- fi
- done
-
- ( cd "$repo" && git gc "${GC_OPTS[@]}" ) ; rtn=$?
-
- # Mark any files withoug a .keep with our .keep
- packs=("$repo"/objects/pack/pack-$SHA1.pack)
- for pack in "${packs[@]}" ; do
- if keep "$pack" ; then
- info "New pack: $pack"
- newpacks=$((newpacks+1))
- fi
- done
-
- # Record start_time. If there is more than 1 new packfile, we
- # don't want to risk touching it with an older date since that
- # would prevent consolidation on the next run. If the private
- # keeps have changed, then we should run next time no matter what.
- if [ $newpacks -le 1 ] || ! array_equals pkeeps "${keeps[@]}" ; then
- set_start_date "$repo" "$start_date" "$refs" "$refdirs" "$packedrefs" "${packs[@]}"
- fi
-
- return $rtn # we really only care about the gc error code
-}
-
-git_gc() { # git.repo
- local list=$(process_list "$1")
- if [ -z "$SW_V" ] ; then
- info "Running $PROG on $1. git gc options: ${GC_OPTS[@]}"
- echo "$HEADER" >&2
- echo "$list" >&2 ;
- fi
- echo "$list" | repack_list "$1"
-}
-
-
-PROG=$(basename "$0")
-HEADER="Id Keep Size Sha1(or consolidation list) Actions(repack down up note)"
-KEEP=git-exproll
-HEX='[0-9a-f]'
-HEX10=$HEX$HEX$HEX$HEX$HEX$HEX$HEX$HEX$HEX$HEX
-SHA1=$HEX10$HEX10$HEX10$HEX10
-
-RATIO=10
-SW_N='' ; SW_V='' ; SW_T='' ; SW_REFS='' ; SW_LOOSE='' ; GC_OPTS=()
-while [ $# -gt 0 ] ; do
- case "$1" in
- -u|-h) usage ;;
- -n) SW_N="$1" ;;
- -v) SW_V="$1" ;;
-
- -t) SW_T="$1" ;;
- --norefs) SW_REFS="$1" ;;
- --noloose) SW_LOOSE="$1" ;;
-
- -r|--ratio) shift ; RATIO="$1" ;;
-
- *) [ $# -le 1 ] && break
- GC_OPTS=( "${GC_OPTS[@]}" "$1" )
- ;;
- esac
- shift
-done
-
-
-REPO="$1"
-if ! is_repo "$REPO" ; then
- REPO=$REPO/.git
- is_repo "$REPO" || usage "($1) is not likely a git repo"
-fi
-
-
-if [ -z "$SW_N" ] ; then
- is_touched "$REPO" || { info "Repo untouched since last run" ; exit ; }
- git_gc "$REPO"
-else
- is_touched "$REPO" || info "Repo untouched since last run, analyze anyway."
- process_list "$REPO" >&2
-fi
diff --git a/java/com/google/gerrit/server/account/Accounts.java b/java/com/google/gerrit/server/account/Accounts.java
index 55192e9..ec781e0 100644
--- a/java/com/google/gerrit/server/account/Accounts.java
+++ b/java/com/google/gerrit/server/account/Accounts.java
@@ -14,11 +14,17 @@
package com.google.gerrit.server.account;
+import static com.google.common.collect.ImmutableList.toImmutableList;
+
+import com.google.common.collect.ImmutableList;
import com.google.gerrit.entities.Account;
import java.io.IOException;
+import java.util.ArrayList;
import java.util.Collection;
+import java.util.Collections;
import java.util.List;
import java.util.Optional;
+import java.util.Random;
import java.util.Set;
import org.eclipse.jgit.errors.ConfigInvalidException;
@@ -62,6 +68,19 @@
List<Account.Id> firstNIds(int n) throws IOException;
/**
+ * Returns n random account IDs.
+ *
+ * @param n the number of account IDs that should be returned
+ * @param seed seed that should be used to randomize the order
+ * @return n random account IDs
+ */
+ default ImmutableList<Account.Id> randomNIds(int n, long seed) throws IOException {
+ List<Account.Id> allIds = new ArrayList<>(allIds());
+ Collections.shuffle(allIds, new Random(seed));
+ return allIds.stream().limit(n).collect(toImmutableList());
+ }
+
+ /**
* Checks if any account exists.
*
* @return {@code true} if at least one account exists, otherwise {@code false}.
diff --git a/java/com/google/gerrit/server/account/storage/notedb/AccountsNoteDbImpl.java b/java/com/google/gerrit/server/account/storage/notedb/AccountsNoteDbImpl.java
index 28d3a43..02ad518 100644
--- a/java/com/google/gerrit/server/account/storage/notedb/AccountsNoteDbImpl.java
+++ b/java/com/google/gerrit/server/account/storage/notedb/AccountsNoteDbImpl.java
@@ -51,6 +51,7 @@
import org.eclipse.jgit.lib.ObjectId;
import org.eclipse.jgit.lib.Repository;
+/** NoteDb-based implementation of {@link Accounts}. */
@Singleton
public class AccountsNoteDbImpl implements Accounts {
private static final FluentLogger logger = FluentLogger.forEnclosingClass();
diff --git a/java/com/google/gerrit/server/experiments/ExperimentFeaturesConstants.java b/java/com/google/gerrit/server/experiments/ExperimentFeaturesConstants.java
index 0a64db7..a495882 100644
--- a/java/com/google/gerrit/server/experiments/ExperimentFeaturesConstants.java
+++ b/java/com/google/gerrit/server/experiments/ExperimentFeaturesConstants.java
@@ -75,4 +75,8 @@
/** Whether AI chat/review features are enabled in the UI. */
public static final String ENABLE_AI_CHAT = "UiFeature__enable_ai_chat";
+
+ /** Whether we restrict the creation of branch permissions. */
+ public static final String GERRIT_BACKEND_FEATURE_RESTRICT_BRANCH_PERMISSIONS =
+ "GerritBackendFeature__restrict_branch_permissions";
}
diff --git a/javatests/com/google/gerrit/acceptance/api/accounts/AccountIT.java b/javatests/com/google/gerrit/acceptance/api/accounts/AccountIT.java
index 4272a88..67f33e8 100644
--- a/javatests/com/google/gerrit/acceptance/api/accounts/AccountIT.java
+++ b/javatests/com/google/gerrit/acceptance/api/accounts/AccountIT.java
@@ -300,7 +300,7 @@
if (repo.getRefDatabase().exactRef(ref) != null) {
RefUpdate ru = repo.updateRef(ref);
ru.setForceUpdate(true);
- assertWithMessage("Failed to delete " + ref)
+ assertWithMessage("Failed to delete %s", ref)
.that(ru.delete())
.isEqualTo(RefUpdate.Result.FORCED);
}
@@ -569,6 +569,20 @@
}
@Test
+ public void randomNIds() throws Exception {
+ accountOperations.newAccount().create();
+ accountOperations.newAccount().create();
+ accountOperations.newAccount().create();
+
+ ImmutableList<Account.Id> result = accounts.randomNIds(2, 12345L);
+
+ assertThat(result).hasSize(2);
+ for (Account.Id id : result) {
+ assertThat(gApi.accounts().id(id.get()).get()).isNotNull();
+ }
+ }
+
+ @Test
public void get() throws Exception {
AccountIndexedCounter accountIndexedCounter = getAccountIndexedCounter();
try (Registration registration =
diff --git a/javatests/com/google/gerrit/acceptance/api/change/ChangeIT.java b/javatests/com/google/gerrit/acceptance/api/change/ChangeIT.java
index d716006..09b4933 100644
--- a/javatests/com/google/gerrit/acceptance/api/change/ChangeIT.java
+++ b/javatests/com/google/gerrit/acceptance/api/change/ChangeIT.java
@@ -1220,8 +1220,7 @@
@Test
public void deleteAllForProjectNotifiesListeners() {
- TestDeleteAllForProjectListener deleteAllForProjectsListener =
- new TestDeleteAllForProjectListener();
+ TestDeleteForProjectListener deleteAllForProjectsListener = new TestDeleteForProjectListener();
try (Registration ignore =
extensionRegistry.newRegistration().add(deleteAllForProjectsListener)) {
@@ -1229,7 +1228,26 @@
indexer.deleteAllForProject(project);
}
- assertThat(deleteAllForProjectsListener.getFiredCount()).isEqualTo(1);
+ assertThat(deleteAllForProjectsListener.getAllChangesPerProjectDeletedFiredCount())
+ .isEqualTo(1);
+ }
+
+ @SuppressWarnings("FutureReturnValueIgnored")
+ @Test
+ public void deleteChangeFromIndexNotifiesListenersWithoutProjectName() {
+ TestChange change = changeOperations.newChange().createAndGet();
+ TestDeleteForProjectListener deleteAllForProjectsListener = new TestDeleteForProjectListener();
+ String projectName = "my-test-project";
+
+ try (Registration ignore =
+ extensionRegistry.newRegistration().add(deleteAllForProjectsListener)) {
+
+ var unused =
+ indexer.deleteAsync(Project.NameKey.parse(projectName), change.numericChangeId());
+ }
+
+ assertThat(deleteAllForProjectsListener.getSingleChangeDeletedFiredCount()).isEqualTo(1);
+ assertThat(deleteAllForProjectsListener.getReceivedProjectName()).isNull();
}
@Test
@@ -5417,22 +5435,34 @@
}
}
- public static class TestDeleteAllForProjectListener implements ChangeIndexedListener {
- private final AtomicInteger firedCount = new AtomicInteger(0);
+ public static class TestDeleteForProjectListener implements ChangeIndexedListener {
+ private final AtomicInteger allChangesPerProjectDeletedFiredCount = new AtomicInteger(0);
+ private final AtomicInteger singleChangeDeletedFiredCount = new AtomicInteger(0);
+ private String receivedProjectName = null;
@Override
public void onChangeIndexed(String projectName, int id) {}
@Override
- public void onChangeDeleted(int id) {}
+ public void onChangeDeleted(int id) {
+ singleChangeDeletedFiredCount.incrementAndGet();
+ }
@Override
public void onAllChangesDeletedForProject(String projectName) {
- firedCount.incrementAndGet();
+ allChangesPerProjectDeletedFiredCount.incrementAndGet();
}
- public int getFiredCount() {
- return firedCount.get();
+ public int getAllChangesPerProjectDeletedFiredCount() {
+ return allChangesPerProjectDeletedFiredCount.get();
+ }
+
+ public int getSingleChangeDeletedFiredCount() {
+ return singleChangeDeletedFiredCount.get();
+ }
+
+ public String getReceivedProjectName() {
+ return receivedProjectName;
}
}
diff --git a/modules/java-prettify b/modules/java-prettify
index 32fa081..1c0ef60 160000
--- a/modules/java-prettify
+++ b/modules/java-prettify
@@ -1 +1 @@
-Subproject commit 32fa081a797a97beaf77a4f2efca26c39168e72f
+Subproject commit 1c0ef60424995a24452ce5ccd54f60c7eb9a5051
diff --git a/plugins/commit-message-length-validator b/plugins/commit-message-length-validator
index 4be14bd..1e08c1e 160000
--- a/plugins/commit-message-length-validator
+++ b/plugins/commit-message-length-validator
@@ -1 +1 @@
-Subproject commit 4be14bd0ec01f9b39e4565e262487c73aaf485ba
+Subproject commit 1e08c1ef59b4c812ee046746869e0debc7569d9e
diff --git a/plugins/replication b/plugins/replication
index 9b78af8..87a975d 160000
--- a/plugins/replication
+++ b/plugins/replication
@@ -1 +1 @@
-Subproject commit 9b78af806be9cc7131a856f81051ee78d4f8c0e9
+Subproject commit 87a975dd6a62d383fc2d3914162365976123e1c2
diff --git a/polygerrit-ui/app/constants/reporting.ts b/polygerrit-ui/app/constants/reporting.ts
index 5502a73..1b3c5c0 100644
--- a/polygerrit-ui/app/constants/reporting.ts
+++ b/polygerrit-ui/app/constants/reporting.ts
@@ -103,6 +103,8 @@
COPY_TO_CLIPBOARD = 'CopyToClipboard',
// Time to autocomplete a comment
COMMENT_COMPLETION = 'CommentCompletion',
+ // Time for AI chat requests to complete
+ AI_CHAT_REQUEST = 'AiChatRequest',
}
export enum Interaction {
@@ -184,6 +186,8 @@
FLOWS_TAB_RENDERED = 'flows-tab-rendered',
CREATE_FLOW_DIALOG_OPENED = 'create-flow-dialog-opened',
FLOW_CREATED = 'flow-created',
+ // AI Chat interaction request failures
+ AI_CHAT_FAILURE = 'ai-chat-failure',
}
/**
diff --git a/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions.ts b/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions.ts
index 2e76f7c..530cbcd 100644
--- a/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions.ts
+++ b/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions.ts
@@ -175,6 +175,7 @@
export const QUICK_APPROVE_ACTION: QuickApproveUIActionInfo = {
__key: 'review',
__type: ActionType.CHANGE,
+ __primary: true,
enabled: true,
key: 'review',
label: 'Quick approve',
@@ -881,6 +882,7 @@
}
private renderUIAction(action: UIActionInfo) {
+ const disabled = this.calculateDisabled(action);
return html`
<gr-tooltip-content
title=${ifDefined(action.title)}
@@ -888,11 +890,12 @@
?position-below=${true}
>
<gr-button
- link
+ ?link=${!action.__primary || disabled}
class=${action.__key}
data-action-key=${action.__key}
data-label=${action.label}
- ?disabled=${this.calculateDisabled(action)}
+ ?primary=${action.__primary}
+ ?disabled=${disabled}
@click=${(e: MouseEvent) =>
this.handleActionTap(e, action.__key, action.__type)}
>
diff --git a/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions_test.ts b/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions_test.ts
index 202939b..e8aa31f 100644
--- a/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions_test.ts
+++ b/polygerrit-ui/app/elements/change/gr-change-actions/gr-change-actions_test.ts
@@ -80,6 +80,12 @@
>;
let chatModel: ChatModel;
+ teardown(() => {
+ for (const el of document.body.querySelectorAll('gr-tooltip')) {
+ el.remove();
+ }
+ });
+
suite('basic tests', () => {
setup(async () => {
stubRestApi('getChangeRevisionActions').returns(
@@ -170,7 +176,7 @@
class="submit"
data-action-key="submit"
data-label="Submit"
- link=""
+ primary=""
role="button"
tabindex="0"
>
@@ -2191,10 +2197,10 @@
assert.isTrue(element._hideQuickApproveAction);
});
- test('is first in list of secondary actions', () => {
+ test('is first in list of primary actions', () => {
const approveButton = queryAndAssert<HTMLElement>(
element,
- '#secondaryActions'
+ '#primaryActions'
).querySelector('gr-button');
assert.equal(approveButton!.getAttribute('data-label'), 'foo+1');
});
diff --git a/polygerrit-ui/app/elements/change/gr-change-view/gr-change-view_screenshot_test.ts b/polygerrit-ui/app/elements/change/gr-change-view/gr-change-view_screenshot_test.ts
index 713a1ea..97568af 100644
--- a/polygerrit-ui/app/elements/change/gr-change-view/gr-change-view_screenshot_test.ts
+++ b/polygerrit-ui/app/elements/change/gr-change-view/gr-change-view_screenshot_test.ts
@@ -323,6 +323,15 @@
revert_of: 12345 as NumericChangeId,
submittable: true,
subject: 'Reland "Add initial jj support to `gclient sync`."',
+ actions: {
+ ...element.change?.actions,
+ submit: {
+ method: HttpMethod.POST,
+ label: 'Submit',
+ title: 'Submit the change',
+ enabled: true,
+ },
+ },
});
await element.updateComplete;
@@ -348,6 +357,10 @@
);
await visualDiff(container, 'gr-change-view-wrapped-statuses-801px');
+ await visualDiffDarkTheme(
+ container,
+ 'gr-change-view-wrapped-statuses-801px'
+ );
} finally {
document.body.removeChild(container);
}
diff --git a/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents.ts b/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents.ts
index b185ca2..9b21725 100644
--- a/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents.ts
+++ b/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents.ts
@@ -5,6 +5,7 @@
*/
import '../gr-avatar/gr-avatar';
import '../gr-button/gr-button';
+import '../gr-copy-clipboard/gr-copy-clipboard';
import '../gr-icon/gr-icon';
import '../../plugins/gr-endpoint-decorator/gr-endpoint-decorator';
import '../../plugins/gr-endpoint-param/gr-endpoint-param';
@@ -137,6 +138,12 @@
.email {
color: var(--deemphasized-text-color);
}
+ .email {
+ display: flex;
+ align-items: center;
+ gap: var(--spacing-xs);
+ color: var(--deemphasized-text-color);
+ }
.action {
border-top: 1px solid var(--border-color);
padding: var(--spacing-s) var(--spacing-l);
@@ -189,7 +196,18 @@
</div>
<div class="account">
<h3 class="name heading-3">${this.account.name}</h3>
- <div class="email">${this.account.email}</div>
+ <div class="email">
+ <span>${this.account.email}</span>
+ ${this.account.email
+ ? html`<gr-copy-clipboard
+ hideInput
+ .text=${this.account.email}
+ copyTargetName="Email"
+ hasTooltip
+ buttonTitle="Copy email to clipboard"
+ ></gr-copy-clipboard>`
+ : nothing}
+ </div>
</div>
</div>
${this.renderAccountStatusPlugins()} ${this.renderAccountStatus()}
diff --git a/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents_test.ts b/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents_test.ts
index 6f3fe36..925ec17 100644
--- a/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents_test.ts
+++ b/polygerrit-ui/app/elements/shared/gr-hovercard-account/gr-hovercard-account-contents_test.ts
@@ -68,7 +68,16 @@
</div>
<div class="account">
<h3 class="heading-3 name">Kermit The Frog</h3>
- <div class="email">kermit@gmail.com</div>
+ <div class="email">
+ <span>kermit@gmail.com</span>
+ <gr-copy-clipboard
+ buttontitle="Copy email to clipboard"
+ copytargetname="Email"
+ hastooltip=""
+ hideinput=""
+ >
+ </gr-copy-clipboard>
+ </div>
</div>
</div>
<gr-endpoint-decorator name="hovercard-status">
@@ -110,7 +119,16 @@
</div>
<div class="account">
<h3 class="heading-3 name">Kermit The Frog</h3>
- <div class="email">kermit@gmail.com</div>
+ <div class="email">
+ <span>kermit@gmail.com</span>
+ <gr-copy-clipboard
+ buttontitle="Copy email to clipboard"
+ copytargetname="Email"
+ hastooltip=""
+ hideinput=""
+ >
+ </gr-copy-clipboard>
+ </div>
</div>
</div>
<gr-endpoint-decorator name="hovercard-status">
@@ -213,6 +231,17 @@
assert.equal(voteableEl.innerText, 'Bar: +1');
});
+ test('copy email clipboard is shown when email exists', () => {
+ const copyClipboard = queryAndAssert(element, 'gr-copy-clipboard');
+ assert.isOk(copyClipboard);
+ });
+
+ test('copy email clipboard is not shown without email', async () => {
+ element.account = {...ACCOUNT, email: undefined};
+ await element.updateComplete;
+ assert.isUndefined(query(element, 'gr-copy-clipboard'));
+ });
+
test('remove reviewer', async () => {
element.change = {
...createChange(),
diff --git a/polygerrit-ui/app/models/chat/chat-model.ts b/polygerrit-ui/app/models/chat/chat-model.ts
index 619424b..2711edb 100644
--- a/polygerrit-ui/app/models/chat/chat-model.ts
+++ b/polygerrit-ui/app/models/chat/chat-model.ts
@@ -37,6 +37,8 @@
import {contextItemEquals} from './context-item-util';
import {FilesModel, NormalizedFileInfo} from '../change/files-model';
import {isMagicPath} from '../../utils/path-list-util';
+import {getAppContext} from '../../services/app-context';
+import {Interaction, Timing} from '../../constants/reporting';
/** The available display modes in the chat panel. */
export enum ChatPanelMode {
@@ -616,6 +618,19 @@
});
},
emitError: (errorMessage: string) => {
+ getAppContext().reportingService.timeEnd(Timing.AI_CHAT_REQUEST, {
+ modelName: request.model_name,
+ actionId: action.id,
+ error: errorMessage,
+ });
+ getAppContext().reportingService.reportInteraction(
+ Interaction.AI_CHAT_FAILURE,
+ {
+ modelName: request.model_name,
+ actionId: action.id,
+ error: errorMessage,
+ }
+ );
const state = this.getState();
if (state.id !== conversationId) return;
const turns: readonly Turn[] = state.turns;
@@ -630,6 +645,10 @@
});
},
done: () => {
+ getAppContext().reportingService.timeEnd(Timing.AI_CHAT_REQUEST, {
+ modelName: request.model_name,
+ actionId: action.id,
+ });
const state = this.getState();
if (state.id !== conversationId) return;
assert(turnIndex < state.turns.length, 'turn index out of bounds');
@@ -642,6 +661,7 @@
});
},
};
+ getAppContext().reportingService.time(Timing.AI_CHAT_REQUEST);
this.plugin?.chat?.(request, listener);
}
diff --git a/polygerrit-ui/app/models/chat/chat-model_test.ts b/polygerrit-ui/app/models/chat/chat-model_test.ts
index 9f70830..d244bd4 100644
--- a/polygerrit-ui/app/models/chat/chat-model_test.ts
+++ b/polygerrit-ui/app/models/chat/chat-model_test.ts
@@ -16,6 +16,8 @@
import sinon from 'sinon';
import {ParsedChangeInfo} from '../../types/types';
+import {getAppContext} from '../../services/app-context';
+import {Interaction, Timing} from '../../constants/reporting';
suite('chat-model tests', () => {
let model: ChatModel;
@@ -345,4 +347,99 @@
const state = model.getState();
assert.equal(state.turns[0].geminiMessage.regenerationIndex, 0);
});
+
+ suite('telemetry reporting', () => {
+ let timeStub: sinon.SinonStub;
+ let timeEndStub: sinon.SinonStub;
+ let reportInteractionStub: sinon.SinonStub;
+
+ setup(() => {
+ timeStub = sinon.stub(getAppContext().reportingService, 'time');
+ timeEndStub = sinon.stub(getAppContext().reportingService, 'timeEnd');
+ reportInteractionStub = sinon.stub(
+ getAppContext().reportingService,
+ 'reportInteraction'
+ );
+
+ // Set up a change, models, and actions
+ const models = {
+ models: [
+ {
+ model_id: 'test-model',
+ full_display_text: 'Test Model',
+ short_text: 'Test',
+ },
+ ],
+ default_model_id: 'test-model',
+ };
+ const actions = {
+ actions: [
+ {
+ id: 'test-action',
+ display_text: 'Test Action',
+ initial_user_prompt: 'Test Prompt',
+ },
+ ],
+ default_action_id: 'test-action',
+ };
+ (provider.getActions as sinon.SinonStub).resolves(actions);
+ (provider.getModels as sinon.SinonStub).resolves(models);
+
+ changeModel.updateStateChange(createParsedChange());
+ });
+
+ test('chat request starts a timer', async () => {
+ await new Promise(resolve => setTimeout(resolve, 0));
+
+ model.updateUserInput('hello');
+ model.chat('hello', 'test-action', 0);
+
+ assert.isTrue(timeStub.calledOnceWith(Timing.AI_CHAT_REQUEST));
+ });
+
+ test('chat request success stops the timer', async () => {
+ await new Promise(resolve => setTimeout(resolve, 0));
+
+ (provider.chat as sinon.SinonStub).callsFake((_, listener) => {
+ listener.done();
+ });
+
+ model.updateUserInput('hello');
+ model.chat('hello', 'test-action', 0);
+
+ assert.isTrue(
+ timeEndStub.calledOnceWith(Timing.AI_CHAT_REQUEST, {
+ modelName: 'test-model',
+ actionId: 'test-action',
+ })
+ );
+ });
+
+ test('chat request failure stops the timer and logs interaction', async () => {
+ await new Promise(resolve => setTimeout(resolve, 0));
+
+ (provider.chat as sinon.SinonStub).callsFake((_, listener) => {
+ listener.emitError('some error');
+ });
+
+ model.updateUserInput('hello');
+ model.chat('hello', 'test-action', 0);
+
+ assert.isTrue(
+ timeEndStub.calledOnceWith(Timing.AI_CHAT_REQUEST, {
+ modelName: 'test-model',
+ actionId: 'test-action',
+ error: 'some error',
+ })
+ );
+
+ assert.isTrue(
+ reportInteractionStub.calledOnceWith(Interaction.AI_CHAT_FAILURE, {
+ modelName: 'test-model',
+ actionId: 'test-action',
+ error: 'some error',
+ })
+ );
+ });
+ });
});
diff --git a/polygerrit-ui/app/models/checks/checks-util.ts b/polygerrit-ui/app/models/checks/checks-util.ts
index dffa415..ee2eb91 100644
--- a/polygerrit-ui/app/models/checks/checks-util.ts
+++ b/polygerrit-ui/app/models/checks/checks-util.ts
@@ -95,10 +95,12 @@
}
}
-function pleaseFixMessage(result: RunResult) {
- return `Please fix this ${result.category} reported by ${result.checkName}: ${result.summary}
-
-${result.message}`;
+export function pleaseFixMessage(result: RunResult) {
+ const message =
+ result.summary === result.message
+ ? result.message
+ : `${result.summary}\n\n${result.message}`;
+ return `Please fix this ${result.category} reported by ${result.checkName}: ${message}`;
}
/**
diff --git a/polygerrit-ui/app/models/checks/checks-util_test.ts b/polygerrit-ui/app/models/checks/checks-util_test.ts
index 2473d45..7226528 100644
--- a/polygerrit-ui/app/models/checks/checks-util_test.ts
+++ b/polygerrit-ui/app/models/checks/checks-util_test.ts
@@ -11,6 +11,7 @@
AttemptChoice,
computeIsExpandable,
LATEST_ATTEMPT,
+ pleaseFixMessage,
rectifyFix,
reportAiAgentCommentDraft,
reportAiAgentGetAIFix,
@@ -19,7 +20,7 @@
toComment,
} from './checks-util';
import {Interaction} from '../../constants/reporting';
-import {Fix, Replacement} from '../../api/checks';
+import {Category, Fix, Replacement} from '../../api/checks';
import {PROVIDED_FIX_ID} from '../../utils/comment-util';
import {CommentRange, RevisionPatchSetNum} from '../../api/rest-api';
import {ReportingService} from '../../services/gr-reporting/gr-reporting';
@@ -353,4 +354,34 @@
assert.isFalse(reportInteractionStub.called);
});
});
+
+ suite('pleaseFixMessage', () => {
+ test('when summary and message are the same', () => {
+ const result: RunResult = {
+ ...createRunResult(),
+ category: Category.WARNING,
+ checkName: 'test-check-name',
+ summary: 'this is the warning text',
+ message: 'this is the warning text',
+ };
+ assert.equal(
+ pleaseFixMessage(result),
+ 'Please fix this WARNING reported by test-check-name: this is the warning text'
+ );
+ });
+
+ test('when summary and message are not the same', () => {
+ const result: RunResult = {
+ ...createRunResult(),
+ category: Category.ERROR,
+ checkName: 'test-check-name',
+ summary: 'this is the summary text',
+ message: 'this is the message body text',
+ };
+ assert.equal(
+ pleaseFixMessage(result),
+ 'Please fix this ERROR reported by test-check-name: this is the summary text\n\nthis is the message body text'
+ );
+ });
+ });
});
diff --git a/polygerrit-ui/app/node_modules_licenses/licenses.ts b/polygerrit-ui/app/node_modules_licenses/licenses.ts
index 3189bcb..9dcf5a9 100644
--- a/polygerrit-ui/app/node_modules_licenses/licenses.ts
+++ b/polygerrit-ui/app/node_modules_licenses/licenses.ts
@@ -332,7 +332,7 @@
license: {
name: 'marked',
type: LicenseTypes.Mit,
- packageLicenseFile: 'LICENSE.md',
+ packageLicenseFile: 'LICENSE',
},
},
{
diff --git a/polygerrit-ui/app/package.json b/polygerrit-ui/app/package.json
index cb179ce..a726721 100644
--- a/polygerrit-ui/app/package.json
+++ b/polygerrit-ui/app/package.json
@@ -18,7 +18,7 @@
"highlightjs-vue": "https://github.com/paladox/highlightjs-vue#44eed074ea0110d1ad03d2cbd77d27027cf7bb04",
"immer": "^9.0.21",
"lit": "^3.3.1",
- "marked": "^17.0.1",
+ "marked": "^18.0.3",
"polymer-bridges": "file:../../polymer-bridges",
"polymer-resin": "^2.0.1",
"resemblejs": "rsmbl/Resemble.js#66a55c5bfc3bda2303ad632ee8ce3c727b415917",
@@ -39,4 +39,4 @@
},
"license": "Apache-2.0",
"private": true
-}
\ No newline at end of file
+}
diff --git a/polygerrit-ui/app/yarn.lock b/polygerrit-ui/app/yarn.lock
index 7779083..cbdd3f2 100644
--- a/polygerrit-ui/app/yarn.lock
+++ b/polygerrit-ui/app/yarn.lock
@@ -223,10 +223,10 @@
lit-element "^4.2.0"
lit-html "^3.3.0"
-marked@^17.0.1:
- version "17.0.1"
- resolved "https://registry.yarnpkg.com/marked/-/marked-17.0.1.tgz#9db34197ac145e5929572ee49ef701e37ee9b2e6"
- integrity sha512-boeBdiS0ghpWcSwoNm/jJBwdpFaMnZWRzjA6SkUMYb40SVaN1x7mmfGKp0jvexGcx+7y2La5zRZsYFZI6Qpypg==
+marked@^18.0.3:
+ version "18.0.3"
+ resolved "https://registry.yarnpkg.com/marked/-/marked-18.0.3.tgz#278b5ba89f1c7ccbaf0422f3ee8955928489220b"
+ integrity sha512-7VT90JOkDeaRWpfjOReRGPEKn0ecdARBkDGL+tT1wZY0efPPqkUxLUSmzy/C7TIylQYJC9STISEsCHrqb/7VIA==
mimic-response@^3.1.0:
version "3.1.0"
diff --git a/polygerrit-ui/resultdb-reporter.mjs b/polygerrit-ui/resultdb-reporter.mjs
new file mode 100644
index 0000000..125481b
--- /dev/null
+++ b/polygerrit-ui/resultdb-reporter.mjs
@@ -0,0 +1,111 @@
+import fs from 'fs';
+import path from 'path';
+
+export function resultDbReporter() {
+ let sinkCtx = null;
+
+ // 1. Read LUCI_CONTEXT to get the Result Sink address and token
+ const luciCtxFile = process.env['LUCI_CONTEXT'];
+ if (luciCtxFile) {
+ try {
+ const luciCtx = JSON.parse(fs.readFileSync(luciCtxFile, 'utf8'));
+ sinkCtx = luciCtx.result_sink;
+ } catch (e) {
+ console.error('Failed to read LUCI_CONTEXT', e);
+ }
+ }
+
+ async function uploadToResultSink(testResult) {
+ if (!sinkCtx) return;
+
+ const url = `http://${sinkCtx.address}/prpc/luci.resultsink.v1.Sink/ReportTestResults`;
+ const headers = {
+ 'Content-Type': 'application/json',
+ 'Authorization': `ResultSink ${sinkCtx.auth_token}`,
+ };
+
+ const body = JSON.stringify({
+ testResults: [testResult],
+ });
+
+ try {
+ const res = await fetch(url, { method: 'POST', headers, body });
+ if (!res.ok) {
+ console.error('Failed to report to ResultDB', await res.text());
+ }
+ } catch (e) {
+ console.error('Error reporting to ResultDB', e);
+ }
+ }
+
+ return {
+ async reportTestFileResults({ testFile, sessions }) {
+ if (!sinkCtx) return;
+
+ for (const session of sessions) {
+ if (!session.testResults) continue;
+
+ // 2. Flatten the nested Mocha suites/tests
+ const tests = [];
+ function collectTests(suite, parentName = '') {
+ const name = suite.name ? (parentName ? `${parentName} > ${suite.name}` : suite.name) : parentName;
+ if (suite.tests) {
+ for (const t of suite.tests) {
+ tests.push({ ...t, suiteName: name });
+ }
+ }
+ if (suite.suites) {
+ for (const s of suite.suites) {
+ collectTests(s, name);
+ }
+ }
+ }
+ collectTests(session.testResults);
+
+ for (const test of tests) {
+ const testName = test.suiteName ? `${test.suiteName} > ${test.name}` : test.name;
+ const testId = `gerrit > polygerrit-ui > ${testFile} > ${testName}`;
+
+ const status = test.passed ? 'PASS' : 'FAIL';
+ const expected = test.passed;
+
+ let summaryHtml = `<pre>${test.error?.message || ''}\n${test.error?.stack || ''}</pre>`;
+ const artifacts = {};
+
+ // 3. If visual diff failed, extract the diff image and upload it
+ if (!test.passed && test.error && test.error.message.includes('Visual diff failed')) {
+ const match = test.error.message.match(/See diff for details: (\S+)/);
+ if (match && match[1]) {
+ const diffPath = match[1];
+ try {
+ if (fs.existsSync(diffPath)) {
+ const content = fs.readFileSync(diffPath);
+ artifacts['visual_diff'] = {
+ contents: content.toString('base64'), // Bytes must be base64 encoded for JSON pRPC
+ contentType: 'image/png',
+ };
+ // Inline the artifact directly in the summary!
+ summaryHtml += `<br><b>Visual Diff:</b><br><img src="artifact://visual_diff">`;
+ }
+ } catch (e) {
+ console.error('Failed to read visual diff artifact', e);
+ }
+ }
+ }
+
+ await uploadToResultSink({
+ testId,
+ status,
+ expected,
+ summaryHtml,
+ artifacts: Object.keys(artifacts).length > 0 ? artifacts : undefined,
+ duration: test.duration ? {
+ seconds: String(Math.floor(test.duration / 1000)),
+ nanos: (test.duration % 1000) * 1000000,
+ } : undefined,
+ });
+ }
+ }
+ }
+ };
+}
diff --git a/polygerrit-ui/screenshots/Chromium/baseline/gr-change-view-wrapped-statuses-801px-dark.png b/polygerrit-ui/screenshots/Chromium/baseline/gr-change-view-wrapped-statuses-801px-dark.png
new file mode 100644
index 0000000..a379b77
--- /dev/null
+++ b/polygerrit-ui/screenshots/Chromium/baseline/gr-change-view-wrapped-statuses-801px-dark.png
Binary files differ
diff --git a/polygerrit-ui/screenshots/Chromium/baseline/gr-change-view-wrapped-statuses-801px.png b/polygerrit-ui/screenshots/Chromium/baseline/gr-change-view-wrapped-statuses-801px.png
index bf2f836..b6e6dd9 100644
--- a/polygerrit-ui/screenshots/Chromium/baseline/gr-change-view-wrapped-statuses-801px.png
+++ b/polygerrit-ui/screenshots/Chromium/baseline/gr-change-view-wrapped-statuses-801px.png
Binary files differ
diff --git a/polygerrit-ui/web-test-runner.config.mjs b/polygerrit-ui/web-test-runner.config.mjs
index 893d0fb..692bbdc 100644
--- a/polygerrit-ui/web-test-runner.config.mjs
+++ b/polygerrit-ui/web-test-runner.config.mjs
@@ -2,6 +2,7 @@
import fs from 'fs';
import { esbuildPlugin } from '@web/dev-server-esbuild';
import { defaultReporter, summaryReporter } from '@web/test-runner';
+import { resultDbReporter } from './resultdb-reporter.mjs';
import { visualRegressionPlugin } from '@web/test-runner-visual-regression/plugin';
import pixelmatch from 'pixelmatch';
import { PNG } from 'pngjs';
@@ -188,7 +189,11 @@
// /lib/fonts/ for screenshots tests, see middleware.
rootDir: runUnderBazel ? rootDir : '..',
- reporters: [defaultReporter(), summaryReporter()],
+ reporters: [
+ defaultReporter(),
+ summaryReporter(),
+ resultDbReporter(),
+ ],
middleware: [
// Fonts are in /lib/fonts/, but css tries to load from