Merge "Add event interface to Gerrit" into stable-2.16
diff --git a/.bazelversion b/.bazelversion
index 9084fa2..26aaba0 100644
--- a/.bazelversion
+++ b/.bazelversion
@@ -1 +1 @@
-1.1.0
+1.2.0
diff --git a/Documentation/backup.txt b/Documentation/backup.txt
new file mode 100644
index 0000000..ed044ba
--- /dev/null
+++ b/Documentation/backup.txt
@@ -0,0 +1,270 @@
+= Gerrit Code Review - Backup
+
+A Gerrit Code Review site contains data that needs to be backed up regularly.
+This document describes best practices for backing up review data.
+
+[#mand-backup]
+== Data which must be backed up
+
+[#mand-backup-git]
+Git repositories::
++
+The bare Git repositories managed by Gerrit are typically stored in the
+`${SITE}/git` directory. However, the locations can be customized in
+`${site}/etc/gerrit.config`. They contain the history of the respective
+projects, and since 2.15 if you are using _NoteDb_, and for 3.0 and newer,
+also change and review metadata, user accounts and groups.
++
+
+[#mand-backup-db]
+SQL database::
++
+Gerrit releases in the 2.x series store some data in the database you
+have chosen when installing Gerrit. If you are using 2.16 and have
+migrated to _NoteDb_ only the schema version is stored in the database.
++
+If you are using h2 you need to backup the `.db` files in the folder
+`${SITE}/db`.
++
+For all other database types refer to their backup documentation.
++
+Gerrit release 3.0 and newer store all primary data in _NoteDb_ inside
+the git repositories of the Gerrit site. Only the review flag marking in
+the UI when you have reviewed a changed file is stored in a relational
+database. If you are using h2 this database is named
+`account_patch_reviews.h2.db`.
+
+[#optional-backup]
+== Data optional to be backed up
+
+[#data-optional-backup-index]
+Search index::
++
+The _Lucene_ search index is stored in the `${SITE}/index` folder.
+It can be recomputed from primary data in the git repositories but
+reindexing may take a long time hence backing up the index makes sense
+for production installations.
++
+If you have chosen to use _Elastic Search_ for indexing,
+refer to its
+link:https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-snapshots.html[backup documentation].
+
+[#optional-backup-cache]
+Caches::
++
+Gerrit uses many caches which populate automatically. Some of the caches
+are persisted in the directory `${SITE}/cache` to retain the cached data
+across restarts. Since repopulating persistent caches takes time and server
+resources it makes sense to include them in backups to avoid unnecessary
+higher load and degraded performance when a Gerrit site has been restored
+from backup and caches need to be repopulated.
+
+[#optional-backup-config]
+Configuration::
++
+Gerrit configuration files are located in the directory `${SITE}/etc`
+and should be backed up or versioned in a git repository. The `etc`
+directory also contains secrets which should be handled separately
++
+* `secure.config` contains passwords and `auth.registerEmailPrivateKey`
+* public and private SSH host keys
++
+You may consider to use the
+link:https://gerrit.googlesource.com/plugins/secure-config/[secure-config plugin]
+to encrypt these secrets.
+
+[#optional-backup-plugin-data]
+Plugin Data::
++
+The `${SITE}/data/` directory is used by plugins storing data like e.g.
+the delete-project and the replication plugin.
+
+[#optional-backup-libs]
+Libraries::
++
+The `${SITE}/lib/` directory contains libraries used as statically loaded
+plugin or providing additional dependencies needed by Gerrit plugins.
+
+[#optional-backup-plugins]
+Plugins::
++
+The `${SITE}/plugins/` directory contains the installed Gerrit plugins.
+
+[#optional-backup-static]
+Static Resources::
++
+The `${SITE}/static/` directory contains static resources used to customize the
+Gerrit UI and email templates.
+
+[#optional-backup-logs]
+Logs::
++
+The `${SITE}/logs/` directory contains Gerrit server log files. Logs can still
+be written when the server is in read-only mode.
+
+[#cons-backup]
+== Consistent backups
+
+There are several ways to ensure consistency when backing up primary data.
+
+[#cons-backup-snapshot]
+=== Filesystem snapshots
+
+Gerrit 3.0 or newer::
++
+* all primary data is stored in git
+* Use a file system like lvm, zfs, btrfs or nfs supporting snapshots.
+Create a snapshot and then archive the snapshot.
+
+Gerrit 2.x::
++
+Gerrit 2.16 can use _NoteDb_ to store almost all this data which
+simplifies creating backups since consistency between database and git
+repositories is no longer critical. If you migrated to _NoteDb_ you can
+follow the backup procedure for 3.0 and higher and additionally take
+a backup of the database, which only contains the schema version,
+hence consistency between git and database is no longer critical since
+the schema version only changes during upgrade. If you didn't migrate
+to _NoteDb_ then follow the backup procedure for older 2.x Gerrit versions.
++
+Older 2.x Gerrit versions store change meta data, review comments, votes,
+accounts and group information in a SQL database. Creating consistent backups
+where git repositories and the data stored in the database are backed up
+consistently requires to turn the server read-only or to shut it down
+while creating the backup since there is no integrated transaction handling
+between git repositories and the SQL database. Also crons and currently
+running cron jobs (e.g. repacking repositories) which affect the repositories
+may need to be shut down.
+Use a file system supporting snapshots to keep the period where the gerrit
+server is read-only or down as short as possible.
+
+[#cons-backup-read-only]
+=== Turn master read-only for backup
+
+Make the server read-only before taking the backup. This means read-access
+is still available during backup, because only write operations have to be
+stopped to ensure consistency. This can be implemented using the
+link:https://gerrit.googlesource.com/plugins/readonly/[_readonly_] plugin.
+
+[#cons-backup-replicate]
+=== Replicate data for backup
+
+Replicating the git repositories can backup the most critical repository data
+but does not backup repository meta-data such as the project description
+file, ref-logs, git configs, and alternate configs.
+
+Replicate all git repositories to another file system using
+`git clone --mirror`,
+or the
+link:https://gerrit.googlesource.com/plugins/replication[replication plugin]
+or the
+link:https://gerrit.googlesource.com/plugins/pull-replication[pull-replication plugin].
+Best you use a filesystem supporting snapshots to create a backup archive
+of such a replica.
+
+For 2.x Gerrit versions also set up a database slave for the data stored in the
+SQL database. If you are using 2.16 and migrated to _NoteDb_ you may consider to
+skip setting up a database slave, instead take a backup of the database which only
+contains the current schema version in this case.
+In addition you need to ensure that no write operations are in flight before you
+take the replica offline. Otherwise the database backup might be inconsistent
+with the backup of the git repositories.
+
+Do not skip backing up the replica, the replica alone IS NOT a backup.
+Imagine someone deleted a project by mistake and this deletion got replicated.
+Replication of repository deletions can be switched off using the
+link:https://gerrit.googlesource.com/plugins/replication/+/refs/heads/master/src/main/resources/Documentation/config.md[server option]
+`remote.NAME.replicateProjectDeletions`.
+
+If you are using Gerrit slaves to offload read traffic you can use one of these
+slaves for creating backups.
+
+[#cons-backup-offline]
+=== Take master offline for backup
+
+Shutdown the server before taking a backup. This is simple but means downtime
+for the users. Also crons and currently running cron jobs (e.g. repacking
+repositories) which affect the repositories may need to be shut down.
+
+[#backup-methods]
+== Backup methods
+
+[#backup-methods-snapshots]
+=== Filesystem snapshots
+
+Filesystems supporting copy on write snapshots::
++
+Use a file system supporting copy-on-write snapshots like
+link:https://btrfs.wiki.kernel.org/index.php/SysadminGuide#Snapshots[btrfs]
+or
+https://wiki.debian.org/ZFS#Snapshots[zfs].
+
+
+Other filesystems supporting snapshots::
+https://wiki.archlinux.org/index.php/LVM#Snapshots[lvm] or nfs.
++
+Create a snapshot and then archive the snapshot to another storage.
++
+While snapshots are great for creating high quality backups quickly, they are
+not ideal as a format for storing backup data. Snapshots typically depend and
+reside on the same storage infrastructure as the original disk images.
+Therefore, it’s crucial that you archive these snapshots and store them
+elsewhere.
+
+3.0 or newer::
+Snapshot the complete site directory
+
+2.x::
+Similar, but the data of the database should be stored on the very same volume
+on the same machine, so that the snapshot is taken atomically over both
+the git data and the database data. Because everything should be ACID, it can safely
+crash-recover - as if the power has been plugged and the server got booted up again.
+(Actually more safe than that, because the filesystem knows about taking the snapshot,
+and also about the pending writes it can sync.)
+
+In addition to that, using filesystem snapshots allows to:
+
+* easy and fast roll back without having to access remote backup data (e.g. to restore
+accidental rm -rf git/ back in seconds).
+* incremental transfer of consistent snapshots
+* save a lot of data while still keeping multiple "known consistent states"
+
+[#backup-methods-other]
+=== Other backup methods
+
+To ensure consistent backups these backup methods require to turn the server into
+read-only mode while a backup is running.
+
+* create an archive like `tar.gz` to backup the site
+* `rsync`
+* plain old `cp`
+
+[#backup-methods-test]
+== Test backups
+
+Test backups and fire drill restoring backups to ensure the backups aren't
+corrupt or incomplete and you can restore a backup quickly.
+
+[#backup-dr]
+== Disaster recovery
+
+[#backup-dr-repl]
+=== Replicate backup archives
+
+To enable disaster recovery at least replicate backup archives to another data center.
+And fire drill restoring a new site using the backup.
+
+[#backup-dr-multi-site]
+=== Multi-site setup
+
+Use the https://gerrit.googlesource.com/plugins/multi-site[multi-site plugin]
+to install Gerrit with multiple sites installed in different datacenters
+across different regions. This ensures that in case of a severe problem with
+one of the sites, the other sites can still serve your repositories.
+
+GERRIT
+------
+Part of link:index.html[Gerrit Code Review]
+
+SEARCHBOX
+---------
diff --git a/Documentation/install.txt b/Documentation/install.txt
index dbca368..b6a2954 100644
--- a/Documentation/install.txt
+++ b/Documentation/install.txt
@@ -260,6 +260,11 @@
 * http://www.kernel.org/pub/software/scm/git/docs/git-daemon.html[git-daemon]
 
 
+[[backup]]
+== Backup
+
+See the link:backup.html[backup documentation].
+
 GERRIT
 ------
 Part of link:index.html[Gerrit Code Review]
diff --git a/Documentation/note-db.txt b/Documentation/note-db.txt
index fd2bef37..555120a 100644
--- a/Documentation/note-db.txt
+++ b/Documentation/note-db.txt
@@ -192,3 +192,10 @@
   of all changes in NoteDb is accurate, and so is only safe once all changes are
   NoteDb primary. Otherwise, reading changes only from NoteDb might result in
   inaccurate results, and writing to NoteDb would compound the problem. +
+
+== NoteDB to ReviewDB rollback
+
+In case of rollback from NoteDB to ReviewDB, all the meta refs and the
+sequence ref need to be removed.
+The [remove-notedb-refs.sh](https://gerrit.googlesource.com/gerrit/+/refs/heads/master/contrib/remove-notedb-refs.sh)
+script has been written to automate this process.
diff --git a/Documentation/rest-api-accounts.txt b/Documentation/rest-api-accounts.txt
index 8247212..615accc 100644
--- a/Documentation/rest-api-accounts.txt
+++ b/Documentation/rest-api-accounts.txt
@@ -733,8 +733,8 @@
   [
     {
       "seq": 1,
-      "ssh_public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw\u003d\u003d john.doe@example.com",
-      "encoded_key": "AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw\u003d\u003d",
+      "ssh_public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw== john.doe@example.com",
+      "encoded_key": "AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw==",
       "algorithm": "ssh-rsa",
       "comment": "john.doe@example.com",
       "valid": true
@@ -767,8 +767,8 @@
   )]}'
   {
     "seq": 1,
-    "ssh_public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw\u003d\u003d john.doe@example.com",
-    "encoded_key": "AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw\u003d\u003d",
+    "ssh_public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw== john.doe@example.com",
+    "encoded_key": "AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw==",
     "algorithm": "ssh-rsa",
     "comment": "john.doe@example.com",
     "valid": true
@@ -791,9 +791,9 @@
 .Request
 ----
   POST /accounts/self/sshkeys HTTP/1.0
-  Content-Type: plain/text
+  Content-Type: text/plain
 
-  AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw\u003d\u003d
+  ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw== john.doe@example.com
 ----
 
 As response an link:#ssh-key-info[SshKeyInfo] entity is returned that
@@ -808,8 +808,8 @@
   )]}'
   {
     "seq": 2,
-    "ssh_public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw\u003d\u003d john.doe@example.com",
-    "encoded_key": "AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw\u003d\u003d",
+    "ssh_public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw== john.doe@example.com",
+    "encoded_key": "AAAAB3NzaC1yc2EAAAABIwAAAQEA0T...YImydZAw==",
     "algorithm": "ssh-rsa",
     "comment": "john.doe@example.com",
     "valid": true
diff --git a/Jenkinsfile b/Jenkinsfile
index 29a3d15..dff9932 100644
--- a/Jenkinsfile
+++ b/Jenkinsfile
@@ -88,6 +88,10 @@
     }
 }
 
+def hasChangeNumber() {
+    env.GERRIT_CHANGE_NUMBER?.trim()
+}
+
 def postCheck(check) {
     def gerritPostUrl = Globals.gerritUrl +
         "a/changes/${check.changeNum}/revisions/${check.sha1}/checks"
@@ -154,18 +158,19 @@
 }
 
 def prepareBuildsForMode(buildName, mode="reviewdb", retryTimes = 1) {
-    def propagate = retryTimes == 1 ? false : true
     return {
         stage("${buildName}/${mode}") {
-            catchError{
-                retry(retryTimes){
-                    def slaveBuild = build job: "${buildName}", parameters: [
+            def slaveBuild = null
+            for (int i = 1; i <= retryTimes; i++) {
+                try {
+                    slaveBuild = build job: "${buildName}", parameters: [
                         string(name: 'REFSPEC', value: Change.ref),
                         string(name: 'BRANCH', value: Change.sha1),
                         string(name: 'CHANGE_URL', value: Change.url),
                         string(name: 'MODE', value: mode),
                         string(name: 'TARGET_BRANCH', value: Change.branch)
-                    ], propagate: propagate
+                    ], propagate: false
+                } finally {
                     if (buildName == "Gerrit-codestyle"){
                         Builds.codeStyle = new Build(
                             slaveBuild.getAbsoluteUrl(), slaveBuild.getResult())
@@ -173,6 +178,9 @@
                         Builds.verification[mode] = new Build(
                             slaveBuild.getAbsoluteUrl(), slaveBuild.getResult())
                     }
+                    if (slaveBuild.getResult() == "SUCCESS") {
+                        break
+                    }
                 }
             }
         }
@@ -181,9 +189,17 @@
 
 def collectBuilds() {
     def builds = [:]
-    builds["Gerrit-codestyle"] = prepareBuildsForMode("Gerrit-codestyle")
-    Builds.modes.each {
-        builds["Gerrit-verification(${it})"] = prepareBuildsForMode("Gerrit-verifier-bazel", it)
+    if (hasChangeNumber()) {
+       builds["Gerrit-codestyle"] = prepareBuildsForMode("Gerrit-codestyle")
+       Builds.modes.each {
+          builds["Gerrit-verification(${it})"] = prepareBuildsForMode("Gerrit-verifier-bazel", it)
+       }
+    } else {
+       builds["java8"] = { -> build "Gerrit-bazel-${env.BRANCH_NAME}" }
+
+       if (env.BRANCH_NAME == "master") {
+          builds["java11"] = { -> build "Gerrit-bazel-java11-${env.BRANCH_NAME}" }
+       }
     }
     return builds
 }
@@ -270,44 +286,48 @@
 
 node ('master') {
 
-    stage('Preparing'){
-        gerritReview labels: ['Verified': 0, 'Code-Style': 0]
+    if (hasChangeNumber()) {
+        stage('Preparing'){
+            gerritReview labels: ['Verified': 0, 'Code-Style': 0]
 
-        getChangeMetaData()
-        collectBuildModes()
+            getChangeMetaData()
+            collectBuildModes()
+        }
     }
 
     parallel(collectBuilds())
 
-    stage('Retry Flaky Builds'){
-        def flakyBuildsModes = findFlakyBuilds()
-        if (flakyBuildsModes.size() > 0){
-            parallel flakyBuildsModes.collectEntries {
-                ["Gerrit-verification(${it})" :
-                    prepareBuildsForMode("Gerrit-verifier-bazel", it, 3)]
+    if (hasChangeNumber()) {
+        stage('Retry Flaky Builds'){
+            def flakyBuildsModes = findFlakyBuilds()
+            if (flakyBuildsModes.size() > 0){
+                parallel flakyBuildsModes.collectEntries {
+                    ["Gerrit-verification(${it})" :
+                        prepareBuildsForMode("Gerrit-verifier-bazel", it, 3)]
+                }
             }
         }
-    }
 
-    stage('Report to Gerrit'){
-        resCodeStyle = getLabelValue(1, Builds.codeStyle.result)
-        gerritReview(
-            labels: ['Code-Style': resCodeStyle],
-            message: createCodeStyleMsgBody(Builds.codeStyle, resCodeStyle))
-        postCheck(new GerritCheck("codestyle", Change.number, Change.sha1, Builds.codeStyle))
+        stage('Report to Gerrit'){
+            resCodeStyle = getLabelValue(1, Builds.codeStyle.result)
+            gerritReview(
+                labels: ['Code-Style': resCodeStyle],
+                message: createCodeStyleMsgBody(Builds.codeStyle, resCodeStyle))
+            postCheck(new GerritCheck("codestyle", Change.number, Change.sha1, Builds.codeStyle))
 
-        def verificationResults = Builds.verification.collect { k, v -> v }
-        def resVerify = verificationResults.inject(1) {
-            acc, build -> getLabelValue(acc, build.result)
+            def verificationResults = Builds.verification.collect { k, v -> v }
+            def resVerify = verificationResults.inject(1) {
+                acc, build -> getLabelValue(acc, build.result)
+            }
+            gerritReview(
+                labels: ['Verified': resVerify],
+                message: createVerifyMsgBody(Builds.verification))
+
+            Builds.verification.each { type, build -> postCheck(
+                new GerritCheck(type, Change.number, Change.sha1, build)
+            )}
+
+            setResult(resVerify, resCodeStyle)
         }
-        gerritReview(
-            labels: ['Verified': resVerify],
-            message: createVerifyMsgBody(Builds.verification))
-
-        Builds.verification.each { type, build -> postCheck(
-            new GerritCheck(type, Change.number, Change.sha1, build)
-        )}
-
-        setResult(resVerify, resCodeStyle)
     }
 }
diff --git a/WORKSPACE b/WORKSPACE
index 9b17514..6495faa 100644
--- a/WORKSPACE
+++ b/WORKSPACE
@@ -32,9 +32,9 @@
 
 http_archive(
     name = "io_bazel_rules_closure",
-    sha256 = "0409f8bd2a8b6fd1db289cdc0acb394dafd69f60a86d0169bc6495e648e01587",
-    strip_prefix = "rules_closure-18f8acf24ae0d03a9c3ee872ff91dcfbf383d69e",
-    urls = ["https://github.com/bazelbuild/rules_closure/archive/18f8acf24ae0d03a9c3ee872ff91dcfbf383d69e.tar.gz"],
+    sha256 = "03c3b16f205085817fd89cfdcb2220a0138647ee7992be9cef291b069dd90301",
+    strip_prefix = "rules_closure-196a45f0ede2faec11dcc6c60fbc5e7471f4bd58",
+    urls = ["https://github.com/bazelbuild/rules_closure/archive/196a45f0ede2faec11dcc6c60fbc5e7471f4bd58.tar.gz"],
 )
 
 # File is specific to Polymer and copied from the Closure Github -- should be
@@ -292,6 +292,31 @@
     sha1 = GUAVA_BIN_SHA1,
 )
 
+CAFFEINE_VERS = "2.8.0"
+
+maven_jar(
+    name = "caffeine",
+    artifact = "com.github.ben-manes.caffeine:caffeine:" + CAFFEINE_VERS,
+    sha1 = "6000774d7f8412ced005a704188ced78beeed2bb",
+)
+
+# TODO(davido): Rename guava.jar to caffeine-guava.jar on fetch to prevent potential
+# naming collision between caffeine guava adapater and guava library itself.
+# Remove this renaming procedure, once this upstream issue is fixed:
+# https://github.com/ben-manes/caffeine/issues/364.
+http_file(
+    name = "caffeine-guava-renamed",
+    downloaded_file_path = "caffeine-guava-" + CAFFEINE_VERS + ".jar",
+    sha256 = "3a66ee3ec70971dee0bae6e56bda7b8742bc4bedd7489161bfbbaaf7137d89e1",
+    urls = [
+        "https://repo1.maven.org/maven2/com/github/ben-manes/caffeine/guava/" +
+        CAFFEINE_VERS +
+        "/guava-" +
+        CAFFEINE_VERS +
+        ".jar",
+    ],
+)
+
 maven_jar(
     name = "jsch",
     artifact = "com.jcraft:jsch:0.1.54",
@@ -743,10 +768,10 @@
 
 maven_jar(
     name = "blame-cache",
-    artifact = "com/google/gitiles:blame-cache:0.2-7",
+    artifact = "com/google/gitiles:blame-cache:0.2-7.1",
     attach_source = False,
     repository = GERRIT,
-    sha1 = "8170f33b8b1db6f55e41d7069fa050a4d102a62b",
+    sha1 = "73915991bb7472a730102ab01ca68776a52466fd",
 )
 
 # Keep this version of Soy synchronized with the version used in Gitiles.
diff --git a/java/com/google/gerrit/elasticsearch/ElasticVersion.java b/java/com/google/gerrit/elasticsearch/ElasticVersion.java
index 309ee3e5..574a226 100644
--- a/java/com/google/gerrit/elasticsearch/ElasticVersion.java
+++ b/java/com/google/gerrit/elasticsearch/ElasticVersion.java
@@ -30,7 +30,8 @@
   V7_1("7.1.*"),
   V7_2("7.2.*"),
   V7_3("7.3.*"),
-  V7_4("7.4.*");
+  V7_4("7.4.*"),
+  V7_5("7.5.*");
 
   private final String version;
   private final Pattern pattern;
diff --git a/java/com/google/gerrit/pgm/Daemon.java b/java/com/google/gerrit/pgm/Daemon.java
index b08842e..5912973 100644
--- a/java/com/google/gerrit/pgm/Daemon.java
+++ b/java/com/google/gerrit/pgm/Daemon.java
@@ -579,14 +579,14 @@
 
   private Injector createWebInjector() {
     final List<Module> modules = new ArrayList<>();
-    if (sshd) {
-      modules.add(new ProjectQoSFilter.Module());
-    }
     modules.add(RequestContextFilter.module());
     modules.add(RequestMetricsFilter.module());
     modules.add(H2CacheBasedWebSession.module());
     modules.add(sysInjector.getInstance(GerritAuthModule.class));
     modules.add(sysInjector.getInstance(GitOverHttpModule.class));
+    if (sshd) {
+      modules.add(new ProjectQoSFilter.Module());
+    }
     modules.add(AllRequestFilter.module());
     modules.add(sysInjector.getInstance(WebModule.class));
     modules.add(sysInjector.getInstance(RequireSslFilter.Module.class));
diff --git a/java/com/google/gerrit/server/cache/CacheBackend.java b/java/com/google/gerrit/server/cache/CacheBackend.java
new file mode 100644
index 0000000..ec9876f
--- /dev/null
+++ b/java/com/google/gerrit/server/cache/CacheBackend.java
@@ -0,0 +1,25 @@
+// Copyright (C) 2019 The Android Open Source Project
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package com.google.gerrit.server.cache;
+
+/** Caffeine is used as default cache backend, but can be overridden with Guava backend. */
+public enum CacheBackend {
+  CAFFEINE,
+  GUAVA;
+
+  public boolean isLegacyBackend() {
+    return this == GUAVA;
+  }
+}
diff --git a/java/com/google/gerrit/server/cache/CacheModule.java b/java/com/google/gerrit/server/cache/CacheModule.java
index 2878624..0fdc6f5 100644
--- a/java/com/google/gerrit/server/cache/CacheModule.java
+++ b/java/com/google/gerrit/server/cache/CacheModule.java
@@ -68,7 +68,8 @@
    */
   protected <K, V> CacheBinding<K, V> cache(
       String name, TypeLiteral<K> keyType, TypeLiteral<V> valType) {
-    CacheProvider<K, V> m = new CacheProvider<>(this, name, keyType, valType);
+    CacheProvider<K, V> m =
+        new CacheProvider<>(this, name, keyType, valType, CacheBackend.CAFFEINE);
     bindCache(m, name, keyType, valType);
     return m;
   }
@@ -123,7 +124,20 @@
    */
   protected <K, V> PersistentCacheBinding<K, V> persist(
       String name, Class<K> keyType, Class<V> valType) {
-    return persist(name, TypeLiteral.get(keyType), TypeLiteral.get(valType));
+    return persist(name, TypeLiteral.get(keyType), TypeLiteral.get(valType), CacheBackend.CAFFEINE);
+  }
+
+  /**
+   * Declare a named in-memory/on-disk cache.
+   *
+   * @param <K> type of key used to lookup entries.
+   * @param <V> type of value stored by the cache.
+   * @param backend cache backend.
+   * @return binding to describe the cache.
+   */
+  protected <K, V> PersistentCacheBinding<K, V> persist(
+      String name, Class<K> keyType, Class<V> valType, CacheBackend backend) {
+    return persist(name, TypeLiteral.get(keyType), TypeLiteral.get(valType), backend);
   }
 
   /**
@@ -135,7 +149,7 @@
    */
   protected <K, V> PersistentCacheBinding<K, V> persist(
       String name, Class<K> keyType, TypeLiteral<V> valType) {
-    return persist(name, TypeLiteral.get(keyType), valType);
+    return persist(name, TypeLiteral.get(keyType), valType, CacheBackend.CAFFEINE);
   }
 
   /**
@@ -146,8 +160,9 @@
    * @return binding to describe the cache.
    */
   protected <K, V> PersistentCacheBinding<K, V> persist(
-      String name, TypeLiteral<K> keyType, TypeLiteral<V> valType) {
-    PersistentCacheProvider<K, V> m = new PersistentCacheProvider<>(this, name, keyType, valType);
+      String name, TypeLiteral<K> keyType, TypeLiteral<V> valType, CacheBackend backend) {
+    PersistentCacheProvider<K, V> m =
+        new PersistentCacheProvider<>(this, name, keyType, valType, backend);
     bindCache(m, name, keyType, valType);
 
     Type cacheDefType =
diff --git a/java/com/google/gerrit/server/cache/CacheProvider.java b/java/com/google/gerrit/server/cache/CacheProvider.java
index b1a9b91..fe4244c 100644
--- a/java/com/google/gerrit/server/cache/CacheProvider.java
+++ b/java/com/google/gerrit/server/cache/CacheProvider.java
@@ -30,6 +30,7 @@
 
 class CacheProvider<K, V> implements Provider<Cache<K, V>>, CacheBinding<K, V>, CacheDef<K, V> {
   private final CacheModule module;
+  private final CacheBackend backend;
   final String name;
   private final TypeLiteral<K> keyType;
   private final TypeLiteral<V> valType;
@@ -44,11 +45,17 @@
   private MemoryCacheFactory memoryCacheFactory;
   private boolean frozen;
 
-  CacheProvider(CacheModule module, String name, TypeLiteral<K> keyType, TypeLiteral<V> valType) {
+  CacheProvider(
+      CacheModule module,
+      String name,
+      TypeLiteral<K> keyType,
+      TypeLiteral<V> valType,
+      CacheBackend backend) {
     this.module = module;
     this.name = name;
     this.keyType = keyType;
     this.valType = valType;
+    this.backend = backend;
   }
 
   @Inject(optional = true)
@@ -159,7 +166,9 @@
   public Cache<K, V> get() {
     freeze();
     CacheLoader<K, V> ldr = loader();
-    return ldr != null ? memoryCacheFactory.build(this, ldr) : memoryCacheFactory.build(this);
+    return ldr != null
+        ? memoryCacheFactory.build(this, ldr, backend)
+        : memoryCacheFactory.build(this, backend);
   }
 
   protected void checkNotFrozen() {
diff --git a/java/com/google/gerrit/server/cache/MemoryCacheFactory.java b/java/com/google/gerrit/server/cache/MemoryCacheFactory.java
index fc55753..558380d 100644
--- a/java/com/google/gerrit/server/cache/MemoryCacheFactory.java
+++ b/java/com/google/gerrit/server/cache/MemoryCacheFactory.java
@@ -19,7 +19,8 @@
 import com.google.common.cache.LoadingCache;
 
 public interface MemoryCacheFactory {
-  <K, V> Cache<K, V> build(CacheDef<K, V> def);
+  <K, V> Cache<K, V> build(CacheDef<K, V> def, CacheBackend backend);
 
-  <K, V> LoadingCache<K, V> build(CacheDef<K, V> def, CacheLoader<K, V> loader);
+  <K, V> LoadingCache<K, V> build(
+      CacheDef<K, V> def, CacheLoader<K, V> loader, CacheBackend backend);
 }
diff --git a/java/com/google/gerrit/server/cache/PersistentCacheFactory.java b/java/com/google/gerrit/server/cache/PersistentCacheFactory.java
index 27fa9ca..93f91ef 100644
--- a/java/com/google/gerrit/server/cache/PersistentCacheFactory.java
+++ b/java/com/google/gerrit/server/cache/PersistentCacheFactory.java
@@ -19,9 +19,10 @@
 import com.google.common.cache.LoadingCache;
 
 public interface PersistentCacheFactory {
-  <K, V> Cache<K, V> build(PersistentCacheDef<K, V> def);
+  <K, V> Cache<K, V> build(PersistentCacheDef<K, V> def, CacheBackend backend);
 
-  <K, V> LoadingCache<K, V> build(PersistentCacheDef<K, V> def, CacheLoader<K, V> loader);
+  <K, V> LoadingCache<K, V> build(
+      PersistentCacheDef<K, V> def, CacheLoader<K, V> loader, CacheBackend backend);
 
   void onStop(String plugin);
 }
diff --git a/java/com/google/gerrit/server/cache/PersistentCacheProvider.java b/java/com/google/gerrit/server/cache/PersistentCacheProvider.java
index 59d66e3..4fc107f 100644
--- a/java/com/google/gerrit/server/cache/PersistentCacheProvider.java
+++ b/java/com/google/gerrit/server/cache/PersistentCacheProvider.java
@@ -30,6 +30,7 @@
 
 class PersistentCacheProvider<K, V> extends CacheProvider<K, V>
     implements Provider<Cache<K, V>>, PersistentCacheBinding<K, V>, PersistentCacheDef<K, V> {
+  private final CacheBackend backend;
   private int version;
   private long diskLimit;
   private CacheSerializer<K> keySerializer;
@@ -39,9 +40,19 @@
 
   PersistentCacheProvider(
       CacheModule module, String name, TypeLiteral<K> keyType, TypeLiteral<V> valType) {
-    super(module, name, keyType, valType);
+    this(module, name, keyType, valType, CacheBackend.CAFFEINE);
+  }
+
+  PersistentCacheProvider(
+      CacheModule module,
+      String name,
+      TypeLiteral<K> keyType,
+      TypeLiteral<V> valType,
+      CacheBackend backend) {
+    super(module, name, keyType, valType, backend);
     version = -1;
     diskLimit = 128 << 20;
+    this.backend = backend;
   }
 
   @Inject(optional = true)
@@ -130,8 +141,8 @@
     freeze();
     CacheLoader<K, V> ldr = loader();
     return ldr != null
-        ? persistentCacheFactory.build(this, ldr)
-        : persistentCacheFactory.build(this);
+        ? persistentCacheFactory.build(this, ldr, backend)
+        : persistentCacheFactory.build(this, backend);
   }
 
   private static <T> void checkSerializer(
diff --git a/java/com/google/gerrit/server/cache/h2/H2CacheFactory.java b/java/com/google/gerrit/server/cache/h2/H2CacheFactory.java
index af1228d..2b068aa 100644
--- a/java/com/google/gerrit/server/cache/h2/H2CacheFactory.java
+++ b/java/com/google/gerrit/server/cache/h2/H2CacheFactory.java
@@ -21,6 +21,7 @@
 import com.google.common.util.concurrent.ThreadFactoryBuilder;
 import com.google.gerrit.extensions.events.LifecycleListener;
 import com.google.gerrit.extensions.registration.DynamicMap;
+import com.google.gerrit.server.cache.CacheBackend;
 import com.google.gerrit.server.cache.MemoryCacheFactory;
 import com.google.gerrit.server.cache.PersistentCacheDef;
 import com.google.gerrit.server.cache.PersistentCacheFactory;
@@ -156,18 +157,21 @@
 
   @SuppressWarnings({"unchecked"})
   @Override
-  public <K, V> Cache<K, V> build(PersistentCacheDef<K, V> in) {
+  public <K, V> Cache<K, V> build(PersistentCacheDef<K, V> in, CacheBackend backend) {
     long limit = config.getLong("cache", in.configKey(), "diskLimit", in.diskLimit());
 
     if (cacheDir == null || limit <= 0) {
-      return memCacheFactory.build(in);
+      return memCacheFactory.build(in, backend);
     }
 
     H2CacheDefProxy<K, V> def = new H2CacheDefProxy<>(in);
     SqlStore<K, V> store = newSqlStore(def, limit);
     H2CacheImpl<K, V> cache =
         new H2CacheImpl<>(
-            executor, store, def.keyType(), (Cache<K, ValueHolder<V>>) memCacheFactory.build(def));
+            executor,
+            store,
+            def.keyType(),
+            (Cache<K, ValueHolder<V>>) memCacheFactory.build(def, backend));
     synchronized (caches) {
       caches.add(cache);
     }
@@ -176,11 +180,12 @@
 
   @SuppressWarnings("unchecked")
   @Override
-  public <K, V> LoadingCache<K, V> build(PersistentCacheDef<K, V> in, CacheLoader<K, V> loader) {
+  public <K, V> LoadingCache<K, V> build(
+      PersistentCacheDef<K, V> in, CacheLoader<K, V> loader, CacheBackend backend) {
     long limit = config.getLong("cache", in.configKey(), "diskLimit", in.diskLimit());
 
     if (cacheDir == null || limit <= 0) {
-      return memCacheFactory.build(in, loader);
+      return memCacheFactory.build(in, loader, backend);
     }
 
     H2CacheDefProxy<K, V> def = new H2CacheDefProxy<>(in);
@@ -188,7 +193,9 @@
     Cache<K, ValueHolder<V>> mem =
         (Cache<K, ValueHolder<V>>)
             memCacheFactory.build(
-                def, (CacheLoader<K, V>) new H2CacheImpl.Loader<>(executor, store, loader));
+                def,
+                (CacheLoader<K, V>) new H2CacheImpl.Loader<>(executor, store, loader),
+                backend);
     H2CacheImpl<K, V> cache = new H2CacheImpl<>(executor, store, def.keyType(), mem);
     synchronized (caches) {
       caches.add(cache);
diff --git a/java/com/google/gerrit/server/cache/mem/BUILD b/java/com/google/gerrit/server/cache/mem/BUILD
index eb0695e..bc5b66a 100644
--- a/java/com/google/gerrit/server/cache/mem/BUILD
+++ b/java/com/google/gerrit/server/cache/mem/BUILD
@@ -8,6 +8,8 @@
         "//java/com/google/gerrit/common:annotations",
         "//java/com/google/gerrit/extensions:api",
         "//java/com/google/gerrit/server",
+        "//lib:caffeine",
+        "//lib:caffeine-guava",
         "//lib:guava",
         "//lib/guice",
         "//lib/jgit/org.eclipse.jgit:jgit",
diff --git a/java/com/google/gerrit/server/cache/mem/DefaultMemoryCacheFactory.java b/java/com/google/gerrit/server/cache/mem/DefaultMemoryCacheFactory.java
index ad1d396..9906b3d 100644
--- a/java/com/google/gerrit/server/cache/mem/DefaultMemoryCacheFactory.java
+++ b/java/com/google/gerrit/server/cache/mem/DefaultMemoryCacheFactory.java
@@ -17,13 +17,18 @@
 import static java.util.concurrent.TimeUnit.NANOSECONDS;
 import static java.util.concurrent.TimeUnit.SECONDS;
 
+import com.github.benmanes.caffeine.cache.Caffeine;
+import com.github.benmanes.caffeine.cache.RemovalListener;
+import com.github.benmanes.caffeine.cache.Weigher;
+import com.github.benmanes.caffeine.guava.CaffeinatedGuava;
 import com.google.common.base.Strings;
 import com.google.common.cache.Cache;
 import com.google.common.cache.CacheBuilder;
 import com.google.common.cache.CacheLoader;
 import com.google.common.cache.LoadingCache;
-import com.google.common.cache.Weigher;
+import com.google.common.cache.RemovalNotification;
 import com.google.gerrit.common.Nullable;
+import com.google.gerrit.server.cache.CacheBackend;
 import com.google.gerrit.server.cache.CacheDef;
 import com.google.gerrit.server.cache.ForwardingRemovalListener;
 import com.google.gerrit.server.cache.MemoryCacheFactory;
@@ -46,25 +51,30 @@
   }
 
   @Override
-  public <K, V> Cache<K, V> build(CacheDef<K, V> def) {
-    return create(def).build();
+  public <K, V> Cache<K, V> build(CacheDef<K, V> def, CacheBackend backend) {
+    return backend.isLegacyBackend()
+        ? createLegacy(def).build()
+        : CaffeinatedGuava.build(create(def));
   }
 
   @Override
-  public <K, V> LoadingCache<K, V> build(CacheDef<K, V> def, CacheLoader<K, V> loader) {
-    return create(def).build(loader);
+  public <K, V> LoadingCache<K, V> build(
+      CacheDef<K, V> def, CacheLoader<K, V> loader, CacheBackend backend) {
+    return backend.isLegacyBackend()
+        ? createLegacy(def).build(loader)
+        : CaffeinatedGuava.build(create(def), loader);
   }
 
   @SuppressWarnings("unchecked")
-  private <K, V> CacheBuilder<K, V> create(CacheDef<K, V> def) {
-    CacheBuilder<K, V> builder = newCacheBuilder();
+  private <K, V> CacheBuilder<K, V> createLegacy(CacheDef<K, V> def) {
+    CacheBuilder<K, V> builder = newLegacyCacheBuilder();
     builder.recordStats();
     builder.maximumWeight(
         cfg.getLong("cache", def.configKey(), "memoryLimit", def.maximumWeight()));
 
     builder = builder.removalListener(forwardingRemovalListenerFactory.create(def.name()));
 
-    Weigher<K, V> weigher = def.weigher();
+    com.google.common.cache.Weigher<K, V> weigher = def.weigher();
     if (weigher == null) {
       weigher = unitWeight();
     }
@@ -98,6 +108,42 @@
     return builder;
   }
 
+  private <K, V> Caffeine<K, V> create(CacheDef<K, V> def) {
+    Caffeine<K, V> builder = newCacheBuilder();
+    builder.recordStats();
+    builder.maximumWeight(
+        cfg.getLong("cache", def.configKey(), "memoryLimit", def.maximumWeight()));
+    builder = builder.removalListener(newRemovalListener(def.name()));
+    builder.weigher(newWeigher(def.weigher()));
+
+    Duration expireAfterWrite = def.expireAfterWrite();
+    if (has(def.configKey(), "maxAge")) {
+      builder.expireAfterWrite(
+          ConfigUtil.getTimeUnit(
+              cfg, "cache", def.configKey(), "maxAge", toSeconds(expireAfterWrite), SECONDS),
+          SECONDS);
+    } else if (expireAfterWrite != null) {
+      builder.expireAfterWrite(expireAfterWrite.toNanos(), NANOSECONDS);
+    }
+
+    Duration expireAfterAccess = def.expireFromMemoryAfterAccess();
+    if (has(def.configKey(), "expireFromMemoryAfterAccess")) {
+      builder.expireAfterAccess(
+          ConfigUtil.getTimeUnit(
+              cfg,
+              "cache",
+              def.configKey(),
+              "expireFromMemoryAfterAccess",
+              toSeconds(expireAfterAccess),
+              SECONDS),
+          SECONDS);
+    } else if (expireAfterAccess != null) {
+      builder.expireAfterAccess(expireAfterAccess.toNanos(), NANOSECONDS);
+    }
+
+    return builder;
+  }
+
   private static long toSeconds(@Nullable Duration duration) {
     return duration != null ? duration.getSeconds() : 0;
   }
@@ -107,16 +153,31 @@
   }
 
   @SuppressWarnings("unchecked")
-  private static <K, V> CacheBuilder<K, V> newCacheBuilder() {
+  private static <K, V> CacheBuilder<K, V> newLegacyCacheBuilder() {
     return (CacheBuilder<K, V>) CacheBuilder.newBuilder();
   }
 
-  private static <K, V> Weigher<K, V> unitWeight() {
-    return new Weigher<K, V>() {
-      @Override
-      public int weigh(K key, V value) {
-        return 1;
-      }
-    };
+  private static <K, V> com.google.common.cache.Weigher<K, V> unitWeight() {
+    return (key, value) -> 1;
+  }
+
+  @SuppressWarnings("unchecked")
+  private static <K, V> Caffeine<K, V> newCacheBuilder() {
+    return (Caffeine<K, V>) Caffeine.newBuilder();
+  }
+
+  @SuppressWarnings("unchecked")
+  private <V, K> RemovalListener<K, V> newRemovalListener(String cacheName) {
+    return (k, v, cause) ->
+        forwardingRemovalListenerFactory
+            .create(cacheName)
+            .onRemoval(
+                RemovalNotification.create(
+                    k, v, com.google.common.cache.RemovalCause.valueOf(cause.name())));
+  }
+
+  private static <K, V> Weigher<K, V> newWeigher(
+      com.google.common.cache.Weigher<K, V> guavaWeigher) {
+    return guavaWeigher == null ? Weigher.singletonWeigher() : (k, v) -> guavaWeigher.weigh(k, v);
   }
 }
diff --git a/java/com/google/gerrit/server/mail/send/ChangeEmail.java b/java/com/google/gerrit/server/mail/send/ChangeEmail.java
index 8b8f6f9..a1b20af 100644
--- a/java/com/google/gerrit/server/mail/send/ChangeEmail.java
+++ b/java/com/google/gerrit/server/mail/send/ChangeEmail.java
@@ -181,6 +181,7 @@
     setChangeSubjectHeader();
     setHeader(MailHeader.CHANGE_ID.fieldName(), "" + change.getKey().get());
     setHeader(MailHeader.CHANGE_NUMBER.fieldName(), "" + change.getChangeId());
+    setHeader(MailHeader.PROJECT.fieldName(), "" + change.getProject());
     setChangeUrlHeader();
     setCommitIdHeader();
 
diff --git a/java/com/google/gerrit/server/mail/send/NotificationEmail.java b/java/com/google/gerrit/server/mail/send/NotificationEmail.java
index a60dfea..be593ff 100644
--- a/java/com/google/gerrit/server/mail/send/NotificationEmail.java
+++ b/java/com/google/gerrit/server/mail/send/NotificationEmail.java
@@ -123,7 +123,7 @@
     soyContext.put("branch", branchData);
 
     footers.add(MailHeader.PROJECT.withDelimiter() + branch.getParentKey().get());
-    footers.add("Gerrit-Branch: " + branch.getShortName());
+    footers.add(MailHeader.BRANCH.withDelimiter() + branch.getShortName());
   }
 
   @VisibleForTesting
diff --git a/java/com/google/gerrit/server/mail/send/OutgoingEmail.java b/java/com/google/gerrit/server/mail/send/OutgoingEmail.java
index c1bef07..400bc4f 100644
--- a/java/com/google/gerrit/server/mail/send/OutgoingEmail.java
+++ b/java/com/google/gerrit/server/mail/send/OutgoingEmail.java
@@ -204,12 +204,12 @@
         }
       }
 
-      Set<Address> intersection = Sets.intersection(smtpRcptTo, smtpRcptToPlaintextOnly);
+      Set<Address> intersection = Sets.intersection(va.smtpRcptTo, smtpRcptToPlaintextOnly);
       if (!intersection.isEmpty()) {
         logger.atSevere().log("Email '%s' will be sent twice to %s", messageClass, intersection);
       }
 
-      if (!smtpRcptTo.isEmpty()) {
+      if (!va.smtpRcptTo.isEmpty()) {
         // Send multipart message
         logger.atFine().log("Sending multipart '%s'", messageClass);
         args.emailSender.send(va.smtpFromAddress, va.smtpRcptTo, va.headers, va.body, va.htmlBody);
diff --git a/java/com/google/gerrit/server/notedb/ChangeDraftUpdate.java b/java/com/google/gerrit/server/notedb/ChangeDraftUpdate.java
index 169fd2e..d8390b7 100644
--- a/java/com/google/gerrit/server/notedb/ChangeDraftUpdate.java
+++ b/java/com/google/gerrit/server/notedb/ChangeDraftUpdate.java
@@ -174,7 +174,6 @@
 
     Map<RevId, RevisionNoteBuilder> builders = cache.getBuilders();
     boolean touchedAnyRevs = false;
-    boolean hasComments = false;
     for (Map.Entry<RevId, RevisionNoteBuilder> e : builders.entrySet()) {
       updatedRevs.add(e.getKey());
       ObjectId id = ObjectId.fromString(e.getKey().get());
@@ -185,7 +184,6 @@
       if (data.length == 0) {
         rnm.noteMap.remove(id);
       } else {
-        hasComments = true;
         ObjectId dataBlob = ins.insert(OBJ_BLOB, data);
         rnm.noteMap.set(id, dataBlob);
       }
@@ -198,10 +196,9 @@
       return NO_OP_UPDATE;
     }
 
-    // If we touched every revision and there are no comments left, tell the
+    // If there are no comments left, tell the
     // caller to delete the entire ref.
-    boolean touchedAllRevs = updatedRevs.equals(rnm.revisionNotes.keySet());
-    if (touchedAllRevs && !hasComments) {
+    if (!rnm.noteMap.iterator().hasNext()) {
       return null;
     }
 
diff --git a/java/com/google/gerrit/server/patch/PatchListCacheImpl.java b/java/com/google/gerrit/server/patch/PatchListCacheImpl.java
index 6039fff..20f0f72 100644
--- a/java/com/google/gerrit/server/patch/PatchListCacheImpl.java
+++ b/java/com/google/gerrit/server/patch/PatchListCacheImpl.java
@@ -22,6 +22,7 @@
 import com.google.gerrit.reviewdb.client.Change;
 import com.google.gerrit.reviewdb.client.PatchSet;
 import com.google.gerrit.reviewdb.client.Project;
+import com.google.gerrit.server.cache.CacheBackend;
 import com.google.gerrit.server.cache.CacheModule;
 import com.google.gerrit.server.config.GerritServerConfig;
 import com.google.inject.Inject;
@@ -45,7 +46,9 @@
       @Override
       protected void configure() {
         factory(PatchListLoader.Factory.class);
-        persist(FILE_NAME, PatchListKey.class, PatchList.class)
+        // TODO(davido): Switch off using legacy cache backend, after fixing PatchListLoader
+        // to be recursion free.
+        persist(FILE_NAME, PatchListKey.class, PatchList.class, CacheBackend.GUAVA)
             .maximumWeight(10 << 20)
             .weigher(PatchListWeigher.class);
 
diff --git a/java/com/google/gerrit/server/project/RefUtil.java b/java/com/google/gerrit/server/project/RefUtil.java
index 9f1fa4a..4e08137 100644
--- a/java/com/google/gerrit/server/project/RefUtil.java
+++ b/java/com/google/gerrit/server/project/RefUtil.java
@@ -19,6 +19,7 @@
 
 import com.google.common.collect.Iterables;
 import com.google.common.flogger.FluentLogger;
+import com.google.gerrit.common.Nullable;
 import com.google.gerrit.extensions.restapi.BadRequestException;
 import com.google.gerrit.reviewdb.client.Project;
 import com.google.gerrit.reviewdb.client.RefNames;
@@ -46,16 +47,15 @@
     try {
       ObjectId revid = repo.resolve(baseRevision);
       if (revid == null) {
-        throw new InvalidRevisionException();
+        throw new InvalidRevisionException(baseRevision);
       }
       return revid;
     } catch (IOException err) {
       logger.atSevere().withCause(err).log(
           "Cannot resolve \"%s\" in project \"%s\"", baseRevision, projectName.get());
-      throw new InvalidRevisionException();
+      throw new InvalidRevisionException(baseRevision);
     } catch (RevisionSyntaxException err) {
-      logger.atSevere().withCause(err).log("Invalid revision syntax \"%s\"", baseRevision);
-      throw new InvalidRevisionException();
+      throw new InvalidRevisionException(baseRevision);
     }
   }
 
@@ -66,7 +66,7 @@
       try {
         rw.markStart(rw.parseCommit(revid));
       } catch (IncorrectObjectTypeException err) {
-        throw new InvalidRevisionException();
+        throw new InvalidRevisionException(revid.name());
       }
       RefDatabase refDb = repo.getRefDatabase();
       Iterable<Ref> refs =
@@ -86,11 +86,11 @@
       rw.checkConnectivity();
       return rw;
     } catch (IncorrectObjectTypeException | MissingObjectException err) {
-      throw new InvalidRevisionException();
+      throw new InvalidRevisionException(revid.name());
     } catch (IOException err) {
       logger.atSevere().withCause(err).log(
           "Repository \"%s\" may be corrupt; suggest running git fsck", repo.getDirectory());
-      throw new InvalidRevisionException();
+      throw new InvalidRevisionException(revid.name());
     }
   }
 
@@ -125,8 +125,8 @@
 
     public static final String MESSAGE = "Invalid Revision";
 
-    InvalidRevisionException() {
-      super(MESSAGE);
+    InvalidRevisionException(@Nullable String invalidRevision) {
+      super(MESSAGE + ": " + invalidRevision);
     }
   }
 }
diff --git a/java/com/google/gerrit/server/restapi/project/CreateBranch.java b/java/com/google/gerrit/server/restapi/project/CreateBranch.java
index 62106e8..ec1f56e 100644
--- a/java/com/google/gerrit/server/restapi/project/CreateBranch.java
+++ b/java/com/google/gerrit/server/restapi/project/CreateBranch.java
@@ -16,6 +16,7 @@
 
 import static com.google.gerrit.reviewdb.client.RefNames.isConfigRef;
 
+import com.google.common.base.Strings;
 import com.google.common.flogger.FluentLogger;
 import com.google.gerrit.extensions.api.projects.BranchInfo;
 import com.google.gerrit.extensions.api.projects.BranchInput;
@@ -91,7 +92,10 @@
     if (input.ref != null && !ref.equals(input.ref)) {
       throw new BadRequestException("ref must match URL");
     }
-    if (input.revision == null) {
+    if (input.revision != null) {
+      input.revision = input.revision.trim();
+    }
+    if (Strings.isNullOrEmpty(input.revision)) {
       input.revision = Constants.HEAD;
     }
     while (ref.startsWith("/")) {
diff --git a/java/com/google/gerrit/server/restapi/project/CreateTag.java b/java/com/google/gerrit/server/restapi/project/CreateTag.java
index e72deaf..855a7c7 100644
--- a/java/com/google/gerrit/server/restapi/project/CreateTag.java
+++ b/java/com/google/gerrit/server/restapi/project/CreateTag.java
@@ -87,7 +87,10 @@
     if (input.ref != null && !ref.equals(input.ref)) {
       throw new BadRequestException("ref must match URL");
     }
-    if (input.revision == null) {
+    if (input.revision != null) {
+      input.revision = input.revision.trim();
+    }
+    if (Strings.isNullOrEmpty(input.revision)) {
       input.revision = Constants.HEAD;
     }
 
diff --git a/java/com/google/gerrit/sshd/commands/SetAccountCommand.java b/java/com/google/gerrit/sshd/commands/SetAccountCommand.java
index 9bcb103..0ff19f7 100644
--- a/java/com/google/gerrit/sshd/commands/SetAccountCommand.java
+++ b/java/com/google/gerrit/sshd/commands/SetAccountCommand.java
@@ -277,7 +277,7 @@
           PermissionBackendException {
     for (String sshKey : sshKeys) {
       SshKeyInput in = new SshKeyInput();
-      in.raw = RawInputUtil.create(sshKey.getBytes(UTF_8), "plain/text");
+      in.raw = RawInputUtil.create(sshKey.getBytes(UTF_8), "text/plain");
       addSshKey.apply(rsrc, in);
     }
   }
diff --git a/javatests/com/google/gerrit/acceptance/api/group/BUILD b/javatests/com/google/gerrit/acceptance/api/group/BUILD
index 2bcbc73..f65db0d 100644
--- a/javatests/com/google/gerrit/acceptance/api/group/BUILD
+++ b/javatests/com/google/gerrit/acceptance/api/group/BUILD
@@ -20,7 +20,6 @@
     srcs = ["GroupAssert.java"],
     deps = [
         "//java/com/google/gerrit/extensions:api",
-        "//java/com/google/gerrit/reviewdb:server",
         "//java/com/google/gerrit/server",
         "//lib:gwtorm",
         "//lib/truth",
diff --git a/javatests/com/google/gerrit/acceptance/pgm/ElasticReindexIT.java b/javatests/com/google/gerrit/acceptance/pgm/ElasticReindexIT.java
index c0f27bd..a5ac3a9 100644
--- a/javatests/com/google/gerrit/acceptance/pgm/ElasticReindexIT.java
+++ b/javatests/com/google/gerrit/acceptance/pgm/ElasticReindexIT.java
@@ -37,7 +37,7 @@
 
   @ConfigSuite.Config
   public static Config elasticsearchV7() {
-    return getConfig(ElasticVersion.V7_4);
+    return getConfig(ElasticVersion.V7_5);
   }
 
   @Override
diff --git a/javatests/com/google/gerrit/acceptance/rest/project/CreateBranchIT.java b/javatests/com/google/gerrit/acceptance/rest/project/CreateBranchIT.java
index df89686..c49a62b 100644
--- a/javatests/com/google/gerrit/acceptance/rest/project/CreateBranchIT.java
+++ b/javatests/com/google/gerrit/acceptance/rest/project/CreateBranchIT.java
@@ -28,12 +28,14 @@
 import com.google.gerrit.extensions.api.projects.BranchInfo;
 import com.google.gerrit.extensions.api.projects.BranchInput;
 import com.google.gerrit.extensions.restapi.AuthException;
+import com.google.gerrit.extensions.restapi.BadRequestException;
 import com.google.gerrit.extensions.restapi.ResourceConflictException;
 import com.google.gerrit.extensions.restapi.RestApiException;
 import com.google.gerrit.reviewdb.client.Account;
 import com.google.gerrit.reviewdb.client.AccountGroup;
 import com.google.gerrit.reviewdb.client.Branch;
 import com.google.gerrit.reviewdb.client.RefNames;
+import org.eclipse.jgit.revwalk.RevCommit;
 import org.junit.Before;
 import org.junit.Test;
 
@@ -127,6 +129,112 @@
         "Not allowed to create group branch.");
   }
 
+  @Test
+  public void createWithRevision() throws Exception {
+    RevCommit revision = getRemoteHead(project, "master");
+
+    // update master so that points to a different revision than the revision on which we create the
+    // new branch
+    pushTo("refs/heads/master");
+    assertThat(getRemoteHead(project, "master")).isNotEqualTo(revision);
+
+    BranchInput input = new BranchInput();
+    input.revision = revision.name();
+    BranchInfo created = branch(testBranch).create(input).get();
+    assertThat(created.ref).isEqualTo(testBranch.get());
+    assertThat(created.revision).isEqualTo(revision.name());
+    assertThat(getRemoteHead(project, testBranch.getShortName())).isEqualTo(revision);
+  }
+
+  @Test
+  public void createWithoutSpecifyingRevision() throws Exception {
+    // If revision is not specified, the branch is created based on HEAD, which points to master.
+    RevCommit expectedRevision = getRemoteHead(project, "master");
+
+    BranchInput input = new BranchInput();
+    input.revision = null;
+    BranchInfo created = branch(testBranch).create(input).get();
+    assertThat(created.ref).isEqualTo(testBranch.get());
+    assertThat(created.revision).isEqualTo(expectedRevision.name());
+    assertThat(getRemoteHead(project, testBranch.getShortName())).isEqualTo(expectedRevision);
+  }
+
+  @Test
+  public void createWithEmptyRevision() throws Exception {
+    // If revision is not specified, the branch is created based on HEAD, which points to master.
+    RevCommit expectedRevision = getRemoteHead(project, "master");
+
+    BranchInput input = new BranchInput();
+    input.revision = "";
+    BranchInfo created = branch(testBranch).create(input).get();
+    assertThat(created.ref).isEqualTo(testBranch.get());
+    assertThat(created.revision).isEqualTo(expectedRevision.name());
+    assertThat(getRemoteHead(project, testBranch.getShortName())).isEqualTo(expectedRevision);
+  }
+
+  @Test
+  public void createRevisionIsTrimmed() throws Exception {
+    RevCommit revision = getRemoteHead(project, "master");
+
+    BranchInput input = new BranchInput();
+    input.revision = "\t" + revision.name();
+    BranchInfo created = branch(testBranch).create(input).get();
+    assertThat(created.ref).isEqualTo(testBranch.get());
+    assertThat(created.revision).isEqualTo(revision.name());
+    assertThat(getRemoteHead(project, testBranch.getShortName())).isEqualTo(revision);
+  }
+
+  @Test
+  public void createWithBranchNameAsRevision() throws Exception {
+    RevCommit expectedRevision = getRemoteHead(project, "master");
+
+    BranchInput input = new BranchInput();
+    input.revision = "master";
+    BranchInfo created = branch(testBranch).create(input).get();
+    assertThat(created.ref).isEqualTo(testBranch.get());
+    assertThat(created.revision).isEqualTo(expectedRevision.name());
+    assertThat(getRemoteHead(project, testBranch.getShortName())).isEqualTo(expectedRevision);
+  }
+
+  @Test
+  public void createWithFullBranchNameAsRevision() throws Exception {
+    RevCommit expectedRevision = getRemoteHead(project, "master");
+
+    BranchInput input = new BranchInput();
+    input.revision = "refs/heads/master";
+    BranchInfo created = branch(testBranch).create(input).get();
+    assertThat(created.ref).isEqualTo(testBranch.get());
+    assertThat(created.revision).isEqualTo(expectedRevision.name());
+    assertThat(getRemoteHead(project, testBranch.getShortName())).isEqualTo(expectedRevision);
+  }
+
+  @Test
+  public void cannotCreateWithNonExistingBranchNameAsRevision() throws Exception {
+    assertCreateFails(
+        testBranch,
+        "refs/heads/non-existing",
+        BadRequestException.class,
+        "invalid revision \"refs/heads/non-existing\"");
+  }
+
+  @Test
+  public void cannotCreateWithNonExistingRevision() throws Exception {
+    assertCreateFails(
+        testBranch,
+        "deadbeefdeadbeefdeadbeefdeadbeefdeadbeef",
+        BadRequestException.class,
+        "invalid revision \"deadbeefdeadbeefdeadbeefdeadbeefdeadbeef\"");
+  }
+
+  @Test
+  public void cannotCreateWithInvalidRevision() throws Exception {
+    assertCreateFails(
+        testBranch,
+        "invalid\trevision",
+        BadRequestException.class,
+        "invalid revision \"invalid\trevision\"");
+  }
+
   private void blockCreateReference() throws Exception {
     block("refs/*", Permission.CREATE, ANONYMOUS_USERS);
   }
diff --git a/javatests/com/google/gerrit/acceptance/rest/project/TagsIT.java b/javatests/com/google/gerrit/acceptance/rest/project/TagsIT.java
index d4edc0d..610ce92 100644
--- a/javatests/com/google/gerrit/acceptance/rest/project/TagsIT.java
+++ b/javatests/com/google/gerrit/acceptance/rest/project/TagsIT.java
@@ -35,6 +35,7 @@
 import com.google.gerrit.extensions.restapi.ResourceNotFoundException;
 import java.sql.Timestamp;
 import java.util.List;
+import org.eclipse.jgit.revwalk.RevCommit;
 import org.junit.Test;
 
 @NoHttpd
@@ -326,6 +327,53 @@
     tag(input.ref).create(input);
   }
 
+  @Test
+  public void noBaseRevision() throws Exception {
+    grantTagPermissions();
+
+    // If revision is not specified, the tag is created based on HEAD, which points to master.
+    RevCommit expectedRevision = getRemoteHead(project, "master");
+
+    TagInput input = new TagInput();
+    input.ref = "test";
+    input.revision = null;
+
+    TagInfo result = tag(input.ref).create(input).get();
+    assertThat(result.ref).isEqualTo(R_TAGS + input.ref);
+    assertThat(result.revision).isEqualTo(expectedRevision.name());
+  }
+
+  @Test
+  public void emptyBaseRevision() throws Exception {
+    grantTagPermissions();
+
+    // If revision is not specified, the tag is created based on HEAD, which points to master.
+    RevCommit expectedRevision = getRemoteHead(project, "master");
+
+    TagInput input = new TagInput();
+    input.ref = "test";
+    input.revision = "";
+
+    TagInfo result = tag(input.ref).create(input).get();
+    assertThat(result.ref).isEqualTo(R_TAGS + input.ref);
+    assertThat(result.revision).isEqualTo(expectedRevision.name());
+  }
+
+  @Test
+  public void baseRevisionIsTrimmed() throws Exception {
+    grantTagPermissions();
+
+    RevCommit revision = getRemoteHead(project, "master");
+
+    TagInput input = new TagInput();
+    input.ref = "test";
+    input.revision = "\t" + revision.name();
+
+    TagInfo result = tag(input.ref).create(input).get();
+    assertThat(result.ref).isEqualTo(R_TAGS + input.ref);
+    assertThat(result.revision).isEqualTo(revision.name());
+  }
+
   private void assertTagList(FluentIterable<String> expected, List<TagInfo> actual)
       throws Exception {
     assertThat(actual).hasSize(expected.size());
diff --git a/javatests/com/google/gerrit/acceptance/server/change/CommentsIT.java b/javatests/com/google/gerrit/acceptance/server/change/CommentsIT.java
index 7455419..fb2441b 100644
--- a/javatests/com/google/gerrit/acceptance/server/change/CommentsIT.java
+++ b/javatests/com/google/gerrit/acceptance/server/change/CommentsIT.java
@@ -47,6 +47,7 @@
 import com.google.gerrit.extensions.restapi.TopLevelResource;
 import com.google.gerrit.reviewdb.client.Change;
 import com.google.gerrit.reviewdb.client.Patch;
+import com.google.gerrit.reviewdb.client.RefNames;
 import com.google.gerrit.server.change.ChangeResource;
 import com.google.gerrit.server.change.RevisionResource;
 import com.google.gerrit.server.notedb.ChangeNoteUtil;
@@ -69,6 +70,7 @@
 import java.util.regex.Matcher;
 import java.util.regex.Pattern;
 import org.eclipse.jgit.lib.ObjectReader;
+import org.eclipse.jgit.lib.Ref;
 import org.eclipse.jgit.lib.Repository;
 import org.eclipse.jgit.notes.NoteMap;
 import org.eclipse.jgit.revwalk.RevCommit;
@@ -288,6 +290,45 @@
   }
 
   @Test
+  public void postCommentsUnreachableData() throws Exception {
+    String file = "file";
+    PushOneCommit push =
+        pushFactory.create(db, admin.getIdent(), testRepo, "first subject", file, "l1\nl2\n");
+
+    String dest = "refs/for/master";
+    PushOneCommit.Result r1 = push.to(dest);
+    r1.assertOkStatus();
+    String changeId = r1.getChangeId();
+    String revId = r1.getCommit().getName();
+
+    PushOneCommit.Result r2 = amendChange(r1.getChangeId());
+    r2.assertOkStatus();
+
+    String draftRefName = RefNames.refsDraftComments(r1.getChange().getId(), admin.getId());
+
+    DraftInput draft = newDraft(file, Side.REVISION, 1, "comment");
+    addDraft(changeId, "1", draft);
+    ReviewInput reviewInput = new ReviewInput();
+    reviewInput.drafts = DraftHandling.PUBLISH;
+    reviewInput.message = "foo";
+    gApi.changes().id(r1.getChangeId()).revision(1).review(reviewInput);
+
+    addDraft(changeId, "2", newDraft(file, Side.REVISION, 2, "comment2"));
+    reviewInput = new ReviewInput();
+    reviewInput.drafts = DraftHandling.PUBLISH_ALL_REVISIONS;
+    reviewInput.message = "bar";
+    gApi.changes().id(r1.getChangeId()).revision(2).review(reviewInput);
+
+    Map<String, List<CommentInfo>> drafts = getDraftComments(changeId, revId);
+    assertThat(drafts.isEmpty()).isTrue();
+
+    try (Repository repo = repoManager.openRepository(allUsers)) {
+      Ref ref = repo.exactRef(draftRefName);
+      assertThat(ref).isNull();
+    }
+  }
+
+  @Test
   public void listComments() throws Exception {
     String file = "file";
     PushOneCommit push =
diff --git a/javatests/com/google/gerrit/acceptance/server/change/PatchListCacheIT.java b/javatests/com/google/gerrit/acceptance/server/change/PatchListCacheIT.java
index 1892566..424dbac 100644
--- a/javatests/com/google/gerrit/acceptance/server/change/PatchListCacheIT.java
+++ b/javatests/com/google/gerrit/acceptance/server/change/PatchListCacheIT.java
@@ -39,7 +39,9 @@
 import com.google.gerrit.server.patch.Text;
 import com.google.inject.Inject;
 import com.google.inject.name.Named;
+import java.lang.reflect.Field;
 import java.util.ArrayList;
+import java.util.Arrays;
 import java.util.HashSet;
 import java.util.List;
 import java.util.Optional;
@@ -66,6 +68,25 @@
   private Cache<PatchListKey, PatchList> abstractPatchListCache;
 
   @Test
+  public void ensureLegacyBackendIsUsedForFileCacheBackend() throws Exception {
+    Field fileCacheField = patchListCache.getClass().getDeclaredField("fileCache");
+    fileCacheField.setAccessible(true);
+    // Use the reflection to access "localCache" field that is only present in Guava backend.
+    assertThat(
+            Arrays.stream(fileCacheField.get(patchListCache).getClass().getDeclaredFields())
+                .anyMatch(f -> f.getName().equals("localCache")))
+        .isTrue();
+
+    // intraCache (and all other cache backends) should use Caffeine backend.
+    Field intraCacheField = patchListCache.getClass().getDeclaredField("intraCache");
+    intraCacheField.setAccessible(true);
+    assertThat(
+            Arrays.stream(intraCacheField.get(patchListCache).getClass().getDeclaredFields())
+                .noneMatch(f -> f.getName().equals("localCache")))
+        .isTrue();
+  }
+
+  @Test
   public void listPatchesAgainstBase() throws Exception {
     commitBuilder().add(FILE_D, "4").message(SUBJECT_1).create();
     pushHead(testRepo, "refs/heads/master", false);
diff --git a/javatests/com/google/gerrit/acceptance/ssh/ElasticIndexIT.java b/javatests/com/google/gerrit/acceptance/ssh/ElasticIndexIT.java
index 61a490b..5348c53 100644
--- a/javatests/com/google/gerrit/acceptance/ssh/ElasticIndexIT.java
+++ b/javatests/com/google/gerrit/acceptance/ssh/ElasticIndexIT.java
@@ -36,7 +36,7 @@
 
   @ConfigSuite.Config
   public static Config elasticsearchV7() {
-    return getConfig(ElasticVersion.V7_4);
+    return getConfig(ElasticVersion.V7_5);
   }
 
   @Override
diff --git a/javatests/com/google/gerrit/elasticsearch/BUILD b/javatests/com/google/gerrit/elasticsearch/BUILD
index 328c791..b5378a4 100644
--- a/javatests/com/google/gerrit/elasticsearch/BUILD
+++ b/javatests/com/google/gerrit/elasticsearch/BUILD
@@ -31,6 +31,11 @@
     "//lib/jgit/org.eclipse.jgit:jgit",
 ]
 
+HTTP_TEST_DEPS = [
+    "//lib/httpcomponents:httpasyncclient",
+    "//lib/httpcomponents:httpclient",
+]
+
 QUERY_TESTS_DEP = "//javatests/com/google/gerrit/server/query/%s:abstract_query_tests"
 
 TYPES = [
@@ -67,7 +72,7 @@
     size = "large",
     srcs = [src],
     tags = ELASTICSEARCH_TAGS,
-    deps = ELASTICSEARCH_DEPS + [QUERY_TESTS_DEP % name],
+    deps = ELASTICSEARCH_DEPS + [QUERY_TESTS_DEP % name] + HTTP_TEST_DEPS,
 ) for name, src in ELASTICSEARCH_TESTS_V6.items()]
 
 [junit_tests(
@@ -75,10 +80,7 @@
     size = "large",
     srcs = [src],
     tags = ELASTICSEARCH_TAGS,
-    deps = ELASTICSEARCH_DEPS + [QUERY_TESTS_DEP % name] + [
-        "//lib/httpcomponents:httpasyncclient",
-        "//lib/httpcomponents:httpclient",
-    ],
+    deps = ELASTICSEARCH_DEPS + [QUERY_TESTS_DEP % name] + HTTP_TEST_DEPS,
 ) for name, src in ELASTICSEARCH_TESTS_V7.items()]
 
 junit_tests(
diff --git a/javatests/com/google/gerrit/elasticsearch/ElasticContainer.java b/javatests/com/google/gerrit/elasticsearch/ElasticContainer.java
index 73e7eca..c692a3b 100644
--- a/javatests/com/google/gerrit/elasticsearch/ElasticContainer.java
+++ b/javatests/com/google/gerrit/elasticsearch/ElasticContainer.java
@@ -51,7 +51,7 @@
       case V6_7:
         return "blacktop/elasticsearch:6.7.2";
       case V6_8:
-        return "blacktop/elasticsearch:6.8.4";
+        return "blacktop/elasticsearch:6.8.5";
       case V7_0:
         return "blacktop/elasticsearch:7.0.1";
       case V7_1:
@@ -62,6 +62,8 @@
         return "blacktop/elasticsearch:7.3.2";
       case V7_4:
         return "blacktop/elasticsearch:7.4.2";
+      case V7_5:
+        return "blacktop/elasticsearch:7.5.0";
     }
     throw new IllegalStateException("No tests for version: " + version.name());
   }
diff --git a/javatests/com/google/gerrit/elasticsearch/ElasticV6QueryChangesTest.java b/javatests/com/google/gerrit/elasticsearch/ElasticV6QueryChangesTest.java
index 08f4839..94c5a04 100644
--- a/javatests/com/google/gerrit/elasticsearch/ElasticV6QueryChangesTest.java
+++ b/javatests/com/google/gerrit/elasticsearch/ElasticV6QueryChangesTest.java
@@ -21,7 +21,12 @@
 import com.google.gerrit.testing.IndexConfig;
 import com.google.inject.Guice;
 import com.google.inject.Injector;
+import org.apache.http.client.methods.HttpPost;
+import org.apache.http.client.protocol.HttpClientContext;
+import org.apache.http.impl.nio.client.CloseableHttpAsyncClient;
+import org.apache.http.impl.nio.client.HttpAsyncClients;
 import org.eclipse.jgit.lib.Config;
+import org.junit.After;
 import org.junit.AfterClass;
 import org.junit.BeforeClass;
 
@@ -33,6 +38,7 @@
 
   private static ElasticNodeInfo nodeInfo;
   private static ElasticContainer container;
+  private static CloseableHttpAsyncClient client;
 
   @BeforeClass
   public static void startIndexService() {
@@ -43,6 +49,8 @@
 
     container = ElasticContainer.createAndStart(ElasticVersion.V6_8);
     nodeInfo = new ElasticNodeInfo(container.getHttpHost().getPort());
+    client = HttpAsyncClients.createDefault();
+    client.start();
   }
 
   @AfterClass
@@ -52,6 +60,16 @@
     }
   }
 
+  @After
+  public void closeIndex() {
+    client.execute(
+        new HttpPost(
+            String.format(
+                "http://localhost:%d/%s*/_close", nodeInfo.port, getSanitizedMethodName())),
+        HttpClientContext.create(),
+        null);
+  }
+
   @Override
   protected void initAfterLifecycleStart() throws Exception {
     super.initAfterLifecycleStart();
diff --git a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryAccountsTest.java b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryAccountsTest.java
index 069ca20..5e533fb 100644
--- a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryAccountsTest.java
+++ b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryAccountsTest.java
@@ -41,7 +41,7 @@
       return;
     }
 
-    container = ElasticContainer.createAndStart(ElasticVersion.V7_4);
+    container = ElasticContainer.createAndStart(ElasticVersion.V7_5);
     nodeInfo = new ElasticNodeInfo(container.getHttpHost().getPort());
   }
 
diff --git a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryChangesTest.java b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryChangesTest.java
index 0290c7a..3c70ed5 100644
--- a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryChangesTest.java
+++ b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryChangesTest.java
@@ -47,7 +47,7 @@
       return;
     }
 
-    container = ElasticContainer.createAndStart(ElasticVersion.V7_4);
+    container = ElasticContainer.createAndStart(ElasticVersion.V7_5);
     nodeInfo = new ElasticNodeInfo(container.getHttpHost().getPort());
     client = HttpAsyncClients.createDefault();
     client.start();
diff --git a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryGroupsTest.java b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryGroupsTest.java
index a60068c..f2dcdaa 100644
--- a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryGroupsTest.java
+++ b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryGroupsTest.java
@@ -41,7 +41,7 @@
       return;
     }
 
-    container = ElasticContainer.createAndStart(ElasticVersion.V7_4);
+    container = ElasticContainer.createAndStart(ElasticVersion.V7_5);
     nodeInfo = new ElasticNodeInfo(container.getHttpHost().getPort());
   }
 
diff --git a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryProjectsTest.java b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryProjectsTest.java
index cf93f6f..e2de7fe 100644
--- a/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryProjectsTest.java
+++ b/javatests/com/google/gerrit/elasticsearch/ElasticV7QueryProjectsTest.java
@@ -41,7 +41,7 @@
       return;
     }
 
-    container = ElasticContainer.createAndStart(ElasticVersion.V7_4);
+    container = ElasticContainer.createAndStart(ElasticVersion.V7_5);
     nodeInfo = new ElasticNodeInfo(container.getHttpHost().getPort());
   }
 
diff --git a/javatests/com/google/gerrit/elasticsearch/ElasticVersionTest.java b/javatests/com/google/gerrit/elasticsearch/ElasticVersionTest.java
index 8198ced..b5c455d 100644
--- a/javatests/com/google/gerrit/elasticsearch/ElasticVersionTest.java
+++ b/javatests/com/google/gerrit/elasticsearch/ElasticVersionTest.java
@@ -63,6 +63,9 @@
 
     assertThat(ElasticVersion.forVersion("7.4.0")).isEqualTo(ElasticVersion.V7_4);
     assertThat(ElasticVersion.forVersion("7.4.1")).isEqualTo(ElasticVersion.V7_4);
+
+    assertThat(ElasticVersion.forVersion("7.5.0")).isEqualTo(ElasticVersion.V7_5);
+    assertThat(ElasticVersion.forVersion("7.5.1")).isEqualTo(ElasticVersion.V7_5);
   }
 
   @Test
@@ -88,6 +91,7 @@
     assertThat(ElasticVersion.V7_2.isAtLeastMinorVersion(ElasticVersion.V6_7)).isFalse();
     assertThat(ElasticVersion.V7_3.isAtLeastMinorVersion(ElasticVersion.V6_7)).isFalse();
     assertThat(ElasticVersion.V7_4.isAtLeastMinorVersion(ElasticVersion.V6_7)).isFalse();
+    assertThat(ElasticVersion.V7_5.isAtLeastMinorVersion(ElasticVersion.V6_7)).isFalse();
   }
 
   @Test
@@ -105,6 +109,7 @@
     assertThat(ElasticVersion.V7_2.isV6OrLater()).isTrue();
     assertThat(ElasticVersion.V7_3.isV6OrLater()).isTrue();
     assertThat(ElasticVersion.V7_4.isV6OrLater()).isTrue();
+    assertThat(ElasticVersion.V7_5.isV6OrLater()).isTrue();
   }
 
   @Test
@@ -122,5 +127,6 @@
     assertThat(ElasticVersion.V7_2.isV7OrLater()).isTrue();
     assertThat(ElasticVersion.V7_3.isV7OrLater()).isTrue();
     assertThat(ElasticVersion.V7_4.isV7OrLater()).isTrue();
+    assertThat(ElasticVersion.V7_5.isV7OrLater()).isTrue();
   }
 }
diff --git a/lib/BUILD b/lib/BUILD
index 0474171..77b2fcd 100644
--- a/lib/BUILD
+++ b/lib/BUILD
@@ -1,4 +1,4 @@
-load("@rules_java//java:defs.bzl", "java_library")
+load("@rules_java//java:defs.bzl", "java_import", "java_library")
 
 exports_files(glob([
     "LICENSE-*",
@@ -99,6 +99,29 @@
 )
 
 java_library(
+    name = "caffeine",
+    data = ["//lib:LICENSE-Apache2.0"],
+    visibility = [
+        "//java/com/google/gerrit/server/cache/mem:__pkg__",
+    ],
+    exports = ["@caffeine//jar"],
+)
+
+java_import(
+    name = "caffeine-guava-renamed",
+    jars = ["@caffeine-guava-renamed//file"],
+)
+
+java_library(
+    name = "caffeine-guava",
+    data = ["//lib:LICENSE-Apache2.0"],
+    visibility = [
+        "//java/com/google/gerrit/server/cache/mem:__pkg__",
+    ],
+    exports = [":caffeine-guava-renamed"],
+)
+
+java_library(
     name = "jsch",
     data = ["//lib:LICENSE-jsch"],
     visibility = ["//visibility:public"],
diff --git a/lib/js/bower_archives.bzl b/lib/js/bower_archives.bzl
index 75c8277..d84a2b6 100644
--- a/lib/js/bower_archives.bzl
+++ b/lib/js/bower_archives.bzl
@@ -26,10 +26,10 @@
         sha1 = "849ad3ee7c77506548b7b5db603a4e150b9431aa",
     )
     bower_archive(
-        name = "font-roboto",
-        package = "PolymerElements/font-roboto",
+        name = "font-roboto-local",
+        package = "PolymerElements/font-roboto-local",
         version = "1.1.0",
-        sha1 = "ab4218d87b9ce569d6282b01f7642e551879c3d5",
+        sha1 = "de651abf9b1b2d0935f7b264d48131677196412f",
     )
     bower_archive(
         name = "iron-a11y-announcer",
@@ -136,8 +136,10 @@
     bower_archive(
         name = "paper-styles",
         package = "PolymerElements/paper-styles",
-        version = "1.3.1",
-        sha1 = "4ee9c692366949a754e0e39f8031aa60ce66f24d",
+        # Basically 1.3.1 but with
+        # https://github.com/PolymerElements/paper-styles/pull/164 applied
+        version = "dd0b13e186b9690d5e74a93f6e51e0835ea60495",
+        sha1 = "f859a8dee403fbb724e8d0cf009db79c6dd61b47",
     )
     bower_archive(
         name = "sinon-chai",
diff --git a/lib/js/bower_components.bzl b/lib/js/bower_components.bzl
index a540828..64ab611 100644
--- a/lib/js/bower_components.bzl
+++ b/lib/js/bower_components.bzl
@@ -30,7 +30,7 @@
         seed = True,
     )
     bower_component(
-        name = "font-roboto",
+        name = "font-roboto-local",
         license = "//lib:LICENSE-polymer",
     )
     bower_component(
@@ -286,7 +286,7 @@
         name = "paper-styles",
         license = "//lib:LICENSE-polymer",
         deps = [
-            ":font-roboto",
+            ":font-roboto-local",
             ":iron-flex-layout",
             ":polymer",
         ],
diff --git a/plugins/hooks b/plugins/hooks
index e912780..57c8e62 160000
--- a/plugins/hooks
+++ b/plugins/hooks
@@ -1 +1 @@
-Subproject commit e9127800c4efe8ad9601c7b1152783cd12a55bc2
+Subproject commit 57c8e6261016e1cc233ea41f19fa5bdf187df8a7
diff --git a/polygerrit-ui/app/elements/change-list/gr-change-list-item/gr-change-list-item.html b/polygerrit-ui/app/elements/change-list/gr-change-list-item/gr-change-list-item.html
index 568fe12..2374677 100644
--- a/polygerrit-ui/app/elements/change-list/gr-change-list-item/gr-change-list-item.html
+++ b/polygerrit-ui/app/elements/change-list/gr-change-list-item/gr-change-list-item.html
@@ -90,7 +90,6 @@
       a {
         color: inherit;
         cursor: pointer;
-        display: inline-block;
         text-decoration: none;
       }
       a:hover {
diff --git a/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.html b/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.html
index d5f1dc3..4a939e6 100644
--- a/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.html
+++ b/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.html
@@ -77,10 +77,10 @@
               data-name$="[[item.name]]"
               data-url$="[[item.url]]"
               on-tap="_handleShowAgreement"
-              disabled$="[[_disableAggreements(item, _groups, _signedAgreements)]]">
+              disabled$="[[_disableAgreements(item, _groups, _signedAgreements)]]">
           <label id="claNewAgreementsLabel">[[item.name]]</label>
         </span>
-        <div class$="alreadySubmittedText [[_hideAggreements(item, _groups, _signedAgreements)]]">
+        <div class$="alreadySubmittedText [[_hideAgreements(item, _groups, _signedAgreements)]]">
           Agreement already submitted.
         </div>
         <div class="agreementsUrl">
diff --git a/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.js b/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.js
index c771332..493b179 100644
--- a/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.js
+++ b/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view.js
@@ -107,7 +107,8 @@
       return agreements ? 'show' : '';
     },
 
-    _disableAggreements(item, groups, signedAgreements) {
+    _disableAgreements(item, groups, signedAgreements) {
+      if (!groups) return false;
       for (const group of groups) {
         if ((item && item.auto_verify_group &&
             item.auto_verify_group.id === group.id) ||
@@ -118,8 +119,8 @@
       return false;
     },
 
-    _hideAggreements(item, groups, signedAgreements) {
-      return this._disableAggreements(item, groups, signedAgreements) ?
+    _hideAgreements(item, groups, signedAgreements) {
+      return this._disableAgreements(item, groups, signedAgreements) ?
           '' : 'hide';
     },
 
@@ -131,6 +132,7 @@
     // if specified it returns 'hideAgreementsTextBox' which
     // then hides the text box and submit button.
     _computeHideAgreementClass(name, config) {
+      if (!config) return '';
       for (const key in config) {
         if (!config.hasOwnProperty(key)) { continue; }
         for (const prop in config[key]) {
diff --git a/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view_test.html b/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view_test.html
index 2304d15..aec52cb 100644
--- a/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view_test.html
+++ b/polygerrit-ui/app/elements/settings/gr-cla-view/gr-cla-view_test.html
@@ -140,28 +140,31 @@
           'none');
     });
 
-    test('_disableAggreements', () => {
+    test('_disableAgreements', () => {
       // In the auto verify group and have not yet signed agreement
       assert.isTrue(
-          element._disableAggreements(auth, groups, signedAgreements));
+          element._disableAgreements(auth, groups, signedAgreements));
       // Not in the auto verify group and have not yet signed agreement
       assert.isFalse(
-          element._disableAggreements(auth2, groups, signedAgreements));
+          element._disableAgreements(auth2, groups, signedAgreements));
       // Not in the auto verify group, have signed agreement
       assert.isTrue(
-          element._disableAggreements(auth3, groups, signedAgreements));
+          element._disableAgreements(auth3, groups, signedAgreements));
+      // Make sure the undefined check works
+      assert.isFalse(
+          element._disableAgreements(auth, undefined, signedAgreements));
     });
 
-    test('_hideAggreements', () => {
+    test('_hideAgreements', () => {
       // Not in the auto verify group and have not yet signed agreement
       assert.equal(
-          element._hideAggreements(auth, groups, signedAgreements), '');
+          element._hideAgreements(auth, groups, signedAgreements), '');
       // In the auto verify group
       assert.equal(
-          element._hideAggreements(auth2, groups, signedAgreements), 'hide');
+          element._hideAgreements(auth2, groups, signedAgreements), 'hide');
       // Not in the auto verify group, have signed agreement
       assert.equal(
-          element._hideAggreements(auth3, groups, signedAgreements), '');
+          element._hideAgreements(auth3, groups, signedAgreements), '');
     });
 
     test('_disableAgreementsText', () => {
diff --git a/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.html b/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.html
index f4b120a..42b6f94 100644
--- a/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.html
+++ b/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.html
@@ -147,7 +147,7 @@
         id="dropdown"
         vertical-align="top"
         allow-outside-scroll="true"
-        on-tap="_handleDropdownTap">
+        on-click="_handleDropdownClick">
       <paper-listbox
           class="dropdown-content"
           slot="dropdown-content"
diff --git a/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.js b/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.js
index 40d8811..c8f4f63 100644
--- a/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.js
+++ b/polygerrit-ui/app/elements/shared/gr-dropdown-list/gr-dropdown-list.js
@@ -75,7 +75,7 @@
      * Handle a click on the iron-dropdown element.
      * @param {!Event} e
      */
-    _handleDropdownTap(e) {
+    _handleDropdownClick(e) {
       // async is needed so that that the click event is fired before the
       // dropdown closes (This was a bug for touch devices).
       this.async(() => {
diff --git a/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.html b/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.html
index 47ab03a..373e77d 100644
--- a/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.html
+++ b/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.html
@@ -109,7 +109,7 @@
         vertical-offset="[[verticalOffset]]"
         allow-outside-scroll="true"
         horizontal-align="[[horizontalAlign]]"
-        on-tap="_handleDropdownTap">
+        on-click="_handleDropdownClick">
       <div class="dropdown-content" slot="dropdown-content">
         <ul>
           <template is="dom-if" if="[[topContent]]">
diff --git a/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.js b/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.js
index 974f179..c06ffc9 100644
--- a/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.js
+++ b/polygerrit-ui/app/elements/shared/gr-dropdown/gr-dropdown.js
@@ -152,7 +152,7 @@
      * Handle a click on the iron-dropdown element.
      * @param {!Event} e
      */
-    _handleDropdownTap(e) {
+    _handleDropdownClick(e) {
       this._close();
     },
 
diff --git a/polygerrit-ui/app/elements/shared/gr-rest-api-interface/gr-rest-api-interface.js b/polygerrit-ui/app/elements/shared/gr-rest-api-interface/gr-rest-api-interface.js
index dd4f04f..79e3fa8 100644
--- a/polygerrit-ui/app/elements/shared/gr-rest-api-interface/gr-rest-api-interface.js
+++ b/polygerrit-ui/app/elements/shared/gr-rest-api-interface/gr-rest-api-interface.js
@@ -2668,7 +2668,7 @@
         method: 'POST',
         url: '/accounts/self/sshkeys',
         body: key,
-        contentType: 'plain/text',
+        contentType: 'text/plain',
         reportUrlAsIs: true,
       };
       return this._send(req)
diff --git a/polygerrit-ui/app/rules.bzl b/polygerrit-ui/app/rules.bzl
index 3012f7f..ca8c402 100644
--- a/polygerrit-ui/app/rules.bzl
+++ b/polygerrit-ui/app/rules.bzl
@@ -90,6 +90,8 @@
             # we extract from the zip, but depend on the component for license checking.
             "@webcomponentsjs//:zipfile",
             "//lib/js:webcomponentsjs",
+            "@font-roboto-local//:zipfile",
+            "//lib/js:font-roboto-local",
         ],
         outs = outs,
         cmd = " && ".join([
@@ -101,6 +103,7 @@
             "for f in $(locations " + name + "_theme_sources); do cp $$f $$TMP/polygerrit_ui/styles/themes; done",
             "for f in $(locations //lib/js:highlightjs_files); do cp $$f $$TMP/polygerrit_ui/bower_components/highlightjs/ ; done",
             "unzip -qd $$TMP/polygerrit_ui/bower_components $(location @webcomponentsjs//:zipfile) webcomponentsjs/webcomponents-lite.js",
+            "unzip -qd $$TMP/polygerrit_ui/bower_components $(location @font-roboto-local//:zipfile) font-roboto-local/fonts/\*/\*.ttf",
             "cd $$TMP",
             "find . -exec touch -t 198001010000 '{}' ';'",
             "zip -qr $$ROOT/$@ *",
diff --git a/tools/js/bower2bazel.py b/tools/js/bower2bazel.py
index 7b24524..7d00c0d 100755
--- a/tools/js/bower2bazel.py
+++ b/tools/js/bower2bazel.py
@@ -38,7 +38,7 @@
     "codemirror-minified": "codemirror-minified",
     "es6-promise": "es6-promise",
     "fetch": "fetch",
-    "font-roboto": "polymer",
+    "font-roboto-local": "polymer",
     "iron-a11y-announcer": "polymer",
     "iron-a11y-keys-behavior": "polymer",
     "iron-autogrow-textarea": "polymer",
@@ -77,7 +77,7 @@
     "paper-behaviors": "polymer",
     "paper-ripple": "polymer",
     "iron-checked-element-behavior": "polymer",
-    "font-roboto": "polymer",
+    "font-roboto-local": "polymer",
 }
 
 
diff --git a/tools/maven/package.bzl b/tools/maven/package.bzl
index 11e569d..b25656d 100644
--- a/tools/maven/package.bzl
+++ b/tools/maven/package.bzl
@@ -17,6 +17,14 @@
     "echo \"# this script should run from the root of your workspace.\" >> $@",
     "echo \"set -e\" >> $@",
     "echo \"\" >> $@",
+    "echo 'function bazel_cmd() {' >> $@",
+    "echo '  if [[ `which bazelisk` ]]; then' >> $@",
+    "echo '    bazelisk \"$$@\"' >> $@",
+    "echo '  else' >> $@",
+    "echo '    bazel \"$$@\"' >> $@",
+    "echo '  fi' >> $@",
+    "echo '}' >> $@",
+    "echo \"\" >> $@",
     "echo 'if [[ \"$$VERBOSE\" ]]; then set -x ; fi' >> $@",
     "echo \"\" >> $@",
     "echo %s >> $@",
@@ -32,7 +40,7 @@
         src = {},
         doc = {},
         war = {}):
-    build_cmd = ["bazel", "build"]
+    build_cmd = ["bazel_cmd", "build"]
     mvn_cmd = ["python", "tools/maven/mvn.py", "-v", version]
     api_cmd = mvn_cmd[:]
     api_targets = []
diff --git a/tools/nongoogle.bzl b/tools/nongoogle.bzl
index 48671e3..1552595 100644
--- a/tools/nongoogle.bzl
+++ b/tools/nongoogle.bzl
@@ -61,8 +61,8 @@
     # elasticsearch-rest-client explicitly depends on this version
     maven_jar(
         name = "httpcore-nio",
-        artifact = "org.apache.httpcomponents:httpcore-nio:4.4.11",
-        sha1 = "7d0a97d01d39cff9aa3e6db81f21fddb2435f4e6",
+        artifact = "org.apache.httpcomponents:httpcore-nio:4.4.12",
+        sha1 = "84cd29eca842f31db02987cfedea245af020198b",
     )
 
     maven_jar(
@@ -94,8 +94,8 @@
     # and httpasyncclient as necessary.
     maven_jar(
         name = "elasticsearch-rest-client",
-        artifact = "org.elasticsearch.client:elasticsearch-rest-client:7.4.2",
-        sha1 = "f48725523c0b3402f869214433602f8d3f4c737c",
+        artifact = "org.elasticsearch.client:elasticsearch-rest-client:7.5.0",
+        sha1 = "62535b6fc3a4e943e88e7640eac22e29f03a696d",
     )
 
     maven_jar(
@@ -162,18 +162,18 @@
         sha1 = "3e83394258ae2089be7219b971ec21a8288528ad",
     )
 
-    TESTCONTAINERS_VERSION = "1.12.3"
+    TESTCONTAINERS_VERSION = "1.12.4"
 
     maven_jar(
         name = "testcontainers",
         artifact = "org.testcontainers:testcontainers:" + TESTCONTAINERS_VERSION,
-        sha1 = "e424a4549640e120acceac641ac909fcda58bf62",
+        sha1 = "456b6facac12c4b67130d9056a43c011679e9f0c",
     )
 
     maven_jar(
         name = "testcontainers-elasticsearch",
         artifact = "org.testcontainers:elasticsearch:" + TESTCONTAINERS_VERSION,
-        sha1 = "c0796de5032070b8768ce78c78949b48f13c30db",
+        sha1 = "9e210c277a35a95a76d03a79e2812575bd07391c",
     )
 
     maven_jar(