Merge history of v1.13.11.

For older versions of repo, this would make it easier for it to perform
a self update by making it a fast-forward from the following tags:

  v1.13.9.2, v1.13.9.3, v1.13.9.4, v1.13.10, v1.13.11

Change-Id: Ia75776312eaf802a150db8bd7c0a6dce57914580
diff --git a/.flake8 b/.flake8
index 45ab656..6b824e9 100644
--- a/.flake8
+++ b/.flake8
@@ -1,3 +1,15 @@
 [flake8]
-max-line-length=80
-ignore=E111,E114,E402
+max-line-length=100
+ignore=
+    # E111: Indentation is not a multiple of four
+    E111,
+    # E114: Indentation is not a multiple of four (comment)
+    E114,
+    # E402: Module level import not at top of file
+    E402,
+    # E731: do not assign a lambda expression, use a def
+    E731,
+    # W503: Line break before binary operator
+    W503,
+    # W504: Line break after binary operator
+    W504
diff --git a/.github/workflows/test-ci.yml b/.github/workflows/test-ci.yml
new file mode 100644
index 0000000..1988185
--- /dev/null
+++ b/.github/workflows/test-ci.yml
@@ -0,0 +1,31 @@
+# GitHub actions workflow.
+# https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions
+
+name: Test CI
+
+on:
+  push:
+    branches: [main, repo-1, stable, maint]
+    tags: [v*]
+
+jobs:
+  test:
+    strategy:
+      fail-fast: false
+      matrix:
+        os: [ubuntu-latest, macos-latest, windows-latest]
+        python-version: [3.6, 3.7, 3.8, 3.9]
+    runs-on: ${{ matrix.os }}
+
+    steps:
+    - uses: actions/checkout@v2
+    - name: Set up Python ${{ matrix.python-version }}
+      uses: actions/setup-python@v1
+      with:
+        python-version: ${{ matrix.python-version }}
+    - name: Install dependencies
+      run: |
+        python -m pip install --upgrade pip
+        pip install tox tox-gh-actions
+    - name: Test with tox
+      run: tox
diff --git a/.gitignore b/.gitignore
index 3796244..4e91be9 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,3 +1,4 @@
+*.asc
 *.egg-info/
 *.log
 *.pyc
@@ -6,6 +7,7 @@
 .repopickle_*
 /repoc
 /.tox
+/.venv
 
 # PyCharm related
 /.idea/
diff --git a/.mailmap b/.mailmap
index 905139d..caf1c98 100644
--- a/.mailmap
+++ b/.mailmap
@@ -4,6 +4,7 @@
 Hu Xiuyun <xiuyun.hu@hisilicon.com>           Hu Xiuyun <clouds08@qq.com>
 Jelly Chen <chenguodong@huawei.com>           chenguodong <chenguodong@huawei.com>
 Jia Bi <bijia@xiaomi.com>                     bijia <bijia@xiaomi.com>
+Jiri Tyr <jiri.tyr@gmail.com>                 Jiri tyr <jiri.tyr@gmail.com>
 JoonCheol Park <jooncheol@gmail.com>          Jooncheol Park <jooncheol@gmail.com>
 Sergii Pylypenko <x.pelya.x@gmail.com>        pelya <x.pelya.x@gmail.com>
 Shawn Pearce <sop@google.com>                 Shawn O. Pearce <sop@google.com>
diff --git a/README.md b/README.md
index 5c88635..5519e9a 100644
--- a/README.md
+++ b/README.md
@@ -6,15 +6,29 @@
 easier to work with Git.  The repo command is an executable Python script
 that you can put anywhere in your path.
 
-* Homepage: https://gerrit.googlesource.com/git-repo/
-* Bug reports: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
-* Source: https://gerrit.googlesource.com/git-repo/
-* Overview: https://source.android.com/source/developing.html
-* Docs: https://source.android.com/source/using-repo.html
+* Homepage: <https://gerrit.googlesource.com/git-repo/>
+* Mailing list: [repo-discuss on Google Groups][repo-discuss]
+* Bug reports: <https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo>
+* Source: <https://gerrit.googlesource.com/git-repo/>
+* Overview: <https://source.android.com/source/developing.html>
+* Docs: <https://source.android.com/source/using-repo.html>
 * [repo Manifest Format](./docs/manifest-format.md)
 * [repo Hooks](./docs/repo-hooks.md)
 * [Submitting patches](./SUBMITTING_PATCHES.md)
 * Running Repo in [Microsoft Windows](./docs/windows.md)
+* GitHub mirror: <https://github.com/GerritCodeReview/git-repo>
+* Postsubmit tests: <https://github.com/GerritCodeReview/git-repo/actions>
+
+## Contact
+
+Please use the [repo-discuss] mailing list or [issue tracker] for questions.
+
+You can [file a new bug report][new-bug] under the "repo" component.
+
+Please do not e-mail individual developers for support.
+They do not have the bandwidth for it, and often times questions have already
+been asked on [repo-discuss] or bugs posted to the [issue tracker].
+So please search those sites first.
 
 ## Install
 
@@ -34,3 +48,8 @@
 $ curl https://storage.googleapis.com/git-repo-downloads/repo > ~/.bin/repo
 $ chmod a+rx ~/.bin/repo
 ```
+
+
+[new-bug]: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue
+[issue tracker]: https://bugs.chromium.org/p/gerrit/issues/list?q=component:repo
+[repo-discuss]: https://groups.google.com/forum/#!forum/repo-discuss
diff --git a/SUBMITTING_PATCHES.md b/SUBMITTING_PATCHES.md
index 5021e7e..0c18924 100644
--- a/SUBMITTING_PATCHES.md
+++ b/SUBMITTING_PATCHES.md
@@ -4,13 +4,13 @@
 
  - Make small logical changes.
  - Provide a meaningful commit message.
- - Check for coding errors and style nits with pyflakes and flake8
+ - Check for coding errors and style nits with flake8.
  - Make sure all code is under the Apache License, 2.0.
  - Publish your changes for review.
  - Make corrections if requested.
  - Verify your changes on gerrit so they can be submitted.
 
-   `git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/master`
+   `git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/main`
 
 
 # Long Version
@@ -38,34 +38,30 @@
 probably need to split up your commit to finer grained pieces.
 
 
-## Check for coding errors and style nits with pyflakes and flake8
+## Check for coding errors and style violations with flake8
 
-### Coding errors
-
-Run `pyflakes` on changed modules:
-
-    pyflakes file.py
-
-Ideally there should be no new errors or warnings introduced.
-
-### Style violations
-
-Run `flake8` on changes modules:
+Run `flake8` on changed modules:
 
     flake8 file.py
 
-Note that repo generally follows [Google's python style guide] rather than
-[PEP 8], so it's possible that the output of `flake8` will be quite noisy.
-It's not mandatory to avoid all warnings, but at least the maximum line
-length should be followed.
+Note that repo generally follows [Google's Python Style Guide] rather than
+[PEP 8], with a couple of notable exceptions:
 
-If there are many occurrences of the same warning that cannot be
-avoided without going against the Google style guide, these may be
-suppressed in the included `.flake8` file.
+* Indentation is at 2 columns rather than 4
+* The maximum line length is 100 columns rather than 80
 
-[Google's python style guide]: https://google.github.io/styleguide/pyguide.html
+There should be no new errors or warnings introduced.
+
+Warnings that cannot be avoided without going against the Google Style Guide
+may be suppressed inline individally using a `# noqa` comment as described
+in the [flake8 documentation].
+
+If there are many occurrences of the same warning, these may be suppressed for
+the entire project in the included `.flake8` file.
+
+[Google's Python Style Guide]: https://google.github.io/styleguide/pyguide.html
 [PEP 8]: https://www.python.org/dev/peps/pep-0008/
-
+[flake8 documentation]: https://flake8.pycqa.org/en/3.1.1/user/ignoring-errors.html#in-line-ignoring-errors
 
 ## Running tests
 
@@ -154,7 +150,7 @@
 a remembered remote to make this easier in the future:
 
     git config remote.review.url https://gerrit-review.googlesource.com/git-repo
-    git config remote.review.push HEAD:refs/for/master
+    git config remote.review.push HEAD:refs/for/main
 
     git push review
 
diff --git a/color.py b/color.py
index 5b3a282..fdd7253 100644
--- a/color.py
+++ b/color.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -84,6 +82,7 @@
     code = ''
   return code
 
+
 DEFAULT = None
 
 
diff --git a/command.py b/command.py
index 9e113f1..b972a0b 100644
--- a/command.py
+++ b/command.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,25 +12,65 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
+import multiprocessing
 import os
 import optparse
-import platform
 import re
 import sys
 
 from event_log import EventLog
 from error import NoSuchProjectError
 from error import InvalidProjectGroupsError
+import progress
+
+
+# Are we generating man-pages?
+GENERATE_MANPAGES = os.environ.get('_REPO_GENERATE_MANPAGES_') == ' indeed! '
+
+
+# Number of projects to submit to a single worker process at a time.
+# This number represents a tradeoff between the overhead of IPC and finer
+# grained opportunity for parallelism. This particular value was chosen by
+# iterating through powers of two until the overall performance no longer
+# improved. The performance of this batch size is not a function of the
+# number of cores on the system.
+WORKER_BATCH_SIZE = 32
+
+
+# How many jobs to run in parallel by default?  This assumes the jobs are
+# largely I/O bound and do not hit the network.
+DEFAULT_LOCAL_JOBS = min(os.cpu_count(), 8)
 
 
 class Command(object):
   """Base class for any command line action in repo.
   """
 
-  common = False
+  # Singleton for all commands to track overall repo command execution and
+  # provide event summary to callers.  Only used by sync subcommand currently.
+  #
+  # NB: This is being replaced by git trace2 events.  See git_trace2_event_log.
   event_log = EventLog()
-  manifest = None
-  _optparse = None
+
+  # Whether this command is a "common" one, i.e. whether the user would commonly
+  # use it or it's a more uncommon command.  This is used by the help command to
+  # show short-vs-full summaries.
+  COMMON = False
+
+  # Whether this command supports running in parallel.  If greater than 0,
+  # it is the number of parallel jobs to default to.
+  PARALLEL_JOBS = None
+
+  def __init__(self, repodir=None, client=None, manifest=None, gitc_manifest=None,
+               git_event_log=None):
+    self.repodir = repodir
+    self.client = client
+    self.manifest = manifest
+    self.gitc_manifest = gitc_manifest
+    self.git_event_log = git_event_log
+
+    # Cache for the OptionParser property.
+    self._optparse = None
 
   def WantPager(self, _opt):
     return False
@@ -66,13 +104,39 @@
         usage = self.helpUsage.strip().replace('%prog', me)
       except AttributeError:
         usage = 'repo %s' % self.NAME
-      self._optparse = optparse.OptionParser(usage=usage)
+      epilog = 'Run `repo help %s` to view the detailed manual.' % self.NAME
+      self._optparse = optparse.OptionParser(usage=usage, epilog=epilog)
+      self._CommonOptions(self._optparse)
       self._Options(self._optparse)
     return self._optparse
 
-  def _Options(self, p):
-    """Initialize the option parser.
+  def _CommonOptions(self, p, opt_v=True):
+    """Initialize the option parser with common options.
+
+    These will show up for *all* subcommands, so use sparingly.
+    NB: Keep in sync with repo:InitParser().
     """
+    g = p.add_option_group('Logging options')
+    opts = ['-v'] if opt_v else []
+    g.add_option(*opts, '--verbose',
+                 dest='output_mode', action='store_true',
+                 help='show all output')
+    g.add_option('-q', '--quiet',
+                 dest='output_mode', action='store_false',
+                 help='only show errors')
+
+    if self.PARALLEL_JOBS is not None:
+      default = 'based on number of CPU cores'
+      if not GENERATE_MANPAGES:
+        # Only include active cpu count if we aren't generating man pages.
+        default = f'%default; {default}'
+      p.add_option(
+          '-j', '--jobs',
+          type=int, default=self.PARALLEL_JOBS,
+          help=f'number of jobs to run in parallel (default: {default})')
+
+  def _Options(self, p):
+    """Initialize the option parser with subcommand-specific options."""
 
   def _RegisteredEnvironmentOptions(self):
     """Get options that can be set from environment variables.
@@ -98,6 +162,11 @@
     self.OptionParser.print_usage()
     sys.exit(1)
 
+  def CommonValidateOptions(self, opt, args):
+    """Validate common options."""
+    opt.quiet = opt.output_mode is False
+    opt.verbose = opt.output_mode is True
+
   def ValidateOptions(self, opt, args):
     """Validate the user options & arguments before executing.
 
@@ -113,6 +182,44 @@
     """
     raise NotImplementedError
 
+  @staticmethod
+  def ExecuteInParallel(jobs, func, inputs, callback, output=None, ordered=False):
+    """Helper for managing parallel execution boiler plate.
+
+    For subcommands that can easily split their work up.
+
+    Args:
+      jobs: How many parallel processes to use.
+      func: The function to apply to each of the |inputs|.  Usually a
+          functools.partial for wrapping additional arguments.  It will be run
+          in a separate process, so it must be pickalable, so nested functions
+          won't work.  Methods on the subcommand Command class should work.
+      inputs: The list of items to process.  Must be a list.
+      callback: The function to pass the results to for processing.  It will be
+          executed in the main thread and process the results of |func| as they
+          become available.  Thus it may be a local nested function.  Its return
+          value is passed back directly.  It takes three arguments:
+          - The processing pool (or None with one job).
+          - The |output| argument.
+          - An iterator for the results.
+      output: An output manager.  May be progress.Progess or color.Coloring.
+      ordered: Whether the jobs should be processed in order.
+
+    Returns:
+      The |callback| function's results are returned.
+    """
+    try:
+      # NB: Multiprocessing is heavy, so don't spin it up for one job.
+      if len(inputs) == 1 or jobs == 1:
+        return callback(None, output, (func(x) for x in inputs))
+      else:
+        with multiprocessing.Pool(jobs) as pool:
+          submit = pool.imap if ordered else pool.imap_unordered
+          return callback(pool, output, submit(func, inputs, chunksize=WORKER_BATCH_SIZE))
+    finally:
+      if isinstance(output, progress.Progress):
+        output.end()
+
   def _ResetPathToProjectMap(self, projects):
     self._by_path = dict((p.worktree, p) for p in projects)
 
@@ -123,9 +230,9 @@
     project = None
     if os.path.exists(path):
       oldpath = None
-      while path and \
-            path != oldpath and \
-            path != manifest.topdir:
+      while (path and
+             path != oldpath and
+             path != manifest.topdir):
         try:
           project = self._by_path[path]
           break
@@ -156,9 +263,7 @@
     mp = manifest.manifestProject
 
     if not groups:
-      groups = mp.config.GetString('manifest.groups')
-    if not groups:
-      groups = 'default,platform-' + platform.system().lower()
+      groups = manifest.GetGroupsStr()
     groups = [x for x in re.split(r'[,\s]+', groups) if x]
 
     if not args:
@@ -236,6 +341,7 @@
   """Command which requires user interaction on the tty and
      must not run within a pager, even if the user asks to.
   """
+
   def WantPager(self, _opt):
     return False
 
@@ -244,6 +350,7 @@
   """Command which defaults to output in a pager, as its
      display tends to be larger than one screen full.
   """
+
   def WantPager(self, _opt):
     return True
 
diff --git a/completion.bash b/completion.bash
new file mode 100644
index 0000000..09291d5
--- /dev/null
+++ b/completion.bash
@@ -0,0 +1,156 @@
+# Copyright 2021 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+# Programmable bash completion.  https://github.com/scop/bash-completion
+
+# TODO: Handle interspersed options.  We handle `repo h<tab>`, but not
+# `repo --time h<tab>`.
+
+# Complete the list of repo subcommands.
+__complete_repo_list_commands() {
+  local repo=${COMP_WORDS[0]}
+  (
+  # Handle completions if running outside of a checkout.
+  if ! "${repo}" help --all 2>/dev/null; then
+    repo help 2>/dev/null
+  fi
+  ) | sed -n '/^  /{s/  \([^ ]\+\) .\+/\1/;p}'
+}
+
+# Complete list of all branches available in all projects in the repo client
+# checkout.
+__complete_repo_list_branches() {
+  local repo=${COMP_WORDS[0]}
+  "${repo}" branches 2>/dev/null | \
+    sed -n '/|/{s/[ *][Pp ] *\([^ ]\+\) .*/\1/;p}'
+}
+
+# Complete list of all projects available in the repo client checkout.
+__complete_repo_list_projects() {
+  local repo=${COMP_WORDS[0]}
+  "${repo}" list -n 2>/dev/null
+  "${repo}" list -p --relative-to=. 2>/dev/null
+}
+
+# Complete the repo <command> argument.
+__complete_repo_command() {
+  if [[ ${COMP_CWORD} -ne 1 ]]; then
+    return 1
+  fi
+
+  local command=${COMP_WORDS[1]}
+  COMPREPLY=($(compgen -W "$(__complete_repo_list_commands)" -- "${command}"))
+  return 0
+}
+
+# Complete repo subcommands that take <branch> <projects>.
+__complete_repo_command_branch_projects() {
+  local current=$1
+  if [[ ${COMP_CWORD} -eq 2 ]]; then
+    COMPREPLY=($(compgen -W "$(__complete_repo_list_branches)" -- "${current}"))
+  else
+    COMPREPLY=($(compgen -W "$(__complete_repo_list_projects)" -- "${current}"))
+  fi
+}
+
+# Complete repo subcommands that take only <projects>.
+__complete_repo_command_projects() {
+  local current=$1
+  COMPREPLY=($(compgen -W "$(__complete_repo_list_projects)" -- "${current}"))
+}
+
+# Complete `repo help`.
+__complete_repo_command_help() {
+  local current=$1
+  # CWORD=1 is "start".
+  # CWORD=2 is the <subcommand> which we complete here.
+  if [[ ${COMP_CWORD} -eq 2 ]]; then
+    COMPREPLY=(
+      $(compgen -W "$(__complete_repo_list_commands)" -- "${current}")
+    )
+  fi
+}
+
+# Complete `repo forall`.
+__complete_repo_command_forall() {
+  local current=$1
+  # CWORD=1 is "forall".
+  # CWORD=2+ are <projects> *until* we hit the -c option.
+  local i
+  for (( i = 0; i < COMP_CWORD; ++i )); do
+    if [[ "${COMP_WORDS[i]}" == "-c" ]]; then
+      return 0
+    fi
+  done
+
+  COMPREPLY=(
+    $(compgen -W "$(__complete_repo_list_projects)" -- "${current}")
+  )
+}
+
+# Complete `repo start`.
+__complete_repo_command_start() {
+  local current=$1
+  # CWORD=1 is "start".
+  # CWORD=2 is the <branch> which we don't complete.
+  # CWORD=3+ are <projects> which we complete here.
+  if [[ ${COMP_CWORD} -gt 2 ]]; then
+    COMPREPLY=(
+      $(compgen -W "$(__complete_repo_list_projects)" -- "${current}")
+    )
+  fi
+}
+
+# Complete the repo subcommand arguments.
+__complete_repo_arg() {
+  if [[ ${COMP_CWORD} -le 1 ]]; then
+    return 1
+  fi
+
+  local command=${COMP_WORDS[1]}
+  local current=${COMP_WORDS[COMP_CWORD]}
+  case ${command} in
+  abandon|checkout)
+    __complete_repo_command_branch_projects "${current}"
+    return 0
+    ;;
+
+  branch|branches|diff|info|list|overview|prune|rebase|smartsync|stage|status|\
+  sync|upload)
+    __complete_repo_command_projects "${current}"
+    return 0
+    ;;
+
+  help|start|forall)
+    __complete_repo_command_${command} "${current}"
+    return 0
+    ;;
+
+  *)
+    return 1
+    ;;
+  esac
+}
+
+# Complete the repo arguments.
+__complete_repo() {
+  COMPREPLY=()
+  __complete_repo_command && return 0
+  __complete_repo_arg && return 0
+  return 0
+}
+
+# Fallback to the default complete methods if we aren't able to provide anything
+# useful.  This will allow e.g. local paths to be used when it makes sense.
+complete -F __complete_repo -o bashdefault -o default repo
diff --git a/docs/internal-fs-layout.md b/docs/internal-fs-layout.md
new file mode 100644
index 0000000..af6a452
--- /dev/null
+++ b/docs/internal-fs-layout.md
@@ -0,0 +1,263 @@
+# Repo internal filesystem layout
+
+A reference to the `.repo/` tree in repo client checkouts.
+Hopefully it's complete & up-to-date, but who knows!
+
+*** note
+**Warning**:
+This is meant for developers of the repo project itself as a quick reference.
+**Nothing** in here must be construed as ABI, or that repo itself will never
+change its internals in backwards incompatible ways.
+***
+
+[TOC]
+
+## .repo/ layout
+
+All content under `.repo/` is managed by `repo` itself with few exceptions.
+
+In general, you should not make manual changes in here.
+If a setting was initialized using an option to `repo init`, you should use that
+command to change the setting later on.
+It is always safe to re-run `repo init` in existing repo client checkouts.
+For example, if you want to change the manifest branch, you can simply run
+`repo init --manifest-branch=<new name>` and repo will take care of the rest.
+
+*   `config`: Per-repo client checkout settings using [git-config] file format.
+*   `.repo_config.json`: JSON cache of the `config` file for repo to
+    read/process quickly.
+
+### repo/ state
+
+*   `repo/`: A git checkout of the repo project.  This is how `repo` re-execs
+    itself to get the latest released version.
+
+    It tracks the git repository at `REPO_URL` using the `REPO_REV` branch.
+    Those are specified at `repo init` time using the `--repo-url=<REPO_URL>`
+    and `--repo-rev=<REPO_REV>` options.
+
+    Any changes made to this directory will usually be automatically discarded
+    by repo itself when it checks for updates.  If you want to update to the
+    latest version of repo, use `repo selfupdate` instead.  If you want to
+    change the git URL/branch that this tracks, re-run `repo init` with the new
+    settings.
+
+*   `.repo_fetchtimes.json`: Used by `repo sync` to record stats when syncing
+    the various projects.
+
+### Manifests
+
+For more documentation on the manifest format, including the local_manifests
+support, see the [manifest-format.md] file.
+
+*   `manifests/`: A git checkout of the manifest project.  Its `.git/` state
+    points to the `manifest.git` bare checkout (see below).  It tracks the git
+    branch specified at `repo init` time via `--manifest-branch`.
+
+    The local branch name is always `default` regardless of the remote tracking
+    branch.  Do not get confused if the remote branch is not `default`, or if
+    there is a remote `default` that is completely different!
+
+    No manual changes should be made in here as it will just confuse repo and
+    it won't automatically recover causing no new changes to be picked up.
+
+*   `manifests.git/`: A bare checkout of the manifest project.  It tracks the
+    git repository specified at `repo init` time via `--manifest-url`.
+
+    No manual changes should be made in here as it will just confuse repo.
+    If you want to switch the tracking settings, re-run `repo init` with the
+    new settings.
+
+*   `manifest.xml`: The manifest that repo uses.  It is generated at `repo init`
+    and uses the `--manifest-name` to determine what manifest file to load next
+    out of `manifests/`.
+
+    Do not try to modify this to load other manifests as it will confuse repo.
+    If you want to switch manifest files, re-run `repo init` with the new
+    setting.
+
+    Older versions of repo managed this with symlinks.
+
+*   `manifest.xml -> manifests/<manifest-name>.xml`: A symlink to the manifest
+    that the user wishes to sync.  It is specified at `repo init` time via
+    `--manifest-name`.
+
+
+*   `manifests.git/.repo_config.json`: JSON cache of the `manifests.git/config`
+    file for repo to read/process quickly.
+
+*   `local_manifest.xml` (*Deprecated*): User-authored tweaks to the manifest
+    used to sync.  See [local manifests] for more details.
+*   `local_manifests/`: Directory of user-authored manifest fragments to tweak
+    the manifest used to sync.  See [local manifests] for more details.
+
+### Project objects
+
+*** note
+**Warning**: Please do not use repo's approach to projects/ & project-objects/
+layouts as a model for other tools to implement similar approaches.
+It has a number of known downsides like:
+*   [Symlinks do not work well under Windows](./windows.md).
+*   Git sometimes replaces symlinks under .git/ with real files (under unknown
+    circumstances), and then the internal state gets out of sync, and data loss
+    may ensue.
+*   When sharing project-objects between multiple project checkouts, Git might
+    automatically run `gc` or `prune` which may lead to data loss or corruption
+    (since those operate on leaf projects and miss refs in other leaves).  See
+    https://gerrit-review.googlesource.com/c/git-repo/+/254392 for more details.
+
+Instead, you should use standard Git workflows like [git worktree] or
+[gitsubmodules] with [superprojects].
+***
+
+*   `copy-link-files.json`: Tracking file used by `repo sync` to determine when
+    copyfile or linkfile are added or removed and need corresponding updates.
+*   `project.list`: Tracking file used by `repo sync` to determine when projects
+    are added or removed and need corresponding updates in the checkout.
+*   `projects/`: Bare checkouts of every project synced by the manifest.  The
+    filesystem layout matches the `<project path=...` setting in the manifest
+    (i.e. where it's checked out in the repo client source tree).  Those
+    checkouts will symlink their `.git/` state to paths under here.
+
+    Some git state is further split out under `project-objects/`.
+*   `project-objects/`: Git objects that are safe to share across multiple
+    git checkouts.  The filesystem layout matches the `<project name=...`
+    setting in the manifest (i.e. the path on the remote server) with a `.git`
+    suffix.  This allows for multiple checkouts of the same remote git repo to
+    share their objects.  For example, you could have different branches of
+    `foo/bar.git` checked out to `foo/bar-main`, `foo/bar-release`, etc...
+    There will be multiple trees under `projects/` for each one, but only one
+    under `project-objects/`.
+
+    This layout is designed to allow people to sync against different remotes
+    (e.g. a local mirror & a public review server) while avoiding duplicating
+    the content.  However, this can run into problems if different remotes use
+    the same path on their respective servers.  Best to avoid that.
+*   `subprojects/`: Like `projects/`, but for git submodules.
+*   `subproject-objects/`: Like `project-objects/`, but for git submodules.
+*   `worktrees/`: Bare checkouts of every project synced by the manifest.  The
+    filesystem layout matches the `<project name=...` setting in the manifest
+    (i.e. the path on the remote server) with a `.git` suffix.  This has the
+    same advantages as the `project-objects/` layout above.
+
+    This is used when [git worktree]'s are enabled.
+
+### Global settings
+
+The `.repo/manifests.git/config` file is used to track settings for the entire
+repo client checkout.
+
+Most settings use the `[repo]` section to avoid conflicts with git.
+
+Everything under `[repo.syncstate.*]` is used to keep track of sync details for logging
+purposes.
+
+User controlled settings are initialized when running `repo init`.
+
+| Setting                  | `repo init` Option        | Use/Meaning |
+|-------------------       |---------------------------|-------------|
+| manifest.groups          | `--groups` & `--platform` | The manifest groups to sync |
+| manifest.standalone      | `--standalone-manifest`   | Download manifest as static file instead of creating checkout |
+| repo.archive             | `--archive`               | Use `git archive` for checkouts |
+| repo.clonebundle         | `--clone-bundle`          | Whether the initial sync used clone.bundle explicitly |
+| repo.clonefilter         | `--clone-filter`          | Filter setting when using [partial git clones] |
+| repo.depth               | `--depth`                 | Create shallow checkouts when cloning |
+| repo.dissociate          | `--dissociate`            | Dissociate from any reference/mirrors after initial clone |
+| repo.mirror              | `--mirror`                | Checkout is a repo mirror |
+| repo.partialclone        | `--partial-clone`         | Create [partial git clones] |
+| repo.partialcloneexclude | `--partial-clone-exclude` | Comma-delimited list of project names (not paths) to exclude while using [partial git clones] |
+| repo.reference           | `--reference`             | Reference repo client checkout |
+| repo.submodules          | `--submodules`            | Sync git submodules |
+| repo.superproject        | `--use-superproject`      | Sync [superproject] |
+| repo.worktree            | `--worktree`              | Use [git worktree] for checkouts |
+| user.email               | `--config-name`           | User's e-mail address; Copied into `.git/config` when checking out a new project |
+| user.name                | `--config-name`           | User's name; Copied into `.git/config` when checking out a new project |
+
+[partial git clones]: https://git-scm.com/docs/gitrepository-layout#_code_partialclone_code
+[superproject]: https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
+
+### Repo hooks settings
+
+For more details on this feature, see the [repo-hooks docs](./repo-hooks.md).
+We'll just discuss the internal configuration settings.
+These are stored in the registered `<repo-hooks>` project itself, so if the
+manifest switches to a different project, the settings will not be copied.
+
+| Setting                              | Use/Meaning |
+|--------------------------------------|-------------|
+| repo.hooks.\<hook\>.approvedmanifest | User approval for secure manifest sources (e.g. https://) |
+| repo.hooks.\<hook\>.approvedhash     | User approval for insecure manifest sources (e.g. http://) |
+
+
+For example, if our manifest had the following entries, we would store settings
+under `.repo/projects/src/repohooks.git/config` (which would be reachable via
+`git --git-dir=src/repohooks/.git config`).
+```xml
+  <project path="src/repohooks" name="chromiumos/repohooks" ... />
+  <repo-hooks in-project="chromiumos/repohooks" ... />
+```
+
+If `<hook>` is `pre-upload`, the `.git/config` setting might be:
+```ini
+[repo "hooks.pre-upload"]
+	approvedmanifest = https://chromium.googlesource.com/chromiumos/manifest
+```
+
+## Per-project settings
+
+These settings are somewhat meant to be tweaked by the user on a per-project
+basis (e.g. `git config` in a checked out source repo).
+
+Where possible, we re-use standard git settings to avoid confusion, and we
+refrain from documenting those, so see [git-config] documentation instead.
+
+See `repo help upload` for documentation on `[review]` settings.
+
+The `[remote]` settings are automatically populated/updated from the manifest.
+
+The `[branch]` settings are updated by `repo start` and `git branch`.
+
+| Setting                       | Subcommands   | Use/Meaning |
+|-------------------------------|---------------|-------------|
+| review.\<url\>.autocopy       | upload        | Automatically add to `--cc=<value>` |
+| review.\<url\>.autoreviewer   | upload        | Automatically add to `--reviewers=<value>` |
+| review.\<url\>.autoupload     | upload        | Automatically answer "yes" or "no" to all prompts |
+| review.\<url\>.uploadhashtags | upload        | Automatically add to `--hashtag=<value>` |
+| review.\<url\>.uploadlabels   | upload        | Automatically add to `--label=<value>` |
+| review.\<url\>.uploadnotify   | upload        | [Notify setting][upload-notify] to use |
+| review.\<url\>.uploadtopic    | upload        | Default [topic] to use |
+| review.\<url\>.username       | upload        | Override username with `ssh://` review URIs |
+| remote.\<remote\>.fetch       | sync          | Set of refs to fetch |
+| remote.\<remote\>.projectname | \<network\>   | The name of the project as it exists in Gerrit review |
+| remote.\<remote\>.pushurl     | upload        | The base URI for pushing CLs |
+| remote.\<remote\>.review      | upload        | The URI of the Gerrit review server |
+| remote.\<remote\>.url         | sync & upload | The URI of the git project to fetch |
+| branch.\<branch\>.merge       | sync & upload | The branch to merge & upload & track |
+| branch.\<branch\>.remote      | sync & upload | The remote to track |
+
+## ~/ dotconfig layout
+
+Repo will create & maintain a few files in the user's home directory.
+
+*   `.repoconfig/`: Repo's per-user directory for all random config files/state.
+*   `.repoconfig/config`: Per-user settings using [git-config] file format.
+*   `.repoconfig/keyring-version`: Cache file for checking if the gnupg subdir
+    has all the same keys as the repo launcher.  Used to avoid running gpg
+    constantly as that can be quite slow.
+*   `.repoconfig/gnupg/`: GnuPG's internal state directory used when repo needs
+    to run `gpg`.  This provides isolation from the user's normal `~/.gnupg/`.
+
+*   `.repoconfig/.repo_config.json`: JSON cache of the `.repoconfig/config`
+    file for repo to read/process quickly.
+*   `.repo_.gitconfig.json`: JSON cache of the `.gitconfig` file for repo to
+    read/process quickly.
+
+
+[git-config]: https://git-scm.com/docs/git-config
+[git worktree]: https://git-scm.com/docs/git-worktree
+[gitsubmodules]: https://git-scm.com/docs/gitsubmodules
+[manifest-format.md]: ./manifest-format.md
+[local manifests]: ./manifest-format.md#Local-Manifests
+[superprojects]: https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
+[topic]: https://gerrit-review.googlesource.com/Documentation/intro-user.html#topics
+[upload-notify]: https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify
diff --git a/docs/manifest-format.md b/docs/manifest-format.md
index 93d9b96..8e0049b 100644
--- a/docs/manifest-format.md
+++ b/docs/manifest-format.md
@@ -21,6 +21,7 @@
 
 ```xml
 <!DOCTYPE manifest [
+
   <!ELEMENT manifest (notice?,
                       remote*,
                       default?,
@@ -29,11 +30,13 @@
                       project*,
                       extend-project*,
                       repo-hooks?,
+                      superproject?,
+                      contactinfo?,
                       include*)>
 
   <!ELEMENT notice (#PCDATA)>
 
-  <!ELEMENT remote EMPTY>
+  <!ELEMENT remote (annotation*)>
   <!ATTLIST remote name         ID    #REQUIRED>
   <!ATTLIST remote alias        CDATA #IMPLIED>
   <!ATTLIST remote fetch        CDATA #REQUIRED>
@@ -87,21 +90,39 @@
   <!ELEMENT extend-project EMPTY>
   <!ATTLIST extend-project name CDATA #REQUIRED>
   <!ATTLIST extend-project path CDATA #IMPLIED>
+  <!ATTLIST extend-project dest-path CDATA #IMPLIED>
   <!ATTLIST extend-project groups CDATA #IMPLIED>
   <!ATTLIST extend-project revision CDATA #IMPLIED>
+  <!ATTLIST extend-project remote CDATA #IMPLIED>
 
   <!ELEMENT remove-project EMPTY>
   <!ATTLIST remove-project name  CDATA #REQUIRED>
+  <!ATTLIST remove-project optional  CDATA #IMPLIED>
 
   <!ELEMENT repo-hooks EMPTY>
   <!ATTLIST repo-hooks in-project CDATA #REQUIRED>
   <!ATTLIST repo-hooks enabled-list CDATA #REQUIRED>
 
+  <!ELEMENT superproject EMPTY>
+  <!ATTLIST superproject name     CDATA #REQUIRED>
+  <!ATTLIST superproject remote   IDREF #IMPLIED>
+  <!ATTLIST superproject revision CDATA #IMPLIED>
+
+  <!ELEMENT contactinfo EMPTY>
+  <!ATTLIST contactinfo bugurl  CDATA #REQUIRED>
+
   <!ELEMENT include EMPTY>
-  <!ATTLIST include name CDATA #REQUIRED>
+  <!ATTLIST include name   CDATA #REQUIRED>
+  <!ATTLIST include groups CDATA #IMPLIED>
 ]>
 ```
 
+For compatibility purposes across repo releases, all unknown elements are
+silently ignored.  However, repo reserves all possible names for itself for
+future use.  If you want to use custom elements, the `x-*` namespace is
+reserved for that purpose, and repo guarantees to never allocate any
+corresponding names.
+
 A description of the elements and their attributes follows.
 
 
@@ -109,6 +130,10 @@
 
 The root element of the file.
 
+### Element notice
+
+Arbitrary text that is displayed to users whenever `repo sync` finishes.
+The content is simply passed through as it exists in the manifest.
 
 ### Element remote
 
@@ -141,8 +166,8 @@
 are uploaded to by `repo upload`.  This attribute is optional;
 if not specified then `repo upload` will not function.
 
-Attribute `revision`: Name of a Git branch (e.g. `master` or
-`refs/heads/master`). Remotes with their own revision will override
+Attribute `revision`: Name of a Git branch (e.g. `main` or
+`refs/heads/main`). Remotes with their own revision will override
 the default revision.
 
 ### Element default
@@ -155,11 +180,11 @@
 Project elements lacking a remote attribute of their own will use
 this remote.
 
-Attribute `revision`: Name of a Git branch (e.g. `master` or
-`refs/heads/master`).  Project elements lacking their own
+Attribute `revision`: Name of a Git branch (e.g. `main` or
+`refs/heads/main`).  Project elements lacking their own
 revision attribute will use this revision.
 
-Attribute `dest-branch`: Name of a Git branch (e.g. `master`).
+Attribute `dest-branch`: Name of a Git branch (e.g. `main`).
 Project elements not setting their own `dest-branch` will inherit
 this value. If this value is not set, projects will use `revision`
 by default instead.
@@ -235,24 +260,37 @@
 The project name must match the name Gerrit knows, if Gerrit is
 being used for code reviews.
 
+"name" must not be empty, and may not be an absolute path or use "." or ".."
+path components.  It is always interpreted relative to the remote's fetch
+settings, so if a different base path is needed, declare a different remote
+with the new settings needed.
+These restrictions are not enforced for [Local Manifests].
+
 Attribute `path`: An optional path relative to the top directory
 of the repo client where the Git working directory for this project
-should be placed.  If not supplied the project name is used.
+should be placed.  If not supplied the project "name" is used.
 If the project has a parent element, its path will be prefixed
 by the parent's.
 
+"path" may not be an absolute path or use "." or ".." path components.
+These restrictions are not enforced for [Local Manifests].
+
+If you want to place files into the root of the checkout (e.g. a README or
+Makefile or another build script), use the [copyfile] or [linkfile] elements
+instead.
+
 Attribute `remote`: Name of a previously defined remote element.
 If not supplied the remote given by the default element is used.
 
 Attribute `revision`: Name of the Git branch the manifest wants
 to track for this project.  Names can be relative to refs/heads
-(e.g. just "master") or absolute (e.g. "refs/heads/master").
+(e.g. just "main") or absolute (e.g. "refs/heads/main").
 Tags and/or explicit SHA-1s should work in theory, but have not
 been extensively tested.  If not supplied the revision given by
 the remote element is used if applicable, else the default
 element is used.
 
-Attribute `dest-branch`: Name of a Git branch (e.g. `master`).
+Attribute `dest-branch`: Name of a Git branch (e.g. `main`).
 When using `repo upload`, changes will be submitted for code
 review on this branch. If unspecified both here and in the
 default element, `revision` is used instead.
@@ -261,7 +299,7 @@
 whitespace or comma separated.  All projects belong to the group
 "all", and each project automatically belongs to a group of
 its name:`name` and path:`path`.  E.g. for
-<project name="monkeys" path="barrel-of"/>, that project
+`<project name="monkeys" path="barrel-of"/>`, that project
 definition is implicitly in the following manifest groups:
 default, name:monkeys, and path:barrel-of.  If you place a project in the
 group "notdefault", it will not be automatically downloaded by repo.
@@ -300,21 +338,29 @@
 Attribute `path`: If specified, limit the change to projects checked out
 at the specified path, rather than all projects with the given name.
 
+Attribute `dest-path`: If specified, a path relative to the top directory
+of the repo client where the Git working directory for this project
+should be placed.  This is used to move a project in the checkout by
+overriding the existing `path` setting.
+
 Attribute `groups`: List of additional groups to which this project
 belongs.  Same syntax as the corresponding element of `project`.
 
 Attribute `revision`: If specified, overrides the revision of the original
 project.  Same syntax as the corresponding element of `project`.
 
+Attribute `remote`: If specified, overrides the remote of the original
+project.  Same syntax as the corresponding element of `project`.
+
 ### Element annotation
 
 Zero or more annotation elements may be specified as children of a
-project element. Each element describes a name-value pair that will be
-exported into each project's environment during a 'forall' command,
-prefixed with REPO__.  In addition, there is an optional attribute
-"keep" which accepts the case insensitive values "true" (default) or
-"false".  This attribute determines whether or not the annotation will
-be kept when exported with the manifest subcommand.
+project or remote element. Each element describes a name-value pair.
+For projects, this name-value pair will be exported into each project's
+environment during a 'forall' command, prefixed with `REPO__`.  In addition,
+there is an optional attribute "keep" which accepts the case insensitive values
+"true" (default) or "false".  This attribute determines whether or not the
+annotation will be kept when exported with the manifest subcommand.
 
 ### Element copyfile
 
@@ -338,7 +384,7 @@
 instead of copying it creates a symlink.
 
 The symlink is created at "dest" (relative to the top of the tree) and
-points to the path specified by "src".
+points to the path specified by "src" which is a path in the project.
 
 Parent directories of "dest" will be automatically created if missing.
 
@@ -355,6 +401,62 @@
 the user can remove a project, and possibly replace it with their
 own definition.
 
+Attribute `optional`: Set to true to ignore remove-project elements with no
+matching `project` element.
+
+### Element repo-hooks
+
+NB: See the [practical documentation](./repo-hooks.md) for using repo hooks.
+
+Only one repo-hooks element may be specified at a time.
+Attempting to redefine it will fail to parse.
+
+Attribute `in-project`: The project where the hooks are defined.  The value
+must match the `name` attribute (**not** the `path` attribute) of a previously
+defined `project` element.
+
+Attribute `enabled-list`: List of hooks to use, whitespace or comma separated.
+
+### Element superproject
+
+***
+*Note*: This is currently a WIP.
+***
+
+NB: See the [git superprojects documentation](
+https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects) for background
+information.
+
+This element is used to specify the URL of the superproject. It has "name" and
+"remote" as atrributes. Only "name" is required while the others have
+reasonable defaults. At most one superproject may be specified.
+Attempting to redefine it will fail to parse.
+
+Attribute `name`: A unique name for the superproject. This attribute has the
+same meaning as project's name attribute. See the
+[element project](#element-project) for more information.
+
+Attribute `remote`: Name of a previously defined remote element.
+If not supplied the remote given by the default element is used.
+
+Attribute `revision`: Name of the Git branch the manifest wants
+to track for this superproject. If not supplied the revision given
+by the remote element is used if applicable, else the default
+element is used.
+
+### Element contactinfo
+
+***
+*Note*: This is currently a WIP.
+***
+
+This element is used to let manifest authors self-register contact info.
+It has "bugurl" as a required atrribute. This element can be repeated,
+and any later entries will clobber earlier ones. This would allow manifest
+authors who extend manifests to specify their own contact info.
+
+Attribute `bugurl`: The URL to file a bug against the manifest owner.
+
 ### Element include
 
 This element provides the capability of including another manifest
@@ -364,8 +466,15 @@
 Attribute `name`: the manifest to include, specified relative to
 the manifest repository's root.
 
+"name" may not be an absolute path or use "." or ".." path components.
+These restrictions are not enforced for [Local Manifests].
 
-## Local Manifests
+Attribute `groups`: List of additional groups to which all projects
+in the included manifest belong. This appends and recurses, meaning
+all projects in sub-manifests carry all parent include groups.
+Same syntax as the corresponding element of `project`.
+
+## Local Manifests {#local-manifests}
 
 Additional remotes and projects may be added through local manifest
 files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.
@@ -392,10 +501,12 @@
 Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will
 be loaded in alphabetical order.
 
-Additional remotes and projects may also be added through a local
-manifest, stored in `$TOP_DIR/.repo/local_manifest.xml`. This method
-is deprecated in favor of using multiple manifest files as mentioned
-above.
+Projects from local manifest files are added into
+local::<local manifest filename> group.
 
-If `$TOP_DIR/.repo/local_manifest.xml` exists, it will be loaded before
-any manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml`.
+The legacy `$TOP_DIR/.repo/local_manifest.xml` path is no longer supported.
+
+
+[copyfile]: #Element-copyfile
+[linkfile]: #Element-linkfile
+[Local Manifests]: #local-manifests
diff --git a/docs/python-support.md b/docs/python-support.md
index a5c490a..3eaaba3 100644
--- a/docs/python-support.md
+++ b/docs/python-support.md
@@ -18,13 +18,13 @@
 no new features will be added, nor is support guaranteed.
 
 Users can select this during `repo init` time via the [repo launcher].
-Otherwise the default branches (e.g. stable & master) will be used which will
+Otherwise the default branches (e.g. stable & main) will be used which will
 require Python 3.
 
 This means the [repo launcher] needs to support both Python 2 & Python 3, but
 since it doesn't import any other repo code, this shouldn't be too problematic.
 
-The master branch will require Python 3.6 at a minimum.
+The main branch will require Python 3.6 at a minimum.
 If the system has an older version of Python 3, then users will have to select
 the legacy Python 2 branch instead.
 
diff --git a/docs/release-process.md b/docs/release-process.md
index 22c2fd1..f71a411 100644
--- a/docs/release-process.md
+++ b/docs/release-process.md
@@ -5,6 +5,37 @@
 
 [TOC]
 
+## Schedule
+
+There is no specific schedule for when releases are made.
+Usually it's more along the lines of "enough minor changes have been merged",
+or "there's a known issue the maintainers know should get fixed".
+If you find a fix has been merged for an issue important to you, but hasn't been
+released after a week or so, feel free to [contact] us to request a new release.
+
+### Release Freezes {#freeze}
+
+We try to observe a regular schedule for when **not** to release.
+If something goes wrong, staff need to be active in order to respond quickly &
+effectively.
+We also don't want to disrupt non-Google organizations if possible.
+
+We generally follow the rules:
+
+* Release during Mon - Thu, 9:00 - 14:00 [US PT]
+* Avoid holidays
+  * All regular [US holidays]
+  * Large international ones if possible
+  * All the various [New Years]
+    * Jan 1 in Gregorian calendar is the most obvious
+    * Check for large Lunar New Years too
+* Follow the normal [Google production freeze schedule]
+
+[US holidays]: https://en.wikipedia.org/wiki/Federal_holidays_in_the_United_States
+[US PT]: https://en.wikipedia.org/wiki/Pacific_Time_Zone
+[New Years]: https://en.wikipedia.org/wiki/New_Year
+[Google production freeze schedule]: http://goto.google.com/prod-freeze
+
 ## Launcher script
 
 The main repo script serves as a standalone program and is often referred to as
@@ -49,11 +80,12 @@
 
 *   `--repo-url`: This tells repo where to clone the full repo project itself.
     It defaults to the official project (`REPO_URL` in the launcher script).
-*   `--repo-branch`: This tells repo which branch to use for the full project.
+*   `--repo-rev`: This tells repo which branch to use for the full project.
     It defaults to the `stable` branch (`REPO_REV` in the launcher script).
 
-Whenever `repo sync` is run, repo will check to see if an update is available.
-It fetches the latest repo-branch from the repo-url.
+Whenever `repo sync` is run, repo will, once every 24 hours, see if an update
+is available.
+It fetches the latest repo-rev from the repo-url.
 Then it verifies that the latest commit in the branch has a valid signed tag
 using `git tag -v` (which uses gpg).
 If the tag is valid, then repo will update its internal checkout to it.
@@ -64,9 +96,14 @@
 
 If that tag cannot be verified, it gives up and forces the user to resolve.
 
+### Force an update
+
+The `repo selfupdate` command can be used to force an immediate update.
+It is not subject to the 24 hour limitation.
+
 ## Branch management
 
-All development happens on the `master` branch and should generally be stable.
+All development happens on the `main` branch and should generally be stable.
 
 Since the repo launcher defaults to tracking the `stable` branch, it is not
 normally updated until a new release is available.
@@ -81,7 +118,7 @@
 branch will be updated from `v1.9.x` to `v1.10.x`.
 
 We don't have parallel release branches/series.
-Typically all tags are made against the `master` branch and then pushed to the
+Typically all tags are made against the `main` branch and then pushed to the
 `stable` branch to make it available to the rest of the world.
 Since repo doesn't typically see a lot of changes, this tends to be OK.
 
@@ -89,10 +126,10 @@
 
 When you want to create a new release, you'll need to select a good version and
 create a signed tag using a key registered in repo itself.
-Typically we just tag the latest version of the `master` branch.
+Typically we just tag the latest version of the `main` branch.
 The tag could be pushed now, but it won't be used by clients normally (since the
-default `repo-branch` setting is `stable`).
-This would allow some early testing on systems who explicitly select `master`.
+default `repo-rev` setting is `stable`).
+This would allow some early testing on systems who explicitly select `main`.
 
 ### Creating a signed tag
 
@@ -113,7 +150,7 @@
 $ gpg -K
 
 # Pick whatever branch or commit you want to tag.
-$ r=master
+$ r=main
 
 # Pick the new version.
 $ t=1.12.10
@@ -161,7 +198,144 @@
 $ git log --format="%h (%aN) %s" --no-merges origin/stable..$r
 ```
 
+## Project References
 
+Here's a table showing the relationship of major tools, their EOL dates, and
+their status in Ubuntu & Debian.
+Those distros tend to be good indicators of how long we need to support things.
+
+Things in bold indicate stuff to take note of, but does not guarantee that we
+still support them.
+Things in italics are things we used to care about but probably don't anymore.
+
+|   Date   |     EOL      | [Git][rel-g] | [Python][rel-p] | [SSH][rel-o] | [Ubuntu][rel-u] / [Debian][rel-d] | Git | Python | SSH |
+|:--------:|:------------:|:------------:|:---------------:|:------------:|-----------------------------------|-----|--------|-----|
+| Apr 2008 |              |              |                 | 5.0          |
+| Jun 2008 |              |              |                 | 5.1          |
+| Oct 2008 | *Oct 2013*   |              | 2.6.0           |              | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
+| Dec 2008 | *Feb 2009*   |              | 3.0.0           |
+| Feb 2009 |              |              |                 | 5.2          |
+| Feb 2009 | *Mar 2012*   |              |                 |              | Debian 5 Lenny       | 1.5.6.5 | 2.5.2 |
+| Jun 2009 | *Jun 2016*   |              | 3.1.0           |              | *10.04 Lucid* - 10.10 Maverick / *Squeeze* |
+| Sep 2009 |              |              |                 | 5.3          | *10.04 Lucid* |
+| Feb 2010 | *Oct 2012*   | 1.7.0        |                 |              | *10.04 Lucid* - *12.04 Precise* - 12.10 Quantal |
+| Mar 2010 |              |              |                 | 5.4          |
+| Apr 2010 |              |              |                 | 5.5          | 10.10 Maverick |
+| Apr 2010 | *Apr 2015*   |              |                 |              | *10.04 Lucid*        | 1.7.0.4  | 2.6.5 3.1.2  | 5.3 |
+| Jul 2010 | *Dec 2019*   |              | *2.7.0*         |              | 11.04 Natty - *<current>* |
+| Aug 2010 |              |              |                 | 5.6          |
+| Oct 2010 |              |              |                 |              | 10.10 Maverick       | 1.7.1    | 2.6.6 3.1.3  | 5.5 |
+| Jan 2011 |              |              |                 | 5.7          |
+| Feb 2011 |              |              |                 | 5.8          | 11.04 Natty |
+| Feb 2011 | *Feb 2016*   |              |                 |              | Debian 6 Squeeze     | 1.7.2.5  | 2.6.6 3.1.3  |
+| Apr 2011 |              |              |                 |              | 11.04 Natty          | 1.7.4    | 2.7.1 3.2.0  | 5.8 |
+| Sep 2011 |              |              |                 | 5.9          | *12.04 Precise* |
+| Oct 2011 | *Feb 2016*   |              | 3.2.0           |              | 11.04 Natty - 12.10 Quantal |
+| Oct 2011 |              |              |                 |              | 11.10 Ocelot         | 1.7.5.4  | 2.7.2 3.2.2  | 5.8 |
+| Apr 2012 |              |              |                 | 6.0          | 12.10 Quantal |
+| Apr 2012 | *Apr 2019*   |              |                 |              | *12.04 Precise*      | 1.7.9.5  | 2.7.3 3.2.3  | 5.9 |
+| Aug 2012 |              |              |                 | 6.1          | 13.04 Raring |
+| Sep 2012 | *Sep 2017*   |              | 3.3.0           |              | 13.04 Raring - 13.10 Saucy |
+| Oct 2012 | *Dec 2014*   | 1.8.0        |                 |              | 13.04 Raring - 13.10 Saucy |
+| Oct 2012 |              |              |                 |              | 12.10 Quantal        | 1.7.10.4 | 2.7.3 3.2.3  | 6.0 |
+| Mar 2013 |              |              |                 | 6.2          | 13.10 Saucy |
+| Apr 2013 |              |              |                 |              | 13.04 Raring         | 1.8.1.2  | 2.7.4 3.3.1  | 6.1 |
+| May 2013 | *May 2018*   |              |                 |              | Debian 7 Wheezy      | 1.7.10.4 | 2.7.3 3.2.3  |
+| Sep 2013 |              |              |                 | 6.3          |
+| Oct 2013 |              |              |                 |              | 13.10 Saucy          | 1.8.3.2  | 2.7.5 3.3.2  | 6.2 |
+| Nov 2013 |              |              |                 | 6.4          |
+| Jan 2014 |              |              |                 | 6.5          |
+| Feb 2014 | *Dec 2014*   | **1.9.0**    |                 |              | *14.04 Trusty* |
+| Mar 2014 | *Mar 2019*   |              | *3.4.0*         |              | *14.04 Trusty* - 15.10 Wily / *Jessie* |
+| Mar 2014 |              |              |                 | 6.6          | *14.04 Trusty* - 14.10 Utopic |
+| Apr 2014 | *Apr 2022*   |              |                 |              | *14.04 Trusty*       | 1.9.1    | 2.7.5 3.4.0  | 6.6 |
+| May 2014 | *Dec 2014*   | 2.0.0        |
+| Aug 2014 | *Dec 2014*   | *2.1.0*      |                 |              | 14.10 Utopic - 15.04 Vivid / *Jessie* |
+| Oct 2014 |              |              |                 | 6.7          | 15.04 Vivid |
+| Oct 2014 |              |              |                 |              | 14.10 Utopic         | 2.1.0    | 2.7.8 3.4.2  | 6.6 |
+| Nov 2014 | *Sep 2015*   | 2.2.0        |
+| Feb 2015 | *Sep 2015*   | 2.3.0        |
+| Mar 2015 |              |              |                 | 6.8          |
+| Apr 2015 | *May 2017*   | 2.4.0        |
+| Apr 2015 | *Jun 2020*   |              |                 |              | *Debian 8 Jessie*    | 2.1.4    | 2.7.9 3.4.2  |
+| Apr 2015 |              |              |                 |              | 15.04 Vivid          | 2.1.4    | 2.7.9 3.4.3  | 6.7 |
+| Jul 2015 | *May 2017*   | 2.5.0        |                 |              | 15.10 Wily |
+| Jul 2015 |              |              |                 | 6.9          | 15.10 Wily |
+| Aug 2015 |              |              |                 | 7.0          |
+| Aug 2015 |              |              |                 | 7.1          |
+| Sep 2015 | *May 2017*   | 2.6.0        |
+| Sep 2015 | *Sep 2020*   |              | *3.5.0*         |              | *16.04 Xenial* - 17.04 Zesty / *Stretch* |
+| Oct 2015 |              |              |                 |              | 15.10 Wily           | 2.5.0    | 2.7.9 3.4.3  | 6.9 |
+| Jan 2016 | *Jul 2017*   | *2.7.0*      |                 |              | *16.04 Xenial* |
+| Feb 2016 |              |              |                 | 7.2          | *16.04 Xenial* |
+| Mar 2016 | *Jul 2017*   | 2.8.0        |
+| Apr 2016 | *Apr 2024*   |              |                 |              | *16.04 Xenial*       | 2.7.4    | 2.7.11 3.5.1 | 7.2 |
+| Jun 2016 | *Jul 2017*   | 2.9.0        |                 |              | 16.10 Yakkety |
+| Jul 2016 |              |              |                 | 7.3          | 16.10 Yakkety |
+| Sep 2016 | *Sep 2017*   | 2.10.0       |
+| Oct 2016 |              |              |                 |              | 16.10 Yakkety        | 2.9.3    | 2.7.11 3.5.1 | 7.3 |
+| Nov 2016 | *Sep 2017*   | *2.11.0*     |                 |              | 17.04 Zesty / *Stretch* |
+| Dec 2016 | **Dec 2021** |              | **3.6.0**       |              | 17.10 Artful - **18.04 Bionic** - 18.10 Cosmic |
+| Dec 2016 |              |              |                 | 7.4          | 17.04 Zesty / *Debian 9 Stretch* |
+| Feb 2017 | *Sep 2017*   | 2.12.0       |
+| Mar 2017 |              |              |                 | 7.5          | 17.10 Artful |
+| Apr 2017 |              |              |                 |              | 17.04 Zesty          | 2.11.0   | 2.7.13 3.5.3 | 7.4 |
+| May 2017 | *May 2018*   | 2.13.0       |
+| Jun 2017 | *Jun 2022*   |              |                 |              | *Debian 9 Stretch*   | 2.11.0   | 2.7.13 3.5.3 | 7.4 |
+| Aug 2017 | *Dec 2019*   | 2.14.0       |                 |              | 17.10 Artful |
+| Oct 2017 | *Dec 2019*   | 2.15.0       |
+| Oct 2017 |              |              |                 | 7.6          | **18.04 Bionic** |
+| Oct 2017 |              |              |                 |              | 17.10 Artful         | 2.14.1   | 2.7.14 3.6.3 | 7.5 |
+| Jan 2018 | *Dec 2019*   | 2.16.0       |
+| Apr 2018 | *Mar 2021*   | **2.17.0**   |                 |              | **18.04 Bionic**     |
+| Apr 2018 |              |              |                 | 7.7          | 18.10 Cosmic |
+| Apr 2018 | **Apr 2028** |              |                 |              | **18.04 Bionic**     | 2.17.0   | 2.7.15 3.6.5 | 7.6 |
+| Jun 2018 | *Mar 2021*   | 2.18.0       |
+| Jun 2018 | **Jun 2023** |              | 3.7.0           |              | 19.04 Disco - **20.04 Focal** / **Buster** |
+| Aug 2018 |              |              |                 | 7.8          |
+| Sep 2018 | *Mar 2021*   | 2.19.0       |                 |              | 18.10 Cosmic |
+| Oct 2018 |              |              |                 | 7.9          | 19.04 Disco / **Buster** |
+| Oct 2018 |              |              |                 |              | 18.10 Cosmic         | 2.19.1   | 2.7.15 3.6.6 | 7.7 |
+| Dec 2018 | *Mar 2021*   | **2.20.0**   |                 |              | 19.04 Disco - 19.10 Eoan / **Buster** |
+| Feb 2019 | *Mar 2021*   | 2.21.0       |
+| Apr 2019 |              |              |                 | 8.0          | 19.10 Eoan |
+| Apr 2019 |              |              |                 |              | 19.04 Disco          | 2.20.1   | 2.7.16 3.7.3 | 7.9 |
+| Jun 2019 |              | 2.22.0       |
+| Jul 2019 | **Jul 2024** |              |                 |              | **Debian 10 Buster** | 2.20.1   | 2.7.16 3.7.3 | 7.9 |
+| Aug 2019 | *Mar 2021*   | 2.23.0       |
+| Oct 2019 | **Oct 2024** |              | 3.8.0           |              | **20.04 Focal** - 20.10 Groovy |
+| Oct 2019 |              |              |                 | 8.1          |
+| Oct 2019 |              |              |                 |              | 19.10 Eoan           | 2.20.1   | 2.7.17 3.7.5 | 8.0 |
+| Nov 2019 | *Mar 2021*   | 2.24.0       |
+| Jan 2020 | *Mar 2021*   | 2.25.0       |                 |              | **20.04 Focal** |
+| Feb 2020 |              |              |                 | 8.2          | **20.04 Focal** |
+| Mar 2020 | *Mar 2021*   | 2.26.0       |
+| Apr 2020 | **Apr 2030** |              |                 |              | **20.04 Focal**      | 2.25.1   | 2.7.17 3.8.2 | 8.2 |
+| May 2020 | *Mar 2021*   | 2.27.0       |                 |              | 20.10 Groovy |
+| May 2020 |              |              |                 | 8.3          |
+| Jul 2020 | *Mar 2021*   | 2.28.0       |
+| Sep 2020 |              |              |                 | 8.4          | 21.04 Hirsute / **Bullseye** |
+| Oct 2020 | *Mar 2021*   | 2.29.0       |
+| Oct 2020 |              |              |                 |              | 20.10 Groovy         | 2.27.0   | 2.7.18 3.8.6 | 8.3 |
+| Oct 2020 | **Oct 2025** |              | 3.9.0           |              | 21.04 Hirsute / **Bullseye** |
+| Dec 2020 | *Mar 2021*   | 2.30.0       |                 |              | 21.04 Hirsute / **Bullseye** |
+| Mar 2021 |              | 2.31.0       |
+| Mar 2021 |              |              |                 | 8.5          |
+| Apr 2021 |              |              |                 | 8.6          |
+| Apr 2021 | *Jan 2022*   |              |                 |              | 21.04 Hirsute        | 2.30.2   | 2.7.18 3.9.4 | 8.4 |
+| Jun 2021 |              | 2.32.0       |
+| Aug 2021 |              | 2.33.0       |
+| Aug 2021 |              |              |                 | 8.7          |
+| Aug 2021 | **Aug 2026** |              |                 |              | **Debian 11 Bullseye** | 2.30.2 | 2.7.18 3.9.2 | 8.4 |
+| **Date** |   **EOL**    | **[Git][rel-g]** | **[Python][rel-p]** | **[SSH][rel-o]** | **[Ubuntu][rel-u] / [Debian][rel-d]** | **Git** | **Python** | **SSH** |
+
+
+[contact]: ../README.md#contact
+[rel-d]: https://en.wikipedia.org/wiki/Debian_version_history
+[rel-g]: https://en.wikipedia.org/wiki/Git#Releases
+[rel-o]: https://www.openssh.com/releasenotes.html
+[rel-p]: https://en.wikipedia.org/wiki/History_of_Python#Table_of_versions
+[rel-u]: https://en.wikipedia.org/wiki/Ubuntu_version_history#Table_of_versions
 [example announcement]: https://groups.google.com/d/topic/repo-discuss/UGBNismWo1M/discussion
 [repo-discuss@googlegroups.com]: https://groups.google.com/forum/#!forum/repo-discuss
 [go/repo-release]: https://goto.google.com/repo-release
diff --git a/docs/repo-hooks.md b/docs/repo-hooks.md
index 7c37c30..cbb1aac 100644
--- a/docs/repo-hooks.md
+++ b/docs/repo-hooks.md
@@ -27,7 +27,7 @@
 For the full syntax, see the [repo manifest format](./manifest-format.md).
 
 Here's a short example from
-[Android](https://android.googlesource.com/platform/manifest/+/master/default.xml).
+[Android](https://android.googlesource.com/platform/manifest/+/HEAD/default.xml).
 The `<project>` line checks out the repohooks git repo to the local
 `tools/repohooks/` path.  The `<repo-hooks>` line says to look in the project
 with the name `platform/tools/repohooks` for hooks to run during the
diff --git a/docs/windows.md b/docs/windows.md
index 8091296..4282beb 100644
--- a/docs/windows.md
+++ b/docs/windows.md
@@ -19,7 +19,33 @@
 We will never add code specific to older versions of Windows.
 It might work, but it most likely won't, so please don't bother asking.
 
-## Symlinks
+## Git worktrees
+
+*** note
+**Warning**: Repo's support for Git worktrees is new & experimental.
+Please report any bugs and be sure to maintain backups!
+***
+
+The Repo 2.4 release introduced support for [Git worktrees][git-worktree].
+You don't have to worry about or understand this particular feature, so don't
+worry if this section of the Git manual is particularly impenetrable.
+
+The salient point is that Git worktrees allow Repo to create repo client
+checkouts that do not require symlinks at all under Windows.
+This means users no longer need Administrator access to sync code.
+
+Simply use `--worktree` when running `repo init` to opt in.
+
+This does not effect specific Git repositories that use symlinks themselves.
+
+[git-worktree]: https://git-scm.com/docs/git-worktree
+
+## Symlinks by default
+
+*** note
+**NB**: This section applies to the default Repo behavior which does not use
+Git worktrees (see the previous section for more info).
+***
 
 Repo will use symlinks heavily internally.
 On *NIX platforms, this isn't an issue, but Windows makes it a bit difficult.
@@ -62,9 +88,8 @@
 
 ## Python
 
-You should make sure to be running Python 3.6 or newer under Windows.
-Python 2 might work, but due to already limited platform testing, you should
-only run newer Python versions.
+Python 3.6 or newer is required.
+Python 2 is known to be broken when running under Windows.
 See our [Python Support](./python-support.md) document for more details.
 
 You can grab the latest Windows installer here:<br>
diff --git a/editor.py b/editor.py
index fcf1638..b84a42d 100644
--- a/editor.py
+++ b/editor.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import os
 import re
 import sys
@@ -24,6 +21,7 @@
 from error import EditorError
 import platform_utils
 
+
 class Editor(object):
   """Manages the user's preferred text editor."""
 
@@ -57,7 +55,7 @@
 
     if os.getenv('TERM') == 'dumb':
       print(
-"""No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
+          """No editor specified in GIT_EDITOR, core.editor, VISUAL or EDITOR.
 Tried to fall back to vi but terminal is dumb.  Please configure at
 least one of these before using this command.""", file=sys.stderr)
       sys.exit(1)
@@ -104,10 +102,10 @@
         rc = subprocess.Popen(args, shell=shell).wait()
       except OSError as e:
         raise EditorError('editor failed, %s: %s %s'
-          % (str(e), editor, path))
+                          % (str(e), editor, path))
       if rc != 0:
         raise EditorError('editor failed with exit status %d: %s %s'
-          % (rc, editor, path))
+                          % (rc, editor, path))
 
       with open(path, mode='rb') as fd2:
         return fd2.read().decode('utf-8')
diff --git a/error.py b/error.py
index 5bfe3a6..cbefcb7 100644
--- a/error.py
+++ b/error.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,70 +12,89 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
+
 class ManifestParseError(Exception):
   """Failed to parse the manifest file.
   """
 
-class ManifestInvalidRevisionError(Exception):
+
+class ManifestInvalidRevisionError(ManifestParseError):
   """The revision value in a project is incorrect.
   """
 
+
+class ManifestInvalidPathError(ManifestParseError):
+  """A path used in <copyfile> or <linkfile> is incorrect.
+  """
+
+
 class NoManifestException(Exception):
   """The required manifest does not exist.
   """
+
   def __init__(self, path, reason):
-    super(NoManifestException, self).__init__()
+    super().__init__(path, reason)
     self.path = path
     self.reason = reason
 
   def __str__(self):
     return self.reason
 
+
 class EditorError(Exception):
   """Unspecified error from the user's text editor.
   """
+
   def __init__(self, reason):
-    super(EditorError, self).__init__()
+    super().__init__(reason)
     self.reason = reason
 
   def __str__(self):
     return self.reason
 
+
 class GitError(Exception):
   """Unspecified internal error from git.
   """
+
   def __init__(self, command):
-    super(GitError, self).__init__()
+    super().__init__(command)
     self.command = command
 
   def __str__(self):
     return self.command
 
+
 class UploadError(Exception):
   """A bundle upload to Gerrit did not succeed.
   """
+
   def __init__(self, reason):
-    super(UploadError, self).__init__()
+    super().__init__(reason)
     self.reason = reason
 
   def __str__(self):
     return self.reason
 
+
 class DownloadError(Exception):
   """Cannot download a repository.
   """
+
   def __init__(self, reason):
-    super(DownloadError, self).__init__()
+    super().__init__(reason)
     self.reason = reason
 
   def __str__(self):
     return self.reason
 
+
 class NoSuchProjectError(Exception):
   """A specified project does not exist in the work tree.
   """
+
   def __init__(self, name=None):
-    super(NoSuchProjectError, self).__init__()
+    super().__init__(name)
     self.name = name
 
   def __str__(self):
@@ -89,8 +106,9 @@
 class InvalidProjectGroupsError(Exception):
   """A specified project is not suitable for the specified groups
   """
+
   def __init__(self, name=None):
-    super(InvalidProjectGroupsError, self).__init__()
+    super().__init__(name)
     self.name = name
 
   def __str__(self):
@@ -98,15 +116,18 @@
       return 'in current directory'
     return self.name
 
+
 class RepoChangedException(Exception):
   """Thrown if 'repo sync' results in repo updating its internal
      repo or manifest repositories.  In this special case we must
      use exec to re-execute repo with the new code and manifest.
   """
+
   def __init__(self, extra_args=None):
-    super(RepoChangedException, self).__init__()
+    super().__init__(extra_args)
     self.extra_args = extra_args or []
 
+
 class HookError(Exception):
   """Thrown if a 'repo-hook' could not be run.
 
diff --git a/event_log.py b/event_log.py
index 315d752..c77c564 100644
--- a/event_log.py
+++ b/event_log.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2017 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,8 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
-
 import json
 import multiprocessing
 
@@ -23,6 +19,7 @@
 TASK_SYNC_NETWORK = 'sync-network'
 TASK_SYNC_LOCAL = 'sync-local'
 
+
 class EventLog(object):
   """Event log that records events that occurred during a repo invocation.
 
@@ -138,7 +135,7 @@
     Returns:
       A dictionary of the event added to the log.
     """
-    event['status'] =  self.GetStatusString(success)
+    event['status'] = self.GetStatusString(success)
     event['finish_time'] = finish
     return event
 
@@ -165,6 +162,7 @@
 # An integer id that is unique across this invocation of the program.
 _EVENT_ID = multiprocessing.Value('i', 1)
 
+
 def _NextEventId():
   """Helper function for grabbing the next unique id.
 
diff --git a/fetch.py b/fetch.py
new file mode 100644
index 0000000..91d40cd
--- /dev/null
+++ b/fetch.py
@@ -0,0 +1,41 @@
+# Copyright (C) 2021 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""This module contains functions used to fetch files from various sources."""
+
+import subprocess
+import sys
+from urllib.parse import urlparse
+
+def fetch_file(url):
+  """Fetch a file from the specified source using the appropriate protocol.
+
+  Returns:
+    The contents of the file as bytes.
+  """
+  scheme = urlparse(url).scheme
+  if scheme == 'gs':
+    cmd = ['gsutil', 'cat', url]
+    try:
+      result = subprocess.run(
+          cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+      return result.stdout
+    except subprocess.CalledProcessError as e:
+      print('fatal: error running "gsutil": %s' % e.output,
+            file=sys.stderr)
+    sys.exit(1)
+  if scheme == 'file':
+    with open(url[len('file://'):], 'rb') as f:
+      return f.read()
+  raise ValueError('unsupported url %s' % url)
diff --git a/git_command.py b/git_command.py
index dc542c3..95db91f 100644
--- a/git_command.py
+++ b/git_command.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,12 +12,10 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import functools
 import os
 import sys
 import subprocess
-import tempfile
-from signal import SIGTERM
 
 from error import GitError
 from git_refs import HEAD
@@ -28,75 +24,42 @@
 from wrapper import Wrapper
 
 GIT = 'git'
-MIN_GIT_VERSION = (1, 5, 4)
+# NB: These do not need to be kept in sync with the repo launcher script.
+# These may be much newer as it allows the repo launcher to roll between
+# different repo releases while source versions might require a newer git.
+#
+# The soft version is when we start warning users that the version is old and
+# we'll be dropping support for it.  We'll refuse to work with versions older
+# than the hard version.
+#
+# git-1.7 is in (EOL) Ubuntu Precise.  git-1.9 is in Ubuntu Trusty.
+MIN_GIT_VERSION_SOFT = (1, 9, 1)
+MIN_GIT_VERSION_HARD = (1, 7, 2)
 GIT_DIR = 'GIT_DIR'
 
 LAST_GITDIR = None
 LAST_CWD = None
 
-_ssh_proxy_path = None
-_ssh_sock_path = None
-_ssh_clients = []
-
-def ssh_sock(create=True):
-  global _ssh_sock_path
-  if _ssh_sock_path is None:
-    if not create:
-      return None
-    tmp_dir = '/tmp'
-    if not os.path.exists(tmp_dir):
-      tmp_dir = tempfile.gettempdir()
-    _ssh_sock_path = os.path.join(
-      tempfile.mkdtemp('', 'ssh-', tmp_dir),
-      'master-%r@%h:%p')
-  return _ssh_sock_path
-
-def _ssh_proxy():
-  global _ssh_proxy_path
-  if _ssh_proxy_path is None:
-    _ssh_proxy_path = os.path.join(
-      os.path.dirname(__file__),
-      'git_ssh')
-  return _ssh_proxy_path
-
-def _add_ssh_client(p):
-  _ssh_clients.append(p)
-
-def _remove_ssh_client(p):
-  try:
-    _ssh_clients.remove(p)
-  except ValueError:
-    pass
-
-def terminate_ssh_clients():
-  global _ssh_clients
-  for p in _ssh_clients:
-    try:
-      os.kill(p.pid, SIGTERM)
-      p.wait()
-    except OSError:
-      pass
-  _ssh_clients = []
-
-_git_version = None
 
 class _GitCall(object):
+  @functools.lru_cache(maxsize=None)
   def version_tuple(self):
-    global _git_version
-    if _git_version is None:
-      _git_version = Wrapper().ParseGitVersion()
-      if _git_version is None:
-        print('fatal: unable to detect git version', file=sys.stderr)
-        sys.exit(1)
-    return _git_version
+    ret = Wrapper().ParseGitVersion()
+    if ret is None:
+      print('fatal: unable to detect git version', file=sys.stderr)
+      sys.exit(1)
+    return ret
 
   def __getattr__(self, name):
-    name = name.replace('_','-')
+    name = name.replace('_', '-')
+
     def fun(*cmdv):
       command = [name]
       command.extend(cmdv)
       return GitCommand(None, command).Wait() == 0
     return fun
+
+
 git = _GitCall()
 
 
@@ -111,11 +74,11 @@
 
     proj = os.path.dirname(os.path.abspath(__file__))
     env[GIT_DIR] = os.path.join(proj, '.git')
-
-    p = subprocess.Popen([GIT, 'describe', HEAD], stdout=subprocess.PIPE,
-                         env=env)
-    if p.wait() == 0:
-      ver = p.stdout.read().strip().decode('utf-8')
+    result = subprocess.run([GIT, 'describe', HEAD], stdout=subprocess.PIPE,
+                            stderr=subprocess.DEVNULL, encoding='utf-8',
+                            env=env, check=False)
+    if result.returncode == 0:
+      ver = result.stdout.strip()
       if ver.startswith('v'):
         ver = ver[1:]
     else:
@@ -177,8 +140,10 @@
 
     return self._git_ua
 
+
 user_agent = UserAgent()
 
+
 def git_require(min_version, fail=False, msg=''):
   git_version = git.version_tuple()
   if min_version <= git_version:
@@ -191,42 +156,38 @@
     sys.exit(1)
   return False
 
-def _setenv(env, name, value):
-  env[name] = value.encode()
 
 class GitCommand(object):
   def __init__(self,
                project,
                cmdv,
-               bare = False,
-               provide_stdin = False,
-               capture_stdout = False,
-               capture_stderr = False,
-               disable_editor = False,
-               ssh_proxy = False,
-               cwd = None,
-               gitdir = None):
+               bare=False,
+               input=None,
+               capture_stdout=False,
+               capture_stderr=False,
+               merge_output=False,
+               disable_editor=False,
+               ssh_proxy=None,
+               cwd=None,
+               gitdir=None):
     env = self._GetBasicEnv()
 
-    # If we are not capturing std* then need to print it.
-    self.tee = {'stdout': not capture_stdout, 'stderr': not capture_stderr}
-
     if disable_editor:
-      _setenv(env, 'GIT_EDITOR', ':')
+      env['GIT_EDITOR'] = ':'
     if ssh_proxy:
-      _setenv(env, 'REPO_SSH_SOCK', ssh_sock())
-      _setenv(env, 'GIT_SSH', _ssh_proxy())
-      _setenv(env, 'GIT_SSH_VARIANT', 'ssh')
+      env['REPO_SSH_SOCK'] = ssh_proxy.sock()
+      env['GIT_SSH'] = ssh_proxy.proxy
+      env['GIT_SSH_VARIANT'] = 'ssh'
     if 'http_proxy' in env and 'darwin' == sys.platform:
       s = "'http.proxy=%s'" % (env['http_proxy'],)
       p = env.get('GIT_CONFIG_PARAMETERS')
       if p is not None:
         s = p + ' ' + s
-      _setenv(env, 'GIT_CONFIG_PARAMETERS', s)
+      env['GIT_CONFIG_PARAMETERS'] = s
     if 'GIT_ALLOW_PROTOCOL' not in env:
-      _setenv(env, 'GIT_ALLOW_PROTOCOL',
-              'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
-    _setenv(env, 'GIT_HTTP_USER_AGENT', user_agent.git)
+      env['GIT_ALLOW_PROTOCOL'] = (
+          'file:git:http:https:ssh:persistent-http:persistent-https:sso:rpc')
+    env['GIT_HTTP_USER_AGENT'] = user_agent.git
 
     if project:
       if not cwd:
@@ -237,7 +198,10 @@
     command = [GIT]
     if bare:
       if gitdir:
-        _setenv(env, GIT_DIR, gitdir)
+        # Git on Windows wants its paths only using / for reliability.
+        if platform_utils.isWindows():
+          gitdir = gitdir.replace('\\', '/')
+        env[GIT_DIR] = gitdir
       cwd = None
     command.append(cmdv[0])
     # Need to use the --progress flag for fetch/clone so output will be
@@ -247,13 +211,10 @@
         command.append('--progress')
     command.extend(cmdv[1:])
 
-    if provide_stdin:
-      stdin = subprocess.PIPE
-    else:
-      stdin = None
-
-    stdout = subprocess.PIPE
-    stderr = subprocess.PIPE
+    stdin = subprocess.PIPE if input else None
+    stdout = subprocess.PIPE if capture_stdout else None
+    stderr = (subprocess.STDOUT if merge_output else
+              (subprocess.PIPE if capture_stderr else None))
 
     if IsTrace():
       global LAST_CWD
@@ -281,23 +242,38 @@
         dbg += ' 1>|'
       if stderr == subprocess.PIPE:
         dbg += ' 2>|'
+      elif stderr == subprocess.STDOUT:
+        dbg += ' 2>&1'
       Trace('%s', dbg)
 
     try:
       p = subprocess.Popen(command,
-                           cwd = cwd,
-                           env = env,
-                           stdin = stdin,
-                           stdout = stdout,
-                           stderr = stderr)
+                           cwd=cwd,
+                           env=env,
+                           encoding='utf-8',
+                           errors='backslashreplace',
+                           stdin=stdin,
+                           stdout=stdout,
+                           stderr=stderr)
     except Exception as e:
       raise GitError('%s: %s' % (command[1], e))
 
     if ssh_proxy:
-      _add_ssh_client(p)
+      ssh_proxy.add_client(p)
 
     self.process = p
-    self.stdin = p.stdin
+    if input:
+      if isinstance(input, str):
+        input = input.encode('utf-8')
+      p.stdin.write(input)
+      p.stdin.close()
+
+    try:
+      self.stdout, self.stderr = p.communicate()
+    finally:
+      if ssh_proxy:
+        ssh_proxy.remove_client(p)
+    self.rc = p.wait()
 
   @staticmethod
   def _GetBasicEnv():
@@ -317,35 +293,4 @@
     return env
 
   def Wait(self):
-    try:
-      p = self.process
-      rc = self._CaptureOutput()
-    finally:
-      _remove_ssh_client(p)
-    return rc
-
-  def _CaptureOutput(self):
-    p = self.process
-    s_in = platform_utils.FileDescriptorStreams.create()
-    s_in.add(p.stdout, sys.stdout, 'stdout')
-    s_in.add(p.stderr, sys.stderr, 'stderr')
-    self.stdout = ''
-    self.stderr = ''
-
-    while not s_in.is_done:
-      in_ready = s_in.select()
-      for s in in_ready:
-        buf = s.read()
-        if not buf:
-          s_in.remove(s)
-          continue
-        if not hasattr(buf, 'encode'):
-          buf = buf.decode()
-        if s.std_name == 'stdout':
-          self.stdout += buf
-        else:
-          self.stderr += buf
-        if self.tee[s.std_name]:
-          s.dest.write(buf)
-          s.dest.flush()
-    return p.wait()
+    return self.rc
diff --git a/git_config.py b/git_config.py
index 8de3200..3cd0939 100644
--- a/git_config.py
+++ b/git_config.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,84 +12,83 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
-
 import contextlib
+import datetime
 import errno
+from http.client import HTTPException
 import json
 import os
 import re
 import ssl
 import subprocess
 import sys
-try:
-  import threading as _threading
-except ImportError:
-  import dummy_threading as _threading
-import time
+import urllib.error
+import urllib.request
 
-from pyversion import is_python3
-if is_python3():
-  import urllib.request
-  import urllib.error
-else:
-  import urllib2
-  import imp
-  urllib = imp.new_module('urllib')
-  urllib.request = urllib2
-  urllib.error = urllib2
-
-from signal import SIGTERM
 from error import GitError, UploadError
 import platform_utils
 from repo_trace import Trace
-if is_python3():
-  from http.client import HTTPException
-else:
-  from httplib import HTTPException
-
 from git_command import GitCommand
-from git_command import ssh_sock
-from git_command import terminate_ssh_clients
 from git_refs import R_CHANGES, R_HEADS, R_TAGS
 
+# Prefix that is prepended to all the keys of SyncAnalysisState's data
+# that is saved in the config.
+SYNC_STATE_PREFIX = 'repo.syncstate.'
+
 ID_RE = re.compile(r'^[0-9a-f]{40}$')
 
 REVIEW_CACHE = dict()
 
+
 def IsChange(rev):
   return rev.startswith(R_CHANGES)
 
+
 def IsId(rev):
   return ID_RE.match(rev)
 
+
 def IsTag(rev):
   return rev.startswith(R_TAGS)
 
+
 def IsImmutable(rev):
     return IsChange(rev) or IsId(rev) or IsTag(rev)
 
+
 def _key(name):
   parts = name.split('.')
   if len(parts) < 2:
     return name.lower()
-  parts[ 0] = parts[ 0].lower()
+  parts[0] = parts[0].lower()
   parts[-1] = parts[-1].lower()
   return '.'.join(parts)
 
+
 class GitConfig(object):
   _ForUser = None
 
+  _USER_CONFIG = '~/.gitconfig'
+
+  _ForSystem = None
+  _SYSTEM_CONFIG = '/etc/gitconfig'
+
+  @classmethod
+  def ForSystem(cls):
+    if cls._ForSystem is None:
+      cls._ForSystem = cls(configfile=cls._SYSTEM_CONFIG)
+    return cls._ForSystem
+
   @classmethod
   def ForUser(cls):
     if cls._ForUser is None:
-      cls._ForUser = cls(configfile = os.path.expanduser('~/.gitconfig'))
+      cls._ForUser = cls(configfile=os.path.expanduser(cls._USER_CONFIG))
     return cls._ForUser
 
   @classmethod
   def ForRepository(cls, gitdir, defaults=None):
-    return cls(configfile = os.path.join(gitdir, 'config'),
-               defaults = defaults)
+    return cls(configfile=os.path.join(gitdir, 'config'),
+               defaults=defaults)
 
   def __init__(self, configfile, defaults=None, jsonFile=None):
     self.file = configfile
@@ -104,18 +101,74 @@
     self._json = jsonFile
     if self._json is None:
       self._json = os.path.join(
-        os.path.dirname(self.file),
-        '.repo_' + os.path.basename(self.file) + '.json')
+          os.path.dirname(self.file),
+          '.repo_' + os.path.basename(self.file) + '.json')
 
-  def Has(self, name, include_defaults = True):
+  def ClearCache(self):
+    """Clear the in-memory cache of config."""
+    self._cache_dict = None
+
+  def Has(self, name, include_defaults=True):
     """Return true if this configuration file has the key.
     """
     if _key(name) in self._cache:
       return True
     if include_defaults and self.defaults:
-      return self.defaults.Has(name, include_defaults = True)
+      return self.defaults.Has(name, include_defaults=True)
     return False
 
+  def GetInt(self, name):
+    """Returns an integer from the configuration file.
+
+    This follows the git config syntax.
+
+    Args:
+      name: The key to lookup.
+
+    Returns:
+      None if the value was not defined, or is not a boolean.
+      Otherwise, the number itself.
+    """
+    v = self.GetString(name)
+    if v is None:
+      return None
+    v = v.strip()
+
+    mult = 1
+    if v.endswith('k'):
+      v = v[:-1]
+      mult = 1024
+    elif v.endswith('m'):
+      v = v[:-1]
+      mult = 1024 * 1024
+    elif v.endswith('g'):
+      v = v[:-1]
+      mult = 1024 * 1024 * 1024
+
+    base = 10
+    if v.startswith('0x'):
+      base = 16
+
+    try:
+      return int(v, base=base) * mult
+    except ValueError:
+      return None
+
+  def DumpConfigDict(self):
+    """Returns the current configuration dict.
+
+    Configuration data is information only (e.g. logging) and
+    should not be considered a stable data-source.
+
+    Returns:
+      dict of {<key>, <value>} for git configuration cache.
+      <value> are strings converted by GetString.
+    """
+    config_dict = {}
+    for key in self._cache:
+      config_dict[key] = self.GetString(key)
+    return config_dict
+
   def GetBoolean(self, name):
     """Returns a boolean from the configuration file.
        None : The value was not defined, or is not a boolean.
@@ -132,6 +185,12 @@
       return False
     return None
 
+  def SetBoolean(self, name, value):
+    """Set the truthy value for a key."""
+    if value is not None:
+      value = 'true' if value else 'false'
+    self.SetString(name, value)
+
   def GetString(self, name, all_keys=False):
     """Get the first value for a key, or None if it is not defined.
 
@@ -142,7 +201,7 @@
       v = self._cache[_key(name)]
     except KeyError:
       if self.defaults:
-        return self.defaults.GetString(name, all_keys = all_keys)
+        return self.defaults.GetString(name, all_keys=all_keys)
       v = []
 
     if not all_keys:
@@ -153,7 +212,7 @@
     r = []
     r.extend(v)
     if self.defaults:
-      r.extend(self.defaults.GetString(name, all_keys = True))
+      r.extend(self.defaults.GetString(name, all_keys=True))
     return r
 
   def SetString(self, name, value):
@@ -212,12 +271,28 @@
       self._branches[b.name] = b
     return b
 
+  def GetSyncAnalysisStateData(self):
+    """Returns data to be logged for the analysis of sync performance."""
+    return {k: v for k, v in self.DumpConfigDict().items() if k.startswith(SYNC_STATE_PREFIX)}
+
+  def UpdateSyncAnalysisState(self, options, superproject_logging_data):
+    """Update Config's SYNC_STATE_PREFIX* data with the latest sync data.
+
+    Args:
+      options: Options passed to sync returned from optparse. See _Options().
+      superproject_logging_data: A dictionary of superproject data that is to be logged.
+
+    Returns:
+      SyncAnalysisState object.
+    """
+    return SyncAnalysisState(self, options, superproject_logging_data)
+
   def GetSubSections(self, section):
     """List all subsection names matching $section.*.*
     """
     return self._sections.get(section, set())
 
-  def HasSection(self, section, subsection = ''):
+  def HasSection(self, section, subsection=''):
     """Does at least one key in section.subsection exist?
     """
     try:
@@ -268,8 +343,7 @@
 
   def _ReadJson(self):
     try:
-      if os.path.getmtime(self._json) \
-      <= os.path.getmtime(self.file):
+      if os.path.getmtime(self._json) <= os.path.getmtime(self.file):
         platform_utils.remove(self._json)
         return None
     except OSError:
@@ -278,8 +352,8 @@
       Trace(': parsing %s', self.file)
       with open(self._json) as fd:
         return json.load(fd)
-    except (IOError, ValueError):
-      platform_utils.remove(self._json)
+    except (IOError, ValueErrorl):
+      platform_utils.remove(self._json, missing_ok=True)
       return None
 
   def _SaveJson(self, cache):
@@ -287,8 +361,7 @@
       with open(self._json, 'w') as fd:
         json.dump(cache, fd, indent=2)
     except (IOError, TypeError):
-      if os.path.exists(self._json):
-        platform_utils.remove(self._json)
+      platform_utils.remove(self._json, missing_ok=True)
 
   def _ReadGit(self):
     """
@@ -298,11 +371,10 @@
 
     """
     c = {}
-    d = self._do('--null', '--list')
-    if d is None:
+    if not os.path.exists(self.file):
       return c
-    if not is_python3():
-      d = d.decode('utf-8')
+
+    d = self._do('--null', '--list')
     for line in d.rstrip('\0').split('\0'):
       if '\n' in line:
         key, val = line.split('\n', 1)
@@ -318,17 +390,26 @@
     return c
 
   def _do(self, *args):
-    command = ['config', '--file', self.file]
+    if self.file == self._SYSTEM_CONFIG:
+      command = ['config', '--system', '--includes']
+    else:
+      command = ['config', '--file', self.file, '--includes']
     command.extend(args)
 
     p = GitCommand(None,
                    command,
-                   capture_stdout = True,
-                   capture_stderr = True)
+                   capture_stdout=True,
+                   capture_stderr=True)
     if p.Wait() == 0:
       return p.stdout
     else:
-      GitError('git config %s: %s' % (str(args), p.stderr))
+      raise GitError('git config %s: %s' % (str(args), p.stderr))
+
+
+class RepoConfig(GitConfig):
+  """User settings for repo itself."""
+
+  _USER_CONFIG = '~/.repoconfig/config'
 
 
 class RefSpec(object):
@@ -387,133 +468,16 @@
     return s
 
 
-_master_processes = []
-_master_keys = set()
-_ssh_master = True
-_master_keys_lock = None
-
-def init_ssh():
-  """Should be called once at the start of repo to init ssh master handling.
-
-  At the moment, all we do is to create our lock.
-  """
-  global _master_keys_lock
-  assert _master_keys_lock is None, "Should only call init_ssh once"
-  _master_keys_lock = _threading.Lock()
-
-def _open_ssh(host, port=None):
-  global _ssh_master
-
-  # Acquire the lock.  This is needed to prevent opening multiple masters for
-  # the same host when we're running "repo sync -jN" (for N > 1) _and_ the
-  # manifest <remote fetch="ssh://xyz"> specifies a different host from the
-  # one that was passed to repo init.
-  _master_keys_lock.acquire()
-  try:
-
-    # Check to see whether we already think that the master is running; if we
-    # think it's already running, return right away.
-    if port is not None:
-      key = '%s:%s' % (host, port)
-    else:
-      key = host
-
-    if key in _master_keys:
-      return True
-
-    if not _ssh_master \
-    or 'GIT_SSH' in os.environ \
-    or sys.platform in ('win32', 'cygwin'):
-      # failed earlier, or cygwin ssh can't do this
-      #
-      return False
-
-    # We will make two calls to ssh; this is the common part of both calls.
-    command_base = ['ssh',
-                     '-o','ControlPath %s' % ssh_sock(),
-                     host]
-    if port is not None:
-      command_base[1:1] = ['-p', str(port)]
-
-    # Since the key wasn't in _master_keys, we think that master isn't running.
-    # ...but before actually starting a master, we'll double-check.  This can
-    # be important because we can't tell that that 'git@myhost.com' is the same
-    # as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
-    check_command = command_base + ['-O','check']
-    try:
-      Trace(': %s', ' '.join(check_command))
-      check_process = subprocess.Popen(check_command,
-                                       stdout=subprocess.PIPE,
-                                       stderr=subprocess.PIPE)
-      check_process.communicate() # read output, but ignore it...
-      isnt_running = check_process.wait()
-
-      if not isnt_running:
-        # Our double-check found that the master _was_ infact running.  Add to
-        # the list of keys.
-        _master_keys.add(key)
-        return True
-    except Exception:
-      # Ignore excpetions.  We we will fall back to the normal command and print
-      # to the log there.
-      pass
-
-    command = command_base[:1] + \
-              ['-M', '-N'] + \
-              command_base[1:]
-    try:
-      Trace(': %s', ' '.join(command))
-      p = subprocess.Popen(command)
-    except Exception as e:
-      _ssh_master = False
-      print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
-             % (host,port, str(e)), file=sys.stderr)
-      return False
-
-    time.sleep(1)
-    ssh_died = (p.poll() is not None)
-    if ssh_died:
-      return False
-
-    _master_processes.append(p)
-    _master_keys.add(key)
-    return True
-  finally:
-    _master_keys_lock.release()
-
-def close_ssh():
-  global _master_keys_lock
-
-  terminate_ssh_clients()
-
-  for p in _master_processes:
-    try:
-      os.kill(p.pid, SIGTERM)
-      p.wait()
-    except OSError:
-      pass
-  del _master_processes[:]
-  _master_keys.clear()
-
-  d = ssh_sock(create=False)
-  if d:
-    try:
-      platform_utils.rmdir(os.path.dirname(d))
-    except OSError:
-      pass
-
-  # We're done with the lock, so we can delete it.
-  _master_keys_lock = None
-
-URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
 URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
 
+
 def GetSchemeFromUrl(url):
   m = URI_ALL.match(url)
   if m:
     return m.group(1)
   return None
 
+
 @contextlib.contextmanager
 def GetUrlCookieFile(url, quiet):
   if url.startswith('persistent-'):
@@ -554,29 +518,11 @@
     cookiefile = os.path.expanduser(cookiefile)
   yield cookiefile, None
 
-def _preconnect(url):
-  m = URI_ALL.match(url)
-  if m:
-    scheme = m.group(1)
-    host = m.group(2)
-    if ':' in host:
-      host, port = host.split(':')
-    else:
-      port = None
-    if scheme in ('ssh', 'git+ssh', 'ssh+git'):
-      return _open_ssh(host, port)
-    return False
-
-  m = URI_SCP.match(url)
-  if m:
-    host = m.group(1)
-    return _open_ssh(host)
-
-  return False
 
 class Remote(object):
   """Configuration options related to a remote.
   """
+
   def __init__(self, config, name):
     self._config = config
     self.name = name
@@ -585,7 +531,7 @@
     self.review = self._Get('review')
     self.projectname = self._Get('projectname')
     self.fetch = list(map(RefSpec.FromString,
-                      self._Get('fetch', all_keys=True)))
+                          self._Get('fetch', all_keys=True)))
     self._review_url = None
 
   def _InsteadOf(self):
@@ -599,8 +545,8 @@
       insteadOfList = globCfg.GetString(key, all_keys=True)
 
       for insteadOf in insteadOfList:
-        if self.url.startswith(insteadOf) \
-        and len(insteadOf) > len(longest):
+        if (self.url.startswith(insteadOf)
+                and len(insteadOf) > len(longest)):
           longest = insteadOf
           longestUrl = url
 
@@ -609,9 +555,23 @@
 
     return self.url.replace(longest, longestUrl, 1)
 
-  def PreConnectFetch(self):
+  def PreConnectFetch(self, ssh_proxy):
+    """Run any setup for this remote before we connect to it.
+
+    In practice, if the remote is using SSH, we'll attempt to create a new
+    SSH master session to it for reuse across projects.
+
+    Args:
+      ssh_proxy: The SSH settings for managing master sessions.
+
+    Returns:
+      Whether the preconnect phase for this remote was successful.
+    """
+    if not ssh_proxy:
+      return True
+
     connectionUrl = self._InsteadOf()
-    return _preconnect(connectionUrl)
+    return ssh_proxy.preconnect(connectionUrl)
 
   def ReviewUrl(self, userEmail, validate_certs):
     if self._review_url is None:
@@ -731,12 +691,13 @@
 
   def _Get(self, key, all_keys=False):
     key = 'remote.%s.%s' % (self.name, key)
-    return self._config.GetString(key, all_keys = all_keys)
+    return self._config.GetString(key, all_keys=all_keys)
 
 
 class Branch(object):
   """Configuration options related to a single branch.
   """
+
   def __init__(self, config, name):
     self._config = config
     self.name = name
@@ -780,4 +741,71 @@
 
   def _Get(self, key, all_keys=False):
     key = 'branch.%s.%s' % (self.name, key)
-    return self._config.GetString(key, all_keys = all_keys)
+    return self._config.GetString(key, all_keys=all_keys)
+
+
+class SyncAnalysisState:
+  """Configuration options related to logging of sync state for analysis.
+
+  This object is versioned.
+  """
+  def __init__(self, config, options, superproject_logging_data):
+    """Initializes SyncAnalysisState.
+
+    Saves the following data into the |config| object.
+    - sys.argv, options, superproject's logging data.
+    - repo.*, branch.* and remote.* parameters from config object.
+    - Current time as synctime.
+    - Version number of the object.
+
+    All the keys saved by this object are prepended with SYNC_STATE_PREFIX.
+
+    Args:
+      config: GitConfig object to store all options.
+      options: Options passed to sync returned from optparse. See _Options().
+      superproject_logging_data: A dictionary of superproject data that is to be logged.
+    """
+    self._config = config
+    now = datetime.datetime.utcnow()
+    self._Set('main.synctime', now.isoformat() + 'Z')
+    self._Set('main.version', '1')
+    self._Set('sys.argv', sys.argv)
+    for key, value in superproject_logging_data.items():
+      self._Set(f'superproject.{key}', value)
+    for key, value in options.__dict__.items():
+      self._Set(f'options.{key}', value)
+    config_items = config.DumpConfigDict().items()
+    EXTRACT_NAMESPACES = {'repo', 'branch', 'remote'}
+    self._SetDictionary({k: v for k, v in config_items
+                         if not k.startswith(SYNC_STATE_PREFIX) and
+                         k.split('.', 1)[0] in EXTRACT_NAMESPACES})
+
+  def _SetDictionary(self, data):
+    """Save all key/value pairs of |data| dictionary.
+
+    Args:
+      data: A dictionary whose key/value are to be saved.
+    """
+    for key, value in data.items():
+      self._Set(key, value)
+
+  def _Set(self, key, value):
+    """Set the |value| for a |key| in the |_config| member.
+
+    |key| is prepended with the value of SYNC_STATE_PREFIX constant.
+
+    Args:
+      key: Name of the key.
+      value: |value| could be of any type. If it is 'bool', it will be saved
+             as a Boolean and for all other types, it will be saved as a String.
+    """
+    if value is None:
+      return
+    sync_key = f'{SYNC_STATE_PREFIX}{key}'
+    sync_key = sync_key.replace('_', '')
+    if isinstance(value, str):
+      self._config.SetString(sync_key, value)
+    elif isinstance(value, bool):
+      self._config.SetBoolean(sync_key, value)
+    else:
+      self._config.SetString(sync_key, str(value))
diff --git a/git_refs.py b/git_refs.py
index debd4cb..2d4a809 100644
--- a/git_refs.py
+++ b/git_refs.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -18,12 +16,14 @@
 from repo_trace import Trace
 import platform_utils
 
-HEAD      = 'HEAD'
+HEAD = 'HEAD'
 R_CHANGES = 'refs/changes/'
-R_HEADS   = 'refs/heads/'
-R_TAGS    = 'refs/tags/'
-R_PUB     = 'refs/published/'
-R_M       = 'refs/remotes/m/'
+R_HEADS = 'refs/heads/'
+R_TAGS = 'refs/tags/'
+R_PUB = 'refs/published/'
+R_WORKTREE = 'refs/worktree/'
+R_WORKTREE_M = R_WORKTREE + 'm/'
+R_M = 'refs/remotes/m/'
 
 
 class GitRefs(object):
@@ -131,11 +131,14 @@
     base = os.path.join(self._gitdir, prefix)
     for name in platform_utils.listdir(base):
       p = os.path.join(base, name)
-      if platform_utils.isdir(p):
+      # We don't implement the full ref validation algorithm, just the simple
+      # rules that would show up in local filesystems.
+      # https://git-scm.com/docs/git-check-ref-format
+      if name.startswith('.') or name.endswith('.lock'):
+        pass
+      elif platform_utils.isdir(p):
         self._mtime[prefix] = os.path.getmtime(base)
         self._ReadLoose(prefix + name + '/')
-      elif name.endswith('.lock'):
-        pass
       else:
         self._ReadLoose1(p, prefix + name)
 
@@ -144,7 +147,7 @@
       with open(path) as fd:
         mtime = os.path.getmtime(path)
         ref_id = fd.readline()
-    except (IOError, OSError):
+    except (OSError, UnicodeError):
       return
 
     try:
diff --git a/git_superproject.py b/git_superproject.py
new file mode 100644
index 0000000..4ca84a5
--- /dev/null
+++ b/git_superproject.py
@@ -0,0 +1,415 @@
+# Copyright (C) 2021 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Provide functionality to get all projects and their commit ids from Superproject.
+
+For more information on superproject, check out:
+https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
+
+Examples:
+  superproject = Superproject()
+  UpdateProjectsResult = superproject.UpdateProjectsRevisionId(projects)
+"""
+
+import hashlib
+import functools
+import os
+import sys
+import time
+from typing import NamedTuple
+
+from git_command import git_require, GitCommand
+from git_config import RepoConfig
+from git_refs import R_HEADS
+from manifest_xml import LOCAL_MANIFEST_GROUP_PREFIX
+
+_SUPERPROJECT_GIT_NAME = 'superproject.git'
+_SUPERPROJECT_MANIFEST_NAME = 'superproject_override.xml'
+
+
+class SyncResult(NamedTuple):
+  """Return the status of sync and whether caller should exit."""
+
+  # Whether the superproject sync was successful.
+  success: bool
+  # Whether the caller should exit.
+  fatal: bool
+
+
+class CommitIdsResult(NamedTuple):
+  """Return the commit ids and whether caller should exit."""
+
+  # A dictionary with the projects/commit ids on success, otherwise None.
+  commit_ids: dict
+  # Whether the caller should exit.
+  fatal: bool
+
+
+class UpdateProjectsResult(NamedTuple):
+  """Return the overriding manifest file and whether caller should exit."""
+
+  # Path name of the overriding manifest file if successful, otherwise None.
+  manifest_path: str
+  # Whether the caller should exit.
+  fatal: bool
+
+
+class Superproject(object):
+  """Get commit ids from superproject.
+
+  Initializes a local copy of a superproject for the manifest. This allows
+  lookup of commit ids for all projects. It contains _project_commit_ids which
+  is a dictionary with project/commit id entries.
+  """
+  def __init__(self, manifest, repodir, git_event_log,
+               superproject_dir='exp-superproject', quiet=False, print_messages=False):
+    """Initializes superproject.
+
+    Args:
+      manifest: A Manifest object that is to be written to a file.
+      repodir: Path to the .repo/ dir for holding all internal checkout state.
+          It must be in the top directory of the repo client checkout.
+      git_event_log: A git trace2 event log to log events.
+      superproject_dir: Relative path under |repodir| to checkout superproject.
+      quiet:  If True then only print the progress messages.
+      print_messages: if True then print error/warning messages.
+    """
+    self._project_commit_ids = None
+    self._manifest = manifest
+    self._git_event_log = git_event_log
+    self._quiet = quiet
+    self._print_messages = print_messages
+    self._branch = manifest.branch
+    self._repodir = os.path.abspath(repodir)
+    self._superproject_dir = superproject_dir
+    self._superproject_path = os.path.join(self._repodir, superproject_dir)
+    self._manifest_path = os.path.join(self._superproject_path,
+                                       _SUPERPROJECT_MANIFEST_NAME)
+    git_name = ''
+    if self._manifest.superproject:
+      remote = self._manifest.superproject['remote']
+      git_name = hashlib.md5(remote.name.encode('utf8')).hexdigest() + '-'
+      self._branch = self._manifest.superproject['revision']
+      self._remote_url = remote.url
+    else:
+      self._remote_url = None
+    self._work_git_name = git_name + _SUPERPROJECT_GIT_NAME
+    self._work_git = os.path.join(self._superproject_path, self._work_git_name)
+
+  @property
+  def project_commit_ids(self):
+    """Returns a dictionary of projects and their commit ids."""
+    return self._project_commit_ids
+
+  @property
+  def manifest_path(self):
+    """Returns the manifest path if the path exists or None."""
+    return self._manifest_path if os.path.exists(self._manifest_path) else None
+
+  def _LogMessage(self, message):
+    """Logs message to stderr and _git_event_log."""
+    if self._print_messages:
+      print(message, file=sys.stderr)
+    self._git_event_log.ErrorEvent(message, f'{message}')
+
+  def _LogMessagePrefix(self):
+    """Returns the prefix string to be logged in each log message"""
+    return f'repo superproject branch: {self._branch} url: {self._remote_url}'
+
+  def _LogError(self, message):
+    """Logs error message to stderr and _git_event_log."""
+    self._LogMessage(f'{self._LogMessagePrefix()} error: {message}')
+
+  def _LogWarning(self, message):
+    """Logs warning message to stderr and _git_event_log."""
+    self._LogMessage(f'{self._LogMessagePrefix()} warning: {message}')
+
+  def _Init(self):
+    """Sets up a local Git repository to get a copy of a superproject.
+
+    Returns:
+      True if initialization is successful, or False.
+    """
+    if not os.path.exists(self._superproject_path):
+      os.mkdir(self._superproject_path)
+    if not self._quiet and not os.path.exists(self._work_git):
+      print('%s: Performing initial setup for superproject; this might take '
+            'several minutes.' % self._work_git)
+    cmd = ['init', '--bare', self._work_git_name]
+    p = GitCommand(None,
+                   cmd,
+                   cwd=self._superproject_path,
+                   capture_stdout=True,
+                   capture_stderr=True)
+    retval = p.Wait()
+    if retval:
+      self._LogWarning(f'git init call failed, command: git {cmd}, '
+                       f'return code: {retval}, stderr: {p.stderr}')
+      return False
+    return True
+
+  def _Fetch(self):
+    """Fetches a local copy of a superproject for the manifest based on |_remote_url|.
+
+    Returns:
+      True if fetch is successful, or False.
+    """
+    if not os.path.exists(self._work_git):
+      self._LogWarning(f'git fetch missing directory: {self._work_git}')
+      return False
+    if not git_require((2, 28, 0)):
+      self._LogWarning('superproject requires a git version 2.28 or later')
+      return False
+    cmd = ['fetch', self._remote_url, '--depth', '1', '--force', '--no-tags',
+           '--filter', 'blob:none']
+    if self._branch:
+      cmd += [self._branch + ':' + self._branch]
+    p = GitCommand(None,
+                   cmd,
+                   cwd=self._work_git,
+                   capture_stdout=True,
+                   capture_stderr=True)
+    retval = p.Wait()
+    if retval:
+      self._LogWarning(f'git fetch call failed, command: git {cmd}, '
+                       f'return code: {retval}, stderr: {p.stderr}')
+      return False
+    return True
+
+  def _LsTree(self):
+    """Gets the commit ids for all projects.
+
+    Works only in git repositories.
+
+    Returns:
+      data: data returned from 'git ls-tree ...' instead of None.
+    """
+    if not os.path.exists(self._work_git):
+      self._LogWarning(f'git ls-tree missing directory: {self._work_git}')
+      return None
+    data = None
+    branch = 'HEAD' if not self._branch else self._branch
+    cmd = ['ls-tree', '-z', '-r', branch]
+
+    p = GitCommand(None,
+                   cmd,
+                   cwd=self._work_git,
+                   capture_stdout=True,
+                   capture_stderr=True)
+    retval = p.Wait()
+    if retval == 0:
+      data = p.stdout
+    else:
+      self._LogWarning(f'git ls-tree call failed, command: git {cmd}, '
+                       f'return code: {retval}, stderr: {p.stderr}')
+    return data
+
+  def Sync(self):
+    """Gets a local copy of a superproject for the manifest.
+
+    Returns:
+      SyncResult
+    """
+    if not self._manifest.superproject:
+      self._LogWarning(f'superproject tag is not defined in manifest: '
+                       f'{self._manifest.manifestFile}')
+      return SyncResult(False, False)
+
+    print('NOTICE: --use-superproject is in beta; report any issues to the '
+          'address described in `repo version`', file=sys.stderr)
+    should_exit = True
+    if not self._remote_url:
+      self._LogWarning(f'superproject URL is not defined in manifest: '
+                       f'{self._manifest.manifestFile}')
+      return SyncResult(False, should_exit)
+
+    if not self._Init():
+      return SyncResult(False, should_exit)
+    if not self._Fetch():
+      return SyncResult(False, should_exit)
+    if not self._quiet:
+      print('%s: Initial setup for superproject completed.' % self._work_git)
+    return SyncResult(True, False)
+
+  def _GetAllProjectsCommitIds(self):
+    """Get commit ids for all projects from superproject and save them in _project_commit_ids.
+
+    Returns:
+      CommitIdsResult
+    """
+    sync_result = self.Sync()
+    if not sync_result.success:
+      return CommitIdsResult(None, sync_result.fatal)
+
+    data = self._LsTree()
+    if not data:
+      self._LogWarning(f'git ls-tree failed to return data for manifest: '
+                       f'{self._manifest.manifestFile}')
+      return CommitIdsResult(None, True)
+
+    # Parse lines like the following to select lines starting with '160000' and
+    # build a dictionary with project path (last element) and its commit id (3rd element).
+    #
+    # 160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00
+    # 120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00
+    commit_ids = {}
+    for line in data.split('\x00'):
+      ls_data = line.split(None, 3)
+      if not ls_data:
+        break
+      if ls_data[0] == '160000':
+        commit_ids[ls_data[3]] = ls_data[2]
+
+    self._project_commit_ids = commit_ids
+    return CommitIdsResult(commit_ids, False)
+
+  def _WriteManifestFile(self):
+    """Writes manifest to a file.
+
+    Returns:
+      manifest_path: Path name of the file into which manifest is written instead of None.
+    """
+    if not os.path.exists(self._superproject_path):
+      self._LogWarning(f'missing superproject directory: {self._superproject_path}')
+      return None
+    manifest_str = self._manifest.ToXml(groups=self._manifest.GetGroupsStr()).toxml()
+    manifest_path = self._manifest_path
+    try:
+      with open(manifest_path, 'w', encoding='utf-8') as fp:
+        fp.write(manifest_str)
+    except IOError as e:
+      self._LogError(f'cannot write manifest to : {manifest_path} {e}')
+      return None
+    return manifest_path
+
+  def _SkipUpdatingProjectRevisionId(self, project):
+    """Checks if a project's revision id needs to be updated or not.
+
+    Revision id for projects from local manifest will not be updated.
+
+    Args:
+      project: project whose revision id is being updated.
+
+    Returns:
+      True if a project's revision id should not be updated, or False,
+    """
+    path = project.relpath
+    if not path:
+      return True
+    # Skip the project with revisionId.
+    if project.revisionId:
+      return True
+    # Skip the project if it comes from the local manifest.
+    return any(s.startswith(LOCAL_MANIFEST_GROUP_PREFIX) for s in project.groups)
+
+  def UpdateProjectsRevisionId(self, projects):
+    """Update revisionId of every project in projects with the commit id.
+
+    Args:
+      projects: List of projects whose revisionId needs to be updated.
+
+    Returns:
+      UpdateProjectsResult
+    """
+    commit_ids_result = self._GetAllProjectsCommitIds()
+    commit_ids = commit_ids_result.commit_ids
+    if not commit_ids:
+      return UpdateProjectsResult(None, commit_ids_result.fatal)
+
+    projects_missing_commit_ids = []
+    for project in projects:
+      if self._SkipUpdatingProjectRevisionId(project):
+        continue
+      path = project.relpath
+      commit_id = commit_ids.get(path)
+      if not commit_id:
+        projects_missing_commit_ids.append(path)
+
+    # If superproject doesn't have a commit id for a project, then report an
+    # error event and continue as if do not use superproject is specified.
+    if projects_missing_commit_ids:
+      self._LogWarning(f'please file a bug using {self._manifest.contactinfo.bugurl} '
+                       f'to report missing commit_ids for: {projects_missing_commit_ids}')
+      return UpdateProjectsResult(None, False)
+
+    for project in projects:
+      if not self._SkipUpdatingProjectRevisionId(project):
+        project.SetRevisionId(commit_ids.get(project.relpath))
+
+    manifest_path = self._WriteManifestFile()
+    return UpdateProjectsResult(manifest_path, False)
+
+
+@functools.lru_cache(maxsize=None)
+def _UseSuperprojectFromConfiguration():
+  """Returns the user choice of whether to use superproject."""
+  user_cfg = RepoConfig.ForUser()
+  time_now = int(time.time())
+
+  user_value = user_cfg.GetBoolean('repo.superprojectChoice')
+  if user_value is not None:
+    user_expiration = user_cfg.GetInt('repo.superprojectChoiceExpire')
+    if user_expiration is None or user_expiration <= 0 or user_expiration >= time_now:
+      # TODO(b/190688390) - Remove prompt when we are comfortable with the new
+      # default value.
+      if user_value:
+        print(('You are currently enrolled in Git submodules experiment '
+               '(go/android-submodules-quickstart).  Use --no-use-superproject '
+               'to override.\n'), file=sys.stderr)
+      else:
+        print(('You are not currently enrolled in Git submodules experiment '
+               '(go/android-submodules-quickstart).  Use --use-superproject '
+               'to override.\n'), file=sys.stderr)
+      return user_value
+
+  # We don't have an unexpired choice, ask for one.
+  system_cfg = RepoConfig.ForSystem()
+  system_value = system_cfg.GetBoolean('repo.superprojectChoice')
+  if system_value:
+    # The system configuration is proposing that we should enable the
+    # use of superproject. Treat the user as enrolled for two weeks.
+    #
+    # TODO(b/190688390) - Remove prompt when we are comfortable with the new
+    # default value.
+    userchoice = True
+    time_choiceexpire = time_now + (86400 * 14)
+    user_cfg.SetString('repo.superprojectChoiceExpire', str(time_choiceexpire))
+    user_cfg.SetBoolean('repo.superprojectChoice', userchoice)
+    print('You are automatically enrolled in Git submodules experiment '
+          '(go/android-submodules-quickstart) for another two weeks.\n',
+          file=sys.stderr)
+    return True
+
+  # For all other cases, we would not use superproject by default.
+  return False
+
+
+def PrintMessages(opt, manifest):
+  """Returns a boolean if error/warning messages are to be printed."""
+  return opt.use_superproject is not None or manifest.superproject
+
+
+def UseSuperproject(opt, manifest):
+  """Returns a boolean if use-superproject option is enabled."""
+
+  if opt.use_superproject is not None:
+    return opt.use_superproject
+  else:
+    client_value = manifest.manifestProject.config.GetBoolean('repo.superproject')
+    if client_value is not None:
+      return client_value
+    else:
+      if not manifest.superproject:
+        return False
+      return _UseSuperprojectFromConfiguration()
diff --git a/git_trace2_event_log.py b/git_trace2_event_log.py
new file mode 100644
index 0000000..0e5e908
--- /dev/null
+++ b/git_trace2_event_log.py
@@ -0,0 +1,273 @@
+# Copyright (C) 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Provide event logging in the git trace2 EVENT format.
+
+The git trace2 EVENT format is defined at:
+https://www.kernel.org/pub/software/scm/git/docs/technical/api-trace2.html#_event_format
+https://git-scm.com/docs/api-trace2#_the_event_format_target
+
+  Usage:
+
+  git_trace_log = EventLog()
+  git_trace_log.StartEvent()
+  ...
+  git_trace_log.ExitEvent()
+  git_trace_log.Write()
+"""
+
+
+import datetime
+import json
+import os
+import sys
+import tempfile
+import threading
+
+from git_command import GitCommand, RepoSourceVersion
+
+
+class EventLog(object):
+  """Event log that records events that occurred during a repo invocation.
+
+  Events are written to the log as a consecutive JSON entries, one per line.
+  Entries follow the git trace2 EVENT format.
+
+  Each entry contains the following common keys:
+  - event: The event name
+  - sid: session-id - Unique string to allow process instance to be identified.
+  - thread: The thread name.
+  - time: is the UTC time of the event.
+
+  Valid 'event' names and event specific fields are documented here:
+  https://git-scm.com/docs/api-trace2#_event_format
+  """
+
+  def __init__(self, env=None):
+    """Initializes the event log."""
+    self._log = []
+    # Try to get session-id (sid) from environment (setup in repo launcher).
+    KEY = 'GIT_TRACE2_PARENT_SID'
+    if env is None:
+      env = os.environ
+
+    now = datetime.datetime.utcnow()
+
+    # Save both our sid component and the complete sid.
+    # We use our sid component (self._sid) as the unique filename prefix and
+    # the full sid (self._full_sid) in the log itself.
+    self._sid = 'repo-%s-P%08x' % (now.strftime('%Y%m%dT%H%M%SZ'), os.getpid())
+    parent_sid = env.get(KEY)
+    # Append our sid component to the parent sid (if it exists).
+    if parent_sid is not None:
+      self._full_sid = parent_sid + '/' + self._sid
+    else:
+      self._full_sid = self._sid
+
+    # Set/update the environment variable.
+    # Environment handling across systems is messy.
+    try:
+      env[KEY] = self._full_sid
+    except UnicodeEncodeError:
+      env[KEY] = self._full_sid.encode()
+
+    # Add a version event to front of the log.
+    self._AddVersionEvent()
+
+  @property
+  def full_sid(self):
+    return self._full_sid
+
+  def _AddVersionEvent(self):
+    """Adds a 'version' event at the beginning of current log."""
+    version_event = self._CreateEventDict('version')
+    version_event['evt'] = "2"
+    version_event['exe'] = RepoSourceVersion()
+    self._log.insert(0, version_event)
+
+  def _CreateEventDict(self, event_name):
+    """Returns a dictionary with the common keys/values for git trace2 events.
+
+    Args:
+      event_name: The event name.
+
+    Returns:
+      Dictionary with the common event fields populated.
+    """
+    return {
+        'event': event_name,
+        'sid': self._full_sid,
+        'thread': threading.currentThread().getName(),
+        'time': datetime.datetime.utcnow().isoformat() + 'Z',
+    }
+
+  def StartEvent(self):
+    """Append a 'start' event to the current log."""
+    start_event = self._CreateEventDict('start')
+    start_event['argv'] = sys.argv
+    self._log.append(start_event)
+
+  def ExitEvent(self, result):
+    """Append an 'exit' event to the current log.
+
+    Args:
+      result: Exit code of the event
+    """
+    exit_event = self._CreateEventDict('exit')
+
+    # Consider 'None' success (consistent with event_log result handling).
+    if result is None:
+      result = 0
+    exit_event['code'] = result
+    self._log.append(exit_event)
+
+  def CommandEvent(self, name, subcommands):
+    """Append a 'command' event to the current log.
+
+    Args:
+      name: Name of the primary command (ex: repo, git)
+      subcommands: List of the sub-commands (ex: version, init, sync)
+    """
+    command_event = self._CreateEventDict('command')
+    command_event['name'] = name
+    command_event['subcommands'] = subcommands
+    self._log.append(command_event)
+
+  def LogConfigEvents(self, config, event_dict_name):
+    """Append a |event_dict_name| event for each config key in |config|.
+
+    Args:
+      config: Configuration dictionary.
+      event_dict_name: Name of the event dictionary for items to be logged under.
+    """
+    for param, value in config.items():
+      event = self._CreateEventDict(event_dict_name)
+      event['param'] = param
+      event['value'] = value
+      self._log.append(event)
+
+  def DefParamRepoEvents(self, config):
+    """Append a 'def_param' event for each repo.* config key to the current log.
+
+    Args:
+      config: Repo configuration dictionary
+    """
+    # Only output the repo.* config parameters.
+    repo_config = {k: v for k, v in config.items() if k.startswith('repo.')}
+    self.LogConfigEvents(repo_config, 'def_param')
+
+  def GetDataEventName(self, value):
+    """Returns 'data-json' if the value is an array else returns 'data'."""
+    return 'data-json' if value[0] == '[' and value[-1] == ']' else 'data'
+
+  def LogDataConfigEvents(self, config, prefix):
+    """Append a 'data' event for each config key/value in |config| to the current log.
+
+    For each keyX and valueX of the config, "key" field of the event is '|prefix|/keyX'
+    and the "value" of the "key" field is valueX.
+
+    Args:
+      config: Configuration dictionary.
+      prefix: Prefix for each key that is logged.
+    """
+    for key, value in config.items():
+      event = self._CreateEventDict(self.GetDataEventName(value))
+      event['key'] = f'{prefix}/{key}'
+      event['value'] = value
+      self._log.append(event)
+
+  def ErrorEvent(self, msg, fmt):
+    """Append a 'error' event to the current log."""
+    error_event = self._CreateEventDict('error')
+    error_event['msg'] = msg
+    error_event['fmt'] = fmt
+    self._log.append(error_event)
+
+  def _GetEventTargetPath(self):
+    """Get the 'trace2.eventtarget' path from git configuration.
+
+    Returns:
+      path: git config's 'trace2.eventtarget' path if it exists, or None
+    """
+    path = None
+    cmd = ['config', '--get', 'trace2.eventtarget']
+    # TODO(https://crbug.com/gerrit/13706): Use GitConfig when it supports
+    # system git config variables.
+    p = GitCommand(None, cmd, capture_stdout=True, capture_stderr=True,
+                   bare=True)
+    retval = p.Wait()
+    if retval == 0:
+      # Strip trailing carriage-return in path.
+      path = p.stdout.rstrip('\n')
+    elif retval != 1:
+      # `git config --get` is documented to produce an exit status of `1` if
+      # the requested variable is not present in the configuration. Report any
+      # other return value as an error.
+      print("repo: error: 'git config --get' call failed with return code: %r, stderr: %r" % (
+          retval, p.stderr), file=sys.stderr)
+    return path
+
+  def Write(self, path=None):
+    """Writes the log out to a file.
+
+    Log is only written if 'path' or 'git config --get trace2.eventtarget'
+    provide a valid path to write logs to.
+
+    Logging filename format follows the git trace2 style of being a unique
+    (exclusive writable) file.
+
+    Args:
+      path: Path to where logs should be written.
+
+    Returns:
+      log_path: Path to the log file if log is written, otherwise None
+    """
+    log_path = None
+    # If no logging path is specified, get the path from 'trace2.eventtarget'.
+    if path is None:
+      path = self._GetEventTargetPath()
+
+    # If no logging path is specified, exit.
+    if path is None:
+      return None
+
+    if isinstance(path, str):
+      # Get absolute path.
+      path = os.path.abspath(os.path.expanduser(path))
+    else:
+      raise TypeError('path: str required but got %s.' % type(path))
+
+    # Git trace2 requires a directory to write log to.
+
+    # TODO(https://crbug.com/gerrit/13706): Support file (append) mode also.
+    if not os.path.isdir(path):
+      return None
+    # Use NamedTemporaryFile to generate a unique filename as required by git trace2.
+    try:
+      with tempfile.NamedTemporaryFile(mode='x', prefix=self._sid, dir=path,
+                                       delete=False) as f:
+        # TODO(https://crbug.com/gerrit/13706): Support writing events as they
+        # occur.
+        for e in self._log:
+          # Dump in compact encoding mode.
+          # See 'Compact encoding' in Python docs:
+          # https://docs.python.org/3/library/json.html#module-json
+          json.dump(e, f, indent=None, separators=(',', ':'))
+          f.write('\n')
+        log_path = f.name
+    except FileExistsError as err:
+      print('repo: warning: git trace2 logging failed: %r' % err,
+            file=sys.stderr)
+      return None
+    return log_path
diff --git a/gitc_utils.py b/gitc_utils.py
index b47e181..486bbeb 100644
--- a/gitc_utils.py
+++ b/gitc_utils.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2015 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,8 +12,8 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import os
+import multiprocessing
 import platform
 import re
 import sys
@@ -29,12 +27,24 @@
 
 NUM_BATCH_RETRIEVE_REVISIONID = 32
 
+
 def get_gitc_manifest_dir():
   return wrapper.Wrapper().get_gitc_manifest_dir()
 
+
 def parse_clientdir(gitc_fs_path):
   return wrapper.Wrapper().gitc_parse_clientdir(gitc_fs_path)
 
+
+def _get_project_revision(args):
+  """Worker for _set_project_revisions to lookup one project remote."""
+  (i, url, expr) = args
+  gitcmd = git_command.GitCommand(
+      None, ['ls-remote', url, expr], capture_stdout=True, cwd='/tmp')
+  rc = gitcmd.Wait()
+  return (i, rc, gitcmd.stdout.split('\t', 1)[0])
+
+
 def _set_project_revisions(projects):
   """Sets the revisionExpr for a list of projects.
 
@@ -42,47 +52,38 @@
   should not be overly large. Recommend calling this function multiple times
   with each call not exceeding NUM_BATCH_RETRIEVE_REVISIONID projects.
 
-  @param projects: List of project objects to set the revionExpr for.
+  Args:
+    projects: List of project objects to set the revionExpr for.
   """
   # Retrieve the commit id for each project based off of it's current
   # revisionExpr and it is not already a commit id.
-  project_gitcmds = [(
-      project, git_command.GitCommand(None,
-                                      ['ls-remote',
-                                       project.remote.url,
-                                       project.revisionExpr],
-                                      capture_stdout=True, cwd='/tmp'))
-      for project in projects if not git_config.IsId(project.revisionExpr)]
-  for proj, gitcmd in project_gitcmds:
-    if gitcmd.Wait():
-      print('FATAL: Failed to retrieve revisionExpr for %s' % proj)
-      sys.exit(1)
-    revisionExpr = gitcmd.stdout.split('\t')[0]
-    if not revisionExpr:
-      raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
-                               (proj.remote.url, proj.revisionExpr))
-    proj.revisionExpr = revisionExpr
+  with multiprocessing.Pool(NUM_BATCH_RETRIEVE_REVISIONID) as pool:
+    results_iter = pool.imap_unordered(
+        _get_project_revision,
+        ((i, project.remote.url, project.revisionExpr)
+         for i, project in enumerate(projects)
+         if not git_config.IsId(project.revisionExpr)),
+        chunksize=8)
+    for (i, rc, revisionExpr) in results_iter:
+      project = projects[i]
+      if rc:
+        print('FATAL: Failed to retrieve revisionExpr for %s' % project.name)
+        pool.terminate()
+        sys.exit(1)
+      if not revisionExpr:
+        pool.terminate()
+        raise ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
+                                 (project.remote.url, project.revisionExpr))
+      project.revisionExpr = revisionExpr
 
-def _manifest_groups(manifest):
-  """Returns the manifest group string that should be synced
-
-  This is the same logic used by Command.GetProjects(), which is used during
-  repo sync
-
-  @param manifest: The XmlManifest object
-  """
-  mp = manifest.manifestProject
-  groups = mp.config.GetString('manifest.groups')
-  if not groups:
-    groups = 'default,platform-' + platform.system().lower()
-  return groups
 
 def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
   """Generate a manifest for shafsd to use for this GITC client.
 
-  @param gitc_manifest: Current gitc manifest, or None if there isn't one yet.
-  @param manifest: A GitcManifest object loaded with the current repo manifest.
-  @param paths: List of project paths we want to update.
+  Args:
+    gitc_manifest: Current gitc manifest, or None if there isn't one yet.
+    manifest: A GitcManifest object loaded with the current repo manifest.
+    paths: List of project paths we want to update.
   """
 
   print('Generating GITC Manifest by fetching revision SHAs for each '
@@ -90,7 +91,7 @@
   if paths is None:
     paths = list(manifest.paths.keys())
 
-  groups = [x for x in re.split(r'[,\s]+', _manifest_groups(manifest)) if x]
+  groups = [x for x in re.split(r'[,\s]+', manifest.GetGroupsStr()) if x]
 
   # Convert the paths to projects, and filter them to the matched groups.
   projects = [manifest.paths[p] for p in paths]
@@ -104,11 +105,11 @@
       if not proj.upstream and not git_config.IsId(proj.revisionExpr):
         proj.upstream = proj.revisionExpr
 
-      if not path in gitc_manifest.paths:
+      if path not in gitc_manifest.paths:
         # Any new projects need their first revision, even if we weren't asked
         # for them.
         projects.append(proj)
-      elif not path in paths:
+      elif path not in paths:
         # And copy revisions from the previous manifest if we're not updating
         # them now.
         gitc_proj = gitc_manifest.paths[path]
@@ -118,11 +119,7 @@
         else:
           proj.revisionExpr = gitc_proj.revisionExpr
 
-  index = 0
-  while index < len(projects):
-    _set_project_revisions(
-        projects[index:(index+NUM_BATCH_RETRIEVE_REVISIONID)])
-    index += NUM_BATCH_RETRIEVE_REVISIONID
+  _set_project_revisions(projects)
 
   if gitc_manifest is not None:
     for path, proj in gitc_manifest.paths.items():
@@ -140,16 +137,20 @@
   # Save the manifest.
   save_manifest(manifest)
 
+
 def save_manifest(manifest, client_dir=None):
   """Save the manifest file in the client_dir.
 
-  @param client_dir: Client directory to save the manifest in.
-  @param manifest: Manifest object to save.
+  Args:
+    manifest: Manifest object to save.
+    client_dir: Client directory to save the manifest in.
   """
   if not client_dir:
-    client_dir = manifest.gitc_client_dir
-  with open(os.path.join(client_dir, '.manifest'), 'w') as f:
-    manifest.Save(f, groups=_manifest_groups(manifest))
+    manifest_file = manifest.manifestFile
+  else:
+    manifest_file = os.path.join(client_dir, '.manifest')
+  with open(manifest_file, 'w') as f:
+    manifest.Save(f, groups=manifest.GetGroupsStr())
   # TODO(sbasi/jorg): Come up with a solution to remove the sleep below.
   # Give the GITC filesystem time to register the manifest changes.
   time.sleep(3)
diff --git a/hooks.py b/hooks.py
new file mode 100644
index 0000000..67c21a2
--- /dev/null
+++ b/hooks.py
@@ -0,0 +1,509 @@
+# Copyright (C) 2008 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+import errno
+import json
+import os
+import re
+import subprocess
+import sys
+import traceback
+import urllib.parse
+
+from error import HookError
+from git_refs import HEAD
+
+
+class RepoHook(object):
+  """A RepoHook contains information about a script to run as a hook.
+
+  Hooks are used to run a python script before running an upload (for instance,
+  to run presubmit checks).  Eventually, we may have hooks for other actions.
+
+  This shouldn't be confused with files in the 'repo/hooks' directory.  Those
+  files are copied into each '.git/hooks' folder for each project.  Repo-level
+  hooks are associated instead with repo actions.
+
+  Hooks are always python.  When a hook is run, we will load the hook into the
+  interpreter and execute its main() function.
+
+  Combinations of hook option flags:
+  - no-verify=False, verify=False (DEFAULT):
+    If stdout is a tty, can prompt about running hooks if needed.
+    If user denies running hooks, the action is cancelled. If stdout is
+    not a tty and we would need to prompt about hooks, action is
+    cancelled.
+  - no-verify=False, verify=True:
+    Always run hooks with no prompt.
+  - no-verify=True, verify=False:
+    Never run hooks, but run action anyway (AKA bypass hooks).
+  - no-verify=True, verify=True:
+    Invalid
+  """
+
+  def __init__(self,
+               hook_type,
+               hooks_project,
+               repo_topdir,
+               manifest_url,
+               bypass_hooks=False,
+               allow_all_hooks=False,
+               ignore_hooks=False,
+               abort_if_user_denies=False):
+    """RepoHook constructor.
+
+    Params:
+      hook_type: A string representing the type of hook.  This is also used
+          to figure out the name of the file containing the hook.  For
+          example: 'pre-upload'.
+      hooks_project: The project containing the repo hooks.
+          If you have a manifest, this is manifest.repo_hooks_project.
+          OK if this is None, which will make the hook a no-op.
+      repo_topdir: The top directory of the repo client checkout.
+          This is the one containing the .repo directory. Scripts will
+          run with CWD as this directory.
+          If you have a manifest, this is manifest.topdir.
+      manifest_url: The URL to the manifest git repo.
+      bypass_hooks: If True, then 'Do not run the hook'.
+      allow_all_hooks: If True, then 'Run the hook without prompting'.
+      ignore_hooks: If True, then 'Do not abort action if hooks fail'.
+      abort_if_user_denies: If True, we'll abort running the hook if the user
+          doesn't allow us to run the hook.
+    """
+    self._hook_type = hook_type
+    self._hooks_project = hooks_project
+    self._repo_topdir = repo_topdir
+    self._manifest_url = manifest_url
+    self._bypass_hooks = bypass_hooks
+    self._allow_all_hooks = allow_all_hooks
+    self._ignore_hooks = ignore_hooks
+    self._abort_if_user_denies = abort_if_user_denies
+
+    # Store the full path to the script for convenience.
+    if self._hooks_project:
+      self._script_fullpath = os.path.join(self._hooks_project.worktree,
+                                           self._hook_type + '.py')
+    else:
+      self._script_fullpath = None
+
+  def _GetHash(self):
+    """Return a hash of the contents of the hooks directory.
+
+    We'll just use git to do this.  This hash has the property that if anything
+    changes in the directory we will return a different has.
+
+    SECURITY CONSIDERATION:
+      This hash only represents the contents of files in the hook directory, not
+      any other files imported or called by hooks.  Changes to imported files
+      can change the script behavior without affecting the hash.
+
+    Returns:
+      A string representing the hash.  This will always be ASCII so that it can
+      be printed to the user easily.
+    """
+    assert self._hooks_project, "Must have hooks to calculate their hash."
+
+    # We will use the work_git object rather than just calling GetRevisionId().
+    # That gives us a hash of the latest checked in version of the files that
+    # the user will actually be executing.  Specifically, GetRevisionId()
+    # doesn't appear to change even if a user checks out a different version
+    # of the hooks repo (via git checkout) nor if a user commits their own revs.
+    #
+    # NOTE: Local (non-committed) changes will not be factored into this hash.
+    # I think this is OK, since we're really only worried about warning the user
+    # about upstream changes.
+    return self._hooks_project.work_git.rev_parse(HEAD)
+
+  def _GetMustVerb(self):
+    """Return 'must' if the hook is required; 'should' if not."""
+    if self._abort_if_user_denies:
+      return 'must'
+    else:
+      return 'should'
+
+  def _CheckForHookApproval(self):
+    """Check to see whether this hook has been approved.
+
+    We'll accept approval of manifest URLs if they're using secure transports.
+    This way the user can say they trust the manifest hoster.  For insecure
+    hosts, we fall back to checking the hash of the hooks repo.
+
+    Note that we ask permission for each individual hook even though we use
+    the hash of all hooks when detecting changes.  We'd like the user to be
+    able to approve / deny each hook individually.  We only use the hash of all
+    hooks because there is no other easy way to detect changes to local imports.
+
+    Returns:
+      True if this hook is approved to run; False otherwise.
+
+    Raises:
+      HookError: Raised if the user doesn't approve and abort_if_user_denies
+          was passed to the consturctor.
+    """
+    if self._ManifestUrlHasSecureScheme():
+      return self._CheckForHookApprovalManifest()
+    else:
+      return self._CheckForHookApprovalHash()
+
+  def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt,
+                                  changed_prompt):
+    """Check for approval for a particular attribute and hook.
+
+    Args:
+      subkey: The git config key under [repo.hooks.<hook_type>] to store the
+          last approved string.
+      new_val: The new value to compare against the last approved one.
+      main_prompt: Message to display to the user to ask for approval.
+      changed_prompt: Message explaining why we're re-asking for approval.
+
+    Returns:
+      True if this hook is approved to run; False otherwise.
+
+    Raises:
+      HookError: Raised if the user doesn't approve and abort_if_user_denies
+          was passed to the consturctor.
+    """
+    hooks_config = self._hooks_project.config
+    git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey)
+
+    # Get the last value that the user approved for this hook; may be None.
+    old_val = hooks_config.GetString(git_approval_key)
+
+    if old_val is not None:
+      # User previously approved hook and asked not to be prompted again.
+      if new_val == old_val:
+        # Approval matched.  We're done.
+        return True
+      else:
+        # Give the user a reason why we're prompting, since they last told
+        # us to "never ask again".
+        prompt = 'WARNING: %s\n\n' % (changed_prompt,)
+    else:
+      prompt = ''
+
+    # Prompt the user if we're not on a tty; on a tty we'll assume "no".
+    if sys.stdout.isatty():
+      prompt += main_prompt + ' (yes/always/NO)? '
+      response = input(prompt).lower()
+      print()
+
+      # User is doing a one-time approval.
+      if response in ('y', 'yes'):
+        return True
+      elif response == 'always':
+        hooks_config.SetString(git_approval_key, new_val)
+        return True
+
+    # For anything else, we'll assume no approval.
+    if self._abort_if_user_denies:
+      raise HookError('You must allow the %s hook or use --no-verify.' %
+                      self._hook_type)
+
+    return False
+
+  def _ManifestUrlHasSecureScheme(self):
+    """Check if the URI for the manifest is a secure transport."""
+    secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc')
+    parse_results = urllib.parse.urlparse(self._manifest_url)
+    return parse_results.scheme in secure_schemes
+
+  def _CheckForHookApprovalManifest(self):
+    """Check whether the user has approved this manifest host.
+
+    Returns:
+      True if this hook is approved to run; False otherwise.
+    """
+    return self._CheckForHookApprovalHelper(
+        'approvedmanifest',
+        self._manifest_url,
+        'Run hook scripts from %s' % (self._manifest_url,),
+        'Manifest URL has changed since %s was allowed.' % (self._hook_type,))
+
+  def _CheckForHookApprovalHash(self):
+    """Check whether the user has approved the hooks repo.
+
+    Returns:
+      True if this hook is approved to run; False otherwise.
+    """
+    prompt = ('Repo %s run the script:\n'
+              '  %s\n'
+              '\n'
+              'Do you want to allow this script to run')
+    return self._CheckForHookApprovalHelper(
+        'approvedhash',
+        self._GetHash(),
+        prompt % (self._GetMustVerb(), self._script_fullpath),
+        'Scripts have changed since %s was allowed.' % (self._hook_type,))
+
+  @staticmethod
+  def _ExtractInterpFromShebang(data):
+    """Extract the interpreter used in the shebang.
+
+    Try to locate the interpreter the script is using (ignoring `env`).
+
+    Args:
+      data: The file content of the script.
+
+    Returns:
+      The basename of the main script interpreter, or None if a shebang is not
+      used or could not be parsed out.
+    """
+    firstline = data.splitlines()[:1]
+    if not firstline:
+      return None
+
+    # The format here can be tricky.
+    shebang = firstline[0].strip()
+    m = re.match(r'^#!\s*([^\s]+)(?:\s+([^\s]+))?', shebang)
+    if not m:
+      return None
+
+    # If the using `env`, find the target program.
+    interp = m.group(1)
+    if os.path.basename(interp) == 'env':
+      interp = m.group(2)
+
+    return interp
+
+  def _ExecuteHookViaReexec(self, interp, context, **kwargs):
+    """Execute the hook script through |interp|.
+
+    Note: Support for this feature should be dropped ~Jun 2021.
+
+    Args:
+      interp: The Python program to run.
+      context: Basic Python context to execute the hook inside.
+      kwargs: Arbitrary arguments to pass to the hook script.
+
+    Raises:
+      HookError: When the hooks failed for any reason.
+    """
+    # This logic needs to be kept in sync with _ExecuteHookViaImport below.
+    script = """
+import json, os, sys
+path = '''%(path)s'''
+kwargs = json.loads('''%(kwargs)s''')
+context = json.loads('''%(context)s''')
+sys.path.insert(0, os.path.dirname(path))
+data = open(path).read()
+exec(compile(data, path, 'exec'), context)
+context['main'](**kwargs)
+""" % {
+        'path': self._script_fullpath,
+        'kwargs': json.dumps(kwargs),
+        'context': json.dumps(context),
+    }
+
+    # We pass the script via stdin to avoid OS argv limits.  It also makes
+    # unhandled exception tracebacks less verbose/confusing for users.
+    cmd = [interp, '-c', 'import sys; exec(sys.stdin.read())']
+    proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
+    proc.communicate(input=script.encode('utf-8'))
+    if proc.returncode:
+      raise HookError('Failed to run %s hook.' % (self._hook_type,))
+
+  def _ExecuteHookViaImport(self, data, context, **kwargs):
+    """Execute the hook code in |data| directly.
+
+    Args:
+      data: The code of the hook to execute.
+      context: Basic Python context to execute the hook inside.
+      kwargs: Arbitrary arguments to pass to the hook script.
+
+    Raises:
+      HookError: When the hooks failed for any reason.
+    """
+    # Exec, storing global context in the context dict.  We catch exceptions
+    # and convert to a HookError w/ just the failing traceback.
+    try:
+      exec(compile(data, self._script_fullpath, 'exec'), context)
+    except Exception:
+      raise HookError('%s\nFailed to import %s hook; see traceback above.' %
+                      (traceback.format_exc(), self._hook_type))
+
+    # Running the script should have defined a main() function.
+    if 'main' not in context:
+      raise HookError('Missing main() in: "%s"' % self._script_fullpath)
+
+    # Call the main function in the hook.  If the hook should cause the
+    # build to fail, it will raise an Exception.  We'll catch that convert
+    # to a HookError w/ just the failing traceback.
+    try:
+      context['main'](**kwargs)
+    except Exception:
+      raise HookError('%s\nFailed to run main() for %s hook; see traceback '
+                      'above.' % (traceback.format_exc(), self._hook_type))
+
+  def _ExecuteHook(self, **kwargs):
+    """Actually execute the given hook.
+
+    This will run the hook's 'main' function in our python interpreter.
+
+    Args:
+      kwargs: Keyword arguments to pass to the hook.  These are often specific
+          to the hook type.  For instance, pre-upload hooks will contain
+          a project_list.
+    """
+    # Keep sys.path and CWD stashed away so that we can always restore them
+    # upon function exit.
+    orig_path = os.getcwd()
+    orig_syspath = sys.path
+
+    try:
+      # Always run hooks with CWD as topdir.
+      os.chdir(self._repo_topdir)
+
+      # Put the hook dir as the first item of sys.path so hooks can do
+      # relative imports.  We want to replace the repo dir as [0] so
+      # hooks can't import repo files.
+      sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
+
+      # Initial global context for the hook to run within.
+      context = {'__file__': self._script_fullpath}
+
+      # Add 'hook_should_take_kwargs' to the arguments to be passed to main.
+      # We don't actually want hooks to define their main with this argument--
+      # it's there to remind them that their hook should always take **kwargs.
+      # For instance, a pre-upload hook should be defined like:
+      #   def main(project_list, **kwargs):
+      #
+      # This allows us to later expand the API without breaking old hooks.
+      kwargs = kwargs.copy()
+      kwargs['hook_should_take_kwargs'] = True
+
+      # See what version of python the hook has been written against.
+      data = open(self._script_fullpath).read()
+      interp = self._ExtractInterpFromShebang(data)
+      reexec = False
+      if interp:
+        prog = os.path.basename(interp)
+        if prog.startswith('python2') and sys.version_info.major != 2:
+          reexec = True
+        elif prog.startswith('python3') and sys.version_info.major == 2:
+          reexec = True
+
+      # Attempt to execute the hooks through the requested version of Python.
+      if reexec:
+        try:
+          self._ExecuteHookViaReexec(interp, context, **kwargs)
+        except OSError as e:
+          if e.errno == errno.ENOENT:
+            # We couldn't find the interpreter, so fallback to importing.
+            reexec = False
+          else:
+            raise
+
+      # Run the hook by importing directly.
+      if not reexec:
+        self._ExecuteHookViaImport(data, context, **kwargs)
+    finally:
+      # Restore sys.path and CWD.
+      sys.path = orig_syspath
+      os.chdir(orig_path)
+
+  def _CheckHook(self):
+    # Bail with a nice error if we can't find the hook.
+    if not os.path.isfile(self._script_fullpath):
+      raise HookError('Couldn\'t find repo hook: %s' % self._script_fullpath)
+
+  def Run(self, **kwargs):
+    """Run the hook.
+
+    If the hook doesn't exist (because there is no hooks project or because
+    this particular hook is not enabled), this is a no-op.
+
+    Args:
+      user_allows_all_hooks: If True, we will never prompt about running the
+          hook--we'll just assume it's OK to run it.
+      kwargs: Keyword arguments to pass to the hook.  These are often specific
+          to the hook type.  For instance, pre-upload hooks will contain
+          a project_list.
+
+    Returns:
+      True: On success or ignore hooks by user-request
+      False: The hook failed. The caller should respond with aborting the action.
+        Some examples in which False is returned:
+        * Finding the hook failed while it was enabled, or
+        * the user declined to run a required hook (from _CheckForHookApproval)
+        In all these cases the user did not pass the proper arguments to
+        ignore the result through the option combinations as listed in
+        AddHookOptionGroup().
+    """
+    # Do not do anything in case bypass_hooks is set, or
+    # no-op if there is no hooks project or if hook is disabled.
+    if (self._bypass_hooks or
+        not self._hooks_project or
+        self._hook_type not in self._hooks_project.enabled_repo_hooks):
+      return True
+
+    passed = True
+    try:
+      self._CheckHook()
+
+      # Make sure the user is OK with running the hook.
+      if self._allow_all_hooks or self._CheckForHookApproval():
+        # Run the hook with the same version of python we're using.
+        self._ExecuteHook(**kwargs)
+    except SystemExit as e:
+      passed = False
+      print('ERROR: %s hooks exited with exit code: %s' % (self._hook_type, str(e)),
+            file=sys.stderr)
+    except HookError as e:
+      passed = False
+      print('ERROR: %s' % str(e), file=sys.stderr)
+
+    if not passed and self._ignore_hooks:
+      print('\nWARNING: %s hooks failed, but continuing anyways.' % self._hook_type,
+            file=sys.stderr)
+      passed = True
+
+    return passed
+
+  @classmethod
+  def FromSubcmd(cls, manifest, opt, *args, **kwargs):
+    """Method to construct the repo hook class
+
+    Args:
+      manifest: The current active manifest for this command from which we
+          extract a couple of fields.
+      opt: Contains the commandline options for the action of this hook.
+          It should contain the options added by AddHookOptionGroup() in which
+          we are interested in RepoHook execution.
+    """
+    for key in ('bypass_hooks', 'allow_all_hooks', 'ignore_hooks'):
+      kwargs.setdefault(key, getattr(opt, key))
+    kwargs.update({
+        'hooks_project': manifest.repo_hooks_project,
+        'repo_topdir': manifest.topdir,
+        'manifest_url': manifest.manifestProject.GetRemote('origin').url,
+    })
+    return cls(*args, **kwargs)
+
+  @staticmethod
+  def AddOptionGroup(parser, name):
+    """Help options relating to the various hooks."""
+
+    # Note that verify and no-verify are NOT opposites of each other, which
+    # is why they store to different locations. We are using them to match
+    # 'git commit' syntax.
+    group = parser.add_option_group(name + ' hooks')
+    group.add_option('--no-verify',
+                     dest='bypass_hooks', action='store_true',
+                     help='Do not run the %s hook.' % name)
+    group.add_option('--verify',
+                     dest='allow_all_hooks', action='store_true',
+                     help='Run the %s hook without prompting.' % name)
+    group.add_option('--ignore-hooks',
+                     action='store_true',
+                     help='Do not abort if %s hooks fail.' % name)
diff --git a/hooks/commit-msg b/hooks/commit-msg
index d9bb188..70d67ea 100755
--- a/hooks/commit-msg
+++ b/hooks/commit-msg
@@ -1,5 +1,5 @@
 #!/bin/sh
-# From Gerrit Code Review 2.14.6
+# From Gerrit Code Review 3.1.3
 #
 # Part of Gerrit Code Review (https://www.gerritcodereview.com/)
 #
@@ -16,176 +16,48 @@
 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 # See the License for the specific language governing permissions and
 # limitations under the License.
-#
 
-unset GREP_OPTIONS
+# avoid [[ which is not POSIX sh.
+if test "$#" != 1 ; then
+  echo "$0 requires an argument."
+  exit 1
+fi
 
-CHANGE_ID_AFTER="Bug|Depends-On|Issue|Test|Feature|Fixes|Fixed"
-MSG="$1"
+if test ! -f "$1" ; then
+  echo "file does not exist: $1"
+  exit 1
+fi
 
-# Check for, and add if missing, a unique Change-Id
-#
-add_ChangeId() {
-	clean_message=`sed -e '
-		/^diff --git .*/{
-			s///
-			q
-		}
-		/^Signed-off-by:/d
-		/^#/d
-	' "$MSG" | git stripspace`
-	if test -z "$clean_message"
-	then
-		return
-	fi
+# Do not create a change id if requested
+if test "false" = "`git config --bool --get gerrit.createChangeId`" ; then
+  exit 0
+fi
 
-	# Do not add Change-Id to temp commits
-	if echo "$clean_message" | head -1 | grep -q '^\(fixup\|squash\)!'
-	then
-		return
-	fi
+# $RANDOM will be undefined if not using bash, so don't use set -u
+random=$( (whoami ; hostname ; date; cat $1 ; echo $RANDOM) | git hash-object --stdin)
+dest="$1.tmp.${random}"
 
-	if test "false" = "`git config --bool --get gerrit.createChangeId`"
-	then
-		return
-	fi
+trap 'rm -f "${dest}"' EXIT
 
-	# Does Change-Id: already exist? if so, exit (no change).
-	if grep -i '^Change-Id:' "$MSG" >/dev/null
-	then
-		return
-	fi
+if ! git stripspace --strip-comments < "$1" > "${dest}" ; then
+   echo "cannot strip comments from $1"
+   exit 1
+fi
 
-	id=`_gen_ChangeId`
-	T="$MSG.tmp.$$"
-	AWK=awk
-	if [ -x /usr/xpg4/bin/awk ]; then
-		# Solaris AWK is just too broken
-		AWK=/usr/xpg4/bin/awk
-	fi
+if test ! -s "${dest}" ; then
+  echo "file is empty: $1"
+  exit 1
+fi
 
-	# Get core.commentChar from git config or use default symbol
-	commentChar=`git config --get core.commentChar`
-	commentChar=${commentChar:-#}
+# Avoid the --in-place option which only appeared in Git 2.8
+# Avoid the --if-exists option which only appeared in Git 2.15
+if ! git -c trailer.ifexists=doNothing interpret-trailers \
+      --trailer "Change-Id: I${random}" < "$1" > "${dest}" ; then
+  echo "cannot insert change-id line in $1"
+  exit 1
+fi
 
-	# How this works:
-	# - parse the commit message as (textLine+ blankLine*)*
-	# - assume textLine+ to be a footer until proven otherwise
-	# - exception: the first block is not footer (as it is the title)
-	# - read textLine+ into a variable
-	# - then count blankLines
-	# - once the next textLine appears, print textLine+ blankLine* as these
-	#   aren't footer
-	# - in END, the last textLine+ block is available for footer parsing
-	$AWK '
-	BEGIN {
-		# while we start with the assumption that textLine+
-		# is a footer, the first block is not.
-		isFooter = 0
-		footerComment = 0
-		blankLines = 0
-	}
-
-	# Skip lines starting with commentChar without any spaces before it.
-	/^'"$commentChar"'/ { next }
-
-	# Skip the line starting with the diff command and everything after it,
-	# up to the end of the file, assuming it is only patch data.
-	# If more than one line before the diff was empty, strip all but one.
-	/^diff --git / {
-		blankLines = 0
-		while (getline) { }
-		next
-	}
-
-	# Count blank lines outside footer comments
-	/^$/ && (footerComment == 0) {
-		blankLines++
-		next
-	}
-
-	# Catch footer comment
-	/^\[[a-zA-Z0-9-]+:/ && (isFooter == 1) {
-		footerComment = 1
-	}
-
-	/]$/ && (footerComment == 1) {
-		footerComment = 2
-	}
-
-	# We have a non-blank line after blank lines. Handle this.
-	(blankLines > 0) {
-		print lines
-		for (i = 0; i < blankLines; i++) {
-			print ""
-		}
-
-		lines = ""
-		blankLines = 0
-		isFooter = 1
-		footerComment = 0
-	}
-
-	# Detect that the current block is not the footer
-	(footerComment == 0) && (!/^\[?[a-zA-Z0-9-]+:/ || /^[a-zA-Z0-9-]+:\/\//) {
-		isFooter = 0
-	}
-
-	{
-		# We need this information about the current last comment line
-		if (footerComment == 2) {
-			footerComment = 0
-		}
-		if (lines != "") {
-			lines = lines "\n";
-		}
-		lines = lines $0
-	}
-
-	# Footer handling:
-	# If the last block is considered a footer, splice in the Change-Id at the
-	# right place.
-	# Look for the right place to inject Change-Id by considering
-	# CHANGE_ID_AFTER. Keys listed in it (case insensitive) come first,
-	# then Change-Id, then everything else (eg. Signed-off-by:).
-	#
-	# Otherwise just print the last block, a new line and the Change-Id as a
-	# block of its own.
-	END {
-		unprinted = 1
-		if (isFooter == 0) {
-			print lines "\n"
-			lines = ""
-		}
-		changeIdAfter = "^(" tolower("'"$CHANGE_ID_AFTER"'") "):"
-		numlines = split(lines, footer, "\n")
-		for (line = 1; line <= numlines; line++) {
-			if (unprinted && match(tolower(footer[line]), changeIdAfter) != 1) {
-				unprinted = 0
-				print "Change-Id: I'"$id"'"
-			}
-			print footer[line]
-		}
-		if (unprinted) {
-			print "Change-Id: I'"$id"'"
-		}
-	}' "$MSG" > "$T" && mv "$T" "$MSG" || rm -f "$T"
-}
-_gen_ChangeIdInput() {
-	echo "tree `git write-tree`"
-	if parent=`git rev-parse "HEAD^0" 2>/dev/null`
-	then
-		echo "parent $parent"
-	fi
-	echo "author `git var GIT_AUTHOR_IDENT`"
-	echo "committer `git var GIT_COMMITTER_IDENT`"
-	echo
-	printf '%s' "$clean_message"
-}
-_gen_ChangeId() {
-	_gen_ChangeIdInput |
-	git hash-object -t commit --stdin
-}
-
-
-add_ChangeId
+if ! mv "${dest}" "$1" ; then
+  echo "cannot mv ${dest} to $1"
+  exit 1
+fi
diff --git a/main.py b/main.py
index 16db144..2050cab 100755
--- a/main.py
+++ b/main.py
@@ -1,5 +1,4 @@
-#!/usr/bin/env python
-# -*- coding:utf-8 -*-
+#!/usr/bin/env python3
 #
 # Copyright (C) 2008 The Android Open Source Project
 #
@@ -21,23 +20,15 @@
 which takes care of execing this entry point.
 """
 
-from __future__ import print_function
 import getpass
 import netrc
 import optparse
 import os
+import shlex
 import sys
 import textwrap
 import time
-
-from pyversion import is_python3
-if is_python3():
-  import urllib.request
-else:
-  import imp
-  import urllib2
-  urllib = imp.new_module('urllib')
-  urllib.request = urllib2
+import urllib.request
 
 try:
   import kerberos
@@ -47,8 +38,9 @@
 from color import SetDefaultColoring
 import event_log
 from repo_trace import SetTrace
-from git_command import git, GitCommand, user_agent
-from git_config import init_ssh, close_ssh
+from git_command import user_agent
+from git_config import RepoConfig
+from git_trace2_event_log import EventLog
 from command import InteractiveCommand
 from command import MirrorSafeCommand
 from command import GitcAvailableCommand, GitcClientCommand
@@ -62,25 +54,54 @@
 from error import NoSuchProjectError
 from error import RepoChangedException
 import gitc_utils
-from manifest_xml import GitcManifest, XmlManifest
+from manifest_xml import GitcClient, RepoClient
 from pager import RunPager, TerminatePager
 from wrapper import WrapperPath, Wrapper
 
 from subcmds import all_commands
 
-if not is_python3():
-  input = raw_input
+
+# NB: These do not need to be kept in sync with the repo launcher script.
+# These may be much newer as it allows the repo launcher to roll between
+# different repo releases while source versions might require a newer python.
+#
+# The soft version is when we start warning users that the version is old and
+# we'll be dropping support for it.  We'll refuse to work with versions older
+# than the hard version.
+#
+# python-3.6 is in Ubuntu Bionic.
+MIN_PYTHON_VERSION_SOFT = (3, 6)
+MIN_PYTHON_VERSION_HARD = (3, 6)
+
+if sys.version_info.major < 3:
+  print('repo: error: Python 2 is no longer supported; '
+        'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
+        file=sys.stderr)
+  sys.exit(1)
+else:
+  if sys.version_info < MIN_PYTHON_VERSION_HARD:
+    print('repo: error: Python 3 version is too old; '
+          'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
+          file=sys.stderr)
+    sys.exit(1)
+  elif sys.version_info < MIN_PYTHON_VERSION_SOFT:
+    print('repo: warning: your Python 3 version is no longer supported; '
+          'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_SOFT),
+          file=sys.stderr)
+
 
 global_options = optparse.OptionParser(
     usage='repo [-p|--paginate|--no-pager] COMMAND [ARGS]',
     add_help_option=False)
 global_options.add_option('-h', '--help', action='store_true',
                           help='show this help message and exit')
+global_options.add_option('--help-all', action='store_true',
+                          help='show this help message with all subcommands and exit')
 global_options.add_option('-p', '--paginate',
                           dest='pager', action='store_true',
                           help='display command output in the pager')
 global_options.add_option('--no-pager',
-                          dest='no_pager', action='store_true',
+                          dest='pager', action='store_false',
                           help='disable the pager')
 global_options.add_option('--color',
                           choices=('auto', 'always', 'never'), default=None,
@@ -97,76 +118,125 @@
 global_options.add_option('--version',
                           dest='show_version', action='store_true',
                           help='display this version of repo')
+global_options.add_option('--show-toplevel',
+                          action='store_true',
+                          help='display the path of the top-level directory of '
+                               'the repo client checkout')
 global_options.add_option('--event-log',
                           dest='event_log', action='store',
                           help='filename of event log to append timeline to')
+global_options.add_option('--git-trace2-event-log', action='store',
+                          help='directory to write git trace2 event log to')
+
 
 class _Repo(object):
   def __init__(self, repodir):
     self.repodir = repodir
     self.commands = all_commands
-    # add 'branch' as an alias for 'branches'
-    all_commands['branch'] = all_commands['branches']
+
+  def _PrintHelp(self, short: bool = False, all_commands: bool = False):
+    """Show --help screen."""
+    global_options.print_help()
+    print()
+    if short:
+      commands = ' '.join(sorted(self.commands))
+      wrapped_commands = textwrap.wrap(commands, width=77)
+      print('Available commands:\n  %s' % ('\n  '.join(wrapped_commands),))
+      print('\nRun `repo help <command>` for command-specific details.')
+      print('Bug reports:', Wrapper().BUG_URL)
+    else:
+      cmd = self.commands['help']()
+      if all_commands:
+        cmd.PrintAllCommandsBody()
+      else:
+        cmd.PrintCommonCommandsBody()
 
   def _ParseArgs(self, argv):
     """Parse the main `repo` command line options."""
-    name = None
-    glob = []
-
-    for i in range(len(argv)):
-      if not argv[i].startswith('-'):
-        name = argv[i]
-        if i > 0:
-          glob = argv[:i]
+    for i, arg in enumerate(argv):
+      if not arg.startswith('-'):
+        name = arg
+        glob = argv[:i]
         argv = argv[i + 1:]
         break
-    if not name:
+    else:
+      name = None
       glob = argv
-      name = 'help'
       argv = []
     gopts, _gargs = global_options.parse_args(glob)
 
-    if gopts.help:
-      global_options.print_help()
-      commands = ' '.join(sorted(self.commands))
-      wrapped_commands = textwrap.wrap(commands, width=77)
-      print('\nAvailable commands:\n  %s' % ('\n  '.join(wrapped_commands),))
-      print('\nRun `repo help <command>` for command-specific details.')
-      global_options.exit()
+    if name:
+      name, alias_args = self._ExpandAlias(name)
+      argv = alias_args + argv
 
     return (name, gopts, argv)
 
+  def _ExpandAlias(self, name):
+    """Look up user registered aliases."""
+    # We don't resolve aliases for existing subcommands.  This matches git.
+    if name in self.commands:
+      return name, []
+
+    key = 'alias.%s' % (name,)
+    alias = RepoConfig.ForRepository(self.repodir).GetString(key)
+    if alias is None:
+      alias = RepoConfig.ForUser().GetString(key)
+    if alias is None:
+      return name, []
+
+    args = alias.strip().split(' ', 1)
+    name = args[0]
+    if len(args) == 2:
+      args = shlex.split(args[1])
+    else:
+      args = []
+    return name, args
+
   def _Run(self, name, gopts, argv):
     """Execute the requested subcommand."""
     result = 0
 
     if gopts.trace:
       SetTrace()
-    if gopts.show_version:
-      if name == 'help':
-        name = 'version'
-      else:
-        print('fatal: invalid usage of --version', file=sys.stderr)
-        return 1
+
+    # Handle options that terminate quickly first.
+    if gopts.help or gopts.help_all:
+      self._PrintHelp(short=False, all_commands=gopts.help_all)
+      return 0
+    elif gopts.show_version:
+      # Always allow global --version regardless of subcommand validity.
+      name = 'version'
+    elif gopts.show_toplevel:
+      print(os.path.dirname(self.repodir))
+      return 0
+    elif not name:
+      # No subcommand specified, so show the help/subcommand.
+      self._PrintHelp(short=True)
+      return 1
 
     SetDefaultColoring(gopts.color)
 
+    git_trace2_event_log = EventLog()
+    repo_client = RepoClient(self.repodir)
+    gitc_manifest = None
+    gitc_client_name = gitc_utils.parse_clientdir(os.getcwd())
+    if gitc_client_name:
+      gitc_manifest = GitcClient(self.repodir, gitc_client_name)
+      repo_client.isGitcClient = True
+
     try:
-      cmd = self.commands[name]
+      cmd = self.commands[name](
+          repodir=self.repodir,
+          client=repo_client,
+          manifest=repo_client.manifest,
+          gitc_manifest=gitc_manifest,
+          git_event_log=git_trace2_event_log)
     except KeyError:
       print("repo: '%s' is not a repo command.  See 'repo help'." % name,
             file=sys.stderr)
       return 1
 
-    cmd.repodir = self.repodir
-    cmd.manifest = XmlManifest(cmd.repodir)
-    cmd.gitc_manifest = None
-    gitc_client_name = gitc_utils.parse_clientdir(os.getcwd())
-    if gitc_client_name:
-      cmd.gitc_manifest = GitcManifest(cmd.repodir, gitc_client_name)
-      cmd.manifest.isGitcClient = True
-
-    Editor.globalConfig = cmd.manifest.globalConfig
+    Editor.globalConfig = cmd.client.globalConfig
 
     if not isinstance(cmd, MirrorSafeCommand) and cmd.manifest.IsMirror:
       print("fatal: '%s' requires a working directory" % name,
@@ -188,13 +258,13 @@
       copts = cmd.ReadEnvironmentOptions(copts)
     except NoManifestException as e:
       print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
-        file=sys.stderr)
+            file=sys.stderr)
       print('error: manifest missing or unreadable -- please run init',
             file=sys.stderr)
       return 1
 
-    if not gopts.no_pager and not isinstance(cmd, InteractiveCommand):
-      config = cmd.manifest.globalConfig
+    if gopts.pager is not False and not isinstance(cmd, InteractiveCommand):
+      config = cmd.client.globalConfig
       if gopts.pager:
         use_pager = True
       else:
@@ -207,13 +277,17 @@
     start = time.time()
     cmd_event = cmd.event_log.Add(name, event_log.TASK_COMMAND, start)
     cmd.event_log.SetParent(cmd_event)
+    git_trace2_event_log.StartEvent()
+    git_trace2_event_log.CommandEvent(name='repo', subcommands=[name])
+
     try:
+      cmd.CommonValidateOptions(copts, cargs)
       cmd.ValidateOptions(copts, cargs)
       result = cmd.Execute(copts, cargs)
     except (DownloadError, ManifestInvalidRevisionError,
-        NoManifestException) as e:
+            NoManifestException) as e:
       print('error: in `%s`: %s' % (' '.join([name] + argv), str(e)),
-        file=sys.stderr)
+            file=sys.stderr)
       if isinstance(e, NoManifestException):
         print('error: manifest missing or unreadable -- please run init',
               file=sys.stderr)
@@ -228,7 +302,8 @@
       if e.name:
         print('error: project group must be enabled for project %s' % e.name, file=sys.stderr)
       else:
-        print('error: project group must be enabled for the project in the current directory', file=sys.stderr)
+        print('error: project group must be enabled for the project in the current directory',
+              file=sys.stderr)
       result = 1
     except SystemExit as e:
       if e.code:
@@ -248,49 +323,78 @@
 
       cmd.event_log.FinishEvent(cmd_event, finish,
                                 result is None or result == 0)
+      git_trace2_event_log.DefParamRepoEvents(
+          cmd.manifest.manifestProject.config.DumpConfigDict())
+      git_trace2_event_log.ExitEvent(result)
+
       if gopts.event_log:
         cmd.event_log.Write(os.path.abspath(
                             os.path.expanduser(gopts.event_log)))
 
+      git_trace2_event_log.Write(gopts.git_trace2_event_log)
     return result
 
 
-def _CheckWrapperVersion(ver, repo_path):
+def _CheckWrapperVersion(ver_str, repo_path):
+  """Verify the repo launcher is new enough for this checkout.
+
+  Args:
+    ver_str: The version string passed from the repo launcher when it ran us.
+    repo_path: The path to the repo launcher that loaded us.
+  """
+  # Refuse to work with really old wrapper versions.  We don't test these,
+  # so might as well require a somewhat recent sane version.
+  # v1.15 of the repo launcher was released in ~Mar 2012.
+  MIN_REPO_VERSION = (1, 15)
+  min_str = '.'.join(str(x) for x in MIN_REPO_VERSION)
+
   if not repo_path:
     repo_path = '~/bin/repo'
 
-  if not ver:
+  if not ver_str:
     print('no --wrapper-version argument', file=sys.stderr)
     sys.exit(1)
 
+  # Pull out the version of the repo launcher we know about to compare.
   exp = Wrapper().VERSION
-  ver = tuple(map(int, ver.split('.')))
-  if len(ver) == 1:
-    ver = (0, ver[0])
+  ver = tuple(map(int, ver_str.split('.')))
 
   exp_str = '.'.join(map(str, exp))
-  if exp[0] > ver[0] or ver < (0, 4):
+  if ver < MIN_REPO_VERSION:
     print("""
-!!! A new repo command (%5s) is available.    !!!
-!!! You must upgrade before you can continue:   !!!
+repo: error:
+!!! Your version of repo %s is too old.
+!!! We need at least version %s.
+!!! A new version of repo (%s) is available.
+!!! You must upgrade before you can continue:
 
     cp %s %s
-""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
+""" % (ver_str, min_str, exp_str, WrapperPath(), repo_path), file=sys.stderr)
     sys.exit(1)
 
   if exp > ver:
-    print("""
-... A new repo command (%5s) is available.
+    print('\n... A new version of repo (%s) is available.' % (exp_str,),
+          file=sys.stderr)
+    if os.access(repo_path, os.W_OK):
+      print("""\
 ... You should upgrade soon:
-
     cp %s %s
-""" % (exp_str, WrapperPath(), repo_path), file=sys.stderr)
+""" % (WrapperPath(), repo_path), file=sys.stderr)
+    else:
+      print("""\
+... New version is available at: %s
+... The launcher is run from: %s
+!!! The launcher is not writable.  Please talk to your sysadmin or distro
+!!! to get an update installed.
+""" % (WrapperPath(), repo_path), file=sys.stderr)
+
 
 def _CheckRepoDir(repo_dir):
   if not repo_dir:
     print('no --repo-dir argument', file=sys.stderr)
     sys.exit(1)
 
+
 def _PruneOptions(argv, opt):
   i = 0
   while i < len(argv):
@@ -306,6 +410,7 @@
       continue
     i += 1
 
+
 class _UserAgentHandler(urllib.request.BaseHandler):
   def http_request(self, req):
     req.add_header('User-Agent', user_agent.repo)
@@ -315,6 +420,7 @@
     req.add_header('User-Agent', user_agent.repo)
     return req
 
+
 def _AddPasswordFromUserInput(handler, msg, req):
   # If repo could not find auth info from netrc, try to get it from user input
   url = req.get_full_url()
@@ -328,22 +434,24 @@
       return
     handler.passwd.add_password(None, url, user, password)
 
+
 class _BasicAuthHandler(urllib.request.HTTPBasicAuthHandler):
   def http_error_401(self, req, fp, code, msg, headers):
     _AddPasswordFromUserInput(self, msg, req)
     return urllib.request.HTTPBasicAuthHandler.http_error_401(
-      self, req, fp, code, msg, headers)
+        self, req, fp, code, msg, headers)
 
   def http_error_auth_reqed(self, authreq, host, req, headers):
     try:
       old_add_header = req.add_header
+
       def _add_header(name, val):
         val = val.replace('\n', '')
         old_add_header(name, val)
       req.add_header = _add_header
       return urllib.request.AbstractBasicAuthHandler.http_error_auth_reqed(
-        self, authreq, host, req, headers)
-    except:
+          self, authreq, host, req, headers)
+    except Exception:
       reset = getattr(self, 'reset_retry_count', None)
       if reset is not None:
         reset()
@@ -351,22 +459,24 @@
         self.retried = 0
       raise
 
+
 class _DigestAuthHandler(urllib.request.HTTPDigestAuthHandler):
   def http_error_401(self, req, fp, code, msg, headers):
     _AddPasswordFromUserInput(self, msg, req)
     return urllib.request.HTTPDigestAuthHandler.http_error_401(
-      self, req, fp, code, msg, headers)
+        self, req, fp, code, msg, headers)
 
   def http_error_auth_reqed(self, auth_header, host, req, headers):
     try:
       old_add_header = req.add_header
+
       def _add_header(name, val):
         val = val.replace('\n', '')
         old_add_header(name, val)
       req.add_header = _add_header
       return urllib.request.AbstractDigestAuthHandler.http_error_auth_reqed(
-        self, auth_header, host, req, headers)
-    except:
+          self, auth_header, host, req, headers)
+    except Exception:
       reset = getattr(self, 'reset_retry_count', None)
       if reset is not None:
         reset()
@@ -374,6 +484,7 @@
         self.retried = 0
       raise
 
+
 class _KerberosAuthHandler(urllib.request.BaseHandler):
   def __init__(self):
     self.retried = 0
@@ -392,7 +503,7 @@
 
       if self.retried > 3:
         raise urllib.request.HTTPError(req.get_full_url(), 401,
-          "Negotiate auth failed", headers, None)
+                                       "Negotiate auth failed", headers, None)
       else:
         self.retried += 1
 
@@ -408,7 +519,7 @@
         return response
     except kerberos.GSSError:
       return None
-    except:
+    except Exception:
       self.reset_retry_count()
       raise
     finally:
@@ -454,6 +565,7 @@
       kerberos.authGSSClientClean(self.context)
       self.context = None
 
+
 def init_http():
   handlers = [_UserAgentHandler()]
 
@@ -462,7 +574,7 @@
     n = netrc.netrc()
     for host in n.hosts:
       p = n.hosts[host]
-      mgr.add_password(p[1], 'http://%s/'  % host, p[0], p[2])
+      mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2])
       mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2])
   except netrc.NetrcParseError:
     pass
@@ -481,6 +593,7 @@
     handlers.append(urllib.request.HTTPSHandler(debuglevel=1))
   urllib.request.install_opener(urllib.request.build_opener(*handlers))
 
+
 def _Main(argv):
   result = 0
 
@@ -502,20 +615,16 @@
 
   repo = _Repo(opt.repodir)
   try:
-    try:
-      init_ssh()
-      init_http()
-      name, gopts, argv = repo._ParseArgs(argv)
-      run = lambda: repo._Run(name, gopts, argv) or 0
-      if gopts.trace_python:
-        import trace
-        tracer = trace.Trace(count=False, trace=True, timing=True,
-                             ignoredirs=set(sys.path[1:]))
-        result = tracer.runfunc(run)
-      else:
-        result = run()
-    finally:
-      close_ssh()
+    init_http()
+    name, gopts, argv = repo._ParseArgs(argv)
+    run = lambda: repo._Run(name, gopts, argv) or 0
+    if gopts.trace_python:
+      import trace
+      tracer = trace.Trace(count=False, trace=True, timing=True,
+                           ignoredirs=set(sys.path[1:]))
+      result = tracer.runfunc(run)
+    else:
+      result = run()
   except KeyboardInterrupt:
     print('aborted by user', file=sys.stderr)
     result = 1
@@ -528,7 +637,7 @@
     argv = list(sys.argv)
     argv.extend(rce.extra_args)
     try:
-      os.execv(__file__, argv)
+      os.execv(sys.executable, [__file__] + argv)
     except OSError as e:
       print('fatal: cannot restart repo after upgrade', file=sys.stderr)
       print('fatal: %s' % e, file=sys.stderr)
@@ -537,5 +646,6 @@
   TerminatePager()
   sys.exit(result)
 
+
 if __name__ == '__main__':
   _Main(sys.argv[1:])
diff --git a/man/repo-abandon.1 b/man/repo-abandon.1
new file mode 100644
index 0000000..b3c0422
--- /dev/null
+++ b/man/repo-abandon.1
@@ -0,0 +1,36 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo abandon" "Repo Manual"
+.SH NAME
+repo \- repo abandon - manual page for repo abandon
+.SH SYNOPSIS
+.B repo
+\fI\,abandon \/\fR[\fI\,--all | <branchname>\/\fR] [\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Permanently abandon a development branch
+.PP
+This subcommand permanently abandons a development branch by
+deleting it (and all its history) from your local repository.
+.PP
+It is equivalent to "git branch \fB\-D\fR <branchname>".
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-\-all\fR
+delete all branches in all projects
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help abandon` to view the detailed manual.
diff --git a/man/repo-branch.1 b/man/repo-branch.1
new file mode 100644
index 0000000..854ee98
--- /dev/null
+++ b/man/repo-branch.1
@@ -0,0 +1 @@
+.so man1/repo-branches.1
\ No newline at end of file
diff --git a/man/repo-branches.1 b/man/repo-branches.1
new file mode 100644
index 0000000..7fe0b02
--- /dev/null
+++ b/man/repo-branches.1
@@ -0,0 +1,59 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo branches" "Repo Manual"
+.SH NAME
+repo \- repo branches - manual page for repo branches
+.SH SYNOPSIS
+.B repo
+\fI\,branches \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+View current topic branches
+.PP
+Summarizes the currently available topic branches.
+.PP
+# Branch Display
+.PP
+The branch display output by this command is organized into four
+columns of information; for example:
+.TP
+*P nocolor
+| in repo
+.TP
+repo2
+|
+.PP
+The first column contains a * if the branch is the currently
+checked out branch in any of the specified projects, or a blank
+if no project has the branch checked out.
+.PP
+The second column contains either blank, p or P, depending upon
+the upload status of the branch.
+.IP
+(blank): branch not yet published by repo upload
+.IP
+P: all commits were published by repo upload
+p: only some commits were published by repo upload
+.PP
+The third column contains the branch name.
+.PP
+The fourth column (after the | separator) lists the projects that
+the branch appears in, or does not appear in.  If no project list
+is shown, then the branch appears in all projects.
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help branches` to view the detailed manual.
diff --git a/man/repo-checkout.1 b/man/repo-checkout.1
new file mode 100644
index 0000000..6dd3e6c
--- /dev/null
+++ b/man/repo-checkout.1
@@ -0,0 +1,36 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo checkout" "Repo Manual"
+.SH NAME
+repo \- repo checkout - manual page for repo checkout
+.SH SYNOPSIS
+.B repo
+\fI\,checkout <branchname> \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Checkout a branch for development
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help checkout` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo checkout' command checks out an existing branch that was previously
+created by 'repo start'.
+.PP
+The command is equivalent to:
+.IP
+repo forall [<project>...] \fB\-c\fR git checkout <branchname>
diff --git a/man/repo-cherry-pick.1 b/man/repo-cherry-pick.1
new file mode 100644
index 0000000..e7716c5
--- /dev/null
+++ b/man/repo-cherry-pick.1
@@ -0,0 +1,28 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo cherry-pick" "Repo Manual"
+.SH NAME
+repo \- repo cherry-pick - manual page for repo cherry-pick
+.SH SYNOPSIS
+.B repo
+\fI\,cherry-pick <sha1>\/\fR
+.SH DESCRIPTION
+Summary
+.PP
+Cherry\-pick a change.
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help cherry\-pick` to view the detailed manual.
+.SH DETAILS
+.PP
+\&'repo cherry\-pick' cherry\-picks a change from one branch to another. The change
+id will be updated, and a reference to the old change id will be added.
diff --git a/man/repo-diff.1 b/man/repo-diff.1
new file mode 100644
index 0000000..890f8d2
--- /dev/null
+++ b/man/repo-diff.1
@@ -0,0 +1,35 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo diff" "Repo Manual"
+.SH NAME
+repo \- repo diff - manual page for repo diff
+.SH SYNOPSIS
+.B repo
+\fI\,diff \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Show changes between commit and working tree
+.PP
+The \fB\-u\fR option causes 'repo diff' to generate diff output with file paths
+relative to the repository root, so the output can be applied
+to the Unix 'patch' command.
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-u\fR, \fB\-\-absolute\fR
+paths are relative to the repository root
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help diff` to view the detailed manual.
diff --git a/man/repo-diffmanifests.1 b/man/repo-diffmanifests.1
new file mode 100644
index 0000000..add50f1
--- /dev/null
+++ b/man/repo-diffmanifests.1
@@ -0,0 +1,61 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo diffmanifests" "Repo Manual"
+.SH NAME
+repo \- repo diffmanifests - manual page for repo diffmanifests
+.SH SYNOPSIS
+.B repo
+\fI\,diffmanifests manifest1.xml \/\fR[\fI\,manifest2.xml\/\fR] [\fI\,options\/\fR]
+.SH DESCRIPTION
+Summary
+.PP
+Manifest diff utility
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-\-raw\fR
+display raw diff
+.TP
+\fB\-\-no\-color\fR
+does not display the diff in color
+.TP
+\fB\-\-pretty\-format=\fR<FORMAT>
+print the log using a custom git pretty format string
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help diffmanifests` to view the detailed manual.
+.SH DETAILS
+.PP
+The repo diffmanifests command shows differences between project revisions of
+manifest1 and manifest2. if manifest2 is not specified, current manifest.xml
+will be used instead. Both absolute and relative paths may be used for
+manifests. Relative paths start from project's ".repo/manifests" folder.
+.PP
+The \fB\-\-raw\fR option Displays the diff in a way that facilitates parsing, the
+project pattern will be <status> <path> <revision from> [<revision to>] and the
+commit pattern will be <status> <onelined log> with status values respectively :
+.IP
+A = Added project
+R = Removed project
+C = Changed project
+U = Project with unreachable revision(s) (revision(s) not found)
+.PP
+for project, and
+.IP
+A = Added commit
+R = Removed commit
+.PP
+for a commit.
+.PP
+Only changed projects may contain commits, and commit status always starts with
+a space, and are part of last printed project. Unreachable revisions may occur
+if project is not up to date or if repo has not been initialized with all the
+groups, in which case some projects won't be synced and their revisions won't be
+found.
diff --git a/man/repo-download.1 b/man/repo-download.1
new file mode 100644
index 0000000..cf7f767
--- /dev/null
+++ b/man/repo-download.1
@@ -0,0 +1,44 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo download" "Repo Manual"
+.SH NAME
+repo \- repo download - manual page for repo download
+.SH SYNOPSIS
+.B repo
+\fI\,download {\/\fR[\fI\,project\/\fR] \fI\,change\/\fR[\fI\,/patchset\/\fR]\fI\,}\/\fR...
+.SH DESCRIPTION
+Summary
+.PP
+Download and checkout a change
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-b\fR BRANCH, \fB\-\-branch\fR=\fI\,BRANCH\/\fR
+create a new branch first
+.TP
+\fB\-c\fR, \fB\-\-cherry\-pick\fR
+cherry\-pick instead of checkout
+.TP
+\fB\-x\fR, \fB\-\-record\-origin\fR
+pass \fB\-x\fR when cherry\-picking
+.TP
+\fB\-r\fR, \fB\-\-revert\fR
+revert instead of checkout
+.TP
+\fB\-f\fR, \fB\-\-ff\-only\fR
+force fast\-forward merge
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help download` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo download' command downloads a change from the review system and makes
+it available in your project's local working directory. If no project is
+specified try to use current directory as a project.
diff --git a/man/repo-forall.1 b/man/repo-forall.1
new file mode 100644
index 0000000..eb2ad57
--- /dev/null
+++ b/man/repo-forall.1
@@ -0,0 +1,128 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo forall" "Repo Manual"
+.SH NAME
+repo \- repo forall - manual page for repo forall
+.SH SYNOPSIS
+.B repo
+\fI\,forall \/\fR[\fI\,<project>\/\fR...] \fI\,-c <command> \/\fR[\fI\,<arg>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Run a shell command in each project
+.PP
+repo forall \fB\-r\fR str1 [str2] ... \fB\-c\fR <command> [<arg>...]
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-r\fR, \fB\-\-regex\fR
+execute the command only on projects matching regex or
+wildcard expression
+.TP
+\fB\-i\fR, \fB\-\-inverse\-regex\fR
+execute the command only on projects not matching
+regex or wildcard expression
+.TP
+\fB\-g\fR GROUPS, \fB\-\-groups\fR=\fI\,GROUPS\/\fR
+execute the command only on projects matching the
+specified groups
+.TP
+\fB\-c\fR, \fB\-\-command\fR
+command (and arguments) to execute
+.TP
+\fB\-e\fR, \fB\-\-abort\-on\-errors\fR
+abort if a command exits unsuccessfully
+.TP
+\fB\-\-ignore\-missing\fR
+silently skip & do not exit non\-zero due missing
+checkouts
+.TP
+\fB\-\-interactive\fR
+force interactive usage
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.TP
+\fB\-p\fR
+show project headers before output
+.PP
+Run `repo help forall` to view the detailed manual.
+.SH DETAILS
+.PP
+Executes the same shell command in each project.
+.PP
+The \fB\-r\fR option allows running the command only on projects matching regex or
+wildcard expression.
+.PP
+By default, projects are processed non\-interactively in parallel. If you want to
+run interactive commands, make sure to pass \fB\-\-interactive\fR to force \fB\-\-jobs\fR 1.
+While the processing order of projects is not guaranteed, the order of project
+output is stable.
+.PP
+Output Formatting
+.PP
+The \fB\-p\fR option causes 'repo forall' to bind pipes to the command's stdin, stdout
+and stderr streams, and pipe all output into a continuous stream that is
+displayed in a single pager session. Project headings are inserted before the
+output of each command is displayed. If the command produces no output in a
+project, no heading is displayed.
+.PP
+The formatting convention used by \fB\-p\fR is very suitable for some types of
+searching, e.g. `repo forall \fB\-p\fR \fB\-c\fR git log \fB\-SFoo\fR` will print all commits that
+add or remove references to Foo.
+.PP
+The \fB\-v\fR option causes 'repo forall' to display stderr messages if a command
+produces output only on stderr. Normally the \fB\-p\fR option causes command output to
+be suppressed until the command produces at least one byte of output on stdout.
+.PP
+Environment
+.PP
+pwd is the project's working directory. If the current client is a mirror
+client, then pwd is the Git repository.
+.PP
+REPO_PROJECT is set to the unique name of the project.
+.PP
+REPO_PATH is the path relative the the root of the client.
+.PP
+REPO_REMOTE is the name of the remote system from the manifest.
+.PP
+REPO_LREV is the name of the revision from the manifest, translated to a local
+tracking branch. If you need to pass the manifest revision to a locally executed
+git command, use REPO_LREV.
+.PP
+REPO_RREV is the name of the revision from the manifest, exactly as written in
+the manifest.
+.PP
+REPO_COUNT is the total number of projects being iterated.
+.PP
+REPO_I is the current (1\-based) iteration count. Can be used in conjunction with
+REPO_COUNT to add a simple progress indicator to your command.
+.PP
+REPO__* are any extra environment variables, specified by the "annotation"
+element under any project element. This can be useful for differentiating trees
+based on user\-specific criteria, or simply annotating tree details.
+.PP
+shell positional arguments ($1, $2, .., $#) are set to any arguments following
+<command>.
+.PP
+Example: to list projects:
+.IP
+repo forall \fB\-c\fR 'echo $REPO_PROJECT'
+.PP
+Notice that $REPO_PROJECT is quoted to ensure it is expanded in the context of
+running <command> instead of in the calling shell.
+.PP
+Unless \fB\-p\fR is used, stdin, stdout, stderr are inherited from the terminal and are
+not redirected.
+.PP
+If \fB\-e\fR is used, when a command exits unsuccessfully, 'repo forall' will abort
+without iterating through the remaining projects.
diff --git a/man/repo-gitc-delete.1 b/man/repo-gitc-delete.1
new file mode 100644
index 0000000..c84c6e4
--- /dev/null
+++ b/man/repo-gitc-delete.1
@@ -0,0 +1,31 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo gitc-delete" "Repo Manual"
+.SH NAME
+repo \- repo gitc-delete - manual page for repo gitc-delete
+.SH SYNOPSIS
+.B repo
+\fI\,gitc-delete\/\fR
+.SH DESCRIPTION
+Summary
+.PP
+Delete a GITC Client.
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-f\fR, \fB\-\-force\fR
+force the deletion (no prompt)
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help gitc\-delete` to view the detailed manual.
+.SH DETAILS
+.PP
+This subcommand deletes the current GITC client, deleting the GITC manifest and
+all locally downloaded sources.
diff --git a/man/repo-gitc-init.1 b/man/repo-gitc-init.1
new file mode 100644
index 0000000..9b61866
--- /dev/null
+++ b/man/repo-gitc-init.1
@@ -0,0 +1,150 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "September 2021" "repo gitc-init" "Repo Manual"
+.SH NAME
+repo \- repo gitc-init - manual page for repo gitc-init
+.SH SYNOPSIS
+.B repo
+\fI\,gitc-init \/\fR[\fI\,options\/\fR] [\fI\,client name\/\fR]
+.SH DESCRIPTION
+Summary
+.PP
+Initialize a GITC Client.
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.SS Manifest options:
+.TP
+\fB\-u\fR URL, \fB\-\-manifest\-url\fR=\fI\,URL\/\fR
+manifest repository location
+.TP
+\fB\-b\fR REVISION, \fB\-\-manifest\-branch\fR=\fI\,REVISION\/\fR
+manifest branch or revision (use HEAD for default)
+.TP
+\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
+initial manifest file
+.TP
+\fB\-\-standalone\-manifest\fR
+download the manifest as a static file rather then
+create a git checkout of the manifest repo
+.TP
+\fB\-g\fR GROUP, \fB\-\-groups\fR=\fI\,GROUP\/\fR
+restrict manifest projects to ones with specified
+group(s) [default|all|G1,G2,G3|G4,\-G5,\-G6]
+.TP
+\fB\-p\fR PLATFORM, \fB\-\-platform\fR=\fI\,PLATFORM\/\fR
+restrict manifest projects to ones with a specified
+platform group [auto|all|none|linux|darwin|...]
+.TP
+\fB\-\-submodules\fR
+sync any submodules associated with the manifest repo
+.SS Manifest (only) checkout options:
+.TP
+\fB\-\-current\-branch\fR
+fetch only current manifest branch from server
+.TP
+\fB\-\-no\-current\-branch\fR
+fetch all manifest branches from server
+.TP
+\fB\-\-tags\fR
+fetch tags in the manifest
+.TP
+\fB\-\-no\-tags\fR
+don't fetch tags in the manifest
+.SS Checkout modes:
+.TP
+\fB\-\-mirror\fR
+create a replica of the remote repositories rather
+than a client working directory
+.TP
+\fB\-\-archive\fR
+checkout an archive instead of a git repository for
+each project. See git archive.
+.TP
+\fB\-\-worktree\fR
+use git\-worktree to manage projects
+.SS Project checkout optimizations:
+.TP
+\fB\-\-reference\fR=\fI\,DIR\/\fR
+location of mirror directory
+.TP
+\fB\-\-dissociate\fR
+dissociate from reference mirrors after clone
+.TP
+\fB\-\-depth\fR=\fI\,DEPTH\/\fR
+create a shallow clone with given depth; see git clone
+.TP
+\fB\-\-partial\-clone\fR
+perform partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
+.TP
+\fB\-\-no\-partial\-clone\fR
+disable use of partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
+.TP
+\fB\-\-partial\-clone\-exclude\fR=\fI\,PARTIAL_CLONE_EXCLUDE\/\fR
+exclude the specified projects (a comma\-delimited
+project names) from partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
+.TP
+\fB\-\-clone\-filter\fR=\fI\,CLONE_FILTER\/\fR
+filter for use with \fB\-\-partial\-clone\fR [default:
+blob:none]
+.TP
+\fB\-\-use\-superproject\fR
+use the manifest superproject to sync projects
+.TP
+\fB\-\-no\-use\-superproject\fR
+disable use of manifest superprojects
+.TP
+\fB\-\-clone\-bundle\fR
+enable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
+not \fB\-\-partial\-clone\fR)
+.TP
+\fB\-\-no\-clone\-bundle\fR
+disable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
+\fB\-\-partial\-clone\fR)
+.SS repo Version options:
+.TP
+\fB\-\-repo\-url\fR=\fI\,URL\/\fR
+repo repository location ($REPO_URL)
+.TP
+\fB\-\-repo\-rev\fR=\fI\,REV\/\fR
+repo branch or revision ($REPO_REV)
+.TP
+\fB\-\-no\-repo\-verify\fR
+do not verify repo source code
+.SS Other options:
+.TP
+\fB\-\-config\-name\fR
+Always prompt for name/e\-mail
+.SS GITC options:
+.TP
+\fB\-f\fR MANIFEST_FILE, \fB\-\-manifest\-file\fR=\fI\,MANIFEST_FILE\/\fR
+Optional manifest file to use for this GITC client.
+.TP
+\fB\-c\fR GITC_CLIENT, \fB\-\-gitc\-client\fR=\fI\,GITC_CLIENT\/\fR
+Name of the gitc_client instance to create or modify.
+.PP
+Run `repo help gitc\-init` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo gitc\-init' command is ran to initialize a new GITC client for use with
+the GITC file system.
+.PP
+This command will setup the client directory, initialize repo, just like repo
+init does, and then downloads the manifest collection and installs it in the
+\&.repo/directory of the GITC client.
+.PP
+Once this is done, a GITC manifest is generated by pulling the HEAD SHA for each
+project and generates the properly formatted XML file and installs it as
+\&.manifest in the GITC client directory.
+.PP
+The \fB\-c\fR argument is required to specify the GITC client name.
+.PP
+The optional \fB\-f\fR argument can be used to specify the manifest file to use for
+this GITC client.
diff --git a/man/repo-grep.1 b/man/repo-grep.1
new file mode 100644
index 0000000..be41058
--- /dev/null
+++ b/man/repo-grep.1
@@ -0,0 +1,119 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo grep" "Repo Manual"
+.SH NAME
+repo \- repo grep - manual page for repo grep
+.SH SYNOPSIS
+.B repo
+\fI\,grep {pattern | -e pattern} \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Print lines matching a pattern
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.SS Logging options:
+.TP
+\fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.SS Sources:
+.TP
+\fB\-\-cached\fR
+Search the index, instead of the work tree
+.TP
+\fB\-r\fR TREEish, \fB\-\-revision\fR=\fI\,TREEish\/\fR
+Search TREEish, instead of the work tree
+.SS Pattern:
+.TP
+\fB\-e\fR PATTERN
+Pattern to search for
+.TP
+\fB\-i\fR, \fB\-\-ignore\-case\fR
+Ignore case differences
+.TP
+\fB\-a\fR, \fB\-\-text\fR
+Process binary files as if they were text
+.TP
+\fB\-I\fR
+Don't match the pattern in binary files
+.TP
+\fB\-w\fR, \fB\-\-word\-regexp\fR
+Match the pattern only at word boundaries
+.TP
+\fB\-v\fR, \fB\-\-invert\-match\fR
+Select non\-matching lines
+.TP
+\fB\-G\fR, \fB\-\-basic\-regexp\fR
+Use POSIX basic regexp for patterns (default)
+.TP
+\fB\-E\fR, \fB\-\-extended\-regexp\fR
+Use POSIX extended regexp for patterns
+.TP
+\fB\-F\fR, \fB\-\-fixed\-strings\fR
+Use fixed strings (not regexp) for pattern
+.SS Pattern Grouping:
+.TP
+\fB\-\-all\-match\fR
+Limit match to lines that have all patterns
+.TP
+\fB\-\-and\fR, \fB\-\-or\fR, \fB\-\-not\fR
+Boolean operators to combine patterns
+.TP
+\-(, \-)
+Boolean operator grouping
+.SS Output:
+.TP
+\fB\-n\fR
+Prefix the line number to matching lines
+.TP
+\fB\-C\fR CONTEXT
+Show CONTEXT lines around match
+.TP
+\fB\-B\fR CONTEXT
+Show CONTEXT lines before match
+.TP
+\fB\-A\fR CONTEXT
+Show CONTEXT lines after match
+.TP
+\fB\-l\fR, \fB\-\-name\-only\fR, \fB\-\-files\-with\-matches\fR
+Show only file names containing matching lines
+.TP
+\fB\-L\fR, \fB\-\-files\-without\-match\fR
+Show only file names not containing matching lines
+.PP
+Run `repo help grep` to view the detailed manual.
+.SH DETAILS
+.PP
+Search for the specified patterns in all project files.
+.PP
+Boolean Options
+.PP
+The following options can appear as often as necessary to express the pattern to
+locate:
+.HP
+\fB\-e\fR PATTERN
+.HP
+\fB\-\-and\fR, \fB\-\-or\fR, \fB\-\-not\fR, \-(, \-)
+.PP
+Further, the \fB\-r\fR/\-\-revision option may be specified multiple times in order to
+scan multiple trees. If the same file matches in more than one tree, only the
+first result is reported, prefixed by the revision name it was found under.
+.PP
+Examples
+.PP
+Look for a line that has '#define' and either 'MAX_PATH or 'PATH_MAX':
+.IP
+repo grep \fB\-e\fR '#define' \fB\-\-and\fR \-\e( \fB\-e\fR MAX_PATH \fB\-e\fR PATH_MAX \e)
+.PP
+Look for a line that has 'NODE' or 'Unexpected' in files that contain a line
+that matches both expressions:
+.IP
+repo grep \fB\-\-all\-match\fR \fB\-e\fR NODE \fB\-e\fR Unexpected
diff --git a/man/repo-help.1 b/man/repo-help.1
new file mode 100644
index 0000000..d6da3c5
--- /dev/null
+++ b/man/repo-help.1
@@ -0,0 +1,33 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo help" "Repo Manual"
+.SH NAME
+repo \- repo help - manual page for repo help
+.SH SYNOPSIS
+.B repo
+\fI\,help \/\fR[\fI\,--all|command\/\fR]
+.SH DESCRIPTION
+Summary
+.PP
+Display detailed help on a command
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-a\fR, \fB\-\-all\fR
+show the complete list of commands
+.TP
+\fB\-\-help\-all\fR
+show the \fB\-\-help\fR of all commands
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help help` to view the detailed manual.
+.SH DETAILS
+.PP
+Displays detailed usage information about a command.
diff --git a/man/repo-info.1 b/man/repo-info.1
new file mode 100644
index 0000000..cf7c17b
--- /dev/null
+++ b/man/repo-info.1
@@ -0,0 +1,40 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo info" "Repo Manual"
+.SH NAME
+repo \- repo info - manual page for repo info
+.SH SYNOPSIS
+.B repo
+\fI\,info \/\fR[\fI\,-dl\/\fR] [\fI\,-o \/\fR[\fI\,-c\/\fR]] [\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Get info on the manifest branch, current branch or unmerged branches
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-d\fR, \fB\-\-diff\fR
+show full info and commit diff including remote
+branches
+.TP
+\fB\-o\fR, \fB\-\-overview\fR
+show overview of all local commits
+.TP
+\fB\-c\fR, \fB\-\-current\-branch\fR
+consider only checked out branches
+.TP
+\fB\-\-no\-current\-branch\fR
+consider all local branches
+.TP
+\fB\-l\fR, \fB\-\-local\-only\fR
+disable all remote operations
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help info` to view the detailed manual.
diff --git a/man/repo-init.1 b/man/repo-init.1
new file mode 100644
index 0000000..9957b64
--- /dev/null
+++ b/man/repo-init.1
@@ -0,0 +1,170 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "September 2021" "repo init" "Repo Manual"
+.SH NAME
+repo \- repo init - manual page for repo init
+.SH SYNOPSIS
+.B repo
+\fI\,init \/\fR[\fI\,options\/\fR] [\fI\,manifest url\/\fR]
+.SH DESCRIPTION
+Summary
+.PP
+Initialize a repo client checkout in the current directory
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.SS Manifest options:
+.TP
+\fB\-u\fR URL, \fB\-\-manifest\-url\fR=\fI\,URL\/\fR
+manifest repository location
+.TP
+\fB\-b\fR REVISION, \fB\-\-manifest\-branch\fR=\fI\,REVISION\/\fR
+manifest branch or revision (use HEAD for default)
+.TP
+\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
+initial manifest file
+.TP
+\fB\-\-standalone\-manifest\fR
+download the manifest as a static file rather then
+create a git checkout of the manifest repo
+.TP
+\fB\-g\fR GROUP, \fB\-\-groups\fR=\fI\,GROUP\/\fR
+restrict manifest projects to ones with specified
+group(s) [default|all|G1,G2,G3|G4,\-G5,\-G6]
+.TP
+\fB\-p\fR PLATFORM, \fB\-\-platform\fR=\fI\,PLATFORM\/\fR
+restrict manifest projects to ones with a specified
+platform group [auto|all|none|linux|darwin|...]
+.TP
+\fB\-\-submodules\fR
+sync any submodules associated with the manifest repo
+.SS Manifest (only) checkout options:
+.TP
+\fB\-c\fR, \fB\-\-current\-branch\fR
+fetch only current manifest branch from server
+.TP
+\fB\-\-no\-current\-branch\fR
+fetch all manifest branches from server
+.TP
+\fB\-\-tags\fR
+fetch tags in the manifest
+.TP
+\fB\-\-no\-tags\fR
+don't fetch tags in the manifest
+.SS Checkout modes:
+.TP
+\fB\-\-mirror\fR
+create a replica of the remote repositories rather
+than a client working directory
+.TP
+\fB\-\-archive\fR
+checkout an archive instead of a git repository for
+each project. See git archive.
+.TP
+\fB\-\-worktree\fR
+use git\-worktree to manage projects
+.SS Project checkout optimizations:
+.TP
+\fB\-\-reference\fR=\fI\,DIR\/\fR
+location of mirror directory
+.TP
+\fB\-\-dissociate\fR
+dissociate from reference mirrors after clone
+.TP
+\fB\-\-depth\fR=\fI\,DEPTH\/\fR
+create a shallow clone with given depth; see git clone
+.TP
+\fB\-\-partial\-clone\fR
+perform partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
+.TP
+\fB\-\-no\-partial\-clone\fR
+disable use of partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
+.TP
+\fB\-\-partial\-clone\-exclude\fR=\fI\,PARTIAL_CLONE_EXCLUDE\/\fR
+exclude the specified projects (a comma\-delimited
+project names) from partial clone (https://gitscm.com/docs/gitrepositorylayout#_code_partialclone_code)
+.TP
+\fB\-\-clone\-filter\fR=\fI\,CLONE_FILTER\/\fR
+filter for use with \fB\-\-partial\-clone\fR [default:
+blob:none]
+.TP
+\fB\-\-use\-superproject\fR
+use the manifest superproject to sync projects
+.TP
+\fB\-\-no\-use\-superproject\fR
+disable use of manifest superprojects
+.TP
+\fB\-\-clone\-bundle\fR
+enable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
+not \fB\-\-partial\-clone\fR)
+.TP
+\fB\-\-no\-clone\-bundle\fR
+disable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS (default if
+\fB\-\-partial\-clone\fR)
+.SS repo Version options:
+.TP
+\fB\-\-repo\-url\fR=\fI\,URL\/\fR
+repo repository location ($REPO_URL)
+.TP
+\fB\-\-repo\-rev\fR=\fI\,REV\/\fR
+repo branch or revision ($REPO_REV)
+.TP
+\fB\-\-no\-repo\-verify\fR
+do not verify repo source code
+.SS Other options:
+.TP
+\fB\-\-config\-name\fR
+Always prompt for name/e\-mail
+.PP
+Run `repo help init` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo init' command is run once to install and initialize repo. The latest
+repo source code and manifest collection is downloaded from the server and is
+installed in the .repo/ directory in the current working directory.
+.PP
+When creating a new checkout, the manifest URL is the only required setting. It
+may be specified using the \fB\-\-manifest\-url\fR option, or as the first optional
+argument.
+.PP
+The optional \fB\-b\fR argument can be used to select the manifest branch to checkout
+and use. If no branch is specified, the remote's default branch is used. This is
+equivalent to using \fB\-b\fR HEAD.
+.PP
+The optional \fB\-m\fR argument can be used to specify an alternate manifest to be
+used. If no manifest is specified, the manifest default.xml will be used.
+.PP
+If the \fB\-\-standalone\-manifest\fR argument is set, the manifest will be downloaded
+directly from the specified \fB\-\-manifest\-url\fR as a static file (rather than setting
+up a manifest git checkout). With \fB\-\-standalone\-manifest\fR, the manifest will be
+fully static and will not be re\-downloaded during subsesquent `repo init` and
+`repo sync` calls.
+.PP
+The \fB\-\-reference\fR option can be used to point to a directory that has the content
+of a \fB\-\-mirror\fR sync. This will make the working directory use as much data as
+possible from the local reference directory when fetching from the server. This
+will make the sync go a lot faster by reducing data traffic on the network.
+.PP
+The \fB\-\-dissociate\fR option can be used to borrow the objects from the directory
+specified with the \fB\-\-reference\fR option only to reduce network transfer, and stop
+borrowing from them after a first clone is made by making necessary local copies
+of borrowed objects.
+.PP
+The \fB\-\-no\-clone\-bundle\fR option disables any attempt to use \fI\,$URL/clone.bundle\/\fP to
+bootstrap a new Git repository from a resumeable bundle file on a content
+delivery network. This may be necessary if there are problems with the local
+Python HTTP client or proxy configuration, but the Git binary works.
+.PP
+Switching Manifest Branches
+.PP
+To switch to another manifest branch, `repo init \fB\-b\fR otherbranch` may be used in
+an existing client. However, as this only updates the manifest, a subsequent
+`repo sync` (or `repo sync \fB\-d\fR`) is necessary to update the working directory
+files.
diff --git a/man/repo-list.1 b/man/repo-list.1
new file mode 100644
index 0000000..7f85e61
--- /dev/null
+++ b/man/repo-list.1
@@ -0,0 +1,61 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo list" "Repo Manual"
+.SH NAME
+repo \- repo list - manual page for repo list
+.SH SYNOPSIS
+.B repo
+\fI\,list \/\fR[\fI\,-f\/\fR] [\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+List projects and their associated directories
+.PP
+repo list [\-f] \fB\-r\fR str1 [str2]...
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-r\fR, \fB\-\-regex\fR
+filter the project list based on regex or wildcard
+matching of strings
+.TP
+\fB\-g\fR GROUPS, \fB\-\-groups\fR=\fI\,GROUPS\/\fR
+filter the project list based on the groups the
+project is in
+.TP
+\fB\-a\fR, \fB\-\-all\fR
+show projects regardless of checkout state
+.TP
+\fB\-n\fR, \fB\-\-name\-only\fR
+display only the name of the repository
+.TP
+\fB\-p\fR, \fB\-\-path\-only\fR
+display only the path of the repository
+.TP
+\fB\-f\fR, \fB\-\-fullpath\fR
+display the full work tree path instead of the
+relative path
+.TP
+\fB\-\-relative\-to\fR=\fI\,PATH\/\fR
+display paths relative to this one (default: top of
+repo client checkout)
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help list` to view the detailed manual.
+.SH DETAILS
+.PP
+List all projects; pass '.' to list the project for the cwd.
+.PP
+By default, only projects that currently exist in the checkout are shown. If you
+want to list all projects (using the specified filter settings), use the \fB\-\-all\fR
+option. If you want to show all projects regardless of the manifest groups, then
+also pass \fB\-\-groups\fR all.
+.PP
+This is similar to running: repo forall \fB\-c\fR 'echo "$REPO_PATH : $REPO_PROJECT"'.
diff --git a/man/repo-manifest.1 b/man/repo-manifest.1
new file mode 100644
index 0000000..be46760
--- /dev/null
+++ b/man/repo-manifest.1
@@ -0,0 +1,548 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo manifest" "Repo Manual"
+.SH NAME
+repo \- repo manifest - manual page for repo manifest
+.SH SYNOPSIS
+.B repo
+\fI\,manifest \/\fR[\fI\,-o {-|NAME.xml}\/\fR] [\fI\,-m MANIFEST.xml\/\fR] [\fI\,-r\/\fR]
+.SH DESCRIPTION
+Summary
+.PP
+Manifest inspection utility
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-r\fR, \fB\-\-revision\-as\-HEAD\fR
+save revisions as current HEAD
+.TP
+\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
+temporary manifest to use for this sync
+.TP
+\fB\-\-suppress\-upstream\-revision\fR
+if in \fB\-r\fR mode, do not write the upstream field (only
+of use if the branch names for a sha1 manifest are
+sensitive)
+.TP
+\fB\-\-suppress\-dest\-branch\fR
+if in \fB\-r\fR mode, do not write the dest\-branch field
+(only of use if the branch names for a sha1 manifest
+are sensitive)
+.TP
+\fB\-\-json\fR
+output manifest in JSON format (experimental)
+.TP
+\fB\-\-pretty\fR
+format output for humans to read
+.TP
+\fB\-\-no\-local\-manifests\fR
+ignore local manifests
+.TP
+\fB\-o\fR \-|NAME.xml, \fB\-\-output\-file\fR=\fI\,\-\/\fR|NAME.xml
+file to save the manifest to
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help manifest` to view the detailed manual.
+.SH DETAILS
+.PP
+With the \fB\-o\fR option, exports the current manifest for inspection. The manifest
+and (if present) local_manifests/ are combined together to produce a single
+manifest file. This file can be stored in a Git repository for use during future
+\&'repo init' invocations.
+.PP
+The \fB\-r\fR option can be used to generate a manifest file with project revisions set
+to the current commit hash. These are known as "revision locked manifests", as
+they don't follow a particular branch. In this case, the 'upstream' attribute is
+set to the ref we were on when the manifest was generated. The 'dest\-branch'
+attribute is set to indicate the remote ref to push changes to via 'repo
+upload'.
+.PP
+repo Manifest Format
+.PP
+A repo manifest describes the structure of a repo client; that is the
+directories that are visible and where they should be obtained from with git.
+.PP
+The basic structure of a manifest is a bare Git repository holding a single
+`default.xml` XML file in the top level directory.
+.PP
+Manifests are inherently version controlled, since they are kept within a Git
+repository. Updates to manifests are automatically obtained by clients during
+`repo sync`.
+.PP
+[TOC]
+.PP
+XML File Format
+.PP
+A manifest XML file (e.g. `default.xml`) roughly conforms to the following DTD:
+.PP
+```xml <!DOCTYPE manifest [
+.TP
+<!ELEMENT manifest (notice?,
+remote*,
+default?,
+manifest\-server?,
+remove\-project*,
+project*,
+extend\-project*,
+repo\-hooks?,
+superproject?,
+contactinfo?,
+include*)>
+.IP
+<!ELEMENT notice (#PCDATA)>
+.IP
+<!ELEMENT remote (annotation*)>
+<!ATTLIST remote name         ID    #REQUIRED>
+<!ATTLIST remote alias        CDATA #IMPLIED>
+<!ATTLIST remote fetch        CDATA #REQUIRED>
+<!ATTLIST remote pushurl      CDATA #IMPLIED>
+<!ATTLIST remote review       CDATA #IMPLIED>
+<!ATTLIST remote revision     CDATA #IMPLIED>
+.IP
+<!ELEMENT default EMPTY>
+<!ATTLIST default remote      IDREF #IMPLIED>
+<!ATTLIST default revision    CDATA #IMPLIED>
+<!ATTLIST default dest\-branch CDATA #IMPLIED>
+<!ATTLIST default upstream    CDATA #IMPLIED>
+<!ATTLIST default sync\-j      CDATA #IMPLIED>
+<!ATTLIST default sync\-c      CDATA #IMPLIED>
+<!ATTLIST default sync\-s      CDATA #IMPLIED>
+<!ATTLIST default sync\-tags   CDATA #IMPLIED>
+.IP
+<!ELEMENT manifest\-server EMPTY>
+<!ATTLIST manifest\-server url CDATA #REQUIRED>
+.TP
+<!ELEMENT project (annotation*,
+project*,
+copyfile*,
+linkfile*)>
+.TP
+<!ATTLIST project name
+CDATA #REQUIRED>
+.TP
+<!ATTLIST project path
+CDATA #IMPLIED>
+.TP
+<!ATTLIST project remote
+IDREF #IMPLIED>
+.TP
+<!ATTLIST project revision
+CDATA #IMPLIED>
+.IP
+<!ATTLIST project dest\-branch CDATA #IMPLIED>
+<!ATTLIST project groups      CDATA #IMPLIED>
+<!ATTLIST project sync\-c      CDATA #IMPLIED>
+<!ATTLIST project sync\-s      CDATA #IMPLIED>
+<!ATTLIST project sync\-tags   CDATA #IMPLIED>
+<!ATTLIST project upstream CDATA #IMPLIED>
+<!ATTLIST project clone\-depth CDATA #IMPLIED>
+<!ATTLIST project force\-path CDATA #IMPLIED>
+.IP
+<!ELEMENT annotation EMPTY>
+<!ATTLIST annotation name  CDATA #REQUIRED>
+<!ATTLIST annotation value CDATA #REQUIRED>
+<!ATTLIST annotation keep  CDATA "true">
+.IP
+<!ELEMENT copyfile EMPTY>
+<!ATTLIST copyfile src  CDATA #REQUIRED>
+<!ATTLIST copyfile dest CDATA #REQUIRED>
+.IP
+<!ELEMENT linkfile EMPTY>
+<!ATTLIST linkfile src CDATA #REQUIRED>
+<!ATTLIST linkfile dest CDATA #REQUIRED>
+.IP
+<!ELEMENT extend\-project EMPTY>
+<!ATTLIST extend\-project name CDATA #REQUIRED>
+<!ATTLIST extend\-project path CDATA #IMPLIED>
+<!ATTLIST extend\-project groups CDATA #IMPLIED>
+<!ATTLIST extend\-project revision CDATA #IMPLIED>
+<!ATTLIST extend\-project remote CDATA #IMPLIED>
+.IP
+<!ELEMENT remove\-project EMPTY>
+<!ATTLIST remove\-project name  CDATA #REQUIRED>
+<!ATTLIST remove\-project optional  CDATA #IMPLIED>
+.IP
+<!ELEMENT repo\-hooks EMPTY>
+<!ATTLIST repo\-hooks in\-project CDATA #REQUIRED>
+<!ATTLIST repo\-hooks enabled\-list CDATA #REQUIRED>
+.IP
+<!ELEMENT superproject EMPTY>
+<!ATTLIST superproject name    CDATA #REQUIRED>
+<!ATTLIST superproject remote  IDREF #IMPLIED>
+.IP
+<!ELEMENT contactinfo EMPTY>
+<!ATTLIST contactinfo bugurl  CDATA #REQUIRED>
+.IP
+<!ELEMENT include EMPTY>
+<!ATTLIST include name   CDATA #REQUIRED>
+<!ATTLIST include groups CDATA #IMPLIED>
+.PP
+]>
+```
+.PP
+For compatibility purposes across repo releases, all unknown elements are
+silently ignored. However, repo reserves all possible names for itself for
+future use. If you want to use custom elements, the `x\-*` namespace is reserved
+for that purpose, and repo guarantees to never allocate any corresponding names.
+.PP
+A description of the elements and their attributes follows.
+.PP
+Element manifest
+.PP
+The root element of the file.
+.PP
+Element notice
+.PP
+Arbitrary text that is displayed to users whenever `repo sync` finishes. The
+content is simply passed through as it exists in the manifest.
+.PP
+Element remote
+.PP
+One or more remote elements may be specified. Each remote element specifies a
+Git URL shared by one or more projects and (optionally) the Gerrit review server
+those projects upload changes through.
+.PP
+Attribute `name`: A short name unique to this manifest file. The name specified
+here is used as the remote name in each project's .git/config, and is therefore
+automatically available to commands like `git fetch`, `git remote`, `git pull`
+and `git push`.
+.PP
+Attribute `alias`: The alias, if specified, is used to override `name` to be set
+as the remote name in each project's .git/config. Its value can be duplicated
+while attribute `name` has to be unique in the manifest file. This helps each
+project to be able to have same remote name which actually points to different
+remote url.
+.PP
+Attribute `fetch`: The Git URL prefix for all projects which use this remote.
+Each project's name is appended to this prefix to form the actual URL used to
+clone the project.
+.PP
+Attribute `pushurl`: The Git "push" URL prefix for all projects which use this
+remote. Each project's name is appended to this prefix to form the actual URL
+used to "git push" the project. This attribute is optional; if not specified
+then "git push" will use the same URL as the `fetch` attribute.
+.PP
+Attribute `review`: Hostname of the Gerrit server where reviews are uploaded to
+by `repo upload`. This attribute is optional; if not specified then `repo
+upload` will not function.
+.PP
+Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`).
+Remotes with their own revision will override the default revision.
+.PP
+Element default
+.PP
+At most one default element may be specified. Its remote and revision attributes
+are used when a project element does not specify its own remote or revision
+attribute.
+.PP
+Attribute `remote`: Name of a previously defined remote element. Project
+elements lacking a remote attribute of their own will use this remote.
+.PP
+Attribute `revision`: Name of a Git branch (e.g. `main` or `refs/heads/main`).
+Project elements lacking their own revision attribute will use this revision.
+.PP
+Attribute `dest\-branch`: Name of a Git branch (e.g. `main`). Project elements
+not setting their own `dest\-branch` will inherit this value. If this value is
+not set, projects will use `revision` by default instead.
+.PP
+Attribute `upstream`: Name of the Git ref in which a sha1 can be found. Used
+when syncing a revision locked manifest in \fB\-c\fR mode to avoid having to sync the
+entire ref space. Project elements not setting their own `upstream` will inherit
+this value.
+.PP
+Attribute `sync\-j`: Number of parallel jobs to use when synching.
+.PP
+Attribute `sync\-c`: Set to true to only sync the given Git branch (specified in
+the `revision` attribute) rather than the whole ref space. Project elements
+lacking a sync\-c element of their own will use this value.
+.PP
+Attribute `sync\-s`: Set to true to also sync sub\-projects.
+.PP
+Attribute `sync\-tags`: Set to false to only sync the given Git branch (specified
+in the `revision` attribute) rather than the other ref tags.
+.PP
+Element manifest\-server
+.PP
+At most one manifest\-server may be specified. The url attribute is used to
+specify the URL of a manifest server, which is an XML RPC service.
+.PP
+The manifest server should implement the following RPC methods:
+.IP
+GetApprovedManifest(branch, target)
+.PP
+Return a manifest in which each project is pegged to a known good revision for
+the current branch and target. This is used by repo sync when the \fB\-\-smart\-sync\fR
+option is given.
+.PP
+The target to use is defined by environment variables TARGET_PRODUCT and
+TARGET_BUILD_VARIANT. These variables are used to create a string of the form
+$TARGET_PRODUCT\-$TARGET_BUILD_VARIANT, e.g. passion\-userdebug. If one of those
+variables or both are not present, the program will call GetApprovedManifest
+without the target parameter and the manifest server should choose a reasonable
+default target.
+.IP
+GetManifest(tag)
+.PP
+Return a manifest in which each project is pegged to the revision at the
+specified tag. This is used by repo sync when the \fB\-\-smart\-tag\fR option is given.
+.PP
+Element project
+.PP
+One or more project elements may be specified. Each element describes a single
+Git repository to be cloned into the repo client workspace. You may specify
+Git\-submodules by creating a nested project. Git\-submodules will be
+automatically recognized and inherit their parent's attributes, but those may be
+overridden by an explicitly specified project element.
+.PP
+Attribute `name`: A unique name for this project. The project's name is appended
+onto its remote's fetch URL to generate the actual URL to configure the Git
+remote with. The URL gets formed as:
+.IP
+${remote_fetch}/${project_name}.git
+.PP
+where ${remote_fetch} is the remote's fetch attribute and ${project_name} is the
+project's name attribute. The suffix ".git" is always appended as repo assumes
+the upstream is a forest of bare Git repositories. If the project has a parent
+element, its name will be prefixed by the parent's.
+.PP
+The project name must match the name Gerrit knows, if Gerrit is being used for
+code reviews.
+.PP
+"name" must not be empty, and may not be an absolute path or use "." or ".."
+path components. It is always interpreted relative to the remote's fetch
+settings, so if a different base path is needed, declare a different remote with
+the new settings needed. These restrictions are not enforced for [Local
+Manifests].
+.PP
+Attribute `path`: An optional path relative to the top directory of the repo
+client where the Git working directory for this project should be placed. If not
+supplied the project "name" is used. If the project has a parent element, its
+path will be prefixed by the parent's.
+.PP
+"path" may not be an absolute path or use "." or ".." path components. These
+restrictions are not enforced for [Local Manifests].
+.PP
+If you want to place files into the root of the checkout (e.g. a README or
+Makefile or another build script), use the [copyfile] or [linkfile] elements
+instead.
+.PP
+Attribute `remote`: Name of a previously defined remote element. If not supplied
+the remote given by the default element is used.
+.PP
+Attribute `revision`: Name of the Git branch the manifest wants to track for
+this project. Names can be relative to refs/heads (e.g. just "main") or absolute
+(e.g. "refs/heads/main"). Tags and/or explicit SHA\-1s should work in theory, but
+have not been extensively tested. If not supplied the revision given by the
+remote element is used if applicable, else the default element is used.
+.PP
+Attribute `dest\-branch`: Name of a Git branch (e.g. `main`). When using `repo
+upload`, changes will be submitted for code review on this branch. If
+unspecified both here and in the default element, `revision` is used instead.
+.PP
+Attribute `groups`: List of groups to which this project belongs, whitespace or
+comma separated. All projects belong to the group "all", and each project
+automatically belongs to a group of its name:`name` and path:`path`. E.g. for
+`<project name="monkeys" path="barrel\-of"/>`, that project definition is
+implicitly in the following manifest groups: default, name:monkeys, and
+path:barrel\-of. If you place a project in the group "notdefault", it will not be
+automatically downloaded by repo. If the project has a parent element, the
+`name` and `path` here are the prefixed ones.
+.PP
+Attribute `sync\-c`: Set to true to only sync the given Git branch (specified in
+the `revision` attribute) rather than the whole ref space.
+.PP
+Attribute `sync\-s`: Set to true to also sync sub\-projects.
+.PP
+Attribute `upstream`: Name of the Git ref in which a sha1 can be found. Used
+when syncing a revision locked manifest in \fB\-c\fR mode to avoid having to sync the
+entire ref space.
+.PP
+Attribute `clone\-depth`: Set the depth to use when fetching this project. If
+specified, this value will override any value given to repo init with the
+\fB\-\-depth\fR option on the command line.
+.PP
+Attribute `force\-path`: Set to true to force this project to create the local
+mirror repository according to its `path` attribute (if supplied) rather than
+the `name` attribute. This attribute only applies to the local mirrors syncing,
+it will be ignored when syncing the projects in a client working directory.
+.PP
+Element extend\-project
+.PP
+Modify the attributes of the named project.
+.PP
+This element is mostly useful in a local manifest file, to modify the attributes
+of an existing project without completely replacing the existing project
+definition. This makes the local manifest more robust against changes to the
+original manifest.
+.PP
+Attribute `path`: If specified, limit the change to projects checked out at the
+specified path, rather than all projects with the given name.
+.PP
+Attribute `groups`: List of additional groups to which this project belongs.
+Same syntax as the corresponding element of `project`.
+.PP
+Attribute `revision`: If specified, overrides the revision of the original
+project. Same syntax as the corresponding element of `project`.
+.PP
+Attribute `remote`: If specified, overrides the remote of the original project.
+Same syntax as the corresponding element of `project`.
+.PP
+Element annotation
+.PP
+Zero or more annotation elements may be specified as children of a project or
+remote element. Each element describes a name\-value pair. For projects, this
+name\-value pair will be exported into each project's environment during a
+\&'forall' command, prefixed with `REPO__`. In addition, there is an optional
+attribute "keep" which accepts the case insensitive values "true" (default) or
+"false". This attribute determines whether or not the annotation will be kept
+when exported with the manifest subcommand.
+.PP
+Element copyfile
+.PP
+Zero or more copyfile elements may be specified as children of a project
+element. Each element describes a src\-dest pair of files; the "src" file will be
+copied to the "dest" place during `repo sync` command.
+.PP
+"src" is project relative, "dest" is relative to the top of the tree. Copying
+from paths outside of the project or to paths outside of the repo client is not
+allowed.
+.PP
+"src" and "dest" must be files. Directories or symlinks are not allowed.
+Intermediate paths must not be symlinks either.
+.PP
+Parent directories of "dest" will be automatically created if missing.
+.PP
+Element linkfile
+.PP
+It's just like copyfile and runs at the same time as copyfile but instead of
+copying it creates a symlink.
+.PP
+The symlink is created at "dest" (relative to the top of the tree) and points to
+the path specified by "src" which is a path in the project.
+.PP
+Parent directories of "dest" will be automatically created if missing.
+.PP
+The symlink target may be a file or directory, but it may not point outside of
+the repo client.
+.PP
+Element remove\-project
+.PP
+Deletes the named project from the internal manifest table, possibly allowing a
+subsequent project element in the same manifest file to replace the project with
+a different source.
+.PP
+This element is mostly useful in a local manifest file, where the user can
+remove a project, and possibly replace it with their own definition.
+.PP
+Attribute `optional`: Set to true to ignore remove\-project elements with no
+matching `project` element.
+.PP
+Element repo\-hooks
+.PP
+NB: See the [practical documentation](./repo\-hooks.md) for using repo hooks.
+.PP
+Only one repo\-hooks element may be specified at a time. Attempting to redefine
+it will fail to parse.
+.PP
+Attribute `in\-project`: The project where the hooks are defined. The value must
+match the `name` attribute (**not** the `path` attribute) of a previously
+defined `project` element.
+.PP
+Attribute `enabled\-list`: List of hooks to use, whitespace or comma separated.
+.PP
+Element superproject
+.PP
+*** *Note*: This is currently a WIP. ***
+.PP
+NB: See the [git superprojects documentation](
+https://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects) for background
+information.
+.PP
+This element is used to specify the URL of the superproject. It has "name" and
+"remote" as atrributes. Only "name" is required while the others have reasonable
+defaults. At most one superproject may be specified. Attempting to redefine it
+will fail to parse.
+.PP
+Attribute `name`: A unique name for the superproject. This attribute has the
+same meaning as project's name attribute. See the [element
+project](#element\-project) for more information.
+.PP
+Attribute `remote`: Name of a previously defined remote element. If not supplied
+the remote given by the default element is used.
+.PP
+Element contactinfo
+.PP
+*** *Note*: This is currently a WIP. ***
+.PP
+This element is used to let manifest authors self\-register contact info. It has
+"bugurl" as a required atrribute. This element can be repeated, and any later
+entries will clobber earlier ones. This would allow manifest authors who extend
+manifests to specify their own contact info.
+.PP
+Attribute `bugurl`: The URL to file a bug against the manifest owner.
+.PP
+Element include
+.PP
+This element provides the capability of including another manifest file into the
+originating manifest. Normal rules apply for the target manifest to include \- it
+must be a usable manifest on its own.
+.PP
+Attribute `name`: the manifest to include, specified relative to the manifest
+repository's root.
+.PP
+"name" may not be an absolute path or use "." or ".." path components. These
+restrictions are not enforced for [Local Manifests].
+.PP
+Attribute `groups`: List of additional groups to which all projects in the
+included manifest belong. This appends and recurses, meaning all projects in
+sub\-manifests carry all parent include groups. Same syntax as the corresponding
+element of `project`.
+.PP
+Local Manifests
+.PP
+Additional remotes and projects may be added through local manifest files stored
+in `$TOP_DIR/.repo/local_manifests/*.xml`.
+.PP
+For example:
+.IP
+\f(CW$ ls .repo/local_manifests\fR
+.IP
+local_manifest.xml
+another_local_manifest.xml
+.IP
+\f(CW$ cat .repo/local_manifests/local_manifest.xml\fR
+.IP
+<?xml version="1.0" encoding="UTF\-8"?>
+<manifest>
+.IP
+<project path="manifest"
+.IP
+name="tools/manifest" />
+.IP
+<project path="platform\-manifest"
+.IP
+name="platform/manifest" />
+.IP
+</manifest>
+.PP
+Users may add projects to the local manifest(s) prior to a `repo sync`
+invocation, instructing repo to automatically download and manage these extra
+projects.
+.PP
+Manifest files stored in `$TOP_DIR/.repo/local_manifests/*.xml` will be loaded
+in alphabetical order.
+.PP
+Projects from local manifest files are added into local::<local manifest
+filename> group.
+.PP
+The legacy `$TOP_DIR/.repo/local_manifest.xml` path is no longer supported.
+.SS [copyfile]: #Element\-copyfile [linkfile]: #Element\-linkfile [Local Manifests]:
+.PP
+#local\-manifests
diff --git a/man/repo-overview.1 b/man/repo-overview.1
new file mode 100644
index 0000000..a12c764
--- /dev/null
+++ b/man/repo-overview.1
@@ -0,0 +1,39 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo overview" "Repo Manual"
+.SH NAME
+repo \- repo overview - manual page for repo overview
+.SH SYNOPSIS
+.B repo
+\fI\,overview \/\fR[\fI\,--current-branch\/\fR] [\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Display overview of unmerged project branches
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-c\fR, \fB\-\-current\-branch\fR
+consider only checked out branches
+.TP
+\fB\-\-no\-current\-branch\fR
+consider all local branches
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help overview` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo overview' command is used to display an overview of the projects
+branches, and list any local commits that have not yet been merged into the
+project.
+.PP
+The \fB\-c\fR/\-\-current\-branch option can be used to restrict the output to only
+branches currently checked out in each project. By default, all branches are
+displayed.
diff --git a/man/repo-prune.1 b/man/repo-prune.1
new file mode 100644
index 0000000..bd68a37
--- /dev/null
+++ b/man/repo-prune.1
@@ -0,0 +1,28 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo prune" "Repo Manual"
+.SH NAME
+repo \- repo prune - manual page for repo prune
+.SH SYNOPSIS
+.B repo
+\fI\,prune \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Prune (delete) already merged topics
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help prune` to view the detailed manual.
diff --git a/man/repo-rebase.1 b/man/repo-rebase.1
new file mode 100644
index 0000000..aa26103
--- /dev/null
+++ b/man/repo-rebase.1
@@ -0,0 +1,55 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo rebase" "Repo Manual"
+.SH NAME
+repo \- repo rebase - manual page for repo rebase
+.SH SYNOPSIS
+.B repo
+\fI\,rebase {\/\fR[\fI\,<project>\/\fR...] \fI\,| -i <project>\/\fR...\fI\,}\/\fR
+.SH DESCRIPTION
+Summary
+.PP
+Rebase local branches on upstream branch
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-\-fail\-fast\fR
+stop rebasing after first error is hit
+.TP
+\fB\-f\fR, \fB\-\-force\-rebase\fR
+pass \fB\-\-force\-rebase\fR to git rebase
+.TP
+\fB\-\-no\-ff\fR
+pass \fB\-\-no\-ff\fR to git rebase
+.TP
+\fB\-\-autosquash\fR
+pass \fB\-\-autosquash\fR to git rebase
+.TP
+\fB\-\-whitespace\fR=\fI\,WS\/\fR
+pass \fB\-\-whitespace\fR to git rebase
+.TP
+\fB\-\-auto\-stash\fR
+stash local modifications before starting
+.TP
+\fB\-m\fR, \fB\-\-onto\-manifest\fR
+rebase onto the manifest version instead of upstream
+HEAD (this helps to make sure the local tree stays
+consistent if you previously synced to a manifest)
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.TP
+\fB\-i\fR, \fB\-\-interactive\fR
+interactive rebase (single project only)
+.PP
+Run `repo help rebase` to view the detailed manual.
+.SH DETAILS
+.PP
+\&'repo rebase' uses git rebase to move local changes in the current topic branch
+to the HEAD of the upstream history, useful when you have made commits in a
+topic branch but need to incorporate new upstream changes "underneath" them.
diff --git a/man/repo-selfupdate.1 b/man/repo-selfupdate.1
new file mode 100644
index 0000000..70c855a
--- /dev/null
+++ b/man/repo-selfupdate.1
@@ -0,0 +1,35 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo selfupdate" "Repo Manual"
+.SH NAME
+repo \- repo selfupdate - manual page for repo selfupdate
+.SH SYNOPSIS
+.B repo
+\fI\,selfupdate\/\fR
+.SH DESCRIPTION
+Summary
+.PP
+Update repo to the latest version
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.SS repo Version options:
+.TP
+\fB\-\-no\-repo\-verify\fR
+do not verify repo source code
+.PP
+Run `repo help selfupdate` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo selfupdate' command upgrades repo to the latest version, if a newer
+version is available.
+.PP
+Normally this is done automatically by 'repo sync' and does not need to be
+performed by an end\-user.
diff --git a/man/repo-smartsync.1 b/man/repo-smartsync.1
new file mode 100644
index 0000000..5d93911
--- /dev/null
+++ b/man/repo-smartsync.1
@@ -0,0 +1,118 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo smartsync" "Repo Manual"
+.SH NAME
+repo \- repo smartsync - manual page for repo smartsync
+.SH SYNOPSIS
+.B repo
+\fI\,smartsync \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Update working tree to the latest known good revision
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
+number of network jobs to run in parallel (defaults to
+\fB\-\-jobs\fR)
+.TP
+\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
+number of local checkout jobs to run in parallel
+(defaults to \fB\-\-jobs\fR)
+.TP
+\fB\-f\fR, \fB\-\-force\-broken\fR
+obsolete option (to be deleted in the future)
+.TP
+\fB\-\-fail\-fast\fR
+stop syncing after first error is hit
+.TP
+\fB\-\-force\-sync\fR
+overwrite an existing git directory if it needs to
+point to a different object directory. WARNING: this
+may cause loss of data
+.TP
+\fB\-\-force\-remove\-dirty\fR
+force remove projects with uncommitted modifications
+if projects no longer exist in the manifest. WARNING:
+this may cause loss of data
+.TP
+\fB\-l\fR, \fB\-\-local\-only\fR
+only update working tree, don't fetch
+.TP
+\fB\-\-no\-manifest\-update\fR, \fB\-\-nmu\fR
+use the existing manifest checkout as\-is. (do not
+update to the latest revision)
+.TP
+\fB\-n\fR, \fB\-\-network\-only\fR
+fetch only, don't update working tree
+.TP
+\fB\-d\fR, \fB\-\-detach\fR
+detach projects back to manifest revision
+.TP
+\fB\-c\fR, \fB\-\-current\-branch\fR
+fetch only current branch from server
+.TP
+\fB\-\-no\-current\-branch\fR
+fetch all branches from server
+.TP
+\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
+temporary manifest to use for this sync
+.TP
+\fB\-\-clone\-bundle\fR
+enable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS
+.TP
+\fB\-\-no\-clone\-bundle\fR
+disable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS
+.TP
+\fB\-u\fR MANIFEST_SERVER_USERNAME, \fB\-\-manifest\-server\-username\fR=\fI\,MANIFEST_SERVER_USERNAME\/\fR
+username to authenticate with the manifest server
+.TP
+\fB\-p\fR MANIFEST_SERVER_PASSWORD, \fB\-\-manifest\-server\-password\fR=\fI\,MANIFEST_SERVER_PASSWORD\/\fR
+password to authenticate with the manifest server
+.TP
+\fB\-\-fetch\-submodules\fR
+fetch submodules from server
+.TP
+\fB\-\-use\-superproject\fR
+use the manifest superproject to sync projects
+.TP
+\fB\-\-no\-use\-superproject\fR
+disable use of manifest superprojects
+.TP
+\fB\-\-tags\fR
+fetch tags
+.TP
+\fB\-\-no\-tags\fR
+don't fetch tags
+.TP
+\fB\-\-optimized\-fetch\fR
+only fetch projects fixed to sha1 if revision does not
+exist locally
+.TP
+\fB\-\-retry\-fetches\fR=\fI\,RETRY_FETCHES\/\fR
+number of times to retry fetches on transient errors
+.TP
+\fB\-\-prune\fR
+delete refs that no longer exist on the remote
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.SS repo Version options:
+.TP
+\fB\-\-no\-repo\-verify\fR
+do not verify repo source code
+.PP
+Run `repo help smartsync` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo smartsync' command is a shortcut for sync \fB\-s\fR.
diff --git a/man/repo-stage.1 b/man/repo-stage.1
new file mode 100644
index 0000000..07e1cac
--- /dev/null
+++ b/man/repo-stage.1
@@ -0,0 +1,30 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo stage" "Repo Manual"
+.SH NAME
+repo \- repo stage - manual page for repo stage
+.SH SYNOPSIS
+.B repo
+\fI\,stage -i \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Stage file(s) for commit
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.TP
+\fB\-i\fR, \fB\-\-interactive\fR
+use interactive staging
+.PP
+Run `repo help stage` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo stage' command stages files to prepare the next commit.
diff --git a/man/repo-start.1 b/man/repo-start.1
new file mode 100644
index 0000000..b00a31f
--- /dev/null
+++ b/man/repo-start.1
@@ -0,0 +1,41 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo start" "Repo Manual"
+.SH NAME
+repo \- repo start - manual page for repo start
+.SH SYNOPSIS
+.B repo
+\fI\,start <newbranchname> \/\fR[\fI\,--all | <project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Start a new branch for development
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-\-all\fR
+begin branch in all projects
+.TP
+\fB\-r\fR REVISION, \fB\-\-rev\fR=\fI\,REVISION\/\fR, \fB\-\-revision\fR=\fI\,REVISION\/\fR
+point branch at this revision instead of upstream
+.TP
+\fB\-\-head\fR, \fB\-\-HEAD\fR
+abbreviation for \fB\-\-rev\fR HEAD
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help start` to view the detailed manual.
+.SH DETAILS
+.PP
+\&'repo start' begins a new branch of development, starting from the revision
+specified in the manifest.
diff --git a/man/repo-status.1 b/man/repo-status.1
new file mode 100644
index 0000000..fbae2c5
--- /dev/null
+++ b/man/repo-status.1
@@ -0,0 +1,98 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo status" "Repo Manual"
+.SH NAME
+repo \- repo status - manual page for repo status
+.SH SYNOPSIS
+.B repo
+\fI\,status \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Show the working tree status
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-o\fR, \fB\-\-orphans\fR
+include objects in working directory outside of repo
+projects
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help status` to view the detailed manual.
+.SH DETAILS
+.PP
+\&'repo status' compares the working tree to the staging area (aka index), and the
+most recent commit on this branch (HEAD), in each project specified. A summary
+is displayed, one line per file where there is a difference between these three
+states.
+.PP
+The \fB\-j\fR/\-\-jobs option can be used to run multiple status queries in parallel.
+.PP
+The \fB\-o\fR/\-\-orphans option can be used to show objects that are in the working
+directory, but not associated with a repo project. This includes unmanaged
+top\-level files and directories, but also includes deeper items. For example, if
+dir/subdir/proj1 and dir/subdir/proj2 are repo projects, dir/subdir/proj3 will
+be shown if it is not known to repo.
+.PP
+Status Display
+.PP
+The status display is organized into three columns of information, for example
+if the file 'subcmds/status.py' is modified in the project 'repo' on branch
+\&'devwork':
+.TP
+project repo/
+branch devwork
+.TP
+\fB\-m\fR
+subcmds/status.py
+.PP
+The first column explains how the staging area (index) differs from the last
+commit (HEAD). Its values are always displayed in upper case and have the
+following meanings:
+.TP
+\-:
+no difference
+.TP
+A:
+added         (not in HEAD,     in index                     )
+.TP
+M:
+modified      (    in HEAD,     in index, different content  )
+.TP
+D:
+deleted       (    in HEAD, not in index                     )
+.TP
+R:
+renamed       (not in HEAD,     in index, path changed       )
+.TP
+C:
+copied        (not in HEAD,     in index, copied from another)
+.TP
+T:
+mode changed  (    in HEAD,     in index, same content       )
+.TP
+U:
+unmerged; conflict resolution required
+.PP
+The second column explains how the working directory differs from the index. Its
+values are always displayed in lower case and have the following meanings:
+.TP
+\-:
+new / unknown (not in index,     in work tree                )
+.TP
+m:
+modified      (    in index,     in work tree, modified      )
+.TP
+d:
+deleted       (    in index, not in work tree                )
diff --git a/man/repo-sync.1 b/man/repo-sync.1
new file mode 100644
index 0000000..c87c970
--- /dev/null
+++ b/man/repo-sync.1
@@ -0,0 +1,209 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo sync" "Repo Manual"
+.SH NAME
+repo \- repo sync - manual page for repo sync
+.SH SYNOPSIS
+.B repo
+\fI\,sync \/\fR[\fI\,<project>\/\fR...]
+.SH DESCRIPTION
+Summary
+.PP
+Update working tree to the latest revision
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-\-jobs\-network\fR=\fI\,JOBS\/\fR
+number of network jobs to run in parallel (defaults to
+\fB\-\-jobs\fR)
+.TP
+\fB\-\-jobs\-checkout\fR=\fI\,JOBS\/\fR
+number of local checkout jobs to run in parallel
+(defaults to \fB\-\-jobs\fR)
+.TP
+\fB\-f\fR, \fB\-\-force\-broken\fR
+obsolete option (to be deleted in the future)
+.TP
+\fB\-\-fail\-fast\fR
+stop syncing after first error is hit
+.TP
+\fB\-\-force\-sync\fR
+overwrite an existing git directory if it needs to
+point to a different object directory. WARNING: this
+may cause loss of data
+.TP
+\fB\-\-force\-remove\-dirty\fR
+force remove projects with uncommitted modifications
+if projects no longer exist in the manifest. WARNING:
+this may cause loss of data
+.TP
+\fB\-l\fR, \fB\-\-local\-only\fR
+only update working tree, don't fetch
+.TP
+\fB\-\-no\-manifest\-update\fR, \fB\-\-nmu\fR
+use the existing manifest checkout as\-is. (do not
+update to the latest revision)
+.TP
+\fB\-n\fR, \fB\-\-network\-only\fR
+fetch only, don't update working tree
+.TP
+\fB\-d\fR, \fB\-\-detach\fR
+detach projects back to manifest revision
+.TP
+\fB\-c\fR, \fB\-\-current\-branch\fR
+fetch only current branch from server
+.TP
+\fB\-\-no\-current\-branch\fR
+fetch all branches from server
+.TP
+\fB\-m\fR NAME.xml, \fB\-\-manifest\-name\fR=\fI\,NAME\/\fR.xml
+temporary manifest to use for this sync
+.TP
+\fB\-\-clone\-bundle\fR
+enable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS
+.TP
+\fB\-\-no\-clone\-bundle\fR
+disable use of \fI\,/clone.bundle\/\fP on HTTP/HTTPS
+.TP
+\fB\-u\fR MANIFEST_SERVER_USERNAME, \fB\-\-manifest\-server\-username\fR=\fI\,MANIFEST_SERVER_USERNAME\/\fR
+username to authenticate with the manifest server
+.TP
+\fB\-p\fR MANIFEST_SERVER_PASSWORD, \fB\-\-manifest\-server\-password\fR=\fI\,MANIFEST_SERVER_PASSWORD\/\fR
+password to authenticate with the manifest server
+.TP
+\fB\-\-fetch\-submodules\fR
+fetch submodules from server
+.TP
+\fB\-\-use\-superproject\fR
+use the manifest superproject to sync projects
+.TP
+\fB\-\-no\-use\-superproject\fR
+disable use of manifest superprojects
+.TP
+\fB\-\-tags\fR
+fetch tags
+.TP
+\fB\-\-no\-tags\fR
+don't fetch tags
+.TP
+\fB\-\-optimized\-fetch\fR
+only fetch projects fixed to sha1 if revision does not
+exist locally
+.TP
+\fB\-\-retry\-fetches\fR=\fI\,RETRY_FETCHES\/\fR
+number of times to retry fetches on transient errors
+.TP
+\fB\-\-prune\fR
+delete refs that no longer exist on the remote
+.TP
+\fB\-s\fR, \fB\-\-smart\-sync\fR
+smart sync using manifest from the latest known good
+build
+.TP
+\fB\-t\fR SMART_TAG, \fB\-\-smart\-tag\fR=\fI\,SMART_TAG\/\fR
+smart sync using manifest from a known tag
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.SS repo Version options:
+.TP
+\fB\-\-no\-repo\-verify\fR
+do not verify repo source code
+.PP
+Run `repo help sync` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo sync' command synchronizes local project directories with the remote
+repositories specified in the manifest. If a local project does not yet exist,
+it will clone a new local directory from the remote repository and set up
+tracking branches as specified in the manifest. If the local project already
+exists, 'repo sync' will update the remote branches and rebase any new local
+changes on top of the new remote changes.
+.PP
+\&'repo sync' will synchronize all projects listed at the command line. Projects
+can be specified either by name, or by a relative or absolute path to the
+project's local directory. If no projects are specified, 'repo sync' will
+synchronize all projects listed in the manifest.
+.PP
+The \fB\-d\fR/\-\-detach option can be used to switch specified projects back to the
+manifest revision. This option is especially helpful if the project is currently
+on a topic branch, but the manifest revision is temporarily needed.
+.PP
+The \fB\-s\fR/\-\-smart\-sync option can be used to sync to a known good build as
+specified by the manifest\-server element in the current manifest. The
+\fB\-t\fR/\-\-smart\-tag option is similar and allows you to specify a custom tag/label.
+.PP
+The \fB\-u\fR/\-\-manifest\-server\-username and \fB\-p\fR/\-\-manifest\-server\-password options can
+be used to specify a username and password to authenticate with the manifest
+server when using the \fB\-s\fR or \fB\-t\fR option.
+.PP
+If \fB\-u\fR and \fB\-p\fR are not specified when using the \fB\-s\fR or \fB\-t\fR option, 'repo sync' will
+attempt to read authentication credentials for the manifest server from the
+user's .netrc file.
+.PP
+\&'repo sync' will not use authentication credentials from \fB\-u\fR/\-p or .netrc if the
+manifest server specified in the manifest file already includes credentials.
+.PP
+By default, all projects will be synced. The \fB\-\-fail\-fast\fR option can be used to
+halt syncing as soon as possible when the first project fails to sync.
+.PP
+The \fB\-\-force\-sync\fR option can be used to overwrite existing git directories if
+they have previously been linked to a different object directory. WARNING: This
+may cause data to be lost since refs may be removed when overwriting.
+.PP
+The \fB\-\-force\-remove\-dirty\fR option can be used to remove previously used projects
+with uncommitted changes. WARNING: This may cause data to be lost since
+uncommitted changes may be removed with projects that no longer exist in the
+manifest.
+.PP
+The \fB\-\-no\-clone\-bundle\fR option disables any attempt to use \fI\,$URL/clone.bundle\/\fP to
+bootstrap a new Git repository from a resumeable bundle file on a content
+delivery network. This may be necessary if there are problems with the local
+Python HTTP client or proxy configuration, but the Git binary works.
+.PP
+The \fB\-\-fetch\-submodules\fR option enables fetching Git submodules of a project from
+server.
+.PP
+The \fB\-c\fR/\-\-current\-branch option can be used to only fetch objects that are on the
+branch specified by a project's revision.
+.PP
+The \fB\-\-optimized\-fetch\fR option can be used to only fetch projects that are fixed
+to a sha1 revision if the sha1 revision does not already exist locally.
+.PP
+The \fB\-\-prune\fR option can be used to remove any refs that no longer exist on the
+remote.
+.PP
+SSH Connections
+.PP
+If at least one project remote URL uses an SSH connection (ssh://, git+ssh://,
+or user@host:path syntax) repo will automatically enable the SSH ControlMaster
+option when connecting to that host. This feature permits other projects in the
+same 'repo sync' session to reuse the same SSH tunnel, saving connection setup
+overheads.
+.PP
+To disable this behavior on UNIX platforms, set the GIT_SSH environment variable
+to 'ssh'. For example:
+.IP
+export GIT_SSH=ssh
+repo sync
+.PP
+Compatibility
+.PP
+This feature is automatically disabled on Windows, due to the lack of UNIX
+domain socket support.
+.PP
+This feature is not compatible with url.insteadof rewrites in the user's
+~/.gitconfig. 'repo sync' is currently not able to perform the rewrite early
+enough to establish the ControlMaster tunnel.
+.PP
+If the remote SSH daemon is Gerrit Code Review, version 2.0.10 or later is
+required to fix a server side protocol bug.
diff --git a/man/repo-upload.1 b/man/repo-upload.1
new file mode 100644
index 0000000..36a0dac
--- /dev/null
+++ b/man/repo-upload.1
@@ -0,0 +1,175 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo upload" "Repo Manual"
+.SH NAME
+repo \- repo upload - manual page for repo upload
+.SH SYNOPSIS
+.B repo
+\fI\,upload \/\fR[\fI\,--re --cc\/\fR] [\fI\,<project>\/\fR]...
+.SH DESCRIPTION
+Summary
+.PP
+Upload changes for code review
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-j\fR JOBS, \fB\-\-jobs\fR=\fI\,JOBS\/\fR
+number of jobs to run in parallel (default: based on
+number of CPU cores)
+.TP
+\fB\-t\fR
+send local branch name to Gerrit Code Review
+.TP
+\fB\-\-hashtag\fR=\fI\,HASHTAGS\/\fR, \fB\-\-ht\fR=\fI\,HASHTAGS\/\fR
+add hashtags (comma delimited) to the review
+.TP
+\fB\-\-hashtag\-branch\fR, \fB\-\-htb\fR
+add local branch name as a hashtag
+.TP
+\fB\-l\fR LABELS, \fB\-\-label\fR=\fI\,LABELS\/\fR
+add a label when uploading
+.TP
+\fB\-\-re\fR=\fI\,REVIEWERS\/\fR, \fB\-\-reviewers\fR=\fI\,REVIEWERS\/\fR
+request reviews from these people
+.TP
+\fB\-\-cc\fR=\fI\,CC\/\fR
+also send email to these email addresses
+.TP
+\fB\-\-br\fR=\fI\,BRANCH\/\fR, \fB\-\-branch\fR=\fI\,BRANCH\/\fR
+(local) branch to upload
+.TP
+\fB\-c\fR, \fB\-\-current\-branch\fR
+upload current git branch
+.TP
+\fB\-\-no\-current\-branch\fR
+upload all git branches
+.TP
+\fB\-\-ne\fR, \fB\-\-no\-emails\fR
+do not send e\-mails on upload
+.TP
+\fB\-p\fR, \fB\-\-private\fR
+upload as a private change (deprecated; use \fB\-\-wip\fR)
+.TP
+\fB\-w\fR, \fB\-\-wip\fR
+upload as a work\-in\-progress change
+.TP
+\fB\-o\fR PUSH_OPTIONS, \fB\-\-push\-option\fR=\fI\,PUSH_OPTIONS\/\fR
+additional push options to transmit
+.TP
+\fB\-D\fR BRANCH, \fB\-\-destination\fR=\fI\,BRANCH\/\fR, \fB\-\-dest\fR=\fI\,BRANCH\/\fR
+submit for review on this target branch
+.TP
+\fB\-n\fR, \fB\-\-dry\-run\fR
+do everything except actually upload the CL
+.TP
+\fB\-y\fR, \fB\-\-yes\fR
+answer yes to all safe prompts
+.TP
+\fB\-\-no\-cert\-checks\fR
+disable verifying ssl certs (unsafe)
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.SS pre\-upload hooks:
+.TP
+\fB\-\-no\-verify\fR
+Do not run the pre\-upload hook.
+.TP
+\fB\-\-verify\fR
+Run the pre\-upload hook without prompting.
+.TP
+\fB\-\-ignore\-hooks\fR
+Do not abort if pre\-upload hooks fail.
+.PP
+Run `repo help upload` to view the detailed manual.
+.SH DETAILS
+.PP
+The 'repo upload' command is used to send changes to the Gerrit Code Review
+system. It searches for topic branches in local projects that have not yet been
+published for review. If multiple topic branches are found, 'repo upload' opens
+an editor to allow the user to select which branches to upload.
+.PP
+\&'repo upload' searches for uploadable changes in all projects listed at the
+command line. Projects can be specified either by name, or by a relative or
+absolute path to the project's local directory. If no projects are specified,
+\&'repo upload' will search for uploadable changes in all projects listed in the
+manifest.
+.PP
+If the \fB\-\-reviewers\fR or \fB\-\-cc\fR options are passed, those emails are added to the
+respective list of users, and emails are sent to any new users. Users passed as
+\fB\-\-reviewers\fR must already be registered with the code review system, or the
+upload will fail.
+.PP
+Configuration
+.PP
+review.URL.autoupload:
+.PP
+To disable the "Upload ... (y/N)?" prompt, you can set a per\-project or global
+Git configuration option. If review.URL.autoupload is set to "true" then repo
+will assume you always answer "y" at the prompt, and will not prompt you
+further. If it is set to "false" then repo will assume you always answer "n",
+and will abort.
+.PP
+review.URL.autoreviewer:
+.PP
+To automatically append a user or mailing list to reviews, you can set a
+per\-project or global Git option to do so.
+.PP
+review.URL.autocopy:
+.PP
+To automatically copy a user or mailing list to all uploaded reviews, you can
+set a per\-project or global Git option to do so. Specifically,
+review.URL.autocopy can be set to a comma separated list of reviewers who you
+always want copied on all uploads with a non\-empty \fB\-\-re\fR argument.
+.PP
+review.URL.username:
+.PP
+Override the username used to connect to Gerrit Code Review. By default the
+local part of the email address is used.
+.PP
+The URL must match the review URL listed in the manifest XML file, or in the
+\&.git/config within the project. For example:
+.IP
+[remote "origin"]
+.IP
+url = git://git.example.com/project.git
+review = http://review.example.com/
+.IP
+[review "http://review.example.com/"]
+.IP
+autoupload = true
+autocopy = johndoe@company.com,my\-team\-alias@company.com
+.PP
+review.URL.uploadtopic:
+.PP
+To add a topic branch whenever uploading a commit, you can set a per\-project or
+global Git option to do so. If review.URL.uploadtopic is set to "true" then repo
+will assume you always want the equivalent of the \fB\-t\fR option to the repo command.
+If unset or set to "false" then repo will make use of only the command line
+option.
+.PP
+review.URL.uploadhashtags:
+.PP
+To add hashtags whenever uploading a commit, you can set a per\-project or global
+Git option to do so. The value of review.URL.uploadhashtags will be used as
+comma delimited hashtags like the \fB\-\-hashtag\fR option.
+.PP
+review.URL.uploadlabels:
+.PP
+To add labels whenever uploading a commit, you can set a per\-project or global
+Git option to do so. The value of review.URL.uploadlabels will be used as comma
+delimited labels like the \fB\-\-label\fR option.
+.PP
+review.URL.uploadnotify:
+.PP
+Control e\-mail notifications when uploading.
+https://gerrit\-review.googlesource.com/Documentation/user\-upload.html#notify
+.PP
+References
+.PP
+Gerrit Code Review: https://www.gerritcodereview.com/
diff --git a/man/repo-version.1 b/man/repo-version.1
new file mode 100644
index 0000000..cc703f6
--- /dev/null
+++ b/man/repo-version.1
@@ -0,0 +1,24 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo version" "Repo Manual"
+.SH NAME
+repo \- repo version - manual page for repo version
+.SH SYNOPSIS
+.B repo
+\fI\,version\/\fR
+.SH DESCRIPTION
+Summary
+.PP
+Display the version of repo
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.SS Logging options:
+.TP
+\fB\-v\fR, \fB\-\-verbose\fR
+show all output
+.TP
+\fB\-q\fR, \fB\-\-quiet\fR
+only show errors
+.PP
+Run `repo help version` to view the detailed manual.
diff --git a/man/repo.1 b/man/repo.1
new file mode 100644
index 0000000..4aa7638
--- /dev/null
+++ b/man/repo.1
@@ -0,0 +1,133 @@
+.\" DO NOT MODIFY THIS FILE!  It was generated by help2man.
+.TH REPO "1" "July 2021" "repo" "Repo Manual"
+.SH NAME
+repo \- repository management tool built on top of git
+.SH SYNOPSIS
+.B repo
+[\fI\,-p|--paginate|--no-pager\/\fR] \fI\,COMMAND \/\fR[\fI\,ARGS\/\fR]
+.SH OPTIONS
+.TP
+\fB\-h\fR, \fB\-\-help\fR
+show this help message and exit
+.TP
+\fB\-\-help\-all\fR
+show this help message with all subcommands and exit
+.TP
+\fB\-p\fR, \fB\-\-paginate\fR
+display command output in the pager
+.TP
+\fB\-\-no\-pager\fR
+disable the pager
+.TP
+\fB\-\-color\fR=\fI\,COLOR\/\fR
+control color usage: auto, always, never
+.TP
+\fB\-\-trace\fR
+trace git command execution (REPO_TRACE=1)
+.TP
+\fB\-\-trace\-python\fR
+trace python command execution
+.TP
+\fB\-\-time\fR
+time repo command execution
+.TP
+\fB\-\-version\fR
+display this version of repo
+.TP
+\fB\-\-show\-toplevel\fR
+display the path of the top\-level directory of the
+repo client checkout
+.TP
+\fB\-\-event\-log\fR=\fI\,EVENT_LOG\/\fR
+filename of event log to append timeline to
+.TP
+\fB\-\-git\-trace2\-event\-log\fR=\fI\,GIT_TRACE2_EVENT_LOG\/\fR
+directory to write git trace2 event log to
+.SS "The complete list of recognized repo commands are:"
+.TP
+abandon
+Permanently abandon a development branch
+.TP
+branch
+View current topic branches
+.TP
+branches
+View current topic branches
+.TP
+checkout
+Checkout a branch for development
+.TP
+cherry\-pick
+Cherry\-pick a change.
+.TP
+diff
+Show changes between commit and working tree
+.TP
+diffmanifests
+Manifest diff utility
+.TP
+download
+Download and checkout a change
+.TP
+forall
+Run a shell command in each project
+.TP
+gitc\-delete
+Delete a GITC Client.
+.TP
+gitc\-init
+Initialize a GITC Client.
+.TP
+grep
+Print lines matching a pattern
+.TP
+help
+Display detailed help on a command
+.TP
+info
+Get info on the manifest branch, current branch or unmerged branches
+.TP
+init
+Initialize a repo client checkout in the current directory
+.TP
+list
+List projects and their associated directories
+.TP
+manifest
+Manifest inspection utility
+.TP
+overview
+Display overview of unmerged project branches
+.TP
+prune
+Prune (delete) already merged topics
+.TP
+rebase
+Rebase local branches on upstream branch
+.TP
+selfupdate
+Update repo to the latest version
+.TP
+smartsync
+Update working tree to the latest known good revision
+.TP
+stage
+Stage file(s) for commit
+.TP
+start
+Start a new branch for development
+.TP
+status
+Show the working tree status
+.TP
+sync
+Update working tree to the latest revision
+.TP
+upload
+Upload changes for code review
+.TP
+version
+Display the version of repo
+.PP
+See 'repo help <command>' for more information on a specific command.
+Bug reports: https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue
diff --git a/manifest_xml.py b/manifest_xml.py
index 3814a25..68ead53 100644
--- a/manifest_xml.py
+++ b/manifest_xml.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,33 +12,34 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import collections
 import itertools
 import os
+import platform
 import re
 import sys
 import xml.dom.minidom
-
-from pyversion import is_python3
-if is_python3():
-  import urllib.parse
-else:
-  import imp
-  import urlparse
-  urllib = imp.new_module('urllib')
-  urllib.parse = urlparse
+import urllib.parse
 
 import gitc_utils
-from git_config import GitConfig
+from git_config import GitConfig, IsId
 from git_refs import R_HEADS, HEAD
 import platform_utils
-from project import RemoteSpec, Project, MetaProject
-from error import ManifestParseError, ManifestInvalidRevisionError
+from project import Annotation, RemoteSpec, Project, MetaProject
+from error import (ManifestParseError, ManifestInvalidPathError,
+                   ManifestInvalidRevisionError)
+from wrapper import Wrapper
 
 MANIFEST_FILE_NAME = 'manifest.xml'
 LOCAL_MANIFEST_NAME = 'local_manifest.xml'
 LOCAL_MANIFESTS_DIR_NAME = 'local_manifests'
 
+# Add all projects from local manifest into a group.
+LOCAL_MANIFEST_GROUP_PREFIX = 'local:'
+
+# ContactInfo has the self-registered bug url, supplied by the manifest authors.
+ContactInfo = collections.namedtuple('ContactInfo', 'bugurl')
+
 # urljoin gets confused if the scheme is not known.
 urllib.parse.uses_relative.extend([
     'ssh',
@@ -55,6 +54,61 @@
     'sso',
     'rpc'])
 
+
+def XmlBool(node, attr, default=None):
+  """Determine boolean value of |node|'s |attr|.
+
+  Invalid values will issue a non-fatal warning.
+
+  Args:
+    node: XML node whose attributes we access.
+    attr: The attribute to access.
+    default: If the attribute is not set (value is empty), then use this.
+
+  Returns:
+    True if the attribute is a valid string representing true.
+    False if the attribute is a valid string representing false.
+    |default| otherwise.
+  """
+  value = node.getAttribute(attr)
+  s = value.lower()
+  if s == '':
+    return default
+  elif s in {'yes', 'true', '1'}:
+    return True
+  elif s in {'no', 'false', '0'}:
+    return False
+  else:
+    print('warning: manifest: %s="%s": ignoring invalid XML boolean' %
+          (attr, value), file=sys.stderr)
+    return default
+
+
+def XmlInt(node, attr, default=None):
+  """Determine integer value of |node|'s |attr|.
+
+  Args:
+    node: XML node whose attributes we access.
+    attr: The attribute to access.
+    default: If the attribute is not set (value is empty), then use this.
+
+  Returns:
+    The number if the attribute is a valid number.
+
+  Raises:
+    ManifestParseError: The number is invalid.
+  """
+  value = node.getAttribute(attr)
+  if not value:
+    return default
+
+  try:
+    return int(value)
+  except ValueError:
+    raise ManifestParseError('manifest: invalid %s="%s" integer' %
+                             (attr, value))
+
+
 class _Default(object):
   """Project defaults within the manifest."""
 
@@ -68,11 +122,16 @@
   sync_tags = True
 
   def __eq__(self, other):
+    if not isinstance(other, _Default):
+      return False
     return self.__dict__ == other.__dict__
 
   def __ne__(self, other):
+    if not isinstance(other, _Default):
+      return True
     return self.__dict__ != other.__dict__
 
+
 class _XmlRemote(object):
   def __init__(self,
                name,
@@ -90,14 +149,22 @@
     self.reviewUrl = review
     self.revision = revision
     self.resolvedFetchUrl = self._resolveFetchUrl()
+    self.annotations = []
 
   def __eq__(self, other):
-    return self.__dict__ == other.__dict__
+    if not isinstance(other, _XmlRemote):
+      return False
+    return (sorted(self.annotations) == sorted(other.annotations) and
+      self.name == other.name and self.fetchUrl == other.fetchUrl and
+      self.pushUrl == other.pushUrl and self.remoteAlias == other.remoteAlias
+      and self.reviewUrl == other.reviewUrl and self.revision == other.revision)
 
   def __ne__(self, other):
-    return self.__dict__ != other.__dict__
+    return not self.__eq__(other)
 
   def _resolveFetchUrl(self):
+    if self.fetchUrl is None:
+      return ''
     url = self.fetchUrl.rstrip('/')
     manifestUrl = self.manifestUrl.rstrip('/')
     # urljoin will gets confused over quite a few things.  The ones we care
@@ -126,25 +193,48 @@
                       orig_name=self.name,
                       fetchUrl=self.fetchUrl)
 
+  def AddAnnotation(self, name, value, keep):
+    self.annotations.append(Annotation(name, value, keep))
+
+
 class XmlManifest(object):
   """manages the repo configuration file"""
 
-  def __init__(self, repodir):
+  def __init__(self, repodir, manifest_file, local_manifests=None):
+    """Initialize.
+
+    Args:
+      repodir: Path to the .repo/ dir for holding all internal checkout state.
+          It must be in the top directory of the repo client checkout.
+      manifest_file: Full path to the manifest file to parse.  This will usually
+          be |repodir|/|MANIFEST_FILE_NAME|.
+      local_manifests: Full path to the directory of local override manifests.
+          This will usually be |repodir|/|LOCAL_MANIFESTS_DIR_NAME|.
+    """
+    # TODO(vapier): Move this out of this class.
+    self.globalConfig = GitConfig.ForUser()
+
     self.repodir = os.path.abspath(repodir)
     self.topdir = os.path.dirname(self.repodir)
-    self.manifestFile = os.path.join(self.repodir, MANIFEST_FILE_NAME)
-    self.globalConfig = GitConfig.ForUser()
-    self.localManifestWarning = False
-    self.isGitcClient = False
+    self.manifestFile = manifest_file
+    self.local_manifests = local_manifests
     self._load_local_manifests = True
 
     self.repoProject = MetaProject(self, 'repo',
-      gitdir   = os.path.join(repodir, 'repo/.git'),
-      worktree = os.path.join(repodir, 'repo'))
+                                   gitdir=os.path.join(repodir, 'repo/.git'),
+                                   worktree=os.path.join(repodir, 'repo'))
 
-    self.manifestProject = MetaProject(self, 'manifests',
-      gitdir   = os.path.join(repodir, 'manifests.git'),
-      worktree = os.path.join(repodir, 'manifests'))
+    mp = MetaProject(self, 'manifests',
+                     gitdir=os.path.join(repodir, 'manifests.git'),
+                     worktree=os.path.join(repodir, 'manifests'))
+    self.manifestProject = mp
+
+    # This is a bit hacky, but we're in a chicken & egg situation: all the
+    # normal repo settings live in the manifestProject which we just setup
+    # above, so we couldn't easily query before that.  We assume Project()
+    # init doesn't care if this changes afterwards.
+    if os.path.exists(mp.gitdir) and mp.config.GetBoolean('repo.worktree'):
+      mp.use_git_worktrees = True
 
     self._Unload()
 
@@ -179,12 +269,26 @@
     """
     self.Override(name)
 
-    try:
-      if os.path.lexists(self.manifestFile):
-        platform_utils.remove(self.manifestFile)
-      platform_utils.symlink(os.path.join('manifests', name), self.manifestFile)
-    except OSError as e:
-      raise ManifestParseError('cannot link manifest %s: %s' % (name, str(e)))
+    # Old versions of repo would generate symlinks we need to clean up.
+    platform_utils.remove(self.manifestFile, missing_ok=True)
+    # This file is interpreted as if it existed inside the manifest repo.
+    # That allows us to use <include> with the relative file name.
+    with open(self.manifestFile, 'w') as fp:
+      fp.write("""<?xml version="1.0" encoding="UTF-8"?>
+<!--
+DO NOT EDIT THIS FILE!  It is generated by repo and changes will be discarded.
+If you want to use a different manifest, use `repo init -m <file>` instead.
+
+If you want to customize your checkout by overriding manifest settings, use
+the local_manifests/ directory instead.
+
+For more information on repo manifests, check out:
+https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md
+-->
+<manifest>
+  <include name="%s" />
+</manifest>
+""" % (name,))
 
   def _RemoteToXml(self, r, doc, root):
     e = doc.createElement('remote')
@@ -200,18 +304,28 @@
     if r.revision is not None:
       e.setAttribute('revision', r.revision)
 
-  def _ParseGroups(self, groups):
-    return [x for x in re.split(r'[,\s]+', groups) if x]
+    for a in r.annotations:
+      if a.keep == 'true':
+        ae = doc.createElement('annotation')
+        ae.setAttribute('name', a.name)
+        ae.setAttribute('value', a.value)
+        e.appendChild(ae)
 
-  def Save(self, fd, peg_rev=False, peg_rev_upstream=True, groups=None):
-    """Write the current manifest out to the given file descriptor.
+  def _ParseList(self, field):
+    """Parse fields that contain flattened lists.
+
+    These are whitespace & comma separated.  Empty elements will be discarded.
     """
+    return [x for x in re.split(r'[,\s]+', field) if x]
+
+  def ToXml(self, peg_rev=False, peg_rev_upstream=True, peg_rev_dest_branch=True, groups=None):
+    """Return the current manifest XML."""
     mp = self.manifestProject
 
     if groups is None:
       groups = mp.config.GetString('manifest.groups')
     if groups:
-      groups = self._ParseGroups(groups)
+      groups = self._ParseList(groups)
 
     doc = xml.dom.minidom.Document()
     root = doc.createElement('manifest')
@@ -223,7 +337,7 @@
     if self.notice:
       notice_element = root.appendChild(doc.createElement('notice'))
       notice_lines = self.notice.splitlines()
-      indented_notice = ('\n'.join(" "*4 + line for line in notice_lines))[4:]
+      indented_notice = ('\n'.join(" " * 4 + line for line in notice_lines))[4:]
       notice_element.appendChild(doc.createTextNode(indented_notice))
 
     d = self.default
@@ -308,10 +422,19 @@
             # Only save the origin if the origin is not a sha1, and the default
             # isn't our value
             e.setAttribute('upstream', p.revisionExpr)
+
+        if peg_rev_dest_branch:
+          if p.dest_branch:
+            e.setAttribute('dest-branch', p.dest_branch)
+          elif value != p.revisionExpr:
+            e.setAttribute('dest-branch', p.revisionExpr)
+
       else:
         revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr
         if not revision or revision != p.revisionExpr:
           e.setAttribute('revision', p.revisionExpr)
+        elif p.revisionId:
+          e.setAttribute('revision', p.revisionId)
         if (p.upstream and (p.upstream != p.revisionExpr or
                             p.upstream != d.upstreamExpr)):
           e.setAttribute('upstream', p.upstream)
@@ -372,11 +495,84 @@
                      ' '.join(self._repo_hooks_project.enabled_repo_hooks))
       root.appendChild(e)
 
+    if self._superproject:
+      root.appendChild(doc.createTextNode(''))
+      e = doc.createElement('superproject')
+      e.setAttribute('name', self._superproject['name'])
+      remoteName = None
+      if d.remote:
+        remoteName = d.remote.name
+      remote = self._superproject.get('remote')
+      if not d.remote or remote.orig_name != remoteName:
+        remoteName = remote.orig_name
+        e.setAttribute('remote', remoteName)
+      revision = remote.revision or d.revisionExpr
+      if not revision or revision != self._superproject['revision']:
+        e.setAttribute('revision', self._superproject['revision'])
+      root.appendChild(e)
+
+    if self._contactinfo.bugurl != Wrapper().BUG_URL:
+      root.appendChild(doc.createTextNode(''))
+      e = doc.createElement('contactinfo')
+      e.setAttribute('bugurl', self._contactinfo.bugurl)
+      root.appendChild(e)
+
+    return doc
+
+  def ToDict(self, **kwargs):
+    """Return the current manifest as a dictionary."""
+    # Elements that may only appear once.
+    SINGLE_ELEMENTS = {
+        'notice',
+        'default',
+        'manifest-server',
+        'repo-hooks',
+        'superproject',
+        'contactinfo',
+    }
+    # Elements that may be repeated.
+    MULTI_ELEMENTS = {
+        'remote',
+        'remove-project',
+        'project',
+        'extend-project',
+        'include',
+        # These are children of 'project' nodes.
+        'annotation',
+        'project',
+        'copyfile',
+        'linkfile',
+    }
+
+    doc = self.ToXml(**kwargs)
+    ret = {}
+
+    def append_children(ret, node):
+      for child in node.childNodes:
+        if child.nodeType == xml.dom.Node.ELEMENT_NODE:
+          attrs = child.attributes
+          element = dict((attrs.item(i).localName, attrs.item(i).value)
+                         for i in range(attrs.length))
+          if child.nodeName in SINGLE_ELEMENTS:
+            ret[child.nodeName] = element
+          elif child.nodeName in MULTI_ELEMENTS:
+            ret.setdefault(child.nodeName, []).append(element)
+          else:
+            raise ManifestParseError('Unhandled element "%s"' % (child.nodeName,))
+
+          append_children(element, child)
+
+    append_children(ret, doc.firstChild)
+
+    return ret
+
+  def Save(self, fd, **kwargs):
+    """Write the current manifest out to the given file descriptor."""
+    doc = self.ToXml(**kwargs)
     doc.writexml(fd, '', '  ', '\n', 'UTF-8')
 
   def _output_manifest_project_extras(self, p, e):
     """Manifests can modify e if they support extra project attributes."""
-    pass
 
   @property
   def paths(self):
@@ -404,6 +600,16 @@
     return self._repo_hooks_project
 
   @property
+  def superproject(self):
+    self._Load()
+    return self._superproject
+
+  @property
+  def contactinfo(self):
+    self._Load()
+    return self._contactinfo
+
+  @property
   def notice(self):
     self._Load()
     return self._notice
@@ -414,16 +620,45 @@
     return self._manifest_server
 
   @property
+  def CloneBundle(self):
+    clone_bundle = self.manifestProject.config.GetBoolean('repo.clonebundle')
+    if clone_bundle is None:
+      return False if self.manifestProject.config.GetBoolean('repo.partialclone') else True
+    else:
+      return clone_bundle
+
+  @property
   def CloneFilter(self):
     if self.manifestProject.config.GetBoolean('repo.partialclone'):
       return self.manifestProject.config.GetString('repo.clonefilter')
     return None
 
   @property
+  def PartialCloneExclude(self):
+    exclude = self.manifest.manifestProject.config.GetString(
+        'repo.partialcloneexclude') or ''
+    return set(x.strip() for x in exclude.split(','))
+
+  @property
+  def UseLocalManifests(self):
+    return self._load_local_manifests
+
+  def SetUseLocalManifests(self, value):
+    self._load_local_manifests = value
+
+  @property
+  def HasLocalManifests(self):
+    return self._load_local_manifests and self.local_manifests
+
+  @property
   def IsMirror(self):
     return self.manifestProject.config.GetBoolean('repo.mirror')
 
   @property
+  def UseGitWorktrees(self):
+    return self.manifestProject.config.GetBoolean('repo.worktree')
+
+  @property
   def IsArchive(self):
     return self.manifestProject.config.GetBoolean('repo.archive')
 
@@ -431,6 +666,17 @@
   def HasSubmodules(self):
     return self.manifestProject.config.GetBoolean('repo.submodules')
 
+  def GetDefaultGroupsStr(self):
+    """Returns the default group string for the platform."""
+    return 'default,platform-' + platform.system().lower()
+
+  def GetGroupsStr(self):
+    """Returns the manifest group string that should be synced."""
+    groups = self.manifestProject.config.GetString('manifest.groups')
+    if not groups:
+      groups = self.GetDefaultGroupsStr()
+    return groups
+
   def _Unload(self):
     self._loaded = False
     self._projects = {}
@@ -438,6 +684,8 @@
     self._remotes = {}
     self._default = None
     self._repo_hooks_project = None
+    self._superproject = {}
+    self._contactinfo = ContactInfo(Wrapper().BUG_URL)
     self._notice = None
     self.branch = None
     self._manifest_server = None
@@ -450,28 +698,24 @@
         b = b[len(R_HEADS):]
       self.branch = b
 
+      # The manifestFile was specified by the user which is why we allow include
+      # paths to point anywhere.
       nodes = []
-      nodes.append(self._ParseManifestXml(self.manifestFile,
-                                          self.manifestProject.worktree))
+      nodes.append(self._ParseManifestXml(
+          self.manifestFile, self.manifestProject.worktree,
+          restrict_includes=False))
 
-      if self._load_local_manifests:
-        local = os.path.join(self.repodir, LOCAL_MANIFEST_NAME)
-        if os.path.exists(local):
-          if not self.localManifestWarning:
-            self.localManifestWarning = True
-            print('warning: %s is deprecated; put local manifests '
-                  'in `%s` instead' % (LOCAL_MANIFEST_NAME,
-                  os.path.join(self.repodir, LOCAL_MANIFESTS_DIR_NAME)),
-                  file=sys.stderr)
-          nodes.append(self._ParseManifestXml(local, self.repodir))
-
-        local_dir = os.path.abspath(os.path.join(self.repodir,
-                                    LOCAL_MANIFESTS_DIR_NAME))
+      if self._load_local_manifests and self.local_manifests:
         try:
-          for local_file in sorted(platform_utils.listdir(local_dir)):
+          for local_file in sorted(platform_utils.listdir(self.local_manifests)):
             if local_file.endswith('.xml'):
-              local = os.path.join(local_dir, local_file)
-              nodes.append(self._ParseManifestXml(local, self.repodir))
+              local = os.path.join(self.local_manifests, local_file)
+              # Since local manifests are entirely managed by the user, allow
+              # them to point anywhere the user wants.
+              nodes.append(self._ParseManifestXml(
+                  local, self.repodir,
+                  parent_groups=f'{LOCAL_MANIFEST_GROUP_PREFIX}:{local_file[:-4]}',
+                  restrict_includes=False))
         except OSError:
           pass
 
@@ -489,7 +733,19 @@
 
       self._loaded = True
 
-  def _ParseManifestXml(self, path, include_root):
+  def _ParseManifestXml(self, path, include_root, parent_groups='',
+                        restrict_includes=True):
+    """Parse a manifest XML and return the computed nodes.
+
+    Args:
+      path: The XML file to read & parse.
+      include_root: The path to interpret include "name"s relative to.
+      parent_groups: The groups to apply to this projects.
+      restrict_includes: Whether to constrain the "name" attribute of includes.
+
+    Returns:
+      List of XML nodes.
+    """
     try:
       root = xml.dom.minidom.parse(path)
     except (OSError, xml.parsers.expat.ExpatError) as e:
@@ -508,20 +764,35 @@
     for node in manifest.childNodes:
       if node.nodeName == 'include':
         name = self._reqatt(node, 'name')
+        if restrict_includes:
+          msg = self._CheckLocalPath(name)
+          if msg:
+            raise ManifestInvalidPathError(
+                '<include> invalid "name": %s: %s' % (name, msg))
+        include_groups = ''
+        if parent_groups:
+          include_groups = parent_groups
+        if node.hasAttribute('groups'):
+          include_groups = node.getAttribute('groups') + ',' + include_groups
         fp = os.path.join(include_root, name)
         if not os.path.isfile(fp):
-          raise ManifestParseError("include %s doesn't exist or isn't a file"
-              % (name,))
+          raise ManifestParseError("include [%s/]%s doesn't exist or isn't a file"
+                                   % (include_root, name))
         try:
-          nodes.extend(self._ParseManifestXml(fp, include_root))
+          nodes.extend(self._ParseManifestXml(fp, include_root, include_groups))
         # should isolate this to the exact exception, but that's
         # tricky.  actual parsing implementation may vary.
-        except (KeyboardInterrupt, RuntimeError, SystemExit):
+        except (KeyboardInterrupt, RuntimeError, SystemExit, ManifestParseError):
           raise
         except Exception as e:
           raise ManifestParseError(
               "failed parsing included manifest %s: %s" % (name, e))
       else:
+        if parent_groups and node.nodeName == 'project':
+          nodeGroups = parent_groups
+          if node.hasAttribute('groups'):
+            nodeGroups = node.getAttribute('groups') + ',' + nodeGroups
+          node.setAttribute('groups', nodeGroups)
         nodes.append(node)
     return nodes
 
@@ -541,9 +812,10 @@
     for node in itertools.chain(*node_list):
       if node.nodeName == 'default':
         new_default = self._ParseDefault(node)
+        emptyDefault = not node.hasAttributes() and not node.hasChildNodes()
         if self._default is None:
           self._default = new_default
-        elif new_default != self._default:
+        elif not emptyDefault and new_default != self._default:
           raise ManifestParseError('duplicate default in %s' %
                                    (self.manifestFile))
 
@@ -582,6 +854,8 @@
       for subproject in project.subprojects:
         recursively_add_projects(subproject)
 
+    repo_hooks_project = None
+    enabled_repo_hooks = None
     for node in itertools.chain(*node_list):
       if node.nodeName == 'project':
         project = self._ParseProject(node)
@@ -594,61 +868,108 @@
                                    'project: %s' % name)
 
         path = node.getAttribute('path')
+        dest_path = node.getAttribute('dest-path')
         groups = node.getAttribute('groups')
         if groups:
-          groups = self._ParseGroups(groups)
+          groups = self._ParseList(groups)
         revision = node.getAttribute('revision')
+        remote = node.getAttribute('remote')
+        if remote:
+          remote = self._get_remote(node)
 
+        named_projects = self._projects[name]
+        if dest_path and not path and len(named_projects) > 1:
+          raise ManifestParseError('extend-project cannot use dest-path when '
+                                   'matching multiple projects: %s' % name)
         for p in self._projects[name]:
           if path and p.relpath != path:
             continue
           if groups:
             p.groups.extend(groups)
           if revision:
-            p.revisionExpr = revision
-      if node.nodeName == 'repo-hooks':
-        # Get the name of the project and the (space-separated) list of enabled.
-        repo_hooks_project = self._reqatt(node, 'in-project')
-        enabled_repo_hooks = self._reqatt(node, 'enabled-list').split()
+            p.SetRevision(revision)
 
+          if remote:
+            p.remote = remote.ToRemoteSpec(name)
+
+          if dest_path:
+            del self._paths[p.relpath]
+            relpath, worktree, gitdir, objdir, _ = self.GetProjectPaths(name, dest_path)
+            p.UpdatePaths(relpath, worktree, gitdir, objdir)
+            self._paths[p.relpath] = p
+
+      if node.nodeName == 'repo-hooks':
         # Only one project can be the hooks project
-        if self._repo_hooks_project is not None:
+        if repo_hooks_project is not None:
           raise ManifestParseError(
               'duplicate repo-hooks in %s' %
               (self.manifestFile))
 
-        # Store a reference to the Project.
-        try:
-          repo_hooks_projects = self._projects[repo_hooks_project]
-        except KeyError:
+        # Get the name of the project and the (space-separated) list of enabled.
+        repo_hooks_project = self._reqatt(node, 'in-project')
+        enabled_repo_hooks = self._ParseList(self._reqatt(node, 'enabled-list'))
+      if node.nodeName == 'superproject':
+        name = self._reqatt(node, 'name')
+        # There can only be one superproject.
+        if self._superproject.get('name'):
           raise ManifestParseError(
-              'project %s not found for repo-hooks' %
-              (repo_hooks_project))
-
-        if len(repo_hooks_projects) != 1:
-          raise ManifestParseError(
-              'internal error parsing repo-hooks in %s' %
+              'duplicate superproject in %s' %
               (self.manifestFile))
-        self._repo_hooks_project = repo_hooks_projects[0]
+        self._superproject['name'] = name
+        remote_name = node.getAttribute('remote')
+        if not remote_name:
+          remote = self._default.remote
+        else:
+          remote = self._get_remote(node)
+        if remote is None:
+          raise ManifestParseError("no remote for superproject %s within %s" %
+                                   (name, self.manifestFile))
+        self._superproject['remote'] = remote.ToRemoteSpec(name)
+        revision = node.getAttribute('revision') or remote.revision
+        if not revision:
+          revision = self._default.revisionExpr
+        if not revision:
+          raise ManifestParseError('no revision for superproject %s within %s' %
+                                   (name, self.manifestFile))
+        self._superproject['revision'] = revision
+      if node.nodeName == 'contactinfo':
+        bugurl = self._reqatt(node, 'bugurl')
+        # This element can be repeated, later entries will clobber earlier ones.
+        self._contactinfo = ContactInfo(bugurl)
 
-        # Store the enabled hooks in the Project object.
-        self._repo_hooks_project.enabled_repo_hooks = enabled_repo_hooks
       if node.nodeName == 'remove-project':
         name = self._reqatt(node, 'name')
 
-        if name not in self._projects:
+        if name in self._projects:
+          for p in self._projects[name]:
+            del self._paths[p.relpath]
+          del self._projects[name]
+
+          # If the manifest removes the hooks project, treat it as if it deleted
+          # the repo-hooks element too.
+          if repo_hooks_project == name:
+            repo_hooks_project = None
+        elif not XmlBool(node, 'optional', False):
           raise ManifestParseError('remove-project element specifies non-existent '
                                    'project: %s' % name)
 
-        for p in self._projects[name]:
-          del self._paths[p.relpath]
-        del self._projects[name]
+    # Store repo hooks project information.
+    if repo_hooks_project:
+      # Store a reference to the Project.
+      try:
+        repo_hooks_projects = self._projects[repo_hooks_project]
+      except KeyError:
+        raise ManifestParseError(
+            'project %s not found for repo-hooks' %
+            (repo_hooks_project))
 
-        # If the manifest removes the hooks project, treat it as if it deleted
-        # the repo-hooks element too.
-        if self._repo_hooks_project and (self._repo_hooks_project.name == name):
-          self._repo_hooks_project = None
-
+      if len(repo_hooks_projects) != 1:
+        raise ManifestParseError(
+            'internal error parsing repo-hooks in %s' %
+            (self.manifestFile))
+      self._repo_hooks_project = repo_hooks_projects[0]
+      # Store the enabled hooks in the Project object.
+      self._repo_hooks_project.enabled_repo_hooks = enabled_repo_hooks
 
   def _AddMetaProjectMirror(self, m):
     name = None
@@ -676,15 +997,15 @@
     if name not in self._projects:
       m.PreSync()
       gitdir = os.path.join(self.topdir, '%s.git' % name)
-      project = Project(manifest = self,
-                        name = name,
-                        remote = remote.ToRemoteSpec(name),
-                        gitdir = gitdir,
-                        objdir = gitdir,
-                        worktree = None,
-                        relpath = name or None,
-                        revisionExpr = m.revisionExpr,
-                        revisionId = None)
+      project = Project(manifest=self,
+                        name=name,
+                        remote=remote.ToRemoteSpec(name),
+                        gitdir=gitdir,
+                        objdir=gitdir,
+                        worktree=None,
+                        relpath=name or None,
+                        revisionExpr=m.revisionExpr,
+                        revisionId=None)
       self._projects[project.name] = [project]
       self._paths[project.relpath] = project
 
@@ -707,7 +1028,14 @@
     if revision == '':
       revision = None
     manifestUrl = self.manifestProject.config.GetString('remote.origin.url')
-    return _XmlRemote(name, alias, fetch, pushUrl, manifestUrl, review, revision)
+
+    remote = _XmlRemote(name, alias, fetch, pushUrl, manifestUrl, review, revision)
+
+    for n in node.childNodes:
+      if n.nodeName == 'annotation':
+        self._ParseAnnotation(remote, n)
+
+    return remote
 
   def _ParseDefault(self, node):
     """
@@ -722,29 +1050,14 @@
     d.destBranchExpr = node.getAttribute('dest-branch') or None
     d.upstreamExpr = node.getAttribute('upstream') or None
 
-    sync_j = node.getAttribute('sync-j')
-    if sync_j == '' or sync_j is None:
-      d.sync_j = 1
-    else:
-      d.sync_j = int(sync_j)
+    d.sync_j = XmlInt(node, 'sync-j', 1)
+    if d.sync_j <= 0:
+      raise ManifestParseError('%s: sync-j must be greater than 0, not "%s"' %
+                               (self.manifestFile, d.sync_j))
 
-    sync_c = node.getAttribute('sync-c')
-    if not sync_c:
-      d.sync_c = False
-    else:
-      d.sync_c = sync_c.lower() in ("yes", "true", "1")
-
-    sync_s = node.getAttribute('sync-s')
-    if not sync_s:
-      d.sync_s = False
-    else:
-      d.sync_s = sync_s.lower() in ("yes", "true", "1")
-
-    sync_tags = node.getAttribute('sync-tags')
-    if not sync_tags:
-      d.sync_tags = True
-    else:
-      d.sync_tags = sync_tags.lower() in ("yes", "true", "1")
+    d.sync_c = XmlBool(node, 'sync-c', False)
+    d.sync_s = XmlBool(node, 'sync-s', False)
+    d.sync_tags = XmlBool(node, 'sync-tags', True)
     return d
 
   def _ParseNotice(self, node):
@@ -792,11 +1105,15 @@
   def _UnjoinName(self, parent_name, name):
     return os.path.relpath(name, parent_name)
 
-  def _ParseProject(self, node, parent = None, **extra_proj_attrs):
+  def _ParseProject(self, node, parent=None, **extra_proj_attrs):
     """
     reads a <project> element from the manifest file
     """
     name = self._reqatt(node, 'name')
+    msg = self._CheckLocalPath(name, dir_ok=True)
+    if msg:
+      raise ManifestInvalidPathError(
+          '<project> invalid "name": %s: %s' % (name, msg))
     if parent:
       name = self._JoinName(parent.name, name)
 
@@ -805,55 +1122,34 @@
       remote = self._default.remote
     if remote is None:
       raise ManifestParseError("no remote for project %s within %s" %
-            (name, self.manifestFile))
+                               (name, self.manifestFile))
 
     revisionExpr = node.getAttribute('revision') or remote.revision
     if not revisionExpr:
       revisionExpr = self._default.revisionExpr
     if not revisionExpr:
       raise ManifestParseError("no revision for project %s within %s" %
-            (name, self.manifestFile))
+                               (name, self.manifestFile))
 
     path = node.getAttribute('path')
     if not path:
       path = name
-    if path.startswith('/'):
-      raise ManifestParseError("project %s path cannot be absolute in %s" %
-            (name, self.manifestFile))
-
-    rebase = node.getAttribute('rebase')
-    if not rebase:
-      rebase = True
     else:
-      rebase = rebase.lower() in ("yes", "true", "1")
+      # NB: The "." project is handled specially in Project.Sync_LocalHalf.
+      msg = self._CheckLocalPath(path, dir_ok=True, cwd_dot_ok=True)
+      if msg:
+        raise ManifestInvalidPathError(
+            '<project> invalid "path": %s: %s' % (path, msg))
 
-    sync_c = node.getAttribute('sync-c')
-    if not sync_c:
-      sync_c = False
-    else:
-      sync_c = sync_c.lower() in ("yes", "true", "1")
+    rebase = XmlBool(node, 'rebase', True)
+    sync_c = XmlBool(node, 'sync-c', False)
+    sync_s = XmlBool(node, 'sync-s', self._default.sync_s)
+    sync_tags = XmlBool(node, 'sync-tags', self._default.sync_tags)
 
-    sync_s = node.getAttribute('sync-s')
-    if not sync_s:
-      sync_s = self._default.sync_s
-    else:
-      sync_s = sync_s.lower() in ("yes", "true", "1")
-
-    sync_tags = node.getAttribute('sync-tags')
-    if not sync_tags:
-      sync_tags = self._default.sync_tags
-    else:
-      sync_tags = sync_tags.lower() in ("yes", "true", "1")
-
-    clone_depth = node.getAttribute('clone-depth')
-    if clone_depth:
-      try:
-        clone_depth = int(clone_depth)
-        if  clone_depth <= 0:
-          raise ValueError()
-      except ValueError:
-        raise ManifestParseError('invalid clone-depth %s in %s' %
-                                 (clone_depth, self.manifestFile))
+    clone_depth = XmlInt(node, 'clone-depth')
+    if clone_depth is not None and clone_depth <= 0:
+      raise ManifestParseError('%s: clone-depth must be greater than 0, not "%s"' %
+                               (self.manifestFile, clone_depth))
 
     dest_branch = node.getAttribute('dest-branch') or self._default.destBranchExpr
 
@@ -862,11 +1158,13 @@
     groups = ''
     if node.hasAttribute('groups'):
       groups = node.getAttribute('groups')
-    groups = self._ParseGroups(groups)
+    groups = self._ParseList(groups)
 
     if parent is None:
-      relpath, worktree, gitdir, objdir = self.GetProjectPaths(name, path)
+      relpath, worktree, gitdir, objdir, use_git_worktrees = \
+          self.GetProjectPaths(name, path)
     else:
+      use_git_worktrees = False
       relpath, worktree, gitdir, objdir = \
           self.GetSubprojectPaths(parent, name, path)
 
@@ -874,27 +1172,28 @@
     groups.extend(set(default_groups).difference(groups))
 
     if self.IsMirror and node.hasAttribute('force-path'):
-      if node.getAttribute('force-path').lower() in ("yes", "true", "1"):
+      if XmlBool(node, 'force-path', False):
         gitdir = os.path.join(self.topdir, '%s.git' % path)
 
-    project = Project(manifest = self,
-                      name = name,
-                      remote = remote.ToRemoteSpec(name),
-                      gitdir = gitdir,
-                      objdir = objdir,
-                      worktree = worktree,
-                      relpath = relpath,
-                      revisionExpr = revisionExpr,
-                      revisionId = None,
-                      rebase = rebase,
-                      groups = groups,
-                      sync_c = sync_c,
-                      sync_s = sync_s,
-                      sync_tags = sync_tags,
-                      clone_depth = clone_depth,
-                      upstream = upstream,
-                      parent = parent,
-                      dest_branch = dest_branch,
+    project = Project(manifest=self,
+                      name=name,
+                      remote=remote.ToRemoteSpec(name),
+                      gitdir=gitdir,
+                      objdir=objdir,
+                      worktree=worktree,
+                      relpath=relpath,
+                      revisionExpr=revisionExpr,
+                      revisionId=None,
+                      rebase=rebase,
+                      groups=groups,
+                      sync_c=sync_c,
+                      sync_s=sync_s,
+                      sync_tags=sync_tags,
+                      clone_depth=clone_depth,
+                      upstream=upstream,
+                      parent=parent,
+                      dest_branch=dest_branch,
+                      use_git_worktrees=use_git_worktrees,
                       **extra_proj_attrs)
 
     for n in node.childNodes:
@@ -905,11 +1204,16 @@
       if n.nodeName == 'annotation':
         self._ParseAnnotation(project, n)
       if n.nodeName == 'project':
-        project.subprojects.append(self._ParseProject(n, parent = project))
+        project.subprojects.append(self._ParseProject(n, parent=project))
 
     return project
 
   def GetProjectPaths(self, name, path):
+    # The manifest entries might have trailing slashes.  Normalize them to avoid
+    # unexpected filesystem behavior since we do string concatenation below.
+    path = path.rstrip('/')
+    name = name.rstrip('/')
+    use_git_worktrees = False
     relpath = path
     if self.IsMirror:
       worktree = None
@@ -918,8 +1222,15 @@
     else:
       worktree = os.path.join(self.topdir, path).replace('\\', '/')
       gitdir = os.path.join(self.repodir, 'projects', '%s.git' % path)
-      objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name)
-    return relpath, worktree, gitdir, objdir
+      # We allow people to mix git worktrees & non-git worktrees for now.
+      # This allows for in situ migration of repo clients.
+      if os.path.exists(gitdir) or not self.UseGitWorktrees:
+        objdir = os.path.join(self.repodir, 'project-objects', '%s.git' % name)
+      else:
+        use_git_worktrees = True
+        gitdir = os.path.join(self.repodir, 'worktrees', '%s.git' % name)
+        objdir = gitdir
+    return relpath, worktree, gitdir, objdir, use_git_worktrees
 
   def GetProjectsWithName(self, name):
     return self._projects.get(name, [])
@@ -934,6 +1245,10 @@
     return os.path.relpath(relpath, parent_relpath)
 
   def GetSubprojectPaths(self, parent, name, path):
+    # The manifest entries might have trailing slashes.  Normalize them to avoid
+    # unexpected filesystem behavior since we do string concatenation below.
+    path = path.rstrip('/')
+    name = name.rstrip('/')
     relpath = self._JoinRelpath(parent.relpath, path)
     gitdir = os.path.join(parent.gitdir, 'subprojects', '%s.git' % path)
     objdir = os.path.join(parent.gitdir, 'subproject-objects', '%s.git' % name)
@@ -943,23 +1258,151 @@
       worktree = os.path.join(parent.worktree, path).replace('\\', '/')
     return relpath, worktree, gitdir, objdir
 
+  @staticmethod
+  def _CheckLocalPath(path, dir_ok=False, cwd_dot_ok=False):
+    """Verify |path| is reasonable for use in filesystem paths.
+
+    Used with <copyfile> & <linkfile> & <project> elements.
+
+    This only validates the |path| in isolation: it does not check against the
+    current filesystem state.  Thus it is suitable as a first-past in a parser.
+
+    It enforces a number of constraints:
+    * No empty paths.
+    * No "~" in paths.
+    * No Unicode codepoints that filesystems might elide when normalizing.
+    * No relative path components like "." or "..".
+    * No absolute paths.
+    * No ".git" or ".repo*" path components.
+
+    Args:
+      path: The path name to validate.
+      dir_ok: Whether |path| may force a directory (e.g. end in a /).
+      cwd_dot_ok: Whether |path| may be just ".".
+
+    Returns:
+      None if |path| is OK, a failure message otherwise.
+    """
+    if not path:
+      return 'empty paths not allowed'
+
+    if '~' in path:
+      return '~ not allowed (due to 8.3 filenames on Windows filesystems)'
+
+    path_codepoints = set(path)
+
+    # Some filesystems (like Apple's HFS+) try to normalize Unicode codepoints
+    # which means there are alternative names for ".git".  Reject paths with
+    # these in it as there shouldn't be any reasonable need for them here.
+    # The set of codepoints here was cribbed from jgit's implementation:
+    # https://eclipse.googlesource.com/jgit/jgit/+/9110037e3e9461ff4dac22fee84ef3694ed57648/org.eclipse.jgit/src/org/eclipse/jgit/lib/ObjectChecker.java#884
+    BAD_CODEPOINTS = {
+        u'\u200C',  # ZERO WIDTH NON-JOINER
+        u'\u200D',  # ZERO WIDTH JOINER
+        u'\u200E',  # LEFT-TO-RIGHT MARK
+        u'\u200F',  # RIGHT-TO-LEFT MARK
+        u'\u202A',  # LEFT-TO-RIGHT EMBEDDING
+        u'\u202B',  # RIGHT-TO-LEFT EMBEDDING
+        u'\u202C',  # POP DIRECTIONAL FORMATTING
+        u'\u202D',  # LEFT-TO-RIGHT OVERRIDE
+        u'\u202E',  # RIGHT-TO-LEFT OVERRIDE
+        u'\u206A',  # INHIBIT SYMMETRIC SWAPPING
+        u'\u206B',  # ACTIVATE SYMMETRIC SWAPPING
+        u'\u206C',  # INHIBIT ARABIC FORM SHAPING
+        u'\u206D',  # ACTIVATE ARABIC FORM SHAPING
+        u'\u206E',  # NATIONAL DIGIT SHAPES
+        u'\u206F',  # NOMINAL DIGIT SHAPES
+        u'\uFEFF',  # ZERO WIDTH NO-BREAK SPACE
+    }
+    if BAD_CODEPOINTS & path_codepoints:
+      # This message is more expansive than reality, but should be fine.
+      return 'Unicode combining characters not allowed'
+
+    # Reject newlines as there shouldn't be any legitmate use for them, they'll
+    # be confusing to users, and they can easily break tools that expect to be
+    # able to iterate over newline delimited lists.  This even applies to our
+    # own code like .repo/project.list.
+    if {'\r', '\n'} & path_codepoints:
+      return 'Newlines not allowed'
+
+    # Assume paths might be used on case-insensitive filesystems.
+    path = path.lower()
+
+    # Split up the path by its components.  We can't use os.path.sep exclusively
+    # as some platforms (like Windows) will convert / to \ and that bypasses all
+    # our constructed logic here.  Especially since manifest authors only use
+    # / in their paths.
+    resep = re.compile(r'[/%s]' % re.escape(os.path.sep))
+    # Strip off trailing slashes as those only produce '' elements, and we use
+    # parts to look for individual bad components.
+    parts = resep.split(path.rstrip('/'))
+
+    # Some people use src="." to create stable links to projects.  Lets allow
+    # that but reject all other uses of "." to keep things simple.
+    if not cwd_dot_ok or parts != ['.']:
+      for part in set(parts):
+        if part in {'.', '..', '.git'} or part.startswith('.repo'):
+          return 'bad component: %s' % (part,)
+
+    if not dir_ok and resep.match(path[-1]):
+      return 'dirs not allowed'
+
+    # NB: The two abspath checks here are to handle platforms with multiple
+    # filesystem path styles (e.g. Windows).
+    norm = os.path.normpath(path)
+    if (norm == '..' or
+        (len(norm) >= 3 and norm.startswith('..') and resep.match(norm[0])) or
+        os.path.isabs(norm) or
+        norm.startswith('/')):
+      return 'path cannot be outside'
+
+  @classmethod
+  def _ValidateFilePaths(cls, element, src, dest):
+    """Verify |src| & |dest| are reasonable for <copyfile> & <linkfile>.
+
+    We verify the path independent of any filesystem state as we won't have a
+    checkout available to compare to.  i.e. This is for parsing validation
+    purposes only.
+
+    We'll do full/live sanity checking before we do the actual filesystem
+    modifications in _CopyFile/_LinkFile/etc...
+    """
+    # |dest| is the file we write to or symlink we create.
+    # It is relative to the top of the repo client checkout.
+    msg = cls._CheckLocalPath(dest)
+    if msg:
+      raise ManifestInvalidPathError(
+          '<%s> invalid "dest": %s: %s' % (element, dest, msg))
+
+    # |src| is the file we read from or path we point to for symlinks.
+    # It is relative to the top of the git project checkout.
+    is_linkfile = element == 'linkfile'
+    msg = cls._CheckLocalPath(src, dir_ok=is_linkfile, cwd_dot_ok=is_linkfile)
+    if msg:
+      raise ManifestInvalidPathError(
+          '<%s> invalid "src": %s: %s' % (element, src, msg))
+
   def _ParseCopyFile(self, project, node):
     src = self._reqatt(node, 'src')
     dest = self._reqatt(node, 'dest')
     if not self.IsMirror:
       # src is project relative;
-      # dest is relative to the top of the tree
-      project.AddCopyFile(src, dest, os.path.join(self.topdir, dest))
+      # dest is relative to the top of the tree.
+      # We only validate paths if we actually plan to process them.
+      self._ValidateFilePaths('copyfile', src, dest)
+      project.AddCopyFile(src, dest, self.topdir)
 
   def _ParseLinkFile(self, project, node):
     src = self._reqatt(node, 'src')
     dest = self._reqatt(node, 'dest')
     if not self.IsMirror:
       # src is project relative;
-      # dest is relative to the top of the tree
-      project.AddLinkFile(src, dest, os.path.join(self.topdir, dest))
+      # dest is relative to the top of the tree.
+      # We only validate paths if we actually plan to process them.
+      self._ValidateFilePaths('linkfile', src, dest)
+      project.AddLinkFile(src, dest, self.topdir)
 
-  def _ParseAnnotation(self, project, node):
+  def _ParseAnnotation(self, element, node):
     name = self._reqatt(node, 'name')
     value = self._reqatt(node, 'value')
     try:
@@ -968,8 +1411,8 @@
       keep = "true"
     if keep != "true" and keep != "false":
       raise ManifestParseError('optional "keep" attribute must be '
-            '"true" or "false"')
-    project.AddAnnotation(name, value, keep)
+                               '"true" or "false"')
+    element.AddAnnotation(name, value, keep)
 
   def _get_remote(self, node):
     name = node.getAttribute('remote')
@@ -979,7 +1422,7 @@
     v = self._remotes.get(name)
     if not v:
       raise ManifestParseError("remote %s not defined in %s" %
-            (name, self.manifestFile))
+                               (name, self.manifestFile))
     return v
 
   def _reqatt(self, node, attname):
@@ -989,7 +1432,7 @@
     v = node.getAttribute(attname)
     if not v:
       raise ManifestParseError("no %s in <%s> within %s" %
-            (attname, node.nodeName, self.manifestFile))
+                               (attname, node.nodeName, self.manifestFile))
     return v
 
   def projectsDiff(self, manifest):
@@ -1007,7 +1450,7 @@
     diff = {'added': [], 'removed': [], 'changed': [], 'unreachable': []}
 
     for proj in fromKeys:
-      if not proj in toKeys:
+      if proj not in toKeys:
         diff['removed'].append(fromProjects[proj])
       else:
         fromProj = fromProjects[proj]
@@ -1029,19 +1472,11 @@
 
 
 class GitcManifest(XmlManifest):
+  """Parser for GitC (git-in-the-cloud) manifests."""
 
-  def __init__(self, repodir, gitc_client_name):
-    """Initialize the GitcManifest object."""
-    super(GitcManifest, self).__init__(repodir)
-    self.isGitcClient = True
-    self.gitc_client_name = gitc_client_name
-    self.gitc_client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
-                                        gitc_client_name)
-    self.manifestFile = os.path.join(self.gitc_client_dir, '.manifest')
-
-  def _ParseProject(self, node, parent = None):
+  def _ParseProject(self, node, parent=None):
     """Override _ParseProject and add support for GITC specific attributes."""
-    return super(GitcManifest, self)._ParseProject(
+    return super()._ParseProject(
         node, parent=parent, old_revision=node.getAttribute('old-revision'))
 
   def _output_manifest_project_extras(self, p, e):
@@ -1049,3 +1484,36 @@
     if p.old_revision:
       e.setAttribute('old-revision', str(p.old_revision))
 
+
+class RepoClient(XmlManifest):
+  """Manages a repo client checkout."""
+
+  def __init__(self, repodir, manifest_file=None):
+    self.isGitcClient = False
+
+    if os.path.exists(os.path.join(repodir, LOCAL_MANIFEST_NAME)):
+      print('error: %s is not supported; put local manifests in `%s` instead' %
+            (LOCAL_MANIFEST_NAME, os.path.join(repodir, LOCAL_MANIFESTS_DIR_NAME)),
+            file=sys.stderr)
+      sys.exit(1)
+
+    if manifest_file is None:
+      manifest_file = os.path.join(repodir, MANIFEST_FILE_NAME)
+    local_manifests = os.path.abspath(os.path.join(repodir, LOCAL_MANIFESTS_DIR_NAME))
+    super().__init__(repodir, manifest_file, local_manifests)
+
+    # TODO: Completely separate manifest logic out of the client.
+    self.manifest = self
+
+
+class GitcClient(RepoClient, GitcManifest):
+  """Manages a GitC client checkout."""
+
+  def __init__(self, repodir, gitc_client_name):
+    """Initialize the GitcManifest object."""
+    self.gitc_client_name = gitc_client_name
+    self.gitc_client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
+                                        gitc_client_name)
+
+    super().__init__(repodir, os.path.join(self.gitc_client_dir, '.manifest'))
+    self.isGitcClient = True
diff --git a/pager.py b/pager.py
index 221baf3..352923d 100644
--- a/pager.py
+++ b/pager.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import os
 import select
 import subprocess
@@ -27,6 +24,7 @@
 old_stdout = None
 old_stderr = None
 
+
 def RunPager(globalConfig):
   if not os.isatty(0) or not os.isatty(1):
     return
@@ -35,33 +33,37 @@
     return
 
   if platform_utils.isWindows():
-    _PipePager(pager);
+    _PipePager(pager)
   else:
     _ForkPager(pager)
 
+
 def TerminatePager():
   global pager_process, old_stdout, old_stderr
   if pager_process:
     sys.stdout.flush()
     sys.stderr.flush()
     pager_process.stdin.close()
-    pager_process.wait();
+    pager_process.wait()
     pager_process = None
     # Restore initial stdout/err in case there is more output in this process
     # after shutting down the pager process
     sys.stdout = old_stdout
     sys.stderr = old_stderr
 
+
 def _PipePager(pager):
   global pager_process, old_stdout, old_stderr
   assert pager_process is None, "Only one active pager process at a time"
   # Create pager process, piping stdout/err into its stdin
-  pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout, stderr=sys.stderr)
+  pager_process = subprocess.Popen([pager], stdin=subprocess.PIPE, stdout=sys.stdout,
+                                   stderr=sys.stderr)
   old_stdout = sys.stdout
   old_stderr = sys.stderr
   sys.stdout = pager_process.stdin
   sys.stderr = pager_process.stdin
 
+
 def _ForkPager(pager):
   global active
   # This process turns into the pager; a child it forks will
@@ -88,6 +90,7 @@
     print("fatal: cannot start pager '%s'" % pager, file=sys.stderr)
     sys.exit(255)
 
+
 def _SelectPager(globalConfig):
   try:
     return os.environ['GIT_PAGER']
@@ -105,6 +108,7 @@
 
   return 'less'
 
+
 def _BecomePager(pager):
   # Delaying execution of the pager until we have output
   # ready works around a long-standing bug in popularly
diff --git a/platform_utils.py b/platform_utils.py
index 06ef9b1..0203249 100644
--- a/platform_utils.py
+++ b/platform_utils.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2016 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -17,18 +15,9 @@
 import errno
 import os
 import platform
-import select
 import shutil
 import stat
 
-from pyversion import is_python3
-if is_python3():
-  from queue import Queue
-else:
-  from Queue import Queue
-
-from threading import Thread
-
 
 def isWindows():
   """ Returns True when running with the native port of Python for Windows,
@@ -39,145 +28,6 @@
   return platform.system() == "Windows"
 
 
-class FileDescriptorStreams(object):
-  """ Platform agnostic abstraction enabling non-blocking I/O over a
-  collection of file descriptors. This abstraction is required because
-  fctnl(os.O_NONBLOCK) is not supported on Windows.
-  """
-  @classmethod
-  def create(cls):
-    """ Factory method: instantiates the concrete class according to the
-    current platform.
-    """
-    if isWindows():
-      return _FileDescriptorStreamsThreads()
-    else:
-      return _FileDescriptorStreamsNonBlocking()
-
-  def __init__(self):
-    self.streams = []
-
-  def add(self, fd, dest, std_name):
-    """ Wraps an existing file descriptor as a stream.
-    """
-    self.streams.append(self._create_stream(fd, dest, std_name))
-
-  def remove(self, stream):
-    """ Removes a stream, when done with it.
-    """
-    self.streams.remove(stream)
-
-  @property
-  def is_done(self):
-    """ Returns True when all streams have been processed.
-    """
-    return len(self.streams) == 0
-
-  def select(self):
-    """ Returns the set of streams that have data available to read.
-    The returned streams each expose a read() and a close() method.
-    When done with a stream, call the remove(stream) method.
-    """
-    raise NotImplementedError
-
-  def _create_stream(self, fd, dest, std_name):
-    """ Creates a new stream wrapping an existing file descriptor.
-    """
-    raise NotImplementedError
-
-
-class _FileDescriptorStreamsNonBlocking(FileDescriptorStreams):
-  """ Implementation of FileDescriptorStreams for platforms that support
-  non blocking I/O.
-  """
-  class Stream(object):
-    """ Encapsulates a file descriptor """
-    def __init__(self, fd, dest, std_name):
-      self.fd = fd
-      self.dest = dest
-      self.std_name = std_name
-      self.set_non_blocking()
-
-    def set_non_blocking(self):
-      import fcntl
-      flags = fcntl.fcntl(self.fd, fcntl.F_GETFL)
-      fcntl.fcntl(self.fd, fcntl.F_SETFL, flags | os.O_NONBLOCK)
-
-    def fileno(self):
-      return self.fd.fileno()
-
-    def read(self):
-      return self.fd.read(4096)
-
-    def close(self):
-      self.fd.close()
-
-  def _create_stream(self, fd, dest, std_name):
-    return self.Stream(fd, dest, std_name)
-
-  def select(self):
-    ready_streams, _, _ = select.select(self.streams, [], [])
-    return ready_streams
-
-
-class _FileDescriptorStreamsThreads(FileDescriptorStreams):
-  """ Implementation of FileDescriptorStreams for platforms that don't support
-  non blocking I/O. This implementation requires creating threads issuing
-  blocking read operations on file descriptors.
-  """
-  def __init__(self):
-    super(_FileDescriptorStreamsThreads, self).__init__()
-    # The queue is shared accross all threads so we can simulate the
-    # behavior of the select() function
-    self.queue = Queue(10)  # Limit incoming data from streams
-
-  def _create_stream(self, fd, dest, std_name):
-    return self.Stream(fd, dest, std_name, self.queue)
-
-  def select(self):
-    # Return only one stream at a time, as it is the most straighforward
-    # thing to do and it is compatible with the select() function.
-    item = self.queue.get()
-    stream = item.stream
-    stream.data = item.data
-    return [stream]
-
-  class QueueItem(object):
-    """ Item put in the shared queue """
-    def __init__(self, stream, data):
-      self.stream = stream
-      self.data = data
-
-  class Stream(object):
-    """ Encapsulates a file descriptor """
-    def __init__(self, fd, dest, std_name, queue):
-      self.fd = fd
-      self.dest = dest
-      self.std_name = std_name
-      self.queue = queue
-      self.data = None
-      self.thread = Thread(target=self.read_to_queue)
-      self.thread.daemon = True
-      self.thread.start()
-
-    def close(self):
-      self.fd.close()
-
-    def read(self):
-      data = self.data
-      self.data = None
-      return data
-
-    def read_to_queue(self):
-      """ The thread function: reads everything from the file descriptor into
-      the shared queue and terminates when reaching EOF.
-      """
-      for line in iter(self.fd.readline, b''):
-        self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, line))
-      self.fd.close()
-      self.queue.put(_FileDescriptorStreamsThreads.QueueItem(self, None))
-
-
 def symlink(source, link_name):
   """Creates a symbolic link pointing to source named link_name.
   Note: On Windows, source must exist on disk, as the implementation needs
@@ -274,31 +124,30 @@
       else:
         raise
   else:
-    os.rename(src, dst)
+    shutil.move(src, dst)
 
 
-def remove(path):
+def remove(path, missing_ok=False):
   """Remove (delete) the file path. This is a replacement for os.remove that
   allows deleting read-only files on Windows, with support for long paths and
   for deleting directory symbolic links.
 
   Availability: Unix, Windows."""
-  if isWindows():
-    longpath = _makelongpath(path)
-    try:
-      os.remove(longpath)
-    except OSError as e:
-      if e.errno == errno.EACCES:
-        os.chmod(longpath, stat.S_IWRITE)
-        # Directory symbolic links must be deleted with 'rmdir'.
-        if islink(longpath) and isdir(longpath):
-          os.rmdir(longpath)
-        else:
-          os.remove(longpath)
+  longpath = _makelongpath(path) if isWindows() else path
+  try:
+    os.remove(longpath)
+  except OSError as e:
+    if e.errno == errno.EACCES:
+      os.chmod(longpath, stat.S_IWRITE)
+      # Directory symbolic links must be deleted with 'rmdir'.
+      if islink(longpath) and isdir(longpath):
+        os.rmdir(longpath)
       else:
-        raise
-  else:
-    os.remove(path)
+        os.remove(longpath)
+    elif missing_ok and e.errno == errno.ENOENT:
+      pass
+    else:
+      raise
 
 
 def walk(top, topdown=True, onerror=None, followlinks=False):
diff --git a/platform_utils_win32.py b/platform_utils_win32.py
index e20755a..bf916d4 100644
--- a/platform_utils_win32.py
+++ b/platform_utils_win32.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2016 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,16 +14,13 @@
 
 import errno
 
-from pyversion import is_python3
 from ctypes import WinDLL, get_last_error, FormatError, WinError, addressof
-from ctypes import c_buffer
-from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE, POINTER, c_ubyte
-from ctypes.wintypes import WCHAR, USHORT, LPVOID, Structure, Union, ULONG
-from ctypes.wintypes import byref
+from ctypes import c_buffer, c_ubyte, Structure, Union, byref
+from ctypes.wintypes import BOOL, BOOLEAN, LPCWSTR, DWORD, HANDLE
+from ctypes.wintypes import WCHAR, USHORT, LPVOID, ULONG, LPDWORD
 
 kernel32 = WinDLL('kernel32', use_last_error=True)
 
-LPDWORD = POINTER(DWORD)
 UCHAR = c_ubyte
 
 # Win32 error codes
@@ -147,7 +142,8 @@
 
 
 def _create_symlink(source, link_name, dwFlags):
-  if not CreateSymbolicLinkW(link_name, source, dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
+  if not CreateSymbolicLinkW(link_name, source,
+                             dwFlags | SYMBOLIC_LINK_FLAG_ALLOW_UNPRIVILEGED_CREATE):
     # See https://github.com/golang/go/pull/24307/files#diff-b87bc12e4da2497308f9ef746086e4f0
     # "the unprivileged create flag is unsupported below Windows 10 (1703, v10.0.14972).
     # retry without it."
@@ -198,26 +194,15 @@
         'Error reading symbolic link \"%s\"'.format(path))
   rdb = REPARSE_DATA_BUFFER.from_buffer(target_buffer)
   if rdb.ReparseTag == IO_REPARSE_TAG_SYMLINK:
-    return _preserve_encoding(path, rdb.SymbolicLinkReparseBuffer.PrintName)
+    return rdb.SymbolicLinkReparseBuffer.PrintName
   elif rdb.ReparseTag == IO_REPARSE_TAG_MOUNT_POINT:
-    return _preserve_encoding(path, rdb.MountPointReparseBuffer.PrintName)
+    return rdb.MountPointReparseBuffer.PrintName
   # Unsupported reparse point type
   _raise_winerror(
       ERROR_NOT_SUPPORTED,
       'Error reading symbolic link \"%s\"'.format(path))
 
 
-def _preserve_encoding(source, target):
-  """Ensures target is the same string type (i.e. unicode or str) as source."""
-
-  if is_python3():
-    return target
-
-  if isinstance(source, unicode):
-    return unicode(target)
-  return str(target)
-
-
 def _raise_winerror(code, error_desc):
   win_error_desc = FormatError(code).strip()
   error_desc = "%s: %s".format(error_desc, win_error_desc)
diff --git a/progress.py b/progress.py
index d2ed4ba..43c7ad2 100644
--- a/progress.py
+++ b/progress.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -26,18 +24,53 @@
 # column 0.
 CSI_ERASE_LINE = '\x1b[2K'
 
+
+def duration_str(total):
+  """A less noisy timedelta.__str__.
+
+  The default timedelta stringification contains a lot of leading zeros and
+  uses microsecond resolution.  This makes for noisy output.
+  """
+  hours, rem = divmod(total, 3600)
+  mins, secs = divmod(rem, 60)
+  ret = '%.3fs' % (secs,)
+  if mins:
+    ret = '%im%s' % (mins, ret)
+  if hours:
+    ret = '%ih%s' % (hours, ret)
+  return ret
+
+
 class Progress(object):
-  def __init__(self, title, total=0, units='', print_newline=False,
-               always_print_percentage=False):
+  def __init__(self, title, total=0, units='', print_newline=False, delay=True,
+               quiet=False):
     self._title = title
     self._total = total
     self._done = 0
-    self._lastp = -1
     self._start = time()
-    self._show = False
+    self._show = not delay
     self._units = units
     self._print_newline = print_newline
-    self._always_print_percentage = always_print_percentage
+    # Only show the active jobs section if we run more than one in parallel.
+    self._show_jobs = False
+    self._active = 0
+
+    # When quiet, never show any output.  It's a bit hacky, but reusing the
+    # existing logic that delays initial output keeps the rest of the class
+    # clean.  Basically we set the start time to years in the future.
+    if quiet:
+      self._show = False
+      self._start += 2**32
+
+  def start(self, name):
+    self._active += 1
+    if not self._show_jobs:
+      self._show_jobs = self._active > 1
+    self.update(inc=0, msg='started ' + name)
+
+  def finish(self, name):
+    self.update(msg='finished ' + name)
+    self._active -= 1
 
   def update(self, inc=1, msg=''):
     self._done += inc
@@ -53,41 +86,46 @@
 
     if self._total <= 0:
       sys.stderr.write('%s\r%s: %d,' % (
-        CSI_ERASE_LINE,
-        self._title,
-        self._done))
+          CSI_ERASE_LINE,
+          self._title,
+          self._done))
       sys.stderr.flush()
     else:
       p = (100 * self._done) / self._total
-
-      if self._lastp != p or self._always_print_percentage:
-        self._lastp = p
-        sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s)%s%s%s' % (
+      if self._show_jobs:
+        jobs = '[%d job%s] ' % (self._active, 's' if self._active > 1 else '')
+      else:
+        jobs = ''
+      sys.stderr.write('%s\r%s: %2d%% %s(%d%s/%d%s)%s%s%s' % (
           CSI_ERASE_LINE,
           self._title,
           p,
+          jobs,
           self._done, self._units,
           self._total, self._units,
           ' ' if msg else '', msg,
-          "\n" if self._print_newline else ""))
-        sys.stderr.flush()
+          '\n' if self._print_newline else ''))
+      sys.stderr.flush()
 
   def end(self):
     if _NOT_TTY or IsTrace() or not self._show:
       return
 
+    duration = duration_str(time() - self._start)
     if self._total <= 0:
-      sys.stderr.write('%s\r%s: %d, done.\n' % (
-        CSI_ERASE_LINE,
-        self._title,
-        self._done))
+      sys.stderr.write('%s\r%s: %d, done in %s\n' % (
+          CSI_ERASE_LINE,
+          self._title,
+          self._done,
+          duration))
       sys.stderr.flush()
     else:
       p = (100 * self._done) / self._total
-      sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s), done.\n' % (
-        CSI_ERASE_LINE,
-        self._title,
-        p,
-        self._done, self._units,
-        self._total, self._units))
+      sys.stderr.write('%s\r%s: %3d%% (%d%s/%d%s), done in %s\n' % (
+          CSI_ERASE_LINE,
+          self._title,
+          p,
+          self._done, self._units,
+          self._total, self._units,
+          duration))
       sys.stderr.flush()
diff --git a/project.py b/project.py
index 8fdacc6..5b26b64 100644
--- a/project.py
+++ b/project.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,11 +12,9 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import errno
 import filecmp
 import glob
-import json
 import os
 import random
 import re
@@ -29,36 +25,33 @@
 import tarfile
 import tempfile
 import time
-import traceback
+import urllib.parse
 
 from color import Coloring
 from git_command import GitCommand, git_require
 from git_config import GitConfig, IsId, GetSchemeFromUrl, GetUrlCookieFile, \
     ID_RE
-from error import GitError, HookError, UploadError, DownloadError
-from error import ManifestInvalidRevisionError
+from error import GitError, UploadError, DownloadError
+from error import ManifestInvalidRevisionError, ManifestInvalidPathError
 from error import NoManifestException
 import platform_utils
 import progress
 from repo_trace import IsTrace, Trace
 
-from git_refs import GitRefs, HEAD, R_HEADS, R_TAGS, R_PUB, R_M
+from git_refs import GitRefs, HEAD, R_HEADS, R_TAGS, R_PUB, R_M, R_WORKTREE_M
 
-from pyversion import is_python3
-if is_python3():
-  import urllib.parse
-else:
-  import imp
-  import urlparse
-  urllib = imp.new_module('urllib')
-  urllib.parse = urlparse
-  input = raw_input
+
+# Maximum sleep time allowed during retries.
+MAXIMUM_RETRY_SLEEP_SEC = 3600.0
+# +-10% random jitter is added to each Fetches retry sleep duration.
+RETRY_JITTER_PERCENT = 0.1
 
 
 def _lwrite(path, content):
   lock = '%s.lock' % path
 
-  with open(lock, 'w') as fd:
+  # Maintain Unix line endings on all OS's to match git behavior.
+  with open(lock, 'w', newline='\n') as fd:
     fd.write(content)
 
   try:
@@ -85,6 +78,7 @@
 def sq(r):
   return "'" + r.replace("'", "'\''") + "'"
 
+
 _project_hook_list = None
 
 
@@ -197,18 +191,22 @@
     return self._base_exists
 
   def UploadForReview(self, people,
+                      dryrun=False,
                       auto_topic=False,
-                      draft=False,
+                      hashtags=(),
+                      labels=(),
                       private=False,
                       notify=None,
                       wip=False,
                       dest_branch=None,
                       validate_certs=True,
                       push_options=None):
-    self.project.UploadForReview(self.name,
-                                 people,
+    self.project.UploadForReview(branch=self.name,
+                                 people=people,
+                                 dryrun=dryrun,
                                  auto_topic=auto_topic,
-                                 draft=draft,
+                                 hashtags=hashtags,
+                                 labels=labels,
                                  private=private,
                                  notify=notify,
                                  wip=wip,
@@ -234,7 +232,7 @@
 class StatusColoring(Coloring):
 
   def __init__(self, config):
-    Coloring.__init__(self, config, 'status')
+    super().__init__(config, 'status')
     self.project = self.printer('header', attr='bold')
     self.branch = self.printer('header', attr='bold')
     self.nobranch = self.printer('nobranch', fg='red')
@@ -248,30 +246,104 @@
 class DiffColoring(Coloring):
 
   def __init__(self, config):
-    Coloring.__init__(self, config, 'diff')
+    super().__init__(config, 'diff')
     self.project = self.printer('header', attr='bold')
     self.fail = self.printer('fail', fg='red')
 
 
-class _Annotation(object):
+class Annotation(object):
 
   def __init__(self, name, value, keep):
     self.name = name
     self.value = value
     self.keep = keep
 
+  def __eq__(self, other):
+    if not isinstance(other, Annotation):
+      return False
+    return self.__dict__ == other.__dict__
+
+  def __lt__(self, other):
+    # This exists just so that lists of Annotation objects can be sorted, for
+    # use in comparisons.
+    if not isinstance(other, Annotation):
+      raise ValueError('comparison is not between two Annotation objects')
+    if self.name == other.name:
+      if self.value == other.value:
+        return self.keep < other.keep
+      return self.value < other.value
+    return self.name < other.name
+
+
+def _SafeExpandPath(base, subpath, skipfinal=False):
+  """Make sure |subpath| is completely safe under |base|.
+
+  We make sure no intermediate symlinks are traversed, and that the final path
+  is not a special file (e.g. not a socket or fifo).
+
+  NB: We rely on a number of paths already being filtered out while parsing the
+  manifest.  See the validation logic in manifest_xml.py for more details.
+  """
+  # Split up the path by its components.  We can't use os.path.sep exclusively
+  # as some platforms (like Windows) will convert / to \ and that bypasses all
+  # our constructed logic here.  Especially since manifest authors only use
+  # / in their paths.
+  resep = re.compile(r'[/%s]' % re.escape(os.path.sep))
+  components = resep.split(subpath)
+  if skipfinal:
+    # Whether the caller handles the final component itself.
+    finalpart = components.pop()
+
+  path = base
+  for part in components:
+    if part in {'.', '..'}:
+      raise ManifestInvalidPathError(
+          '%s: "%s" not allowed in paths' % (subpath, part))
+
+    path = os.path.join(path, part)
+    if platform_utils.islink(path):
+      raise ManifestInvalidPathError(
+          '%s: traversing symlinks not allow' % (path,))
+
+    if os.path.exists(path):
+      if not os.path.isfile(path) and not platform_utils.isdir(path):
+        raise ManifestInvalidPathError(
+            '%s: only regular files & directories allowed' % (path,))
+
+  if skipfinal:
+    path = os.path.join(path, finalpart)
+
+  return path
+
 
 class _CopyFile(object):
+  """Container for <copyfile> manifest element."""
 
-  def __init__(self, src, dest, abssrc, absdest):
+  def __init__(self, git_worktree, src, topdir, dest):
+    """Register a <copyfile> request.
+
+    Args:
+      git_worktree: Absolute path to the git project checkout.
+      src: Relative path under |git_worktree| of file to read.
+      topdir: Absolute path to the top of the repo client checkout.
+      dest: Relative path under |topdir| of file to write.
+    """
+    self.git_worktree = git_worktree
+    self.topdir = topdir
     self.src = src
     self.dest = dest
-    self.abs_src = abssrc
-    self.abs_dest = absdest
 
   def _Copy(self):
-    src = self.abs_src
-    dest = self.abs_dest
+    src = _SafeExpandPath(self.git_worktree, self.src)
+    dest = _SafeExpandPath(self.topdir, self.dest)
+
+    if platform_utils.isdir(src):
+      raise ManifestInvalidPathError(
+          '%s: copying from directory not supported' % (self.src,))
+    if platform_utils.isdir(dest):
+      raise ManifestInvalidPathError(
+          '%s: copying to directory not allowed' % (self.dest,))
+
     # copy file if it does not exist or is out of date
     if not os.path.exists(dest) or not filecmp.cmp(src, dest):
       try:
@@ -292,13 +364,21 @@
 
 
 class _LinkFile(object):
+  """Container for <linkfile> manifest element."""
 
-  def __init__(self, git_worktree, src, dest, relsrc, absdest):
+  def __init__(self, git_worktree, src, topdir, dest):
+    """Register a <linkfile> request.
+
+    Args:
+      git_worktree: Absolute path to the git project checkout.
+      src: Target of symlink relative to path under |git_worktree|.
+      topdir: Absolute path to the top of the repo client checkout.
+      dest: Relative path under |topdir| of symlink to create.
+    """
     self.git_worktree = git_worktree
+    self.topdir = topdir
     self.src = src
     self.dest = dest
-    self.src_rel_to_dest = relsrc
-    self.abs_dest = absdest
 
   def __linkIt(self, relSrc, absDest):
     # link file if it does not exist or is out of date
@@ -316,35 +396,42 @@
         _error('Cannot link file %s to %s', relSrc, absDest)
 
   def _Link(self):
-    """Link the self.rel_src_to_dest and self.abs_dest. Handles wild cards
-    on the src linking all of the files in the source in to the destination
-    directory.
+    """Link the self.src & self.dest paths.
+
+    Handles wild cards on the src linking all of the files in the source in to
+    the destination directory.
     """
-    # We use the absSrc to handle the situation where the current directory
-    # is not the root of the repo
-    absSrc = os.path.join(self.git_worktree, self.src)
-    if os.path.exists(absSrc):
-      # Entity exists so just a simple one to one link operation
-      self.__linkIt(self.src_rel_to_dest, self.abs_dest)
+    # Some people use src="." to create stable links to projects.  Lets allow
+    # that but reject all other uses of "." to keep things simple.
+    if self.src == '.':
+      src = self.git_worktree
     else:
-      # Entity doesn't exist assume there is a wild card
-      absDestDir = self.abs_dest
-      if os.path.exists(absDestDir) and not platform_utils.isdir(absDestDir):
-        _error('Link error: src with wildcard, %s must be a directory',
-               absDestDir)
+      src = _SafeExpandPath(self.git_worktree, self.src)
+
+    if not glob.has_magic(src):
+      # Entity does not contain a wild card so just a simple one to one link operation.
+      dest = _SafeExpandPath(self.topdir, self.dest, skipfinal=True)
+      # dest & src are absolute paths at this point.  Make sure the target of
+      # the symlink is relative in the context of the repo client checkout.
+      relpath = os.path.relpath(src, os.path.dirname(dest))
+      self.__linkIt(relpath, dest)
+    else:
+      dest = _SafeExpandPath(self.topdir, self.dest)
+      # Entity contains a wild card.
+      if os.path.exists(dest) and not platform_utils.isdir(dest):
+        _error('Link error: src with wildcard, %s must be a directory', dest)
       else:
-        absSrcFiles = glob.glob(absSrc)
-        for absSrcFile in absSrcFiles:
+        for absSrcFile in glob.glob(src):
           # Create a releative path from source dir to destination dir
           absSrcDir = os.path.dirname(absSrcFile)
-          relSrcDir = os.path.relpath(absSrcDir, absDestDir)
+          relSrcDir = os.path.relpath(absSrcDir, dest)
 
           # Get the source file name
           srcFile = os.path.basename(absSrcFile)
 
           # Now form the final full paths to srcFile. They will be
           # absolute for the desintaiton and relative for the srouce.
-          absDest = os.path.join(absDestDir, srcFile)
+          absDest = os.path.join(dest, srcFile)
           relSrc = os.path.join(relSrcDir, srcFile)
           self.__linkIt(relSrc, absDest)
 
@@ -368,405 +455,6 @@
     self.fetchUrl = fetchUrl
 
 
-class RepoHook(object):
-
-  """A RepoHook contains information about a script to run as a hook.
-
-  Hooks are used to run a python script before running an upload (for instance,
-  to run presubmit checks).  Eventually, we may have hooks for other actions.
-
-  This shouldn't be confused with files in the 'repo/hooks' directory.  Those
-  files are copied into each '.git/hooks' folder for each project.  Repo-level
-  hooks are associated instead with repo actions.
-
-  Hooks are always python.  When a hook is run, we will load the hook into the
-  interpreter and execute its main() function.
-  """
-
-  def __init__(self,
-               hook_type,
-               hooks_project,
-               topdir,
-               manifest_url,
-               abort_if_user_denies=False):
-    """RepoHook constructor.
-
-    Params:
-      hook_type: A string representing the type of hook.  This is also used
-          to figure out the name of the file containing the hook.  For
-          example: 'pre-upload'.
-      hooks_project: The project containing the repo hooks.  If you have a
-          manifest, this is manifest.repo_hooks_project.  OK if this is None,
-          which will make the hook a no-op.
-      topdir: Repo's top directory (the one containing the .repo directory).
-          Scripts will run with CWD as this directory.  If you have a manifest,
-          this is manifest.topdir
-      manifest_url: The URL to the manifest git repo.
-      abort_if_user_denies: If True, we'll throw a HookError() if the user
-          doesn't allow us to run the hook.
-    """
-    self._hook_type = hook_type
-    self._hooks_project = hooks_project
-    self._manifest_url = manifest_url
-    self._topdir = topdir
-    self._abort_if_user_denies = abort_if_user_denies
-
-    # Store the full path to the script for convenience.
-    if self._hooks_project:
-      self._script_fullpath = os.path.join(self._hooks_project.worktree,
-                                           self._hook_type + '.py')
-    else:
-      self._script_fullpath = None
-
-  def _GetHash(self):
-    """Return a hash of the contents of the hooks directory.
-
-    We'll just use git to do this.  This hash has the property that if anything
-    changes in the directory we will return a different has.
-
-    SECURITY CONSIDERATION:
-      This hash only represents the contents of files in the hook directory, not
-      any other files imported or called by hooks.  Changes to imported files
-      can change the script behavior without affecting the hash.
-
-    Returns:
-      A string representing the hash.  This will always be ASCII so that it can
-      be printed to the user easily.
-    """
-    assert self._hooks_project, "Must have hooks to calculate their hash."
-
-    # We will use the work_git object rather than just calling GetRevisionId().
-    # That gives us a hash of the latest checked in version of the files that
-    # the user will actually be executing.  Specifically, GetRevisionId()
-    # doesn't appear to change even if a user checks out a different version
-    # of the hooks repo (via git checkout) nor if a user commits their own revs.
-    #
-    # NOTE: Local (non-committed) changes will not be factored into this hash.
-    # I think this is OK, since we're really only worried about warning the user
-    # about upstream changes.
-    return self._hooks_project.work_git.rev_parse('HEAD')
-
-  def _GetMustVerb(self):
-    """Return 'must' if the hook is required; 'should' if not."""
-    if self._abort_if_user_denies:
-      return 'must'
-    else:
-      return 'should'
-
-  def _CheckForHookApproval(self):
-    """Check to see whether this hook has been approved.
-
-    We'll accept approval of manifest URLs if they're using secure transports.
-    This way the user can say they trust the manifest hoster.  For insecure
-    hosts, we fall back to checking the hash of the hooks repo.
-
-    Note that we ask permission for each individual hook even though we use
-    the hash of all hooks when detecting changes.  We'd like the user to be
-    able to approve / deny each hook individually.  We only use the hash of all
-    hooks because there is no other easy way to detect changes to local imports.
-
-    Returns:
-      True if this hook is approved to run; False otherwise.
-
-    Raises:
-      HookError: Raised if the user doesn't approve and abort_if_user_denies
-          was passed to the consturctor.
-    """
-    if self._ManifestUrlHasSecureScheme():
-      return self._CheckForHookApprovalManifest()
-    else:
-      return self._CheckForHookApprovalHash()
-
-  def _CheckForHookApprovalHelper(self, subkey, new_val, main_prompt,
-                                  changed_prompt):
-    """Check for approval for a particular attribute and hook.
-
-    Args:
-      subkey: The git config key under [repo.hooks.<hook_type>] to store the
-          last approved string.
-      new_val: The new value to compare against the last approved one.
-      main_prompt: Message to display to the user to ask for approval.
-      changed_prompt: Message explaining why we're re-asking for approval.
-
-    Returns:
-      True if this hook is approved to run; False otherwise.
-
-    Raises:
-      HookError: Raised if the user doesn't approve and abort_if_user_denies
-          was passed to the consturctor.
-    """
-    hooks_config = self._hooks_project.config
-    git_approval_key = 'repo.hooks.%s.%s' % (self._hook_type, subkey)
-
-    # Get the last value that the user approved for this hook; may be None.
-    old_val = hooks_config.GetString(git_approval_key)
-
-    if old_val is not None:
-      # User previously approved hook and asked not to be prompted again.
-      if new_val == old_val:
-        # Approval matched.  We're done.
-        return True
-      else:
-        # Give the user a reason why we're prompting, since they last told
-        # us to "never ask again".
-        prompt = 'WARNING: %s\n\n' % (changed_prompt,)
-    else:
-      prompt = ''
-
-    # Prompt the user if we're not on a tty; on a tty we'll assume "no".
-    if sys.stdout.isatty():
-      prompt += main_prompt + ' (yes/always/NO)? '
-      response = input(prompt).lower()
-      print()
-
-      # User is doing a one-time approval.
-      if response in ('y', 'yes'):
-        return True
-      elif response == 'always':
-        hooks_config.SetString(git_approval_key, new_val)
-        return True
-
-    # For anything else, we'll assume no approval.
-    if self._abort_if_user_denies:
-      raise HookError('You must allow the %s hook or use --no-verify.' %
-                      self._hook_type)
-
-    return False
-
-  def _ManifestUrlHasSecureScheme(self):
-    """Check if the URI for the manifest is a secure transport."""
-    secure_schemes = ('file', 'https', 'ssh', 'persistent-https', 'sso', 'rpc')
-    parse_results = urllib.parse.urlparse(self._manifest_url)
-    return parse_results.scheme in secure_schemes
-
-  def _CheckForHookApprovalManifest(self):
-    """Check whether the user has approved this manifest host.
-
-    Returns:
-      True if this hook is approved to run; False otherwise.
-    """
-    return self._CheckForHookApprovalHelper(
-        'approvedmanifest',
-        self._manifest_url,
-        'Run hook scripts from %s' % (self._manifest_url,),
-        'Manifest URL has changed since %s was allowed.' % (self._hook_type,))
-
-  def _CheckForHookApprovalHash(self):
-    """Check whether the user has approved the hooks repo.
-
-    Returns:
-      True if this hook is approved to run; False otherwise.
-    """
-    prompt = ('Repo %s run the script:\n'
-              '  %s\n'
-              '\n'
-              'Do you want to allow this script to run')
-    return self._CheckForHookApprovalHelper(
-        'approvedhash',
-        self._GetHash(),
-        prompt % (self._GetMustVerb(), self._script_fullpath),
-        'Scripts have changed since %s was allowed.' % (self._hook_type,))
-
-  @staticmethod
-  def _ExtractInterpFromShebang(data):
-    """Extract the interpreter used in the shebang.
-
-    Try to locate the interpreter the script is using (ignoring `env`).
-
-    Args:
-      data: The file content of the script.
-
-    Returns:
-      The basename of the main script interpreter, or None if a shebang is not
-      used or could not be parsed out.
-    """
-    firstline = data.splitlines()[:1]
-    if not firstline:
-      return None
-
-    # The format here can be tricky.
-    shebang = firstline[0].strip()
-    m = re.match(r'^#!\s*([^\s]+)(?:\s+([^\s]+))?', shebang)
-    if not m:
-      return None
-
-    # If the using `env`, find the target program.
-    interp = m.group(1)
-    if os.path.basename(interp) == 'env':
-      interp = m.group(2)
-
-    return interp
-
-  def _ExecuteHookViaReexec(self, interp, context, **kwargs):
-    """Execute the hook script through |interp|.
-
-    Note: Support for this feature should be dropped ~Jun 2021.
-
-    Args:
-      interp: The Python program to run.
-      context: Basic Python context to execute the hook inside.
-      kwargs: Arbitrary arguments to pass to the hook script.
-
-    Raises:
-      HookError: When the hooks failed for any reason.
-    """
-    # This logic needs to be kept in sync with _ExecuteHookViaImport below.
-    script = """
-import json, os, sys
-path = '''%(path)s'''
-kwargs = json.loads('''%(kwargs)s''')
-context = json.loads('''%(context)s''')
-sys.path.insert(0, os.path.dirname(path))
-data = open(path).read()
-exec(compile(data, path, 'exec'), context)
-context['main'](**kwargs)
-""" % {
-        'path': self._script_fullpath,
-        'kwargs': json.dumps(kwargs),
-        'context': json.dumps(context),
-    }
-
-    # We pass the script via stdin to avoid OS argv limits.  It also makes
-    # unhandled exception tracebacks less verbose/confusing for users.
-    cmd = [interp, '-c', 'import sys; exec(sys.stdin.read())']
-    proc = subprocess.Popen(cmd, stdin=subprocess.PIPE)
-    proc.communicate(input=script.encode('utf-8'))
-    if proc.returncode:
-      raise HookError('Failed to run %s hook.' % (self._hook_type,))
-
-  def _ExecuteHookViaImport(self, data, context, **kwargs):
-    """Execute the hook code in |data| directly.
-
-    Args:
-      data: The code of the hook to execute.
-      context: Basic Python context to execute the hook inside.
-      kwargs: Arbitrary arguments to pass to the hook script.
-
-    Raises:
-      HookError: When the hooks failed for any reason.
-    """
-    # Exec, storing global context in the context dict.  We catch exceptions
-    # and convert to a HookError w/ just the failing traceback.
-    try:
-      exec(compile(data, self._script_fullpath, 'exec'), context)
-    except Exception:
-      raise HookError('%s\nFailed to import %s hook; see traceback above.' %
-                      (traceback.format_exc(), self._hook_type))
-
-    # Running the script should have defined a main() function.
-    if 'main' not in context:
-      raise HookError('Missing main() in: "%s"' % self._script_fullpath)
-
-    # Call the main function in the hook.  If the hook should cause the
-    # build to fail, it will raise an Exception.  We'll catch that convert
-    # to a HookError w/ just the failing traceback.
-    try:
-      context['main'](**kwargs)
-    except Exception:
-      raise HookError('%s\nFailed to run main() for %s hook; see traceback '
-                      'above.' % (traceback.format_exc(), self._hook_type))
-
-  def _ExecuteHook(self, **kwargs):
-    """Actually execute the given hook.
-
-    This will run the hook's 'main' function in our python interpreter.
-
-    Args:
-      kwargs: Keyword arguments to pass to the hook.  These are often specific
-          to the hook type.  For instance, pre-upload hooks will contain
-          a project_list.
-    """
-    # Keep sys.path and CWD stashed away so that we can always restore them
-    # upon function exit.
-    orig_path = os.getcwd()
-    orig_syspath = sys.path
-
-    try:
-      # Always run hooks with CWD as topdir.
-      os.chdir(self._topdir)
-
-      # Put the hook dir as the first item of sys.path so hooks can do
-      # relative imports.  We want to replace the repo dir as [0] so
-      # hooks can't import repo files.
-      sys.path = [os.path.dirname(self._script_fullpath)] + sys.path[1:]
-
-      # Initial global context for the hook to run within.
-      context = {'__file__': self._script_fullpath}
-
-      # Add 'hook_should_take_kwargs' to the arguments to be passed to main.
-      # We don't actually want hooks to define their main with this argument--
-      # it's there to remind them that their hook should always take **kwargs.
-      # For instance, a pre-upload hook should be defined like:
-      #   def main(project_list, **kwargs):
-      #
-      # This allows us to later expand the API without breaking old hooks.
-      kwargs = kwargs.copy()
-      kwargs['hook_should_take_kwargs'] = True
-
-      # See what version of python the hook has been written against.
-      data = open(self._script_fullpath).read()
-      interp = self._ExtractInterpFromShebang(data)
-      reexec = False
-      if interp:
-        prog = os.path.basename(interp)
-        if prog.startswith('python2') and sys.version_info.major != 2:
-          reexec = True
-        elif prog.startswith('python3') and sys.version_info.major == 2:
-          reexec = True
-
-      # Attempt to execute the hooks through the requested version of Python.
-      if reexec:
-        try:
-          self._ExecuteHookViaReexec(interp, context, **kwargs)
-        except OSError as e:
-          if e.errno == errno.ENOENT:
-            # We couldn't find the interpreter, so fallback to importing.
-            reexec = False
-          else:
-            raise
-
-      # Run the hook by importing directly.
-      if not reexec:
-        self._ExecuteHookViaImport(data, context, **kwargs)
-    finally:
-      # Restore sys.path and CWD.
-      sys.path = orig_syspath
-      os.chdir(orig_path)
-
-  def Run(self, user_allows_all_hooks, **kwargs):
-    """Run the hook.
-
-    If the hook doesn't exist (because there is no hooks project or because
-    this particular hook is not enabled), this is a no-op.
-
-    Args:
-      user_allows_all_hooks: If True, we will never prompt about running the
-          hook--we'll just assume it's OK to run it.
-      kwargs: Keyword arguments to pass to the hook.  These are often specific
-          to the hook type.  For instance, pre-upload hooks will contain
-          a project_list.
-
-    Raises:
-      HookError: If there was a problem finding the hook or the user declined
-          to run a required hook (from _CheckForHookApproval).
-    """
-    # No-op if there is no hooks project or if hook is disabled.
-    if ((not self._hooks_project) or (self._hook_type not in
-                                      self._hooks_project.enabled_repo_hooks)):
-      return
-
-    # Bail with a nice error if we can't find the hook.
-    if not os.path.isfile(self._script_fullpath):
-      raise HookError('Couldn\'t find repo hook: "%s"' % self._script_fullpath)
-
-    # Make sure the user is OK with running the hook.
-    if (not user_allows_all_hooks) and (not self._CheckForHookApproval()):
-      return
-
-    # Run the hook with the same version of python we're using.
-    self._ExecuteHook(**kwargs)
-
-
 class Project(object):
   # These objects can be shared between several working trees.
   shareable_files = ['description', 'info']
@@ -793,9 +481,11 @@
                clone_depth=None,
                upstream=None,
                parent=None,
+               use_git_worktrees=False,
                is_derived=False,
                dest_branch=None,
                optimized_fetch=False,
+               retry_fetches=0,
                old_revision=None):
     """Init a Project object.
 
@@ -816,31 +506,21 @@
       sync_tags: The `sync-tags` attribute of manifest.xml's project element.
       upstream: The `upstream` attribute of manifest.xml's project element.
       parent: The parent Project object.
+      use_git_worktrees: Whether to use `git worktree` for this project.
       is_derived: False if the project was explicitly defined in the manifest;
                   True if the project is a discovered submodule.
       dest_branch: The branch to which to push changes for review by default.
       optimized_fetch: If True, when a project is set to a sha1 revision, only
                        fetch from the remote if the sha1 is not present locally.
+      retry_fetches: Retry remote fetches n times upon receiving transient error
+                     with exponential backoff and jitter.
       old_revision: saved git commit id for open GITC projects.
     """
-    self.manifest = manifest
+    self.client = self.manifest = manifest
     self.name = name
     self.remote = remote
-    self.gitdir = gitdir.replace('\\', '/')
-    self.objdir = objdir.replace('\\', '/')
-    if worktree:
-      self.worktree = os.path.normpath(worktree).replace('\\', '/')
-    else:
-      self.worktree = None
-    self.relpath = relpath
-    self.revisionExpr = revisionExpr
-
-    if revisionId is None \
-            and revisionExpr \
-            and IsId(revisionExpr):
-      self.revisionId = revisionExpr
-    else:
-      self.revisionId = revisionId
+    self.UpdatePaths(relpath, worktree, gitdir, objdir)
+    self.SetRevision(revisionExpr, revisionId=revisionId)
 
     self.rebase = rebase
     self.groups = groups
@@ -850,24 +530,19 @@
     self.clone_depth = clone_depth
     self.upstream = upstream
     self.parent = parent
+    # NB: Do not use this setting in __init__ to change behavior so that the
+    # manifest.git checkout can inspect & change it after instantiating.  See
+    # the XmlManifest init code for more info.
+    self.use_git_worktrees = use_git_worktrees
     self.is_derived = is_derived
     self.optimized_fetch = optimized_fetch
+    self.retry_fetches = max(0, retry_fetches)
     self.subprojects = []
 
     self.snapshots = {}
     self.copyfiles = []
     self.linkfiles = []
     self.annotations = []
-    self.config = GitConfig.ForRepository(gitdir=self.gitdir,
-                                          defaults=self.manifest.globalConfig)
-
-    if self.worktree:
-      self.work_git = self._GitGetByExec(self, bare=False, gitdir=gitdir)
-    else:
-      self.work_git = None
-    self.bare_git = self._GitGetByExec(self, bare=True, gitdir=gitdir)
-    self.bare_ref = GitRefs(gitdir)
-    self.bare_objdir = self._GitGetByExec(self, bare=True, gitdir=objdir)
     self.dest_branch = dest_branch
     self.old_revision = old_revision
 
@@ -875,6 +550,35 @@
     # project containing repo hooks.
     self.enabled_repo_hooks = []
 
+  def SetRevision(self, revisionExpr, revisionId=None):
+    """Set revisionId based on revision expression and id"""
+    self.revisionExpr = revisionExpr
+    if revisionId is None and revisionExpr and IsId(revisionExpr):
+      self.revisionId = self.revisionExpr
+    else:
+      self.revisionId = revisionId
+
+  def UpdatePaths(self, relpath, worktree, gitdir, objdir):
+    """Update paths used by this project"""
+    self.gitdir = gitdir.replace('\\', '/')
+    self.objdir = objdir.replace('\\', '/')
+    if worktree:
+      self.worktree = os.path.normpath(worktree).replace('\\', '/')
+    else:
+      self.worktree = None
+    self.relpath = relpath
+
+    self.config = GitConfig.ForRepository(gitdir=self.gitdir,
+                                          defaults=self.manifest.globalConfig)
+
+    if self.worktree:
+      self.work_git = self._GitGetByExec(self, bare=False, gitdir=self.gitdir)
+    else:
+      self.work_git = None
+    self.bare_git = self._GitGetByExec(self, bare=True, gitdir=self.gitdir)
+    self.bare_ref = GitRefs(self.gitdir)
+    self.bare_objdir = self._GitGetByExec(self, bare=True, gitdir=self.objdir)
+
   @property
   def Derived(self):
     return self.is_derived
@@ -902,11 +606,9 @@
     return None
 
   def IsRebaseInProgress(self):
-    w = self.worktree
-    g = os.path.join(w, '.git')
-    return os.path.exists(os.path.join(g, 'rebase-apply')) \
-        or os.path.exists(os.path.join(g, 'rebase-merge')) \
-        or os.path.exists(os.path.join(w, '.dotest'))
+    return (os.path.exists(self.work_git.GetDotgitPath('rebase-apply')) or
+            os.path.exists(self.work_git.GetDotgitPath('rebase-merge')) or
+            os.path.exists(os.path.join(self.worktree, '.dotest')))
 
   def IsDirty(self, consider_untracked=True):
     """Is the working directory modified in some way?
@@ -1152,10 +854,12 @@
 
     return 'DIRTY'
 
-  def PrintWorkTreeDiff(self, absolute_paths=False):
+  def PrintWorkTreeDiff(self, absolute_paths=False, output_redir=None):
     """Prints the status of the repository to stdout.
     """
     out = DiffColoring(self.config)
+    if output_redir:
+      out.redirect(output_redir)
     cmd = ['diff']
     if out.is_on:
       cmd.append('--color')
@@ -1169,6 +873,7 @@
                      cmd,
                      capture_stdout=True,
                      capture_stderr=True)
+      p.Wait()
     except GitError as e:
       out.nl()
       out.project('project %s/' % self.relpath)
@@ -1176,21 +881,14 @@
       out.fail('%s', str(e))
       out.nl()
       return False
-    has_diff = False
-    for line in p.process.stdout:
-      if not hasattr(line, 'encode'):
-        line = line.decode()
-      if not has_diff:
-        out.nl()
-        out.project('project %s/' % self.relpath)
-        out.nl()
-        has_diff = True
-      print(line[:-1])
+    if p.stdout:
+      out.nl()
+      out.project('project %s/' % self.relpath)
+      out.nl()
+      out.write('%s', p.stdout)
     return p.Wait() == 0
 
-
 # Publish / Upload ##
-
   def WasPublished(self, branch, all_refs=None):
     """Was the branch published (uploaded) for code review?
        If so, returns the SHA-1 hash of the last published
@@ -1263,8 +961,10 @@
 
   def UploadForReview(self, branch=None,
                       people=([], []),
+                      dryrun=False,
                       auto_topic=False,
-                      draft=False,
+                      hashtags=(),
+                      labels=(),
                       private=False,
                       notify=None,
                       wip=False,
@@ -1299,6 +999,8 @@
     if url is None:
       raise UploadError('review not configured')
     cmd = ['push']
+    if dryrun:
+      cmd.append('-n')
 
     if url.startswith('ssh://'):
       cmd.append('--receive-pack=gerrit receive-pack')
@@ -1312,15 +1014,12 @@
     if dest_branch.startswith(R_HEADS):
       dest_branch = dest_branch[len(R_HEADS):]
 
-    upload_type = 'for'
-    if draft:
-      upload_type = 'drafts'
-
-    ref_spec = '%s:refs/%s/%s' % (R_HEADS + branch.name, upload_type,
-                                  dest_branch)
+    ref_spec = '%s:refs/for/%s' % (R_HEADS + branch.name, dest_branch)
     opts = []
     if auto_topic:
       opts += ['topic=' + branch.name]
+    opts += ['t=%s' % p for p in hashtags]
+    opts += ['l=%s' % p for p in labels]
 
     opts += ['r=%s' % p for p in people[0]]
     opts += ['cc=%s' % p for p in people[1]]
@@ -1337,14 +1036,13 @@
     if GitCommand(self, cmd, bare=True).Wait() != 0:
       raise UploadError('Upload failed')
 
-    msg = "posted to %s for %s" % (branch.remote.review, dest_branch)
-    self.bare_git.UpdateRef(R_PUB + branch.name,
-                            R_HEADS + branch.name,
-                            message=msg)
-
+    if not dryrun:
+      msg = "posted to %s for %s" % (branch.remote.review, dest_branch)
+      self.bare_git.UpdateRef(R_PUB + branch.name,
+                              R_HEADS + branch.name,
+                              message=msg)
 
 # Sync ##
-
   def _ExtractArchive(self, tarpath, path=None):
     """Extract the given tar on its current location
 
@@ -1362,16 +1060,21 @@
 
   def Sync_NetworkHalf(self,
                        quiet=False,
+                       verbose=False,
+                       output_redir=None,
                        is_new=None,
-                       current_branch_only=False,
+                       current_branch_only=None,
                        force_sync=False,
                        clone_bundle=True,
-                       no_tags=False,
+                       tags=None,
                        archive=False,
                        optimized_fetch=False,
+                       retry_fetches=0,
                        prune=False,
                        submodules=False,
-                       clone_filter=None):
+                       ssh_proxy=None,
+                       clone_filter=None,
+                       partial_clone_exclude=set()):
     """Perform only the network IO portion of the sync process.
        Local working directory/branch state is not affected.
     """
@@ -1402,12 +1105,22 @@
         _warn("Cannot remove archive %s: %s", tarpath, str(e))
       self._CopyAndLinkFiles()
       return True
+
+    # If the shared object dir already exists, don't try to rebootstrap with a
+    # clone bundle download.  We should have the majority of objects already.
+    if clone_bundle and os.path.exists(self.objdir):
+      clone_bundle = False
+
+    if self.name in partial_clone_exclude:
+      clone_bundle = True
+      clone_filter = None
+
     if is_new is None:
       is_new = not self.Exists
     if is_new:
-      self._InitGitDir(force_sync=force_sync)
+      self._InitGitDir(force_sync=force_sync, quiet=quiet)
     else:
-      self._UpdateHooks()
+      self._UpdateHooks(quiet=quiet)
     self._InitRemote()
 
     if is_new:
@@ -1421,12 +1134,12 @@
     else:
       alt_dir = None
 
-    if clone_bundle \
-            and alt_dir is None \
-            and self._ApplyCloneBundle(initial=is_new, quiet=quiet):
+    if (clone_bundle
+            and alt_dir is None
+            and self._ApplyCloneBundle(initial=is_new, quiet=quiet, verbose=verbose)):
       is_new = False
 
-    if not current_branch_only:
+    if current_branch_only is None:
       if self.sync_c:
         current_branch_only = True
       elif not self.manifest._loaded:
@@ -1435,25 +1148,27 @@
       elif self.manifest.default.sync_c:
         current_branch_only = True
 
-    if not no_tags:
-      if not self.sync_tags:
-        no_tags = True
+    if tags is None:
+      tags = self.sync_tags
 
     if self.clone_depth:
       depth = self.clone_depth
     else:
       depth = self.manifest.manifestProject.config.GetString('repo.depth')
 
-    need_to_fetch = not (optimized_fetch and
-                         (ID_RE.match(self.revisionExpr) and
-                          self._CheckForImmutableRevision()))
-    if (need_to_fetch and
-        not self._RemoteFetch(initial=is_new, quiet=quiet, alt_dir=alt_dir,
-                              current_branch_only=current_branch_only,
-                              no_tags=no_tags, prune=prune, depth=depth,
-                              submodules=submodules, force_sync=force_sync,
-                              clone_filter=clone_filter)):
-      return False
+    # See if we can skip the network fetch entirely.
+    if not (optimized_fetch and
+            (ID_RE.match(self.revisionExpr) and
+             self._CheckForImmutableRevision())):
+      if not self._RemoteFetch(
+              initial=is_new,
+              quiet=quiet, verbose=verbose, output_redir=output_redir,
+              alt_dir=alt_dir, current_branch_only=current_branch_only,
+              tags=tags, prune=prune, depth=depth,
+              submodules=submodules, force_sync=force_sync,
+              ssh_proxy=ssh_proxy,
+              clone_filter=clone_filter, retry_fetches=retry_fetches):
+        return False
 
     mp = self.manifest.manifestProject
     dissociate = mp.config.GetBoolean('repo.dissociate')
@@ -1461,7 +1176,11 @@
       alternates_file = os.path.join(self.gitdir, 'objects/info/alternates')
       if os.path.exists(alternates_file):
         cmd = ['repack', '-a', '-d']
-        if GitCommand(self, cmd, bare=True).Wait() != 0:
+        p = GitCommand(self, cmd, bare=True, capture_stdout=bool(output_redir),
+                       merge_output=bool(output_redir))
+        if p.stdout and output_redir:
+          output_redir.write(p.stdout)
+        if p.Wait() != 0:
           return False
         platform_utils.remove(alternates_file)
 
@@ -1469,17 +1188,15 @@
       self._InitMRef()
     else:
       self._InitMirrorHead()
-      try:
-        platform_utils.remove(os.path.join(self.gitdir, 'FETCH_HEAD'))
-      except OSError:
-        pass
+      platform_utils.remove(os.path.join(self.gitdir, 'FETCH_HEAD'),
+                            missing_ok=True)
     return True
 
   def PostRepoUpgrade(self):
     self._InitHooks()
 
   def _CopyAndLinkFiles(self):
-    if self.manifest.isGitcClient:
+    if self.client.isGitcClient:
       return
     for copyfile in self.copyfiles:
       copyfile._Copy()
@@ -1518,6 +1235,12 @@
       raise ManifestInvalidRevisionError('revision %s in %s not found' %
                                          (self.revisionExpr, self.name))
 
+  def SetRevisionId(self, revisionId):
+    if self.revisionExpr:
+      self.upstream = self.revisionExpr
+
+    self.revisionId = revisionId
+
   def Sync_LocalHalf(self, syncbuf, force_sync=False, submodules=False):
     """Perform only the local IO portion of the sync process.
        Network access is not required.
@@ -1534,6 +1257,18 @@
     self.CleanPublishedCache(all_refs)
     revid = self.GetRevisionId(all_refs)
 
+    # Special case the root of the repo client checkout.  Make sure it doesn't
+    # contain files being checked out to dirs we don't allow.
+    if self.relpath == '.':
+      PROTECTED_PATHS = {'.repo'}
+      paths = set(self.work_git.ls_tree('-z', '--name-only', '--', revid).split('\0'))
+      bad_paths = paths & PROTECTED_PATHS
+      if bad_paths:
+        syncbuf.fail(self,
+                     'Refusing to checkout project that writes to protected '
+                     'paths: %s' % (', '.join(bad_paths),))
+        return
+
     def _doff():
       self._FastForward(revid)
       self._CopyAndLinkFiles()
@@ -1712,21 +1447,28 @@
       if submodules:
         syncbuf.later1(self, _dosubmodules)
 
-  def AddCopyFile(self, src, dest, absdest):
-    # dest should already be an absolute path, but src is project relative
-    # make src an absolute path
-    abssrc = os.path.join(self.worktree, src)
-    self.copyfiles.append(_CopyFile(src, dest, abssrc, absdest))
+  def AddCopyFile(self, src, dest, topdir):
+    """Mark |src| for copying to |dest| (relative to |topdir|).
 
-  def AddLinkFile(self, src, dest, absdest):
-    # dest should already be an absolute path, but src is project relative
-    # make src relative path to dest
-    absdestdir = os.path.dirname(absdest)
-    relsrc = os.path.relpath(os.path.join(self.worktree, src), absdestdir)
-    self.linkfiles.append(_LinkFile(self.worktree, src, dest, relsrc, absdest))
+    No filesystem changes occur here.  Actual copying happens later on.
+
+    Paths should have basic validation run on them before being queued.
+    Further checking will be handled when the actual copy happens.
+    """
+    self.copyfiles.append(_CopyFile(self.worktree, src, topdir, dest))
+
+  def AddLinkFile(self, src, dest, topdir):
+    """Mark |dest| to create a symlink (relative to |topdir|) pointing to |src|.
+
+    No filesystem changes occur here.  Actual linking happens later on.
+
+    Paths should have basic validation run on them before being queued.
+    Further checking will be handled when the actual link happens.
+    """
+    self.linkfiles.append(_LinkFile(self.worktree, src, topdir, dest))
 
   def AddAnnotation(self, name, value, keep):
-    self.annotations.append(_Annotation(name, value, keep))
+    self.annotations.append(Annotation(name, value, keep))
 
   def DownloadPatchSet(self, change_id, patch_id):
     """Download a single patch set of a single change to FETCH_HEAD.
@@ -1744,9 +1486,123 @@
                             patch_id,
                             self.bare_git.rev_parse('FETCH_HEAD'))
 
+  def DeleteWorktree(self, quiet=False, force=False):
+    """Delete the source checkout and any other housekeeping tasks.
+
+    This currently leaves behind the internal .repo/ cache state.  This helps
+    when switching branches or manifest changes get reverted as we don't have
+    to redownload all the git objects.  But we should do some GC at some point.
+
+    Args:
+      quiet: Whether to hide normal messages.
+      force: Always delete tree even if dirty.
+
+    Returns:
+      True if the worktree was completely cleaned out.
+    """
+    if self.IsDirty():
+      if force:
+        print('warning: %s: Removing dirty project: uncommitted changes lost.' %
+              (self.relpath,), file=sys.stderr)
+      else:
+        print('error: %s: Cannot remove project: uncommitted changes are '
+              'present.\n' % (self.relpath,), file=sys.stderr)
+        return False
+
+    if not quiet:
+      print('%s: Deleting obsolete checkout.' % (self.relpath,))
+
+    # Unlock and delink from the main worktree.  We don't use git's worktree
+    # remove because it will recursively delete projects -- we handle that
+    # ourselves below.  https://crbug.com/git/48
+    if self.use_git_worktrees:
+      needle = platform_utils.realpath(self.gitdir)
+      # Find the git worktree commondir under .repo/worktrees/.
+      output = self.bare_git.worktree('list', '--porcelain').splitlines()[0]
+      assert output.startswith('worktree '), output
+      commondir = output[9:]
+      # Walk each of the git worktrees to see where they point.
+      configs = os.path.join(commondir, 'worktrees')
+      for name in os.listdir(configs):
+        gitdir = os.path.join(configs, name, 'gitdir')
+        with open(gitdir) as fp:
+          relpath = fp.read().strip()
+        # Resolve the checkout path and see if it matches this project.
+        fullpath = platform_utils.realpath(os.path.join(configs, name, relpath))
+        if fullpath == needle:
+          platform_utils.rmtree(os.path.join(configs, name))
+
+    # Delete the .git directory first, so we're less likely to have a partially
+    # working git repository around. There shouldn't be any git projects here,
+    # so rmtree works.
+
+    # Try to remove plain files first in case of git worktrees.  If this fails
+    # for any reason, we'll fall back to rmtree, and that'll display errors if
+    # it can't remove things either.
+    try:
+      platform_utils.remove(self.gitdir)
+    except OSError:
+      pass
+    try:
+      platform_utils.rmtree(self.gitdir)
+    except OSError as e:
+      if e.errno != errno.ENOENT:
+        print('error: %s: %s' % (self.gitdir, e), file=sys.stderr)
+        print('error: %s: Failed to delete obsolete checkout; remove manually, '
+              'then run `repo sync -l`.' % (self.relpath,), file=sys.stderr)
+        return False
+
+    # Delete everything under the worktree, except for directories that contain
+    # another git project.
+    dirs_to_remove = []
+    failed = False
+    for root, dirs, files in platform_utils.walk(self.worktree):
+      for f in files:
+        path = os.path.join(root, f)
+        try:
+          platform_utils.remove(path)
+        except OSError as e:
+          if e.errno != errno.ENOENT:
+            print('error: %s: Failed to remove: %s' % (path, e), file=sys.stderr)
+            failed = True
+      dirs[:] = [d for d in dirs
+                 if not os.path.lexists(os.path.join(root, d, '.git'))]
+      dirs_to_remove += [os.path.join(root, d) for d in dirs
+                         if os.path.join(root, d) not in dirs_to_remove]
+    for d in reversed(dirs_to_remove):
+      if platform_utils.islink(d):
+        try:
+          platform_utils.remove(d)
+        except OSError as e:
+          if e.errno != errno.ENOENT:
+            print('error: %s: Failed to remove: %s' % (d, e), file=sys.stderr)
+            failed = True
+      elif not platform_utils.listdir(d):
+        try:
+          platform_utils.rmdir(d)
+        except OSError as e:
+          if e.errno != errno.ENOENT:
+            print('error: %s: Failed to remove: %s' % (d, e), file=sys.stderr)
+            failed = True
+    if failed:
+      print('error: %s: Failed to delete obsolete checkout.' % (self.relpath,),
+            file=sys.stderr)
+      print('       Remove manually, then run `repo sync -l`.', file=sys.stderr)
+      return False
+
+    # Try deleting parent dirs if they are empty.
+    path = self.worktree
+    while path != self.manifest.topdir:
+      try:
+        platform_utils.rmdir(path)
+      except OSError as e:
+        if e.errno != errno.ENOENT:
+          break
+      path = os.path.dirname(path)
+
+    return True
 
 # Branch Management ##
-
   def StartBranch(self, name, branch_merge='', revision=None):
     """Create a new branch off the manifest's revision.
     """
@@ -1780,14 +1636,9 @@
       except KeyError:
         head = None
     if revid and head and revid == head:
-      ref = os.path.join(self.gitdir, R_HEADS + name)
-      try:
-        os.makedirs(os.path.dirname(ref))
-      except OSError:
-        pass
-      _lwrite(ref, '%s\n' % revid)
-      _lwrite(os.path.join(self.worktree, '.git', HEAD),
-              'ref: %s%s\n' % (R_HEADS, name))
+      ref = R_HEADS + name
+      self.work_git.update_ref(ref, revid)
+      self.work_git.symbolic_ref(HEAD, ref)
       branch.Save()
       return True
 
@@ -1834,7 +1685,7 @@
       # Same revision; just update HEAD to point to the new
       # target branch, but otherwise take no other action.
       #
-      _lwrite(os.path.join(self.worktree, '.git', HEAD),
+      _lwrite(self.work_git.GetDotgitPath(subpath=HEAD),
               'ref: %s%s\n' % (R_HEADS, name))
       return True
 
@@ -1868,8 +1719,7 @@
 
       revid = self.GetRevisionId(all_refs)
       if head == revid:
-        _lwrite(os.path.join(self.worktree, '.git', HEAD),
-                '%s\n' % revid)
+        _lwrite(self.work_git.GetDotgitPath(subpath=HEAD), '%s\n' % revid)
       else:
         self._Checkout(revid, quiet=True)
 
@@ -1890,6 +1740,11 @@
         if cb is None or name != cb:
           kill.append(name)
 
+    # Minor optimization: If there's nothing to prune, then don't try to read
+    # any project state.
+    if not kill and not cb:
+      return []
+
     rev = self.GetRevisionId(left)
     if cb is not None \
        and not self._revlist(HEAD + '...' + rev) \
@@ -1935,9 +1790,7 @@
         kept.append(ReviewableBranch(self, branch, base))
     return kept
 
-
 # Submodule Management ##
-
   def GetRegisteredSubprojects(self):
     result = []
 
@@ -2088,13 +1941,57 @@
       result.extend(subproject.GetDerivedSubprojects())
     return result
 
-
 # Direct Git Commands ##
+  def EnableRepositoryExtension(self, key, value='true', version=1):
+    """Enable git repository extension |key| with |value|.
+
+    Args:
+      key: The extension to enabled.  Omit the "extensions." prefix.
+      value: The value to use for the extension.
+      version: The minimum git repository version needed.
+    """
+    # Make sure the git repo version is new enough already.
+    found_version = self.config.GetInt('core.repositoryFormatVersion')
+    if found_version is None:
+      found_version = 0
+    if found_version < version:
+      self.config.SetString('core.repositoryFormatVersion', str(version))
+
+    # Enable the extension!
+    self.config.SetString('extensions.%s' % (key,), value)
+
+  def ResolveRemoteHead(self, name=None):
+    """Find out what the default branch (HEAD) points to.
+
+    Normally this points to refs/heads/master, but projects are moving to main.
+    Support whatever the server uses rather than hardcoding "master" ourselves.
+    """
+    if name is None:
+      name = self.remote.name
+
+    # The output will look like (NB: tabs are separators):
+    # ref: refs/heads/master	HEAD
+    # 5f6803b100bb3cd0f534e96e88c91373e8ed1c44	HEAD
+    output = self.bare_git.ls_remote('-q', '--symref', '--exit-code', name, 'HEAD')
+
+    for line in output.splitlines():
+      lhs, rhs = line.split('\t', 1)
+      if rhs == 'HEAD' and lhs.startswith('ref:'):
+        return lhs[4:].strip()
+
+    return None
+
   def _CheckForImmutableRevision(self):
     try:
       # if revision (sha or tag) is not present then following function
       # throws an error.
-      self.bare_git.rev_parse('--verify', '%s^0' % self.revisionExpr)
+      self.bare_git.rev_list('-1', '--missing=allow-any',
+                             '%s^0' % self.revisionExpr, '--')
+      if self.upstream:
+        rev = self.GetRemote(self.remote.name).ToLocal(self.upstream)
+        self.bare_git.rev_list('-1', '--missing=allow-any',
+                               '%s^0' % rev, '--')
+        self.bare_git.merge_base('--is-ancestor', self.revisionExpr, rev)
       return True
     except GitError:
       # There is no such persistent revision. We have to fetch it.
@@ -2117,14 +2014,19 @@
                    current_branch_only=False,
                    initial=False,
                    quiet=False,
+                   verbose=False,
+                   output_redir=None,
                    alt_dir=None,
-                   no_tags=False,
+                   tags=True,
                    prune=False,
                    depth=None,
                    submodules=False,
+                   ssh_proxy=None,
                    force_sync=False,
-                   clone_filter=None):
-
+                   clone_filter=None,
+                   retry_fetches=2,
+                   retry_sleep_initial_sec=4.0,
+                   retry_exp_factor=2.0):
     is_sha1 = False
     tag_name = None
     # The depth should not be used when fetching to a mirror because
@@ -2147,7 +2049,7 @@
 
       if is_sha1 or tag_name is not None:
         if self._CheckForImmutableRevision():
-          if not quiet:
+          if verbose:
             print('Skipped fetching project %s (already have persistent ref)'
                   % self.name)
           return True
@@ -2167,16 +2069,14 @@
     if not name:
       name = self.remote.name
 
-    ssh_proxy = False
     remote = self.GetRemote(name)
-    if remote.PreConnectFetch():
-      ssh_proxy = True
+    if not remote.PreConnectFetch(ssh_proxy):
+      ssh_proxy = None
 
     if initial:
       if alt_dir and 'objects' == os.path.basename(alt_dir):
         ref_dir = os.path.dirname(alt_dir)
         packed_refs = os.path.join(self.gitdir, 'packed-refs')
-        remote = self.GetRemote(name)
 
         all_refs = self.bare_ref.all
         ids = set(all_refs.values())
@@ -2217,7 +2117,7 @@
     if clone_filter:
       git_require((2, 19, 0), fail=True, msg='partial clones')
       cmd.append('--filter=%s' % clone_filter)
-      self.config.SetString('extensions.partialclone', self.remote.name)
+      self.EnableRepositoryExtension('partialclone', self.remote.name)
 
     if depth:
       cmd.append('--depth=%s' % depth)
@@ -2229,8 +2129,10 @@
       if os.path.exists(os.path.join(self.gitdir, 'shallow')):
         cmd.append('--depth=2147483647')
 
-    if quiet:
+    if not verbose:
       cmd.append('--quiet')
+    if not quiet and sys.stdout.isatty():
+      cmd.append('--progress')
     if not self.worktree:
       cmd.append('--update-head-ok')
     cmd.append(name)
@@ -2257,10 +2159,12 @@
     else:
       branch = self.revisionExpr
     if (not self.manifest.IsMirror and is_sha1 and depth
-        and git_require((1, 8, 3))):
+            and git_require((1, 8, 3))):
       # Shallow checkout of a specific commit, fetch from that commit and not
       # the heads only as the commit might be deeper in the history.
       spec.append(branch)
+      if self.upstream:
+        spec.append(self.upstream)
     else:
       if is_sha1:
         branch = self.upstream
@@ -2276,7 +2180,7 @@
 
     # If using depth then we should not get all the tags since they may
     # be outside of the depth.
-    if no_tags or depth:
+    if not tags or depth:
       cmd.append('--no-tags')
     else:
       cmd.append('--tags')
@@ -2284,22 +2188,42 @@
 
     cmd.extend(spec)
 
-    ok = False
-    for _i in range(2):
-      gitcmd = GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy)
+    # At least one retry minimum due to git remote prune.
+    retry_fetches = max(retry_fetches, 2)
+    retry_cur_sleep = retry_sleep_initial_sec
+    ok = prune_tried = False
+    for try_n in range(retry_fetches):
+      gitcmd = GitCommand(self, cmd, bare=True, ssh_proxy=ssh_proxy,
+                          merge_output=True, capture_stdout=quiet or bool(output_redir))
+      if gitcmd.stdout and not quiet and output_redir:
+        output_redir.write(gitcmd.stdout)
       ret = gitcmd.Wait()
       if ret == 0:
         ok = True
         break
-      # If needed, run the 'git remote prune' the first time through the loop
-      elif (not _i and
-            "error:" in gitcmd.stderr and
-            "git remote prune" in gitcmd.stderr):
+
+      # Retry later due to HTTP 429 Too Many Requests.
+      elif (gitcmd.stdout and
+            'error:' in gitcmd.stdout and
+            'HTTP 429' in gitcmd.stdout):
+        # Fallthru to sleep+retry logic at the bottom.
+        pass
+
+      # Try to prune remote branches once in case there are conflicts.
+      # For example, if the remote had refs/heads/upstream, but deleted that and
+      # now has refs/heads/upstream/foo.
+      elif (gitcmd.stdout and
+            'error:' in gitcmd.stdout and
+            'git remote prune' in gitcmd.stdout and
+            not prune_tried):
+        prune_tried = True
         prunecmd = GitCommand(self, ['remote', 'prune', name], bare=True,
                               ssh_proxy=ssh_proxy)
         ret = prunecmd.Wait()
         if ret:
           break
+        print('retrying fetch after pruning remote branches', file=output_redir)
+        # Continue right away so we don't sleep as we shouldn't need to.
         continue
       elif current_branch_only and is_sha1 and ret == 128:
         # Exit code 128 means "couldn't find the ref you asked for"; if we're
@@ -2309,7 +2233,18 @@
       elif ret < 0:
         # Git died with a signal, exit immediately
         break
-      time.sleep(random.randint(30, 45))
+
+      # Figure out how long to sleep before the next attempt, if there is one.
+      if not verbose and gitcmd.stdout:
+        print('\n%s:\n%s' % (self.name, gitcmd.stdout), end='', file=output_redir)
+      if try_n < retry_fetches - 1:
+        print('%s: sleeping %s seconds before retrying' % (self.name, retry_cur_sleep),
+              file=output_redir)
+        time.sleep(retry_cur_sleep)
+        retry_cur_sleep = min(retry_exp_factor * retry_cur_sleep,
+                              MAXIMUM_RETRY_SLEEP_SEC)
+        retry_cur_sleep *= (1 - random.uniform(-RETRY_JITTER_PERCENT,
+                                               RETRY_JITTER_PERCENT))
 
     if initial:
       if alt_dir:
@@ -2324,21 +2259,17 @@
       # got what we wanted, else trigger a second run of all
       # refs.
       if not self._CheckForImmutableRevision():
-        if current_branch_only and depth:
-          # Sync the current branch only with depth set to None
-          return self._RemoteFetch(name=name,
-                                   current_branch_only=current_branch_only,
-                                   initial=False, quiet=quiet, alt_dir=alt_dir,
-                                   depth=None, clone_filter=clone_filter)
-        else:
-          # Avoid infinite recursion: sync all branches with depth set to None
-          return self._RemoteFetch(name=name, current_branch_only=False,
-                                   initial=False, quiet=quiet, alt_dir=alt_dir,
-                                   depth=None, clone_filter=clone_filter)
+        # Sync the current branch only with depth set to None.
+        # We always pass depth=None down to avoid infinite recursion.
+        return self._RemoteFetch(
+            name=name, quiet=quiet, verbose=verbose, output_redir=output_redir,
+            current_branch_only=current_branch_only and depth,
+            initial=False, alt_dir=alt_dir,
+            depth=None, ssh_proxy=ssh_proxy, clone_filter=clone_filter)
 
     return ok
 
-  def _ApplyCloneBundle(self, initial=False, quiet=False):
+  def _ApplyCloneBundle(self, initial=False, quiet=False, verbose=False):
     if initial and \
         (self.manifest.manifestProject.config.GetString('repo.depth') or
          self.clone_depth):
@@ -2362,13 +2293,16 @@
       return False
 
     if not exist_dst:
-      exist_dst = self._FetchBundle(bundle_url, bundle_tmp, bundle_dst, quiet)
+      exist_dst = self._FetchBundle(bundle_url, bundle_tmp, bundle_dst, quiet,
+                                    verbose)
     if not exist_dst:
       return False
 
     cmd = ['fetch']
-    if quiet:
+    if not verbose:
       cmd.append('--quiet')
+    if not quiet and sys.stdout.isatty():
+      cmd.append('--progress')
     if not self.worktree:
       cmd.append('--update-head-ok')
     cmd.append(bundle_dst)
@@ -2377,19 +2311,16 @@
     cmd.append('+refs/tags/*:refs/tags/*')
 
     ok = GitCommand(self, cmd, bare=True).Wait() == 0
-    if os.path.exists(bundle_dst):
-      platform_utils.remove(bundle_dst)
-    if os.path.exists(bundle_tmp):
-      platform_utils.remove(bundle_tmp)
+    platform_utils.remove(bundle_dst, missing_ok=True)
+    platform_utils.remove(bundle_tmp, missing_ok=True)
     return ok
 
-  def _FetchBundle(self, srcUrl, tmpPath, dstPath, quiet):
-    if os.path.exists(dstPath):
-      platform_utils.remove(dstPath)
+  def _FetchBundle(self, srcUrl, tmpPath, dstPath, quiet, verbose):
+    platform_utils.remove(dstPath, missing_ok=True)
 
     cmd = ['curl', '--fail', '--output', tmpPath, '--netrc', '--location']
     if quiet:
-      cmd += ['--silent']
+      cmd += ['--silent', '--show-error']
     if os.path.exists(tmpPath):
       size = os.stat(tmpPath).st_size
       if size >= 1024:
@@ -2411,22 +2342,30 @@
 
       if IsTrace():
         Trace('%s', ' '.join(cmd))
+      if verbose:
+        print('%s: Downloading bundle: %s' % (self.name, srcUrl))
+      stdout = None if verbose else subprocess.PIPE
+      stderr = None if verbose else subprocess.STDOUT
       try:
-        proc = subprocess.Popen(cmd)
+        proc = subprocess.Popen(cmd, stdout=stdout, stderr=stderr)
       except OSError:
         return False
 
-      curlret = proc.wait()
+      (output, _) = proc.communicate()
+      curlret = proc.returncode
 
       if curlret == 22:
         # From curl man page:
         # 22: HTTP page not retrieved. The requested url was not found or
         # returned another error with the HTTP error code being 400 or above.
         # This return code only appears if -f, --fail is used.
-        if not quiet:
-          print("Server does not provide clone.bundle; ignoring.",
-                file=sys.stderr)
+        if verbose:
+          print('%s: Unable to retrieve clone.bundle; ignoring.' % self.name)
+          if output:
+            print('Curl output:\n%s' % output)
         return False
+      elif curlret and not verbose and output:
+        print('%s' % output, file=sys.stderr)
 
     if os.path.exists(tmpPath):
       if curlret == 0 and self._IsValidBundle(tmpPath, quiet):
@@ -2460,8 +2399,12 @@
       if self._allrefs:
         raise GitError('%s checkout %s ' % (self.name, rev))
 
-  def _CherryPick(self, rev):
+  def _CherryPick(self, rev, ffonly=False, record_origin=False):
     cmd = ['cherry-pick']
+    if ffonly:
+      cmd.append('--ff')
+    if record_origin:
+      cmd.append('-x')
     cmd.append(rev)
     cmd.append('--')
     if GitCommand(self, cmd).Wait() != 0:
@@ -2508,13 +2451,13 @@
       raise GitError('%s rebase %s ' % (self.name, upstream))
 
   def _FastForward(self, head, ffonly=False):
-    cmd = ['merge', head]
+    cmd = ['merge', '--no-stat', head]
     if ffonly:
       cmd.append("--ff-only")
     if GitCommand(self, cmd).Wait() != 0:
       raise GitError('%s merge %s ' % (self.name, head))
 
-  def _InitGitDir(self, mirror_git=None, force_sync=False):
+  def _InitGitDir(self, mirror_git=None, force_sync=False, quiet=False):
     init_git_dir = not os.path.exists(self.gitdir)
     init_obj_dir = not os.path.exists(self.objdir)
     try:
@@ -2523,6 +2466,12 @@
         os.makedirs(self.objdir)
         self.bare_objdir.init()
 
+        if self.use_git_worktrees:
+          # Enable per-worktree config file support if possible.  This is more a
+          # nice-to-have feature for users rather than a hard requirement.
+          if git_require((2, 20, 0)):
+            self.EnableRepositoryExtension('worktreeConfig')
+
       # If we have a separate directory to hold refs, initialize it as well.
       if self.objdir != self.gitdir:
         if init_git_dir:
@@ -2542,8 +2491,9 @@
               if self.worktree and os.path.exists(platform_utils.realpath
                                                   (self.worktree)):
                 platform_utils.rmtree(platform_utils.realpath(self.worktree))
-              return self._InitGitDir(mirror_git=mirror_git, force_sync=False)
-            except:
+              return self._InitGitDir(mirror_git=mirror_git, force_sync=False,
+                                      quiet=quiet)
+            except Exception:
               raise e
           raise e
 
@@ -2556,13 +2506,15 @@
             mirror_git = os.path.join(ref_dir, self.name + '.git')
           repo_git = os.path.join(ref_dir, '.repo', 'projects',
                                   self.relpath + '.git')
+          worktrees_git = os.path.join(ref_dir, '.repo', 'worktrees',
+                                       self.name + '.git')
 
           if os.path.exists(mirror_git):
             ref_dir = mirror_git
-
           elif os.path.exists(repo_git):
             ref_dir = repo_git
-
+          elif os.path.exists(worktrees_git):
+            ref_dir = worktrees_git
           else:
             ref_dir = None
 
@@ -2574,7 +2526,7 @@
             _lwrite(os.path.join(self.gitdir, 'objects/info/alternates'),
                     os.path.join(ref_dir, 'objects') + '\n')
 
-        self._UpdateHooks()
+        self._UpdateHooks(quiet=quiet)
 
         m = self.manifest.manifestProject.config
         for key in ['user.name', 'user.email']:
@@ -2582,10 +2534,7 @@
             self.config.SetString(key, m.GetString(key))
         self.config.SetString('filter.lfs.smudge', 'git-lfs smudge --skip -- %f')
         self.config.SetString('filter.lfs.process', 'git-lfs filter-process --skip')
-        if self.manifest.IsMirror:
-          self.config.SetString('core.bare', 'true')
-        else:
-          self.config.SetString('core.bare', None)
+        self.config.SetBoolean('core.bare', True if self.manifest.IsMirror else None)
     except Exception:
       if init_obj_dir and os.path.exists(self.objdir):
         platform_utils.rmtree(self.objdir)
@@ -2593,11 +2542,11 @@
         platform_utils.rmtree(self.gitdir)
       raise
 
-  def _UpdateHooks(self):
+  def _UpdateHooks(self, quiet=False):
     if os.path.exists(self.gitdir):
-      self._InitHooks()
+      self._InitHooks(quiet=quiet)
 
-  def _InitHooks(self):
+  def _InitHooks(self, quiet=False):
     hooks = platform_utils.realpath(self._gitdir_path('hooks'))
     if not os.path.exists(hooks):
       os.makedirs(hooks)
@@ -2617,18 +2566,23 @@
       if platform_utils.islink(dst):
         continue
       if os.path.exists(dst):
-        if filecmp.cmp(stock_hook, dst, shallow=False):
-          platform_utils.remove(dst)
-        else:
-          _warn("%s: Not replacing locally modified %s hook",
-                self.relpath, name)
-          continue
+        # If the files are the same, we'll leave it alone.  We create symlinks
+        # below by default but fallback to hardlinks if the OS blocks them.
+        # So if we're here, it's probably because we made a hardlink below.
+        if not filecmp.cmp(stock_hook, dst, shallow=False):
+          if not quiet:
+            _warn("%s: Not replacing locally modified %s hook",
+                  self.relpath, name)
+        continue
       try:
         platform_utils.symlink(
             os.path.relpath(stock_hook, os.path.dirname(dst)), dst)
       except OSError as e:
         if e.errno == errno.EPERM:
-          raise GitError(self._get_symlink_error_message())
+          try:
+            os.link(stock_hook, dst)
+          except OSError:
+            raise GitError(self._get_symlink_error_message())
         else:
           raise
 
@@ -2648,27 +2602,56 @@
 
   def _InitMRef(self):
     if self.manifest.branch:
-      self._InitAnyMRef(R_M + self.manifest.branch)
+      if self.use_git_worktrees:
+        # Set up the m/ space to point to the worktree-specific ref space.
+        # We'll update the worktree-specific ref space on each checkout.
+        ref = R_M + self.manifest.branch
+        if not self.bare_ref.symref(ref):
+          self.bare_git.symbolic_ref(
+              '-m', 'redirecting to worktree scope',
+              ref, R_WORKTREE_M + self.manifest.branch)
+
+        # We can't update this ref with git worktrees until it exists.
+        # We'll wait until the initial checkout to set it.
+        if not os.path.exists(self.worktree):
+          return
+
+        base = R_WORKTREE_M
+        active_git = self.work_git
+
+        self._InitAnyMRef(HEAD, self.bare_git, detach=True)
+      else:
+        base = R_M
+        active_git = self.bare_git
+
+      self._InitAnyMRef(base + self.manifest.branch, active_git)
 
   def _InitMirrorHead(self):
-    self._InitAnyMRef(HEAD)
+    self._InitAnyMRef(HEAD, self.bare_git)
 
-  def _InitAnyMRef(self, ref):
+  def _InitAnyMRef(self, ref, active_git, detach=False):
     cur = self.bare_ref.symref(ref)
 
     if self.revisionId:
       if cur != '' or self.bare_ref.get(ref) != self.revisionId:
         msg = 'manifest set to %s' % self.revisionId
         dst = self.revisionId + '^0'
-        self.bare_git.UpdateRef(ref, dst, message=msg, detach=True)
+        active_git.UpdateRef(ref, dst, message=msg, detach=True)
     else:
       remote = self.GetRemote(self.remote.name)
       dst = remote.ToLocal(self.revisionExpr)
       if cur != dst:
         msg = 'manifest set to %s' % self.revisionExpr
-        self.bare_git.symbolic_ref('-m', msg, ref, dst)
+        if detach:
+          active_git.UpdateRef(ref, dst, message=msg, detach=True)
+        else:
+          active_git.symbolic_ref('-m', msg, ref, dst)
 
   def _CheckDirReference(self, srcdir, destdir, share_refs):
+    # Git worktrees don't use symlinks to share at all.
+    if self.use_git_worktrees:
+      return
+
     symlink_files = self.shareable_files[:]
     symlink_dirs = self.shareable_dirs[:]
     if share_refs:
@@ -2676,9 +2659,31 @@
       symlink_dirs += self.working_tree_dirs
     to_symlink = symlink_files + symlink_dirs
     for name in set(to_symlink):
-      dst = platform_utils.realpath(os.path.join(destdir, name))
+      # Try to self-heal a bit in simple cases.
+      dst_path = os.path.join(destdir, name)
+      src_path = os.path.join(srcdir, name)
+
+      if name in self.working_tree_dirs:
+        # If the dir is missing under .repo/projects/, create it.
+        if not os.path.exists(src_path):
+          os.makedirs(src_path)
+
+      elif name in self.working_tree_files:
+        # If it's a file under the checkout .git/ and the .repo/projects/ has
+        # nothing, move the file under the .repo/projects/ tree.
+        if not os.path.exists(src_path) and os.path.isfile(dst_path):
+          platform_utils.rename(dst_path, src_path)
+
+      # If the path exists under the .repo/projects/ and there's no symlink
+      # under the checkout .git/, recreate the symlink.
+      if name in self.working_tree_dirs or name in self.working_tree_files:
+        if os.path.exists(src_path) and not os.path.exists(dst_path):
+          platform_utils.symlink(
+              os.path.relpath(src_path, os.path.dirname(dst_path)), dst_path)
+
+      dst = platform_utils.realpath(dst_path)
       if os.path.lexists(dst):
-        src = platform_utils.realpath(os.path.join(srcdir, name))
+        src = platform_utils.realpath(src_path)
         # Fail if the links are pointing to the wrong place
         if src != dst:
           _error('%s is different in %s vs %s', name, destdir, srcdir)
@@ -2735,10 +2740,7 @@
         # If the source file doesn't exist, ensure the destination
         # file doesn't either.
         if name in symlink_files and not os.path.lexists(src):
-          try:
-            platform_utils.remove(dst)
-          except OSError:
-            pass
+          platform_utils.remove(dst, missing_ok=True)
 
       except OSError as e:
         if e.errno == errno.EPERM:
@@ -2746,11 +2748,45 @@
         else:
           raise
 
+  def _InitGitWorktree(self):
+    """Init the project using git worktrees."""
+    self.bare_git.worktree('prune')
+    self.bare_git.worktree('add', '-ff', '--checkout', '--detach', '--lock',
+                           self.worktree, self.GetRevisionId())
+
+    # Rewrite the internal state files to use relative paths between the
+    # checkouts & worktrees.
+    dotgit = os.path.join(self.worktree, '.git')
+    with open(dotgit, 'r') as fp:
+      # Figure out the checkout->worktree path.
+      setting = fp.read()
+      assert setting.startswith('gitdir:')
+      git_worktree_path = setting.split(':', 1)[1].strip()
+    # Some platforms (e.g. Windows) won't let us update dotgit in situ because
+    # of file permissions.  Delete it and recreate it from scratch to avoid.
+    platform_utils.remove(dotgit)
+    # Use relative path from checkout->worktree & maintain Unix line endings
+    # on all OS's to match git behavior.
+    with open(dotgit, 'w', newline='\n') as fp:
+      print('gitdir:', os.path.relpath(git_worktree_path, self.worktree),
+            file=fp)
+    # Use relative path from worktree->checkout & maintain Unix line endings
+    # on all OS's to match git behavior.
+    with open(os.path.join(git_worktree_path, 'gitdir'), 'w', newline='\n') as fp:
+      print(os.path.relpath(dotgit, git_worktree_path), file=fp)
+
+    self._InitMRef()
+
   def _InitWorkTree(self, force_sync=False, submodules=False):
     realdotgit = os.path.join(self.worktree, '.git')
     tmpdotgit = realdotgit + '.tmp'
     init_dotgit = not os.path.exists(realdotgit)
     if init_dotgit:
+      if self.use_git_worktrees:
+        self._InitGitWorktree()
+        self._CopyAndLinkFiles()
+        return
+
       dotgit = tmpdotgit
       platform_utils.rmtree(tmpdotgit, ignore_errors=True)
       os.makedirs(tmpdotgit)
@@ -2766,7 +2802,7 @@
         try:
           platform_utils.rmtree(dotgit)
           return self._InitWorkTree(force_sync=False, submodules=submodules)
-        except:
+        except Exception:
           raise e
       raise e
 
@@ -2857,6 +2893,13 @@
       self._bare = bare
       self._gitdir = gitdir
 
+    # __getstate__ and __setstate__ are required for pickling because __getattr__ exists.
+    def __getstate__(self):
+      return (self._project, self._bare, self._gitdir)
+
+    def __setstate__(self, state):
+      self._project, self._bare, self._gitdir = state
+
     def LsOthers(self):
       p = GitCommand(self._project,
                      ['ls-files',
@@ -2885,54 +2928,67 @@
                      bare=False,
                      capture_stdout=True,
                      capture_stderr=True)
-      try:
-        out = p.process.stdout.read()
-        if not hasattr(out, 'encode'):
-          out = out.decode()
-        r = {}
-        if out:
-          out = iter(out[:-1].split('\0'))
-          while out:
-            try:
-              info = next(out)
-              path = next(out)
-            except StopIteration:
-              break
+      p.Wait()
+      r = {}
+      out = p.stdout
+      if out:
+        out = iter(out[:-1].split('\0'))
+        while out:
+          try:
+            info = next(out)
+            path = next(out)
+          except StopIteration:
+            break
 
-            class _Info(object):
+          class _Info(object):
 
-              def __init__(self, path, omode, nmode, oid, nid, state):
-                self.path = path
-                self.src_path = None
-                self.old_mode = omode
-                self.new_mode = nmode
-                self.old_id = oid
-                self.new_id = nid
+            def __init__(self, path, omode, nmode, oid, nid, state):
+              self.path = path
+              self.src_path = None
+              self.old_mode = omode
+              self.new_mode = nmode
+              self.old_id = oid
+              self.new_id = nid
 
-                if len(state) == 1:
-                  self.status = state
-                  self.level = None
-                else:
-                  self.status = state[:1]
-                  self.level = state[1:]
-                  while self.level.startswith('0'):
-                    self.level = self.level[1:]
+              if len(state) == 1:
+                self.status = state
+                self.level = None
+              else:
+                self.status = state[:1]
+                self.level = state[1:]
+                while self.level.startswith('0'):
+                  self.level = self.level[1:]
 
-            info = info[1:].split(' ')
-            info = _Info(path, *info)
-            if info.status in ('R', 'C'):
-              info.src_path = info.path
-              info.path = next(out)
-            r[info.path] = info
-        return r
-      finally:
-        p.Wait()
+          info = info[1:].split(' ')
+          info = _Info(path, *info)
+          if info.status in ('R', 'C'):
+            info.src_path = info.path
+            info.path = next(out)
+          r[info.path] = info
+      return r
+
+    def GetDotgitPath(self, subpath=None):
+      """Return the full path to the .git dir.
+
+      As a convenience, append |subpath| if provided.
+      """
+      if self._bare:
+        dotgit = self._gitdir
+      else:
+        dotgit = os.path.join(self._project.worktree, '.git')
+        if os.path.isfile(dotgit):
+          # Git worktrees use a "gitdir:" syntax to point to the scratch space.
+          with open(dotgit) as fp:
+            setting = fp.read()
+          assert setting.startswith('gitdir:')
+          gitdir = setting.split(':', 1)[1].strip()
+          dotgit = os.path.normpath(os.path.join(self._project.worktree, gitdir))
+
+      return dotgit if subpath is None else os.path.join(dotgit, subpath)
 
     def GetHead(self):
-      if self._bare:
-        path = os.path.join(self._project.gitdir, HEAD)
-      else:
-        path = os.path.join(self._project.worktree, '.git', HEAD)
+      """Return the ref that HEAD points to."""
+      path = self.GetDotgitPath(subpath=HEAD)
       try:
         with open(path) as fd:
           line = fd.readline()
@@ -3027,9 +3083,6 @@
           raise TypeError('%s() got an unexpected keyword argument %r'
                           % (name, k))
         if config is not None:
-          if not git_require((1, 7, 2)):
-            raise ValueError('cannot set config on command line for %s()'
-                             % name)
           for k, v in config.items():
             cmdv.append('-c')
             cmdv.append('%s=%s' % (k, v))
@@ -3109,7 +3162,7 @@
 class _SyncColoring(Coloring):
 
   def __init__(self, config):
-    Coloring.__init__(self, config, 'reposync')
+    super().__init__(config, 'reposync')
     self.project = self.printer('header', attr='bold')
     self.info = self.printer('info')
     self.fail = self.printer('fail', fg='red')
diff --git a/pyversion.py b/pyversion.py
deleted file mode 100644
index f608240..0000000
--- a/pyversion.py
+++ /dev/null
@@ -1,20 +0,0 @@
-# -*- coding:utf-8 -*-
-#
-# Copyright (C) 2013 The Android Open Source Project
-#
-# Licensed under the Apache License, Version 2.0 (the "License");
-# you may not use this file except in compliance with the License.
-# You may obtain a copy of the License at
-#
-#      http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-import sys
-
-def is_python3():
-  return sys.version_info[0] == 3
diff --git a/release/README.md b/release/README.md
new file mode 100644
index 0000000..3b81d53
--- /dev/null
+++ b/release/README.md
@@ -0,0 +1,2 @@
+These are helper tools for managing official releases.
+See the [release process](../docs/release-process.md) document for more details.
diff --git a/release/sign-launcher.py b/release/sign-launcher.py
new file mode 100755
index 0000000..ba5e490
--- /dev/null
+++ b/release/sign-launcher.py
@@ -0,0 +1,114 @@
+#!/usr/bin/env python3
+# Copyright (C) 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Helper tool for signing repo launcher scripts correctly.
+
+This is intended to be run only by the official Repo release managers.
+"""
+
+import argparse
+import os
+import subprocess
+import sys
+
+import util
+
+
+def sign(opts):
+  """Sign the launcher!"""
+  output = ''
+  for key in opts.keys:
+    # We use ! at the end of the key so that gpg uses this specific key.
+    # Otherwise it uses the key as a lookup into the overall key and uses the
+    # default signing key.  i.e. It will see that KEYID_RSA is a subkey of
+    # another key, and use the primary key to sign instead of the subkey.
+    cmd = ['gpg', '--homedir', opts.gpgdir, '-u', f'{key}!', '--batch', '--yes',
+           '--armor', '--detach-sign', '--output', '-', opts.launcher]
+    ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
+    output += ret.stdout
+
+  # Save the combined signatures into one file.
+  with open(f'{opts.launcher}.asc', 'w', encoding='utf-8') as fp:
+    fp.write(output)
+
+
+def check(opts):
+  """Check the signature."""
+  util.run(opts, ['gpg', '--verify', f'{opts.launcher}.asc'])
+
+
+def postmsg(opts):
+  """Helpful info to show at the end for release manager."""
+  print(f"""
+Repo launcher bucket:
+  gs://git-repo-downloads/
+
+To upload this launcher directly:
+  gsutil cp -a public-read {opts.launcher} {opts.launcher}.asc gs://git-repo-downloads/
+
+NB: You probably want to upload it with a specific version first, e.g.:
+  gsutil cp -a public-read {opts.launcher} gs://git-repo-downloads/repo-3.0
+  gsutil cp -a public-read {opts.launcher}.asc gs://git-repo-downloads/repo-3.0.asc
+""")
+
+
+def get_parser():
+  """Get a CLI parser."""
+  parser = argparse.ArgumentParser(description=__doc__)
+  parser.add_argument('-n', '--dry-run',
+                      dest='dryrun', action='store_true',
+                      help='show everything that would be done')
+  parser.add_argument('--gpgdir',
+                      default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
+                      help='path to dedicated gpg dir with release keys '
+                           '(default: ~/.gnupg/repo/)')
+  parser.add_argument('--keyid', dest='keys', default=[], action='append',
+                      help='alternative signing keys to use')
+  parser.add_argument('launcher',
+                      default=os.path.join(util.TOPDIR, 'repo'), nargs='?',
+                      help='the launcher script to sign')
+  return parser
+
+
+def main(argv):
+  """The main func!"""
+  parser = get_parser()
+  opts = parser.parse_args(argv)
+
+  if not os.path.exists(opts.gpgdir):
+    parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
+  if not os.path.exists(opts.launcher):
+    parser.error(f'launcher does not exist: {opts.launcher}')
+
+  opts.launcher = os.path.relpath(opts.launcher)
+  print(f'Signing "{opts.launcher}" launcher script and saving to '
+        f'"{opts.launcher}.asc"')
+
+  if opts.keys:
+    print(f'Using custom keys to sign: {" ".join(opts.keys)}')
+  else:
+    print('Using official Repo release keys to sign')
+    opts.keys = [util.KEYID_DSA, util.KEYID_RSA, util.KEYID_ECC]
+    util.import_release_key(opts)
+
+  sign(opts)
+  check(opts)
+  postmsg(opts)
+
+  return 0
+
+
+if __name__ == '__main__':
+  sys.exit(main(sys.argv[1:]))
diff --git a/release/sign-tag.py b/release/sign-tag.py
new file mode 100755
index 0000000..605437c
--- /dev/null
+++ b/release/sign-tag.py
@@ -0,0 +1,140 @@
+#!/usr/bin/env python3
+# Copyright (C) 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Helper tool for signing repo release tags correctly.
+
+This is intended to be run only by the official Repo release managers, but it
+could be run by people maintaining their own fork of the project.
+
+NB: Check docs/release-process.md for production freeze information.
+"""
+
+import argparse
+import os
+import re
+import subprocess
+import sys
+
+import util
+
+
+# We currently sign with the old DSA key as it's been around the longest.
+# We should transition to RSA by Jun 2020, and ECC by Jun 2021.
+KEYID = util.KEYID_DSA
+
+# Regular expression to validate tag names.
+RE_VALID_TAG = r'^v([0-9]+[.])+[0-9]+$'
+
+
+def sign(opts):
+  """Tag the commit & sign it!"""
+  # We use ! at the end of the key so that gpg uses this specific key.
+  # Otherwise it uses the key as a lookup into the overall key and uses the
+  # default signing key.  i.e. It will see that KEYID_RSA is a subkey of
+  # another key, and use the primary key to sign instead of the subkey.
+  cmd = ['git', 'tag', '-s', opts.tag, '-u', f'{opts.key}!',
+         '-m', f'repo {opts.tag}', opts.commit]
+
+  key = 'GNUPGHOME'
+  print('+', f'export {key}="{opts.gpgdir}"')
+  oldvalue = os.getenv(key)
+  os.putenv(key, opts.gpgdir)
+  util.run(opts, cmd)
+  if oldvalue is None:
+    os.unsetenv(key)
+  else:
+    os.putenv(key, oldvalue)
+
+
+def check(opts):
+  """Check the signature."""
+  util.run(opts, ['git', 'tag', '--verify', opts.tag])
+
+
+def postmsg(opts):
+  """Helpful info to show at the end for release manager."""
+  cmd = ['git', 'rev-parse', 'remotes/origin/stable']
+  ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
+  current_release = ret.stdout.strip()
+
+  cmd = ['git', 'log', '--format=%h (%aN) %s', '--no-merges',
+         f'remotes/origin/stable..{opts.tag}']
+  ret = util.run(opts, cmd, encoding='utf-8', stdout=subprocess.PIPE)
+  shortlog = ret.stdout.strip()
+
+  print(f"""
+Here's the short log since the last release.
+{shortlog}
+
+To push release to the public:
+  git push origin {opts.commit}:stable {opts.tag} -n
+NB: People will start upgrading to this version immediately.
+
+To roll back a release:
+  git push origin --force {current_release}:stable -n
+""")
+
+
+def get_parser():
+  """Get a CLI parser."""
+  parser = argparse.ArgumentParser(
+      description=__doc__,
+      formatter_class=argparse.RawDescriptionHelpFormatter)
+  parser.add_argument('-n', '--dry-run',
+                      dest='dryrun', action='store_true',
+                      help='show everything that would be done')
+  parser.add_argument('--gpgdir',
+                      default=os.path.join(util.HOMEDIR, '.gnupg', 'repo'),
+                      help='path to dedicated gpg dir with release keys '
+                           '(default: ~/.gnupg/repo/)')
+  parser.add_argument('-f', '--force', action='store_true',
+                      help='force signing of any tag')
+  parser.add_argument('--keyid', dest='key',
+                      help='alternative signing key to use')
+  parser.add_argument('tag',
+                      help='the tag to create (e.g. "v2.0")')
+  parser.add_argument('commit', default='HEAD', nargs='?',
+                      help='the commit to tag')
+  return parser
+
+
+def main(argv):
+  """The main func!"""
+  parser = get_parser()
+  opts = parser.parse_args(argv)
+
+  if not os.path.exists(opts.gpgdir):
+    parser.error(f'--gpgdir does not exist: {opts.gpgdir}')
+
+  if not opts.force and not re.match(RE_VALID_TAG, opts.tag):
+    parser.error(f'tag "{opts.tag}" does not match regex "{RE_VALID_TAG}"; '
+                 'use --force to sign anyways')
+
+  if opts.key:
+    print(f'Using custom key to sign: {opts.key}')
+  else:
+    print('Using official Repo release key to sign')
+    opts.key = KEYID
+    util.import_release_key(opts)
+
+  sign(opts)
+  check(opts)
+  postmsg(opts)
+
+  return 0
+
+
+if __name__ == '__main__':
+  sys.exit(main(sys.argv[1:]))
diff --git a/release/update-manpages b/release/update-manpages
new file mode 100755
index 0000000..ddbce0c
--- /dev/null
+++ b/release/update-manpages
@@ -0,0 +1,102 @@
+#!/usr/bin/env python3
+# Copyright (C) 2021 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Helper tool for generating manual page for all repo commands.
+
+This is intended to be run before every official Repo release.
+"""
+
+from pathlib import Path
+from functools import partial
+import argparse
+import multiprocessing
+import os
+import re
+import shutil
+import subprocess
+import sys
+import tempfile
+
+TOPDIR = Path(__file__).resolve().parent.parent
+MANDIR = TOPDIR.joinpath('man')
+
+# Load repo local modules.
+sys.path.insert(0, str(TOPDIR))
+from git_command import RepoSourceVersion
+import subcmds
+
+def worker(cmd, **kwargs):
+  subprocess.run(cmd, **kwargs)
+
+def main(argv):
+  parser = argparse.ArgumentParser(description=__doc__)
+  opts = parser.parse_args(argv)
+
+  if not shutil.which('help2man'):
+    sys.exit('Please install help2man to continue.')
+
+  # Let repo know we're generating man pages so it can avoid some dynamic
+  # behavior (like probing active number of CPUs).  We use a weird name &
+  # value to make it less likely for users to set this var themselves.
+  os.environ['_REPO_GENERATE_MANPAGES_'] = ' indeed! '
+
+  # "repo branch" is an alias for "repo branches".
+  del subcmds.all_commands['branch']
+  (MANDIR / 'repo-branch.1').write_text('.so man1/repo-branches.1')
+
+  version = RepoSourceVersion()
+  cmdlist = [['help2man', '-N', '-n', f'repo {cmd} - manual page for repo {cmd}',
+    '-S', f'repo {cmd}', '-m', 'Repo Manual', f'--version-string={version}',
+    '-o', MANDIR.joinpath(f'repo-{cmd}.1.tmp'), TOPDIR.joinpath('repo'),
+    '-h', f'help {cmd}'] for cmd in subcmds.all_commands]
+  cmdlist.append(['help2man', '-N', '-n', 'repository management tool built on top of git',
+    '-S', 'repo', '-m', 'Repo Manual', f'--version-string={version}',
+    '-o', MANDIR.joinpath('repo.1.tmp'), TOPDIR.joinpath('repo'),
+    '-h', '--help-all'])
+
+  with tempfile.TemporaryDirectory() as tempdir:
+    repo_dir = Path(tempdir) / '.repo'
+    repo_dir.mkdir()
+    (repo_dir / 'repo').symlink_to(TOPDIR)
+
+    # Run all cmd in parallel, and wait for them to finish.
+    with multiprocessing.Pool() as pool:
+      pool.map(partial(worker, cwd=tempdir, check=True), cmdlist)
+
+  regex = (
+      (r'(It was generated by help2man) [0-9.]+', '\g<1>.'),
+      (r'^\.IP\n(.*:)\n', '.SS \g<1>\n'),
+      (r'^\.PP\nDescription', '.SH DETAILS'),
+  )
+  for tmp_path in MANDIR.glob('*.1.tmp'):
+    path = tmp_path.parent / tmp_path.stem
+    old_data = path.read_text() if path.exists() else ''
+
+    data = tmp_path.read_text()
+    tmp_path.unlink()
+
+    for pattern, replacement in regex:
+      data = re.sub(pattern, replacement, data, flags=re.M)
+
+    # If the only thing that changed was the date, don't refresh.  This avoids
+    # a lot of noise when only one file actually updates.
+    old_data = re.sub(r'^(\.TH REPO "1" ")([^"]+)', r'\1', old_data, flags=re.M)
+    new_data = re.sub(r'^(\.TH REPO "1" ")([^"]+)', r'\1', data, flags=re.M)
+    if old_data != new_data:
+      path.write_text(data)
+
+
+if __name__ == '__main__':
+  sys.exit(main(sys.argv[1:]))
diff --git a/release/util.py b/release/util.py
new file mode 100644
index 0000000..9d0eb1d
--- /dev/null
+++ b/release/util.py
@@ -0,0 +1,73 @@
+# Copyright (C) 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Random utility code for release tools."""
+
+import os
+import re
+import subprocess
+import sys
+
+
+assert sys.version_info >= (3, 6), 'This module requires Python 3.6+'
+
+
+TOPDIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
+HOMEDIR = os.path.expanduser('~')
+
+
+# These are the release keys we sign with.
+KEYID_DSA = '8BB9AD793E8E6153AF0F9A4416530D5E920F5C65'
+KEYID_RSA = 'A34A13BE8E76BFF46A0C022DA2E75A824AAB9624'
+KEYID_ECC = 'E1F9040D7A3F6DAFAC897CD3D3B95DA243E48A39'
+
+
+def cmdstr(cmd):
+  """Get a nicely quoted shell command."""
+  ret = []
+  for arg in cmd:
+    if not re.match(r'^[a-zA-Z0-9/_.=-]+$', arg):
+      arg = f'"{arg}"'
+    ret.append(arg)
+  return ' '.join(ret)
+
+
+def run(opts, cmd, check=True, **kwargs):
+  """Helper around subprocess.run to include logging."""
+  print('+', cmdstr(cmd))
+  if opts.dryrun:
+    cmd = ['true', '--'] + cmd
+  try:
+    return subprocess.run(cmd, check=check, **kwargs)
+  except subprocess.CalledProcessError as e:
+    print(f'aborting: {e}', file=sys.stderr)
+    sys.exit(1)
+
+
+def import_release_key(opts):
+  """Import the public key of the official release repo signing key."""
+  # Extract the key from our repo launcher.
+  launcher = getattr(opts, 'launcher', os.path.join(TOPDIR, 'repo'))
+  print(f'Importing keys from "{launcher}" launcher script')
+  with open(launcher, encoding='utf-8') as fp:
+    data = fp.read()
+
+  keys = re.findall(
+      r'\n-----BEGIN PGP PUBLIC KEY BLOCK-----\n[^-]*'
+      r'\n-----END PGP PUBLIC KEY BLOCK-----\n', data, flags=re.M)
+  run(opts, ['gpg', '--import'], input='\n'.join(keys).encode('utf-8'))
+
+  print('Marking keys as fully trusted')
+  run(opts, ['gpg', '--import-ownertrust'],
+      input=f'{KEYID_DSA}:6:\n'.encode('utf-8'))
diff --git a/repo b/repo
index 15d2ff8..4cddbf1 100755
--- a/repo
+++ b/repo
@@ -1,26 +1,7 @@
 #!/usr/bin/env python
 # -*- coding:utf-8 -*-
-
-"""Repo launcher.
-
-This is a standalone tool that people may copy to anywhere in their system.
-It is used to get an initial repo client checkout, and after that it runs the
-copy of repo in the checkout.
-"""
-
-from __future__ import print_function
-
-# repo default configuration
 #
-import os
-REPO_URL = os.environ.get('REPO_URL', None)
-if not REPO_URL:
-  REPO_URL = 'https://gerrit.googlesource.com/git-repo'
-REPO_REV = os.environ.get('REPO_REV')
-if not REPO_REV:
-  REPO_REV = 'repo-1'
-
-# Copyright (C) 2008 Google Inc.
+# Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
 # you may not use this file except in compliance with the License.
@@ -34,11 +15,144 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
+"""Repo launcher.
+
+This is a standalone tool that people may copy to anywhere in their system.
+It is used to get an initial repo client checkout, and after that it runs the
+copy of repo in the checkout.
+"""
+
+from __future__ import print_function
+
+import datetime
+import os
+import platform
+import shlex
+import subprocess
+import sys
+
+
+# These should never be newer than the main.py version since this needs to be a
+# bit more flexible with older systems.  See that file for more details on the
+# versions we select.
+MIN_PYTHON_VERSION_SOFT = (3, 6)
+MIN_PYTHON_VERSION_HARD = (3, 5)
+
+
+# Keep basic logic in sync with repo_trace.py.
+class Trace(object):
+  """Trace helper logic."""
+
+  REPO_TRACE = 'REPO_TRACE'
+
+  def __init__(self):
+    self.set(os.environ.get(self.REPO_TRACE) == '1')
+
+  def set(self, value):
+    self.enabled = bool(value)
+
+  def print(self, *args, **kwargs):
+    if self.enabled:
+      print(*args, **kwargs)
+
+
+trace = Trace()
+
+
+def exec_command(cmd):
+  """Execute |cmd| or return None on failure."""
+  trace.print(':', ' '.join(cmd))
+  try:
+    if platform.system() == 'Windows':
+      ret = subprocess.call(cmd)
+      sys.exit(ret)
+    else:
+      os.execvp(cmd[0], cmd)
+  except Exception:
+    pass
+
+
+def check_python_version():
+  """Make sure the active Python version is recent enough."""
+  def reexec(prog):
+    exec_command([prog] + sys.argv)
+
+  ver = sys.version_info
+  major = ver.major
+  minor = ver.minor
+
+  # Abort on very old Python 2 versions.
+  if (major, minor) < (2, 7):
+    print('repo: error: Your Python version is too old. '
+          'Please use Python {}.{} or newer instead.'.format(
+              *MIN_PYTHON_VERSION_SOFT), file=sys.stderr)
+    sys.exit(1)
+
+  # Try to re-exec the version specific Python 3 if needed.
+  if (major, minor) < MIN_PYTHON_VERSION_SOFT:
+    # Python makes releases ~once a year, so try our min version +10 to help
+    # bridge the gap.  This is the fallback anyways so perf isn't critical.
+    min_major, min_minor = MIN_PYTHON_VERSION_SOFT
+    for inc in range(0, 10):
+      reexec('python{}.{}'.format(min_major, min_minor + inc))
+
+    # Fallback to older versions if possible.
+    for inc in range(MIN_PYTHON_VERSION_SOFT[1] - MIN_PYTHON_VERSION_HARD[1], 0, -1):
+      # Don't downgrade, and don't reexec ourselves (which would infinite loop).
+      if (min_major, min_minor - inc) <= (major, minor):
+        break
+      reexec('python{}.{}'.format(min_major, min_minor - inc))
+
+    # Try the generic Python 3 wrapper, but only if it's new enough.  If it
+    # isn't, we want to just give up below and make the user resolve things.
+    try:
+      proc = subprocess.Popen(
+          ['python3', '-c', 'import sys; '
+           'print(sys.version_info.major, sys.version_info.minor)'],
+          stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+      (output, _) = proc.communicate()
+      python3_ver = tuple(int(x) for x in output.decode('utf-8').split())
+    except (OSError, subprocess.CalledProcessError):
+      python3_ver = None
+
+    # If the python3 version looks like it's new enough, give it a try.
+    if (python3_ver and python3_ver >= MIN_PYTHON_VERSION_HARD
+            and python3_ver != (major, minor)):
+      reexec('python3')
+
+    # We're still here, so diagnose things for the user.
+    if major < 3:
+      print('repo: error: Python 2 is no longer supported; '
+            'Please upgrade to Python {}.{}+.'.format(*MIN_PYTHON_VERSION_HARD),
+            file=sys.stderr)
+      sys.exit(1)
+    elif (major, minor) < MIN_PYTHON_VERSION_HARD:
+      print('repo: error: Python 3 version is too old; '
+            'Please use Python {}.{} or newer.'.format(*MIN_PYTHON_VERSION_HARD),
+            file=sys.stderr)
+      sys.exit(1)
+
+
+if __name__ == '__main__':
+  check_python_version()
+
+
+# repo default configuration
+#
+REPO_URL = os.environ.get('REPO_URL', None)
+if not REPO_URL:
+  REPO_URL = 'https://gerrit.googlesource.com/git-repo'
+REPO_REV = os.environ.get('REPO_REV')
+if not REPO_REV:
+  REPO_REV = 'stable'
+# URL to file bug reports for repo tool issues.
+BUG_URL = 'https://bugs.chromium.org/p/gerrit/issues/entry?template=Repo+tool+issue'
+
 # increment this whenever we make important changes to this script
-VERSION = (1, 27)
+VERSION = (2, 17)
 
 # increment this if the MAINTAINER_KEYS block is modified
-KEYRING_VERSION = (1, 2)
+KEYRING_VERSION = (2, 3)
 
 # Each individual key entry is created by using:
 # gpg --armor --export keyid
@@ -46,7 +160,6 @@
 
      Repo Maintainer <repo@android.kernel.org>
 -----BEGIN PGP PUBLIC KEY BLOCK-----
-Version: GnuPG v1.4.2.2 (GNU/Linux)
 
 mQGiBEj3ugERBACrLJh/ZPyVSKeClMuznFIrsQ+hpNnmJGw1a9GXKYKk8qHPhAZf
 WKtrBqAVMNRLhL85oSlekRz98u41H5si5zcuv+IXJDF5MJYcB8f22wAy15lUqPWi
@@ -82,63 +195,64 @@
 5xGrFy8tfAaeBMIQ17gvFSp/suc9DYO0ICK2BISzq+F+ZiAKsjMYOBNdH/h0zobQ
 HTHs37+/QLMomGEGKZMWi0dShU2J5mNRQu3Hhxl3hHDVbt5CeJBb26aQcQrFz69W
 zE3GNvmJosh6leayjtI9P2A6iEkEGBECAAkFAkj3uiACGwwACgkQFlMNXpIPXGWp
-TACbBS+Up3RpfYVfd63c1cDdlru13pQAn3NQy/SN858MkxN+zym86UBgOad2
-=CMiZ
------END PGP PUBLIC KEY BLOCK-----
-
-     Conley Owens <cco3@android.com>
------BEGIN PGP PUBLIC KEY BLOCK-----
-Version: GnuPG v1.4.11 (GNU/Linux)
-
-mQENBFHRvc8BCADFg45Xx/y6QDC+T7Y/gGc7vx0ww7qfOwIKlAZ9xG3qKunMxo+S
-hPCnzEl3cq+6I1Ww/ndop/HB3N3toPXRCoN8Vs4/Hc7by+SnaLFnacrm+tV5/OgT
-V37Lzt8lhay1Kl+YfpFwHYYpIEBLFV9knyfRXS/428W2qhdzYfvB15/AasRmwmor
-py4NIzSs8UD/SPr1ihqNCdZM76+MQyN5HMYXW/ALZXUFG0pwluHFA7hrfPG74i8C
-zMiP7qvMWIl/r/jtzHioH1dRKgbod+LZsrDJ8mBaqsZaDmNJMhss9g76XvfMyLra
-9DI9/iFuBpGzeqBv0hwOGQspLRrEoyTeR6n1ABEBAAG0H0NvbmxleSBPd2VucyA8
-Y2NvM0BhbmRyb2lkLmNvbT6JATgEEwECACIFAlHRvc8CGwMGCwkIBwMCBhUIAgkK
-CwQWAgMBAh4BAheAAAoJEGe35EhpKzgsP6AIAJKJmNtn4l7hkYHKHFSo3egb6RjQ
-zEIP3MFTcu8HFX1kF1ZFbrp7xqurLaE53kEkKuAAvjJDAgI8mcZHP1JyplubqjQA
-xvv84gK+OGP3Xk+QK1ZjUQSbjOpjEiSZpRhWcHci3dgOUH4blJfByHw25hlgHowd
-a/2PrNKZVcJ92YienaxxGjcXEUcd0uYEG2+rwllQigFcnMFDhr9B71MfalRHjFKE
-fmdoypqLrri61YBc59P88Rw2/WUpTQjgNubSqa3A2+CKdaRyaRw+2fdF4TdR0h8W
-zbg+lbaPtJHsV+3mJC7fq26MiJDRJa5ZztpMn8su20gbLgi2ShBOaHAYDDi5AQ0E
-UdG9zwEIAMoOBq+QLNozAhxOOl5GL3StTStGRgPRXINfmViTsihrqGCWBBUfXlUE
-OytC0mYcrDUQev/8ToVoyqw+iGSwDkcSXkrEUCKFtHV/GECWtk1keyHgR10YKI1R
-mquSXoubWGqPeG1PAI74XWaRx8UrL8uCXUtmD8Q5J7mDjKR5NpxaXrwlA0bKsf2E
-Gp9tu1kKauuToZhWHMRMqYSOGikQJwWSFYKT1KdNcOXLQF6+bfoJ6sjVYdwfmNQL
-Ixn8QVhoTDedcqClSWB17VDEFDFa7MmqXZz2qtM3X1R/MUMHqPtegQzBGNhRdnI2
-V45+1Nnx/uuCxDbeI4RbHzujnxDiq70AEQEAAYkBHwQYAQIACQUCUdG9zwIbDAAK
-CRBnt+RIaSs4LNVeB/0Y2pZ8I7gAAcEM0Xw8drr4omg2fUoK1J33ozlA/RxeA/lJ
-I3KnyCDTpXuIeBKPGkdL8uMATC9Z8DnBBajRlftNDVZS3Hz4G09G9QpMojvJkFJV
-By+01Flw/X+eeN8NpqSuLV4W+AjEO8at/VvgKr1AFvBRdZ7GkpI1o6DgPe7ZqX+1
-dzQZt3e13W0rVBb/bUgx9iSLoeWP3aq/k+/GRGOR+S6F6BBSl0SQ2EF2+dIywb1x
-JuinEP+AwLAUZ1Bsx9ISC0Agpk2VeHXPL3FGhroEmoMvBzO0kTFGyoeT7PR/BfKv
-+H/g3HsL2LOB9uoIm8/5p2TTU5ttYCXMHhQZ81AY
-=AUp4
+TACbBS+Up3RpfYVfd63c1cDdlru13pQAn3NQy/SN858MkxN+zym86UBgOad2uQIN
+BF5FqOoBEAC8aRtWEtXzeuoQhdFrLTqYs2dy6kl9y+j3DMQYAMs8je582qzUigIO
+ZZxq7T/3WQgghsdw9yPvdzlw9tKdet2TJkR1mtBfSjZQrkKwR0pQP4AD7t/90Whu
+R8Wlu8ysapE2hLxMH5Y2znRQX2LkUYmk0K2ik9AgZEh3AFEg3YLl2pGnSjeSp3ch
+cLX2n/rVZf5LXluZGRG+iov1Ka+8m+UqzohMA1DYNECJW6KPgXsNX++i8/iwZVic
+PWzhRJSQC+QiAZNsKT6HNNKs97YCUVzhjBLnRSxRBPkr0hS/VMWY2V4pbASljWyd
+GYmlDcxheLne0yjes0bJAdvig5rB42FOV0FCM4bDYOVwKfZ7SpzGCYXxtlwe0XNG
+tLW9WA6tICVqNZ/JNiRTBLrsGSkyrEhDPKnIHlHRI5Zux6IHwMVB0lQKHjSop+t6
+oyubqWcPCGGYdz2QGQHNz7huC/Zn0wS4hsoiSwPv6HCq3jNyUkOJ7wZ3ouv60p2I
+kPurgviVaRaPSKTYdKfkcJOtFeqOh1na5IHkXsD9rNctB7tSgfsm0G6qJIVe3ZmJ
+7QAyHBfuLrAWCq5xS8EHDlvxPdAD8EEsa9T32YxcHKIkxr1eSwrUrKb8cPhWq1pp
+Jiylw6G1fZ02VKixqmPC4oFMyg1PO8L2tcQTrnVmZvfFGiaekHKdhQARAQABiQKW
+BBgRAgAgFiEEi7mteT6OYVOvD5pEFlMNXpIPXGUFAl5FqOoCGwICQAkQFlMNXpIP
+XGXBdCAEGQEKAB0WIQSjShO+jna/9GoMAi2i51qCSquWJAUCXkWo6gAKCRCi51qC
+SquWJLzgD/0YEZYS7yKxhP+kk94TcTYMBMSZpU5KFClB77yu4SI1LeXq4ocBT4sp
+EPaOsQiIx//j59J67b7CBe4UeRA6D2n0pw+bCKuc731DFi5X9C1zq3a7E67SQ2yd
+FbYE2fnpVnMqb62g4sTh7JmdxEtXCWBUWL0OEoWouBW1PkFDHx2kYLC7YpZt3+4t
+VtNhSfV8NS6PF8ep3JXHVd2wsC3DQtggeId5GM44o8N0SkwQHNjK8ZD+VZ74ZnhZ
+HeyHskomiOC61LrZWQvxD6VqtfnBQ5GvONO8QuhkiFwMMOnpPVj2k7ngSkd5o27K
+6c53ZESOlR4bAfl0i3RZYC9B5KerGkBE3dTgTzmGjOaahl2eLz4LDPdTwMtS+sAU
+1hPPvZTQeYDdV62bOWUyteMoJu354GgZPQ9eItWYixpNCyOGNcJXl6xk3/OuoP6f
+MciFV8aMxs/7mUR8q1Ei3X9MKu+bbODYj2rC1tMkLj1OaAJkfvRuYrKsQpoUsn4q
+VT9+aciNpU/I7M30watlWo7RfUFI3zaGdMDcMFju1cWt2Un8E3gtscGufzbz1Z5Z
+Gak+tCOWUyuYNWX3noit7Dk6+3JGHGaQettldNu2PLM9SbIXd2EaqK/eEv9BS3dd
+ItkZwzyZXSaQ9UqAceY1AHskJJ5KVXIRLuhP5jBWWo3fnRMyMYt2nwNBAJ9B9TA8
+VlBniwIl5EzCvOFOTGrtewCdHOvr3N3ieypGz1BzyCN9tJMO3G24MwReRal9Fgkr
+BgEEAdpHDwEBB0BhPE/je6OuKgWzJ1mnrUmHhn4IMOHp+58+T5kHU3Oy6YjXBBgR
+AgAgFiEEi7mteT6OYVOvD5pEFlMNXpIPXGUFAl5FqX0CGwIAgQkQFlMNXpIPXGV2
+IAQZFggAHRYhBOH5BA16P22vrIl809O5XaJD5Io5BQJeRal9AAoJENO5XaJD5Io5
+MEkA/3uLmiwANOcgE0zB9zga0T/KkYhYOWFx7zRyDhrTf9spAPwIfSBOAGtwxjLO
+DCce5OaQJl/YuGHvXq2yx5h7T8pdAZ+PAJ4qfIk2LLSidsplTDXOKhOQAuOqUQCf
+cZ7aFsJF4PtcDrfdejyAxbtsSHI=
+=82Tj
 -----END PGP PUBLIC KEY BLOCK-----
 """
 
 GIT = 'git'                      # our git command
+# NB: The version of git that the repo launcher requires may be much older than
+# the version of git that the main repo source tree requires.  Keeping this at
+# an older version also makes it easier for users to upgrade/rollback as needed.
+#
+# git-1.7 is in (EOL) Ubuntu Precise.
 MIN_GIT_VERSION = (1, 7, 2)      # minimum supported git version
 repodir = '.repo'                # name of repo's private directory
 S_repo = 'repo'                  # special repo repository
 S_manifests = 'manifests'        # special manifest repository
 REPO_MAIN = S_repo + '/main.py'  # main script
-MIN_PYTHON_VERSION = (2, 7)      # minimum supported python version
 GITC_CONFIG_FILE = '/gitc/.config'
 GITC_FS_ROOT_DIR = '/gitc/manifest-rw/'
 
 
 import collections
 import errno
+import json
 import optparse
-import platform
 import re
 import shutil
 import stat
-import subprocess
-import sys
 
 if sys.version_info[0] == 3:
   import urllib.request
@@ -151,116 +265,215 @@
   urllib.error = urllib2
 
 
-# Python version check
-ver = sys.version_info
-if (ver[0], ver[1]) < MIN_PYTHON_VERSION:
-  print('error: Python version {} unsupported.\n'
-        'Please use Python {}.{} instead.'.format(
-            sys.version.split(' ')[0],
-            MIN_PYTHON_VERSION[0],
-            MIN_PYTHON_VERSION[1],
-        ), file=sys.stderr)
-  sys.exit(1)
-
 home_dot_repo = os.path.expanduser('~/.repoconfig')
 gpg_dir = os.path.join(home_dot_repo, 'gnupg')
 
-extra_args = []
-init_optparse = optparse.OptionParser(usage="repo init -u url [options]")
 
-# Logging
-group = init_optparse.add_option_group('Logging options')
-group.add_option('-q', '--quiet',
-                 dest="quiet", action="store_true", default=False,
-                 help="be quiet")
+def GetParser(gitc_init=False):
+  """Setup the CLI parser."""
+  if gitc_init:
+    usage = 'repo gitc-init -c client [options] [-u] url'
+  else:
+    usage = 'repo init [options] [-u] url'
 
-# Manifest
-group = init_optparse.add_option_group('Manifest options')
-group.add_option('-u', '--manifest-url',
-                 dest='manifest_url',
-                 help='manifest repository location', metavar='URL')
-group.add_option('-b', '--manifest-branch',
-                 dest='manifest_branch',
-                 help='manifest branch or revision', metavar='REVISION')
-group.add_option('-m', '--manifest-name',
-                 dest='manifest_name',
-                 help='initial manifest file', metavar='NAME.xml')
-group.add_option('--current-branch',
-                 dest='current_branch_only', action='store_true',
-                 help='fetch only current manifest branch from server')
-group.add_option('--mirror',
-                 dest='mirror', action='store_true',
-                 help='create a replica of the remote repositories '
-                      'rather than a client working directory')
-group.add_option('--reference',
-                 dest='reference',
-                 help='location of mirror directory', metavar='DIR')
-group.add_option('--dissociate',
-                 dest='dissociate', action='store_true',
-                 help='dissociate from reference mirrors after clone')
-group.add_option('--depth', type='int', default=None,
-                 dest='depth',
-                 help='create a shallow clone with given depth; see git clone')
-group.add_option('--partial-clone', action='store_true',
-                 dest='partial_clone',
-                 help='perform partial clone (https://git-scm.com/'
-                 'docs/gitrepository-layout#_code_partialclone_code)')
-group.add_option('--clone-filter', action='store', default='blob:none',
-                 dest='clone_filter',
-                 help='filter for use with --partial-clone [default: %default]')
-group.add_option('--archive',
-                 dest='archive', action='store_true',
-                 help='checkout an archive instead of a git repository for '
-                      'each project. See git archive.')
-group.add_option('--submodules',
-                 dest='submodules', action='store_true',
-                 help='sync any submodules associated with the manifest repo')
-group.add_option('-g', '--groups',
-                 dest='groups', default='default',
-                 help='restrict manifest projects to ones with specified '
-                      'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]',
-                 metavar='GROUP')
-group.add_option('-p', '--platform',
-                 dest='platform', default="auto",
-                 help='restrict manifest projects to ones with a specified '
-                      'platform group [auto|all|none|linux|darwin|...]',
-                 metavar='PLATFORM')
-group.add_option('--no-clone-bundle',
-                 dest='no_clone_bundle', action='store_true',
-                 help='disable use of /clone.bundle on HTTP/HTTPS')
-group.add_option('--no-tags',
-                 dest='no_tags', action='store_true',
-                 help="don't fetch tags in the manifest")
+  parser = optparse.OptionParser(usage=usage)
+  InitParser(parser, gitc_init=gitc_init)
+  return parser
 
 
-# Tool
-group = init_optparse.add_option_group('repo Version options')
-group.add_option('--repo-url',
-                 dest='repo_url',
-                 help='repo repository location ($REPO_URL)', metavar='URL')
-group.add_option('--repo-branch',
-                 dest='repo_branch',
-                 help='repo branch or revision ($REPO_REV)', metavar='REVISION')
-group.add_option('--no-repo-verify',
-                 dest='no_repo_verify', action='store_true',
-                 help='do not verify repo source code')
+def InitParser(parser, gitc_init=False):
+  """Setup the CLI parser."""
+  # NB: Keep in sync with command.py:_CommonOptions().
 
-# Other
-group = init_optparse.add_option_group('Other options')
-group.add_option('--config-name',
-                 dest='config_name', action="store_true", default=False,
-                 help='Always prompt for name/e-mail')
+  # Logging.
+  group = parser.add_option_group('Logging options')
+  group.add_option('-v', '--verbose',
+                   dest='output_mode', action='store_true',
+                   help='show all output')
+  group.add_option('-q', '--quiet',
+                   dest='output_mode', action='store_false',
+                   help='only show errors')
+
+  # Manifest.
+  group = parser.add_option_group('Manifest options')
+  group.add_option('-u', '--manifest-url',
+                   help='manifest repository location', metavar='URL')
+  group.add_option('-b', '--manifest-branch', metavar='REVISION',
+                   help='manifest branch or revision (use HEAD for default)')
+  group.add_option('-m', '--manifest-name', default='default.xml',
+                   help='initial manifest file', metavar='NAME.xml')
+  group.add_option('-g', '--groups', default='default',
+                   help='restrict manifest projects to ones with specified '
+                        'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]',
+                   metavar='GROUP')
+  group.add_option('-p', '--platform', default='auto',
+                   help='restrict manifest projects to ones with a specified '
+                        'platform group [auto|all|none|linux|darwin|...]',
+                   metavar='PLATFORM')
+  group.add_option('--submodules', action='store_true',
+                   help='sync any submodules associated with the manifest repo')
+  group.add_option('--standalone-manifest', action='store_true',
+                   help='download the manifest as a static file '
+                        'rather then create a git checkout of '
+                        'the manifest repo')
+
+  # Options that only affect manifest project, and not any of the projects
+  # specified in the manifest itself.
+  group = parser.add_option_group('Manifest (only) checkout options')
+  cbr_opts = ['--current-branch']
+  # The gitc-init subcommand allocates -c itself, but a lot of init users
+  # want -c, so try to satisfy both as best we can.
+  if not gitc_init:
+    cbr_opts += ['-c']
+  group.add_option(*cbr_opts,
+                   dest='current_branch_only', action='store_true',
+                   help='fetch only current manifest branch from server')
+  group.add_option('--no-current-branch',
+                   dest='current_branch_only', action='store_false',
+                   help='fetch all manifest branches from server')
+  group.add_option('--tags',
+                   action='store_true',
+                   help='fetch tags in the manifest')
+  group.add_option('--no-tags',
+                   dest='tags', action='store_false',
+                   help="don't fetch tags in the manifest")
+
+  # These are fundamentally different ways of structuring the checkout.
+  group = parser.add_option_group('Checkout modes')
+  group.add_option('--mirror', action='store_true',
+                   help='create a replica of the remote repositories '
+                        'rather than a client working directory')
+  group.add_option('--archive', action='store_true',
+                   help='checkout an archive instead of a git repository for '
+                        'each project. See git archive.')
+  group.add_option('--worktree', action='store_true',
+                   help='use git-worktree to manage projects')
+
+  # These are fundamentally different ways of structuring the checkout.
+  group = parser.add_option_group('Project checkout optimizations')
+  group.add_option('--reference',
+                   help='location of mirror directory', metavar='DIR')
+  group.add_option('--dissociate', action='store_true',
+                   help='dissociate from reference mirrors after clone')
+  group.add_option('--depth', type='int', default=None,
+                   help='create a shallow clone with given depth; '
+                        'see git clone')
+  group.add_option('--partial-clone', action='store_true',
+                   help='perform partial clone (https://git-scm.com/'
+                        'docs/gitrepository-layout#_code_partialclone_code)')
+  group.add_option('--no-partial-clone', action='store_false',
+                   help='disable use of partial clone (https://git-scm.com/'
+                        'docs/gitrepository-layout#_code_partialclone_code)')
+  group.add_option('--partial-clone-exclude', action='store',
+                   help='exclude the specified projects (a comma-delimited '
+                        'project names) from partial clone (https://git-scm.com'
+                        '/docs/gitrepository-layout#_code_partialclone_code)')
+  group.add_option('--clone-filter', action='store', default='blob:none',
+                   help='filter for use with --partial-clone '
+                        '[default: %default]')
+  group.add_option('--use-superproject', action='store_true', default=None,
+                   help='use the manifest superproject to sync projects')
+  group.add_option('--no-use-superproject', action='store_false',
+                   dest='use_superproject',
+                   help='disable use of manifest superprojects')
+  group.add_option('--clone-bundle', action='store_true',
+                   help='enable use of /clone.bundle on HTTP/HTTPS '
+                        '(default if not --partial-clone)')
+  group.add_option('--no-clone-bundle',
+                   dest='clone_bundle', action='store_false',
+                   help='disable use of /clone.bundle on HTTP/HTTPS (default if --partial-clone)')
+
+  # Tool.
+  group = parser.add_option_group('repo Version options')
+  group.add_option('--repo-url', metavar='URL',
+                   help='repo repository location ($REPO_URL)')
+  group.add_option('--repo-rev', metavar='REV',
+                   help='repo branch or revision ($REPO_REV)')
+  group.add_option('--repo-branch', dest='repo_rev',
+                   help=optparse.SUPPRESS_HELP)
+  group.add_option('--no-repo-verify',
+                   dest='repo_verify', default=True, action='store_false',
+                   help='do not verify repo source code')
+
+  # Other.
+  group = parser.add_option_group('Other options')
+  group.add_option('--config-name',
+                   action='store_true', default=False,
+                   help='Always prompt for name/e-mail')
+
+  # gitc-init specific settings.
+  if gitc_init:
+    group = parser.add_option_group('GITC options')
+    group.add_option('-f', '--manifest-file',
+                     help='Optional manifest file to use for this GITC client.')
+    group.add_option('-c', '--gitc-client',
+                     help='Name of the gitc_client instance to create or modify.')
+
+  return parser
 
 
-def _GitcInitOptions(init_optparse_arg):
-  init_optparse_arg.set_usage("repo gitc-init -u url -c client [options]")
-  g = init_optparse_arg.add_option_group('GITC options')
-  g.add_option('-f', '--manifest-file',
-               dest='manifest_file',
-               help='Optional manifest file to use for this GITC client.')
-  g.add_option('-c', '--gitc-client',
-               dest='gitc_client',
-               help='The name of the gitc_client instance to create or modify.')
+# This is a poor replacement for subprocess.run until we require Python 3.6+.
+RunResult = collections.namedtuple(
+    'RunResult', ('returncode', 'stdout', 'stderr'))
+
+
+class RunError(Exception):
+  """Error when running a command failed."""
+
+
+def run_command(cmd, **kwargs):
+  """Run |cmd| and return its output."""
+  check = kwargs.pop('check', False)
+  if kwargs.pop('capture_output', False):
+    kwargs.setdefault('stdout', subprocess.PIPE)
+    kwargs.setdefault('stderr', subprocess.PIPE)
+  cmd_input = kwargs.pop('input', None)
+
+  def decode(output):
+    """Decode |output| to text."""
+    if output is None:
+      return output
+    try:
+      return output.decode('utf-8')
+    except UnicodeError:
+      print('repo: warning: Invalid UTF-8 output:\ncmd: %r\n%r' % (cmd, output),
+            file=sys.stderr)
+      # TODO(vapier): Once we require Python 3, use 'backslashreplace'.
+      return output.decode('utf-8', 'replace')
+
+  # Run & package the results.
+  proc = subprocess.Popen(cmd, **kwargs)
+  (stdout, stderr) = proc.communicate(input=cmd_input)
+  dbg = ': ' + ' '.join(cmd)
+  if cmd_input is not None:
+    dbg += ' 0<|'
+  if stdout == subprocess.PIPE:
+    dbg += ' 1>|'
+  if stderr == subprocess.PIPE:
+    dbg += ' 2>|'
+  elif stderr == subprocess.STDOUT:
+    dbg += ' 2>&1'
+  trace.print(dbg)
+  ret = RunResult(proc.returncode, decode(stdout), decode(stderr))
+
+  # If things failed, print useful debugging output.
+  if check and ret.returncode:
+    print('repo: error: "%s" failed with exit status %s' %
+          (cmd[0], ret.returncode), file=sys.stderr)
+    print('  cwd: %s\n  cmd: %r' %
+          (kwargs.get('cwd', os.getcwd()), cmd), file=sys.stderr)
+
+    def _print_output(name, output):
+      if output:
+        print('  %s:\n  >> %s' % (name, '\n  >> '.join(output.splitlines())),
+              file=sys.stderr)
+
+    _print_output('stdout', ret.stdout)
+    _print_output('stderr', ret.stderr)
+    raise RunError(ret)
+
+  return ret
+
 
 _gitc_manifest_dir = None
 
@@ -283,9 +496,11 @@
 def gitc_parse_clientdir(gitc_fs_path):
   """Parse a path in the GITC FS and return its client name.
 
-  @param gitc_fs_path: A subdirectory path within the GITC_FS_ROOT_DIR.
+  Args:
+    gitc_fs_path: A subdirectory path within the GITC_FS_ROOT_DIR.
 
-  @returns: The GITC client name
+  Returns:
+    The GITC client name.
   """
   if gitc_fs_path == GITC_FS_ROOT_DIR:
     return None
@@ -309,31 +524,53 @@
   """
 
 
+def check_repo_verify(repo_verify, quiet=False):
+  """Check the --repo-verify state."""
+  if not repo_verify:
+    print('repo: warning: verification of repo code has been disabled;\n'
+          'repo will not be able to verify the integrity of itself.\n',
+          file=sys.stderr)
+    return False
+
+  if NeedSetupGnuPG():
+    return SetupGnuPG(quiet)
+
+  return True
+
+
+def check_repo_rev(dst, rev, repo_verify=True, quiet=False):
+  """Check that |rev| is valid."""
+  do_verify = check_repo_verify(repo_verify, quiet=quiet)
+  remote_ref, local_rev = resolve_repo_rev(dst, rev)
+  if not quiet and not remote_ref.startswith('refs/heads/'):
+    print('warning: repo is not tracking a remote branch, so it will not '
+          'receive updates', file=sys.stderr)
+  if do_verify:
+    rev = verify_rev(dst, remote_ref, local_rev, quiet)
+  else:
+    rev = local_rev
+  return (remote_ref, rev)
+
+
 def _Init(args, gitc_init=False):
   """Installs repo by cloning it over the network.
   """
-  if gitc_init:
-    _GitcInitOptions(init_optparse)
-  opt, args = init_optparse.parse_args(args)
+  parser = GetParser(gitc_init=gitc_init)
+  opt, args = parser.parse_args(args)
   if args:
-    init_optparse.print_usage()
-    sys.exit(1)
+    if not opt.manifest_url:
+      opt.manifest_url = args.pop(0)
+    if args:
+      parser.print_usage()
+      sys.exit(1)
+  opt.quiet = opt.output_mode is False
+  opt.verbose = opt.output_mode is True
 
-  url = opt.repo_url
-  if not url:
-    url = REPO_URL
-    extra_args.append('--repo-url=%s' % url)
+  if opt.clone_bundle is None:
+    opt.clone_bundle = False if opt.partial_clone else True
 
-  branch = opt.repo_branch
-  if not branch:
-    branch = REPO_REV
-    extra_args.append('--repo-branch=%s' % branch)
-
-  if branch.startswith('refs/heads/'):
-    branch = branch[len('refs/heads/'):]
-  if branch.startswith('refs/'):
-    print("fatal: invalid branch name '%s'" % branch, file=sys.stderr)
-    raise CloneFailure()
+  url = opt.repo_url or REPO_URL
+  rev = opt.repo_rev or REPO_REV
 
   try:
     if gitc_init:
@@ -368,23 +605,13 @@
 
   _CheckGitVersion()
   try:
-    if opt.no_repo_verify:
-      do_verify = False
-    else:
-      if NeedSetupGnuPG():
-        do_verify = SetupGnuPG(opt.quiet)
-      else:
-        do_verify = True
-
+    if not opt.quiet:
+      print('Downloading Repo source from', url)
     dst = os.path.abspath(os.path.join(repodir, S_repo))
-    _Clone(url, dst, opt.quiet, not opt.no_clone_bundle)
+    _Clone(url, dst, opt.clone_bundle, opt.quiet, opt.verbose)
 
-    if do_verify:
-      rev = _Verify(dst, branch, opt.quiet)
-    else:
-      rev = 'refs/remotes/origin/%s^0' % branch
-
-    _Checkout(dst, branch, rev, opt.quiet)
+    remote_ref, rev = check_repo_rev(dst, rev, opt.repo_verify, quiet=opt.quiet)
+    _Checkout(dst, remote_ref, rev, opt.quiet)
 
     if not os.path.isfile(os.path.join(dst, 'repo')):
       print("warning: '%s' does not look like a git-repo repository, is "
@@ -397,15 +624,34 @@
     raise
 
 
+def run_git(*args, **kwargs):
+  """Run git and return execution details."""
+  kwargs.setdefault('capture_output', True)
+  kwargs.setdefault('check', True)
+  try:
+    return run_command([GIT] + list(args), **kwargs)
+  except OSError as e:
+    print(file=sys.stderr)
+    print('repo: error: "%s" is not available' % GIT, file=sys.stderr)
+    print('repo: error: %s' % e, file=sys.stderr)
+    print(file=sys.stderr)
+    print('Please make sure %s is installed and in your path.' % GIT,
+          file=sys.stderr)
+    sys.exit(1)
+  except RunError:
+    raise CloneFailure()
+
+
 # The git version info broken down into components for easy analysis.
 # Similar to Python's sys.version_info.
 GitVersion = collections.namedtuple(
     'GitVersion', ('major', 'minor', 'micro', 'full'))
 
+
 def ParseGitVersion(ver_str=None):
   if ver_str is None:
     # Load the version ourselves.
-    ver_str = _GetGitVersion()
+    ver_str = run_git('--version').stdout
 
   if not ver_str.startswith('git version '):
     return None
@@ -422,41 +668,52 @@
   return GitVersion(*to_tuple)
 
 
-def _GetGitVersion():
-  cmd = [GIT, '--version']
-  try:
-    proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
-  except OSError as e:
-    print(file=sys.stderr)
-    print("fatal: '%s' is not available" % GIT, file=sys.stderr)
-    print('fatal: %s' % e, file=sys.stderr)
-    print(file=sys.stderr)
-    print('Please make sure %s is installed and in your path.' % GIT,
-          file=sys.stderr)
-    raise
-
-  ver_str = proc.stdout.read().strip()
-  proc.stdout.close()
-  proc.wait()
-  return ver_str.decode('utf-8')
-
-
 def _CheckGitVersion():
-  try:
-    ver_act = ParseGitVersion()
-  except OSError:
-    raise CloneFailure()
-
+  ver_act = ParseGitVersion()
   if ver_act is None:
     print('fatal: unable to detect git version', file=sys.stderr)
     raise CloneFailure()
 
   if ver_act < MIN_GIT_VERSION:
     need = '.'.join(map(str, MIN_GIT_VERSION))
-    print('fatal: git %s or later required' % need, file=sys.stderr)
+    print('fatal: git %s or later required; found %s' % (need, ver_act.full),
+          file=sys.stderr)
     raise CloneFailure()
 
 
+def SetGitTrace2ParentSid(env=None):
+  """Set up GIT_TRACE2_PARENT_SID for git tracing."""
+  # We roughly follow the format git itself uses in trace2/tr2_sid.c.
+  # (1) Be unique (2) be valid filename (3) be fixed length.
+  #
+  # Since we always export this variable, we try to avoid more expensive calls.
+  # e.g. We don't attempt hostname lookups or hashing the results.
+  if env is None:
+    env = os.environ
+
+  KEY = 'GIT_TRACE2_PARENT_SID'
+
+  now = datetime.datetime.utcnow()
+  value = 'repo-%s-P%08x' % (now.strftime('%Y%m%dT%H%M%SZ'), os.getpid())
+
+  # If it's already set, then append ourselves.
+  if KEY in env:
+    value = env[KEY] + '/' + value
+
+  _setenv(KEY, value, env=env)
+
+
+def _setenv(key, value, env=None):
+  """Set |key| in the OS environment |env| to |value|."""
+  if env is None:
+    env = os.environ
+  # Environment handling across systems is messy.
+  try:
+    env[key] = value
+  except UnicodeEncodeError:
+    env[key] = value.encode()
+
+
 def NeedSetupGnuPG():
   if not os.path.isdir(home_dot_repo):
     return True
@@ -492,43 +749,54 @@
             file=sys.stderr)
       sys.exit(1)
 
-  env = os.environ.copy()
+  if not quiet:
+    print('repo: Updating release signing keys to keyset ver %s' %
+          ('.'.join(str(x) for x in KEYRING_VERSION),))
+  # NB: We use --homedir (and cwd below) because some environments (Windows) do
+  # not correctly handle full native paths.  We avoid the issue by changing to
+  # the right dir with cwd=gpg_dir before executing gpg, and then telling gpg to
+  # use the cwd (.) as its homedir which leaves the path resolution logic to it.
+  cmd = ['gpg', '--homedir', '.', '--import']
   try:
-    env['GNUPGHOME'] = gpg_dir
-  except UnicodeEncodeError:
-    env['GNUPGHOME'] = gpg_dir.encode()
-
-  cmd = ['gpg', '--import']
-  try:
-    proc = subprocess.Popen(cmd,
-                            env=env,
-                            stdin=subprocess.PIPE)
-  except OSError as e:
+    # gpg can be pretty chatty.  Always capture the output and if something goes
+    # wrong, the builtin check failure will dump stdout & stderr for debugging.
+    run_command(cmd, stdin=subprocess.PIPE, capture_output=True,
+                cwd=gpg_dir, check=True,
+                input=MAINTAINER_KEYS.encode('utf-8'))
+  except OSError:
     if not quiet:
       print('warning: gpg (GnuPG) is not available.', file=sys.stderr)
       print('warning: Installing it is strongly encouraged.', file=sys.stderr)
       print(file=sys.stderr)
     return False
 
-  proc.stdin.write(MAINTAINER_KEYS.encode('utf-8'))
-  proc.stdin.close()
-
-  if proc.wait() != 0:
-    print('fatal: registering repo maintainer keys failed', file=sys.stderr)
-    sys.exit(1)
-  print()
-
   with open(os.path.join(home_dot_repo, 'keyring-version'), 'w') as fd:
     fd.write('.'.join(map(str, KEYRING_VERSION)) + '\n')
   return True
 
 
-def _SetConfig(local, name, value):
+def _SetConfig(cwd, name, value):
   """Set a git configuration option to the specified value.
   """
-  cmd = [GIT, 'config', name, value]
-  if subprocess.Popen(cmd, cwd=local).wait() != 0:
-    raise CloneFailure()
+  run_git('config', name, value, cwd=cwd)
+
+
+def _GetRepoConfig(name):
+  """Read a repo configuration option."""
+  config = os.path.join(home_dot_repo, 'config')
+  if not os.path.exists(config):
+    return None
+
+  cmd = ['config', '--file', config, '--get', name]
+  ret = run_git(*cmd, check=False)
+  if ret.returncode == 0:
+    return ret.stdout
+  elif ret.returncode == 1:
+    return None
+  else:
+    print('repo: error: git %s failed:\n%s' % (' '.join(cmd), ret.stderr),
+          file=sys.stderr)
+    raise RunError()
 
 
 def _InitHttp():
@@ -542,7 +810,7 @@
       p = n.hosts[host]
       mgr.add_password(p[1], 'http://%s/' % host, p[0], p[2])
       mgr.add_password(p[1], 'https://%s/' % host, p[0], p[2])
-  except:
+  except Exception:
     pass
   handlers.append(urllib.request.HTTPBasicAuthHandler(mgr))
   handlers.append(urllib.request.HTTPDigestAuthHandler(mgr))
@@ -556,39 +824,29 @@
   urllib.request.install_opener(urllib.request.build_opener(*handlers))
 
 
-def _Fetch(url, local, src, quiet):
-  if not quiet:
-    print('Get %s' % url, file=sys.stderr)
-
-  cmd = [GIT, 'fetch']
-  if quiet:
+def _Fetch(url, cwd, src, quiet, verbose):
+  cmd = ['fetch']
+  if not verbose:
     cmd.append('--quiet')
+  err = None
+  if not quiet and sys.stdout.isatty():
+    cmd.append('--progress')
+  elif not verbose:
     err = subprocess.PIPE
-  else:
-    err = None
   cmd.append(src)
   cmd.append('+refs/heads/*:refs/remotes/origin/*')
   cmd.append('+refs/tags/*:refs/tags/*')
-
-  proc = subprocess.Popen(cmd, cwd=local, stderr=err)
-  if err:
-    proc.stderr.read()
-    proc.stderr.close()
-  if proc.wait() != 0:
-    raise CloneFailure()
+  run_git(*cmd, stderr=err, capture_output=False, cwd=cwd)
 
 
-def _DownloadBundle(url, local, quiet):
+def _DownloadBundle(url, cwd, quiet, verbose):
   if not url.endswith('/'):
     url += '/'
   url += 'clone.bundle'
 
-  proc = subprocess.Popen(
-      [GIT, 'config', '--get-regexp', 'url.*.insteadof'],
-      cwd=local,
-      stdout=subprocess.PIPE)
-  for line in proc.stdout:
-    line = line.decode('utf-8')
+  ret = run_git('config', '--get-regexp', 'url.*.insteadof', cwd=cwd,
+                check=False)
+  for line in ret.stdout.splitlines():
     m = re.compile(r'^url\.(.*)\.insteadof (.*)$').match(line)
     if m:
       new_url = m.group(1)
@@ -596,29 +854,26 @@
       if url.startswith(old_url):
         url = new_url + url[len(old_url):]
         break
-  proc.stdout.close()
-  proc.wait()
 
   if not url.startswith('http:') and not url.startswith('https:'):
     return False
 
-  dest = open(os.path.join(local, '.git', 'clone.bundle'), 'w+b')
+  dest = open(os.path.join(cwd, '.git', 'clone.bundle'), 'w+b')
   try:
     try:
       r = urllib.request.urlopen(url)
     except urllib.error.HTTPError as e:
-      if e.code in [401, 403, 404, 501]:
-        return False
-      print('fatal: Cannot get %s' % url, file=sys.stderr)
-      print('fatal: HTTP error %s' % e.code, file=sys.stderr)
-      raise CloneFailure()
+      if e.code not in [400, 401, 403, 404, 501]:
+        print('warning: Cannot get %s' % url, file=sys.stderr)
+        print('warning: HTTP error %s' % e.code, file=sys.stderr)
+      return False
     except urllib.error.URLError as e:
       print('fatal: Cannot get %s' % url, file=sys.stderr)
       print('fatal: error %s' % e.reason, file=sys.stderr)
       raise CloneFailure()
     try:
-      if not quiet:
-        print('Get %s' % url, file=sys.stderr)
+      if verbose:
+        print('Downloading clone bundle %s' % url, file=sys.stderr)
       while True:
         buf = r.read(8192)
         if not buf:
@@ -630,124 +885,139 @@
     dest.close()
 
 
-def _ImportBundle(local):
-  path = os.path.join(local, '.git', 'clone.bundle')
+def _ImportBundle(cwd):
+  path = os.path.join(cwd, '.git', 'clone.bundle')
   try:
-    _Fetch(local, local, path, True)
+    _Fetch(cwd, cwd, path, True, False)
   finally:
     os.remove(path)
 
 
-def _Clone(url, local, quiet, clone_bundle):
+def _Clone(url, cwd, clone_bundle, quiet, verbose):
   """Clones a git repository to a new subdirectory of repodir
   """
+  if verbose:
+    print('Cloning git repository', url)
+
   try:
-    os.mkdir(local)
+    os.mkdir(cwd)
   except OSError as e:
-    print('fatal: cannot make %s directory: %s' % (local, e.strerror),
+    print('fatal: cannot make %s directory: %s' % (cwd, e.strerror),
           file=sys.stderr)
     raise CloneFailure()
 
-  cmd = [GIT, 'init', '--quiet']
-  try:
-    proc = subprocess.Popen(cmd, cwd=local)
-  except OSError as e:
-    print(file=sys.stderr)
-    print("fatal: '%s' is not available" % GIT, file=sys.stderr)
-    print('fatal: %s' % e, file=sys.stderr)
-    print(file=sys.stderr)
-    print('Please make sure %s is installed and in your path.' % GIT,
-          file=sys.stderr)
-    raise CloneFailure()
-  if proc.wait() != 0:
-    print('fatal: could not create %s' % local, file=sys.stderr)
-    raise CloneFailure()
+  run_git('init', '--quiet', cwd=cwd)
 
   _InitHttp()
-  _SetConfig(local, 'remote.origin.url', url)
-  _SetConfig(local,
+  _SetConfig(cwd, 'remote.origin.url', url)
+  _SetConfig(cwd,
              'remote.origin.fetch',
              '+refs/heads/*:refs/remotes/origin/*')
-  if clone_bundle and _DownloadBundle(url, local, quiet):
-    _ImportBundle(local)
-  _Fetch(url, local, 'origin', quiet)
+  if clone_bundle and _DownloadBundle(url, cwd, quiet, verbose):
+    _ImportBundle(cwd)
+  _Fetch(url, cwd, 'origin', quiet, verbose)
 
 
-def _Verify(cwd, branch, quiet):
-  """Verify the branch has been signed by a tag.
+def resolve_repo_rev(cwd, committish):
+  """Figure out what REPO_REV represents.
+
+  We support:
+  * refs/heads/xxx: Branch.
+  * refs/tags/xxx: Tag.
+  * xxx: Branch or tag or commit.
+
+  Args:
+    cwd: The git checkout to run in.
+    committish: The REPO_REV argument to resolve.
+
+  Returns:
+    A tuple of (remote ref, commit) as makes sense for the committish.
+    For branches, this will look like ('refs/heads/stable', <revision>).
+    For tags, this will look like ('refs/tags/v1.0', <revision>).
+    For commits, this will be (<revision>, <revision>).
   """
-  cmd = [GIT, 'describe', 'origin/%s' % branch]
-  proc = subprocess.Popen(cmd,
-                          stdout=subprocess.PIPE,
-                          stderr=subprocess.PIPE,
-                          cwd=cwd)
-  cur = proc.stdout.read().strip().decode('utf-8')
-  proc.stdout.close()
+  def resolve(committish):
+    ret = run_git('rev-parse', '--verify', '%s^{commit}' % (committish,),
+                  cwd=cwd, check=False)
+    return None if ret.returncode else ret.stdout.strip()
 
-  proc.stderr.read()
-  proc.stderr.close()
+  # An explicit branch.
+  if committish.startswith('refs/heads/'):
+    remote_ref = committish
+    committish = committish[len('refs/heads/'):]
+    rev = resolve('refs/remotes/origin/%s' % committish)
+    if rev is None:
+      print('repo: error: unknown branch "%s"' % (committish,),
+            file=sys.stderr)
+      raise CloneFailure()
+    return (remote_ref, rev)
 
-  if proc.wait() != 0 or not cur:
-    print(file=sys.stderr)
-    print("fatal: branch '%s' has not been signed" % branch, file=sys.stderr)
-    raise CloneFailure()
+  # An explicit tag.
+  if committish.startswith('refs/tags/'):
+    remote_ref = committish
+    committish = committish[len('refs/tags/'):]
+    rev = resolve(remote_ref)
+    if rev is None:
+      print('repo: error: unknown tag "%s"' % (committish,),
+            file=sys.stderr)
+      raise CloneFailure()
+    return (remote_ref, rev)
+
+  # See if it's a short branch name.
+  rev = resolve('refs/remotes/origin/%s' % committish)
+  if rev:
+    return ('refs/heads/%s' % (committish,), rev)
+
+  # See if it's a tag.
+  rev = resolve('refs/tags/%s' % committish)
+  if rev:
+    return ('refs/tags/%s' % (committish,), rev)
+
+  # See if it's a commit.
+  rev = resolve(committish)
+  if rev and rev.lower().startswith(committish.lower()):
+    return (rev, rev)
+
+  # Give up!
+  print('repo: error: unable to resolve "%s"' % (committish,), file=sys.stderr)
+  raise CloneFailure()
+
+
+def verify_rev(cwd, remote_ref, rev, quiet):
+  """Verify the commit has been signed by a tag."""
+  ret = run_git('describe', rev, cwd=cwd)
+  cur = ret.stdout.strip()
 
   m = re.compile(r'^(.*)-[0-9]{1,}-g[0-9a-f]{1,}$').match(cur)
   if m:
     cur = m.group(1)
     if not quiet:
       print(file=sys.stderr)
-      print("info: Ignoring branch '%s'; using tagged release '%s'"
-            % (branch, cur), file=sys.stderr)
+      print("warning: '%s' is not signed; falling back to signed release '%s'"
+            % (remote_ref, cur), file=sys.stderr)
       print(file=sys.stderr)
 
   env = os.environ.copy()
-  try:
-    env['GNUPGHOME'] = gpg_dir
-  except UnicodeEncodeError:
-    env['GNUPGHOME'] = gpg_dir.encode()
-
-  cmd = [GIT, 'tag', '-v', cur]
-  proc = subprocess.Popen(cmd,
-                          stdout=subprocess.PIPE,
-                          stderr=subprocess.PIPE,
-                          cwd=cwd,
-                          env=env)
-  out = proc.stdout.read().decode('utf-8')
-  proc.stdout.close()
-
-  err = proc.stderr.read().decode('utf-8')
-  proc.stderr.close()
-
-  if proc.wait() != 0:
-    print(file=sys.stderr)
-    print(out, file=sys.stderr)
-    print(err, file=sys.stderr)
-    print(file=sys.stderr)
-    raise CloneFailure()
+  _setenv('GNUPGHOME', gpg_dir, env)
+  run_git('tag', '-v', cur, cwd=cwd, env=env)
   return '%s^0' % cur
 
 
-def _Checkout(cwd, branch, rev, quiet):
+def _Checkout(cwd, remote_ref, rev, quiet):
   """Checkout an upstream branch into the repository and track it.
   """
-  cmd = [GIT, 'update-ref', 'refs/heads/default', rev]
-  if subprocess.Popen(cmd, cwd=cwd).wait() != 0:
-    raise CloneFailure()
+  run_git('update-ref', 'refs/heads/default', rev, cwd=cwd)
 
   _SetConfig(cwd, 'branch.default.remote', 'origin')
-  _SetConfig(cwd, 'branch.default.merge', 'refs/heads/%s' % branch)
+  _SetConfig(cwd, 'branch.default.merge', remote_ref)
 
-  cmd = [GIT, 'symbolic-ref', 'HEAD', 'refs/heads/default']
-  if subprocess.Popen(cmd, cwd=cwd).wait() != 0:
-    raise CloneFailure()
+  run_git('symbolic-ref', 'HEAD', 'refs/heads/default', cwd=cwd)
 
-  cmd = [GIT, 'read-tree', '--reset', '-u']
+  cmd = ['read-tree', '--reset', '-u']
   if not quiet:
     cmd.append('-v')
   cmd.append('HEAD')
-  if subprocess.Popen(cmd, cwd=cwd).wait() != 0:
-    raise CloneFailure()
+  run_git(*cmd, cwd=cwd)
 
 
 def _FindRepo():
@@ -757,9 +1027,7 @@
   repo = None
 
   olddir = None
-  while curdir != '/' \
-          and curdir != olddir \
-          and not repo:
+  while curdir != olddir and not repo:
     repo = os.path.join(curdir, repodir, REPO_MAIN)
     if not os.path.isfile(repo):
       repo = None
@@ -770,6 +1038,26 @@
 
 class _Options(object):
   help = False
+  version = False
+
+
+def _ExpandAlias(name):
+  """Look up user registered aliases."""
+  # We don't resolve aliases for existing subcommands.  This matches git.
+  if name in {'gitc-init', 'help', 'init'}:
+    return name, []
+
+  alias = _GetRepoConfig('alias.%s' % (name,))
+  if alias is None:
+    return name, []
+
+  args = alias.strip().split(' ', 1)
+  name = args[0]
+  if len(args) == 2:
+    args = shlex.split(args[1])
+  else:
+    args = []
+  return name, args
 
 
 def _ParseArguments(args):
@@ -781,7 +1069,10 @@
     a = args[i]
     if a == '-h' or a == '--help':
       opt.help = True
-
+    elif a == '--version':
+      opt.version = True
+    elif a == '--trace':
+      trace.set(True)
     elif not a.startswith('-'):
       cmd = a
       arg = args[i + 1:]
@@ -789,6 +1080,90 @@
   return cmd, opt, arg
 
 
+class Requirements(object):
+  """Helper for checking repo's system requirements."""
+
+  REQUIREMENTS_NAME = 'requirements.json'
+
+  def __init__(self, requirements):
+    """Initialize.
+
+    Args:
+      requirements: A dictionary of settings.
+    """
+    self.requirements = requirements
+
+  @classmethod
+  def from_dir(cls, path):
+    return cls.from_file(os.path.join(path, cls.REQUIREMENTS_NAME))
+
+  @classmethod
+  def from_file(cls, path):
+    try:
+      with open(path, 'rb') as f:
+        data = f.read()
+    except EnvironmentError:
+      # NB: EnvironmentError is used for Python 2 & 3 compatibility.
+      # If we couldn't open the file, assume it's an old source tree.
+      return None
+
+    return cls.from_data(data)
+
+  @classmethod
+  def from_data(cls, data):
+    comment_line = re.compile(br'^ *#')
+    strip_data = b''.join(x for x in data.splitlines() if not comment_line.match(x))
+    try:
+      json_data = json.loads(strip_data)
+    except Exception:  # pylint: disable=broad-except
+      # If we couldn't parse it, assume it's incompatible.
+      return None
+
+    return cls(json_data)
+
+  def _get_soft_ver(self, pkg):
+    """Return the soft version for |pkg| if it exists."""
+    return self.requirements.get(pkg, {}).get('soft', ())
+
+  def _get_hard_ver(self, pkg):
+    """Return the hard version for |pkg| if it exists."""
+    return self.requirements.get(pkg, {}).get('hard', ())
+
+  @staticmethod
+  def _format_ver(ver):
+    """Return a dotted version from |ver|."""
+    return '.'.join(str(x) for x in ver)
+
+  def assert_ver(self, pkg, curr_ver):
+    """Verify |pkg|'s |curr_ver| is new enough."""
+    curr_ver = tuple(curr_ver)
+    soft_ver = tuple(self._get_soft_ver(pkg))
+    hard_ver = tuple(self._get_hard_ver(pkg))
+    if curr_ver < hard_ver:
+      print('repo: error: Your version of "%s" (%s) is unsupported; '
+            'Please upgrade to at least version %s to continue.' %
+            (pkg, self._format_ver(curr_ver), self._format_ver(soft_ver)),
+            file=sys.stderr)
+      sys.exit(1)
+
+    if curr_ver < soft_ver:
+      print('repo: warning: Your version of "%s" (%s) is no longer supported; '
+            'Please upgrade to at least version %s to avoid breakage.' %
+            (pkg, self._format_ver(curr_ver), self._format_ver(soft_ver)),
+            file=sys.stderr)
+
+  def assert_all(self):
+    """Assert all of the requirements are satisified."""
+    # See if we need a repo launcher upgrade first.
+    self.assert_ver('repo', VERSION)
+
+    # Check python before we try to import the repo code.
+    self.assert_ver('python', sys.version_info)
+
+    # Check git while we're at it.
+    self.assert_ver('git', ParseGitVersion())
+
+
 def _Usage():
   gitc_usage = ""
   if get_gitc_manifest_dir():
@@ -807,17 +1182,15 @@
 
 For access to the full online help, install repo ("repo init").
 """)
+  print('Bug reports:', BUG_URL)
   sys.exit(0)
 
 
 def _Help(args):
   if args:
-    if args[0] == 'init':
-      init_optparse.print_help()
-      sys.exit(0)
-    elif args[0] == 'gitc-init':
-      _GitcInitOptions(init_optparse)
-      init_optparse.print_help()
+    if args[0] in {'init', 'gitc-init'}:
+      parser = GetParser(gitc_init=args[0] == 'gitc-init')
+      parser.print_help()
       sys.exit(0)
     else:
       print("error: '%s' is not a bootstrap command.\n"
@@ -828,6 +1201,25 @@
   sys.exit(1)
 
 
+def _Version():
+  """Show version information."""
+  print('<repo not installed>')
+  print('repo launcher version %s' % ('.'.join(str(x) for x in VERSION),))
+  print('       (from %s)' % (__file__,))
+  print('git %s' % (ParseGitVersion().full,))
+  print('Python %s' % sys.version)
+  uname = platform.uname()
+  if sys.version_info.major < 3:
+    # Python 3 returns a named tuple, but Python 2 is simpler.
+    print(uname)
+  else:
+    print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
+    print('CPU %s (%s)' %
+          (uname.machine, uname.processor if uname.processor else 'unknown'))
+  print('Bug reports:', BUG_URL)
+  sys.exit(0)
+
+
 def _NotInstalled():
   print('error: repo is not installed.  Use "repo init" to install it here.',
         file=sys.stderr)
@@ -860,26 +1252,26 @@
   global REPO_REV
 
   REPO_URL = gitdir
-  proc = subprocess.Popen([GIT,
-                           '--git-dir=%s' % gitdir,
-                           'symbolic-ref',
-                           'HEAD'],
-                          stdout=subprocess.PIPE,
-                          stderr=subprocess.PIPE)
-  REPO_REV = proc.stdout.read().strip().decode('utf-8')
-  proc.stdout.close()
+  ret = run_git('--git-dir=%s' % gitdir, 'symbolic-ref', 'HEAD', check=False)
+  if ret.returncode:
+    # If we're not tracking a branch (bisect/etc...), then fall back to commit.
+    print('repo: warning: %s has no current branch; using HEAD' % gitdir,
+          file=sys.stderr)
+    try:
+      ret = run_git('rev-parse', 'HEAD', cwd=gitdir)
+    except CloneFailure:
+      print('fatal: %s has invalid HEAD' % gitdir, file=sys.stderr)
+      sys.exit(1)
 
-  proc.stderr.read()
-  proc.stderr.close()
-
-  if proc.wait() != 0:
-    print('fatal: %s has no current branch' % gitdir, file=sys.stderr)
-    sys.exit(1)
+  REPO_REV = ret.stdout.strip()
 
 
 def main(orig_args):
   cmd, opt, args = _ParseArguments(orig_args)
 
+  # We run this early as we run some git commands ourselves.
+  SetGitTrace2ParentSid()
+
   repo_main, rel_repo_dir = None, None
   # Don't use the local repo copy, make sure to switch to the gitc client first.
   if cmd != 'gitc-init':
@@ -896,10 +1288,17 @@
           file=sys.stderr)
     sys.exit(1)
   if not repo_main:
+    # Only expand aliases here since we'll be parsing the CLI ourselves.
+    # If we had repo_main, alias expansion would happen in main.py.
+    cmd, alias_args = _ExpandAlias(cmd)
+    args = alias_args + args
+
     if opt.help:
       _Usage()
     if cmd == 'help':
       _Help(args)
+    if opt.version or cmd == 'version':
+      _Version()
     if not cmd:
       _NotInstalled()
     if cmd == 'init' or cmd == 'gitc-init':
@@ -920,6 +1319,14 @@
   if my_main:
     repo_main = my_main
 
+  if not repo_main:
+    print("fatal: unable to find repo entry point", file=sys.stderr)
+    sys.exit(1)
+
+  reqs = Requirements.from_dir(os.path.dirname(repo_main))
+  if reqs:
+    reqs.assert_all()
+
   ver_str = '.'.join(map(str, VERSION))
   me = [sys.executable, repo_main,
         '--repo-dir=%s' % rel_repo_dir,
@@ -927,16 +1334,9 @@
         '--wrapper-path=%s' % wrapper_path,
         '--']
   me.extend(orig_args)
-  me.extend(extra_args)
-  try:
-    if platform.system() == "Windows":
-      sys.exit(subprocess.call(me))
-    else:
-      os.execv(sys.executable, me)
-  except OSError as e:
-    print("fatal: unable to start %s" % repo_main, file=sys.stderr)
-    print("fatal: %s" % e, file=sys.stderr)
-    sys.exit(148)
+  exec_command(me)
+  print("fatal: unable to start %s" % repo_main, file=sys.stderr)
+  sys.exit(148)
 
 
 if __name__ == '__main__':
diff --git a/repo_trace.py b/repo_trace.py
index f5bc76d..7be0c04 100644
--- a/repo_trace.py
+++ b/repo_trace.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -19,7 +17,6 @@
 Activated via `repo --trace ...` or `REPO_TRACE=1 repo ...`.
 """
 
-from __future__ import print_function
 import sys
 import os
 
@@ -28,13 +25,16 @@
 
 _TRACE = os.environ.get(REPO_TRACE) == '1'
 
+
 def IsTrace():
   return _TRACE
 
+
 def SetTrace():
   global _TRACE
   _TRACE = True
 
+
 def Trace(fmt, *args):
   if IsTrace():
     print(fmt % args, file=sys.stderr)
diff --git a/requirements.json b/requirements.json
new file mode 100644
index 0000000..cb55cd2
--- /dev/null
+++ b/requirements.json
@@ -0,0 +1,57 @@
+# This file declares various requirements for this version of repo.  The
+# launcher script will load it and check the constraints before trying to run
+# us.  This avoids issues of the launcher using an old version of Python (e.g.
+# 3.5) while the codebase has moved on to requiring something much newer (e.g.
+# 3.8).  If the launcher tried to import us, it would fail with syntax errors.
+
+# This is a JSON file with line-level comments allowed.
+
+# Always keep backwards compatibility in mine.  The launcher script is robust
+# against missing values, but when a field is renamed/removed, it means older
+# versions of the launcher script won't be able to enforce the constraint.
+
+# When requiring versions, always use lists as they are easy to parse & compare
+# in Python.  Strings would require futher processing to turn into a list.
+
+# Version constraints should be expressed in pairs: soft & hard.  Soft versions
+# are when we start warning users that their software too old and we're planning
+# on dropping support for it, so they need to start planning system upgrades.
+# Hard versions are when we refuse to work the tool.  Users will be shown an
+# error message before we abort entirely.
+
+# When deciding whether to upgrade a version requirement, check out the distro
+# lists to see who will be impacted:
+# https://gerrit.googlesource.com/git-repo/+/HEAD/docs/release-process.md#Project-References
+
+{
+  # The repo launcher itself.  This allows us to force people to upgrade as some
+  # ignore the warnings about it being out of date, or install ancient versions
+  # to start with for whatever reason.
+  #
+  # NB: Repo launchers started checking this file with repo-2.12, so listing
+  # versions older than that won't make a difference.
+  "repo": {
+    "hard": [2, 11],
+    "soft": [2, 11]
+  },
+
+  # Supported Python versions.
+  #
+  # python-3.6 is in Ubuntu Bionic.
+  # python-3.7 is in Debian Buster.
+  "python": {
+    "hard": [3, 6],
+    "soft": [3, 6]
+  },
+
+  # Supported git versions.
+  #
+  # git-1.7.2 is in Debian Squeeze.
+  # git-1.7.9 is in Ubuntu Precise.
+  # git-1.9.1 is in Ubuntu Trusty.
+  # git-1.7.10 is in Debian Wheezy.
+  "git": {
+    "hard": [1, 7, 2],
+    "soft": [1, 9, 1]
+  }
+}
diff --git a/run_tests b/run_tests
index d7144b3..573dd44 100755
--- a/run_tests
+++ b/run_tests
@@ -1,5 +1,4 @@
-#!/usr/bin/env python
-# -*- coding:utf-8 -*-
+#!/usr/bin/env python3
 # Copyright 2019 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,37 +15,46 @@
 
 """Wrapper to run pytest with the right settings."""
 
-from __future__ import print_function
-
 import errno
 import os
+import shutil
 import subprocess
 import sys
 
 
-def run_pytest(cmd, argv):
-  """Run the unittests via |cmd|."""
-  try:
-    return subprocess.call([cmd] + argv)
-  except OSError as e:
-    if e.errno == errno.ENOENT:
-      print('%s: unable to run `%s`: %s' % (__file__, cmd, e), file=sys.stderr)
-      print('%s: Try installing pytest: sudo apt-get install python-pytest' %
-            (__file__,), file=sys.stderr)
-      return 127
-    else:
-      raise
+def find_pytest():
+  """Try to locate a good version of pytest."""
+  # If we're in a virtualenv, assume that it's provided the right pytest.
+  if 'VIRTUAL_ENV' in os.environ:
+    return 'pytest'
+
+  # Use the Python 3 version if available.
+  ret = shutil.which('pytest-3')
+  if ret:
+    return ret
+
+  # Hopefully this is a Python 3 version.
+  ret = shutil.which('pytest')
+  if ret:
+    return ret
+
+  print('%s: unable to find pytest.' % (__file__,), file=sys.stderr)
+  print('%s: Try installing: sudo apt-get install python-pytest' % (__file__,),
+        file=sys.stderr)
 
 
 def main(argv):
   """The main entry."""
   # Add the repo tree to PYTHONPATH as the tests expect to be able to import
   # modules directly.
-  topdir = os.path.dirname(os.path.realpath(__file__))
-  pythonpath = os.environ.get('PYTHONPATH', '')
-  os.environ['PYTHONPATH'] = '%s:%s' % (topdir, pythonpath)
+  pythonpath = os.path.dirname(os.path.realpath(__file__))
+  oldpythonpath = os.environ.get('PYTHONPATH', None)
+  if oldpythonpath is not None:
+    pythonpath += os.pathsep + oldpythonpath
+  os.environ['PYTHONPATH'] = pythonpath
 
-  return run_pytest('pytest', argv)
+  pytest = find_pytest()
+  return subprocess.run([pytest] + argv, check=False).returncode
 
 
 if __name__ == '__main__':
diff --git a/setup.py b/setup.py
index f4d7728..17aeae2 100755
--- a/setup.py
+++ b/setup.py
@@ -1,5 +1,4 @@
-#!/usr/bin/env python
-# -*- coding:utf-8 -*-
+#!/usr/bin/env python3
 # Copyright 2019 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the 'License");
@@ -16,8 +15,6 @@
 
 """Python packaging for repo."""
 
-from __future__ import print_function
-
 import os
 import setuptools
 
@@ -35,7 +32,7 @@
 # https://packaging.python.org/tutorials/packaging-projects/
 setuptools.setup(
     name='repo',
-    version='1.13.8',
+    version='2',
     maintainer='Various',
     maintainer_email='repo-discuss@googlegroups.com',
     description='Repo helps manage many Git repositories',
@@ -55,9 +52,10 @@
         'Operating System :: MacOS :: MacOS X',
         'Operating System :: Microsoft :: Windows :: Windows 10',
         'Operating System :: POSIX :: Linux',
+        'Programming Language :: Python :: 3',
+        'Programming Language :: Python :: 3 :: Only',
         'Topic :: Software Development :: Version Control :: Git',
     ],
-    # We support Python 2.7 and Python 3.6+.
-    python_requires='>=2.7, ' + ', '.join('!=3.%i.*' % x for x in range(0, 6)),
+    python_requires='>=3.6',
     packages=['subcmds'],
 )
diff --git a/ssh.py b/ssh.py
new file mode 100644
index 0000000..0ae8d12
--- /dev/null
+++ b/ssh.py
@@ -0,0 +1,277 @@
+# Copyright (C) 2008 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Common SSH management logic."""
+
+import functools
+import multiprocessing
+import os
+import re
+import signal
+import subprocess
+import sys
+import tempfile
+import time
+
+import platform_utils
+from repo_trace import Trace
+
+
+PROXY_PATH = os.path.join(os.path.dirname(__file__), 'git_ssh')
+
+
+def _run_ssh_version():
+  """run ssh -V to display the version number"""
+  return subprocess.check_output(['ssh', '-V'], stderr=subprocess.STDOUT).decode()
+
+
+def _parse_ssh_version(ver_str=None):
+  """parse a ssh version string into a tuple"""
+  if ver_str is None:
+    ver_str = _run_ssh_version()
+  m = re.match(r'^OpenSSH_([0-9.]+)(p[0-9]+)?\s', ver_str)
+  if m:
+    return tuple(int(x) for x in m.group(1).split('.'))
+  else:
+    return ()
+
+
+@functools.lru_cache(maxsize=None)
+def version():
+  """return ssh version as a tuple"""
+  try:
+    return _parse_ssh_version()
+  except subprocess.CalledProcessError:
+    print('fatal: unable to detect ssh version', file=sys.stderr)
+    sys.exit(1)
+
+
+URI_SCP = re.compile(r'^([^@:]*@?[^:/]{1,}):')
+URI_ALL = re.compile(r'^([a-z][a-z+-]*)://([^@/]*@?[^/]*)/')
+
+
+class ProxyManager:
+  """Manage various ssh clients & masters that we spawn.
+
+  This will take care of sharing state between multiprocessing children, and
+  make sure that if we crash, we don't leak any of the ssh sessions.
+
+  The code should work with a single-process scenario too, and not add too much
+  overhead due to the manager.
+  """
+
+  # Path to the ssh program to run which will pass our master settings along.
+  # Set here more as a convenience API.
+  proxy = PROXY_PATH
+
+  def __init__(self, manager):
+    # Protect access to the list of active masters.
+    self._lock = multiprocessing.Lock()
+    # List of active masters (pid).  These will be spawned on demand, and we are
+    # responsible for shutting them all down at the end.
+    self._masters = manager.list()
+    # Set of active masters indexed by "host:port" information.
+    # The value isn't used, but multiprocessing doesn't provide a set class.
+    self._master_keys = manager.dict()
+    # Whether ssh masters are known to be broken, so we give up entirely.
+    self._master_broken = manager.Value('b', False)
+    # List of active ssh sesssions.  Clients will be added & removed as
+    # connections finish, so this list is just for safety & cleanup if we crash.
+    self._clients = manager.list()
+    # Path to directory for holding master sockets.
+    self._sock_path = None
+
+  def __enter__(self):
+    """Enter a new context."""
+    return self
+
+  def __exit__(self, exc_type, exc_value, traceback):
+    """Exit a context & clean up all resources."""
+    self.close()
+
+  def add_client(self, proc):
+    """Track a new ssh session."""
+    self._clients.append(proc.pid)
+
+  def remove_client(self, proc):
+    """Remove a completed ssh session."""
+    try:
+      self._clients.remove(proc.pid)
+    except ValueError:
+      pass
+
+  def add_master(self, proc):
+    """Track a new master connection."""
+    self._masters.append(proc.pid)
+
+  def _terminate(self, procs):
+    """Kill all |procs|."""
+    for pid in procs:
+      try:
+        os.kill(pid, signal.SIGTERM)
+        os.waitpid(pid, 0)
+      except OSError:
+        pass
+
+    # The multiprocessing.list() API doesn't provide many standard list()
+    # methods, so we have to manually clear the list.
+    while True:
+      try:
+        procs.pop(0)
+      except:
+        break
+
+  def close(self):
+    """Close this active ssh session.
+
+    Kill all ssh clients & masters we created, and nuke the socket dir.
+    """
+    self._terminate(self._clients)
+    self._terminate(self._masters)
+
+    d = self.sock(create=False)
+    if d:
+      try:
+        platform_utils.rmdir(os.path.dirname(d))
+      except OSError:
+        pass
+
+  def _open_unlocked(self, host, port=None):
+    """Make sure a ssh master session exists for |host| & |port|.
+
+    If one doesn't exist already, we'll create it.
+
+    We won't grab any locks, so the caller has to do that.  This helps keep the
+    business logic of actually creating the master separate from grabbing locks.
+    """
+    # Check to see whether we already think that the master is running; if we
+    # think it's already running, return right away.
+    if port is not None:
+      key = '%s:%s' % (host, port)
+    else:
+      key = host
+
+    if key in self._master_keys:
+      return True
+
+    if self._master_broken.value or 'GIT_SSH' in os.environ:
+      # Failed earlier, so don't retry.
+      return False
+
+    # We will make two calls to ssh; this is the common part of both calls.
+    command_base = ['ssh', '-o', 'ControlPath %s' % self.sock(), host]
+    if port is not None:
+      command_base[1:1] = ['-p', str(port)]
+
+    # Since the key wasn't in _master_keys, we think that master isn't running.
+    # ...but before actually starting a master, we'll double-check.  This can
+    # be important because we can't tell that that 'git@myhost.com' is the same
+    # as 'myhost.com' where "User git" is setup in the user's ~/.ssh/config file.
+    check_command = command_base + ['-O', 'check']
+    try:
+      Trace(': %s', ' '.join(check_command))
+      check_process = subprocess.Popen(check_command,
+                                       stdout=subprocess.PIPE,
+                                       stderr=subprocess.PIPE)
+      check_process.communicate()  # read output, but ignore it...
+      isnt_running = check_process.wait()
+
+      if not isnt_running:
+        # Our double-check found that the master _was_ infact running.  Add to
+        # the list of keys.
+        self._master_keys[key] = True
+        return True
+    except Exception:
+      # Ignore excpetions.  We we will fall back to the normal command and print
+      # to the log there.
+      pass
+
+    command = command_base[:1] + ['-M', '-N'] + command_base[1:]
+    try:
+      Trace(': %s', ' '.join(command))
+      p = subprocess.Popen(command)
+    except Exception as e:
+      self._master_broken.value = True
+      print('\nwarn: cannot enable ssh control master for %s:%s\n%s'
+            % (host, port, str(e)), file=sys.stderr)
+      return False
+
+    time.sleep(1)
+    ssh_died = (p.poll() is not None)
+    if ssh_died:
+      return False
+
+    self.add_master(p)
+    self._master_keys[key] = True
+    return True
+
+  def _open(self, host, port=None):
+    """Make sure a ssh master session exists for |host| & |port|.
+
+    If one doesn't exist already, we'll create it.
+
+    This will obtain any necessary locks to avoid inter-process races.
+    """
+    # Bail before grabbing the lock if we already know that we aren't going to
+    # try creating new masters below.
+    if sys.platform in ('win32', 'cygwin'):
+      return False
+
+    # Acquire the lock.  This is needed to prevent opening multiple masters for
+    # the same host when we're running "repo sync -jN" (for N > 1) _and_ the
+    # manifest <remote fetch="ssh://xyz"> specifies a different host from the
+    # one that was passed to repo init.
+    with self._lock:
+      return self._open_unlocked(host, port)
+
+  def preconnect(self, url):
+    """If |uri| will create a ssh connection, setup the ssh master for it."""
+    m = URI_ALL.match(url)
+    if m:
+      scheme = m.group(1)
+      host = m.group(2)
+      if ':' in host:
+        host, port = host.split(':')
+      else:
+        port = None
+      if scheme in ('ssh', 'git+ssh', 'ssh+git'):
+        return self._open(host, port)
+      return False
+
+    m = URI_SCP.match(url)
+    if m:
+      host = m.group(1)
+      return self._open(host)
+
+    return False
+
+  def sock(self, create=True):
+    """Return the path to the ssh socket dir.
+
+    This has all the master sockets so clients can talk to them.
+    """
+    if self._sock_path is None:
+      if not create:
+        return None
+      tmp_dir = '/tmp'
+      if not os.path.exists(tmp_dir):
+        tmp_dir = tempfile.gettempdir()
+      if version() < (6, 7):
+        tokens = '%r@%h:%p'
+      else:
+        tokens = '%C'  # hash of %l%h%p%r
+      self._sock_path = os.path.join(
+          tempfile.mkdtemp('', 'ssh-', tmp_dir),
+          'master-' + tokens)
+    return self._sock_path
diff --git a/subcmds/__init__.py b/subcmds/__init__.py
index 2734103..051dda0 100644
--- a/subcmds/__init__.py
+++ b/subcmds/__init__.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,6 +14,7 @@
 
 import os
 
+# A mapping of the subcommand name to the class that implements it.
 all_commands = {}
 
 my_dir = os.path.dirname(__file__)
@@ -37,14 +36,14 @@
                      ['%s' % name])
     mod = getattr(mod, name)
     try:
-      cmd = getattr(mod, clsn)()
+      cmd = getattr(mod, clsn)
     except AttributeError:
       raise SyntaxError('%s/%s does not define class %s' % (
-                         __name__, py, clsn))
+          __name__, py, clsn))
 
     name = name.replace('_', '-')
     cmd.NAME = name
     all_commands[name] = cmd
 
-if 'help' in all_commands:
-  all_commands['help'].commands = all_commands
+# Add 'branch' as an alias for 'branches'.
+all_commands['branch'] = all_commands['branches']
diff --git a/subcmds/abandon.py b/subcmds/abandon.py
index cd1d0c4..85d85f5 100644
--- a/subcmds/abandon.py
+++ b/subcmds/abandon.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,15 +12,18 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
-import sys
-from command import Command
 from collections import defaultdict
+import functools
+import itertools
+import sys
+
+from command import Command, DEFAULT_LOCAL_JOBS
 from git_command import git
 from progress import Progress
 
+
 class Abandon(Command):
-  common = True
+  COMMON = True
   helpSummary = "Permanently abandon a development branch"
   helpUsage = """
 %prog [--all | <branchname>] [<project>...]
@@ -32,6 +33,8 @@
 
 It is equivalent to "git branch -D <branchname>".
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
+
   def _Options(self, p):
     p.add_option('--all',
                  dest='all', action='store_true',
@@ -48,52 +51,64 @@
     else:
       args.insert(0, "'All local branches'")
 
+  def _ExecuteOne(self, all_branches, nb, project):
+    """Abandon one project."""
+    if all_branches:
+      branches = project.GetBranches()
+    else:
+      branches = [nb]
+
+    ret = {}
+    for name in branches:
+      status = project.AbandonBranch(name)
+      if status is not None:
+        ret[name] = status
+    return (ret, project)
+
   def Execute(self, opt, args):
     nb = args[0]
     err = defaultdict(list)
     success = defaultdict(list)
     all_projects = self.GetProjects(args[1:])
 
-    pm = Progress('Abandon %s' % nb, len(all_projects))
-    for project in all_projects:
-      pm.update()
-
-      if opt.all:
-        branches = list(project.GetBranches().keys())
-      else:
-        branches = [nb]
-
-      for name in branches:
-        status = project.AbandonBranch(name)
-        if status is not None:
+    def _ProcessResults(_pool, pm, states):
+      for (results, project) in states:
+        for branch, status in results.items():
           if status:
-            success[name].append(project)
+            success[branch].append(project)
           else:
-            err[name].append(project)
-    pm.end()
+            err[branch].append(project)
+        pm.update()
 
-    width = 25
-    for name in branches:
-      if width < len(name):
-        width = len(name)
+    self.ExecuteInParallel(
+        opt.jobs,
+        functools.partial(self._ExecuteOne, opt.all, nb),
+        all_projects,
+        callback=_ProcessResults,
+        output=Progress('Abandon %s' % (nb,), len(all_projects), quiet=opt.quiet))
 
+    width = max(itertools.chain(
+        [25], (len(x) for x in itertools.chain(success, err))))
     if err:
       for br in err.keys():
-        err_msg = "error: cannot abandon %s" %br
+        err_msg = "error: cannot abandon %s" % br
         print(err_msg, file=sys.stderr)
         for proj in err[br]:
-          print(' '*len(err_msg) + " | %s" % proj.relpath, file=sys.stderr)
+          print(' ' * len(err_msg) + " | %s" % proj.relpath, file=sys.stderr)
       sys.exit(1)
     elif not success:
       print('error: no project has local branch(es) : %s' % nb,
             file=sys.stderr)
       sys.exit(1)
     else:
-      print('Abandoned branches:', file=sys.stderr)
+      # Everything below here is displaying status.
+      if opt.quiet:
+        return
+      print('Abandoned branches:')
       for br in success.keys():
         if len(all_projects) > 1 and len(all_projects) == len(success[br]):
           result = "all project"
         else:
           result = "%s" % (
-            ('\n'+' '*width + '| ').join(p.relpath for p in success[br]))
-        print("%s%s| %s\n" % (br,' '*(width-len(br)), result),file=sys.stderr)
+              ('\n' + ' ' * width + '| ').join(p.relpath for p in success[br]))
+        print("%s%s| %s\n" % (br, ' ' * (width - len(br)), result))
diff --git a/subcmds/branches.py b/subcmds/branches.py
index fb60d7d..6d975ed 100644
--- a/subcmds/branches.py
+++ b/subcmds/branches.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,18 +12,21 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import itertools
 import sys
+
 from color import Coloring
-from command import Command
+from command import Command, DEFAULT_LOCAL_JOBS
+
 
 class BranchColoring(Coloring):
   def __init__(self, config):
     Coloring.__init__(self, config, 'branch')
     self.current = self.printer('current', fg='green')
-    self.local   = self.printer('local')
+    self.local = self.printer('local')
     self.notinproject = self.printer('notinproject', fg='red')
 
+
 class BranchInfo(object):
   def __init__(self, name):
     self.name = name
@@ -61,7 +62,7 @@
 
 
 class Branches(Command):
-  common = True
+  COMMON = True
   helpSummary = "View current topic branches"
   helpUsage = """
 %prog [<project>...]
@@ -94,6 +95,7 @@
 is shown, then the branch appears in all projects.
 
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
 
   def Execute(self, opt, args):
     projects = self.GetProjects(args)
@@ -101,14 +103,19 @@
     all_branches = {}
     project_cnt = len(projects)
 
-    for project in projects:
-      for name, b in project.GetBranches().items():
-        b.project = project
+    def _ProcessResults(_pool, _output, results):
+      for name, b in itertools.chain.from_iterable(results):
         if name not in all_branches:
           all_branches[name] = BranchInfo(name)
         all_branches[name].add(b)
 
-    names = list(sorted(all_branches))
+    self.ExecuteInParallel(
+        opt.jobs,
+        expand_project_to_branches,
+        projects,
+        callback=_ProcessResults)
+
+    names = sorted(all_branches)
 
     if not names:
       print('   (no branches)', file=sys.stderr)
@@ -158,7 +165,7 @@
           for b in i.projects:
             have.add(b.project)
           for p in projects:
-            if not p in have:
+            if p not in have:
               paths.append(p.relpath)
 
         s = ' %s %s' % (in_type, ', '.join(paths))
@@ -170,11 +177,27 @@
           fmt = out.current if i.IsCurrent else out.write
           for p in paths:
             out.nl()
-            fmt(width*' ' + '          %s' % p)
+            fmt(width * ' ' + '          %s' % p)
           fmt = out.write
           for p in non_cur_paths:
             out.nl()
-            fmt(width*' ' + '          %s' % p)
+            fmt(width * ' ' + '          %s' % p)
       else:
         out.write(' in all projects')
       out.nl()
+
+
+def expand_project_to_branches(project):
+  """Expands a project into a list of branch names & associated information.
+
+  Args:
+    project: project.Project
+
+  Returns:
+    List[Tuple[str, git_config.Branch]]
+  """
+  branches = []
+  for name, b in project.GetBranches().items():
+    b.project = project
+    branches.append((name, b))
+  return branches
diff --git a/subcmds/checkout.py b/subcmds/checkout.py
index c8a09a8..9b42948 100644
--- a/subcmds/checkout.py
+++ b/subcmds/checkout.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,13 +12,15 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import functools
 import sys
-from command import Command
+
+from command import Command, DEFAULT_LOCAL_JOBS
 from progress import Progress
 
+
 class Checkout(Command):
-  common = True
+  COMMON = True
   helpSummary = "Checkout a branch for development"
   helpUsage = """
 %prog <branchname> [<project>...]
@@ -33,28 +33,37 @@
 
   repo forall [<project>...] -c git checkout <branchname>
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
 
   def ValidateOptions(self, opt, args):
     if not args:
       self.Usage()
 
+  def _ExecuteOne(self, nb, project):
+    """Checkout one project."""
+    return (project.CheckoutBranch(nb), project)
+
   def Execute(self, opt, args):
     nb = args[0]
     err = []
     success = []
     all_projects = self.GetProjects(args[1:])
 
-    pm = Progress('Checkout %s' % nb, len(all_projects))
-    for project in all_projects:
-      pm.update()
+    def _ProcessResults(_pool, pm, results):
+      for status, project in results:
+        if status is not None:
+          if status:
+            success.append(project)
+          else:
+            err.append(project)
+        pm.update()
 
-      status = project.CheckoutBranch(nb)
-      if status is not None:
-        if status:
-          success.append(project)
-        else:
-          err.append(project)
-    pm.end()
+    self.ExecuteInParallel(
+        opt.jobs,
+        functools.partial(self._ExecuteOne, nb),
+        all_projects,
+        callback=_ProcessResults,
+        output=Progress('Checkout %s' % (nb,), len(all_projects), quiet=opt.quiet))
 
     if err:
       for p in err:
diff --git a/subcmds/cherry_pick.py b/subcmds/cherry_pick.py
index a541a04..7bd858b 100644
--- a/subcmds/cherry_pick.py
+++ b/subcmds/cherry_pick.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2010 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import re
 import sys
 from command import Command
@@ -22,8 +19,9 @@
 
 CHANGE_ID_RE = re.compile(r'^\s*Change-Id: I([0-9a-f]{40})\s*$')
 
+
 class CherryPick(Command):
-  common = True
+  COMMON = True
   helpSummary = "Cherry-pick a change."
   helpUsage = """
 %prog <sha1>
@@ -34,9 +32,6 @@
 change id will be added.
 """
 
-  def _Options(self, p):
-    pass
-
   def ValidateOptions(self, opt, args):
     if len(args) != 1:
       self.Usage()
@@ -46,8 +41,8 @@
 
     p = GitCommand(None,
                    ['rev-parse', '--verify', reference],
-                   capture_stdout = True,
-                   capture_stderr = True)
+                   capture_stdout=True,
+                   capture_stderr=True)
     if p.Wait() != 0:
       print(p.stderr, file=sys.stderr)
       sys.exit(1)
@@ -61,8 +56,8 @@
 
     p = GitCommand(None,
                    ['cherry-pick', sha1],
-                   capture_stdout = True,
-                   capture_stderr = True)
+                   capture_stdout=True,
+                   capture_stderr=True)
     status = p.Wait()
 
     print(p.stdout, file=sys.stdout)
@@ -74,11 +69,9 @@
       new_msg = self._Reformat(old_msg, sha1)
 
       p = GitCommand(None, ['commit', '--amend', '-F', '-'],
-                     provide_stdin = True,
-                     capture_stdout = True,
-                     capture_stderr = True)
-      p.stdin.write(new_msg)
-      p.stdin.close()
+                     input=new_msg,
+                     capture_stdout=True,
+                     capture_stderr=True)
       if p.Wait() != 0:
         print("error: Failed to update commit message", file=sys.stderr)
         sys.exit(1)
@@ -97,7 +90,7 @@
 
   def _StripHeader(self, commit_msg):
     lines = commit_msg.splitlines()
-    return "\n".join(lines[lines.index("")+1:])
+    return "\n".join(lines[lines.index("") + 1:])
 
   def _Reformat(self, old_msg, sha1):
     new_msg = []
diff --git a/subcmds/diff.py b/subcmds/diff.py
index fa41e70..00a7ec2 100644
--- a/subcmds/diff.py
+++ b/subcmds/diff.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,10 +12,14 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from command import PagedCommand
+import functools
+import io
+
+from command import DEFAULT_LOCAL_JOBS, PagedCommand
+
 
 class Diff(PagedCommand):
-  common = True
+  COMMON = True
   helpSummary = "Show changes between commit and working tree"
   helpUsage = """
 %prog [<project>...]
@@ -26,19 +28,42 @@
 relative to the repository root, so the output can be applied
 to the Unix 'patch' command.
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
 
   def _Options(self, p):
-    def cmd(option, opt_str, value, parser):
-      setattr(parser.values, option.dest, list(parser.rargs))
-      while parser.rargs:
-        del parser.rargs[0]
     p.add_option('-u', '--absolute',
                  dest='absolute', action='store_true',
-                 help='Paths are relative to the repository root')
+                 help='paths are relative to the repository root')
+
+  def _ExecuteOne(self, absolute, project):
+    """Obtains the diff for a specific project.
+
+    Args:
+      absolute: Paths are relative to the root.
+      project: Project to get status of.
+
+    Returns:
+      The status of the project.
+    """
+    buf = io.StringIO()
+    ret = project.PrintWorkTreeDiff(absolute, output_redir=buf)
+    return (ret, buf.getvalue())
 
   def Execute(self, opt, args):
-    ret = 0
-    for project in self.GetProjects(args):
-      if not project.PrintWorkTreeDiff(opt.absolute):
-        ret = 1
-    return ret
+    all_projects = self.GetProjects(args)
+
+    def _ProcessResults(_pool, _output, results):
+      ret = 0
+      for (state, output) in results:
+        if output:
+          print(output, end='')
+        if not state:
+          ret = 1
+      return ret
+
+    return self.ExecuteInParallel(
+        opt.jobs,
+        functools.partial(self._ExecuteOne, opt.absolute),
+        all_projects,
+        callback=_ProcessResults,
+        ordered=True)
diff --git a/subcmds/diffmanifests.py b/subcmds/diffmanifests.py
index b999699..f6cc30a 100644
--- a/subcmds/diffmanifests.py
+++ b/subcmds/diffmanifests.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2014 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,12 +14,14 @@
 
 from color import Coloring
 from command import PagedCommand
-from manifest_xml import XmlManifest
+from manifest_xml import RepoClient
+
 
 class _Coloring(Coloring):
   def __init__(self, config):
     Coloring.__init__(self, config, "status")
 
+
 class Diffmanifests(PagedCommand):
   """ A command to see logs in projects represented by manifests
 
@@ -31,7 +31,7 @@
   deeper level.
   """
 
-  common = True
+  COMMON = True
   helpSummary = "Manifest diff utility"
   helpUsage = """%prog manifest1.xml [manifest2.xml] [options]"""
 
@@ -68,16 +68,16 @@
   def _Options(self, p):
     p.add_option('--raw',
                  dest='raw', action='store_true',
-                 help='Display raw diff.')
+                 help='display raw diff')
     p.add_option('--no-color',
                  dest='color', action='store_false', default=True,
-                 help='does not display the diff in color.')
+                 help='does not display the diff in color')
     p.add_option('--pretty-format',
                  dest='pretty_format', action='store',
                  metavar='<FORMAT>',
                  help='print the log using a custom git pretty format string')
 
-  def _printRawDiff(self, diff):
+  def _printRawDiff(self, diff, pretty_format=None):
     for project in diff['added']:
       self.printText("A %s %s" % (project.relpath, project.revisionExpr))
       self.out.nl()
@@ -90,7 +90,7 @@
       self.printText("C %s %s %s" % (project.relpath, project.revisionExpr,
                                      otherProject.revisionExpr))
       self.out.nl()
-      self._printLogs(project, otherProject, raw=True, color=False)
+      self._printLogs(project, otherProject, raw=True, color=False, pretty_format=pretty_format)
 
     for project, otherProject in diff['unreachable']:
       self.printText("U %s %s %s" % (project.relpath, project.revisionExpr,
@@ -181,26 +181,26 @@
       self.OptionParser.error('missing manifests to diff')
 
   def Execute(self, opt, args):
-    self.out = _Coloring(self.manifest.globalConfig)
+    self.out = _Coloring(self.client.globalConfig)
     self.printText = self.out.nofmt_printer('text')
     if opt.color:
-      self.printProject = self.out.nofmt_printer('project', attr = 'bold')
-      self.printAdded = self.out.nofmt_printer('green', fg = 'green', attr = 'bold')
-      self.printRemoved = self.out.nofmt_printer('red', fg = 'red', attr = 'bold')
-      self.printRevision = self.out.nofmt_printer('revision', fg = 'yellow')
+      self.printProject = self.out.nofmt_printer('project', attr='bold')
+      self.printAdded = self.out.nofmt_printer('green', fg='green', attr='bold')
+      self.printRemoved = self.out.nofmt_printer('red', fg='red', attr='bold')
+      self.printRevision = self.out.nofmt_printer('revision', fg='yellow')
     else:
       self.printProject = self.printAdded = self.printRemoved = self.printRevision = self.printText
 
-    manifest1 = XmlManifest(self.manifest.repodir)
+    manifest1 = RepoClient(self.repodir)
     manifest1.Override(args[0], load_local_manifests=False)
     if len(args) == 1:
       manifest2 = self.manifest
     else:
-      manifest2 = XmlManifest(self.manifest.repodir)
+      manifest2 = RepoClient(self.repodir)
       manifest2.Override(args[1], load_local_manifests=False)
 
     diff = manifest1.projectsDiff(manifest2)
     if opt.raw:
-      self._printRawDiff(diff)
+      self._printRawDiff(diff, pretty_format=opt.pretty_format)
     else:
       self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format)
diff --git a/subcmds/download.py b/subcmds/download.py
index f746bc2..523f25e 100644
--- a/subcmds/download.py
+++ b/subcmds/download.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,17 +12,17 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import re
 import sys
 
 from command import Command
-from error import GitError
+from error import GitError, NoSuchProjectError
 
 CHANGE_RE = re.compile(r'^([1-9][0-9]*)(?:[/\.-]([1-9][0-9]*))?$')
 
+
 class Download(Command):
-  common = True
+  COMMON = True
   helpSummary = "Download and checkout a change"
   helpUsage = """
 %prog {[project] change[/patchset]}...
@@ -36,9 +34,13 @@
 """
 
   def _Options(self, p):
+    p.add_option('-b', '--branch',
+                 help='create a new branch first')
     p.add_option('-c', '--cherry-pick',
                  dest='cherrypick', action='store_true',
                  help="cherry-pick instead of checkout")
+    p.add_option('-x', '--record-origin', action='store_true',
+                 help='pass -x when cherry-picking')
     p.add_option('-r', '--revert',
                  dest='revert', action='store_true',
                  help="revert instead of checkout")
@@ -58,6 +60,7 @@
       if m:
         if not project:
           project = self.GetProjects(".")[0]
+          print('Defaulting to cwd project', project.name)
         chg_id = int(m.group(1))
         if m.group(2):
           ps_id = int(m.group(2))
@@ -74,9 +77,33 @@
                 ps_id = max(int(match.group(1)), ps_id)
         to_get.append((project, chg_id, ps_id))
       else:
-        project = self.GetProjects([a])[0]
+        projects = self.GetProjects([a])
+        if len(projects) > 1:
+          # If the cwd is one of the projects, assume they want that.
+          try:
+            project = self.GetProjects('.')[0]
+          except NoSuchProjectError:
+            project = None
+          if project not in projects:
+            print('error: %s matches too many projects; please re-run inside '
+                  'the project checkout.' % (a,), file=sys.stderr)
+            for project in projects:
+              print('  %s/ @ %s' % (project.relpath, project.revisionExpr),
+                    file=sys.stderr)
+            sys.exit(1)
+        else:
+          project = projects[0]
+          print('Defaulting to cwd project', project.name)
     return to_get
 
+  def ValidateOptions(self, opt, args):
+    if opt.record_origin:
+      if not opt.cherrypick:
+        self.OptionParser.error('-x only makes sense with --cherry-pick')
+
+      if opt.ffonly:
+        self.OptionParser.error('-x and --ff are mutually exclusive options')
+
   def Execute(self, opt, args):
     for project, change_id, ps_id in self._ParseChangeIds(args):
       dl = project.DownloadPatchSet(change_id, ps_id)
@@ -93,22 +120,41 @@
         continue
 
       if len(dl.commits) > 1:
-        print('[%s] %d/%d depends on %d unmerged changes:' \
+        print('[%s] %d/%d depends on %d unmerged changes:'
               % (project.name, change_id, ps_id, len(dl.commits)),
               file=sys.stderr)
         for c in dl.commits:
           print('  %s' % (c), file=sys.stderr)
-      if opt.cherrypick:
-        try:
-          project._CherryPick(dl.commit)
-        except GitError:
-          print('[%s] Could not complete the cherry-pick of %s' \
-                % (project.name, dl.commit), file=sys.stderr)
-          sys.exit(1)
 
+      if opt.cherrypick:
+        mode = 'cherry-pick'
       elif opt.revert:
-        project._Revert(dl.commit)
+        mode = 'revert'
       elif opt.ffonly:
-        project._FastForward(dl.commit, ffonly=True)
+        mode = 'fast-forward merge'
       else:
-        project._Checkout(dl.commit)
+        mode = 'checkout'
+
+      # We'll combine the branch+checkout operation, but all the rest need a
+      # dedicated branch start.
+      if opt.branch and mode != 'checkout':
+        project.StartBranch(opt.branch)
+
+      try:
+        if opt.cherrypick:
+          project._CherryPick(dl.commit, ffonly=opt.ffonly,
+                              record_origin=opt.record_origin)
+        elif opt.revert:
+          project._Revert(dl.commit)
+        elif opt.ffonly:
+          project._FastForward(dl.commit, ffonly=True)
+        else:
+          if opt.branch:
+            project.StartBranch(opt.branch, revision=dl.commit)
+          else:
+            project._Checkout(dl.commit)
+
+      except GitError:
+        print('[%s] Could not complete the %s of %s'
+              % (project.name, mode, dl.commit), file=sys.stderr)
+        sys.exit(1)
diff --git a/subcmds/forall.py b/subcmds/forall.py
index 131ba67..7c1dea9 100644
--- a/subcmds/forall.py
+++ b/subcmds/forall.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,8 +12,9 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import errno
+import functools
+import io
 import multiprocessing
 import re
 import os
@@ -24,14 +23,14 @@
 import subprocess
 
 from color import Coloring
-from command import Command, MirrorSafeCommand
-import platform_utils
+from command import DEFAULT_LOCAL_JOBS, Command, MirrorSafeCommand, WORKER_BATCH_SIZE
+from error import ManifestInvalidRevisionError
 
 _CAN_COLOR = [
-  'branch',
-  'diff',
-  'grep',
-  'log',
+    'branch',
+    'diff',
+    'grep',
+    'log',
 ]
 
 
@@ -42,11 +41,11 @@
 
 
 class Forall(Command, MirrorSafeCommand):
-  common = False
+  COMMON = False
   helpSummary = "Run a shell command in each project"
   helpUsage = """
 %prog [<project>...] -c <command> [<arg>...]
-%prog -r str1 [str2] ... -c <command> [<arg>...]"
+%prog -r str1 [str2] ... -c <command> [<arg>...]
 """
   helpDescription = """
 Executes the same shell command in each project.
@@ -54,6 +53,11 @@
 The -r option allows running the command only on projects matching
 regex or wildcard expression.
 
+By default, projects are processed non-interactively in parallel.  If you want
+to run interactive commands, make sure to pass --interactive to force --jobs 1.
+While the processing order of projects is not guaranteed, the order of project
+output is stable.
+
 # Output Formatting
 
 The -p option causes '%prog' to bind pipes to the command's stdin,
@@ -116,70 +120,48 @@
 If -e is used, when a command exits unsuccessfully, '%prog' will abort
 without iterating through the remaining projects.
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
+
+  @staticmethod
+  def _cmd_option(option, _opt_str, _value, parser):
+    setattr(parser.values, option.dest, list(parser.rargs))
+    while parser.rargs:
+      del parser.rargs[0]
 
   def _Options(self, p):
-    def cmd(option, opt_str, value, parser):
-      setattr(parser.values, option.dest, list(parser.rargs))
-      while parser.rargs:
-        del parser.rargs[0]
     p.add_option('-r', '--regex',
                  dest='regex', action='store_true',
-                 help="Execute the command only on projects matching regex or wildcard expression")
+                 help='execute the command only on projects matching regex or wildcard expression')
     p.add_option('-i', '--inverse-regex',
                  dest='inverse_regex', action='store_true',
-                 help="Execute the command only on projects not matching regex or wildcard expression")
+                 help='execute the command only on projects not matching regex or '
+                      'wildcard expression')
     p.add_option('-g', '--groups',
                  dest='groups',
-                 help="Execute the command only on projects matching the specified groups")
+                 help='execute the command only on projects matching the specified groups')
     p.add_option('-c', '--command',
-                 help='Command (and arguments) to execute',
+                 help='command (and arguments) to execute',
                  dest='command',
                  action='callback',
-                 callback=cmd)
+                 callback=self._cmd_option)
     p.add_option('-e', '--abort-on-errors',
                  dest='abort_on_errors', action='store_true',
-                 help='Abort if a command exits unsuccessfully')
+                 help='abort if a command exits unsuccessfully')
     p.add_option('--ignore-missing', action='store_true',
-                 help='Silently skip & do not exit non-zero due missing '
+                 help='silently skip & do not exit non-zero due missing '
                       'checkouts')
 
-    g = p.add_option_group('Output')
+    g = p.get_option_group('--quiet')
     g.add_option('-p',
                  dest='project_header', action='store_true',
-                 help='Show project headers before output')
-    g.add_option('-v', '--verbose',
-                 dest='verbose', action='store_true',
-                 help='Show command error messages')
-    g.add_option('-j', '--jobs',
-                 dest='jobs', action='store', type='int', default=1,
-                 help='number of commands to execute simultaneously')
+                 help='show project headers before output')
+    p.add_option('--interactive',
+                 action='store_true',
+                 help='force interactive usage')
 
   def WantPager(self, opt):
     return opt.project_header and opt.jobs == 1
 
-  def _SerializeProject(self, project):
-    """ Serialize a project._GitGetByExec instance.
-
-    project._GitGetByExec is not pickle-able. Instead of trying to pass it
-    around between processes, make a dict ourselves containing only the
-    attributes that we need.
-
-    """
-    if not self.manifest.IsMirror:
-      lrev = project.GetRevisionId()
-    else:
-      lrev = None
-    return {
-      'name': project.name,
-      'relpath': project.relpath,
-      'remote_name': project.remote.name,
-      'lrev': lrev,
-      'rrev': project.revisionExpr,
-      'annotations': dict((a.name, a.value) for a in project.annotations),
-      'gitdir': project.gitdir,
-      'worktree': project.worktree,
-    }
-
   def ValidateOptions(self, opt, args):
     if not opt.command:
       self.Usage()
@@ -195,9 +177,14 @@
       cmd.append(cmd[0])
     cmd.extend(opt.command[1:])
 
-    if  opt.project_header \
-    and not shell \
-    and cmd[0] == 'git':
+    # Historically, forall operated interactively, and in serial.  If the user
+    # has selected 1 job, then default to interacive mode.
+    if opt.jobs == 1:
+      opt.interactive = True
+
+    if opt.project_header \
+            and not shell \
+            and cmd[0] == 'git':
       # If this is a direct git command that can enable colorized
       # output and the user prefers coloring, add --color into the
       # command line because we are going to wrap the command into
@@ -220,7 +207,7 @@
 
     smart_sync_manifest_name = "smart_sync_override.xml"
     smart_sync_manifest_path = os.path.join(
-      self.manifest.manifestProject.worktree, smart_sync_manifest_name)
+        self.manifest.manifestProject.worktree, smart_sync_manifest_name)
 
     if os.path.isfile(smart_sync_manifest_path):
       self.manifest.Override(smart_sync_manifest_path)
@@ -234,58 +221,50 @@
 
     os.environ['REPO_COUNT'] = str(len(projects))
 
-    pool = multiprocessing.Pool(opt.jobs, InitWorker)
     try:
       config = self.manifest.manifestProject.config
-      results_it = pool.imap(
-         DoWorkWrapper,
-         self.ProjectArgs(projects, mirror, opt, cmd, shell, config))
-      pool.close()
-      for r in results_it:
-        rc = rc or r
-        if r != 0 and opt.abort_on_errors:
-          raise Exception('Aborting due to previous error')
+      with multiprocessing.Pool(opt.jobs, InitWorker) as pool:
+        results_it = pool.imap(
+            functools.partial(DoWorkWrapper, mirror, opt, cmd, shell, config),
+            enumerate(projects),
+            chunksize=WORKER_BATCH_SIZE)
+        first = True
+        for (r, output) in results_it:
+          if output:
+            if first:
+              first = False
+            elif opt.project_header:
+              print()
+            # To simplify the DoWorkWrapper, take care of automatic newlines.
+            end = '\n'
+            if output[-1] == '\n':
+              end = ''
+            print(output, end=end)
+          rc = rc or r
+          if r != 0 and opt.abort_on_errors:
+            raise Exception('Aborting due to previous error')
     except (KeyboardInterrupt, WorkerKeyboardInterrupt):
       # Catch KeyboardInterrupt raised inside and outside of workers
-      print('Interrupted - terminating the pool')
-      pool.terminate()
       rc = rc or errno.EINTR
     except Exception as e:
       # Catch any other exceptions raised
-      print('Got an error, terminating the pool: %s: %s' %
-              (type(e).__name__, e),
+      print('forall: unhandled error, terminating the pool: %s: %s' %
+            (type(e).__name__, e),
             file=sys.stderr)
-      pool.terminate()
       rc = rc or getattr(e, 'errno', 1)
-    finally:
-      pool.join()
     if rc != 0:
       sys.exit(rc)
 
-  def ProjectArgs(self, projects, mirror, opt, cmd, shell, config):
-    for cnt, p in enumerate(projects):
-      try:
-        project = self._SerializeProject(p)
-      except Exception as e:
-        print('Project list error on project %s: %s: %s' %
-                (p.name, type(e).__name__, e),
-              file=sys.stderr)
-        return
-      except KeyboardInterrupt:
-        print('Project list interrupted',
-              file=sys.stderr)
-        return
-      yield [mirror, opt, cmd, shell, cnt, config, project]
 
 class WorkerKeyboardInterrupt(Exception):
   """ Keyboard interrupt exception for worker processes. """
-  pass
 
 
 def InitWorker():
   signal.signal(signal.SIGINT, signal.SIG_IGN)
 
-def DoWorkWrapper(args):
+
+def DoWorkWrapper(mirror, opt, cmd, shell, config, args):
   """ A wrapper around the DoWork() method.
 
   Catch the KeyboardInterrupt exceptions here and re-raise them as a different,
@@ -293,109 +272,81 @@
   and making the parent hang indefinitely.
 
   """
-  project = args.pop()
+  cnt, project = args
   try:
-    return DoWork(project, *args)
+    return DoWork(project, mirror, opt, cmd, shell, cnt, config)
   except KeyboardInterrupt:
-    print('%s: Worker interrupted' % project['name'])
+    print('%s: Worker interrupted' % project.name)
     raise WorkerKeyboardInterrupt()
 
 
 def DoWork(project, mirror, opt, cmd, shell, cnt, config):
   env = os.environ.copy()
+
   def setenv(name, val):
     if val is None:
       val = ''
-    if hasattr(val, 'encode'):
-      val = val.encode()
     env[name] = val
 
-  setenv('REPO_PROJECT', project['name'])
-  setenv('REPO_PATH', project['relpath'])
-  setenv('REPO_REMOTE', project['remote_name'])
-  setenv('REPO_LREV', project['lrev'])
-  setenv('REPO_RREV', project['rrev'])
+  setenv('REPO_PROJECT', project.name)
+  setenv('REPO_PATH', project.relpath)
+  setenv('REPO_REMOTE', project.remote.name)
+  try:
+    # If we aren't in a fully synced state and we don't have the ref the manifest
+    # wants, then this will fail.  Ignore it for the purposes of this code.
+    lrev = '' if mirror else project.GetRevisionId()
+  except ManifestInvalidRevisionError:
+    lrev = ''
+  setenv('REPO_LREV', lrev)
+  setenv('REPO_RREV', project.revisionExpr)
+  setenv('REPO_UPSTREAM', project.upstream)
+  setenv('REPO_DEST_BRANCH', project.dest_branch)
   setenv('REPO_I', str(cnt + 1))
-  for name in project['annotations']:
-    setenv("REPO__%s" % (name), project['annotations'][name])
+  for annotation in project.annotations:
+    setenv("REPO__%s" % (annotation.name), annotation.value)
 
   if mirror:
-    setenv('GIT_DIR', project['gitdir'])
-    cwd = project['gitdir']
+    setenv('GIT_DIR', project.gitdir)
+    cwd = project.gitdir
   else:
-    cwd = project['worktree']
+    cwd = project.worktree
 
   if not os.path.exists(cwd):
     # Allow the user to silently ignore missing checkouts so they can run on
     # partial checkouts (good for infra recovery tools).
     if opt.ignore_missing:
-      return 0
+      return (0, '')
+
+    output = ''
     if ((opt.project_header and opt.verbose)
-        or not opt.project_header):
-      print('skipping %s/' % project['relpath'], file=sys.stderr)
-    return 1
+            or not opt.project_header):
+      output = 'skipping %s/' % project.relpath
+    return (1, output)
 
-  if opt.project_header:
-    stdin = subprocess.PIPE
-    stdout = subprocess.PIPE
-    stderr = subprocess.PIPE
+  if opt.verbose:
+    stderr = subprocess.STDOUT
   else:
-    stdin = None
-    stdout = None
-    stderr = None
+    stderr = subprocess.DEVNULL
 
-  p = subprocess.Popen(cmd,
-                       cwd=cwd,
-                       shell=shell,
-                       env=env,
-                       stdin=stdin,
-                       stdout=stdout,
-                       stderr=stderr)
+  stdin = None if opt.interactive else subprocess.DEVNULL
 
+  result = subprocess.run(
+      cmd, cwd=cwd, shell=shell, env=env, check=False,
+      encoding='utf-8', errors='replace',
+      stdin=stdin, stdout=subprocess.PIPE, stderr=stderr)
+
+  output = result.stdout
   if opt.project_header:
-    out = ForallColoring(config)
-    out.redirect(sys.stdout)
-    empty = True
-    errbuf = ''
-
-    p.stdin.close()
-    s_in = platform_utils.FileDescriptorStreams.create()
-    s_in.add(p.stdout, sys.stdout, 'stdout')
-    s_in.add(p.stderr, sys.stderr, 'stderr')
-
-    while not s_in.is_done:
-      in_ready = s_in.select()
-      for s in in_ready:
-        buf = s.read().decode()
-        if not buf:
-          s.close()
-          s_in.remove(s)
-          continue
-
-        if not opt.verbose:
-          if s.std_name == 'stderr':
-            errbuf += buf
-            continue
-
-        if empty and out:
-          if not cnt == 0:
-            out.nl()
-
-          if mirror:
-            project_header_path = project['name']
-          else:
-            project_header_path = project['relpath']
-          out.project('project %s/', project_header_path)
-          out.nl()
-          out.flush()
-          if errbuf:
-            sys.stderr.write(errbuf)
-            sys.stderr.flush()
-            errbuf = ''
-          empty = False
-
-        s.dest.write(buf)
-        s.dest.flush()
-
-  r = p.wait()
-  return r
+    if output:
+      buf = io.StringIO()
+      out = ForallColoring(config)
+      out.redirect(buf)
+      if mirror:
+        project_header_path = project.name
+      else:
+        project_header_path = project.relpath
+      out.project('project %s/' % project_header_path)
+      out.nl()
+      buf.write(output)
+      output = buf.getvalue()
+  return (result.returncode, output)
diff --git a/subcmds/gitc_delete.py b/subcmds/gitc_delete.py
index e5214b8..df74946 100644
--- a/subcmds/gitc_delete.py
+++ b/subcmds/gitc_delete.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2015 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,18 +12,14 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import sys
 
 from command import Command, GitcClientCommand
 import platform_utils
 
-from pyversion import is_python3
-if not is_python3():
-  input = raw_input
 
 class GitcDelete(Command, GitcClientCommand):
-  common = True
+  COMMON = True
   visible_everywhere = False
   helpSummary = "Delete a GITC Client."
   helpUsage = """
@@ -39,7 +33,7 @@
   def _Options(self, p):
     p.add_option('-f', '--force',
                  dest='force', action='store_true',
-                 help='Force the deletion (no prompt).')
+                 help='force the deletion (no prompt)')
 
   def Execute(self, opt, args):
     if not opt.force:
diff --git a/subcmds/gitc_init.py b/subcmds/gitc_init.py
index 378f923..e705b61 100644
--- a/subcmds/gitc_init.py
+++ b/subcmds/gitc_init.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2015 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import os
 import sys
 
@@ -26,7 +23,7 @@
 
 
 class GitcInit(init.Init, GitcAvailableCommand):
-  common = True
+  COMMON = True
   helpSummary = "Initialize a GITC Client."
   helpUsage = """
 %prog [options] [client name]
@@ -50,23 +47,17 @@
 """
 
   def _Options(self, p):
-    super(GitcInit, self)._Options(p, gitc_init=True)
-    g = p.add_option_group('GITC options')
-    g.add_option('-f', '--manifest-file',
-                 dest='manifest_file',
-                 help='Optional manifest file to use for this GITC client.')
-    g.add_option('-c', '--gitc-client',
-                 dest='gitc_client',
-                 help='The name of the gitc_client instance to create or modify.')
+    super()._Options(p, gitc_init=True)
 
   def Execute(self, opt, args):
     gitc_client = gitc_utils.parse_clientdir(os.getcwd())
     if not gitc_client or (opt.gitc_client and gitc_client != opt.gitc_client):
-      print('fatal: Please update your repo command. See go/gitc for instructions.', file=sys.stderr)
+      print('fatal: Please update your repo command. See go/gitc for instructions.',
+            file=sys.stderr)
       sys.exit(1)
     self.client_dir = os.path.join(gitc_utils.get_gitc_manifest_dir(),
                                    gitc_client)
-    super(GitcInit, self).Execute(opt, args)
+    super().Execute(opt, args)
 
     manifest_file = self.manifest.manifestFile
     if opt.manifest_file:
diff --git a/subcmds/grep.py b/subcmds/grep.py
index 4dd85d5..8ac4ba1 100644
--- a/subcmds/grep.py
+++ b/subcmds/grep.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,14 +12,14 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
-
+import functools
 import sys
 
 from color import Coloring
-from command import PagedCommand
+from command import DEFAULT_LOCAL_JOBS, PagedCommand
 from error import GitError
-from git_command import git_require, GitCommand
+from git_command import GitCommand
+
 
 class GrepColoring(Coloring):
   def __init__(self, config):
@@ -29,8 +27,9 @@
     self.project = self.printer('project', attr='bold')
     self.fail = self.printer('fail', fg='red')
 
+
 class Grep(PagedCommand):
-  common = True
+  COMMON = True
   helpSummary = "Print lines matching a pattern"
   helpUsage = """
 %prog {pattern | -e pattern} [<project>...]
@@ -63,30 +62,33 @@
   repo grep --all-match -e NODE -e Unexpected
 
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
+
+  @staticmethod
+  def _carry_option(_option, opt_str, value, parser):
+    pt = getattr(parser.values, 'cmd_argv', None)
+    if pt is None:
+      pt = []
+      setattr(parser.values, 'cmd_argv', pt)
+
+    if opt_str == '-(':
+      pt.append('(')
+    elif opt_str == '-)':
+      pt.append(')')
+    else:
+      pt.append(opt_str)
+
+    if value is not None:
+      pt.append(value)
+
+  def _CommonOptions(self, p):
+    """Override common options slightly."""
+    super()._CommonOptions(p, opt_v=False)
 
   def _Options(self, p):
-    def carry(option,
-              opt_str,
-              value,
-              parser):
-      pt = getattr(parser.values, 'cmd_argv', None)
-      if pt is None:
-        pt = []
-        setattr(parser.values, 'cmd_argv', pt)
-
-      if opt_str == '-(':
-        pt.append('(')
-      elif opt_str == '-)':
-        pt.append(')')
-      else:
-        pt.append(opt_str)
-
-      if value is not None:
-        pt.append(value)
-
     g = p.add_option_group('Sources')
     g.add_option('--cached',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Search the index, instead of the work tree')
     g.add_option('-r', '--revision',
                  dest='revision', action='append', metavar='TREEish',
@@ -94,74 +96,139 @@
 
     g = p.add_option_group('Pattern')
     g.add_option('-e',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  metavar='PATTERN', type='str',
                  help='Pattern to search for')
     g.add_option('-i', '--ignore-case',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Ignore case differences')
     g.add_option('-a', '--text',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help="Process binary files as if they were text")
     g.add_option('-I',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help="Don't match the pattern in binary files")
     g.add_option('-w', '--word-regexp',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Match the pattern only at word boundaries')
     g.add_option('-v', '--invert-match',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Select non-matching lines')
     g.add_option('-G', '--basic-regexp',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Use POSIX basic regexp for patterns (default)')
     g.add_option('-E', '--extended-regexp',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Use POSIX extended regexp for patterns')
     g.add_option('-F', '--fixed-strings',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Use fixed strings (not regexp) for pattern')
 
     g = p.add_option_group('Pattern Grouping')
     g.add_option('--all-match',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Limit match to lines that have all patterns')
     g.add_option('--and', '--or', '--not',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Boolean operators to combine patterns')
     g.add_option('-(', '-)',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Boolean operator grouping')
 
     g = p.add_option_group('Output')
     g.add_option('-n',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Prefix the line number to matching lines')
     g.add_option('-C',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  metavar='CONTEXT', type='str',
                  help='Show CONTEXT lines around match')
     g.add_option('-B',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  metavar='CONTEXT', type='str',
                  help='Show CONTEXT lines before match')
     g.add_option('-A',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  metavar='CONTEXT', type='str',
                  help='Show CONTEXT lines after match')
     g.add_option('-l', '--name-only', '--files-with-matches',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Show only file names containing matching lines')
     g.add_option('-L', '--files-without-match',
-                 action='callback', callback=carry,
+                 action='callback', callback=self._carry_option,
                  help='Show only file names not containing matching lines')
 
+  def _ExecuteOne(self, cmd_argv, project):
+    """Process one project."""
+    try:
+      p = GitCommand(project,
+                     cmd_argv,
+                     bare=False,
+                     capture_stdout=True,
+                     capture_stderr=True)
+    except GitError as e:
+      return (project, -1, None, str(e))
+
+    return (project, p.Wait(), p.stdout, p.stderr)
+
+  @staticmethod
+  def _ProcessResults(full_name, have_rev, _pool, out, results):
+    git_failed = False
+    bad_rev = False
+    have_match = False
+
+    for project, rc, stdout, stderr in results:
+      if rc < 0:
+        git_failed = True
+        out.project('--- project %s ---' % project.relpath)
+        out.nl()
+        out.fail('%s', stderr)
+        out.nl()
+        continue
+
+      if rc:
+        # no results
+        if stderr:
+          if have_rev and 'fatal: ambiguous argument' in stderr:
+            bad_rev = True
+          else:
+            out.project('--- project %s ---' % project.relpath)
+            out.nl()
+            out.fail('%s', stderr.strip())
+            out.nl()
+        continue
+      have_match = True
+
+      # We cut the last element, to avoid a blank line.
+      r = stdout.split('\n')
+      r = r[0:-1]
+
+      if have_rev and full_name:
+        for line in r:
+          rev, line = line.split(':', 1)
+          out.write("%s", rev)
+          out.write(':')
+          out.project(project.relpath)
+          out.write('/')
+          out.write("%s", line)
+          out.nl()
+      elif full_name:
+        for line in r:
+          out.project(project.relpath)
+          out.write('/')
+          out.write("%s", line)
+          out.nl()
+      else:
+        for line in r:
+          print(line)
+
+    return (git_failed, bad_rev, have_match)
 
   def Execute(self, opt, args):
     out = GrepColoring(self.manifest.manifestProject.config)
 
     cmd_argv = ['grep']
-    if out.is_on and git_require((1, 6, 3)):
+    if out.is_on:
       cmd_argv.append('--color')
     cmd_argv.extend(getattr(opt, 'cmd_argv', []))
 
@@ -188,62 +255,13 @@
       cmd_argv.extend(opt.revision)
     cmd_argv.append('--')
 
-    git_failed = False
-    bad_rev = False
-    have_match = False
-
-    for project in projects:
-      try:
-        p = GitCommand(project,
-                       cmd_argv,
-                       bare=False,
-                       capture_stdout=True,
-                       capture_stderr=True)
-      except GitError as e:
-        git_failed = True
-        out.project('--- project %s ---' % project.relpath)
-        out.nl()
-        out.fail('%s', str(e))
-        out.nl()
-        continue
-
-      if p.Wait() != 0:
-        # no results
-        #
-        if p.stderr:
-          if have_rev and 'fatal: ambiguous argument' in p.stderr:
-            bad_rev = True
-          else:
-            out.project('--- project %s ---' % project.relpath)
-            out.nl()
-            out.fail('%s', p.stderr.strip())
-            out.nl()
-        continue
-      have_match = True
-
-      # We cut the last element, to avoid a blank line.
-      #
-      r = p.stdout.split('\n')
-      r = r[0:-1]
-
-      if have_rev and full_name:
-        for line in r:
-          rev, line = line.split(':', 1)
-          out.write("%s", rev)
-          out.write(':')
-          out.project(project.relpath)
-          out.write('/')
-          out.write("%s", line)
-          out.nl()
-      elif full_name:
-        for line in r:
-          out.project(project.relpath)
-          out.write('/')
-          out.write("%s", line)
-          out.nl()
-      else:
-        for line in r:
-          print(line)
+    git_failed, bad_rev, have_match = self.ExecuteInParallel(
+        opt.jobs,
+        functools.partial(self._ExecuteOne, cmd_argv),
+        projects,
+        callback=functools.partial(self._ProcessResults, full_name, have_rev),
+        output=out,
+        ordered=True)
 
     if git_failed:
       sys.exit(1)
diff --git a/subcmds/help.py b/subcmds/help.py
index 7893050..1a60ef4 100644
--- a/subcmds/help.py
+++ b/subcmds/help.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,17 +12,19 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import re
 import sys
-from formatter import AbstractFormatter, DumbWriter
+import textwrap
 
+from subcmds import all_commands
 from color import Coloring
 from command import PagedCommand, MirrorSafeCommand, GitcAvailableCommand, GitcClientCommand
 import gitc_utils
+from wrapper import Wrapper
+
 
 class Help(PagedCommand, MirrorSafeCommand):
-  common = False
+  COMMON = False
   helpSummary = "Display detailed help on a command"
   helpUsage = """
 %prog [--all|command]
@@ -41,7 +41,7 @@
     fmt = '  %%-%ds  %%s' % maxlen
 
     for name in commandNames:
-      command = self.commands[name]
+      command = all_commands[name]()
       try:
         summary = command.helpSummary.strip()
       except AttributeError:
@@ -50,20 +50,27 @@
 
   def _PrintAllCommands(self):
     print('usage: repo COMMAND [ARGS]')
+    self.PrintAllCommandsBody()
+
+  def PrintAllCommandsBody(self):
     print('The complete list of recognized repo commands are:')
-    commandNames = list(sorted(self.commands))
+    commandNames = list(sorted(all_commands))
     self._PrintCommands(commandNames)
     print("See 'repo help <command>' for more information on a "
           'specific command.')
+    print('Bug reports:', Wrapper().BUG_URL)
 
   def _PrintCommonCommands(self):
     print('usage: repo COMMAND [ARGS]')
+    self.PrintCommonCommandsBody()
+
+  def PrintCommonCommandsBody(self):
     print('The most commonly used repo commands are:')
 
     def gitc_supported(cmd):
       if not isinstance(cmd, GitcAvailableCommand) and not isinstance(cmd, GitcClientCommand):
         return True
-      if self.manifest.isGitcClient:
+      if self.client.isGitcClient:
         return True
       if isinstance(cmd, GitcClientCommand):
         return False
@@ -72,21 +79,21 @@
       return False
 
     commandNames = list(sorted([name
-                    for name, command in self.commands.items()
-                    if command.common and gitc_supported(command)]))
+                                for name, command in all_commands.items()
+                                if command.COMMON and gitc_supported(command)]))
     self._PrintCommands(commandNames)
 
     print(
-"See 'repo help <command>' for more information on a specific command.\n"
-"See 'repo help --all' for a complete list of recognized commands.")
+        "See 'repo help <command>' for more information on a specific command.\n"
+        "See 'repo help --all' for a complete list of recognized commands.")
+    print('Bug reports:', Wrapper().BUG_URL)
 
   def _PrintCommandHelp(self, cmd, header_prefix=''):
     class _Out(Coloring):
       def __init__(self, gc):
         Coloring.__init__(self, gc, 'help')
         self.heading = self.printer('heading', attr='bold')
-
-        self.wrap = AbstractFormatter(DumbWriter())
+        self._first = True
 
       def _PrintSection(self, heading, bodyAttr):
         try:
@@ -96,7 +103,9 @@
         if body == '' or body is None:
           return
 
-        self.nl()
+        if not self._first:
+          self.nl()
+        self._first = False
 
         self.heading('%s%s', header_prefix, heading)
         self.nl()
@@ -106,7 +115,8 @@
         body = body.strip()
         body = body.replace('%prog', me)
 
-        asciidoc_hdr = re.compile(r'^\n?#+ (.+)$')
+        # Extract the title, but skip any trailing {#anchors}.
+        asciidoc_hdr = re.compile(r'^\n?#+ ([^{]+)(\{#.+\})?$')
         for para in body.split("\n\n"):
           if para.startswith(' '):
             self.write('%s', para)
@@ -121,19 +131,21 @@
             self.nl()
             continue
 
-          self.wrap.add_flowing_data(para)
-          self.wrap.end_paragraph(1)
-        self.wrap.end_paragraph(0)
+          lines = textwrap.wrap(para.replace('  ', ' '), width=80,
+                                break_long_words=False, break_on_hyphens=False)
+          for line in lines:
+            self.write('%s', line)
+            self.nl()
+          self.nl()
 
-    out = _Out(self.manifest.globalConfig)
+    out = _Out(self.client.globalConfig)
     out._PrintSection('Summary', 'helpSummary')
     cmd.OptionParser.print_help()
     out._PrintSection('Description', 'helpDescription')
 
   def _PrintAllCommandHelp(self):
-    for name in sorted(self.commands):
-      cmd = self.commands[name]
-      cmd.manifest = self.manifest
+    for name in sorted(all_commands):
+      cmd = all_commands[name](manifest=self.manifest)
       self._PrintCommandHelp(cmd, header_prefix='[%s] ' % (name,))
 
   def _Options(self, p):
@@ -157,12 +169,11 @@
       name = args[0]
 
       try:
-        cmd = self.commands[name]
+        cmd = all_commands[name](manifest=self.manifest)
       except KeyError:
         print("repo: '%s' is not a repo command." % name, file=sys.stderr)
         sys.exit(1)
 
-      cmd.manifest = self.manifest
       self._PrintCommandHelp(cmd)
 
     else:
diff --git a/subcmds/info.py b/subcmds/info.py
index d62e1e6..6c1246e 100644
--- a/subcmds/info.py
+++ b/subcmds/info.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2012 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,18 +12,22 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
+import optparse
+
 from command import PagedCommand
 from color import Coloring
-from git_refs import R_M
+from git_refs import R_M, R_HEADS
+
 
 class _Coloring(Coloring):
   def __init__(self, config):
     Coloring.__init__(self, config, "status")
 
+
 class Info(PagedCommand):
-  common = True
+  COMMON = True
   helpSummary = "Get info on the manifest branch, current branch or unmerged branches"
-  helpUsage = "%prog [-dl] [-o [-b]] [<project>...]"
+  helpUsage = "%prog [-dl] [-o [-c]] [<project>...]"
 
   def _Options(self, p):
     p.add_option('-d', '--diff',
@@ -34,22 +36,28 @@
     p.add_option('-o', '--overview',
                  dest='overview', action='store_true',
                  help='show overview of all local commits')
-    p.add_option('-b', '--current-branch',
+    p.add_option('-c', '--current-branch',
                  dest="current_branch", action="store_true",
                  help="consider only checked out branches")
+    p.add_option('--no-current-branch',
+                 dest='current_branch', action='store_false',
+                 help='consider all local branches')
+    # Turn this into a warning & remove this someday.
+    p.add_option('-b',
+                 dest='current_branch', action='store_true',
+                 help=optparse.SUPPRESS_HELP)
     p.add_option('-l', '--local-only',
                  dest="local", action="store_true",
-                 help="Disable all remote operations")
-
+                 help="disable all remote operations")
 
   def Execute(self, opt, args):
-    self.out = _Coloring(self.manifest.globalConfig)
-    self.heading = self.out.printer('heading', attr = 'bold')
-    self.headtext = self.out.nofmt_printer('headtext', fg = 'yellow')
-    self.redtext = self.out.printer('redtext', fg = 'red')
-    self.sha = self.out.printer("sha", fg = 'yellow')
+    self.out = _Coloring(self.client.globalConfig)
+    self.heading = self.out.printer('heading', attr='bold')
+    self.headtext = self.out.nofmt_printer('headtext', fg='yellow')
+    self.redtext = self.out.printer('redtext', fg='red')
+    self.sha = self.out.printer("sha", fg='yellow')
     self.text = self.out.nofmt_printer('text')
-    self.dimtext = self.out.printer('dimtext', attr = 'dim')
+    self.dimtext = self.out.printer('dimtext', attr='dim')
 
     self.opt = opt
 
@@ -122,11 +130,14 @@
       self.printSeparator()
 
   def findRemoteLocalDiff(self, project):
-    #Fetch all the latest commits
+    # Fetch all the latest commits.
     if not self.opt.local:
       project.Sync_NetworkHalf(quiet=True, current_branch_only=True)
 
-    logTarget = R_M + self.manifest.manifestProject.config.GetBranch("default").merge
+    branch = self.manifest.manifestProject.config.GetBranch('default').merge
+    if branch.startswith(R_HEADS):
+      branch = branch[len(R_HEADS):]
+    logTarget = R_M + branch
 
     bareTmp = project.bare_git._bare
     project.bare_git._bare = False
@@ -195,16 +206,16 @@
       commits = branch.commits
       date = branch.date
       self.text('%s %-33s (%2d commit%s, %s)' % (
-        branch.name == project.CurrentBranch and '*' or ' ',
-        branch.name,
-        len(commits),
-        len(commits) != 1 and 's' or '',
-        date))
+          branch.name == project.CurrentBranch and '*' or ' ',
+          branch.name,
+          len(commits),
+          len(commits) != 1 and 's' or '',
+          date))
       self.out.nl()
 
       for commit in commits:
         split = commit.split()
-        self.text('{0:38}{1} '.format('','-'))
+        self.text('{0:38}{1} '.format('', '-'))
         self.sha(split[0] + " ")
         self.text(" ".join(split[1:]))
         self.out.nl()
diff --git a/subcmds/init.py b/subcmds/init.py
index 6594a60..9c6b2ad 100644
--- a/subcmds/init.py
+++ b/subcmds/init.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,34 +12,30 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import os
 import platform
 import re
+import subprocess
 import sys
-
-from pyversion import is_python3
-if is_python3():
-  import urllib.parse
-else:
-  import imp
-  import urlparse
-  urllib = imp.new_module('urllib')
-  urllib.parse = urlparse
+import urllib.parse
 
 from color import Coloring
 from command import InteractiveCommand, MirrorSafeCommand
 from error import ManifestParseError
 from project import SyncBuffer
 from git_config import GitConfig
-from git_command import git_require, MIN_GIT_VERSION
+from git_command import git_require, MIN_GIT_VERSION_SOFT, MIN_GIT_VERSION_HARD
+import fetch
+import git_superproject
 import platform_utils
+from wrapper import Wrapper
+
 
 class Init(InteractiveCommand, MirrorSafeCommand):
-  common = True
-  helpSummary = "Initialize repo in the current directory"
+  COMMON = True
+  helpSummary = "Initialize a repo client checkout in the current directory"
   helpUsage = """
-%prog [options]
+%prog [options] [manifest url]
 """
   helpDescription = """
 The '%prog' command is run once to install and initialize repo.
@@ -49,13 +43,24 @@
 from the server and is installed in the .repo/ directory in the
 current working directory.
 
+When creating a new checkout, the manifest URL is the only required setting.
+It may be specified using the --manifest-url option, or as the first optional
+argument.
+
 The optional -b argument can be used to select the manifest branch
-to checkout and use.  If no branch is specified, master is assumed.
+to checkout and use.  If no branch is specified, the remote's default
+branch is used.  This is equivalent to using -b HEAD.
 
 The optional -m argument can be used to specify an alternate manifest
 to be used. If no manifest is specified, the manifest default.xml
 will be used.
 
+If the --standalone-manifest argument is set, the manifest will be downloaded
+directly from the specified --manifest-url as a static file (rather than
+setting up a manifest git checkout). With --standalone-manifest, the manifest
+will be fully static and will not be re-downloaded during subsesquent
+`repo init` and `repo sync` calls.
+
 The --reference option can be used to point to a directory that
 has the content of a --mirror sync. This will make the working
 directory use as much data as possible from the local reference
@@ -81,109 +86,64 @@
 to update the working directory files.
 """
 
+  def _CommonOptions(self, p):
+    """Disable due to re-use of Wrapper()."""
+
   def _Options(self, p, gitc_init=False):
-    # Logging
-    g = p.add_option_group('Logging options')
-    g.add_option('-q', '--quiet',
-                 dest="quiet", action="store_true", default=False,
-                 help="be quiet")
-
-    # Manifest
-    g = p.add_option_group('Manifest options')
-    g.add_option('-u', '--manifest-url',
-                 dest='manifest_url',
-                 help='manifest repository location', metavar='URL')
-    g.add_option('-b', '--manifest-branch',
-                 dest='manifest_branch',
-                 help='manifest branch or revision', metavar='REVISION')
-    cbr_opts = ['--current-branch']
-    # The gitc-init subcommand allocates -c itself, but a lot of init users
-    # want -c, so try to satisfy both as best we can.
-    if not gitc_init:
-      cbr_opts += ['-c']
-    g.add_option(*cbr_opts,
-                 dest='current_branch_only', action='store_true',
-                 help='fetch only current manifest branch from server')
-    g.add_option('-m', '--manifest-name',
-                 dest='manifest_name', default='default.xml',
-                 help='initial manifest file', metavar='NAME.xml')
-    g.add_option('--mirror',
-                 dest='mirror', action='store_true',
-                 help='create a replica of the remote repositories '
-                      'rather than a client working directory')
-    g.add_option('--reference',
-                 dest='reference',
-                 help='location of mirror directory', metavar='DIR')
-    g.add_option('--dissociate',
-                 dest='dissociate', action='store_true',
-                 help='dissociate from reference mirrors after clone')
-    g.add_option('--depth', type='int', default=None,
-                 dest='depth',
-                 help='create a shallow clone with given depth; see git clone')
-    g.add_option('--partial-clone', action='store_true',
-                 dest='partial_clone',
-                 help='perform partial clone (https://git-scm.com/'
-                 'docs/gitrepository-layout#_code_partialclone_code)')
-    g.add_option('--clone-filter', action='store', default='blob:none',
-                 dest='clone_filter',
-                 help='filter for use with --partial-clone [default: %default]')
-    g.add_option('--archive',
-                 dest='archive', action='store_true',
-                 help='checkout an archive instead of a git repository for '
-                      'each project. See git archive.')
-    g.add_option('--submodules',
-                 dest='submodules', action='store_true',
-                 help='sync any submodules associated with the manifest repo')
-    g.add_option('-g', '--groups',
-                 dest='groups', default='default',
-                 help='restrict manifest projects to ones with specified '
-                      'group(s) [default|all|G1,G2,G3|G4,-G5,-G6]',
-                 metavar='GROUP')
-    g.add_option('-p', '--platform',
-                 dest='platform', default='auto',
-                 help='restrict manifest projects to ones with a specified '
-                      'platform group [auto|all|none|linux|darwin|...]',
-                 metavar='PLATFORM')
-    g.add_option('--no-clone-bundle',
-                 dest='no_clone_bundle', action='store_true',
-                 help='disable use of /clone.bundle on HTTP/HTTPS')
-    g.add_option('--no-tags',
-                 dest='no_tags', action='store_true',
-                 help="don't fetch tags in the manifest")
-
-    # Tool
-    g = p.add_option_group('repo Version options')
-    g.add_option('--repo-url',
-                 dest='repo_url',
-                 help='repo repository location', metavar='URL')
-    g.add_option('--repo-branch',
-                 dest='repo_branch',
-                 help='repo branch or revision', metavar='REVISION')
-    g.add_option('--no-repo-verify',
-                 dest='no_repo_verify', action='store_true',
-                 help='do not verify repo source code')
-
-    # Other
-    g = p.add_option_group('Other options')
-    g.add_option('--config-name',
-                 dest='config_name', action="store_true", default=False,
-                 help='Always prompt for name/e-mail')
+    Wrapper().InitParser(p, gitc_init=gitc_init)
 
   def _RegisteredEnvironmentOptions(self):
     return {'REPO_MANIFEST_URL': 'manifest_url',
             'REPO_MIRROR_LOCATION': 'reference'}
 
+  def _CloneSuperproject(self, opt):
+    """Clone the superproject based on the superproject's url and branch.
+
+    Args:
+      opt: Program options returned from optparse.  See _Options().
+    """
+    superproject = git_superproject.Superproject(self.manifest,
+                                                 self.repodir,
+                                                 self.git_event_log,
+                                                 quiet=opt.quiet)
+    sync_result = superproject.Sync()
+    if not sync_result.success:
+      print('warning: git update of superproject failed, repo sync will not '
+            'use superproject to fetch source; while this error is not fatal, '
+            'and you can continue to run repo sync, please run repo init with '
+            'the --no-use-superproject option to stop seeing this warning',
+            file=sys.stderr)
+      if sync_result.fatal and opt.use_superproject is not None:
+        sys.exit(1)
+
   def _SyncManifest(self, opt):
     m = self.manifest.manifestProject
     is_new = not m.Exists
 
+    # If repo has already been initialized, we take -u with the absence of
+    # --standalone-manifest to mean "transition to a standard repo set up",
+    # which necessitates starting fresh.
+    # If --standalone-manifest is set, we always tear everything down and start
+    # anew.
+    if not is_new:
+      was_standalone_manifest = m.config.GetString('manifest.standalone')
+      if opt.standalone_manifest or (
+          was_standalone_manifest and opt.manifest_url):
+        m.config.ClearCache()
+        if m.gitdir and os.path.exists(m.gitdir):
+          platform_utils.rmtree(m.gitdir)
+        if m.worktree and os.path.exists(m.worktree):
+          platform_utils.rmtree(m.worktree)
+
+    is_new = not m.Exists
     if is_new:
       if not opt.manifest_url:
-        print('fatal: manifest url (-u) is required.', file=sys.stderr)
+        print('fatal: manifest url is required.', file=sys.stderr)
         sys.exit(1)
 
       if not opt.quiet:
-        print('Get %s' % GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),
+        print('Downloading manifest from %s' %
+              (GitConfig.ForUser().UrlInsteadOf(opt.manifest_url),),
               file=sys.stderr)
 
       # The manifest project object doesn't keep track of the path on the
@@ -200,30 +160,52 @@
 
       m._InitGitDir(mirror_git=mirrored_manifest_git)
 
-      if opt.manifest_branch:
-        m.revisionExpr = opt.manifest_branch
-      else:
-        m.revisionExpr = 'refs/heads/master'
-    else:
-      if opt.manifest_branch:
-        m.revisionExpr = opt.manifest_branch
-      else:
-        m.PreSync()
+    # If standalone_manifest is set, mark the project as "standalone" -- we'll
+    # still do much of the manifests.git set up, but will avoid actual syncs to
+    # a remote.
+    standalone_manifest = False
+    if opt.standalone_manifest:
+      standalone_manifest = True
+    elif not opt.manifest_url:
+      # If -u is set and --standalone-manifest is not, then we're not in
+      # standalone mode. Otherwise, use config to infer what we were in the last
+      # init.
+      standalone_manifest = bool(m.config.GetString('manifest.standalone'))
+    m.config.SetString('manifest.standalone', opt.manifest_url)
 
     self._ConfigureDepth(opt)
 
+    # Set the remote URL before the remote branch as we might need it below.
     if opt.manifest_url:
       r = m.GetRemote(m.remote.name)
       r.url = opt.manifest_url
       r.ResetFetch()
       r.Save()
 
+    if not standalone_manifest:
+      if opt.manifest_branch:
+        if opt.manifest_branch == 'HEAD':
+          opt.manifest_branch = m.ResolveRemoteHead()
+          if opt.manifest_branch is None:
+            print('fatal: unable to resolve HEAD', file=sys.stderr)
+            sys.exit(1)
+        m.revisionExpr = opt.manifest_branch
+      else:
+        if is_new:
+          default_branch = m.ResolveRemoteHead()
+          if default_branch is None:
+            # If the remote doesn't have HEAD configured, default to master.
+            default_branch = 'refs/heads/master'
+          m.revisionExpr = default_branch
+        else:
+          m.PreSync()
+
     groups = re.split(r'[,\s]+', opt.groups)
     all_platforms = ['linux', 'darwin', 'windows']
     platformize = lambda x: 'platform-' + x
     if opt.platform == 'auto':
       if (not opt.mirror and
-          not m.config.GetString('repo.mirror') == 'true'):
+              not m.config.GetString('repo.mirror') == 'true'):
         groups.append(platformize(platform.system().lower()))
     elif opt.platform == 'all':
       groups.extend(map(platformize, all_platforms))
@@ -235,7 +217,7 @@
 
     groups = [x for x in groups if x]
     groupstr = ','.join(groups)
-    if opt.platform == 'auto' and groupstr == 'default,platform-' + platform.system().lower():
+    if opt.platform == 'auto' and groupstr == self.manifest.GetDefaultGroupsStr():
       groupstr = None
     m.config.SetString('manifest.groups', groupstr)
 
@@ -243,11 +225,25 @@
       m.config.SetString('repo.reference', opt.reference)
 
     if opt.dissociate:
-      m.config.SetString('repo.dissociate', 'true')
+      m.config.SetBoolean('repo.dissociate', opt.dissociate)
+
+    if opt.worktree:
+      if opt.mirror:
+        print('fatal: --mirror and --worktree are incompatible',
+              file=sys.stderr)
+        sys.exit(1)
+      if opt.submodules:
+        print('fatal: --submodules and --worktree are incompatible',
+              file=sys.stderr)
+        sys.exit(1)
+      m.config.SetBoolean('repo.worktree', opt.worktree)
+      if is_new:
+        m.use_git_worktrees = True
+      print('warning: --worktree is experimental!', file=sys.stderr)
 
     if opt.archive:
       if is_new:
-        m.config.SetString('repo.archive', 'true')
+        m.config.SetBoolean('repo.archive', opt.archive)
       else:
         print('fatal: --archive is only supported when initializing a new '
               'workspace.', file=sys.stderr)
@@ -257,7 +253,7 @@
 
     if opt.mirror:
       if is_new:
-        m.config.SetString('repo.mirror', 'true')
+        m.config.SetBoolean('repo.mirror', opt.mirror)
       else:
         print('fatal: --mirror is only supported when initializing a new '
               'workspace.', file=sys.stderr)
@@ -265,25 +261,49 @@
               'in another location.', file=sys.stderr)
         sys.exit(1)
 
-    if opt.partial_clone:
+    if opt.partial_clone is not None:
       if opt.mirror:
         print('fatal: --mirror and --partial-clone are mutually exclusive',
               file=sys.stderr)
         sys.exit(1)
-      m.config.SetString('repo.partialclone', 'true')
+      m.config.SetBoolean('repo.partialclone', opt.partial_clone)
       if opt.clone_filter:
         m.config.SetString('repo.clonefilter', opt.clone_filter)
+    elif m.config.GetBoolean('repo.partialclone'):
+      opt.clone_filter = m.config.GetString('repo.clonefilter')
     else:
       opt.clone_filter = None
 
-    if opt.submodules:
-      m.config.SetString('repo.submodules', 'true')
+    if opt.partial_clone_exclude is not None:
+      m.config.SetString('repo.partialcloneexclude', opt.partial_clone_exclude)
 
-    if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet,
-        clone_bundle=not opt.no_clone_bundle,
-        current_branch_only=opt.current_branch_only,
-        no_tags=opt.no_tags, submodules=opt.submodules,
-        clone_filter=opt.clone_filter):
+    if opt.clone_bundle is None:
+      opt.clone_bundle = False if opt.partial_clone else True
+    else:
+      m.config.SetBoolean('repo.clonebundle', opt.clone_bundle)
+
+    if opt.submodules:
+      m.config.SetBoolean('repo.submodules', opt.submodules)
+
+    if opt.use_superproject is not None:
+      m.config.SetBoolean('repo.superproject', opt.use_superproject)
+
+    if standalone_manifest:
+      if is_new:
+        manifest_name = 'default.xml'
+        manifest_data = fetch.fetch_file(opt.manifest_url)
+        dest = os.path.join(m.worktree, manifest_name)
+        os.makedirs(os.path.dirname(dest), exist_ok=True)
+        with open(dest, 'wb') as f:
+          f.write(manifest_data)
+      return
+
+    if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet, verbose=opt.verbose,
+                              clone_bundle=opt.clone_bundle,
+                              current_branch_only=opt.current_branch_only,
+                              tags=opt.tags, submodules=opt.submodules,
+                              clone_filter=opt.clone_filter,
+                              partial_clone_exclude=self.manifest.PartialCloneExclude):
       r = m.GetRemote(m.remote.name)
       print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
 
@@ -326,8 +346,8 @@
       return value
     return a
 
-  def _ShouldConfigureUser(self):
-    gc = self.manifest.globalConfig
+  def _ShouldConfigureUser(self, opt):
+    gc = self.client.globalConfig
     mp = self.manifest.manifestProject
 
     # If we don't have local settings, get from global.
@@ -338,21 +358,24 @@
       mp.config.SetString('user.name', gc.GetString('user.name'))
       mp.config.SetString('user.email', gc.GetString('user.email'))
 
-    print()
-    print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
-                                         mp.config.GetString('user.email')))
-    print('If you want to change this, please re-run \'repo init\' with --config-name')
+    if not opt.quiet:
+      print()
+      print('Your identity is: %s <%s>' % (mp.config.GetString('user.name'),
+                                           mp.config.GetString('user.email')))
+      print("If you want to change this, please re-run 'repo init' with --config-name")
     return False
 
-  def _ConfigureUser(self):
+  def _ConfigureUser(self, opt):
     mp = self.manifest.manifestProject
 
     while True:
-      print()
-      name  = self._Prompt('Your Name', mp.UserName)
+      if not opt.quiet:
+        print()
+      name = self._Prompt('Your Name', mp.UserName)
       email = self._Prompt('Your Email', mp.UserEmail)
 
-      print()
+      if not opt.quiet:
+        print()
       print('Your identity is: %s <%s>' % (name, email))
       print('is this correct [y/N]? ', end='')
       # TODO: When we require Python 3, use flush=True w/print above.
@@ -373,7 +396,7 @@
     return False
 
   def _ConfigureColor(self):
-    gc = self.manifest.globalConfig
+    gc = self.client.globalConfig
     if self._HasColorSet(gc):
       return
 
@@ -424,15 +447,16 @@
       # We store the depth in the main manifest project.
       self.manifest.manifestProject.config.SetString('repo.depth', depth)
 
-  def _DisplayResult(self):
+  def _DisplayResult(self, opt):
     if self.manifest.IsMirror:
       init_type = 'mirror '
     else:
       init_type = ''
 
-    print()
-    print('repo %shas been initialized in %s'
-          % (init_type, self.manifest.topdir))
+    if not opt.quiet:
+      print()
+      print('repo %shas been initialized in %s' %
+            (init_type, self.manifest.topdir))
 
     current_dir = os.getcwd()
     if current_dir != self.manifest.topdir:
@@ -450,15 +474,61 @@
     if opt.archive and opt.mirror:
       self.OptionParser.error('--mirror and --archive cannot be used together.')
 
+    if opt.standalone_manifest and (
+        opt.manifest_branch or opt.manifest_name != 'default.xml'):
+      self.OptionParser.error('--manifest-branch and --manifest-name cannot'
+                              ' be used with --standalone-manifest.')
+
+    if args:
+      if opt.manifest_url:
+        self.OptionParser.error(
+            '--manifest-url option and URL argument both specified: only use '
+            'one to select the manifest URL.')
+
+      opt.manifest_url = args.pop(0)
+
+      if args:
+        self.OptionParser.error('too many arguments to init')
+
   def Execute(self, opt, args):
-    git_require(MIN_GIT_VERSION, fail=True)
+    git_require(MIN_GIT_VERSION_HARD, fail=True)
+    if not git_require(MIN_GIT_VERSION_SOFT):
+      print('repo: warning: git-%s+ will soon be required; please upgrade your '
+            'version of git to maintain support.'
+            % ('.'.join(str(x) for x in MIN_GIT_VERSION_SOFT),),
+            file=sys.stderr)
+
+    rp = self.manifest.repoProject
+
+    # Handle new --repo-url requests.
+    if opt.repo_url:
+      remote = rp.GetRemote('origin')
+      remote.url = opt.repo_url
+      remote.Save()
+
+    # Handle new --repo-rev requests.
+    if opt.repo_rev:
+      wrapper = Wrapper()
+      remote_ref, rev = wrapper.check_repo_rev(
+          rp.gitdir, opt.repo_rev, repo_verify=opt.repo_verify, quiet=opt.quiet)
+      branch = rp.GetBranch('default')
+      branch.merge = remote_ref
+      rp.work_git.reset('--hard', rev)
+      branch.Save()
+
+    if opt.worktree:
+      # Older versions of git supported worktree, but had dangerous gc bugs.
+      git_require((2, 15, 0), fail=True, msg='git gc worktree corruption')
 
     self._SyncManifest(opt)
     self._LinkManifest(opt.manifest_name)
 
+    if self.manifest.manifestProject.config.GetBoolean('repo.superproject'):
+      self._CloneSuperproject(opt)
+
     if os.isatty(0) and os.isatty(1) and not self.manifest.IsMirror:
-      if opt.config_name or self._ShouldConfigureUser():
-        self._ConfigureUser()
+      if opt.config_name or self._ShouldConfigureUser(opt):
+        self._ConfigureUser(opt)
       self._ConfigureColor()
 
-    self._DisplayResult()
+    self._DisplayResult(opt)
diff --git a/subcmds/list.py b/subcmds/list.py
index 00172f0..6adf85b 100644
--- a/subcmds/list.py
+++ b/subcmds/list.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2011 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,45 +12,59 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
-import sys
+import os
 
 from command import Command, MirrorSafeCommand
 
+
 class List(Command, MirrorSafeCommand):
-  common = True
+  COMMON = True
   helpSummary = "List projects and their associated directories"
   helpUsage = """
 %prog [-f] [<project>...]
-%prog [-f] -r str1 [str2]..."
+%prog [-f] -r str1 [str2]...
 """
   helpDescription = """
 List all projects; pass '.' to list the project for the cwd.
 
+By default, only projects that currently exist in the checkout are shown.  If
+you want to list all projects (using the specified filter settings), use the
+--all option.  If you want to show all projects regardless of the manifest
+groups, then also pass --groups all.
+
 This is similar to running: repo forall -c 'echo "$REPO_PATH : $REPO_PROJECT"'.
 """
 
   def _Options(self, p):
     p.add_option('-r', '--regex',
                  dest='regex', action='store_true',
-                 help="Filter the project list based on regex or wildcard matching of strings")
+                 help='filter the project list based on regex or wildcard matching of strings')
     p.add_option('-g', '--groups',
                  dest='groups',
-                 help="Filter the project list based on the groups the project is in")
-    p.add_option('-f', '--fullpath',
-                 dest='fullpath', action='store_true',
-                 help="Display the full work tree path instead of the relative path")
+                 help='filter the project list based on the groups the project is in')
+    p.add_option('-a', '--all',
+                 action='store_true',
+                 help='show projects regardless of checkout state')
     p.add_option('-n', '--name-only',
                  dest='name_only', action='store_true',
-                 help="Display only the name of the repository")
+                 help='display only the name of the repository')
     p.add_option('-p', '--path-only',
                  dest='path_only', action='store_true',
-                 help="Display only the path of the repository")
+                 help='display only the path of the repository')
+    p.add_option('-f', '--fullpath',
+                 dest='fullpath', action='store_true',
+                 help='display the full work tree path instead of the relative path')
+    p.add_option('--relative-to', metavar='PATH',
+                 help='display paths relative to this one (default: top of repo client checkout)')
 
   def ValidateOptions(self, opt, args):
     if opt.fullpath and opt.name_only:
       self.OptionParser.error('cannot combine -f and -n')
 
+    # Resolve any symlinks so the output is stable.
+    if opt.relative_to:
+      opt.relative_to = os.path.realpath(opt.relative_to)
+
   def Execute(self, opt, args):
     """List all projects and the associated directories.
 
@@ -65,23 +77,26 @@
       args: Positional args.  Can be a list of projects to list, or empty.
     """
     if not opt.regex:
-      projects = self.GetProjects(args, groups=opt.groups)
+      projects = self.GetProjects(args, groups=opt.groups, missing_ok=opt.all)
     else:
       projects = self.FindProjects(args)
 
     def _getpath(x):
       if opt.fullpath:
         return x.worktree
+      if opt.relative_to:
+        return os.path.relpath(x.worktree, opt.relative_to)
       return x.relpath
 
     lines = []
     for project in projects:
       if opt.name_only and not opt.path_only:
-        lines.append("%s" % ( project.name))
+        lines.append("%s" % (project.name))
       elif opt.path_only and not opt.name_only:
         lines.append("%s" % (_getpath(project)))
       else:
         lines.append("%s : %s" % (_getpath(project), project.name))
 
-    lines.sort()
-    print('\n'.join(lines))
+    if lines:
+      lines.sort()
+      print('\n'.join(lines))
diff --git a/subcmds/manifest.py b/subcmds/manifest.py
index 9c1b3f0..0fbdeac 100644
--- a/subcmds/manifest.py
+++ b/subcmds/manifest.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,25 +12,32 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import json
 import os
 import sys
 
 from command import PagedCommand
 
+
 class Manifest(PagedCommand):
-  common = False
+  COMMON = False
   helpSummary = "Manifest inspection utility"
   helpUsage = """
-%prog [-o {-|NAME.xml} [-r]]
+%prog [-o {-|NAME.xml}] [-m MANIFEST.xml] [-r]
 """
   _helpDescription = """
 
 With the -o option, exports the current manifest for inspection.
-The manifest and (if present) local_manifest.xml are combined
+The manifest and (if present) local_manifests/ are combined
 together to produce a single manifest file.  This file can be stored
 in a Git repository for use during future 'repo init' invocations.
 
+The -r option can be used to generate a manifest file with project
+revisions set to the current commit hash.  These are known as
+"revision locked manifests", as they don't follow a particular branch.
+In this case, the 'upstream' attribute is set to the ref we were on
+when the manifest was generated.  The 'dest-branch' attribute is set
+to indicate the remote ref to push changes to via 'repo upload'.
 """
 
   @property
@@ -48,26 +53,63 @@
   def _Options(self, p):
     p.add_option('-r', '--revision-as-HEAD',
                  dest='peg_rev', action='store_true',
-                 help='Save revisions as current HEAD')
+                 help='save revisions as current HEAD')
+    p.add_option('-m', '--manifest-name',
+                 help='temporary manifest to use for this sync', metavar='NAME.xml')
     p.add_option('--suppress-upstream-revision', dest='peg_rev_upstream',
                  default=True, action='store_false',
-                 help='If in -r mode, do not write the upstream field.  '
-                 'Only of use if the branch names for a sha1 manifest are '
-                 'sensitive.')
+                 help='if in -r mode, do not write the upstream field '
+                 '(only of use if the branch names for a sha1 manifest are '
+                 'sensitive)')
+    p.add_option('--suppress-dest-branch', dest='peg_rev_dest_branch',
+                 default=True, action='store_false',
+                 help='if in -r mode, do not write the dest-branch field '
+                 '(only of use if the branch names for a sha1 manifest are '
+                 'sensitive)')
+    p.add_option('--json', default=False, action='store_true',
+                 help='output manifest in JSON format (experimental)')
+    p.add_option('--pretty', default=False, action='store_true',
+                 help='format output for humans to read')
+    p.add_option('--no-local-manifests', default=False, action='store_true',
+                 dest='ignore_local_manifests', help='ignore local manifests')
     p.add_option('-o', '--output-file',
                  dest='output_file',
                  default='-',
-                 help='File to save the manifest to',
+                 help='file to save the manifest to',
                  metavar='-|NAME.xml')
 
   def _Output(self, opt):
+    # If alternate manifest is specified, override the manifest file that we're using.
+    if opt.manifest_name:
+      self.manifest.Override(opt.manifest_name, False)
+
     if opt.output_file == '-':
       fd = sys.stdout
     else:
       fd = open(opt.output_file, 'w')
-    self.manifest.Save(fd,
-                       peg_rev = opt.peg_rev,
-                       peg_rev_upstream = opt.peg_rev_upstream)
+
+    self.manifest.SetUseLocalManifests(not opt.ignore_local_manifests)
+
+    if opt.json:
+      print('warning: --json is experimental!', file=sys.stderr)
+      doc = self.manifest.ToDict(peg_rev=opt.peg_rev,
+                                 peg_rev_upstream=opt.peg_rev_upstream,
+                                 peg_rev_dest_branch=opt.peg_rev_dest_branch)
+
+      json_settings = {
+          # JSON style guide says Uunicode characters are fully allowed.
+          'ensure_ascii': False,
+          # We use 2 space indent to match JSON style guide.
+          'indent': 2 if opt.pretty else None,
+          'separators': (',', ': ') if opt.pretty else (',', ':'),
+          'sort_keys': True,
+      }
+      fd.write(json.dumps(doc, **json_settings))
+    else:
+      self.manifest.Save(fd,
+                         peg_rev=opt.peg_rev,
+                         peg_rev_upstream=opt.peg_rev_upstream,
+                         peg_rev_dest_branch=opt.peg_rev_dest_branch)
     fd.close()
     if opt.output_file != '-':
       print('Saved manifest to %s' % opt.output_file, file=sys.stderr)
diff --git a/subcmds/overview.py b/subcmds/overview.py
index 08b58a6..63f5a79 100644
--- a/subcmds/overview.py
+++ b/subcmds/overview.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2012 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,13 +12,14 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import optparse
+
 from color import Coloring
 from command import PagedCommand
 
 
 class Overview(PagedCommand):
-  common = True
+  COMMON = True
   helpSummary = "Display overview of unmerged project branches"
   helpUsage = """
 %prog [--current-branch] [<project>...]
@@ -29,15 +28,22 @@
 The '%prog' command is used to display an overview of the projects branches,
 and list any local commits that have not yet been merged into the project.
 
-The -b/--current-branch option can be used to restrict the output to only
+The -c/--current-branch option can be used to restrict the output to only
 branches currently checked out in each project.  By default, all branches
 are displayed.
 """
 
   def _Options(self, p):
-    p.add_option('-b', '--current-branch',
+    p.add_option('-c', '--current-branch',
                  dest="current_branch", action="store_true",
-                 help="Consider only checked out branches")
+                 help="consider only checked out branches")
+    p.add_option('--no-current-branch',
+                 dest='current_branch', action='store_false',
+                 help='consider all local branches')
+    # Turn this into a warning & remove this someday.
+    p.add_option('-b',
+                 dest='current_branch', action='store_true',
+                 help=optparse.SUPPRESS_HELP)
 
   def Execute(self, opt, args):
     all_branches = []
diff --git a/subcmds/prune.py b/subcmds/prune.py
index ff2fba1..584ee7e 100644
--- a/subcmds/prune.py
+++ b/subcmds/prune.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,21 +12,38 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import itertools
+
 from color import Coloring
-from command import PagedCommand
+from command import DEFAULT_LOCAL_JOBS, PagedCommand
+
 
 class Prune(PagedCommand):
-  common = True
+  COMMON = True
   helpSummary = "Prune (delete) already merged topics"
   helpUsage = """
 %prog [<project>...]
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
+
+  def _ExecuteOne(self, project):
+    """Process one project."""
+    return project.PruneHeads()
 
   def Execute(self, opt, args):
-    all_branches = []
-    for project in self.GetProjects(args):
-      all_branches.extend(project.PruneHeads())
+    projects = self.GetProjects(args)
+
+    # NB: Should be able to refactor this module to display summary as results
+    # come back from children.
+    def _ProcessResults(_pool, _output, results):
+      return list(itertools.chain.from_iterable(results))
+
+    all_branches = self.ExecuteInParallel(
+        opt.jobs,
+        self._ExecuteOne,
+        projects,
+        callback=_ProcessResults,
+        ordered=True)
 
     if not all_branches:
       return
diff --git a/subcmds/rebase.py b/subcmds/rebase.py
index dcb8b2a..7c53eb7 100644
--- a/subcmds/rebase.py
+++ b/subcmds/rebase.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2010 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import sys
 
 from color import Coloring
@@ -30,7 +27,7 @@
 
 
 class Rebase(Command):
-  common = True
+  COMMON = True
   helpSummary = "Rebase local branches on upstream branch"
   helpUsage = """
 %prog {[<project>...] | -i <project>...}
@@ -42,36 +39,34 @@
 """
 
   def _Options(self, p):
-    p.add_option('-i', '--interactive',
-                dest="interactive", action="store_true",
-                help="interactive rebase (single project only)")
+    g = p.get_option_group('--quiet')
+    g.add_option('-i', '--interactive',
+                 dest="interactive", action="store_true",
+                 help="interactive rebase (single project only)")
 
     p.add_option('--fail-fast',
                  dest='fail_fast', action='store_true',
-                 help='Stop rebasing after first error is hit')
+                 help='stop rebasing after first error is hit')
     p.add_option('-f', '--force-rebase',
                  dest='force_rebase', action='store_true',
-                 help='Pass --force-rebase to git rebase')
+                 help='pass --force-rebase to git rebase')
     p.add_option('--no-ff',
-                 dest='no_ff', action='store_true',
-                 help='Pass --no-ff to git rebase')
-    p.add_option('-q', '--quiet',
-                 dest='quiet', action='store_true',
-                 help='Pass --quiet to git rebase')
+                 dest='ff', default=True, action='store_false',
+                 help='pass --no-ff to git rebase')
     p.add_option('--autosquash',
                  dest='autosquash', action='store_true',
-                 help='Pass --autosquash to git rebase')
+                 help='pass --autosquash to git rebase')
     p.add_option('--whitespace',
                  dest='whitespace', action='store', metavar='WS',
-                 help='Pass --whitespace to git rebase')
+                 help='pass --whitespace to git rebase')
     p.add_option('--auto-stash',
                  dest='auto_stash', action='store_true',
-                 help='Stash local modifications before starting')
+                 help='stash local modifications before starting')
     p.add_option('-m', '--onto-manifest',
                  dest='onto_manifest', action='store_true',
-                 help='Rebase onto the manifest version instead of upstream '
-                      'HEAD.  This helps to make sure the local tree stays '
-                      'consistent if you previously synced to a manifest.')
+                 help='rebase onto the manifest version instead of upstream '
+                      'HEAD (this helps to make sure the local tree stays '
+                      'consistent if you previously synced to a manifest)')
 
   def Execute(self, opt, args):
     all_projects = self.GetProjects(args)
@@ -82,7 +77,7 @@
             file=sys.stderr)
       if len(args) == 1:
         print('note: project %s is mapped to more than one path' % (args[0],),
-            file=sys.stderr)
+              file=sys.stderr)
       return 1
 
     # Setup the common git rebase args that we use for all projects.
@@ -93,7 +88,7 @@
       common_args.append('--quiet')
     if opt.force_rebase:
       common_args.append('--force-rebase')
-    if opt.no_ff:
+    if not opt.ff:
       common_args.append('--no-ff')
     if opt.autosquash:
       common_args.append('--autosquash')
diff --git a/subcmds/selfupdate.py b/subcmds/selfupdate.py
index a8a09b6..282f518 100644
--- a/subcmds/selfupdate.py
+++ b/subcmds/selfupdate.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 from optparse import SUPPRESS_HELP
 import sys
 
@@ -22,8 +19,9 @@
 from subcmds.sync import _PostRepoUpgrade
 from subcmds.sync import _PostRepoFetch
 
+
 class Selfupdate(Command, MirrorSafeCommand):
-  common = False
+  COMMON = False
   helpSummary = "Update repo to the latest version"
   helpUsage = """
 %prog
@@ -39,7 +37,7 @@
   def _Options(self, p):
     g = p.add_option_group('repo Version options')
     g.add_option('--no-repo-verify',
-                 dest='no_repo_verify', action='store_true',
+                 dest='repo_verify', default=True, action='store_false',
                  help='do not verify repo source code')
     g.add_option('--repo-upgraded',
                  dest='repo_upgraded', action='store_true',
@@ -59,5 +57,5 @@
 
       rp.bare_git.gc('--auto')
       _PostRepoFetch(rp,
-                     no_repo_verify = opt.no_repo_verify,
-                     verbose = True)
+                     repo_verify=opt.repo_verify,
+                     verbose=True)
diff --git a/subcmds/smartsync.py b/subcmds/smartsync.py
index 675b983..d91d59c 100644
--- a/subcmds/smartsync.py
+++ b/subcmds/smartsync.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2010 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,8 +14,9 @@
 
 from subcmds.sync import Sync
 
+
 class Smartsync(Sync):
-  common = True
+  COMMON = True
   helpSummary = "Update working tree to the latest known good revision"
   helpUsage = """
 %prog [<project>...]
diff --git a/subcmds/stage.py b/subcmds/stage.py
index aeb4951..0389a4f 100644
--- a/subcmds/stage.py
+++ b/subcmds/stage.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,13 +12,13 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import sys
 
 from color import Coloring
 from command import InteractiveCommand
 from git_command import GitCommand
 
+
 class _ProjectList(Coloring):
   def __init__(self, gc):
     Coloring.__init__(self, gc, 'interactive')
@@ -28,8 +26,9 @@
     self.header = self.printer('header', attr='bold')
     self.help = self.printer('help', fg='red', attr='bold')
 
+
 class Stage(InteractiveCommand):
-  common = True
+  COMMON = True
   helpSummary = "Stage file(s) for commit"
   helpUsage = """
 %prog -i [<project>...]
@@ -39,7 +38,8 @@
 """
 
   def _Options(self, p):
-    p.add_option('-i', '--interactive',
+    g = p.get_option_group('--quiet')
+    g.add_option('-i', '--interactive',
                  dest='interactive', action='store_true',
                  help='use interactive staging')
 
@@ -105,6 +105,7 @@
         continue
     print('Bye.')
 
+
 def _AddI(project):
   p = GitCommand(project, ['add', '--interactive'], bare=False)
   p.Wait()
diff --git a/subcmds/start.py b/subcmds/start.py
index 6ec0b2c..2addaf2 100644
--- a/subcmds/start.py
+++ b/subcmds/start.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,19 +12,20 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import functools
 import os
 import sys
 
-from command import Command
+from command import Command, DEFAULT_LOCAL_JOBS
 from git_config import IsImmutable
 from git_command import git
 import gitc_utils
 from progress import Progress
 from project import SyncBuffer
 
+
 class Start(Command):
-  common = True
+  COMMON = True
   helpSummary = "Start a new branch for development"
   helpUsage = """
 %prog <newbranchname> [--all | <project>...]
@@ -35,6 +34,7 @@
 '%prog' begins a new branch of development, starting from the
 revision specified in the manifest.
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
 
   def _Options(self, p):
     p.add_option('--all',
@@ -42,7 +42,8 @@
                  help='begin branch in all projects')
     p.add_option('-r', '--rev', '--revision', dest='revision',
                  help='point branch at this revision instead of upstream')
-    p.add_option('--head', dest='revision', action='store_const', const='HEAD',
+    p.add_option('--head', '--HEAD',
+                 dest='revision', action='store_const', const='HEAD',
                  help='abbreviation for --rev HEAD')
 
   def ValidateOptions(self, opt, args):
@@ -53,6 +54,26 @@
     if not git.check_ref_format('heads/%s' % nb):
       self.OptionParser.error("'%s' is not a valid name" % nb)
 
+  def _ExecuteOne(self, revision, nb, project):
+    """Start one project."""
+    # If the current revision is immutable, such as a SHA1, a tag or
+    # a change, then we can't push back to it. Substitute with
+    # dest_branch, if defined; or with manifest default revision instead.
+    branch_merge = ''
+    if IsImmutable(project.revisionExpr):
+      if project.dest_branch:
+        branch_merge = project.dest_branch
+      else:
+        branch_merge = self.manifest.default.revisionExpr
+
+    try:
+      ret = project.StartBranch(
+          nb, branch_merge=branch_merge, revision=revision)
+    except Exception as e:
+      print('error: unable to checkout %s: %s' % (project.name, e), file=sys.stderr)
+      ret = False
+    return (ret, project)
+
   def Execute(self, opt, args):
     nb = args[0]
     err = []
@@ -60,7 +81,7 @@
     if not opt.all:
       projects = args[1:]
       if len(projects) < 1:
-        projects = ['.',]  # start it in the local project by default
+        projects = ['.']  # start it in the local project by default
 
     all_projects = self.GetProjects(projects,
                                     missing_ok=bool(self.gitc_manifest))
@@ -84,11 +105,8 @@
       if not os.path.exists(os.getcwd()):
         os.chdir(self.manifest.topdir)
 
-    pm = Progress('Starting %s' % nb, len(all_projects))
-    for project in all_projects:
-      pm.update()
-
-      if self.gitc_manifest:
+      pm = Progress('Syncing %s' % nb, len(all_projects), quiet=opt.quiet)
+      for project in all_projects:
         gitc_project = self.gitc_manifest.paths[project.relpath]
         # Sync projects that have not been opened.
         if not gitc_project.already_synced:
@@ -101,21 +119,21 @@
           sync_buf = SyncBuffer(self.manifest.manifestProject.config)
           project.Sync_LocalHalf(sync_buf)
           project.revisionId = gitc_project.old_revision
+        pm.update()
+      pm.end()
 
-      # If the current revision is immutable, such as a SHA1, a tag or
-      # a change, then we can't push back to it. Substitute with
-      # dest_branch, if defined; or with manifest default revision instead.
-      branch_merge = ''
-      if IsImmutable(project.revisionExpr):
-        if project.dest_branch:
-          branch_merge = project.dest_branch
-        else:
-          branch_merge = self.manifest.default.revisionExpr
+    def _ProcessResults(_pool, pm, results):
+      for (result, project) in results:
+        if not result:
+          err.append(project)
+        pm.update()
 
-      if not project.StartBranch(
-          nb, branch_merge=branch_merge, revision=opt.revision):
-        err.append(project)
-    pm.end()
+    self.ExecuteInParallel(
+        opt.jobs,
+        functools.partial(self._ExecuteOne, opt.revision, nb),
+        all_projects,
+        callback=_ProcessResults,
+        output=Progress('Starting %s' % (nb,), len(all_projects), quiet=opt.quiet))
 
     if err:
       for p in err:
diff --git a/subcmds/status.py b/subcmds/status.py
index 63972d7..5b66954 100644
--- a/subcmds/status.py
+++ b/subcmds/status.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,25 +12,19 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
-
-from command import PagedCommand
-
-try:
-  import threading as _threading
-except ImportError:
-  import dummy_threading as _threading
-
+import functools
 import glob
-
-import itertools
+import io
 import os
 
+from command import DEFAULT_LOCAL_JOBS, PagedCommand
+
 from color import Coloring
 import platform_utils
 
+
 class Status(PagedCommand):
-  common = True
+  COMMON = True
   helpSummary = "Show the working tree status"
   helpUsage = """
 %prog [<project>...]
@@ -84,36 +76,29 @@
  d:  deleted       (    in index, not in work tree                )
 
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
 
   def _Options(self, p):
-    p.add_option('-j', '--jobs',
-                 dest='jobs', action='store', type='int', default=2,
-                 help="number of projects to check simultaneously")
     p.add_option('-o', '--orphans',
                  dest='orphans', action='store_true',
                  help="include objects in working directory outside of repo projects")
-    p.add_option('-q', '--quiet', action='store_true',
-                 help="only print the name of modified projects")
 
-  def _StatusHelper(self, project, clean_counter, sem, quiet):
+  def _StatusHelper(self, quiet, project):
     """Obtains the status for a specific project.
 
     Obtains the status for a project, redirecting the output to
-    the specified object. It will release the semaphore
-    when done.
+    the specified object.
 
     Args:
+      quiet: Where to output the status.
       project: Project to get status of.
-      clean_counter: Counter for clean projects.
-      sem: Semaphore, will call release() when complete.
-      output: Where to output the status.
+
+    Returns:
+      The status of the project.
     """
-    try:
-      state = project.PrintWorkTreeStatus(quiet=quiet)
-      if state == 'CLEAN':
-        next(clean_counter)
-    finally:
-      sem.release()
+    buf = io.StringIO()
+    ret = project.PrintWorkTreeStatus(quiet=quiet, output_redir=buf)
+    return (ret, buf.getvalue())
 
   def _FindOrphans(self, dirs, proj_dirs, proj_dirs_parents, outstring):
     """find 'dirs' that are present in 'proj_dirs_parents' but not in 'proj_dirs'"""
@@ -126,34 +111,31 @@
         continue
       if item in proj_dirs_parents:
         self._FindOrphans(glob.glob('%s/.*' % item) +
-            glob.glob('%s/*' % item),
-            proj_dirs, proj_dirs_parents, outstring)
+                          glob.glob('%s/*' % item),
+                          proj_dirs, proj_dirs_parents, outstring)
         continue
       outstring.append(''.join([status_header, item, '/']))
 
   def Execute(self, opt, args):
     all_projects = self.GetProjects(args)
-    counter = itertools.count()
 
-    if opt.jobs == 1:
-      for project in all_projects:
-        state = project.PrintWorkTreeStatus(quiet=opt.quiet)
+    def _ProcessResults(_pool, _output, results):
+      ret = 0
+      for (state, output) in results:
+        if output:
+          print(output, end='')
         if state == 'CLEAN':
-          next(counter)
-    else:
-      sem = _threading.Semaphore(opt.jobs)
-      threads = []
-      for project in all_projects:
-        sem.acquire()
+          ret += 1
+      return ret
 
-        t = _threading.Thread(target=self._StatusHelper,
-                              args=(project, counter, sem, opt.quiet))
-        threads.append(t)
-        t.daemon = True
-        t.start()
-      for t in threads:
-        t.join()
-    if not opt.quiet and len(all_projects) == next(counter):
+    counter = self.ExecuteInParallel(
+        opt.jobs,
+        functools.partial(self._StatusHelper, opt.quiet),
+        all_projects,
+        callback=_ProcessResults,
+        ordered=True)
+
+    if not opt.quiet and len(all_projects) == counter:
       print('nothing to commit (working directory clean)')
 
     if opt.orphans:
@@ -170,8 +152,8 @@
       class StatusColoring(Coloring):
         def __init__(self, config):
           Coloring.__init__(self, config, 'status')
-          self.project = self.printer('header', attr = 'bold')
-          self.untracked = self.printer('untracked', fg = 'red')
+          self.project = self.printer('header', attr='bold')
+          self.untracked = self.printer('untracked', fg='red')
 
       orig_path = os.getcwd()
       try:
@@ -179,11 +161,11 @@
 
         outstring = []
         self._FindOrphans(glob.glob('.*') +
-            glob.glob('*'),
-            proj_dirs, proj_dirs_parents, outstring)
+                          glob.glob('*'),
+                          proj_dirs, proj_dirs_parents, outstring)
 
         if outstring:
-          output = StatusColoring(self.manifest.globalConfig)
+          output = StatusColoring(self.client.globalConfig)
           output.project('Objects not within a project (orphans)')
           output.nl()
           for entry in outstring:
diff --git a/subcmds/sync.py b/subcmds/sync.py
index 2973a16..3211cbb 100644
--- a/subcmds/sync.py
+++ b/subcmds/sync.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,37 +12,23 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import errno
+import functools
+import http.cookiejar as cookielib
+import io
 import json
+import multiprocessing
 import netrc
 from optparse import SUPPRESS_HELP
 import os
-import re
 import socket
-import subprocess
 import sys
 import tempfile
 import time
-
-from pyversion import is_python3
-if is_python3():
-  import http.cookiejar as cookielib
-  import urllib.error
-  import urllib.parse
-  import urllib.request
-  import xmlrpc.client
-else:
-  import cookielib
-  import imp
-  import urllib2
-  import urlparse
-  import xmlrpclib
-  urllib = imp.new_module('urllib')
-  urllib.error = urllib2
-  urllib.parse = urlparse
-  urllib.request = urllib2
-  xmlrpc = imp.new_module('xmlrpc')
-  xmlrpc.client = xmlrpclib
+import urllib.error
+import urllib.parse
+import urllib.request
+import xmlrpc.client
 
 try:
   import threading as _threading
@@ -53,44 +37,36 @@
 
 try:
   import resource
+
   def _rlimit_nofile():
     return resource.getrlimit(resource.RLIMIT_NOFILE)
 except ImportError:
   def _rlimit_nofile():
     return (256, 256)
 
-try:
-  import multiprocessing
-except ImportError:
-  multiprocessing = None
-
 import event_log
-from git_command import GIT, git_require
+from git_command import git_require
 from git_config import GetUrlCookieFile
 from git_refs import R_HEADS, HEAD
+import git_superproject
 import gitc_utils
 from project import Project
 from project import RemoteSpec
-from command import Command, MirrorSafeCommand
+from command import Command, MirrorSafeCommand, WORKER_BATCH_SIZE
 from error import RepoChangedException, GitError, ManifestParseError
 import platform_utils
 from project import SyncBuffer
 from progress import Progress
+import ssh
 from wrapper import Wrapper
 from manifest_xml import GitcManifest
 
 _ONE_DAY_S = 24 * 60 * 60
 
-class _FetchError(Exception):
-  """Internal error thrown in _FetchHelper() when we don't want stack trace."""
-  pass
-
-class _CheckoutError(Exception):
-  """Internal error thrown in _CheckoutOne() when we don't want stack trace."""
 
 class Sync(Command, MirrorSafeCommand):
   jobs = 1
-  common = True
+  COMMON = True
   helpSummary = "Update working tree to the latest revision"
   helpUsage = """
 %prog [<project>...]
@@ -133,11 +109,11 @@
 credentials.
 
 By default, all projects will be synced. The --fail-fast option can be used
-to halt syncing as soon as possible when the the first project fails to sync.
+to halt syncing as soon as possible when the first project fails to sync.
 
 The --force-sync option can be used to overwrite existing git
 directories if they have previously been linked to a different
-object direcotry. WARNING: This may cause data to be lost since
+object directory. WARNING: This may cause data to be lost since
 refs may be removed when overwriting.
 
 The --force-remove-dirty option can be used to remove previously used
@@ -191,12 +167,21 @@
 later is required to fix a server side protocol bug.
 
 """
+  PARALLEL_JOBS = 1
+
+  def _CommonOptions(self, p):
+    if self.manifest:
+      try:
+        self.PARALLEL_JOBS = self.manifest.default.sync_j
+      except ManifestParseError:
+        pass
+    super()._CommonOptions(p)
 
   def _Options(self, p, show_smart=True):
-    try:
-      self.jobs = self.manifest.default.sync_j
-    except ManifestParseError:
-      self.jobs = 1
+    p.add_option('--jobs-network', default=None, type=int, metavar='JOBS',
+                 help='number of network jobs to run in parallel (defaults to --jobs)')
+    p.add_option('--jobs-checkout', default=None, type=int, metavar='JOBS',
+                 help='number of local checkout jobs to run in parallel (defaults to --jobs)')
 
     p.add_option('-f', '--force-broken',
                  dest='force_broken', action='store_true',
@@ -217,6 +202,10 @@
     p.add_option('-l', '--local-only',
                  dest='local_only', action='store_true',
                  help="only update working tree, don't fetch")
+    p.add_option('--no-manifest-update', '--nmu',
+                 dest='mp_update', action='store_false', default='true',
+                 help='use the existing manifest checkout as-is. '
+                      '(do not update to the latest revision)')
     p.add_option('-n', '--network-only',
                  dest='network_only', action='store_true',
                  help="fetch only, don't update working tree")
@@ -226,17 +215,15 @@
     p.add_option('-c', '--current-branch',
                  dest='current_branch_only', action='store_true',
                  help='fetch only current branch from server')
-    p.add_option('-q', '--quiet',
-                 dest='quiet', action='store_true',
-                 help='be more quiet')
-    p.add_option('-j', '--jobs',
-                 dest='jobs', action='store', type='int',
-                 help="projects to fetch simultaneously (default %d)" % self.jobs)
+    p.add_option('--no-current-branch',
+                 dest='current_branch_only', action='store_false',
+                 help='fetch all branches from server')
     p.add_option('-m', '--manifest-name',
                  dest='manifest_name',
                  help='temporary manifest to use for this sync', metavar='NAME.xml')
-    p.add_option('--no-clone-bundle',
-                 dest='no_clone_bundle', action='store_true',
+    p.add_option('--clone-bundle', action='store_true',
+                 help='enable use of /clone.bundle on HTTP/HTTPS')
+    p.add_option('--no-clone-bundle', dest='clone_bundle', action='store_false',
                  help='disable use of /clone.bundle on HTTP/HTTPS')
     p.add_option('-u', '--manifest-server-username', action='store',
                  dest='manifest_server_username',
@@ -247,12 +234,23 @@
     p.add_option('--fetch-submodules',
                  dest='fetch_submodules', action='store_true',
                  help='fetch submodules from server')
+    p.add_option('--use-superproject', action='store_true',
+                 help='use the manifest superproject to sync projects')
+    p.add_option('--no-use-superproject', action='store_false',
+                 dest='use_superproject',
+                 help='disable use of manifest superprojects')
+    p.add_option('--tags',
+                 action='store_false',
+                 help='fetch tags')
     p.add_option('--no-tags',
-                 dest='no_tags', action='store_true',
+                 dest='tags', action='store_false',
                  help="don't fetch tags")
     p.add_option('--optimized-fetch',
                  dest='optimized_fetch', action='store_true',
                  help='only fetch projects fixed to sha1 if revision does not exist locally')
+    p.add_option('--retry-fetches',
+                 default=0, action='store', type='int',
+                 help='number of times to retry fetches on transient errors')
     p.add_option('--prune', dest='prune', action='store_true',
                  help='delete refs that no longer exist on the remote')
     if show_smart:
@@ -265,345 +263,400 @@
 
     g = p.add_option_group('repo Version options')
     g.add_option('--no-repo-verify',
-                 dest='no_repo_verify', action='store_true',
+                 dest='repo_verify', default=True, action='store_false',
                  help='do not verify repo source code')
     g.add_option('--repo-upgraded',
                  dest='repo_upgraded', action='store_true',
                  help=SUPPRESS_HELP)
 
-  def _FetchProjectList(self, opt, projects, sem, *args, **kwargs):
-    """Main function of the fetch threads.
+  def _GetBranch(self):
+    """Returns the branch name for getting the approved manifest."""
+    p = self.manifest.manifestProject
+    b = p.GetBranch(p.CurrentBranch)
+    branch = b.merge
+    if branch.startswith(R_HEADS):
+      branch = branch[len(R_HEADS):]
+    return branch
+
+  def _GetCurrentBranchOnly(self, opt):
+    """Returns True if current-branch or use-superproject options are enabled."""
+    return opt.current_branch_only or git_superproject.UseSuperproject(opt, self.manifest)
+
+  def _UpdateProjectsRevisionId(self, opt, args, load_local_manifests, superproject_logging_data):
+    """Update revisionId of every project with the SHA from superproject.
+
+    This function updates each project's revisionId with SHA from superproject.
+    It writes the updated manifest into a file and reloads the manifest from it.
+
+    Args:
+      opt: Program options returned from optparse.  See _Options().
+      args: Arguments to pass to GetProjects. See the GetProjects
+          docstring for details.
+      load_local_manifests: Whether to load local manifests.
+      superproject_logging_data: A dictionary of superproject data that is to be logged.
+
+    Returns:
+      Returns path to the overriding manifest file instead of None.
+    """
+    print_messages = git_superproject.PrintMessages(opt, self.manifest)
+    superproject = git_superproject.Superproject(self.manifest,
+                                                 self.repodir,
+                                                 self.git_event_log,
+                                                 quiet=opt.quiet,
+                                                 print_messages=print_messages)
+    if opt.local_only:
+      manifest_path = superproject.manifest_path
+      if manifest_path:
+        self._ReloadManifest(manifest_path, load_local_manifests)
+      return manifest_path
+
+    all_projects = self.GetProjects(args,
+                                    missing_ok=True,
+                                    submodules_ok=opt.fetch_submodules)
+    update_result = superproject.UpdateProjectsRevisionId(all_projects)
+    manifest_path = update_result.manifest_path
+    superproject_logging_data['updatedrevisionid'] = bool(manifest_path)
+    if manifest_path:
+      self._ReloadManifest(manifest_path, load_local_manifests)
+    else:
+      if print_messages:
+        print('warning: Update of revisionId from superproject has failed, '
+              'repo sync will not use superproject to fetch the source. ',
+              'Please resync with the --no-use-superproject option to avoid this repo warning.',
+              file=sys.stderr)
+      if update_result.fatal and opt.use_superproject is not None:
+        sys.exit(1)
+    return manifest_path
+
+  def _FetchProjectList(self, opt, projects):
+    """Main function of the fetch worker.
+
+    The projects we're given share the same underlying git object store, so we
+    have to fetch them in serial.
 
     Delegates most of the work to _FetchHelper.
 
     Args:
       opt: Program options returned from optparse.  See _Options().
       projects: Projects to fetch.
-      sem: We'll release() this semaphore when we exit so that another thread
-          can be started up.
-      *args, **kwargs: Remaining arguments to pass to _FetchHelper. See the
-          _FetchHelper docstring for details.
     """
-    try:
-        for project in projects:
-          success = self._FetchHelper(opt, project, *args, **kwargs)
-          if not success and opt.fail_fast:
-            break
-    finally:
-        sem.release()
+    return [self._FetchOne(opt, x) for x in projects]
 
-  def _FetchHelper(self, opt, project, lock, fetched, pm, err_event,
-                   clone_filter):
+  def _FetchOne(self, opt, project):
     """Fetch git objects for a single project.
 
     Args:
       opt: Program options returned from optparse.  See _Options().
       project: Project object for the project to fetch.
-      lock: Lock for accessing objects that are shared amongst multiple
-          _FetchHelper() threads.
-      fetched: set object that we will add project.gitdir to when we're done
-          (with our lock held).
-      pm: Instance of a Project object.  We will call pm.update() (with our
-          lock held).
-      err_event: We'll set this event in the case of an error (after printing
-          out info about the error).
-      clone_filter: Filter for use in a partial clone.
 
     Returns:
       Whether the fetch was successful.
     """
-    # We'll set to true once we've locked the lock.
-    did_lock = False
-
-    # Encapsulate everything in a try/except/finally so that:
-    # - We always set err_event in the case of an exception.
-    # - We always make sure we unlock the lock if we locked it.
     start = time.time()
     success = False
+    buf = io.StringIO()
     try:
-      try:
-        success = project.Sync_NetworkHalf(
+      success = project.Sync_NetworkHalf(
           quiet=opt.quiet,
-          current_branch_only=opt.current_branch_only,
+          verbose=opt.verbose,
+          output_redir=buf,
+          current_branch_only=self._GetCurrentBranchOnly(opt),
           force_sync=opt.force_sync,
-          clone_bundle=not opt.no_clone_bundle,
-          no_tags=opt.no_tags, archive=self.manifest.IsArchive,
+          clone_bundle=opt.clone_bundle,
+          tags=opt.tags, archive=self.manifest.IsArchive,
           optimized_fetch=opt.optimized_fetch,
+          retry_fetches=opt.retry_fetches,
           prune=opt.prune,
-          clone_filter=clone_filter)
-        self._fetch_times.Set(project, time.time() - start)
+          ssh_proxy=self.ssh_proxy,
+          clone_filter=self.manifest.CloneFilter,
+          partial_clone_exclude=self.manifest.PartialCloneExclude)
 
-        # Lock around all the rest of the code, since printing, updating a set
-        # and Progress.update() are not thread safe.
-        lock.acquire()
-        did_lock = True
+      output = buf.getvalue()
+      if (opt.verbose or not success) and output:
+        print('\n' + output.rstrip())
 
-        if not success:
-          err_event.set()
-          print('error: Cannot fetch %s from %s'
-                % (project.name, project.remote.url),
-                file=sys.stderr)
-          if opt.fail_fast:
-            raise _FetchError()
-
-        fetched.add(project.gitdir)
-        pm.update(msg=project.name)
-      except _FetchError:
-        pass
-      except Exception as e:
-        print('error: Cannot fetch %s (%s: %s)' \
+      if not success:
+        print('error: Cannot fetch %s from %s'
+              % (project.name, project.remote.url),
+              file=sys.stderr)
+    except GitError as e:
+      print('error.GitError: Cannot fetch %s' % str(e), file=sys.stderr)
+    except Exception as e:
+      print('error: Cannot fetch %s (%s: %s)'
             % (project.name, type(e).__name__, str(e)), file=sys.stderr)
-        err_event.set()
-        raise
-    finally:
-      if did_lock:
-        lock.release()
-      finish = time.time()
-      self.event_log.AddSync(project, event_log.TASK_SYNC_NETWORK,
-                             start, finish, success)
+      raise
 
-    return success
+    finish = time.time()
+    return (success, project, start, finish)
 
-  def _Fetch(self, projects, opt):
+  @classmethod
+  def _FetchInitChild(cls, ssh_proxy):
+    cls.ssh_proxy = ssh_proxy
+
+  def _Fetch(self, projects, opt, err_event, ssh_proxy):
+    ret = True
+
+    jobs = opt.jobs_network if opt.jobs_network else self.jobs
     fetched = set()
-    lock = _threading.Lock()
-    pm = Progress('Fetching projects', len(projects),
-                  always_print_percentage=opt.quiet)
+    pm = Progress('Fetching', len(projects), delay=False, quiet=opt.quiet)
 
     objdir_project_map = dict()
     for project in projects:
       objdir_project_map.setdefault(project.objdir, []).append(project)
+    projects_list = list(objdir_project_map.values())
 
-    threads = set()
-    sem = _threading.Semaphore(self.jobs)
-    err_event = _threading.Event()
-    for project_list in objdir_project_map.values():
-      # Check for any errors before running any more tasks.
-      # ...we'll let existing threads finish, though.
-      if err_event.isSet() and opt.fail_fast:
-        break
+    def _ProcessResults(results_sets):
+      ret = True
+      for results in results_sets:
+        for (success, project, start, finish) in results:
+          self._fetch_times.Set(project, finish - start)
+          self.event_log.AddSync(project, event_log.TASK_SYNC_NETWORK,
+                                 start, finish, success)
+          # Check for any errors before running any more tasks.
+          # ...we'll let existing jobs finish, though.
+          if not success:
+            ret = False
+          else:
+            fetched.add(project.gitdir)
+          pm.update(msg=project.name)
+        if not ret and opt.fail_fast:
+          break
+      return ret
 
-      sem.acquire()
-      kwargs = dict(opt=opt,
-                    projects=project_list,
-                    sem=sem,
-                    lock=lock,
-                    fetched=fetched,
-                    pm=pm,
-                    err_event=err_event,
-                    clone_filter=self.manifest.CloneFilter)
-      if self.jobs > 1:
-        t = _threading.Thread(target = self._FetchProjectList,
-                              kwargs = kwargs)
-        # Ensure that Ctrl-C will not freeze the repo process.
-        t.daemon = True
-        threads.add(t)
-        t.start()
+    # We pass the ssh proxy settings via the class.  This allows multiprocessing
+    # to pickle it up when spawning children.  We can't pass it as an argument
+    # to _FetchProjectList below as multiprocessing is unable to pickle those.
+    Sync.ssh_proxy = None
+
+    # NB: Multiprocessing is heavy, so don't spin it up for one job.
+    if len(projects_list) == 1 or jobs == 1:
+      self._FetchInitChild(ssh_proxy)
+      if not _ProcessResults(self._FetchProjectList(opt, x) for x in projects_list):
+        ret = False
+    else:
+      # Favor throughput over responsiveness when quiet.  It seems that imap()
+      # will yield results in batches relative to chunksize, so even as the
+      # children finish a sync, we won't see the result until one child finishes
+      # ~chunksize jobs.  When using a large --jobs with large chunksize, this
+      # can be jarring as there will be a large initial delay where repo looks
+      # like it isn't doing anything and sits at 0%, but then suddenly completes
+      # a lot of jobs all at once.  Since this code is more network bound, we
+      # can accept a bit more CPU overhead with a smaller chunksize so that the
+      # user sees more immediate & continuous feedback.
+      if opt.quiet:
+        chunksize = WORKER_BATCH_SIZE
       else:
-        self._FetchProjectList(**kwargs)
+        pm.update(inc=0, msg='warming up')
+        chunksize = 4
+      with multiprocessing.Pool(
+          jobs, initializer=self._FetchInitChild, initargs=(ssh_proxy,)) as pool:
+        results = pool.imap_unordered(
+            functools.partial(self._FetchProjectList, opt),
+            projects_list,
+            chunksize=chunksize)
+        if not _ProcessResults(results):
+          ret = False
+          pool.close()
 
-    for t in threads:
-      t.join()
-
-    # If we saw an error, exit with code 1 so that other scripts can check.
-    if err_event.isSet() and opt.fail_fast:
-      print('\nerror: Exited sync due to fetch errors', file=sys.stderr)
-      sys.exit(1)
+    # Cleanup the reference now that we're done with it, and we're going to
+    # release any resources it points to.  If we don't, later multiprocessing
+    # usage (e.g. checkouts) will try to pickle and then crash.
+    del Sync.ssh_proxy
 
     pm.end()
     self._fetch_times.Save()
 
     if not self.manifest.IsArchive:
-      self._GCProjects(projects)
+      self._GCProjects(projects, opt, err_event)
 
-    return fetched
+    return (ret, fetched)
 
-  def _CheckoutWorker(self, opt, sem, project, *args, **kwargs):
-    """Main function of the fetch threads.
-
-    Delegates most of the work to _CheckoutOne.
+  def _FetchMain(self, opt, args, all_projects, err_event, manifest_name,
+                 load_local_manifests, ssh_proxy):
+    """The main network fetch loop.
 
     Args:
       opt: Program options returned from optparse.  See _Options().
-      projects: Projects to fetch.
-      sem: We'll release() this semaphore when we exit so that another thread
-          can be started up.
-      *args, **kwargs: Remaining arguments to pass to _CheckoutOne. See the
-          _CheckoutOne docstring for details.
-    """
-    try:
-      return self._CheckoutOne(opt, project, *args, **kwargs)
-    finally:
-      sem.release()
+      args: Command line args used to filter out projects.
+      all_projects: List of all projects that should be fetched.
+      err_event: Whether an error was hit while processing.
+      manifest_name: Manifest file to be reloaded.
+      load_local_manifests: Whether to load local manifests.
+      ssh_proxy: SSH manager for clients & masters.
 
-  def _CheckoutOne(self, opt, project, lock, pm, err_event, err_results):
+    Returns:
+      List of all projects that should be checked out.
+    """
+    rp = self.manifest.repoProject
+
+    to_fetch = []
+    now = time.time()
+    if _ONE_DAY_S <= (now - rp.LastFetch):
+      to_fetch.append(rp)
+    to_fetch.extend(all_projects)
+    to_fetch.sort(key=self._fetch_times.Get, reverse=True)
+
+    success, fetched = self._Fetch(to_fetch, opt, err_event, ssh_proxy)
+    if not success:
+      err_event.set()
+
+    _PostRepoFetch(rp, opt.repo_verify)
+    if opt.network_only:
+      # bail out now; the rest touches the working tree
+      if err_event.is_set():
+        print('\nerror: Exited sync due to fetch errors.\n', file=sys.stderr)
+        sys.exit(1)
+      return
+
+    # Iteratively fetch missing and/or nested unregistered submodules
+    previously_missing_set = set()
+    while True:
+      self._ReloadManifest(manifest_name, load_local_manifests)
+      all_projects = self.GetProjects(args,
+                                      missing_ok=True,
+                                      submodules_ok=opt.fetch_submodules)
+      missing = []
+      for project in all_projects:
+        if project.gitdir not in fetched:
+          missing.append(project)
+      if not missing:
+        break
+      # Stop us from non-stopped fetching actually-missing repos: If set of
+      # missing repos has not been changed from last fetch, we break.
+      missing_set = set(p.name for p in missing)
+      if previously_missing_set == missing_set:
+        break
+      previously_missing_set = missing_set
+      success, new_fetched = self._Fetch(missing, opt, err_event, ssh_proxy)
+      if not success:
+        err_event.set()
+      fetched.update(new_fetched)
+
+    return all_projects
+
+  def _CheckoutOne(self, detach_head, force_sync, project):
     """Checkout work tree for one project
 
     Args:
-      opt: Program options returned from optparse.  See _Options().
+      detach_head: Whether to leave a detached HEAD.
+      force_sync: Force checking out of the repo.
       project: Project object for the project to checkout.
-      lock: Lock for accessing objects that are shared amongst multiple
-          _CheckoutWorker() threads.
-      pm: Instance of a Project object.  We will call pm.update() (with our
-          lock held).
-      err_event: We'll set this event in the case of an error (after printing
-          out info about the error).
-      err_results: A list of strings, paths to git repos where checkout
-          failed.
 
     Returns:
       Whether the fetch was successful.
     """
-    # We'll set to true once we've locked the lock.
-    did_lock = False
-
-    # Encapsulate everything in a try/except/finally so that:
-    # - We always set err_event in the case of an exception.
-    # - We always make sure we unlock the lock if we locked it.
     start = time.time()
     syncbuf = SyncBuffer(self.manifest.manifestProject.config,
-                         detach_head=opt.detach_head)
+                         detach_head=detach_head)
     success = False
     try:
-      try:
-        project.Sync_LocalHalf(syncbuf, force_sync=opt.force_sync)
+      project.Sync_LocalHalf(syncbuf, force_sync=force_sync)
+      success = syncbuf.Finish()
+    except GitError as e:
+      print('error.GitError: Cannot checkout %s: %s' %
+            (project.name, str(e)), file=sys.stderr)
+    except Exception as e:
+      print('error: Cannot checkout %s: %s: %s' %
+            (project.name, type(e).__name__, str(e)),
+            file=sys.stderr)
+      raise
 
-        # Lock around all the rest of the code, since printing, updating a set
-        # and Progress.update() are not thread safe.
-        lock.acquire()
-        success = syncbuf.Finish()
-        did_lock = True
+    if not success:
+      print('error: Cannot checkout %s' % (project.name), file=sys.stderr)
+    finish = time.time()
+    return (success, project, start, finish)
 
-        if not success:
-          err_event.set()
-          print('error: Cannot checkout %s' % (project.name),
-                file=sys.stderr)
-          raise _CheckoutError()
-
-        pm.update(msg=project.name)
-      except _CheckoutError:
-        pass
-      except Exception as e:
-        print('error: Cannot checkout %s: %s: %s' %
-              (project.name, type(e).__name__, str(e)),
-              file=sys.stderr)
-        err_event.set()
-        raise
-    finally:
-      if did_lock:
-        if not success:
-          err_results.append(project.relpath)
-        lock.release()
-      finish = time.time()
-      self.event_log.AddSync(project, event_log.TASK_SYNC_LOCAL,
-                             start, finish, success)
-
-    return success
-
-  def _Checkout(self, all_projects, opt):
+  def _Checkout(self, all_projects, opt, err_results):
     """Checkout projects listed in all_projects
 
     Args:
       all_projects: List of all projects that should be checked out.
       opt: Program options returned from optparse.  See _Options().
+      err_results: A list of strings, paths to git repos where checkout failed.
     """
+    # Only checkout projects with worktrees.
+    all_projects = [x for x in all_projects if x.worktree]
 
-    # Perform checkouts in multiple threads when we are using partial clone.
-    # Without partial clone, all needed git objects are already downloaded,
-    # in this situation it's better to use only one process because the checkout
-    # would be mostly disk I/O; with partial clone, the objects are only
-    # downloaded when demanded (at checkout time), which is similar to the
-    # Sync_NetworkHalf case and parallelism would be helpful.
-    if self.manifest.CloneFilter:
-      syncjobs = self.jobs
-    else:
-      syncjobs = 1
+    def _ProcessResults(pool, pm, results):
+      ret = True
+      for (success, project, start, finish) in results:
+        self.event_log.AddSync(project, event_log.TASK_SYNC_LOCAL,
+                               start, finish, success)
+        # Check for any errors before running any more tasks.
+        # ...we'll let existing jobs finish, though.
+        if not success:
+          ret = False
+          err_results.append(project.relpath)
+          if opt.fail_fast:
+            if pool:
+              pool.close()
+            return ret
+        pm.update(msg=project.name)
+      return ret
 
-    lock = _threading.Lock()
-    pm = Progress('Checking out projects', len(all_projects))
+    return self.ExecuteInParallel(
+        opt.jobs_checkout if opt.jobs_checkout else self.jobs,
+        functools.partial(self._CheckoutOne, opt.detach_head, opt.force_sync),
+        all_projects,
+        callback=_ProcessResults,
+        output=Progress('Checking out', len(all_projects), quiet=opt.quiet)) and not err_results
 
-    threads = set()
-    sem = _threading.Semaphore(syncjobs)
-    err_event = _threading.Event()
-    err_results = []
+  def _GCProjects(self, projects, opt, err_event):
+    pm = Progress('Garbage collecting', len(projects), delay=False, quiet=opt.quiet)
+    pm.update(inc=0, msg='prescan')
 
-    for project in all_projects:
-      # Check for any errors before running any more tasks.
-      # ...we'll let existing threads finish, though.
-      if err_event.isSet() and opt.fail_fast:
-        break
-
-      sem.acquire()
-      if project.worktree:
-        kwargs = dict(opt=opt,
-                      sem=sem,
-                      project=project,
-                      lock=lock,
-                      pm=pm,
-                      err_event=err_event,
-                      err_results=err_results)
-        if syncjobs > 1:
-          t = _threading.Thread(target=self._CheckoutWorker,
-                                kwargs=kwargs)
-          # Ensure that Ctrl-C will not freeze the repo process.
-          t.daemon = True
-          threads.add(t)
-          t.start()
-        else:
-          self._CheckoutWorker(**kwargs)
-
-    for t in threads:
-      t.join()
-
-    pm.end()
-    # If we saw an error, exit with code 1 so that other scripts can check.
-    if err_event.isSet():
-      print('\nerror: Exited sync due to checkout errors', file=sys.stderr)
-      if err_results:
-        print('Failing repos:\n%s' % '\n'.join(err_results),
-              file=sys.stderr)
-      sys.exit(1)
-
-  def _GCProjects(self, projects):
     gc_gitdirs = {}
     for project in projects:
-      if len(project.manifest.GetProjectsWithName(project.name)) > 1:
-        print('Shared project %s found, disabling pruning.' % project.name)
-        project.bare_git.config('--replace-all', 'gc.pruneExpire', 'never')
+      # Make sure pruning never kicks in with shared projects.
+      if (not project.use_git_worktrees and
+              len(project.manifest.GetProjectsWithName(project.name)) > 1):
+        if not opt.quiet:
+          print('\r%s: Shared project %s found, disabling pruning.' %
+                (project.relpath, project.name))
+        if git_require((2, 7, 0)):
+          project.EnableRepositoryExtension('preciousObjects')
+        else:
+          # This isn't perfect, but it's the best we can do with old git.
+          print('\r%s: WARNING: shared projects are unreliable when using old '
+                'versions of git; please upgrade to git-2.7.0+.'
+                % (project.relpath,),
+                file=sys.stderr)
+          project.config.SetString('gc.pruneExpire', 'never')
       gc_gitdirs[project.gitdir] = project.bare_git
 
-    has_dash_c = git_require((1, 7, 2))
-    if multiprocessing and has_dash_c:
-      cpu_count = multiprocessing.cpu_count()
-    else:
-      cpu_count = 1
+    pm.update(inc=len(projects) - len(gc_gitdirs), msg='warming up')
+
+    cpu_count = os.cpu_count()
     jobs = min(self.jobs, cpu_count)
 
     if jobs < 2:
       for bare_git in gc_gitdirs.values():
+        pm.update(msg=bare_git._project.name)
         bare_git.gc('--auto')
+      pm.end()
       return
 
     config = {'pack.threads': cpu_count // jobs if cpu_count > jobs else 1}
 
     threads = set()
     sem = _threading.Semaphore(jobs)
-    err_event = _threading.Event()
 
     def GC(bare_git):
+      pm.start(bare_git._project.name)
       try:
         try:
           bare_git.gc('--auto', config=config)
         except GitError:
           err_event.set()
-        except:
+        except Exception:
           err_event.set()
           raise
       finally:
+        pm.finish(bare_git._project.name)
         sem.release()
 
     for bare_git in gc_gitdirs.values():
-      if err_event.isSet():
+      if err_event.is_set() and opt.fail_fast:
         break
       sem.acquire()
       t = _threading.Thread(target=GC, args=(bare_git,))
@@ -613,84 +666,30 @@
 
     for t in threads:
       t.join()
+    pm.end()
 
-    if err_event.isSet():
-      print('\nerror: Exited sync due to gc errors', file=sys.stderr)
-      sys.exit(1)
+  def _ReloadManifest(self, manifest_name=None, load_local_manifests=True):
+    """Reload the manfiest from the file specified by the |manifest_name|.
 
-  def _ReloadManifest(self, manifest_name=None):
+    It unloads the manifest if |manifest_name| is None.
+
+    Args:
+      manifest_name: Manifest file to be reloaded.
+      load_local_manifests: Whether to load local manifests.
+    """
     if manifest_name:
       # Override calls _Unload already
-      self.manifest.Override(manifest_name)
+      self.manifest.Override(manifest_name, load_local_manifests=load_local_manifests)
     else:
       self.manifest._Unload()
 
-  def _DeleteProject(self, path):
-    print('Deleting obsolete path %s' % path, file=sys.stderr)
-
-    # Delete the .git directory first, so we're less likely to have a partially
-    # working git repository around. There shouldn't be any git projects here,
-    # so rmtree works.
-    try:
-      platform_utils.rmtree(os.path.join(path, '.git'))
-    except OSError as e:
-      print('Failed to remove %s (%s)' % (os.path.join(path, '.git'), str(e)), file=sys.stderr)
-      print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
-      print('       remove manually, then run sync again', file=sys.stderr)
-      return 1
-
-    # Delete everything under the worktree, except for directories that contain
-    # another git project
-    dirs_to_remove = []
-    failed = False
-    for root, dirs, files in platform_utils.walk(path):
-      for f in files:
-        try:
-          platform_utils.remove(os.path.join(root, f))
-        except OSError as e:
-          print('Failed to remove %s (%s)' % (os.path.join(root, f), str(e)), file=sys.stderr)
-          failed = True
-      dirs[:] = [d for d in dirs
-                 if not os.path.lexists(os.path.join(root, d, '.git'))]
-      dirs_to_remove += [os.path.join(root, d) for d in dirs
-                         if os.path.join(root, d) not in dirs_to_remove]
-    for d in reversed(dirs_to_remove):
-      if platform_utils.islink(d):
-        try:
-          platform_utils.remove(d)
-        except OSError as e:
-          print('Failed to remove %s (%s)' % (os.path.join(root, d), str(e)), file=sys.stderr)
-          failed = True
-      elif len(platform_utils.listdir(d)) == 0:
-        try:
-          platform_utils.rmdir(d)
-        except OSError as e:
-          print('Failed to remove %s (%s)' % (os.path.join(root, d), str(e)), file=sys.stderr)
-          failed = True
-          continue
-    if failed:
-      print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
-      print('       remove manually, then run sync again', file=sys.stderr)
-      return 1
-
-    # Try deleting parent dirs if they are empty
-    project_dir = path
-    while project_dir != self.manifest.topdir:
-      if len(platform_utils.listdir(project_dir)) == 0:
-        platform_utils.rmdir(project_dir)
-      else:
-        break
-      project_dir = os.path.dirname(project_dir)
-
-    return 0
-
   def UpdateProjectList(self, opt):
     new_project_paths = []
     for project in self.GetProjects(None, missing_ok=True):
       if project.relpath:
         new_project_paths.append(project.relpath)
     file_name = 'project.list'
-    file_path = os.path.join(self.manifest.repodir, file_name)
+    file_path = os.path.join(self.repodir, file_name)
     old_project_paths = []
 
     if os.path.exists(file_path):
@@ -705,28 +704,20 @@
           gitdir = os.path.join(self.manifest.topdir, path, '.git')
           if os.path.exists(gitdir):
             project = Project(
-                           manifest = self.manifest,
-                           name = path,
-                           remote = RemoteSpec('origin'),
-                           gitdir = gitdir,
-                           objdir = gitdir,
-                           worktree = os.path.join(self.manifest.topdir, path),
-                           relpath = path,
-                           revisionExpr = 'HEAD',
-                           revisionId = None,
-                           groups = None)
-
-            if project.IsDirty() and opt.force_remove_dirty:
-              print('WARNING: Removing dirty project "%s": uncommitted changes '
-                    'erased' % project.relpath, file=sys.stderr)
-              self._DeleteProject(project.worktree)
-            elif project.IsDirty():
-              print('error: Cannot remove project "%s": uncommitted changes '
-                    'are present' % project.relpath, file=sys.stderr)
-              print('       commit changes, then run sync again',
-                    file=sys.stderr)
-              return 1
-            elif self._DeleteProject(project.worktree):
+                manifest=self.manifest,
+                name=path,
+                remote=RemoteSpec('origin'),
+                gitdir=gitdir,
+                objdir=gitdir,
+                use_git_worktrees=os.path.isfile(gitdir),
+                worktree=os.path.join(self.manifest.topdir, path),
+                relpath=path,
+                revisionExpr='HEAD',
+                revisionId=None,
+                groups=None)
+            if not project.DeleteWorktree(
+                    quiet=opt.quiet,
+                    force=opt.force_remove_dirty):
               return 1
 
     new_project_paths.sort()
@@ -735,6 +726,56 @@
       fd.write('\n')
     return 0
 
+  def UpdateCopyLinkfileList(self):
+    """Save all dests of copyfile and linkfile, and update them if needed.
+
+    Returns:
+      Whether update was successful.
+    """
+    new_paths = {}
+    new_linkfile_paths = []
+    new_copyfile_paths = []
+    for project in self.GetProjects(None, missing_ok=True):
+      new_linkfile_paths.extend(x.dest for x in project.linkfiles)
+      new_copyfile_paths.extend(x.dest for x in project.copyfiles)
+
+    new_paths = {
+        'linkfile': new_linkfile_paths,
+        'copyfile': new_copyfile_paths,
+    }
+
+    copylinkfile_name = 'copy-link-files.json'
+    copylinkfile_path = os.path.join(self.manifest.repodir, copylinkfile_name)
+    old_copylinkfile_paths = {}
+
+    if os.path.exists(copylinkfile_path):
+      with open(copylinkfile_path, 'rb') as fp:
+        try:
+          old_copylinkfile_paths = json.load(fp)
+        except:
+          print('error: %s is not a json formatted file.' %
+                copylinkfile_path, file=sys.stderr)
+          platform_utils.remove(copylinkfile_path)
+          return False
+
+      need_remove_files = []
+      need_remove_files.extend(
+          set(old_copylinkfile_paths.get('linkfile', [])) -
+          set(new_linkfile_paths))
+      need_remove_files.extend(
+          set(old_copylinkfile_paths.get('copyfile', [])) -
+          set(new_copyfile_paths))
+
+      for need_remove_file in need_remove_files:
+        # Try to remove the updated copyfile or linkfile.
+        # So, if the file is not exist, nothing need to do.
+        platform_utils.remove(need_remove_file, missing_ok=True)
+
+    # Create copy-link-files.json, save dest path of "copyfile" and "linkfile".
+    with open(copylinkfile_path, 'w', encoding='utf-8') as fp:
+      json.dump(new_paths, fp)
+    return True
+
   def _SmartSyncSetup(self, opt, smart_sync_manifest_path):
     if not self.manifest.manifest_server:
       print('error: cannot smart sync: no manifest server defined in '
@@ -745,7 +786,7 @@
     if not opt.quiet:
       print('Using manifest server %s' % manifest_server)
 
-    if not '@' in manifest_server:
+    if '@' not in manifest_server:
       username = None
       password = None
       if opt.manifest_server_username and opt.manifest_server_password:
@@ -782,19 +823,15 @@
     try:
       server = xmlrpc.client.Server(manifest_server, transport=transport)
       if opt.smart_sync:
-        p = self.manifest.manifestProject
-        b = p.GetBranch(p.CurrentBranch)
-        branch = b.merge
-        if branch.startswith(R_HEADS):
-          branch = branch[len(R_HEADS):]
+        branch = self._GetBranch()
 
-        env = os.environ.copy()
-        if 'SYNC_TARGET' in env:
-          target = env['SYNC_TARGET']
+        if 'SYNC_TARGET' in os.environ:
+          target = os.environ['SYNC_TARGET']
           [success, manifest_str] = server.GetApprovedManifest(branch, target)
-        elif 'TARGET_PRODUCT' in env and 'TARGET_BUILD_VARIANT' in env:
-          target = '%s-%s' % (env['TARGET_PRODUCT'],
-                              env['TARGET_BUILD_VARIANT'])
+        elif ('TARGET_PRODUCT' in os.environ and
+              'TARGET_BUILD_VARIANT' in os.environ):
+          target = '%s-%s' % (os.environ['TARGET_PRODUCT'],
+                              os.environ['TARGET_BUILD_VARIANT'])
           [success, manifest_str] = server.GetApprovedManifest(branch, target)
         else:
           [success, manifest_str] = server.GetApprovedManifest(branch)
@@ -833,12 +870,15 @@
     """Fetch & update the local manifest project."""
     if not opt.local_only:
       start = time.time()
-      success = mp.Sync_NetworkHalf(quiet=opt.quiet,
-                                    current_branch_only=opt.current_branch_only,
-                                    no_tags=opt.no_tags,
+      success = mp.Sync_NetworkHalf(quiet=opt.quiet, verbose=opt.verbose,
+                                    current_branch_only=self._GetCurrentBranchOnly(opt),
+                                    force_sync=opt.force_sync,
+                                    tags=opt.tags,
                                     optimized_fetch=opt.optimized_fetch,
+                                    retry_fetches=opt.retry_fetches,
                                     submodules=self.manifest.HasSubmodules,
-                                    clone_filter=self.manifest.CloneFilter)
+                                    clone_filter=self.manifest.CloneFilter,
+                                    partial_clone_exclude=self.manifest.PartialCloneExclude)
       finish = time.time()
       self.event_log.AddSync(mp, event_log.TASK_SYNC_NETWORK,
                              start, finish, success)
@@ -852,7 +892,7 @@
                              start, time.time(), clean)
       if not clean:
         sys.exit(1)
-      self._ReloadManifest(opt.manifest_name)
+      self._ReloadManifest(manifest_name)
       if opt.jobs is None:
         self.jobs = self.manifest.default.sync_j
 
@@ -886,7 +926,10 @@
 
     manifest_name = opt.manifest_name
     smart_sync_manifest_path = os.path.join(
-      self.manifest.manifestProject.worktree, 'smart_sync_override.xml')
+        self.manifest.manifestProject.worktree, 'smart_sync_override.xml')
+
+    if opt.clone_bundle is None:
+      opt.clone_bundle = self.manifest.CloneBundle
 
     if opt.smart_sync or opt.smart_tag:
       manifest_name = self._SmartSyncSetup(opt, smart_sync_manifest_path)
@@ -898,8 +941,17 @@
           print('error: failed to remove existing smart sync override manifest: %s' %
                 e, file=sys.stderr)
 
+    err_event = multiprocessing.Event()
+
     rp = self.manifest.repoProject
     rp.PreSync()
+    cb = rp.CurrentBranch
+    if cb:
+      base = rp.GetBranch(cb).merge
+      if not base or not base.startswith('refs/heads/'):
+        print('warning: repo is not tracking a remote branch, so it will not '
+              'receive updates; run `repo init --repo-rev=stable` to fix.',
+              file=sys.stderr)
 
     mp = self.manifest.manifestProject
     mp.PreSync()
@@ -907,7 +959,21 @@
     if opt.repo_upgraded:
       _PostRepoUpgrade(self.manifest, quiet=opt.quiet)
 
-    self._UpdateManifestProject(opt, mp, manifest_name)
+    if not opt.mp_update:
+      print('Skipping update of local manifest project.')
+    else:
+      self._UpdateManifestProject(opt, mp, manifest_name)
+
+    load_local_manifests = not self.manifest.HasLocalManifests
+    use_superproject = git_superproject.UseSuperproject(opt, self.manifest)
+    superproject_logging_data = {
+        'superproject': use_superproject,
+        'haslocalmanifests': bool(self.manifest.HasLocalManifests),
+        'hassuperprojecttag': bool(self.manifest.superproject),
+    }
+    if use_superproject:
+      manifest_name = self._UpdateProjectsRevisionId(
+          opt, args, load_local_manifests, superproject_logging_data) or opt.manifest_name
 
     if self.gitc_manifest:
       gitc_manifest_projects = self.GetProjects(args,
@@ -948,56 +1014,92 @@
                                     missing_ok=True,
                                     submodules_ok=opt.fetch_submodules)
 
+    err_network_sync = False
+    err_update_projects = False
+
     self._fetch_times = _FetchTimes(self.manifest)
     if not opt.local_only:
-      to_fetch = []
-      now = time.time()
-      if _ONE_DAY_S <= (now - rp.LastFetch):
-        to_fetch.append(rp)
-      to_fetch.extend(all_projects)
-      to_fetch.sort(key=self._fetch_times.Get, reverse=True)
+      with multiprocessing.Manager() as manager:
+        with ssh.ProxyManager(manager) as ssh_proxy:
+          # Initialize the socket dir once in the parent.
+          ssh_proxy.sock()
+          all_projects = self._FetchMain(opt, args, all_projects, err_event,
+                                         manifest_name, load_local_manifests,
+                                         ssh_proxy)
 
-      fetched = self._Fetch(to_fetch, opt)
-      _PostRepoFetch(rp, opt.no_repo_verify)
       if opt.network_only:
-        # bail out now; the rest touches the working tree
         return
 
-      # Iteratively fetch missing and/or nested unregistered submodules
-      previously_missing_set = set()
-      while True:
-        self._ReloadManifest(manifest_name)
-        all_projects = self.GetProjects(args,
-                                        missing_ok=True,
-                                        submodules_ok=opt.fetch_submodules)
-        missing = []
-        for project in all_projects:
-          if project.gitdir not in fetched:
-            missing.append(project)
-        if not missing:
-          break
-        # Stop us from non-stopped fetching actually-missing repos: If set of
-        # missing repos has not been changed from last fetch, we break.
-        missing_set = set(p.name for p in missing)
-        if previously_missing_set == missing_set:
-          break
-        previously_missing_set = missing_set
-        fetched.update(self._Fetch(missing, opt))
+      # If we saw an error, exit with code 1 so that other scripts can check.
+      if err_event.is_set():
+        err_network_sync = True
+        if opt.fail_fast:
+          print('\nerror: Exited sync due to fetch errors.\n'
+                'Local checkouts *not* updated. Resolve network issues & '
+                'retry.\n'
+                '`repo sync -l` will update some local checkouts.',
+                file=sys.stderr)
+          sys.exit(1)
 
     if self.manifest.IsMirror or self.manifest.IsArchive:
       # bail out now, we have no working tree
       return
 
     if self.UpdateProjectList(opt):
-      sys.exit(1)
+      err_event.set()
+      err_update_projects = True
+      if opt.fail_fast:
+        print('\nerror: Local checkouts *not* updated.', file=sys.stderr)
+        sys.exit(1)
 
-    self._Checkout(all_projects, opt)
+    err_update_linkfiles = not self.UpdateCopyLinkfileList()
+    if err_update_linkfiles:
+      err_event.set()
+      if opt.fail_fast:
+        print('\nerror: Local update copyfile or linkfile failed.', file=sys.stderr)
+        sys.exit(1)
+
+    err_results = []
+    # NB: We don't exit here because this is the last step.
+    err_checkout = not self._Checkout(all_projects, opt, err_results)
+    if err_checkout:
+      err_event.set()
 
     # If there's a notice that's supposed to print at the end of the sync, print
     # it now...
     if self.manifest.notice:
       print(self.manifest.notice)
 
+    # If we saw an error, exit with code 1 so that other scripts can check.
+    if err_event.is_set():
+      print('\nerror: Unable to fully sync the tree.', file=sys.stderr)
+      if err_network_sync:
+        print('error: Downloading network changes failed.', file=sys.stderr)
+      if err_update_projects:
+        print('error: Updating local project lists failed.', file=sys.stderr)
+      if err_update_linkfiles:
+        print('error: Updating copyfiles or linkfiles failed.', file=sys.stderr)
+      if err_checkout:
+        print('error: Checking out local projects failed.', file=sys.stderr)
+        if err_results:
+          print('Failing repos:\n%s' % '\n'.join(err_results), file=sys.stderr)
+      print('Try re-running with "-j1 --fail-fast" to exit at the first error.',
+            file=sys.stderr)
+      sys.exit(1)
+
+    # Log the previous sync analysis state from the config.
+    self.git_event_log.LogDataConfigEvents(mp.config.GetSyncAnalysisStateData(),
+                                           'previous_sync_state')
+
+    # Update and log with the new sync analysis state.
+    mp.config.UpdateSyncAnalysisState(opt, superproject_logging_data)
+    self.git_event_log.LogDataConfigEvents(mp.config.GetSyncAnalysisStateData(),
+                                           'current_sync_state')
+
+    if not opt.quiet:
+      print('repo sync has finished successfully.')
+
+
 def _PostRepoUpgrade(manifest, quiet=False):
   wrapper = Wrapper()
   if wrapper.NeedSetupGnuPG():
@@ -1006,15 +1108,29 @@
     if project.Exists:
       project.PostRepoUpgrade()
 
-def _PostRepoFetch(rp, no_repo_verify=False, verbose=False):
+
+def _PostRepoFetch(rp, repo_verify=True, verbose=False):
   if rp.HasChanges:
     print('info: A new version of repo is available', file=sys.stderr)
-    print(file=sys.stderr)
-    if no_repo_verify or _VerifyTag(rp):
-      syncbuf = SyncBuffer(rp.config)
-      rp.Sync_LocalHalf(syncbuf)
-      if not syncbuf.Finish():
-        sys.exit(1)
+    wrapper = Wrapper()
+    try:
+      rev = rp.bare_git.describe(rp.GetRevisionId())
+    except GitError:
+      rev = None
+    _, new_rev = wrapper.check_repo_rev(rp.gitdir, rev, repo_verify=repo_verify)
+    # See if we're held back due to missing signed tag.
+    current_revid = rp.bare_git.rev_parse('HEAD')
+    new_revid = rp.bare_git.rev_parse('--verify', new_rev)
+    if current_revid != new_revid:
+      # We want to switch to the new rev, but also not trash any uncommitted
+      # changes.  This helps with local testing/hacking.
+      # If a local change has been made, we will throw that away.
+      # We also have to make sure this will switch to an older commit if that's
+      # the latest tag in order to support release rollback.
+      try:
+        rp.work_git.reset('--keep', new_rev)
+      except GitError as e:
+        sys.exit(str(e))
       print('info: Restarting repo with latest version', file=sys.stderr)
       raise RepoChangedException(['--repo-upgraded'])
     else:
@@ -1024,53 +1140,6 @@
       print('repo version %s is current' % rp.work_git.describe(HEAD),
             file=sys.stderr)
 
-def _VerifyTag(project):
-  gpg_dir = os.path.expanduser('~/.repoconfig/gnupg')
-  if not os.path.exists(gpg_dir):
-    print('warning: GnuPG was not available during last "repo init"\n'
-          'warning: Cannot automatically authenticate repo."""',
-          file=sys.stderr)
-    return True
-
-  try:
-    cur = project.bare_git.describe(project.GetRevisionId())
-  except GitError:
-    cur = None
-
-  if not cur \
-     or re.compile(r'^.*-[0-9]{1,}-g[0-9a-f]{1,}$').match(cur):
-    rev = project.revisionExpr
-    if rev.startswith(R_HEADS):
-      rev = rev[len(R_HEADS):]
-
-    print(file=sys.stderr)
-    print("warning: project '%s' branch '%s' is not signed"
-          % (project.name, rev), file=sys.stderr)
-    return False
-
-  env = os.environ.copy()
-  env['GIT_DIR'] = project.gitdir.encode()
-  env['GNUPGHOME'] = gpg_dir.encode()
-
-  cmd = [GIT, 'tag', '-v', cur]
-  proc = subprocess.Popen(cmd,
-                          stdout = subprocess.PIPE,
-                          stderr = subprocess.PIPE,
-                          env = env)
-  out = proc.stdout.read()
-  proc.stdout.close()
-
-  err = proc.stderr.read()
-  proc.stderr.close()
-
-  if proc.wait() != 0:
-    print(file=sys.stderr)
-    print(out, file=sys.stderr)
-    print(err, file=sys.stderr)
-    print(file=sys.stderr)
-    return False
-  return True
-
 
 class _FetchTimes(object):
   _ALPHA = 0.5
@@ -1090,7 +1159,7 @@
     old = self._times.get(name, t)
     self._seen.add(name)
     a = self._ALPHA
-    self._times[name] = (a*t) + ((1-a) * old)
+    self._times[name] = (a * t) + ((1 - a) * old)
 
   def _Load(self):
     if self._times is None:
@@ -1098,10 +1167,7 @@
         with open(self._path) as f:
           self._times = json.load(f)
       except (IOError, ValueError):
-        try:
-          platform_utils.remove(self._path)
-        except OSError:
-          pass
+        platform_utils.remove(self._path, missing_ok=True)
         self._times = {}
 
   def Save(self):
@@ -1119,15 +1185,14 @@
       with open(self._path, 'w') as f:
         json.dump(self._times, f, indent=2)
     except (IOError, TypeError):
-      try:
-        platform_utils.remove(self._path)
-      except OSError:
-        pass
+      platform_utils.remove(self._path, missing_ok=True)
 
 # This is a replacement for xmlrpc.client.Transport using urllib2
 # and supporting persistent-http[s]. It cannot change hosts from
 # request to request like the normal transport, the real url
 # is passed during initialization.
+
+
 class PersistentTransport(xmlrpc.client.Transport):
   def __init__(self, orig_host):
     self.orig_host = orig_host
@@ -1138,7 +1203,7 @@
       # Since we're only using them for HTTP, copy the file temporarily,
       # stripping those prefixes away.
       if cookiefile:
-        tmpcookiefile = tempfile.NamedTemporaryFile()
+        tmpcookiefile = tempfile.NamedTemporaryFile(mode='w')
         tmpcookiefile.write("# HTTP Cookie File")
         try:
           with open(cookiefile) as f:
@@ -1162,7 +1227,7 @@
       if proxy:
         proxyhandler = urllib.request.ProxyHandler({
             "http": proxy,
-            "https": proxy })
+            "https": proxy})
 
       opener = urllib.request.build_opener(
           urllib.request.HTTPCookieProcessor(cookiejar),
@@ -1219,4 +1284,3 @@
 
   def close(self):
     pass
-
diff --git a/subcmds/upload.py b/subcmds/upload.py
index 5c12aae..c48deab 100644
--- a/subcmds/upload.py
+++ b/subcmds/upload.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2008 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,25 +12,23 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 import copy
+import functools
+import optparse
 import re
 import sys
 
-from command import InteractiveCommand
+from command import DEFAULT_LOCAL_JOBS, InteractiveCommand
 from editor import Editor
-from error import HookError, UploadError
+from error import UploadError
 from git_command import GitCommand
-from project import RepoHook
+from git_refs import R_HEADS
+from hooks import RepoHook
 
-from pyversion import is_python3
-if not is_python3():
-  input = raw_input
-else:
-  unicode = str
 
 UNUSUAL_COMMIT_THRESHOLD = 5
 
+
 def _ConfirmManyUploads(multiple_branches=False):
   if multiple_branches:
     print('ATTENTION: One or more branches has an unusually high number '
@@ -44,19 +40,22 @@
   answer = input("If you are sure you intend to do this, type 'yes': ").strip()
   return answer == "yes"
 
+
 def _die(fmt, *args):
   msg = fmt % args
   print('error: %s' % msg, file=sys.stderr)
   sys.exit(1)
 
+
 def _SplitEmails(values):
   result = []
   for value in values:
     result.extend([s.strip() for s in value.split(',')])
   return result
 
+
 class Upload(InteractiveCommand):
-  common = True
+  COMMON = True
   helpSummary = "Upload changes for code review"
   helpUsage = """
 %prog [--re --cc] [<project>]...
@@ -126,74 +125,89 @@
 of the -t option to the repo command. If unset or set to "false" then
 repo will make use of only the command line option.
 
+review.URL.uploadhashtags:
+
+To add hashtags whenever uploading a commit, you can set a per-project
+or global Git option to do so. The value of review.URL.uploadhashtags
+will be used as comma delimited hashtags like the --hashtag option.
+
+review.URL.uploadlabels:
+
+To add labels whenever uploading a commit, you can set a per-project
+or global Git option to do so. The value of review.URL.uploadlabels
+will be used as comma delimited labels like the --label option.
+
+review.URL.uploadnotify:
+
+Control e-mail notifications when uploading.
+https://gerrit-review.googlesource.com/Documentation/user-upload.html#notify
+
 # References
 
 Gerrit Code Review:  https://www.gerritcodereview.com/
 
 """
+  PARALLEL_JOBS = DEFAULT_LOCAL_JOBS
 
   def _Options(self, p):
     p.add_option('-t',
                  dest='auto_topic', action='store_true',
-                 help='Send local branch name to Gerrit Code Review')
+                 help='send local branch name to Gerrit Code Review')
+    p.add_option('--hashtag', '--ht',
+                 dest='hashtags', action='append', default=[],
+                 help='add hashtags (comma delimited) to the review')
+    p.add_option('--hashtag-branch', '--htb',
+                 action='store_true',
+                 help='add local branch name as a hashtag')
+    p.add_option('-l', '--label',
+                 dest='labels', action='append', default=[],
+                 help='add a label when uploading')
     p.add_option('--re', '--reviewers',
-                 type='string',  action='append', dest='reviewers',
-                 help='Request reviews from these people.')
+                 type='string', action='append', dest='reviewers',
+                 help='request reviews from these people')
     p.add_option('--cc',
-                 type='string',  action='append', dest='cc',
-                 help='Also send email to these email addresses.')
-    p.add_option('--br',
-                 type='string',  action='store', dest='branch',
-                 help='Branch to upload.')
-    p.add_option('--cbr', '--current-branch',
+                 type='string', action='append', dest='cc',
+                 help='also send email to these email addresses')
+    p.add_option('--br', '--branch',
+                 type='string', action='store', dest='branch',
+                 help='(local) branch to upload')
+    p.add_option('-c', '--current-branch',
                  dest='current_branch', action='store_true',
-                 help='Upload current git branch.')
-    p.add_option('-d', '--draft',
-                 action='store_true', dest='draft', default=False,
-                 help='If specified, upload as a draft.')
+                 help='upload current git branch')
+    p.add_option('--no-current-branch',
+                 dest='current_branch', action='store_false',
+                 help='upload all git branches')
+    # Turn this into a warning & remove this someday.
+    p.add_option('--cbr',
+                 dest='current_branch', action='store_true',
+                 help=optparse.SUPPRESS_HELP)
     p.add_option('--ne', '--no-emails',
                  action='store_false', dest='notify', default=True,
-                 help='If specified, do not send emails on upload.')
+                 help='do not send e-mails on upload')
     p.add_option('-p', '--private',
                  action='store_true', dest='private', default=False,
-                 help='If specified, upload as a private change.')
+                 help='upload as a private change (deprecated; use --wip)')
     p.add_option('-w', '--wip',
                  action='store_true', dest='wip', default=False,
-                 help='If specified, upload as a work-in-progress change.')
+                 help='upload as a work-in-progress change')
     p.add_option('-o', '--push-option',
                  type='string', action='append', dest='push_options',
                  default=[],
-                 help='Additional push options to transmit')
+                 help='additional push options to transmit')
     p.add_option('-D', '--destination', '--dest',
                  type='string', action='store', dest='dest_branch',
                  metavar='BRANCH',
-                 help='Submit for review on this target branch.')
-
-    # Options relating to upload hook.  Note that verify and no-verify are NOT
-    # opposites of each other, which is why they store to different locations.
-    # We are using them to match 'git commit' syntax.
-    #
-    # Combinations:
-    # - no-verify=False, verify=False (DEFAULT):
-    #   If stdout is a tty, can prompt about running upload hooks if needed.
-    #   If user denies running hooks, the upload is cancelled.  If stdout is
-    #   not a tty and we would need to prompt about upload hooks, upload is
-    #   cancelled.
-    # - no-verify=False, verify=True:
-    #   Always run upload hooks with no prompt.
-    # - no-verify=True, verify=False:
-    #   Never run upload hooks, but upload anyway (AKA bypass hooks).
-    # - no-verify=True, verify=True:
-    #   Invalid
+                 help='submit for review on this target branch')
+    p.add_option('-n', '--dry-run',
+                 dest='dryrun', default=False, action='store_true',
+                 help='do everything except actually upload the CL')
+    p.add_option('-y', '--yes',
+                 default=False, action='store_true',
+                 help='answer yes to all safe prompts')
     p.add_option('--no-cert-checks',
                  dest='validate_certs', action='store_false', default=True,
-                 help='Disable verifying ssl certs (unsafe).')
-    p.add_option('--no-verify',
-                 dest='bypass_hooks', action='store_true',
-                 help='Do not run the upload hook.')
-    p.add_option('--verify',
-                 dest='allow_all_hooks', action='store_true',
-                 help='Run the upload hook without prompting.')
+                 help='disable verifying ssl certs (unsafe)')
+    RepoHook.AddOptionGroup(p, 'pre-upload')
 
   def _SingleBranch(self, opt, branch, people):
     project = branch.project
@@ -212,20 +226,24 @@
 
       destination = opt.dest_branch or project.dest_branch or project.revisionExpr
       print('Upload project %s/ to remote branch %s%s:' %
-            (project.relpath, destination, ' (draft)' if opt.draft else ''))
+            (project.relpath, destination, ' (private)' if opt.private else ''))
       print('  branch %s (%2d commit%s, %s):' % (
-                    name,
-                    len(commit_list),
-                    len(commit_list) != 1 and 's' or '',
-                    date))
+          name,
+          len(commit_list),
+          len(commit_list) != 1 and 's' or '',
+          date))
       for commit in commit_list:
         print('         %s' % commit)
 
       print('to %s (y/N)? ' % remote.review, end='')
       # TODO: When we require Python 3, use flush=True w/print above.
       sys.stdout.flush()
-      answer = sys.stdin.readline().strip().lower()
-      answer = answer in ('y', 'yes', '1', 'true', 't')
+      if opt.yes:
+        print('<--yes>')
+        answer = True
+      else:
+        answer = sys.stdin.readline().strip().lower()
+        answer = answer in ('y', 'yes', '1', 'true', 't')
 
     if answer:
       if len(branch.commits) > UNUSUAL_COMMIT_THRESHOLD:
@@ -322,12 +340,12 @@
 
     key = 'review.%s.autoreviewer' % project.GetBranch(name).remote.review
     raw_list = project.config.GetString(key)
-    if not raw_list is None:
+    if raw_list is not None:
       people[0].extend([entry.strip() for entry in raw_list.split(',')])
 
     key = 'review.%s.autocopy' % project.GetBranch(name).remote.review
     raw_list = project.config.GetString(key)
-    if not raw_list is None and len(people[0]) > 0:
+    if raw_list is not None and len(people[0]) > 0:
       people[1].extend([entry.strip() for entry in raw_list.split(',')])
 
   def _FindGerritChange(self, branch):
@@ -364,7 +382,11 @@
             print('Continue uploading? (y/N) ', end='')
             # TODO: When we require Python 3, use flush=True w/print above.
             sys.stdout.flush()
-            a = sys.stdin.readline().strip().lower()
+            if opt.yes:
+              print('<--yes>')
+              a = 'yes'
+            else:
+              a = sys.stdin.readline().strip().lower()
             if a not in ('y', 'yes', 't', 'true', 'on'):
               print("skipping upload", file=sys.stderr)
               branch.uploaded = False
@@ -376,12 +398,51 @@
           key = 'review.%s.uploadtopic' % branch.project.remote.review
           opt.auto_topic = branch.project.config.GetBoolean(key)
 
+        def _ExpandCommaList(value):
+          """Split |value| up into comma delimited entries."""
+          if not value:
+            return
+          for ret in value.split(','):
+            ret = ret.strip()
+            if ret:
+              yield ret
+
+        # Check if hashtags should be included.
+        key = 'review.%s.uploadhashtags' % branch.project.remote.review
+        hashtags = set(_ExpandCommaList(branch.project.config.GetString(key)))
+        for tag in opt.hashtags:
+          hashtags.update(_ExpandCommaList(tag))
+        if opt.hashtag_branch:
+          hashtags.add(branch.name)
+
+        # Check if labels should be included.
+        key = 'review.%s.uploadlabels' % branch.project.remote.review
+        labels = set(_ExpandCommaList(branch.project.config.GetString(key)))
+        for label in opt.labels:
+          labels.update(_ExpandCommaList(label))
+        # Basic sanity check on label syntax.
+        for label in labels:
+          if not re.match(r'^.+[+-][0-9]+$', label):
+            print('repo: error: invalid label syntax "%s": labels use forms '
+                  'like CodeReview+1 or Verified-1' % (label,), file=sys.stderr)
+            sys.exit(1)
+
+        # Handle e-mail notifications.
+        if opt.notify is False:
+          notify = 'NONE'
+        else:
+          key = 'review.%s.uploadnotify' % branch.project.remote.review
+          notify = branch.project.config.GetString(key)
+
         destination = opt.dest_branch or branch.project.dest_branch
 
         # Make sure our local branch is not setup to track a different remote branch
         merge_branch = self._GetMergeBranch(branch.project)
         if destination:
-          full_dest = 'refs/heads/%s' % destination
+          full_dest = destination
+          if not full_dest.startswith(R_HEADS):
+            full_dest = R_HEADS + full_dest
+
           if not opt.dest_branch and merge_branch and merge_branch != full_dest:
             print('merge branch %s does not match destination branch %s'
                   % (merge_branch, full_dest))
@@ -392,10 +453,12 @@
             continue
 
         branch.UploadForReview(people,
+                               dryrun=opt.dryrun,
                                auto_topic=opt.auto_topic,
-                               draft=opt.draft,
+                               hashtags=hashtags,
+                               labels=labels,
                                private=opt.private,
-                               notify=None if opt.notify else 'NONE',
+                               notify=notify,
                                wip=opt.wip,
                                dest_branch=destination,
                                validate_certs=opt.validate_certs,
@@ -418,18 +481,18 @@
           else:
             fmt = '\n       (%s)'
           print(('[FAILED] %-15s %-15s' + fmt) % (
-                 branch.project.relpath + '/', \
-                 branch.name, \
-                 str(branch.error)),
-                 file=sys.stderr)
+              branch.project.relpath + '/',
+              branch.name,
+              str(branch.error)),
+              file=sys.stderr)
       print()
 
     for branch in todo:
       if branch.uploaded:
         print('[OK    ] %-15s %s' % (
-               branch.project.relpath + '/',
-               branch.name),
-               file=sys.stderr)
+            branch.project.relpath + '/',
+            branch.name),
+            file=sys.stderr)
 
     if have_errors:
       sys.exit(1)
@@ -437,68 +500,72 @@
   def _GetMergeBranch(self, project):
     p = GitCommand(project,
                    ['rev-parse', '--abbrev-ref', 'HEAD'],
-                   capture_stdout = True,
-                   capture_stderr = True)
+                   capture_stdout=True,
+                   capture_stderr=True)
     p.Wait()
     local_branch = p.stdout.strip()
     p = GitCommand(project,
                    ['config', '--get', 'branch.%s.merge' % local_branch],
-                   capture_stdout = True,
-                   capture_stderr = True)
+                   capture_stdout=True,
+                   capture_stderr=True)
     p.Wait()
     merge_branch = p.stdout.strip()
     return merge_branch
 
+  @staticmethod
+  def _GatherOne(opt, project):
+    """Figure out the upload status for |project|."""
+    if opt.current_branch:
+      cbr = project.CurrentBranch
+      up_branch = project.GetUploadableBranch(cbr)
+      avail = [up_branch] if up_branch else None
+    else:
+      avail = project.GetUploadableBranches(opt.branch)
+    return (project, avail)
+
   def Execute(self, opt, args):
-    project_list = self.GetProjects(args)
-    pending = []
-    reviewers = []
-    cc = []
-    branch = None
+    projects = self.GetProjects(args)
 
-    if opt.branch:
-      branch = opt.branch
-
-    for project in project_list:
-      if opt.current_branch:
-        cbr = project.CurrentBranch
-        up_branch = project.GetUploadableBranch(cbr)
-        if up_branch:
-          avail = [up_branch]
-        else:
-          avail = None
-          print('ERROR: Current branch (%s) not uploadable. '
-                'You may be able to type '
-                '"git branch --set-upstream-to m/master" to fix '
-                'your branch.' % str(cbr),
+    def _ProcessResults(_pool, _out, results):
+      pending = []
+      for result in results:
+        project, avail = result
+        if avail is None:
+          print('repo: error: %s: Unable to upload branch "%s". '
+                'You might be able to fix the branch by running:\n'
+                '  git branch --set-upstream-to m/%s' %
+                (project.relpath, project.CurrentBranch, self.manifest.branch),
                 file=sys.stderr)
-      else:
-        avail = project.GetUploadableBranches(branch)
-      if avail:
-        pending.append((project, avail))
+        elif avail:
+          pending.append(result)
+      return pending
+
+    pending = self.ExecuteInParallel(
+        opt.jobs,
+        functools.partial(self._GatherOne, opt),
+        projects,
+        callback=_ProcessResults)
 
     if not pending:
-      print("no branches ready for upload", file=sys.stderr)
-      return
+      if opt.branch is None:
+        print('repo: error: no branches ready for upload', file=sys.stderr)
+      else:
+        print('repo: error: no branches named "%s" ready for upload' %
+              (opt.branch,), file=sys.stderr)
+      return 1
 
-    if not opt.bypass_hooks:
-      hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
-                      self.manifest.topdir,
-                      self.manifest.manifestProject.GetRemote('origin').url,
-                      abort_if_user_denies=True)
-      pending_proj_names = [project.name for (project, available) in pending]
-      pending_worktrees = [project.worktree for (project, available) in pending]
-      try:
-        hook.Run(opt.allow_all_hooks, project_list=pending_proj_names,
-                 worktree_list=pending_worktrees)
-      except HookError as e:
-        print("ERROR: %s" % str(e), file=sys.stderr)
-        return
+    pending_proj_names = [project.name for (project, available) in pending]
+    pending_worktrees = [project.worktree for (project, available) in pending]
+    hook = RepoHook.FromSubcmd(
+        hook_type='pre-upload', manifest=self.manifest,
+        opt=opt, abort_if_user_denies=True)
+    if not hook.Run(
+        project_list=pending_proj_names,
+        worktree_list=pending_worktrees):
+      return 1
 
-    if opt.reviewers:
-      reviewers = _SplitEmails(opt.reviewers)
-    if opt.cc:
-      cc = _SplitEmails(opt.cc)
+    reviewers = _SplitEmails(opt.reviewers) if opt.reviewers else []
+    cc = _SplitEmails(opt.cc) if opt.cc else []
     people = (reviewers, cc)
 
     if len(pending) == 1 and len(pending[0][1]) == 1:
diff --git a/subcmds/version.py b/subcmds/version.py
index 761172b..09b053e 100644
--- a/subcmds/version.py
+++ b/subcmds/version.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,17 +12,20 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
+import platform
 import sys
+
 from command import Command, MirrorSafeCommand
 from git_command import git, RepoSourceVersion, user_agent
 from git_refs import HEAD
+from wrapper import Wrapper
+
 
 class Version(Command, MirrorSafeCommand):
   wrapper_version = None
   wrapper_path = None
 
-  common = False
+  COMMON = False
   helpSummary = "Display the version of repo"
   helpUsage = """
 %prog
@@ -33,16 +34,19 @@
   def Execute(self, opt, args):
     rp = self.manifest.repoProject
     rem = rp.GetRemote(rp.remote.name)
+    branch = rp.GetBranch('default')
 
     # These might not be the same.  Report them both.
     src_ver = RepoSourceVersion()
     rp_ver = rp.bare_git.describe(HEAD)
     print('repo version %s' % rp_ver)
     print('       (from %s)' % rem.url)
+    print('       (tracking %s)' % branch.merge)
+    print('       (%s)' % rp.bare_git.log('-1', '--format=%cD', HEAD))
 
-    if Version.wrapper_path is not None:
-      print('repo launcher version %s' % Version.wrapper_version)
-      print('       (from %s)' % Version.wrapper_path)
+    if self.wrapper_path is not None:
+      print('repo launcher version %s' % self.wrapper_version)
+      print('       (from %s)' % self.wrapper_path)
 
       if src_ver != rp_ver:
         print('       (currently at %s)' % src_ver)
@@ -51,3 +55,12 @@
     print('git %s' % git.version_tuple().full)
     print('git User-Agent %s' % user_agent.git)
     print('Python %s' % sys.version)
+    uname = platform.uname()
+    if sys.version_info.major < 3:
+      # Python 3 returns a named tuple, but Python 2 is simpler.
+      print(uname)
+    else:
+      print('OS %s %s (%s)' % (uname.system, uname.release, uname.version))
+      print('CPU %s (%s)' %
+            (uname.machine, uname.processor if uname.processor else 'unknown'))
+    print('Bug reports:', Wrapper().BUG_URL)
diff --git a/tests/fixtures/test.gitconfig b/tests/fixtures/test.gitconfig
index 3c573c9..b178cf6 100644
--- a/tests/fixtures/test.gitconfig
+++ b/tests/fixtures/test.gitconfig
@@ -1,3 +1,23 @@
 [section]
 	empty
 	nonempty = true
+	boolinvalid = oops
+	booltrue = true
+	boolfalse = false
+	intinvalid = oops
+	inthex = 0x10
+	inthexk = 0x10k
+	int = 10
+	intk = 10k
+	intm = 10m
+	intg = 10g
+[repo "syncstate.main"]
+	synctime = 2021-09-14T17:23:43.537338Z
+	version = 1
+[repo "syncstate.sys"]
+	argv = ['/usr/bin/pytest-3']
+[repo "syncstate.superproject"]
+	test = false
+[repo "syncstate.options"]
+	verbose = true
+	mpupdate = false
diff --git a/tests/test_editor.py b/tests/test_editor.py
index fbcfcdb..cfd4f5e 100644
--- a/tests/test_editor.py
+++ b/tests/test_editor.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2019 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,8 +14,6 @@
 
 """Unittests for the editor.py module."""
 
-from __future__ import print_function
-
 import unittest
 
 from editor import Editor
diff --git a/tests/test_error.py b/tests/test_error.py
new file mode 100644
index 0000000..82b00c2
--- /dev/null
+++ b/tests/test_error.py
@@ -0,0 +1,53 @@
+# Copyright 2021 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the error.py module."""
+
+import inspect
+import pickle
+import unittest
+
+import error
+
+
+class PickleTests(unittest.TestCase):
+  """Make sure all our custom exceptions can be pickled."""
+
+  def getExceptions(self):
+    """Return all our custom exceptions."""
+    for name in dir(error):
+      cls = getattr(error, name)
+      if isinstance(cls, type) and issubclass(cls, Exception):
+        yield cls
+
+  def testExceptionLookup(self):
+    """Make sure our introspection logic works."""
+    classes = list(self.getExceptions())
+    self.assertIn(error.HookError, classes)
+    # Don't assert the exact number to avoid being a change-detector test.
+    self.assertGreater(len(classes), 10)
+
+  def testPickle(self):
+    """Try to pickle all the exceptions."""
+    for cls in self.getExceptions():
+      args = inspect.getfullargspec(cls.__init__).args[1:]
+      obj = cls(*args)
+      p = pickle.dumps(obj)
+      try:
+        newobj = pickle.loads(p)
+      except Exception as e:  # pylint: disable=broad-except
+        self.fail('Class %s is unable to be pickled: %s\n'
+                  'Incomplete super().__init__(...) call?' % (cls, e))
+      self.assertIsInstance(newobj, cls)
+      self.assertEqual(str(obj), str(newobj))
diff --git a/tests/test_git_command.py b/tests/test_git_command.py
index 51171a3..93300a6 100644
--- a/tests/test_git_command.py
+++ b/tests/test_git_command.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright 2019 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,12 +14,16 @@
 
 """Unittests for the git_command.py module."""
 
-from __future__ import print_function
-
 import re
 import unittest
 
+try:
+  from unittest import mock
+except ImportError:
+  import mock
+
 import git_command
+import wrapper
 
 
 class GitCallUnitTest(unittest.TestCase):
@@ -35,7 +37,7 @@
     # We don't dive too deep into the values here to avoid having to update
     # whenever git versions change.  We do check relative to this min version
     # as this is what `repo` itself requires via MIN_GIT_VERSION.
-    MIN_GIT_VERSION = (1, 7, 2)
+    MIN_GIT_VERSION = (2, 10, 2)
     self.assertTrue(isinstance(ver.major, int))
     self.assertTrue(isinstance(ver.minor, int))
     self.assertTrue(isinstance(ver.micro, int))
@@ -76,3 +78,45 @@
     # the general form.
     m = re.match(r'^git/[^ ]+ ([^ ]+) git-repo/[^ ]+', ua)
     self.assertIsNotNone(m)
+
+
+class GitRequireTests(unittest.TestCase):
+  """Test the git_require helper."""
+
+  def setUp(self):
+    ver = wrapper.GitVersion(1, 2, 3, 4)
+    mock.patch.object(git_command.git, 'version_tuple', return_value=ver).start()
+
+  def tearDown(self):
+    mock.patch.stopall()
+
+  def test_older_nonfatal(self):
+    """Test non-fatal require calls with old versions."""
+    self.assertFalse(git_command.git_require((2,)))
+    self.assertFalse(git_command.git_require((1, 3)))
+    self.assertFalse(git_command.git_require((1, 2, 4)))
+    self.assertFalse(git_command.git_require((1, 2, 3, 5)))
+
+  def test_newer_nonfatal(self):
+    """Test non-fatal require calls with newer versions."""
+    self.assertTrue(git_command.git_require((0,)))
+    self.assertTrue(git_command.git_require((1, 0)))
+    self.assertTrue(git_command.git_require((1, 2, 0)))
+    self.assertTrue(git_command.git_require((1, 2, 3, 0)))
+
+  def test_equal_nonfatal(self):
+    """Test require calls with equal values."""
+    self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=False))
+    self.assertTrue(git_command.git_require((1, 2, 3, 4), fail=True))
+
+  def test_older_fatal(self):
+    """Test fatal require calls with old versions."""
+    with self.assertRaises(SystemExit) as e:
+      git_command.git_require((2,), fail=True)
+      self.assertNotEqual(0, e.code)
+
+  def test_older_fatal_msg(self):
+    """Test fatal require calls with old versions and message."""
+    with self.assertRaises(SystemExit) as e:
+      git_command.git_require((2,), fail=True, msg='so sad')
+      self.assertNotEqual(0, e.code)
diff --git a/tests/test_git_config.py b/tests/test_git_config.py
index b735f27..faf12a2 100644
--- a/tests/test_git_config.py
+++ b/tests/test_git_config.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2009 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,21 +14,22 @@
 
 """Unittests for the git_config.py module."""
 
-from __future__ import print_function
-
 import os
+import tempfile
 import unittest
 
 import git_config
 
+
 def fixture(*paths):
   """Return a path relative to test/fixtures.
   """
   return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
 
-class GitConfigUnitTest(unittest.TestCase):
-  """Tests the GitConfig class.
-  """
+
+class GitConfigReadOnlyTests(unittest.TestCase):
+  """Read-only tests of the GitConfig class."""
+
   def setUp(self):
     """Create a GitConfig object using the test.gitconfig fixture.
     """
@@ -68,5 +67,126 @@
     val = config.GetString('empty')
     self.assertEqual(val, None)
 
+  def test_GetBoolean_undefined(self):
+    """Test GetBoolean on key that doesn't exist."""
+    self.assertIsNone(self.config.GetBoolean('section.missing'))
+
+  def test_GetBoolean_invalid(self):
+    """Test GetBoolean on invalid boolean value."""
+    self.assertIsNone(self.config.GetBoolean('section.boolinvalid'))
+
+  def test_GetBoolean_true(self):
+    """Test GetBoolean on valid true boolean."""
+    self.assertTrue(self.config.GetBoolean('section.booltrue'))
+
+  def test_GetBoolean_false(self):
+    """Test GetBoolean on valid false boolean."""
+    self.assertFalse(self.config.GetBoolean('section.boolfalse'))
+
+  def test_GetInt_undefined(self):
+    """Test GetInt on key that doesn't exist."""
+    self.assertIsNone(self.config.GetInt('section.missing'))
+
+  def test_GetInt_invalid(self):
+    """Test GetInt on invalid integer value."""
+    self.assertIsNone(self.config.GetBoolean('section.intinvalid'))
+
+  def test_GetInt_valid(self):
+    """Test GetInt on valid integers."""
+    TESTS = (
+        ('inthex', 16),
+        ('inthexk', 16384),
+        ('int', 10),
+        ('intk', 10240),
+        ('intm', 10485760),
+        ('intg', 10737418240),
+    )
+    for key, value in TESTS:
+      self.assertEqual(value, self.config.GetInt('section.%s' % (key,)))
+
+  def test_GetSyncAnalysisStateData(self):
+    """Test config entries with a sync state analysis data."""
+    superproject_logging_data = {}
+    superproject_logging_data['test'] = False
+    options = type('options', (object,), {})()
+    options.verbose = 'true'
+    options.mp_update = 'false'
+    TESTS = (
+        ('superproject.test', 'false'),
+        ('options.verbose', 'true'),
+        ('options.mpupdate', 'false'),
+        ('main.version', '1'),
+    )
+    self.config.UpdateSyncAnalysisState(options, superproject_logging_data)
+    sync_data = self.config.GetSyncAnalysisStateData()
+    for key, value in TESTS:
+      self.assertEqual(sync_data[f'{git_config.SYNC_STATE_PREFIX}{key}'], value)
+    self.assertTrue(sync_data[f'{git_config.SYNC_STATE_PREFIX}main.synctime'])
+
+
+class GitConfigReadWriteTests(unittest.TestCase):
+  """Read/write tests of the GitConfig class."""
+
+  def setUp(self):
+    self.tmpfile = tempfile.NamedTemporaryFile()
+    self.config = self.get_config()
+
+  def get_config(self):
+    """Get a new GitConfig instance."""
+    return git_config.GitConfig(self.tmpfile.name)
+
+  def test_SetString(self):
+    """Test SetString behavior."""
+    # Set a value.
+    self.assertIsNone(self.config.GetString('foo.bar'))
+    self.config.SetString('foo.bar', 'val')
+    self.assertEqual('val', self.config.GetString('foo.bar'))
+
+    # Make sure the value was actually written out.
+    config = self.get_config()
+    self.assertEqual('val', config.GetString('foo.bar'))
+
+    # Update the value.
+    self.config.SetString('foo.bar', 'valll')
+    self.assertEqual('valll', self.config.GetString('foo.bar'))
+    config = self.get_config()
+    self.assertEqual('valll', config.GetString('foo.bar'))
+
+    # Delete the value.
+    self.config.SetString('foo.bar', None)
+    self.assertIsNone(self.config.GetString('foo.bar'))
+    config = self.get_config()
+    self.assertIsNone(config.GetString('foo.bar'))
+
+  def test_SetBoolean(self):
+    """Test SetBoolean behavior."""
+    # Set a true value.
+    self.assertIsNone(self.config.GetBoolean('foo.bar'))
+    for val in (True, 1):
+      self.config.SetBoolean('foo.bar', val)
+      self.assertTrue(self.config.GetBoolean('foo.bar'))
+
+    # Make sure the value was actually written out.
+    config = self.get_config()
+    self.assertTrue(config.GetBoolean('foo.bar'))
+    self.assertEqual('true', config.GetString('foo.bar'))
+
+    # Set a false value.
+    for val in (False, 0):
+      self.config.SetBoolean('foo.bar', val)
+      self.assertFalse(self.config.GetBoolean('foo.bar'))
+
+    # Make sure the value was actually written out.
+    config = self.get_config()
+    self.assertFalse(config.GetBoolean('foo.bar'))
+    self.assertEqual('false', config.GetString('foo.bar'))
+
+    # Delete the value.
+    self.config.SetBoolean('foo.bar', None)
+    self.assertIsNone(self.config.GetBoolean('foo.bar'))
+    config = self.get_config()
+    self.assertIsNone(config.GetBoolean('foo.bar'))
+
+
 if __name__ == '__main__':
   unittest.main()
diff --git a/tests/test_git_superproject.py b/tests/test_git_superproject.py
new file mode 100644
index 0000000..a24fc7f
--- /dev/null
+++ b/tests/test_git_superproject.py
@@ -0,0 +1,376 @@
+# Copyright (C) 2021 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the git_superproject.py module."""
+
+import json
+import os
+import platform
+import tempfile
+import unittest
+from unittest import mock
+
+import git_superproject
+import git_trace2_event_log
+import manifest_xml
+import platform_utils
+from test_manifest_xml import sort_attributes
+
+
+class SuperprojectTestCase(unittest.TestCase):
+  """TestCase for the Superproject module."""
+
+  PARENT_SID_KEY = 'GIT_TRACE2_PARENT_SID'
+  PARENT_SID_VALUE = 'parent_sid'
+  SELF_SID_REGEX = r'repo-\d+T\d+Z-.*'
+  FULL_SID_REGEX = r'^%s/%s' % (PARENT_SID_VALUE, SELF_SID_REGEX)
+
+  def setUp(self):
+    """Set up superproject every time."""
+    self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
+    self.repodir = os.path.join(self.tempdir, '.repo')
+    self.manifest_file = os.path.join(
+        self.repodir, manifest_xml.MANIFEST_FILE_NAME)
+    os.mkdir(self.repodir)
+    self.platform = platform.system().lower()
+
+    # By default we initialize with the expected case where
+    # repo launches us (so GIT_TRACE2_PARENT_SID is set).
+    env = {
+        self.PARENT_SID_KEY: self.PARENT_SID_VALUE,
+    }
+    self.git_event_log = git_trace2_event_log.EventLog(env=env)
+
+    # The manifest parsing really wants a git repo currently.
+    gitdir = os.path.join(self.repodir, 'manifests.git')
+    os.mkdir(gitdir)
+    with open(os.path.join(gitdir, 'config'), 'w') as fp:
+      fp.write("""[remote "origin"]
+        url = https://localhost:0/manifest
+""")
+
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <superproject name="superproject"/>
+  <project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
+  " /></manifest>
+""")
+    self._superproject = git_superproject.Superproject(manifest, self.repodir,
+                                                       self.git_event_log)
+
+  def tearDown(self):
+    """Tear down superproject every time."""
+    platform_utils.rmtree(self.tempdir)
+
+  def getXmlManifest(self, data):
+    """Helper to initialize a manifest for testing."""
+    with open(self.manifest_file, 'w') as fp:
+      fp.write(data)
+    return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
+
+  def verifyCommonKeys(self, log_entry, expected_event_name, full_sid=True):
+    """Helper function to verify common event log keys."""
+    self.assertIn('event', log_entry)
+    self.assertIn('sid', log_entry)
+    self.assertIn('thread', log_entry)
+    self.assertIn('time', log_entry)
+
+    # Do basic data format validation.
+    self.assertEqual(expected_event_name, log_entry['event'])
+    if full_sid:
+      self.assertRegex(log_entry['sid'], self.FULL_SID_REGEX)
+    else:
+      self.assertRegex(log_entry['sid'], self.SELF_SID_REGEX)
+    self.assertRegex(log_entry['time'], r'^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$')
+
+  def readLog(self, log_path):
+    """Helper function to read log data into a list."""
+    log_data = []
+    with open(log_path, mode='rb') as f:
+      for line in f:
+        log_data.append(json.loads(line))
+    return log_data
+
+  def verifyErrorEvent(self):
+    """Helper to verify that error event is written."""
+
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self.git_event_log.Write(path=tempdir)
+      self.log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self.log_data), 2)
+    error_event = self.log_data[1]
+    self.verifyCommonKeys(self.log_data[0], expected_event_name='version')
+    self.verifyCommonKeys(error_event, expected_event_name='error')
+    # Check for 'error' event specific fields.
+    self.assertIn('msg', error_event)
+    self.assertIn('fmt', error_event)
+
+  def test_superproject_get_superproject_no_superproject(self):
+    """Test with no url."""
+    manifest = self.getXmlManifest("""
+<manifest>
+</manifest>
+""")
+    superproject = git_superproject.Superproject(manifest, self.repodir, self.git_event_log)
+    # Test that exit condition is false when there is no superproject tag.
+    sync_result = superproject.Sync()
+    self.assertFalse(sync_result.success)
+    self.assertFalse(sync_result.fatal)
+    self.verifyErrorEvent()
+
+  def test_superproject_get_superproject_invalid_url(self):
+    """Test with an invalid url."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <superproject name="superproject"/>
+</manifest>
+""")
+    superproject = git_superproject.Superproject(manifest, self.repodir, self.git_event_log)
+    sync_result = superproject.Sync()
+    self.assertFalse(sync_result.success)
+    self.assertTrue(sync_result.fatal)
+
+  def test_superproject_get_superproject_invalid_branch(self):
+    """Test with an invalid branch."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <superproject name="superproject"/>
+</manifest>
+""")
+    self._superproject = git_superproject.Superproject(manifest, self.repodir,
+                                                       self.git_event_log)
+    with mock.patch.object(self._superproject, '_branch', 'junk'):
+      sync_result = self._superproject.Sync()
+      self.assertFalse(sync_result.success)
+      self.assertTrue(sync_result.fatal)
+
+  def test_superproject_get_superproject_mock_init(self):
+    """Test with _Init failing."""
+    with mock.patch.object(self._superproject, '_Init', return_value=False):
+      sync_result = self._superproject.Sync()
+      self.assertFalse(sync_result.success)
+      self.assertTrue(sync_result.fatal)
+
+  def test_superproject_get_superproject_mock_fetch(self):
+    """Test with _Fetch failing."""
+    with mock.patch.object(self._superproject, '_Init', return_value=True):
+      os.mkdir(self._superproject._superproject_path)
+      with mock.patch.object(self._superproject, '_Fetch', return_value=False):
+        sync_result = self._superproject.Sync()
+        self.assertFalse(sync_result.success)
+        self.assertTrue(sync_result.fatal)
+
+  def test_superproject_get_all_project_commit_ids_mock_ls_tree(self):
+    """Test with LsTree being a mock."""
+    data = ('120000 blob 158258bdf146f159218e2b90f8b699c4d85b5804\tAndroid.bp\x00'
+            '160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
+            '160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00'
+            '120000 blob acc2cbdf438f9d2141f0ae424cec1d8fc4b5d97f\tbootstrap.bash\x00'
+            '160000 commit ade9b7a0d874e25fff4bf2552488825c6f111928\tbuild/bazel\x00')
+    with mock.patch.object(self._superproject, '_Init', return_value=True):
+      with mock.patch.object(self._superproject, '_Fetch', return_value=True):
+        with mock.patch.object(self._superproject, '_LsTree', return_value=data):
+          commit_ids_result = self._superproject._GetAllProjectsCommitIds()
+          self.assertEqual(commit_ids_result.commit_ids, {
+              'art': '2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea',
+              'bootable/recovery': 'e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06',
+              'build/bazel': 'ade9b7a0d874e25fff4bf2552488825c6f111928'
+          })
+          self.assertFalse(commit_ids_result.fatal)
+
+  def test_superproject_write_manifest_file(self):
+    """Test with writing manifest to a file after setting revisionId."""
+    self.assertEqual(len(self._superproject._manifest.projects), 1)
+    project = self._superproject._manifest.projects[0]
+    project.SetRevisionId('ABCDEF')
+    # Create temporary directory so that it can write the file.
+    os.mkdir(self._superproject._superproject_path)
+    manifest_path = self._superproject._WriteManifestFile()
+    self.assertIsNotNone(manifest_path)
+    with open(manifest_path, 'r') as fp:
+      manifest_xml_data = fp.read()
+    self.assertEqual(
+        sort_attributes(manifest_xml_data),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="default-remote"/>'
+        '<default remote="default-remote" revision="refs/heads/main"/>'
+        '<project groups="notdefault,platform-' + self.platform + '" '
+        'name="platform/art" path="art" revision="ABCDEF" upstream="refs/heads/main"/>'
+        '<superproject name="superproject"/>'
+        '</manifest>')
+
+  def test_superproject_update_project_revision_id(self):
+    """Test with LsTree being a mock."""
+    self.assertEqual(len(self._superproject._manifest.projects), 1)
+    projects = self._superproject._manifest.projects
+    data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
+            '160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tbootable/recovery\x00')
+    with mock.patch.object(self._superproject, '_Init', return_value=True):
+      with mock.patch.object(self._superproject, '_Fetch', return_value=True):
+        with mock.patch.object(self._superproject,
+                               '_LsTree',
+                               return_value=data):
+          # Create temporary directory so that it can write the file.
+          os.mkdir(self._superproject._superproject_path)
+          update_result = self._superproject.UpdateProjectsRevisionId(projects)
+          self.assertIsNotNone(update_result.manifest_path)
+          self.assertFalse(update_result.fatal)
+          with open(update_result.manifest_path, 'r') as fp:
+            manifest_xml_data = fp.read()
+          self.assertEqual(
+              sort_attributes(manifest_xml_data),
+              '<?xml version="1.0" ?><manifest>'
+              '<remote fetch="http://localhost" name="default-remote"/>'
+              '<default remote="default-remote" revision="refs/heads/main"/>'
+              '<project groups="notdefault,platform-' + self.platform + '" '
+              'name="platform/art" path="art" '
+              'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
+              '<superproject name="superproject"/>'
+              '</manifest>')
+
+  def test_superproject_update_project_revision_id_no_superproject_tag(self):
+    """Test update of commit ids of a manifest without superproject tag."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="test-name"/>
+</manifest>
+""")
+    self.maxDiff = None
+    self._superproject = git_superproject.Superproject(manifest, self.repodir,
+                                                       self.git_event_log)
+    self.assertEqual(len(self._superproject._manifest.projects), 1)
+    projects = self._superproject._manifest.projects
+    project = projects[0]
+    project.SetRevisionId('ABCDEF')
+    update_result = self._superproject.UpdateProjectsRevisionId(projects)
+    self.assertIsNone(update_result.manifest_path)
+    self.assertFalse(update_result.fatal)
+    self.verifyErrorEvent()
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="default-remote"/>'
+        '<default remote="default-remote" revision="refs/heads/main"/>'
+        '<project name="test-name" revision="ABCDEF" upstream="refs/heads/main"/>'
+        '</manifest>')
+
+  def test_superproject_update_project_revision_id_from_local_manifest_group(self):
+    """Test update of commit ids of a manifest that have local manifest no superproject group."""
+    local_group = manifest_xml.LOCAL_MANIFEST_GROUP_PREFIX + ':local'
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <remote name="goog" fetch="http://localhost2" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <superproject name="superproject"/>
+  <project path="vendor/x" name="platform/vendor/x" remote="goog"
+           groups=\"""" + local_group + """
+         " revision="master-with-vendor" clone-depth="1" />
+  <project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
+  " /></manifest>
+""")
+    self.maxDiff = None
+    self._superproject = git_superproject.Superproject(manifest, self.repodir,
+                                                       self.git_event_log)
+    self.assertEqual(len(self._superproject._manifest.projects), 2)
+    projects = self._superproject._manifest.projects
+    data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00')
+    with mock.patch.object(self._superproject, '_Init', return_value=True):
+      with mock.patch.object(self._superproject, '_Fetch', return_value=True):
+        with mock.patch.object(self._superproject,
+                               '_LsTree',
+                               return_value=data):
+          # Create temporary directory so that it can write the file.
+          os.mkdir(self._superproject._superproject_path)
+          update_result = self._superproject.UpdateProjectsRevisionId(projects)
+          self.assertIsNotNone(update_result.manifest_path)
+          self.assertFalse(update_result.fatal)
+          with open(update_result.manifest_path, 'r') as fp:
+            manifest_xml_data = fp.read()
+          # Verify platform/vendor/x's project revision hasn't changed.
+          self.assertEqual(
+              sort_attributes(manifest_xml_data),
+              '<?xml version="1.0" ?><manifest>'
+              '<remote fetch="http://localhost" name="default-remote"/>'
+              '<remote fetch="http://localhost2" name="goog"/>'
+              '<default remote="default-remote" revision="refs/heads/main"/>'
+              '<project groups="notdefault,platform-' + self.platform + '" '
+              'name="platform/art" path="art" '
+              'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
+              '<project clone-depth="1" groups="' + local_group + '" '
+              'name="platform/vendor/x" path="vendor/x" remote="goog" '
+              'revision="master-with-vendor"/>'
+              '<superproject name="superproject"/>'
+              '</manifest>')
+
+  def test_superproject_update_project_revision_id_with_pinned_manifest(self):
+    """Test update of commit ids of a pinned manifest."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <superproject name="superproject"/>
+  <project path="vendor/x" name="platform/vendor/x" revision="" />
+  <project path="vendor/y" name="platform/vendor/y"
+           revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f" />
+  <project path="art" name="platform/art" groups="notdefault,platform-""" + self.platform + """
+  " /></manifest>
+""")
+    self.maxDiff = None
+    self._superproject = git_superproject.Superproject(manifest, self.repodir,
+                                                       self.git_event_log)
+    self.assertEqual(len(self._superproject._manifest.projects), 3)
+    projects = self._superproject._manifest.projects
+    data = ('160000 commit 2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea\tart\x00'
+            '160000 commit e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06\tvendor/x\x00')
+    with mock.patch.object(self._superproject, '_Init', return_value=True):
+      with mock.patch.object(self._superproject, '_Fetch', return_value=True):
+        with mock.patch.object(self._superproject,
+                               '_LsTree',
+                               return_value=data):
+          # Create temporary directory so that it can write the file.
+          os.mkdir(self._superproject._superproject_path)
+          update_result = self._superproject.UpdateProjectsRevisionId(projects)
+          self.assertIsNotNone(update_result.manifest_path)
+          self.assertFalse(update_result.fatal)
+          with open(update_result.manifest_path, 'r') as fp:
+            manifest_xml_data = fp.read()
+          # Verify platform/vendor/x's project revision hasn't changed.
+          self.assertEqual(
+              sort_attributes(manifest_xml_data),
+              '<?xml version="1.0" ?><manifest>'
+              '<remote fetch="http://localhost" name="default-remote"/>'
+              '<default remote="default-remote" revision="refs/heads/main"/>'
+              '<project groups="notdefault,platform-' + self.platform + '" '
+              'name="platform/art" path="art" '
+              'revision="2c2724cb36cd5a9cec6c852c681efc3b7c6b86ea" upstream="refs/heads/main"/>'
+              '<project name="platform/vendor/x" path="vendor/x" '
+              'revision="e9d25da64d8d365dbba7c8ee00fe8c4473fe9a06" upstream="refs/heads/main"/>'
+              '<project name="platform/vendor/y" path="vendor/y" '
+              'revision="52d3c9f7c107839ece2319d077de0cd922aa9d8f"/>'
+              '<superproject name="superproject"/>'
+              '</manifest>')
+
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/tests/test_git_trace2_event_log.py b/tests/test_git_trace2_event_log.py
new file mode 100644
index 0000000..89dcfb9
--- /dev/null
+++ b/tests/test_git_trace2_event_log.py
@@ -0,0 +1,329 @@
+# Copyright (C) 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the git_trace2_event_log.py module."""
+
+import json
+import os
+import tempfile
+import unittest
+from unittest import mock
+
+import git_trace2_event_log
+
+
+class EventLogTestCase(unittest.TestCase):
+  """TestCase for the EventLog module."""
+
+  PARENT_SID_KEY = 'GIT_TRACE2_PARENT_SID'
+  PARENT_SID_VALUE = 'parent_sid'
+  SELF_SID_REGEX = r'repo-\d+T\d+Z-.*'
+  FULL_SID_REGEX = r'^%s/%s' % (PARENT_SID_VALUE, SELF_SID_REGEX)
+
+  def setUp(self):
+    """Load the event_log module every time."""
+    self._event_log_module = None
+    # By default we initialize with the expected case where
+    # repo launches us (so GIT_TRACE2_PARENT_SID is set).
+    env = {
+        self.PARENT_SID_KEY: self.PARENT_SID_VALUE,
+    }
+    self._event_log_module = git_trace2_event_log.EventLog(env=env)
+    self._log_data = None
+
+  def verifyCommonKeys(self, log_entry, expected_event_name=None, full_sid=True):
+    """Helper function to verify common event log keys."""
+    self.assertIn('event', log_entry)
+    self.assertIn('sid', log_entry)
+    self.assertIn('thread', log_entry)
+    self.assertIn('time', log_entry)
+
+    # Do basic data format validation.
+    if expected_event_name:
+      self.assertEqual(expected_event_name, log_entry['event'])
+    if full_sid:
+      self.assertRegex(log_entry['sid'], self.FULL_SID_REGEX)
+    else:
+      self.assertRegex(log_entry['sid'], self.SELF_SID_REGEX)
+    self.assertRegex(log_entry['time'], r'^\d+-\d+-\d+T\d+:\d+:\d+\.\d+Z$')
+
+  def readLog(self, log_path):
+    """Helper function to read log data into a list."""
+    log_data = []
+    with open(log_path, mode='rb') as f:
+      for line in f:
+        log_data.append(json.loads(line))
+    return log_data
+
+  def remove_prefix(self, s, prefix):
+    """Return a copy string after removing |prefix| from |s|, if present or the original string."""
+    if s.startswith(prefix):
+        return s[len(prefix):]
+    else:
+        return s
+
+  def test_initial_state_with_parent_sid(self):
+    """Test initial state when 'GIT_TRACE2_PARENT_SID' is set by parent."""
+    self.assertRegex(self._event_log_module.full_sid, self.FULL_SID_REGEX)
+
+  def test_initial_state_no_parent_sid(self):
+    """Test initial state when 'GIT_TRACE2_PARENT_SID' is not set."""
+    # Setup an empty environment dict (no parent sid).
+    self._event_log_module = git_trace2_event_log.EventLog(env={})
+    self.assertRegex(self._event_log_module.full_sid, self.SELF_SID_REGEX)
+
+  def test_version_event(self):
+    """Test 'version' event data is valid.
+
+    Verify that the 'version' event is written even when no other
+    events are addded.
+
+    Expected event log:
+    <version event>
+    """
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    # A log with no added events should only have the version entry.
+    self.assertEqual(len(self._log_data), 1)
+    version_event = self._log_data[0]
+    self.verifyCommonKeys(version_event, expected_event_name='version')
+    # Check for 'version' event specific fields.
+    self.assertIn('evt', version_event)
+    self.assertIn('exe', version_event)
+    # Verify "evt" version field is a string.
+    self.assertIsInstance(version_event['evt'], str)
+
+  def test_start_event(self):
+    """Test and validate 'start' event data is valid.
+
+    Expected event log:
+    <version event>
+    <start event>
+    """
+    self._event_log_module.StartEvent()
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 2)
+    start_event = self._log_data[1]
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+    self.verifyCommonKeys(start_event, expected_event_name='start')
+    # Check for 'start' event specific fields.
+    self.assertIn('argv', start_event)
+    self.assertTrue(isinstance(start_event['argv'], list))
+
+  def test_exit_event_result_none(self):
+    """Test 'exit' event data is valid when result is None.
+
+    We expect None result to be converted to 0 in the exit event data.
+
+    Expected event log:
+    <version event>
+    <exit event>
+    """
+    self._event_log_module.ExitEvent(None)
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 2)
+    exit_event = self._log_data[1]
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+    self.verifyCommonKeys(exit_event, expected_event_name='exit')
+    # Check for 'exit' event specific fields.
+    self.assertIn('code', exit_event)
+    # 'None' result should convert to 0 (successful) return code.
+    self.assertEqual(exit_event['code'], 0)
+
+  def test_exit_event_result_integer(self):
+    """Test 'exit' event data is valid when result is an integer.
+
+    Expected event log:
+    <version event>
+    <exit event>
+    """
+    self._event_log_module.ExitEvent(2)
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 2)
+    exit_event = self._log_data[1]
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+    self.verifyCommonKeys(exit_event, expected_event_name='exit')
+    # Check for 'exit' event specific fields.
+    self.assertIn('code', exit_event)
+    self.assertEqual(exit_event['code'], 2)
+
+  def test_command_event(self):
+    """Test and validate 'command' event data is valid.
+
+    Expected event log:
+    <version event>
+    <command event>
+    """
+    name = 'repo'
+    subcommands = ['init' 'this']
+    self._event_log_module.CommandEvent(name='repo', subcommands=subcommands)
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 2)
+    command_event = self._log_data[1]
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+    self.verifyCommonKeys(command_event, expected_event_name='command')
+    # Check for 'command' event specific fields.
+    self.assertIn('name', command_event)
+    self.assertIn('subcommands', command_event)
+    self.assertEqual(command_event['name'], name)
+    self.assertEqual(command_event['subcommands'], subcommands)
+
+  def test_def_params_event_repo_config(self):
+    """Test 'def_params' event data outputs only repo config keys.
+
+    Expected event log:
+    <version event>
+    <def_param event>
+    <def_param event>
+    """
+    config = {
+        'git.foo': 'bar',
+        'repo.partialclone': 'true',
+        'repo.partialclonefilter': 'blob:none',
+    }
+    self._event_log_module.DefParamRepoEvents(config)
+
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 3)
+    def_param_events = self._log_data[1:]
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+
+    for event in def_param_events:
+      self.verifyCommonKeys(event, expected_event_name='def_param')
+      # Check for 'def_param' event specific fields.
+      self.assertIn('param', event)
+      self.assertIn('value', event)
+      self.assertTrue(event['param'].startswith('repo.'))
+
+  def test_def_params_event_no_repo_config(self):
+    """Test 'def_params' event data won't output non-repo config keys.
+
+    Expected event log:
+    <version event>
+    """
+    config = {
+        'git.foo': 'bar',
+        'git.core.foo2': 'baz',
+    }
+    self._event_log_module.DefParamRepoEvents(config)
+
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 1)
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+
+  def test_data_event_config(self):
+    """Test 'data' event data outputs all config keys.
+
+    Expected event log:
+    <version event>
+    <data event>
+    <data event>
+    """
+    config = {
+        'git.foo': 'bar',
+        'repo.partialclone': 'false',
+        'repo.syncstate.superproject.hassuperprojecttag': 'true',
+        'repo.syncstate.superproject.sys.argv': ['--', 'sync', 'protobuf'],
+    }
+    prefix_value = 'prefix'
+    self._event_log_module.LogDataConfigEvents(config, prefix_value)
+
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 5)
+    data_events = self._log_data[1:]
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+
+    for event in data_events:
+      self.verifyCommonKeys(event)
+      # Check for 'data' event specific fields.
+      self.assertIn('key', event)
+      self.assertIn('value', event)
+      key = event['key']
+      key = self.remove_prefix(key, f'{prefix_value}/')
+      value = event['value']
+      self.assertEqual(self._event_log_module.GetDataEventName(value), event['event'])
+      self.assertTrue(key in config and value == config[key])
+
+  def test_error_event(self):
+    """Test and validate 'error' event data is valid.
+
+    Expected event log:
+    <version event>
+    <error event>
+    """
+    msg = 'invalid option: --cahced'
+    fmt = 'invalid option: %s'
+    self._event_log_module.ErrorEvent(msg, fmt)
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      log_path = self._event_log_module.Write(path=tempdir)
+      self._log_data = self.readLog(log_path)
+
+    self.assertEqual(len(self._log_data), 2)
+    error_event = self._log_data[1]
+    self.verifyCommonKeys(self._log_data[0], expected_event_name='version')
+    self.verifyCommonKeys(error_event, expected_event_name='error')
+    # Check for 'error' event specific fields.
+    self.assertIn('msg', error_event)
+    self.assertIn('fmt', error_event)
+    self.assertEqual(error_event['msg'], msg)
+    self.assertEqual(error_event['fmt'], fmt)
+
+  def test_write_with_filename(self):
+    """Test Write() with a path to a file exits with None."""
+    self.assertIsNone(self._event_log_module.Write(path='path/to/file'))
+
+  def test_write_with_git_config(self):
+    """Test Write() uses the git config path when 'git config' call succeeds."""
+    with tempfile.TemporaryDirectory(prefix='event_log_tests') as tempdir:
+      with mock.patch.object(self._event_log_module,
+                             '_GetEventTargetPath', return_value=tempdir):
+        self.assertEqual(os.path.dirname(self._event_log_module.Write()), tempdir)
+
+  def test_write_no_git_config(self):
+    """Test Write() with no git config variable present exits with None."""
+    with mock.patch.object(self._event_log_module,
+                           '_GetEventTargetPath', return_value=None):
+      self.assertIsNone(self._event_log_module.Write())
+
+  def test_write_non_string(self):
+    """Test Write() with non-string type for |path| throws TypeError."""
+    with self.assertRaises(TypeError):
+      self._event_log_module.Write(path=1234)
+
+
+if __name__ == '__main__':
+  unittest.main()
diff --git a/tests/test_hooks.py b/tests/test_hooks.py
new file mode 100644
index 0000000..6632b3e
--- /dev/null
+++ b/tests/test_hooks.py
@@ -0,0 +1,55 @@
+# Copyright (C) 2019 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the hooks.py module."""
+
+import hooks
+import unittest
+
+class RepoHookShebang(unittest.TestCase):
+  """Check shebang parsing in RepoHook."""
+
+  def test_no_shebang(self):
+    """Lines w/out shebangs should be rejected."""
+    DATA = (
+        '',
+        '#\n# foo\n',
+        '# Bad shebang in script\n#!/foo\n'
+    )
+    for data in DATA:
+      self.assertIsNone(hooks.RepoHook._ExtractInterpFromShebang(data))
+
+  def test_direct_interp(self):
+    """Lines whose shebang points directly to the interpreter."""
+    DATA = (
+        ('#!/foo', '/foo'),
+        ('#! /foo', '/foo'),
+        ('#!/bin/foo ', '/bin/foo'),
+        ('#! /usr/foo ', '/usr/foo'),
+        ('#! /usr/foo -args', '/usr/foo'),
+    )
+    for shebang, interp in DATA:
+      self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
+                       interp)
+
+  def test_env_interp(self):
+    """Lines whose shebang launches through `env`."""
+    DATA = (
+        ('#!/usr/bin/env foo', 'foo'),
+        ('#!/bin/env foo', 'foo'),
+        ('#! /bin/env /bin/foo ', '/bin/foo'),
+    )
+    for shebang, interp in DATA:
+      self.assertEqual(hooks.RepoHook._ExtractInterpFromShebang(shebang),
+                       interp)
diff --git a/tests/test_manifest_xml.py b/tests/test_manifest_xml.py
new file mode 100644
index 0000000..cb3eb85
--- /dev/null
+++ b/tests/test_manifest_xml.py
@@ -0,0 +1,845 @@
+# Copyright (C) 2019 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the manifest_xml.py module."""
+
+import os
+import platform
+import re
+import shutil
+import tempfile
+import unittest
+import xml.dom.minidom
+
+import error
+import manifest_xml
+
+
+# Invalid paths that we don't want in the filesystem.
+INVALID_FS_PATHS = (
+    '',
+    '.',
+    '..',
+    '../',
+    './',
+    './/',
+    'foo/',
+    './foo',
+    '../foo',
+    'foo/./bar',
+    'foo/../../bar',
+    '/foo',
+    './../foo',
+    '.git/foo',
+    # Check case folding.
+    '.GIT/foo',
+    'blah/.git/foo',
+    '.repo/foo',
+    '.repoconfig',
+    # Block ~ due to 8.3 filenames on Windows filesystems.
+    '~',
+    'foo~',
+    'blah/foo~',
+    # Block Unicode characters that get normalized out by filesystems.
+    u'foo\u200Cbar',
+    # Block newlines.
+    'f\n/bar',
+    'f\r/bar',
+)
+
+# Make sure platforms that use path separators (e.g. Windows) are also
+# rejected properly.
+if os.path.sep != '/':
+  INVALID_FS_PATHS += tuple(x.replace('/', os.path.sep) for x in INVALID_FS_PATHS)
+
+
+def sort_attributes(manifest):
+  """Sort the attributes of all elements alphabetically.
+
+  This is needed because different versions of the toxml() function from
+  xml.dom.minidom outputs the attributes of elements in different orders.
+  Before Python 3.8 they were output alphabetically, later versions preserve
+  the order specified by the user.
+
+  Args:
+    manifest: String containing an XML manifest.
+
+  Returns:
+    The XML manifest with the attributes of all elements sorted alphabetically.
+  """
+  new_manifest = ''
+  # This will find every element in the XML manifest, whether they have
+  # attributes or not. This simplifies recreating the manifest below.
+  matches = re.findall(r'(<[/?]?[a-z-]+\s*)((?:\S+?="[^"]+"\s*?)*)(\s*[/?]?>)', manifest)
+  for head, attrs, tail in matches:
+    m = re.findall(r'\S+?="[^"]+"', attrs)
+    new_manifest += head + ' '.join(sorted(m)) + tail
+  return new_manifest
+
+
+class ManifestParseTestCase(unittest.TestCase):
+  """TestCase for parsing manifests."""
+
+  def setUp(self):
+    self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
+    self.repodir = os.path.join(self.tempdir, '.repo')
+    self.manifest_dir = os.path.join(self.repodir, 'manifests')
+    self.manifest_file = os.path.join(
+        self.repodir, manifest_xml.MANIFEST_FILE_NAME)
+    self.local_manifest_dir = os.path.join(
+        self.repodir, manifest_xml.LOCAL_MANIFESTS_DIR_NAME)
+    os.mkdir(self.repodir)
+    os.mkdir(self.manifest_dir)
+
+    # The manifest parsing really wants a git repo currently.
+    gitdir = os.path.join(self.repodir, 'manifests.git')
+    os.mkdir(gitdir)
+    with open(os.path.join(gitdir, 'config'), 'w') as fp:
+      fp.write("""[remote "origin"]
+        url = https://localhost:0/manifest
+""")
+
+  def tearDown(self):
+    shutil.rmtree(self.tempdir, ignore_errors=True)
+
+  def getXmlManifest(self, data):
+    """Helper to initialize a manifest for testing."""
+    with open(self.manifest_file, 'w') as fp:
+      fp.write(data)
+    return manifest_xml.XmlManifest(self.repodir, self.manifest_file)
+
+  @staticmethod
+  def encodeXmlAttr(attr):
+    """Encode |attr| using XML escape rules."""
+    return attr.replace('\r', '&#x000d;').replace('\n', '&#x000a;')
+
+
+class ManifestValidateFilePaths(unittest.TestCase):
+  """Check _ValidateFilePaths helper.
+
+  This doesn't access a real filesystem.
+  """
+
+  def check_both(self, *args):
+    manifest_xml.XmlManifest._ValidateFilePaths('copyfile', *args)
+    manifest_xml.XmlManifest._ValidateFilePaths('linkfile', *args)
+
+  def test_normal_path(self):
+    """Make sure good paths are accepted."""
+    self.check_both('foo', 'bar')
+    self.check_both('foo/bar', 'bar')
+    self.check_both('foo', 'bar/bar')
+    self.check_both('foo/bar', 'bar/bar')
+
+  def test_symlink_targets(self):
+    """Some extra checks for symlinks."""
+    def check(*args):
+      manifest_xml.XmlManifest._ValidateFilePaths('linkfile', *args)
+
+    # We allow symlinks to end in a slash since we allow them to point to dirs
+    # in general.  Technically the slash isn't necessary.
+    check('foo/', 'bar')
+    # We allow a single '.' to get a reference to the project itself.
+    check('.', 'bar')
+
+  def test_bad_paths(self):
+    """Make sure bad paths (src & dest) are rejected."""
+    for path in INVALID_FS_PATHS:
+      self.assertRaises(
+          error.ManifestInvalidPathError, self.check_both, path, 'a')
+      self.assertRaises(
+          error.ManifestInvalidPathError, self.check_both, 'a', path)
+
+
+class ValueTests(unittest.TestCase):
+  """Check utility parsing code."""
+
+  def _get_node(self, text):
+    return xml.dom.minidom.parseString(text).firstChild
+
+  def test_bool_default(self):
+    """Check XmlBool default handling."""
+    node = self._get_node('<node/>')
+    self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
+    self.assertIsNone(manifest_xml.XmlBool(node, 'a', None))
+    self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
+
+    node = self._get_node('<node a=""/>')
+    self.assertIsNone(manifest_xml.XmlBool(node, 'a'))
+
+  def test_bool_invalid(self):
+    """Check XmlBool invalid handling."""
+    node = self._get_node('<node a="moo"/>')
+    self.assertEqual(123, manifest_xml.XmlBool(node, 'a', 123))
+
+  def test_bool_true(self):
+    """Check XmlBool true values."""
+    for value in ('yes', 'true', '1'):
+      node = self._get_node('<node a="%s"/>' % (value,))
+      self.assertTrue(manifest_xml.XmlBool(node, 'a'))
+
+  def test_bool_false(self):
+    """Check XmlBool false values."""
+    for value in ('no', 'false', '0'):
+      node = self._get_node('<node a="%s"/>' % (value,))
+      self.assertFalse(manifest_xml.XmlBool(node, 'a'))
+
+  def test_int_default(self):
+    """Check XmlInt default handling."""
+    node = self._get_node('<node/>')
+    self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
+    self.assertIsNone(manifest_xml.XmlInt(node, 'a', None))
+    self.assertEqual(123, manifest_xml.XmlInt(node, 'a', 123))
+
+    node = self._get_node('<node a=""/>')
+    self.assertIsNone(manifest_xml.XmlInt(node, 'a'))
+
+  def test_int_good(self):
+    """Check XmlInt numeric handling."""
+    for value in (-1, 0, 1, 50000):
+      node = self._get_node('<node a="%s"/>' % (value,))
+      self.assertEqual(value, manifest_xml.XmlInt(node, 'a'))
+
+  def test_int_invalid(self):
+    """Check XmlInt invalid handling."""
+    with self.assertRaises(error.ManifestParseError):
+      node = self._get_node('<node a="xx"/>')
+      manifest_xml.XmlInt(node, 'a')
+
+
+class XmlManifestTests(ManifestParseTestCase):
+  """Check manifest processing."""
+
+  def test_empty(self):
+    """Parse an 'empty' manifest file."""
+    manifest = self.getXmlManifest(
+        '<?xml version="1.0" encoding="UTF-8"?>'
+        '<manifest></manifest>')
+    self.assertEqual(manifest.remotes, {})
+    self.assertEqual(manifest.projects, [])
+
+  def test_link(self):
+    """Verify Link handling with new names."""
+    manifest = manifest_xml.XmlManifest(self.repodir, self.manifest_file)
+    with open(os.path.join(self.manifest_dir, 'foo.xml'), 'w') as fp:
+      fp.write('<manifest></manifest>')
+    manifest.Link('foo.xml')
+    with open(self.manifest_file) as fp:
+      self.assertIn('<include name="foo.xml" />', fp.read())
+
+  def test_toxml_empty(self):
+    """Verify the ToXml() helper."""
+    manifest = self.getXmlManifest(
+        '<?xml version="1.0" encoding="UTF-8"?>'
+        '<manifest></manifest>')
+    self.assertEqual(manifest.ToXml().toxml(), '<?xml version="1.0" ?><manifest/>')
+
+  def test_todict_empty(self):
+    """Verify the ToDict() helper."""
+    manifest = self.getXmlManifest(
+        '<?xml version="1.0" encoding="UTF-8"?>'
+        '<manifest></manifest>')
+    self.assertEqual(manifest.ToDict(), {})
+
+  def test_repo_hooks(self):
+    """Check repo-hooks settings."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <project name="repohooks" path="src/repohooks"/>
+  <repo-hooks in-project="repohooks" enabled-list="a, b"/>
+</manifest>
+""")
+    self.assertEqual(manifest.repo_hooks_project.name, 'repohooks')
+    self.assertEqual(manifest.repo_hooks_project.enabled_repo_hooks, ['a', 'b'])
+
+  def test_repo_hooks_unordered(self):
+    """Check repo-hooks settings work even if the project def comes second."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <repo-hooks in-project="repohooks" enabled-list="a, b"/>
+  <project name="repohooks" path="src/repohooks"/>
+</manifest>
+""")
+    self.assertEqual(manifest.repo_hooks_project.name, 'repohooks')
+    self.assertEqual(manifest.repo_hooks_project.enabled_repo_hooks, ['a', 'b'])
+
+  def test_unknown_tags(self):
+    """Check superproject settings."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <superproject name="superproject"/>
+  <iankaz value="unknown (possible) future tags are ignored"/>
+  <x-custom-tag>X tags are always ignored</x-custom-tag>
+</manifest>
+""")
+    self.assertEqual(manifest.superproject['name'], 'superproject')
+    self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="test-remote"/>'
+        '<default remote="test-remote" revision="refs/heads/main"/>'
+        '<superproject name="superproject"/>'
+        '</manifest>')
+
+  def test_remote_annotations(self):
+    """Check remote settings."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost">
+    <annotation name="foo" value="bar"/>
+  </remote>
+</manifest>
+""")
+    self.assertEqual(manifest.remotes['test-remote'].annotations[0].name, 'foo')
+    self.assertEqual(manifest.remotes['test-remote'].annotations[0].value, 'bar')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="test-remote">'
+        '<annotation name="foo" value="bar"/>'
+        '</remote>'
+        '</manifest>')
+
+
+class IncludeElementTests(ManifestParseTestCase):
+  """Tests for <include>."""
+
+  def test_group_levels(self):
+    root_m = os.path.join(self.manifest_dir, 'root.xml')
+    with open(root_m, 'w') as fp:
+      fp.write("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <include name="level1.xml" groups="level1-group" />
+  <project name="root-name1" path="root-path1" />
+  <project name="root-name2" path="root-path2" groups="r2g1,r2g2" />
+</manifest>
+""")
+    with open(os.path.join(self.manifest_dir, 'level1.xml'), 'w') as fp:
+      fp.write("""
+<manifest>
+  <include name="level2.xml" groups="level2-group" />
+  <project name="level1-name1" path="level1-path1" />
+</manifest>
+""")
+    with open(os.path.join(self.manifest_dir, 'level2.xml'), 'w') as fp:
+      fp.write("""
+<manifest>
+  <project name="level2-name1" path="level2-path1" groups="l2g1,l2g2" />
+</manifest>
+""")
+    include_m = manifest_xml.XmlManifest(self.repodir, root_m)
+    for proj in include_m.projects:
+      if proj.name == 'root-name1':
+        # Check include group not set on root level proj.
+        self.assertNotIn('level1-group', proj.groups)
+      if proj.name == 'root-name2':
+        # Check root proj group not removed.
+        self.assertIn('r2g1', proj.groups)
+      if proj.name == 'level1-name1':
+        # Check level1 proj has inherited group level 1.
+        self.assertIn('level1-group', proj.groups)
+      if proj.name == 'level2-name1':
+        # Check level2 proj has inherited group levels 1 and 2.
+        self.assertIn('level1-group', proj.groups)
+        self.assertIn('level2-group', proj.groups)
+        # Check level2 proj group not removed.
+        self.assertIn('l2g1', proj.groups)
+
+  def test_allow_bad_name_from_user(self):
+    """Check handling of bad name attribute from the user's input."""
+    def parse(name):
+      name = self.encodeXmlAttr(name)
+      manifest = self.getXmlManifest(f"""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <include name="{name}" />
+</manifest>
+""")
+      # Force the manifest to be parsed.
+      manifest.ToXml()
+
+    # Setup target of the include.
+    target = os.path.join(self.tempdir, 'target.xml')
+    with open(target, 'w') as fp:
+      fp.write('<manifest></manifest>')
+
+    # Include with absolute path.
+    parse(os.path.abspath(target))
+
+    # Include with relative path.
+    parse(os.path.relpath(target, self.manifest_dir))
+
+  def test_bad_name_checks(self):
+    """Check handling of bad name attribute."""
+    def parse(name):
+      name = self.encodeXmlAttr(name)
+      # Setup target of the include.
+      with open(os.path.join(self.manifest_dir, 'target.xml'), 'w') as fp:
+        fp.write(f'<manifest><include name="{name}"/></manifest>')
+
+      manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <include name="target.xml" />
+</manifest>
+""")
+      # Force the manifest to be parsed.
+      manifest.ToXml()
+
+    # Handle empty name explicitly because a different codepath rejects it.
+    with self.assertRaises(error.ManifestParseError):
+      parse('')
+
+    for path in INVALID_FS_PATHS:
+      if not path:
+        continue
+
+      with self.assertRaises(error.ManifestInvalidPathError):
+        parse(path)
+
+
+class ProjectElementTests(ManifestParseTestCase):
+  """Tests for <project>."""
+
+  def test_group(self):
+    """Check project group settings."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <project name="test-name" path="test-path"/>
+  <project name="extras" path="path" groups="g1,g2,g1"/>
+</manifest>
+""")
+    self.assertEqual(len(manifest.projects), 2)
+    # Ordering isn't guaranteed.
+    result = {
+        manifest.projects[0].name: manifest.projects[0].groups,
+        manifest.projects[1].name: manifest.projects[1].groups,
+    }
+    project = manifest.projects[0]
+    self.assertCountEqual(
+        result['test-name'],
+        ['name:test-name', 'all', 'path:test-path'])
+    self.assertCountEqual(
+        result['extras'],
+        ['g1', 'g2', 'g1', 'name:extras', 'all', 'path:path'])
+    groupstr = 'default,platform-' + platform.system().lower()
+    self.assertEqual(groupstr, manifest.GetGroupsStr())
+    groupstr = 'g1,g2,g1'
+    manifest.manifestProject.config.SetString('manifest.groups', groupstr)
+    self.assertEqual(groupstr, manifest.GetGroupsStr())
+
+  def test_set_revision_id(self):
+    """Check setting of project's revisionId."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="test-name"/>
+</manifest>
+""")
+    self.assertEqual(len(manifest.projects), 1)
+    project = manifest.projects[0]
+    project.SetRevisionId('ABCDEF')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="default-remote"/>'
+        '<default remote="default-remote" revision="refs/heads/main"/>'
+        '<project name="test-name" revision="ABCDEF" upstream="refs/heads/main"/>'
+        '</manifest>')
+
+  def test_trailing_slash(self):
+    """Check handling of trailing slashes in attributes."""
+    def parse(name, path):
+      name = self.encodeXmlAttr(name)
+      path = self.encodeXmlAttr(path)
+      return self.getXmlManifest(f"""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="{name}" path="{path}" />
+</manifest>
+""")
+
+    manifest = parse('a/path/', 'foo')
+    self.assertEqual(manifest.projects[0].gitdir,
+                     os.path.join(self.tempdir, '.repo/projects/foo.git'))
+    self.assertEqual(manifest.projects[0].objdir,
+                     os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
+
+    manifest = parse('a/path', 'foo/')
+    self.assertEqual(manifest.projects[0].gitdir,
+                     os.path.join(self.tempdir, '.repo/projects/foo.git'))
+    self.assertEqual(manifest.projects[0].objdir,
+                     os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
+
+    manifest = parse('a/path', 'foo//////')
+    self.assertEqual(manifest.projects[0].gitdir,
+                     os.path.join(self.tempdir, '.repo/projects/foo.git'))
+    self.assertEqual(manifest.projects[0].objdir,
+                     os.path.join(self.tempdir, '.repo/project-objects/a/path.git'))
+
+  def test_toplevel_path(self):
+    """Check handling of path=. specially."""
+    def parse(name, path):
+      name = self.encodeXmlAttr(name)
+      path = self.encodeXmlAttr(path)
+      return self.getXmlManifest(f"""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="{name}" path="{path}" />
+</manifest>
+""")
+
+    for path in ('.', './', './/', './//'):
+      manifest = parse('server/path', path)
+      self.assertEqual(manifest.projects[0].gitdir,
+                       os.path.join(self.tempdir, '.repo/projects/..git'))
+
+  def test_bad_path_name_checks(self):
+    """Check handling of bad path & name attributes."""
+    def parse(name, path):
+      name = self.encodeXmlAttr(name)
+      path = self.encodeXmlAttr(path)
+      manifest = self.getXmlManifest(f"""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="{name}" path="{path}" />
+</manifest>
+""")
+      # Force the manifest to be parsed.
+      manifest.ToXml()
+
+    # Verify the parser is valid by default to avoid buggy tests below.
+    parse('ok', 'ok')
+
+    # Handle empty name explicitly because a different codepath rejects it.
+    # Empty path is OK because it defaults to the name field.
+    with self.assertRaises(error.ManifestParseError):
+      parse('', 'ok')
+
+    for path in INVALID_FS_PATHS:
+      if not path or path.endswith('/'):
+        continue
+
+      with self.assertRaises(error.ManifestInvalidPathError):
+        parse(path, 'ok')
+
+      # We have a dedicated test for path=".".
+      if path not in {'.'}:
+        with self.assertRaises(error.ManifestInvalidPathError):
+          parse('ok', path)
+
+
+class SuperProjectElementTests(ManifestParseTestCase):
+  """Tests for <superproject>."""
+
+  def test_superproject(self):
+    """Check superproject settings."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <superproject name="superproject"/>
+</manifest>
+""")
+    self.assertEqual(manifest.superproject['name'], 'superproject')
+    self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
+    self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
+    self.assertEqual(manifest.superproject['revision'], 'refs/heads/main')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="test-remote"/>'
+        '<default remote="test-remote" revision="refs/heads/main"/>'
+        '<superproject name="superproject"/>'
+        '</manifest>')
+
+  def test_superproject_revision(self):
+    """Check superproject settings with a different revision attribute"""
+    self.maxDiff = None
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/main" />
+  <superproject name="superproject" revision="refs/heads/stable" />
+</manifest>
+""")
+    self.assertEqual(manifest.superproject['name'], 'superproject')
+    self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
+    self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
+    self.assertEqual(manifest.superproject['revision'], 'refs/heads/stable')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="test-remote"/>'
+        '<default remote="test-remote" revision="refs/heads/main"/>'
+        '<superproject name="superproject" revision="refs/heads/stable"/>'
+        '</manifest>')
+
+  def test_superproject_revision_default_negative(self):
+    """Check superproject settings with a same revision attribute"""
+    self.maxDiff = None
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" />
+  <default remote="test-remote" revision="refs/heads/stable" />
+  <superproject name="superproject" revision="refs/heads/stable" />
+</manifest>
+""")
+    self.assertEqual(manifest.superproject['name'], 'superproject')
+    self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
+    self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
+    self.assertEqual(manifest.superproject['revision'], 'refs/heads/stable')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="test-remote"/>'
+        '<default remote="test-remote" revision="refs/heads/stable"/>'
+        '<superproject name="superproject"/>'
+        '</manifest>')
+
+  def test_superproject_revision_remote(self):
+    """Check superproject settings with a same revision attribute"""
+    self.maxDiff = None
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="test-remote" fetch="http://localhost" revision="refs/heads/main" />
+  <default remote="test-remote" />
+  <superproject name="superproject" revision="refs/heads/stable" />
+</manifest>
+""")
+    self.assertEqual(manifest.superproject['name'], 'superproject')
+    self.assertEqual(manifest.superproject['remote'].name, 'test-remote')
+    self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/superproject')
+    self.assertEqual(manifest.superproject['revision'], 'refs/heads/stable')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="test-remote" revision="refs/heads/main"/>'
+        '<default remote="test-remote"/>'
+        '<superproject name="superproject" revision="refs/heads/stable"/>'
+        '</manifest>')
+
+  def test_remote(self):
+    """Check superproject settings with a remote."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <remote name="superproject-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <superproject name="platform/superproject" remote="superproject-remote"/>
+</manifest>
+""")
+    self.assertEqual(manifest.superproject['name'], 'platform/superproject')
+    self.assertEqual(manifest.superproject['remote'].name, 'superproject-remote')
+    self.assertEqual(manifest.superproject['remote'].url, 'http://localhost/platform/superproject')
+    self.assertEqual(manifest.superproject['revision'], 'refs/heads/main')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="default-remote"/>'
+        '<remote fetch="http://localhost" name="superproject-remote"/>'
+        '<default remote="default-remote" revision="refs/heads/main"/>'
+        '<superproject name="platform/superproject" remote="superproject-remote"/>'
+        '</manifest>')
+
+  def test_defalut_remote(self):
+    """Check superproject settings with a default remote."""
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <superproject name="superproject" remote="default-remote"/>
+</manifest>
+""")
+    self.assertEqual(manifest.superproject['name'], 'superproject')
+    self.assertEqual(manifest.superproject['remote'].name, 'default-remote')
+    self.assertEqual(manifest.superproject['revision'], 'refs/heads/main')
+    self.assertEqual(
+        sort_attributes(manifest.ToXml().toxml()),
+        '<?xml version="1.0" ?><manifest>'
+        '<remote fetch="http://localhost" name="default-remote"/>'
+        '<default remote="default-remote" revision="refs/heads/main"/>'
+        '<superproject name="superproject"/>'
+        '</manifest>')
+
+
+class ContactinfoElementTests(ManifestParseTestCase):
+  """Tests for <contactinfo>."""
+
+  def test_contactinfo(self):
+    """Check contactinfo settings."""
+    bugurl = 'http://localhost/contactinfo'
+    manifest = self.getXmlManifest(f"""
+<manifest>
+  <contactinfo bugurl="{bugurl}"/>
+</manifest>
+""")
+    self.assertEqual(manifest.contactinfo.bugurl, bugurl)
+    self.assertEqual(
+        manifest.ToXml().toxml(),
+        '<?xml version="1.0" ?><manifest>'
+        f'<contactinfo bugurl="{bugurl}"/>'
+        '</manifest>')
+
+
+class DefaultElementTests(ManifestParseTestCase):
+  """Tests for <default>."""
+
+  def test_default(self):
+    """Check default settings."""
+    a = manifest_xml._Default()
+    a.revisionExpr = 'foo'
+    a.remote = manifest_xml._XmlRemote(name='remote')
+    b = manifest_xml._Default()
+    b.revisionExpr = 'bar'
+    self.assertEqual(a, a)
+    self.assertNotEqual(a, b)
+    self.assertNotEqual(b, a.remote)
+    self.assertNotEqual(a, 123)
+    self.assertNotEqual(a, None)
+
+
+class RemoteElementTests(ManifestParseTestCase):
+  """Tests for <remote>."""
+
+  def test_remote(self):
+    """Check remote settings."""
+    a = manifest_xml._XmlRemote(name='foo')
+    a.AddAnnotation('key1', 'value1', 'true')
+    b = manifest_xml._XmlRemote(name='foo')
+    b.AddAnnotation('key2', 'value1', 'true')
+    c = manifest_xml._XmlRemote(name='foo')
+    c.AddAnnotation('key1', 'value2', 'true')
+    d = manifest_xml._XmlRemote(name='foo')
+    d.AddAnnotation('key1', 'value1', 'false')
+    self.assertEqual(a, a)
+    self.assertNotEqual(a, b)
+    self.assertNotEqual(a, c)
+    self.assertNotEqual(a, d)
+    self.assertNotEqual(a, manifest_xml._Default())
+    self.assertNotEqual(a, 123)
+    self.assertNotEqual(a, None)
+
+
+class RemoveProjectElementTests(ManifestParseTestCase):
+  """Tests for <remove-project>."""
+
+  def test_remove_one_project(self):
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="myproject" />
+  <remove-project name="myproject" />
+</manifest>
+""")
+    self.assertEqual(manifest.projects, [])
+
+  def test_remove_one_project_one_remains(self):
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="myproject" />
+  <project name="yourproject" />
+  <remove-project name="myproject" />
+</manifest>
+""")
+
+    self.assertEqual(len(manifest.projects), 1)
+    self.assertEqual(manifest.projects[0].name, 'yourproject')
+
+  def test_remove_one_project_doesnt_exist(self):
+    with self.assertRaises(manifest_xml.ManifestParseError):
+      manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <remove-project name="myproject" />
+</manifest>
+""")
+      manifest.projects
+
+  def test_remove_one_optional_project_doesnt_exist(self):
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <remove-project name="myproject" optional="true" />
+</manifest>
+""")
+    self.assertEqual(manifest.projects, [])
+
+
+class ExtendProjectElementTests(ManifestParseTestCase):
+  """Tests for <extend-project>."""
+
+  def test_extend_project_dest_path_single_match(self):
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="myproject" />
+  <extend-project name="myproject" dest-path="bar" />
+</manifest>
+""")
+    self.assertEqual(len(manifest.projects), 1)
+    self.assertEqual(manifest.projects[0].relpath, 'bar')
+
+  def test_extend_project_dest_path_multi_match(self):
+    with self.assertRaises(manifest_xml.ManifestParseError):
+      manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="myproject" path="x" />
+  <project name="myproject" path="y" />
+  <extend-project name="myproject" dest-path="bar" />
+</manifest>
+""")
+      manifest.projects
+
+  def test_extend_project_dest_path_multi_match_path_specified(self):
+    manifest = self.getXmlManifest("""
+<manifest>
+  <remote name="default-remote" fetch="http://localhost" />
+  <default remote="default-remote" revision="refs/heads/main" />
+  <project name="myproject" path="x" />
+  <project name="myproject" path="y" />
+  <extend-project name="myproject" path="x" dest-path="bar" />
+</manifest>
+""")
+    self.assertEqual(len(manifest.projects), 2)
+    if manifest.projects[0].relpath == 'y':
+      self.assertEqual(manifest.projects[1].relpath, 'bar')
+    else:
+      self.assertEqual(manifest.projects[0].relpath, 'bar')
+      self.assertEqual(manifest.projects[1].relpath, 'y')
diff --git a/tests/test_platform_utils.py b/tests/test_platform_utils.py
new file mode 100644
index 0000000..55b7805
--- /dev/null
+++ b/tests/test_platform_utils.py
@@ -0,0 +1,50 @@
+# Copyright 2021 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the platform_utils.py module."""
+
+import os
+import tempfile
+import unittest
+
+import platform_utils
+
+
+class RemoveTests(unittest.TestCase):
+  """Check remove() helper."""
+
+  def testMissingOk(self):
+    """Check missing_ok handling."""
+    with tempfile.TemporaryDirectory() as tmpdir:
+      path = os.path.join(tmpdir, 'test')
+
+      # Should not fail.
+      platform_utils.remove(path, missing_ok=True)
+
+      # Should fail.
+      self.assertRaises(OSError, platform_utils.remove, path)
+      self.assertRaises(OSError, platform_utils.remove, path, missing_ok=False)
+
+      # Should not fail if it exists.
+      open(path, 'w').close()
+      platform_utils.remove(path, missing_ok=True)
+      self.assertFalse(os.path.exists(path))
+
+      open(path, 'w').close()
+      platform_utils.remove(path)
+      self.assertFalse(os.path.exists(path))
+
+      open(path, 'w').close()
+      platform_utils.remove(path, missing_ok=False)
+      self.assertFalse(os.path.exists(path))
diff --git a/tests/test_project.py b/tests/test_project.py
index 77126df..9b2cc4e 100644
--- a/tests/test_project.py
+++ b/tests/test_project.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2019 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,8 +14,6 @@
 
 """Unittests for the project.py module."""
 
-from __future__ import print_function
-
 import contextlib
 import os
 import shutil
@@ -25,7 +21,10 @@
 import tempfile
 import unittest
 
+import error
+import git_command
 import git_config
+import platform_utils
 import project
 
 
@@ -36,49 +35,22 @@
   # Python 2 support entirely.
   try:
     tempdir = tempfile.mkdtemp(prefix='repo-tests')
-    subprocess.check_call(['git', 'init'], cwd=tempdir)
+
+    # Tests need to assume, that main is default branch at init,
+    # which is not supported in config until 2.28.
+    cmd = ['git', 'init']
+    if git_command.git_require((2, 28, 0)):
+      cmd += ['--initial-branch=main']
+    else:
+      # Use template dir for init.
+      templatedir = tempfile.mkdtemp(prefix='.test-template')
+      with open(os.path.join(templatedir, 'HEAD'), 'w') as fp:
+        fp.write('ref: refs/heads/main\n')
+      cmd += ['--template', templatedir]
+    subprocess.check_call(cmd, cwd=tempdir)
     yield tempdir
   finally:
-    shutil.rmtree(tempdir)
-
-
-class RepoHookShebang(unittest.TestCase):
-  """Check shebang parsing in RepoHook."""
-
-  def test_no_shebang(self):
-    """Lines w/out shebangs should be rejected."""
-    DATA = (
-        '',
-        '# -*- coding:utf-8 -*-\n',
-        '#\n# foo\n',
-        '# Bad shebang in script\n#!/foo\n'
-    )
-    for data in DATA:
-      self.assertIsNone(project.RepoHook._ExtractInterpFromShebang(data))
-
-  def test_direct_interp(self):
-    """Lines whose shebang points directly to the interpreter."""
-    DATA = (
-        ('#!/foo', '/foo'),
-        ('#! /foo', '/foo'),
-        ('#!/bin/foo ', '/bin/foo'),
-        ('#! /usr/foo ', '/usr/foo'),
-        ('#! /usr/foo -args', '/usr/foo'),
-    )
-    for shebang, interp in DATA:
-      self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
-                       interp)
-
-  def test_env_interp(self):
-    """Lines whose shebang launches through `env`."""
-    DATA = (
-        ('#!/usr/bin/env foo', 'foo'),
-        ('#!/bin/env foo', 'foo'),
-        ('#! /bin/env /bin/foo ', '/bin/foo'),
-    )
-    for shebang, interp in DATA:
-      self.assertEqual(project.RepoHook._ExtractInterpFromShebang(shebang),
-                       interp)
+    platform_utils.rmtree(tempdir)
 
 
 class FakeProject(object):
@@ -114,7 +86,7 @@
 
       # Start off with the normal details.
       rb = project.ReviewableBranch(
-          fakeproj, fakeproj.config.GetBranch('work'), 'master')
+          fakeproj, fakeproj.config.GetBranch('work'), 'main')
       self.assertEqual('work', rb.name)
       self.assertEqual(1, len(rb.commits))
       self.assertIn('Del file', rb.commits[0])
@@ -127,10 +99,239 @@
       self.assertTrue(rb.date)
 
       # Now delete the tracking branch!
-      fakeproj.work_git.branch('-D', 'master')
+      fakeproj.work_git.branch('-D', 'main')
       rb = project.ReviewableBranch(
-          fakeproj, fakeproj.config.GetBranch('work'), 'master')
+          fakeproj, fakeproj.config.GetBranch('work'), 'main')
       self.assertEqual(0, len(rb.commits))
       self.assertFalse(rb.base_exists)
       # Hard to assert anything useful about this.
       self.assertTrue(rb.date)
+
+
+class CopyLinkTestCase(unittest.TestCase):
+  """TestCase for stub repo client checkouts.
+
+  It'll have a layout like:
+    tempdir/          # self.tempdir
+      checkout/       # self.topdir
+        git-project/  # self.worktree
+
+  Attributes:
+    tempdir: A dedicated temporary directory.
+    worktree: The top of the repo client checkout.
+    topdir: The top of a project checkout.
+  """
+
+  def setUp(self):
+    self.tempdir = tempfile.mkdtemp(prefix='repo_tests')
+    self.topdir = os.path.join(self.tempdir, 'checkout')
+    self.worktree = os.path.join(self.topdir, 'git-project')
+    os.makedirs(self.topdir)
+    os.makedirs(self.worktree)
+
+  def tearDown(self):
+    shutil.rmtree(self.tempdir, ignore_errors=True)
+
+  @staticmethod
+  def touch(path):
+    with open(path, 'w'):
+      pass
+
+  def assertExists(self, path, msg=None):
+    """Make sure |path| exists."""
+    if os.path.exists(path):
+      return
+
+    if msg is None:
+      msg = ['path is missing: %s' % path]
+      while path != '/':
+        path = os.path.dirname(path)
+        if not path:
+          # If we're given something like "foo", abort once we get to "".
+          break
+        result = os.path.exists(path)
+        msg.append('\tos.path.exists(%s): %s' % (path, result))
+        if result:
+          msg.append('\tcontents: %r' % os.listdir(path))
+          break
+      msg = '\n'.join(msg)
+
+    raise self.failureException(msg)
+
+
+class CopyFile(CopyLinkTestCase):
+  """Check _CopyFile handling."""
+
+  def CopyFile(self, src, dest):
+    return project._CopyFile(self.worktree, src, self.topdir, dest)
+
+  def test_basic(self):
+    """Basic test of copying a file from a project to the toplevel."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    cf = self.CopyFile('foo.txt', 'foo')
+    cf._Copy()
+    self.assertExists(os.path.join(self.topdir, 'foo'))
+
+  def test_src_subdir(self):
+    """Copy a file from a subdir of a project."""
+    src = os.path.join(self.worktree, 'bar', 'foo.txt')
+    os.makedirs(os.path.dirname(src))
+    self.touch(src)
+    cf = self.CopyFile('bar/foo.txt', 'new.txt')
+    cf._Copy()
+    self.assertExists(os.path.join(self.topdir, 'new.txt'))
+
+  def test_dest_subdir(self):
+    """Copy a file to a subdir of a checkout."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    cf = self.CopyFile('foo.txt', 'sub/dir/new.txt')
+    self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
+    cf._Copy()
+    self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'new.txt'))
+
+  def test_update(self):
+    """Make sure changed files get copied again."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    dest = os.path.join(self.topdir, 'bar')
+    with open(src, 'w') as f:
+      f.write('1st')
+    cf = self.CopyFile('foo.txt', 'bar')
+    cf._Copy()
+    self.assertExists(dest)
+    with open(dest) as f:
+      self.assertEqual(f.read(), '1st')
+
+    with open(src, 'w') as f:
+      f.write('2nd!')
+    cf._Copy()
+    with open(dest) as f:
+      self.assertEqual(f.read(), '2nd!')
+
+  def test_src_block_symlink(self):
+    """Do not allow reading from a symlinked path."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    sym = os.path.join(self.worktree, 'sym')
+    self.touch(src)
+    platform_utils.symlink('foo.txt', sym)
+    self.assertExists(sym)
+    cf = self.CopyFile('sym', 'foo')
+    self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
+
+  def test_src_block_symlink_traversal(self):
+    """Do not allow reading through a symlink dir."""
+    realfile = os.path.join(self.tempdir, 'file.txt')
+    self.touch(realfile)
+    src = os.path.join(self.worktree, 'bar', 'file.txt')
+    platform_utils.symlink(self.tempdir, os.path.join(self.worktree, 'bar'))
+    self.assertExists(src)
+    cf = self.CopyFile('bar/file.txt', 'foo')
+    self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
+
+  def test_src_block_copy_from_dir(self):
+    """Do not allow copying from a directory."""
+    src = os.path.join(self.worktree, 'dir')
+    os.makedirs(src)
+    cf = self.CopyFile('dir', 'foo')
+    self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
+
+  def test_dest_block_symlink(self):
+    """Do not allow writing to a symlink."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    platform_utils.symlink('dest', os.path.join(self.topdir, 'sym'))
+    cf = self.CopyFile('foo.txt', 'sym')
+    self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
+
+  def test_dest_block_symlink_traversal(self):
+    """Do not allow writing through a symlink dir."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    platform_utils.symlink(tempfile.gettempdir(),
+                           os.path.join(self.topdir, 'sym'))
+    cf = self.CopyFile('foo.txt', 'sym/foo.txt')
+    self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
+
+  def test_src_block_copy_to_dir(self):
+    """Do not allow copying to a directory."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    os.makedirs(os.path.join(self.topdir, 'dir'))
+    cf = self.CopyFile('foo.txt', 'dir')
+    self.assertRaises(error.ManifestInvalidPathError, cf._Copy)
+
+
+class LinkFile(CopyLinkTestCase):
+  """Check _LinkFile handling."""
+
+  def LinkFile(self, src, dest):
+    return project._LinkFile(self.worktree, src, self.topdir, dest)
+
+  def test_basic(self):
+    """Basic test of linking a file from a project into the toplevel."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    lf = self.LinkFile('foo.txt', 'foo')
+    lf._Link()
+    dest = os.path.join(self.topdir, 'foo')
+    self.assertExists(dest)
+    self.assertTrue(os.path.islink(dest))
+    self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
+
+  def test_src_subdir(self):
+    """Link to a file in a subdir of a project."""
+    src = os.path.join(self.worktree, 'bar', 'foo.txt')
+    os.makedirs(os.path.dirname(src))
+    self.touch(src)
+    lf = self.LinkFile('bar/foo.txt', 'foo')
+    lf._Link()
+    self.assertExists(os.path.join(self.topdir, 'foo'))
+
+  def test_src_self(self):
+    """Link to the project itself."""
+    dest = os.path.join(self.topdir, 'foo', 'bar')
+    lf = self.LinkFile('.', 'foo/bar')
+    lf._Link()
+    self.assertExists(dest)
+    self.assertEqual(os.path.join('..', 'git-project'), os.readlink(dest))
+
+  def test_dest_subdir(self):
+    """Link a file to a subdir of a checkout."""
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    lf = self.LinkFile('foo.txt', 'sub/dir/foo/bar')
+    self.assertFalse(os.path.exists(os.path.join(self.topdir, 'sub')))
+    lf._Link()
+    self.assertExists(os.path.join(self.topdir, 'sub', 'dir', 'foo', 'bar'))
+
+  def test_src_block_relative(self):
+    """Do not allow relative symlinks."""
+    BAD_SOURCES = (
+        './',
+        '..',
+        '../',
+        'foo/.',
+        'foo/./bar',
+        'foo/..',
+        'foo/../foo',
+    )
+    for src in BAD_SOURCES:
+      lf = self.LinkFile(src, 'foo')
+      self.assertRaises(error.ManifestInvalidPathError, lf._Link)
+
+  def test_update(self):
+    """Make sure changed targets get updated."""
+    dest = os.path.join(self.topdir, 'sym')
+
+    src = os.path.join(self.worktree, 'foo.txt')
+    self.touch(src)
+    lf = self.LinkFile('foo.txt', 'sym')
+    lf._Link()
+    self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
+
+    # Point the symlink somewhere else.
+    os.unlink(dest)
+    platform_utils.symlink(self.tempdir, dest)
+    lf._Link()
+    self.assertEqual(os.path.join('git-project', 'foo.txt'), os.readlink(dest))
diff --git a/tests/test_ssh.py b/tests/test_ssh.py
new file mode 100644
index 0000000..ffb5cb9
--- /dev/null
+++ b/tests/test_ssh.py
@@ -0,0 +1,74 @@
+# Copyright 2019 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the ssh.py module."""
+
+import multiprocessing
+import subprocess
+import unittest
+from unittest import mock
+
+import ssh
+
+
+class SshTests(unittest.TestCase):
+  """Tests the ssh functions."""
+
+  def test_parse_ssh_version(self):
+    """Check _parse_ssh_version() handling."""
+    ver = ssh._parse_ssh_version('Unknown\n')
+    self.assertEqual(ver, ())
+    ver = ssh._parse_ssh_version('OpenSSH_1.0\n')
+    self.assertEqual(ver, (1, 0))
+    ver = ssh._parse_ssh_version('OpenSSH_6.6.1p1 Ubuntu-2ubuntu2.13, OpenSSL 1.0.1f 6 Jan 2014\n')
+    self.assertEqual(ver, (6, 6, 1))
+    ver = ssh._parse_ssh_version('OpenSSH_7.6p1 Ubuntu-4ubuntu0.3, OpenSSL 1.0.2n  7 Dec 2017\n')
+    self.assertEqual(ver, (7, 6))
+
+  def test_version(self):
+    """Check version() handling."""
+    with mock.patch('ssh._run_ssh_version', return_value='OpenSSH_1.2\n'):
+      self.assertEqual(ssh.version(), (1, 2))
+
+  def test_context_manager_empty(self):
+    """Verify context manager with no clients works correctly."""
+    with multiprocessing.Manager() as manager:
+      with ssh.ProxyManager(manager):
+        pass
+
+  def test_context_manager_child_cleanup(self):
+    """Verify orphaned clients & masters get cleaned up."""
+    with multiprocessing.Manager() as manager:
+      with ssh.ProxyManager(manager) as ssh_proxy:
+        client = subprocess.Popen(['sleep', '964853320'])
+        ssh_proxy.add_client(client)
+        master = subprocess.Popen(['sleep', '964853321'])
+        ssh_proxy.add_master(master)
+    # If the process still exists, these will throw timeout errors.
+    client.wait(0)
+    master.wait(0)
+
+  def test_ssh_sock(self):
+    """Check sock() function."""
+    manager = multiprocessing.Manager()
+    proxy = ssh.ProxyManager(manager)
+    with mock.patch('tempfile.mkdtemp', return_value='/tmp/foo'):
+      # old ssh version uses port
+      with mock.patch('ssh.version', return_value=(6, 6)):
+        self.assertTrue(proxy.sock().endswith('%p'))
+
+      proxy._sock_path = None
+      # new ssh version uses hash
+      with mock.patch('ssh.version', return_value=(6, 7)):
+        self.assertTrue(proxy.sock().endswith('%C'))
diff --git a/tests/test_subcmds.py b/tests/test_subcmds.py
new file mode 100644
index 0000000..bc53051
--- /dev/null
+++ b/tests/test_subcmds.py
@@ -0,0 +1,73 @@
+# Copyright (C) 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the subcmds module (mostly __init__.py than subcommands)."""
+
+import optparse
+import unittest
+
+import subcmds
+
+
+class AllCommands(unittest.TestCase):
+  """Check registered all_commands."""
+
+  def test_required_basic(self):
+    """Basic checking of registered commands."""
+    # NB: We don't test all subcommands as we want to avoid "change detection"
+    # tests, so we just look for the most common/important ones here that are
+    # unlikely to ever change.
+    for cmd in {'cherry-pick', 'help', 'init', 'start', 'sync', 'upload'}:
+      self.assertIn(cmd, subcmds.all_commands)
+
+  def test_naming(self):
+    """Verify we don't add things that we shouldn't."""
+    for cmd in subcmds.all_commands:
+      # Reject filename suffixes like "help.py".
+      self.assertNotIn('.', cmd)
+
+      # Make sure all '_' were converted to '-'.
+      self.assertNotIn('_', cmd)
+
+      # Reject internal python paths like "__init__".
+      self.assertFalse(cmd.startswith('__'))
+
+  def test_help_desc_style(self):
+    """Force some consistency in option descriptions.
+
+    Python's optparse & argparse has a few default options like --help.  Their
+    option description text uses lowercase sentence fragments, so enforce our
+    options follow the same style so UI is consistent.
+
+    We enforce:
+    * Text starts with lowercase.
+    * Text doesn't end with period.
+    """
+    for name, cls in subcmds.all_commands.items():
+      cmd = cls()
+      parser = cmd.OptionParser
+      for option in parser.option_list:
+        if option.help == optparse.SUPPRESS_HELP:
+          continue
+
+        c = option.help[0]
+        self.assertEqual(
+            c.lower(), c,
+            msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text '
+                f'should start with lowercase: "{option.help}"')
+
+        self.assertNotEqual(
+            option.help[-1], '.',
+            msg=f'subcmds/{name}.py: {option.get_opt_string()}: help text '
+                f'should not end in a period: "{option.help}"')
diff --git a/tests/test_subcmds_init.py b/tests/test_subcmds_init.py
new file mode 100644
index 0000000..af4346d
--- /dev/null
+++ b/tests/test_subcmds_init.py
@@ -0,0 +1,49 @@
+# Copyright (C) 2020 The Android Open Source Project
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+#      http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+"""Unittests for the subcmds/init.py module."""
+
+import unittest
+
+from subcmds import init
+
+
+class InitCommand(unittest.TestCase):
+  """Check registered all_commands."""
+
+  def setUp(self):
+    self.cmd = init.Init()
+
+  def test_cli_parser_good(self):
+    """Check valid command line options."""
+    ARGV = (
+        [],
+    )
+    for argv in ARGV:
+      opts, args = self.cmd.OptionParser.parse_args(argv)
+      self.cmd.ValidateOptions(opts, args)
+
+  def test_cli_parser_bad(self):
+    """Check invalid command line options."""
+    ARGV = (
+        # Too many arguments.
+        ['url', 'asdf'],
+
+        # Conflicting options.
+        ['--mirror', '--archive'],
+    )
+    for argv in ARGV:
+      opts, args = self.cmd.OptionParser.parse_args(argv)
+      with self.assertRaises(SystemExit):
+        self.cmd.ValidateOptions(opts, args)
diff --git a/tests/test_wrapper.py b/tests/test_wrapper.py
index 8ef8d48..e9a1f64 100644
--- a/tests/test_wrapper.py
+++ b/tests/test_wrapper.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2015 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -16,27 +14,87 @@
 
 """Unittests for the wrapper.py module."""
 
-from __future__ import print_function
-
+import contextlib
+from io import StringIO
 import os
+import re
+import shutil
+import sys
+import tempfile
 import unittest
+from unittest import mock
 
+import git_command
+import main
+import platform_utils
 import wrapper
 
+
+@contextlib.contextmanager
+def TemporaryDirectory():
+  """Create a new empty git checkout for testing."""
+  # TODO(vapier): Convert this to tempfile.TemporaryDirectory once we drop
+  # Python 2 support entirely.
+  try:
+    tempdir = tempfile.mkdtemp(prefix='repo-tests')
+    yield tempdir
+  finally:
+    platform_utils.rmtree(tempdir)
+
+
 def fixture(*paths):
   """Return a path relative to tests/fixtures.
   """
   return os.path.join(os.path.dirname(__file__), 'fixtures', *paths)
 
-class RepoWrapperUnitTest(unittest.TestCase):
-  """Tests helper functions in the repo wrapper
-  """
+
+class RepoWrapperTestCase(unittest.TestCase):
+  """TestCase for the wrapper module."""
+
   def setUp(self):
-    """Load the wrapper module every time
-    """
+    """Load the wrapper module every time."""
     wrapper._wrapper_module = None
     self.wrapper = wrapper.Wrapper()
 
+
+class RepoWrapperUnitTest(RepoWrapperTestCase):
+  """Tests helper functions in the repo wrapper
+  """
+
+  def test_version(self):
+    """Make sure _Version works."""
+    with self.assertRaises(SystemExit) as e:
+      with mock.patch('sys.stdout', new_callable=StringIO) as stdout:
+        with mock.patch('sys.stderr', new_callable=StringIO) as stderr:
+          self.wrapper._Version()
+    self.assertEqual(0, e.exception.code)
+    self.assertEqual('', stderr.getvalue())
+    self.assertIn('repo launcher version', stdout.getvalue())
+
+  def test_python_constraints(self):
+    """The launcher should never require newer than main.py."""
+    self.assertGreaterEqual(main.MIN_PYTHON_VERSION_HARD,
+                            wrapper.MIN_PYTHON_VERSION_HARD)
+    self.assertGreaterEqual(main.MIN_PYTHON_VERSION_SOFT,
+                            wrapper.MIN_PYTHON_VERSION_SOFT)
+    # Make sure the versions are themselves in sync.
+    self.assertGreaterEqual(wrapper.MIN_PYTHON_VERSION_SOFT,
+                            wrapper.MIN_PYTHON_VERSION_HARD)
+
+  def test_init_parser(self):
+    """Make sure 'init' GetParser works."""
+    parser = self.wrapper.GetParser(gitc_init=False)
+    opts, args = parser.parse_args([])
+    self.assertEqual([], args)
+    self.assertIsNone(opts.manifest_url)
+
+  def test_gitc_init_parser(self):
+    """Make sure 'gitc-init' GetParser works."""
+    parser = self.wrapper.GetParser(gitc_init=True)
+    opts, args = parser.parse_args([])
+    self.assertEqual([], args)
+    self.assertIsNone(opts.manifest_file)
+
   def test_get_gitc_manifest_dir_no_gitc(self):
     """
     Test reading a missing gitc config file
@@ -72,9 +130,442 @@
     self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/test/extra'), 'test')
     self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test'), 'test')
     self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/'), 'test')
-    self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'), 'test')
+    self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/test/extra'),
+                     'test')
     self.assertEqual(self.wrapper.gitc_parse_clientdir('/gitc/manifest-rw/'), None)
     self.assertEqual(self.wrapper.gitc_parse_clientdir('/test/usr/local/google/gitc/'), None)
 
+
+class SetGitTrace2ParentSid(RepoWrapperTestCase):
+  """Check SetGitTrace2ParentSid behavior."""
+
+  KEY = 'GIT_TRACE2_PARENT_SID'
+  VALID_FORMAT = re.compile(r'^repo-[0-9]{8}T[0-9]{6}Z-P[0-9a-f]{8}$')
+
+  def test_first_set(self):
+    """Test env var not yet set."""
+    env = {}
+    self.wrapper.SetGitTrace2ParentSid(env)
+    self.assertIn(self.KEY, env)
+    value = env[self.KEY]
+    self.assertRegex(value, self.VALID_FORMAT)
+
+  def test_append(self):
+    """Test env var is appended."""
+    env = {self.KEY: 'pfx'}
+    self.wrapper.SetGitTrace2ParentSid(env)
+    self.assertIn(self.KEY, env)
+    value = env[self.KEY]
+    self.assertTrue(value.startswith('pfx/'))
+    self.assertRegex(value[4:], self.VALID_FORMAT)
+
+  def test_global_context(self):
+    """Check os.environ gets updated by default."""
+    os.environ.pop(self.KEY, None)
+    self.wrapper.SetGitTrace2ParentSid()
+    self.assertIn(self.KEY, os.environ)
+    value = os.environ[self.KEY]
+    self.assertRegex(value, self.VALID_FORMAT)
+
+
+class RunCommand(RepoWrapperTestCase):
+  """Check run_command behavior."""
+
+  def test_capture(self):
+    """Check capture_output handling."""
+    ret = self.wrapper.run_command(['echo', 'hi'], capture_output=True)
+    self.assertEqual(ret.stdout, 'hi\n')
+
+  def test_check(self):
+    """Check check handling."""
+    self.wrapper.run_command(['true'], check=False)
+    self.wrapper.run_command(['true'], check=True)
+    self.wrapper.run_command(['false'], check=False)
+    with self.assertRaises(self.wrapper.RunError):
+      self.wrapper.run_command(['false'], check=True)
+
+
+class RunGit(RepoWrapperTestCase):
+  """Check run_git behavior."""
+
+  def test_capture(self):
+    """Check capture_output handling."""
+    ret = self.wrapper.run_git('--version')
+    self.assertIn('git', ret.stdout)
+
+  def test_check(self):
+    """Check check handling."""
+    with self.assertRaises(self.wrapper.CloneFailure):
+      self.wrapper.run_git('--version-asdfasdf')
+    self.wrapper.run_git('--version-asdfasdf', check=False)
+
+
+class ParseGitVersion(RepoWrapperTestCase):
+  """Check ParseGitVersion behavior."""
+
+  def test_autoload(self):
+    """Check we can load the version from the live git."""
+    ret = self.wrapper.ParseGitVersion()
+    self.assertIsNotNone(ret)
+
+  def test_bad_ver(self):
+    """Check handling of bad git versions."""
+    ret = self.wrapper.ParseGitVersion(ver_str='asdf')
+    self.assertIsNone(ret)
+
+  def test_normal_ver(self):
+    """Check handling of normal git versions."""
+    ret = self.wrapper.ParseGitVersion(ver_str='git version 2.25.1')
+    self.assertEqual(2, ret.major)
+    self.assertEqual(25, ret.minor)
+    self.assertEqual(1, ret.micro)
+    self.assertEqual('2.25.1', ret.full)
+
+  def test_extended_ver(self):
+    """Check handling of extended distro git versions."""
+    ret = self.wrapper.ParseGitVersion(
+        ver_str='git version 1.30.50.696.g5e7596f4ac-goog')
+    self.assertEqual(1, ret.major)
+    self.assertEqual(30, ret.minor)
+    self.assertEqual(50, ret.micro)
+    self.assertEqual('1.30.50.696.g5e7596f4ac-goog', ret.full)
+
+
+class CheckGitVersion(RepoWrapperTestCase):
+  """Check _CheckGitVersion behavior."""
+
+  def test_unknown(self):
+    """Unknown versions should abort."""
+    with mock.patch.object(self.wrapper, 'ParseGitVersion', return_value=None):
+      with self.assertRaises(self.wrapper.CloneFailure):
+        self.wrapper._CheckGitVersion()
+
+  def test_old(self):
+    """Old versions should abort."""
+    with mock.patch.object(
+        self.wrapper, 'ParseGitVersion',
+        return_value=self.wrapper.GitVersion(1, 0, 0, '1.0.0')):
+      with self.assertRaises(self.wrapper.CloneFailure):
+        self.wrapper._CheckGitVersion()
+
+  def test_new(self):
+    """Newer versions should run fine."""
+    with mock.patch.object(
+        self.wrapper, 'ParseGitVersion',
+        return_value=self.wrapper.GitVersion(100, 0, 0, '100.0.0')):
+      self.wrapper._CheckGitVersion()
+
+
+class Requirements(RepoWrapperTestCase):
+  """Check Requirements handling."""
+
+  def test_missing_file(self):
+    """Don't crash if the file is missing (old version)."""
+    testdir = os.path.dirname(os.path.realpath(__file__))
+    self.assertIsNone(self.wrapper.Requirements.from_dir(testdir))
+    self.assertIsNone(self.wrapper.Requirements.from_file(
+        os.path.join(testdir, 'xxxxxxxxxxxxxxxxxxxxxxxx')))
+
+  def test_corrupt_data(self):
+    """If the file can't be parsed, don't blow up."""
+    self.assertIsNone(self.wrapper.Requirements.from_file(__file__))
+    self.assertIsNone(self.wrapper.Requirements.from_data(b'x'))
+
+  def test_valid_data(self):
+    """Make sure we can parse the file we ship."""
+    self.assertIsNotNone(self.wrapper.Requirements.from_data(b'{}'))
+    rootdir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
+    self.assertIsNotNone(self.wrapper.Requirements.from_dir(rootdir))
+    self.assertIsNotNone(self.wrapper.Requirements.from_file(os.path.join(
+        rootdir, 'requirements.json')))
+
+  def test_format_ver(self):
+    """Check format_ver can format."""
+    self.assertEqual('1.2.3', self.wrapper.Requirements._format_ver((1, 2, 3)))
+    self.assertEqual('1', self.wrapper.Requirements._format_ver([1]))
+
+  def test_assert_all_unknown(self):
+    """Check assert_all works with incompatible file."""
+    reqs = self.wrapper.Requirements({})
+    reqs.assert_all()
+
+  def test_assert_all_new_repo(self):
+    """Check assert_all accepts new enough repo."""
+    reqs = self.wrapper.Requirements({'repo': {'hard': [1, 0]}})
+    reqs.assert_all()
+
+  def test_assert_all_old_repo(self):
+    """Check assert_all rejects old repo."""
+    reqs = self.wrapper.Requirements({'repo': {'hard': [99999, 0]}})
+    with self.assertRaises(SystemExit):
+      reqs.assert_all()
+
+  def test_assert_all_new_python(self):
+    """Check assert_all accepts new enough python."""
+    reqs = self.wrapper.Requirements({'python': {'hard': sys.version_info}})
+    reqs.assert_all()
+
+  def test_assert_all_old_python(self):
+    """Check assert_all rejects old python."""
+    reqs = self.wrapper.Requirements({'python': {'hard': [99999, 0]}})
+    with self.assertRaises(SystemExit):
+      reqs.assert_all()
+
+  def test_assert_ver_unknown(self):
+    """Check assert_ver works with incompatible file."""
+    reqs = self.wrapper.Requirements({})
+    reqs.assert_ver('xxx', (1, 0))
+
+  def test_assert_ver_new(self):
+    """Check assert_ver allows new enough versions."""
+    reqs = self.wrapper.Requirements({'git': {'hard': [1, 0], 'soft': [2, 0]}})
+    reqs.assert_ver('git', (1, 0))
+    reqs.assert_ver('git', (1, 5))
+    reqs.assert_ver('git', (2, 0))
+    reqs.assert_ver('git', (2, 5))
+
+  def test_assert_ver_old(self):
+    """Check assert_ver rejects old versions."""
+    reqs = self.wrapper.Requirements({'git': {'hard': [1, 0], 'soft': [2, 0]}})
+    with self.assertRaises(SystemExit):
+      reqs.assert_ver('git', (0, 5))
+
+
+class NeedSetupGnuPG(RepoWrapperTestCase):
+  """Check NeedSetupGnuPG behavior."""
+
+  def test_missing_dir(self):
+    """The ~/.repoconfig tree doesn't exist yet."""
+    with TemporaryDirectory() as tempdir:
+      self.wrapper.home_dot_repo = os.path.join(tempdir, 'foo')
+      self.assertTrue(self.wrapper.NeedSetupGnuPG())
+
+  def test_missing_keyring(self):
+    """The keyring-version file doesn't exist yet."""
+    with TemporaryDirectory() as tempdir:
+      self.wrapper.home_dot_repo = tempdir
+      self.assertTrue(self.wrapper.NeedSetupGnuPG())
+
+  def test_empty_keyring(self):
+    """The keyring-version file exists, but is empty."""
+    with TemporaryDirectory() as tempdir:
+      self.wrapper.home_dot_repo = tempdir
+      with open(os.path.join(tempdir, 'keyring-version'), 'w'):
+        pass
+      self.assertTrue(self.wrapper.NeedSetupGnuPG())
+
+  def test_old_keyring(self):
+    """The keyring-version file exists, but it's old."""
+    with TemporaryDirectory() as tempdir:
+      self.wrapper.home_dot_repo = tempdir
+      with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
+        fp.write('1.0\n')
+      self.assertTrue(self.wrapper.NeedSetupGnuPG())
+
+  def test_new_keyring(self):
+    """The keyring-version file exists, and is up-to-date."""
+    with TemporaryDirectory() as tempdir:
+      self.wrapper.home_dot_repo = tempdir
+      with open(os.path.join(tempdir, 'keyring-version'), 'w') as fp:
+        fp.write('1000.0\n')
+      self.assertFalse(self.wrapper.NeedSetupGnuPG())
+
+
+class SetupGnuPG(RepoWrapperTestCase):
+  """Check SetupGnuPG behavior."""
+
+  def test_full(self):
+    """Make sure it works completely."""
+    with TemporaryDirectory() as tempdir:
+      self.wrapper.home_dot_repo = tempdir
+      self.wrapper.gpg_dir = os.path.join(self.wrapper.home_dot_repo, 'gnupg')
+      self.assertTrue(self.wrapper.SetupGnuPG(True))
+      with open(os.path.join(tempdir, 'keyring-version'), 'r') as fp:
+        data = fp.read()
+      self.assertEqual('.'.join(str(x) for x in self.wrapper.KEYRING_VERSION),
+                       data.strip())
+
+
+class VerifyRev(RepoWrapperTestCase):
+  """Check verify_rev behavior."""
+
+  def test_verify_passes(self):
+    """Check when we have a valid signed tag."""
+    desc_result = self.wrapper.RunResult(0, 'v1.0\n', '')
+    gpg_result = self.wrapper.RunResult(0, '', '')
+    with mock.patch.object(self.wrapper, 'run_git',
+                           side_effect=(desc_result, gpg_result)):
+      ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
+      self.assertEqual('v1.0^0', ret)
+
+  def test_unsigned_commit(self):
+    """Check we fall back to signed tag when we have an unsigned commit."""
+    desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
+    gpg_result = self.wrapper.RunResult(0, '', '')
+    with mock.patch.object(self.wrapper, 'run_git',
+                           side_effect=(desc_result, gpg_result)):
+      ret = self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
+      self.assertEqual('v1.0^0', ret)
+
+  def test_verify_fails(self):
+    """Check we fall back to signed tag when we have an unsigned commit."""
+    desc_result = self.wrapper.RunResult(0, 'v1.0-10-g1234\n', '')
+    gpg_result = Exception
+    with mock.patch.object(self.wrapper, 'run_git',
+                           side_effect=(desc_result, gpg_result)):
+      with self.assertRaises(Exception):
+        self.wrapper.verify_rev('/', 'refs/heads/stable', '1234', True)
+
+
+class GitCheckoutTestCase(RepoWrapperTestCase):
+  """Tests that use a real/small git checkout."""
+
+  GIT_DIR = None
+  REV_LIST = None
+
+  @classmethod
+  def setUpClass(cls):
+    # Create a repo to operate on, but do it once per-class.
+    cls.GIT_DIR = tempfile.mkdtemp(prefix='repo-rev-tests')
+    run_git = wrapper.Wrapper().run_git
+
+    remote = os.path.join(cls.GIT_DIR, 'remote')
+    os.mkdir(remote)
+
+    # Tests need to assume, that main is default branch at init,
+    # which is not supported in config until 2.28.
+    if git_command.git_require((2, 28, 0)):
+      initstr = '--initial-branch=main'
+    else:
+      # Use template dir for init.
+      templatedir = tempfile.mkdtemp(prefix='.test-template')
+      with open(os.path.join(templatedir, 'HEAD'), 'w') as fp:
+        fp.write('ref: refs/heads/main\n')
+      initstr = '--template=' + templatedir
+
+    run_git('init', initstr, cwd=remote)
+    run_git('commit', '--allow-empty', '-minit', cwd=remote)
+    run_git('branch', 'stable', cwd=remote)
+    run_git('tag', 'v1.0', cwd=remote)
+    run_git('commit', '--allow-empty', '-m2nd commit', cwd=remote)
+    cls.REV_LIST = run_git('rev-list', 'HEAD', cwd=remote).stdout.splitlines()
+
+    run_git('init', cwd=cls.GIT_DIR)
+    run_git('fetch', remote, '+refs/heads/*:refs/remotes/origin/*', cwd=cls.GIT_DIR)
+
+  @classmethod
+  def tearDownClass(cls):
+    if not cls.GIT_DIR:
+      return
+
+    shutil.rmtree(cls.GIT_DIR)
+
+
+class ResolveRepoRev(GitCheckoutTestCase):
+  """Check resolve_repo_rev behavior."""
+
+  def test_explicit_branch(self):
+    """Check refs/heads/branch argument."""
+    rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/stable')
+    self.assertEqual('refs/heads/stable', rrev)
+    self.assertEqual(self.REV_LIST[1], lrev)
+
+    with self.assertRaises(wrapper.CloneFailure):
+      self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/heads/unknown')
+
+  def test_explicit_tag(self):
+    """Check refs/tags/tag argument."""
+    rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/v1.0')
+    self.assertEqual('refs/tags/v1.0', rrev)
+    self.assertEqual(self.REV_LIST[1], lrev)
+
+    with self.assertRaises(wrapper.CloneFailure):
+      self.wrapper.resolve_repo_rev(self.GIT_DIR, 'refs/tags/unknown')
+
+  def test_branch_name(self):
+    """Check branch argument."""
+    rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'stable')
+    self.assertEqual('refs/heads/stable', rrev)
+    self.assertEqual(self.REV_LIST[1], lrev)
+
+    rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'main')
+    self.assertEqual('refs/heads/main', rrev)
+    self.assertEqual(self.REV_LIST[0], lrev)
+
+  def test_tag_name(self):
+    """Check tag argument."""
+    rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, 'v1.0')
+    self.assertEqual('refs/tags/v1.0', rrev)
+    self.assertEqual(self.REV_LIST[1], lrev)
+
+  def test_full_commit(self):
+    """Check specific commit argument."""
+    commit = self.REV_LIST[0]
+    rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
+    self.assertEqual(commit, rrev)
+    self.assertEqual(commit, lrev)
+
+  def test_partial_commit(self):
+    """Check specific (partial) commit argument."""
+    commit = self.REV_LIST[0][0:20]
+    rrev, lrev = self.wrapper.resolve_repo_rev(self.GIT_DIR, commit)
+    self.assertEqual(self.REV_LIST[0], rrev)
+    self.assertEqual(self.REV_LIST[0], lrev)
+
+  def test_unknown(self):
+    """Check unknown ref/commit argument."""
+    with self.assertRaises(wrapper.CloneFailure):
+      self.wrapper.resolve_repo_rev(self.GIT_DIR, 'boooooooya')
+
+
+class CheckRepoVerify(RepoWrapperTestCase):
+  """Check check_repo_verify behavior."""
+
+  def test_no_verify(self):
+    """Always fail with --no-repo-verify."""
+    self.assertFalse(self.wrapper.check_repo_verify(False))
+
+  def test_gpg_initialized(self):
+    """Should pass if gpg is setup already."""
+    with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=False):
+      self.assertTrue(self.wrapper.check_repo_verify(True))
+
+  def test_need_gpg_setup(self):
+    """Should pass/fail based on gpg setup."""
+    with mock.patch.object(self.wrapper, 'NeedSetupGnuPG', return_value=True):
+      with mock.patch.object(self.wrapper, 'SetupGnuPG') as m:
+        m.return_value = True
+        self.assertTrue(self.wrapper.check_repo_verify(True))
+
+        m.return_value = False
+        self.assertFalse(self.wrapper.check_repo_verify(True))
+
+
+class CheckRepoRev(GitCheckoutTestCase):
+  """Check check_repo_rev behavior."""
+
+  def test_verify_works(self):
+    """Should pass when verification passes."""
+    with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
+      with mock.patch.object(self.wrapper, 'verify_rev', return_value='12345'):
+        rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
+    self.assertEqual('refs/heads/stable', rrev)
+    self.assertEqual('12345', lrev)
+
+  def test_verify_fails(self):
+    """Should fail when verification fails."""
+    with mock.patch.object(self.wrapper, 'check_repo_verify', return_value=True):
+      with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
+        with self.assertRaises(Exception):
+          self.wrapper.check_repo_rev(self.GIT_DIR, 'stable')
+
+  def test_verify_ignore(self):
+    """Should pass when verification is disabled."""
+    with mock.patch.object(self.wrapper, 'verify_rev', side_effect=Exception):
+      rrev, lrev = self.wrapper.check_repo_rev(self.GIT_DIR, 'stable', repo_verify=False)
+    self.assertEqual('refs/heads/stable', rrev)
+    self.assertEqual(self.REV_LIST[1], lrev)
+
+
 if __name__ == '__main__':
   unittest.main()
diff --git a/tox.ini b/tox.ini
index 02c5647..aa4e297 100644
--- a/tox.ini
+++ b/tox.ini
@@ -15,8 +15,19 @@
 # https://tox.readthedocs.io/
 
 [tox]
-envlist = py27, py36, py37, py38
+envlist = py36, py37, py38, py39
+
+[gh-actions]
+python =
+    3.6: py36
+    3.7: py37
+    3.8: py38
+    3.9: py39
 
 [testenv]
 deps = pytest
-commands = {toxinidir}/run_tests
+commands = {envpython} run_tests
+setenv =
+    GIT_AUTHOR_NAME = Repo test author
+    GIT_COMMITTER_NAME = Repo test committer
+    EMAIL = repo@gerrit.nodomain
diff --git a/wrapper.py b/wrapper.py
index 0ce3250..b1aa4c5 100644
--- a/wrapper.py
+++ b/wrapper.py
@@ -1,5 +1,3 @@
-# -*- coding:utf-8 -*-
-#
 # Copyright (C) 2014 The Android Open Source Project
 #
 # Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +12,6 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-from __future__ import print_function
 try:
   from importlib.machinery import SourceFileLoader
   _loader = lambda *args: SourceFileLoader(*args).load_module()
@@ -27,7 +24,10 @@
 def WrapperPath():
   return os.path.join(os.path.dirname(__file__), 'repo')
 
+
 _wrapper_module = None
+
+
 def Wrapper():
   global _wrapper_module
   if not _wrapper_module: