Compare commits

...

157 Commits

Author SHA1 Message Date
SteveLauC
4df30c2587 chore: release v16.0.2 (#995) 2024-12-07 15:21:19 +08:00
Andre Toerien
305a5fbcae fix(poetry): skip if not installed with official script (#989)
* fix(poetry): skip if not installed with official script

* feat(poetry): add poetry_force_self_update config option

* docs: give this config a more detailed explanation

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2024-12-07 15:09:52 +08:00
Tulip Blossom
4f4dcbb643 feat: add bootc support to Fedora atomic distros
* feat(bootc): add Bootc support + docs

Co-authored-by: Steve Lau <stevelauc@outlook.com>

* docs(bootc): specify that itll supercede rpm-ostree if enabled :p

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2024-11-19 11:07:12 +08:00
Laura Demkowicz-Duffy
202897ba35 refactor: disable julia startup file for julia package update (#983)
* refactor(julia): disable julia startup file for julia package update

* feat(julia): add configuration option for julia startup file

* fix: deny unknown fields on JuliaConfig deserialisation

Co-authored-by: SteveLauC <stevelauc@outlook.com>

* doc(julia): clarify startup_file option purpose

---------

Co-authored-by: SteveLauC <stevelauc@outlook.com>
2024-11-19 09:17:51 +08:00
Youn Mélois
444689c899 feat: allow version specification for deno (#970)
* feat: allow version specification for deno

* fix: missing quotes for string in toml file

Co-authored-by: SteveLauC <stevelauc@outlook.com>

* fix: deno upgrade for different executable versions

* fix: tell apart the two cases for v1.x in SkipStep reason

* docs: add comments and documentation on version method for deno

* chore: add explanatory comment on stable channel that does nothing

Co-authored-by: SteveLauC <stevelauc@outlook.com>

---------

Co-authored-by: SteveLauC <stevelauc@outlook.com>
2024-10-29 18:09:47 +08:00
Gudsfile
98ec13f8db i18n(app.yml): new language fr (#969)
Apply suggestions from code review

Co-authored-by: SteveLauC <stevelauc@outlook.com>
2024-10-29 16:34:44 +08:00
Lucas Parzianello
39f76a3a71 uv step: checking self subcommand exits; fixes #942 (#971)
* uv step: checking self subcommand exits; fixes #942

* uv: fixing return behavior

---------

Co-authored-by: Lucas Parzianello <lucaspar@users.noreply.github.com>
2024-10-29 15:40:31 +08:00
Ricardo Torres
f181a795a6 refactor: flip order of mise upgrade and mise plugins update (#968)
flip order of mise plugins update and mise upgrade to attempt updating plugins first.
2024-10-28 09:59:22 +08:00
Andreas02-dev
ea2f3e07e9 feat(microsoft_store): Add Microsoft Store step for Windows (#963)
* feat(microsoft_store): Add Microsoft Store step for Windows

Add Microsoft Store Apps update step for Windows as Winget cannot update all Microsoft Store apps yet.

Closes #912

* style(translation): modify `zh_TW` translation
2024-10-23 08:15:46 +08:00
SteveLauC
8aad6eae0d refactor: add missing i18n for OpenBSD steps (#965) 2024-10-22 08:47:15 +08:00
SteveLauC
e86e5fe3e7 docs: document that we need to translate user-facing texts (#966) 2024-10-22 08:46:59 +08:00
λP.(P izzy)
2c2569c4f8 Improve OpenBSD -CURRENT detection and Dry-run feedback (#954)
* Improve OpenBSD -CURRENT detection and Dry-run feedback

This commit improves the -CURRENT detection by way of parsing `/etc/motd`. This change is more future-proof as when OpenBSD nears a stable release, `uname` will temporarily report like -STABLE.

This commit *also* adds feedback if -CURRENT is found to make debugging this feature easier with `--dry-run`, or, just a regular run as well.

* Make OpenBSD step less talky and improve verbiage.

This commit removes the command flag feedback. This commit also swaps the output "update", for "upgrade", making this step closer to other steps for consistency.
2024-10-18 08:26:27 +08:00
Rebecca Turner
9ffdc9649e Add support for Lix (Nix fork) (#952)
Add support for Lix

Lix is a fork of Nix 2.18 focused on maintainability and user
experience. It has a different format for the version, to distinguish it
from CppNix:

    $ nix --version
    nix (Lix, like Nix) 2.91.0

See: <https://lix.systems/>
2024-10-18 08:23:25 +08:00
Rikiub%
a5d4f2eec9 i18n (app.yml): Add Spanish localization (es) (#955)
* Update app.yml

* "es" localization added

* Grammar fixes

* Fix YAML syntax errors

* Fix YAML syntax errors

* Fix duplicated

* Fix duplicate

* Grammar fix

* Grammar fix

* Fix duplicate

* Improve grammar

* Update locales/app.yml

Co-authored-by: SteveLauC <stevelauc@outlook.com>

* Improve Grammar

* Improve Grammar

* Improve Grammar

* Improve Grammar

* Improve Grammar

---------

Co-authored-by: SteveLauC <stevelauc@outlook.com>
2024-10-17 08:04:49 +08:00
Nils
a5df40e01d Refactor config.rs and vagrant.rs files (#949)
* Refactor config.rs and vagrant.rs files

* Refactor config.rs and vagrant.rs files
2024-10-15 17:56:03 +08:00
SteveLauC
0573fc97c6 docs: update release procedure that SECURITY.md should be updated in major release (#946)
docs: update release procedure that SECURITY.md should be updated in major releases
2024-10-14 17:01:22 +08:00
Nils
1ae95f41a1 Update SECURITY.md (#945) 2024-10-14 16:37:15 +08:00
λP.(P izzy)
8a7af2e14d [FIXES #922] properly check for -CURRENT in OpenBSD steps and pass the correct flags to the respective commands (#923)
* [FIXES #922] properly check for -CURRENT in openbsd steps and pass the correct flags

* un-break ctx.config().dry_run() on OpenBSD Step
2024-10-14 08:29:51 +08:00
Nicolas Lorin
c36da89933 ci: add bin pkg to aur (#944) 2024-10-13 21:14:28 +08:00
SteveLauC
bbb84c2ee7 chore: release v16.0.1 (#940) 2024-10-11 10:46:50 +08:00
Tobias Micheler
36fd4b13c0 fix: pypi-release-action: downgrade upload-artifact and download-artifact to v3 (#938)
Since there are breaking changes between the upload-artifact and download-artifact versions v3 and v4, this workflow was broken, and no releases were uploaded to pypi.
A downgrade should make this work again
2024-10-10 07:26:29 +08:00
SteveLauC
49327000fc fix: tmux unknown cmd: attach-client (#937) 2024-10-08 21:33:53 +08:00
Oliver Tzeng
9c25cd7426 i18n(app.yml): new language zh_TW (#931)
* i18n(app.yml): new language zh_TW

translated topgrade to zh_TW

* Update app.yml

* Update app.yml

* fix(i18n): Fixed "self-upgrade" translation

thanks @SteveLauC
2024-10-08 19:11:11 +08:00
Alexandre Veyrenc
9767e4169c Fix typo in exit_status (#934)
fix: typo in exit_status
2024-10-08 08:52:36 +08:00
SteveLauC
0854f9c559 revert: PR 866 (#927) 2024-10-06 21:39:47 +08:00
SteveLauC
e4a068d808 chore: release v16.0.0 (#925)
* chore: release v16.0.0

* chore: it should be containers.runtime
2024-10-06 21:23:00 +08:00
SteveLauC
4c793b0df8 ci: correct checker binary name (#926)
* ci: correct checker binary name

* ci: correct checker binary name
2024-10-06 21:13:25 +08:00
SteveLauC
a021441135 fix: use single quotes for locale keys with new line char (#920) 2024-10-04 12:33:19 +08:00
Florian Nagel
29c555c394 Add i18n by using rust i18n (#807)
* feat: initial i18n setup

* style: fmt

* feat: i18n support for new steps

* fix: build on Linux

* fix: build on Linux

* refactor: rm unused translation keys

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2024-10-03 18:47:35 +08:00
SteveLauC
c33d396489 docs: --only is no longer experimental (#919) 2024-09-29 09:03:26 +08:00
⑆ Neveda ⑈
f6d2ba4dae feat(brew): Add greedy-auto-updates option to Brew (#914) 2024-09-26 18:29:11 +08:00
SteveLauC
a88574204d docs: don't call execute("bin_name_str") (#916) 2024-09-26 15:05:45 +08:00
Marcelo Duarte Trevisani
9435bc4b7d Add Pixi to topgrade (#915)
* Add Pixi

* make linter happy

* Fix args
2024-09-26 14:19:32 +08:00
tomaszn
27245cbd7b feat(brew): use sudo if Homebrew owned by another user on Linux (#904)
feat(brew): use sudo if Homebrew owned by another user

On Linux, run "brew update" with sudo if the Homebrew installation directory
is owned by a different user. This is typically the user who installed
Homebrew, but can also be a dedicated user account. This change ensures that
Homebrew updates can proceed smoothly even when its directory ownership does
not match the current user's UID. Proper sudo configuration is assumed for
this to work properly.
2024-09-22 21:00:52 +08:00
wetfloo
21751aa8a5 feat: tmux session attach mode (#901)
* feat: tmux session attach mode

* feat: example config update

* feat: move the comment down to be relevant

* feat: fix tmux not attaching from non-tmux env when using create_and_switch_client

* feat: make matching on tmux modes as described in suggestions

* feat: make tmux_session_attach_mode private

* feat: remove tmux mode cli option

* feat: wrap default value in quotation marks for tmux session mode

* feat: renames for tmux session management options

* feat: try to make tmux session mode description better
2024-09-17 21:06:39 +08:00
Marcel Coetzee
ad41948450 Remove check for whether conda config contains auto_activate_base (#905)
Signed-off-by: Marcel Coetzee <marcel@mooncoon.com>
2024-09-17 09:14:52 +08:00
SteveLauC
e32246f172 feat: clean scoop cache if cleanup is enabled (#909) 2024-09-16 15:27:01 +08:00
SteveLauC
25d3a816b4 fix: aura since v4.0.6 does not need sudo (#908)
* fix: aura since v4.0.6 does not need sudo

* fix: remove 'aura ' from version str
2024-09-16 13:01:05 +08:00
SteveLauC
05b1a565e0 chore: pin toolchain to MSRV(1.76) (#900)
* chore: pin toolchain to MSRV(1.76)

* chore: remove more toolchain action & update readme
2024-09-04 21:40:09 +08:00
Kreeblah
7b2623ea3c Add Debian-based system builds (#898)
* Add Debian-based system builds

* Address feedback

* Remove git as a listed dependency for Debian package
2024-09-04 11:50:39 +08:00
SteveLauC
983c5243ba fix: a panic introduced by improper unwrap() (#899)
fix: an panic introduced by improper unwrap()
2024-09-03 15:26:41 +08:00
Lucas Parzianello
1958fe1e5b Containers step: new runtime option to configuration (#896)
* pyenv: fixes #849

* feat: adds `uv` python manager step

* moved new uv step from unix to generic

* containers step: added container runtime option to config

* documented breaking change

---------

Co-authored-by: Lucas Parzianello <lucaspar@users.noreply.github.com>
2024-09-01 15:35:23 +08:00
SteveLauC
ca8558d9b4 feat: support step Bun on Windows (#893) 2024-08-26 22:21:17 +08:00
Lucas Parzianello
1b534800a9 Adds uv step (#890)
* pyenv: fixes #849

* feat: adds `uv` python manager step

* moved new uv step from unix to generic

---------

Co-authored-by: Lucas Parzianello <lucaspar@users.noreply.github.com>
2024-08-25 10:22:27 +08:00
Boris Smidt
e91c00c9c0 Add aqua tool installer cli (#889)
* Add aqua cli

* Move aqua cli to generic.rs

* Add a dry-run support to aqua

* style: format code

---------

Co-authored-by: Steve Lau <stevelauc@outlook.com>
2024-08-20 09:18:27 +08:00
Nils
a2375b4820 chore: update winget-releaser to use main branch (#888)
Update the winget-releaser action in the release_to_winget.yml workflow to use the main branch instead of v2. This ensures that the latest version of the action is being used for publishing.
2024-08-18 10:29:17 +08:00
Patrick J. Roddy
2e0c8e9e17 Fix RubyGems issues for mise regarding sudo (#887) 2024-08-18 10:28:22 +08:00
Nils
dc0ddcf9f0 Update README.md (#882)
* Update README.md

Added Chocolatey

* chore: fix broken Chocolatey link in README.md
2024-08-18 10:22:23 +08:00
Diogo Ribeiro
a1f3c86a39 feat: add volta packages (#883)
add print_info when no packages found

apply review feedback
2024-08-01 18:26:22 +08:00
Daniel Horecki
55f672eff7 Allow Nix unfree packages to be upgraded (#881)
Allow unfree packages to be upgraded

Fixes #611.

Co-authored-by: Daniel Horecki <morr@morr.pl>
2024-08-01 09:52:03 +08:00
Nils
8ece0346d8 chore: improve Windows Update step and add PSWindowsUpdate Module (#842)
* chore: improve Windows Update step and add PSWindowsUpdate Module

Refactor the `windows_update` function in `windows.rs` to improve the Windows Update step. Added a prompt for administrator privileges and updated the warning message. Also, added support for installing the PSWindowsUpdate Module as an alternative to using USOClient for Windows Update.

still see warning:
The installer will request to run as administrator, expect a prompt.
Start-Process : A parameter cannot be found that matches parameter name 'Command'.
At line:1 char:74
+ ... ath powershell -Verb runAs -ArgumentList  -NoProfile -Command Import- ...
+                                                          ~~~~~~~~
    + CategoryInfo          : InvalidArgument: (:) [Start-Process], ParameterBindingException
    + FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.PowerShell.Commands.StartProcessCommand

VERBOSE: MSI-THIN-GF36 (6/30/2024 4:48:48 PM): Connecting to Microsoft Update server. Please wait...
VERBOSE: Found [0] Updates in pre search criteria

but as the verbose shows it works

* trying

* fix
2024-08-01 09:50:48 +08:00
Nicolas Lorin
b1fe1d201a ci: fix release_to_aur.yml (#879) 2024-07-29 16:13:33 +08:00
Nils
5010abdc22 Update SECURITY.md (#878) 2024-07-29 10:01:46 +08:00
SteveLauC
e4441d5021 refactor: fix Windows clippy (#880)
Refactor: fix Windows clippy
2024-07-29 09:01:04 +08:00
dashmoho
5af0c6a7e5 Fix nix upgrades (#874)
Nix version 2.21 changed how packages are upgraded.
Fixes #782.

Co-authored-by: Daniel Horecki <morr@morr.pl>
2024-07-24 07:37:22 +08:00
SteveLauC
b8da17106a feat: support ZVM (#777) 2024-07-23 07:26:08 +08:00
Tommaso Melacarne
fdf40dbf43 Fix nightly clippy warnings (#872) 2024-07-22 07:33:42 +08:00
Ryan Zoeller
f3b6530969 feat(os): support NI Linux Real-Time's opkg package manager (#870)
NI Linux Real-Time is a Yocto Linux-based distribution used with
NI's embedded and real-time controllers.

Related links:
- https://www.ni.com/en/shop/linux/introduction-to-ni-linux-real-time.html
- https://github.com/ni/nilrt
- https://github.com/ni/nilrt-docs
2024-07-21 09:09:36 +08:00
Lazerbeak12345
cbc5fc94f9 feat(linux.rs): Add support for Funtoo (#868)
* feat(linus.rs): Add support for Funtoo

* style(linux.rs): fix clippy reccomendations

* test(funtoo support): add funtoo test
2024-07-20 11:04:26 +08:00
SteveLauC
dceb697355 feat: don't run reboot with sudo on Linux with systemd (#866) 2024-07-20 10:13:14 +08:00
Lucas Parzianello
07118fa0d2 Fix pyenv: no such command 'update' (#860)
pyenv: fixes #849

Co-authored-by: Lucas Parzianello <lucaspar@users.noreply.github.com>
2024-07-11 07:52:09 +08:00
dependabot[bot]
16e6db0def chore(deps): bump zerovec-derive from 0.10.2 to 0.10.3 (#858)
Bumps [zerovec-derive](https://github.com/unicode-org/icu4x) from 0.10.2 to 0.10.3.
- [Release notes](https://github.com/unicode-org/icu4x/releases)
- [Changelog](https://github.com/unicode-org/icu4x/blob/main/CHANGELOG.md)
- [Commits](https://github.com/unicode-org/icu4x/commits/ind/zerovec-derive@0.10.3)

---
updated-dependencies:
- dependency-name: zerovec-derive
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-10 07:44:46 +08:00
dependabot[bot]
64d8f6d632 chore(deps): bump zerovec from 0.10.2 to 0.10.4 (#856)
Bumps [zerovec](https://github.com/unicode-org/icu4x) from 0.10.2 to 0.10.4.
- [Release notes](https://github.com/unicode-org/icu4x/releases)
- [Changelog](https://github.com/unicode-org/icu4x/blob/main/CHANGELOG.md)
- [Commits](https://github.com/unicode-org/icu4x/commits/ind/zerovec@0.10.4)

---
updated-dependencies:
- dependency-name: zerovec
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-09 07:59:04 +08:00
SteveLauC
180b5cba58 docs: document that maintenance continues (#855) 2024-07-08 09:26:57 +08:00
Steve Lau
bac416e907 docs: document that it is currently unmaintained 2024-07-07 16:23:23 +08:00
NAKASHIMA, Makoto
cb674a1572 fix: traverse symbolic links under $CONIG_DIR/topgrade.d (#852) (#853) 2024-07-07 13:47:53 +08:00
SteveLauC
960b14fa20 feat: support Poetry (#790) 2024-07-07 10:37:07 +08:00
tranzystorekk
a9f57d4205 Small clap adjustments (#846)
* style(cli): use new clap keywords

* fix(cli): use lowercase command name
2024-07-01 17:06:04 +08:00
SteveLauC
13330b6950 docs: update release procedure (#845) 2024-07-01 10:21:35 +08:00
SteveLauC
1ebcc9beee chore: prepare for v15.0.0 (#843) 2024-07-01 09:45:20 +08:00
SteveLauC
55e1bbf2b9 feat: new step Lensfun's database update (#839)
* feat: new step Lensfun's database update

* refactor: take 1 as a success exit code
2024-06-30 22:41:09 +08:00
SteveLauC
f2dfa1e475 fix: consider TMUX_PLUGIN_MANAGER_PATH when searching tpm binary (#835)
* fix: consider TMUX_PLUGIN_MANAGER_PATH when searching tpm binary

* fix: correct update_plugins path when env var is present
2024-06-30 19:17:30 +08:00
SteveLauC
fcd53e772a chore: collect --dry-run and --yes opts info in feature request template (#838)
chore: collect --dry-run and --yes opts info in feature request template
2024-06-30 14:17:45 +08:00
dependabot[bot]
8b9d7ef8f3 chore(deps): bump curve25519-dalek from 4.1.2 to 4.1.3 (#827)
Bumps [curve25519-dalek](https://github.com/dalek-cryptography/curve25519-dalek) from 4.1.2 to 4.1.3.
- [Release notes](https://github.com/dalek-cryptography/curve25519-dalek/releases)
- [Commits](https://github.com/dalek-cryptography/curve25519-dalek/compare/curve25519-4.1.2...curve25519-4.1.3)

---
updated-dependencies:
- dependency-name: curve25519-dalek
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-06-19 09:49:20 +08:00
SteveLauC
d8406a8cfe style: allow unused ExecutorChild (#829)
* style: allow unused ExecutorChild

* style: remove duplicate cfg on windows
2024-06-19 09:43:26 +08:00
SteveLauC
4a9ef581e5 chore: bump deps (#823) 2024-06-13 09:21:42 +08:00
Tamás Demeter-Haludka
a52db1f261 Run MasonUpdate as part of the vim updates (#821)
feat(vim): add mason update
2024-06-13 09:00:15 +08:00
Yaroslav Markin
8e16174ce7 fix(RubyGems): support no-sudo updating for rbenv and rvm (#820) 2024-06-06 19:37:06 +08:00
huajingyun
c748bb5d7a deps: bump libc from 0.2.153 to 0.2.155 (#818) 2024-05-28 09:23:10 +08:00
lachsdachs
3cc8f0d818 Add linux mint support (#817)
Update linux.rs
2024-05-26 16:26:11 +08:00
SteveLauC
f96eeeda6b chore: build binary for both macOS aarch64 and amd64 (#816) 2024-05-25 20:26:21 +08:00
SteveLauC
d1d8904376 ci: replace deprecated gh actions with alternatives (#814) 2024-05-25 19:29:17 +08:00
SteveLauC
3b329fe687 chore: update PR template (#815) 2024-05-25 17:35:46 +08:00
SteveLauC
9eb1b4ac9f ci: remove code coverage test & uniform file names (#811) 2024-05-24 09:02:05 +08:00
lachsdachs
c4c0bd7383 add upgrade stuff for bedrock linuxmint strata (#813) 2024-05-24 09:01:46 +08:00
alice
1e9de5832d feat: add support for chimera linux (#808)
since it also uses apk the update/upgrade is identical to alpine/wolfi
2024-05-19 18:48:51 +08:00
dependabot[bot]
f2b17cdd9d chore(deps): bump mio from 0.8.10 to 0.8.11 (#729)
Bumps [mio](https://github.com/tokio-rs/mio) from 0.8.10 to 0.8.11.
- [Release notes](https://github.com/tokio-rs/mio/releases)
- [Changelog](https://github.com/tokio-rs/mio/blob/master/CHANGELOG.md)
- [Commits](https://github.com/tokio-rs/mio/compare/v0.8.10...v0.8.11)

---
updated-dependencies:
- dependency-name: mio
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-19 09:55:30 +08:00
dependabot[bot]
7bfd6c2439 chore(deps): bump h2 from 0.3.24 to 0.3.26 (#766)
Bumps [h2](https://github.com/hyperium/h2) from 0.3.24 to 0.3.26.
- [Release notes](https://github.com/hyperium/h2/releases)
- [Changelog](https://github.com/hyperium/h2/blob/v0.3.26/CHANGELOG.md)
- [Commits](https://github.com/hyperium/h2/compare/v0.3.24...v0.3.26)

---
updated-dependencies:
- dependency-name: h2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-19 09:55:05 +08:00
dependabot[bot]
0e8d5f0266 chore(deps): bump rustls from 0.21.10 to 0.21.12 (#804)
Bumps [rustls](https://github.com/rustls/rustls) from 0.21.10 to 0.21.12.
- [Release notes](https://github.com/rustls/rustls/releases)
- [Changelog](https://github.com/rustls/rustls/blob/main/CHANGELOG.md)
- [Commits](https://github.com/rustls/rustls/compare/v/0.21.10...v/0.21.12)

---
updated-dependencies:
- dependency-name: rustls
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-19 09:54:27 +08:00
Nils
32add8f046 Dependatbot Updates (#802)
* chore(deps): bump actions/checkout from 3 to 4

Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* chore(deps): bump github/codeql-action from 2 to 3

Bumps [github/codeql-action](https://github.com/github/codeql-action) from 2 to 3.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/github/codeql-action/compare/v2...v3)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* chore(deps): bump softprops/action-gh-release from 1 to 2

Bumps [softprops/action-gh-release](https://github.com/softprops/action-gh-release) from 1 to 2.
- [Release notes](https://github.com/softprops/action-gh-release/releases)
- [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md)
- [Commits](https://github.com/softprops/action-gh-release/compare/v1...v2)

---
updated-dependencies:
- dependency-name: softprops/action-gh-release
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-05-18 19:48:09 +08:00
SteveLauC
f661f00277 feat: support auto-cpufreq (#800) 2024-05-18 16:34:03 +08:00
Alok Singh
2a1999fe20 Add rye support (#799)
Rye is a new cargo-like package manager for python by @mitsuhiko.
2024-05-13 20:52:13 +08:00
SteveLauC
4d66431aad fix: Fedora Sway Atomic should be recognized as FedoraImmutable (#795)
* fix: Fedora Sway Atomic should be recognized as FedoraImmutable

* style: fmt
2024-05-11 11:20:43 +08:00
SteveLauC
767f0d91f4 refactor: 2 clippy warnings (#789) 2024-05-06 20:37:55 +08:00
edi
a3428e3477 Always display windows update step (#781)
* always display windows update step

* remove extra comma

* i guess format wants the comma
2024-05-06 20:24:57 +08:00
David C
614131b7bf fix(os): detect Fedora IoT Edition as immutable Fedora variant (#774)
Without this change, it is detected as a regular Fedora variant and
updating fails because neither `dnf` nor `yum` is found.
2024-04-17 09:05:54 +08:00
Dan Sully
9b0681f3b8 Add config flag to toggle verbose Git repository output. (#763)
* Add config flag to toggle verbose Git repository output.

If `true`: the default, no change.

If `false`: Only show repositories that have been updated or have an error.

Minor tweak to output (removed colon) so that copy and paste for 'cd' is nicer.
2024-04-14 10:28:03 +08:00
Andre Toerien
ecf8fb7a47 fix: better dotnet tool list header parsing (#772)
fix: better dotnet tool list header parsing
2024-04-14 09:10:08 +08:00
Andrew Barchuk
04bfb45a97 Fix local host detection for remotes with user (#755) 2024-04-08 19:43:32 +08:00
SteveLauC
d90ce30452 feat: support update PlatformIO Core (#759) 2024-04-07 11:03:33 +08:00
Ricardo Torres
ab21600ca6 feat: add support for mise (#757)
Add support for mise-en-place (or mise). Mise is a tool like asdf (already supported). https://mise.jdx.dev/
2024-03-30 18:40:16 +08:00
λP.(P izzy)
728ea26204 FIXES #708: add config directive for pkg_* cleanup on OpenBSD (#753)
FIXEs #708: add config directive for pkg_* cleanup on OpenBSD
2024-03-26 11:07:39 +08:00
SteveLauC
373cd3b3ae fix: don't use Command::new(bin_name) as it won't work on Windows (#750) 2024-03-24 11:48:17 +08:00
SteveLauC
f4e0258b09 style: fix 2 clippy lint unless_vec & unused_io_amount (#751) 2024-03-24 11:24:39 +08:00
SteveLauC
d50360a69a feat: support update ClamAV databases (#747) 2024-03-19 14:10:47 +08:00
SteveLauC
351922c81f feat: put step logs in a span (#746) 2024-03-16 14:17:19 +08:00
Alok Singh
9518f43866 Add support for Lean 4's elan (#742) 2024-03-16 09:35:47 +08:00
SteveLauC
2c1ce3d4e6 refactor: make GitSteps a dedicated step (#737) 2024-03-09 17:57:33 +08:00
SteveLauC
12116c3261 fix: use env BUN_INSTALL to locate package.json (#734) 2024-03-07 14:12:16 +08:00
Gerald Chen
fbc84e8aa1 fix(pipx): adds --include-injected argument to pipx (#726) 2024-03-01 15:06:23 +08:00
Brent Monning
6dab1e4f37 feat: adds xcodes step (#643) 2024-03-01 07:58:24 +08:00
Lucas Parzianello
650a143602 Adds pyenv step (#724) 2024-02-27 09:25:18 +08:00
Nils
9b6027fe78 Update GitHub Actions workflow for Codecov integration (#718)
- Refine the testing matrix to include only stable and nightly versions of Rust
- Add 'fail_ci_if_error' option to Codecov step for stricter CI checks
- Ensure newline at end of file
2024-02-25 11:19:09 +08:00
Nils
0e30e05ce8 Add GitHub Actions Workflow for Build and Test (#717)
* "Add *.profraw files to .gitignore

*.profraw files are generated by LLVM's Clang compiler when using the -fprofile-instr-generate option for Profile Guided Optimization. These files contain raw profiling data and should not be version controlled."

* Remove redundant import of TryFrom trait

The TryFrom trait was being imported explicitly in src\steps\os\windows.rs, even though it's already part of the Rust prelude and automatically imported into every Rust program. This was causing a compiler warning. This commit comments out the redundant import to resolve the warning.

* Add GitHub Actions workflow for Rust build and test

This commit adds a new GitHub Actions workflow for building and testing the Rust project across multiple operating systems (Ubuntu, Windows, macOS) and Rust versions (stable, beta, nightly). It also includes caching for dependencies and build artifacts, and uploads code coverage reports to Codecov.

* Update Codecov action and add token for coverage report upload

This commit updates the version of the Codecov GitHub Action used to upload coverage reports from v4 to v4.0.1. It also adds a token from the repository secrets to authenticate the upload. This ensures secure and authorized communication with the Codecov service.

* "Fix misuse of --jobs flag in cargo test command"

* "Fix grcov command in GitHub Actions workflow

The grcov command was previously prefixed with './', which caused an error because grcov was not found in the current directory. This commit removes the './' prefix to call grcov from the global path, where it is installed."

* Update GitHub Actions workflow for cross-platform compatibility

This commit modifies the 'build-and-test.yml' GitHub Actions workflow to ensure it works correctly across different operating systems (Ubuntu, Windows, MacOS). The RUSTFLAGS environment variable is now set in a cross-platform compatible way. The workflow will run the build and test process on every pull request and push to the main branch, generate a coverage report, and upload it to Codecov.

* Changed workflow trigger event to 'workflow_run' completion of 'Build and test' workflow

* "Updated GitHub Actions workflow to correctly set environment variables for code coverage"

* Renamed build and test workflow

* Update GitHub Actions workflow trigger

Change the trigger of the 'Test with Code Coverage' workflow to run when the 'build-and-test' workflow is completed. This ensures that code coverage is only calculated after successful build and test runs.

* Update workflow_run trigger in code-coverage.yml

* Fix CODECOV_TOKEN in code-coverage.yml workflow

* Update code-coverage workflow to trigger on pull requests and pushes to main branch

* Update .gitignore file to exclude LLVM profiling output

* Add empty line at the end

* Remove unused import in windows.rs

* Update .github/workflows/build-and-test.yml

Co-authored-by: SteveLauC <stevelauc@outlook.com>

* Update .github/workflows/build-and-test.yml

Co-authored-by: SteveLauC <stevelauc@outlook.com>

* Remove code coverage workflow

---------

Co-authored-by: SteveLauC <stevelauc@outlook.com>
2024-02-25 10:35:56 +08:00
Nils
eea952fa78 Create devskim.yml to enable GitHub code scanning for this repository (#700) 2024-02-24 18:53:10 +08:00
SteveLauC
6071a1ee3b chore: git ignore more (#715) 2024-02-24 13:45:53 +08:00
SteveLauC
a801b7b9f4 chore: bump deps (#714) 2024-02-24 13:14:53 +08:00
SteveLauC
c6e3f0ae0a revert: revert 614 to remove the -p option (#713) 2024-02-24 11:26:41 +08:00
SteveLauC
a43b03d3db feat: also detect Helix step with bin name hx (#710) 2024-02-23 07:39:31 +08:00
Md Isfarul Haque
12b0fa57ad fix: fetch and build Helix grammar as a regular user (#698) 2024-02-23 07:26:08 +08:00
Nils
d9e304f0ef Add .vs to .gitignore (#706)
* Added .vs vode to .gitignore

* Adjust .vs to .vs/
2024-02-22 09:47:37 +08:00
dependabot[bot]
842b92cca7 chore(deps): bump actions/download-artifact from 3 to 4 (#704)
Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 3 to 4.
- [Release notes](https://github.com/actions/download-artifact/releases)
- [Commits](https://github.com/actions/download-artifact/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/download-artifact
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-22 09:39:05 +08:00
dependabot[bot]
485f0ec9c8 chore(deps): bump EnricoMi/publish-unit-test-result-action from 1 to 2 (#705)
Bumps [EnricoMi/publish-unit-test-result-action](https://github.com/enricomi/publish-unit-test-result-action) from 1 to 2.
- [Release notes](https://github.com/enricomi/publish-unit-test-result-action/releases)
- [Commits](https://github.com/enricomi/publish-unit-test-result-action/compare/v1...v2)

---
updated-dependencies:
- dependency-name: EnricoMi/publish-unit-test-result-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-22 09:38:40 +08:00
λP.(P izzy)
5e3b5fc9a7 Fix OpenBSD Step failing to build with E0599 (#707)
* fix openbsd support failing with error E0599

* clean up a little formatting in src/os/openbsd.os
2024-02-21 21:10:34 +08:00
SteveLauC
7c63541cad fix: zinit default install location (#625) 2024-02-17 13:15:53 +08:00
SteveLauC
238e089d74 docs: document brew config entries[skip ci] (#696) 2024-02-17 13:14:39 +08:00
luciodaou
8991bc9f62 feat(brew): adds "greedy-latest" option to Brew (#636) 2024-02-17 11:45:57 +08:00
SteveLauC
7a3f3a8905 feat: support waydroid (#687) 2024-02-16 11:57:53 +08:00
dependabot[bot]
e4085e03eb chore(deps): bump actions/checkout from 2 to 4 (#688)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v2...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-15 16:19:58 +08:00
dependabot[bot]
4b0c366e5f chore(deps): bump actions/upload-artifact from 3 to 4 (#689)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 3 to 4.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-15 16:19:18 +08:00
dependabot[bot]
ea97240d09 chore(deps): bump actions/cache from 1 to 4 (#690)
Bumps [actions/cache](https://github.com/actions/cache) from 1 to 4.
- [Release notes](https://github.com/actions/cache/releases)
- [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
- [Commits](https://github.com/actions/cache/compare/v1...v4)

---
updated-dependencies:
- dependency-name: actions/cache
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-15 16:18:47 +08:00
dependabot[bot]
12de531abb chore(deps): bump codecov/codecov-action from 1 to 4 (#691)
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 1 to 4.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/codecov/codecov-action/compare/v1...v4)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-15 16:18:24 +08:00
dependabot[bot]
c3876ce3bf chore(deps): bump katyo/publish-crates from 1 to 2 (#692)
Bumps [katyo/publish-crates](https://github.com/katyo/publish-crates) from 1 to 2.
- [Release notes](https://github.com/katyo/publish-crates/releases)
- [Commits](https://github.com/katyo/publish-crates/compare/v1...v2)

---
updated-dependencies:
- dependency-name: katyo/publish-crates
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-15 16:18:00 +08:00
SteveLauC
cbbfc3a114 docs: update install doc with Winget (#693) 2024-02-15 16:17:11 +08:00
Nils
ad2bfc9abd Keeping actions up to date with Dependabot (#685) 2024-02-15 16:04:51 +08:00
Nils
528461412e Publish new releases of topgrade to the Windows Package Manager with WinGet Releaser
Publish new releases of topgrade to the Windows Package Manager with WinGet Releaser (GitHb Action).
2024-02-15 16:04:11 +08:00
SteveLauC
64db679390 ci: add macOS aarch64 check (#680) 2024-02-06 16:28:01 +08:00
Wallunen
77a8b3b7d2 feat: add fetch_head configuration option into brew (#679) 2024-02-06 16:17:27 +08:00
Nils
7007e76ab5 Fix/winget (#670)
* cargo update

* Remove the check for 'winget_enable' set to 'true'. On my Windows 10 and 11 machines, there are no issues with Winget anymore. As far as I remember, it was disabled by default because it was buggy back then.

* remove print_warning

* Revert "cargo update"

This reverts commit 5f4e532bc1.

* Removed the `enable_winget = true` configuration as winget is now enabled by default.

* Removed the #[cfg(windows)] flag.

* Revised as Recommended

* Wrapping at 80
2024-02-03 09:09:47 +08:00
Andy Piper
3c970063a9 fix: correct typos in output (#677)
Corrects a grammatical issue and a typo in two of the step output messages.
2024-01-31 09:07:38 +08:00
SteveLauC
b70830015e docs: fix a wrong preposition[skip ci] (#676) 2024-01-30 11:06:32 +08:00
SteveLauC
b43f2c8b3a ci: run cargo test in ci (#674) 2024-01-29 10:36:30 +08:00
RJ Trujillo
c311da16f3 feat: Add support for Wolfi (#672)
* feat: Add support for Wolfi

This adds support for updating Wolfi via Topgrade

* chore(wolfi): Add os release info and unit test

* chore(wolfi): Don't check ID_LIKE as it is unique
2024-01-29 09:11:53 +08:00
Nils
37608a338c Fix/usoclient (#669)
* cargo update

* Implementing a check for Windows 11 and, if detected, skipping Windows Update via usoclient.exe. It is suggested to install PSWindowsUpdate.

* Revert "cargo update"

This reverts commit 43a4d321cf.

* Revert "Implementing a check for Windows 11 and, if detected, skipping Windows Update via usoclient.exe. It is suggested to install PSWindowsUpdate."

This reverts commit e1ef2e4bc5.

* Removed the usoclient step and added an error message.

* cargo fmt
2024-01-29 09:02:40 +08:00
Nils
b07288e674 Fix/pswindowsupdate (#671)
* cargo update

* An elevated PowerShell is required to run Install-WindowsUpdate on my system.

* Revert "cargo update"

This reverts commit fb58ce761a.
2024-01-29 09:01:38 +08:00
Nils
707698faab Update Cargo.lock (#673)
cargo update
2024-01-29 09:00:08 +08:00
SteveLauC
2e70d132d0 feat: certbot renew (#665) 2024-01-28 13:03:30 +08:00
Brent Monning
30c5b31e21 fix: softwareupdate under dry run (#668) 2024-01-27 14:57:10 +08:00
SteveLauC
77ff6cb714 feat: support wildcard in ignored_containers (#666) 2024-01-27 10:54:55 +08:00
SteveLauC
ea13c51b7d chore: release v14.0.1 (#662) 2024-01-25 15:40:52 +08:00
Cat Core
3ed763b884 Fix system updates for Nobara (#661)
* Fix system updates for Nobara

* fmt

* Add os-release test for Nobara

* Make requested changes

* cargo fmt
2024-01-24 19:29:20 +08:00
samhanic
10e1e170b7 fix vscode extensions update step (#650)
* fix vscode extensions update using the new update-extensions cli

* fix non-linux compilation
2024-01-24 10:32:00 +08:00
Sandro
ffa62afc66 Follow up to the follow up in #616 (#660) 2024-01-24 10:22:36 +08:00
SteveLauC
f794329913 feat: skip breaking changes notification with env var (#659)
* feat: skip breaking changes notification with env var

* ci: apply that env in ci
2024-01-23 14:50:35 +08:00
SteveLauC
f9a35c7661 docs: add doc on how to do a new release (#658) 2024-01-23 11:58:09 +08:00
SteveLauC
ed496f3462 chore: fix file name typo[skip ci] (#657)
chore: fix file name typo
2024-01-23 11:50:02 +08:00
Rui Chen
6accdae232 workflows(homebrew): replace Homebrew/actions/bump-formulae with Homebrew/actions/bump-packages (#656)
Signed-off-by: Rui Chen <rui@chenrui.dev>
2024-01-23 10:29:48 +08:00
72 changed files with 5191 additions and 2160 deletions

View File

@@ -8,10 +8,14 @@ assignees: ''
---
## I want to suggest a new step
### Which tool is this about? Where is its repository?
### Which operating systems are supported by this tool?
### What should Topgrade do to figure out if the tool needs to be invoked?
### Which exact commands should Topgrade run?
* Which tool is this about? Where is its repository?
* Which operating systems are supported by this tool?
* What should Topgrade do to figure out if the tool needs to be invoked?
* Which exact commands should Topgrade run?
* Does it have a `--dry-run` option? i.e., print what should be done and exit
* Does it need the user to confirm the execution? And does it provide a `--yes`
option to skip this step?
## I want to suggest some general feature
Topgrade should...

View File

@@ -1,14 +1,15 @@
## Standards checklist:
## What does this PR do
- [ ] The PR title is descriptive.
## Standards checklist
- [ ] The PR title is descriptive
- [ ] I have read `CONTRIBUTING.md`
- [ ] The code compiles (`cargo build`)
- [ ] The code passes rustfmt (`cargo fmt`)
- [ ] The code passes clippy (`cargo clippy`)
- [ ] The code passes tests (`cargo test`)
- [ ] *Optional:* I have tested the code myself
- [ ] If this PR introduces new user-facing messages they are translated
## For new steps
- [ ] *Optional:* Topgrade skips this step where needed
- [ ] *Optional:* The `--dry-run` option works with this step
- [ ] *Optional:* The `--yes` option works with this step if it is supported by

10
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,10 @@
# Set update schedule for GitHub Actions
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
# Check for updates to GitHub Actions every week
interval: "weekly"

View File

@@ -1,4 +1,4 @@
name: Test Configuration File Creation
name: Check config file creation if not exists
on:
pull_request:
@@ -12,10 +12,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- run: |
CONFIG_PATH=~/.config/topgrade.toml;
if [ -f "$CONFIG_PATH" ]; then rm $CONFIG_PATH; fi
cargo build;
./target/debug/topgrade --dry-run --only system;
TOPGRADE_SKIP_BRKC_NOTIFY=true ./target/debug/topgrade --dry-run --only system;
stat $CONFIG_PATH;

22
.github/workflows/check_i18n.yml vendored Normal file
View File

@@ -0,0 +1,22 @@
on:
pull_request:
push:
branches:
- main
name: Check i18n
jobs:
check_locale:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Install checker
# Build it with the dev profile as this is faster and the checker still works
run: |
cargo install --git https://github.com/topgrade-rs/topgrade_i18n_locale_checker --profile dev
- name: Run the checker
run: topgrade_i18n_locale_checker --locale-file ./locales/app.yml --rust-src-to-check ./src

View File

@@ -0,0 +1,32 @@
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.
name: Check Security Vulnerability
on:
pull_request:
push:
branches:
- main
jobs:
lint:
name: DevSkim
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Run DevSkim scanner
uses: microsoft/DevSkim-Action@v1
- name: Upload DevSkim scan results to GitHub Security tab
uses: github/codeql-action/upload-sarif@v3
with:
sarif_file: devskim-results.sarif

View File

@@ -8,7 +8,7 @@ jobs:
prepare:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- uses: actions-rs/toolchain@v1
with:
toolchain: nightly-2022-08-03

View File

@@ -7,23 +7,16 @@ on:
name: CI
env:
RUST_VER: 'stable'
CROSS_VER: '0.2.5'
CARGO_NET_RETRY: 3
jobs:
fmt:
name: Rustfmt
runs-on: ubuntu-20.04
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@master
with:
toolchain: '${{ env.RUST_VER }}'
components: rustfmt
uses: actions/checkout@v4
- name: Run cargo fmt
env:
@@ -42,38 +35,36 @@ jobs:
- target: x86_64-linux-android
target_name: Android
use_cross: true
os: ubuntu-20.04
os: ubuntu-latest
- target: x86_64-unknown-freebsd
target_name: FreeBSD
use_cross: true
os: ubuntu-20.04
os: ubuntu-latest
- target: x86_64-unknown-linux-gnu
target_name: Linux
os: ubuntu-20.04
os: ubuntu-latest
- target: x86_64-apple-darwin
target_name: macOS
os: macos-11
target_name: macOS-x86_64
os: macos-13
- target: aarch64-apple-darwin
target_name: macOS-aarch64
os: macos-latest
- target: x86_64-unknown-netbsd
target_name: NetBSD
use_cross: true
os: ubuntu-20.04
os: ubuntu-latest
- target: x86_64-pc-windows-msvc
target_name: Windows
os: windows-2019
os: windows-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Setup Rust
uses: dtolnay/rust-toolchain@master
with:
toolchain: '${{ env.RUST_VER }}'
components: clippy
uses: actions/checkout@v4
- name: Setup Rust Cache
uses: Swatinem/rust-cache@v2
@@ -84,8 +75,13 @@ jobs:
if: matrix.use_cross == true
run: curl -fL --retry 3 https://github.com/cross-rs/cross/releases/download/v${{ env.CROSS_VER }}/cross-x86_64-unknown-linux-musl.tar.gz | tar vxz -C /usr/local/bin
- name: Run cargo check
- name: Run cargo/cross check
run: ${{ matrix.use_cross == true && 'cross' || 'cargo' }} check --locked --target ${{ matrix.target }}
- name: Run cargo clippy
- name: Run cargo/cross clippy
run: ${{ matrix.use_cross == true && 'cross' || 'cargo' }} clippy --locked --target ${{ matrix.target }} --all-features -- -D warnings
- name: Run cargo test
# ONLY run test with cargo
if: matrix.use_cross == false
run: cargo test --locked --target ${{ matrix.target }}

View File

@@ -1,59 +0,0 @@
on:
pull_request:
push:
branches:
- main
env:
CARGO_TERM_COLOR: always
name: Test with Code Coverage
jobs:
test:
name: Test
env:
PROJECT_NAME_UNDERSCORE: topgrade
CARGO_INCREMENTAL: 0
RUSTFLAGS: -Zprofile -Ccodegen-units=1 -Copt-level=0 -Clink-dead-code -Coverflow-checks=off -Zpanic_abort_tests -Cpanic=abort
RUSTDOCFLAGS: -Cpanic=abort
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
profile: minimal
toolchain: nightly
override: true
- name: Cache dependencies
uses: actions/cache@v2
env:
cache-name: cache-dependencies
with:
path: |
~/.cargo/.crates.toml
~/.cargo/.crates2.json
~/.cargo/bin
~/.cargo/registry/index
~/.cargo/registry/cache
target
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ hashFiles('Cargo.lock') }}
- name: Generate test result and coverage report
run: |
cargo install cargo2junit grcov;
cargo test $CARGO_OPTIONS -- -Z unstable-options --format json | cargo2junit > results.xml;
zip -0 ccov.zip `find . \( -name "$PROJECT_NAME_UNDERSCORE*.gc*" \) -print`;
grcov ccov.zip -s . -t lcov --llvm --ignore-not-existing --ignore "/*" --ignore "tests/*" -o lcov.info;
- name: Upload test results
uses: EnricoMi/publish-unit-test-result-action@v1
with:
check_name: Test Results
github_token: ${{ secrets.GITHUB_TOKEN }}
files: results.xml
- name: Upload to CodeCov
uses: codecov/codecov-action@v1
with:
# required for private repositories:
# token: ${{ secrets.CODECOV_TOKEN }}
files: ./lcov.info
fail_ci_if_error: true

View File

@@ -0,0 +1,88 @@
name: Publish release files for CD native environments
on:
# workflow_run:
# workflows: ["Check SemVer compliance"]
# types:
# - completed
release:
types: [ created ]
jobs:
build:
strategy:
fail-fast: false
matrix:
platform: [ ubuntu-latest, macos-latest, macos-13, windows-latest ]
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@v4
- name: Install cargo-deb
run: cargo install cargo-deb
if: ${{ matrix.platform == 'ubuntu-latest' }}
shell: bash
- name: Check format
run: cargo fmt --all -- --check
- name: Run clippy
run: cargo clippy --all-targets --locked -- -D warnings
- name: Run clippy (All features)
run: cargo clippy --all-targets --locked --all-features -- -D warnings
- name: Run tests
run: cargo test
- name: Build in Release profile with all features enabled
run: cargo build --release --all-features
- name: Rename Release (Unix)
run: |
cargo install default-target
mkdir -p assets
FILENAME=topgrade-${{github.event.release.tag_name}}-$(default-target)
mv target/release/topgrade assets
cd assets
tar --format=ustar -czf $FILENAME.tar.gz topgrade
rm topgrade
ls .
if: ${{ matrix.platform != 'windows-latest' }}
shell: bash
- name: Build Debian-based system binary and create package
# First remove the binary built by previous steps
# because we don't want the auto-update feature,
# then build the new binary without auto-updating.
run: |
rm -rf target/release
cargo build --release
cargo deb --no-build --no-strip
if: ${{ matrix.platform == 'ubuntu-latest' }}
shell: bash
- name: Move Debian-based system package
run: |
mkdir -p assets
mv target/debian/*.deb assets
if: ${{ matrix.platform == 'ubuntu-latest' }}
shell: bash
- name: Rename Release (Windows)
run: |
cargo install default-target
mkdir assets
FILENAME=topgrade-${{github.event.release.tag_name}}-$(default-target)
mv target/release/topgrade.exe assets/topgrade.exe
cd assets
powershell Compress-Archive -Path * -Destination ${FILENAME}.zip
rm topgrade.exe
ls .
if: ${{ matrix.platform == 'windows-latest' }}
shell: bash
- name: Release
uses: softprops/action-gh-release@v2
with:
files: assets/*

View File

@@ -0,0 +1,91 @@
name: Publish release files for non-cd-native environments
on:
# workflow_run:
# workflows: ["Check SemVer compliance"]
# types:
# - completed
release:
types: [ created ]
jobs:
build:
strategy:
fail-fast: false
matrix:
target: [
"aarch64-unknown-linux-gnu",
"armv7-unknown-linux-gnueabihf",
"x86_64-unknown-linux-musl",
"aarch64-unknown-linux-musl",
"x86_64-unknown-freebsd",
]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install cargo-deb cross compilation dependencies
run: sudo apt-get install libc6-arm64-cross libgcc-s1-arm64-cross
if: ${{ matrix.target == 'aarch64-unknown-linux-gnu' }}
shell: bash
- name: Install cargo-deb
run: cargo install cargo-deb
if: ${{ matrix.target == 'aarch64-unknown-linux-gnu' }}
shell: bash
- name: install targets
run: rustup target add ${{ matrix.target }}
- name: install cross
uses: taiki-e/install-action@v2
with:
tool: cross@0.2.5
- name: Check format
run: cross fmt --all -- --check
- name: Run clippy
run: cross clippy --all-targets --locked --target ${{matrix.target}} -- -D warnings
- name: Run clippy (All features)
run: cross clippy --locked --all-features --target ${{matrix.target}} -- -D warnings
- name: Run tests
run: cross test --target ${{matrix.target}}
- name: Build in Release profile with all features enabled
run: cross build --release --all-features --target ${{matrix.target}}
- name: Rename Release
run: |
mkdir -p assets
FILENAME=topgrade-${{github.event.release.tag_name}}-${{matrix.target}}
mv target/${{matrix.target}}/release/topgrade assets
cd assets
tar --format=ustar -czf $FILENAME.tar.gz topgrade
rm topgrade
ls .
- name: Build Debian-based system package without autoupdate feature
# First remove the binary built by previous steps
# because we don't want the auto-update feature,
# then build the new binary without auto-updating.
run: |
rm -rf target/${{matrix.target}}
cross build --release --target ${{matrix.target}}
cargo deb --target=${{matrix.target}} --no-build --no-strip
if: ${{ matrix.target == 'aarch64-unknown-linux-gnu' }}
shell: bash
- name: Move Debian-based system package
run: |
mkdir -p assets
mv target/${{matrix.target}}/debian/*.deb assets
if: ${{ matrix.target == 'aarch64-unknown-linux-gnu' }}
shell: bash
- name: Release
uses: softprops/action-gh-release@v2
with:
files: assets/*

View File

@@ -1,70 +0,0 @@
name: Publish release files for non-cd-native environments
on:
# workflow_run:
# workflows: ["Check SemVer compliance"]
# types:
# - completed
release:
types: [ created ]
jobs:
build:
strategy:
fail-fast: false
matrix:
target: [ "aarch64-unknown-linux-gnu", "armv7-unknown-linux-gnueabihf", "x86_64-unknown-linux-musl", "aarch64-unknown-linux-musl", "x86_64-unknown-freebsd", ]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
profile: minimal
default: true
override: true
target: ${{ matrix.target }}
components: rustfmt, clippy
- uses: actions-rs/cargo@v1.0.1
name: Check format
with:
use-cross: true
command: fmt
args: --all -- --check
- uses: actions-rs/cargo@v1.0.1
name: Run clippy
with:
command: clippy
use-cross: true
args: --all-targets --locked --target ${{matrix.target}} -- -D warnings
- uses: actions-rs/cargo@v1.0.1
name: Run clippy (All features)
with:
command: clippy
use-cross: true
args: --locked --all-features --target ${{matrix.target}} -- -D warnings
- uses: actions-rs/cargo@v1.0.1
name: Run tests
with:
command: test
use-cross: true
args: --target ${{matrix.target}}
- uses: actions-rs/cargo@v1.0.1
name: Build
with:
command: build
use-cross: true
args: --release --all-features --target ${{matrix.target}}
- name: Rename Release
run: |
mkdir assets
FILENAME=topgrade-${{github.event.release.tag_name}}-${{matrix.target}}
mv target/${{matrix.target}}/release/topgrade assets
cd assets
tar --format=ustar -czf $FILENAME.tar.gz topgrade
rm topgrade
ls .
- name: Release
uses: softprops/action-gh-release@v1
with:
files: assets/*

View File

@@ -1,77 +0,0 @@
name: Publish release files for CD native environments
on:
# workflow_run:
# workflows: ["Check SemVer compliance"]
# types:
# - completed
release:
types: [ created ]
jobs:
build:
strategy:
fail-fast: false
matrix:
platform: [ ubuntu-latest, macos-latest, windows-latest ]
runs-on: ${{ matrix.platform }}
steps:
- uses: actions/checkout@v2
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
profile: minimal
override: true
components: rustfmt, clippy
- uses: actions-rs/cargo@v1.0.1
name: Check format
with:
command: fmt
args: --all -- --check
- uses: actions-rs/cargo@v1.0.1
name: Run clippy
with:
command: clippy
args: --all-targets --locked -- -D warnings
- uses: actions-rs/cargo@v1.0.1
name: Run clippy (All features)
with:
command: clippy
args: --all-targets --locked --all-features -- -D warnings
- uses: actions-rs/cargo@v1.0.1
name: Run tests
with:
command: test
- uses: actions-rs/cargo@v1.0.1
name: Build
with:
command: build
args: --release --all-features
- name: Rename Release (Unix)
run: |
cargo install default-target
mkdir assets
FILENAME=topgrade-${{github.event.release.tag_name}}-$(default-target)
mv target/release/topgrade assets
cd assets
tar --format=ustar -czf $FILENAME.tar.gz topgrade
rm topgrade
ls .
if: ${{ matrix.platform != 'windows-latest' }}
shell: bash
- name: Rename Release (Windows)
run: |
cargo install default-target
mkdir assets
FILENAME=topgrade-${{github.event.release.tag_name}}-$(default-target)
mv target/release/topgrade.exe assets/topgrade.exe
cd assets
powershell Compress-Archive -Path * -Destination ${FILENAME}.zip
rm topgrade.exe
ls .
if: ${{ matrix.platform == 'windows-latest' }}
shell: bash
- name: Release
uses: softprops/action-gh-release@v1
with:
files: assets/*

31
.github/workflows/release_to_aur.yml vendored Normal file
View File

@@ -0,0 +1,31 @@
name: Publish to AUR
on:
# workflow_run:
# workflows: ["Check SemVer compliance"]
# types:
# - completed
push:
tags:
- "v*"
jobs:
aur-publish:
runs-on: ubuntu-latest
steps:
- name: Publish source AUR package
uses: aksh1618/update-aur-package@v1.0.5
with:
tag_version_prefix: v
package_name: topgrade
commit_username: "Thomas Schönauer"
commit_email: t.schoenauer@hgs-wt.at
ssh_private_key: ${{ secrets.AUR_SSH_PRIVATE_KEY }}
- name: Publish binary AUR package
uses: aksh1618/update-aur-package@v1.0.5
with:
tag_version_prefix: v
package_name: topgrade-bin
commit_username: "Thomas Schönauer"
commit_email: t.schoenauer@hgs-wt.at
ssh_private_key: ${{ secrets.AUR_SSH_PRIVATE_KEY }}

View File

@@ -12,7 +12,7 @@ jobs:
prepare:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- uses: actions-rs/toolchain@v1
with:
toolchain: stable
@@ -21,7 +21,7 @@ jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: katyo/publish-crates@v1
- uses: katyo/publish-crates@v2
with:
dry-run: true
check-repo: ${{ github.event_name == 'push' }}

View File

@@ -19,7 +19,7 @@ jobs:
uses: Homebrew/actions/setup-homebrew@master
- name: Cache Homebrew Bundler RubyGems
id: cache
uses: actions/cache@v1
uses: actions/cache@v4
with:
path: ${{ steps.set-up-homebrew.outputs.gems-path }}
key: ${{ runner.os }}-rubygems-${{ steps.set-up-homebrew.outputs.gems-hash }}
@@ -29,7 +29,8 @@ jobs:
if: steps.cache.outputs.cache-hit != 'true'
run: brew install-bundler-gems
- name: Bump formulae
uses: Homebrew/actions/bump-formulae@master
uses: Homebrew/actions/bump-packages@master
continue-on-error: true
with:
# Custom GitHub access token with only the 'public_repo' scope enabled
token: ${{secrets.HOMEBREW_ACCESS_TOKEN}}

View File

@@ -14,7 +14,7 @@ jobs:
matrix:
target: [x86_64, x86, aarch64]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Build wheels
uses: PyO3/maturin-action@v1
with:
@@ -34,7 +34,7 @@ jobs:
matrix:
target: [x64, x86]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Build wheels
uses: PyO3/maturin-action@v1
with:
@@ -53,7 +53,7 @@ jobs:
matrix:
target: [x86_64, aarch64]
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Build wheels
uses: PyO3/maturin-action@v1
with:
@@ -69,7 +69,7 @@ jobs:
sdist:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Build sdist
uses: PyO3/maturin-action@v1
with:

13
.github/workflows/release_to_winget.yml vendored Normal file
View File

@@ -0,0 +1,13 @@
name: Publish to WinGet
on:
release:
types: [released]
jobs:
publish:
runs-on: windows-latest
steps:
- uses: vedantmgoyal2009/winget-releaser@main
with:
identifier: topgrade-rs.topgrade
max-versions-to-keep: 5 # keep only latest 5 versions
token: ${{ secrets.WINGET_TOKEN }}

View File

@@ -1,22 +0,0 @@
name: Publish to AUR
on:
# workflow_run:
# workflows: ["Check SemVer compliance"]
# types:
# - completed
push:
tags:
- "v*"
jobs:
aur-publish:
runs-on: ubuntu-latest
steps:
- name: Publish AUR package
uses: ATiltedTree/create-aur-release@v1
with:
package_name: topgrade
commit_username: "Thomas Schönauer"
commit_email: t.schoenauer@hgs-wt.at
ssh_private_key: ${{ secrets.AUR_SSH_PRIVATE_KEY }}

18
.gitignore vendored
View File

@@ -1,4 +1,20 @@
# JetBrains IDEs
.idea/
/target
# Visual Studio
.vs/
# Visual Studio Code
.vscode/
# Generic build outputs
/build
# Specific for some languages like Rust
/target
# LLVM profiling output
*.profraw
# Backup files for any .rs files in the project
**/*.rs.bk

38
.vscode/launch.json vendored
View File

@@ -1,38 +0,0 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"type": "lldb",
"request": "launch",
"name": "Topgrade",
"console": "integratedTerminal",
"cargo": {
"args": [
"build",
"--bin=topgrade-rs",
"--package=topgrade-rs"
],
"filter": {
"name": "topgrade-rs",
"kind": "bin"
}
},
"args": [
"--only",
"${input:step}",
"-v"
],
"cwd": "${workspaceFolder}"
},
],
"inputs": [
{
"type": "promptString",
"id": "step",
"description": "step name",
}
]
}

14
.vscode/tasks.json vendored
View File

@@ -1,14 +0,0 @@
{
"version": "2.0.0",
"tasks": [
{
"type": "cargo",
"command": "clippy",
"problemMatcher": [
"$rustc"
],
"group": "test",
"label": "rust: cargo clippy"
}
]
}

View File

@@ -1,50 +0,0 @@
{
// Place your topgrade workspace snippets here. Each snippet is defined under a snippet name and has a scope, prefix, body and
// description. Add comma separated ids of the languages where the snippet is applicable in the scope field. If scope
// is left empty or omitted, the snippet gets applied to all languages. The prefix is what is
// used to trigger the snippet and the body will be expanded and inserted. Possible variables are:
// $1, $2 for tab stops, $0 for the final cursor position, and ${1:label}, ${2:another} for placeholders.
// Placeholders with the same ids are connected.
// Example:
// "Print to console": {
// "scope": "javascript,typescript",
// "prefix": "log",
// "body": [
// "console.log('$1');",
// "$2"
// ],
// "description": "Log output to console"
// }
"Skip Step": {
"scope": "rust",
"prefix": "skipstep",
"body": [
"return Err(SkipStep(format!(\"$1\")).into());"
]
},
"Step": {
"scope": "rust",
"prefix": "step",
"body": [
"pub fn $1(ctx: &ExecutionContext) -> Result<()> {",
" $0",
" Ok(())",
"}"
]
},
"Require Binary": {
"scope": "rust",
"prefix": "req",
"description": "Require a binary to be installed",
"body": [
"let ${1:binary} = require(\"${1:binary}\")?;"
]
},
"macos": {
"scope": "rust",
"prefix": "macos",
"body": [
"#[cfg(target_os = \"macos\")]"
]
}
}

View File

@@ -1,12 +0,0 @@
1. In 13.0.0, we introduced a new feature, pushing git repos, now this feature
has been removed as some users are not satisfied with it.
For configuration entries, the following ones are gone:
```toml
[git]
pull_only_repos = []
push_only_repos = []
pull_arguments = ""
push_arguments = ""
```

View File

View File

@@ -48,7 +48,7 @@ To add a new `step` to `topgrade`:
// Invoke the new step to get things updated!
ctx.run_type()
.execute("xxx")
.execute(xxx)
.arg(/* args required by this step */)
.status_checked()
}
@@ -103,8 +103,8 @@ and have some basic documentations guiding user how to use these options.
## Breaking changes
If your PR introduces a breaking change, document it in `BREAKINGCHANGE_dev.md`,
it should be written in Markdown and wrapped in 80, for example:
If your PR introduces a breaking change, document it in [`BREAKINGCHANGES_dev.md`][bc_dev],
it should be written in Markdown and wrapped at 80, for example:
```md
1. The configuration location has been updated to x.
@@ -114,6 +114,8 @@ it should be written in Markdown and wrapped in 80, for example:
3. ...
```
[bc_dev]: https://github.com/topgrade-rs/topgrade/blob/main/BREAKINGCHANGES_dev.md
## Before you submit your PR
Make sure your patch passes the following tests on your host:
@@ -127,6 +129,24 @@ $ cargo test
Don't worry about other platforms, we have most of them covered in our CI.
## I18n
If your PR introduces user-facing messages, we need to ensure they are translated.
Please add the translations to [`locales/app.yml`][app_yml]. For simple messages
without arguments (e.g., "hello world"), we can simply translate them according
(Tip: ChatGPT or similar LLMs is good at translation). If a message contains
arguments, e.g., "hello <NAME>", please follow this convention:
```yml
"hello {name}": # key
en: "hello %{name}" # translation
```
Arguments in the key should be in format `{argument_name}`, and they will have
a preceeding `%` when used in translations.
[app_yml]: https://github.com/topgrade-rs/topgrade/blob/main/locales/app.yml
## Some tips
1. Locale

2077
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -5,9 +5,10 @@ categories = ["os"]
keywords = ["upgrade", "update"]
license = "GPL-3.0"
repository = "https://github.com/topgrade-rs/topgrade"
version = "14.0.0"
rust-version = "1.76.0"
version = "16.0.2"
authors = ["Roey Darwish Dror <roey.ghost@gmail.com>", "Thomas Schönauer <t.schoenauer@hgs-wt.at>"]
exclude = ["doc/screenshot.gif", "BREAKINGCHNAGES_dev.md"]
exclude = ["doc/screenshot.gif", "BREAKINGCHANGES_dev.md"]
edition = "2021"
readme = "README.md"
@@ -22,24 +23,24 @@ path = "src/main.rs"
[dependencies]
home = "~0.5"
etcetera = "~0.8"
once_cell = "~1.18"
once_cell = "~1.19"
serde = { version = "~1.0", features = ["derive"] }
toml = "0.8"
which_crate = { version = "~4.1", package = "which" }
which_crate = { version = "~6.0", package = "which" }
shellexpand = "~3.1"
clap = { version = "~4.4", features = ["cargo", "derive"] }
clap_complete = "~4.4"
clap = { version = "~4.5", features = ["cargo", "derive"] }
clap_complete = "~4.5"
clap_mangen = "~0.2"
walkdir = "~2.4"
walkdir = "~2.5"
console = "~0.15"
lazy_static = "~1.4"
chrono = "~0.4"
glob = "~0.3"
strum = { version = "~0.24", features = ["derive"] }
strum = { version = "~0.26", features = ["derive"] }
thiserror = "~1.0"
tempfile = "~3.8"
tempfile = "~3.10"
cfg-if = "~1.0"
tokio = { version = "~1.34", features = ["process", "rt-multi-thread"] }
tokio = { version = "~1.38", features = ["process", "rt-multi-thread"] }
futures = "~0.3"
regex = "~1.10"
semver = "~1.0"
@@ -49,7 +50,10 @@ tracing = { version = "~0.1", features = ["attributes", "log"] }
tracing-subscriber = { version = "~0.3", features = ["env-filter", "time"] }
merge = "~0.1"
regex-split = "~0.1"
notify-rust = "~4.10"
notify-rust = "~4.11"
wildmatch = "2.3.0"
rust-i18n = "3.0.1"
sys-locale = "0.3.1"
[package.metadata.generate-rpm]
assets = [{ source = "target/release/topgrade", dest = "/usr/bin/topgrade" }]
@@ -58,15 +62,23 @@ assets = [{ source = "target/release/topgrade", dest = "/usr/bin/topgrade" }]
git = "*"
[package.metadata.deb]
depends = "$auto,git"
name = "topgrade"
maintainer = "Chris Gelatt <kreeblah@gmail.com>"
copyright = "2024, Topgrade Team"
license-file = ["LICENSE", "0"]
depends = "$auto"
extended-description = "Keeping your system up to date usually involves invoking multiple package managers. This results in big, non-portable shell one-liners saved in your shell. To remedy this, Topgrade detects which tools you use and runs the appropriate commands to update them."
section = "utils"
priority = "optional"
default-features = true
[target.'cfg(unix)'.dependencies]
nix = { version = "~0.27", features = ["hostname", "signal", "user"] }
rust-ini = "~0.19"
self_update_crate = { version = "~0.30", default-features = false, optional = true, package = "self_update", features = ["archive-tar", "compression-flate2", "rustls"] }
nix = { version = "~0.29", features = ["hostname", "signal", "user"] }
rust-ini = "~0.21"
self_update_crate = { version = "~0.40", default-features = false, optional = true, package = "self_update", features = ["archive-tar", "compression-flate2", "rustls"] }
[target.'cfg(windows)'.dependencies]
self_update_crate = { version = "~0.30", default-features = false, optional = true, package = "self_update", features = ["archive-zip", "compression-zip-deflate", "rustls"] }
self_update_crate = { version = "~0.40", default-features = false, optional = true, package = "self_update", features = ["archive-zip", "compression-zip-deflate", "rustls"] }
winapi = "~0.3"
parselnk = "~0.1"

View File

@@ -29,15 +29,16 @@ To remedy this, **Topgrade** detects which tools you use and runs the appropriat
- NixOS: [Nixpkgs](https://search.nixos.org/packages?show=topgrade)
- Void Linux: [XBPS](https://voidlinux.org/packages/?arch=x86_64&q=topgrade)
- macOS: [Homebrew](https://formulae.brew.sh/formula/topgrade) or [MacPorts](https://ports.macports.org/port/topgrade/)
- Windows: [Scoop](https://github.com/ScoopInstaller/Main/blob/master/bucket/topgrade.json)
- Windows: [Chocolatey][choco], [Scoop][scoop] or [Winget][winget]
- PyPi: [pip](https://pypi.org/project/topgrade/)
[choco]: https://community.chocolatey.org/packages/topgrade
[scoop]: https://scoop.sh/#/apps?q=topgrade
[winget]: https://winstall.app/apps/topgrade-rs.topgrade
Other systems users can either use `cargo install` or the compiled binaries from the release page.
The compiled binaries contain a self-upgrading feature.
> Currently, Topgrade requires Rust 1.65 or above. In general, Topgrade tracks
> the latest stable toolchain.
## Usage
Just run `topgrade`.

69
RELEASE_PROCEDURE.md Normal file
View File

@@ -0,0 +1,69 @@
> This document lists the steps that lead to a successful release of Topgrade.
1. Open a PR that:
> Here is an [Example PR](https://github.com/topgrade-rs/topgrade/pull/652)
> that you can refer to.
1. bumps the version number.
> If there are breaking changes, the major version number should be increased.
2. If the major versioin number gets bumped, update [SECURITY.md][SECURITY_file_link].
[SECURITY_file_link]: https://github.com/topgrade-rs/topgrade/blob/main/SECURITY.md
3. Overwrite [`BREAKINGCHANGES`][breaking_changes] with
[`BREAKINGCHANGES_dev`][breaking_changes_dev], and create a new dev file:
```sh'
$ cd topgrade
$ mv BREAKINGCHANGES_dev.md BREAKINGCHANGES.md
$ touch BREAKINGCHANGES_dev.md
```
[breaking_changes_dev]: https://github.com/topgrade-rs/topgrade/blob/main/BREAKINGCHANGES_dev.md
[breaking_changes]: https://github.com/topgrade-rs/topgrade/blob/main/BREAKINGCHANGES.md
2. Check and merge that PR.
3. Go to the [release](https://github.com/topgrade-rs/topgrade/releases) page
and click the [Draft a new release button](https://github.com/topgrade-rs/topgrade/releases/new)
4. Write the release notes
We usually use GitHub's [Automatically generated release notes][auto_gen_release_notes]
functionality to generate release notes, but you write your own one instead.
[auto_gen_release_notes]: https://docs.github.com/en/repositories/releasing-projects-on-github/automatically-generated-release-notes
5. Attaching binaries
You don't need to do this as our CI will automatically do it for you,
binaries for Linux, macOS and Windows will be created and attached.
And the CI will publish the new binary to:
1. AUR
2. PyPi
3. Homebrew (seems that this is not working correctly)
4. Winget
6. Manually release it to Crates.io
> Yeah, this is unfortunate, our CI won't do this for us. We should probably add one.
1. `cd` to the Topgrade directory, make sure that it is the latest version
(i.e., including the PR that bumps the version number).
2. Set up your token with `cargo login`.
3. Dry-run the publish `cargo publish --dry-run`.
4. If step 3 works, then do the final release `cargo publish`.
> You can also take a look at the official tutorial [Publishing on crates.io][doc]
>
> [doc]: https://doc.rust-lang.org/cargo/reference/publishing.html

View File

@@ -6,6 +6,6 @@ We only support the latest major version and each subversion.
| Version | Supported |
| -------- | ------------------ |
| 10.0.x | :white_check_mark: |
| < 10.0 | :x: |
| 16.0.x | :white_check_mark: |
| < 16.0 | :x: |

View File

@@ -47,6 +47,12 @@
# Run inside tmux (default: false)
# run_in_tmux = true
# Changes the way topgrade interacts with
# the tmux session, creating the session
# and only attaching to it if not inside tmux
# (default: "attach_if_not_in_session", allowed values: "attach_if_not_in_session", "attach_always")
# tmux_session_mode = "attach_if_not_in_session"
# Cleanup temporary or old files (default: false)
# cleanup = true
@@ -97,15 +103,44 @@
# enable_pipupgrade = true ###disabled by default
# pipupgrade_arguments = "-y -u --pip-path pip" ###disabled by default
# For the poetry step, by default, Topgrade skips its update if poetry is not
# installed with the official script. This configuration entry forces Topgrade
# to run the update in this case.
#
# (default: false)
# poetry_force_self_update = true
[composer]
# self_update = true
[brew]
# For the BrewCask step
# If `Repo Cask Upgrade` exists, then use the `-a` option.
# Otherwise, use the `--greedy` option.
# greedy_cask = true
# For the BrewCask step
# If `Repo Cask Upgrade` does not exist, then use the `--greedy_latest` option.
# NOTE: the above entry `greedy_cask` contains this entry, though you can enable
# both of them, they won't clash with each other.
# greedy_latest = true
# For the BrewCask step
# If `Repo Cask Upgrade` does not exist, then use the `--greedy_auto_updates` option.
# NOTE: the above entry `greedy_cask` contains this entry, though you can enable
# both of them, they won't clash with each other.
# greedy_auto_updates = true
# For the BrewFormula step
# Execute `brew autoremove` after the step.
# autoremove = true
# For the BrewFormula step
# Upgrade formulae built from the HEAD branch; `brew upgrade --fetch-HEAD`
# fetch_head = true
[linux]
# Arch Package Manager to use.
@@ -144,6 +179,11 @@
# rpm_ostree = false
# For Fedora/CentOS/RHEL Atomic variants, if `bootc` is available and this configuration entry is set to true, use
# it to do the update - Will also supercede rpm-ostree if enabled
# (default: false)
# bootc = false
# nix_arguments = "--flake"
# nix_env_arguments = "--prebuilt-only"
@@ -153,6 +193,7 @@
[git]
# How many repos to pull at max in parallel
# max_concurrency = 5
# Additional git repositories to pull
@@ -167,6 +208,7 @@
# Arguments to pass Git when pulling Repositories
# arguments = "--rebase --autostash"
[windows]
# Manually select Windows updates
# accept_all_updates = false
@@ -182,9 +224,6 @@
# manager such as Scoop or Cargo
# self_rename = true
# Enable WinGet upgrade
# enable_winget = true
[npm]
# Use sudo if the NPM directory isn't owned by the current user
@@ -196,6 +235,11 @@
# use_sudo = true
[deno]
# Upgrade deno executable to the given version.
# version = "stable"
[vim]
# For `vim-plug`, execute `PlugUpdate!` instead of `PlugUpdate`
# force_plug_update = true
@@ -227,4 +271,22 @@
# containers = ["archlinux-latest"]
[containers]
# ignored_containers = ["ghcr.io/rancher-sandbox/rancher-desktop/rdx-proxy:latest"]
# Specify the containers to ignore while updating (Wildcard supported)
# ignored_containers = ["ghcr.io/rancher-sandbox/rancher-desktop/rdx-proxy:latest", "docker.io*"]
# Specify the runtime to use for containers (default: "docker", allowed values: "docker", "podman")
# runtime = "podman"
[lensfun]
# If disabled, Topgrade invokes `lensfunupdatedata` without root priviledge,
# then the update will be only available to you. Otherwise, `sudo` is required,
# and the update will be installed system-wide, i.e., available to all users.
# (default: false)
# use_sudo = false
[julia]
# If disabled, Topgrade invokes julia with the --startup-file=no CLI option.
#
# This may be desirable to avoid loading outdated packages with "using" directives
# in the startup file, which might cause the update run to fail.
# (default: true)
# startup_file = true

795
locales/app.yml Normal file
View File

@@ -0,0 +1,795 @@
_version: 2
"Current system locale is {system_locale}":
en: "Current system locale is %{system_locale}"
es: "La configuración regional del sistema es %{system_locale}"
fr: "Le paramètre linguistique actuel du système est %{system_locale}"
zh_TW: "目前語言為 %{system_locale}"
"Dry running: {program_name} {arguments}":
en: "Dry running: %{program_name} %{arguments}"
es: "Simulando: %{program_name} %{arguments}"
fr: "Simulation : %{program_name} %{arguments}"
zh_TW: "正在模擬 %{program_name} %{arguments} 的執行過程"
"in {directory}":
en: "in %{directory}"
es: "en %{directory}"
fr: "dans %{directory}"
zh_TW: "在 %{directory}"
"Rebooting...":
en: "Rebooting..."
es: "Reiniciando..."
fr: "Redémarrage..."
zh_TW: "正在重新啟動..."
"Plugins upgraded":
en: "Plugins upgraded"
es: "Plugins actualizados"
fr: "Plugins mis à jour"
zh_TW: "已更新所有擴充功能"
"Would self-update":
en: "Would self-update"
es: "Se actualizara automáticamente"
fr: "Se mettrait à jour lui-même"
zh_TW: "將自行更新"
"Pulling":
en: "Pulling"
es: "Extrayendo"
fr: "Récupération"
zh_TW: "正在拉取"
"No Breaking changes":
en: "No Breaking changes"
es: "Sin Cambios Importantes"
fr: "Pas de changement cassant"
zh_TW: "無重大更改"
"Dropping you to shell. Fix what you need and then exit the shell.":
en: "Dropping you to shell. Fix what you need and then exit the shell."
es: "Cambiando al shell. Arregla lo que necesites y luego sal del shell."
fr: "Ouverture d'un shell. Réparez ce dont vous avez besoin et quittez le shell."
zh_TW: "已切換到終端殼層。修復完畢後請退出殼層以繼續。"
"Topgrade launched in a new tmux session":
en: "Topgrade launched in a new tmux session"
es: "Topgrade lanzado en una nueva sesión tmux"
fr: "Topgrade lancé dans une nouvelle session tmux"
zh_TW: "Topgrade 已啟動新 tmux 程序"
'Topgrade upgraded to {version}:\n':
en: 'Topgrade upgraded to %{version}:\n'
es: 'Topgrade actualizado a %{version}:\n'
fr: 'Topgrade mis à jour vers %{version}:\n'
zh_TW: '已將 Topgrade 更新至 %{version}\n'
"Topgrade is up-to-date":
en: "Topgrade is up-to-date"
es: "Topgrade está actualizado"
fr: "Topgrade est à jour"
zh_TW: "Topgrade 為最新版本"
"Updating modules...":
en: "Updating modules..."
es: "Actualizando módulos..."
fr: "Mise à jour des modules..."
zh_TW: "正在更新模組..."
"Powershell Modules Update":
en: "Powershell Modules Update"
es: "Actualización de módulos Powershell"
fr: "Mise à jour des modules Powershell"
zh_TW: "Powershell 模組更新"
"Powershell is not installed":
en: "Powershell is not installed"
es: "Powershell no está instalado"
fr: "Powershell n'est pas installé"
zh_TW: "未安裝 Powershell"
"Error detecting current distribution: {error}":
en: "Error detecting current distribution: %{error}"
es: "Error al detectar la distribución actual: %{error}"
fr: "Erreur lors de la détection de la distribution acutelle: %{error}"
zh_TW: "無法偵測作業系統:%{error}"
"Error: {error}":
en: "Error: %{error}"
es: "Error: %{error}"
fr: "Erreur: %{error}"
zh_TW: "錯誤:%{error}"
"Failed":
en: "Failed"
es: "Fallido"
fr: "Échec"
zh_TW: "失敗"
"pulling":
en: "pulling"
es: "extracción"
fr: "récupérer"
zh_TW: "正在拉取"
"Changed":
en: "Changed"
es: "Cambiado"
fr: "Modifié"
zh_TW: "已更改"
"Up-to-date":
en: "Up-to-date"
es: "Actualizado"
fr: "À jour"
zh_TW: "已為最新版本"
"Self update":
en: "Self update"
es: "Autoactualización"
fr: "Auto mise à jour"
zh_TW: "自行更新"
# The following 2 strings are used in the same sentence
# i.e. *Only* updated repositories will be shown...
# Note that the text *Only* is highlighted in green
"Only":
en: "Only"
es: "Solo"
fr: "Seulement"
zh_TW: "將僅"
"updated repositories will be shown...":
en: "updated repositories will be shown..."
es: "se mostrarán los repositorios actualizados..."
fr: "les dépôts mis à jour seront affichés..."
zh_TW: "顯示被更新的 git 來源..."
"because it has no remotes":
en: "because it has no remotes"
es: "porque no tiene fuentes remotas"
fr: "parce qu'il n'a aucun dépôt distant"
zh_TW: "因為其沒有遠端來源"
"Skipping":
en: "Skipping"
es: "Omitiendo"
fr: "Ignoré"
zh_TW: "正在略過"
"Aura(<0.4.6) requires sudo installed to work with AUR packages":
en: "Aura(<0.4.6) requires sudo installed to work with AUR packages"
es: "Aura(<0.4.6) requiere tener sudo instalado para funcionar con paquetes AUR"
fr: "Aura(<0.4.6) nécessite sudo pour fonctionner avec les paquets AUR"
zh_TW: "Aura<0.4.6)依賴 sudo 安裝 AUR 套件"
"Pacman backup configuration files found:":
en: "Pacman backup configuration files found:"
es: "Archivos de respaldo de Pacman encontrados:"
fr: "Fichiers de configuration de sauvegarde de Pacman trouvés :"
zh_TW: "找到 Pacman 設定備份檔:"
"The package audit was successful, but vulnerable packages still remain on the system":
en: "The package audit was successful, but vulnerable packages still remain on the system"
es: "La auditoría del paquete fue exitosa, pero aún quedan paquetes vulnerables en el sistema"
fr: "L'audit des paquets a réussi, mais des paquets vulnérables restent toujours sur le système"
zh_TW: "雖然套件檢測成功,但系統仍然包含危險套件"
"Syncing portage":
en: "Syncing portage"
es: "Sincronizando portage"
fr: "Synchronisation du portage"
zh_TW: "正在同步 portage"
"Finding available software":
en: "Finding available software"
es: "Buscando software disponible"
fr: "Recherche de logiciels disponible"
zh_TW: "正在尋找軟體"
"A system update is available. Do you wish to install it?":
en: "A system update is available. Do you wish to install it?"
es: "Hay una actualización del sistema disponible. ¿Desea instalarla?"
fr: "Une mise à jour du système est disponible. Voulez-vous l'installer ?"
zh_TW: "系統更新已就緒。是否現在安裝?"
"No new software available.":
en: "No new software available."
es: "No hay ningún software nuevo disponible."
fr: "Aucun nouveau logiciel disponible."
zh_TW: "沒有新軟體。"
"No Xcode releases installed.":
en: "No Xcode releases installed."
es: "No hay versiones de Xcode instaladas."
fr: "Aucune version de Xcode installée."
zh_TW: "尚未安裝 Xcode 發行版。"
"Would you like to move the former Xcode release to the trash?":
en: "Would you like to move the former Xcode release to the trash?"
es: "¿Le gustaría mover la versión anterior de Xcode a la papelera?"
fr: "Voulez-vous déplacer la précédente version de Xcode à la corbeille ?"
zh_TW: "是否將舊版 Xcode 移至垃圾桶?"
"New Xcode release detected:":
en: "New Xcode release detected:"
es: "Nueva versión de Xcode detectada:"
fr: "Nouvelle version de Xcode détectée :"
zh_TW: "有新的 Xcode 版本:"
"Would you like to install it?":
en: "Would you like to install it?"
es: "¿Le gustaría instalarlo?"
fr: "Voulez-vous l'installer ?"
zh_TW: "是否現在安裝?"
"No global packages installed":
en: "No global packages installed"
es: "No hay paquetes globales instalados"
fr: "Aucun paquet global n'est installé"
zh_TW: "尚未安裝全域套件"
"Remote Topgrade launched in Tmux":
en: "Remote Topgrade launched in Tmux"
es: "Topgrade remoto lanzado en Tmux"
fr: "Topgrade distant lancé dans Tmux"
zh_TW: "已在 Tmux 中啟動遠端 Topgrade 程序"
"Remote Topgrade launched in an external terminal":
en: "Remote Topgrade launched in an external terminal"
es: "Topgrade remoto iniciado en una terminal externa"
fr: "Topgrade distant lancé dans un terminal externe"
zh_TW: "已在新終端機視窗中啟動遠端 Topgrade"
"Collecting Vagrant boxes":
en: "Collecting Vagrant boxes"
es: "Recolectando cajas Vagrant"
fr: "Collecte des boîtes Vagrant"
zh_TW: "正在蒐集 Vagrant 容器"
"No Vagrant directories were specified in the configuration file":
en: "No Vagrant directories were specified in the configuration file"
es: "No se especificaron directorios Vagrant en el archivo de configuración"
fr: "Aucun répertoire Vagrant n'est spécifié dans le fichier de configuration"
zh_TW: "尚未在設定檔中指定 Vagrant 資料夾"
"Vagrant boxes":
en: "Vagrant boxes"
es: "Cajas Vagrant"
fr: "Boîtes Vagrant"
zh_TW: "Vagrant 容器"
"No outdated boxes":
en: "No outdated boxes"
es: "Sin cajas obsoletas"
fr: "Aucune boîte obsolète"
zh_TW: "未有需要更新的容器"
"Summary":
en: "Summary"
es: "Resumen"
fr: "Résumé"
zh_TW: "結果"
"Topgrade finished with errors":
en: "Topgrade finished with errors"
es: "Topgrade finalizado con errores"
fr: "Topgrade terminé avec des erreurs"
zh_TW: "Topgrade 執行部分成功"
"Topgrade finished successfully":
en: "Topgrade finished successfully"
es: "Topgrade finalizó exitosamente"
fr: "Topgrade terminé avec succès"
zh_TW: "Topgrade 執行成功"
"Topgrade {version_str} Breaking Changes":
en: "Topgrade %{version_str} Breaking Changes"
es: "Topgrade %{version_str} Cambios Importantes"
fr: "Topgrade %{version_str} Changements Cassants"
zh_TW: "Topgrade %{version_str} 重大更改"
"Path {path} expanded to {expanded}":
en: "Path %{path} expanded to %{expanded}"
es: "Ruta %{path} expandida a %{expanded}"
fr: "Le chemin %{path} a été transformé en %{expanded}"
zh_TW: "已擴展 %{path} 至 %{expanded}"
"Path {path} doesn't exist":
en: "Path %{path} doesn't exist"
es: "La ruta %{path} no existe"
fr: "Le chemin %{path} n'existe pas"
zh_TW: "路徑 %{path} 不存在"
"Cannot find {binary_name} in PATH":
en: "Cannot find %{binary_name} in PATH"
es: "No se pudo encontrar %{binary_name} en PATH"
fr: "Impossible de trouver %{binary_name} dans le PATH"
zh_TW: "在 $PATH 中找不到 %{binary_name} 執行檔"
"Failed to get a UTF-8 encoded hostname":
en: "Failed to get a UTF-8 encoded hostname"
es: "Error al obtener un nombre de host codificado en UTF-8"
fr: "Échec de l'obtention d'un nom d'hôte encodé en UTF-8"
zh_TW: "無法取得 UTF-8 編碼的主機名稱"
"Failed to get hostname: {err}":
en: "Failed to get hostname: %{err}"
es: "Error al obtener el nombre del host: %{err}"
fr: "Échec de l'obtention d'un nom d'hôte: %{err}"
zh_TW: "無法取得主機名稱:%{err}"
"{python} is a Python 2, skip.":
en: "%{python} is a Python 2, skip."
es: "%{python} es Python 2, omitiendo."
fr: "%{python} est un Python 2, ignoré."
zh_TW: "%{python} 是 Python 2略過。"
"{python} is a Python shim, skip.":
en: "%{python} is a Python shim, skip."
es: "%{python} es una corrección de Python, omitiendo."
fr: "%{python} est un shim Python, ignoré."
zh_TW: "%{python} 是 Python shim略過。"
"{key} failed:":
en: "%{key} failed:"
es: "%{key} ha fallado:"
fr: "%{key} a échoué :"
zh_TW: "%{key} 失敗:"
"{step_name} failed":
en: "%{step_name} failed"
es: "%{step_name} fallido"
fr: "%{step_name} a échoué"
zh_TW: "%{step_name} 失敗"
"DragonFly BSD Packages":
en: "DragonFly BSD Packages"
es: "Paquetes BSD de DragonFly"
fr: "Paquets DragonFly BSD"
zh_TW: "DragonFly BSD 套件"
"DragonFly BSD Audit":
en: "DragonFly BSD Audit"
es: "Auditoría de DragonFly BSD"
fr: "Audit de DragonFly BSD"
zh_TW: "DragonFly BSD 紀錄"
"FreeBSD Update":
en: "FreeBSD Update"
es: "Actualización de FreeBSD"
fr: "Mise à jour de FreeBSD"
zh_TW: "FreeBSD 更新"
"FreeBSD Packages":
en: "FreeBSD Packages"
es: "Paquetes FreeBSD"
fr: "Paquets FreeBSD"
zh_TW: "FreeBSD 套件"
"FreeBSD Audit":
en: "FreeBSD Audit"
es: "Auditoría FreeBSD"
fr: "Audit de FreeBSD"
zh_TW: "FreeBSD 紀錄"
"System update":
en: "System update"
es: "Actualización del sistema"
fr: "Mise à jour du système"
zh_TW: "系統更新"
"needrestart will be ran by the package manager":
en: "needrestart will be ran by the package manager"
es: "needrestart será ejecutado por el administrador de paquetes"
fr: "needrestart sera exécuté par le gestionnaire de paquets"
zh_TW: "needrestart 將被套件管理員執行"
"Check for needed restarts":
en: "Check for needed restarts"
es: "Comprobando si es necesario reiniciar el sistema"
fr: "Vérification des redémarrages nécessaires"
zh_TW: "正在檢查是否需要重新啟動系統"
"Should not run in WSL":
en: "Should not run in WSL"
es: "No se debe ejecutar en WSL"
fr: "Ne doit pas être exécuté dans WSL"
zh_TW: "不該在 WSL 中執行"
"Firmware upgrades":
en: "Firmware upgrades"
es: "Actualizaciones de firmware"
fr: "Mises à jour du firmware"
zh_TW: "韌體更新"
"Flatpak System Packages":
en: "Flatpak System Packages"
es: "Paquetes del sistema Flatpak"
fr: "Paquets système Flatpak"
zh_TW: "Flatpak 系統套件"
"Snapd socket does not exist":
en: "Snapd socket does not exist"
es: "El socket Snapd no existe"
fr: "Le socket Snapd n'existe pas"
zh_TW: "找不到 Snapd 程序"
"You need to specify at least one container":
en: "You need to specify at least one container"
es: "Necesita especificar al menos un contenedor"
fr: "Vous devez spécifier au moins un conteneur"
zh_TW: "必須指定至少一個容器"
"Skipped in --yes":
en: "Skipped in --yes"
es: "Omitido por --yes"
fr: "Ignoré avec --yes"
zh_TW: "指定 --yes略過"
"Configuration update":
en: "Configuration update"
es: "Actualización de configuración"
fr: "Mise à jour de la configuration"
zh_TW: "設定更新"
"Going to execute `waydroid upgrade`, which would STOP the running container, is this ok?":
en: "Going to execute `waydroid upgrade`, which would STOP the running container, is this ok?"
es: "Se ejecutará `waydroid update`, lo que DETENDRÁ el contenedor en ejecución. ¿Está bien?"
fr: "`waydroid upgrade` va s'exécuter, ce qui ARRÊTERA le conteneur en cours d'exécution, est-ce ok ?"
zh_TW: "將略過 `waydroid upgrade`,並且「停止」執行容器。是否繼續?"
"Skip the Waydroid step because the user don't want to proceed":
en: "Skip the Waydroid step because the user don't want to proceed"
es: "Omitiendo el paso de Waydroid debido a que el usuario no quiere continuar"
fr: "Passer l'étape Waydroid car l'utilisateur ne souhaite pas l'exécuter"
zh_TW: "使用者指定略過 Waydroid 程序"
"macOS App Store":
en: "macOS App Store"
es: "Tienda de aplicaciones macOS"
fr: "macOS App Store"
zh_TW: "macOS App Store"
"macOS system update":
en: "macOS system update"
es: "Actualización del sistema macOS"
fr: "Mise à jour du système macOS"
zh_TW: "macOS 系統更新"
"OpenBSD Update":
en: "OpenBSD Update"
es: "Actualización de OpenBSD"
fr: "Mise à jour d'OpenBSD"
zh_TW: "OpenBSD 更新"
"OpenBSD Packages":
en: "OpenBSD Packages"
es: "Paquetes OpenBSD"
fr: "Paquets OpenBSD"
zh_TW: "OpenBSD 套件"
"`fisher` is not defined in `fish`":
en: "`fisher` is not defined in `fish`"
es: "`fisher` no está definido en `fish`"
fr: "`fisher` n'est pas reconnu dans `fish`"
zh_TW: "`fisher` 未在 `fish` 中指定"
"`fish_plugins` path doesn't exist: {err}":
en: "`fish_plugins` path doesn't exist: %{err}"
es: "La ruta `fish_plugins` no existe: %{err}"
fr: "Le chemin `fish_plugins` n'existe pas : %{err}"
zh_TW: "不存在 `fish_plugins` 路徑:%{err}"
"`fish_update_completions` is not available":
en: "`fish_update_completions` is not available"
es: "`fish_update_completions` no está disponible"
fr: "`fish_update_completions` n'est pas disponible"
zh_TW: "無法使用 `fish_update_completions`"
"Desktop doest not appear to be gnome":
en: "Desktop doest not appear to be gnome"
es: "El escritorio no parece ser Gnome"
fr: "Le bureau ne semble pas être Gnome"
zh_TW: "桌面環境不是 Gnome"
"Gnome shell extensions are unregistered in DBus":
en: "Gnome shell extensions are unregistered in DBus"
es: "Las extensiones de Gnome Shell no están registradas en DBus"
fr: "Les extensions de Gnome Shell ne sont pas enregistrées dans DBus"
zh_TW: "Gnome Shell 擴充功能在 DBus 中未被註冊"
"Gnome Shell extensions":
en: "Gnome Shell extensions"
es: "Extensiones de Gnome Shell"
fr: "Extensions de Gnome Shell"
zh_TW: "Gnome Shell 擴充功能"
"Not a custom brew for macOS":
en: "Not a custom brew for macOS"
es: "No es un brew personalizado para macOS"
fr: "Pas une version de brew personnalisée pour macOS"
zh_TW: "不是專門的 macOS brew"
"Guix Pull Failed, Skipping":
en: "Guix Pull Failed, Skipping"
es: "Guix Pull Fallido, omitiendo"
fr: "Échec de Guix Pull, ignoré"
zh_TW: "Guix 拉取失敗,略過"
"Nix-darwin on macOS must be upgraded via darwin-rebuild switch":
en: "Nix-darwin on macOS must be upgraded via darwin-rebuild switch"
es: "Nix-darwin en macOS debe actualizarse mediante el interruptor darwin-rebuild"
fr: "Nix-darwin sur macOS doit être mis à niveau via l'option darwin-rebuild"
zh_TW: "Nix-darwin 在 macOS 上必須使用 darwin-rebuild 更新"
"`nix upgrade-nix` can only be used on macOS or non-NixOS Linux":
en: "`nix upgrade-nix` can only be used on macOS or non-NixOS Linux"
es: "`nix update-nix` solo puede usarse en macOS o Linux que no sea NixOS"
fr: "`nix upgrade-nix` ne peut être utilisée que sur macOS ou Linux non-NixOS"
zh_TW: "`nix upgrade-nix` 僅能在 macOS 或非 NixOS 的 Linux 上使用"
"`nix upgrade-nix` cannot be run when Nix is installed in a profile":
en: "`nix upgrade-nix` cannot be run when Nix is installed in a profile"
es: "`nix Upgrade-nix` no puede ejecutarse cuando Nix está instalado en un perfil"
fr: "`nix upgrade-nix` ne peut pas être exécutée lorsque Nix est installé dans un profil"
zh_TW: "`nix upgrade-nix` 無法在已安裝 Nix 使用者環境的系統上使用"
"Nix (self-upgrade)":
en: "Nix (self-upgrade)"
es: "Nix (autoactualización)"
fr: "Nix (auto mise à niveau)"
zh_TW: "Nix自行更新"
"Pyenv is installed, but $PYENV_ROOT is not set correctly":
en: "Pyenv is installed, but $PYENV_ROOT is not set correctly"
es: "Pyenv está instalado, pero $PYENV_ROOT no está configurado correctamente"
fr: "Pyenv est installé, mais $PYENV_ROOT n'est pas défini correctement"
zh_TW: "已安裝 Pyenv 但尚未正確設定 $PYENV_ROOT"
"pyenv is not a git repository":
en: "pyenv is not a git repository"
es: "pyenv no es un repositorio git"
fr: "pyenv n'est pas un dépôt Git"
zh_TW: "pyenv 不是 git 來源"
"Bun Packages":
en: "Bun Packages"
es: "Paquetes Bun"
fr: "Paquets Bun"
zh_TW: "Bun 套件"
"WSL not installed":
en: "WSL not installed"
es: "WSL no instalado"
fr: "WSL n'est pas installé"
zh_TW: "未安裝 WSL"
"Update WSL":
en: "Update WSL"
es: "Actualizando WSL"
fr: "Mise à jour du WSL"
zh_TW: "更新 WSL"
"Could not find Topgrade installed in WSL":
en: "Could not find Topgrade installed in WSL"
es: "Topgrade no se ha instalado dentro de WSL"
fr: "Impossible de trouver Topgrade installé dans WSL"
zh_TW: "尚未在 WSL 內安裝 Topgrade"
"Consider installing PSWindowsUpdate as the use of Windows Update via USOClient is not supported.":
en: "Consider installing PSWindowsUpdate as the use of Windows Update via USOClient is not supported."
es: "Considere instalar PSWindowsUpdate ya que no se admite el uso de Windows Update a través de USOClient."
fr: "Envisagez d'installer PSWindowsUpdate car l'utilisation de Windows Update via USOClient n'est pas prise en charge."
zh_TW: "目前不支援使用 USOClient 管理 Windows 更新。建議安裝 PSWindowsUpdate。"
"USOClient not supported.":
en: "USOClient not supported."
es: "USOClient no es admitido."
fr: "USOClient n'est pas pris en charge."
zh_TW: "不支援 USOClient。"
"Connecting to {hostname}...":
en: "Connecting to %{hostname}..."
es: "Conectándose a %{hostname}..."
fr: "Connexion à %{hostname}..."
zh_TW: "正在連接 %{hostname}..."
"Skipping powered off box {vagrant_box}":
en: "Skipping powered off box %{vagrant_box}"
es: "Omitiendo el contenedor apagado %{vagrant_box}"
fr: "Ingorer la boîte éteinte %{vagrant_box}"
zh_TW: "正在略過已關機的容器 %{vagrant_box}"
"`{repo_tag}` for `{platform}`":
en: "`%{repo_tag}` for `%{platform}`"
es: "`%{repo_tag}` para `%{platform}`"
fr: "`%{repo_tag}` pour `%{platform}`"
zh_TW: "`%{repo_tag}` 給 `%{platform}`"
"Containers":
en: "Containers"
es: "Contenedores"
fr: "Conteneurs"
zh_TW: "容器"
"Emacs directory does not exist":
en: "Emacs directory does not exist"
es: "El directorio Emacs no existe"
fr: "Le répertoire Emacs n'existe pas"
zh_TW: "找不到 Emacs 資料夾"
"Error getting the composer directory: {error}":
en: "Error getting the composer directory: %{error}"
es: "Error al obtener el directorio de composer: %{error}"
fr: "Erreur lors de la récupération du répertoire de Composer : %{error}"
zh_TW: "無法取得 composer 資料夾:%{error}"
"Composer directory {composer_home} isn't a descendant of the user's home directory":
en: "Composer directory %{composer_home} isn't a descendant of the user's home directory"
es: "El directorio de composer %{composer_home} no es descendiente del directorio de inicio del usuario"
fr: "Le répertoire de Composer %{composer_home} n'est pas un descendant du répertoire home de l'utilisateur"
zh_TW: "Composer 資料夾 %{composer_home} 不在家目錄下"
"Composer":
en: "Composer"
es: "Composer"
fr: "Composer"
zh_TW: "Composer"
"Error running `dotnet tool list`. This is expected when a dotnet runtime is installed but no SDK.":
en: "Error running `dotnet tool list`. This is expected when a dotnet runtime is installed but no SDK."
es: "Error al ejecutar `dotnet tool list`. Esto es lo esperado cuando se instala un entorno de ejecución dotnet pero no un SDK."
fr: "Erreur lors de l'exécution de `dotnet tool list`. Ce comportement est attendu lorsque le runtime dotnet est installé mais pas le SDK."
zh_TW: "執行 `dotnet tool list` 失敗。已安裝 dotnet 執行環境但未安裝 SDK"
"No dotnet global tools installed":
en: "No dotnet global tools installed"
es: "No hay herramientas globales dotnet instaladas"
fr: "Aucun outil global dotnet installé"
zh_TW: "尚未安裝全域 dotnet 工具"
"Racket Package Manager":
en: "Racket Package Manager"
es: "Administrador de paquetes Racket"
fr: "Gestionnaire de paquets Racket"
zh_TW: "Racket 套件管理員"
"GH failed":
en: "GH failed"
es: "GH fallido"
fr: "Échec de GH"
zh_TW: "GH 失敗"
"GitHub CLI Extensions":
en: "GitHub CLI Extensions"
es: "Extensiones de GitHub CLI"
fr: "Extensions de Github CLI"
zh_TW: "GitHub CLI 擴充功能"
"Julia Packages":
en: "Julia Packages"
es: "Paquetes Julia"
fr: "Paquets Julia"
zh_TW: "Julia 套件"
"Update ClamAV Database(FreshClam)":
en: "Update ClamAV Database(FreshClam)"
es: "Actualizando base de datos ClamAV (FreshClam)"
fr: "Mise à jour de la base de données ClamAV (FreshClam)"
zh_TW: "更新 ClamAV 資料庫FreshClam"
"Path {pattern} did not contain any git repositories":
en: "Path %{pattern} did not contain any git repositories"
es: "La ruta %{pattern} no contenía ningún repositorio git"
fr: "Le chemin %{pattern} ne contenait aucun dépôt Git"
zh_TW: "路徑 %{pattern} 中沒有任何 git 來源"
"No repositories to pull":
en: "No repositories to pull"
es: "No hay repositorios que extraer"
fr: "Aucun dépôt à récupérer"
zh_TW: "沒有來源可以拉取"
"Git repositories":
en: "Git repositories"
es: "Repositorios Git"
fr: "Dépôts Git"
zh_TW: "Git 來源"
"Would pull {repo}":
en: "Would pull %{repo}"
es: "Extrayendo %{repo}"
fr: "Tirerait %{repo}"
zh_TW: "拉取 %{repo}"
# aka npm
"Node Package Manager":
en: "Node Package Manager"
es: "Node Package Manager (npm)"
fr: "Gestionnaire de paquets Node (npm)"
zh_TW: "Node 套件管理員npm"
# aka pnpm
"Performant Node Package Manager":
en: "Performant Node Package Manager"
es: "Performant Node Package Manager (pnpm)"
fr: "Gestionnaire de paquets Node performant (pnpm)"
zh_TW: "效能 Node 套件管理員pnpm"
"Yarn Package Manager":
en: "Yarn Package Manager"
es: "Administrador de paquetes Yarn"
fr: "Gestionnaire de paquets Yarn"
zh_TW: "Yarn 套件管理員"
"Deno installed outside of .deno directory":
en: "Deno installed outside of .deno directory"
es: "Deno está instalado fuera del directorio .deno"
fr: "Deno est installé en dehors du répertoire .deno"
zh_TW: "Deno 安裝在 .deno 資料夾外"
"The Ultimate vimrc":
en: "The Ultimate vimrc"
es: "El vimrc definitivo (The Ultimate vimrc)"
fr: "The Ultimate vimrc"
zh_TW: "終極 vimrcThe Ultimate vimrc"
"vim binary might be actually nvim":
en: "vim binary might be actually nvim"
es: "el binario vim puede ser nvim"
fr: "Le binaire vim pourrait être en réalité nvim"
zh_TW: "vim 執行檔可能為 nvim"
"`{process}` failed: {exit_status}":
en: "`%{process}` failed: %{exit_status}"
es: "`%{process}` falló: %{exit_status}"
fr: "`%{process}` a échoué : %{exit_status}"
zh_TW: "`%{process}` 失敗:%{exit_status}"
"`{process}` failed: {exit_status} with {output}":
en: "`%{process}` failed: %{exit_status} with %{output}"
es: "`%{process}` falló: %{exit_status} con %{output}"
fr: "`%{process}` a échoué : %{exit_status} avec %{output}"
zh_TW: "`%{process}` 失敗:%{exit_status} 伴隨 %{output}"
"Unknown Linux Distribution":
en: "Unknown Linux Distribution"
es: "Distribución de Linux desconocida"
fr: "Distribution Linux inconnue"
zh_TW: "未知 Linux"
'File "/etc/os-release" does not exist or is empty':
en: 'File "/etc/os-release" does not exist or is empty'
es: 'El archivo "/etc/os-release" no existe o está vacío'
fr: "Le fichier \"/etc/os-release\" n'existe pas ou est vide"
zh_TW: '「/etc/os-release」不存在或為空'
"Failed getting the system package manager":
en: "Failed getting the system package manager"
es: "Error al obtener el administrador de paquetes del sistema"
fr: "Échec de l'obtention du gestionnaire de paquets système"
zh_TW: "偵測系統套件管理員失敗"
"A step failed":
en: "A step failed"
es: "Un paso fallido"
fr: "Une étape a échouée"
zh_TW: "某步驟執行失敗"
"Dry running":
en: "Dry running"
es: "Simulando"
fr: "Simulation"
zh_TW: "模擬執行"
"Topgrade Upgraded":
en: "Topgrade Upgraded"
es: "Topgrade Actualizado"
fr: "Topgrade mis à jour"
zh_TW: "已更新 Topgrade"
# Summary texts
"OK":
en: "OK"
es: "OK"
fr: "OK"
zh_TW: "成功"
"FAILED":
en: "FAILED"
es: "FALLIDO"
fr: "ÉCHEC"
zh_TW: "失敗"
"IGNORED":
en: "IGNORED"
es: "IGNORADO"
fr: "IGNORÉ"
zh_TW: "忽略"
"SKIPPED":
en: "SKIPPED"
es: "OMITIDO"
fr: "PASSÉ"
zh_TW: "略過"
# 'Y' and 'N' have to stay the same characters. Eg for German the translation
# would look sth like "(Y) Ja / (N) Nein"
"(Y)es/(N)o":
en: "(Y)es/(N)o"
es: "(Y) Si / (N) No"
fr: "(Y) Oui / (N) Non"
zh_TW: "(Y)是/(N)否"
# 'y', 'N', 's', 'q' have to stay the same throughout all translations.
# Eg German would look like "(y) Wiederholen / (N) Nein / (s) Konsole / (q) Beenden"
"Retry? (y)es/(N)o/(s)hell/(q)uit":
en: "Retry? (y)es/(N)o/(s)hell/(q)uit"
es: "¿Reintentar? (y) Si / (N) No / (s) Shell / (q) Salir"
fr: "Réessayer ? (y) Oui / (N) Non / (s) Shell / (q) Quitter"
zh_TW: "再試一次? (y)是/(N)否/(s)殼層/(q)退出"
# 'R', 'S', 'Q' have to stay the same throughout all translations. Eg German would look like "\n(R) Neustarten\n(S) Konsole\n(Q) Beenden"
'\n(R)eboot\n(S)hell\n(Q)uit':
en: '\n(R)eboot\n(S)hell\n(Q)uit'
es: "\n(R) Reiniciar\n(S) Shell\n(Q) Salir"
fr: '\n(R) Redémarrer\n(S) Shell\n(Q) Quitter'
zh_TW: '\n(R)重新啟動\n(S)殼層\n(Q)退出'
"Require sudo or counterpart but not found, skip":
en: "Require sudo or counterpart but not found, skip"
es: "Se requiere sudo o su equivalente pero no ha sido encontrado, omitiendo"
fr: "Nécessite sudo ou un équivalent mais n'a pas été trouvé, passé"
zh_TW: "找不到權限管理程式sudo 等),略過"
"sudo as user '{user}'":
en: "sudo as user '%{user}'"
es: "sudo como usuario '%{user}'"
fr: "sudo en tant qu'utilisateur '%{user}'"
zh_TW: "sudo 以使用者 '%{user}'"
"Updating aqua ...":
en: "Updating aqua ..."
es: "Actualizando aqua..."
fr: "Mise à jour d'aqua..."
zh_TW: "正在更新 aqua..."
"Updating aqua installed cli tools ...":
en: "Updating aqua installed cli tools ..."
es: "Actualizando las herramientas CLI instaladas en aqua ..."
fr: "Mise à jour des outils cli installés d'aqua..."
zh_TW: "正在更新 aqua 安裝的命令行介面工具..."
"Updating Volta packages...":
en: "Updating Volta packages..."
es: "Actualizando paquetes Volta..."
fr: "Mise à jour des paquets Volta..."
zh_TW: "正在更新 Volta 套件..."
"No packages installed with Volta":
en: "No packages installed with Volta"
es: "No hay paquetes instalados con Volta"
fr: "Aucun paquet installé avec Volta"
zh_TW: "沒有任何 Volta 套件"
"pyenv-update plugin is not installed":
en: "pyenv-update plugin is not installed"
es: "El plugin pyenv-update no está instalado"
fr: "Le plugin pyenv-update n'est pas installé"
zh_TW: "尚未安裝 pyenv-update 擴充功能"
"Respawning...":
en: "Respawning..."
es: "Reapareciendo..."
fr: "Relancement..."
zh_TW: "正在重新生成..."
"Could not find Topgrade in any WSL disribution":
en: "Could not find Topgrade in any WSL disribution"
es: "No se pudo encontrar Topgrade en ninguna distribución WSL"
fr: "Impossible de trouver Topgrade dans aucune distribution WSL"
zh_TW: "在所有 WSL 中找不到 Topgrade"
"Windows Update":
en: "Windows Update"
es: "Actualización de Windows"
fr: "Mise à jour de Windows"
zh_TW: "Windows 更新"
"Would check if OpenBSD is -current":
en: "Would check if OpenBSD is -current"
es: "Comprobaría si OpenBSD está en -current"
fr: "Vérifierait si OpenBSD est à -curent"
zh_TW: "會檢查 OpenBSD 是否為 -current"
"Would upgrade the OpenBSD system":
en: "Would upgrade the OpenBSD system"
es: "Actualizaría el sistema OpenBSD"
fr: "Mettrait à jour le système OpenBSD"
zh_TW: "會升級 OpenBSD 系統"
"Would upgrade OpenBSD packages":
en: "Would upgrade OpenBSD packages"
es: "Actualizaría los paquetes de OpenBSD"
fr: "Mettrait à jour les paquets OpenBSD"
zh_TW: "會升級 OpenBSD 套件"
"Microsoft Store":
en: "Microsoft Store"
es: "Tienda de Microsoft"
fr: "Microsoft Store"
zh_TW: "Microsoft Store"
"Scanning for updates...":
en: "Scanning for updates..."
es: "Buscando actualizaciones..."
fr: "Recherche de mises à jour..."
zh_TW: "正在掃描更新..."
"Success, Microsoft Store apps are being updated in the background":
en: "Success, Microsoft Store apps are being updated in the background"
es: "Éxito, las aplicaciones de Microsoft Store se están actualizando en segundo plano"
fr: "Succès, les applications du Microsoft Store sont en cours de mise à jour en arrière plan"
zh_TW: "成功Microsoft Store 應用程式正在後台更新"
"Unable to update Microsoft Store apps, manual intervention is required":
en: "Unable to update Microsoft Store apps, manual intervention is required"
es: "No se pueden actualizar las aplicaciones de Microsoft Store, se requiere intervención manual"
fr: "Impossible de mettre à jour les applications du Microsoft Store, une intervention manuelle est nécessaire"
zh_TW: "無法更新 Microsoft Store 應用,需手動幹預"

2
rust-toolchain.toml Normal file
View File

@@ -0,0 +1,2 @@
[toolchain]
channel = "1.76.0"

View File

@@ -11,7 +11,9 @@ use crate::WINDOWS_DIRS;
use crate::XDG_DIRS;
use color_eyre::eyre::Result;
use etcetera::base_strategy::BaseStrategy;
use rust_i18n::t;
use std::{
env::var,
fs::{read_to_string, OpenOptions},
io::Write,
path::PathBuf,
@@ -44,7 +46,7 @@ impl FromStr for Version {
// They cannot be all 0s
assert!(
!(major == 0 && minor == 0 && patch == 0),
"Version numbers can not be all 0s"
"Version numbers cannot be all 0s"
);
Ok(Self {
@@ -89,6 +91,16 @@ fn keep_file_path() -> PathBuf {
data_dir().join(keep_file)
}
/// If environment variable `TOPGRADE_SKIP_BRKC_NOTIFY` is set to `true`, then
/// we won't notify the user of the breaking changes.
pub(crate) fn should_skip() -> bool {
if let Ok(var) = var("TOPGRADE_SKIP_BRKC_NOTIFY") {
return var.as_str() == "true";
}
false
}
/// True if this is the first execution of a major release.
pub(crate) fn first_run_of_major_release() -> Result<bool> {
let version = VERSION_STR.parse::<Version>().expect("should be a valid version");
@@ -107,12 +119,15 @@ pub(crate) fn first_run_of_major_release() -> Result<bool> {
/// Print breaking changes to the user.
pub(crate) fn print_breaking_changes() {
let header = format!("Topgrade {VERSION_STR} Breaking Changes");
let header = format!(
"{}",
t!("Topgrade {version_str} Breaking Changes", version_str = VERSION_STR)
);
print_separator(header);
let contents = if BREAKINGCHANGES.is_empty() {
"No Breaking changes"
t!("No Breaking changes").to_string()
} else {
BREAKINGCHANGES
BREAKINGCHANGES.to_string()
};
println!("{contents}\n");
}
@@ -148,7 +163,7 @@ mod test {
}
#[test]
#[should_panic(expected = "Version numbers can not be all 0s")]
#[should_panic(expected = "Version numbers cannot be all 0s")]
fn invalid_version() {
let all_0 = "0.0.0";
all_0.parse::<Version>().unwrap();

View File

@@ -5,7 +5,7 @@ use std::fs::{write, File};
use std::io::Write;
use std::path::{Path, PathBuf};
use std::process::Command;
use std::{env, fs};
use std::{env, fmt, fs};
use clap::{Parser, ValueEnum};
use clap_complete::Shell;
@@ -15,16 +15,18 @@ use etcetera::base_strategy::BaseStrategy;
use merge::Merge;
use regex::Regex;
use regex_split::RegexSplit;
use rust_i18n::t;
use serde::Deserialize;
use strum::{EnumIter, EnumString, EnumVariantNames, IntoEnumIterator};
use strum::{EnumIter, EnumString, IntoEnumIterator, VariantNames};
use which_crate::which;
use super::utils::editor;
use crate::command::CommandExt;
use crate::sudo::SudoKind;
use crate::utils::{hostname, string_prepend_str};
use crate::utils::string_prepend_str;
use tracing::{debug, error};
// TODO: Add i18n to this. Tracking issue: https://github.com/topgrade-rs/topgrade/issues/859
pub static EXAMPLE_CONFIG: &str = include_str!("../config.example.toml");
/// Topgrade's default log level.
@@ -44,7 +46,7 @@ macro_rules! str_value {
pub type Commands = BTreeMap<String, String>;
#[derive(ValueEnum, EnumString, EnumVariantNames, Debug, Clone, PartialEq, Eq, Deserialize, EnumIter, Copy)]
#[derive(ValueEnum, EnumString, VariantNames, Debug, Clone, PartialEq, Eq, Deserialize, EnumIter, Copy)]
#[clap(rename_all = "snake_case")]
#[serde(rename_all = "snake_case")]
#[strum(serialize_all = "snake_case")]
@@ -53,7 +55,9 @@ pub enum Step {
AppMan,
Asdf,
Atom,
Aqua,
Audit,
AutoCpufreq,
Bin,
Bob,
BrewCask,
@@ -61,9 +65,11 @@ pub enum Step {
Bun,
BunPackages,
Cargo,
Certbot,
Chezmoi,
Chocolatey,
Choosenim,
ClamAvDb,
Composer,
Conda,
ConfigUpdate,
@@ -74,6 +80,7 @@ pub enum Step {
Distrobox,
DkpPacman,
Dotnet,
Elan,
Emacs,
Firmware,
Flatpak,
@@ -97,12 +104,15 @@ pub enum Step {
Helix,
Krew,
Lure,
Lensfun,
Macports,
Mamba,
Miktex,
Mas,
Maza,
Micro,
MicrosoftStore,
Mise,
Myrepos,
Nix,
Node,
@@ -115,11 +125,15 @@ pub enum Step {
PipReviewLocal,
Pipupgrade,
Pipx,
Pixi,
Pkg,
Pkgin,
PlatformioCore,
Pnpm,
Poetry,
Powershell,
Protonup,
Pyenv,
Raco,
Rcm,
Remotes,
@@ -127,6 +141,7 @@ pub enum Step {
Rtcl,
RubyGems,
Rustup,
Rye,
Scoop,
Sdkman,
SelfUpdate,
@@ -142,15 +157,20 @@ pub enum Step {
Tlmgr,
Tmux,
Toolbx,
Uv,
Vagrant,
Vcpkg,
Vim,
VoltaPackages,
Vscode,
Waydroid,
Winget,
Wsl,
WslUpdate,
Xcodes,
Yadm,
Yarn,
Zvm,
}
#[derive(Deserialize, Default, Debug, Merge)]
@@ -165,6 +185,7 @@ pub struct Include {
pub struct Containers {
#[merge(strategy = crate::utils::merge_strategies::vec_prepend_opt)]
ignored_containers: Option<Vec<String>>,
runtime: Option<ContainerRuntime>,
}
#[derive(Deserialize, Default, Debug, Merge)]
@@ -197,7 +218,6 @@ pub struct Windows {
accept_all_updates: Option<bool>,
self_rename: Option<bool>,
open_remotes_in_new_terminal: Option<bool>,
enable_winget: Option<bool>,
wsl_update_pre_release: Option<bool>,
wsl_update_use_web_download: Option<bool>,
}
@@ -209,6 +229,7 @@ pub struct Python {
enable_pip_review_local: Option<bool>,
enable_pipupgrade: Option<bool>,
pipupgrade_arguments: Option<String>,
poetry_force_self_update: Option<bool>,
}
#[derive(Deserialize, Default, Debug, Merge)]
@@ -235,6 +256,13 @@ pub struct NPM {
use_sudo: Option<bool>,
}
#[derive(Deserialize, Default, Debug, Merge)]
#[serde(deny_unknown_fields)]
#[allow(clippy::upper_case_acronyms)]
pub struct Deno {
version: Option<String>,
}
#[derive(Deserialize, Default, Debug, Merge)]
#[serde(deny_unknown_fields)]
#[allow(clippy::upper_case_acronyms)]
@@ -253,7 +281,10 @@ pub struct Flatpak {
#[serde(deny_unknown_fields)]
pub struct Brew {
greedy_cask: Option<bool>,
greedy_latest: Option<bool>,
greedy_auto_updates: Option<bool>,
autoremove: Option<bool>,
fetch_head: Option<bool>,
}
#[derive(Debug, Deserialize, Clone, Copy)]
@@ -270,6 +301,22 @@ pub enum ArchPackageManager {
Yay,
}
#[derive(Clone, Copy, Debug, Deserialize)]
#[serde(rename_all = "snake_case")]
pub enum ContainerRuntime {
Docker,
Podman,
}
impl fmt::Display for ContainerRuntime {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self {
ContainerRuntime::Docker => write!(f, "docker"),
ContainerRuntime::Podman => write!(f, "podman"),
}
}
}
#[derive(Deserialize, Default, Debug, Merge)]
#[serde(deny_unknown_fields)]
pub struct Linux {
@@ -312,6 +359,7 @@ pub struct Linux {
redhat_distro_sync: Option<bool>,
suse_dup: Option<bool>,
rpm_ostree: Option<bool>,
bootc: Option<bool>,
#[merge(strategy = crate::utils::merge_strategies::string_append_opt)]
emerge_sync_flags: Option<String>,
@@ -369,6 +417,8 @@ pub struct Misc {
run_in_tmux: Option<bool>,
tmux_session_mode: Option<TmuxSessionMode>,
cleanup: Option<bool>,
notify_each_step: Option<bool>,
@@ -385,6 +435,31 @@ pub struct Misc {
log_filters: Option<Vec<String>>,
}
#[derive(Clone, Copy, Debug, Deserialize, ValueEnum)]
#[clap(rename_all = "snake_case")]
#[serde(rename_all = "snake_case")]
pub enum TmuxSessionMode {
AttachIfNotInSession,
AttachAlways,
}
pub struct TmuxConfig {
pub args: Vec<String>,
pub session_mode: TmuxSessionMode,
}
#[derive(Deserialize, Default, Debug, Merge)]
#[serde(deny_unknown_fields)]
pub struct Lensfun {
use_sudo: Option<bool>,
}
#[derive(Deserialize, Default, Debug, Merge)]
#[serde(deny_unknown_fields)]
pub struct JuliaConfig {
startup_file: Option<bool>,
}
#[derive(Deserialize, Default, Debug, Merge)]
#[serde(deny_unknown_fields)]
/// Configuration file
@@ -431,6 +506,9 @@ pub struct ConfigFile {
#[merge(strategy = crate::utils::merge_strategies::inner_merge_opt)]
yarn: Option<Yarn>,
#[merge(strategy = crate::utils::merge_strategies::inner_merge_opt)]
deno: Option<Deno>,
#[merge(strategy = crate::utils::merge_strategies::inner_merge_opt)]
vim: Option<Vim>,
@@ -445,6 +523,12 @@ pub struct ConfigFile {
#[merge(strategy = crate::utils::merge_strategies::inner_merge_opt)]
distrobox: Option<Distrobox>,
#[merge(strategy = crate::utils::merge_strategies::inner_merge_opt)]
lensfun: Option<Lensfun>,
#[merge(strategy = crate::utils::merge_strategies::inner_merge_opt)]
julia: Option<JuliaConfig>,
}
fn config_directory() -> PathBuf {
@@ -470,7 +554,7 @@ impl ConfigFile {
let config_directory = config_directory();
let possible_config_paths = vec![
let possible_config_paths = [
config_directory.join("topgrade.toml"),
config_directory.join("topgrade/topgrade.toml"),
];
@@ -479,7 +563,7 @@ impl ConfigFile {
for path in possible_config_paths.iter() {
if path.exists() {
debug!("Configuration at {}", path.display());
res.0 = path.clone();
res.0.clone_from(path);
break;
}
}
@@ -488,7 +572,7 @@ impl ConfigFile {
// If no config file exists, create a default one in the config directory
if !res.0.exists() && res.1.is_empty() {
res.0 = possible_config_paths[0].clone();
res.0.clone_from(&possible_config_paths[0]);
debug!("No configuration exists");
write(&res.0, EXAMPLE_CONFIG).map_err(|e| {
debug!(
@@ -511,7 +595,9 @@ impl ConfigFile {
if dir_to_search.exists() {
for entry in fs::read_dir(dir_to_search)? {
let entry = entry?;
if entry.file_type()?.is_file() {
// Use `Path::is_file()` here to traverse symbolic links.
// `DirEntry::file_type()` and `FileType::is_file()` will not traverse symbolic links.
if entry.path().is_file() {
debug!(
"Found additional (directory) configuration file at {}",
entry.path().display()
@@ -544,13 +630,11 @@ impl ConfigFile {
to read the include directory before returning the main config path
*/
for include in dir_include {
let include_contents = fs::read_to_string(&include).map_err(|e| {
let include_contents = fs::read_to_string(&include).inspect_err(|_| {
error!("Unable to read {}", include.display());
e
})?;
let include_contents_parsed = toml::from_str(include_contents.as_str()).map_err(|e| {
let include_contents_parsed = toml::from_str(include_contents.as_str()).inspect_err(|_| {
error!("Failed to deserialize {}", include.display());
e
})?;
result.merge(include_contents_parsed);
@@ -565,9 +649,8 @@ impl ConfigFile {
return Ok(result);
}
let mut contents_non_split = fs::read_to_string(&config_path).map_err(|e| {
let mut contents_non_split = fs::read_to_string(&config_path).inspect_err(|_| {
error!("Unable to read {}", config_path.display());
e
})?;
Self::ensure_misc_is_present(&mut contents_non_split, &config_path);
@@ -578,9 +661,8 @@ impl ConfigFile {
let contents_split = regex_match_include.split_inclusive_left(contents_non_split.as_str());
for contents in contents_split {
let config_file_include_only: ConfigFileIncludeOnly = toml::from_str(contents).map_err(|e| {
let config_file_include_only: ConfigFileIncludeOnly = toml::from_str(contents).inspect_err(|_| {
error!("Failed to deserialize an include section of {}", config_path.display());
e
})?;
if let Some(includes) = &config_file_include_only.include {
@@ -592,14 +674,14 @@ impl ConfigFile {
let include_contents = match fs::read_to_string(&include_path) {
Ok(c) => c,
Err(e) => {
error!("Unable to read {}: {}", include_path.display(), e);
error!("Unable to read {}: {e}", include_path.display(),);
continue;
}
};
match toml::from_str::<Self>(&include_contents) {
Ok(include_parsed) => result.merge(include_parsed),
Err(e) => {
error!("Failed to deserialize {}: {}", include_path.display(), e);
error!("Failed to deserialize {}: {e}", include_path.display(),);
continue;
}
};
@@ -609,14 +691,17 @@ impl ConfigFile {
match toml::from_str::<Self>(contents) {
Ok(contents) => result.merge(contents),
Err(e) => error!("Failed to deserialize {}: {}", config_path.display(), e),
Err(e) => error!("Failed to deserialize {}: {e}", config_path.display(),),
}
}
if let Some(paths) = result.git.as_mut().and_then(|git| git.repos.as_mut()) {
for path in paths.iter_mut() {
let expanded = shellexpand::tilde::<&str>(&path.as_ref()).into_owned();
debug!("Path {} expanded to {}", path, expanded);
debug!(
"{}",
t!("Path {path} expanded to {expanded}", path = path, expanded = expanded)
);
*path = expanded;
}
}
@@ -654,63 +739,65 @@ impl ConfigFile {
}
// Command line arguments
// TODO: i18n of clap currently not easily possible. Waiting for https://github.com/clap-rs/clap/issues/380
// Tracking issue for i18n: https://github.com/topgrade-rs/topgrade/issues/859
#[derive(Parser, Debug)]
#[clap(name = "Topgrade", version)]
#[command(name = "topgrade", version)]
pub struct CommandLineArgs {
/// Edit the configuration file
#[clap(long = "edit-config")]
#[arg(long = "edit-config")]
edit_config: bool,
/// Show config reference
#[clap(long = "config-reference")]
#[arg(long = "config-reference")]
show_config_reference: bool,
/// Run inside tmux
#[clap(short = 't', long = "tmux")]
#[arg(short = 't', long = "tmux")]
run_in_tmux: bool,
/// Cleanup temporary or old files
#[clap(short = 'c', long = "cleanup")]
#[arg(short = 'c', long = "cleanup")]
cleanup: bool,
/// Print what would be done
#[clap(short = 'n', long = "dry-run")]
#[arg(short = 'n', long = "dry-run")]
dry_run: bool,
/// Do not ask to retry failed steps
#[clap(long = "no-retry")]
#[arg(long = "no-retry")]
no_retry: bool,
/// Do not perform upgrades for the given steps
#[clap(long = "disable", value_name = "STEP", value_enum, num_args = 1..)]
#[arg(long = "disable", value_name = "STEP", value_enum, num_args = 1..)]
disable: Vec<Step>,
/// Perform only the specified steps (experimental)
#[clap(long = "only", value_name = "STEP", value_enum, num_args = 1..)]
/// Perform only the specified steps
#[arg(long = "only", value_name = "STEP", value_enum, num_args = 1..)]
only: Vec<Step>,
/// Run only specific custom commands
#[clap(long = "custom-commands", value_name = "NAME", num_args = 1..)]
#[arg(long = "custom-commands", value_name = "NAME", num_args = 1..)]
custom_commands: Vec<String>,
/// Set environment variables
#[clap(long = "env", value_name = "NAME=VALUE", num_args = 1..)]
#[arg(long = "env", value_name = "NAME=VALUE", num_args = 1..)]
env: Vec<String>,
/// Output debug logs. Alias for `--log-filter debug`.
#[clap(short = 'v', long = "verbose")]
#[arg(short = 'v', long = "verbose")]
pub verbose: bool,
/// Prompt for a key before exiting
#[clap(short = 'k', long = "keep")]
#[arg(short = 'k', long = "keep")]
keep_at_end: bool,
/// Skip sending a notification at the end of a run
#[clap(long = "skip-notify")]
#[arg(long = "skip-notify")]
skip_notify: bool,
/// Say yes to package manager's prompt
#[clap(
#[arg(
short = 'y',
long = "yes",
value_name = "STEP",
@@ -720,37 +807,37 @@ pub struct CommandLineArgs {
yes: Option<Vec<Step>>,
/// Don't pull the predefined git repos
#[clap(long = "disable-predefined-git-repos")]
#[arg(long = "disable-predefined-git-repos")]
disable_predefined_git_repos: bool,
/// Alternative configuration file
#[clap(long = "config", value_name = "PATH")]
#[arg(long = "config", value_name = "PATH")]
config: Option<PathBuf>,
/// A regular expression for restricting remote host execution
#[clap(long = "remote-host-limit", value_name = "REGEX")]
#[arg(long = "remote-host-limit", value_name = "REGEX")]
remote_host_limit: Option<Regex>,
/// Show the reason for skipped steps
#[clap(long = "show-skipped")]
#[arg(long = "show-skipped")]
show_skipped: bool,
/// Tracing filter directives.
///
/// See: https://docs.rs/tracing-subscriber/latest/tracing_subscriber/struct.EnvFilter.html
#[clap(long, default_value = DEFAULT_LOG_LEVEL)]
#[arg(long, default_value = DEFAULT_LOG_LEVEL)]
pub log_filter: String,
/// Print completion script for the given shell and exit
#[clap(long, value_enum, hide = true)]
#[arg(long, value_enum, hide = true)]
pub gen_completion: Option<Shell>,
/// Print roff manpage and exit
#[clap(long, hide = true)]
#[arg(long, hide = true)]
pub gen_manpage: bool,
/// Don't update Topgrade
#[clap(long = "no-self-update")]
#[arg(long = "no-self-update")]
pub no_self_update: bool,
}
@@ -811,7 +898,7 @@ impl Config {
ConfigFile::read(opt.config.clone()).unwrap_or_else(|e| {
// Inform the user about errors when loading the configuration,
// but fallback to the default config to at least attempt to do something
error!("failed to load configuration: {}", e);
error!("failed to load configuration: {e}");
ConfigFile::default()
})
} else {
@@ -861,6 +948,15 @@ impl Config {
.and_then(|containers| containers.ignored_containers.as_ref())
}
/// The preferred runtime for container updates (podman / docker).
pub fn containers_runtime(&self) -> ContainerRuntime {
self.config_file
.containers
.as_ref()
.and_then(|containers| containers.runtime)
.unwrap_or(ContainerRuntime::Docker) // defaults to a popular choice
}
/// Tell whether the specified step should run.
///
/// If the step appears either in the `--disable` command line argument
@@ -917,6 +1013,15 @@ impl Config {
.unwrap_or(false)
}
/// The preferred way to run the new tmux session.
fn tmux_session_mode(&self) -> TmuxSessionMode {
self.config_file
.misc
.as_ref()
.and_then(|misc| misc.tmux_session_mode)
.unwrap_or(TmuxSessionMode::AttachIfNotInSession)
}
/// Tell whether we should perform cleanup steps.
pub fn cleanup(&self) -> bool {
self.opt.cleanup
@@ -974,8 +1079,16 @@ impl Config {
self.config_file.git.as_ref().and_then(|git| git.arguments.as_ref())
}
pub fn tmux_config(&self) -> Result<TmuxConfig> {
let args = self.tmux_arguments()?;
Ok(TmuxConfig {
args,
session_mode: self.tmux_session_mode(),
})
}
/// Extra Tmux arguments
pub fn tmux_arguments(&self) -> Result<Vec<String>> {
fn tmux_arguments(&self) -> Result<Vec<String>> {
let args = &self
.config_file
.misc
@@ -1087,6 +1200,24 @@ impl Config {
.unwrap_or(false)
}
/// Whether Brew cask should be greedy_latest
pub fn brew_greedy_latest(&self) -> bool {
self.config_file
.brew
.as_ref()
.and_then(|c| c.greedy_latest)
.unwrap_or(false)
}
/// Whether Brew cask should be auto_updates
pub fn brew_greedy_auto_updates(&self) -> bool {
self.config_file
.brew
.as_ref()
.and_then(|c| c.greedy_auto_updates)
.unwrap_or(false)
}
/// Whether Brew should autoremove
pub fn brew_autoremove(&self) -> bool {
self.config_file
@@ -1096,6 +1227,15 @@ impl Config {
.unwrap_or(false)
}
/// Whether Brew should upgrade formulae built from the HEAD branch
pub fn brew_fetch_head(&self) -> bool {
self.config_file
.brew
.as_ref()
.and_then(|c| c.fetch_head)
.unwrap_or(false)
}
/// Whether Composer should update itself
pub fn composer_self_update(&self) -> bool {
self.config_file
@@ -1319,6 +1459,15 @@ impl Config {
.unwrap_or(false)
}
/// Use bootc in *when bootc is detected* (default: false)
pub fn bootc(&self) -> bool {
self.config_file
.linux
.as_ref()
.and_then(|linux| linux.bootc)
.unwrap_or(false)
}
/// Determine if we should ignore failures for this step
pub fn ignore_failure(&self, step: Step) -> bool {
self.config_file
@@ -1407,6 +1556,10 @@ impl Config {
.unwrap_or(false)
}
pub fn deno_version(&self) -> Option<&str> {
self.config_file.deno.as_ref().and_then(|deno| deno.version.as_deref())
}
#[cfg(target_os = "linux")]
pub fn firmware_upgrade(&self) -> bool {
self.config_file
@@ -1431,37 +1584,28 @@ impl Config {
#[cfg(target_os = "linux")]
str_value!(linux, emerge_update_flags);
pub fn should_execute_remote(&self, remote: &str) -> bool {
if let Ok(hostname) = hostname() {
if remote == hostname {
pub fn should_execute_remote(&self, hostname: Result<String>, remote: &str) -> bool {
let remote_host = remote.split_once('@').map_or(remote, |(_, host)| host);
if let Ok(hostname) = hostname {
if remote_host == hostname {
return false;
}
}
if let Some(limit) = self.opt.remote_host_limit.as_ref() {
return limit.is_match(remote);
if let Some(limit) = &self.opt.remote_host_limit.as_ref() {
return limit.is_match(remote_host);
}
true
}
#[cfg(windows)]
pub fn enable_winget(&self) -> bool {
return self
.config_file
.windows
.as_ref()
.and_then(|w| w.enable_winget)
.unwrap_or(false);
}
pub fn enable_pipupgrade(&self) -> bool {
return self
.config_file
self.config_file
.python
.as_ref()
.and_then(|python| python.enable_pipupgrade)
.unwrap_or(false);
.unwrap_or(false)
}
pub fn pipupgrade_arguments(&self) -> &str {
self.config_file
@@ -1471,20 +1615,25 @@ impl Config {
.unwrap_or("")
}
pub fn enable_pip_review(&self) -> bool {
return self
.config_file
self.config_file
.python
.as_ref()
.and_then(|python| python.enable_pip_review)
.unwrap_or(false);
.unwrap_or(false)
}
pub fn enable_pip_review_local(&self) -> bool {
return self
.config_file
self.config_file
.python
.as_ref()
.and_then(|python| python.enable_pip_review_local)
.unwrap_or(false);
.unwrap_or(false)
}
pub fn poetry_force_self_update(&self) -> bool {
self.config_file
.python
.as_ref()
.and_then(|python| python.poetry_force_self_update)
.unwrap_or(false)
}
pub fn display_time(&self) -> bool {
@@ -1502,11 +1651,29 @@ impl Config {
self.opt.custom_commands.iter().any(|s| s == name)
}
pub fn lensfun_use_sudo(&self) -> bool {
self.config_file
.lensfun
.as_ref()
.and_then(|lensfun| lensfun.use_sudo)
.unwrap_or(false)
}
pub fn julia_use_startup_file(&self) -> bool {
self.config_file
.julia
.as_ref()
.and_then(|julia| julia.startup_file)
.unwrap_or(true)
}
}
#[cfg(test)]
mod test {
use crate::config::ConfigFile;
use crate::config::*;
use color_eyre::eyre::eyre;
/// Test the default configuration in `config.example.toml` is valid.
#[test]
@@ -1515,4 +1682,51 @@ mod test {
assert!(toml::from_str::<ConfigFile>(str).is_ok());
}
fn config() -> Config {
Config {
opt: CommandLineArgs::parse_from::<_, String>([]),
config_file: ConfigFile::default(),
allowed_steps: Vec::new(),
}
}
#[test]
fn test_should_execute_remote_different_hostname() {
assert!(config().should_execute_remote(Ok("hostname".to_string()), "remote_hostname"))
}
#[test]
fn test_should_execute_remote_different_hostname_with_user() {
assert!(config().should_execute_remote(Ok("hostname".to_string()), "user@remote_hostname"))
}
#[test]
fn test_should_execute_remote_unknown_hostname() {
assert!(config().should_execute_remote(Err(eyre!("failed to get hostname")), "remote_hostname"))
}
#[test]
fn test_should_not_execute_remote_same_hostname() {
assert!(!config().should_execute_remote(Ok("hostname".to_string()), "hostname"))
}
#[test]
fn test_should_not_execute_remote_same_hostname_with_user() {
assert!(!config().should_execute_remote(Ok("hostname".to_string()), "user@hostname"))
}
#[test]
fn test_should_execute_remote_matching_limit() {
let mut config = config();
config.opt = CommandLineArgs::parse_from(["topgrade", "--remote-host-limit", "remote_hostname"]);
assert!(config.should_execute_remote(Ok("hostname".to_string()), "user@remote_hostname"))
}
#[test]
fn test_should_not_execute_remote_not_matching_limit() {
let mut config = config();
config.opt = CommandLineArgs::parse_from(["topgrade", "--remote-host-limit", "other_hostname"]);
assert!(!config.should_execute_remote(Ok("hostname".to_string()), "user@remote_hostname"))
}
}

View File

@@ -1,41 +1,98 @@
use std::process::ExitStatus;
use std::{fmt::Display, process::ExitStatus};
use rust_i18n::t;
use thiserror::Error;
#[derive(Error, Debug, PartialEq, Eq)]
pub enum TopgradeError {
#[error("`{0}` failed: {1}")]
ProcessFailed(String, ExitStatus),
#[error("`{0}` failed: {1}")]
ProcessFailedWithOutput(String, ExitStatus, String),
#[error("Unknown Linux Distribution")]
#[cfg(target_os = "linux")]
UnknownLinuxDistribution,
#[error("File \"/etc/os-release\" does not exist or is empty")]
#[cfg(target_os = "linux")]
EmptyOSReleaseFile,
#[error("Failed getting the system package manager")]
#[cfg(target_os = "linux")]
FailedGettingPackageManager,
}
impl Display for TopgradeError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
match self {
TopgradeError::ProcessFailed(process, exit_status) => {
write!(
f,
"{}",
t!(
"`{process}` failed: {exit_status}",
process = process,
exit_status = exit_status
)
)
}
TopgradeError::ProcessFailedWithOutput(process, exit_status, output) => {
write!(
f,
"{}",
t!(
"`{process}` failed: {exit_status} with {output}",
process = process,
exit_status = exit_status,
output = output
)
)
}
#[cfg(target_os = "linux")]
TopgradeError::UnknownLinuxDistribution => write!(f, "{}", t!("Unknown Linux Distribution")),
#[cfg(target_os = "linux")]
TopgradeError::EmptyOSReleaseFile => {
write!(f, "{}", t!("File \"/etc/os-release\" does not exist or is empty"))
}
#[cfg(target_os = "linux")]
TopgradeError::FailedGettingPackageManager => {
write!(f, "{}", t!("Failed getting the system package manager"))
}
}
}
}
#[derive(Error, Debug)]
#[error("A step failed")]
pub struct StepFailed;
#[derive(Error, Debug)]
#[error("Dry running")]
pub struct DryRun();
impl Display for StepFailed {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", t!("A step failed"))
}
}
#[derive(Error, Debug)]
pub struct DryRun();
impl Display for DryRun {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", t!("Dry running"))
}
}
#[derive(Error, Debug)]
#[error("{0}")]
pub struct SkipStep(pub String);
impl Display for SkipStep {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.0)
}
}
#[cfg(all(windows, feature = "self-update"))]
#[derive(Error, Debug)]
#[error("Topgrade Upgraded")]
pub struct Upgraded(pub ExitStatus);
#[cfg(all(windows, feature = "self-update"))]
impl Display for Upgraded {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", t!("Topgrade Upgraded"))
}
}

View File

@@ -1,8 +1,7 @@
#![allow(dead_code)]
use crate::executor::RunType;
use crate::git::Git;
use crate::sudo::Sudo;
use crate::utils::{require_option, REQUIRE_SUDO};
use crate::utils::{get_require_sudo_string, require_option};
use crate::{config::Config, executor::Executor};
use color_eyre::eyre::Result;
use std::env::var;
@@ -12,7 +11,6 @@ use std::sync::Mutex;
pub struct ExecutionContext<'a> {
run_type: RunType,
sudo: Option<Sudo>,
git: &'a Git,
config: &'a Config,
/// Name of a tmux session to execute commands in, if any.
/// This is used in `./steps/remote/ssh.rs`, where we want to run `topgrade` in a new
@@ -23,12 +21,11 @@ pub struct ExecutionContext<'a> {
}
impl<'a> ExecutionContext<'a> {
pub fn new(run_type: RunType, sudo: Option<Sudo>, git: &'a Git, config: &'a Config) -> Self {
pub fn new(run_type: RunType, sudo: Option<Sudo>, config: &'a Config) -> Self {
let under_ssh = var("SSH_CLIENT").is_ok() || var("SSH_TTY").is_ok();
Self {
run_type,
sudo,
git,
config,
tmux_session: Mutex::new(None),
under_ssh,
@@ -36,7 +33,7 @@ impl<'a> ExecutionContext<'a> {
}
pub fn execute_elevated(&self, command: &Path, interactive: bool) -> Result<Executor> {
let sudo = require_option(self.sudo.as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(self.sudo.as_ref(), get_require_sudo_string())?;
Ok(sudo.execute_elevated(self, command, interactive))
}
@@ -44,10 +41,6 @@ impl<'a> ExecutionContext<'a> {
self.run_type
}
pub fn git(&self) -> &Git {
self.git
}
pub fn sudo(&self) -> &Option<Sudo> {
&self.sudo
}

View File

@@ -4,6 +4,7 @@ use std::path::Path;
use std::process::{Child, Command, ExitStatus, Output};
use color_eyre::eyre::Result;
use rust_i18n::t;
use tracing::debug;
use crate::command::CommandExt;
@@ -209,17 +210,20 @@ pub struct DryCommand {
impl DryCommand {
fn dry_run(&self) {
print!(
"Dry running: {} {}",
self.program.to_string_lossy(),
shell_words::join(
self.args
.iter()
.map(|a| String::from(a.to_string_lossy()))
.collect::<Vec<String>>()
"{}",
t!(
"Dry running: {program_name} {arguments}",
program_name = self.program.to_string_lossy(),
arguments = shell_words::join(
self.args
.iter()
.map(|a| String::from(a.to_string_lossy()))
.collect::<Vec<String>>()
)
)
);
match &self.directory {
Some(dir) => println!(" in {}", dir.to_string_lossy()),
Some(dir) => println!(" {}", t!("in {directory}", directory = dir.to_string_lossy())),
None => println!(),
};
}
@@ -227,6 +231,7 @@ impl DryCommand {
/// The Result of spawn. Contains an actual `std::process::Child` if executed by a wet command.
pub enum ExecutorChild {
#[allow(unused)] // this type has not been used
Wet(Child),
Dry,
}

View File

@@ -6,7 +6,7 @@ use std::path::PathBuf;
use std::process::exit;
use std::time::Duration;
use crate::breaking_changes::{first_run_of_major_release, print_breaking_changes, write_keep_file};
use crate::breaking_changes::{first_run_of_major_release, print_breaking_changes, should_skip, write_keep_file};
use clap::CommandFactory;
use clap::{crate_version, Parser};
use color_eyre::eyre::Context;
@@ -18,6 +18,7 @@ use etcetera::base_strategy::Windows;
#[cfg(unix)]
use etcetera::base_strategy::Xdg;
use once_cell::sync::Lazy;
use rust_i18n::{i18n, t};
use tracing::debug;
use self::config::{CommandLineArgs, Config, Step};
@@ -27,7 +28,7 @@ use self::error::Upgraded;
use self::steps::{remote::*, *};
use self::terminal::*;
use self::utils::{install_color_eyre, install_tracing, update_tracing};
use self::utils::{hostname, install_color_eyre, install_tracing, update_tracing};
mod breaking_changes;
mod command;
@@ -50,9 +51,13 @@ mod utils;
pub(crate) static HOME_DIR: Lazy<PathBuf> = Lazy::new(|| home::home_dir().expect("No home directory"));
#[cfg(unix)]
pub(crate) static XDG_DIRS: Lazy<Xdg> = Lazy::new(|| Xdg::new().expect("No home directory"));
#[cfg(windows)]
pub(crate) static WINDOWS_DIRS: Lazy<Windows> = Lazy::new(|| Windows::new().expect("No home directory"));
// Init and load the i18n files
i18n!("locales", fallback = "en");
fn run() -> Result<()> {
install_color_eyre()?;
ctrlc::set_handler();
@@ -71,6 +76,11 @@ fn run() -> Result<()> {
// and `Config::tracing_filter_directives()`.
let reload_handle = install_tracing(&opt.tracing_filter_directives())?;
// Get current system locale and set it as the default locale
let system_locale = sys_locale::get_locale().unwrap_or("en".to_string());
rust_i18n::set_locale(&system_locale);
debug!("Current system locale is {system_locale}");
if let Some(shell) = opt.gen_completion {
let cmd = &mut CommandLineArgs::command();
clap_complete::generate(shell, cmd, clap::crate_name!(), &mut io::stdout());
@@ -117,13 +127,11 @@ fn run() -> Result<()> {
if config.run_in_tmux() && env::var("TOPGRADE_INSIDE_TMUX").is_err() {
#[cfg(unix)]
{
tmux::run_in_tmux(config.tmux_arguments()?)?;
tmux::run_in_tmux(config.tmux_config()?)?;
return Ok(());
}
}
let git = git::Git::new();
let mut git_repos = git::Repositories::new(&git);
let powershell = powershell::Powershell::new();
let should_run_powershell = powershell.profile().is_some() && config.should_run(Step::Powershell);
let emacs = emacs::Emacs::new();
@@ -132,12 +140,16 @@ fn run() -> Result<()> {
let sudo = config.sudo_command().map_or_else(sudo::Sudo::detect, sudo::Sudo::new);
let run_type = executor::RunType::new(config.dry_run());
let ctx = execution_context::ExecutionContext::new(run_type, sudo, &git, &config);
let ctx = execution_context::ExecutionContext::new(run_type, sudo, &config);
let mut runner = runner::Runner::new(&ctx);
// If this is the first execution of a major release, inform user of breaking
// changes
if first_run_of_major_release()? {
// If
//
// 1. the breaking changes notification shouldnot be skipped
// 2. this is the first execution of a major release
//
// inform user of breaking changes
if !should_skip() && first_run_of_major_release()? {
print_breaking_changes();
if prompt_yesno("Confirmed?")? {
@@ -179,7 +191,7 @@ fn run() -> Result<()> {
}
if let Some(topgrades) = config.remote_topgrades() {
for remote_topgrade in topgrades.iter().filter(|t| config.should_execute_remote(t)) {
for remote_topgrade in topgrades.iter().filter(|t| config.should_execute_remote(hostname(), t)) {
runner.execute(Step::Remotes, format!("Remote ({remote_topgrade})"), || {
ssh::ssh_step(&ctx, remote_topgrade)
})?;
@@ -194,6 +206,9 @@ fn run() -> Result<()> {
runner.execute(Step::Scoop, "Scoop", || windows::run_scoop(&ctx))?;
runner.execute(Step::Winget, "Winget", || windows::run_winget(&ctx))?;
runner.execute(Step::System, "Windows update", || windows::windows_update(&ctx))?;
runner.execute(Step::MicrosoftStore, "Microsoft Store", || {
windows::microsoft_store(&ctx)
})?;
}
#[cfg(target_os = "linux")]
@@ -207,7 +222,7 @@ fn run() -> Result<()> {
runner.execute(Step::System, "System update", || distribution.upgrade(&ctx))?;
}
Err(e) => {
println!("Error detecting current distribution: {e}");
println!("{}", t!("Error detecting current distribution: {error}", error = e));
}
}
runner.execute(Step::ConfigUpdate, "config-update", || linux::run_config_update(&ctx))?;
@@ -231,6 +246,8 @@ fn run() -> Result<()> {
unix::run_brew_formula(&ctx, unix::BrewVariant::Path)
})?;
runner.execute(Step::Lure, "LURE", || linux::run_lure_update(&ctx))?;
runner.execute(Step::Waydroid, "Waydroid", || linux::run_waydroid(&ctx))?;
runner.execute(Step::AutoCpufreq, "auto-cpufreq", || linux::run_auto_cpufreq(&ctx))?;
}
#[cfg(target_os = "macos")]
@@ -254,6 +271,7 @@ fn run() -> Result<()> {
unix::run_brew_cask(&ctx, unix::BrewVariant::Path)
})?;
runner.execute(Step::Macports, "MacPorts", || macos::run_macports(&ctx))?;
runner.execute(Step::Xcodes, "Xcodes", || macos::update_xcodes(&ctx))?;
runner.execute(Step::Sparkle, "Sparkle", || macos::run_sparkle(&ctx))?;
runner.execute(Step::Mas, "App Store", || macos::run_mas(&ctx))?;
runner.execute(Step::System, "System upgrade", || macos::upgrade_macos(&ctx))?;
@@ -293,8 +311,8 @@ fn run() -> Result<()> {
runner.execute(Step::Guix, "guix", || unix::run_guix(&ctx))?;
runner.execute(Step::HomeManager, "home-manager", || unix::run_home_manager(&ctx))?;
runner.execute(Step::Asdf, "asdf", || unix::run_asdf(&ctx))?;
runner.execute(Step::Mise, "mise", || unix::run_mise(&ctx))?;
runner.execute(Step::Pkgin, "pkgin", || unix::run_pkgin(&ctx))?;
runner.execute(Step::Bun, "bun", || unix::run_bun(&ctx))?;
runner.execute(Step::BunPackages, "bun-packages", || unix::run_bun_packages(&ctx))?;
runner.execute(Step::Shell, "zr", || zsh::run_zr(&ctx))?;
runner.execute(Step::Shell, "antibody", || zsh::run_antibody(&ctx))?;
@@ -319,6 +337,7 @@ fn run() -> Result<()> {
runner.execute(Step::GnomeShellExtensions, "Gnome Shell Extensions", || {
unix::upgrade_gnome_extensions(&ctx)
})?;
runner.execute(Step::Pyenv, "pyenv", || unix::run_pyenv(&ctx))?;
runner.execute(Step::Sdkman, "SDKMAN!", || unix::run_sdkman(&ctx))?;
runner.execute(Step::Rcm, "rcm", || unix::run_rcm(&ctx))?;
runner.execute(Step::Maza, "maza", || unix::run_maza(&ctx))?;
@@ -336,6 +355,8 @@ fn run() -> Result<()> {
// The following update function should be executed on all OSes.
runner.execute(Step::Fossil, "fossil", || generic::run_fossil(&ctx))?;
runner.execute(Step::Elan, "elan", || generic::run_elan(&ctx))?;
runner.execute(Step::Rye, "rye", || generic::run_rye(&ctx))?;
runner.execute(Step::Rustup, "rustup", || generic::run_rustup(&ctx))?;
runner.execute(Step::Juliaup, "juliaup", || generic::run_juliaup(&ctx))?;
runner.execute(Step::Dotnet, ".NET", || generic::run_dotnet_upgrade(&ctx))?;
@@ -349,10 +370,11 @@ fn run() -> Result<()> {
runner.execute(Step::Vcpkg, "vcpkg", || generic::run_vcpkg_update(&ctx))?;
runner.execute(Step::Pipx, "pipx", || generic::run_pipx_update(&ctx))?;
runner.execute(Step::Vscode, "Visual Studio Code extensions", || {
generic::run_vscode_extensions_upgrade(&ctx)
generic::run_vscode_extensions_update(&ctx)
})?;
runner.execute(Step::Conda, "conda", || generic::run_conda_update(&ctx))?;
runner.execute(Step::Mamba, "mamba", || generic::run_mamba_update(&ctx))?;
runner.execute(Step::Pixi, "pixi", || generic::run_pixi_update(&ctx))?;
runner.execute(Step::Miktex, "miktex", || generic::run_miktex_packages_update(&ctx))?;
runner.execute(Step::Pip3, "pip3", || generic::run_pip3_update(&ctx))?;
runner.execute(Step::PipReview, "pip-review", || generic::run_pip_review_update(&ctx))?;
@@ -375,6 +397,9 @@ fn run() -> Result<()> {
runner.execute(Step::Node, "npm", || node::run_npm_upgrade(&ctx))?;
runner.execute(Step::Yarn, "yarn", || node::run_yarn_upgrade(&ctx))?;
runner.execute(Step::Pnpm, "pnpm", || node::run_pnpm_upgrade(&ctx))?;
runner.execute(Step::VoltaPackages, "volta packages", || {
node::run_volta_packages_upgrade(&ctx)
})?;
runner.execute(Step::Containers, "Containers", || containers::run_containers(&ctx))?;
runner.execute(Step::Deno, "deno", || node::deno_upgrade(&ctx))?;
runner.execute(Step::Composer, "composer", || generic::run_composer_update(&ctx))?;
@@ -396,67 +421,20 @@ fn run() -> Result<()> {
generic::run_ghcli_extensions_upgrade(&ctx)
})?;
runner.execute(Step::Bob, "Bob", || generic::run_bob(&ctx))?;
if config.use_predefined_git_repos() {
if config.should_run(Step::Emacs) {
if !emacs.is_doom() {
if let Some(directory) = emacs.directory() {
git_repos.insert_if_repo(directory);
}
}
git_repos.insert_if_repo(HOME_DIR.join(".doom.d"));
}
if config.should_run(Step::Vim) {
git_repos.insert_if_repo(HOME_DIR.join(".vim"));
git_repos.insert_if_repo(HOME_DIR.join(".config/nvim"));
}
git_repos.insert_if_repo(HOME_DIR.join(".ideavimrc"));
git_repos.insert_if_repo(HOME_DIR.join(".intellimacs"));
if config.should_run(Step::Rcm) {
git_repos.insert_if_repo(HOME_DIR.join(".dotfiles"));
}
#[cfg(unix)]
{
git_repos.insert_if_repo(zsh::zshrc());
if config.should_run(Step::Tmux) {
git_repos.insert_if_repo(HOME_DIR.join(".tmux"));
}
git_repos.insert_if_repo(HOME_DIR.join(".config/fish"));
git_repos.insert_if_repo(XDG_DIRS.config_dir().join("openbox"));
git_repos.insert_if_repo(XDG_DIRS.config_dir().join("bspwm"));
git_repos.insert_if_repo(XDG_DIRS.config_dir().join("i3"));
git_repos.insert_if_repo(XDG_DIRS.config_dir().join("sway"));
}
#[cfg(windows)]
git_repos.insert_if_repo(
WINDOWS_DIRS
.cache_dir()
.join("Packages/Microsoft.WindowsTerminal_8wekyb3d8bbwe/LocalState"),
);
#[cfg(windows)]
windows::insert_startup_scripts(&mut git_repos).ok();
if let Some(profile) = powershell.profile() {
git_repos.insert_if_repo(profile);
}
}
if config.should_run(Step::GitRepos) {
if let Some(custom_git_repos) = config.git_repos() {
for git_repo in custom_git_repos {
git_repos.glob_insert(git_repo);
}
}
runner.execute(Step::GitRepos, "Git repositories", || {
git.multi_pull_step(&git_repos, &ctx)
})?;
}
runner.execute(Step::Certbot, "Certbot", || generic::run_certbot(&ctx))?;
runner.execute(Step::GitRepos, "Git Repositories", || git::run_git_pull(&ctx))?;
runner.execute(Step::ClamAvDb, "ClamAV Databases", || generic::run_freshclam(&ctx))?;
runner.execute(Step::PlatformioCore, "PlatformIO Core", || {
generic::run_platform_io(&ctx)
})?;
runner.execute(Step::Lensfun, "Lensfun's database update", || {
generic::run_lensfun_update_data(&ctx)
})?;
runner.execute(Step::Poetry, "Poetry", || generic::run_poetry(&ctx))?;
runner.execute(Step::Uv, "uv", || generic::run_uv(&ctx))?;
runner.execute(Step::Zvm, "ZVM", || generic::run_zvm(&ctx))?;
runner.execute(Step::Aqua, "aqua", || generic::run_aqua(&ctx))?;
runner.execute(Step::Bun, "bun", || generic::run_bun(&ctx))?;
if should_run_powershell {
runner.execute(Step::Powershell, "Powershell Modules Update", || {
@@ -486,7 +464,7 @@ fn run() -> Result<()> {
runner.execute(Step::Vagrant, "Vagrant boxes", || vagrant::upgrade_vagrant_boxes(&ctx))?;
if !runner.report().data().is_empty() {
print_separator("Summary");
print_separator(t!("Summary"));
for (key, result) in runner.report().data() {
print_result(key, result);
@@ -510,7 +488,7 @@ fn run() -> Result<()> {
}
if config.keep_at_end() {
print_info("\n(R)eboot\n(S)hell\n(Q)uit");
print_info(t!("\n(R)eboot\n(S)hell\n(Q)uit"));
loop {
match get_key() {
Ok(Key::Char('s')) | Ok(Key::Char('S')) => {
@@ -532,10 +510,11 @@ fn run() -> Result<()> {
if !config.skip_notify() {
notify_desktop(
format!(
"Topgrade finished {}",
if failed { "with errors" } else { "successfully" }
),
if failed {
t!("Topgrade finished with errors")
} else {
t!("Topgrade finished successfully")
},
Some(Duration::from_secs(10)),
)
}
@@ -570,7 +549,7 @@ fn main() {
// The `Debug` implementation of `eyre::Result` prints a multi-line
// error message that includes all the 'causes' added with
// `.with_context(...)` calls.
println!("Error: {error:?}");
println!("{}", t!("Error: {error}", error = format!("{:?}", error)));
}
exit(1);
}

View File

@@ -34,6 +34,14 @@ impl<'a> Runner<'a> {
let key = key.into();
debug!("Step {:?}", key);
// alter the `func` to put it in a span
let func = || {
let span =
tracing::span!(parent: tracing::Span::none(), tracing::Level::TRACE, "step", step = ?step, key = %key);
let _guard = span.enter();
func()
};
loop {
match func() {
Ok(()) => {

View File

@@ -1,5 +1,3 @@
#![cfg(windows)]
use color_eyre::eyre::Result;
use std::{env::current_exe, fs, path::PathBuf};
use tracing::{debug, error};

View File

@@ -5,6 +5,7 @@ use std::process::Command;
use crate::config::Step;
use color_eyre::eyre::{bail, Result};
use rust_i18n::t;
use self_update_crate::backends::github::Update;
use self_update_crate::update::UpdateStatus;
@@ -15,10 +16,10 @@ use crate::error::Upgraded;
use crate::execution_context::ExecutionContext;
pub fn self_update(ctx: &ExecutionContext) -> Result<()> {
print_separator("Self update");
print_separator(t!("Self update"));
if ctx.run_type().dry() {
println!("Would self-update");
println!("{}", t!("Would self-update"));
Ok(())
} else {
let assume_yes = ctx.config().yes(Step::SelfUpdate);
@@ -38,17 +39,17 @@ pub fn self_update(ctx: &ExecutionContext) -> Result<()> {
.update_extended()?;
if let UpdateStatus::Updated(release) = &result {
println!("\nTopgrade upgraded to {}:\n", release.version);
println!("{}", t!("Topgrade upgraded to {version}:\n", version = release.version));
if let Some(body) = &release.body {
println!("{body}");
}
} else {
println!("Topgrade is up-to-date");
println!("{}", t!("Topgrade is up-to-date"));
}
{
if result.updated() {
print_info("Respawning...");
print_info(t!("Respawning..."));
let mut command = Command::new(current_exe?);
command.args(env::args().skip(1)).env("TOPGRADE_NO_SELF_UPGRADE", "");

View File

@@ -6,11 +6,13 @@ use color_eyre::eyre::eyre;
use color_eyre::eyre::Context;
use color_eyre::eyre::Result;
use tracing::{debug, error, warn};
use wildmatch::WildMatch;
use crate::command::CommandExt;
use crate::error::{self, TopgradeError};
use crate::terminal::print_separator;
use crate::{execution_context::ExecutionContext, utils::require};
use rust_i18n::t;
// A string found in the output of docker for containers that weren't found in
// the docker registry. We use this to gracefully handle and skip containers
@@ -42,7 +44,15 @@ impl Container {
impl Display for Container {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
// e.g., "`fedora:latest` for `linux/amd64`"
write!(f, "`{}` for `{}`", self.repo_tag, self.platform)
write!(
f,
"{}",
t!(
"`{repo_tag}` for `{platform}`",
repo_tag = self.repo_tag,
platform = self.platform
)
)
}
}
@@ -51,6 +61,13 @@ impl Display for Container {
///
/// Containers specified in `ignored_containers` will be filtered out.
fn list_containers(crt: &Path, ignored_containers: Option<&Vec<String>>) -> Result<Vec<Container>> {
let ignored_containers = ignored_containers.map(|patterns| {
patterns
.iter()
.map(|pattern| WildMatch::new(pattern))
.collect::<Vec<WildMatch>>()
});
debug!(
"Querying '{} image ls --format \"{{{{.Repository}}}}:{{{{.Tag}}}}/{{{{.ID}}}}\"' for containers",
crt.display()
@@ -85,11 +102,8 @@ fn list_containers(crt: &Path, ignored_containers: Option<&Vec<String>>) -> Resu
assert_eq!(split_res.len(), 2);
let (repo_tag, image_id) = (split_res[0], split_res[1]);
if let Some(ignored_containers) = ignored_containers {
if ignored_containers
.iter()
.any(|ignored_container| repo_tag.eq(ignored_container))
{
if let Some(ref ignored_containers) = ignored_containers {
if ignored_containers.iter().any(|pattern| pattern.matches(repo_tag)) {
debug!("Skipping ignored container '{}'", line);
continue;
}
@@ -115,11 +129,12 @@ fn list_containers(crt: &Path, ignored_containers: Option<&Vec<String>>) -> Resu
}
pub fn run_containers(ctx: &ExecutionContext) -> Result<()> {
// Prefer podman, fall back to docker if not present
let crt = require("podman").or_else(|_| require("docker"))?;
// Check what runtime is specified in the config
let container_runtime = ctx.config().containers_runtime().to_string();
let crt = require(container_runtime)?;
debug!("Using container runtime '{}'", crt.display());
print_separator("Containers");
print_separator(t!("Containers"));
let mut success = true;
let containers =
list_containers(&crt, ctx.config().containers_ignored_tags()).context("Failed to list Docker containers")?;

View File

@@ -4,6 +4,7 @@ use std::path::{Path, PathBuf};
use color_eyre::eyre::Result;
use etcetera::base_strategy::BaseStrategy;
use rust_i18n::t;
use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
@@ -74,9 +75,12 @@ impl Emacs {
if let Some(doom) = &self.doom {
Emacs::update_doom(doom, ctx)?;
}
let init_file = require_option(self.directory.as_ref(), String::from("Emacs directory does not exist"))?
.join("init.el")
.require()?;
let init_file = require_option(
self.directory.as_ref(),
t!("Emacs directory does not exist").to_string(),
)?
.join("init.el")
.require()?;
print_separator("Emacs");

View File

@@ -1,5 +1,6 @@
#![allow(unused_imports)]
use std::ffi::OsStr;
use std::path::PathBuf;
use std::process::Command;
use std::{env, path::Path};
@@ -8,6 +9,7 @@ use std::{fs, io::Write};
use color_eyre::eyre::eyre;
use color_eyre::eyre::Context;
use color_eyre::eyre::Result;
use rust_i18n::t;
use semver::Version;
use tempfile::tempfile_in;
use tracing::{debug, error};
@@ -16,7 +18,7 @@ use crate::command::{CommandExt, Utf8Output};
use crate::execution_context::ExecutionContext;
use crate::executor::ExecutorOutput;
use crate::terminal::{print_separator, shell};
use crate::utils::{self, check_is_python_2_or_shim, require, require_option, which, PathExt, REQUIRE_SUDO};
use crate::utils::{self, check_is_python_2_or_shim, get_require_sudo_string, require, require_option, which, PathExt};
use crate::Step;
use crate::HOME_DIR;
use crate::{
@@ -24,6 +26,18 @@ use crate::{
terminal::print_warning,
};
#[cfg(target_os = "linux")]
pub fn is_wsl() -> Result<bool> {
let output = Command::new("uname").arg("-r").output_checked_utf8()?.stdout;
debug!("Uname output: {}", output);
Ok(output.contains("microsoft"))
}
#[cfg(not(target_os = "linux"))]
pub fn is_wsl() -> Result<bool> {
Ok(false)
}
pub fn run_cargo_update(ctx: &ExecutionContext) -> Result<()> {
let cargo_dir = env::var_os("CARGO_HOME")
.map(PathBuf::from)
@@ -108,13 +122,17 @@ pub fn run_rubygems(ctx: &ExecutionContext) -> Result<()> {
print_separator("RubyGems");
let gem_path_str = gem.as_os_str();
if gem_path_str.to_str().unwrap().contains("asdf") {
if gem_path_str.to_str().unwrap().contains("asdf")
|| gem_path_str.to_str().unwrap().contains("mise")
|| gem_path_str.to_str().unwrap().contains(".rbenv")
|| gem_path_str.to_str().unwrap().contains(".rvm")
{
ctx.run_type()
.execute(gem)
.args(["update", "--system"])
.status_checked()?;
} else {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
if !Path::new("/usr/lib/ruby/vendor_ruby/rubygems/defaults/operating_system.rb").exists() {
ctx.run_type()
.execute(sudo)
@@ -143,7 +161,7 @@ pub fn run_haxelib_update(ctx: &ExecutionContext) -> Result<()> {
let mut command = if directory_writable {
ctx.run_type().execute(&haxelib)
} else {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut c = ctx.run_type().execute(sudo);
c.arg(&haxelib);
c
@@ -208,6 +226,20 @@ pub fn run_apm(ctx: &ExecutionContext) -> Result<()> {
.status_checked()
}
pub fn run_aqua(ctx: &ExecutionContext) -> Result<()> {
let aqua = require("aqua")?;
print_separator("Aqua");
if ctx.run_type().dry() {
println!("{}", t!("Updating aqua ..."));
println!("{}", t!("Updating aqua installed cli tools ..."));
Ok(())
} else {
ctx.run_type().execute(&aqua).arg("update-aqua").status_checked()?;
ctx.run_type().execute(&aqua).arg("update").status_checked()
}
}
pub fn run_rustup(ctx: &ExecutionContext) -> Result<()> {
let rustup = require("rustup")?;
@@ -215,6 +247,24 @@ pub fn run_rustup(ctx: &ExecutionContext) -> Result<()> {
ctx.run_type().execute(rustup).arg("update").status_checked()
}
pub fn run_rye(ctx: &ExecutionContext) -> Result<()> {
let rye = require("rye")?;
print_separator("Rye");
ctx.run_type().execute(rye).args(["self", "update"]).status_checked()
}
pub fn run_elan(ctx: &ExecutionContext) -> Result<()> {
let elan = require("elan")?;
print_separator("elan");
ctx.run_type()
.execute(&elan)
.args(["self", "update"])
.status_checked()?;
ctx.run_type().execute(&elan).arg("update").status_checked()
}
pub fn run_juliaup(ctx: &ExecutionContext) -> Result<()> {
let juliaup = require("juliaup")?;
@@ -316,7 +366,7 @@ pub fn run_vcpkg_update(ctx: &ExecutionContext) -> Result<()> {
let mut command = if is_root_install {
ctx.run_type().execute(&vcpkg)
} else {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut c = ctx.run_type().execute(sudo);
c.arg(&vcpkg);
c
@@ -325,39 +375,48 @@ pub fn run_vcpkg_update(ctx: &ExecutionContext) -> Result<()> {
command.args(["upgrade", "--no-dry-run"]).status_checked()
}
pub fn run_vscode_extensions_upgrade(ctx: &ExecutionContext) -> Result<()> {
let vscode = require("code")?;
print_separator("Visual Studio Code extensions");
// Vscode does not have CLI command to upgrade all extensions (see https://github.com/microsoft/vscode/issues/56578)
// Instead we get the list of installed extensions with `code --list-extensions` command (obtain a line-return separated list of installed extensions)
let extensions = Command::new(&vscode)
.arg("--list-extensions")
.output_checked_utf8()?
.stdout;
// Then we construct the upgrade command: `code --force --install-extension [ext0] --install-extension [ext1] ... --install-extension [extN]`
if !extensions.is_empty() {
let mut command_args = vec!["--force"];
for extension in extensions.split_whitespace() {
command_args.extend(["--install-extension", extension]);
}
ctx.run_type().execute(&vscode).args(command_args).status_checked()?;
pub fn run_vscode_extensions_update(ctx: &ExecutionContext) -> Result<()> {
// Calling vscode in WSL may install a server instead of updating extensions (https://github.com/topgrade-rs/topgrade/issues/594#issuecomment-1782157367)
if is_wsl()? {
return Err(SkipStep(String::from("Should not run in WSL")).into());
}
Ok(())
let vscode = require("code")?;
// Vscode has update command only since 1.86 version ("january 2024" update), disable the update for prior versions
// Use command `code --version` which returns 3 lines: version, git commit, instruction set. We parse only the first one
let version: Result<Version> = match Command::new(&vscode)
.arg("--version")
.output_checked_utf8()?
.stdout
.lines()
.next()
{
Some(item) => Version::parse(item).map_err(|err| err.into()),
_ => return Err(SkipStep(String::from("Cannot find vscode version")).into()),
};
if !matches!(version, Ok(version) if version >= Version::new(1, 86, 0)) {
return Err(SkipStep(String::from("Too old vscode version to have update extensions command")).into());
}
print_separator("Visual Studio Code extensions");
ctx.run_type()
.execute(vscode)
.arg("--update-extensions")
.status_checked()
}
pub fn run_pipx_update(ctx: &ExecutionContext) -> Result<()> {
let pipx = require("pipx")?;
print_separator("pipx");
let mut command_args = vec!["upgrade-all"];
let mut command_args = vec!["upgrade-all", "--include-injected"];
// pipx version 1.4.0 introduced a new command argument `pipx upgrade-all --quiet`
// (see https://pipx.pypa.io/stable/docs/#pipx-upgrade-all)
let version_str = Command::new("pipx")
let version_str = Command::new(&pipx)
.args(["--version"])
.output_checked_utf8()
.map(|s| s.stdout.trim().to_owned());
@@ -372,7 +431,7 @@ pub fn run_pipx_update(ctx: &ExecutionContext) -> Result<()> {
pub fn run_conda_update(ctx: &ExecutionContext) -> Result<()> {
let conda = require("conda")?;
let output = Command::new("conda")
let output = Command::new(&conda)
.args(["config", "--show", "auto_activate_base"])
.output_checked_utf8()?;
debug!("Conda output: {}", output.stdout);
@@ -390,17 +449,16 @@ pub fn run_conda_update(ctx: &ExecutionContext) -> Result<()> {
command.status_checked()
}
pub fn run_pixi_update(ctx: &ExecutionContext) -> Result<()> {
let pixi = require("pixi")?;
print_separator("Pixi");
ctx.run_type().execute(pixi).args(["self-update"]).status_checked()
}
pub fn run_mamba_update(ctx: &ExecutionContext) -> Result<()> {
let mamba = require("mamba")?;
let output = Command::new("mamba")
.args(["config", "--show", "auto_activate_base"])
.output_checked_utf8()?;
debug!("Mamba output: {}", output.stdout);
if output.stdout.contains("False") {
return Err(SkipStep("auto_activate_base is set to False".to_string()).into());
}
print_separator("Mamba");
let mut command = ctx.run_type().execute(mamba);
@@ -489,7 +547,7 @@ pub fn run_pip3_update(ctx: &ExecutionContext) -> Result<()> {
print_separator("pip3");
if env::var("VIRTUAL_ENV").is_ok() {
print_warning("This step is will be skipped when running inside a virtual environment");
print_warning("This step is skipped when running inside a virtual environment");
return Err(SkipStep("Does not run inside a virtual environment".to_string()).into());
}
@@ -517,6 +575,7 @@ pub fn run_pip_review_update(ctx: &ExecutionContext) -> Result<()> {
Ok(())
}
pub fn run_pip_review_local_update(ctx: &ExecutionContext) -> Result<()> {
let pip_review = require("pip-review")?;
@@ -536,6 +595,7 @@ pub fn run_pip_review_local_update(ctx: &ExecutionContext) -> Result<()> {
Ok(())
}
pub fn run_pipupgrade_update(ctx: &ExecutionContext) -> Result<()> {
let pipupgrade = require("pipupgrade")?;
@@ -553,6 +613,7 @@ pub fn run_pipupgrade_update(ctx: &ExecutionContext) -> Result<()> {
Ok(())
}
pub fn run_stack_update(ctx: &ExecutionContext) -> Result<()> {
if require("ghcup").is_ok() {
// `ghcup` is present and probably(?) being used to install `stack`.
@@ -606,7 +667,7 @@ pub fn run_tlmgr_update(ctx: &ExecutionContext) -> Result<()> {
let mut command = if directory_writable {
ctx.run_type().execute(&tlmgr)
} else {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut c = ctx.run_type().execute(sudo);
c.arg(&tlmgr);
c
@@ -663,19 +724,22 @@ pub fn run_composer_update(ctx: &ExecutionContext) -> Result<()> {
let composer_home = Command::new(&composer)
.args(["global", "config", "--absolute", "--quiet", "home"])
.output_checked_utf8()
.map_err(|e| (SkipStep(format!("Error getting the composer directory: {e}"))))
.map_err(|e| (SkipStep(t!("Error getting the composer directory: {error}", error = e).to_string())))
.map(|s| PathBuf::from(s.stdout.trim()))?
.require()?;
if !composer_home.is_descendant_of(&HOME_DIR) {
return Err(SkipStep(format!(
"Composer directory {} isn't a decandent of the user's home directory",
composer_home.display()
))
return Err(SkipStep(
t!(
"Composer directory {composer_home} isn't a descendant of the user's home directory",
composer_home = composer_home.display()
)
.to_string(),
)
.into());
}
print_separator("Composer");
print_separator(t!("Composer"));
if ctx.config().composer_self_update() {
cfg_if::cfg_if! {
@@ -687,7 +751,7 @@ pub fn run_composer_update(ctx: &ExecutionContext) -> Result<()> {
};
if has_update {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type()
.execute(sudo)
.arg(&composer)
@@ -731,13 +795,15 @@ pub fn run_dotnet_upgrade(ctx: &ExecutionContext) -> Result<()> {
{
Ok(output) => output,
Err(_) => {
return Err(SkipStep(String::from(
"Error running `dotnet tool list`. This is expected when a dotnet runtime is installed but no SDK.",
))
.into())
return Err(SkipStep(
t!("Error running `dotnet tool list`. This is expected when a dotnet runtime is installed but no SDK.")
.to_string(),
)
.into());
}
};
let mut in_header = true;
let mut packages = output
.stdout
.lines()
@@ -745,16 +811,22 @@ pub fn run_dotnet_upgrade(ctx: &ExecutionContext) -> Result<()> {
//
// Package Id Version Commands
// -------------------------------------
//
// One thing to note is that .NET SDK respect locale, which means this
// header can be printed in languages other than English, do NOT use it
// to do any check.
.skip(2)
.skip_while(|line| {
// The .NET SDK respects locale, so the header can be printed
// in languages other than English. The separator should hopefully
// always be at least 10 -'s long.
if in_header && line.starts_with("----------") {
in_header = false;
true
} else {
in_header
}
})
.filter(|line| !line.is_empty())
.peekable();
if packages.peek().is_none() {
return Err(SkipStep(String::from("No dotnet global tools installed")).into());
return Err(SkipStep(t!("No dotnet global tools installed").to_string()).into());
}
print_separator(".NET");
@@ -765,27 +837,26 @@ pub fn run_dotnet_upgrade(ctx: &ExecutionContext) -> Result<()> {
.execute(&dotnet)
.args(["tool", "update", package_name, "--global"])
.status_checked()
.with_context(|| format!("Failed to update .NET package {package_name}"))?;
.with_context(|| format!("Failed to update .NET package {:?}", package_name))?;
}
Ok(())
}
pub fn run_helix_grammars(ctx: &ExecutionContext) -> Result<()> {
require("helix")?;
let helix = require("helix").or(require("hx"))?;
print_separator("Helix");
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
ctx.run_type()
.execute(sudo)
.args(["helix", "--grammar", "fetch"])
.execute(&helix)
.args(["--grammar", "fetch"])
.status_checked()
.with_context(|| "Failed to download helix grammars!")?;
ctx.run_type()
.execute(sudo)
.args(["helix", "--grammar", "build"])
.execute(&helix)
.args(["--grammar", "build"])
.status_checked()
.with_context(|| "Failed to build helix grammars!")?;
@@ -795,7 +866,7 @@ pub fn run_helix_grammars(ctx: &ExecutionContext) -> Result<()> {
pub fn run_raco_update(ctx: &ExecutionContext) -> Result<()> {
let raco = require("raco")?;
print_separator("Racket Package Manager");
print_separator(t!("Racket Package Manager"));
ctx.run_type()
.execute(raco)
@@ -823,10 +894,10 @@ pub fn run_ghcli_extensions_upgrade(ctx: &ExecutionContext) -> Result<()> {
let result = Command::new(&gh).args(["extensions", "list"]).output_checked_utf8();
if result.is_err() {
debug!("GH result {:?}", result);
return Err(SkipStep(String::from("GH failed")).into());
return Err(SkipStep(t!("GH failed").to_string()).into());
}
print_separator("GitHub CLI Extensions");
print_separator(t!("GitHub CLI Extensions"));
ctx.run_type()
.execute(&gh)
.args(["extension", "upgrade", "--all"])
@@ -836,12 +907,17 @@ pub fn run_ghcli_extensions_upgrade(ctx: &ExecutionContext) -> Result<()> {
pub fn update_julia_packages(ctx: &ExecutionContext) -> Result<()> {
let julia = require("julia")?;
print_separator("Julia Packages");
print_separator(t!("Julia Packages"));
ctx.run_type()
.execute(julia)
.args(["-e", "using Pkg; Pkg.update()"])
.status_checked()
let mut executor = ctx.run_type().execute(julia);
executor.arg(if ctx.config().julia_use_startup_file() {
"--startup-file=yes"
} else {
"--startup-file=no"
});
executor.args(["-e", "using Pkg; Pkg.update()"]).status_checked()
}
pub fn run_helm_repo_update(ctx: &ExecutionContext) -> Result<()> {
@@ -853,7 +929,7 @@ pub fn run_helm_repo_update(ctx: &ExecutionContext) -> Result<()> {
let mut success = true;
let mut exec = ctx.run_type().execute(helm);
if let Err(e) = exec.arg("repo").arg("update").status_checked() {
error!("Updating repositories failed: {}", e);
error!("Updating repositories failed: {e}");
success = match exec.output_checked_utf8() {
Ok(s) => s.stdout.contains(no_repo) || s.stderr.contains(no_repo),
Err(e) => match e.downcast_ref::<TopgradeError>() {
@@ -884,3 +960,198 @@ pub fn run_bob(ctx: &ExecutionContext) -> Result<()> {
ctx.run_type().execute(bob).args(["update", "--all"]).status_checked()
}
pub fn run_certbot(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let certbot = require("certbot")?;
print_separator("Certbot");
let mut cmd = ctx.run_type().execute(sudo);
cmd.arg(certbot);
cmd.arg("renew");
cmd.status_checked()
}
/// Run `$ freshclam` to update ClamAV signature database
///
/// doc: https://docs.clamav.net/manual/Usage/SignatureManagement.html#freshclam
pub fn run_freshclam(ctx: &ExecutionContext) -> Result<()> {
let freshclam = require("freshclam")?;
print_separator(t!("Update ClamAV Database(FreshClam)"));
ctx.run_type().execute(freshclam).status_checked()
}
/// Involve `pio upgrade` to update PlatformIO core.
pub fn run_platform_io(ctx: &ExecutionContext) -> Result<()> {
// We use the full path because by default the binary is not in `PATH`:
// https://github.com/topgrade-rs/topgrade/issues/754#issuecomment-2020537559
#[cfg(unix)]
fn bin_path() -> PathBuf {
HOME_DIR.join(".platformio/penv/bin/pio")
}
#[cfg(windows)]
fn bin_path() -> PathBuf {
HOME_DIR.join(".platformio/penv/Scripts/pio.exe")
}
let bin_path = require(bin_path())?;
print_separator("PlatformIO Core");
ctx.run_type().execute(bin_path).arg("upgrade").status_checked()
}
/// Run `lensfun-update-data` to update lensfun database.
///
/// `sudo` will be used if `use_sudo` configuration entry is set to true.
pub fn run_lensfun_update_data(ctx: &ExecutionContext) -> Result<()> {
const SEPARATOR: &str = "Lensfun's database update";
let lensfun_update_data = require("lensfun-update-data")?;
const EXIT_CODE_WHEN_NO_UPDATE: i32 = 1;
if ctx.config().lensfun_use_sudo() {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator(SEPARATOR);
ctx.run_type()
.execute(sudo)
.arg(lensfun_update_data)
// `lensfun-update-data` returns 1 when there is no update available
// which should be considered success
.status_checked_with_codes(&[EXIT_CODE_WHEN_NO_UPDATE])
} else {
print_separator(SEPARATOR);
ctx.run_type()
.execute(lensfun_update_data)
.status_checked_with_codes(&[EXIT_CODE_WHEN_NO_UPDATE])
}
}
pub fn run_poetry(ctx: &ExecutionContext) -> Result<()> {
let poetry = require("poetry")?;
#[cfg(unix)]
fn get_interpreter(poetry: &PathBuf) -> Result<PathBuf> {
use std::os::unix::ffi::OsStrExt;
let script = fs::read(poetry)?;
if let Some(r) = script.iter().position(|&b| b == b'\n') {
let first_line = &script[..r];
if first_line.starts_with(b"#!") {
return Ok(OsStr::from_bytes(&first_line[2..]).into());
}
}
Err(eyre!("Could not find shebang"))
}
#[cfg(windows)]
fn get_interpreter(poetry: &PathBuf) -> Result<PathBuf> {
let data = fs::read(poetry)?;
// https://bitbucket.org/vinay.sajip/simple_launcher/src/master/compare_launchers.py
let pos = match data.windows(4).rposition(|b| b == b"PK\x05\x06") {
Some(i) => i,
None => return Err(eyre!("Not a ZIP archive")),
};
let cdr_size = match data.get(pos + 12..pos + 16) {
Some(b) => u32::from_le_bytes(b.try_into().unwrap()) as usize,
None => return Err(eyre!("Invalid CDR size")),
};
let cdr_offset = match data.get(pos + 16..pos + 20) {
Some(b) => u32::from_le_bytes(b.try_into().unwrap()) as usize,
None => return Err(eyre!("Invalid CDR offset")),
};
if pos < cdr_size + cdr_offset {
return Err(eyre!("Invalid ZIP archive"));
}
let arc_pos = pos - cdr_size - cdr_offset;
let shebang = match data[..arc_pos].windows(2).rposition(|b| b == b"#!") {
Some(l) => &data[l + 2..arc_pos - 1],
None => return Err(eyre!("Could not find shebang")),
};
// shebang line is utf8
Ok(std::str::from_utf8(shebang)?.into())
}
if ctx.config().poetry_force_self_update() {
debug!("forcing poetry self update");
} else {
let interpreter = match get_interpreter(&poetry) {
Ok(p) => p,
Err(e) => {
return Err(SkipStep(format!("Could not find interpreter for {}: {}", poetry.display(), e)).into())
}
};
debug!("poetry interpreter: {}", interpreter.display());
let check_official_install_script =
"import sys; from os import path; print('Y') if path.isfile(path.join(sys.prefix, 'poetry_env')) else print('N')";
let output = Command::new(&interpreter)
.args(["-c", check_official_install_script])
.output_checked_utf8()?;
let stdout = output.stdout.trim();
let official_install = match stdout {
"N" => false,
"Y" => true,
_ => unreachable!("unexpected output from `check_official_install_script`"),
};
debug!("poetry is official install: {}", official_install);
if !official_install {
return Err(SkipStep("Not installed with the official script".to_string()).into());
}
}
print_separator("Poetry");
ctx.run_type()
.execute(&poetry)
.args(["self", "update"])
.status_checked()
}
pub fn run_uv(ctx: &ExecutionContext) -> Result<()> {
let uv_exec = require("uv")?;
print_separator("uv");
// try uv self --help first - if it succeeds, we call uv self update
let result = ctx
.run_type()
.execute(&uv_exec)
.args(["self", "--help"])
.output_checked();
if result.is_ok() {
ctx.run_type()
.execute(&uv_exec)
.args(["self", "update"])
.status_checked()
.ok();
}
ctx.run_type()
.execute(&uv_exec)
.args(["tool", "upgrade", "--all"])
.status_checked()
}
/// Involve `zvm upgrade` to update ZVM
pub fn run_zvm(ctx: &ExecutionContext) -> Result<()> {
let zvm = require("zvm")?;
print_separator("ZVM");
ctx.run_type().execute(zvm).arg("upgrade").status_checked()
}
pub fn run_bun(ctx: &ExecutionContext) -> Result<()> {
let bun = require("bun")?;
print_separator("Bun");
ctx.run_type().execute(bun).arg("upgrade").status_checked()
}

View File

@@ -6,30 +6,123 @@ use std::process::{Command, Output, Stdio};
use color_eyre::eyre::Context;
use color_eyre::eyre::{eyre, Result};
use console::style;
use futures::stream::{iter, FuturesUnordered};
use futures::StreamExt;
use futures::stream::{iter, FuturesUnordered, StreamExt};
use glob::{glob_with, MatchOptions};
use tokio::process::Command as AsyncCommand;
use tokio::runtime;
use tracing::{debug, error};
use crate::command::CommandExt;
use crate::config::Step;
use crate::execution_context::ExecutionContext;
use crate::steps::emacs::Emacs;
use crate::terminal::print_separator;
use crate::utils::{which, PathExt};
use crate::{error::SkipStep, terminal::print_warning};
use crate::utils::{require, PathExt};
use crate::{error::SkipStep, terminal::print_warning, HOME_DIR};
use etcetera::base_strategy::BaseStrategy;
use rust_i18n::t;
#[cfg(unix)]
use crate::XDG_DIRS;
#[cfg(windows)]
use crate::WINDOWS_DIRS;
pub fn run_git_pull(ctx: &ExecutionContext) -> Result<()> {
let mut repos = RepoStep::try_new()?;
let config = ctx.config();
// handle built-in repos
if config.use_predefined_git_repos() {
// should be executed on all the platforms
{
if config.should_run(Step::Emacs) {
let emacs = Emacs::new();
if !emacs.is_doom() {
if let Some(directory) = emacs.directory() {
repos.insert_if_repo(directory);
}
}
repos.insert_if_repo(HOME_DIR.join(".doom.d"));
}
if config.should_run(Step::Vim) {
repos.insert_if_repo(HOME_DIR.join(".vim"));
repos.insert_if_repo(HOME_DIR.join(".config/nvim"));
}
repos.insert_if_repo(HOME_DIR.join(".ideavimrc"));
repos.insert_if_repo(HOME_DIR.join(".intellimacs"));
if config.should_run(Step::Rcm) {
repos.insert_if_repo(HOME_DIR.join(".dotfiles"));
}
let powershell = crate::steps::powershell::Powershell::new();
if let Some(profile) = powershell.profile() {
repos.insert_if_repo(profile);
}
}
#[cfg(unix)]
{
repos.insert_if_repo(crate::steps::zsh::zshrc());
if config.should_run(Step::Tmux) {
repos.insert_if_repo(HOME_DIR.join(".tmux"));
}
repos.insert_if_repo(HOME_DIR.join(".config/fish"));
repos.insert_if_repo(XDG_DIRS.config_dir().join("openbox"));
repos.insert_if_repo(XDG_DIRS.config_dir().join("bspwm"));
repos.insert_if_repo(XDG_DIRS.config_dir().join("i3"));
repos.insert_if_repo(XDG_DIRS.config_dir().join("sway"));
}
#[cfg(windows)]
{
repos.insert_if_repo(
WINDOWS_DIRS
.cache_dir()
.join("Packages/Microsoft.WindowsTerminal_8wekyb3d8bbwe/LocalState"),
);
super::os::windows::insert_startup_scripts(&mut repos).ok();
}
}
// Handle user-defined repos
if let Some(custom_git_repos) = config.git_repos() {
for git_repo in custom_git_repos {
repos.glob_insert(git_repo);
}
}
// Warn the user about the bad patterns.
//
// NOTE: this should be executed **before** skipping the Git step or the
// user won't receive this warning in the cases where all the paths configured
// are bad patterns.
repos.bad_patterns.iter().for_each(|pattern| {
print_warning(t!(
"Path {pattern} did not contain any git repositories",
pattern = pattern
))
});
if repos.is_repos_empty() {
return Err(SkipStep(t!("No repositories to pull").to_string()).into());
}
print_separator(t!("Git repositories"));
repos.pull_repos(ctx)
}
#[cfg(windows)]
static PATH_PREFIX: &str = "\\\\?\\";
#[derive(Debug)]
pub struct Git {
git: Option<PathBuf>,
}
pub struct Repositories<'a> {
git: &'a Git,
repositories: HashSet<String>,
pub struct RepoStep {
git: PathBuf,
repos: HashSet<PathBuf>,
glob_match_options: MatchOptions,
bad_patterns: Vec<String>,
}
@@ -45,100 +138,41 @@ fn output_checked_utf8(output: Output) -> Result<()> {
}
}
async fn pull_repository(repo: String, git: &Path, ctx: &ExecutionContext<'_>) -> Result<()> {
let path = repo.to_string();
let before_revision = get_head_revision(git, &repo);
println!("{} {}", style("Pulling").cyan().bold(), path);
let mut command = AsyncCommand::new(git);
command
.stdin(Stdio::null())
.current_dir(&repo)
.args(["pull", "--ff-only"]);
if let Some(extra_arguments) = ctx.config().git_arguments() {
command.args(extra_arguments.split_whitespace());
}
let pull_output = command.output().await?;
let submodule_output = AsyncCommand::new(git)
.args(["submodule", "update", "--recursive"])
.current_dir(&repo)
.stdin(Stdio::null())
.output()
.await?;
let result = output_checked_utf8(pull_output)
.and_then(|_| output_checked_utf8(submodule_output))
.wrap_err_with(|| format!("Failed to pull {repo}"));
if result.is_err() {
println!("{} pulling {}", style("Failed").red().bold(), &repo);
} else {
let after_revision = get_head_revision(git, &repo);
match (&before_revision, &after_revision) {
(Some(before), Some(after)) if before != after => {
println!("{} {}:", style("Changed").yellow().bold(), &repo);
Command::new(git)
.stdin(Stdio::null())
.current_dir(&repo)
.args([
"--no-pager",
"log",
"--no-decorate",
"--oneline",
&format!("{before}..{after}"),
])
.status_checked()?;
println!();
}
_ => {
println!("{} {}", style("Up-to-date").green().bold(), &repo);
}
}
}
result.map(|_| ())
}
fn get_head_revision(git: &Path, repo: &str) -> Option<String> {
fn get_head_revision<P: AsRef<Path>>(git: &Path, repo: P) -> Option<String> {
Command::new(git)
.stdin(Stdio::null())
.current_dir(repo)
.current_dir(repo.as_ref())
.args(["rev-parse", "HEAD"])
.output_checked_utf8()
.map(|output| output.stdout.trim().to_string())
.map_err(|e| {
error!("Error getting revision for {}: {}", repo, e);
error!("Error getting revision for {}: {e}", repo.as_ref().display(),);
e
})
.ok()
}
fn has_remotes(git: &Path, repo: &str) -> Option<bool> {
Command::new(git)
.stdin(Stdio::null())
.current_dir(repo)
.args(["remote", "show"])
.output_checked_utf8()
.map(|output| output.stdout.lines().count() > 0)
.map_err(|e| {
error!("Error getting remotes for {}: {}", repo, e);
e
})
.ok()
}
impl RepoStep {
/// Try to create a `RepoStep`, fail if `git` is not found.
pub fn try_new() -> Result<Self> {
let git = require("git")?;
let mut glob_match_options = MatchOptions::new();
impl Git {
pub fn new() -> Self {
Self { git: which("git") }
if cfg!(windows) {
glob_match_options.case_sensitive = false;
}
Ok(Self {
git,
repos: HashSet::new(),
bad_patterns: Vec::new(),
glob_match_options,
})
}
pub fn get_repo_root<P: AsRef<Path>>(&self, path: P) -> Option<String> {
/// Try to get the root of the repo specified in `path`.
pub fn get_repo_root<P: AsRef<Path>>(&self, path: P) -> Option<PathBuf> {
match path.as_ref().canonicalize() {
Ok(mut path) => {
debug_assert!(path.exists());
@@ -162,71 +196,210 @@ impl Git {
path_string
};
if let Some(git) = &self.git {
let output = Command::new(git)
.stdin(Stdio::null())
.current_dir(path)
.args(["rev-parse", "--show-toplevel"])
.output_checked_utf8()
.ok()
.map(|output| output.stdout.trim().to_string());
return output;
}
let output = Command::new(&self.git)
.stdin(Stdio::null())
.current_dir(path)
.args(["rev-parse", "--show-toplevel"])
.output_checked_utf8()
.ok()
// trim the last newline char
.map(|output| PathBuf::from(output.stdout.trim()));
return output;
}
Err(e) => match e.kind() {
io::ErrorKind::NotFound => debug!("{} does not exist", path.as_ref().display()),
_ => error!("Error looking for {}: {}", path.as_ref().display(), e),
_ => error!("Error looking for {}: {e}", path.as_ref().display(),),
},
}
None
}
pub fn multi_pull_step(&self, repositories: &Repositories, ctx: &ExecutionContext) -> Result<()> {
// Warn the user about the bad patterns.
//
// NOTE: this should be executed **before** skipping the Git step or the
// user won't receive this warning in the cases where all the paths configured
// are bad patterns.
repositories
.bad_patterns
.iter()
.for_each(|pattern| print_warning(format!("Path {pattern} did not contain any git repositories")));
if repositories.repositories.is_empty() {
return Err(SkipStep(String::from("No repositories to pull")).into());
/// Check if `path` is a git repo, if yes, add it to `self.repos`.
///
/// Return the check result.
pub fn insert_if_repo<P: AsRef<Path>>(&mut self, path: P) -> bool {
if let Some(repo) = self.get_repo_root(path) {
self.repos.insert(repo);
true
} else {
false
}
print_separator("Git repositories");
self.multi_pull(repositories, ctx)
}
pub fn multi_pull(&self, repositories: &Repositories, ctx: &ExecutionContext) -> Result<()> {
let git = self.git.as_ref().unwrap();
/// Check if `repo` has a remote.
fn has_remotes<P: AsRef<Path>>(&self, repo: P) -> Option<bool> {
let mut cmd = Command::new(&self.git);
cmd.stdin(Stdio::null())
.current_dir(repo.as_ref())
.args(["remote", "show"]);
let res = cmd.output_checked_utf8();
res.map(|output| output.stdout.lines().count() > 0)
.map_err(|e| {
error!("Error getting remotes for {}: {e}", repo.as_ref().display());
e
})
.ok()
}
/// Similar to `insert_if_repo`, with glob support.
pub fn glob_insert(&mut self, pattern: &str) {
if let Ok(glob) = glob_with(pattern, self.glob_match_options) {
let mut last_git_repo: Option<PathBuf> = None;
for entry in glob {
match entry {
Ok(path) => {
if let Some(last_git_repo) = &last_git_repo {
if path.is_descendant_of(last_git_repo) {
debug!(
"Skipping {} because it's a descendant of last known repo {}",
path.display(),
last_git_repo.display()
);
continue;
}
}
if self.insert_if_repo(&path) {
last_git_repo = Some(path);
}
}
Err(e) => {
error!("Error in path {e}");
}
}
}
if last_git_repo.is_none() {
self.bad_patterns.push(String::from(pattern));
}
} else {
error!("Bad glob pattern: {pattern}");
}
}
/// True if `self.repos` is empty.
pub fn is_repos_empty(&self) -> bool {
self.repos.is_empty()
}
/// Remove `path` from `self.repos`.
///
// `cfg(unix)` because it is only used in the oh-my-zsh step.
#[cfg(unix)]
pub fn remove<P: AsRef<Path>>(&mut self, path: P) {
let _removed = self.repos.remove(path.as_ref());
debug_assert!(_removed);
}
/// Try to pull a repo.
async fn pull_repo<P: AsRef<Path>>(&self, ctx: &ExecutionContext<'_>, repo: P) -> Result<()> {
let before_revision = get_head_revision(&self.git, &repo);
if ctx.config().verbose() {
println!("{} {}", style(t!("Pulling")).cyan().bold(), repo.as_ref().display());
}
let mut command = AsyncCommand::new(&self.git);
command
.stdin(Stdio::null())
.current_dir(&repo)
.args(["pull", "--ff-only"]);
if let Some(extra_arguments) = ctx.config().git_arguments() {
command.args(extra_arguments.split_whitespace());
}
let pull_output = command.output().await?;
let submodule_output = AsyncCommand::new(&self.git)
.args(["submodule", "update", "--recursive"])
.current_dir(&repo)
.stdin(Stdio::null())
.output()
.await?;
let result = output_checked_utf8(pull_output)
.and_then(|_| output_checked_utf8(submodule_output))
.wrap_err_with(|| format!("Failed to pull {}", repo.as_ref().display()));
if result.is_err() {
println!(
"{} {} {}",
style(t!("Failed")).red().bold(),
t!("pulling"),
repo.as_ref().display()
);
} else {
let after_revision = get_head_revision(&self.git, repo.as_ref());
match (&before_revision, &after_revision) {
(Some(before), Some(after)) if before != after => {
println!("{} {}", style(t!("Changed")).yellow().bold(), repo.as_ref().display());
Command::new(&self.git)
.stdin(Stdio::null())
.current_dir(&repo)
.args([
"--no-pager",
"log",
"--no-decorate",
"--oneline",
&format!("{before}..{after}"),
])
.status_checked()?;
println!();
}
_ => {
if ctx.config().verbose() {
println!("{} {}", style(t!("Up-to-date")).green().bold(), repo.as_ref().display());
}
}
}
}
result.map(|_| ())
}
/// Pull the repositories specified in `self.repos`.
///
/// # NOTE
/// This function will create an async runtime and do the real job so the
/// function itself is not async.
fn pull_repos(&self, ctx: &ExecutionContext) -> Result<()> {
if ctx.run_type().dry() {
repositories
.repositories
self.repos
.iter()
.for_each(|repo| println!("Would pull {}", &repo));
.for_each(|repo| println!("{}", t!("Would pull {repo}", repo = repo.display())));
return Ok(());
}
let futures_iterator = repositories
.repositories
if !ctx.config().verbose() {
println!(
"\n{} {}\n",
style(t!("Only")).green().bold(),
t!("updated repositories will be shown...")
);
}
let futures_iterator = self
.repos
.iter()
.filter(|repo| match has_remotes(git, repo) {
.filter(|repo| match self.has_remotes(repo) {
Some(false) => {
println!(
"{} {} because it has no remotes",
style("Skipping").yellow().bold(),
repo
"{} {} {}",
style(t!("Skipping")).yellow().bold(),
repo.display(),
t!("because it has no remotes")
);
false
}
_ => true, // repo has remotes or command to check for remotes has failed. proceed to pull anyway.
})
.map(|repo| pull_repository(repo.clone(), git, ctx));
.map(|repo| self.pull_repo(ctx, repo));
let stream_of_futures = if let Some(limit) = ctx.config().git_concurrency_limit() {
iter(futures_iterator).buffer_unordered(limit).boxed()
@@ -241,76 +414,3 @@ impl Git {
error.unwrap_or(Ok(()))
}
}
impl<'a> Repositories<'a> {
pub fn new(git: &'a Git) -> Self {
let mut glob_match_options = MatchOptions::new();
if cfg!(windows) {
glob_match_options.case_sensitive = false;
}
Self {
git,
repositories: HashSet::new(),
bad_patterns: Vec::new(),
glob_match_options,
}
}
pub fn insert_if_repo<P: AsRef<Path>>(&mut self, path: P) -> bool {
if let Some(repo) = self.git.get_repo_root(path) {
self.repositories.insert(repo);
true
} else {
false
}
}
pub fn glob_insert(&mut self, pattern: &str) {
if let Ok(glob) = glob_with(pattern, self.glob_match_options) {
let mut last_git_repo: Option<PathBuf> = None;
for entry in glob {
match entry {
Ok(path) => {
if let Some(last_git_repo) = &last_git_repo {
if path.is_descendant_of(last_git_repo) {
debug!(
"Skipping {} because it's a decendant of last known repo {}",
path.display(),
last_git_repo.display()
);
continue;
}
}
if self.insert_if_repo(&path) {
last_git_repo = Some(path);
}
}
Err(e) => {
error!("Error in path {}", e);
}
}
}
if last_git_repo.is_none() {
self.bad_patterns.push(String::from(pattern));
}
} else {
error!("Bad glob pattern: {}", pattern);
}
}
#[cfg(unix)]
pub fn is_empty(&self) -> bool {
self.repositories.is_empty()
}
// The following 2 functions are `#[cfg(unix)]` because they are only used in
// the `oh-my-zsh` step, which is UNIX-only.
#[cfg(unix)]
pub fn remove(&mut self, path: &str) {
let _removed = self.repositories.remove(path);
debug_assert!(_removed);
}
}

View File

@@ -1,6 +1,7 @@
use crate::terminal::print_separator;
use crate::utils::require;
use color_eyre::eyre::Result;
use rust_i18n::t;
use crate::execution_context::ExecutionContext;
@@ -17,7 +18,7 @@ pub fn upgrade_kak_plug(ctx: &ExecutionContext) -> Result<()> {
.args(["-ui", "dummy", "-e", UPGRADE_KAK])
.output()?;
println!("Plugins upgraded");
println!("{}", t!("Plugins upgraded"));
Ok(())
}

View File

@@ -4,16 +4,17 @@ use std::os::unix::fs::MetadataExt;
use std::path::PathBuf;
use std::process::Command;
use crate::utils::{require_option, REQUIRE_SUDO};
use crate::utils::{get_require_sudo_string, require_option};
use crate::HOME_DIR;
use color_eyre::eyre::Result;
#[cfg(target_os = "linux")]
use nix::unistd::Uid;
use rust_i18n::t;
use semver::Version;
use tracing::debug;
use crate::command::CommandExt;
use crate::terminal::print_separator;
use crate::terminal::{print_info, print_separator};
use crate::utils::{require, PathExt};
use crate::{error::SkipStep, execution_context::ExecutionContext};
@@ -92,7 +93,7 @@ impl NPM {
fn upgrade(&self, ctx: &ExecutionContext, use_sudo: bool) -> Result<()> {
let args = ["update", self.global_location_arg()];
if use_sudo {
let sudo = require_option(ctx.sudo().clone(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().clone(), get_require_sudo_string())?;
ctx.run_type()
.execute(sudo)
.arg(&self.command)
@@ -156,7 +157,7 @@ impl Yarn {
let args = ["global", "upgrade"];
if use_sudo {
let sudo = require_option(ctx.sudo().clone(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().clone(), get_require_sudo_string())?;
ctx.run_type()
.execute(sudo)
.arg(self.yarn.as_ref().unwrap_or(&self.command))
@@ -183,6 +184,92 @@ impl Yarn {
}
}
struct Deno {
command: PathBuf,
}
impl Deno {
fn new(command: PathBuf) -> Self {
Self { command }
}
fn upgrade(&self, ctx: &ExecutionContext) -> Result<()> {
let mut args = vec![];
let version = ctx.config().deno_version();
if let Some(version) = version {
let bin_version = self.version()?;
if bin_version >= Version::new(2, 0, 0) {
args.push(version);
} else if bin_version >= Version::new(1, 6, 0) {
match version {
"stable" => { /* do nothing, as stable is the default channel to upgrade */ }
"rc" => {
return Err(SkipStep(
"Deno (1.6.0-2.0.0) cannot be upgraded to a release candidate".to_string(),
)
.into());
}
"canary" => args.push("--canary"),
_ => {
if Version::parse(version).is_err() {
return Err(SkipStep("Invalid Deno version".to_string()).into());
}
args.push("--version");
args.push(version);
}
}
} else if bin_version >= Version::new(1, 0, 0) {
match version {
"stable" | "rc" | "canary" => {
// Prior to v1.6.0, `deno upgrade` is not able fetch the latest tag version.
return Err(
SkipStep("Deno (1.0.0-1.6.0) cannot be upgraded to a named channel".to_string()).into(),
);
}
_ => {
if Version::parse(version).is_err() {
return Err(SkipStep("Invalid Deno version".to_string()).into());
}
args.push("--version");
args.push(version);
}
}
} else {
// v0.x cannot be upgraded with `deno upgrade` to v1.x or v2.x
// nor can be upgraded to a specific version.
return Err(SkipStep("Unsupported Deno version".to_string()).into());
}
}
ctx.run_type()
.execute(&self.command)
.arg("upgrade")
.args(args)
.status_checked()?;
Ok(())
}
/// Get the version of Deno.
///
/// This function will return the version of Deno installed on the system.
/// The version is parsed from the output of `deno -V`.
///
/// ```sh
/// deno -V # deno 1.6.0
/// ```
fn version(&self) -> Result<Version> {
let version_str = Command::new(&self.command)
.args(["-V"])
.output_checked_utf8()
.map(|s| s.stdout.trim().to_owned().split_off(5)); // remove "deno " prefix
Version::parse(&version_str?).map_err(|err| err.into())
}
}
#[cfg(target_os = "linux")]
fn should_use_sudo(npm: &NPM, ctx: &ExecutionContext) -> Result<bool> {
if npm.should_use_sudo()? {
@@ -214,7 +301,7 @@ fn should_use_sudo_yarn(yarn: &Yarn, ctx: &ExecutionContext) -> Result<bool> {
pub fn run_npm_upgrade(ctx: &ExecutionContext) -> Result<()> {
let npm = require("npm").map(|b| NPM::new(b, NPMVariant::Npm))?;
print_separator("Node Package Manager");
print_separator(t!("Node Package Manager"));
#[cfg(target_os = "linux")]
{
@@ -230,7 +317,7 @@ pub fn run_npm_upgrade(ctx: &ExecutionContext) -> Result<()> {
pub fn run_pnpm_upgrade(ctx: &ExecutionContext) -> Result<()> {
let pnpm = require("pnpm").map(|b| NPM::new(b, NPMVariant::Pnpm))?;
print_separator("Performant Node Package Manager");
print_separator(t!("Performant Node Package Manager"));
#[cfg(target_os = "linux")]
{
@@ -251,7 +338,7 @@ pub fn run_yarn_upgrade(ctx: &ExecutionContext) -> Result<()> {
return Ok(());
}
print_separator("Yarn Package Manager");
print_separator(t!("Yarn Package Manager"));
#[cfg(target_os = "linux")]
{
@@ -265,14 +352,59 @@ pub fn run_yarn_upgrade(ctx: &ExecutionContext) -> Result<()> {
}
pub fn deno_upgrade(ctx: &ExecutionContext) -> Result<()> {
let deno = require("deno")?;
let deno = require("deno").map(Deno::new)?;
let deno_dir = HOME_DIR.join(".deno");
if !deno.canonicalize()?.is_descendant_of(&deno_dir) {
let skip_reason = SkipStep("Deno installed outside of .deno directory".to_string());
if !deno.command.canonicalize()?.is_descendant_of(&deno_dir) {
let skip_reason = SkipStep(t!("Deno installed outside of .deno directory").to_string());
return Err(skip_reason.into());
}
print_separator("Deno");
ctx.run_type().execute(&deno).arg("upgrade").status_checked()
deno.upgrade(ctx)
}
/// There is no `volta upgrade` command, so we need to upgrade each package
pub fn run_volta_packages_upgrade(ctx: &ExecutionContext) -> Result<()> {
let volta = require("volta")?;
print_separator("Volta");
if ctx.run_type().dry() {
print_info(t!("Updating Volta packages..."));
return Ok(());
}
let list_output = ctx
.run_type()
.execute(&volta)
.args(["list", "--format=plain"])
.output_checked_utf8()?
.stdout;
let installed_packages: Vec<&str> = list_output
.lines()
.filter_map(|line| {
// format is 'kind package@version ...'
let mut parts = line.split_whitespace();
parts.next();
let package_part = parts.next()?;
let version_index = package_part.rfind('@').unwrap_or(package_part.len());
Some(package_part[..version_index].trim())
})
.collect();
if installed_packages.is_empty() {
print_info(t!("No packages installed with Volta"));
return Ok(());
}
for package in installed_packages.iter() {
ctx.run_type()
.execute(&volta)
.args(["install", package])
.status_checked()?;
}
Ok(())
}

View File

@@ -4,12 +4,13 @@ use std::path::{Path, PathBuf};
use color_eyre::eyre;
use color_eyre::eyre::Result;
use rust_i18n::t;
use walkdir::WalkDir;
use crate::command::CommandExt;
use crate::error::TopgradeError;
use crate::execution_context::ExecutionContext;
use crate::sudo::Sudo;
use crate::utils::require_option;
use crate::utils::which;
use crate::{config, Step};
@@ -144,13 +145,13 @@ impl Trizen {
}
pub struct Pacman {
sudo: Sudo,
executable: PathBuf,
}
impl ArchPackageManager for Pacman {
fn upgrade(&self, ctx: &ExecutionContext) -> Result<()> {
let mut command = ctx.run_type().execute(&self.sudo);
let sudo = require_option(ctx.sudo().as_ref(), "sudo is required to run pacman".into())?;
let mut command = ctx.run_type().execute(sudo);
command
.arg(&self.executable)
.arg("-Syu")
@@ -161,7 +162,7 @@ impl ArchPackageManager for Pacman {
command.status_checked()?;
if ctx.config().cleanup() {
let mut command = ctx.run_type().execute(&self.sudo);
let mut command = ctx.run_type().execute(sudo);
command.arg(&self.executable).arg("-Scc");
if ctx.config().yes(Step::System) {
command.arg("--noconfirm");
@@ -174,10 +175,9 @@ impl ArchPackageManager for Pacman {
}
impl Pacman {
pub fn get(ctx: &ExecutionContext) -> Option<Self> {
pub fn get() -> Option<Self> {
Some(Self {
executable: which("powerpill").unwrap_or_else(|| PathBuf::from("pacman")),
sudo: ctx.sudo().to_owned()?,
})
}
}
@@ -263,47 +263,76 @@ impl ArchPackageManager for Pamac {
pub struct Aura {
executable: PathBuf,
sudo: Sudo,
}
impl Aura {
fn get(ctx: &ExecutionContext) -> Option<Self> {
fn get() -> Option<Self> {
Some(Self {
executable: which("aura")?,
sudo: ctx.sudo().to_owned()?,
})
}
}
impl ArchPackageManager for Aura {
fn upgrade(&self, ctx: &ExecutionContext) -> Result<()> {
let sudo = which("sudo").unwrap_or_default();
let mut aur_update = ctx.run_type().execute(&sudo);
use semver::Version;
if sudo.ends_with("sudo") {
aur_update
.arg(&self.executable)
let version_cmd_output = ctx
.run_type()
.execute(&self.executable)
.arg("--version")
.output_checked_utf8()?;
// Output will be something like: "aura x.x.x\n"
let version_cmd_stdout = version_cmd_output.stdout;
let version_str = version_cmd_stdout.trim_start_matches("aura ").trim_end();
let version = Version::parse(version_str).expect("invalid version");
// Aura, since version 4.0.6, no longer needs sudo.
//
// https://github.com/fosskers/aura/releases/tag/v4.0.6
let version_no_sudo = Version::new(4, 0, 6);
if version >= version_no_sudo {
let mut cmd = ctx.run_type().execute(&self.executable);
cmd.arg("-Au")
.args(ctx.config().aura_aur_arguments().split_whitespace());
if ctx.config().yes(Step::System) {
cmd.arg("--noconfirm");
}
cmd.status_checked()?;
let mut cmd = ctx.run_type().execute(&self.executable);
cmd.arg("-Syu")
.args(ctx.config().aura_pacman_arguments().split_whitespace());
if ctx.config().yes(Step::System) {
cmd.arg("--noconfirm");
}
cmd.status_checked()?;
} else {
let sudo = crate::utils::require_option(
ctx.sudo().as_ref(),
t!("Aura(<0.4.6) requires sudo installed to work with AUR packages").to_string(),
)?;
let mut cmd = ctx.run_type().execute(sudo);
cmd.arg(&self.executable)
.arg("-Au")
.args(ctx.config().aura_aur_arguments().split_whitespace());
if ctx.config().yes(Step::System) {
aur_update.arg("--noconfirm");
cmd.arg("--noconfirm");
}
cmd.status_checked()?;
aur_update.status_checked()?;
} else {
println!("Aura requires sudo installed to work with AUR packages")
let mut cmd = ctx.run_type().execute(sudo);
cmd.arg(&self.executable)
.arg("-Syu")
.args(ctx.config().aura_pacman_arguments().split_whitespace());
if ctx.config().yes(Step::System) {
cmd.arg("--noconfirm");
}
cmd.status_checked()?;
}
let mut pacman_update = ctx.run_type().execute(&self.sudo);
pacman_update
.arg(&self.executable)
.arg("-Syu")
.args(ctx.config().aura_pacman_arguments().split_whitespace());
if ctx.config().yes(Step::System) {
pacman_update.arg("--noconfirm");
}
pacman_update.status_checked()?;
Ok(())
}
}
@@ -323,16 +352,16 @@ pub fn get_arch_package_manager(ctx: &ExecutionContext) -> Option<Box<dyn ArchPa
.or_else(|| Trizen::get().map(box_package_manager))
.or_else(|| Pikaur::get().map(box_package_manager))
.or_else(|| Pamac::get().map(box_package_manager))
.or_else(|| Pacman::get(ctx).map(box_package_manager))
.or_else(|| Aura::get(ctx).map(box_package_manager)),
.or_else(|| Pacman::get().map(box_package_manager))
.or_else(|| Aura::get().map(box_package_manager)),
config::ArchPackageManager::GarudaUpdate => GarudaUpdate::get().map(box_package_manager),
config::ArchPackageManager::Trizen => Trizen::get().map(box_package_manager),
config::ArchPackageManager::Paru => YayParu::get("paru", &pacman).map(box_package_manager),
config::ArchPackageManager::Yay => YayParu::get("yay", &pacman).map(box_package_manager),
config::ArchPackageManager::Pacman => Pacman::get(ctx).map(box_package_manager),
config::ArchPackageManager::Pacman => Pacman::get().map(box_package_manager),
config::ArchPackageManager::Pikaur => Pikaur::get().map(box_package_manager),
config::ArchPackageManager::Pamac => Pamac::get().map(box_package_manager),
config::ArchPackageManager::Aura => Aura::get(ctx).map(box_package_manager),
config::ArchPackageManager::Aura => Aura::get().map(box_package_manager),
}
}
@@ -355,7 +384,7 @@ pub fn show_pacnew() {
.peekable();
if iter.peek().is_some() {
println!("\nPacman backup configuration files found:");
println!("\n{}", t!("Pacman backup configuration files found:"));
for entry in iter {
println!("{}", entry.path().display());

View File

@@ -1,14 +1,14 @@
use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
use crate::terminal::print_separator;
use crate::utils::{require_option, REQUIRE_SUDO};
use crate::utils::{get_require_sudo_string, require_option};
use crate::Step;
use color_eyre::eyre::Result;
use std::process::Command;
pub fn upgrade_packages(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
print_separator("DragonFly BSD Packages");
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator(t!("DragonFly BSD Packages"));
let mut cmd = ctx.run_type().execute(sudo);
cmd.args(["/usr/local/sbin/pkg", "upgrade"]);
if ctx.config().yes(Step::System) {
@@ -18,9 +18,9 @@ pub fn upgrade_packages(ctx: &ExecutionContext) -> Result<()> {
}
pub fn audit_packages(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator("DragonFly BSD Audit");
print_separator(t!("DragonFly BSD Audit"));
#[allow(clippy::disallowed_methods)]
if !Command::new(sudo)
@@ -28,7 +28,9 @@ pub fn audit_packages(ctx: &ExecutionContext) -> Result<()> {
.status()?
.success()
{
println!("The package audit was successful, but vulnerable packages still remain on the system");
println!(t!(
"The package audit was successful, but vulnerable packages still remain on the system"
));
}
Ok(())
}

View File

@@ -1,14 +1,15 @@
use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
use crate::terminal::print_separator;
use crate::utils::{require_option, REQUIRE_SUDO};
use crate::utils::{get_require_sudo_string, require_option};
use crate::Step;
use color_eyre::eyre::Result;
use rust_i18n::t;
use std::process::Command;
pub fn upgrade_freebsd(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
print_separator("FreeBSD Update");
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator(t!("FreeBSD Update"));
ctx.run_type()
.execute(sudo)
.args(["/usr/sbin/freebsd-update", "fetch", "install"])
@@ -16,8 +17,8 @@ pub fn upgrade_freebsd(ctx: &ExecutionContext) -> Result<()> {
}
pub fn upgrade_packages(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
print_separator("FreeBSD Packages");
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator(t!("FreeBSD Packages"));
let mut command = ctx.run_type().execute(sudo);
@@ -29,9 +30,9 @@ pub fn upgrade_packages(ctx: &ExecutionContext) -> Result<()> {
}
pub fn audit_packages(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator("FreeBSD Audit");
print_separator(t!("FreeBSD Audit"));
Command::new(sudo)
.args(["/usr/sbin/pkg", "audit", "-Fr"])

View File

@@ -3,14 +3,16 @@ use std::process::Command;
use color_eyre::eyre::Result;
use ini::Ini;
use rust_i18n::t;
use tracing::{debug, warn};
use crate::command::CommandExt;
use crate::error::{SkipStep, TopgradeError};
use crate::execution_context::ExecutionContext;
use crate::steps::generic::is_wsl;
use crate::steps::os::archlinux;
use crate::terminal::print_separator;
use crate::utils::{require, require_option, which, PathExt, REQUIRE_SUDO};
use crate::terminal::{print_separator, prompt_yesno};
use crate::utils::{get_require_sudo_string, require, require_option, which, PathExt};
use crate::{Step, HOME_DIR};
static OS_RELEASE_PATH: &str = "/etc/os-release";
@@ -19,14 +21,17 @@ static OS_RELEASE_PATH: &str = "/etc/os-release";
#[derive(Copy, Clone, Debug, PartialEq, Eq)]
pub enum Distribution {
Alpine,
Wolfi,
Arch,
Bedrock,
CentOS,
Chimera,
ClearLinux,
Fedora,
FedoraImmutable,
Debian,
Gentoo,
NILRT,
OpenMandriva,
OpenSuseTumbleweed,
PCLinuxOS,
@@ -38,6 +43,7 @@ pub enum Distribution {
Exherbo,
NixOS,
KDENeon,
Nobara,
}
impl Distribution {
@@ -45,34 +51,35 @@ impl Distribution {
let section = os_release.general_section();
let id = section.get("ID");
let name = section.get("NAME");
let variant: Option<Vec<&str>> = section.get("VARIANT").map(|s| s.split_whitespace().collect());
let variant = section.get("VARIANT");
let id_like: Option<Vec<&str>> = section.get("ID_LIKE").map(|s| s.split_whitespace().collect());
Ok(match id {
Some("alpine") => Distribution::Alpine,
Some("chimera") => Distribution::Chimera,
Some("wolfi") => Distribution::Wolfi,
Some("centos") | Some("rhel") | Some("ol") => Distribution::CentOS,
Some("clear-linux-os") => Distribution::ClearLinux,
Some("fedora") | Some("nobara") => {
Some("fedora") => {
return if let Some(variant) = variant {
if variant.contains(&"Silverblue")
|| variant.contains(&"Kinoite")
|| variant.contains(&"Sericea")
|| variant.contains(&"Onyx")
{
Ok(Distribution::FedoraImmutable)
} else {
Ok(Distribution::Fedora)
match variant {
"Silverblue" | "Kinoite" | "Sericea" | "Onyx" | "IoT Edition" | "Sway Atomic" => {
Ok(Distribution::FedoraImmutable)
}
_ => Ok(Distribution::Fedora),
}
} else {
Ok(Distribution::Fedora)
};
}
Some("nilrt") => Distribution::NILRT,
Some("nobara") => Distribution::Nobara,
Some("void") => Distribution::Void,
Some("debian") | Some("pureos") | Some("Deepin") => Distribution::Debian,
Some("debian") | Some("pureos") | Some("Deepin") | Some("linuxmint") => Distribution::Debian,
Some("arch") | Some("manjaro-arm") | Some("garuda") | Some("artix") => Distribution::Arch,
Some("solus") => Distribution::Solus,
Some("gentoo") => Distribution::Gentoo,
Some("gentoo") | Some("funtoo") => Distribution::Gentoo,
Some("exherbo") => Distribution::Exherbo,
Some("nixos") => Distribution::NixOS,
Some("opensuse-microos") => Distribution::SuseMicro,
@@ -129,10 +136,12 @@ impl Distribution {
}
pub fn upgrade(self, ctx: &ExecutionContext) -> Result<()> {
print_separator("System update");
print_separator(t!("System update"));
match self {
Distribution::Alpine => upgrade_alpine_linux(ctx),
Distribution::Chimera => upgrade_chimera_linux(ctx),
Distribution::Wolfi => upgrade_wolfi_linux(ctx),
Distribution::Arch => archlinux::upgrade_arch_linux(ctx),
Distribution::CentOS | Distribution::Fedora => upgrade_redhat(ctx),
Distribution::FedoraImmutable => upgrade_fedora_immutable(ctx),
@@ -151,6 +160,8 @@ impl Distribution {
Distribution::Bedrock => update_bedrock(ctx),
Distribution::OpenMandriva => upgrade_openmandriva(ctx),
Distribution::PCLinuxOS => upgrade_pclinuxos(ctx),
Distribution::Nobara => upgrade_nobara(ctx),
Distribution::NILRT => upgrade_nilrt(ctx),
}
}
@@ -166,7 +177,7 @@ impl Distribution {
}
fn update_bedrock(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type().execute(sudo).args(["brl", "update"]);
@@ -177,7 +188,7 @@ fn update_bedrock(ctx: &ExecutionContext) -> Result<()> {
debug!("Bedrock distribution {}", distribution);
match distribution {
"arch" => archlinux::upgrade_arch_linux(ctx)?,
"debian" | "ubuntu" => upgrade_debian(ctx)?,
"debian" | "ubuntu" | "linuxmint" => upgrade_debian(ctx)?,
"centos" | "fedora" => upgrade_redhat(ctx)?,
"bedrock" => upgrade_bedrock_strata(ctx)?,
_ => {
@@ -189,22 +200,37 @@ fn update_bedrock(ctx: &ExecutionContext) -> Result<()> {
Ok(())
}
fn is_wsl() -> Result<bool> {
let output = Command::new("uname").arg("-r").output_checked_utf8()?.stdout;
debug!("Uname output: {}", output);
Ok(output.contains("microsoft"))
}
fn upgrade_alpine_linux(ctx: &ExecutionContext) -> Result<()> {
let apk = require("apk")?;
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type().execute(sudo).arg(&apk).arg("update").status_checked()?;
ctx.run_type().execute(sudo).arg(&apk).arg("upgrade").status_checked()
}
fn upgrade_chimera_linux(ctx: &ExecutionContext) -> Result<()> {
let apk = require("apk")?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type().execute(sudo).arg(&apk).arg("update").status_checked()?;
ctx.run_type().execute(sudo).arg(&apk).arg("upgrade").status_checked()
}
fn upgrade_wolfi_linux(ctx: &ExecutionContext) -> Result<()> {
let apk = require("apk")?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type().execute(sudo).arg(&apk).arg("update").status_checked()?;
ctx.run_type().execute(sudo).arg(&apk).arg("upgrade").status_checked()
}
fn upgrade_redhat(ctx: &ExecutionContext) -> Result<()> {
if let Some(ostree) = which("rpm-ostree") {
if let Some(bootc) = which("bootc") {
if ctx.config().bootc() {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
return ctx.run_type().execute(sudo).arg(&bootc).arg("upgrade").status_checked();
}
} else if let Some(ostree) = which("rpm-ostree") {
if ctx.config().rpm_ostree() {
let mut command = ctx.run_type().execute(ostree);
command.arg("upgrade");
@@ -212,7 +238,7 @@ fn upgrade_redhat(ctx: &ExecutionContext) -> Result<()> {
}
};
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut command = ctx.run_type().execute(sudo);
command
.arg(which("dnf").unwrap_or_else(|| Path::new("yum").to_path_buf()))
@@ -234,7 +260,56 @@ fn upgrade_redhat(ctx: &ExecutionContext) -> Result<()> {
Ok(())
}
fn upgrade_nobara(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let pkg_manager = require("dnf")?;
let mut update_command = ctx.run_type().execute(sudo);
update_command.arg(&pkg_manager);
if ctx.config().yes(Step::System) {
update_command.arg("-y");
}
update_command.arg("update");
// See https://nobaraproject.org/docs/upgrade-troubleshooting/how-do-i-update-the-system/
update_command.args([
"rpmfusion-nonfree-release",
"rpmfusion-free-release",
"fedora-repos",
"nobara-repos",
]);
update_command.arg("--refresh").status_checked()?;
let mut upgrade_command = ctx.run_type().execute(sudo);
upgrade_command.arg(&pkg_manager);
if ctx.config().yes(Step::System) {
upgrade_command.arg("-y");
}
upgrade_command.arg("distro-sync");
upgrade_command.status_checked()?;
Ok(())
}
fn upgrade_nilrt(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let opkg = require("opkg")?;
ctx.run_type().execute(sudo).arg(&opkg).arg("update").status_checked()?;
ctx.run_type().execute(sudo).arg(&opkg).arg("upgrade").status_checked()
}
fn upgrade_fedora_immutable(ctx: &ExecutionContext) -> Result<()> {
if let Some(bootc) = which("bootc") {
if ctx.config().bootc() {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
return ctx.run_type().execute(sudo).arg(&bootc).arg("upgrade").status_checked();
}
}
let ostree = require("rpm-ostree")?;
let mut command = ctx.run_type().execute(ostree);
command.arg("upgrade");
@@ -243,14 +318,14 @@ fn upgrade_fedora_immutable(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_bedrock_strata(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type().execute(sudo).args(["brl", "update"]).status_checked()?;
Ok(())
}
fn upgrade_suse(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type()
.execute(sudo)
.args(["zypper", "refresh"])
@@ -273,7 +348,7 @@ fn upgrade_suse(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_opensuse_tumbleweed(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type()
.execute(sudo)
.args(["zypper", "refresh"])
@@ -291,7 +366,7 @@ fn upgrade_opensuse_tumbleweed(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_suse_micro(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut cmd = ctx.run_type().execute(sudo);
cmd.arg("transactional-update");
if ctx.config().yes(Step::System) {
@@ -304,10 +379,10 @@ fn upgrade_suse_micro(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_openmandriva(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut command = ctx.run_type().execute(sudo);
command.arg(&which("dnf").unwrap()).arg("upgrade");
command.arg(which("dnf").unwrap()).arg("upgrade");
if let Some(args) = ctx.config().dnf_arguments() {
command.args(args.split_whitespace());
@@ -323,10 +398,10 @@ fn upgrade_openmandriva(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_pclinuxos(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut command_update = ctx.run_type().execute(sudo);
command_update.arg(&which("apt-get").unwrap()).arg("update");
command_update.arg(which("apt-get").unwrap()).arg("update");
if let Some(args) = ctx.config().dnf_arguments() {
command_update.args(args.split_whitespace());
@@ -339,7 +414,7 @@ fn upgrade_pclinuxos(ctx: &ExecutionContext) -> Result<()> {
command_update.status_checked()?;
let mut cmd = ctx.run_type().execute(sudo);
cmd.arg(&which("apt-get").unwrap());
cmd.arg(which("apt-get").unwrap());
cmd.arg("dist-upgrade");
if ctx.config().yes(Step::System) {
cmd.arg("-y");
@@ -370,7 +445,7 @@ fn upgrade_vanilla(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_void(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut command = ctx.run_type().execute(sudo);
command.args(["xbps-install", "-Su", "xbps"]);
if ctx.config().yes(Step::System) {
@@ -391,7 +466,7 @@ fn upgrade_void(ctx: &ExecutionContext) -> Result<()> {
fn upgrade_gentoo(ctx: &ExecutionContext) -> Result<()> {
let run_type = ctx.run_type();
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
if let Some(layman) = which("layman") {
run_type
.execute(sudo)
@@ -400,17 +475,22 @@ fn upgrade_gentoo(ctx: &ExecutionContext) -> Result<()> {
.status_checked()?;
}
println!("Syncing portage");
run_type
.execute(sudo)
.args(["emerge", "--sync"])
.args(
ctx.config()
.emerge_sync_flags()
.map(|s| s.split_whitespace().collect())
.unwrap_or_else(|| vec!["-q"]),
)
.status_checked()?;
println!("{}", t!("Syncing portage"));
if let Some(ego) = which("ego") {
// The Funtoo team doesn't reccomend running both ego sync and emerge --sync
run_type.execute(sudo).arg(ego).arg("sync").status_checked()?;
} else {
run_type
.execute(sudo)
.args(["emerge", "--sync"])
.args(
ctx.config()
.emerge_sync_flags()
.map(|s| s.split_whitespace().collect())
.unwrap_or_else(|| vec!["-q"]),
)
.status_checked()?;
}
if let Some(eix_update) = which("eix-update") {
run_type.execute(sudo).arg(eix_update).status_checked()?;
@@ -461,7 +541,7 @@ fn upgrade_debian(ctx: &ExecutionContext) -> Result<()> {
return Ok(());
}
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
if !is_nala {
ctx.run_type()
.execute(sudo)
@@ -515,7 +595,7 @@ pub fn run_deb_get(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_solus(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut cmd = ctx.run_type().execute(sudo);
cmd.arg("eopkg");
if ctx.config().yes(Step::System) {
@@ -624,7 +704,7 @@ pub fn run_packer_nu(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_clearlinux(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut cmd = ctx.run_type().execute(sudo);
cmd.args(["swupd", "update"]);
if ctx.config().yes(Step::System) {
@@ -636,7 +716,7 @@ fn upgrade_clearlinux(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_exherbo(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
ctx.run_type().execute(sudo).args(["cave", "sync"]).status_checked()?;
ctx.run_type()
@@ -665,7 +745,7 @@ fn upgrade_exherbo(ctx: &ExecutionContext) -> Result<()> {
}
fn upgrade_nixos(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let mut command = ctx.run_type().execute(sudo);
command.args(["/run/current-system/sw/bin/nixos-rebuild", "switch", "--upgrade"]);
@@ -691,7 +771,7 @@ fn upgrade_neon(ctx: &ExecutionContext) -> Result<()> {
// seems rare
// if that comes up we need to create a Distribution::PackageKit or some such
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let pkcon = which("pkcon").unwrap();
// pkcon ignores update with update and refresh provided together
ctx.run_type()
@@ -720,7 +800,7 @@ fn upgrade_neon(ctx: &ExecutionContext) -> Result<()> {
/// alternative
fn should_skip_needrestart() -> Result<()> {
let distribution = Distribution::detect()?;
let msg = "needrestart will be ran by the package manager";
let msg = t!("needrestart will be ran by the package manager");
if distribution.redhat_based() {
return Err(SkipStep(String::from(msg)).into());
@@ -755,12 +835,12 @@ fn should_skip_needrestart() -> Result<()> {
}
pub fn run_needrestart(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let needrestart = require("needrestart")?;
should_skip_needrestart()?;
print_separator("Check for needed restarts");
print_separator(t!("Check for needed restarts"));
ctx.run_type().execute(sudo).arg(needrestart).status_checked()?;
@@ -771,10 +851,10 @@ pub fn run_fwupdmgr(ctx: &ExecutionContext) -> Result<()> {
let fwupdmgr = require("fwupdmgr")?;
if is_wsl()? {
return Err(SkipStep(String::from("Should not run in WSL")).into());
return Err(SkipStep(t!("Should not run in WSL").to_string()).into());
}
print_separator("Firmware upgrades");
print_separator(t!("Firmware upgrades"));
ctx.run_type()
.execute(&fwupdmgr)
@@ -796,7 +876,7 @@ pub fn run_fwupdmgr(ctx: &ExecutionContext) -> Result<()> {
pub fn run_flatpak(ctx: &ExecutionContext) -> Result<()> {
let flatpak = require("flatpak")?;
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let cleanup = ctx.config().cleanup();
let yes = ctx.config().yes(Step::Flatpak);
let run_type = ctx.run_type();
@@ -816,7 +896,7 @@ pub fn run_flatpak(ctx: &ExecutionContext) -> Result<()> {
run_type.execute(&flatpak).args(&cleanup_args).status_checked()?;
}
print_separator("Flatpak System Packages");
print_separator(t!("Flatpak System Packages"));
if ctx.config().flatpak_use_sudo() || std::env::var("SSH_CLIENT").is_ok() {
let mut update_args = vec!["update", "--system"];
if yes {
@@ -857,11 +937,11 @@ pub fn run_flatpak(ctx: &ExecutionContext) -> Result<()> {
}
pub fn run_snap(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let snap = require("snap")?;
if !PathBuf::from("/var/snapd.socket").exists() && !PathBuf::from("/run/snapd.socket").exists() {
return Err(SkipStep(String::from("Snapd socket does not exist")).into());
return Err(SkipStep(t!("Snapd socket does not exist").to_string()).into());
}
print_separator("snap");
@@ -869,7 +949,7 @@ pub fn run_snap(ctx: &ExecutionContext) -> Result<()> {
}
pub fn run_pihole_update(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let pihole = require("pihole")?;
Path::new("/opt/pihole/update.sh").require()?;
@@ -903,7 +983,7 @@ pub fn run_distrobox_update(ctx: &ExecutionContext) -> Result<()> {
) {
(r, Some(c)) => {
if c.is_empty() {
return Err(SkipStep("You need to specify at least one container".to_string()).into());
return Err(SkipStep(t!("You need to specify at least one container").to_string()).into());
}
r.args(c)
}
@@ -918,7 +998,7 @@ pub fn run_distrobox_update(ctx: &ExecutionContext) -> Result<()> {
}
pub fn run_dkp_pacman_update(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let dkp_pacman = require("dkp-pacman")?;
print_separator("Devkitpro pacman");
@@ -941,20 +1021,20 @@ pub fn run_dkp_pacman_update(ctx: &ExecutionContext) -> Result<()> {
}
pub fn run_config_update(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
if ctx.config().yes(Step::ConfigUpdate) {
return Err(SkipStep("Skipped in --yes".to_string()).into());
return Err(SkipStep(t!("Skipped in --yes").to_string()).into());
}
if let Ok(etc_update) = require("etc-update") {
print_separator("Configuration update");
print_separator(t!("Configuration update"));
ctx.run_type().execute(sudo).arg(etc_update).status_checked()?;
} else if let Ok(pacdiff) = require("pacdiff") {
if std::env::var("DIFFPROG").is_err() {
require("vim")?;
}
print_separator("Configuration update");
print_separator(t!("Configuration update"));
ctx.execute_elevated(&pacdiff, false)?.status_checked()?;
}
@@ -977,6 +1057,67 @@ pub fn run_lure_update(ctx: &ExecutionContext) -> Result<()> {
exe.status_checked()
}
pub fn run_waydroid(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let waydroid = require("waydroid")?;
let status = ctx.run_type().execute(&waydroid).arg("status").output_checked_utf8()?;
// example output of `waydroid status`:
//
// ```sh
// $ waydroid status
// Session: RUNNING
// Container: RUNNING
// Vendor type: MAINLINE
// IP address: 192.168.240.112
// Session user: w568w(1000)
// Wayland display: wayland-0
// ```
//
// ```sh
// $ waydroid status
// Session: STOPPED
// Vendor type: MAINLINE
// ```
let session = status
.stdout
.lines()
.find(|line| line.contains("Session:"))
.unwrap_or_else(|| panic!("the output of `waydroid status` should contain `Session:`"));
let is_container_running = session.contains("RUNNING");
let assume_yes = ctx.config().yes(Step::Waydroid);
print_separator("Waydroid");
if is_container_running && !assume_yes {
let update_allowed = prompt_yesno(&t!(
"Going to execute `waydroid upgrade`, which would STOP the running container, is this ok?"
))?;
if !update_allowed {
return Err(
SkipStep(t!("Skip the Waydroid step because the user don't want to proceed").to_string()).into(),
);
}
}
ctx.run_type()
.execute(sudo)
.arg(&waydroid)
.arg("upgrade")
.status_checked()
}
pub fn run_auto_cpufreq(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
let auto_cpu_freq = require("auto-cpufreq")?;
print_separator("auto-cpufreq");
ctx.run_type()
.execute(sudo)
.arg(auto_cpu_freq)
.arg("--update")
.status_checked()
}
#[cfg(test)]
mod tests {
use super::*;
@@ -989,6 +1130,11 @@ mod tests {
);
}
#[test]
fn test_wolfi() {
test_template(include_str!("os_release/wolfi"), Distribution::Wolfi);
}
#[test]
fn test_arch_linux() {
test_template(include_str!("os_release/arch"), Distribution::Arch);
@@ -1049,6 +1195,11 @@ mod tests {
test_template(include_str!("os_release/fedorakinoite"), Distribution::FedoraImmutable);
test_template(include_str!("os_release/fedoraonyx"), Distribution::FedoraImmutable);
test_template(include_str!("os_release/fedorasericea"), Distribution::FedoraImmutable);
test_template(include_str!("os_release/fedoraiot"), Distribution::FedoraImmutable);
test_template(
include_str!("os_release/fedoraswayatomic"),
Distribution::FedoraImmutable,
);
}
#[test]
@@ -1066,6 +1217,11 @@ mod tests {
test_template(include_str!("os_release/gentoo"), Distribution::Gentoo);
}
#[test]
fn test_funtoo() {
test_template(include_str!("os_release/funtoo"), Distribution::Gentoo);
}
#[test]
fn test_exherbo() {
test_template(include_str!("os_release/exherbo"), Distribution::Exherbo);
@@ -1120,4 +1276,14 @@ mod tests {
fn test_solus() {
test_template(include_str!("os_release/solus"), Distribution::Solus);
}
#[test]
fn test_nobara() {
test_template(include_str!("os_release/nobara"), Distribution::Nobara);
}
#[test]
fn test_nilrt() {
test_template(include_str!("os_release/nilrt"), Distribution::NILRT);
}
}

View File

@@ -1,16 +1,18 @@
use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
use crate::terminal::{print_separator, prompt_yesno};
use crate::utils::{require_option, REQUIRE_SUDO};
use crate::utils::{get_require_sudo_string, require_option};
use crate::{utils::require, Step};
use color_eyre::eyre::Result;
use rust_i18n::t;
use std::collections::HashSet;
use std::fs;
use std::process::Command;
use tracing::debug;
pub fn run_macports(ctx: &ExecutionContext) -> Result<()> {
require("port")?;
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator("MacPorts");
ctx.run_type()
@@ -33,25 +35,25 @@ pub fn run_macports(ctx: &ExecutionContext) -> Result<()> {
pub fn run_mas(ctx: &ExecutionContext) -> Result<()> {
let mas = require("mas")?;
print_separator("macOS App Store");
print_separator(t!("macOS App Store"));
ctx.run_type().execute(mas).arg("upgrade").status_checked()
}
pub fn upgrade_macos(ctx: &ExecutionContext) -> Result<()> {
print_separator("macOS system update");
print_separator(t!("macOS system update"));
let should_ask = !(ctx.config().yes(Step::System)) || (ctx.config().dry_run());
let should_ask = !(ctx.config().yes(Step::System) || ctx.config().dry_run());
if should_ask {
println!("Finding available software");
println!("{}", t!("Finding available software"));
if system_update_available()? {
let answer = prompt_yesno("A system update is available. Do you wish to install it?")?;
let answer = prompt_yesno(t!("A system update is available. Do you wish to install it?").as_ref())?;
if !answer {
return Ok(());
}
println!();
} else {
println!("No new software available.");
println!("{}", t!("No new software available."));
return Ok(());
}
}
@@ -93,3 +95,150 @@ pub fn run_sparkle(ctx: &ExecutionContext) -> Result<()> {
}
Ok(())
}
pub fn update_xcodes(ctx: &ExecutionContext) -> Result<()> {
let xcodes = require("xcodes")?;
print_separator("Xcodes");
let should_ask = !(ctx.config().yes(Step::Xcodes) || ctx.config().dry_run());
let releases = ctx
.run_type()
.execute(&xcodes)
.args(["update"])
.output_checked_utf8()?
.stdout;
let releases_installed: Vec<String> = releases
.lines()
.filter(|r| r.contains("(Installed)"))
.map(String::from)
.collect();
if releases_installed.is_empty() {
println!("{}", t!("No Xcode releases installed."));
return Ok(());
}
let (installed_gm, installed_beta, installed_regular) =
releases_installed
.iter()
.fold((false, false, false), |(gm, beta, regular), release| {
(
gm || release.contains("GM") || release.contains("Release Candidate"),
beta || release.contains("Beta"),
regular
|| !(release.contains("GM")
|| release.contains("Release Candidate")
|| release.contains("Beta")),
)
});
let releases_gm = releases
.lines()
.filter(|&r| r.matches("GM").count() > 0 || r.matches("Release Candidate").count() > 0)
.map(String::from)
.collect();
let releases_beta = releases
.lines()
.filter(|&r| r.matches("Beta").count() > 0)
.map(String::from)
.collect();
let releases_regular = releases
.lines()
.filter(|&r| {
r.matches("GM").count() == 0
&& r.matches("Release Candidate").count() == 0
&& r.matches("Beta").count() == 0
})
.map(String::from)
.collect();
if installed_gm {
process_xcodes_releases(releases_gm, should_ask, ctx)?;
}
if installed_beta {
process_xcodes_releases(releases_beta, should_ask, ctx)?;
}
if installed_regular {
process_xcodes_releases(releases_regular, should_ask, ctx)?;
}
let releases_new = ctx
.run_type()
.execute(&xcodes)
.args(["list"])
.output_checked_utf8()?
.stdout;
let releases_gm_new_installed: HashSet<_> = releases_new
.lines()
.filter(|release| {
release.contains("(Installed)") && (release.contains("GM") || release.contains("Release Candidate"))
})
.collect();
let releases_beta_new_installed: HashSet<_> = releases_new
.lines()
.filter(|release| release.contains("(Installed)") && release.contains("Beta"))
.collect();
let releases_regular_new_installed: HashSet<_> = releases_new
.lines()
.filter(|release| {
release.contains("(Installed)")
&& !(release.contains("GM") || release.contains("Release Candidate") || release.contains("Beta"))
})
.collect();
for releases_new_installed in [
releases_gm_new_installed,
releases_beta_new_installed,
releases_regular_new_installed,
] {
if should_ask && releases_new_installed.len() == 2 {
let answer_uninstall =
prompt_yesno(t!("Would you like to move the former Xcode release to the trash?").as_ref())?;
if answer_uninstall {
let _ = ctx
.run_type()
.execute(&xcodes)
.args([
"uninstall",
releases_new_installed.iter().next().cloned().unwrap_or_default(),
])
.status_checked();
}
}
}
Ok(())
}
pub fn process_xcodes_releases(releases_filtered: Vec<String>, should_ask: bool, ctx: &ExecutionContext) -> Result<()> {
let xcodes = require("xcodes")?;
if releases_filtered
.last()
.map(|s| !s.contains("(Installed)"))
.unwrap_or(true)
&& !releases_filtered.is_empty()
{
println!(
"{} {}",
t!("New Xcode release detected:"),
releases_filtered.last().cloned().unwrap_or_default()
);
if should_ask {
let answer_install = prompt_yesno(t!("Would you like to install it?").as_ref())?;
if answer_install {
let _ = ctx
.run_type()
.execute(xcodes)
.args(["install", &releases_filtered.last().cloned().unwrap_or_default()])
.status_checked();
}
println!();
}
}
Ok(())
}

View File

@@ -1,23 +1,65 @@
use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
use crate::terminal::print_separator;
use crate::utils::{require_option, REQUIRE_SUDO};
use crate::utils::{get_require_sudo_string, require_option};
use color_eyre::eyre::Result;
use std::path::PathBuf;
use rust_i18n::t;
use std::fs;
fn is_openbsd_current(ctx: &ExecutionContext) -> Result<bool> {
let motd_content = fs::read_to_string("/etc/motd")?;
let is_current = motd_content.contains("-current");
if ctx.config().dry_run() {
println!("{}", t!("Would check if OpenBSD is -current"));
Ok(is_current)
} else {
Ok(is_current)
}
}
pub fn upgrade_openbsd(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
print_separator("OpenBSD Update");
ctx.run_type()
.execute(sudo)
.args(&["/usr/sbin/sysupgrade", "-n"])
.status_checked()
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator(t!("OpenBSD Update"));
let is_current = is_openbsd_current(ctx)?;
if ctx.config().dry_run() {
println!("{}", t!("Would upgrade the OpenBSD system"));
return Ok(());
}
let mut args = vec!["/usr/sbin/sysupgrade", "-n"];
if is_current {
args.push("-s");
}
ctx.run_type().execute(sudo).args(&args).status_checked()
}
pub fn upgrade_packages(ctx: &ExecutionContext) -> Result<()> {
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
print_separator("OpenBSD Packages");
ctx.run_type()
.execute(sudo)
.args(&["/usr/sbin/pkg_add", "-u"])
.status_checked()
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator(t!("OpenBSD Packages"));
let is_current = is_openbsd_current(ctx)?;
if ctx.config().dry_run() {
println!("{}", t!("Would upgrade OpenBSD packages"));
return Ok(());
}
if ctx.config().cleanup() {
ctx.run_type()
.execute(sudo)
.args(["/usr/sbin/pkg_delete", "-ac"])
.status_checked()?;
}
let mut args = vec!["/usr/sbin/pkg_add", "-u"];
if is_current {
args.push("-Dsnap");
}
ctx.run_type().execute(sudo).args(&args).status_checked()?;
Ok(())
}

View File

@@ -0,0 +1,22 @@
NAME="Fedora Linux"
VERSION="39.20240415.0 (IoT Edition)"
ID=fedora
VERSION_ID=39
VERSION_CODENAME=""
PLATFORM_ID="platform:f39"
PRETTY_NAME="Fedora Linux 39.20240415.0 (IoT Edition)"
ANSI_COLOR="0;38;2;60;110;180"
LOGO=fedora-logo-icon
CPE_NAME="cpe:/o:fedoraproject:fedora:39"
HOME_URL="https://fedoraproject.org/"
DOCUMENTATION_URL="https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/"
SUPPORT_URL="https://ask.fedoraproject.org/"
BUG_REPORT_URL="https://bugzilla.redhat.com/"
REDHAT_BUGZILLA_PRODUCT="Fedora"
REDHAT_BUGZILLA_PRODUCT_VERSION=39
REDHAT_SUPPORT_PRODUCT="Fedora"
REDHAT_SUPPORT_PRODUCT_VERSION=39
SUPPORT_END=2024-11-12
VARIANT="IoT Edition"
VARIANT_ID=iot
OSTREE_VERSION='39.20240415.0'

View File

@@ -0,0 +1,23 @@
NAME="Fedora Linux"
VERSION="40.20240426.0 (Sway Atomic)"
ID=fedora
VERSION_ID=40
VERSION_CODENAME=""
PLATFORM_ID="platform:f40"
PRETTY_NAME="Fedora Linux 40.20240426.0 (Sway Atomic)"
ANSI_COLOR="0;38;2;60;110;180"
LOGO=fedora-logo-icon
CPE_NAME="cpe:/o:fedoraproject:fedora:40"
DEFAULT_HOSTNAME="fedora"
HOME_URL="https://fedoraproject.org/atomic-desktops/sway/"
DOCUMENTATION_URL="https://docs.fedoraproject.org/en-US/fedora-sericea/"
SUPPORT_URL="https://ask.fedoraproject.org/"
BUG_REPORT_URL="https://gitlab.com/fedora/sigs/sway/SIG/-/issues"
REDHAT_BUGZILLA_PRODUCT="Fedora"
REDHAT_BUGZILLA_PRODUCT_VERSION=40
REDHAT_SUPPORT_PRODUCT="Fedora"
REDHAT_SUPPORT_PRODUCT_VERSION=40
SUPPORT_END=2025-05-13
VARIANT="Sway Atomic"
VARIANT_ID=sway-atomic
OSTREE_VERSION='40.20240426.0'

View File

@@ -0,0 +1,6 @@
ID="funtoo"
NAME="Funtoo"
PRETTY_NAME="Funtoo Linux"
ANSI_COLOR="0;34"
HOME_URL="https://www.funtoo.org"
BUG_REPORT_URL="https://bugs.funtoo.org"

View File

@@ -0,0 +1,8 @@
ID=nilrt
NAME="NI Linux Real-Time"
VERSION="10.0 (kirkstone)"
VERSION_ID=10.0
PRETTY_NAME="NI Linux Real-Time 10.0 (kirkstone)"
DISTRO_CODENAME="kirkstone"
BUILD_ID="23.8.0f153-x64"
VERSION_CODENAME="kirkstone"

View File

@@ -0,0 +1,23 @@
NAME="Nobara Linux"
VERSION="39 (GNOME Edition)"
ID=nobara
ID_LIKE="rhel centos fedora"
VERSION_ID=39
VERSION_CODENAME=""
PLATFORM_ID="platform:f39"
PRETTY_NAME="Nobara Linux 39 (GNOME Edition)"
ANSI_COLOR="0;38;2;60;110;180"
LOGO=nobara-logo-icon
CPE_NAME="cpe:/o:nobaraproject:nobara:39"
DEFAULT_HOSTNAME="nobara"
HOME_URL="https://nobaraproject.org/"
DOCUMENTATION_URL="https://www.nobaraproject.org/"
SUPPORT_URL="https://www.nobaraproject.org/"
BUG_REPORT_URL="https://gitlab.com/gloriouseggroll/nobara-images"
REDHAT_BUGZILLA_PRODUCT="Nobara"
REDHAT_BUGZILLA_PRODUCT_VERSION=39
REDHAT_SUPPORT_PRODUCT="Nobara"
REDHAT_SUPPORT_PRODUCT_VERSION=39
SUPPORT_END=2024-05-14
VARIANT="GNOME Edition"
VARIANT_ID=gnome

View File

@@ -0,0 +1,5 @@
ID=wolfi
NAME="Wolfi"
PRETTY_NAME="Wolfi"
VERSION_ID="20230201"
HOME_URL="https://wolfi.dev"

View File

@@ -13,6 +13,12 @@ use color_eyre::eyre::Context;
use color_eyre::eyre::Result;
use home;
use ini::Ini;
use lazy_static::lazy_static;
#[cfg(target_os = "linux")]
use nix::unistd::Uid;
use regex::Regex;
use rust_i18n::t;
use semver::Version;
use tracing::debug;
#[cfg(target_os = "linux")]
@@ -24,7 +30,7 @@ use crate::executor::Executor;
#[cfg(any(target_os = "linux", target_os = "macos"))]
use crate::executor::RunType;
use crate::terminal::print_separator;
use crate::utils::{require, require_option, PathExt, REQUIRE_SUDO};
use crate::utils::{get_require_sudo_string, require, require_option, PathExt};
#[cfg(any(target_os = "linux", target_os = "macos"))]
const INTEL_BREW: &str = "/usr/local/bin/brew";
@@ -98,19 +104,19 @@ pub fn run_fisher(ctx: &ExecutionContext) -> Result<()> {
.args(["-c", "type -t fisher"])
.output_checked_utf8()
.map(|_| ())
.map_err(|_| SkipStep("`fisher` is not defined in `fish`".to_owned()))?;
.map_err(|_| SkipStep(t!("`fisher` is not defined in `fish`").to_string()))?;
Command::new(&fish)
.args(["-c", "echo \"$__fish_config_dir/fish_plugins\""])
.output_checked_utf8()
.and_then(|output| Path::new(&output.stdout.trim()).require().map(|_| ()))
.map_err(|err| SkipStep(format!("`fish_plugins` path doesn't exist: {err}")))?;
.map_err(|err| SkipStep(t!("`fish_plugins` path doesn't exist: {err}", err = err).to_string()))?;
Command::new(&fish)
.args(["-c", "fish_update_completions"])
.output_checked_utf8()
.map(|_| ())
.map_err(|_| SkipStep("`fish_update_completions` is not available".to_owned()))?;
.map_err(|_| SkipStep(t!("`fish_update_completions` is not available").to_string()))?;
print_separator("Fisher");
@@ -177,7 +183,7 @@ pub fn run_oh_my_fish(ctx: &ExecutionContext) -> Result<()> {
pub fn run_pkgin(ctx: &ExecutionContext) -> Result<()> {
let pkgin = require("pkgin")?;
let sudo = require_option(ctx.sudo().as_ref(), REQUIRE_SUDO.to_string())?;
let sudo = require_option(ctx.sudo().as_ref(), get_require_sudo_string())?;
print_separator("Pkgin");
@@ -232,7 +238,7 @@ pub fn upgrade_gnome_extensions(ctx: &ExecutionContext) -> Result<()> {
let gdbus = require("gdbus")?;
require_option(
var("XDG_CURRENT_DESKTOP").ok().filter(|p| p.contains("GNOME")),
"Desktop doest not appear to be gnome".to_string(),
t!("Desktop doest not appear to be gnome").to_string(),
)?;
let output = Command::new("gdbus")
.args([
@@ -249,10 +255,10 @@ pub fn upgrade_gnome_extensions(ctx: &ExecutionContext) -> Result<()> {
debug!("Checking for gnome extensions: {}", output);
if !output.stdout.contains("org.gnome.Shell.Extensions") {
return Err(SkipStep(String::from("Gnome shell extensions are unregistered in DBus")).into());
return Err(SkipStep(t!("Gnome shell extensions are unregistered in DBus").to_string()).into());
}
print_separator("Gnome Shell extensions");
print_separator(t!("Gnome Shell extensions"));
ctx.run_type()
.execute(gdbus)
@@ -269,6 +275,23 @@ pub fn upgrade_gnome_extensions(ctx: &ExecutionContext) -> Result<()> {
.status_checked()
}
#[cfg(target_os = "linux")]
pub fn brew_linux_sudo_uid() -> Option<u32> {
let linuxbrew_directory = "/home/linuxbrew/.linuxbrew";
if let Ok(metadata) = std::fs::metadata(linuxbrew_directory) {
let owner_id = metadata.uid();
let current_id = Uid::effective();
// print debug these two values
debug!("linuxbrew_directory owner_id: {}, current_id: {}", owner_id, current_id);
return if owner_id == current_id.as_raw() {
None // no need for sudo if linuxbrew is owned by the current user
} else {
Some(owner_id) // otherwise use sudo to run brew as the owner
};
}
None
}
#[cfg(any(target_os = "linux", target_os = "macos"))]
pub fn run_brew_formula(ctx: &ExecutionContext, variant: BrewVariant) -> Result<()> {
#[allow(unused_variables)]
@@ -277,18 +300,50 @@ pub fn run_brew_formula(ctx: &ExecutionContext, variant: BrewVariant) -> Result<
#[cfg(target_os = "macos")]
{
if variant.is_path() && !BrewVariant::is_macos_custom(binary_name) {
return Err(SkipStep("Not a custom brew for macOS".to_string()).into());
return Err(SkipStep(t!("Not a custom brew for macOS").to_string()).into());
}
}
#[cfg(target_os = "linux")]
{
let sudo_uid = brew_linux_sudo_uid();
// if brew is owned by another user, execute "sudo -Hu <uid> brew update"
if let Some(user_id) = sudo_uid {
let uid = nix::unistd::Uid::from_raw(user_id);
let user = nix::unistd::User::from_uid(uid)
.expect("failed to call getpwuid()")
.expect("this user should exist");
let sudo_as_user = t!("sudo as user '{user}'", user = user.name);
print_separator(format!("{} ({})", variant.step_title(), sudo_as_user));
let sudo = crate::utils::require_option(ctx.sudo().as_ref(), crate::utils::get_require_sudo_string())?;
ctx.run_type()
.execute(sudo)
.current_dir("/tmp") // brew needs a writable current directory
.args([
"--set-home",
&format!("--user={}", user.name),
&format!("{}", binary_name.to_string_lossy()),
"update",
])
.status_checked()?;
return Ok(());
}
}
print_separator(variant.step_title());
let run_type = ctx.run_type();
variant.execute(run_type).arg("update").status_checked()?;
variant
.execute(run_type)
.args(["upgrade", "--formula"])
.status_checked()?;
let mut command = variant.execute(run_type);
command.args(["upgrade", "--formula"]);
if ctx.config().brew_fetch_head() {
command.arg("--fetch-HEAD");
}
command.status_checked()?;
if ctx.config().cleanup() {
variant.execute(run_type).arg("cleanup").status_checked()?;
@@ -305,7 +360,7 @@ pub fn run_brew_formula(ctx: &ExecutionContext, variant: BrewVariant) -> Result<
pub fn run_brew_cask(ctx: &ExecutionContext, variant: BrewVariant) -> Result<()> {
let binary_name = require(variant.binary_name())?;
if variant.is_path() && !BrewVariant::is_macos_custom(binary_name) {
return Err(SkipStep("Not a custom brew for macOS".to_string()).into());
return Err(SkipStep(t!("Not a custom brew for macOS").to_string()).into());
}
print_separator(format!("{} - Cask", variant.step_title()));
let run_type = ctx.run_type();
@@ -328,6 +383,12 @@ pub fn run_brew_cask(ctx: &ExecutionContext, variant: BrewVariant) -> Result<()>
if ctx.config().brew_cask_greedy() {
brew_args.push("--greedy");
}
if ctx.config().brew_greedy_latest() {
brew_args.push("--greedy-latest");
}
if ctx.config().brew_greedy_auto_updates() {
brew_args.push("--greedy-auto-updates");
}
}
variant.execute(run_type).args(&brew_args).status_checked()?;
@@ -354,7 +415,7 @@ pub fn run_guix(ctx: &ExecutionContext) -> Result<()> {
if should_upgrade {
return run_type.execute(&guix).args(["package", "-u"]).status_checked();
}
Err(SkipStep(String::from("Guix Pull Failed, Skipping")).into())
Err(SkipStep(t!("Guix Pull Failed, Skipping").to_string()).into())
}
pub fn run_nix(ctx: &ExecutionContext) -> Result<()> {
@@ -374,23 +435,72 @@ pub fn run_nix(ctx: &ExecutionContext) -> Result<()> {
#[cfg(target_os = "macos")]
{
if require("darwin-rebuild").is_ok() {
return Err(SkipStep(String::from(
"Nix-darwin on macOS must be upgraded via darwin-rebuild switch",
))
.into());
return Err(
SkipStep(t!("Nix-darwin on macOS must be upgraded via darwin-rebuild switch").to_string()).into(),
);
}
}
let run_type = ctx.run_type();
run_type.execute(nix_channel).arg("--update").status_checked()?;
let mut get_version_cmd = ctx.run_type().execute(&nix);
get_version_cmd.arg("--version");
let get_version_cmd_output = get_version_cmd.output_checked_utf8()?;
let get_version_cmd_first_line_stdout = get_version_cmd_output
.stdout
.lines()
.next()
.ok_or_else(|| eyre!("`nix --version` output is empty"))?;
let is_lix = get_version_cmd_first_line_stdout.contains("Lix");
debug!(
output=%get_version_cmd_output,
?is_lix,
"`nix --version` output"
);
lazy_static! {
static ref NIX_VERSION_REGEX: Regex =
Regex::new(r#"^nix \([^)]*\) ([0-9.]+)"#).expect("Nix version regex always compiles");
}
if get_version_cmd_first_line_stdout.is_empty() {
return Err(eyre!("`nix --version` output was empty"));
}
let captures = NIX_VERSION_REGEX.captures(get_version_cmd_first_line_stdout);
let raw_version = match &captures {
None => {
return Err(eyre!(
"`nix --version` output was weird: {get_version_cmd_first_line_stdout:?}\n\
If the `nix --version` output format changed, please file an issue to Topgrade"
));
}
Some(captures) => &captures[1],
};
let version =
Version::parse(raw_version).wrap_err_with(|| format!("Unable to parse Nix version: {raw_version:?}"))?;
debug!("Nix version: {:?}", version);
// Nix since 2.21.0 uses `--all --impure` rather than `.*` to upgrade all packages.
// Lix is based on Nix 2.18, so it doesn't!
let packages = if version >= Version::new(2, 21, 0) && !is_lix {
vec!["--all", "--impure"]
} else {
vec![".*"]
};
if Path::new(&manifest_json_path).exists() {
run_type
.execute(nix)
.args(nix_args())
.arg("profile")
.arg("upgrade")
.arg(".*")
.args(&packages)
.arg("--verbose")
.status_checked()
} else {
@@ -419,20 +529,16 @@ pub fn run_nix_self_upgrade(ctx: &ExecutionContext) -> Result<()> {
}
if !should_self_upgrade {
return Err(SkipStep(String::from(
"`nix upgrade-nix` can only be used on macOS or non-NixOS Linux",
))
.into());
return Err(SkipStep(t!("`nix upgrade-nix` can only be used on macOS or non-NixOS Linux").to_string()).into());
}
if nix_profile_dir(&nix)?.is_none() {
return Err(SkipStep(String::from(
"`nix upgrade-nix` cannot be run when Nix is installed in a profile",
))
.into());
return Err(
SkipStep(t!("`nix upgrade-nix` cannot be run when Nix is installed in a profile").to_string()).into(),
);
}
print_separator("Nix (self-upgrade)");
print_separator(t!("Nix (self-upgrade)"));
let multi_user = fs::metadata(&nix)?.uid() == 0;
debug!("Multi user nix: {}", multi_user);
@@ -497,7 +603,6 @@ fn nix_profile_dir(nix: &Path) -> Result<Option<PathBuf>> {
}
debug!("Found Nix profile {profile_dir:?}");
let user_env = profile_dir
.canonicalize()
.wrap_err_with(|| format!("Failed to canonicalize {profile_dir:?}"))?;
@@ -543,6 +648,19 @@ pub fn run_asdf(ctx: &ExecutionContext) -> Result<()> {
.status_checked()
}
pub fn run_mise(ctx: &ExecutionContext) -> Result<()> {
let mise = require("mise")?;
print_separator("mise");
ctx.run_type()
.execute(&mise)
.args(["plugins", "update"])
.status_checked()?;
ctx.run_type().execute(&mise).arg("upgrade").status_checked()
}
pub fn run_home_manager(ctx: &ExecutionContext) -> Result<()> {
let home_manager = require("home-manager")?;
@@ -572,6 +690,29 @@ pub fn run_pearl(ctx: &ExecutionContext) -> Result<()> {
ctx.run_type().execute(pearl).arg("update").status_checked()
}
pub fn run_pyenv(ctx: &ExecutionContext) -> Result<()> {
let pyenv = require("pyenv")?;
print_separator("pyenv");
let pyenv_dir = var("PYENV_ROOT")
.map(PathBuf::from)
.unwrap_or_else(|_| HOME_DIR.join(".pyenv"));
if !pyenv_dir.exists() {
return Err(SkipStep(t!("Pyenv is installed, but $PYENV_ROOT is not set correctly").to_string()).into());
}
if !pyenv_dir.join(".git").exists() {
return Err(SkipStep(t!("pyenv is not a git repository").to_string()).into());
}
if !pyenv_dir.join("plugins").join("pyenv-update").exists() {
return Err(SkipStep(t!("pyenv-update plugin is not installed").to_string()).into());
}
ctx.run_type().execute(pyenv).arg("update").status_checked()
}
pub fn run_sdkman(ctx: &ExecutionContext) -> Result<()> {
let bash = require("bash")?;
@@ -635,21 +776,18 @@ pub fn run_sdkman(ctx: &ExecutionContext) -> Result<()> {
Ok(())
}
pub fn run_bun(ctx: &ExecutionContext) -> Result<()> {
let bun = require("bun")?;
print_separator("Bun");
ctx.run_type().execute(bun).arg("upgrade").status_checked()
}
pub fn run_bun_packages(ctx: &ExecutionContext) -> Result<()> {
let bun = require("bun")?;
print_separator("Bun Packages");
print_separator(t!("Bun Packages"));
if !HOME_DIR.join(".bun/install/global/package.json").exists() {
println!("No global packages installed");
let mut package_json: PathBuf = var("BUN_INSTALL")
.map(PathBuf::from)
.unwrap_or_else(|_| HOME_DIR.join(".bun"));
package_json.push("install/global/package.json");
if !package_json.exists() {
println!("{}", t!("No global packages installed"));
return Ok(());
}
@@ -674,6 +812,7 @@ pub fn run_maza(ctx: &ExecutionContext) -> Result<()> {
}
pub fn reboot() -> Result<()> {
print!("Rebooting...");
print!("{}", t!("Rebooting..."));
Command::new("sudo").arg("reboot").status_checked()
}

View File

@@ -1,4 +1,3 @@
use std::convert::TryFrom;
use std::path::Path;
use std::{ffi::OsStr, process::Command};
@@ -10,8 +9,9 @@ use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
use crate::terminal::{print_separator, print_warning};
use crate::utils::{require, which};
use crate::{error::SkipStep, steps::git::Repositories};
use crate::{error::SkipStep, steps::git::RepoStep};
use crate::{powershell, Step};
use rust_i18n::t;
pub fn run_chocolatey(ctx: &ExecutionContext) -> Result<()> {
let choco = require("choco")?;
@@ -42,11 +42,6 @@ pub fn run_winget(ctx: &ExecutionContext) -> Result<()> {
print_separator("winget");
if !ctx.config().enable_winget() {
print_warning("Winget is disabled by default. Enable it by setting enable_winget=true in the [windows] section in the configuration.");
return Err(SkipStep(String::from("Winget is disabled by default")).into());
}
ctx.run_type()
.execute(winget)
.args(["upgrade", "--all"])
@@ -63,6 +58,10 @@ pub fn run_scoop(ctx: &ExecutionContext) -> Result<()> {
if ctx.config().cleanup() {
ctx.run_type().execute(&scoop).args(["cleanup", "*"]).status_checked()?;
ctx.run_type()
.execute(&scoop)
.args(["cache", "rm", "-a"])
.status_checked()?
}
Ok(())
@@ -70,12 +69,12 @@ pub fn run_scoop(ctx: &ExecutionContext) -> Result<()> {
pub fn update_wsl(ctx: &ExecutionContext) -> Result<()> {
if !is_wsl_installed()? {
return Err(SkipStep("WSL not installed".to_string()).into());
return Err(SkipStep(t!("WSL not installed").to_string()).into());
}
let wsl = require("wsl")?;
print_separator("Update WSL");
print_separator(t!("Update WSL"));
let mut wsl_command = ctx.run_type().execute(wsl);
wsl_command.args(["--update"]);
@@ -128,7 +127,7 @@ fn upgrade_wsl_distribution(wsl: &Path, dist: &str, ctx: &ExecutionContext) -> R
let topgrade = Command::new(wsl)
.args(["-d", dist, "bash", "-lc", "which topgrade"])
.output_checked_utf8()
.map_err(|_| SkipStep(String::from("Could not find Topgrade installed in WSL")))?
.map_err(|_| SkipStep(t!("Could not find Topgrade installed in WSL").to_string()))?
.stdout // The normal output from `which topgrade` appends a newline, so we trim it here.
.trim_end()
.to_owned();
@@ -177,7 +176,7 @@ fn upgrade_wsl_distribution(wsl: &Path, dist: &str, ctx: &ExecutionContext) -> R
pub fn run_wsl_topgrade(ctx: &ExecutionContext) -> Result<()> {
if !is_wsl_installed()? {
return Err(SkipStep("WSL not installed".to_string()).into());
return Err(SkipStep(t!("WSL not installed").to_string()).into());
}
let wsl = require("wsl")?;
@@ -200,27 +199,34 @@ pub fn run_wsl_topgrade(ctx: &ExecutionContext) -> Result<()> {
if ran {
Ok(())
} else {
Err(SkipStep(String::from("Could not find Topgrade in any WSL disribution")).into())
Err(SkipStep(t!("Could not find Topgrade in any WSL disribution").to_string()).into())
}
}
pub fn windows_update(ctx: &ExecutionContext) -> Result<()> {
let powershell = powershell::Powershell::windows_powershell();
print_separator(t!("Windows Update"));
if powershell.supports_windows_update() {
print_separator("Windows Update");
return powershell.windows_update(ctx);
println!("The installer will request to run as administrator, expect a prompt.");
powershell.windows_update(ctx)
} else {
print_warning(t!(
"Consider installing PSWindowsUpdate as the use of Windows Update via USOClient is not supported."
));
Err(SkipStep(t!("USOClient not supported.").to_string()).into())
}
}
let usoclient = require("UsoClient")?;
pub fn microsoft_store(ctx: &ExecutionContext) -> Result<()> {
let powershell = powershell::Powershell::windows_powershell();
print_separator("Windows Update");
println!("Running Windows Update. Check the control panel for progress.");
ctx.run_type()
.execute(&usoclient)
.arg("ScanInstallWait")
.status_checked()?;
ctx.run_type().execute(&usoclient).arg("StartInstall").status_checked()
print_separator(t!("Microsoft Store"));
powershell.microsoft_store(ctx)
}
pub fn reboot() -> Result<()> {
@@ -229,7 +235,7 @@ pub fn reboot() -> Result<()> {
Command::new("shutdown").args(["/R", "/T", "0"]).status_checked()
}
pub fn insert_startup_scripts(git_repos: &mut Repositories) -> Result<()> {
pub fn insert_startup_scripts(git_repos: &mut RepoStep) -> Result<()> {
let startup_dir = crate::WINDOWS_DIRS
.data_dir()
.join("Microsoft\\Windows\\Start Menu\\Programs\\Startup");
@@ -239,7 +245,7 @@ pub fn insert_startup_scripts(git_repos: &mut Repositories) -> Result<()> {
if let Ok(lnk) = parselnk::Lnk::try_from(Path::new(&path)) {
debug!("Startup link: {:?}", lnk);
if let Some(path) = lnk.relative_path() {
git_repos.insert_if_repo(&startup_dir.join(path));
git_repos.insert_if_repo(startup_dir.join(path));
}
}
}

View File

@@ -4,6 +4,7 @@ use std::path::PathBuf;
use std::process::Command;
use color_eyre::eyre::Result;
use rust_i18n::t;
use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
@@ -62,9 +63,9 @@ impl Powershell {
}
pub fn update_modules(&self, ctx: &ExecutionContext) -> Result<()> {
let powershell = require_option(self.path.as_ref(), String::from("Powershell is not installed"))?;
let powershell = require_option(self.path.as_ref(), t!("Powershell is not installed").to_string())?;
print_separator("Powershell Modules Update");
print_separator(t!("Powershell Modules Update"));
let mut cmd = vec!["Update-Module"];
@@ -76,7 +77,7 @@ impl Powershell {
cmd.push("-Force")
}
println!("Updating modules...");
println!("{}", t!("Updating modules..."));
ctx.run_type()
.execute(powershell)
// This probably doesn't need `shell_words::join`.
@@ -94,10 +95,18 @@ impl Powershell {
#[cfg(windows)]
pub fn windows_update(&self, ctx: &ExecutionContext) -> Result<()> {
let powershell = require_option(self.path.as_ref(), String::from("Powershell is not installed"))?;
let powershell = require_option(self.path.as_ref(), t!("Powershell is not installed").to_string())?;
debug_assert!(self.supports_windows_update());
let accept_all = if ctx.config().accept_all_windows_updates() {
"-AcceptAll"
} else {
""
};
let install_windowsupdate_verbose = "Install-WindowsUpdate -Verbose".to_string();
let mut command = if let Some(sudo) = ctx.sudo() {
let mut command = ctx.run_type().execute(sudo);
command.arg(powershell);
@@ -107,18 +116,46 @@ impl Powershell {
};
command
.args([
"-NoProfile",
"-Command",
&format!(
"Import-Module PSWindowsUpdate; Install-WindowsUpdate -MicrosoftUpdate {} -Verbose",
if ctx.config().accept_all_windows_updates() {
"-AcceptAll"
} else {
""
}
),
])
.args(["-NoProfile", &install_windowsupdate_verbose, accept_all])
.status_checked()
}
#[cfg(windows)]
pub fn microsoft_store(&self, ctx: &ExecutionContext) -> Result<()> {
let powershell = require_option(self.path.as_ref(), t!("Powershell is not installed").to_string())?;
let mut command = if let Some(sudo) = ctx.sudo() {
let mut command = ctx.run_type().execute(sudo);
command.arg(powershell);
command
} else {
ctx.run_type().execute(powershell)
};
println!("{}", t!("Scanning for updates..."));
// Scan for updates using the MDM UpdateScanMethod
// This method is also available for non-MDM devices
let update_command = "(Get-CimInstance -Namespace \"Root\\cimv2\\mdm\\dmmap\" -ClassName \"MDM_EnterpriseModernAppManagement_AppManagement01\" | Invoke-CimMethod -MethodName UpdateScanMethod).ReturnValue";
command.args(["-NoProfile", update_command]);
command
.output_checked_with_utf8(|output| {
if output.stdout.trim() == "0" {
println!(
"{}",
t!("Success, Microsoft Store apps are being updated in the background")
);
Ok(())
} else {
println!(
"{}",
t!("Unable to update Microsoft Store apps, manual intervention is required")
);
Err(())
}
})
.map(|_| ())
}
}

View File

@@ -1,4 +1,5 @@
use color_eyre::eyre::Result;
use rust_i18n::t;
use crate::{
command::CommandExt, error::SkipStep, execution_context::ExecutionContext, terminal::print_separator, utils,
@@ -27,7 +28,7 @@ pub fn ssh_step(ctx: &ExecutionContext, hostname: &str) -> Result<()> {
{
prepare_async_ssh_command(&mut args);
crate::tmux::run_command(ctx, hostname, &shell_words::join(args))?;
Err(SkipStep(String::from("Remote Topgrade launched in Tmux")).into())
Err(SkipStep(String::from(t!("Remote Topgrade launched in Tmux"))).into())
}
#[cfg(not(unix))]
@@ -35,7 +36,7 @@ pub fn ssh_step(ctx: &ExecutionContext, hostname: &str) -> Result<()> {
} else if ctx.config().open_remotes_in_new_terminal() && !ctx.run_type().dry() && cfg!(windows) {
prepare_async_ssh_command(&mut args);
ctx.run_type().execute("wt").args(&args).spawn()?;
Err(SkipStep(String::from("Remote Topgrade launched in an external terminal")).into())
Err(SkipStep(String::from(t!("Remote Topgrade launched in an external terminal"))).into())
} else {
let mut args = vec!["-t", hostname];
@@ -47,7 +48,7 @@ pub fn ssh_step(ctx: &ExecutionContext, hostname: &str) -> Result<()> {
args.extend(["env", &env, "$SHELL", "-lc", topgrade]);
print_separator(format!("Remote ({hostname})"));
println!("Connecting to {hostname}...");
println!("{}", t!("Connecting to {hostname}...", hostname = hostname));
ctx.run_type().execute(ssh).args(&args).status_checked()
}

View File

@@ -4,6 +4,7 @@ use std::{fmt::Display, rc::Rc, str::FromStr};
use color_eyre::eyre::Result;
use regex::Regex;
use rust_i18n::t;
use strum::EnumString;
use tracing::{debug, error};
@@ -125,7 +126,7 @@ impl<'a> TemporaryPowerOn<'a> {
}
}
impl<'a> Drop for TemporaryPowerOn<'a> {
impl Drop for TemporaryPowerOn<'_> {
fn drop(&mut self) {
let subcommand = if self.ctx.config().vagrant_always_suspend().unwrap_or(false) {
"suspend"
@@ -151,14 +152,14 @@ impl<'a> Drop for TemporaryPowerOn<'a> {
pub fn collect_boxes(ctx: &ExecutionContext) -> Result<Vec<VagrantBox>> {
let directories = utils::require_option(
ctx.config().vagrant_directories(),
String::from("No Vagrant directories were specified in the configuration file"),
String::from(t!("No Vagrant directories were specified in the configuration file")),
)?;
let vagrant = Vagrant {
path: utils::require("vagrant")?,
};
print_separator("Vagrant");
println!("Collecting Vagrant boxes");
println!("{}", t!("Collecting Vagrant boxes"));
let mut result = Vec::new();
@@ -183,7 +184,11 @@ pub fn topgrade_vagrant_box(ctx: &ExecutionContext, vagrant_box: &VagrantBox) ->
let mut _poweron = None;
if !vagrant_box.initial_status.powered_on() {
if !(ctx.config().vagrant_power_on().unwrap_or(true)) {
return Err(SkipStep(format!("Skipping powered off box {vagrant_box}")).into());
return Err(SkipStep(format!(
"{}",
t!("Skipping powered off box {vagrant_box}", vagrant_box = vagrant_box)
))
.into());
} else {
print_separator(seperator);
_poweron = Some(vagrant.temporary_power_on(vagrant_box, ctx)?);
@@ -205,7 +210,7 @@ pub fn topgrade_vagrant_box(ctx: &ExecutionContext, vagrant_box: &VagrantBox) ->
pub fn upgrade_vagrant_boxes(ctx: &ExecutionContext) -> Result<()> {
let vagrant = utils::require("vagrant")?;
print_separator("Vagrant boxes");
print_separator(t!("Vagrant boxes"));
let outdated = Command::new(&vagrant)
.args(["box", "outdated", "--global"])
@@ -227,7 +232,7 @@ pub fn upgrade_vagrant_boxes(ctx: &ExecutionContext) -> Result<()> {
}
if !found {
println!("No outdated boxes")
println!("{}", t!("No outdated boxes"))
} else {
ctx.run_type()
.execute(&vagrant)

View File

@@ -7,6 +7,8 @@ use color_eyre::eyre::Context;
use color_eyre::eyre::Result;
use crate::command::CommandExt;
use crate::config::TmuxConfig;
use crate::config::TmuxSessionMode;
use crate::terminal::print_separator;
use crate::HOME_DIR;
use crate::{
@@ -14,11 +16,19 @@ use crate::{
utils::{which, PathExt},
};
use rust_i18n::t;
#[cfg(unix)]
use std::os::unix::process::CommandExt as _;
pub fn run_tpm(ctx: &ExecutionContext) -> Result<()> {
let tpm = HOME_DIR.join(".tmux/plugins/tpm/bin/update_plugins").require()?;
let tpm = match env::var("TMUX_PLUGIN_MANAGER_PATH") {
// If `TMUX_PLUGIN_MANAGER_PATH` is set, search for
// `$TMUX_PLUGIN_MANAGER_PATH/bin/install_plugins/tpm/bin/update_plugins`
Ok(var) => PathBuf::from(var).join("bin/install_plugins/tpm/bin/update_plugins"),
// Otherwise, use the default location `~/.tmux/plugins/tpm/bin/update_plugins`
Err(_) => HOME_DIR.join(".tmux/plugins/tpm/bin/update_plugins"),
}
.require()?;
print_separator("tmux plugins");
@@ -124,7 +134,7 @@ impl Tmux {
}
}
pub fn run_in_tmux(args: Vec<String>) -> Result<()> {
pub fn run_in_tmux(config: TmuxConfig) -> Result<()> {
let command = {
let mut command = vec![
String::from("env"),
@@ -137,25 +147,39 @@ pub fn run_in_tmux(args: Vec<String>) -> Result<()> {
shell_words::join(command)
};
let tmux = Tmux::new(args);
let tmux = Tmux::new(config.args);
// Find an unused session and run `topgrade` in it with the current command's arguments.
let session_name = "topgrade";
let window_name = "topgrade";
let session = tmux.new_unique_session(session_name, window_name, &command)?;
// Only attach to the newly-created session if we're not currently in a tmux session.
if env::var("TMUX").is_err() {
let err = tmux.build().args(["attach-session", "-t", &session]).exec();
Err(eyre!("{err}")).context("Failed to `execvp(3)` tmux")
} else {
println!("Topgrade launched in a new tmux session");
Ok(())
}
let is_inside_tmux = env::var("TMUX").is_ok();
let err = match config.session_mode {
TmuxSessionMode::AttachIfNotInSession => {
if is_inside_tmux {
// Only attach to the newly-created session if we're not currently in a tmux session.
println!("{}", t!("Topgrade launched in a new tmux session"));
return Ok(());
} else {
tmux.build().args(["attach-session", "-t", &session]).exec()
}
}
TmuxSessionMode::AttachAlways => {
if is_inside_tmux {
tmux.build().args(["switch-client", "-t", &session]).exec()
} else {
tmux.build().args(["attach-session", "-t", &session]).exec()
}
}
};
Err(eyre!("{err}")).context("Failed to `execvp(3)` tmux")
}
pub fn run_command(ctx: &ExecutionContext, window_name: &str, command: &str) -> Result<()> {
let tmux = Tmux::new(ctx.config().tmux_arguments()?);
let tmux = Tmux::new(ctx.config().tmux_config()?.args);
match ctx.get_tmux_session() {
Some(session_name) => {

View File

@@ -9,6 +9,11 @@ if exists(":AstroUpdate")
quitall
endif
if exists(":MasonUpdate")
echo "MasonUpdate"
MasonUpdate
endif
if exists(":NeoBundleUpdate")
echo "NeoBundle"
NeoBundleUpdate

View File

@@ -10,6 +10,7 @@ use crate::{
execution_context::ExecutionContext,
utils::{require, PathExt},
};
use rust_i18n::t;
use std::path::PathBuf;
use std::{
io::{self, Write},
@@ -57,14 +58,14 @@ fn upgrade(command: &mut Executor, ctx: &ExecutionContext) -> Result<()> {
let status = output.status;
if !status.success() || ctx.config().verbose() {
io::stdout().write(&output.stdout).ok();
io::stderr().write(&output.stderr).ok();
io::stdout().write_all(&output.stdout).ok();
io::stderr().write_all(&output.stderr).ok();
}
if !status.success() {
return Err(TopgradeError::ProcessFailed(command.get_program(), status).into());
} else {
println!("Plugins upgraded")
println!("{}", t!("Plugins upgraded"))
}
}
@@ -77,7 +78,7 @@ pub fn upgrade_ultimate_vimrc(ctx: &ExecutionContext) -> Result<()> {
let python = require("python3")?;
let update_plugins = config_dir.join("update_plugins.py").require()?;
print_separator("The Ultimate vimrc");
print_separator(t!("The Ultimate vimrc"));
ctx.run_type()
.execute(&git)
@@ -108,7 +109,7 @@ pub fn upgrade_vim(ctx: &ExecutionContext) -> Result<()> {
let output = Command::new(&vim).arg("--version").output_checked_utf8()?;
if !output.stdout.starts_with("VIM") {
return Err(SkipStep(String::from("vim binary might be actually nvim")).into());
return Err(SkipStep(t!("vim binary might be actually nvim").to_string()).into());
}
let vimrc = vimrc()?;

View File

@@ -8,10 +8,12 @@ use walkdir::WalkDir;
use crate::command::CommandExt;
use crate::execution_context::ExecutionContext;
use crate::git::Repositories;
use crate::git::RepoStep;
use crate::terminal::print_separator;
use crate::utils::{require, PathExt};
use crate::HOME_DIR;
use crate::XDG_DIRS;
use etcetera::base_strategy::BaseStrategy;
pub fn run_zr(ctx: &ExecutionContext) -> Result<()> {
let zsh = require("zsh")?;
@@ -117,15 +119,12 @@ pub fn run_zinit(ctx: &ExecutionContext) -> Result<()> {
env::var("ZINIT_HOME")
.map(PathBuf::from)
.unwrap_or_else(|_| HOME_DIR.join(".zinit"))
.unwrap_or_else(|_| XDG_DIRS.data_dir().join("zinit"))
.require()?;
print_separator("zinit");
let cmd = format!(
"source {} && zinit self-update && zinit update --all -p",
zshrc.display(),
);
let cmd = format!("source {} && zinit self-update && zinit update --all", zshrc.display());
ctx.run_type()
.execute(zsh)
.args(["-i", "-c", cmd.as_str()])
@@ -140,7 +139,7 @@ pub fn run_zi(ctx: &ExecutionContext) -> Result<()> {
print_separator("zi");
let cmd = format!("source {} && zi self-update && zi update --all -p", zshrc.display(),);
let cmd = format!("source {} && zi self-update && zi update --all", zshrc.display());
ctx.run_type().execute(zsh).args(["-i", "-c", &cmd]).status_checked()
}
@@ -211,8 +210,7 @@ pub fn run_oh_my_zsh(ctx: &ExecutionContext) -> Result<()> {
.unwrap_or_else(|e| {
let default_path = oh_my_zsh.join("custom");
debug!(
"Running zsh returned {}. Using default path: {}",
e,
"Running zsh returned {e}. Using default path: {}",
default_path.display()
);
default_path
@@ -220,22 +218,17 @@ pub fn run_oh_my_zsh(ctx: &ExecutionContext) -> Result<()> {
debug!("oh-my-zsh custom dir: {}", custom_dir.display());
let mut custom_repos = Repositories::new(ctx.git());
let mut custom_repos = RepoStep::try_new()?;
for entry in WalkDir::new(custom_dir).max_depth(2) {
let entry = entry?;
custom_repos.insert_if_repo(entry.path());
}
custom_repos.remove(&oh_my_zsh.to_string_lossy());
if !custom_repos.is_empty() {
println!("Pulling custom plugins and themes");
ctx.git().multi_pull(&custom_repos, ctx)?;
}
custom_repos.remove(&oh_my_zsh);
ctx.run_type()
.execute("zsh")
.arg(&oh_my_zsh.join("tools/upgrade.sh"))
.arg(oh_my_zsh.join("tools/upgrade.sh"))
// oh-my-zsh returns 80 when it is already updated and no changes pulled
// in this update.
// See this comment: https://github.com/r-darwish/topgrade/issues/569#issuecomment-736756731

View File

@@ -11,6 +11,7 @@ use color_eyre::eyre::Context;
use console::{style, Key, Term};
use lazy_static::lazy_static;
use notify_rust::{Notification, Timeout};
use rust_i18n::t;
use tracing::{debug, error};
#[cfg(windows)]
use which_crate::which;
@@ -144,7 +145,7 @@ impl Terminal {
self.term
.write_fmt(format_args!(
"{} {}",
style(format!("{key} failed:")).red().bold(),
style(format!("{}", t!("{key} failed:", key = key))).red().bold(),
message
))
.ok();
@@ -174,10 +175,10 @@ impl Terminal {
"{}: {}\n",
key,
match result {
StepResult::Success => format!("{}", style("OK").bold().green()),
StepResult::Failure => format!("{}", style("FAILED").bold().red()),
StepResult::Ignored => format!("{}", style("IGNORED").bold().yellow()),
StepResult::Skipped(reason) => format!("{}: {}", style("SKIPPED").bold().blue(), reason),
StepResult::Success => format!("{}", style(t!("OK")).bold().green()),
StepResult::Failure => format!("{}", style(t!("FAILED")).bold().red()),
StepResult::Ignored => format!("{}", style(t!("IGNORED")).bold().yellow()),
StepResult::Skipped(reason) => format!("{}: {}", style(t!("SKIPPED")).bold().blue(), reason),
}
))
.ok();
@@ -188,7 +189,7 @@ impl Terminal {
self.term
.write_fmt(format_args!(
"{}",
style(format!("{question} (y)es/(N)o",)).yellow().bold()
style(format!("{question} {}", t!("(Y)es/(N)o"))).yellow().bold()
))
.ok();
@@ -207,14 +208,14 @@ impl Terminal {
}
if self.set_title {
self.term.set_title("Topgrade - Awaiting user");
self.term.set_title(format!("Topgrade - {}", t!("Awaiting user")));
}
if self.desktop_notification {
self.notify_desktop(format!("{step_name} failed"), None);
self.notify_desktop(format!("{}", t!("{step_name} failed", step_name = step_name)), None);
}
let prompt_inner = style(format!("{}Retry? (y)es/(N)o/(s)hell/(q)uit", self.prefix))
let prompt_inner = style(format!("{}{}", self.prefix, t!("Retry? (y)es/(N)o/(s)hell/(q)uit")))
.yellow()
.bold();
@@ -224,7 +225,10 @@ impl Terminal {
match self.term.read_key() {
Ok(Key::Char('y')) | Ok(Key::Char('Y')) => break Ok(true),
Ok(Key::Char('s')) | Ok(Key::Char('S')) => {
println!("\n\nDropping you to shell. Fix what you need and then exit the shell.\n");
println!(
"\n\n{}\n",
t!("Dropping you to shell. Fix what you need and then exit the shell.")
);
if let Err(err) = run_shell().context("Failed to run shell") {
self.term.write_fmt(format_args!("{err:?}\n{prompt_inner}")).ok();
} else {

View File

@@ -5,9 +5,9 @@ use std::path::{Path, PathBuf};
use std::process::Command;
use color_eyre::eyre::Result;
use rust_i18n::t;
use tracing::{debug, error};
use tracing_subscriber::fmt::format::FmtSpan;
use tracing_subscriber::layer::SubscriberExt;
use tracing_subscriber::reload::{Handle, Layer};
use tracing_subscriber::util::SubscriberInitExt;
@@ -52,7 +52,11 @@ where
debug!("Path {:?} exists", self.as_ref());
Ok(self)
} else {
Err(SkipStep(format!("Path {:?} doesn't exist", self.as_ref())).into())
Err(SkipStep(format!(
"{}",
t!("Path {path} doesn't exist", path = format!("{:?}", self.as_ref()))
))
.into())
}
}
}
@@ -93,9 +97,14 @@ pub fn require<T: AsRef<OsStr> + Debug>(binary_name: T) -> Result<PathBuf> {
Ok(path)
}
Err(e) => match e {
which_crate::Error::CannotFindBinaryPath => {
Err(SkipStep(format!("Cannot find {:?} in PATH", &binary_name)).into())
}
which_crate::Error::CannotFindBinaryPath => Err(SkipStep(format!(
"{}",
t!(
"Cannot find {binary_name} in PATH",
binary_name = format!("{:?}", &binary_name)
)
))
.into()),
_ => {
panic!("Detecting {:?} failed: {}", &binary_name, e);
}
@@ -124,7 +133,7 @@ pub fn hostname() -> Result<String> {
match nix::unistd::gethostname() {
Ok(os_str) => Ok(os_str
.into_string()
.map_err(|_| SkipStep("Failed to get a UTF-8 encoded hostname".into()))?),
.map_err(|_| SkipStep(t!("Failed to get a UTF-8 encoded hostname").into()))?),
Err(e) => Err(e.into()),
}
}
@@ -133,7 +142,7 @@ pub fn hostname() -> Result<String> {
pub fn hostname() -> Result<String> {
Command::new("hostname")
.output_checked_utf8()
.map_err(|err| SkipStep(format!("Failed to get hostname: {err}")).into())
.map_err(|err| SkipStep(t!("Failed to get hostname: {err}", err = err).to_string()).into())
.map(|output| output.stdout.trim().to_owned())
}
@@ -192,7 +201,9 @@ pub mod merge_strategies {
// Skip causes
// TODO: Put them in a better place when we have more of them
pub const REQUIRE_SUDO: &str = "Require sudo or counterpart but not found, skip";
pub fn get_require_sudo_string() -> String {
t!("Require sudo or counterpart but not found, skip").to_string()
}
/// Return `Err(SkipStep)` if `python` is a Python 2 or shim.
///
@@ -219,11 +230,11 @@ pub fn check_is_python_2_or_shim(python: PathBuf) -> Result<PathBuf> {
.parse::<u32>()
.expect("Major version should be a valid number");
if major_version == 2 {
return Err(SkipStep(format!("{} is a Python 2, skip.", python.display())).into());
return Err(SkipStep(t!("{python} is a Python 2, skip.", python = python.display()).to_string()).into());
}
} else {
// No version number, is a shim
return Err(SkipStep(format!("{} is a Python shim, skip.", python.display())).into());
return Err(SkipStep(t!("{python} is a Python shim, skip.", python = python.display()).to_string()).into());
}
Ok(python)
@@ -239,10 +250,7 @@ pub fn install_tracing(filter_directives: &str) -> Result<Handle<EnvFilter, Regi
.or_else(|_| EnvFilter::try_from_default_env())
.or_else(|_| EnvFilter::try_new(DEFAULT_LOG_LEVEL))?;
let fmt_layer = fmt::layer()
.with_target(false)
.with_span_events(FmtSpan::NEW | FmtSpan::CLOSE)
.without_time();
let fmt_layer = fmt::layer().with_target(false).without_time();
let (filter, reload_handle) = Layer::new(env_filter);