Compare commits

..

72 Commits

Author SHA1 Message Date
58c682d4d3 Merge pull request 'Improve trash output' (#24) from improve_trash_list into main
Reviewed-on: #24
2025-12-13 18:09:46 +00:00
45d9f5f6c8 Update README, version bump
All checks were successful
Lint & Security / precommit-and-security (pull_request) Successful in 1m2s
2025-12-13 18:06:59 +00:00
659a76f5c9 Make --empty delete dangling files in trash folder not associated with metadata file, edit completer function's name to be reusable 2025-12-13 18:04:55 +00:00
250077c592 Add --inspect flag 2025-12-13 17:34:35 +00:00
631843b3c5 Fix installation instructions 2025-12-09 16:15:13 +00:00
9c653e44a4 Fix release badge link 2025-12-09 15:15:19 +00:00
cdd3ba0cbd Merge pull request 'Update README and pyproject.toml' (#23) from update_resrm_20251209 into main
Reviewed-on: #23
2025-12-09 15:13:46 +00:00
eee00bb6ee Edit badges, update installation instructions, swap github.com entries to git.sysmd.uk
All checks were successful
Lint & Security / precommit-and-security (pull_request) Successful in 48s
2025-12-09 15:11:54 +00:00
f9586bbd0e Merge pull request 'Rename .github folder to .gitea. Use pre-commit directly instead of action' (#22) from rename_github_folder into main
Reviewed-on: #22
2025-12-09 13:23:09 +00:00
51a7001bf2 Rename .github folder to .gitea. Use pre-commit directly instead of action
All checks were successful
Lint & Security / precommit-and-security (pull_request) Successful in 47s
2025-12-09 13:19:46 +00:00
ccf383ebfb Remove .coverage and add that to the .gitignore 2025-12-03 11:58:46 +00:00
Marco D'Aleo
6670c79d47 Merge pull request #21 from guardutils/args_list_fix
Fix list flag to use the long name
2025-12-03 11:57:06 +00:00
3285fbaef4 Fix list flag to use the long name 2025-12-03 11:55:15 +00:00
Marco D'Aleo
c07b7598d0 Merge pull request #20 from guardutils/update_resrm_20251202
Tab completion
2025-12-02 18:02:56 +00:00
9edae1d233 Add tab completion using argcomplete, update README 2025-12-02 18:01:17 +00:00
e36ac044d9 Update badges URLs 2025-11-29 16:43:12 +00:00
Marco D'Aleo
1ad635e37e Merge pull request #19 from guardutils/update_resrm_20251127
Switch ownership from mdaleo404 to guardutils in README and pyproject
2025-11-27 17:56:39 +00:00
649e16c03a Switch ownership from mdaleo404 to guardutils in README and pyproject 2025-11-27 17:55:29 +00:00
Marco D'Aleo
fc02895965 Merge pull request #18 from mdaleo404/add_badges_to_readme
Add badges to README
2025-11-23 07:32:05 +00:00
feb0d313e8 Add badges to README 2025-11-23 07:29:54 +00:00
af6c7a0797 Fix README 2025-11-17 19:05:32 +00:00
ccaa2dcb25 Update README with new installation methods 2025-11-17 18:56:51 +00:00
Marco D'Aleo
5bb1437a49 Merge pull request #17 from mdaleo404/update_resrm_20251117
Rename workflow and make it trigger on pull requests
2025-11-17 15:10:09 +00:00
3c4bbcbc34 Fix typo in workflow's file name 2025-11-17 15:08:51 +00:00
ba29cc590d Rename workflow and make it trigger on pull requests 2025-11-17 15:06:57 +00:00
Marco D'Aleo
2fd6fbb2c2 Merge pull request #16 from mig5/mig/fix-args-force
Use args.force, not args.f
2025-11-17 14:56:00 +00:00
Miguel Jacq
96f7ebf4fc Use args.force, not args.f 2025-11-17 14:11:32 +11:00
7ba20632ab Change Python dependecies version. Remove Black target-version from pyproject.toml.Remove CI pull_request trigger. 2025-11-16 14:53:57 +00:00
f46e699420 Add more checks on pre-commit-config, add CI workflow 2025-11-16 07:04:13 +00:00
Marco D'Aleo
b3aff6d8c5 Merge pull request #15 from mdaleo404/dynamic_version
Add function to fetch package version from pyproject.toml
2025-11-15 18:19:22 +00:00
2013e6b645 Add function to fetch package version from pyproject.toml" 2025-11-15 18:16:50 +00:00
Marco D'Aleo
6a73270f23 Merge pull request #14 from mdaleo404/remove_dev_dependencies
Remove bandit and black from pyproject.toml
2025-11-15 16:57:17 +00:00
4f1e5043fd Remove bandit and black from pyproject.toml 2025-11-15 16:56:33 +00:00
877a490b57 Adjust bandit's severity and confidence levels 2025-11-15 08:27:04 +00:00
Marco D'Aleo
bcad54d94e Merge pull request #13 from mdaleo404/update_resrm_20251115
Add pre-commit framework and hooks config
- bandit
- black
- trailing-whitespace
- end-of-file-fixer
2025-11-15 08:06:31 +00:00
80fb24b7c5 Add pre-commit section to README 2025-11-15 08:04:30 +00:00
5f8d6c03d8 Add pre-commit framework and hooks config 2025-11-15 07:53:04 +00:00
Marco D'Aleo
be9867a007 Merge pull request #11 from mdaleo404/resrm_trash_life
Add automatic prune of trash older than $RESRM_TRASH_LIFE
2025-11-12 17:37:52 +00:00
e76c8726a2 Add automatic prune of trash older than RESRM_TRASH_LIFE value, update README, package version bump 2025-11-12 17:36:51 +00:00
657f14d95d Update README 2025-11-11 19:02:30 +00:00
Marco D'Aleo
b6c6fcc14d Merge pull request #10 from mdaleo404/flags_improvement
Flags improvement
2025-11-11 18:54:05 +00:00
e5f029738a Add version flag, add error message for missing operand, add metavar for restore flag, remove unused recursive argument in move_to_trash function 2025-11-11 18:50:51 +00:00
1fbe499177 Flags improvements and package version bump 2025-11-11 18:11:58 +00:00
662166afb8 Add CODEOWNERS file 2025-11-11 17:30:42 +00:00
Marco D'Aleo
7197bfce0c Merge pull request #9 from mig5/mig/readme-sudo
Clarifications about sudo and secure_path
2025-11-11 17:25:38 +00:00
Miguel Jacq
7ee0e0a99e Clarifications about sudo and secure_path 2025-11-11 15:56:30 +11:00
dcd1165066 Remove legacy resrm repository subdirectory 2025-11-10 13:07:04 +00:00
Marco D'Aleo
45ebe97a9a Merge pull request #8 from mdaleo404/folder_restructure
Folder restructure
2025-11-10 13:05:23 +00:00
dfcffb19e4 Folder restructure 2025-11-10 13:04:02 +00:00
b221fa3534 Release bump to 0.2.0 2025-11-09 11:15:42 +00:00
a24cc8f2b9 remove CONTRIBUTING.md 2025-11-09 11:13:42 +00:00
Marco D'Aleo
d7330933bc Merge pull request #7 from mdaleo404/remove_actions
Remove actions/workflows
2025-11-09 11:11:32 +00:00
ac32bae975 Remove actions/workflows 2025-11-09 11:11:05 +00:00
Marco D'Aleo
b181e31a0c Merge pull request #6 from mdaleo404/gitignore_tweak
chore: remove end of the file line in .gitignore
2025-11-09 11:02:33 +00:00
4fd1243472 chore: remove end of the file line in .gitignore 2025-11-09 11:02:11 +00:00
Marco D'Aleo
328eeaca7a chore: force release
chore: fix workflow
2025-11-09 10:55:32 +00:00
19b79b26ff fix workflow 2025-11-09 10:54:53 +00:00
Marco D'Aleo
a67e31c65d chore: re-trigger release
chore: add new line to end of file core.py
2025-11-09 10:45:50 +00:00
9d4608bd34 add new line to end of file core.py 2025-11-09 10:44:48 +00:00
Marco D'Aleo
27468ae0d0 feat: prepare release cycle
feat: prepare release cycle
2025-11-09 10:35:52 +00:00
8b1d9a81a1 minor change to core.py 2025-11-09 10:34:46 +00:00
6eb3f5a210 Minor fix to inline documentation 2025-11-09 10:25:33 +00:00
Marco D'Aleo
9a64bef661 chore: trigger release
docs: remove end-of-the-file line in CONTRIBUTING.md
2025-11-09 10:10:44 +00:00
5b46a3af01 docs: remove end-of-the-file line in CONTRIBUTING.md 2025-11-09 10:09:07 +00:00
Marco D'Aleo
c8cc694e3c Merge pull request #1 from mdaleo404/restore_many
Add support for restoring multiple files, minor changes to READMEs
2025-11-09 09:54:36 +00:00
756a6af4ac docs: minor update to README files 2025-11-09 09:52:01 +00:00
a20bbeb9f8 feat: add support for restoring multiple files with --restore 2025-11-09 09:48:14 +00:00
668d6bbba4 Add CONTRIBUTING.md file 2025-11-09 09:02:11 +00:00
62264ea115 Add release-please-manifest 2025-11-09 08:33:03 +00:00
d38ec538a4 Switch to new release-please config 2025-11-09 08:29:55 +00:00
166f2dfac2 Add GitHub Actions workflows for packaging and release 2025-11-09 08:21:34 +00:00
ac284d29e3 Delete __pycache__, change README files 2025-11-08 09:59:56 +00:00
19 changed files with 967 additions and 404 deletions

1
.gitea/CODEOWNERS Normal file
View File

@@ -0,0 +1 @@
* @mdaleo404

View File

@@ -0,0 +1,29 @@
name: Lint & Security
on:
pull_request:
jobs:
precommit-and-security:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.13"
- name: Install pre-commit
run: pip install pre-commit
- name: Run pre-commit hooks
run: pre-commit run --all-files --color always
- name: Install pip-audit
run: pip install pip-audit
- name: Run pip-audit
run: pip-audit

1
.gitignore vendored
View File

@@ -1,3 +1,4 @@
__pycache__
.pytest_cache
dist
.coverage

21
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,21 @@
repos:
- repo: https://github.com/PyCQA/bandit
rev: 1.7.9
hooks:
- id: bandit
files: ^src/resrm/
args: ["-lll", "-iii", "-s", "B110,B112"]
- repo: https://github.com/psf/black-pre-commit-mirror
rev: 25.11.0
hooks:
- id: black
language_version: python3.13
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-toml

135
README.md
View File

@@ -1,33 +1,108 @@
[![Licence](https://img.shields.io/badge/GPL--3.0-orange?label=Licence)](https://git.sysmd.uk/guardutils/resrm/src/branch/main/LICENCE)
[![Gitea Release](https://img.shields.io/gitea/v/release/guardutils/resrm?gitea_url=https%3A%2F%2Fgit.sysmd.uk%2F&style=flat&color=orange&logo=gitea)](https://git.sysmd.uk/guardutils/resrm/releases)
[![pre-commit](https://img.shields.io/badge/pre--commit-enabled-blue?logo=pre-commit&style=flat)](https://git.sysmd.uk/guardutils/resrm/src/branch/main/.pre-commit-config.yaml)
# resrm
**resrm** is a safe, drop-in replacement for the Linux `rm` command with **undo/restore support**.
**resrm** is a safe, drop-in replacement for the Linux `rm` command with **undo/restore support**.
It moves files to a per-user _trash_ instead of permanently deleting them, while still allowing full `sudo` support for root-owned files.
---
## Features
- Move files and directories to a **trash folder** instead of permanent deletion
- Restore deleted files by **short ID or exact basename**
- Empty trash safely
- Supports `-r`, `-f`, `-i`, `--perma` options
- Works with `sudo` for root-owned files
- Move files and directories to a **Trash folder** instead of permanent deletion
- Restore deleted files by **short ID or exact basename**
- Empty trash safely
- Supports `-r`, `-f`, `-i`, `--skip-trash` options
- Works with `sudo` for root-owned files
- Automatically prunes Trash entries older than `$RESRM_TRASH_LIFE` days (default **7**, minimum **1**)
> Note: if you need immediate deletion, use the `--skip-trash` flag.
---
## Installation
Install via Poetry:
### From GuardUtils package repo
This is the preferred method of installation.
### Debian/Ubuntu
#### 1) Import the GPG key
```bash
poetry add resrm
sudo mkdir -p /usr/share/keyrings
curl -fsSL https://repo.sysmd.uk/guardutils/guardutils.gpg | sudo gpg --dearmor -o /usr/share/keyrings/guardutils.gpg
```
Or clone the repo and install locally:
The GPG fingerprint is `0032C71FA6A11EF9567D4434C5C06BD4603C28B1`.
#### 2) Add the APT source
```bash
git clone https://github.com/mdaleo404/resrm.git
cd resrm
echo "deb [arch=amd64 signed-by=/usr/share/keyrings/guardutils.gpg] https://repo.sysmd.uk/guardutils/debian stable main" | sudo tee /etc/apt/sources.list.d/guardutils.list
```
#### 3) Update and install
```
sudo apt update
sudo apt install resrm
```
### Fedora/RHEL
#### 1) Import the GPG key
```
sudo rpm --import https://repo.sysmd.uk/guardutils/guardutils.gpg
```
#### 2) Add the repository configuration
```
sudo tee /etc/yum.repos.d/guardutils.repo > /dev/null << 'EOF'
[guardutils]
name=GuardUtils Repository
baseurl=https://repo.sysmd.uk/guardutils/rpm/$basearch
enabled=1
gpgcheck=1
repo_gpgcheck=1
gpgkey=https://repo.sysmd.uk/guardutils/guardutils.gpg
EOF
```
#### 4) Update and install
```
sudo dnf upgrade --refresh
sudo dnf install resrm
```
### From PyPI
**NOTE:** To use `resrm` with `sudo`, the path to `resrm` must be in the `$PATH` seen by `root`.\
Either:
* install `resrm` as `root`, or
* add the path to `resrm` to the `secure_path` parameter in `/etc/sudoers`. For example, where `/home/user/.local/bin` is where `resrm` is:
``` bash
Defaults secure_path="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/user/.local/bin"
```
Install with:
```bash
pip install resrm
```
### From this repository
```bash
git clone https://git.sysmd.uk/guardutils/resrm.git
cd resrm/
poetry install
```
@@ -47,7 +122,7 @@ resrm -f file
resrm -i file
# Permanent delete (bypass trash)
resrm --perma file
resrm --skip-trash file
# List trash entries
resrm -l
@@ -55,12 +130,44 @@ resrm -l
# Restore a file by ID or basename
resrm --restore <id|name>
# Show full details of trashed item
resrm --inspect <id|name>
# Empty the trash permanently
resrm --empty
```
## Trash Location
Normal users: `~/.local/share/resrm/files`
Root user: `/root/.local/share/resrm/files`
Root user: `/root/.local/share/resrm/files`
## Configuration
To control how long trashed files are kept, add this line to your shell configuration (e.g. `~/.bashrc`):
```bash
export RESRM_TRASH_LIFE=10
```
### TAB completion
Add this to your `.bashrc`
```
eval "$(register-python-argcomplete resrm)"
```
And then
```
source ~/.bashrc
```
## pre-commit
This project uses [**pre-commit**](https://pre-commit.com/) to run automatic formatting and security checks before each commit (Black, Bandit, and various safety checks).
To enable it:
```
poetry install
poetry run pre-commit install
```
This ensures consistent formatting, catches common issues early, and keeps the codebase clean.

226
poetry.lock generated Normal file
View File

@@ -0,0 +1,226 @@
# This file is automatically @generated by Poetry 1.8.4 and should not be changed by hand.
[[package]]
name = "argcomplete"
version = "3.6.3"
description = "Bash tab completion for argparse"
optional = false
python-versions = ">=3.8"
files = [
{file = "argcomplete-3.6.3-py3-none-any.whl", hash = "sha256:f5007b3a600ccac5d25bbce33089211dfd49eab4a7718da3f10e3082525a92ce"},
{file = "argcomplete-3.6.3.tar.gz", hash = "sha256:62e8ed4fd6a45864acc8235409461b72c9a28ee785a2011cc5eb78318786c89c"},
]
[package.extras]
test = ["coverage", "mypy", "pexpect", "ruff", "wheel"]
[[package]]
name = "cfgv"
version = "3.4.0"
description = "Validate configuration and produce human readable error messages."
optional = false
python-versions = ">=3.8"
files = [
{file = "cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9"},
{file = "cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560"},
]
[[package]]
name = "distlib"
version = "0.4.0"
description = "Distribution utilities"
optional = false
python-versions = "*"
files = [
{file = "distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16"},
{file = "distlib-0.4.0.tar.gz", hash = "sha256:feec40075be03a04501a973d81f633735b4b69f98b05450592310c0f401a4e0d"},
]
[[package]]
name = "filelock"
version = "3.20.0"
description = "A platform independent file lock."
optional = false
python-versions = ">=3.10"
files = [
{file = "filelock-3.20.0-py3-none-any.whl", hash = "sha256:339b4732ffda5cd79b13f4e2711a31b0365ce445d95d243bb996273d072546a2"},
{file = "filelock-3.20.0.tar.gz", hash = "sha256:711e943b4ec6be42e1d4e6690b48dc175c822967466bb31c0c293f34334c13f4"},
]
[[package]]
name = "identify"
version = "2.6.15"
description = "File identification library for Python"
optional = false
python-versions = ">=3.9"
files = [
{file = "identify-2.6.15-py2.py3-none-any.whl", hash = "sha256:1181ef7608e00704db228516541eb83a88a9f94433a8c80bb9b5bd54b1d81757"},
{file = "identify-2.6.15.tar.gz", hash = "sha256:e4f4864b96c6557ef2a1e1c951771838f4edc9df3a72ec7118b338801b11c7bf"},
]
[package.extras]
license = ["ukkonen"]
[[package]]
name = "nodeenv"
version = "1.9.1"
description = "Node.js virtual environment builder"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
files = [
{file = "nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9"},
{file = "nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f"},
]
[[package]]
name = "platformdirs"
version = "4.5.0"
description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`."
optional = false
python-versions = ">=3.10"
files = [
{file = "platformdirs-4.5.0-py3-none-any.whl", hash = "sha256:e578a81bb873cbb89a41fcc904c7ef523cc18284b7e3b3ccf06aca1403b7ebd3"},
{file = "platformdirs-4.5.0.tar.gz", hash = "sha256:70ddccdd7c99fc5942e9fc25636a8b34d04c24b335100223152c2803e4063312"},
]
[package.extras]
docs = ["furo (>=2025.9.25)", "proselint (>=0.14)", "sphinx (>=8.2.3)", "sphinx-autodoc-typehints (>=3.2)"]
test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=8.4.2)", "pytest-cov (>=7)", "pytest-mock (>=3.15.1)"]
type = ["mypy (>=1.18.2)"]
[[package]]
name = "pre-commit"
version = "3.8.0"
description = "A framework for managing and maintaining multi-language pre-commit hooks."
optional = false
python-versions = ">=3.9"
files = [
{file = "pre_commit-3.8.0-py2.py3-none-any.whl", hash = "sha256:9a90a53bf82fdd8778d58085faf8d83df56e40dfe18f45b19446e26bf1b3a63f"},
{file = "pre_commit-3.8.0.tar.gz", hash = "sha256:8bb6494d4a20423842e198980c9ecf9f96607a07ea29549e180eef9ae80fe7af"},
]
[package.dependencies]
cfgv = ">=2.0.0"
identify = ">=1.0.0"
nodeenv = ">=0.11.1"
pyyaml = ">=5.1"
virtualenv = ">=20.10.0"
[[package]]
name = "pyyaml"
version = "6.0.3"
description = "YAML parser and emitter for Python"
optional = false
python-versions = ">=3.8"
files = [
{file = "PyYAML-6.0.3-cp38-cp38-macosx_10_13_x86_64.whl", hash = "sha256:c2514fceb77bc5e7a2f7adfaa1feb2fb311607c9cb518dbc378688ec73d8292f"},
{file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9c57bb8c96f6d1808c030b1687b9b5fb476abaa47f0db9c0101f5e9f394e97f4"},
{file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:efd7b85f94a6f21e4932043973a7ba2613b059c4a000551892ac9f1d11f5baf3"},
{file = "PyYAML-6.0.3-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:22ba7cfcad58ef3ecddc7ed1db3409af68d023b7f940da23c6c2a1890976eda6"},
{file = "PyYAML-6.0.3-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:6344df0d5755a2c9a276d4473ae6b90647e216ab4757f8426893b5dd2ac3f369"},
{file = "PyYAML-6.0.3-cp38-cp38-win32.whl", hash = "sha256:3ff07ec89bae51176c0549bc4c63aa6202991da2d9a6129d7aef7f1407d3f295"},
{file = "PyYAML-6.0.3-cp38-cp38-win_amd64.whl", hash = "sha256:5cf4e27da7e3fbed4d6c3d8e797387aaad68102272f8f9752883bc32d61cb87b"},
{file = "pyyaml-6.0.3-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:214ed4befebe12df36bcc8bc2b64b396ca31be9304b8f59e25c11cf94a4c033b"},
{file = "pyyaml-6.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:02ea2dfa234451bbb8772601d7b8e426c2bfa197136796224e50e35a78777956"},
{file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b30236e45cf30d2b8e7b3e85881719e98507abed1011bf463a8fa23e9c3e98a8"},
{file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:66291b10affd76d76f54fad28e22e51719ef9ba22b29e1d7d03d6777a9174198"},
{file = "pyyaml-6.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9c7708761fccb9397fe64bbc0395abcae8c4bf7b0eac081e12b809bf47700d0b"},
{file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:418cf3f2111bc80e0933b2cd8cd04f286338bb88bdc7bc8e6dd775ebde60b5e0"},
{file = "pyyaml-6.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5e0b74767e5f8c593e8c9b5912019159ed0533c70051e9cce3e8b6aa699fcd69"},
{file = "pyyaml-6.0.3-cp310-cp310-win32.whl", hash = "sha256:28c8d926f98f432f88adc23edf2e6d4921ac26fb084b028c733d01868d19007e"},
{file = "pyyaml-6.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:bdb2c67c6c1390b63c6ff89f210c8fd09d9a1217a465701eac7316313c915e4c"},
{file = "pyyaml-6.0.3-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:44edc647873928551a01e7a563d7452ccdebee747728c1080d881d68af7b997e"},
{file = "pyyaml-6.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:652cb6edd41e718550aad172851962662ff2681490a8a711af6a4d288dd96824"},
{file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:10892704fc220243f5305762e276552a0395f7beb4dbf9b14ec8fd43b57f126c"},
{file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:850774a7879607d3a6f50d36d04f00ee69e7fc816450e5f7e58d7f17f1ae5c00"},
{file = "pyyaml-6.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b8bb0864c5a28024fac8a632c443c87c5aa6f215c0b126c449ae1a150412f31d"},
{file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1d37d57ad971609cf3c53ba6a7e365e40660e3be0e5175fa9f2365a379d6095a"},
{file = "pyyaml-6.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:37503bfbfc9d2c40b344d06b2199cf0e96e97957ab1c1b546fd4f87e53e5d3e4"},
{file = "pyyaml-6.0.3-cp311-cp311-win32.whl", hash = "sha256:8098f252adfa6c80ab48096053f512f2321f0b998f98150cea9bd23d83e1467b"},
{file = "pyyaml-6.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:9f3bfb4965eb874431221a3ff3fdcddc7e74e3b07799e0e84ca4a0f867d449bf"},
{file = "pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196"},
{file = "pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0"},
{file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28"},
{file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c"},
{file = "pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc"},
{file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e"},
{file = "pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea"},
{file = "pyyaml-6.0.3-cp312-cp312-win32.whl", hash = "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5"},
{file = "pyyaml-6.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b"},
{file = "pyyaml-6.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd"},
{file = "pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8"},
{file = "pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1"},
{file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c"},
{file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5"},
{file = "pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6"},
{file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6"},
{file = "pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be"},
{file = "pyyaml-6.0.3-cp313-cp313-win32.whl", hash = "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26"},
{file = "pyyaml-6.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c"},
{file = "pyyaml-6.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb"},
{file = "pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac"},
{file = "pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310"},
{file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7"},
{file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788"},
{file = "pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5"},
{file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764"},
{file = "pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35"},
{file = "pyyaml-6.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac"},
{file = "pyyaml-6.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3"},
{file = "pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3"},
{file = "pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba"},
{file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c"},
{file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702"},
{file = "pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c"},
{file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065"},
{file = "pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65"},
{file = "pyyaml-6.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9"},
{file = "pyyaml-6.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b"},
{file = "pyyaml-6.0.3-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:b865addae83924361678b652338317d1bd7e79b1f4596f96b96c77a5a34b34da"},
{file = "pyyaml-6.0.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c3355370a2c156cffb25e876646f149d5d68f5e0a3ce86a5084dd0b64a994917"},
{file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3c5677e12444c15717b902a5798264fa7909e41153cdf9ef7ad571b704a63dd9"},
{file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5ed875a24292240029e4483f9d4a4b8a1ae08843b9c54f43fcc11e404532a8a5"},
{file = "pyyaml-6.0.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0150219816b6a1fa26fb4699fb7daa9caf09eb1999f3b70fb6e786805e80375a"},
{file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:fa160448684b4e94d80416c0fa4aac48967a969efe22931448d853ada8baf926"},
{file = "pyyaml-6.0.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:27c0abcb4a5dac13684a37f76e701e054692a9b2d3064b70f5e4eb54810553d7"},
{file = "pyyaml-6.0.3-cp39-cp39-win32.whl", hash = "sha256:1ebe39cb5fc479422b83de611d14e2c0d3bb2a18bbcb01f229ab3cfbd8fee7a0"},
{file = "pyyaml-6.0.3-cp39-cp39-win_amd64.whl", hash = "sha256:2e71d11abed7344e42a8849600193d15b6def118602c4c176f748e4583246007"},
{file = "pyyaml-6.0.3.tar.gz", hash = "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f"},
]
[[package]]
name = "typing-extensions"
version = "4.15.0"
description = "Backported and Experimental Type Hints for Python 3.9+"
optional = false
python-versions = ">=3.9"
files = [
{file = "typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548"},
{file = "typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466"},
]
[[package]]
name = "virtualenv"
version = "20.35.4"
description = "Virtual Python Environment builder"
optional = false
python-versions = ">=3.8"
files = [
{file = "virtualenv-20.35.4-py3-none-any.whl", hash = "sha256:c21c9cede36c9753eeade68ba7d523529f228a403463376cf821eaae2b650f1b"},
{file = "virtualenv-20.35.4.tar.gz", hash = "sha256:643d3914d73d3eeb0c552cbb12d7e82adf0e504dbf86a3182f8771a153a1971c"},
]
[package.dependencies]
distlib = ">=0.3.7,<1"
filelock = ">=3.12.2,<4"
platformdirs = ">=3.9.1,<5"
typing-extensions = {version = ">=4.13.2", markers = "python_version < \"3.11\""}
[package.extras]
docs = ["furo (>=2023.7.26)", "proselint (>=0.13)", "sphinx (>=7.1.2,!=7.3)", "sphinx-argparse (>=0.4)", "sphinxcontrib-towncrier (>=0.2.1a0)", "towncrier (>=23.6)"]
test = ["covdefaults (>=2.3)", "coverage (>=7.2.7)", "coverage-enable-subprocess (>=1)", "flaky (>=3.7)", "packaging (>=23.1)", "pytest (>=7.4)", "pytest-env (>=0.8.2)", "pytest-freezer (>=0.4.8)", "pytest-mock (>=3.11.1)", "pytest-randomly (>=3.12)", "pytest-timeout (>=2.1)", "setuptools (>=68)", "time-machine (>=2.10)"]
[metadata]
lock-version = "2.0"
python-versions = ">=3.10,<4.0"
content-hash = "7f8ea4efe2d270a676fdd9c882c02f43b4b118bfe7a5fd6da1098ff4ec84ce3d"

View File

@@ -1,20 +1,27 @@
[tool.poetry]
name = "resrm"
version = "0.1.0"
version = "0.4.0"
description = "drop-in replacement for rm with undo/restore built-in."
authors = ["Marco D'Aleo <marco@marcodaleo.com>"]
license = "GPL-3.0-or-later"
readme = "README.md"
homepage = "https://github.com/mdaleo404/resrm"
repository = "https://github.com/mdaleo404/resrm"
homepage = "https://git.sysmd.uk/guardutils/resrm"
repository = "https://git.sysmd.uk/guardutils/resrm"
packages = [{include = "resrm", from = "src"}]
[tool.poetry.dependencies]
python = "^3.13"
python = ">=3.10,<4.0"
argcomplete = ">=2"
[tool.poetry.dev-dependencies]
pre-commit = "^3.8"
[tool.poetry.scripts]
resrm = "resrm.cli:main"
[tool.black]
line-length = 79
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"

View File

@@ -1,66 +0,0 @@
# resrm
**resrm** is a safe, drop-in replacement for the Linux `rm` command with **undo/restore support**.
It moves files to a per-user _trash_ instead of permanently deleting them, while still allowing full `sudo` support for root-owned files.
---
## Features
- Move files and directories to a **trash folder** instead of permanent deletion
- Restore deleted files by **short ID or exact basename**
- Empty trash safely
- Supports `-r`, `-f`, `-i`, `--perma` options
- Works with `sudo` for root-owned files
---
## Installation
Install via Poetry:
```bash
poetry add resrm
```
Or clone the repo and install locally:
```bash
git clone https://github.com/mdaleo404/resrm.git
cd resrm
poetry install
```
## Usage
```bash
# Move files to trash
resrm file1 file2
# Recursive remove of a directory
resrm -r mydir
# Force remove (ignore nonexistent)
resrm -f file
# Interactive remove
resrm -i file
# Permanent delete (bypass trash)
resrm --perma file
# List trash entries
resrm -l
# Restore a file by ID or basename
resrm --restore <id|name>
# Empty the trash permanently
resrm --empty
```
## Trash Location
Normal users: `~/.local/share/resrm/files`
Root user: `/root/.local/share/resrm/files`

7
resrm/poetry.lock generated
View File

@@ -1,7 +0,0 @@
# This file is automatically @generated by Poetry 1.8.4 and should not be changed by hand.
package = []
[metadata]
lock-version = "2.0"
python-versions = "^3.13"
content-hash = "f01b553f3895e558c34b4f10542e05acdef39bf0527c8090bd136d914dc73f94"

View File

@@ -1 +0,0 @@
from resrm.core import main

View File

@@ -1,312 +0,0 @@
#!/usr/bin/env python3
"""
resrm: drop-in replacement for rm with undo/restore built-in (single script)
Basic usage:
resrm file1 file2 # move to trash (safe)
resrm -r dir # recursive remove (moves dir to trash)
resrm -f file # ignore nonexistent, no prompt
resrm -i file # interactive prompt before removal
resrm --perma file # permanent delete (bypass trash)
resrm -l # list trash entries (neat table)
resrm --restore <id|name> # restore by short-id (8 chars) or exact basename
resrm --empty # empty trash entries (permanent)
"""
from __future__ import annotations
import argparse
import json
import os
import shutil
import sys
import uuid
import datetime
import textwrap
from pathlib import Path
from typing import List, Dict, Optional
# Config
def get_trash_base_for_user(uid: int) -> Path:
"""Return the trash base path depending on whether user is root or normal."""
if uid == 0:
return Path("/root/.local/share/resrm")
else:
try:
import pwd
user_info = pwd.getpwuid(uid)
home_dir = Path(user_info.pw_dir)
except Exception:
home_dir = Path.home()
return home_dir / ".local" / "share" / "resrm"
def get_trash_paths() -> tuple[Path, Path]:
"""Return (trash_dir, meta_file) for the current user."""
base = get_trash_base_for_user(os.geteuid())
trash = base / "files"
meta = base / "metadata.json"
trash.mkdir(parents=True, exist_ok=True)
meta.parent.mkdir(parents=True, exist_ok=True)
return trash, meta
TRASH_DIR, META_FILE = get_trash_paths()
DATEFMT = "%Y-%m-%d %H:%M"
def load_meta() -> List[Dict]:
if META_FILE.exists():
try:
with META_FILE.open("r", encoding="utf-8") as f:
return json.load(f)
except Exception:
return []
return []
def save_meta(meta: List[Dict]):
with META_FILE.open("w", encoding="utf-8") as f:
json.dump(meta, f, indent=2, ensure_ascii=False)
meta = load_meta()
def short_id(fullid: str) -> str:
return fullid[:8]
def human_time(ts: str) -> str:
"""
Convert ISO timestamp string from metadata to a human-readable format.
"""
try:
dt = datetime.datetime.fromisoformat(ts)
return dt.strftime(DATEFMT)
except Exception:
# Fallback: just return the raw string
return ts
def entry_display(entry: Dict, width: int = 80) -> str:
id8 = short_id(entry["id"])
ts = human_time(entry["timestamp"])
path = entry["orig_path"]
wrapped = textwrap.fill(path, width=width-32)
return f"{id8:<8} {ts:<19} {wrapped}"
def list_trash():
if not meta:
print("Trash empty.")
return
header = f"{'ID':<8} {'Deleted at':<19} {'Original path'}"
print(header)
print('-' * len(header))
for entry in meta:
id8 = short_id(entry["id"])
ts = human_time(entry["timestamp"])
path = entry["orig_path"]
max_path_len = 80
if len(path) > max_path_len:
path = "" + path[-(max_path_len - 1):]
print(f"{id8:<8} {ts:<19} {path}")
def find_candidates(identifier: str) -> List[Dict]:
# exact basename match first
exact = [m for m in meta if Path(m["orig_path"]).name == identifier]
if exact:
return exact
# then id prefix match
id_matches = [m for m in meta if m["id"].startswith(identifier)]
if id_matches:
return id_matches
def restore_one(entry: Dict) -> bool:
src = TRASH_DIR / entry["id"]
dest = Path(entry["orig_path"])
# If dest exists, restore to current dir with original basename
if dest.exists():
dest = Path.cwd() / dest.name
dest.parent.mkdir(parents=True, exist_ok=True)
try:
shutil.move(str(src), str(dest))
except Exception as e:
print(f"Failed to restore: {e}")
return False
# remove meta entry
try:
meta.remove(entry)
save_meta(meta)
except ValueError:
pass
print(f"Restored to: {dest}")
return True
def restore(identifier: str):
candidates = find_candidates(identifier)
if not candidates:
print(f"No match found for '{identifier}'")
return
if len(candidates) == 1:
restore_one(candidates[0])
return
# multiple candidates -> show list and ask
print("Multiple matches:")
for i, e in enumerate(candidates, start=1):
print(f"{i}) {short_id(e['id'])} {e['orig_path']} ({e['timestamp']})")
try:
choice = input("Choose number to restore (or abort): ").strip()
except KeyboardInterrupt:
print("\nAborted.")
return
if not choice.isdigit():
print("Aborted.")
return
idx = int(choice) - 1
if idx < 0 or idx >= len(candidates):
print("Invalid selection.")
return
restore_one(candidates[idx])
def empty_trash():
"""Permanently remove all trashed files and clear metadata."""
count = 0
for entry in list(meta):
f = TRASH_DIR / entry["id"]
try:
if f.exists():
if f.is_dir():
shutil.rmtree(f, ignore_errors=True)
else:
f.unlink(missing_ok=True)
meta.remove(entry)
count += 1
except Exception as e:
print(f"Failed to remove {f}: {e}")
save_meta(meta)
print(f"Trash emptied ({count} entries removed).")
def move_to_trash(path: Path, interactive: bool, force: bool, recursive: bool, perma: bool):
if not path.exists():
if force:
return
print(f"resrm: cannot remove '{path}': No such file or directory")
return
# Interactive prompt
if interactive:
try:
yn = input(f"remove '{path}'? [y/N] ").strip().lower()
except KeyboardInterrupt:
print()
return
if yn != "y":
return
# Permanent delete path
if perma:
try:
if path.is_dir() and not path.is_symlink():
shutil.rmtree(path)
else:
path.unlink()
except Exception as e:
print(f"Failed permanent delete: {e}")
return
# 🚫 Prevent non-root user deleting root-owned files
try:
st = path.stat()
if st.st_uid == 0 and os.geteuid() != 0:
print(f"resrm: permission denied: '{path}' (root-owned file, try sudo)")
return
except Exception:
pass
# 🧭 Detect which trash to use (based on file owner)
try:
import pwd
owner_uid = path.stat().st_uid
owner_info = pwd.getpwuid(owner_uid)
owner_home = Path(owner_info.pw_dir)
trash_base = owner_home / ".local" / "share" / "resrm"
except Exception:
# fallback if we can't resolve owner
trash_base = TRASH_DIR.parent
trash_dir = trash_base / "files"
meta_file = trash_base / "metadata.json"
trash_dir.mkdir(parents=True, exist_ok=True)
meta_file.parent.mkdir(parents=True, exist_ok=True)
# Try to move file
uid = uuid.uuid4().hex
dest = trash_dir / uid
try:
shutil.move(str(path), str(dest))
except Exception as e:
print(f"Failed to move to trash: {e}")
return
# Update metadata (per-owner)
try:
if meta_file.exists():
with meta_file.open("r", encoding="utf-8") as f:
owner_meta = json.load(f)
else:
owner_meta = []
except Exception:
owner_meta = []
entry = {
"id": uid,
"orig_path": str(path.resolve()),
"timestamp": datetime.datetime.now().isoformat()
}
owner_meta.append(entry)
with meta_file.open("w", encoding="utf-8") as f:
json.dump(owner_meta, f, indent=2, ensure_ascii=False)
print(f"Removed '{path}' -> trash id {short_id(uid)}")
def main(argv: Optional[List[str]] = None):
if argv is None:
argv = sys.argv[1:]
parser = argparse.ArgumentParser(add_help=False)
parser.add_argument("paths", nargs="*", help="files to remove")
parser.add_argument("-r", action="store_true", help="recursive")
parser.add_argument("-f", action="store_true", help="force")
parser.add_argument("-i", action="store_true", help="interactive")
parser.add_argument("--perma", action="store_true", help="permanent delete")
parser.add_argument("--restore", nargs=1, help="restore by id or basename")
parser.add_argument("-l", action="store_true", help="list trash")
parser.add_argument("--empty", action="store_true", help="empty the trash permanently")
parser.add_argument("--help", action="store_true", help="show help")
args = parser.parse_args(argv)
# Always print docstring if --help or no args
if args.help or not argv:
print(__doc__)
return
if args.l:
list_trash()
return
if args.empty:
empty_trash()
return
if args.restore:
restore(args.restore[0])
return
if not args.paths:
parser.print_help()
return
# Process removals
for p in args.paths:
pth = Path(p)
# simplistic recursive handling: if -r not given and it's a directory, mimic rm behavior: error unless -r
if pth.is_dir() and not args.r:
if args.f:
continue
print(f"resrm: cannot remove '{pth}': Is a directory")
continue
move_to_trash(pth, interactive=args.i, force=args.f, recursive=args.r, perma=args.perma)

1
src/resrm/cli.py Normal file
View File

@@ -0,0 +1 @@
from resrm.core import main

556
src/resrm/core.py Normal file
View File

@@ -0,0 +1,556 @@
#!/usr/bin/env python3
"""
resrm: drop-in replacement for rm with undo/restore built-in (single script)
Basic usage:
resrm file1 file2 # move to trash (safe)
resrm -r dir # recursive remove (moves dir to trash)
resrm -f file # ignore nonexistent, no prompt
resrm -i file # interactive prompt before removal
resrm --skip-trash file # permanent delete (bypass trash)
resrm -l|--list # list trash entries (neat table)
resrm --restore <id|name> # restore by short-id (8 chars) or exact basename
resrm --inspect <id|name> # output full detail list of trashed item
resrm --empty # empty trash entries (permanent)
"""
from __future__ import annotations
import argparse
import argcomplete
import json
import os
import shutil
import sys
import uuid
import datetime
import textwrap
import importlib.metadata
from pathlib import Path
from typing import List, Dict, Optional
# Config
def get_version():
try:
return importlib.metadata.version("resrm")
except importlib.metadata.PackageNotFoundError:
return "unknown"
def get_trash_base_for_user(uid: int) -> Path:
"""Return the trash base path depending on whether user is root or normal."""
if uid == 0:
return Path("/root/.local/share/resrm")
else:
try:
import pwd
user_info = pwd.getpwuid(uid)
home_dir = Path(user_info.pw_dir)
except Exception:
home_dir = Path.home()
return home_dir / ".local" / "share" / "resrm"
def get_trash_paths() -> tuple[Path, Path]:
"""Return (trash_dir, meta_file) for the current user."""
base = get_trash_base_for_user(os.geteuid())
trash = base / "files"
meta = base / "metadata.json"
trash.mkdir(parents=True, exist_ok=True)
meta.parent.mkdir(parents=True, exist_ok=True)
return trash, meta
TRASH_DIR, META_FILE = get_trash_paths()
DATEFMT = "%Y-%m-%d %H:%M"
def prune_old_trash():
"""Remove trash entries older than RESRM_TRASH_LIFE days (default 7)."""
try:
life_days = int(os.environ.get("RESRM_TRASH_LIFE", "7"))
except ValueError:
life_days = 7
if life_days < 1:
life_days = 1
cutoff = datetime.datetime.now() - datetime.timedelta(days=life_days)
removed = 0
for entry in list(meta): # make copy since we'll modify meta
try:
ts = datetime.datetime.fromisoformat(entry["timestamp"])
except Exception:
continue # skip malformed entries
if ts < cutoff:
f = TRASH_DIR / entry["id"]
try:
if f.exists():
if f.is_dir():
shutil.rmtree(f, ignore_errors=True)
else:
f.unlink(missing_ok=True)
meta.remove(entry)
removed += 1
except Exception as e:
print(f"Failed to prune {f}: {e}")
if removed > 0:
save_meta(meta)
print(
f"Pruned {removed} trash entr{'y' if removed == 1 else 'ies'} older than {life_days} da{'y' if life_days == 1 else 'ys'}."
)
def load_meta() -> List[Dict]:
if META_FILE.exists():
try:
with META_FILE.open("r", encoding="utf-8") as f:
return json.load(f)
except Exception:
return []
return []
def save_meta(meta: List[Dict]):
with META_FILE.open("w", encoding="utf-8") as f:
json.dump(meta, f, indent=2, ensure_ascii=False)
meta = load_meta()
def short_id(fullid: str) -> str:
return fullid[:8]
def human_time(ts: str) -> str:
"""
Convert ISO timestamp string from metadata to a human-readable format.
"""
try:
dt = datetime.datetime.fromisoformat(ts)
return dt.strftime(DATEFMT)
except Exception:
# Fallback: just return the raw string
return ts
def entry_display(entry: Dict, width: int = 80) -> str:
id8 = short_id(entry["id"])
ts = human_time(entry["timestamp"])
path = entry["orig_path"]
wrapped = textwrap.fill(path, width=width - 32)
return f"{id8:<8} {ts:<19} {wrapped}"
def list_trash():
if not meta:
print("Trash empty.")
return
header = f"{'ID':<8} {'Deleted at':<19} {'Original path'}"
print(header)
print("-" * len(header))
for entry in meta:
id8 = short_id(entry["id"])
ts = human_time(entry["timestamp"])
path = entry["orig_path"]
max_path_len = 80
if len(path) > max_path_len:
path = "" + path[-(max_path_len - 1) :]
print(f"{id8:<8} {ts:<19} {path}")
def find_candidates(identifier: str) -> List[Dict]:
# exact basename match first
exact = [m for m in meta if Path(m["orig_path"]).name == identifier]
if exact:
return exact
# then id prefix match
id_matches = [m for m in meta if m["id"].startswith(identifier)]
if id_matches:
return id_matches
return []
def restore_many(identifiers: List[str]):
"""Restore multiple files, prompting when needed."""
for identifier in identifiers:
candidates = find_candidates(identifier)
if not candidates:
print(f"No match found for '{identifier}'")
continue
# Only one match - restore immediately
if len(candidates) == 1:
restore_one(candidates[0])
continue
# Multiple matches - prompt user
print(f"Multiple matches for '{identifier}':")
for i, entry in enumerate(candidates, start=1):
print(
f"{i}) {short_id(entry['id'])} {entry['orig_path']} ({entry['timestamp']})"
)
try:
choice = input("Choose number to restore (or skip): ").strip()
except KeyboardInterrupt:
print("\nAborted.")
return
if not choice.isdigit():
print("Skipped.")
continue
idx = int(choice) - 1
if 0 <= idx < len(candidates):
restore_one(candidates[idx])
else:
print("Invalid selection. Skipped.")
def restore_one(entry: Dict) -> bool:
src = TRASH_DIR / entry["id"]
dest = Path(entry["orig_path"])
# If dest exists, restore to current dir with original basename
if dest.exists():
dest = Path.cwd() / dest.name
dest.parent.mkdir(parents=True, exist_ok=True)
try:
shutil.move(str(src), str(dest))
except Exception as e:
print(f"Failed to restore: {e}")
return False
# remove meta entry
try:
meta.remove(entry)
save_meta(meta)
except ValueError:
pass
print(f"Restored to: {dest}")
return True
def restore(identifier: str):
candidates = find_candidates(identifier)
if not candidates:
print(f"No match found for '{identifier}'")
return
if len(candidates) == 1:
restore_one(candidates[0])
return
# multiple candidates -> show list and ask
print("Multiple matches:")
for i, e in enumerate(candidates, start=1):
print(
f"{i}) {short_id(e['id'])} {e['orig_path']} ({e['timestamp']})"
)
try:
choice = input("Choose number to restore (or abort): ").strip()
except KeyboardInterrupt:
print("\nAborted.")
return
if not choice.isdigit():
print("Aborted.")
return
idx = int(choice) - 1
if idx < 0 or idx >= len(candidates):
print("Invalid selection.")
return
restore_one(candidates[idx])
def empty_trash():
"""Permanently remove all trashed files and clear metadata."""
# Remove everything inside the trash directory
count = 0
for item in TRASH_DIR.iterdir():
try:
if item.is_dir():
shutil.rmtree(item, ignore_errors=True)
else:
item.unlink(missing_ok=True)
count += 1
except Exception as e:
print(f"Failed to remove {item}: {e}")
# Clear metadata
meta.clear()
save_meta(meta)
print(f"Trash emptied ({count} entries removed).")
def move_to_trash(
path: Path, interactive: bool, force: bool, skip_trash: bool
):
if not path.exists():
if force:
return
print(f"resrm: cannot remove '{path}': No such file or directory")
return
# Interactive prompt
if interactive and not force:
try:
yn = input(f"remove '{path}'? [y/N] ").strip().lower()
except KeyboardInterrupt:
print()
return
if yn != "y":
return
# Permanent delete path
if skip_trash:
try:
if path.is_dir() and not path.is_symlink():
shutil.rmtree(path)
else:
path.unlink()
except Exception as e:
print(f"Failed permanent delete: {e}")
return
# Prevent non-root user deleting root-owned files
try:
st = path.stat()
if st.st_uid == 0 and os.geteuid() != 0:
print(
f"resrm: permission denied: '{path}' (root-owned file, try sudo)"
)
return
except Exception:
pass
# Detect which trash to use (based on file owner)
try:
import pwd
owner_uid = path.stat().st_uid
owner_info = pwd.getpwuid(owner_uid)
owner_home = Path(owner_info.pw_dir)
trash_base = owner_home / ".local" / "share" / "resrm"
except Exception:
# fallback if we can't resolve owner
trash_base = TRASH_DIR.parent
trash_dir = trash_base / "files"
meta_file = trash_base / "metadata.json"
trash_dir.mkdir(parents=True, exist_ok=True)
meta_file.parent.mkdir(parents=True, exist_ok=True)
# Try to move file
uid = uuid.uuid4().hex
dest = trash_dir / uid
try:
shutil.move(str(path), str(dest))
except Exception as e:
print(f"Failed to move to trash: {e}")
return
# Update metadata (per-owner)
try:
if meta_file.exists():
with meta_file.open("r", encoding="utf-8") as f:
owner_meta = json.load(f)
else:
owner_meta = []
except Exception:
owner_meta = []
entry = {
"id": uid,
"orig_path": str(path.resolve()),
"timestamp": datetime.datetime.now().isoformat(),
}
owner_meta.append(entry)
with meta_file.open("w", encoding="utf-8") as f:
json.dump(owner_meta, f, indent=2, ensure_ascii=False)
print(f"Removed '{path}' -> trash id {short_id(uid)}")
def inspect_entry(identifier: str):
"""Show full information about trash entries matching the identifier."""
candidates = find_candidates(identifier)
if not candidates:
print(f"No match found for '{identifier}'")
return
for entry in candidates:
# Validate entry structure
if not isinstance(entry, dict):
print(f"Invalid metadata entry (not a dict): {entry!r}")
print()
continue
entry_id = entry.get("id")
orig_path = entry.get("orig_path", "?")
timestamp = entry.get("timestamp", "?")
if not entry_id:
print(f"Invalid metadata entry (missing id): {entry}")
continue
trash_path = TRASH_DIR / entry_id
print(f"ID: {short_id(entry_id)}")
print(f"Original: {orig_path}")
print(f"Deleted at: {human_time(timestamp)}")
print(f"Stored at: {trash_path}")
try:
st = trash_path.lstat() # preserves symlink info
import stat, pwd, grp
# Type detection
if stat.S_ISDIR(st.st_mode):
ftype = "directory"
elif stat.S_ISLNK(st.st_mode):
try:
target = os.readlink(trash_path)
ftype = f"symlink → {target}"
except Exception:
ftype = "symlink"
else:
ftype = "file"
# Permissions
perms = stat.filemode(st.st_mode)
# Ownership
try:
user = pwd.getpwuid(st.st_uid).pw_name
except Exception:
user = st.st_uid
try:
group = grp.getgrgid(st.st_gid).gr_name
except Exception:
group = st.st_gid
owner = f"{user}:{group}"
# Size (bytes for file, recursive for directories)
size = st.st_size
print(f"Type: {ftype}")
print(f"Size: {size} bytes")
print(f"Permissions: {perms}")
print(f"Ownership: {owner}")
except Exception as e:
print(f"Unknown stats for {e}")
def main(argv: Optional[List[str]] = None):
if argv is None:
argv = sys.argv[1:]
prune_old_trash()
parser = argparse.ArgumentParser(add_help=False)
parser.add_argument("paths", nargs="*", help="files to remove")
parser.add_argument("-r", action="store_true", help="recursive")
parser.add_argument("-f", "--force", action="store_true", help="force")
parser.add_argument("-i", action="store_true", help="interactive")
parser.add_argument(
"--skip-trash", action="store_true", help="permanent delete"
)
inspect_arg = parser.add_argument(
"--inspect",
"-I",
nargs="+",
metavar="item",
help="show full metadata and original path for this trash entry",
)
restore_arg = parser.add_argument(
"--restore",
nargs="+",
metavar="item",
help="restore by id or basename",
)
# completer
def id_name_completer(prefix, parsed_args, **kwargs):
return [
short_id(m["id"])
for m in meta
if short_id(m["id"]).startswith(prefix)
] + [
Path(m["orig_path"]).name
for m in meta
if Path(m["orig_path"]).name.startswith(prefix)
]
restore_arg.completer = id_name_completer
inspect_arg.completer = id_name_completer
parser.add_argument("-l", "--list", action="store_true", help="list trash")
parser.add_argument(
"--empty", action="store_true", help="empty the trash permanently"
)
parser.add_argument("-h", "--help", action="store_true", help="show help")
parser.add_argument(
"-V", "--version", action="version", version=f"resrm {get_version()}"
)
argcomplete.autocomplete(parser)
args = parser.parse_args(argv)
# Always print docstring if -h or --help
if args.help:
print(__doc__)
return
if not args.paths and not (
args.list or args.empty or args.restore or args.inspect
):
print("resrm: missing operand")
print("Try 'resrm --help' for more information.")
return
if args.list:
list_trash()
return
if args.inspect:
for item in args.inspect:
inspect_entry(item)
return
if args.empty:
empty_trash()
return
if args.restore:
restore_many(args.restore)
return
if not args.paths:
parser.print_help()
return
# Process removals
for p in args.paths:
pth = Path(p)
# simplistic recursive handling: if -r not given and it's a directory, mimic rm behavior: error unless -r
if pth.is_dir() and not args.r:
if args.force:
continue
print(f"resrm: cannot remove '{pth}': Is a directory")
continue
move_to_trash(
pth,
interactive=args.i,
force=args.force,
skip_trash=args.skip_trash,
)