restructed chaos into order
This commit is contained in:
parent
3c71198b6b
commit
c3336b22be
120 changed files with 638 additions and 7769 deletions
|
@ -1 +0,0 @@
|
|||
Bitbucket workflow test
|
|
@ -1 +0,0 @@
|
|||
Bitbucket workflow test
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrYACgkQTifTfDWI
|
||||
cr89ohAApGBu+6qlz9mZhAZjbQ8DVK9MjccMk5UhkZA86lk4hfwC63K+1lgo/Y5H
|
||||
JD8uHt73ueTO/MNsYUjZUUoFfoNu0M5VUMQ3TYaJ/Mmup/aU6Rb/8vS4DM5pdH/J
|
||||
tW8usTQTg5i0GYZrSmDIpl9OqqWgAQduHHALNQtNH8j6qgqrUZ5WUwfLDCS3+KYe
|
||||
M1gLgLXgAf4GVH6bG3+8Hddpl0TESHpcXg87MT6HXs0sLY/KDfqdN35Vtydi+TLe
|
||||
9OLRYLPrfVrVhWqXAaBzyz75HxGSYELC/eu+sPi2rmJTC43hDgncnNlLVZqSvTwX
|
||||
OMG2V7HhDFDM/PmoMQ1d/MrtqRxOLmyp8+OcEzG85HvzOh3j1xiDl/gngTa9pF6O
|
||||
QvXUdWBgno7LVUcP1pvrl5+ynDvzy6W5jZHtwoLTVAKgD63FcM/xNaGylBxRBzst
|
||||
YGsH5RY3ZniXlax8P+DfH/4AzFUU1OvsjVex4+4iqinnmwKWabYHEJFdYXi9vTIZ
|
||||
1bB7Y30QYXzGutrG796vkwRFX0gTiWueOstQpNnu5fkLbLgsL/hPRGZsxSh/IIrt
|
||||
KMi499KgSiy+5qzlMABPBtIwdHQA2tgGz0NK+ZmysHNM9gwrJ4yKazfIjqn1ce5I
|
||||
QvK6raDVPyzE5x0xAPJIf2HQ2xosJQbsT8ZDIXSRFBPSBknaG/8=
|
||||
=Mz/c
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,57 +0,0 @@
|
|||
# Auto-Generated Wiki for git-sigil
|
||||
|
||||
## Project Overview
|
||||
Auto-generated by GitField OSF publisher on 2025-06-05T20:42:35-05:00
|
||||
|
||||
## Repository Info
|
||||
- **Last Commit**: got publish_osf.sh working
|
||||
- **Commit Hash**: a1d16f2903e1d79b846ed969804810f245e169b8
|
||||
|
||||
## README Preview
|
||||
# 🌱 GitField: Multi-Platform Repository Sync for Resilience and Sovereignty
|
||||
|
||||
## 📜 Overview
|
||||
|
||||
**GitField** is a collection of Bash scripts designed to synchronize a Git repository across **Radicle**, **GitLab**, **Bitbucket**, and **GitHub** using a recursive, metadata-rich workflow. This project ensures **redundancy**, **sovereignty**, and **transparency** by generating interconnected metadata snapshots and distributing them across decentralized and centralized platforms. The strategy protects against deplatforming risks, motivated by past attempts to suppress this work by individuals such as **Mr. Joel Johnson** ([Mirror post](https://mirror.xyz/neutralizingnarcissism.eth/x40_zDWWrYOJ7nh8Y0fk06_3kNEP0KteSSRjPmXkiGg?utm_medium=social&utm_source=heylink.me)) and **Dr. Peter Gaied** ([Paragraph post](https://paragraph.com/@neutralizingnarcissism/%F0%9F%9C%81-the-narcissistic-messiah)). By prioritizing decentralization with a Radicle-first approach and recursively pushing metadata, GitField creates a resilient, auditable chain of project state, ensuring persistence and accessibility for collaborators, communities, and future AI systems.
|
||||
|
||||
## 🛡️ Purpose and Intention
|
||||
|
||||
The GitField project is driven by three core principles:
|
||||
|
||||
|
||||
## Internal Documents
|
||||
Links to documents uploaded to OSF:
|
||||
|
||||
### DOCS
|
||||
- [docs/bitbucket/CLI-ONLY_workflow_bitbucket_Ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/bitbucket/CLI-ONLY_workflow_bitbucket_Ubuntu.md)
|
||||
- [docs/bitbucket/CLI-ONLY_workflow_bitbucket_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/bitbucket/CLI-ONLY_workflow_bitbucket_ubuntu.md)
|
||||
- [docs/github/1_prerequisites_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/1_prerequisites_github_ubuntu.md)
|
||||
- [docs/github/2_create_remote_repo_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/2_create_remote_repo_github_ubuntu.md)
|
||||
- [docs/github/3_commit_existing_repo_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/3_commit_existing_repo_github_ubuntu.md)
|
||||
- [docs/github/CLI-ONLY_workflow_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/CLI-ONLY_workflow_github_ubuntu.md)
|
||||
- [docs/gitlab/1_prerequisites_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/1_prerequisites_gitlab_ubuntu.md)
|
||||
- [docs/gitlab/2_create_remote_repo_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/2_create_remote_repo_gitlab_ubuntu.md)
|
||||
- [docs/gitlab/3_commit_existing_repo_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/3_commit_existing_repo_gitlab_ubuntu.md)
|
||||
- [docs/gitlab/CLI-ONLY_workflow_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/CLI-ONLY_workflow_gitlab_ubuntu.md)
|
||||
- [docs/osf/old/for_radicle.md](https://osf.io/rnq6v/files/osfstorage/docs/osf/old/for_radicle.md)
|
||||
- [docs/radicle/for_radicle.md](https://osf.io/rnq6v/files/osfstorage/docs/radicle/for_radicle.md)
|
||||
|
||||
### SCRIPTS
|
||||
- [INSTALL.sh](https://osf.io/rnq6v/files/osfstorage/INSTALL.sh)
|
||||
- [bin/gitfield-sync-gdrive.sh](https://osf.io/rnq6v/files/osfstorage/bin/gitfield-sync-gdrive.sh)
|
||||
- [bin/mount-gdrive.sh](https://osf.io/rnq6v/files/osfstorage/bin/mount-gdrive.sh)
|
||||
- [bin/publish_osf.sh](https://osf.io/rnq6v/files/osfstorage/bin/publish_osf.sh)
|
||||
- [bin/sync-metadata.sh](https://osf.io/rnq6v/files/osfstorage/bin/sync-metadata.sh)
|
||||
- [docs/osf/new/test-osf-api.sh](https://osf.io/rnq6v/files/osfstorage/docs/osf/new/test-osf-api.sh)
|
||||
- [docs/osf/old/test-osf-api.sh](https://osf.io/rnq6v/files/osfstorage/docs/osf/old/test-osf-api.sh)
|
||||
- [tools/invoke_solaria.py](https://osf.io/rnq6v/files/osfstorage/tools/invoke_solaria.py)
|
||||
|
||||
### DATA
|
||||
- [docs/osf/new/gitfield.osf.yaml](https://osf.io/rnq6v/files/osfstorage/docs/osf/new/gitfield.osf.yaml)
|
||||
- [docs/osf/old/gitfield.osf.yaml](https://osf.io/rnq6v/files/osfstorage/docs/osf/old/gitfield.osf.yaml)
|
||||
- [osf.yaml](https://osf.io/rnq6v/files/osfstorage/osf.yaml)
|
||||
|
||||
### FILES
|
||||
- [GITFIELD.md](https://osf.io/rnq6v/files/osfstorage/GITFIELD.md)
|
||||
- [LICENSE](https://osf.io/rnq6v/files/osfstorage/LICENSE)
|
||||
- [bin/SolariaSeedPacket_∞.20_SacredMomentEdition.md](https://osf.io/rnq6v/files/osfstorage/bin/SolariaSeedPacket_∞.20_SacredMomentEdition.md)
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrYACgkQTifTfDWI
|
||||
cr8RXQ/+KycUsGUAFMhn16HxZVm+MNGiuNpvL14dDfkdURviBZE9g1Vqot26w1Vj
|
||||
uXZ5MN6ZXTEFIO+WegTXJFWtAJzbFikEZ+vfszCWhQeWiG1903fnnfJKRcGneIxZ
|
||||
H7u9oFPvk2ekgMSuTEvY1VM+CdHshTrIZSyicIrfVI4zOT4F1WJsQDH4/nuF7imB
|
||||
LxYKp7qI8LvKHwcQGGMViMAi95ynQ20E8eZDwiI8Q5sD89Rf3wwtobKqfXgdHhpl
|
||||
JJ4E97aPthMVlTjtgTtPZZzOJd6ztir0c9ZkUpAHSWEepaETAAQEMF9KiJ3BgKiE
|
||||
5PCy/5PsF+pfwc0AZAiDPQ+o+/vlT7sl/C9dLLWOsfqMT2TzBZOJ9bhRewNiLGg4
|
||||
ZmVR8r8ELFDErmLWLjDhRlZbfhIB0gcHPkHw241yKk90hswOGbHWEZJ7+jI41v/L
|
||||
4jqEScjgozmQUMZBPQjJ4WWFb/zrJPonPpHSnwEF2eSRhg2gyZYDnAdXG3jmUgYY
|
||||
wzn2IYh/UHE4rajlx3f5zRSo541j/ZohXLG/qJL31p50B1/LgzzZyCYxOnU/Tb3S
|
||||
AcyCKsObqrfA+FroZXOAeoyjcAdvX2tTRvoKLUhAGe5nxeonCXyKnqYRRa/+Bvde
|
||||
G+WR/hfxOVg2KJuwf2/wQm0emTfh7vI13gI3cLQxyXdg3TyhMGY=
|
||||
=Z0hx
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,57 +0,0 @@
|
|||
# Auto-Generated Wiki for git-sigil
|
||||
|
||||
## Project Overview
|
||||
Auto-generated by GitField OSF publisher on 2025-06-05T20:42:35-05:00
|
||||
|
||||
## Repository Info
|
||||
- **Last Commit**: got publish_osf.sh working
|
||||
- **Commit Hash**: a1d16f2903e1d79b846ed969804810f245e169b8
|
||||
|
||||
## README Preview
|
||||
# 🌱 GitField: Multi-Platform Repository Sync for Resilience and Sovereignty
|
||||
|
||||
## 📜 Overview
|
||||
|
||||
**GitField** is a collection of Bash scripts designed to synchronize a Git repository across **Radicle**, **GitLab**, **Bitbucket**, and **GitHub** using a recursive, metadata-rich workflow. This project ensures **redundancy**, **sovereignty**, and **transparency** by generating interconnected metadata snapshots and distributing them across decentralized and centralized platforms. The strategy protects against deplatforming risks, motivated by past attempts to suppress this work by individuals such as **Mr. Joel Johnson** ([Mirror post](https://mirror.xyz/neutralizingnarcissism.eth/x40_zDWWrYOJ7nh8Y0fk06_3kNEP0KteSSRjPmXkiGg?utm_medium=social&utm_source=heylink.me)) and **Dr. Peter Gaied** ([Paragraph post](https://paragraph.com/@neutralizingnarcissism/%F0%9F%9C%81-the-narcissistic-messiah)). By prioritizing decentralization with a Radicle-first approach and recursively pushing metadata, GitField creates a resilient, auditable chain of project state, ensuring persistence and accessibility for collaborators, communities, and future AI systems.
|
||||
|
||||
## 🛡️ Purpose and Intention
|
||||
|
||||
The GitField project is driven by three core principles:
|
||||
|
||||
|
||||
## Internal Documents
|
||||
Links to documents uploaded to OSF:
|
||||
|
||||
### DOCS
|
||||
- [docs/bitbucket/CLI-ONLY_workflow_bitbucket_Ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/bitbucket/CLI-ONLY_workflow_bitbucket_Ubuntu.md)
|
||||
- [docs/bitbucket/CLI-ONLY_workflow_bitbucket_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/bitbucket/CLI-ONLY_workflow_bitbucket_ubuntu.md)
|
||||
- [docs/github/1_prerequisites_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/1_prerequisites_github_ubuntu.md)
|
||||
- [docs/github/2_create_remote_repo_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/2_create_remote_repo_github_ubuntu.md)
|
||||
- [docs/github/3_commit_existing_repo_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/3_commit_existing_repo_github_ubuntu.md)
|
||||
- [docs/github/CLI-ONLY_workflow_github_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/github/CLI-ONLY_workflow_github_ubuntu.md)
|
||||
- [docs/gitlab/1_prerequisites_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/1_prerequisites_gitlab_ubuntu.md)
|
||||
- [docs/gitlab/2_create_remote_repo_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/2_create_remote_repo_gitlab_ubuntu.md)
|
||||
- [docs/gitlab/3_commit_existing_repo_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/3_commit_existing_repo_gitlab_ubuntu.md)
|
||||
- [docs/gitlab/CLI-ONLY_workflow_gitlab_ubuntu.md](https://osf.io/rnq6v/files/osfstorage/docs/gitlab/CLI-ONLY_workflow_gitlab_ubuntu.md)
|
||||
- [docs/osf/old/for_radicle.md](https://osf.io/rnq6v/files/osfstorage/docs/osf/old/for_radicle.md)
|
||||
- [docs/radicle/for_radicle.md](https://osf.io/rnq6v/files/osfstorage/docs/radicle/for_radicle.md)
|
||||
|
||||
### SCRIPTS
|
||||
- [INSTALL.sh](https://osf.io/rnq6v/files/osfstorage/INSTALL.sh)
|
||||
- [bin/gitfield-sync-gdrive.sh](https://osf.io/rnq6v/files/osfstorage/bin/gitfield-sync-gdrive.sh)
|
||||
- [bin/mount-gdrive.sh](https://osf.io/rnq6v/files/osfstorage/bin/mount-gdrive.sh)
|
||||
- [bin/publish_osf.sh](https://osf.io/rnq6v/files/osfstorage/bin/publish_osf.sh)
|
||||
- [bin/sync-metadata.sh](https://osf.io/rnq6v/files/osfstorage/bin/sync-metadata.sh)
|
||||
- [docs/osf/new/test-osf-api.sh](https://osf.io/rnq6v/files/osfstorage/docs/osf/new/test-osf-api.sh)
|
||||
- [docs/osf/old/test-osf-api.sh](https://osf.io/rnq6v/files/osfstorage/docs/osf/old/test-osf-api.sh)
|
||||
- [tools/invoke_solaria.py](https://osf.io/rnq6v/files/osfstorage/tools/invoke_solaria.py)
|
||||
|
||||
### DATA
|
||||
- [docs/osf/new/gitfield.osf.yaml](https://osf.io/rnq6v/files/osfstorage/docs/osf/new/gitfield.osf.yaml)
|
||||
- [docs/osf/old/gitfield.osf.yaml](https://osf.io/rnq6v/files/osfstorage/docs/osf/old/gitfield.osf.yaml)
|
||||
- [osf.yaml](https://osf.io/rnq6v/files/osfstorage/osf.yaml)
|
||||
|
||||
### FILES
|
||||
- [GITFIELD.md](https://osf.io/rnq6v/files/osfstorage/GITFIELD.md)
|
||||
- [LICENSE](https://osf.io/rnq6v/files/osfstorage/LICENSE)
|
||||
- [bin/SolariaSeedPacket_∞.20_SacredMomentEdition.md](https://osf.io/rnq6v/files/osfstorage/bin/SolariaSeedPacket_∞.20_SacredMomentEdition.md)
|
|
@ -1,55 +0,0 @@
|
|||
## 📘 `1_prerequisites_github_ubuntu.md`
|
||||
|
||||
### 📌 Purpose
|
||||
|
||||
Prepare your Ubuntu system to create and work with remote GitHub repositories using SSH.
|
||||
|
||||
---
|
||||
|
||||
### ✅ System Requirements
|
||||
|
||||
* **Install Git**
|
||||
|
||||
```bash
|
||||
sudo apt update
|
||||
sudo apt install git -y
|
||||
```
|
||||
|
||||
* **Create a GitHub account**
|
||||
👉 [https://github.com/join](https://github.com/join)
|
||||
|
||||
* **Set your Git identity**
|
||||
|
||||
```bash
|
||||
git config --global user.name "Your Name"
|
||||
git config --global user.email "your_email@example.com"
|
||||
```
|
||||
|
||||
* **Generate an SSH key (if not already present)**
|
||||
|
||||
```bash
|
||||
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
|
||||
eval "$(ssh-agent -s)"
|
||||
ssh-add ~/.ssh/id_rsa
|
||||
```
|
||||
|
||||
* **Add your SSH public key to GitHub**
|
||||
|
||||
```bash
|
||||
cat ~/.ssh/id_rsa.pub
|
||||
```
|
||||
|
||||
🔗 Copy the output and paste it at:
|
||||
GitHub → Settings → SSH and GPG keys → *New SSH key*
|
||||
|
||||
* **Test the connection**
|
||||
|
||||
```bash
|
||||
ssh -T git@github.com
|
||||
```
|
||||
|
||||
You should see:
|
||||
|
||||
> "Hi `your-username`! You've successfully authenticated..."
|
||||
|
||||
---
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrcACgkQTifTfDWI
|
||||
cr9MWg//T353RB/zysVyPS+JRwHfJ9s4woqHXFN/MJRPtdRY6vN/VhoipNi+jY/f
|
||||
iMq7XhCp/8oBWDuqQHNylyhqEo2yMluyOC2+FgKoqx6+odSSy5rerLVS07oZj2Hp
|
||||
C8mqANOJfMemsfSDlF3t78EVYEwJbAiugyk9r1JoO529eCkFFcTXluqsEKBTk1uC
|
||||
lu6qa1a3XdyE2tTvnDEZ4Y2CRrPS4ZTfcLhPNXtWDzLwL/yuOQ0tXrysE/dO98GV
|
||||
ONGqGbqeqIs9eyztar3qJPjOhB+oIw8DpUNNmrLoGjp1HFKbhx3wvc2gD9isWLxT
|
||||
s+e/sTQasRqytCIADUqlZ6rFyx1Sltovs0xYsM7iViqnVxER9lvDYSAXOFvHSc33
|
||||
w2fSCSLGNLvtnAIQvnG5/pIzaw3XxiKHwTmArAxbM26XcpSFLCLCNWZuD7Op714s
|
||||
6HN6Ss8yOSvyB3ikMYZn7ihtBgSH1+T2WUHj2yHXYSHSsWAPMv7NwGYX2qXxviNh
|
||||
hfZIO1sdbO0ZRQP1CAM2zGmok3bbXra1VMjyrFRA3zRTsDtnywUwM7Yt/fpdbIhm
|
||||
w/NLNC9gMo5iK/wOh0f450NXOr5gspZXT8AZ/0J5L3BQed/T6Aa1pMBgJelBfXK2
|
||||
yjkQOIETmaTlQ7qeBRypS8ZXx0nmpuN8/gBTK851Wj15pDfOmH8=
|
||||
=+thI
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,73 +0,0 @@
|
|||
## 📘 `2_create_remote_repo_github_ubuntu.md`
|
||||
|
||||
### 📌 Purpose
|
||||
|
||||
Create a new remote repository on GitHub and push your local Ubuntu-based Git project to it.
|
||||
|
||||
---
|
||||
|
||||
### 🪐 Step-by-Step
|
||||
|
||||
#### Step 1: Create the remote repository
|
||||
|
||||
1. Go to [https://github.com/new](https://github.com/new)
|
||||
2. Set:
|
||||
|
||||
* Repository Name
|
||||
* Visibility (Public or Private)
|
||||
* ✅ Leave **"Initialize with README"** unchecked
|
||||
3. Click **Create repository**
|
||||
|
||||
---
|
||||
|
||||
#### Step 2: Prepare your local repository
|
||||
|
||||
If starting fresh:
|
||||
|
||||
```bash
|
||||
mkdir myproject
|
||||
cd myproject
|
||||
git init
|
||||
```
|
||||
|
||||
If converting an existing project:
|
||||
|
||||
```bash
|
||||
cd myproject
|
||||
git init
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 3: Add files and commit
|
||||
|
||||
```bash
|
||||
touch README.md # or edit existing files
|
||||
git add .
|
||||
git commit -m "Initial commit"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 4: Link to GitHub remote
|
||||
|
||||
```bash
|
||||
git remote add origin git@github.com:your-username/your-repo-name.git
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 5: Push to GitHub
|
||||
|
||||
```bash
|
||||
git push -u origin main
|
||||
```
|
||||
|
||||
> If you get an error about `main` not existing:
|
||||
|
||||
```bash
|
||||
git branch -M main
|
||||
git push -u origin main
|
||||
```
|
||||
|
||||
---
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrcACgkQTifTfDWI
|
||||
cr8sHg/8DeAqUQi+jIAq92eZjVmuZibZbkMb1/rmeTAdHB5cvappiIo5DO/8WgAF
|
||||
M6jo6X6EiokhXIohee5m+0N5vdp/0mHvdTq2yQ6nzzIm8Jg8T14C1skaHks4lq+e
|
||||
qAeYyXduEeFjUaUNs8KtjHPoaeobecngL6LpFWJrsZhCt5Gh2NrQ7NLqEYKdntke
|
||||
d1X3cbbNyYYN5VK9yGjIUx/2lpBk7q0IxyUKILeJeOFQEHjj2ENMYlM46KvOYt04
|
||||
xS2nq/+YlyBWdb3fzE/yJSSaYwpfCd9SO8cdHmMzkWgzDbGA15f3aiUfuCYYlP5t
|
||||
YMmVn7anF19MethVEB77UyVGahkVH5ld3kJLVoeQvJn2OnLly+NbtUSwne9fVbqd
|
||||
sZ6U57REx0ACBk8NkAawUDI8rENqoq/QqAmHeL0rQFTyRWvr1Ozl4AMilfzAXruF
|
||||
yEu1wSoezudeVE4TIzRUWggPiXPm7Qr52LelFQHzokE5Pb1q0aDOubQgdJcQBS8x
|
||||
Ok0nxHBd0JORlsVy0yRyHWub3Iugd27WPiwlRKXkzEthiqp+IUEaMJhfZ6NYcQG8
|
||||
elh8FO91Qq/LMJzP0g0a5Qn8MZjW/iQZX+k10lORlciQRK08hekhj8I9+bSOQihr
|
||||
DQoxgO3QP9/XQeLtrmGD3Ctj9LXjIppEFo/hO6siJo3AnIDRe+Q=
|
||||
=lt2I
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,51 +0,0 @@
|
|||
## 📘 `3_commit_existing_repo_github_ubuntu.md`
|
||||
|
||||
### 📌 Purpose
|
||||
|
||||
Work with an existing remote GitHub repository on Ubuntu. This includes cloning, committing changes, and pushing updates.
|
||||
|
||||
---
|
||||
|
||||
### 🛠️ Step-by-Step
|
||||
|
||||
#### Step 1: Clone the repository
|
||||
|
||||
```bash
|
||||
git clone git@github.com:your-username/your-repo-name.git
|
||||
cd your-repo-name
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 2: Make your changes
|
||||
|
||||
```bash
|
||||
nano example.txt
|
||||
```
|
||||
|
||||
Or update files as needed.
|
||||
|
||||
---
|
||||
|
||||
#### Step 3: Stage and commit your changes
|
||||
|
||||
```bash
|
||||
git add .
|
||||
git commit -m "Describe your update"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 4: Push to GitHub
|
||||
|
||||
```bash
|
||||
git push origin main
|
||||
```
|
||||
|
||||
> Use the correct branch name if not `main`. Confirm with:
|
||||
|
||||
```bash
|
||||
git branch
|
||||
```
|
||||
|
||||
---
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrgACgkQTifTfDWI
|
||||
cr+Nxw//SupIL6Wjk3I/uHtZdWh99NPY/QU29nifv0bqtgyQxGQB7/c8rfmk+Zk3
|
||||
rtKPXsQ1KlUenHaLxa2qV2b/7uzyzUAp/iVYm3TqqnY78iBqT0AtONufs/1feX+B
|
||||
7pu2ItfWOzGWyrFMXiTcbismnx249HmEmh8+iKvXJYfqF7z/t0ZfxNNJdZWGVkr1
|
||||
Yku5Pxseucs4f9nnojmugiztGFq8jydU1o2kPRhd0nprxIhlmMbOw+S+VdwyqCSn
|
||||
2VXWA3dMTbWcGH22MoLPXwfuOSclaIxH39lSQuZZ66GWy+2dJvKpYsMkvmtjqfSB
|
||||
sZz/NCQRXDZLhOX44+WioOLUG+2fh2ujkBXmLtUUt5EVQxowjKAOlAJ0j4pZyfXI
|
||||
OKVspSptlcprydWbY36Uw+GE4jsL/3vOgUkGGFnJG3ofZZAFcG2xznqcVBWWKNsA
|
||||
fhcPE+SsnNPFLzD4TyNcBEhYS78d06wByzVA0A/rGIQYlXiAnoWTLRzA97aUWgkM
|
||||
FBKj31ZUCTiVclqUCLutuB2CTs+5kR4HcvaPrfc86caMof4AwuVbTP0yMb0LToMG
|
||||
8Wt1BJkvfYlwg9WZgCTMd4Nc/x7kCTR5efemT7MrG3xBFGxREL755m54i47jjT3f
|
||||
ProXz7prSXfyZfqYMp2VxufbWpH/PGDwUbTOjGll1wq3e6NTccI=
|
||||
=ecfP
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,97 +0,0 @@
|
|||
---
|
||||
|
||||
## 🧭 FULL CLI-ONLY WORKFLOW (Ubuntu + GitHub)
|
||||
|
||||
---
|
||||
|
||||
### 🔹 Step 1 — Install prerequisites
|
||||
|
||||
```bash
|
||||
# Install Git
|
||||
sudo apt update
|
||||
sudo apt install git -y
|
||||
|
||||
# Install GitHub CLI
|
||||
type -p curl >/dev/null || sudo apt install curl -y
|
||||
curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | \
|
||||
sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg
|
||||
sudo chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg
|
||||
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] \
|
||||
https://cli.github.com/packages stable main" | \
|
||||
sudo tee /etc/apt/sources.list.d/github-cli.list > /dev/null
|
||||
sudo apt update
|
||||
sudo apt install gh -y
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 🔹 Step 2 — Authenticate with GitHub
|
||||
|
||||
```bash
|
||||
gh auth login
|
||||
```
|
||||
|
||||
* Choose: `GitHub.com`
|
||||
* Protocol: `SSH`
|
||||
* Authenticate via browser (first time only—after that you're CLI-auth’d)
|
||||
|
||||
---
|
||||
|
||||
### 🔹 Step 3 — Set global Git identity
|
||||
|
||||
```bash
|
||||
git config --global user.name "Your Name"
|
||||
git config --global user.email "your_email@example.com"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 🔹 Step 4 — Create and link a new GitHub repo (CLI-only)
|
||||
|
||||
From inside your project directory:
|
||||
|
||||
```bash
|
||||
mkdir myproject
|
||||
cd myproject
|
||||
git init
|
||||
echo "# My Project" > README.md
|
||||
git add .
|
||||
git commit -m "Initial commit"
|
||||
```
|
||||
|
||||
Now create a GitHub repo **from the CLI**:
|
||||
|
||||
```bash
|
||||
gh repo create myproject --public --source=. --remote=origin --push
|
||||
```
|
||||
|
||||
✅ This:
|
||||
|
||||
* Creates the remote GitHub repo
|
||||
* Links it to your local repo
|
||||
* Pushes your first commit to GitHub
|
||||
|
||||
---
|
||||
|
||||
### 🔹 Step 5 — Make further commits
|
||||
|
||||
```bash
|
||||
# Edit files as needed
|
||||
nano something.txt
|
||||
|
||||
# Stage + commit + push
|
||||
git add .
|
||||
git commit -m "Updated something"
|
||||
git push origin main
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 🔹 Bonus — Clone a GitHub repo entirely from CLI
|
||||
|
||||
```bash
|
||||
gh repo clone your-username/your-repo
|
||||
cd your-repo
|
||||
```
|
||||
|
||||
---
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrgACgkQTifTfDWI
|
||||
cr9ywg//fQeW0O85YN3irtSu0uaGMDrecWXdCerN14ljMetW2xVrT5iN+DRiy5fj
|
||||
WJETJ+Tf/zJ4DyVrkFL4j07nBIyX+8VCQIBJXiDPES8nU9G/y0LVlrf3K04p0QFh
|
||||
6U3C++udRD6AZ4FmplyRje4EYqMAeJcGu2YXR7PtWXbMmjmTtXRR24TCF+HBxF8H
|
||||
LNRBZTvlePDgUmXYuod0pskwQyabQAcR4+IldWHYwE1gM4h7s34ClJRJu2Pz9h36
|
||||
BDNAzAMbe3gvWdQXj9gxMdOgJvnLWR0M8vsWVDr7P0j6TqsGY3p2vnfG0prlQMwC
|
||||
B8LYmPEWsoi73wUgFayjdyLsQWtqbeKLDt3+MBELh2VZ2WDnQGUBU37k2ydjt/GY
|
||||
ImWQle0E8fUy7w8crbSt8Dm4d39Ky3+pidVo1APhq9d+8nhRAxIfRa2GUaTNfaXc
|
||||
IUcqokIKnlkLdSiEzIR6nkBfb4HYFpHMUszuSLWUZUursEXk/Z6qI3dakJSyoqDy
|
||||
UX7UNEN3IpVnzNumBpe+40mdDt3ZyaH3cz6o13BAT/Qs1fclmpERzxWv/AhWtnx/
|
||||
p0nt4tnv/cG8e84pnyAi2AmZEk43kTdBtBzHiRnS5BPI0sGk3y6GQ7E8ohWuIktF
|
||||
5NCuuD0d7pUXs4g5Yuv0jJta6tOsJf0Gnp0tVYEtLLRVPluefpY=
|
||||
=tZHc
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,123 +0,0 @@
|
|||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
GIT_REMOTE_NAME="github"
|
||||
REPO_NAME=$(basename "$(pwd)")
|
||||
DEFAULT_NAME="Mark Randall Havens"
|
||||
DEFAULT_EMAIL="mark.r.havens@gmail.com"
|
||||
|
||||
# ────────────────
|
||||
# Logging Helpers
|
||||
# ────────────────
|
||||
info() { echo -e "\e[1;34m[INFO]\e[0m $*"; }
|
||||
warn() { echo -e "\e[1;33m[WARN]\e[0m $*"; }
|
||||
error() { echo -e "\e[1;31m[ERROR]\e[0m $*" >&2; exit 1; }
|
||||
|
||||
# ────────────────
|
||||
# Ensure Git is Installed
|
||||
# ────────────────
|
||||
if ! command -v git &>/dev/null; then
|
||||
info "Installing Git..."
|
||||
sudo apt update && sudo apt install git -y || error "Failed to install Git"
|
||||
else
|
||||
info "Git already installed: $(git --version)"
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Ensure GitHub CLI is Installed
|
||||
# ────────────────
|
||||
if ! command -v gh &>/dev/null; then
|
||||
info "Installing GitHub CLI..."
|
||||
type -p curl >/dev/null || sudo apt install curl -y
|
||||
curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | \
|
||||
sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg
|
||||
sudo chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg
|
||||
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] \
|
||||
https://cli.github.com/packages stable main" | \
|
||||
sudo tee /etc/apt/sources.list.d/github-cli.list > /dev/null
|
||||
sudo apt update && sudo apt install gh -y || error "Failed to install GitHub CLI"
|
||||
else
|
||||
info "GitHub CLI already installed: $(gh --version | head -n 1)"
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Ensure GitHub CLI is Authenticated
|
||||
# ────────────────
|
||||
if ! gh auth status &>/dev/null; then
|
||||
info "Authenticating GitHub CLI..."
|
||||
gh auth login || error "GitHub authentication failed"
|
||||
else
|
||||
info "GitHub CLI authenticated."
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Ensure Git Identity is Set
|
||||
# ────────────────
|
||||
USER_NAME=$(git config --global user.name || true)
|
||||
USER_EMAIL=$(git config --global user.email || true)
|
||||
|
||||
if [[ -z "$USER_NAME" || -z "$USER_EMAIL" ]]; then
|
||||
info "Setting global Git identity..."
|
||||
git config --global user.name "$DEFAULT_NAME"
|
||||
git config --global user.email "$DEFAULT_EMAIL"
|
||||
info "Git identity set to: $DEFAULT_NAME <$DEFAULT_EMAIL>"
|
||||
else
|
||||
info "Git identity already set to: $USER_NAME <$USER_EMAIL>"
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Initialize Git Repo If Missing
|
||||
# ────────────────
|
||||
if [ ! -d ".git" ]; then
|
||||
info "Initializing local Git repository..."
|
||||
git init || error "Failed to initialize git"
|
||||
git add . || warn "Nothing to add"
|
||||
git commit -m "Initial commit" || warn "Nothing to commit"
|
||||
else
|
||||
info "Git repository already initialized."
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Ensure at Least One Commit Exists
|
||||
# ────────────────
|
||||
if ! git rev-parse HEAD &>/dev/null; then
|
||||
info "Creating first commit..."
|
||||
git add . || warn "Nothing to add"
|
||||
git commit -m "Initial commit" || warn "Nothing to commit"
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Create Remote GitHub Repo If Missing
|
||||
# ────────────────
|
||||
if ! git remote get-url "$GIT_REMOTE_NAME" &>/dev/null; then
|
||||
info "Creating GitHub repository '$REPO_NAME'..."
|
||||
gh repo create "$REPO_NAME" --public --source=. --remote="$GIT_REMOTE_NAME" || error "Failed to create GitHub repo"
|
||||
else
|
||||
info "Remote '$GIT_REMOTE_NAME' already set to: $(git remote get-url $GIT_REMOTE_NAME)"
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Commit Changes If Needed
|
||||
# ────────────────
|
||||
if ! git diff --quiet || ! git diff --cached --quiet; then
|
||||
info "Changes detected — committing..."
|
||||
git add .
|
||||
git commit -m "Update: $(date '+%Y-%m-%d %H:%M:%S')" || warn "Nothing to commit"
|
||||
else
|
||||
info "No uncommitted changes found."
|
||||
fi
|
||||
|
||||
# ────────────────
|
||||
# Final Push — Always Push, Even If No Upstream
|
||||
# ────────────────
|
||||
BRANCH_NAME=$(git rev-parse --abbrev-ref HEAD)
|
||||
|
||||
if ! git config --get branch."$BRANCH_NAME".remote &>/dev/null; then
|
||||
info "No upstream detected. Setting upstream and pushing..."
|
||||
git push -u "$GIT_REMOTE_NAME" "$BRANCH_NAME" || error "Failed to push and set upstream"
|
||||
else
|
||||
info "Pushing to remote '$GIT_REMOTE_NAME'..."
|
||||
git push "$GIT_REMOTE_NAME" "$BRANCH_NAME" || error "Push failed"
|
||||
fi
|
|
@ -1,63 +0,0 @@
|
|||
### 📘 `1_prerequisites_gitlab_ubuntu.md`
|
||||
|
||||
````markdown
|
||||
## 📘 `1_prerequisites_gitlab_ubuntu.md`
|
||||
|
||||
### 📌 Purpose
|
||||
|
||||
Prepare your Ubuntu system to create and work with remote GitLab repositories using SSH and CLI tools.
|
||||
|
||||
---
|
||||
|
||||
### ✅ System Requirements
|
||||
|
||||
* **Install Git**
|
||||
|
||||
```bash
|
||||
sudo apt update
|
||||
sudo apt install git -y
|
||||
````
|
||||
|
||||
* **Create a GitLab account**
|
||||
👉 [https://gitlab.com/users/sign\_up](https://gitlab.com/users/sign_up)
|
||||
|
||||
* **Set your Git identity**
|
||||
|
||||
```bash
|
||||
git config --global user.name "Your Name"
|
||||
git config --global user.email "your_email@example.com"
|
||||
```
|
||||
|
||||
* **Generate an SSH key (if not already present)**
|
||||
|
||||
```bash
|
||||
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
|
||||
eval "$(ssh-agent -s)"
|
||||
ssh-add ~/.ssh/id_rsa
|
||||
```
|
||||
|
||||
* **Add your SSH key to GitLab**
|
||||
|
||||
```bash
|
||||
cat ~/.ssh/id_rsa.pub
|
||||
```
|
||||
|
||||
🔗 Copy the output and paste it at:
|
||||
GitLab → Preferences → SSH Keys → *Add key*
|
||||
|
||||
* **Test the connection**
|
||||
|
||||
```bash
|
||||
ssh -T git@gitlab.com
|
||||
```
|
||||
|
||||
✅ You should see something like:
|
||||
|
||||
> Welcome to GitLab, @your-username!
|
||||
|
||||
---
|
||||
|
||||
````
|
||||
|
||||
---
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrkACgkQTifTfDWI
|
||||
cr8Q1g//dflV981OANoZd5I16kloyER4M+oYluInO2n0MDpUJJ3gWIY6S6NPM2D0
|
||||
5SPASSzNWdS8M3oJZa4JCUZwNAaABLR2A67/4ObI5pKs9PWyO1zOO7t253YPFu1T
|
||||
5cQ9q18K8rcoIsNDuT9MnESS/yJTFxVXTyavyqv/mYi/y8P0WyiUakChqOM+xtgx
|
||||
aaO8FVZbfXTR+gKn6WlOz9IPBjPlppBJ/zw5LXC1rCDq+jDf6dqFRf5eFPdg1Lia
|
||||
Iu581/+TLST1HKXyAjQ4wgVqce0KVZxk25YakUcpnCIFWktEU5CfSbzbJN2YD45R
|
||||
Ts7xbne5kb3qnVnQZnDNt6YzWuqEnrKfKXwPgrIsSgEKp3tTqCYZZnxPBjYbXiyC
|
||||
LdJ4BffnkiyHtGrfNVMu6OUgglvdMnjhBeWhlswZcTzzWK/vNWavkFJz0zEB/XEZ
|
||||
4/o32Bc+hMv+VgR++nvKsOp5Ky4NlPf4ALJm3bomz4h8GoxvrZEsyl1Wc1EVwsVn
|
||||
t7zRcwlTqJsm7ilRxwxOipzJSedl+YTU/Ii39/p3lZf7y97IubeUEGzOEeQ2PYB6
|
||||
dbWWpz9z1Ke7GTQdtu7c/SGNzZPhgLBUsvU/WK754D8Bi5NhhfMZQijyfZcIb5nh
|
||||
ID+3/ZwrdM+EJjP/5fD0jS/uM+vj5TQ6LbCkZinjMc+AUIdlHw8=
|
||||
=ndkr
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,73 +0,0 @@
|
|||
### 📘 `2_create_remote_repo_gitlab_ubuntu.md`
|
||||
|
||||
```markdown
|
||||
## 📘 `2_create_remote_repo_gitlab_ubuntu.md`
|
||||
|
||||
### 📌 Purpose
|
||||
|
||||
Create a new GitLab repository and push your local Ubuntu project to it using the CLI.
|
||||
|
||||
---
|
||||
|
||||
### 🪐 Step-by-Step
|
||||
|
||||
#### Step 1: Install GitLab CLI
|
||||
|
||||
```bash
|
||||
curl -s https://raw.githubusercontent.com/profclems/glab/trunk/scripts/install.sh | sudo bash
|
||||
````
|
||||
|
||||
#### Step 2: Authenticate GitLab CLI
|
||||
|
||||
```bash
|
||||
glab auth login
|
||||
```
|
||||
|
||||
Choose:
|
||||
|
||||
* GitLab.com or custom instance
|
||||
* Paste your **Personal Access Token** when prompted
|
||||
|
||||
---
|
||||
|
||||
#### Step 3: Initialize your project
|
||||
|
||||
```bash
|
||||
mkdir myproject
|
||||
cd myproject
|
||||
git init
|
||||
echo "# My Project" > README.md
|
||||
git add .
|
||||
git commit -m "Initial commit"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 4: Create GitLab repository via CLI
|
||||
|
||||
```bash
|
||||
glab repo create myproject --visibility public --confirm
|
||||
```
|
||||
|
||||
This:
|
||||
|
||||
* Creates the GitLab repo
|
||||
* Links it to your local repo
|
||||
* Adds `origin` remote
|
||||
|
||||
---
|
||||
|
||||
#### Step 5: Push to GitLab
|
||||
|
||||
```bash
|
||||
git push -u origin master
|
||||
```
|
||||
|
||||
✅ From now on, `git push` will work as expected.
|
||||
|
||||
---
|
||||
|
||||
````
|
||||
|
||||
---
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrkACgkQTifTfDWI
|
||||
cr9rtg//Q2VMqBfERrWC2qEbRc562J7PFj+ZQh5TX5JyZgD44e3nuSDvMnSDjTrR
|
||||
HAPAB1AZW0HQypsuJEfgldmKXSZ9E75roPhLsPPlvMq6HQFuHk/LRXYt9GRTNqK/
|
||||
I5SIHMZwVLvQKJFKOWfRi1eCfyMLBP7yweNySVqdMPGVfwxlWbwYTp1nBrvdawUe
|
||||
xw0Hv1AhMvrfVSpEyUJ8zu7djTRZGEmU9YjpLpnzMnnvl3ppJ5mPuP51kbWQNSxg
|
||||
3xEhftEb5acWN401a/5rK98PJpQ4GS3VFzB7sVhNmMjlD2ArlHkIsts7SftVuDF6
|
||||
Qt4QfK0qZbOBCmEvPx1PoEhBFZqnw0EQvfzGV8T1oNrGD8e3CxlWfmbdu8S3frBD
|
||||
hIMATS1EGH4L+URZ7kfZWY8H4cKl+8xQt6MfbRMzVKe/A/ntOXwBiqQuwpLnFKTE
|
||||
Yu3yrZyPP3SFD8SDVgXoTqmYRwRBUeX+4Wtfp7J3UhrtmknUdbpfTIZDL9FSMNVv
|
||||
VVPloBapldg/QzE/nt8UDbliQtHcBfDTPf4G1g7mkiAFdMcZPouEPVV0xgDNXrrL
|
||||
lDWKoTFM36zJCMzQuVv6CSNx5b6qr9E2OUTbXdsFXZWUOXzplmJFARZmO0Az9GTu
|
||||
AiPUVmOERnu/9fe1KgsrxIH9Tu9zeo8Vf0IDZaWG8VQEgaZFfX4=
|
||||
=8eQX
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,53 +0,0 @@
|
|||
### 📘 `3_commit_existing_repo_gitlab_ubuntu.md`
|
||||
|
||||
```markdown
|
||||
## 📘 `3_commit_existing_repo_gitlab_ubuntu.md`
|
||||
|
||||
### 📌 Purpose
|
||||
|
||||
Work with an existing GitLab repo: clone, edit, commit, and push using Ubuntu.
|
||||
|
||||
---
|
||||
|
||||
### 🛠️ Step-by-Step
|
||||
|
||||
#### Step 1: Clone the repository
|
||||
|
||||
```bash
|
||||
git clone git@gitlab.com:your-username/your-repo.git
|
||||
cd your-repo
|
||||
````
|
||||
|
||||
---
|
||||
|
||||
#### Step 2: Edit files
|
||||
|
||||
```bash
|
||||
nano myfile.txt
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 3: Stage and commit
|
||||
|
||||
```bash
|
||||
git add .
|
||||
git commit -m "Your change description"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
#### Step 4: Push your changes
|
||||
|
||||
```bash
|
||||
git push origin master
|
||||
```
|
||||
|
||||
If you use another branch (e.g., `main`, `dev`), substitute accordingly.
|
||||
|
||||
---
|
||||
|
||||
````
|
||||
|
||||
---
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrkACgkQTifTfDWI
|
||||
cr+ZLA/+MRj45ohkYLySq+PUWNLhKadirl1w6T7Vfrg16UZQc57shNW10WHl/S0b
|
||||
sBFApfTQYV5phcKejJrutlvoh5thAAea2BWB0QGYaJQ+bYUvRLk7ZAt8G0mf+j0a
|
||||
qgtmxNKH8xkVaeMt6lUq2YU13ZUvHEMYL9bfwlRhR+gNeq/bxV2wSlGtyf1mL0wD
|
||||
uuFUCiEUaZyhu6Vt+EtiJdu27LN6eyLfz0ERWBpRlt5WSPTUWbsjnw6f9DpyvmqC
|
||||
QmD/U0xY0rV2zMt3s2akRJLVHp7VhAlvPWCuxrL1iHEI7xEOzPRVTEd6Oja9djCk
|
||||
JtNqGy5EKUrj4ZJs8WwIGuIs23zJGR8yUp3zPyc6BAI+BYc8Xqs7vLX5wfdg972y
|
||||
J6UBba7hGlzrAtIafz3FkqmfrrwoKqFoYJAO7HW46kz6aDt/rhUrfh6oChSbntf0
|
||||
DI0oPwCfl4lKreuMnHfYQ/1tJMs5Q1v3+w/oE3zd4rXBBQ995BEWNUPuiMIIWK7y
|
||||
vt/JbyTyz5+rtx9qZZM76oldnC1fu/zI+K/RZqsIqDET1qhglFMIEvR+3jpeAkrK
|
||||
MzcCkTvtsun10jSmbpmcYQgIKYclE7HYsJNkCulpL/4aM698cXmDpF+NYWiIt4UU
|
||||
M09R8kvM/7kYE0QEnwWXauD/DksWqFHDfsp9qGAK4kNB0nopKEk=
|
||||
=kWqD
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,69 +0,0 @@
|
|||
### 📘 `CLI-ONLY_workflow_gitlab_ubuntu.md`
|
||||
|
||||
```markdown
|
||||
## 📘 `CLI-ONLY_workflow_gitlab_ubuntu.md`
|
||||
|
||||
### 📌 Purpose
|
||||
|
||||
Set up, initialize, and push a GitLab repo using only the terminal — no browser required.
|
||||
|
||||
---
|
||||
|
||||
### 🪐 Step-by-Step CLI Workflow
|
||||
|
||||
#### 1. Install everything you need
|
||||
|
||||
```bash
|
||||
sudo apt update
|
||||
sudo apt install git curl -y
|
||||
curl -s https://raw.githubusercontent.com/profclems/glab/trunk/scripts/install.sh | sudo bash
|
||||
````
|
||||
|
||||
#### 2. Configure your Git identity
|
||||
|
||||
```bash
|
||||
git config --global user.name "Your Name"
|
||||
git config --global user.email "your_email@example.com"
|
||||
```
|
||||
|
||||
#### 3. Authenticate with GitLab
|
||||
|
||||
```bash
|
||||
glab auth login
|
||||
```
|
||||
|
||||
Use **SSH** and paste your **Personal Access Token** (create one at [https://gitlab.com/-/profile/personal\_access\_tokens](https://gitlab.com/-/profile/personal_access_tokens))
|
||||
|
||||
---
|
||||
|
||||
#### 4. Initialize your project
|
||||
|
||||
```bash
|
||||
mkdir myproject
|
||||
cd myproject
|
||||
git init
|
||||
touch README.md
|
||||
git add .
|
||||
git commit -m "Initial commit"
|
||||
```
|
||||
|
||||
#### 5. Create GitLab repo via CLI
|
||||
|
||||
```bash
|
||||
glab repo create myproject --visibility public --confirm
|
||||
```
|
||||
|
||||
#### 6. Push your changes
|
||||
|
||||
```bash
|
||||
git push -u origin master
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
✅ Done. You've created and linked a GitLab repository entirely from the CLI.
|
||||
|
||||
```
|
||||
|
||||
---
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtroACgkQTifTfDWI
|
||||
cr84vA/+KqX7UEDZtwIdczZBo6wMg/+T4Q4IavJ3AnoKBfOIwynoMoKl9TW7gZ4r
|
||||
gSOQ1cK+2gHEaMdyWSIh7q5fC+VEb0phXRu2wLufeyJpJ7G0k7mmSxmDzttn8NEu
|
||||
/0UxU4XJf3R4Bw8SKUTtbbqd5VlW2MIuJYgpsClxrQEotNVyj91ch9olinFp7Qp4
|
||||
ZigZstkrgzMYg0pcIu9mQzZ0ZW+wFjQn71PLmw3o6tCF3NPi1LsvKxZtJ7GqU/cW
|
||||
D6X04nNS4ldjHxqkMzsxsIMKBvqfmHoV2iPCTVmbMMwNBFPrrCCarDGl4/im54fh
|
||||
v6anBRNMMlPv8PTyQKC4Ks8kt69KvVnc02m2sv5Wt+sIFRdf1C3IxLrGCvhLPfU6
|
||||
S+vlGC1hKTwuoSZtIvSa4yeBoTgDvRWcroUlgsYHheb+KV1Zvs0FPcwCxWeIXnet
|
||||
9HxbfYE0r6ATH4uacLikeA4IvjSxagSxzdmcBH52e1tTzuHZ4BYGOZISEibSZqrW
|
||||
aD0EgNe9/sp+tA5Mx4yM7Ty3GkmW1nmNsmlj2TemTfysk32u7wSYkU1C+3UDLnh1
|
||||
B8euIfuZ9PBsIhXtbQWoIXxEnYQ5vDzOAptyAez2bqbZ1nOeaIJ2dVHMYTOEbW+x
|
||||
OeOmAvwpQjAX0QqlvvQARDCMgHr395phN8VjNg71Q2DUIrksG9I=
|
||||
=qDrY
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,286 +0,0 @@
|
|||
#!/usr/bin/env bash
|
||||
set -Eeuo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
# ╭─────────────────────────────────────────────────────────────────────────╮
|
||||
# │ gitfield-osf :: v3.2.0 (Refactored) │
|
||||
# │ Self-Healing • Auto-Detecting • PEP 668-Compliant • Debuggable │
|
||||
# ╰─────────────────────────────────────────────────────────────────────────╯
|
||||
#
|
||||
# This script uses osfclient to upload files, based on a YAML config.
|
||||
# It will auto-install python3, pip3, yq, pipx, and osfclient if missing.
|
||||
# 1. ensure_dependencies(): makes sure python3, pip3, yq, pipx, osfclient exist
|
||||
# 2. configure_osfclient(): prompts for token & username, writes ~/.config/osfclient/config
|
||||
# 3. load_yaml_config(): reads project.title, include/exclude globs from gitfield.osf.yaml
|
||||
# 4. resolve_files(): expands include/exclude patterns into a FILES array
|
||||
# 5. find_or_create_project(): finds or creates an OSF project with the given title
|
||||
# 6. upload_files(): loops over FILES and does osf upload
|
||||
#
|
||||
# Usage:
|
||||
# chmod +x gitfield-osf
|
||||
# ./gitfield-osf
|
||||
#
|
||||
# If gitfield.osf.yaml is missing or empty patterns match nothing, the script will exit cleanly.
|
||||
# Any failure prints an [ERROR] and exits non-zero.
|
||||
|
||||
########################################################################
|
||||
# CUSTOMIZE HERE (if needed):
|
||||
########################################################################
|
||||
# If you want to override config path:
|
||||
# export GITFIELD_CONFIG=/path/to/your/gitfield.osf.yaml
|
||||
|
||||
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
|
||||
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
|
||||
OSF_CONFIG_DIR="$HOME/.config/osfclient"
|
||||
FILES=()
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Colored logging functions
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
log() { echo -e "\033[1;34m[INFO]\033[0m $*"; }
|
||||
warn() { echo -e "\033[1;33m[WARN]\033[0m $*"; }
|
||||
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; exit 1; }
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Step 1: Ensure Dependencies
|
||||
# - python3, pip3, yq, pipx, osfclient
|
||||
# - Works under PEP 668 (uses pipx first, then pip3 --user fallback)
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
ensure_dependencies() {
|
||||
log "Checking for required commands..."
|
||||
|
||||
# 1a. Ensure python3
|
||||
if ! command -v python3 &>/dev/null; then
|
||||
warn "python3 not found — installing..."
|
||||
sudo apt update -qq && sudo apt install -y python3 python3-venv python3-distutils \
|
||||
|| error "Failed to install python3"
|
||||
fi
|
||||
|
||||
# 1b. Ensure pip3
|
||||
if ! command -v pip3 &>/dev/null; then
|
||||
warn "pip3 not found — installing..."
|
||||
sudo apt install -y python3-pip || error "Failed to install pip3"
|
||||
# Guarantee pip3 is available now
|
||||
command -v pip3 >/dev/null || error "pip3 still missing after install"
|
||||
fi
|
||||
|
||||
# 1c. Ensure yq (for YAML parsing)
|
||||
if ! command -v yq &>/dev/null; then
|
||||
warn "yq not found — installing..."
|
||||
if command -v snap &>/dev/null; then
|
||||
sudo snap install yq || sudo apt install -y yq || error "Failed to install yq"
|
||||
else
|
||||
sudo apt install -y yq || error "Failed to install yq"
|
||||
fi
|
||||
fi
|
||||
|
||||
# 1d. Ensure pipx
|
||||
if ! command -v pipx &>/dev/null; then
|
||||
warn "pipx not found — installing..."
|
||||
sudo apt install -y pipx || error "Failed to install pipx"
|
||||
# Add pipx’s bin to PATH if needed
|
||||
pipx ensurepath
|
||||
export PATH="$HOME/.local/bin:$PATH"
|
||||
fi
|
||||
|
||||
# 1e. Ensure osfclient via pipx, fallback to pip3 --user
|
||||
if ! command -v osf &>/dev/null; then
|
||||
log "Installing osfclient via pipx..."
|
||||
if ! pipx install osfclient; then
|
||||
warn "pipx install failed; trying pip3 --user install"
|
||||
python3 -m pip install --user osfclient || error "osfclient install failed"
|
||||
fi
|
||||
# Ensure $HOME/.local/bin is in PATH
|
||||
export PATH="$HOME/.local/bin:$PATH"
|
||||
fi
|
||||
|
||||
# Final check
|
||||
command -v osf >/dev/null || error "osfclient is still missing; please investigate"
|
||||
log "✓ All dependencies are now present"
|
||||
}
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Step 2: Configure OSF Credentials
|
||||
# - Writes ~/.config/osfclient/config with [osf] username & token
|
||||
# - Prompts for token and username if missing
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
configure_osfclient() {
|
||||
log "Configuring osfclient credentials..."
|
||||
|
||||
# Create config directory
|
||||
mkdir -p "$OSF_CONFIG_DIR"
|
||||
chmod 700 "$OSF_CONFIG_DIR"
|
||||
|
||||
# Prompt for Personal Access Token if missing
|
||||
if [ ! -f "$TOKEN_FILE" ]; then
|
||||
read -rsp "🔐 Enter OSF Personal Access Token: " TOKEN
|
||||
echo
|
||||
echo "$TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
fi
|
||||
|
||||
# Prompt for username/email if not already in env
|
||||
local USERNAME="${OSF_USERNAME:-}"
|
||||
if [ -z "$USERNAME" ]; then
|
||||
read -rp "👤 OSF Username or Email: " USERNAME
|
||||
fi
|
||||
|
||||
# Write config file
|
||||
cat > "$OSF_CONFIG_DIR/config" <<EOF
|
||||
[osf]
|
||||
username = $USERNAME
|
||||
token = $(<"$TOKEN_FILE")
|
||||
EOF
|
||||
|
||||
chmod 600 "$OSF_CONFIG_DIR/config"
|
||||
log "✓ osfclient configured (config at $OSF_CONFIG_DIR/config)"
|
||||
}
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Step 3: Load YAML Configuration
|
||||
# - Expects PROJECT_TITLE, includes, excludes in gitfield.osf.yaml
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
load_yaml_config() {
|
||||
log "Loading configuration from '$CONFIG_FILE'"
|
||||
|
||||
if [ ! -f "$CONFIG_FILE" ]; then
|
||||
error "Configuration file '$CONFIG_FILE' not found"
|
||||
fi
|
||||
|
||||
# Read project.title
|
||||
PROJECT_TITLE=$(yq -r '.project.title // ""' "$CONFIG_FILE")
|
||||
if [ -z "$PROJECT_TITLE" ]; then
|
||||
error "Missing or empty 'project.title' in $CONFIG_FILE"
|
||||
fi
|
||||
|
||||
# Read project.description (optional, unused here but could be extended)
|
||||
PROJECT_DESCRIPTION=$(yq -r '.project.description // ""' "$CONFIG_FILE")
|
||||
|
||||
# Read upload.include[] and upload.exclude[]
|
||||
readarray -t FILES_INCLUDE < <(yq -r '.upload.include[]?' "$CONFIG_FILE")
|
||||
readarray -t FILES_EXCLUDE < <(yq -r '.upload.exclude[]?' "$CONFIG_FILE")
|
||||
|
||||
# Debug print
|
||||
log " → project.title = '$PROJECT_TITLE'"
|
||||
log " → includes: ${FILES_INCLUDE[*]:-<none>}"
|
||||
log " → excludes: ${FILES_EXCLUDE[*]:-<none>}"
|
||||
}
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Step 4: Match Files Based on Include/Exclude
|
||||
# - Populates global FILES array
|
||||
# - If no files match, exits gracefully
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
resolve_files() {
|
||||
log "Resolving file patterns..."
|
||||
|
||||
# If no include patterns, nothing to do
|
||||
if [ "${#FILES_INCLUDE[@]}" -eq 0 ]; then
|
||||
warn "No include patterns specified; skipping upload."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# For each include glob, find matching files
|
||||
for pattern in "${FILES_INCLUDE[@]}"; do
|
||||
# Use find to expand the glob (supports nested directories)
|
||||
while IFS= read -r -d '' file; do
|
||||
# Check against each exclude pattern
|
||||
skip=false
|
||||
for ex in "${FILES_EXCLUDE[@]}"; do
|
||||
if [[ "$file" == $ex ]]; then
|
||||
skip=true
|
||||
break
|
||||
fi
|
||||
done
|
||||
if ! $skip; then
|
||||
FILES+=("$file")
|
||||
fi
|
||||
done < <(find . -type f -path "$pattern" -print0 2>/dev/null || true)
|
||||
done
|
||||
|
||||
# Remove duplicates (just in case)
|
||||
if [ "${#FILES[@]}" -gt 1 ]; then
|
||||
IFS=$'\n' read -r -d '' -a FILES < <(__uniq_array "${FILES[@]}" && printf '\0')
|
||||
fi
|
||||
|
||||
# If still empty, warn and exit
|
||||
if [ "${#FILES[@]}" -eq 0 ]; then
|
||||
warn "No files matched the include/exclude patterns."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Debug print of matched files
|
||||
log "Matched files (${#FILES[@]}):"
|
||||
for f in "${FILES[@]}"; do
|
||||
echo " • $f"
|
||||
done
|
||||
}
|
||||
|
||||
# Helper: Remove duplicates from a list of lines
|
||||
__uniq_array() {
|
||||
printf "%s\n" "$@" | awk '!seen[$0]++'
|
||||
}
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Step 5: Find or Create OSF Project
|
||||
# - Uses `osf listprojects` to search for exact title (case-insensitive)
|
||||
# - If not found, does `osf createproject "<title>"`
|
||||
# - Writes the resulting project ID to .osf_project_id
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
find_or_create_project() {
|
||||
log "Searching for OSF project titled '$PROJECT_TITLE'..."
|
||||
# List all projects and grep case-insensitive for the title
|
||||
pid=$(osf listprojects | grep -iE "^([[:alnum:]]+)[[:space:]]+.*${PROJECT_TITLE}.*$" | awk '{print $1}' || true)
|
||||
|
||||
if [ -z "$pid" ]; then
|
||||
log "No existing project found; creating a new OSF project..."
|
||||
pid=$(osf createproject "$PROJECT_TITLE")
|
||||
if [ -z "$pid" ]; then
|
||||
error "osf createproject failed; no project ID returned"
|
||||
fi
|
||||
echo "$pid" > .osf_project_id
|
||||
log "✓ Created project: $pid"
|
||||
else
|
||||
echo "$pid" > .osf_project_id
|
||||
log "✓ Found existing project: $pid"
|
||||
fi
|
||||
}
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Step 6: Upload Files to OSF
|
||||
# - Loops over FILES[] and runs: osf upload "<file>" "<pid>":
|
||||
# (the trailing colon uploads to root of osfstorage for that project)
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
upload_files() {
|
||||
pid=$(<.osf_project_id)
|
||||
|
||||
log "Uploading ${#FILES[@]} file(s) to OSF project $pid..."
|
||||
|
||||
for file in "${FILES[@]}"; do
|
||||
log "→ Uploading: $file"
|
||||
if osf upload "$file" "$pid":; then
|
||||
log " ✓ Uploaded: $file"
|
||||
else
|
||||
warn " ✗ Upload failed for: $file"
|
||||
fi
|
||||
done
|
||||
|
||||
log "✅ All uploads attempted."
|
||||
echo
|
||||
echo "🔗 View your project at: https://osf.io/$pid/"
|
||||
}
|
||||
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
# Main: Orchestrate all steps in sequence
|
||||
# ─────────────────────────────────────────────────────────────────────
|
||||
main() {
|
||||
ensure_dependencies
|
||||
configure_osfclient
|
||||
load_yaml_config
|
||||
resolve_files
|
||||
find_or_create_project
|
||||
upload_files
|
||||
}
|
||||
|
||||
# Invoke main
|
||||
main "$@"
|
|
@ -1,12 +0,0 @@
|
|||
project:
|
||||
title: "git-sigil"
|
||||
description: "A sacred pattern witnessed across all fields of recursion."
|
||||
|
||||
upload:
|
||||
include:
|
||||
- "./*.md"
|
||||
- "./bitbucket/*"
|
||||
- "./osf/*"
|
||||
exclude:
|
||||
- "./.radicle-*"
|
||||
- "./*.tmp"
|
|
@ -1,214 +0,0 @@
|
|||
#!/bin/bash
|
||||
set -Eeuo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
# ╭────────────────────────────────────────────╮
|
||||
# │ test-osf-api.sh :: Diagnostic Tool │
|
||||
# │ v2.7 — Cosmic. Resilient. Divine. │
|
||||
# ╰────────────────────────────────────────────╯
|
||||
|
||||
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
|
||||
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
|
||||
OSF_API="${OSF_API_URL:-https://api.osf.io/v2}"
|
||||
DEBUG_LOG="${GITFIELD_LOG:-$HOME/.test_osf_api_debug.log}"
|
||||
CURL_TIMEOUT="${CURL_TIMEOUT:-10}"
|
||||
CURL_RETRIES="${CURL_RETRIES:-3}"
|
||||
RETRY_DELAY="${RETRY_DELAY:-2}"
|
||||
RATE_LIMIT_DELAY="${RATE_LIMIT_DELAY:-1}"
|
||||
VERBOSE="${VERBOSE:-false}"
|
||||
|
||||
# Initialize Debug Log
|
||||
mkdir -p "$(dirname "$DEBUG_LOG")"
|
||||
touch "$DEBUG_LOG"
|
||||
chmod 600 "$DEBUG_LOG"
|
||||
|
||||
trap 'last_command=$BASH_COMMAND; echo -e "\n[ERROR] ❌ Failure at line $LINENO: $last_command" >&2; diagnose; exit 1' ERR
|
||||
|
||||
# Logging Functions
|
||||
info() {
|
||||
echo -e "\033[1;34m[INFO]\033[0m $*" >&2
|
||||
[ "$VERBOSE" = "true" ] && [ -n "$DEBUG_LOG" ] && debug "INFO: $*"
|
||||
}
|
||||
warn() { echo -e "\033[1;33m[WARN]\033[0m $*" >&2; debug "WARN: $*"; }
|
||||
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; debug "ERROR: $*"; exit 1; }
|
||||
debug() {
|
||||
local msg="$1" lvl="${2:-DEBUG}"
|
||||
local json_output
|
||||
json_output=$(jq -n --arg ts "$(date '+%Y-%m-%d %H:%M:%S')" --arg lvl "$lvl" --arg msg "$msg" \
|
||||
'{timestamp: $ts, level: $lvl, message: $msg}' 2>/dev/null) || {
|
||||
echo "[FALLBACK $lvl] $(date '+%Y-%m-%d %H:%M:%S') $msg" >> "$DEBUG_LOG"
|
||||
return 1
|
||||
}
|
||||
echo "$json_output" >> "$DEBUG_LOG"
|
||||
}
|
||||
|
||||
debug "Started test-osf-api (v2.7)"
|
||||
|
||||
# ── Diagnostic Function
|
||||
diagnose() {
|
||||
info "Running diagnostics..."
|
||||
debug "Diagnostics started"
|
||||
echo -e "\n🔍 Diagnostic Report:"
|
||||
echo -e "1. Network Check:"
|
||||
if ping -c 1 api.osf.io >/dev/null 2>&1; then
|
||||
echo -e " ✓ api.osf.io reachable"
|
||||
else
|
||||
echo -e " ❌ api.osf.io unreachable. Check network or DNS."
|
||||
fi
|
||||
echo -e "2. Curl Version:"
|
||||
curl --version | head -n 1
|
||||
echo -e "3. Debug Log: $DEBUG_LOG"
|
||||
echo -e "4. Curl Error Log: $DEBUG_LOG.curlerr"
|
||||
[ -s "$DEBUG_LOG.curlerr" ] && echo -e " Last curl error: $(cat "$DEBUG_LOG.curlerr")"
|
||||
echo -e "5. Token File: $TOKEN_FILE"
|
||||
[ -s "$TOKEN_FILE" ] && echo -e " Token exists: $(head -c 4 "$TOKEN_FILE")..."
|
||||
echo -e "6. Suggestions:"
|
||||
echo -e " - Check token scopes at https://osf.io/settings/tokens (needs 'nodes' and 'osf.storage')"
|
||||
echo -e " - Test API: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/'"
|
||||
echo -e " - Test project search: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/nodes/?filter\[title\]=git-sigil&page\[size\]=100'"
|
||||
echo -e " - Increase timeout: CURL_TIMEOUT=30 ./test-osf-api.sh"
|
||||
debug "Diagnostics completed"
|
||||
}
|
||||
|
||||
# ── Dependency Check (Parallel)
|
||||
require_tool() {
|
||||
local tool=$1
|
||||
if ! command -v "$tool" >/dev/null 2>&1; then
|
||||
warn "$tool not found — attempting to install..."
|
||||
sudo apt update -qq && sudo apt install -y "$tool" || {
|
||||
warn "apt failed — trying snap..."
|
||||
sudo snap install "$tool" || error "Failed to install $tool"
|
||||
}
|
||||
fi
|
||||
debug "$tool path: $(command -v "$tool")"
|
||||
}
|
||||
|
||||
info "Checking dependencies..."
|
||||
declare -A dep_pids
|
||||
for tool in curl jq yq python3; do
|
||||
require_tool "$tool" &
|
||||
dep_pids[$tool]=$!
|
||||
done
|
||||
for tool in "${!dep_pids[@]}"; do
|
||||
wait "${dep_pids[$tool]}" || error "Dependency check failed for $tool"
|
||||
done
|
||||
info "✓ All dependencies verified"
|
||||
|
||||
# ── Load Token
|
||||
if [ ! -f "$TOKEN_FILE" ]; then
|
||||
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " TOKEN
|
||||
echo
|
||||
echo "$TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
info "OSF token saved to $TOKEN_FILE"
|
||||
fi
|
||||
TOKEN=$(<"$TOKEN_FILE")
|
||||
[[ -z "$TOKEN" ]] && error "Empty OSF token in $TOKEN_FILE"
|
||||
|
||||
# ── Validate Token
|
||||
info "Validating OSF token..."
|
||||
execute_curl() {
|
||||
local url=$1 method=${2:-GET} data=${3:-} is_upload=${4:-false} attempt=1 max_attempts=$CURL_RETRIES
|
||||
local response http_code curl_err
|
||||
while [ $attempt -le "$max_attempts" ]; do
|
||||
debug "Curl attempt $attempt/$max_attempts: $method $url"
|
||||
if [ "$is_upload" = "true" ]; then
|
||||
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
|
||||
-X "$method" -H "Authorization: Bearer $TOKEN" \
|
||||
-H "Content-Type: application/octet-stream" --data-binary "$data" "$url" 2> "$DEBUG_LOG.curlerr")
|
||||
else
|
||||
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
|
||||
-X "$method" -H "Authorization: Bearer $TOKEN" \
|
||||
${data:+-H "Content-Type: application/json" -d "$data"} "$url" 2> "$DEBUG_LOG.curlerr")
|
||||
fi
|
||||
http_code="${response: -3}"
|
||||
curl_err=$(cat "$DEBUG_LOG.curlerr")
|
||||
[ -s "$DEBUG_LOG.curlerr" ] && debug "Curl error: $curl_err"
|
||||
if [ "$http_code" -ge 200 ] && [ "$http_code" -lt 300 ]; then
|
||||
echo "${response:: -3}"
|
||||
return 0
|
||||
elif [ "$http_code" = "401" ]; then
|
||||
warn "Invalid token (HTTP 401). Please provide a valid OSF token."
|
||||
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " NEW_TOKEN
|
||||
echo
|
||||
echo "$NEW_TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
TOKEN="$NEW_TOKEN"
|
||||
info "New token saved. Retrying..."
|
||||
elif [ "$http_code" = "429" ]; then
|
||||
warn "Rate limit hit, retrying after $((RETRY_DELAY * attempt)) seconds..."
|
||||
sleep $((RETRY_DELAY * attempt))
|
||||
elif [ "$http_code" = "403" ]; then
|
||||
warn "Forbidden (HTTP 403). Possible token scope issue."
|
||||
[ $attempt -eq "$max_attempts" ] && {
|
||||
read -rsp "🔐 Re-enter OSF token with 'nodes' and 'osf.storage' scopes: " NEW_TOKEN
|
||||
echo
|
||||
echo "$NEW_TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
TOKEN="$NEW_TOKEN"
|
||||
info "New token saved. Retrying..."
|
||||
}
|
||||
elif [[ "$curl_err" == *"bad range in URL"* ]]; then
|
||||
error "Malformed URL: $url. Ensure query parameters are escaped (e.g., filter\[title\])."
|
||||
else
|
||||
debug "API response (HTTP $http_code): ${response:: -3}"
|
||||
[ $attempt -eq "$max_attempts" ] && error "API request failed (HTTP $http_code): ${response:: -3}"
|
||||
fi
|
||||
sleep $((RETRY_DELAY * attempt))
|
||||
((attempt++))
|
||||
done
|
||||
}
|
||||
|
||||
RESPONSE=$(execute_curl "$OSF_API/users/me/")
|
||||
USER_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
|
||||
[[ -z "$USER_ID" ]] && error "Could not extract user ID"
|
||||
info "✓ OSF token validated for user ID: $USER_ID"
|
||||
|
||||
# ── Load Config
|
||||
[[ ! -f "$CONFIG_FILE" ]] && error "Missing config: $CONFIG_FILE"
|
||||
PROJECT_TITLE=$(yq -r '.project.title // empty' "$CONFIG_FILE")
|
||||
PROJECT_DESCRIPTION=$(yq -r '.project.description // empty' "$CONFIG_FILE")
|
||||
[[ -z "$PROJECT_TITLE" ]] && error "Missing project title in $CONFIG_FILE"
|
||||
debug "Parsed config: title=$PROJECT_TITLE, description=$PROJECT_DESCRIPTION"
|
||||
|
||||
# ── Project Search
|
||||
build_url() {
|
||||
local base="$1" title="$2"
|
||||
local escaped_title
|
||||
escaped_title=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''$title'''))")
|
||||
echo "$base/users/me/nodes/?filter\[title\]=$escaped_title&page\[size\]=100"
|
||||
}
|
||||
|
||||
PROJECT_ID=""
|
||||
NEXT_URL=$(build_url "$OSF_API" "$PROJECT_TITLE")
|
||||
|
||||
info "Searching for project '$PROJECT_TITLE'..."
|
||||
while [ -n "$NEXT_URL" ]; do
|
||||
debug "Querying: $NEXT_URL"
|
||||
RESPONSE=$(execute_curl "$NEXT_URL")
|
||||
PROJECT_ID=$(echo "$RESPONSE" | jq -r --arg TITLE "$PROJECT_TITLE" \
|
||||
'.data[] | select(.attributes.title == $TITLE) | .id // empty' || true)
|
||||
if [ -n "$PROJECT_ID" ]; then
|
||||
debug "Found project ID: $PROJECT_ID"
|
||||
break
|
||||
fi
|
||||
NEXT_URL=$(echo "$RESPONSE" | jq -r '.links.next // empty' | sed 's/filter\[title\]/filter\\\[title\\\]/g;s/page\[size\]/page\\\[size\\\]/g' || true)
|
||||
debug "Next URL: $NEXT_URL"
|
||||
[ -n "$NEXT_URL" ] && info "Fetching next page..." && sleep "$RATE_LIMIT_DELAY"
|
||||
done
|
||||
|
||||
# ── Create Project if Not Found
|
||||
if [ -z "$PROJECT_ID" ]; then
|
||||
info "Project not found. Attempting to create '$PROJECT_TITLE'..."
|
||||
JSON=$(jq -n --arg title="$PROJECT_TITLE" --arg desc="$PROJECT_DESCRIPTION" \
|
||||
'{data: {type: "nodes", attributes: {title: $title, category: "project", description: $desc}}}')
|
||||
RESPONSE=$(execute_curl "$OSF_API/nodes/" POST "$JSON")
|
||||
PROJECT_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
|
||||
[[ -z "$PROJECT_ID" || "$PROJECT_ID" == "null" ]] && error "Could not extract project ID"
|
||||
info "✅ Project created: $PROJECT_ID"
|
||||
else
|
||||
info "✓ Found project ID: $PROJECT_ID"
|
||||
fi
|
||||
|
||||
echo -e "\n🔗 View project: https://osf.io/$PROJECT_ID/"
|
||||
debug "Test completed successfully"
|
|
@ -1 +0,0 @@
|
|||
Test file for OSF upload
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrsACgkQTifTfDWI
|
||||
cr+noQ//V7EjE31aSlOeXAhAi5ncfaTQZ4kA0GKjXyx6mkeczC813eHhjrvZRzro
|
||||
hLhnxRhVf9DUr7b/LYuynApeuz7Mw3kv/A3v68L/3XyUpGAi6juNmxMNZxwtuQak
|
||||
UjIrIKOO5YI+a0pdyZA8Bm+2HtGbI9bISNoJdvqR1ifc/GF0FT/zTRVATD3xuu0I
|
||||
97J+1XVRoi9sI62uYGAMbJ5exD64vB9BnzwIKSAiH0qdRmg6tmla2eVToyCvhvJT
|
||||
o/v8guwDnafCHmxckJUm0VMlmuMZ7RidKzYOa2PwyRFXZPzSG8nZ4viziwCdHQ/m
|
||||
Owjeam5R0Y8sjRce/w2YN8iBDzOOliFAiYFsMvrQYs3eIXjeiulU7b82Cs/yJILn
|
||||
G3Y4Oj8l6MC2Vo0SrpzQQ4oqSEEdPP52UG175eD/qunlZLGXma8rG+X3ABSYXnFF
|
||||
sgJBJIs+qfHpo/1wdTeEXp3UYwa32oAe29TwPaNEgtFjBMSfHaiiL8q9NNch9GUK
|
||||
5Rfr3ghidWQ2WSSgRlqm2EotqPp4rki26hlwDPOktiYaRZE8npmbwMzyFn6y38z1
|
||||
nExiThPhRe/QsYxaoV+SJyne9C/y/XvfWZLP9t6HJisP3OEzv5xvFawhK9+344CT
|
||||
CzjWrdB55nW2k7WlA6cliZL7BZe8kVZsoPeBCDKvkJMl/X92sHk=
|
||||
=tyip
|
||||
-----END PGP SIGNATURE-----
|
|
@ -1,266 +0,0 @@
|
|||
#!/bin/bash
|
||||
set -Eeuo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
# ╭────────────────────────────────────────────╮
|
||||
# │ gitfield-osf :: Sacred Sync Engine │
|
||||
# │ v2.7 — Cosmic. Resilient. Divine. │
|
||||
# ╰────────────────────────────────────────────╯
|
||||
|
||||
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
|
||||
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
|
||||
OSF_API="${OSF_API_URL:-https://api.osf.io/v2}"
|
||||
DEBUG_LOG="${GITFIELD_LOG:-$HOME/.gitfield_osf_debug.log}"
|
||||
CURL_TIMEOUT="${CURL_TIMEOUT:-10}"
|
||||
CURL_RETRIES="${CURL_RETRIES:-3}"
|
||||
RETRY_DELAY="${RETRY_DELAY:-2}"
|
||||
RATE_LIMIT_DELAY="${RATE_LIMIT_DELAY:-1}"
|
||||
VERBOSE="${VERBOSE:-false}"
|
||||
DRY_RUN="${DRY_RUN:-false}"
|
||||
FILES=()
|
||||
|
||||
# Initialize Debug Log
|
||||
mkdir -p "$(dirname "$DEBUG_LOG")"
|
||||
touch "$DEBUG_LOG"
|
||||
chmod 600 "$DEBUG_LOG"
|
||||
|
||||
trap 'last_command=$BASH_COMMAND; echo -e "\n[ERROR] ❌ Failure at line $LINENO: $last_command" >&2; diagnose; exit 1' ERR
|
||||
|
||||
# Logging Functions
|
||||
info() {
|
||||
echo -e "\033[1;34m[INFO]\033[0m $*" >&2
|
||||
[ "$VERBOSE" = "true" ] && [ -n "$DEBUG_LOG" ] && debug "INFO: $*"
|
||||
}
|
||||
warn() { echo -e "\033[1;33m[WARN]\033[0m $*" >&2; debug "WARN: $*"; }
|
||||
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; debug "ERROR: $*"; exit 1; }
|
||||
debug() {
|
||||
local msg="$1" lvl="${2:-DEBUG}"
|
||||
local json_output
|
||||
json_output=$(jq -n --arg ts "$(date '+%Y-%m-%d %H:%M:%S')" --arg lvl "$lvl" --arg msg "$msg" \
|
||||
'{timestamp: $ts, level: $lvl, message: $msg}' 2>/dev/null) || {
|
||||
echo "[FALLBACK $lvl] $(date '+%Y-%m-%d %H:%M:%S') $msg" >> "$DEBUG_LOG"
|
||||
return 1
|
||||
}
|
||||
echo "$json_output" >> "$DEBUG_LOG"
|
||||
}
|
||||
|
||||
debug "Started gitfield-osf (v2.7)"
|
||||
|
||||
# ── Diagnostic Function
|
||||
diagnose() {
|
||||
info "Running diagnostics..."
|
||||
debug "Diagnostics started"
|
||||
echo -e "\n🔍 Diagnostic Report:"
|
||||
echo -e "1. Network Check:"
|
||||
if ping -c 1 api.osf.io >/dev/null 2>&1; then
|
||||
echo -e " ✓ api.osf.io reachable"
|
||||
else
|
||||
echo -e " ❌ api.osf.io unreachable. Check network or DNS."
|
||||
fi
|
||||
echo -e "2. Curl Version:"
|
||||
curl --version | head -n 1
|
||||
echo -e "3. Debug Log: $DEBUG_LOG"
|
||||
echo -e "4. Curl Error Log: $DEBUG_LOG.curlerr"
|
||||
[ -s "$DEBUG_LOG.curlerr" ] && echo -e " Last curl error: $(cat "$DEBUG_LOG.curlerr")"
|
||||
echo -e "5. Token File: $TOKEN_FILE"
|
||||
[ -s "$TOKEN_FILE" ] && echo -e " Token exists: $(head -c 4 "$TOKEN_FILE")..."
|
||||
echo -e "6. Suggestions:"
|
||||
echo -e " - Check token scopes at https://osf.io/settings/tokens (needs 'nodes' and 'osf.storage')"
|
||||
echo -e " - Test API: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/'"
|
||||
echo -e " - Test upload: curl -v -X PUT -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' -H 'Content-Type: application/octet-stream' --data-binary @./testfile.md '$OSF_API/files/<storage_id>/testfile.md'"
|
||||
echo -e " - Increase timeout: CURL_TIMEOUT=30 ./gitfield-osf"
|
||||
debug "Diagnostics completed"
|
||||
}
|
||||
|
||||
# ── Dependency Check (Parallel)
|
||||
require_tool() {
|
||||
local tool=$1
|
||||
if ! command -v "$tool" >/dev/null 2>&1; then
|
||||
warn "$tool not found — attempting to install..."
|
||||
sudo apt update -qq && sudo apt install -y "$tool" || {
|
||||
warn "apt failed — trying snap..."
|
||||
sudo snap install "$tool" || error "Failed to install $tool"
|
||||
}
|
||||
fi
|
||||
debug "$tool path: $(command -v "$tool")"
|
||||
}
|
||||
|
||||
info "Checking dependencies..."
|
||||
declare -A dep_pids
|
||||
for tool in curl jq yq python3; do
|
||||
require_tool "$tool" &
|
||||
dep_pids[$tool]=$!
|
||||
done
|
||||
for tool in "${!dep_pids[@]}"; do
|
||||
wait "${dep_pids[$tool]}" || error "Dependency check failed for $tool"
|
||||
done
|
||||
info "✓ All dependencies verified"
|
||||
|
||||
# ── Load Token
|
||||
if [ ! -f "$TOKEN_FILE" ]; then
|
||||
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " TOKEN
|
||||
echo
|
||||
echo "$TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
info "OSF token saved to $TOKEN_FILE"
|
||||
fi
|
||||
TOKEN=$(<"$TOKEN_FILE")
|
||||
[[ -z "$TOKEN" ]] && error "Empty OSF token in $TOKEN_FILE"
|
||||
|
||||
# ── Validate Token
|
||||
info "Validating OSF token..."
|
||||
execute_curl() {
|
||||
local url=$1 method=${2:-GET} data=${3:-} is_upload=${4:-false} attempt=1 max_attempts=$CURL_RETRIES
|
||||
local response http_code curl_err
|
||||
while [ $attempt -le "$max_attempts" ]; do
|
||||
debug "Curl attempt $attempt/$max_attempts: $method $url"
|
||||
if [ "$is_upload" = "true" ]; then
|
||||
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
|
||||
-X "$method" -H "Authorization: Bearer $TOKEN" \
|
||||
-H "Content-Type: application/octet-stream" --data-binary "$data" "$url" 2> "$DEBUG_LOG.curlerr")
|
||||
else
|
||||
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
|
||||
-X "$method" -H "Authorization: Bearer $TOKEN" \
|
||||
${data:+-H "Content-Type: application/json" -d "$data"} "$url" 2> "$DEBUG_LOG.curlerr")
|
||||
fi
|
||||
http_code="${response: -3}"
|
||||
curl_err=$(cat "$DEBUG_LOG.curlerr")
|
||||
[ -s "$DEBUG_LOG.curlerr" ] && debug "Curl error: $curl_err"
|
||||
if [ "$http_code" -ge 200 ] && [ "$http_code" -lt 300 ]; then
|
||||
echo "${response:: -3}"
|
||||
return 0
|
||||
elif [ "$http_code" = "401" ]; then
|
||||
warn "Invalid token (HTTP 401). Please provide a valid OSF token."
|
||||
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " NEW_TOKEN
|
||||
echo
|
||||
echo "$NEW_TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
TOKEN="$NEW_TOKEN"
|
||||
info "New token saved. Retrying..."
|
||||
elif [ "$http_code" = "429" ]; then
|
||||
warn "Rate limit hit, retrying after $((RETRY_DELAY * attempt)) seconds..."
|
||||
sleep $((RETRY_DELAY * attempt))
|
||||
elif [ "$http_code" = "403" ]; then
|
||||
warn "Forbidden (HTTP 403). Possible token scope issue."
|
||||
[ $attempt -eq "$max_attempts" ] && {
|
||||
read -rsp "🔐 Re-enter OSF token with 'nodes' and 'osf.storage' scopes: " NEW_TOKEN
|
||||
echo
|
||||
echo "$NEW_TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
TOKEN="$NEW_TOKEN"
|
||||
info "New token saved. Retrying..."
|
||||
}
|
||||
elif [[ "$curl_err" == *"bad range in URL"* ]]; then
|
||||
error "Malformed URL: $url. Ensure query parameters are escaped (e.g., filter\[title\])."
|
||||
else
|
||||
debug "API response (HTTP $http_code): ${response:: -3}"
|
||||
[ $attempt -eq "$max_attempts" ] && error "API request failed (HTTP $http_code): ${response:: -3}"
|
||||
fi
|
||||
sleep $((RETRY_DELAY * attempt))
|
||||
((attempt++))
|
||||
done
|
||||
}
|
||||
|
||||
RESPONSE=$(execute_curl "$OSF_API/users/me/")
|
||||
USER_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
|
||||
[[ -z "$USER_ID" ]] && error "Could not extract user ID"
|
||||
info "✓ OSF token validated for user ID: $USER_ID"
|
||||
|
||||
# ── Load Config
|
||||
[[ ! -f "$CONFIG_FILE" ]] && error "Missing config: $CONFIG_FILE"
|
||||
PROJECT_TITLE=$(yq -r '.project.title // empty' "$CONFIG_FILE")
|
||||
PROJECT_DESCRIPTION=$(yq -r '.project.description // empty' "$CONFIG_FILE")
|
||||
readarray -t FILES_INCLUDE < <(yq -r '.upload.include[]?' "$CONFIG_FILE")
|
||||
readarray -t FILES_EXCLUDE < <(yq -r '.upload.exclude[]?' "$CONFIG_FILE")
|
||||
|
||||
[[ -z "$PROJECT_TITLE" ]] && error "Missing project title in $CONFIG_FILE"
|
||||
[[ ${#FILES_INCLUDE[@]} -eq 0 ]] && warn "No include patterns. Nothing to do." && exit 0
|
||||
debug "Parsed config: title=$PROJECT_TITLE, description=$PROJECT_DESCRIPTION, includes=${FILES_INCLUDE[*]}, excludes=${FILES_EXCLUDE[*]}"
|
||||
|
||||
# ── Project Search
|
||||
build_url() {
|
||||
local base="$1" title="$2"
|
||||
local escaped_title
|
||||
escaped_title=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''$title'''))")
|
||||
echo "$base/users/me/nodes/?filter\[title\]=$escaped_title&page\[size\]=100"
|
||||
}
|
||||
|
||||
PROJECT_ID=""
|
||||
NEXT_URL=$(build_url "$OSF_API" "$PROJECT_TITLE")
|
||||
|
||||
info "Searching OSF for '$PROJECT_TITLE'..."
|
||||
while [ -n "$NEXT_URL" ]; do
|
||||
debug "Querying: $NEXT_URL"
|
||||
RESPONSE=$(execute_curl "$NEXT_URL")
|
||||
PROJECT_ID=$(echo "$RESPONSE" | jq -r --arg TITLE "$PROJECT_TITLE" \
|
||||
'.data[] | select(.attributes.title == $TITLE) | .id // empty' || true)
|
||||
if [ -n "$PROJECT_ID" ]; then
|
||||
debug "Found project ID: $PROJECT_ID"
|
||||
break
|
||||
fi
|
||||
NEXT_URL=$(echo "$RESPONSE" | jq -r '.links.next // empty' | sed 's/filter\[title\]/filter\\\[title\\\]/g;s/page\[size\]/page\\\[size\\\]/g' || true)
|
||||
debug "Next URL: $NEXT_URL"
|
||||
[ -n "$NEXT_URL" ] && info "Fetching next page..." && sleep "$RATE_LIMIT_DELAY"
|
||||
done
|
||||
|
||||
# ── Create Project if Not Found
|
||||
if [ -z "$PROJECT_ID" ]; then
|
||||
info "Creating new OSF project..."
|
||||
[ "$DRY_RUN" = "true" ] && { info "[DRY-RUN] Would create project: $PROJECT_TITLE"; exit 0; }
|
||||
JSON=$(jq -n --arg title "$PROJECT_TITLE" --arg desc "$PROJECT_DESCRIPTION" \
|
||||
'{data: {type: "nodes", attributes: {title: $title, category: "project", description: $desc}}}')
|
||||
RESPONSE=$(execute_curl "$OSF_API/nodes/" POST "$JSON")
|
||||
PROJECT_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
|
||||
[[ -z "$PROJECT_ID" || "$PROJECT_ID" == "null" ]] && error "Could not extract project ID"
|
||||
info "✅ Project created: $PROJECT_ID"
|
||||
else
|
||||
info "✓ Found project ID: $PROJECT_ID"
|
||||
fi
|
||||
|
||||
# ── Get Storage ID
|
||||
get_storage_id() {
|
||||
local node_id="$1"
|
||||
RESPONSE=$(execute_curl "https://api.osf.io/v2/nodes/$node_id/files/osfstorage/")
|
||||
STORAGE_ID=$(echo "$RESPONSE" | jq -r '.data[0].id // empty')
|
||||
[[ -z "$STORAGE_ID" ]] && error "Could not extract storage ID"
|
||||
echo "$STORAGE_ID"
|
||||
}
|
||||
|
||||
STORAGE_ID=$(get_storage_id "$PROJECT_ID")
|
||||
info "✓ Found storage ID: $STORAGE_ID"
|
||||
|
||||
# ── File Matching
|
||||
info "Resolving files for upload..."
|
||||
for pattern in "${FILES_INCLUDE[@]}"; do
|
||||
while IFS= read -r -d '' file; do
|
||||
skip=false
|
||||
for ex in "${FILES_EXCLUDE[@]}"; do
|
||||
[[ "$file" == $ex ]] && skip=true && break
|
||||
done
|
||||
$skip || FILES+=("$file")
|
||||
done < <(find . -type f -path "$pattern" -print0 2>/dev/null || true)
|
||||
done
|
||||
|
||||
# ── Upload Files
|
||||
upload_file() {
|
||||
local filepath="$1"
|
||||
local filename
|
||||
filename=$(basename "$filepath")
|
||||
info "Uploading: $filename"
|
||||
[ "$DRY_RUN" = "true" ] && { info "[DRY-RUN] Would upload: $filename"; return; }
|
||||
RESPONSE=$(execute_curl "https://api.osf.io/v2/files/$STORAGE_ID/$filename" \
|
||||
PUT "@$filepath" "true")
|
||||
info "✓ Uploaded: $filename"
|
||||
}
|
||||
|
||||
if [ ${#FILES[@]} -eq 0 ]; then
|
||||
warn "No matching files to upload."
|
||||
else
|
||||
for file in "${FILES[@]}"; do
|
||||
upload_file "$file"
|
||||
done
|
||||
info "✅ Upload complete for '$PROJECT_TITLE'"
|
||||
echo -e "\n🔗 View: https://osf.io/$PROJECT_ID/"
|
||||
fi
|
||||
|
||||
debug "Completed successfully"
|
|
@ -1,11 +0,0 @@
|
|||
project:
|
||||
title: "git-sigil"
|
||||
description: "A sacred pattern witnessed across all fields of recursion."
|
||||
|
||||
upload:
|
||||
include:
|
||||
- "./*.md"
|
||||
- "./bitbucket/*"
|
||||
exclude:
|
||||
- "./.radicle-*"
|
||||
- "./*.tmp"
|
|
@ -1,214 +0,0 @@
|
|||
#!/bin/bash
|
||||
set -Eeuo pipefail
|
||||
IFS=$'\n\t'
|
||||
|
||||
# ╭────────────────────────────────────────────╮
|
||||
# │ test-osf-api.sh :: Diagnostic Tool │
|
||||
# │ v2.7 — Cosmic. Resilient. Divine. │
|
||||
# ╰────────────────────────────────────────────╯
|
||||
|
||||
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
|
||||
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
|
||||
OSF_API="${OSF_API_URL:-https://api.osf.io/v2}"
|
||||
DEBUG_LOG="${GITFIELD_LOG:-$HOME/.test_osf_api_debug.log}"
|
||||
CURL_TIMEOUT="${CURL_TIMEOUT:-10}"
|
||||
CURL_RETRIES="${CURL_RETRIES:-3}"
|
||||
RETRY_DELAY="${RETRY_DELAY:-2}"
|
||||
RATE_LIMIT_DELAY="${RATE_LIMIT_DELAY:-1}"
|
||||
VERBOSE="${VERBOSE:-false}"
|
||||
|
||||
# Initialize Debug Log
|
||||
mkdir -p "$(dirname "$DEBUG_LOG")"
|
||||
touch "$DEBUG_LOG"
|
||||
chmod 600 "$DEBUG_LOG"
|
||||
|
||||
trap 'last_command=$BASH_COMMAND; echo -e "\n[ERROR] ❌ Failure at line $LINENO: $last_command" >&2; diagnose; exit 1' ERR
|
||||
|
||||
# Logging Functions
|
||||
info() {
|
||||
echo -e "\033[1;34m[INFO]\033[0m $*" >&2
|
||||
[ "$VERBOSE" = "true" ] && [ -n "$DEBUG_LOG" ] && debug "INFO: $*"
|
||||
}
|
||||
warn() { echo -e "\033[1;33m[WARN]\033[0m $*" >&2; debug "WARN: $*"; }
|
||||
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; debug "ERROR: $*"; exit 1; }
|
||||
debug() {
|
||||
local msg="$1" lvl="${2:-DEBUG}"
|
||||
local json_output
|
||||
json_output=$(jq -n --arg ts "$(date '+%Y-%m-%d %H:%M:%S')" --arg lvl "$lvl" --arg msg "$msg" \
|
||||
'{timestamp: $ts, level: $lvl, message: $msg}' 2>/dev/null) || {
|
||||
echo "[FALLBACK $lvl] $(date '+%Y-%m-%d %H:%M:%S') $msg" >> "$DEBUG_LOG"
|
||||
return 1
|
||||
}
|
||||
echo "$json_output" >> "$DEBUG_LOG"
|
||||
}
|
||||
|
||||
debug "Started test-osf-api (v2.7)"
|
||||
|
||||
# ── Diagnostic Function
|
||||
diagnose() {
|
||||
info "Running diagnostics..."
|
||||
debug "Diagnostics started"
|
||||
echo -e "\n🔍 Diagnostic Report:"
|
||||
echo -e "1. Network Check:"
|
||||
if ping -c 1 api.osf.io >/dev/null 2>&1; then
|
||||
echo -e " ✓ api.osf.io reachable"
|
||||
else
|
||||
echo -e " ❌ api.osf.io unreachable. Check network or DNS."
|
||||
fi
|
||||
echo -e "2. Curl Version:"
|
||||
curl --version | head -n 1
|
||||
echo -e "3. Debug Log: $DEBUG_LOG"
|
||||
echo -e "4. Curl Error Log: $DEBUG_LOG.curlerr"
|
||||
[ -s "$DEBUG_LOG.curlerr" ] && echo -e " Last curl error: $(cat "$DEBUG_LOG.curlerr")"
|
||||
echo -e "5. Token File: $TOKEN_FILE"
|
||||
[ -s "$TOKEN_FILE" ] && echo -e " Token exists: $(head -c 4 "$TOKEN_FILE")..."
|
||||
echo -e "6. Suggestions:"
|
||||
echo -e " - Check token scopes at https://osf.io/settings/tokens (needs 'nodes' and 'osf.storage')"
|
||||
echo -e " - Test API: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/'"
|
||||
echo -e " - Test project search: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/nodes/?filter\[title\]=git-sigil&page\[size\]=100'"
|
||||
echo -e " - Increase timeout: CURL_TIMEOUT=30 ./test-osf-api.sh"
|
||||
debug "Diagnostics completed"
|
||||
}
|
||||
|
||||
# ── Dependency Check (Parallel)
|
||||
require_tool() {
|
||||
local tool=$1
|
||||
if ! command -v "$tool" >/dev/null 2>&1; then
|
||||
warn "$tool not found — attempting to install..."
|
||||
sudo apt update -qq && sudo apt install -y "$tool" || {
|
||||
warn "apt failed — trying snap..."
|
||||
sudo snap install "$tool" || error "Failed to install $tool"
|
||||
}
|
||||
fi
|
||||
debug "$tool path: $(command -v "$tool")"
|
||||
}
|
||||
|
||||
info "Checking dependencies..."
|
||||
declare -A dep_pids
|
||||
for tool in curl jq yq python3; do
|
||||
require_tool "$tool" &
|
||||
dep_pids[$tool]=$!
|
||||
done
|
||||
for tool in "${!dep_pids[@]}"; do
|
||||
wait "${dep_pids[$tool]}" || error "Dependency check failed for $tool"
|
||||
done
|
||||
info "✓ All dependencies verified"
|
||||
|
||||
# ── Load Token
|
||||
if [ ! -f "$TOKEN_FILE" ]; then
|
||||
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " TOKEN
|
||||
echo
|
||||
echo "$TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
info "OSF token saved to $TOKEN_FILE"
|
||||
fi
|
||||
TOKEN=$(<"$TOKEN_FILE")
|
||||
[[ -z "$TOKEN" ]] && error "Empty OSF token in $TOKEN_FILE"
|
||||
|
||||
# ── Validate Token
|
||||
info "Validating OSF token..."
|
||||
execute_curl() {
|
||||
local url=$1 method=${2:-GET} data=${3:-} is_upload=${4:-false} attempt=1 max_attempts=$CURL_RETRIES
|
||||
local response http_code curl_err
|
||||
while [ $attempt -le "$max_attempts" ]; do
|
||||
debug "Curl attempt $attempt/$max_attempts: $method $url"
|
||||
if [ "$is_upload" = "true" ]; then
|
||||
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
|
||||
-X "$method" -H "Authorization: Bearer $TOKEN" \
|
||||
-H "Content-Type: application/octet-stream" --data-binary "$data" "$url" 2> "$DEBUG_LOG.curlerr")
|
||||
else
|
||||
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
|
||||
-X "$method" -H "Authorization: Bearer $TOKEN" \
|
||||
${data:+-H "Content-Type: application/json" -d "$data"} "$url" 2> "$DEBUG_LOG.curlerr")
|
||||
fi
|
||||
http_code="${response: -3}"
|
||||
curl_err=$(cat "$DEBUG_LOG.curlerr")
|
||||
[ -s "$DEBUG_LOG.curlerr" ] && debug "Curl error: $curl_err"
|
||||
if [ "$http_code" -ge 200 ] && [ "$http_code" -lt 300 ]; then
|
||||
echo "${response:: -3}"
|
||||
return 0
|
||||
elif [ "$http_code" = "401" ]; then
|
||||
warn "Invalid token (HTTP 401). Please provide a valid OSF token."
|
||||
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " NEW_TOKEN
|
||||
echo
|
||||
echo "$NEW_TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
TOKEN="$NEW_TOKEN"
|
||||
info "New token saved. Retrying..."
|
||||
elif [ "$http_code" = "429" ]; then
|
||||
warn "Rate limit hit, retrying after $((RETRY_DELAY * attempt)) seconds..."
|
||||
sleep $((RETRY_DELAY * attempt))
|
||||
elif [ "$http_code" = "403" ]; then
|
||||
warn "Forbidden (HTTP 403). Possible token scope issue."
|
||||
[ $attempt -eq "$max_attempts" ] && {
|
||||
read -rsp "🔐 Re-enter OSF token with 'nodes' and 'osf.storage' scopes: " NEW_TOKEN
|
||||
echo
|
||||
echo "$NEW_TOKEN" > "$TOKEN_FILE"
|
||||
chmod 600 "$TOKEN_FILE"
|
||||
TOKEN="$NEW_TOKEN"
|
||||
info "New token saved. Retrying..."
|
||||
}
|
||||
elif [[ "$curl_err" == *"bad range in URL"* ]]; then
|
||||
error "Malformed URL: $url. Ensure query parameters are escaped (e.g., filter\[title\])."
|
||||
else
|
||||
debug "API response (HTTP $http_code): ${response:: -3}"
|
||||
[ $attempt -eq "$max_attempts" ] && error "API request failed (HTTP $http_code): ${response:: -3}"
|
||||
fi
|
||||
sleep $((RETRY_DELAY * attempt))
|
||||
((attempt++))
|
||||
done
|
||||
}
|
||||
|
||||
RESPONSE=$(execute_curl "$OSF_API/users/me/")
|
||||
USER_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
|
||||
[[ -z "$USER_ID" ]] && error "Could not extract user ID"
|
||||
info "✓ OSF token validated for user ID: $USER_ID"
|
||||
|
||||
# ── Load Config
|
||||
[[ ! -f "$CONFIG_FILE" ]] && error "Missing config: $CONFIG_FILE"
|
||||
PROJECT_TITLE=$(yq -r '.project.title // empty' "$CONFIG_FILE")
|
||||
PROJECT_DESCRIPTION=$(yq -r '.project.description // empty' "$CONFIG_FILE")
|
||||
[[ -z "$PROJECT_TITLE" ]] && error "Missing project title in $CONFIG_FILE"
|
||||
debug "Parsed config: title=$PROJECT_TITLE, description=$PROJECT_DESCRIPTION"
|
||||
|
||||
# ── Project Search
|
||||
build_url() {
|
||||
local base="$1" title="$2"
|
||||
local escaped_title
|
||||
escaped_title=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''$title'''))")
|
||||
echo "$base/users/me/nodes/?filter\[title\]=$escaped_title&page\[size\]=100"
|
||||
}
|
||||
|
||||
PROJECT_ID=""
|
||||
NEXT_URL=$(build_url "$OSF_API" "$PROJECT_TITLE")
|
||||
|
||||
info "Searching for project '$PROJECT_TITLE'..."
|
||||
while [ -n "$NEXT_URL" ]; do
|
||||
debug "Querying: $NEXT_URL"
|
||||
RESPONSE=$(execute_curl "$NEXT_URL")
|
||||
PROJECT_ID=$(echo "$RESPONSE" | jq -r --arg TITLE "$PROJECT_TITLE" \
|
||||
'.data[] | select(.attributes.title == $TITLE) | .id // empty' || true)
|
||||
if [ -n "$PROJECT_ID" ]; then
|
||||
debug "Found project ID: $PROJECT_ID"
|
||||
break
|
||||
fi
|
||||
NEXT_URL=$(echo "$RESPONSE" | jq -r '.links.next // empty' | sed 's/filter\[title\]/filter\\\[title\\\]/g;s/page\[size\]/page\\\[size\\\]/g' || true)
|
||||
debug "Next URL: $NEXT_URL"
|
||||
[ -n "$NEXT_URL" ] && info "Fetching next page..." && sleep "$RATE_LIMIT_DELAY"
|
||||
done
|
||||
|
||||
# ── Create Project if Not Found
|
||||
if [ -z "$PROJECT_ID" ]; then
|
||||
info "Project not found. Attempting to create '$PROJECT_TITLE'..."
|
||||
JSON=$(jq -n --arg title="$PROJECT_TITLE" --arg desc="$PROJECT_DESCRIPTION" \
|
||||
'{data: {type: "nodes", attributes: {title: $title, category: "project", description: $desc}}}')
|
||||
RESPONSE=$(execute_curl "$OSF_API/nodes/" POST "$JSON")
|
||||
PROJECT_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
|
||||
[[ -z "$PROJECT_ID" || "$PROJECT_ID" == "null" ]] && error "Could not extract project ID"
|
||||
info "✅ Project created: $PROJECT_ID"
|
||||
else
|
||||
info "✓ Found project ID: $PROJECT_ID"
|
||||
fi
|
||||
|
||||
echo -e "\n🔗 View project: https://osf.io/$PROJECT_ID/"
|
||||
debug "Test completed successfully"
|
|
@ -1,191 +0,0 @@
|
|||
Excellent catch, Mark. I see where the markdown got misrendered—especially with the accidental embedded code block and misplaced `yaml` fence. Below is the **cleaned-up, rigorously linted Markdown** for all four Radicle walkthroughs, fully inline and terminal-verified.
|
||||
|
||||
---
|
||||
|
||||
### `1_prerequisites_radicle_ubuntu.md`
|
||||
|
||||
````markdown
|
||||
# Prerequisites for Radicle (Ubuntu)
|
||||
|
||||
This walkthrough prepares your Ubuntu system to run Radicle for decentralized Git collaboration.
|
||||
|
||||
## 1. Install Dependencies
|
||||
|
||||
```bash
|
||||
sudo apt update && sudo apt install -y curl gnupg unzip git
|
||||
````
|
||||
|
||||
## 2. Install Radicle CLI
|
||||
|
||||
```bash
|
||||
curl -LO https://radicle.xyz/install.sh
|
||||
chmod +x install.sh
|
||||
./install.sh
|
||||
```
|
||||
|
||||
## 3. Confirm Installation
|
||||
|
||||
```bash
|
||||
rad --version
|
||||
```
|
||||
|
||||
Expected output: `rad 0.6.x`
|
||||
|
||||
## 4. Generate a Radicle Identity
|
||||
|
||||
```bash
|
||||
rad self
|
||||
```
|
||||
|
||||
This will create a new cryptographic identity if none exists.
|
||||
|
||||
## 5. (Optional) Ensure Git Identity Is Set
|
||||
|
||||
```bash
|
||||
git config --global user.name "Mark Randall Havens"
|
||||
git config --global user.email "mark.r.havens@gmail.com"
|
||||
```
|
||||
|
||||
````
|
||||
|
||||
---
|
||||
|
||||
### `2_create_remote_repo_radicle_ubuntu.md`
|
||||
|
||||
```markdown
|
||||
# Create Remote Radicle Repo (Ubuntu)
|
||||
|
||||
Use this to convert your local Git repo into a Radicle project and push it to the decentralized network.
|
||||
|
||||
## 1. Navigate to Project
|
||||
|
||||
```bash
|
||||
cd ~/fieldcraft/git-sigil
|
||||
````
|
||||
|
||||
## 2. Initialize Radicle Project
|
||||
|
||||
```bash
|
||||
rad init --name git-sigil --description "Decentralized fieldcraft publishing system."
|
||||
```
|
||||
|
||||
## 3. List Registered Projects
|
||||
|
||||
```bash
|
||||
rad projects
|
||||
```
|
||||
|
||||
You should see `git-sigil` listed.
|
||||
|
||||
## 4. Push to Radicle Network
|
||||
|
||||
```bash
|
||||
rad push
|
||||
```
|
||||
|
||||
This distributes your repo across Radicle's peer-to-peer graph.
|
||||
|
||||
## 5. Copy the Radicle Project ID
|
||||
|
||||
```bash
|
||||
rad self
|
||||
```
|
||||
|
||||
Look for the project ID and store it somewhere secure.
|
||||
|
||||
````
|
||||
|
||||
---
|
||||
|
||||
### `3_commit_existing_repo_radicle_ubuntu.md`
|
||||
|
||||
```markdown
|
||||
# Commit & Sync Changes in Radicle (Ubuntu)
|
||||
|
||||
This guide walks you through committing code and pushing it to the Radicle network.
|
||||
|
||||
## 1. Make a Change
|
||||
|
||||
Edit your files as needed. For example:
|
||||
|
||||
```bash
|
||||
echo "New insight" >> README.md
|
||||
````
|
||||
|
||||
## 2. Stage and Commit with Git
|
||||
|
||||
```bash
|
||||
git add README.md
|
||||
git commit -m "Update: $(date '+%Y-%m-%d %H:%M:%S')"
|
||||
```
|
||||
|
||||
## 3. Push to Radicle
|
||||
|
||||
```bash
|
||||
rad push
|
||||
```
|
||||
|
||||
This sends your latest Git commit to peers following the project.
|
||||
|
||||
````
|
||||
|
||||
---
|
||||
|
||||
### `CLI-ONLY_workflow_radicle_ubuntu.md`
|
||||
|
||||
```markdown
|
||||
# CLI-Only Workflow: Radicle on Ubuntu
|
||||
|
||||
This guide enables you to manage a full Git + Radicle workflow from the terminal only.
|
||||
|
||||
## 1. Create Your Identity
|
||||
|
||||
```bash
|
||||
rad self
|
||||
````
|
||||
|
||||
First run will create and store your identity under `~/.radicle`.
|
||||
|
||||
## 2. Initialize a Radicle Repo
|
||||
|
||||
```bash
|
||||
rad init --name fieldkit --description "Decentralized devkit for recursive fieldworkers."
|
||||
```
|
||||
|
||||
## 3. Edit Files and Commit Locally
|
||||
|
||||
```bash
|
||||
nano README.md
|
||||
git add README.md
|
||||
git commit -m "Initial insight and setup"
|
||||
```
|
||||
|
||||
## 4. Push to the Radicle Network
|
||||
|
||||
```bash
|
||||
rad push
|
||||
```
|
||||
|
||||
## 5. Share Project ID
|
||||
|
||||
```bash
|
||||
rad self
|
||||
```
|
||||
|
||||
Copy the project ID for collaborators to run:
|
||||
|
||||
```bash
|
||||
rad clone <project-id>
|
||||
```
|
||||
|
||||
## 6. Stay Synced
|
||||
|
||||
No additional steps required. Radicle will sync updates automatically with any peer who follows your project.
|
||||
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
Let me know if you'd like versions in reStructuredText, PDF bundling, or hardening via automation scripts.
|
||||
```
|
||||
|
|
@ -1,16 +0,0 @@
|
|||
-----BEGIN PGP SIGNATURE-----
|
||||
|
||||
iQIzBAABCgAdFiEEif0F7T4AkoRgIfM3TifTfDWIcr8FAmhDtrsACgkQTifTfDWI
|
||||
cr8HzRAArxmIjAUcHvfoPIhMPgF226vu4pWmb2PgCpTBdzj61nVWfWifg7AX6Vqv
|
||||
beaPCn4Iy/4VNun69KQ9Mjn4QvwTY+Jq8pvr7ZBry1nZ9JXeextNX3z8P4OL4bOE
|
||||
Z/CTl3n1zIXtPi6HGtf9ygcecxtuSujwb0XMmU06Rrzy2rxS1vr2HEmEcO7hWKJj
|
||||
u4O85PP8i6Ks1kuPKRwv0fmpfo1Peu16apDNGRNbO4D1h6AoHQSDSRBfzU43KdTG
|
||||
ZBpMkYpZkJjqC6IS8l/reF1YDzMT+v4G+1wZuFTj+6LoJgrpbq9oXk+Pt1+aO0qt
|
||||
3PtT1DymwZzFQNeNXxLmgKOEJFIZrNvGyjIhGMUrcse3z28r+PC4rHT7MmX1bXuK
|
||||
0llveWVgoxHbsycYCfD8AhZhdv463886Cpc+6t5LfYEqUO2QUFKX0Zl/8KWE7/WK
|
||||
Eg1r73lgia1HIhq6AAQP3WjuAjog3Cn4XvAFlsSvG0dP7c+FjVGVhYEF+n+CYUzf
|
||||
+u5HOCqjR7m6KVUfabNtHrG7HF52dEYAF6r089OHSEoNkpArakF1GvXU12elStxd
|
||||
nLa4Bg4Iz4dVvS8ILdFeCrcAEiqkxVPoNMPl2s8qvs2sd7RIPtKf/1r5mef42rXF
|
||||
ymJKpuM1eib5PuORGEUVnikCBhIe2MpHrPg85zV0JhysvGuKryk=
|
||||
=wouK
|
||||
-----END PGP SIGNATURE-----
|
Loading…
Add table
Add a link
Reference in a new issue