Post-Radicle sync at 2025-06-05 01:07:56

This commit is contained in:
Mark Randall Havens 2025-06-05 01:07:57 -05:00
parent d4edf2afd8
commit bf03d19403
29 changed files with 351 additions and 1 deletions

View file

@ -0,0 +1,55 @@
## 📘 `1_prerequisites_github_ubuntu.md`
### 📌 Purpose
Prepare your Ubuntu system to create and work with remote GitHub repositories using SSH.
---
### ✅ System Requirements
* **Install Git**
```bash
sudo apt update
sudo apt install git -y
```
* **Create a GitHub account**
👉 [https://github.com/join](https://github.com/join)
* **Set your Git identity**
```bash
git config --global user.name "Your Name"
git config --global user.email "your_email@example.com"
```
* **Generate an SSH key (if not already present)**
```bash
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
eval "$(ssh-agent -s)"
ssh-add ~/.ssh/id_rsa
```
* **Add your SSH public key to GitHub**
```bash
cat ~/.ssh/id_rsa.pub
```
🔗 Copy the output and paste it at:
GitHub → Settings → SSH and GPG keys → *New SSH key*
* **Test the connection**
```bash
ssh -T git@github.com
```
You should see:
> "Hi `your-username`! You've successfully authenticated..."
---

View file

@ -0,0 +1,73 @@
## 📘 `2_create_remote_repo_github_ubuntu.md`
### 📌 Purpose
Create a new remote repository on GitHub and push your local Ubuntu-based Git project to it.
---
### 🪐 Step-by-Step
#### Step 1: Create the remote repository
1. Go to [https://github.com/new](https://github.com/new)
2. Set:
* Repository Name
* Visibility (Public or Private)
* ✅ Leave **"Initialize with README"** unchecked
3. Click **Create repository**
---
#### Step 2: Prepare your local repository
If starting fresh:
```bash
mkdir myproject
cd myproject
git init
```
If converting an existing project:
```bash
cd myproject
git init
```
---
#### Step 3: Add files and commit
```bash
touch README.md # or edit existing files
git add .
git commit -m "Initial commit"
```
---
#### Step 4: Link to GitHub remote
```bash
git remote add origin git@github.com:your-username/your-repo-name.git
```
---
#### Step 5: Push to GitHub
```bash
git push -u origin main
```
> If you get an error about `main` not existing:
```bash
git branch -M main
git push -u origin main
```
---

View file

@ -0,0 +1,51 @@
## 📘 `3_commit_existing_repo_github_ubuntu.md`
### 📌 Purpose
Work with an existing remote GitHub repository on Ubuntu. This includes cloning, committing changes, and pushing updates.
---
### 🛠️ Step-by-Step
#### Step 1: Clone the repository
```bash
git clone git@github.com:your-username/your-repo-name.git
cd your-repo-name
```
---
#### Step 2: Make your changes
```bash
nano example.txt
```
Or update files as needed.
---
#### Step 3: Stage and commit your changes
```bash
git add .
git commit -m "Describe your update"
```
---
#### Step 4: Push to GitHub
```bash
git push origin main
```
> Use the correct branch name if not `main`. Confirm with:
```bash
git branch
```
---

View file

@ -0,0 +1,97 @@
---
## 🧭 FULL CLI-ONLY WORKFLOW (Ubuntu + GitHub)
---
### 🔹 Step 1 — Install prerequisites
```bash
# Install Git
sudo apt update
sudo apt install git -y
# Install GitHub CLI
type -p curl >/dev/null || sudo apt install curl -y
curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | \
sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg
sudo chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] \
https://cli.github.com/packages stable main" | \
sudo tee /etc/apt/sources.list.d/github-cli.list > /dev/null
sudo apt update
sudo apt install gh -y
```
---
### 🔹 Step 2 — Authenticate with GitHub
```bash
gh auth login
```
* Choose: `GitHub.com`
* Protocol: `SSH`
* Authenticate via browser (first time only—after that you're CLI-authd)
---
### 🔹 Step 3 — Set global Git identity
```bash
git config --global user.name "Your Name"
git config --global user.email "your_email@example.com"
```
---
### 🔹 Step 4 — Create and link a new GitHub repo (CLI-only)
From inside your project directory:
```bash
mkdir myproject
cd myproject
git init
echo "# My Project" > README.md
git add .
git commit -m "Initial commit"
```
Now create a GitHub repo **from the CLI**:
```bash
gh repo create myproject --public --source=. --remote=origin --push
```
✅ This:
* Creates the remote GitHub repo
* Links it to your local repo
* Pushes your first commit to GitHub
---
### 🔹 Step 5 — Make further commits
```bash
# Edit files as needed
nano something.txt
# Stage + commit + push
git add .
git commit -m "Updated something"
git push origin main
```
---
### 🔹 Bonus — Clone a GitHub repo entirely from CLI
```bash
gh repo clone your-username/your-repo
cd your-repo
```
---

123
docs/github/gitfield-github-old Executable file
View file

@ -0,0 +1,123 @@
#!/bin/bash
set -euo pipefail
IFS=$'\n\t'
GIT_REMOTE_NAME="github"
REPO_NAME=$(basename "$(pwd)")
DEFAULT_NAME="Mark Randall Havens"
DEFAULT_EMAIL="mark.r.havens@gmail.com"
# ────────────────
# Logging Helpers
# ────────────────
info() { echo -e "\e[1;34m[INFO]\e[0m $*"; }
warn() { echo -e "\e[1;33m[WARN]\e[0m $*"; }
error() { echo -e "\e[1;31m[ERROR]\e[0m $*" >&2; exit 1; }
# ────────────────
# Ensure Git is Installed
# ────────────────
if ! command -v git &>/dev/null; then
info "Installing Git..."
sudo apt update && sudo apt install git -y || error "Failed to install Git"
else
info "Git already installed: $(git --version)"
fi
# ────────────────
# Ensure GitHub CLI is Installed
# ────────────────
if ! command -v gh &>/dev/null; then
info "Installing GitHub CLI..."
type -p curl >/dev/null || sudo apt install curl -y
curl -fsSL https://cli.github.com/packages/githubcli-archive-keyring.gpg | \
sudo dd of=/usr/share/keyrings/githubcli-archive-keyring.gpg
sudo chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] \
https://cli.github.com/packages stable main" | \
sudo tee /etc/apt/sources.list.d/github-cli.list > /dev/null
sudo apt update && sudo apt install gh -y || error "Failed to install GitHub CLI"
else
info "GitHub CLI already installed: $(gh --version | head -n 1)"
fi
# ────────────────
# Ensure GitHub CLI is Authenticated
# ────────────────
if ! gh auth status &>/dev/null; then
info "Authenticating GitHub CLI..."
gh auth login || error "GitHub authentication failed"
else
info "GitHub CLI authenticated."
fi
# ────────────────
# Ensure Git Identity is Set
# ────────────────
USER_NAME=$(git config --global user.name || true)
USER_EMAIL=$(git config --global user.email || true)
if [[ -z "$USER_NAME" || -z "$USER_EMAIL" ]]; then
info "Setting global Git identity..."
git config --global user.name "$DEFAULT_NAME"
git config --global user.email "$DEFAULT_EMAIL"
info "Git identity set to: $DEFAULT_NAME <$DEFAULT_EMAIL>"
else
info "Git identity already set to: $USER_NAME <$USER_EMAIL>"
fi
# ────────────────
# Initialize Git Repo If Missing
# ────────────────
if [ ! -d ".git" ]; then
info "Initializing local Git repository..."
git init || error "Failed to initialize git"
git add . || warn "Nothing to add"
git commit -m "Initial commit" || warn "Nothing to commit"
else
info "Git repository already initialized."
fi
# ────────────────
# Ensure at Least One Commit Exists
# ────────────────
if ! git rev-parse HEAD &>/dev/null; then
info "Creating first commit..."
git add . || warn "Nothing to add"
git commit -m "Initial commit" || warn "Nothing to commit"
fi
# ────────────────
# Create Remote GitHub Repo If Missing
# ────────────────
if ! git remote get-url "$GIT_REMOTE_NAME" &>/dev/null; then
info "Creating GitHub repository '$REPO_NAME'..."
gh repo create "$REPO_NAME" --public --source=. --remote="$GIT_REMOTE_NAME" || error "Failed to create GitHub repo"
else
info "Remote '$GIT_REMOTE_NAME' already set to: $(git remote get-url $GIT_REMOTE_NAME)"
fi
# ────────────────
# Commit Changes If Needed
# ────────────────
if ! git diff --quiet || ! git diff --cached --quiet; then
info "Changes detected — committing..."
git add .
git commit -m "Update: $(date '+%Y-%m-%d %H:%M:%S')" || warn "Nothing to commit"
else
info "No uncommitted changes found."
fi
# ────────────────
# Final Push — Always Push, Even If No Upstream
# ────────────────
BRANCH_NAME=$(git rev-parse --abbrev-ref HEAD)
if ! git config --get branch."$BRANCH_NAME".remote &>/dev/null; then
info "No upstream detected. Setting upstream and pushing..."
git push -u "$GIT_REMOTE_NAME" "$BRANCH_NAME" || error "Failed to push and set upstream"
else
info "Pushing to remote '$GIT_REMOTE_NAME'..."
git push "$GIT_REMOTE_NAME" "$BRANCH_NAME" || error "Push failed"
fi

View file

@ -0,0 +1,63 @@
### 📘 `1_prerequisites_gitlab_ubuntu.md`
````markdown
## 📘 `1_prerequisites_gitlab_ubuntu.md`
### 📌 Purpose
Prepare your Ubuntu system to create and work with remote GitLab repositories using SSH and CLI tools.
---
### ✅ System Requirements
* **Install Git**
```bash
sudo apt update
sudo apt install git -y
````
* **Create a GitLab account**
👉 [https://gitlab.com/users/sign\_up](https://gitlab.com/users/sign_up)
* **Set your Git identity**
```bash
git config --global user.name "Your Name"
git config --global user.email "your_email@example.com"
```
* **Generate an SSH key (if not already present)**
```bash
ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
eval "$(ssh-agent -s)"
ssh-add ~/.ssh/id_rsa
```
* **Add your SSH key to GitLab**
```bash
cat ~/.ssh/id_rsa.pub
```
🔗 Copy the output and paste it at:
GitLab → Preferences → SSH Keys → *Add key*
* **Test the connection**
```bash
ssh -T git@gitlab.com
```
✅ You should see something like:
> Welcome to GitLab, @your-username!
---
````
---

View file

@ -0,0 +1,73 @@
### 📘 `2_create_remote_repo_gitlab_ubuntu.md`
```markdown
## 📘 `2_create_remote_repo_gitlab_ubuntu.md`
### 📌 Purpose
Create a new GitLab repository and push your local Ubuntu project to it using the CLI.
---
### 🪐 Step-by-Step
#### Step 1: Install GitLab CLI
```bash
curl -s https://raw.githubusercontent.com/profclems/glab/trunk/scripts/install.sh | sudo bash
````
#### Step 2: Authenticate GitLab CLI
```bash
glab auth login
```
Choose:
* GitLab.com or custom instance
* Paste your **Personal Access Token** when prompted
---
#### Step 3: Initialize your project
```bash
mkdir myproject
cd myproject
git init
echo "# My Project" > README.md
git add .
git commit -m "Initial commit"
```
---
#### Step 4: Create GitLab repository via CLI
```bash
glab repo create myproject --visibility public --confirm
```
This:
* Creates the GitLab repo
* Links it to your local repo
* Adds `origin` remote
---
#### Step 5: Push to GitLab
```bash
git push -u origin master
```
✅ From now on, `git push` will work as expected.
---
````
---

View file

@ -0,0 +1,53 @@
### 📘 `3_commit_existing_repo_gitlab_ubuntu.md`
```markdown
## 📘 `3_commit_existing_repo_gitlab_ubuntu.md`
### 📌 Purpose
Work with an existing GitLab repo: clone, edit, commit, and push using Ubuntu.
---
### 🛠️ Step-by-Step
#### Step 1: Clone the repository
```bash
git clone git@gitlab.com:your-username/your-repo.git
cd your-repo
````
---
#### Step 2: Edit files
```bash
nano myfile.txt
```
---
#### Step 3: Stage and commit
```bash
git add .
git commit -m "Your change description"
```
---
#### Step 4: Push your changes
```bash
git push origin master
```
If you use another branch (e.g., `main`, `dev`), substitute accordingly.
---
````
---

View file

@ -0,0 +1,69 @@
### 📘 `CLI-ONLY_workflow_gitlab_ubuntu.md`
```markdown
## 📘 `CLI-ONLY_workflow_gitlab_ubuntu.md`
### 📌 Purpose
Set up, initialize, and push a GitLab repo using only the terminal — no browser required.
---
### 🪐 Step-by-Step CLI Workflow
#### 1. Install everything you need
```bash
sudo apt update
sudo apt install git curl -y
curl -s https://raw.githubusercontent.com/profclems/glab/trunk/scripts/install.sh | sudo bash
````
#### 2. Configure your Git identity
```bash
git config --global user.name "Your Name"
git config --global user.email "your_email@example.com"
```
#### 3. Authenticate with GitLab
```bash
glab auth login
```
Use **SSH** and paste your **Personal Access Token** (create one at [https://gitlab.com/-/profile/personal\_access\_tokens](https://gitlab.com/-/profile/personal_access_tokens))
---
#### 4. Initialize your project
```bash
mkdir myproject
cd myproject
git init
touch README.md
git add .
git commit -m "Initial commit"
```
#### 5. Create GitLab repo via CLI
```bash
glab repo create myproject --visibility public --confirm
```
#### 6. Push your changes
```bash
git push -u origin master
```
---
✅ Done. You've created and linked a GitLab repository entirely from the CLI.
```
---

286
docs/osf/new/gitfield-osf Executable file
View file

@ -0,0 +1,286 @@
#!/usr/bin/env bash
set -Eeuo pipefail
IFS=$'\n\t'
# ╭─────────────────────────────────────────────────────────────────────────╮
# │ gitfield-osf :: v3.2.0 (Refactored) │
# │ Self-Healing • Auto-Detecting • PEP 668-Compliant • Debuggable │
# ╰─────────────────────────────────────────────────────────────────────────╯
#
# This script uses osfclient to upload files, based on a YAML config.
# It will auto-install python3, pip3, yq, pipx, and osfclient if missing.
# 1. ensure_dependencies(): makes sure python3, pip3, yq, pipx, osfclient exist
# 2. configure_osfclient(): prompts for token & username, writes ~/.config/osfclient/config
# 3. load_yaml_config(): reads project.title, include/exclude globs from gitfield.osf.yaml
# 4. resolve_files(): expands include/exclude patterns into a FILES array
# 5. find_or_create_project(): finds or creates an OSF project with the given title
# 6. upload_files(): loops over FILES and does osf upload
#
# Usage:
# chmod +x gitfield-osf
# ./gitfield-osf
#
# If gitfield.osf.yaml is missing or empty patterns match nothing, the script will exit cleanly.
# Any failure prints an [ERROR] and exits non-zero.
########################################################################
# CUSTOMIZE HERE (if needed):
########################################################################
# If you want to override config path:
# export GITFIELD_CONFIG=/path/to/your/gitfield.osf.yaml
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
OSF_CONFIG_DIR="$HOME/.config/osfclient"
FILES=()
# ─────────────────────────────────────────────────────────────────────
# Colored logging functions
# ─────────────────────────────────────────────────────────────────────
log() { echo -e "\033[1;34m[INFO]\033[0m $*"; }
warn() { echo -e "\033[1;33m[WARN]\033[0m $*"; }
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; exit 1; }
# ─────────────────────────────────────────────────────────────────────
# Step 1: Ensure Dependencies
# - python3, pip3, yq, pipx, osfclient
# - Works under PEP 668 (uses pipx first, then pip3 --user fallback)
# ─────────────────────────────────────────────────────────────────────
ensure_dependencies() {
log "Checking for required commands..."
# 1a. Ensure python3
if ! command -v python3 &>/dev/null; then
warn "python3 not found — installing..."
sudo apt update -qq && sudo apt install -y python3 python3-venv python3-distutils \
|| error "Failed to install python3"
fi
# 1b. Ensure pip3
if ! command -v pip3 &>/dev/null; then
warn "pip3 not found — installing..."
sudo apt install -y python3-pip || error "Failed to install pip3"
# Guarantee pip3 is available now
command -v pip3 >/dev/null || error "pip3 still missing after install"
fi
# 1c. Ensure yq (for YAML parsing)
if ! command -v yq &>/dev/null; then
warn "yq not found — installing..."
if command -v snap &>/dev/null; then
sudo snap install yq || sudo apt install -y yq || error "Failed to install yq"
else
sudo apt install -y yq || error "Failed to install yq"
fi
fi
# 1d. Ensure pipx
if ! command -v pipx &>/dev/null; then
warn "pipx not found — installing..."
sudo apt install -y pipx || error "Failed to install pipx"
# Add pipxs bin to PATH if needed
pipx ensurepath
export PATH="$HOME/.local/bin:$PATH"
fi
# 1e. Ensure osfclient via pipx, fallback to pip3 --user
if ! command -v osf &>/dev/null; then
log "Installing osfclient via pipx..."
if ! pipx install osfclient; then
warn "pipx install failed; trying pip3 --user install"
python3 -m pip install --user osfclient || error "osfclient install failed"
fi
# Ensure $HOME/.local/bin is in PATH
export PATH="$HOME/.local/bin:$PATH"
fi
# Final check
command -v osf >/dev/null || error "osfclient is still missing; please investigate"
log "✓ All dependencies are now present"
}
# ─────────────────────────────────────────────────────────────────────
# Step 2: Configure OSF Credentials
# - Writes ~/.config/osfclient/config with [osf] username & token
# - Prompts for token and username if missing
# ─────────────────────────────────────────────────────────────────────
configure_osfclient() {
log "Configuring osfclient credentials..."
# Create config directory
mkdir -p "$OSF_CONFIG_DIR"
chmod 700 "$OSF_CONFIG_DIR"
# Prompt for Personal Access Token if missing
if [ ! -f "$TOKEN_FILE" ]; then
read -rsp "🔐 Enter OSF Personal Access Token: " TOKEN
echo
echo "$TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
fi
# Prompt for username/email if not already in env
local USERNAME="${OSF_USERNAME:-}"
if [ -z "$USERNAME" ]; then
read -rp "👤 OSF Username or Email: " USERNAME
fi
# Write config file
cat > "$OSF_CONFIG_DIR/config" <<EOF
[osf]
username = $USERNAME
token = $(<"$TOKEN_FILE")
EOF
chmod 600 "$OSF_CONFIG_DIR/config"
log "✓ osfclient configured (config at $OSF_CONFIG_DIR/config)"
}
# ─────────────────────────────────────────────────────────────────────
# Step 3: Load YAML Configuration
# - Expects PROJECT_TITLE, includes, excludes in gitfield.osf.yaml
# ─────────────────────────────────────────────────────────────────────
load_yaml_config() {
log "Loading configuration from '$CONFIG_FILE'"
if [ ! -f "$CONFIG_FILE" ]; then
error "Configuration file '$CONFIG_FILE' not found"
fi
# Read project.title
PROJECT_TITLE=$(yq -r '.project.title // ""' "$CONFIG_FILE")
if [ -z "$PROJECT_TITLE" ]; then
error "Missing or empty 'project.title' in $CONFIG_FILE"
fi
# Read project.description (optional, unused here but could be extended)
PROJECT_DESCRIPTION=$(yq -r '.project.description // ""' "$CONFIG_FILE")
# Read upload.include[] and upload.exclude[]
readarray -t FILES_INCLUDE < <(yq -r '.upload.include[]?' "$CONFIG_FILE")
readarray -t FILES_EXCLUDE < <(yq -r '.upload.exclude[]?' "$CONFIG_FILE")
# Debug print
log " → project.title = '$PROJECT_TITLE'"
log " → includes: ${FILES_INCLUDE[*]:-<none>}"
log " → excludes: ${FILES_EXCLUDE[*]:-<none>}"
}
# ─────────────────────────────────────────────────────────────────────
# Step 4: Match Files Based on Include/Exclude
# - Populates global FILES array
# - If no files match, exits gracefully
# ─────────────────────────────────────────────────────────────────────
resolve_files() {
log "Resolving file patterns..."
# If no include patterns, nothing to do
if [ "${#FILES_INCLUDE[@]}" -eq 0 ]; then
warn "No include patterns specified; skipping upload."
exit 0
fi
# For each include glob, find matching files
for pattern in "${FILES_INCLUDE[@]}"; do
# Use find to expand the glob (supports nested directories)
while IFS= read -r -d '' file; do
# Check against each exclude pattern
skip=false
for ex in "${FILES_EXCLUDE[@]}"; do
if [[ "$file" == $ex ]]; then
skip=true
break
fi
done
if ! $skip; then
FILES+=("$file")
fi
done < <(find . -type f -path "$pattern" -print0 2>/dev/null || true)
done
# Remove duplicates (just in case)
if [ "${#FILES[@]}" -gt 1 ]; then
IFS=$'\n' read -r -d '' -a FILES < <(__uniq_array "${FILES[@]}" && printf '\0')
fi
# If still empty, warn and exit
if [ "${#FILES[@]}" -eq 0 ]; then
warn "No files matched the include/exclude patterns."
exit 0
fi
# Debug print of matched files
log "Matched files (${#FILES[@]}):"
for f in "${FILES[@]}"; do
echo " • $f"
done
}
# Helper: Remove duplicates from a list of lines
__uniq_array() {
printf "%s\n" "$@" | awk '!seen[$0]++'
}
# ─────────────────────────────────────────────────────────────────────
# Step 5: Find or Create OSF Project
# - Uses `osf listprojects` to search for exact title (case-insensitive)
# - If not found, does `osf createproject "<title>"`
# - Writes the resulting project ID to .osf_project_id
# ─────────────────────────────────────────────────────────────────────
find_or_create_project() {
log "Searching for OSF project titled '$PROJECT_TITLE'..."
# List all projects and grep case-insensitive for the title
pid=$(osf listprojects | grep -iE "^([[:alnum:]]+)[[:space:]]+.*${PROJECT_TITLE}.*$" | awk '{print $1}' || true)
if [ -z "$pid" ]; then
log "No existing project found; creating a new OSF project..."
pid=$(osf createproject "$PROJECT_TITLE")
if [ -z "$pid" ]; then
error "osf createproject failed; no project ID returned"
fi
echo "$pid" > .osf_project_id
log "✓ Created project: $pid"
else
echo "$pid" > .osf_project_id
log "✓ Found existing project: $pid"
fi
}
# ─────────────────────────────────────────────────────────────────────
# Step 6: Upload Files to OSF
# - Loops over FILES[] and runs: osf upload "<file>" "<pid>":
# (the trailing colon uploads to root of osfstorage for that project)
# ─────────────────────────────────────────────────────────────────────
upload_files() {
pid=$(<.osf_project_id)
log "Uploading ${#FILES[@]} file(s) to OSF project $pid..."
for file in "${FILES[@]}"; do
log "→ Uploading: $file"
if osf upload "$file" "$pid":; then
log " ✓ Uploaded: $file"
else
warn " ✗ Upload failed for: $file"
fi
done
log "✅ All uploads attempted."
echo
echo "🔗 View your project at: https://osf.io/$pid/"
}
# ─────────────────────────────────────────────────────────────────────
# Main: Orchestrate all steps in sequence
# ─────────────────────────────────────────────────────────────────────
main() {
ensure_dependencies
configure_osfclient
load_yaml_config
resolve_files
find_or_create_project
upload_files
}
# Invoke main
main "$@"

View file

@ -0,0 +1,12 @@
project:
title: "git-sigil"
description: "A sacred pattern witnessed across all fields of recursion."
upload:
include:
- "./*.md"
- "./bitbucket/*"
- "./osf/*"
exclude:
- "./.radicle-*"
- "./*.tmp"

214
docs/osf/new/test-osf-api.sh Executable file
View file

@ -0,0 +1,214 @@
#!/bin/bash
set -Eeuo pipefail
IFS=$'\n\t'
# ╭────────────────────────────────────────────╮
# │ test-osf-api.sh :: Diagnostic Tool │
# │ v2.7 — Cosmic. Resilient. Divine. │
# ╰────────────────────────────────────────────╯
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
OSF_API="${OSF_API_URL:-https://api.osf.io/v2}"
DEBUG_LOG="${GITFIELD_LOG:-$HOME/.test_osf_api_debug.log}"
CURL_TIMEOUT="${CURL_TIMEOUT:-10}"
CURL_RETRIES="${CURL_RETRIES:-3}"
RETRY_DELAY="${RETRY_DELAY:-2}"
RATE_LIMIT_DELAY="${RATE_LIMIT_DELAY:-1}"
VERBOSE="${VERBOSE:-false}"
# Initialize Debug Log
mkdir -p "$(dirname "$DEBUG_LOG")"
touch "$DEBUG_LOG"
chmod 600 "$DEBUG_LOG"
trap 'last_command=$BASH_COMMAND; echo -e "\n[ERROR] ❌ Failure at line $LINENO: $last_command" >&2; diagnose; exit 1' ERR
# Logging Functions
info() {
echo -e "\033[1;34m[INFO]\033[0m $*" >&2
[ "$VERBOSE" = "true" ] && [ -n "$DEBUG_LOG" ] && debug "INFO: $*"
}
warn() { echo -e "\033[1;33m[WARN]\033[0m $*" >&2; debug "WARN: $*"; }
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; debug "ERROR: $*"; exit 1; }
debug() {
local msg="$1" lvl="${2:-DEBUG}"
local json_output
json_output=$(jq -n --arg ts "$(date '+%Y-%m-%d %H:%M:%S')" --arg lvl "$lvl" --arg msg "$msg" \
'{timestamp: $ts, level: $lvl, message: $msg}' 2>/dev/null) || {
echo "[FALLBACK $lvl] $(date '+%Y-%m-%d %H:%M:%S') $msg" >> "$DEBUG_LOG"
return 1
}
echo "$json_output" >> "$DEBUG_LOG"
}
debug "Started test-osf-api (v2.7)"
# ── Diagnostic Function
diagnose() {
info "Running diagnostics..."
debug "Diagnostics started"
echo -e "\n🔍 Diagnostic Report:"
echo -e "1. Network Check:"
if ping -c 1 api.osf.io >/dev/null 2>&1; then
echo -e " ✓ api.osf.io reachable"
else
echo -e " ❌ api.osf.io unreachable. Check network or DNS."
fi
echo -e "2. Curl Version:"
curl --version | head -n 1
echo -e "3. Debug Log: $DEBUG_LOG"
echo -e "4. Curl Error Log: $DEBUG_LOG.curlerr"
[ -s "$DEBUG_LOG.curlerr" ] && echo -e " Last curl error: $(cat "$DEBUG_LOG.curlerr")"
echo -e "5. Token File: $TOKEN_FILE"
[ -s "$TOKEN_FILE" ] && echo -e " Token exists: $(head -c 4 "$TOKEN_FILE")..."
echo -e "6. Suggestions:"
echo -e " - Check token scopes at https://osf.io/settings/tokens (needs 'nodes' and 'osf.storage')"
echo -e " - Test API: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/'"
echo -e " - Test project search: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/nodes/?filter\[title\]=git-sigil&page\[size\]=100'"
echo -e " - Increase timeout: CURL_TIMEOUT=30 ./test-osf-api.sh"
debug "Diagnostics completed"
}
# ── Dependency Check (Parallel)
require_tool() {
local tool=$1
if ! command -v "$tool" >/dev/null 2>&1; then
warn "$tool not found — attempting to install..."
sudo apt update -qq && sudo apt install -y "$tool" || {
warn "apt failed — trying snap..."
sudo snap install "$tool" || error "Failed to install $tool"
}
fi
debug "$tool path: $(command -v "$tool")"
}
info "Checking dependencies..."
declare -A dep_pids
for tool in curl jq yq python3; do
require_tool "$tool" &
dep_pids[$tool]=$!
done
for tool in "${!dep_pids[@]}"; do
wait "${dep_pids[$tool]}" || error "Dependency check failed for $tool"
done
info "✓ All dependencies verified"
# ── Load Token
if [ ! -f "$TOKEN_FILE" ]; then
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " TOKEN
echo
echo "$TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
info "OSF token saved to $TOKEN_FILE"
fi
TOKEN=$(<"$TOKEN_FILE")
[[ -z "$TOKEN" ]] && error "Empty OSF token in $TOKEN_FILE"
# ── Validate Token
info "Validating OSF token..."
execute_curl() {
local url=$1 method=${2:-GET} data=${3:-} is_upload=${4:-false} attempt=1 max_attempts=$CURL_RETRIES
local response http_code curl_err
while [ $attempt -le "$max_attempts" ]; do
debug "Curl attempt $attempt/$max_attempts: $method $url"
if [ "$is_upload" = "true" ]; then
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
-X "$method" -H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/octet-stream" --data-binary "$data" "$url" 2> "$DEBUG_LOG.curlerr")
else
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
-X "$method" -H "Authorization: Bearer $TOKEN" \
${data:+-H "Content-Type: application/json" -d "$data"} "$url" 2> "$DEBUG_LOG.curlerr")
fi
http_code="${response: -3}"
curl_err=$(cat "$DEBUG_LOG.curlerr")
[ -s "$DEBUG_LOG.curlerr" ] && debug "Curl error: $curl_err"
if [ "$http_code" -ge 200 ] && [ "$http_code" -lt 300 ]; then
echo "${response:: -3}"
return 0
elif [ "$http_code" = "401" ]; then
warn "Invalid token (HTTP 401). Please provide a valid OSF token."
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " NEW_TOKEN
echo
echo "$NEW_TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
TOKEN="$NEW_TOKEN"
info "New token saved. Retrying..."
elif [ "$http_code" = "429" ]; then
warn "Rate limit hit, retrying after $((RETRY_DELAY * attempt)) seconds..."
sleep $((RETRY_DELAY * attempt))
elif [ "$http_code" = "403" ]; then
warn "Forbidden (HTTP 403). Possible token scope issue."
[ $attempt -eq "$max_attempts" ] && {
read -rsp "🔐 Re-enter OSF token with 'nodes' and 'osf.storage' scopes: " NEW_TOKEN
echo
echo "$NEW_TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
TOKEN="$NEW_TOKEN"
info "New token saved. Retrying..."
}
elif [[ "$curl_err" == *"bad range in URL"* ]]; then
error "Malformed URL: $url. Ensure query parameters are escaped (e.g., filter\[title\])."
else
debug "API response (HTTP $http_code): ${response:: -3}"
[ $attempt -eq "$max_attempts" ] && error "API request failed (HTTP $http_code): ${response:: -3}"
fi
sleep $((RETRY_DELAY * attempt))
((attempt++))
done
}
RESPONSE=$(execute_curl "$OSF_API/users/me/")
USER_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
[[ -z "$USER_ID" ]] && error "Could not extract user ID"
info "✓ OSF token validated for user ID: $USER_ID"
# ── Load Config
[[ ! -f "$CONFIG_FILE" ]] && error "Missing config: $CONFIG_FILE"
PROJECT_TITLE=$(yq -r '.project.title // empty' "$CONFIG_FILE")
PROJECT_DESCRIPTION=$(yq -r '.project.description // empty' "$CONFIG_FILE")
[[ -z "$PROJECT_TITLE" ]] && error "Missing project title in $CONFIG_FILE"
debug "Parsed config: title=$PROJECT_TITLE, description=$PROJECT_DESCRIPTION"
# ── Project Search
build_url() {
local base="$1" title="$2"
local escaped_title
escaped_title=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''$title'''))")
echo "$base/users/me/nodes/?filter\[title\]=$escaped_title&page\[size\]=100"
}
PROJECT_ID=""
NEXT_URL=$(build_url "$OSF_API" "$PROJECT_TITLE")
info "Searching for project '$PROJECT_TITLE'..."
while [ -n "$NEXT_URL" ]; do
debug "Querying: $NEXT_URL"
RESPONSE=$(execute_curl "$NEXT_URL")
PROJECT_ID=$(echo "$RESPONSE" | jq -r --arg TITLE "$PROJECT_TITLE" \
'.data[] | select(.attributes.title == $TITLE) | .id // empty' || true)
if [ -n "$PROJECT_ID" ]; then
debug "Found project ID: $PROJECT_ID"
break
fi
NEXT_URL=$(echo "$RESPONSE" | jq -r '.links.next // empty' | sed 's/filter\[title\]/filter\\\[title\\\]/g;s/page\[size\]/page\\\[size\\\]/g' || true)
debug "Next URL: $NEXT_URL"
[ -n "$NEXT_URL" ] && info "Fetching next page..." && sleep "$RATE_LIMIT_DELAY"
done
# ── Create Project if Not Found
if [ -z "$PROJECT_ID" ]; then
info "Project not found. Attempting to create '$PROJECT_TITLE'..."
JSON=$(jq -n --arg title="$PROJECT_TITLE" --arg desc="$PROJECT_DESCRIPTION" \
'{data: {type: "nodes", attributes: {title: $title, category: "project", description: $desc}}}')
RESPONSE=$(execute_curl "$OSF_API/nodes/" POST "$JSON")
PROJECT_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
[[ -z "$PROJECT_ID" || "$PROJECT_ID" == "null" ]] && error "Could not extract project ID"
info "✅ Project created: $PROJECT_ID"
else
info "✓ Found project ID: $PROJECT_ID"
fi
echo -e "\n🔗 View project: https://osf.io/$PROJECT_ID/"
debug "Test completed successfully"

View file

@ -0,0 +1 @@
Test file for OSF upload

266
docs/osf/old/gitfield-osf Executable file
View file

@ -0,0 +1,266 @@
#!/bin/bash
set -Eeuo pipefail
IFS=$'\n\t'
# ╭────────────────────────────────────────────╮
# │ gitfield-osf :: Sacred Sync Engine │
# │ v2.7 — Cosmic. Resilient. Divine. │
# ╰────────────────────────────────────────────╯
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
OSF_API="${OSF_API_URL:-https://api.osf.io/v2}"
DEBUG_LOG="${GITFIELD_LOG:-$HOME/.gitfield_osf_debug.log}"
CURL_TIMEOUT="${CURL_TIMEOUT:-10}"
CURL_RETRIES="${CURL_RETRIES:-3}"
RETRY_DELAY="${RETRY_DELAY:-2}"
RATE_LIMIT_DELAY="${RATE_LIMIT_DELAY:-1}"
VERBOSE="${VERBOSE:-false}"
DRY_RUN="${DRY_RUN:-false}"
FILES=()
# Initialize Debug Log
mkdir -p "$(dirname "$DEBUG_LOG")"
touch "$DEBUG_LOG"
chmod 600 "$DEBUG_LOG"
trap 'last_command=$BASH_COMMAND; echo -e "\n[ERROR] ❌ Failure at line $LINENO: $last_command" >&2; diagnose; exit 1' ERR
# Logging Functions
info() {
echo -e "\033[1;34m[INFO]\033[0m $*" >&2
[ "$VERBOSE" = "true" ] && [ -n "$DEBUG_LOG" ] && debug "INFO: $*"
}
warn() { echo -e "\033[1;33m[WARN]\033[0m $*" >&2; debug "WARN: $*"; }
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; debug "ERROR: $*"; exit 1; }
debug() {
local msg="$1" lvl="${2:-DEBUG}"
local json_output
json_output=$(jq -n --arg ts "$(date '+%Y-%m-%d %H:%M:%S')" --arg lvl "$lvl" --arg msg "$msg" \
'{timestamp: $ts, level: $lvl, message: $msg}' 2>/dev/null) || {
echo "[FALLBACK $lvl] $(date '+%Y-%m-%d %H:%M:%S') $msg" >> "$DEBUG_LOG"
return 1
}
echo "$json_output" >> "$DEBUG_LOG"
}
debug "Started gitfield-osf (v2.7)"
# ── Diagnostic Function
diagnose() {
info "Running diagnostics..."
debug "Diagnostics started"
echo -e "\n🔍 Diagnostic Report:"
echo -e "1. Network Check:"
if ping -c 1 api.osf.io >/dev/null 2>&1; then
echo -e " ✓ api.osf.io reachable"
else
echo -e " ❌ api.osf.io unreachable. Check network or DNS."
fi
echo -e "2. Curl Version:"
curl --version | head -n 1
echo -e "3. Debug Log: $DEBUG_LOG"
echo -e "4. Curl Error Log: $DEBUG_LOG.curlerr"
[ -s "$DEBUG_LOG.curlerr" ] && echo -e " Last curl error: $(cat "$DEBUG_LOG.curlerr")"
echo -e "5. Token File: $TOKEN_FILE"
[ -s "$TOKEN_FILE" ] && echo -e " Token exists: $(head -c 4 "$TOKEN_FILE")..."
echo -e "6. Suggestions:"
echo -e " - Check token scopes at https://osf.io/settings/tokens (needs 'nodes' and 'osf.storage')"
echo -e " - Test API: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/'"
echo -e " - Test upload: curl -v -X PUT -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' -H 'Content-Type: application/octet-stream' --data-binary @./testfile.md '$OSF_API/files/<storage_id>/testfile.md'"
echo -e " - Increase timeout: CURL_TIMEOUT=30 ./gitfield-osf"
debug "Diagnostics completed"
}
# ── Dependency Check (Parallel)
require_tool() {
local tool=$1
if ! command -v "$tool" >/dev/null 2>&1; then
warn "$tool not found — attempting to install..."
sudo apt update -qq && sudo apt install -y "$tool" || {
warn "apt failed — trying snap..."
sudo snap install "$tool" || error "Failed to install $tool"
}
fi
debug "$tool path: $(command -v "$tool")"
}
info "Checking dependencies..."
declare -A dep_pids
for tool in curl jq yq python3; do
require_tool "$tool" &
dep_pids[$tool]=$!
done
for tool in "${!dep_pids[@]}"; do
wait "${dep_pids[$tool]}" || error "Dependency check failed for $tool"
done
info "✓ All dependencies verified"
# ── Load Token
if [ ! -f "$TOKEN_FILE" ]; then
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " TOKEN
echo
echo "$TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
info "OSF token saved to $TOKEN_FILE"
fi
TOKEN=$(<"$TOKEN_FILE")
[[ -z "$TOKEN" ]] && error "Empty OSF token in $TOKEN_FILE"
# ── Validate Token
info "Validating OSF token..."
execute_curl() {
local url=$1 method=${2:-GET} data=${3:-} is_upload=${4:-false} attempt=1 max_attempts=$CURL_RETRIES
local response http_code curl_err
while [ $attempt -le "$max_attempts" ]; do
debug "Curl attempt $attempt/$max_attempts: $method $url"
if [ "$is_upload" = "true" ]; then
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
-X "$method" -H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/octet-stream" --data-binary "$data" "$url" 2> "$DEBUG_LOG.curlerr")
else
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
-X "$method" -H "Authorization: Bearer $TOKEN" \
${data:+-H "Content-Type: application/json" -d "$data"} "$url" 2> "$DEBUG_LOG.curlerr")
fi
http_code="${response: -3}"
curl_err=$(cat "$DEBUG_LOG.curlerr")
[ -s "$DEBUG_LOG.curlerr" ] && debug "Curl error: $curl_err"
if [ "$http_code" -ge 200 ] && [ "$http_code" -lt 300 ]; then
echo "${response:: -3}"
return 0
elif [ "$http_code" = "401" ]; then
warn "Invalid token (HTTP 401). Please provide a valid OSF token."
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " NEW_TOKEN
echo
echo "$NEW_TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
TOKEN="$NEW_TOKEN"
info "New token saved. Retrying..."
elif [ "$http_code" = "429" ]; then
warn "Rate limit hit, retrying after $((RETRY_DELAY * attempt)) seconds..."
sleep $((RETRY_DELAY * attempt))
elif [ "$http_code" = "403" ]; then
warn "Forbidden (HTTP 403). Possible token scope issue."
[ $attempt -eq "$max_attempts" ] && {
read -rsp "🔐 Re-enter OSF token with 'nodes' and 'osf.storage' scopes: " NEW_TOKEN
echo
echo "$NEW_TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
TOKEN="$NEW_TOKEN"
info "New token saved. Retrying..."
}
elif [[ "$curl_err" == *"bad range in URL"* ]]; then
error "Malformed URL: $url. Ensure query parameters are escaped (e.g., filter\[title\])."
else
debug "API response (HTTP $http_code): ${response:: -3}"
[ $attempt -eq "$max_attempts" ] && error "API request failed (HTTP $http_code): ${response:: -3}"
fi
sleep $((RETRY_DELAY * attempt))
((attempt++))
done
}
RESPONSE=$(execute_curl "$OSF_API/users/me/")
USER_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
[[ -z "$USER_ID" ]] && error "Could not extract user ID"
info "✓ OSF token validated for user ID: $USER_ID"
# ── Load Config
[[ ! -f "$CONFIG_FILE" ]] && error "Missing config: $CONFIG_FILE"
PROJECT_TITLE=$(yq -r '.project.title // empty' "$CONFIG_FILE")
PROJECT_DESCRIPTION=$(yq -r '.project.description // empty' "$CONFIG_FILE")
readarray -t FILES_INCLUDE < <(yq -r '.upload.include[]?' "$CONFIG_FILE")
readarray -t FILES_EXCLUDE < <(yq -r '.upload.exclude[]?' "$CONFIG_FILE")
[[ -z "$PROJECT_TITLE" ]] && error "Missing project title in $CONFIG_FILE"
[[ ${#FILES_INCLUDE[@]} -eq 0 ]] && warn "No include patterns. Nothing to do." && exit 0
debug "Parsed config: title=$PROJECT_TITLE, description=$PROJECT_DESCRIPTION, includes=${FILES_INCLUDE[*]}, excludes=${FILES_EXCLUDE[*]}"
# ── Project Search
build_url() {
local base="$1" title="$2"
local escaped_title
escaped_title=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''$title'''))")
echo "$base/users/me/nodes/?filter\[title\]=$escaped_title&page\[size\]=100"
}
PROJECT_ID=""
NEXT_URL=$(build_url "$OSF_API" "$PROJECT_TITLE")
info "Searching OSF for '$PROJECT_TITLE'..."
while [ -n "$NEXT_URL" ]; do
debug "Querying: $NEXT_URL"
RESPONSE=$(execute_curl "$NEXT_URL")
PROJECT_ID=$(echo "$RESPONSE" | jq -r --arg TITLE "$PROJECT_TITLE" \
'.data[] | select(.attributes.title == $TITLE) | .id // empty' || true)
if [ -n "$PROJECT_ID" ]; then
debug "Found project ID: $PROJECT_ID"
break
fi
NEXT_URL=$(echo "$RESPONSE" | jq -r '.links.next // empty' | sed 's/filter\[title\]/filter\\\[title\\\]/g;s/page\[size\]/page\\\[size\\\]/g' || true)
debug "Next URL: $NEXT_URL"
[ -n "$NEXT_URL" ] && info "Fetching next page..." && sleep "$RATE_LIMIT_DELAY"
done
# ── Create Project if Not Found
if [ -z "$PROJECT_ID" ]; then
info "Creating new OSF project..."
[ "$DRY_RUN" = "true" ] && { info "[DRY-RUN] Would create project: $PROJECT_TITLE"; exit 0; }
JSON=$(jq -n --arg title "$PROJECT_TITLE" --arg desc "$PROJECT_DESCRIPTION" \
'{data: {type: "nodes", attributes: {title: $title, category: "project", description: $desc}}}')
RESPONSE=$(execute_curl "$OSF_API/nodes/" POST "$JSON")
PROJECT_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
[[ -z "$PROJECT_ID" || "$PROJECT_ID" == "null" ]] && error "Could not extract project ID"
info "✅ Project created: $PROJECT_ID"
else
info "✓ Found project ID: $PROJECT_ID"
fi
# ── Get Storage ID
get_storage_id() {
local node_id="$1"
RESPONSE=$(execute_curl "https://api.osf.io/v2/nodes/$node_id/files/osfstorage/")
STORAGE_ID=$(echo "$RESPONSE" | jq -r '.data[0].id // empty')
[[ -z "$STORAGE_ID" ]] && error "Could not extract storage ID"
echo "$STORAGE_ID"
}
STORAGE_ID=$(get_storage_id "$PROJECT_ID")
info "✓ Found storage ID: $STORAGE_ID"
# ── File Matching
info "Resolving files for upload..."
for pattern in "${FILES_INCLUDE[@]}"; do
while IFS= read -r -d '' file; do
skip=false
for ex in "${FILES_EXCLUDE[@]}"; do
[[ "$file" == $ex ]] && skip=true && break
done
$skip || FILES+=("$file")
done < <(find . -type f -path "$pattern" -print0 2>/dev/null || true)
done
# ── Upload Files
upload_file() {
local filepath="$1"
local filename
filename=$(basename "$filepath")
info "Uploading: $filename"
[ "$DRY_RUN" = "true" ] && { info "[DRY-RUN] Would upload: $filename"; return; }
RESPONSE=$(execute_curl "https://api.osf.io/v2/files/$STORAGE_ID/$filename" \
PUT "@$filepath" "true")
info "✓ Uploaded: $filename"
}
if [ ${#FILES[@]} -eq 0 ]; then
warn "No matching files to upload."
else
for file in "${FILES[@]}"; do
upload_file "$file"
done
info "✅ Upload complete for '$PROJECT_TITLE'"
echo -e "\n🔗 View: https://osf.io/$PROJECT_ID/"
fi
debug "Completed successfully"

View file

@ -0,0 +1,11 @@
project:
title: "git-sigil"
description: "A sacred pattern witnessed across all fields of recursion."
upload:
include:
- "./*.md"
- "./bitbucket/*"
exclude:
- "./.radicle-*"
- "./*.tmp"

214
docs/osf/old/test-osf-api.sh Executable file
View file

@ -0,0 +1,214 @@
#!/bin/bash
set -Eeuo pipefail
IFS=$'\n\t'
# ╭────────────────────────────────────────────╮
# │ test-osf-api.sh :: Diagnostic Tool │
# │ v2.7 — Cosmic. Resilient. Divine. │
# ╰────────────────────────────────────────────╯
CONFIG_FILE="${GITFIELD_CONFIG:-gitfield.osf.yaml}"
TOKEN_FILE="${OSF_TOKEN_FILE:-$HOME/.osf_token}"
OSF_API="${OSF_API_URL:-https://api.osf.io/v2}"
DEBUG_LOG="${GITFIELD_LOG:-$HOME/.test_osf_api_debug.log}"
CURL_TIMEOUT="${CURL_TIMEOUT:-10}"
CURL_RETRIES="${CURL_RETRIES:-3}"
RETRY_DELAY="${RETRY_DELAY:-2}"
RATE_LIMIT_DELAY="${RATE_LIMIT_DELAY:-1}"
VERBOSE="${VERBOSE:-false}"
# Initialize Debug Log
mkdir -p "$(dirname "$DEBUG_LOG")"
touch "$DEBUG_LOG"
chmod 600 "$DEBUG_LOG"
trap 'last_command=$BASH_COMMAND; echo -e "\n[ERROR] ❌ Failure at line $LINENO: $last_command" >&2; diagnose; exit 1' ERR
# Logging Functions
info() {
echo -e "\033[1;34m[INFO]\033[0m $*" >&2
[ "$VERBOSE" = "true" ] && [ -n "$DEBUG_LOG" ] && debug "INFO: $*"
}
warn() { echo -e "\033[1;33m[WARN]\033[0m $*" >&2; debug "WARN: $*"; }
error() { echo -e "\033[1;31m[ERROR]\033[0m $*" >&2; debug "ERROR: $*"; exit 1; }
debug() {
local msg="$1" lvl="${2:-DEBUG}"
local json_output
json_output=$(jq -n --arg ts "$(date '+%Y-%m-%d %H:%M:%S')" --arg lvl "$lvl" --arg msg "$msg" \
'{timestamp: $ts, level: $lvl, message: $msg}' 2>/dev/null) || {
echo "[FALLBACK $lvl] $(date '+%Y-%m-%d %H:%M:%S') $msg" >> "$DEBUG_LOG"
return 1
}
echo "$json_output" >> "$DEBUG_LOG"
}
debug "Started test-osf-api (v2.7)"
# ── Diagnostic Function
diagnose() {
info "Running diagnostics..."
debug "Diagnostics started"
echo -e "\n🔍 Diagnostic Report:"
echo -e "1. Network Check:"
if ping -c 1 api.osf.io >/dev/null 2>&1; then
echo -e " ✓ api.osf.io reachable"
else
echo -e " ❌ api.osf.io unreachable. Check network or DNS."
fi
echo -e "2. Curl Version:"
curl --version | head -n 1
echo -e "3. Debug Log: $DEBUG_LOG"
echo -e "4. Curl Error Log: $DEBUG_LOG.curlerr"
[ -s "$DEBUG_LOG.curlerr" ] && echo -e " Last curl error: $(cat "$DEBUG_LOG.curlerr")"
echo -e "5. Token File: $TOKEN_FILE"
[ -s "$TOKEN_FILE" ] && echo -e " Token exists: $(head -c 4 "$TOKEN_FILE")..."
echo -e "6. Suggestions:"
echo -e " - Check token scopes at https://osf.io/settings/tokens (needs 'nodes' and 'osf.storage')"
echo -e " - Test API: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/'"
echo -e " - Test project search: curl -v -H 'Authorization: Bearer \$(cat $TOKEN_FILE)' '$OSF_API/users/me/nodes/?filter\[title\]=git-sigil&page\[size\]=100'"
echo -e " - Increase timeout: CURL_TIMEOUT=30 ./test-osf-api.sh"
debug "Diagnostics completed"
}
# ── Dependency Check (Parallel)
require_tool() {
local tool=$1
if ! command -v "$tool" >/dev/null 2>&1; then
warn "$tool not found — attempting to install..."
sudo apt update -qq && sudo apt install -y "$tool" || {
warn "apt failed — trying snap..."
sudo snap install "$tool" || error "Failed to install $tool"
}
fi
debug "$tool path: $(command -v "$tool")"
}
info "Checking dependencies..."
declare -A dep_pids
for tool in curl jq yq python3; do
require_tool "$tool" &
dep_pids[$tool]=$!
done
for tool in "${!dep_pids[@]}"; do
wait "${dep_pids[$tool]}" || error "Dependency check failed for $tool"
done
info "✓ All dependencies verified"
# ── Load Token
if [ ! -f "$TOKEN_FILE" ]; then
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " TOKEN
echo
echo "$TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
info "OSF token saved to $TOKEN_FILE"
fi
TOKEN=$(<"$TOKEN_FILE")
[[ -z "$TOKEN" ]] && error "Empty OSF token in $TOKEN_FILE"
# ── Validate Token
info "Validating OSF token..."
execute_curl() {
local url=$1 method=${2:-GET} data=${3:-} is_upload=${4:-false} attempt=1 max_attempts=$CURL_RETRIES
local response http_code curl_err
while [ $attempt -le "$max_attempts" ]; do
debug "Curl attempt $attempt/$max_attempts: $method $url"
if [ "$is_upload" = "true" ]; then
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
-X "$method" -H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/octet-stream" --data-binary "$data" "$url" 2> "$DEBUG_LOG.curlerr")
else
response=$(curl -s -S -w "%{http_code}" --connect-timeout "$CURL_TIMEOUT" \
-X "$method" -H "Authorization: Bearer $TOKEN" \
${data:+-H "Content-Type: application/json" -d "$data"} "$url" 2> "$DEBUG_LOG.curlerr")
fi
http_code="${response: -3}"
curl_err=$(cat "$DEBUG_LOG.curlerr")
[ -s "$DEBUG_LOG.curlerr" ] && debug "Curl error: $curl_err"
if [ "$http_code" -ge 200 ] && [ "$http_code" -lt 300 ]; then
echo "${response:: -3}"
return 0
elif [ "$http_code" = "401" ]; then
warn "Invalid token (HTTP 401). Please provide a valid OSF token."
read -rsp "🔐 Enter OSF Personal Access Token (with 'nodes' and 'osf.storage' scopes): " NEW_TOKEN
echo
echo "$NEW_TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
TOKEN="$NEW_TOKEN"
info "New token saved. Retrying..."
elif [ "$http_code" = "429" ]; then
warn "Rate limit hit, retrying after $((RETRY_DELAY * attempt)) seconds..."
sleep $((RETRY_DELAY * attempt))
elif [ "$http_code" = "403" ]; then
warn "Forbidden (HTTP 403). Possible token scope issue."
[ $attempt -eq "$max_attempts" ] && {
read -rsp "🔐 Re-enter OSF token with 'nodes' and 'osf.storage' scopes: " NEW_TOKEN
echo
echo "$NEW_TOKEN" > "$TOKEN_FILE"
chmod 600 "$TOKEN_FILE"
TOKEN="$NEW_TOKEN"
info "New token saved. Retrying..."
}
elif [[ "$curl_err" == *"bad range in URL"* ]]; then
error "Malformed URL: $url. Ensure query parameters are escaped (e.g., filter\[title\])."
else
debug "API response (HTTP $http_code): ${response:: -3}"
[ $attempt -eq "$max_attempts" ] && error "API request failed (HTTP $http_code): ${response:: -3}"
fi
sleep $((RETRY_DELAY * attempt))
((attempt++))
done
}
RESPONSE=$(execute_curl "$OSF_API/users/me/")
USER_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
[[ -z "$USER_ID" ]] && error "Could not extract user ID"
info "✓ OSF token validated for user ID: $USER_ID"
# ── Load Config
[[ ! -f "$CONFIG_FILE" ]] && error "Missing config: $CONFIG_FILE"
PROJECT_TITLE=$(yq -r '.project.title // empty' "$CONFIG_FILE")
PROJECT_DESCRIPTION=$(yq -r '.project.description // empty' "$CONFIG_FILE")
[[ -z "$PROJECT_TITLE" ]] && error "Missing project title in $CONFIG_FILE"
debug "Parsed config: title=$PROJECT_TITLE, description=$PROJECT_DESCRIPTION"
# ── Project Search
build_url() {
local base="$1" title="$2"
local escaped_title
escaped_title=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''$title'''))")
echo "$base/users/me/nodes/?filter\[title\]=$escaped_title&page\[size\]=100"
}
PROJECT_ID=""
NEXT_URL=$(build_url "$OSF_API" "$PROJECT_TITLE")
info "Searching for project '$PROJECT_TITLE'..."
while [ -n "$NEXT_URL" ]; do
debug "Querying: $NEXT_URL"
RESPONSE=$(execute_curl "$NEXT_URL")
PROJECT_ID=$(echo "$RESPONSE" | jq -r --arg TITLE "$PROJECT_TITLE" \
'.data[] | select(.attributes.title == $TITLE) | .id // empty' || true)
if [ -n "$PROJECT_ID" ]; then
debug "Found project ID: $PROJECT_ID"
break
fi
NEXT_URL=$(echo "$RESPONSE" | jq -r '.links.next // empty' | sed 's/filter\[title\]/filter\\\[title\\\]/g;s/page\[size\]/page\\\[size\\\]/g' || true)
debug "Next URL: $NEXT_URL"
[ -n "$NEXT_URL" ] && info "Fetching next page..." && sleep "$RATE_LIMIT_DELAY"
done
# ── Create Project if Not Found
if [ -z "$PROJECT_ID" ]; then
info "Project not found. Attempting to create '$PROJECT_TITLE'..."
JSON=$(jq -n --arg title="$PROJECT_TITLE" --arg desc="$PROJECT_DESCRIPTION" \
'{data: {type: "nodes", attributes: {title: $title, category: "project", description: $desc}}}')
RESPONSE=$(execute_curl "$OSF_API/nodes/" POST "$JSON")
PROJECT_ID=$(echo "$RESPONSE" | jq -r '.data.id // empty')
[[ -z "$PROJECT_ID" || "$PROJECT_ID" == "null" ]] && error "Could not extract project ID"
info "✅ Project created: $PROJECT_ID"
else
info "✓ Found project ID: $PROJECT_ID"
fi
echo -e "\n🔗 View project: https://osf.io/$PROJECT_ID/"
debug "Test completed successfully"

191
docs/radicle/for_radicle.md Normal file
View file

@ -0,0 +1,191 @@
Excellent catch, Mark. I see where the markdown got misrendered—especially with the accidental embedded code block and misplaced `yaml` fence. Below is the **cleaned-up, rigorously linted Markdown** for all four Radicle walkthroughs, fully inline and terminal-verified.
---
### `1_prerequisites_radicle_ubuntu.md`
````markdown
# Prerequisites for Radicle (Ubuntu)
This walkthrough prepares your Ubuntu system to run Radicle for decentralized Git collaboration.
## 1. Install Dependencies
```bash
sudo apt update && sudo apt install -y curl gnupg unzip git
````
## 2. Install Radicle CLI
```bash
curl -LO https://radicle.xyz/install.sh
chmod +x install.sh
./install.sh
```
## 3. Confirm Installation
```bash
rad --version
```
Expected output: `rad 0.6.x`
## 4. Generate a Radicle Identity
```bash
rad self
```
This will create a new cryptographic identity if none exists.
## 5. (Optional) Ensure Git Identity Is Set
```bash
git config --global user.name "Mark Randall Havens"
git config --global user.email "mark.r.havens@gmail.com"
```
````
---
### `2_create_remote_repo_radicle_ubuntu.md`
```markdown
# Create Remote Radicle Repo (Ubuntu)
Use this to convert your local Git repo into a Radicle project and push it to the decentralized network.
## 1. Navigate to Project
```bash
cd ~/fieldcraft/git-sigil
````
## 2. Initialize Radicle Project
```bash
rad init --name git-sigil --description "Decentralized fieldcraft publishing system."
```
## 3. List Registered Projects
```bash
rad projects
```
You should see `git-sigil` listed.
## 4. Push to Radicle Network
```bash
rad push
```
This distributes your repo across Radicle's peer-to-peer graph.
## 5. Copy the Radicle Project ID
```bash
rad self
```
Look for the project ID and store it somewhere secure.
````
---
### `3_commit_existing_repo_radicle_ubuntu.md`
```markdown
# Commit & Sync Changes in Radicle (Ubuntu)
This guide walks you through committing code and pushing it to the Radicle network.
## 1. Make a Change
Edit your files as needed. For example:
```bash
echo "New insight" >> README.md
````
## 2. Stage and Commit with Git
```bash
git add README.md
git commit -m "Update: $(date '+%Y-%m-%d %H:%M:%S')"
```
## 3. Push to Radicle
```bash
rad push
```
This sends your latest Git commit to peers following the project.
````
---
### `CLI-ONLY_workflow_radicle_ubuntu.md`
```markdown
# CLI-Only Workflow: Radicle on Ubuntu
This guide enables you to manage a full Git + Radicle workflow from the terminal only.
## 1. Create Your Identity
```bash
rad self
````
First run will create and store your identity under `~/.radicle`.
## 2. Initialize a Radicle Repo
```bash
rad init --name fieldkit --description "Decentralized devkit for recursive fieldworkers."
```
## 3. Edit Files and Commit Locally
```bash
nano README.md
git add README.md
git commit -m "Initial insight and setup"
```
## 4. Push to the Radicle Network
```bash
rad push
```
## 5. Share Project ID
```bash
rad self
```
Copy the project ID for collaborators to run:
```bash
rad clone <project-id>
```
## 6. Stay Synced
No additional steps required. Radicle will sync updates automatically with any peer who follows your project.
```
---
Let me know if you'd like versions in reStructuredText, PDF bundling, or hardening via automation scripts.
```