-
-
Notifications
You must be signed in to change notification settings - Fork 0
Collection Management Advanced
Skill Level: Intermediate to Advanced
Master advanced collection management techniques for complex organizational structures and sophisticated automation workflows.
Organize collections in hierarchies for large organizations:
# Enterprise collection structure
enterprise/
├── infrastructure/ # Shared infrastructure tools
├── security/ # Security and compliance tools
├── teams/
│ ├── frontend/ # Frontend team tools
│ ├── backend/ # Backend team tools
│ ├── data/ # Data team tools
│ └── qa/ # QA team tools
└── projects/
├── project-a/ # Project-specific tools
└── project-b/# Install base infrastructure collection
dr -col add git@company.com/infrastructure.git
# Install team-specific collections
dr -col add git@company.com/frontend-tools.git
dr -col add git@company.com/backend-tools.git
dr -col add git@company.com/qa-tools.git
# Install project-specific collections
dr -col add git@company.com/project-a-tools.git
dr -col add git@company.com/project-b-tools.git
# List all installed collections
dr -col listCreate collections that depend on other collections:
The dotrun.collection.yml file defines collection metadata using YAML format:
# dotrun.collection.yml
name: frontend-tools # Collection identifier (alphanumeric, dashes, underscores)
version: 2.0.0 # Semantic version (X.Y.Z) - must match git tag
description: Frontend development tools and workflows # One-line summary
author: Frontend Team # Creator or organization name
repository: https://github.com/team/frontend-tools # Git repository URL (HTTPS or SSH)# Optional metadata
license: MIT # SPDX identifier (MIT, Apache-2.0, GPL-3.0, etc.)
homepage: https://docs.team.com/frontend-tools # Documentation or project URL
dependencies: [] # List of required collections (see Dependencies below)# dotrun.collection.yml
name: frontend-tools
version: 2.0.0
description: Frontend development tools and workflows for React/TypeScript projects
author: Frontend Team <frontend@company.com>
repository: https://github.com/team/frontend-tools
# Optional fields
license: MIT
homepage: https://docs.company.com/frontend-tools
dependencies:
- infrastructure-tools
- security-baselinename
- Required: Yes
- Format: Alphanumeric characters, dashes, and underscores only
- Pattern:
^[a-zA-Z0-9_-]+$ - Example:
frontend-tools,my_collection,DevOps123 - Invalid:
my collection(spaces),my.collection(dots)
version
- Required: Yes
- Format: Semantic versioning (MAJOR.MINOR.PATCH)
- Pattern:
^\d+\.\d+\.\d+$ - Must match git tag (without 'v' prefix)
- Example:
1.0.0,2.5.3,10.0.1 - Git tag:
v1.0.0or1.0.0(both valid)
description
- Required: Yes
- Format: Single-line string (no newlines)
- Recommended: 50-100 characters
- Purpose: Brief summary shown in collection lists
author
- Required: Yes
- Format: Free-form string
- Can include email:
Name <email@example.com> - Can be organization:
Company DevOps Team
repository
- Required: Yes
- Format: GitHub URL (HTTPS or SSH)
- HTTPS:
https://github.com/user/repo.gitorhttps://github.com/user/repo - SSH:
git@github.com:user/repo.git - Must be accessible (public or authenticated)
license
- Required: No
- Format: SPDX identifier string
- Common values:
MIT,Apache-2.0,GPL-3.0,BSD-3-Clause,ISC - See: https://spdx.org/licenses/
homepage
- Required: No
- Format: Valid URL (HTTP/HTTPS)
- Purpose: Link to documentation, wiki, or project page
dependencies
- Required: No
- Format: Array of collection names (strings)
- Each item must be a valid collection name
- Collections are not auto-installed (manual installation required)
- Example:
['base-tools', 'security-utils']
Critical: The version field in dotrun.collection.yml MUST match your git tag (excluding the optional 'v' prefix):
# Metadata file
version: 2.0.0
# Git tag (either format works)
git tag v2.0.0 # With 'v' prefix
git tag 2.0.0 # Without prefix
# Both tags above match version: 2.0.0Workflow:
- Update
versionindotrun.collection.yml - Commit changes:
git commit -m "Release v2.0.0" - Create matching tag:
git tag v2.0.0 - Push with tags:
git push origin main --tags
Validation: DotRun validates version/tag match during installation and updates
DotRun v3.1.0 uses a copy-based architecture. Resources from collections are copied into your local DotRun config, allowing safe local edits while DotRun tracks whether files diverge from the last imported version via SHA256 hashes.
Advantages:
- Local edits: You can freely modify imported files without impacting the collection's git clone
- Clear update semantics: DotRun detects modified vs. unmodified files using stored hashes
- Predictable behavior: No symlink quirks or permission issues across platforms
Trade-offs:
- Slightly more disk usage due to copies
- Requires a hash index for change detection
~/.config/dotrun/collections/ # Persistent git clones of collections
├── <collection-name >/
│ ├── .git/ # Full git repository
│ ├── dotrun.collection.yml # Collection metadata
│ ├── scripts/ # Collection scripts
│ ├── aliases/ # Collection aliases
│ ├── helpers/ # Collection helpers
│ └── configs/ # Collection configs
~/.local/share/dotrun/collections.conf # Installed collections tracking
# Lines: <collection-name>|<repo-url>|<version>|<last-sync>
~/.config/dotrun/ # Imported resources (copied from collections)
├── scripts/
├── aliases/
├── helpers/
└── configs/
~/.config/dotrun/.dr-hashes/ # SHA256 tracking for imported resources
└── <resource-basename >.sha256 # Stores original hash for modification detectionOn import, DotRun copies a file and stores the file's SHA256 in .dr-hashes/ for later comparison.
Import Process (copy_with_hash):
# Copy resource to destination
cp "$source" "$dest"
# Calculate SHA256 hash
hash="$(sha256sum "$dest" | awk '{print $1}')"
# Store hash for modification detection
echo "$hash" >"$DR_CONFIG/.dr-hashes/$(basename "$dest").sha256"Update Check (check_file_modified):
# Read stored hash from import time
stored_hash="$(cat "$hash_file")"
# Calculate current hash of file
current_hash="$(sha256sum "$file" | awk '{print $1}')"
# Compare to detect modifications
if [[ "$stored_hash" == "$current_hash" ]]; then
# File is unmodified since import
else
# File has been modified locally
fiNote: Hash files are keyed by the destination file's basename. Resource namespaces (scripts/aliases/helpers/configs) should avoid basename collisions to prevent hash conflicts.
When updating a collection, DotRun handles three scenarios:
1. Unmodified Files (hash matches):
- File hasn't been changed since import
- Options: update to new version, diff vs. incoming, skip
2. Modified Files (hash differs):
- File has been edited locally
- Options: keep local changes, overwrite with incoming, diff, backup then overwrite
3. New Files (in collection, not currently imported):
- File exists in collection but not in local workspace
- Options: import, view/diff, skip
Update Flow:
# Pseudocode for update prompts
if file_new_in_collection && not_imported; then
prompt: [import | view | skip]
elif hash_matches; then
prompt: [update | diff | skip]
else
prompt: [keep | overwrite | diff | backup+overwrite]
fiA file exists in the updated collection but has never been imported to your workspace.
When This Occurs:
- Collection author added a new script, alias, helper, or config
- You skipped this file during initial collection installation
- File was added in a collection version update
Interactive Prompt Options:
import - Copy the file to your workspace and start tracking with SHA256 hash
# Example workflow
dr -col update my-collection
# Prompt for scripts/new-deploy.sh:
# [NEW] scripts/new-deploy.sh not currently imported
# Options: [i]mport | [v]iew | [s]kip
# Choice: i
# Result: File copied to ~/.config/dotrun/scripts/new-deploy.sh
# Hash stored in ~/.config/dotrun/.dr-hashes/new-deploy.sh.sha256view - Preview the file content before deciding
# Choice: v
# Result: Displays file content with syntax highlighting
# Then re-prompts: [i]mport | [s]kipskip - Don't import this file (can import later)
# Choice: s
# Result: File remains in collection only, not imported to workspace
# Future updates will continue offering to importWhen to Use Each Option:
- import: File looks useful, want to add it to your workspace
- view: Unsure if file is needed, want to review content first
- skip: File not relevant to your workflow, or will import manually later
A file exists in your workspace and its hash matches the stored hash - you haven't edited it since import.
When This Occurs:
- File was imported from collection and never edited locally
- You restored a file to its original state
- File modification was reverted
Interactive Prompt Options:
update - Replace with new collection version (recommended)
# Example workflow
dr -col update my-collection
# Prompt for scripts/deploy.sh:
# [UNMODIFIED] scripts/deploy.sh (hash matches)
# Collection has newer version (v2.1.0)
# Options: [u]pdate | [d]iff | [s]kip
# Choice: u
# Result: File updated to collection version v2.1.0
# New hash stored in ~/.config/dotrun/.dr-hashes/deploy.sh.sha256diff - Compare your current version vs. incoming collection version
# Choice: d
# Result: Shows side-by-side diff highlighting changes
# Example output:
# --- Current (v2.0.0)
# +++ Incoming (v2.1.0)
# @@ -15,6 +15,8 @@
# deploy_to_production() {
# echo "Deploying..."
# + # New feature: health check before deploy
# + check_service_health
# }
# Then re-prompts: [u]pdate | [s]kipskip - Keep current version, don't update
# Choice: s
# Result: File unchanged, remains at previous version
# Future updates will continue offering to updateWhen to Use Each Option:
- update: Trust collection maintainer, want latest improvements
- diff: Want to review changes before updating
- skip: Current version works, prefer not to update yet
Best Practice: For unmodified files, use diff first to review changes, then update if changes look good.
A file exists in your workspace and its hash doesn't match the stored hash - you've edited it locally.
When This Occurs:
- You customized a collection script for your environment
- You fixed a bug or added features locally
- You made temporary changes for testing
Interactive Prompt Options:
keep - Preserve your local changes, don't update from collection (safe)
# Example workflow
dr -col update my-collection
# Prompt for scripts/deploy.sh:
# [MODIFIED] scripts/deploy.sh (hash differs - local edits detected)
# You have local changes
# Collection has newer version (v2.1.0)
# Options: [k]eep | [o]verwrite | [d]iff | [b]ackup+overwrite
# Choice: k
# Result: File unchanged, local edits preserved
# Hash NOT updated (remains original import hash)
# Future updates will continue detecting as modifiedoverwrite - Replace with collection version, LOSE local changes
# Choice: o
# WARNING: This will permanently discard your local changes!
# Confirm overwrite? [y/N]: y
# Result: File replaced with collection version v2.1.0
# Local changes LOST permanently
# New hash storeddiff - Compare your local changes vs. incoming collection version
# Choice: d
# Result: Three-way diff showing:
# 1. Original imported version
# 2. Your local changes
# 3. Incoming collection changes
# Example output:
# --- Original (v2.0.0)
# +++ Your Local Changes
# @@ -10,3 +10,5 @@
# deploy_command="./deploy.sh"
# +# Custom: Added notification
# +notify "Deployment started"
#
# --- Original (v2.0.0)
# +++ Incoming Collection (v2.1.0)
# @@ -15,6 +15,8 @@
# deploy_to_production() {
# + # Collection: Added health check
# + check_service_health
# }
# Then re-prompts: [k]eep | [o]verwrite | [b]ackup+overwritebackup+overwrite - Save your local version, then update from collection
# Choice: b
# Result:
# 1. Local version backed up to: scripts/deploy.sh.backup
# 2. File replaced with collection version v2.1.0
# 3. New hash stored
# 4. You can manually merge your changes from .backup fileWhen to Use Each Option:
- keep: Local changes are intentional customizations, want to preserve
- overwrite: Local changes were temporary, want latest collection version (CAUTION!)
- diff: Want to see exactly what changed before deciding
- backup+overwrite: Want collection update but preserve local changes for manual merge
Data Loss Warning: The overwrite option permanently discards local changes. There is NO undo. Always use backup+overwrite instead unless you're absolutely certain.
Best Practices:
-
Always use diff first - Review both your changes and collection changes before deciding
-
Use backup+overwrite, not overwrite - Safer approach that preserves your work
-
Document custom changes - Add comments in files explaining why you modified them
-
Consider contributing back - If your changes fix bugs or add value, submit PR to collection
-
Manual merge workflow:
# After backup+overwrite, merge manually: dr -col update my-collection # Choose backup+overwrite for modified files # Review your backed up changes cat ~/.config/dotrun/scripts/deploy.sh.backup # Review new collection version cat ~/.config/dotrun/scripts/deploy.sh # Manually merge your changes into new version $EDITOR ~/.config/dotrun/scripts/deploy.sh # Test merged version dr deploy --dry-run # Remove backup after successful merge rm ~/.config/dotrun/scripts/deploy.sh.backup
Collection Update Started
│
├─ For each file in collection:
│ │
│ ├─ File NOT in workspace?
│ │ └─ → NEW FILE scenario
│ │ ├─ [import] Copy to workspace + track hash
│ │ ├─ [view] Preview → [import | skip]
│ │ └─ [skip] Remain unimported
│ │
│ ├─ File in workspace → Check hash
│ │ │
│ │ ├─ Hash MATCHES stored hash?
│ │ │ └─ → UNMODIFIED FILE scenario
│ │ │ ├─ [update] Replace with collection version
│ │ │ ├─ [diff] Compare → [update | skip]
│ │ │ └─ [skip] Keep current version
│ │ │
│ │ └─ Hash DIFFERS from stored hash?
│ │ └─ → MODIFIED FILE scenario
│ │ ├─ [keep] Preserve local changes (safest)
│ │ ├─ [overwrite] Discard local changes (⚠️ DATA LOSS!)
│ │ ├─ [diff] Compare three-way → [keep | overwrite | backup]
│ │ └─ [backup+overwrite] Save local + update (recommended)
│ │
│ └─ Process next file...
│
└─ Update Complete
└─ Summary: X files updated, Y kept, Z new files imported
Scenario: Collection maintainer fixed a bug in unmodified script
dr -col update team-tools
# File: scripts/test-runner.sh [UNMODIFIED]
# Action: [d]iff → Review fix → [u]pdate
# Result: Bug fix applied, script now works correctlyScenario: Collection added new helper, you want to use it
dr -col update team-tools
# File: helpers/database.sh [NEW]
# Action: [v]iew → Review functions → [i]mport
# Result: New helper available for your scriptsScenario: You customized script, collection also updated it
dr -col update team-tools
# File: scripts/deploy.sh [MODIFIED]
# Action: [d]iff → See both changes → [b]ackup+overwrite
# Result: Collection updates applied, your changes in deploy.sh.backup
# Next: Manually merge your custom logic into new versionScenario: You made temporary debug changes, don't need them
dr -col update team-tools
# File: scripts/debug-helper.sh [MODIFIED]
# Action: [o]verwrite → Confirm [y]
# Result: Temporary changes discarded, clean collection version restored
# Warning: Only use if you're certain changes aren't needed!Scenario: You intentionally customized file for your environment
dr -col update team-tools
# File: configs/01-paths.config [MODIFIED]
# Action: [k]eep
# Result: Your environment-specific paths preserved
# Note: Will show as modified in every future updateProblem: Lost track of which files I modified
Solution: Review all modified files before updating:
# Check which imported files have local changes
cd ~/.config/dotrun
find scripts aliases helpers configs -type f -name "*.sh" -o -name "*.config" -o -name "*.aliases" | while read file; do
basename=$(basename "$file")
hash_file=".dr-hashes/${basename}.sha256"
if [[ -f "$hash_file" ]]; then
stored_hash=$(cat "$hash_file")
current_hash=$(sha256sum "$file" | awk '{print $1}')
if [[ "$stored_hash" != "$current_hash" ]]; then
echo "MODIFIED: $file"
fi
fi
doneProblem: Accidentally chose overwrite, want to recover changes
Solution: If you haven't edited the file again, check shell history or git history (if versioned):
# Check shell history for recent edits
history | grep -i "$EDITOR.*filename"
# If your dotrun config is git-tracked:
cd ~/.config/dotrun
git log --all --full-history -- scripts/filename.sh
git show <commit-hash >:scripts/filename.shPrevention: Always use backup+overwrite instead of overwrite for modified files.
Problem: Collection update shows all files as modified, but I didn't change them
Possible causes:
-
.dr-hashes/directory was deleted or corrupted - Files were edited by another tool (linter, formatter)
- Line ending changes (CRLF vs LF)
Solution: Re-import collection to reset hashes:
# Remove collection and re-add
dr -col remove my-collection
dr -col add https://github.com/user/my-collection.git
# Re-import only the files you needProblem: Want to update collection but skip certain files automatically
Solution: Use skip for those files and document in a local note:
# Create skip list
echo "scripts/custom-config.sh" >>~/.config/dotrun/.collection-skip-list
# During updates, always [k]eep or [s]kip files in this list
# Note: DotRun doesn't currently auto-skip files (manual process)For teams managing many collections, consider these update strategies:
Conservative Strategy (Minimize Risk):
- Run
dr -col syncto check for updates - Update one collection at a time
- For each modified file: diff → backup+overwrite → manual merge
- Test after each collection update
- Document merged changes
Aggressive Strategy (Trust Collection Maintainers):
- Bulk update all collections:
dr collection-updater bulk-update - For unmodified files: update automatically
- For modified files: keep (preserve customizations)
- Manually pull critical collection updates later
Balanced Strategy (Recommended):
- Update critical collections individually with review
- For unmodified files: diff → update if changes make sense
- For modified files: diff → backup+overwrite → manual merge
- For new files: view → import if useful
- Test thoroughly after updates
- Keep backup copies until confident
Location: ~/.local/share/dotrun/collections.conf
Format: Pipe-delimited values
# Format: name|repository|version|last_sync_timestamp
# Example:
dotrun | https://github.com/jvPalma/dotrun.git | 3.1.0 | 1729612800
frontend-tools | git@company.com/frontend-tools.git | 2.1.0 | 1729698400Fields:
- name: Logical collection name
- repository: Git URL used to clone/update
- version: Semantic version (or ref) of the installed collection
- last_sync_timestamp: Epoch seconds of last successful sync
Scripts that automatically import required collections:
#!/usr/bin/env bash
### DOC
# Auto-import required collections for project setup
### DOC
set -euo pipefail
ensure_collection() {
local collection_name="$1"
local repository="$2"
local version="${3:-latest}"
if ! dr -col list | grep -q "$collection_name"; then
echo "📦 Installing required collection: $collection_name"
if [[ "$version" != "latest" ]]; then
# Clone specific version/tag first
local temp_dir="/tmp/$collection_name-$$"
git clone --branch "$version" "$repository" "$temp_dir"
# Note: dr -col add installs from repository, not local directory
# For specific versions, clone and tag checkout is handled by DotRun
dr -col add "$repository"
rm -rf "$temp_dir"
else
dr -col add "$repository"
fi
else
echo "✅ Collection already available: $collection_name"
# Check version if specified
if [[ "$version" != "latest" ]]; then
# Check current version from collection metadata
local metadata_file="$DR_CONFIG/collections/$collection_name/dotrun.collection.yml"
if [[ -f "$metadata_file" ]]; then
local current_version
current_version=$(yq eval '.version' "$metadata_file")
if [[ "$current_version" != "$version" ]]; then
echo "⚠️ Warning: $collection_name version mismatch (current: $current_version, required: $version)"
fi
fi
fi
fi
}
load_dependencies_from_config() {
local config_file=".drun-project.yml"
if [[ ! -f "$config_file" ]]; then
echo "❌ Project configuration not found: $config_file"
exit 1
fi
echo "📋 Loading dependencies from $config_file..."
# Parse YAML and import collections
while IFS= read -r line; do
if [[ "$line" =~ ^[[:space:]]*-[[:space:]]*name:[[:space:]]*\"(.*)\" ]]; then
local name="${BASH_REMATCH[1]}"
local repo=$(yq eval ".collection_dependencies[] | select(.name == \"$name\") | .repository" "$config_file")
local version=$(yq eval ".collection_dependencies[] | select(.name == \"$name\") | .version" "$config_file")
ensure_collection "$name" "$repo" "$version"
fi
done < <(yq eval '.collection_dependencies[] | .name' "$config_file")
}
setup_project_type() {
local project_type="${1:-fullstack}"
echo "🏗️ Setting up project type: $project_type"
case "$project_type" in
fullstack)
ensure_collection "infrastructure" "git@company.com/infrastructure.git"
ensure_collection "frontend" "git@company.com/frontend-tools.git"
ensure_collection "backend" "git@company.com/backend-tools.git"
ensure_collection "database" "git@company.com/database-tools.git"
;;
microservices)
ensure_collection "infrastructure" "git@company.com/infrastructure.git"
ensure_collection "kubernetes" "git@company.com/k8s-tools.git"
ensure_collection "monitoring" "git@company.com/monitoring-tools.git"
ensure_collection "security" "git@company.com/security-tools.git"
;;
data-science)
ensure_collection "infrastructure" "git@company.com/infrastructure.git"
ensure_collection "ml-tools" "git@company.com/ml-tools.git"
ensure_collection "data-processing" "git@company.com/data-tools.git"
ensure_collection "jupyter" "git@company.com/jupyter-tools.git"
;;
mobile)
ensure_collection "infrastructure" "git@company.com/infrastructure.git"
ensure_collection "mobile-ios" "git@company.com/ios-tools.git"
ensure_collection "mobile-android" "git@company.com/android-tools.git"
ensure_collection "mobile-common" "git@company.com/mobile-common.git"
;;
*)
echo "❌ Unknown project type: $project_type"
echo "Available types: fullstack, microservices, data-science, mobile"
exit 1
;;
esac
echo "✅ All required collections imported for $project_type project"
}
main() {
local action="$1"
case "$action" in
auto-setup)
if [[ -f ".drun-project.yml" ]]; then
load_dependencies_from_config
else
setup_project_type "${2:-fullstack}"
fi
;;
setup-type)
setup_project_type "${2:-fullstack}"
;;
ensure)
ensure_collection "$2" "$3" "${4:-latest}"
;;
*)
echo "Usage: dr collection-manager <auto-setup|setup-type|ensure> [args...]"
echo " auto-setup - Load from .drun-project.yml or default"
echo " setup-type <type> - Setup specific project type"
echo " ensure <name> <repo> [ver] - Ensure specific collection"
exit 1
;;
esac
}
main "$@"#!/usr/bin/env bash
### DOC
# Collection version management and updates
### DOC
set -euo pipefail
list_collection_versions() {
local collection_name="$1"
local collection_path="$DR_CONFIG/collections/$collection_name"
if [[ ! -d "$collection_path" ]]; then
echo "❌ Collection not found: $collection_name"
exit 1
fi
echo "📋 Available versions for $collection_name:"
cd "$collection_path"
if git tag -l | grep -E '^v[0-9]+\.[0-9]+\.[0-9]+$' | sort -V; then
echo
echo "Current version: $(git describe --tags --abbrev=0 2>/dev/null || echo 'untagged')"
echo "Current branch: $(git branch --show-current)"
else
echo "No semantic version tags found"
fi
}
update_collection() {
local collection_name="$1"
# Use built-in update command with conflict resolution
echo "🔄 Updating $collection_name..."
dr -col update "$collection_name"
}
check_updates() {
echo "🔍 Checking for collection updates..."
# Use dr -col sync for built-in update checking
dr -col sync
}
bulk_update() {
echo "🔄 Bulk updating all collections..."
# Get list of installed collections
local collections
collections=$(dr -col list | awk '{print $1}')
local success_count=0
local total_count=0
while IFS= read -r collection; do
[[ -z "$collection" ]] && continue
((total_count++))
echo
echo "📦 Updating $collection..."
# Use built-in update command
if dr -col update "$collection"; then
((success_count++))
else
echo "❌ Failed to update $collection"
fi
done <<<"$collections"
echo
echo "📊 Update Summary:"
echo " Successfully updated: $success_count/$total_count collections"
if [[ $success_count -eq $total_count ]]; then
echo "✅ All collections updated successfully"
else
echo "⚠️ Some collections failed to update"
exit 1
fi
}
main() {
local action="$1"
case "$action" in
list-versions)
list_collection_versions "$2"
;;
update)
update_collection "$2"
;;
check-updates)
check_updates
;;
bulk-update)
bulk_update
;;
*)
echo "Usage: dr collection-updater <command> [args...]"
echo
echo "Commands:"
echo " list-versions <collection> - List available versions"
echo " update <collection> - Update collection (interactive)"
echo " check-updates - Check for available updates"
echo " bulk-update - Update all collections"
exit 1
;;
esac
}
main "$@"#!/usr/bin/env bash
### DOC
# Validate collection structure and integrity
### DOC
set -euo pipefail
validate_collection_structure() {
local collection_name="$1"
local collection_path="$DR_CONFIG/collections/$collection_name"
echo "🔍 Validating structure for $collection_name..."
local errors=0
# Check if collection directory exists
if [[ ! -d "$collection_path" ]]; then
echo "❌ Collection directory not found: $collection_path"
return 1
fi
# Check for required files
local required_files=("dotrun.collection.yml" "README.md")
for file in "${required_files[@]}"; do
if [[ ! -f "$collection_path/$file" ]]; then
echo "❌ Missing required file: $file"
((errors++))
else
echo "✅ Found: $file"
fi
done
# Check bin directory structure
if [[ -d "$collection_path/bin" ]]; then
echo "✅ Found: bin/ directory"
# Validate script files
local script_count=0
while IFS= read -r -d '' script_file; do
((script_count++))
# Check if file is executable
if [[ ! -x "$script_file" ]]; then
echo "⚠️ Script not executable: $(basename "$script_file")"
((errors++))
fi
# Check for DOC sections
if ! grep -q "### DOC" "$script_file"; then
echo "⚠️ Missing DOC section: $(basename "$script_file")"
fi
done < <(find "$collection_path/bin" -name "*.sh" -print0)
echo "📊 Found $script_count scripts"
else
echo "⚠️ No bin/ directory found"
fi
# Validate metadata
if [[ -f "$collection_path/dotrun.collection.yml" ]]; then
if ! yq eval '.' "$collection_path/dotrun.collection.yml" >/dev/null 2>&1; then
echo "❌ Invalid YAML in dotrun.collection.yml"
((errors++))
fi
fi
return $errors
}
test_collection_scripts() {
local collection_name="$1"
local collection_path="$DR_CONFIG/collections/$collection_name"
echo "🧪 Testing scripts in $collection_name..."
local test_failures=0
local test_count=0
if [[ -d "$collection_path/bin" ]]; then
while IFS= read -r -d '' script_file; do
((test_count++))
local script_name
script_name=$(basename "$script_file" .sh)
echo "Testing: $collection_name/$script_name"
# Syntax check
if ! bash -n "$script_file"; then
echo "❌ Syntax error in: $script_name"
((test_failures++))
continue
fi
# ShellCheck if available
if command -v shellcheck >/dev/null; then
if ! shellcheck "$script_file"; then
echo "⚠️ ShellCheck warnings in: $script_name"
fi
fi
# Test help functionality
if "$script_file" --help >/dev/null 2>&1; then
echo "✅ Help works: $script_name"
else
echo "⚠️ No help available: $script_name"
fi
done < <(find "$collection_path/bin" -name "*.sh" -print0)
fi
echo "📊 Tested $test_count scripts, $test_failures failures"
return $test_failures
}
security_scan_collection() {
local collection_name="$1"
local collection_path="$DR_CONFIG/collections/$collection_name"
echo "🔒 Security scanning $collection_name..."
local security_issues=0
# Check for hardcoded secrets
if grep -r -i -E "(password|secret|key|token).*=" "$collection_path/bin/" 2>/dev/null; then
echo "⚠️ Potential hardcoded secrets found"
((security_issues++))
fi
# Check for dangerous commands
if grep -r -E "(rm -rf /|dd if=|mkfs|fdisk)" "$collection_path/bin/" 2>/dev/null; then
echo "⚠️ Potentially dangerous commands found"
((security_issues++))
fi
# Check file permissions
while IFS= read -r -d '' file; do
local perms
perms=$(stat -c "%a" "$file")
if [[ "$perms" == "777" ]]; then
echo "⚠️ Overly permissive file: $file"
((security_issues++))
fi
done < <(find "$collection_path" -type f -print0)
if [[ $security_issues -eq 0 ]]; then
echo "✅ No security issues found"
else
echo "⚠️ $security_issues potential security issues found"
fi
return $security_issues
}
main() {
local action="$1"
local collection_name="${2:-}"
case "$action" in
structure)
if [[ -z "$collection_name" ]]; then
echo "❌ Collection name required"
exit 1
fi
validate_collection_structure "$collection_name"
;;
scripts)
if [[ -z "$collection_name" ]]; then
echo "❌ Collection name required"
exit 1
fi
test_collection_scripts "$collection_name"
;;
security)
if [[ -z "$collection_name" ]]; then
echo "❌ Collection name required"
exit 1
fi
security_scan_collection "$collection_name"
;;
all)
if [[ -z "$collection_name" ]]; then
echo "❌ Collection name required"
exit 1
fi
local total_errors=0
validate_collection_structure "$collection_name" || ((total_errors++))
test_collection_scripts "$collection_name" || ((total_errors++))
security_scan_collection "$collection_name" || ((total_errors++))
if [[ $total_errors -eq 0 ]]; then
echo "✅ All validation checks passed"
else
echo "❌ Validation completed with $total_errors errors/warnings"
exit 1
fi
;;
*)
echo "Usage: dr collection-validator <structure|scripts|security|all> <collection-name>"
exit 1
;;
esac
}
main "$@"- Performance Optimization - Optimize collection and script performance
- Security Patterns - Advanced security practices
- Monitoring Integration - Collection usage monitoring
Advanced collection management enables scalable automation across large organizations and complex projects.