What You’ll Learn
Comprehensive reference for Verdent’s built-in tool system, including file operations, search capabilities, command execution, and integration tools.File Operations
Tools for reading, editing, and creating files.- file_read
- file_edit
- file_write
Read file contents with optional line ranges for large files. Works with all text formats. Essential for understanding code before modifications.
Use Cases:Best Practices:
| Parameter | Description |
|---|---|
path | File path to read |
start_line | Starting line number (optional, for reading specific sections) |
max_lines | Maximum number of lines to return (optional, for limiting output) |
- Reading configuration files before editing them
- Understanding existing implementation patterns in a codebase
- Reviewing test files to understand current coverage
- Use line ranges for files over 500 lines to avoid context overload
- Read only the sections relevant to your current task
- For large files, use
grep_contentfirst to identify relevant line numbers before reading
- Files larger than 256KB return only the first 256KB of content
- Very large files (>10,000 lines) should be read in sections to avoid context window exhaustion and slow response times
Search & Navigation
Tools for finding files and searching content. No limits on search results.- glob
- grep_content
- grep_file
- list_dir
Find files matching glob patterns like
Use Cases:Best Practices:
**/*.ts or src/**/*.js. Supports filtering by directory path, exclude patterns, and result limiting.| Parameter | Description |
|---|---|
pattern | Glob pattern to match files (e.g., **/*.ts, src/**/*.js) |
exclude | Patterns to exclude from results (e.g., **/node_modules/**) |
max_results | Maximum number of files to return |
- Finding all components of a specific type in a project
- Locating test files across the codebase
- Identifying configuration files scattered across directories
- Use specific patterns to narrow scope (
src/**/*.tsinstead of**/*) - Exclude large directories like
node_modulesto improve performance - Set
max_resultsto prevent overwhelming output on large codebases
Execution & Integration
Tools for running commands and coordinating tasks. No limits on concurrent subagents.- bash
- spawn_subagent
- todo_update
Execute shell commands with configurable timeout, descriptive summaries, and support for command chaining via
Use Cases:Limits:
&&.| Parameter | Description |
|---|---|
command | The shell command to execute |
timeout | Maximum execution time in milliseconds (hard limit: 120000ms / 2 minutes) |
summary | Human-readable description of what the command does |
- Running test suites and build processes
- Installing or updating dependencies
- Executing git operations (commit, push, pull)
- Running database migrations or scripts
- Maximum timeout: 120 seconds (2 minutes, hard limit)
- Commands exceeding the timeout are automatically terminated
- Set explicit timeouts appropriate to expected execution time
- Always provide clear summaries so the command’s purpose is obvious
- Chain dependent commands with
&&to ensure proper sequencing - Review destructive commands carefully before execution (rm, drop, truncate)
- Commands execute with your user permissions
- Never run commands from untrusted sources
- Use Plan Mode for review when working in shared codebases
- Avoid commands that might expose credentials or sensitive data
- Break the work into smaller, sequential commands
- Run in background and check results separately
- Execute manually in your terminal for full control
Web Access
Tools for searching and fetching web content.- web_search
- web_fetch
Query internet search engines with control over result count and freshness filtering to find recent information.
Use Cases:
| Parameter | Description |
|---|---|
query | The search query string |
num_results | How many search results to return |
freshness_days | Only return results from the last N days |
- Finding official documentation for unfamiliar APIs or libraries
- Researching specific error messages to find solutions
- Checking current best practices and up-to-date recommendations
FAQs
How long can bash commands run?
How long can bash commands run?
Maximum timeout: 120 seconds (2 minutes)Commands exceeding 2 minutes will be automatically terminated.Alternatives for long operations:
- Break into smaller commands
- Run in background and check results separately
- Execute manually in your terminal
What's the difference between grep_file and grep_content?
What's the difference between grep_file and grep_content?
grep_file returns only file paths that contain matches. Use it to quickly identify which files to examine.grep_content returns the matching lines with optional context. Use it when you need to see the actual code.Recommended workflow: Start with
grep_file to find relevant files, then use file_read or grep_content for details.When should I use file_edit vs file_write?
When should I use file_edit vs file_write?
file_edit is for targeted changes. It replaces specific text while preserving the rest of the file.file_write is for complete rewrites. It overwrites the entire file with new content.Use
file_edit when possible; it’s safer and preserves file structure.