The $0 productivity upgrade most developers miss
Your Mac and Linux machines come with grep, find, and cat - tools from the 1970s. Modern alternatives run 10-100x faster, output structured JSON for AI workflows, and take 30 minutes to install. Most developers have no idea they exist.

Key takeaways
- Standard Unix tools are from the 1970s - Modern alternatives like ripgrep and fd run 10-100x faster while providing better output and error messages
- JSON output changes everything for AI - Tools like ripgrep and jq output structured data that LLMs can parse reliably, enabling automation that was impossible with text-based tools
- The Rust revolution matters - Tools written in Rust between 2018-2024 are dramatically faster, safer, and more reliable than their C predecessors from decades ago
- Free 30-minute install creates lasting gains - One-time setup with Homebrew provides 10x performance improvements and better developer experience permanently
Your developers are using tools from 1975.
They are searching codebases with grep - a tool created in 1974. They are finding files with find - from 1971. They are viewing files with cat - 1971 again. These work, technically. They have worked for 50 years.
But between 2018 and 2024, the open-source community rewrote all of them. The new versions run 10-100x faster. They output JSON that AI tools can parse. They have better defaults, clearer error messages, and they just work better.
Most developers have no idea these tools exist. The ones who do often don’t bother installing them because “the old tools work fine.”
They don’t work fine. Not for modern AI workflows.
Why 1970s tools break AI automation
Here is what happens when you try to automate something using standard Unix tools and AI.
You want to search your codebase for a pattern, extract some information, and feed it to Claude or ChatGPT for analysis. You write a script:
grep -r "function.*authenticate" . --include="*.js" |
grep -v node_modules |
cut -d: -f1,2 |
# now try to get Claude to parse this messThe output looks like this:
src/auth/login.js:23: function authenticateUser(credentials) {
src/auth/oauth.js:45:function authenticate_oauth(token) {That’s text. Unstructured text. Different formats per line. No metadata. No context. When you hand this to an LLM, it has to guess at the structure. Sometimes it works. Sometimes it hallucinates. Sometimes it just fails.
Now try the modern equivalent with ripgrep:
rg "function.*authenticate" --type js --jsonThe output:
{"type":"match","data":{"path":{"text":"src/auth/login.js"},"lines":{"text":" function authenticateUser(credentials) {"},"line_number":23,"absolute_offset":1250,"submatches":[{"match":{"text":"function authenticateUser"},"start":2,"end":27}]}}That’s structured data. The LLM can parse it reliably. You can extract exactly what you need. You can build automation that actually works.
This difference compounds. Every script, every automation, every AI-assisted workflow. Text parsing fails unpredictably. Structured data works reliably.
The 10 tools that matter
Not all modern tools are worth installing. Some are marginal improvements. Some solve problems nobody has. But ten tools make a genuine difference.
For code search: ripgrep
Ripgrep (rg) replaces grep. It searches 10x faster. It respects .gitignore automatically. It outputs JSON.
The speed difference is real. On a million-line codebase, grep takes 20 seconds. Ripgrep takes 2 seconds. Multiply that by how many times per day your developers search code.
But the JSON output is why it matters for AI. Every automation you build on top of ripgrep just works because the data is structured.
Install: brew install ripgrep
For file finding: fd
The Unix find command has syntax from hell. Finding all JavaScript files that don’t contain “test” in the path looks like this:
find . -name "*.js" -not -path "*/test/*" -type fWith fd:
fd -e js -E testFive times faster. One-tenth the cognitive load. Your developers will actually use it instead of giving up and searching manually.
Install: brew install fd
For JSON processing: jq
If you work with APIs or configuration files or logs, you work with JSON. The standard Unix way to process JSON is grep and sed. This is like performing surgery with a hammer.
jq is a proper JSON processor. Query it like a database. Transform it reliably. Extract exactly what you need.
curl api.example.com/users | jq '.data[] | {name: .name, active: .status == "active"}'This is the difference between automation that works and automation that fails randomly.
Install: brew install jq (actually included in macOS but worth highlighting)
For data tables: miller
You have a CSV file with 100,000 rows. You need to filter it, join it with another file, calculate statistics, and output the results. The Unix way involves awk scripts that nobody can read and everyone fears to modify.
Miller (mlr) processes CSV, JSON, and other formats like a database. Filter, join, aggregate - all with SQL-like syntax.
mlr --csv filter '$age > 30' then stats1 -a mean,sum -f salary data.csvFor any data analysis or reporting automation, this is essential.
Install: brew install miller
For interactive selection: fzf
Your automation script needs the user to pick from a list. The old way is to output the list and make them type a number or use arrow keys through a custom menu you spent an hour building.
fzf is a fuzzy finder. Pipe any list into it, the user can search and select interactively. Works with anything - files, directories, git branches, process lists.
# Let user pick a file
vim $(fzf)
# Let user pick a git branch
git checkout $(git branch | fzf)This transforms clunky scripts into polished tools.
Install: brew install fzf
For syntax highlighting: bat
The cat command dumps file contents. No syntax highlighting. No line numbers. Just text.
bat adds syntax highlighting, git integration, line numbers, and automatic paging. It makes reviewing code files in the terminal actually pleasant.
When you are building automation that needs to show code to users, bat makes it readable.
Install: brew install bat
For better git diffs: delta
Git diffs are hard to read. Lines of red and green. No syntax highlighting. Easy to miss important changes.
delta makes git diffs beautiful. Syntax highlighting, better formatting, side-by-side view. Your developers will actually review changes properly instead of skimming.
Install: brew install git-delta
For smart navigation: zoxide
Your developers type cd hundreds of times per day. Usually to directories they visit constantly.
zoxide learns which directories they use most and lets them jump there with partial matches. z api jumps to the api-v2 directory they use every day. z cont jumps to the controllers folder.
This saves seconds each time. Across a team, across a year, it adds up to hours.
Install: brew install zoxide
For better prompts: starship
The default terminal prompt shows the current directory. Maybe the git branch if you have it configured.
starship shows context - git branch, current language version, whether the last command failed, how long it took to run. All automatically.
This reduces cognitive load. Your developers don’t need to run separate commands to check this information. It’s just there.
Install: brew install starship
For quick command help: tldr
Man pages are comprehensive and unreadable. When your developer needs to remember how to use tar, they don’t need 2000 lines of documentation. They need three examples.
tldr provides simplified, example-focused help pages. No theory. Just “here is how you do the common things.”
Install: brew install tldr
Why this matters for AI work specifically
Every organization building with AI hits the same problem. The tools from the 1970s assume text output. AI tools work better with structured data.
GitHub’s research on developer productivity shows AI coding assistants work best when they can understand code context precisely. Tools that output JSON make that possible. Tools that output unstructured text force the AI to guess.
When ripgrep outputs search results as JSON, your AI can know exactly which file, which line, what matched, what the surrounding context is. When grep outputs text, the AI has to parse it heuristically and hope it got it right.
This shows up everywhere:
- Analyzing error logs (structured JSON vs messy text)
- Processing API responses (jq vs grep)
- Building automation (reliable parsing vs fragile text munging)
- Generating reports (miller’s SQL-like operations vs complex awk scripts)
The developers who have modern tools build automation that works. The developers without them build automation that mostly works and fails mysteriously sometimes.
The Rust revolution
Most of these tools - ripgrep, fd, bat, delta, zoxide, starship - are written in Rust. This is not a coincidence.
Rust is a systems programming language that came out of Mozilla in 2010. It’s as fast as C but memory-safe by design. Between 2018 and 2024, developers rewrote huge swaths of Unix utilities in Rust.
The results are faster, more reliable, and have better error messages. When grep fails, you get “binary file matches.” When ripgrep fails, you get a clear explanation of what went wrong and how to fix it.
This matters for developer productivity. Better error messages mean less time debugging tools and more time doing actual work.
What to actually do
If you manage developers, tell them to install these tools. It takes 30 minutes. The one-line install on Mac:
brew install ripgrep fd fzf bat jq git-delta zoxide starship tldr millerOn Linux, replace brew with apt or whatever package manager the distro uses.
Then configure the shell integrations:
# Add to ~/.zshrc or ~/.bashrc
eval "$(fzf --zsh)"
eval "$(zoxide init zsh)"
eval "$(starship init zsh)"Configure git to use delta:
git config --global core.pager deltaThat’s it. The tools are now available. Your developers can use them immediately.
The bigger challenge is getting them to actually use the new tools instead of the old ones. This is habit change. Some developers will switch immediately. Others need to see the benefit first.
The way to do this is to show, not tell. When someone asks “how do I find all the places we call this function,” show them rg instead of grep. When someone needs to filter a CSV, show them mlr instead of awk.
Over time, the team shifts to the better tools. Not because they were mandated, but because they’re obviously better once you try them.
The larger pattern
This is not really about command-line tools. It’s about keeping your developers’ environment current.
The Unix tools from the 1970s were revolutionary for their time. But it’s 2024. We have better options now. Tools that are faster, easier to use, and work better with AI.
The developers who adopt modern tools get more done. They build more reliable automation. They work faster. They are less frustrated.
The developers who stick with old tools might not notice they’re being slowed down. The friction is subtle. A few extra seconds here. A failed script there. Automation that works 90% of the time instead of 99%.
But it compounds. Give a developer modern tools and they build things that weren’t possible before. Not because the old tools couldn’t technically do it, but because the new tools make it easy enough that they actually do it.
This shows up in unexpected places. Your developer has a manual task that takes 10 minutes. With old tools, scripting it takes an hour and has edge cases. They just do it manually forever. With modern tools, scripting it takes 10 minutes and works reliably. They script it and save that time forever.
Multiply that across your development team. Multiply it by the number of tasks they handle. The difference is substantial.
The tools are free. The installation takes half an hour. The return on investment starts immediately and lasts as long as those developers are on your team.
Most companies don’t know these tools exist. The ones that do often don’t bother because “the old tools work.” But when you’re building with AI, the structured output and reliable parsing make the difference between automation that works and automation that fails unpredictably.
Your competitors will figure this out eventually. Better if you figure it out first.
About the Author
Amit Kothari is an experienced consultant, advisor, and educator specializing in AI and operations. With 25+ years of experience and as the founder of Tallyfy (raised $3.6m), he helps mid-size companies identify, plan, and implement practical AI solutions that actually work. Originally British and now based in St. Louis, MO, Amit combines deep technical expertise with real-world business understanding.
Disclaimer: The content in this article represents personal opinions based on extensive research and practical experience. While every effort has been made to ensure accuracy through data analysis and source verification, this should not be considered professional advice. Always consult with qualified professionals for decisions specific to your situation.