\n\n\n\n Why I Still Love CLI Tools in 2026 (And You Should Too) - AgntBox Why I Still Love CLI Tools in 2026 (And You Should Too) - AgntBox \n

Why I Still Love CLI Tools in 2026 (And You Should Too)

📖 7 min read‱1,311 words‱Updated Mar 28, 2026

How a “Quick Script” Turned Into an Obsession

A few years ago, I wrote a tiny shell script to rename some log files. It took maybe 10 minutes. That script is still in my dotfiles, and according to my shell history I’ve used it 437 times since 2021.

That was the moment I realized: the command line isn’t just a “power user” thing. It’s a multiplier. Every tiny improvement you make gets reused hundreds (or thousands) of times, quietly paying you back.

I’m the kind of person who keeps comparison spreadsheets for everything. Terminals. Git tools. JSON processors. If it runs in a shell, I’ve probably installed it, broken it, and kept notes.

So let’s talk CLI tools. Not as some old-school rite of passage, but as a very practical way to get more done with less noise. If you’ve been CLI-curious or stuck with just git and npm, this is for you.

Why the CLI Still Matters in 2026

We’re drowning in graphical tools. Browser UIs for everything, Electron apps eating RAM like it’s a sport. And yet, I keep ending up back in a terminal window.

Here’s why the CLI still hits different:

  • Speed: Once muscle memory kicks in, typing commands is simply faster than clicking through nested menus. I can switch git branches, run tests, and open logs without touching the mouse.
  • Composability: Tools are simple, but you can chain them. psql into a database, pipe results to jq, feed that into a script. Suddenly you’ve built a tiny one-off “app” in 30 seconds.
  • Automation: Anything you type today can become tomorrow’s script or alias. That’s how you create those “10 minutes today, 437 uses later” moments.
  • Truth over visuals: Logs, processes, ports, containers—CLI tools usually show you what’s actually happening, not a curated dashboard that hides details.

And yes, the CLI has a learning curve. But once you get a small set of commands under your fingers, the payoff compounds quickly. You don’t need to become a shell wizard; you just need a few well-chosen tools.

CLI Tools That Actually Earn Their Keep

Let me run through a few command line tools that have survived multiple rounds of purges in my setup. I track installs and usage in a spreadsheet (I told you I was that person); these are the ones that keep showing up.

1. ripgrep (rg): Search that doesn’t make you wait

I uninstalled grep from my muscle memory the day I installed ripgrep. It’s fast, respects .gitignore, and the defaults are sane.

For example, in one monorepo (~250k lines, 2023 numbers from my notes), a search for user_id took:

  • grep -R "user_id" .: ~5.3 seconds
  • rg "user_id": ~0.7 seconds

Doesn’t sound huge, but multiply that by “how often do you search in your codebase per day?” and it adds up quickly.

2. fzf: Fuzzy finding your entire life

fzf is an interactive fuzzy finder. Sounds boring. It is not. It basically turns lists into searchable interfaces on the fly.

I use it to:

  • Jump between git branches
  • Open recent files in my editor
  • Pick Docker containers to shell into

Example: I wired a shortcut so that typing cproj brings up a fuzzy list of my last 50 projects, then drops me into the selected directory. No more digging through folders.

3. jq: JSON without tears

If you touch APIs, logs, or anything that spits out JSON, jq is non‑negotiable. It lets you filter, transform, and reshape JSON from the command line.

Example I actually ran last week: I had a log file with 18,000 JSON lines and I needed the distinct user IDs who hit a particular endpoint.

One-liner:

cat logs.jsonl | jq -r 'select(.path == "/api/v2/orders") | .user_id' | sort -u | wc -l

Result: 73 unique users. Took 0.4 seconds. No Python script. No temp files. Just glueing together small tools.

4. fd: Finding files without headaches

fd is a nicer find. That’s it. That’s the pitch.

Instead of:

find . -name "*.test.js"

you type:

fd ".test.js"

and it Just Worksℱ: respects .gitignore, has better defaults, and doesn’t make you remember weird flags.

Building Tiny Workflows That Save You Hours

The real power isn’t any single CLI tool. It’s how you combine them into little workflows that quietly save you hours every month.

Two concrete examples from my own setup:

Workflow 1: One command to inspect an issue

I got tired of doing the same dance when looking at a bug:

  • Check out the branch
  • Pull latest changes
  • Start the dev server
  • Tail logs

Now I have a script called bug that does this:

  • Uses fzf to pick a branch
  • Runs git checkout and git pull
  • Starts docker compose up in the background
  • Uses rg to search for the ticket ID in the code and opens the first match in my editor

It’s not fancy. It’s maybe 25 lines of bash. But it turns a 2‑3 minute ritual into a ~10 second interaction, and I use it multiple times per day.

Workflow 2: Quick API inspector for debugging

I used to open Postman, click around, then copy paste JSON into a formatter just to inspect an endpoint. Now:

api GET /api/v2/orders?limit=20

My api script wraps curl, adds auth headers, and pipes the response through jq for pretty printing. It also logs the call to a ~/.api-history file so I can repeat it later.

According to my shell history, I used that script 196 times in October 2025 alone. That’s not just convenience; that’s a whole GUI tool removed from my workflow.

Getting Started Without Overwhelming Yourself

If you’re new to CLI tools (or you’ve only used the basics), don’t install 20 things and try to change your whole setup in a weekend. That’s how you end up with a messy shell and no habits.

Here’s how I’d start if I were you:

  • Step 1: Make your terminal nice to use
    Install a good terminal (WezTerm, Alacritty, iTerm2, or the new Windows Terminal), pick a readable font, and enable mouse wheel scroll. You’ll use it more if it doesn’t feel like a punishment.
  • Step 2: Add 2–3 “win” tools
    Install rg, fzf, and jq. These give you fast search, fuzzy navigation, and JSON superpowers. Use them for a week before adding more.
  • Step 3: Turn repeated commands into aliases
    Every time you type a long command twice in one day, consider making an alias or tiny script. That’s how your personal toolkit grows naturally.
  • Step 4: Write your first 10‑line script
    Take something boring you do—like booting your dev environment—and script it. Doesn’t have to be pretty. It just has to work.

The goal isn’t to become “the terminal person” on your team. The goal is to quietly shave friction off your day until things feel noticeably smoother.

FAQ

Do I need to switch to Linux or macOS to use CLI tools?

No. Windows has a solid story now with Windows Terminal, WSL2, and PowerShell. Most of the tools I mentioned—rg, fzf, jq, fd—work on all three major platforms. If you’re on Windows, I’d recommend trying WSL2 with Ubuntu as a starting point.

Is it worth learning the CLI if my team uses GUI tools for everything?

Yes, because this is about your personal speed, not group consensus. You can keep using the same tools as your team while quietly improving your own workflow. No one needs to sign off on you using rg instead of your IDE’s search.

How do I remember all the commands?

You don’t. Use your shell history (Ctrl+R reverse search), keep a small ~/notes/cli.md file with your favorite commands, and turn your top 10 long commands into aliases. Over time, the ones you actually need will stick. The rest can stay in your notes.

🕒 Published:

🧰
Written by Jake Chen

Software reviewer and AI tool expert. Independently tests and benchmarks AI products. No sponsored reviews — ever.

Learn more →
Browse Topics: AI & Automation | Comparisons | Dev Tools | Infrastructure | Security & Monitoring

See Also

AgntkitAidebugAgntapiAgnthq
Scroll to Top