Files
the_information_nexus/tech_docs/primitives.md
2025-08-03 05:00:37 -05:00

8.1 KiB
Raw Permalink Blame History

You're absolutely right that tools built on solid, fundamental primitives tend to outlast fleeting "modern" alternatives. Here's a reasonable list of primitives (and associated tools) that have stood the test of time, categorized by domain:


1. Text Processing (The Unix Philosophy)

  • Core Primitives: Lines, streams, regular expressions, fields/delimiters.
  • Tools to Master:
    • grep Filtering by pattern.
    • sed Stream editing (substitutions, deletions).
    • awk Field-based text processing (awk is almost a language itself).
    • cut / paste / join Column and table operations.
    • sort / uniq Sorting and deduplication.
    • tr Character-level transformations.
    • xargs Construct commands from input.

Why? These tools compose endlessly and handle 90% of text wrangling tasks.


2. System & Process Control

  • Core Primitives: Files, processes, signals, permissions, pipes.
  • Tools to Master:
    • find File system search with powerful predicates.
    • ps / top / htop Process inspection.
    • kill / pkill / killall Process management.
    • cron / systemd (timers) Scheduling.
    • lsof List open files/sockets.
    • strace / perf Debugging and profiling.

Why? Understanding processes and system state is forever relevant.


3. Networking

  • Core Primitives: Sockets, ports, packets, DNS, HTTP.
  • Tools to Master:
    • curl / wget HTTP/HTTPS interactions.
    • ssh / scp / rsync Remote access and file transfer.
    • netstat / ss Socket inspection.
    • nc (netcat) Arbitrary TCP/UDP connections.
    • tcpdump / Wireshark Packet inspection.
    • dig / nslookup DNS debugging.
    • iptables / nftables Firewalling (still foundational despite newer abstractions).

Why? Networking is universal, and these tools expose the underlying reality.


4. Files & Storage

  • Core Primitives: Inodes, blocks, mounts, symlinks, permissions.
  • Tools to Master:
    • dd Low-level file/device operations.
    • mount / umount / df / du Disk management.
    • tar / gzip / xz Archiving and compression.
    • ln Hard/soft links.
    • chmod / chown / acl Permissions.
    • rsync Efficient file synchronization.

Why? Filesystems dont change as fast as abstractions built atop them.


5. Version Control (The Primitive: Diffs & Patches)

  • Tool to Master: git (but understand its internals: objects, refs, the index).
  • Bonus: Learn diff / patch for low-level changes.

Why? Git won. Knowing it deeply pays dividends.


6. Shell Scripting

  • Core Primitives: Variables, loops, conditionals, subshells, exit codes.
  • Tools to Master:
    • bash (or zsh) The dominant shells.
    • jq JSON processing (for modern APIs).
    • make Dependency-based workflows (not just for C!).

Why? Automation is eternal, and shell is the lingua franca.


7. Performance & Debugging

  • Core Primitives: CPU, memory, I/O, latency.
  • Tools to Master:
    • perf / strace / ltrace Profiling and tracing.
    • vmstat / iostat / sar System metrics.
    • gdb Debugging binaries.
    • time Measuring command runtime.

Why? Performance bottlenecks always resurface in new forms.


What to Master vs. Know?

  • Master: grep, awk, sed, find, git, ssh, iptables, jq, curl, bash.
  • Know Well Enough: strace, tcpdump, perf, dd, make.

Avoid Overinvesting In:

  • Fad tools that abstract too much (e.g., overly "declarative" systems that hide the underlying primitives).
  • GUI-only tools (unless theyre irreplaceable, like Wireshark).

Key Insight:

The tools that last are the ones that expose underlying system realities rather than abstract them away. Learn the primitives theyre built on (text streams, processes, files, networks), and youll adapt to any "modern" tool that comes and goes.


You're absolutely right that tools built on solid, fundamental primitives tend to outlast fleeting "modern" alternatives. Here's a reasonable list of primitives (and associated tools) that have stood the test of time, categorized by domain:


1. Text Processing (The Unix Philosophy)

  • Core Primitives: Lines, streams, regular expressions, fields/delimiters.
  • Tools to Master:
    • grep Filtering by pattern.
    • sed Stream editing (substitutions, deletions).
    • awk Field-based text processing (awk is almost a language itself).
    • cut / paste / join Column and table operations.
    • sort / uniq Sorting and deduplication.
    • tr Character-level transformations.
    • xargs Construct commands from input.

Why? These tools compose endlessly and handle 90% of text wrangling tasks.


2. System & Process Control

  • Core Primitives: Files, processes, signals, permissions, pipes.
  • Tools to Master:
    • find File system search with powerful predicates.
    • ps / top / htop Process inspection.
    • kill / pkill / killall Process management.
    • cron / systemd (timers) Scheduling.
    • lsof List open files/sockets.
    • strace / perf Debugging and profiling.

Why? Understanding processes and system state is forever relevant.


3. Networking

  • Core Primitives: Sockets, ports, packets, DNS, HTTP.
  • Tools to Master:
    • curl / wget HTTP/HTTPS interactions.
    • ssh / scp / rsync Remote access and file transfer.
    • netstat / ss Socket inspection.
    • nc (netcat) Arbitrary TCP/UDP connections.
    • tcpdump / Wireshark Packet inspection.
    • dig / nslookup DNS debugging.
    • iptables / nftables Firewalling (still foundational despite newer abstractions).

Why? Networking is universal, and these tools expose the underlying reality.


4. Files & Storage

  • Core Primitives: Inodes, blocks, mounts, symlinks, permissions.
  • Tools to Master:
    • dd Low-level file/device operations.
    • mount / umount / df / du Disk management.
    • tar / gzip / xz Archiving and compression.
    • ln Hard/soft links.
    • chmod / chown / acl Permissions.
    • rsync Efficient file synchronization.

Why? Filesystems dont change as fast as abstractions built atop them.


5. Version Control (The Primitive: Diffs & Patches)

  • Tool to Master: git (but understand its internals: objects, refs, the index).
  • Bonus: Learn diff / patch for low-level changes.

Why? Git won. Knowing it deeply pays dividends.


6. Shell Scripting

  • Core Primitives: Variables, loops, conditionals, subshells, exit codes.
  • Tools to Master:
    • bash (or zsh) The dominant shells.
    • jq JSON processing (for modern APIs).
    • make Dependency-based workflows (not just for C!).

Why? Automation is eternal, and shell is the lingua franca.


7. Performance & Debugging

  • Core Primitives: CPU, memory, I/O, latency.
  • Tools to Master:
    • perf / strace / ltrace Profiling and tracing.
    • vmstat / iostat / sar System metrics.
    • gdb Debugging binaries.
    • time Measuring command runtime.

Why? Performance bottlenecks always resurface in new forms.


What to Master vs. Know?

  • Master: grep, awk, sed, find, git, ssh, iptables, jq, curl, bash.
  • Know Well Enough: strace, tcpdump, perf, dd, make.

Avoid Overinvesting In:

  • Fad tools that abstract too much (e.g., overly "declarative" systems that hide the underlying primitives).
  • GUI-only tools (unless theyre irreplaceable, like Wireshark).

Key Insight:

The tools that last are the ones that expose underlying system realities rather than abstract them away. Learn the primitives theyre built on (text streams, processes, files, networks), and youll adapt to any "modern" tool that comes and goes.