# Working with `jq` on Debian ## Introduction to `jq` `jq` is a powerful command-line tool used for parsing, filtering, transforming, and analyzing JSON data. It allows you to manipulate JSON in a similar way to how `sed`, `awk`, and `grep` handle text files. This guide will walk you through installing `jq`, basic usage, practical examples, and common use cases. ## Installation To install `jq` on a Debian-based system, use the following commands: ```sh sudo apt-get update sudo apt-get install jq ``` ## JSON Examples for Practice Here are some sample JSON data structures to practice with: ### Example 1: Simple JSON Object ```json { "name": "John Doe", "age": 30, "city": "New York" } ``` ### Example 2: JSON Array ```json [ { "name": "John Doe", "age": 30, "city": "New York" }, { "name": "Jane Smith", "age": 25, "city": "Los Angeles" }, { "name": "Sam Brown", "age": 20, "city": "Chicago" } ] ``` ### Example 3: Nested JSON Object ```json { "id": 1, "name": "Product Name", "price": 29.99, "tags": ["electronics", "gadget"], "stock": { "warehouse": 100, "retail": 50 } } ``` ## Basic `jq` Commands ### Parsing and Pretty-Printing JSON To pretty-print JSON, you can use the `.` filter: ```sh cat example1.json | jq . ``` ### Extracting a Value To extract a specific value from a JSON object: ```sh cat example1.json | jq '.name' ``` For a JSON array, you can extract a specific element by index: ```sh cat example2.json | jq '.[0].name' ``` ### Filtering JSON Arrays To filter an array based on a condition: ```sh cat example2.json | jq '.[] | select(.age > 25)' ``` ### Modifying JSON To modify a JSON object and add a new field: ```sh cat example1.json | jq '. + {"country": "USA"}' ``` ### Combining Filters You can combine multiple filters to achieve more complex queries: ```sh cat example3.json | jq '.stock | {total_stock: (.warehouse + .retail)}' ``` ## Practical Exercises ### Exercise 1: Extract the Age of "Jane Smith" ```sh cat example2.json | jq '.[] | select(.name == "Jane Smith") | .age' ``` ### Exercise 2: List All Names ```sh cat example2.json | jq '.[].name' ``` ### Exercise 3: Calculate Total Stock ```sh cat example3.json | jq '.stock | .warehouse + .retail' ``` ### Exercise 4: Add a New Tag "sale" to the Tags Array ```sh cat example3.json | jq '.tags += ["sale"]' ``` ## Common Uses of `jq` ### Parsing API Responses When interacting with web APIs, the responses are often in JSON format. `jq` allows you to parse, filter, and extract the necessary data from these responses. ```sh curl -s https://api.example.com/data | jq '.items[] | {name: .name, id: .id}' ``` ### Processing Configuration Files Many modern applications use JSON for configuration. With `jq`, you can easily modify or extract values from these files. ```sh jq '.settings.debug = true' config.json > new_config.json ``` ### Log Analysis If your logs are in JSON format, you can use `jq` to search for specific entries, aggregate data, or transform the logs into a more readable format. ```sh cat logs.json | jq '.[] | select(.level == "error") | {timestamp: .timestamp, message: .message}' ``` ### Data Transformation Transforming JSON data into different structures or formats is straightforward with `jq`. This is useful for data pipelines or ETL (Extract, Transform, Load) processes. ```sh cat data.json | jq '[.items[] | {name: .name, value: .value}]' ``` ### Scripting and Automation In shell scripts, `jq` can be used to parse and manipulate JSON data as part of automation tasks. ```sh # Extracting a value from JSON in a script response=$(curl -s https://api.example.com/data) id=$(echo $response | jq -r '.items[0].id') echo "The ID is: $id" ``` ### Testing and Debugging When developing applications that produce or consume JSON, `jq` helps in quickly inspecting the JSON output for correctness. ```sh cat response.json | jq '.' ``` ## Practical Scenarios ### Working with Kubernetes Kubernetes uses JSON and YAML extensively. You can use `jq` to filter and extract information from the JSON output of `kubectl` commands. ```sh kubectl get pods -o json | jq '.items[] | {name: .metadata.name, status: .status.phase}' ``` ### CI/CD Pipelines In continuous integration and deployment workflows, `jq` can parse and transform JSON data used in configuration files, reports, or environment variables. ```sh echo $GITHUB_EVENT_PATH | jq '.commits[] | {message: .message, author: .author.name}' ``` ### Web Development When dealing with front-end and back-end integration, `jq` helps in simulating API responses or transforming data formats. ```sh cat mock_response.json | jq '.users[] | {username: .login, email: .email}' ``` ### Data Analysis For quick analysis of JSON data files, `jq` provides a powerful way to query and aggregate data. ```sh cat data.json | jq '[.records[] | select(.active == true) | .value] | add' ``` ### DevOps and Infrastructure Management Tools like Terraform and AWS CLI produce JSON output, and `jq` is perfect for extracting and processing this information. ```sh aws ec2 describe-instances | jq '.Reservations[].Instances[] | {instanceId: .InstanceId, state: .State.Name}' ``` ## Conclusion `jq` is a versatile tool that can be integrated into various workflows to handle JSON data efficiently. Whether you're working with APIs, configuration files, logs, or automation scripts, `jq` helps you parse, filter, and transform JSON data with ease. Feel free to modify these examples and try different commands. `jq` has a comprehensive manual that you can refer to for more advanced features: ```sh man jq ``` Happy learning! If you have any specific questions or need further assistance with `jq`, let me know! --- # Learning `jq` for Command-Line JSON Processing `jq` is a powerful command-line JSON processor that lets you parse, filter, and transform JSON data. Here's a comprehensive guide to get you started: ## Installation Most Linux distributions and macOS can install it via package managers: ```bash # Ubuntu/Debian sudo apt install jq # CentOS/RHEL sudo yum install jq # macOS (using Homebrew) brew install jq # Windows (via Chocolatey) choco install jq ``` ## Basic Usage ```bash # Basic pretty-printing jq '.' file.json # Read from stdin curl -s https://api.example.com/data | jq '.' ``` ## Selecting Data ```bash # Get a specific field jq '.field' file.json # Get nested fields jq '.parent.child.grandchild' file.json # Get array elements jq '.array[0]' file.json # First element jq '.array[-1]' file.json # Last element jq '.array[2:5]' file.json # Slice from index 2 to 5 ``` ## Common Operations ```bash # Get multiple fields jq '{name: .name, age: .age}' file.json # Filter arrays jq '.users[] | select(.age > 30)' file.json # Map operations jq '.numbers[] | . * 2' file.json # Sort jq '.users | sort_by(.age)' file.json # Length/count jq '.array | length' file.json ``` ## Advanced Features ```bash # String interpolation jq '"Hello, \(.name)!"' file.json # Conditional logic jq 'if .age > 18 then "Adult" else "Minor" end' file.json # Variables jq '. as $item | $item.name' file.json # Custom functions jq 'def add(x; y): x + y; add(5; 10)' <<< '{}' ``` ## Practical Examples ```bash # Extract all email addresses from JSON jq -r '.users[].email' file.json # Convert JSON to CSV jq -r '.users[] | [.name, .email, .age] | @csv' file.json # Sum all values in an array jq '[.numbers[]] | add' file.json # Find unique values jq '.tags[]' file.json | sort | uniq # Modify JSON structure jq '{user: .name, contact: {email: .email, phone: .tel}}' file.json ``` ## Tips 1. Use `-r` for raw output (no quotes around strings) 2. Combine with `curl` for API responses: `curl -s ... | jq ...` 3. Use `//` for default values: `jq '.name // "Anonymous"'` 4. For large files, use `--stream` for iterative parsing