Lesson 36 • Advanced

    JSON & XML Processing

    Data interchange is the backbone of modern applications. JSON dominates REST APIs, while XML persists in enterprise systems and configuration files. Master both formats with Jackson (the industry standard) and know when to reach for streaming parsers for massive files.

    Before You Start

    You should know REST APIs (JSON is the standard API format) and Annotations (Jackson uses @JsonProperty, @JsonIgnore, etc.). Basic understanding of data structures (maps, lists) is assumed.

    What You'll Learn

    • ✅ JSON parsing with Jackson ObjectMapper
    • ✅ Gson as an alternative JSON library
    • ✅ Custom serializers and @JsonProperty annotations
    • ✅ XML processing with JAXB and DOM/SAX parsers
    • ✅ JSON vs XML: when to use each
    • ✅ Streaming parsers for large files

    1️⃣ JSON vs XML — Decision Guide

    Analogy: JSON is like texting — short, efficient, easy to read. XML is like a formal letter — verbose but self-describing with schemas and validation. For new projects, JSON wins 95% of the time. XML appears in SOAP services, Maven POMs, and legacy enterprise systems.

    FeatureJSONXML
    SyntaxLightweight, key-valueVerbose, tag-based
    Data typesString, number, bool, arrayEverything is text
    Best forREST APIs, configs, NoSQLEnterprise, SOAP, documents
    Java libraryJackson, GsonJAXB, DOM, SAX, StAX

    Try It: Jackson ObjectMapper

    Try it Yourself »
    JavaScript
    // 💡 Try modifying this code and see what happens!
    // Simulating Jackson ObjectMapper operations
    
    console.log("=== Jackson ObjectMapper ===\n");
    
    // 1. Serialization (Object → JSON)
    console.log("1. SERIALIZE (writeValueAsString):");
    let user = { name: "Alice", age: 30, email: "alice@test.com", active: true };
    let json = JSON.stringify(user, null, 2);
    console.log(json);
    
    // 2. Deserialization (JSON → Object)
    console.log("\n2. DESERIALIZE (readValue):");
    let jsonStr = '{"name":"Bob","age":25,"ema
    ...

    2️⃣ Nested Objects & Tree Model

    Real-world APIs return deeply nested JSON. Jackson handles this automatically when your POJOs match the structure. When you don't want to create classes, use the tree model (JsonNode) to navigate JSON like a document.

    mapper.readValue(json, Order.class) — POJO mapping (typed)

    mapper.readTree(json) — tree model (dynamic)

    node.get("field").asText() — extract values from tree

    Try It: Nested JSON & Tree Model

    Try it Yourself »
    JavaScript
    // 💡 Try modifying this code and see what happens!
    // Working with nested JSON and tree model
    
    console.log("=== Nested JSON & Tree Model ===\n");
    
    // 1. Complex nested JSON (e-commerce order)
    let orderJson = {
      id: "ORD-2024-001",
      customer: {
        name: "Alice Johnson",
        tier: "Gold",
        address: { city: "Portland", state: "OR", zip: "97201" }
      },
      items: [
        { product: "Laptop Pro 15", quantity: 1, price: 1299.99, category: "Electronics" },
        { product: "USB-C Hub", quantity: 2, pr
    ...

    Common Mistakes

    • ⚠️ Not handling unknown properties: Configure FAIL_ON_UNKNOWN_PROPERTIES = false to avoid crashes when APIs add new fields
    • ⚠️ Date formatting surprises: Use @JsonFormat(pattern="yyyy-MM-dd'T'HH:mm:ss")
    • ⚠️ Loading huge files into memory: A 2GB JSON file with readValue() will cause OutOfMemoryError — use streaming
    • ⚠️ XXE attacks in XML: Disable external entity processing to prevent security vulnerabilities

    Pro Tips

    • 💡 Jackson is the industry standard — Spring Boot auto-configures it. Learn ObjectMapper well
    • 💡 Use @JsonProperty("user_name") to map snake_case API fields to camelCase Java
    • 💡 mapper.readTree(json) gives you a tree model — perfect for dynamic JSON
    • 💡 For GB-sized files, use Jackson's JsonParser (streaming) — constant memory

    Try It: Streaming Parser for Large Files

    Try it Yourself »
    JavaScript
    // 💡 Try modifying this code and see what happens!
    // Streaming JSON parser simulation for large files
    
    console.log("=== Streaming Parser (Large Files) ===\n");
    
    // Generate a "large" dataset
    console.log("1. PROBLEM: Loading large JSON into memory");
    let sampleRecord = { id: 1, name: "User", email: "user@test.com", score: 42 };
    let recordSize = JSON.stringify(sampleRecord).length;
    console.log("  Record size: ~" + recordSize + " bytes");
    console.log("  1M records = ~" + Math.round(recordSize * 1
    ...

    📋 Quick Reference

    LibraryAPIUse Case
    Jacksonmapper.readValue(json, T.class)JSON ↔ Java objects
    Gsonnew Gson().fromJson(json, T.class)Simpler alternative
    JsonNodemapper.readTree(json)Tree-model (no POJO)
    JAXBMarshaller / UnmarshallerXML ↔ Java objects
    StreamingJsonParser / JsonGeneratorLarge file processing

    🎉 Lesson Complete!

    You can now process JSON and XML like a pro! Next: Unit Testing — writing reliable tests with JUnit 5 and Mockito.

    Sign up for free to track which lessons you've completed and get learning reminders.

    Previous

    Cookie & Privacy Settings

    We use cookies to improve your experience, analyze traffic, and show personalized ads. You can manage your preferences below.

    By clicking "Accept All", you consent to our use of cookies for analytics and personalized advertising. You can customize your preferences or reject non-essential cookies.

    Privacy PolicyTerms of Service