Using Apple Intelligence’s Default System Model To Build a Coding Assistant Command Line Tool

This tool looks in the current directory and all subdirectories for source code files and describes them and then enters a chat loop for talking about the code.

Package.swift:

 1 // swift-tools-version: 6.2
 2 // The swift-tools-version declares the minimum version of Swift required to build t\
 3 his package.
 4 
 5 import PackageDescription
 6 
 7 let package = Package(
 8     name: "CodingCLI",
 9 
10     // 1️ Tell SwiftPM we require at least macOS 12 so
11     //    `Task.value`, async/await, and FoundationModels are available.
12     platforms: [
13         .macOS(.v26)
14     ],
15 
16     products: [
17         .executable(name: "CodingCLI", targets: ["CodingCLI"])
18     ],
19 
20     targets: [
21         .executableTarget(
22             name: "CodingCLI",
23 
24             // 2️ Link the system framework that ships with Xcode 17+
25             //    (no external dependency required).
26             linkerSettings: [
27                 .linkedFramework("FoundationModels")
28             ]
29         )
30     ]
31 )

CodingCLI.swift:

 1 import Foundation
 2 import FoundationModels
 3 import Dispatch
 4 
 5 @main
 6 struct CodingCLI {
 7     static func main() async throws {
 8         // ---- 1. Gather candidate source files ----
 9         let exts = ["swift", "py", "lisp"]
10         var blobs: [String] = []
11 
12         let enumerator = FileManager.default.enumerator(atPath: ".")!
13 
14         while let path = enumerator.nextObject() as? String { // avoids @noasync
15             guard let ext = path.split(separator: ".").last,
16                   exts.contains(ext.lowercased()) else { continue }
17 
18             if let data = FileManager.default.contents(atPath: path),
19                data.count < 8 * 1024 {                           // keep size filter
20                 let text = String(decoding: data, as: UTF8.self) // non-optional
21                 blobs.append("### \(path) ###\n\(text)")
22             }
23         }
24 
25         let doc      = blobs.joined(separator: "\n")
26         let summary  = try await Self.summarize(doc)
27         print("\n=== Project Summary ===\n\(summary)\n")
28 
29         // ---- 2. Start interactive chat loop ----
30         let session  = LanguageModelSession(instructions:
31                                             "You are a helpful assistant.")
32         let options  = GenerationOptions(temperature: 0.2)
33         print("Apple-Intelligence chat (streaming, T=0.2).  Type /quit to exit.\n")
34 
35         while let prompt = readLine(strippingNewline: true) {
36             if prompt.isEmpty || prompt == "/quit" { break }
37 
38             var printed = ""
39             let task = Task {
40                 for try await part in session.streamResponse(to: prompt,
41                                                              options: options) {
42                     let delta = part.dropFirst(printed.count)
43                     if !delta.isEmpty {
44                         FileHandle.standardOutput.write(Data(delta.utf8))
45                         fflush(stdout)
46                         printed = part
47                     }
48                 }
49                 print()
50             }
51 
52             signal(SIGINT, SIG_IGN)
53             let sig = DispatchSource.makeSignalSource(signal: SIGINT, queue: .main)
54             sig.setEventHandler { task.cancel() }
55             sig.resume()
56             defer { sig.cancel() }
57 
58             _ = try await task.value
59         }
60     }
61 
62     // ---- 3. Helper: summarize all code ----
63     static func summarize(_ text: String) async throws -> String {
64         let session = LanguageModelSession(
65             instructions: """
66             Summarize the following multi-file project. \
67             For each file give one bullet explaining its role, then \
68             a two-sentence overall description.
69             """
70         )
71         let prompt = text.prefix(24 * 1024)                 // safety window
72         let resp   = try await
73                      session.respond(to: String(prompt),
74                                      options: GenerationOptions(temperature: 0))
75         return resp.content                         // unwrap Response<String>
76     }
77 }

Here is the output for running this tool in its own source directory:

 1 $ swift run
 2 Building for debugging...
 3 [8/8] Applying CodingCLI
 4 Build of product 'CodingCLI' complete! (3.23s)
 5 
 6  === Project Summary ===
 7  ### test.py
 8 - **Role:** This script interacts with Groq to perform a chat completion task.
 9 - **Description:** It sets up a chat session using Groq's API, sends a specific mess\
10 age, and prints the response, showcasing how to utilize Groq for conversational AI t\
11 asks.
12 
13  ### Package.swift
14 - **Role:** Defines the Swift package configuration for the CodingCLI project.
15 - **Description:** This file specifies the project's platform requirements, defines \
16 the executable product, and outlines the executable target with necessary dependenci\
17 es.
18 
19  ### Sources/CodingCLI/CodingCLI.swift
20 - **Role:** Serves as the entry point for the CodingCLI application, handling file s\
21 ummarization and chat interaction.
22 - **Description:** It processes source files to generate a summary, and manages an i\
23 nteractive chat loop using a language model, demonstrating integration of summarizat\
24 ion and conversational AI within a Swift package.
25 
26 Apple-Intelligence chat (streaming, T=0.2).  Type /quit to exit.