<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Pieces.app</title>
    <description>The latest articles on Forem by Pieces.app (@getpieces).</description>
    <link>https://forem.com/getpieces</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/getpieces"/>
    <language>en</language>
    <item>
      <title>How to Build an Agentic Blog Generator</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Tue, 03 Feb 2026 15:22:20 +0000</pubDate>
      <link>https://forem.com/getpieces/build-an-agentic-blog-generator-with-pieces-in-flutter-h3b</link>
      <guid>https://forem.com/getpieces/build-an-agentic-blog-generator-with-pieces-in-flutter-h3b</guid>
      <description>&lt;h2&gt;
  
  
  Building an Agentic Blog Generator With Pieces OS (Flutter)
&lt;/h2&gt;

&lt;p&gt;This project is a Flutter app that generates a technical blog (in Markdown) from real, recent context. The core idea is simple: &lt;strong&gt;use Pieces OS as the source of truth for what you’ve been working on&lt;/strong&gt; (workstream summaries + your own annotations/persona signals), then use an LLM to turn that into a structured, high-quality blog—optionally in &lt;strong&gt;agentic mode&lt;/strong&gt; with MCP tools.&lt;/p&gt;




&lt;h3&gt;
  
  
  Features
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Auto generate title&lt;/strong&gt;: suggests blog titles from recent Pieces workstream summaries.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw695cfjawevjzjd8zyj3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw695cfjawevjzjd8zyj3.png" alt=" " width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Detect persona&lt;/strong&gt;: pulls recent Pieces user annotations and converts them into “persona signals” for voice/tone&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgx9ivvteojg2595d9uz1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgx9ivvteojg2595d9uz1.png" alt=" " width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Detect style&lt;/strong&gt;: analyzes a sample blog to infer a reusable style object for consistent output.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvn7pkl2dcgg1e8us85wp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvn7pkl2dcgg1e8us85wp.png" alt=" " width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Generate multi part blog&lt;/strong&gt;: plans part titles + per-part outlines, then generates Markdown one part at a time.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjs35fq7gcx0nledgqkd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnjs35fq7gcx0nledgqkd.png" alt=" " width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Agentic flow&lt;/strong&gt;: connects to the MCP endpoint and uses tools for retrieval/verification (RAG) instead of guessing.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fneknizj6jhtte6scnljv.png" alt=" " width="800" height="500"&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What Pieces OS gives us (and why it matters)
&lt;/h3&gt;

&lt;p&gt;When you ask an LLM to “write a blog about my project”, you usually get one of two outcomes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A generic post that &lt;em&gt;sounds&lt;/em&gt; plausible but doesn’t match what you actually did.
&lt;/li&gt;
&lt;li&gt;A post that misses the details that made the work interesting (trade-offs, decisions, workflow).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Pieces OS helps fix this by providing &lt;strong&gt;grounded context&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Workstream Summaries (LTM)&lt;/strong&gt;: a stream of recent summaries of your work so we can generate content from what &lt;em&gt;actually happened&lt;/em&gt; recently.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Annotations (persona + preferences + voice cues)&lt;/strong&gt;: your own notes/annotations can be pulled and turned into “persona signals” so the blog tone and framing match you.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MCP (Model Context Protocol) tools&lt;/strong&gt;: in agentic mode, the generator can call MCP tools (memory/search/context) instead of guessing—so it can fetch or verify information as it writes. Practically, &lt;strong&gt;connecting via the MCP endpoint gives us RAG (retrieval‑augmented generation)&lt;/strong&gt;: the model retrieves relevant context from Pieces and then writes with that grounded input.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this app, Pieces is not a “nice-to-have integration”—it’s the backbone for &lt;strong&gt;relevance&lt;/strong&gt; and &lt;strong&gt;personalization&lt;/strong&gt;.&lt;/p&gt;




&lt;h3&gt;
  
  
  How this app uses Pieces OS
&lt;/h3&gt;

&lt;p&gt;At a high level, the wizard does three Pieces-backed things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Connect to Pieces OS&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Establishes app connection with Pieces OS (so API calls work).
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pull recent context&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Pulls recent workstream summary IDs over a short-lived WebSocket.
&lt;/li&gt;
&lt;li&gt;Fetches each summary snapshot.
&lt;/li&gt;
&lt;li&gt;Extracts the human-readable “DESCRIPTION” annotation text from each summary.
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pull persona signals&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;Fetches recent user annotations via &lt;code&gt;UserApi.userGetAnnotations()&lt;/code&gt;.
&lt;/li&gt;
&lt;li&gt;Normalizes/truncates them into a prompt-friendly “persona signals” block.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Then, when we run in &lt;strong&gt;agentic mode&lt;/strong&gt;, we also:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Connect to the Pieces MCP endpoint
&lt;/li&gt;
&lt;li&gt;List the available tools
&lt;/li&gt;
&lt;li&gt;Allow the LLM to call those tools while planning/writing&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This combination (summaries + persona + MCP tools) gives the generator a much better shot at producing content that matches reality.&lt;/p&gt;




&lt;h3&gt;
  
  
  Agentic mode: how we got better output (one-shot → plan-first → structured parts)
&lt;/h3&gt;

&lt;p&gt;We saw a clear quality jump as we changed the prompting approach.&lt;/p&gt;

&lt;h4&gt;
  
  
  Attempt 1: one prompt to generate the whole blog
&lt;/h4&gt;

&lt;p&gt;The naive approach is: “Here’s some context—write the entire blog.”&lt;/p&gt;

&lt;p&gt;In practice, that tends to produce:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;weak structure (rambling, repetitive sections)
&lt;/li&gt;
&lt;li&gt;missing coverage (important modules not mentioned)
&lt;/li&gt;
&lt;li&gt;brittle accuracy (model fills gaps by guessing)&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Attempt 2: plan first, then generate
&lt;/h4&gt;

&lt;p&gt;Next improvement: &lt;strong&gt;separate planning from writing&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Instead of writing immediately, we made the app generate a plan first (what parts, what each part is about). That makes the writing phase far more constrained and coherent:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the model knows the “shape” of the blog before it writes
&lt;/li&gt;
&lt;li&gt;you can review/edit titles before any heavy generation happens&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Attempt 3: even more planning, broken into titled parts
&lt;/h4&gt;

&lt;p&gt;The biggest jump came from adding &lt;em&gt;more&lt;/em&gt; structure:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;generate &lt;strong&gt;part titles&lt;/strong&gt; first (so each part has a clear purpose)
&lt;/li&gt;
&lt;li&gt;generate &lt;strong&gt;an outline for each part&lt;/strong&gt; (so headings and subheadings are defined)
&lt;/li&gt;
&lt;li&gt;then write &lt;strong&gt;one part at a time&lt;/strong&gt;, using the outline as a contract&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That does two things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Focus&lt;/strong&gt;: each part stays on-topic (because it has a title + outline).
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Better coverage&lt;/strong&gt;: parts are explicitly scoped, so important areas are less likely to be skipped.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is exactly why titled parts matter: the model is no longer inventing structure as it writes—it’s executing a plan you’ve already approved.&lt;/p&gt;




&lt;h3&gt;
  
  
  Tutorial: integrating Pieces OS in a Flutter app
&lt;/h3&gt;

&lt;p&gt;This section breaks down how we integrated Pieces OS in a way that works well for a UI-driven Flutter app. The goal is to make the model’s output &lt;strong&gt;grounded&lt;/strong&gt; (Pieces workstream summaries + persona signals) and optionally &lt;strong&gt;agentic&lt;/strong&gt; (MCP endpoint → RAG).&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 1: configure the Pieces endpoints
&lt;/h4&gt;

&lt;p&gt;In &lt;code&gt;PiecesOSService&lt;/code&gt;, we keep the local Pieces defaults in one place:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;baseUrl&lt;/code&gt;: REST API base (&lt;code&gt;http://localhost:39300&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;websocketUrl&lt;/code&gt;: WebSocket base (&lt;code&gt;ws://localhost:39300&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;defaultMcpEndpoint&lt;/code&gt;: MCP streamable HTTP endpoint (used for tool-based RAG)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here’s the exact code that defines those endpoints (and the imports used by the service):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'dart:async';
import 'dart:developer' as dev;
import 'dart:convert';
import 'dart:io';
import 'package:mcp_dart/mcp_dart.dart' as mcp;
import 'package:pieces_os_client/api.dart';

import '../models/persona_signals.dart';

/// Service to interact with Pieces OS for LTM (Long Term Memory)
class PiecesOSService {
  // Pieces OS configuration
  static const String baseUrl = 'http://localhost:39300';
  static const String websocketUrl = 'ws://localhost:39300';
  static const String defaultMcpEndpoint =
      'http://localhost:39300/model_context_protocol/2025-03-26/mcp';
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Step 2: connect the app to Pieces OS (REST)
&lt;/h4&gt;

&lt;p&gt;The &lt;code&gt;connectApplication()&lt;/code&gt; method registers/connects this app with Pieces via &lt;code&gt;ConnectorApi.connect(...)&lt;/code&gt; and stores the returned &lt;code&gt;Context&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;initialize()&lt;/code&gt; method is a small guard that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;ensures we only connect once per session (&lt;code&gt;_isInitialized&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;makes all later calls safe to run “just-in-time” from the UI&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here are the service fields + constructor + REST connection code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  // API clients
  late final ApiClient client;
  late final ConnectorApi _connectorApi;
  late final WorkstreamSummaryApi _workstreamSummaryApi;
  late final AnnotationApi _annotationApi;
  late final UserApi _userApi;

  // Application context
  Context? _context;

  // Application info
  final ApplicationNameEnum appName = ApplicationNameEnum.OPEN_SOURCE;
  final String appVersion = "0.0.1";
  final PlatformEnum platform = Platform.operatingSystem == "windows"
      ? PlatformEnum.WINDOWS
      : Platform.operatingSystem == "macos"
      ? PlatformEnum.MACOS
      : PlatformEnum.LINUX;

  // Track if service is initialized
  bool _isInitialized = false;

  /// Placeholder persona string
  /// generation prompts later.
  String persona = '';

  PiecesOSService() {
    client = ApiClient(basePath: baseUrl);
    _connectorApi = ConnectorApi(client);
    _workstreamSummaryApi = WorkstreamSummaryApi(client);
    _annotationApi = AnnotationApi(client);
    _userApi = UserApi(client);
  }

  /// Register/Connect the application to Pieces OS
  Future&amp;lt;Application&amp;gt; connectApplication() async {
    if (_context?.application != null) {
      return _context!.application;
    }

    try {
      final seededApp = SeededTrackedApplication(
        name: appName,
        platform: platform,
        version: appVersion,
      );

      final connection = SeededConnectorConnection(application: seededApp);

      _context = await _connectorApi.connect(
        seededConnectorConnection: connection,
      );

      if (_context?.application == null) {
        throw Exception(
          'Failed to connect to Pieces OS: No application returned',
        );
      }

      dev.log(
        'Successfully connected to Pieces OS: ${_context!.application.name}',
        name: 'PiecesOSService',
      );
      return _context!.application;
    } catch (e, st) {
      dev.log(
        'Error connecting to Pieces OS',
        name: 'PiecesOSService',
        error: e,
        stackTrace: st,
      );
      rethrow;
    }
  }

  /// Initialize the service - connects to Pieces OS.
  ///
  /// NOTE: We intentionally do **not** start any WebSocket listeners. This app
  /// fetches workstream summary identifiers on-demand.
  Future&amp;lt;void&amp;gt; initialize() async {
    if (_isInitialized) {
      dev.log('Service already initialized', name: 'PiecesOSService');
      return;
    }

    try {
      // Connect to Pieces OS
      await connectApplication();

      _isInitialized = true;
      dev.log('Initialized successfully', name: 'PiecesOSService');
    } catch (e, st) {
      dev.log(
        'Error initializing',
        name: 'PiecesOSService',
        error: e,
        stackTrace: st,
      );
      rethrow;
    }
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Step 3: retrieve recent work context (workstream summaries)
&lt;/h4&gt;

&lt;p&gt;To ground generation in “what I actually worked on”, the service does this on-demand:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;fetchLatestWorkstreamSummaryIds()&lt;/code&gt; opens a WebSocket to the identifiers stream, reads &lt;strong&gt;one&lt;/strong&gt; payload, then closes the socket.
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;getLastSummaryContents()&lt;/code&gt; uses those IDs to fetch summary snapshots and extracts the &lt;strong&gt;DESCRIPTION&lt;/strong&gt; annotation text via &lt;code&gt;getSummaryContent()&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Those DESCRIPTION strings are what we feed into the LLM as “recent context”.&lt;/p&gt;

&lt;p&gt;Here’s the exact code for streaming IDs, fetching summaries, and extracting DESCRIPTION text:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  /// Fetch the most recent workstream summary identifiers on-demand.
  ///
  /// This opens a WebSocket connection **once**, reads the first identifiers
  /// payload, then closes the socket. No continuous listener / no caching.
  Future&amp;lt;List&amp;lt;String&amp;gt;&amp;gt; fetchLatestWorkstreamSummaryIds({
    int limit = 10,
    Duration timeout = const Duration(seconds: 8),
  }) async {
    if (limit &amp;lt;= 0) return const [];
    await initialize();

    final wsUrl = '$websocketUrl/workstream_summaries/stream/identifiers';
    WebSocket? socket;
    try {
      socket = await WebSocket.connect(wsUrl).timeout(timeout);
      final first = await socket.first.timeout(timeout);

      String raw;
      if (first is String) {
        raw = first;
      } else if (first is List&amp;lt;int&amp;gt;) {
        raw = utf8.decode(first);
      } else {
        raw = first.toString();
      }

      final decoded = jsonDecode(raw);
      final streamed = StreamedIdentifiers.fromJson(decoded);
      final ids = (streamed?.iterable ?? const [])
          .map((e) =&amp;gt; e.workstreamSummary?.id)
          .whereType&amp;lt;String&amp;gt;()
          .where((s) =&amp;gt; s.trim().isNotEmpty)
          .take(limit)
          .toList(growable: false);
      return ids;
    } finally {
      try {
        await socket?.close();
      } catch (_) {
        // ignore
      }
    }
  }

  /// Get the summary content from a workstream summary's annotations
  Future&amp;lt;String?&amp;gt; getSummaryContent(WorkstreamSummary summary) async {
    try {
      // Loop through annotations to find the DESCRIPTION type
      for (final annotationRef
          in summary.annotations?.indices.keys.toList() ?? []) {
        // Fetch the full annotation using AnnotationApi (singular)
        final annotation = await _annotationApi
            .annotationSpecificAnnotationSnapshot(annotationRef);
        if (annotation == null) {
          continue;
        }

        // Check if this is a DESCRIPTION type annotation
        if (annotation.type == AnnotationTypeEnum.DESCRIPTION) {
          // Return the text content
          return annotation.text;
        }
      }

      return null;
    } catch (e) {
      dev.log(
        'Error fetching annotation content for ${summary.id}: $e',
        name: 'PiecesOSService',
      );
      return null;
    }
  }

  /// Get the last [limit] workstream summary DESCRIPTION texts (most recent first).
  ///
  /// This performs an on-demand identifiers fetch, then retrieves each summary
  /// snapshot and its DESCRIPTION annotation text.
  Future&amp;lt;List&amp;lt;String&amp;gt;&amp;gt; getLastSummaryContents({int limit = 10}) async {
    if (limit &amp;lt;= 0) return const [];
    await initialize();

    final ids = await fetchLatestWorkstreamSummaryIds(limit: limit);
    if (ids.isEmpty) return const [];

    final summaries = await Future.wait(
      ids.map(
        (id) =&amp;gt; _workstreamSummaryApi
            .workstreamSummariesSpecificWorkstreamSummarySnapshot(id),
      ),
    );

    final contents = await Future.wait(
      summaries.whereType&amp;lt;WorkstreamSummary&amp;gt;().map(
        (s) async =&amp;gt; (await getSummaryContent(s))?.trim(),
      ),
    );

    return contents.whereType&amp;lt;String&amp;gt;().where((t) =&amp;gt; t.isNotEmpty).toList();
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Step 4: retrieve persona signals (user annotations)
&lt;/h4&gt;

&lt;p&gt;To personalize voice and framing, &lt;code&gt;getPersonaFromUserAnnotations()&lt;/code&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;resolves the active user (&lt;code&gt;_resolveUserId()&lt;/code&gt; → &lt;code&gt;UserApi.userSnapshot()&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;fetches recent annotations (&lt;code&gt;UserApi.userGetAnnotations(...)&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;normalizes them into a prompt-friendly &lt;code&gt;PersonaAnnotations&lt;/code&gt; object&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If Pieces returns no annotations, we keep persona optional (so the model doesn’t invent one).&lt;/p&gt;

&lt;p&gt;Here’s the code that resolves the active user and turns annotations into prompt-friendly “persona signals”:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  Future&amp;lt;String&amp;gt; _resolveUserId() async {
    await initialize();

    final snap = await _userApi.userSnapshot();
    final userId = snap?.user?.id;
    if (userId != null &amp;amp;&amp;amp; userId.trim().isNotEmpty) return userId.trim();

    throw StateError(
      'No active user found. Ensure Pieces has an active user session.',
    );
  }

  /// Get a "persona" text derived from the user's annotations.
  ///
  /// Uses `UserApi.userGetAnnotations()` which returns the resolved active Person
  /// and filtered Annotations.
  Future&amp;lt;PersonaAnnotations&amp;gt; getPersonaFromUserAnnotations({
    int limit = 1,
  }) async {
    if (limit &amp;lt;= 0) return const PersonaAnnotations();
    final userId = await _resolveUserId();
    final out = await _userApi.userGetAnnotations(
      userId,
      UserAnnotationsInput(limit: limit),
    );

    final texts = out.annotations.iterable
        .map((a) =&amp;gt; a.text)
        .whereType&amp;lt;String&amp;gt;()
        .map((t) =&amp;gt; t.trim())
        .where((t) =&amp;gt; t.isNotEmpty)
        .toList();
    dev.log(
      'Fetched ${texts.length} annotation texts for persona.',
      name: 'PiecesOSService',
    );

    // Pieces might return an empty annotations list; in that case we return empty
    // and let the caller omit persona entirely.
    if (texts.isEmpty) return const PersonaAnnotations();

    final normalized = &amp;lt;String&amp;gt;[];
    for (final t in texts) {
      final oneLine = t.replaceAll(RegExp(r'\s+'), ' ').trim();
      if (oneLine.isEmpty) continue;
      normalized.add(
        oneLine.length &amp;gt; 180 ? '${oneLine.substring(0, 180)}…' : oneLine,
      );
    }

    return PersonaAnnotations(annotations: normalized);
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Step 5: enable RAG with MCP (agentic mode)
&lt;/h4&gt;

&lt;p&gt;When we connect via the MCP endpoint, the generator can do &lt;strong&gt;RAG (retrieval‑augmented generation)&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;connectMcp()&lt;/code&gt; connects a streamable HTTP transport (POST + SSE GET)
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;listTools()&lt;/code&gt; fetches and caches the available MCP tools
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;callTool()&lt;/code&gt; lets the agent retrieve/verify relevant context from Pieces during planning/writing&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is the “agentic” part: instead of guessing, the model can retrieve what it needs.&lt;/p&gt;

&lt;p&gt;Here’s the MCP client code that enables tool calls (MCP endpoint → RAG):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  // MCP (Streamable HTTP/SSE) client
  mcp.McpClient? _mcpClient;
  mcp.StreamableHttpClientTransport? _mcpTransport;
  Uri? _mcpEndpoint;
  List&amp;lt;mcp.Tool&amp;gt;? _cachedMcpTools;

  bool get isMcpConnected =&amp;gt;
      _mcpClient != null &amp;amp;&amp;amp; _mcpTransport != null;

  /// Connect to Pieces MCP endpoint using Streamable HTTP (POST + SSE GET).
  ///
  /// Default endpoint is `http://localhost:39300/mcp`.
  Future&amp;lt;void&amp;gt; connectMcp({String? endpoint}) async {
    final ep = Uri.parse((endpoint ?? defaultMcpEndpoint).trim());

    // If already connected to same endpoint, do nothing.
    if (isMcpConnected &amp;amp;&amp;amp; _mcpEndpoint == ep) return;

    // Close any existing transport/client.
    await disconnectMcp();
    _mcpEndpoint = ep;

    final transport = mcp.StreamableHttpClientTransport(ep);
    final client = mcp.McpClient(
      const mcp.Implementation(name: 'blog_generator', version: '0.0.1'),
      options: const mcp.McpClientOptions(
        // Client-side capabilities are for server-initiated requests
        // (sampling, elicitation, tasks, roots). We don't need any for now.
        capabilities: mcp.ClientCapabilities(),
      ),
    );

    transport.onerror = (err) {
      dev.log(
        'MCP transport error: $err',
        name: 'PiecesOSService',
        error: err,
      );
    };
    transport.onclose = () {
      dev.log('MCP transport closed', name: 'PiecesOSService');
      _cachedMcpTools = null;
      _mcpClient = null;
      _mcpTransport = null;
    };

    try {
      await client.connect(transport);
      _mcpTransport = transport;
      _mcpClient = client;
      final server = client.getServerVersion();
      dev.log(
        'MCP connected: ${server?.name ?? 'unknown'} ${server?.version ?? ''}',
        name: 'PiecesOSService',
      );
    } catch (e) {
      await disconnectMcp();
      rethrow;
    }
  }

  Future&amp;lt;void&amp;gt; disconnectMcp() async {
    _cachedMcpTools = null;
    try {
      await _mcpTransport?.terminateSession();
    } catch (_) {
      // Ignore - server may not support.
    }

    try {
      await _mcpClient?.close();
    } catch (_) {
      // Ignore.
    }
    try {
      await _mcpTransport?.close();
    } catch (_) {
      // Ignore.
    }

    _mcpClient = null;
    _mcpTransport = null;
  }

  Future&amp;lt;List&amp;lt;mcp.Tool&amp;gt;&amp;gt; listTools({bool forceRefresh = false}) async {
    final client = _mcpClient;
    if (client == null) {
      throw StateError('MCP client not connected');
    }

    if (!forceRefresh &amp;amp;&amp;amp; _cachedMcpTools != null) return _cachedMcpTools!;

    final res = await client.listTools();
    _cachedMcpTools = res.tools;
    return res.tools;
  }

  Future&amp;lt;mcp.CallToolResult&amp;gt; callTool({
    required String name,
    Map&amp;lt;String, dynamic&amp;gt; arguments = const {},
  }) async {
    final client = _mcpClient;
    if (client == null) {
      throw StateError('MCP client not connected');
    }

    return client.callTool(
      mcp.CallToolRequest(name: name, arguments: arguments),
    );
  }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Step 6: clean up
&lt;/h4&gt;

&lt;p&gt;&lt;code&gt;dispose()&lt;/code&gt; closes the MCP session (best-effort) and clears state so the service doesn’t leak resources across UI lifecycles.&lt;/p&gt;

&lt;p&gt;Here’s the cleanup method from the service (it also closes the class):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  /// Disconnect from Pieces OS and cleanup resources
  void dispose() {
    _isInitialized = false;

    // Close MCP client/transport
    // Fire and forget; dispose is sync.
    unawaited(disconnectMcp());

    // Clear context
    _context = null;

    dev.log('Disposed', name: 'PiecesOSService');
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;
  
  
  How the UI uses this service (quick walkthrough)
&lt;/h3&gt;

&lt;p&gt;In &lt;code&gt;BlogWizardScreen&lt;/code&gt;, the flow looks like this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Get grounded context&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;_pieces.getLastSummaryContents(limit: 10)&lt;/code&gt; for recent workstream summaries
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;_pieces.getPersonaFromUserAnnotations(limit: 1)&lt;/code&gt; for persona signals (optional)
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Turn on agentic RAG&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;_pieces.connectMcp(...)&lt;/code&gt; then &lt;code&gt;_pieces.listTools()&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;pass &lt;code&gt;callMcpTool: (...) =&amp;gt; _pieces.callTool(...)&lt;/code&gt; into the agent loop&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;From there, the generator improves quality by moving from one-shot output to a structured workflow:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;plan part titles first
&lt;/li&gt;
&lt;li&gt;generate outlines per part (reviewable)
&lt;/li&gt;
&lt;li&gt;write each part with the outline as a contract&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Takeaway
&lt;/h3&gt;

&lt;p&gt;Pieces OS is what makes this blog generator &lt;em&gt;feel real&lt;/em&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;it anchors generation in your actual recent work (summaries)
&lt;/li&gt;
&lt;li&gt;it shapes tone/voice via your own signals (annotations → persona)
&lt;/li&gt;
&lt;li&gt;it enables agentic correctness when available (MCP tools)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And the prompt strategy matters just as much: moving from &lt;strong&gt;one-shot generation&lt;/strong&gt; to a &lt;strong&gt;plan-first, titled multi-part workflow&lt;/strong&gt; is what consistently turns “okay output” into “publishable output”.&lt;/p&gt;

&lt;p&gt;The rest of the project code (UI, models, generation logic, widgets, etc.) is available on &lt;a href="https://github.com/bishoy-at-pieces/blog-dart-blog-generator" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; repository.&lt;/p&gt;

</description>
      <category>flutter</category>
      <category>beginners</category>
      <category>tutorial</category>
      <category>llm</category>
    </item>
    <item>
      <title>Building a TUI with Pieces SDK - Part 3: Advanced Features</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Mon, 02 Feb 2026 17:57:24 +0000</pubDate>
      <link>https://forem.com/getpieces/building-a-tui-with-pieces-sdk-part-3-advanced-features-3i0o</link>
      <guid>https://forem.com/getpieces/building-a-tui-with-pieces-sdk-part-3-advanced-features-3i0o</guid>
      <description>&lt;h2&gt;
  
  
  Building a Pieces Copilot TUI - Part 3: Advanced Features &amp;amp; Integration
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: This tutorial is part of the &lt;a href="https://github.com/pieces-app/cli-agent" rel="noopener noreferrer"&gt;Pieces CLI project&lt;/a&gt;. We welcome contributions! Feel free to open issues, submit PRs, or suggest improvements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Welcome to Part 3! In &lt;a href="https://dev.to/getpieces/building-a-tui-with-pieces-sdk-part-2-ui-components-59l9"&gt;Part 2&lt;/a&gt;, we built the core UI components. Now, we'll add the advanced features to create a fully-functional Pieces Copilot TUI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What we built in Part 2:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;✅ Message widget (&lt;code&gt;chat_message.py&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;✅ Chat panel (&lt;code&gt;chat_panel.py&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;✅ Input widget (&lt;code&gt;chat_input.py&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;✅ Basic view (&lt;code&gt;chat_view.py&lt;/code&gt; - simple version)
&lt;/li&gt;
&lt;li&gt;✅ Main application (&lt;code&gt;app.py&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 1: Create the Conversations List
&lt;/h2&gt;

&lt;p&gt;Let's build &lt;code&gt;chats_list.py&lt;/code&gt; - the sidebar showing all conversations:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# chats_list.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Conversations list panel widget.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.widgets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Button&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.containers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;VerticalScroll&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Vertical&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.message&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Message&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper.basic_identifier.chat&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;BasicChat&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ChatSelected&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Message&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Message emitted when a chat is selected.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;NewChatRequested&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Message&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Message emitted when new chat is requested.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="k"&gt;pass&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ChatItem&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Widget representing a single chat in the list.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    ChatItem {
        width: 100%;
        height: auto;
        padding: 1 2;
        margin-bottom: 1;
        background: $surface;
        border-left: solid $primary;
    }

    ChatItem:hover {
        background: $panel;
        border-left: solid $accent;
    }

    ChatItem.active {
        background: $primary 30%;
        border-left: thick $accent;
        text-style: bold;
    }

    ChatItem .chat-title {
        color: $text;
        text-style: bold;
    }

    ChatItem .chat-summary {
        color: $text-muted;
        text-style: italic;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;compose&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Compose the chat item.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Untitled&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="n"&gt;summary&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;summary&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;[:&lt;/span&gt;&lt;span class="mi"&gt;47&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;💬 &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;title&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;chat-title&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;chat-summary&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_click&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle click event.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nc"&gt;ChatSelected&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ChatsList&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Vertical&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Panel to display list of conversations.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    ChatsList {
        width: 25%;
        height: 100%;
        border: solid $primary;
        background: $background;
        padding: 1;
    }

    ChatsList:focus-within {
        border: solid $accent;
    }

    ChatsList Button {
        width: 100%;
        margin-bottom: 1;
    }

    ChatsList VerticalScroll {
        height: 1fr;
    }

    ChatsList .empty-state {
        text-align: center;
        color: $text-muted;
        text-style: italic;
        margin: 2;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;border_title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Conversations&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;active_chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_chat_items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;compose&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Compose the chats list panel.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;Button&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;➕ New Chat&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;new-chat-btn&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;VerticalScroll&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;chats-container&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;load_chats&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Load chats from the API.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;chats&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

            &lt;span class="n"&gt;container&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;query_one&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#chats-container&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;VerticalScroll&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

            &lt;span class="c1"&gt;# Clear existing items
&lt;/span&gt;            &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove_children&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_chat_items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;clear&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;No chats yet...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;empty-state&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="n"&gt;chat_item&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatItem&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_chat_items&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chat_item&lt;/span&gt;
                    &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chat_item&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;except &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;ConnectionError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;AttributeError&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;container&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;query_one&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#chats-container&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;VerticalScroll&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ Failed to load chats: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;empty-state&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;set_active_chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;]):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Set the active chat and update UI.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="c1"&gt;# Remove active class from all items
&lt;/span&gt;        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_chat_items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;values&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove_class&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;active&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;active_chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;

        &lt;span class="c1"&gt;# Add active class to the selected item
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_chat_items&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_chat_items&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;add_class&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;active&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_new_chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Add a new chat to the list.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;container&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;query_one&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#chats-container&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;VerticalScroll&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Remove empty state if present
&lt;/span&gt;        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;empty_state&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;query_one&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;.empty-state&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;empty_state&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;LookupError&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# No empty state widget present, which is fine
&lt;/span&gt;            &lt;span class="k"&gt;pass&lt;/span&gt;

        &lt;span class="c1"&gt;# Add new chat at the top
&lt;/span&gt;        &lt;span class="n"&gt;chat_item&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatItem&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_chat_items&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chat_item&lt;/span&gt;

        &lt;span class="c1"&gt;# Mount at the beginning
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;children&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chat_item&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;before&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;children&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;container&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chat_item&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_button_pressed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Button&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Pressed&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle button press.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;button&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;new-chat-btn&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nc"&gt;NewChatRequested&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key features:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;ChatItem&lt;/code&gt; - Individual conversation with title and summary
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;ChatSelected&lt;/code&gt; - Custom message when chat is clicked
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;NewChatRequested&lt;/code&gt; - Message when "New Chat" button is pressed
&lt;/li&gt;
&lt;li&gt;Active chat highlighting
&lt;/li&gt;
&lt;li&gt;Hover effects
&lt;/li&gt;
&lt;li&gt;Error handling for API failures&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 2: Create the Streaming Handler
&lt;/h2&gt;

&lt;p&gt;Now &lt;code&gt;src/pieces_copilot_tui/streaming_handler.py&lt;/code&gt; - connects Pieces OS streaming to our UI:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/streaming_handler.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handler for streaming responses from Pieces OS.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper.basic_identifier.chat&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;BasicChat&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;StreamingHandler&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handles streaming responses from the Pieces Copilot.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;on_thinking_started&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;on_stream_started&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;on_stream_chunk&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;on_stream_completed&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;]],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;on_stream_error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
        Initialize the streaming handler.

        Args:
            pieces_client: The Pieces client instance
            on_thinking_started: Callback when thinking starts
            on_stream_started: Callback when streaming starts (with initial text)
            on_stream_chunk: Callback for each chunk (with full accumulated text)
            on_stream_completed: Callback when streaming completes (with optional new chat)
            on_stream_error: Callback when an error occurs (with error message)
        &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_thinking_started&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_thinking_started&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_stream_started&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_stream_started&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_stream_chunk&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_stream_chunk&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_stream_completed&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_stream_completed&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_stream_error&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_stream_error&lt;/span&gt;

        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

        &lt;span class="c1"&gt;# Register callback
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ask_stream_ws&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ask_stream_ws&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_message_callback&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_handle_stream_message&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;ask_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
        Ask a question to the copilot.

        Args:
            query: The question to ask
        &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_status&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_thinking_started&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stream_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_handle_stream_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle streaming messages from the copilot.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;status&lt;/span&gt;

            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;IN-PROGRESS&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="n"&gt;answers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;answers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;iterable&lt;/span&gt;
                    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;answers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                                &lt;span class="c1"&gt;# First chunk - start streaming
&lt;/span&gt;                                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;
                                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_stream_started&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                            &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                                &lt;span class="c1"&gt;# Subsequent chunks
&lt;/span&gt;                                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;
                                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_stream_chunk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

            &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;COMPLETED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="c1"&gt;# Streaming completed
&lt;/span&gt;                &lt;span class="n"&gt;new_chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
                &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conversation&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="n"&gt;new_chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;conversation&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_chat&lt;/span&gt;

                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_stream_completed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;new_chat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;

            &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;FAILED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;STOPPED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;CANCELED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
                &lt;span class="c1"&gt;# Handle error
&lt;/span&gt;                &lt;span class="n"&gt;error_msg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;getattr&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error_message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Unknown error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_stream_error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;error_msg&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;

        &lt;span class="nf"&gt;except &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;AttributeError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;ConnectionError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;ValueError&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_stream_error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 3: Create the Full View Orchestrator
&lt;/h2&gt;

&lt;p&gt;Now the &lt;strong&gt;complete&lt;/strong&gt; &lt;code&gt;chat_view.py&lt;/code&gt; - replaces the simple version from Part 2:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# chat_view.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Main copilot view combining all widgets.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.screen&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Screen&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.containers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Horizontal&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Vertical&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.binding&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Binding&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.widgets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Footer&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper.basic_identifier.chat&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;BasicChat&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chat_panel&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatPanel&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chats_list&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatsList&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;ChatSelected&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;NewChatRequested&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chat_input&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatInput&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;MessageSubmitted&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.streaming_handler&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;StreamingHandler&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;CopilotView&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Screen&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Main copilot view screen.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;BINDINGS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="nc"&gt;Binding&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ctrl+n&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;new_chat&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;New Chat&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="nc"&gt;Binding&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ctrl+r&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;rename_chat&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Rename&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="nc"&gt;Binding&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ctrl+d&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;delete_chat&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Delete&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="nc"&gt;Binding&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ctrl+l&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;toggle_ltm&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Toggle LTM&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
        &lt;span class="nc"&gt;Binding&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ctrl+q&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;quit&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Quit&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;]&lt;/span&gt;

    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    CopilotView {
        layout: vertical;
    }

    CopilotView Horizontal {
        height: 1fr;
    }

    CopilotView Vertical.main-content {
        width: 75%;
        layout: vertical;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;ChatPanel&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;ChatsList&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_input&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;ChatInput&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_ltm_enabled&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;streaming_handler&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;StreamingHandler&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;compose&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Compose the view.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nc"&gt;Horizontal&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="c1"&gt;# Left sidebar - conversations list
&lt;/span&gt;            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatsList&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;

            &lt;span class="c1"&gt;# Right side - chat panel and input
&lt;/span&gt;            &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nc"&gt;Vertical&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;main-content&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatPanel&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;

                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatInput&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_input&lt;/span&gt;

        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;Footer&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Initialize the view.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="c1"&gt;# Load chats
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;load_chats&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Show welcome message
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;show_welcome&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

        &lt;span class="c1"&gt;# Check LTM status
&lt;/span&gt;        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_ltm_enabled&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_chat_ltm_enabled&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="nf"&gt;except &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;ConnectionError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;AttributeError&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_ltm_enabled&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;

        &lt;span class="c1"&gt;# Setup streaming handler
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;streaming_handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;StreamingHandler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;on_thinking_started&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_on_thinking_started&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;on_stream_started&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_on_stream_started&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;on_stream_chunk&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_on_stream_chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;on_stream_completed&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_on_stream_completed&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;on_stream_error&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_on_stream_error&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_chat_selected&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;ChatSelected&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle chat selection.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_active_chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Load conversation
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;clear_messages&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;border_title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Chat: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;messages&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;role&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;
                &lt;span class="n"&gt;content&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;raw_content&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;except &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;ConnectionError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;AttributeError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;ValueError&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;system&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ Error loading messages: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_new_chat_requested&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;NewChatRequested&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle new chat request.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;action_new_chat&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_message_submitted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;MessageSubmitted&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle user message submission.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_streaming_active&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;

        &lt;span class="c1"&gt;# Add user message to chat panel
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Send to copilot via streaming handler
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;streaming_handler&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;streaming_handler&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ask_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Streaming handler callbacks
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_on_thinking_started&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Called when copilot starts thinking.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;add_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_on_stream_started&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;initial_text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Called when streaming starts with initial text.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;add_streaming_message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;assistant&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;initial_text&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_on_stream_chunk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;full_text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Called for each streaming chunk with accumulated text.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;update_streaming_message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;full_text&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_on_stream_completed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;new_chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;BasicChat&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Called when streaming completes.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;finalize_streaming_message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Update chat reference if new chat was created
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;new_chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;old_chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;old_chat&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;old_chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="n"&gt;new_chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_chat&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;add_new_chat&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;new_chat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;set_active_chat&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;new_chat&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;border_title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Chat: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;new_chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_on_stream_error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;error_msg&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Called when a streaming error occurs.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;system&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ Error: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;error_msg&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;action_new_chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Create a new chat.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_active_chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;clear_messages&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;border_title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Chat: New Conversation&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;show_welcome&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_input&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;focus&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;action_rename_chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Rename the current chat.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;No chat selected&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;warning&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;

        &lt;span class="c1"&gt;# For now, just show a notification - you can implement a dialog later
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Rename chat - Not implemented yet&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;info&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;action_delete_chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Delete the current chat.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;No chat selected&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;warning&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;

        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;chat_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;current_chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;delete&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Deleted chat: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;chat_name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;success&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

            &lt;span class="c1"&gt;# Clear view and reload chats
&lt;/span&gt;            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;action_new_chat&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chats_list&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;load_chats&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="nf"&gt;except &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;ConnectionError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;AttributeError&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error deleting chat: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;action_toggle_ltm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Toggle LTM (Long Term Memory).&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;is_chat_ltm_enabled&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_chat_ltm_enabled&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;is_chat_ltm_enabled&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;deactivate_ltm&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🧠 Chat LTM disabled&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;info&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_ltm_enabled&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;
            &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;is_system_ltm_running&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_ltm_running&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

                &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;is_system_ltm_running&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;activate_ltm&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🧠 Chat LTM enabled&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;success&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_ltm_enabled&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;
                &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🧠 LTM system not running. Please enable it first.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;warning&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="nf"&gt;except &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;ConnectionError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;AttributeError&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ Error toggling LTM: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;🚨 &lt;strong&gt;Critical for Thread Safety&lt;/strong&gt;&lt;br&gt;&lt;br&gt;
Always use &lt;code&gt;call_from_thread()&lt;/code&gt; when updating UI from background threads. Direct UI updates from threads will cause crashes! The streaming handler runs in a background thread, so all UI updates must go through this method.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key features:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Split-pane layout (25% sidebar, 75% chat)
&lt;/li&gt;
&lt;li&gt;Handles all custom messages (&lt;code&gt;ChatSelected&lt;/code&gt;, &lt;code&gt;NewChatRequested&lt;/code&gt;, &lt;code&gt;MessageSubmitted&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;Thread-safe UI updates via &lt;code&gt;call_from_thread()&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Keyboard shortcuts (Ctrl+N, Ctrl+D, Ctrl+L, Ctrl+Q)
&lt;/li&gt;
&lt;li&gt;Conversation management
&lt;/li&gt;
&lt;li&gt;LTM toggle support&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  Step 4: Update the Main Application
&lt;/h2&gt;

&lt;p&gt;Now update &lt;code&gt;src/pieces_copilot_tui/app.py&lt;/code&gt; to use the full &lt;code&gt;CopilotView&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/app.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Pieces Copilot TUI - A Terminal User Interface for Pieces OS.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.app&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;App&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chat_view&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;CopilotView&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;PiecesCopilotTUI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;App&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Pieces Copilot TUI with full features.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="c1"&gt;# CSS styles for the entire app
&lt;/span&gt;    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Screen {
        background: $background;
        color: $text;
    }

    .error {
        color: $error;
        text-style: bold;
    }

    .success {
        color: $success;
        text-style: bold;
    }

    .warning {
        color: $warning;
        text-style: bold;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="nc"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot_view&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Initialize the application.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Pieces Copilot TUI&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

        &lt;span class="c1"&gt;# Initialize Pieces client
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;connect_websocket&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Failed to connect to Pieces OS&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;exit&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;

        &lt;span class="c1"&gt;# Create and push the full copilot view
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot_view&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;CopilotView&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push_screen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot_view&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;


&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;run_tui&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Run the Pieces Copilot TUI application.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="c1"&gt;# Initialize Pieces client
&lt;/span&gt;    &lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Check if Pieces OS is running
&lt;/span&gt;    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_pieces_running&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error: Pieces OS is not running. Please start Pieces OS first.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt;

    &lt;span class="n"&gt;app&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PiecesCopilotTUI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;run&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;


&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;run_tui&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Changes from Part 2:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Import &lt;code&gt;CopilotView&lt;/code&gt; instead of &lt;code&gt;SimpleChatView&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Added &lt;code&gt;warning&lt;/code&gt; CSS class
&lt;/li&gt;
&lt;li&gt;Now using the full-featured view&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 5: Run Your Complete TUI
&lt;/h2&gt;

&lt;p&gt;The &lt;code&gt;__main__.py&lt;/code&gt; file is already set up from Part 2. Run it the Pythonic way:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Navigate to src directory&lt;/span&gt;
&lt;span class="nb"&gt;cd &lt;/span&gt;src

&lt;span class="c"&gt;# Run the module&lt;/span&gt;
python &lt;span class="nt"&gt;-m&lt;/span&gt; pieces_copilot_tui
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it! Simple, clean, and Pythonic. ✨&lt;/p&gt;

&lt;p&gt;🎉 &lt;strong&gt;You now have a fully functional Pieces Copilot TUI!&lt;/strong&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Add Debug Logging
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# In any widget method
&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Current state: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;some_variable&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;log&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Info message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;log&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;warning&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Warning message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;log&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# View in textual console
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Common Issues
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Problem&lt;/strong&gt;: Streaming doesn't start&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Check WebSocket connection
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ask_stream_ws&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ WebSocket not connected!&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;connect_websocket&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Problem&lt;/strong&gt;: UI not updating from streaming&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Make sure you're using call_from_thread
&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;call_from_thread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;assistant&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="c1"&gt;# NOT: self.chat_panel.add_message("assistant", text)
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Problem&lt;/strong&gt;: Messages not loading&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Add error handling
&lt;/span&gt;&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;messages&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;log&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Failed to load messages: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






</description>
      <category>programming</category>
      <category>ai</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Building a TUI with Pieces SDK - Part 2: UI Components</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Mon, 02 Feb 2026 17:57:15 +0000</pubDate>
      <link>https://forem.com/getpieces/building-a-tui-with-pieces-sdk-part-2-ui-components-59l9</link>
      <guid>https://forem.com/getpieces/building-a-tui-with-pieces-sdk-part-2-ui-components-59l9</guid>
      <description>&lt;h2&gt;
  
  
  Building a Pieces Copilot TUI - Part 2: Basic UI Components
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: This tutorial is part of the &lt;a href="https://github.com/pieces-app/cli-agent" rel="noopener noreferrer"&gt;Pieces CLI project&lt;/a&gt;. We welcome contributions! Feel free to open issues, submit PRs, or suggest improvements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Welcome to Part 2! In &lt;a href="https://dev.to/getpieces/building-a-tui-with-pieces-sdk-5eca"&gt;Part 1&lt;/a&gt;, we learned how to use the Pieces OS SDK. Now, we'll start building the Terminal User Interface (TUI) using Textual.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What we'll build in Part 2:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Understanding Textual fundamentals
&lt;/li&gt;
&lt;li&gt;Message display widget
&lt;/li&gt;
&lt;li&gt;Chat panel for conversations
&lt;/li&gt;
&lt;li&gt;User input widget
&lt;/li&gt;
&lt;li&gt;A working basic chat interface&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;What we'll add in Part 3:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Conversations list sidebar
&lt;/li&gt;
&lt;li&gt;Streaming handler integration
&lt;/li&gt;
&lt;li&gt;Full view orchestration
&lt;/li&gt;
&lt;li&gt;Testing &amp;amp; optimization&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Understanding Textual
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://textual.textualize.io/" rel="noopener noreferrer"&gt;Textual&lt;/a&gt; is a modern Python framework for building TUIs. It uses a reactive, component-based architecture similar to React.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pieces-copilot-tui/
├── venv/                                    # Virtual environment
├── requirements.txt                         # Dependencies
├── run.py                                   # Entry point
└── src/                                     # Source code
    └── pieces_copilot_tui/                  # Main package
        ├── __init__.py                      # Package initialization
        ├── app.py                           # Main TUI application
        ├── chat_message.py                  # Individual message widget (Part 2)
        ├── chat_panel.py                    # Message display panel (Part 2)
        ├── chat_input.py                    # User input widget (Part 2)
        ├── streaming_handler.py             # Pieces SDK streaming logic (Part 3)
        ├── chats_list.py                    # Conversations list (Part 3)
        └── chat_view.py                     # Main view orchestrator (Part 3)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In &lt;strong&gt;Part 2&lt;/strong&gt;, we'll build the core UI components. In &lt;strong&gt;Part 3&lt;/strong&gt;, we'll add conversations management and full integration.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 1: Create the Message Widget
&lt;/h2&gt;

&lt;p&gt;Let's start with the foundation - &lt;code&gt;src/pieces_copilot_tui/chat_message.py&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/chat_message.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Chat message widget for displaying individual messages.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.widgets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Static&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.containers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Vertical&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ChatMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Vertical&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Widget to display a single chat message.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    ChatMessage {
        width: 100%;
        height: auto;
        padding: 1 2;
        margin-bottom: 1;
    }

    ChatMessage.user-message {
        background: $primary 20%;
        border-left: thick $primary;
    }

    ChatMessage.assistant-message {
        background: $accent 20%;
        border-left: thick $accent;
    }

    ChatMessage.system-message {
        background: $warning 20%;
        border-left: thick $warning;
    }

    ChatMessage .message-header {
        color: $text-muted;
        text-style: italic;
        margin-bottom: 1;
    }

    ChatMessage .message-content {
        color: $text;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;role&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;timestamp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;timestamp&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;strftime&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Today %I:%M %p&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Set CSS class based on role
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_class&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;-message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;compose&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Compose the message widget.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;role_emoji&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;👤&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;assistant&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🤖&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;system&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;⚙️&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="n"&gt;emoji&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;role_emoji&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;emoji&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;title&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; - &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;timestamp&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;message-header&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;message-content&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key concepts:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;Vertical&lt;/code&gt; container stacks children vertically
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;compose()&lt;/code&gt; defines child widgets
&lt;/li&gt;
&lt;li&gt;CSS classes are added dynamically based on message role
&lt;/li&gt;
&lt;li&gt;Emojis make it visually appealing! 🎨&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;CSS Variables:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;$primary&lt;/code&gt;, &lt;code&gt;$accent&lt;/code&gt;, &lt;code&gt;$warning&lt;/code&gt; - Textual's built-in color variables
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;$text&lt;/code&gt;, &lt;code&gt;$text-muted&lt;/code&gt; - Text color variables
&lt;/li&gt;
&lt;li&gt;You can customize these in your app's theme&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 2: Create the Chat Panel
&lt;/h2&gt;

&lt;p&gt;Now &lt;code&gt;src/pieces_copilot_tui/chat_panel.py&lt;/code&gt; - the scrollable message display:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/chat_panel.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Chat panel widget for displaying conversation messages.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.widgets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Static&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.containers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;VerticalScroll&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chat_message&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatMessage&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ChatPanel&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;VerticalScroll&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Panel to display chat messages.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    ChatPanel {
        width: 100%;
        height: 1fr;
        border: solid $primary;
        background: $background;
        padding: 1 2;
    }

    ChatPanel:focus {
        border: solid $accent;
    }

    ChatPanel .thinking-indicator {
        color: $warning;
        text-style: italic bold blink;
        text-align: center;
        background: $surface;
        border: dashed $warning;
        padding: 1;
        margin: 1;
    }

    ChatPanel .streaming-message {
        border-left: thick $accent;
        text-style: bold;
    }

    ChatPanel .welcome-message {
        text-align: center;
        margin: 4 2;
        padding: 3;
        border: dashed $primary;
        background: $primary 10%;
        color: $text;
        width: 100%;
        height: auto;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;border_title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Chat&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_messages&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Add a new message to the chat panel.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_clear_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

        &lt;span class="n"&gt;timestamp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;timestamp&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;strftime&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Today %I:%M %p&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;message&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_messages&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;scroll_end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;animate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Add a thinking indicator.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_clear_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🤔 Thinking...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;thinking-indicator&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;scroll_end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;animate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_streaming_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Add a streaming message with cursor.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_clear_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

        &lt;span class="n"&gt;timestamp&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;datetime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;strftime&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Today %I:%M %p&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; ▌&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_class&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;streaming-message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;scroll_end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;animate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;update_streaming_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Update the streaming message content.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Update the content of the last Static widget in the streaming message
&lt;/span&gt;            &lt;span class="n"&gt;content_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;.message-content&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;content_widget&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;content_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; ▌&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;scroll_end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;animate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;finalize_streaming_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Convert streaming message to permanent message.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;content_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;.message-content&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;first&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;content_widget&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="c1"&gt;# Remove cursor
&lt;/span&gt;                &lt;span class="n"&gt;current_content&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;content_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;renderable&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; ▌&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;content_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;update&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;current_content&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove_class&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;streaming-message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_messages&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;clear_messages&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Clear all messages from the chat panel.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;_clear_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;is_mounted&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_messages&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;clear&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;show_welcome&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Show welcome message.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;welcome_text&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;🎯 Pieces Copilot TUI

Type your message below to start chatting!

Press Ctrl+Q to quit.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

        &lt;span class="n"&gt;welcome&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Static&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;welcome_text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;classes&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;welcome-message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;welcome&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_clear_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Remove thinking indicator if present.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;is_mounted&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;is_streaming_active&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;bool&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Check if streaming or thinking is currently active.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_streaming_widget&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_thinking_widget&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key features:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;VerticalScroll&lt;/code&gt; makes it scrollable
&lt;/li&gt;
&lt;li&gt;Thinking indicator shows "🤔 Thinking..."
&lt;/li&gt;
&lt;li&gt;Streaming shows content + cursor (" ▌")
&lt;/li&gt;
&lt;li&gt;Auto-scrolls to bottom on new messages
&lt;/li&gt;
&lt;li&gt;Welcome message for first-time users&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 3: Create the Input Widget
&lt;/h2&gt;

&lt;p&gt;Simple but crucial - &lt;code&gt;src/pieces_copilot_tui/chat_input.py&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/chat_input.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Chat input widget for user message entry.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.widgets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Input&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.message&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Message&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;MessageSubmitted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Message&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Message emitted when user submits input.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ChatInput&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Input&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Input widget for chat messages.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    ChatInput {
        width: 100%;
        height: 3;
        background: $surface;
        border: solid $primary;
        padding: 0 1;
        margin: 0;
        dock: bottom;
    }

    ChatInput:focus {
        border: solid $accent;
        background: $panel;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;placeholder&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Type your message here...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_input_submitted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Input&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Submitted&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle input submission.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;strip&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nc"&gt;MessageSubmitted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;strip&lt;/span&gt;&lt;span class="p"&gt;()))&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
            &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stop&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Focus the input when mounted.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;focus&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Textual messages:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Custom &lt;code&gt;MessageSubmitted&lt;/code&gt; message bubbles up to parent
&lt;/li&gt;
&lt;li&gt;Parent widgets can listen for this message
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;event.stop()&lt;/code&gt; prevents further propagation
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;post_message()&lt;/code&gt; sends the message up the widget tree&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;How it works:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;User types message and presses Enter
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;on_input_submitted()&lt;/code&gt; is called
&lt;/li&gt;
&lt;li&gt;Creates &lt;code&gt;MessageSubmitted&lt;/code&gt; message with the text
&lt;/li&gt;
&lt;li&gt;Parent widget receives it via &lt;code&gt;on_message_submitted()&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Input is cleared for next message&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Step 4: Create a Simple View (For Testing)
&lt;/h2&gt;

&lt;p&gt;Let's create a basic &lt;code&gt;src/pieces_copilot_tui/chat_view.py&lt;/code&gt; to test our components:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/chat_view.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Simple chat view for testing (will be expanded in Part 3).&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.screen&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Screen&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.containers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Vertical&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.binding&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Binding&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.widgets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Footer&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chat_panel&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatPanel&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chat_input&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatInput&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;MessageSubmitted&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SimpleChatView&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Screen&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Simple chat view for testing our components.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="n"&gt;BINDINGS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="nc"&gt;Binding&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ctrl+q&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;quit&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Quit&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;]&lt;/span&gt;

    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    SimpleChatView {
        layout: vertical;
    }

    SimpleChatView Vertical {
        width: 100%;
        height: 100%;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;compose&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Compose the view.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nc"&gt;Vertical&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatPanel&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;

                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ChatInput&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_input&lt;/span&gt;

        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="nc"&gt;Footer&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Initialize the view.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="c1"&gt;# Show welcome message
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;show_welcome&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_message_submitted&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;MessageSubmitted&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handle user message submission.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="c1"&gt;# Add user message
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Show thinking indicator
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_thinking_indicator&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

        &lt;span class="c1"&gt;# In Part 3, we'll connect this to the streaming handler
&lt;/span&gt;        &lt;span class="c1"&gt;# For now, just simulate a response
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;simulate_response&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;simulate_response&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;user_message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Simulate a response (will be replaced with real streaming in Part 3).&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;

        &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_response&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sleep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_panel&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;assistant&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Echo: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;user_message&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s"&gt;(Real AI responses coming in Part 3!)&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;asyncio&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create_task&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;add_response&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;💡 &lt;strong&gt;Note&lt;/strong&gt;: This is a simplified version for testing. In Part 3, we'll replace &lt;code&gt;simulate_response()&lt;/code&gt; with real streaming from Pieces OS and add the conversations list sidebar.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 5: Create the Main Application
&lt;/h2&gt;

&lt;p&gt;Now let's create &lt;code&gt;src/pieces_copilot_tui/app.py&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/app.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Pieces Copilot TUI - A Terminal User Interface for Pieces OS.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;textual.app&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;App&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.chat_view&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;SimpleChatView&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;PiecesCopilotTUI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;App&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Pieces Copilot TUI (Basic version for Part 2).&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="c1"&gt;# CSS styles for the entire app
&lt;/span&gt;    &lt;span class="n"&gt;DEFAULT_CSS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Screen {
        background: $background;
        color: $text;
    }

    .error {
        color: $error;
        text-style: bold;
    }

    .success {
        color: $success;
        text-style: bold;
    }
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="nf"&gt;super&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;**&lt;/span&gt;&lt;span class="n"&gt;kwargs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="nc"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_view&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Initialize the application.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Pieces Copilot TUI&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

        &lt;span class="c1"&gt;# Initialize Pieces client
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;connect_websocket&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;notify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Failed to connect to Pieces OS&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;severity&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;exit&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;

        &lt;span class="c1"&gt;# Create and push the simple chat view
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_view&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SimpleChatView&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push_screen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat_view&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;


&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;run_tui&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Run the Pieces Copilot TUI application.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="c1"&gt;# Initialize Pieces client
&lt;/span&gt;    &lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="c1"&gt;# Check if Pieces OS is running
&lt;/span&gt;    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_pieces_running&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error: Pieces OS is not running. Please start Pieces OS first.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt;

    &lt;span class="n"&gt;app&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PiecesCopilotTUI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;run&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;


&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;run_tui&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key points:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We inject &lt;code&gt;PiecesClient&lt;/code&gt; for better testability
&lt;/li&gt;
&lt;li&gt;CSS is used for styling (similar to web CSS!)
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;on_mount()&lt;/code&gt; is called when the app starts
&lt;/li&gt;
&lt;li&gt;Error handling for Pieces OS connection&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 6: Create the Package Initialization
&lt;/h2&gt;

&lt;p&gt;Create &lt;code&gt;src/pieces_copilot_tui/__init__.py&lt;/code&gt; to make it a proper Python package:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/__init__.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Pieces Copilot TUI Package.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="n"&gt;__version__&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;0.1.0&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.app&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesCopilotTUI&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;run_tui&lt;/span&gt;

&lt;span class="n"&gt;__all__&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;PiecesCopilotTUI&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;run_tui&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This allows the relative imports (&lt;code&gt;.chat_message&lt;/code&gt;, &lt;code&gt;.chat_panel&lt;/code&gt;, etc.) to work correctly and exports the main functions.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 7: Create the Module Entry Point
&lt;/h2&gt;

&lt;p&gt;Create &lt;code&gt;src/pieces_copilot_tui/__main__.py&lt;/code&gt; to make the package executable:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# src/pieces_copilot_tui/__main__.py
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
Command-line entry point for Pieces Copilot TUI.

This allows the package to be run with: python -m pieces_copilot_tui
&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;.app&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;run_tui&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;run_tui&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is the Pythonic way to make a package runnable with &lt;code&gt;python -m package_name&lt;/code&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 8: Run Your Basic TUI
&lt;/h2&gt;

&lt;p&gt;The cleanest way to run the TUI is from the &lt;code&gt;src&lt;/code&gt; directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Navigate to src directory&lt;/span&gt;
&lt;span class="nb"&gt;cd &lt;/span&gt;src

&lt;span class="c"&gt;# Run the module&lt;/span&gt;
python &lt;span class="nt"&gt;-m&lt;/span&gt; pieces_copilot_tui
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it! No complex setup, no PYTHONPATH manipulation, just clean Python!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What you should see:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A welcome message in the center
&lt;/li&gt;
&lt;li&gt;An input box at the bottom
&lt;/li&gt;
&lt;li&gt;You can type messages and see them displayed
&lt;/li&gt;
&lt;li&gt;Simulated responses after 1 second
&lt;/li&gt;
&lt;li&gt;Press Ctrl+Q to quit&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Recap
&lt;/h2&gt;

&lt;p&gt;In Part 2, we built the &lt;strong&gt;foundation of our TUI&lt;/strong&gt;:&lt;/p&gt;

&lt;h3&gt;
  
  
  Files Created
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;✅ &lt;code&gt;src/pieces_copilot_tui/chat_message.py&lt;/code&gt; - Individual message widget
&lt;/li&gt;
&lt;li&gt;✅ &lt;code&gt;src/pieces_copilot_tui/chat_panel.py&lt;/code&gt; - Scrollable message display
&lt;/li&gt;
&lt;li&gt;✅ &lt;code&gt;src/pieces_copilot_tui/chat_input.py&lt;/code&gt; - User input widget
&lt;/li&gt;
&lt;li&gt;✅ &lt;code&gt;src/pieces_copilot_tui/chat_view.py&lt;/code&gt; - Simple view (basic version)
&lt;/li&gt;
&lt;li&gt;✅ &lt;code&gt;src/pieces_copilot_tui/app.py&lt;/code&gt; - Main application
&lt;/li&gt;
&lt;li&gt;✅ &lt;code&gt;src/pieces_copilot_tui/__init__.py&lt;/code&gt; - Package definition
&lt;/li&gt;
&lt;li&gt;✅ &lt;code&gt;src/pieces_copilot_tui/__main__.py&lt;/code&gt; - Module entry point&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  What's Next in Part 3?
&lt;/h2&gt;

&lt;p&gt;In &lt;strong&gt;Part 3: Advanced Features &amp;amp; Integration&lt;/strong&gt;, we'll add:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;📋 &lt;strong&gt;Conversations List&lt;/strong&gt; - Sidebar showing all your chats
&lt;/li&gt;
&lt;li&gt;🌊 &lt;strong&gt;Real Streaming&lt;/strong&gt; - Connect to Pieces OS for actual AI responses
&lt;/li&gt;
&lt;li&gt;🎯 &lt;strong&gt;Full Orchestration&lt;/strong&gt; - Complete chat_view.py with all features
&lt;/li&gt;
&lt;li&gt;🧠 &lt;strong&gt;LTM Support&lt;/strong&gt; - Long-Term Memory toggle
&lt;/li&gt;
&lt;li&gt;⚡ &lt;strong&gt;Advanced Features&lt;/strong&gt; - Delete, rename, create conversations
&lt;/li&gt;
&lt;li&gt;🧪 &lt;strong&gt;Testing &amp;amp; Optimization&lt;/strong&gt; - Make it production-ready&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Preview of Part 3 components:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;chats_list.py&lt;/code&gt; - Full conversations management
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;streaming_handler.py&lt;/code&gt; - Real-time streaming from Pieces
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;chat_view.py&lt;/code&gt; - Complete orchestrator (expanded version)
&lt;/li&gt;
&lt;li&gt;Testing, debugging, and performance tips&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Troubleshooting
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Textual Not Found
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;textual&amp;gt;&lt;span class="o"&gt;=&lt;/span&gt;0.47.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  UI Not Updating
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Make sure you're using self.mount() to add widgets
&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;mount&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Not just appending to a list
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Ready for Part 3?
&lt;/h2&gt;

&lt;p&gt;You now have a working TUI foundation! In Part 3, we'll transform this into a fully-featured Pieces Copilot interface with real AI streaming, conversations management, and all the bells and whistles.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Continue to &lt;a href="https://dev.to/getpieces/building-a-tui-with-pieces-sdk-part-3-advanced-features-3i0o"&gt;Part 3&lt;/a&gt;&lt;/strong&gt; when you're ready! 🚀  &lt;/p&gt;

</description>
      <category>ui</category>
      <category>softwaredevelopment</category>
      <category>tutorial</category>
      <category>learning</category>
    </item>
    <item>
      <title>Building a TUI with Pieces SDK</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Mon, 02 Feb 2026 17:57:03 +0000</pubDate>
      <link>https://forem.com/getpieces/building-a-tui-with-pieces-sdk-5eca</link>
      <guid>https://forem.com/getpieces/building-a-tui-with-pieces-sdk-5eca</guid>
      <description>&lt;h2&gt;
  
  
  Building a Pieces Copilot TUI - Part 1: Getting Started with PiecesOS SDK
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Note&lt;/strong&gt;: This tutorial is part of the &lt;a href="https://github.com/pieces-app/cli-agent" rel="noopener noreferrer"&gt;Pieces CLI&lt;/a&gt;. We welcome contributions! Feel free to open issues, submit PRs, or suggest improvements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In this two-part tutorial series, we'll build a fully functional Terminal User Interface (TUI) for Pieces Copilot from scratch. In Part 1, we'll explore the PiecesOS SDK and learn how to interact with PiecesOS through coding. In Part 2, we'll create a beautiful TUI using &lt;a href="https://textual.textualize.io/" rel="noopener noreferrer"&gt;Textual&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What we'll build:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A chat interface with streaming responses
&lt;/li&gt;
&lt;li&gt;Chat management (create, view, delete)
&lt;/li&gt;
&lt;li&gt;Long-Term Memory (LTM) support
&lt;/li&gt;
&lt;li&gt;Real-time UI updates&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Python 3.8+ (check with &lt;code&gt;python --version&lt;/code&gt;)
&lt;/li&gt;
&lt;li&gt;PiecesOS installed and running (&lt;a href="https://docs.pieces.app/products/meet-pieces" rel="noopener noreferrer"&gt;Download here&lt;/a&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 1: Setting Up Your Environment
&lt;/h2&gt;

&lt;p&gt;Open or create a new folder for this project. You can name it whatever you’d like. Inside of the folder, follow the steps below:&lt;/p&gt;

&lt;h3&gt;
  
  
  Create a Virtual Environment
&lt;/h3&gt;

&lt;p&gt;First, let's create an isolated Python environment for our project. Open up a command terminal, and enter the appropriate commands one by one below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;
&lt;span class="c"&gt;# Create a virtual environment&lt;/span&gt;
python &lt;span class="nt"&gt;-m&lt;/span&gt; venv venv

&lt;span class="c"&gt;# Activate the virtual environment&lt;/span&gt;
&lt;span class="c"&gt;## On macOS/Linux:&lt;/span&gt;
&lt;span class="nb"&gt;source &lt;/span&gt;venv/bin/activate

&lt;span class="c"&gt;## On Windows:&lt;/span&gt;
venv&lt;span class="se"&gt;\S&lt;/span&gt;cripts&lt;span class="se"&gt;\a&lt;/span&gt;ctivate
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Install Dependencies
&lt;/h3&gt;

&lt;p&gt;Still inside of your project folder, create a file called &lt;code&gt;requirements.txt&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# PiecesOS Python SDK for interacting with PiecesOS
pieces-os-client&amp;gt;=3.0.0

# Textual TUI framework for building terminal user interfaces
textual[syntax]&amp;gt;=5.3.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Inside of a command terminal, install the dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Step 2: Connecting to PiecesOS
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Initialize the Pieces Client
&lt;/h3&gt;

&lt;p&gt;The &lt;code&gt;PiecesClient&lt;/code&gt; is your gateway to PiecesOS. Let's create a simple script to connect. Create a new file called &lt;code&gt;test_connection.py&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# test_connection.py
&lt;/span&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;

&lt;span class="c1"&gt;# Initialize the client
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="c1"&gt;# Check if PiecesOS is running
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_pieces_running&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ Connected to PiecesOS!&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Version: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;version&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ PiecesOS is not running. Please start it first.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Go ahead and test it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python test_connection.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;What's happening here?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;PiecesClient()&lt;/code&gt; automatically discovers your PiecesOS instance port
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;is_pieces_running()&lt;/code&gt; checks if PiecesOS is accessible
&lt;/li&gt;
&lt;li&gt;The client handles port scanning and WebSocket connections&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Step 3: Working with Chats
&lt;/h2&gt;

&lt;p&gt;Now, let’s start working on creating and getting chats! First we’ll load all of your chats, then create a new chat, and then get that new chat:&lt;/p&gt;

&lt;h3&gt;
  
  
  List All Chats
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Add this section to the inside of 'if client.is_pieces_running()', as this all should only run if Pieces can connect.
&lt;/span&gt;
&lt;span class="c1"&gt;# Get all chats
&lt;/span&gt;&lt;span class="n"&gt;chats&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Found &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; chats:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;  - &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;summary&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Create a New Chat
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Place this right below the code you just added above
&lt;/span&gt;
&lt;span class="c1"&gt;# chats are created automatically when you ask a question
# without setting an active chat
&lt;/span&gt;
&lt;span class="c1"&gt;# Clear current chat to create a new one when we call the stream_question method
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

&lt;span class="c1"&gt;# You can also do this if you want to create one on the spot
# client.copilot.create_chat("My awesome chat")
&lt;/span&gt;

&lt;span class="c1"&gt;# Ask a question - this will create a new chat
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stream_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;What is Python?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Load chat Messages
&lt;/h3&gt;

&lt;p&gt;To make our new chat actually print, we have to print each &lt;code&gt;raw_content&lt;/code&gt; that comes back from PiecesOS:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Add this after 'client.copilot.stream_question("What is Python?")'
&lt;/span&gt;
&lt;span class="c1"&gt;# Get all messages in the chat
&lt;/span&gt;&lt;span class="n"&gt;messages&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;role&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;raw_content&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Step 4: Asking Questions with Streaming
&lt;/h2&gt;

&lt;p&gt;One of the most powerful features is streaming responses. Instead of waiting for the entire response, you get chunks as they're generated, like you’d see in ChatGPT.&lt;/p&gt;

&lt;h3&gt;
  
  
  Basic Streaming Example
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;

&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;handle_stream&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Callback function for streaming responses.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;status&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;INITIALIZED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🤔 Thinking...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;IN-PROGRESS&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="c1"&gt;# Get the text chunks
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;answers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;answers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;iterable&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;end&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;flush&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;COMPLETED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;✅ Done!&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;FAILED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;❌ Error: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;error_message&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Register the callback
&lt;/span&gt;&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ask_stream_ws&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ask_stream_ws&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_message_callback&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;handle_stream&lt;/span&gt;

&lt;span class="c1"&gt;# Ask a question
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stream_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Explain Python decorators&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;⚠️ &lt;strong&gt;Error Handling Tip&lt;/strong&gt;: In production, we will always handle WebSocket disconnections gracefully. The SDK will attempt to reconnect, but you should inform users about connection status.&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding the Streaming Flow
&lt;/h3&gt;

&lt;p&gt;The streaming response goes through several states:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;INITIALIZED&lt;/strong&gt;: Copilot is preparing to respond
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;IN-PROGRESS&lt;/strong&gt;: Streaming text chunks
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;COMPLETED&lt;/strong&gt;: Response is complete
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;FAILED/STOPPED/CANCELED&lt;/strong&gt;: Something went wrong&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Step 5: Handling Streaming Properly
&lt;/h2&gt;

&lt;p&gt;Let's create a more robust streaming handler:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# streaming_handler.py
&lt;/span&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;pieces_os_client.wrapper&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Optional&lt;/span&gt;

&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;StreamingHandler&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Handles streaming responses from Pieces Copilot.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="n"&gt;on_thinking_started&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;on_text_chunk&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;on_completed&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
        &lt;span class="n"&gt;on_error&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Callable&lt;/span&gt;&lt;span class="p"&gt;[[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pieces_client&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_thinking_started&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_thinking_started&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_text_chunk&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_text_chunk&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_completed&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_completed&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_error&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;on_error&lt;/span&gt;

        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;

        &lt;span class="c1"&gt;# Register callback
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ask_stream_ws&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ask_stream_ws&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;on_message_callback&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_handle_stream&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;ask_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Ask a question and handle streaming.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_thinking_started&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stream_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;_handle_stream&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Internal stream handler.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;status&lt;/span&gt;

        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;IN-PROGRESS&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;answers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;question&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;answers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;iterable&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                            &lt;span class="c1"&gt;# First chunk
&lt;/span&gt;                            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;
                        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                            &lt;span class="c1"&gt;# Subsequent chunks
&lt;/span&gt;                            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;

                        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_text_chunk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;answer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;COMPLETED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_completed&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;

        &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;status&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;FAILED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;STOPPED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;CANCELED&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
            &lt;span class="n"&gt;error_msg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;getattr&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error_message&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Unknown error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;error_msg&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;

        &lt;span class="nf"&gt;except &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;AttributeError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;ConnectionError&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;ValueError&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Handle specific streaming errors
&lt;/span&gt;            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on_error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;_current_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;💡 &lt;strong&gt;Best Practice&lt;/strong&gt;: Use specific exception types rather than broad &lt;code&gt;except Exception&lt;/code&gt; handlers. This makes debugging easier and prevents masking unexpected errors.&lt;/p&gt;

&lt;h3&gt;
  
  
  Using the StreamingHandler
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Example usage
&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PiecesClient&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_thinking&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🤔 Thinking...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_chunk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;end&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;flush&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_done&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;✅ Done!&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;close&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;on_error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;error&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;❌ Error: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;error&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;handler&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;StreamingHandler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;pieces_client&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;on_thinking_started&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;on_thinking&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;on_text_chunk&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;on_chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;on_completed&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;on_done&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;on_error&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;on_error&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Ask a question
&lt;/span&gt;&lt;span class="n"&gt;handler&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ask_question&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;What are Python generators?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Step 6: Working with Long-Term Memory (LTM)
&lt;/h2&gt;

&lt;p&gt;LTM allows Pieces to remember context across chats, making responses more personalized.&lt;/p&gt;

&lt;h3&gt;
  
  
  Check LTM Status
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Check if LTM system is running - place after ensure_initialization()
&lt;/span&gt;    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ltm&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_enabled&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ LTM system is running&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ LTM system is not available&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;# client.copilot.ltm.enable()
&lt;/span&gt;
    &lt;span class="c1"&gt;# Check if current chat has LTM enabled - place after LTM system check
&lt;/span&gt;    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ltm&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;is_chat_ltm_enabled&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ Chat LTM is enabled&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ Chat LTM is disabled&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Toggle LTM for Chat
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt; &lt;span class="c1"&gt;# Enable LTM for current chat - place after LTM checks
&lt;/span&gt; &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ltm&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;is_enabled&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
     &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;ltm&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;enable&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
     &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ Chat LTM enabled&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
 &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
     &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ LTM system must be running first&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

 &lt;span class="c1"&gt;# Disable LTM for current chat - place after enable LTM
&lt;/span&gt;    &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;copilot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chat_disable_ltm&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ Chat LTM disabled&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Step 7: Managing chats
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Delete a chat
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Get a chat to delete - place after getting all chats
&lt;/span&gt;&lt;span class="n"&gt;chat_to_delete&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="c1"&gt;# Delete a chat - place after getting chat to delete
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Deleting: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;chat_to_delete&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;chat_to_delete&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;delete&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Rename a chat
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Get a chat to rename - place after getting all chats
&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;chats&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="c1"&gt;# Rename a chat - place after getting chat
&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;My New Chat Name&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ Renamed to: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Common Issues &amp;amp; Troubleshooting
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;### "Connection refused" Error&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;✅ Ensure PiecesOS is running
&lt;/li&gt;
&lt;li&gt;✅ Check if port 39300 is available&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;### "Module not found" Error&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;✅ Activate your virtual environment
&lt;/li&gt;
&lt;li&gt;✅ Reinstall dependencies: &lt;code&gt;pip install -r requirements.txt&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;✅ Check Python version: &lt;code&gt;python --version&lt;/code&gt; (needs 3.8+)&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;### Streaming Not Working&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;✅ Ensure callback is registered before &lt;code&gt;stream_question()&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;✅ Check that PiecesOS is connected and responsive&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Recap
&lt;/h2&gt;

&lt;p&gt;In Part 1, we learned:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;✅ Setting up a Python virtual environment
&lt;/li&gt;
&lt;li&gt;✅ Installing the PiecesOS SDK
&lt;/li&gt;
&lt;li&gt;✅ Connecting to PiecesOS
&lt;/li&gt;
&lt;li&gt;✅ Working with chats
&lt;/li&gt;
&lt;li&gt;✅ Handling streaming responses
&lt;/li&gt;
&lt;li&gt;✅ Managing Long-Term Memory
&lt;/li&gt;
&lt;li&gt;✅ Creating a robust streaming handler&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Next Steps
&lt;/h2&gt;

&lt;p&gt;In &lt;strong&gt;Part 2&lt;/strong&gt;, we'll use everything we learned to build a beautiful Terminal User Interface (TUI) with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Split-pane layout for chats and messages
&lt;/li&gt;
&lt;li&gt;Real-time streaming chat interface
&lt;/li&gt;
&lt;li&gt;Interactive widgets and keyboard shortcuts
&lt;/li&gt;
&lt;li&gt;Proper state management and UI updates&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Useful Resources
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://docs.pieces.app" rel="noopener noreferrer"&gt;PiecesOS Documentation&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/pieces-app/pieces-os-client-sdk-for-python" rel="noopener noreferrer"&gt;Python SDK Reference&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/pieces-app/cli-agent" rel="noopener noreferrer"&gt;Pieces CLI on GitHub&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://discord.gg/getpieces" rel="noopener noreferrer"&gt;Join our Discord&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;strong&gt;Ready for &lt;a href="https://dev.to/getpieces/building-a-tui-with-pieces-sdk-part-2-ui-components-59l9"&gt;Part 2&lt;/a&gt;?&lt;/strong&gt; Let's build the TUI! 🚀  &lt;/p&gt;

</description>
      <category>softwaredevelopment</category>
      <category>howto</category>
      <category>ai</category>
      <category>pieces</category>
    </item>
    <item>
      <title>International Women’s Day: Spotlighting Our Women in Tech</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Fri, 29 Mar 2024 17:32:04 +0000</pubDate>
      <link>https://forem.com/getpieces/international-womens-day-spotlighting-our-women-in-tech-27dm</link>
      <guid>https://forem.com/getpieces/international-womens-day-spotlighting-our-women-in-tech-27dm</guid>
      <description>&lt;p&gt;The 8th of March was International Women’s Day, a day dedicated to celebrating and cherishing the hard work of women across the world. At Pieces, we hosted a roundtable discussion on &lt;a href="https://twitter.com/i/spaces/1ZkJzjEPNvDJv?s=20"&gt;Twitter space&lt;/a&gt; on March 21st where many of the women at our workplace shared their journey and experiences in the tech space.&lt;/p&gt;

&lt;p&gt;There is a very good reason to celebrate women in tech; women are largely underrepresented in the technology field in general. According to this &lt;a href="https://www.womentech.net/en-us/women-in-tech-stats"&gt;survey&lt;/a&gt;, women make up just 35% of employees in STEM in the US. These are shocking numbers, especially when it relates to receiving funds and promotions in the industry. Thanks to inclusive programs like &lt;a href="https://women-in-tech.org/"&gt;Women In Tech (WIT)&lt;/a&gt;, &lt;a href="https://womenwhocode.com/"&gt;Women Who Code&lt;/a&gt;, &lt;a href="https://womeninstem.org/our-vision"&gt;Women in STEM&lt;/a&gt;, etc., more young girls and women have started embracing STEM as a career path.&lt;/p&gt;

&lt;p&gt;We are grateful to the following women for sharing their experiences:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://www.linkedin.com/in/bossemel/"&gt;Leonie Bossemeyer,&lt;/a&gt; a Machine Learning Engineer, with an initial background in economics, is now pursuing her PhD in machine learning.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.linkedin.com/in/laurinmcnulty/"&gt;Laurin McNulty&lt;/a&gt;, Head of Design, is in her second year with Pieces, her first role out of college.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.linkedin.com/in/erinfreeman13/"&gt;Erin Freeman&lt;/a&gt;, Growth Intern, graduating this May from Miami University in Communications and Entrepreneurship.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.linkedin.com/in/roselevy/"&gt;Rosie Levy&lt;/a&gt;, Chief Customer Officer, joined Pieces in December after working in software for 9 years in a variety of roles.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.linkedin.com/in/anushka-gupta23/"&gt;Anushka Gupta&lt;/a&gt;, Developer Advocate, is a computer science graduate who will be joining AWS soon as a programming manager.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.linkedin.com/m/in/sophia-iroegbu"&gt;Sophia Iroegbu&lt;/a&gt;, Developer Advocate, hosted this space, focusing on back-end development and technical writing.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Discussion Highlights
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Overcoming Intimidation and Imposter Syndrome
&lt;/h3&gt;

&lt;p&gt;Leonie, initially daunted by the expertise of her peers, reminds us that it's okay not to have all the answers at first. Sophia's transition from agriculture to tech highlights the diversity of backgrounds in the industry. Anushka's experience with coding highs and lows underscores the importance of resilience in the face of challenges, while Laurin, Erin, and Rosie share their battles with Imposter Syndrome, emphasizing the importance of continuous learning and building confidence.&lt;/p&gt;

&lt;h3&gt;
  
  
  Embracing Challenges and Seizing Opportunities
&lt;/h3&gt;

&lt;p&gt;Rosie's journey from engineer to developer to product owner exemplifies the power of saying "yes" to new experiences, even when they seem daunting. Leonie's thinking outside of the box approach, such as her discovery of an older, simpler, yet effective algorithm to solve a problem, showcases the value of humility and perseverance in tech. Additionally, the projects undertaken by Laurin and Erin, from project management to growing developer communities, demonstrate the growth that comes from stepping out of one's comfort zone and embracing new responsibilities. It’s okay to be clueless when you are just getting started.&lt;/p&gt;

&lt;h3&gt;
  
  
  Balancing Work and Life
&lt;/h3&gt;

&lt;p&gt;Beyond the code, these women emphasize the importance of maintaining a healthy work-life balance. From Laurin's dedication to a hobby like painting to Erin's strategic scheduling and self-care practices, each woman emphasizes the need to prioritize personal well-being alongside professional growth. On the other side, Leonie expresses that sometimes work can be so compelling and exciting that it can be both your work and your hobby, as long as you are achieving what you want to be achieving and doing work that makes you happy.&lt;/p&gt;

&lt;h3&gt;
  
  
  Paving the Way for Future Generations
&lt;/h3&gt;

&lt;p&gt;In their commitment to diversity and inclusion, these women speak on the importance of making strides in their own careers but also actively working to create opportunities for others. Anushka's advocacy for open-source programs and Rosie's mentorship of high school interns highlight the importance of nurturing talent from a young age. Similarly, Laurin, Erin, and Leonie stress the significance of representation and allyship in fostering an inclusive tech community. All of the women of Pieces agree that diversity begins with them, in hiring, in advocating for voices that aren’t being heard in the conference room, and championing each other every single day.&lt;/p&gt;

&lt;h3&gt;
  
  
  Words of Wisdom and Encouragement
&lt;/h3&gt;

&lt;p&gt;As the discussion draws to a close, each woman offers their own advice to those navigating their own tech journeys:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Laurin: Embrace change, have confidence in yourself, and never stop learning.&lt;/li&gt;
&lt;li&gt;Anuskha: It's okay not to know everything; take your time to learn and grow.&lt;/li&gt;
&lt;li&gt;Leonie: Dive into projects that interest you; they provide the perfect opportunity for learning.&lt;/li&gt;
&lt;li&gt;Rosie: Struggling is part of the journey; remember that others face similar challenges, so keep pushing forward.&lt;/li&gt;
&lt;li&gt;Erin: Step out of your comfort zone, build new skills, and try things that scare you.&lt;/li&gt;
&lt;li&gt;Sophia: Start small, tackle challenges head-on, and remember it's okay to take breaks when needed.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Moving Forward with Confidence
&lt;/h3&gt;

&lt;p&gt;As we reflect on the insights shared by these trailblazing women, one thing becomes clear: the tech industry is richer and more vibrant when diverse voices are heard and celebrated. With their resilience, passion, and unwavering dedication, they inspire us all to embrace challenges, uplift one another, and forge ahead with confidence on our own paths to success.&lt;/p&gt;

&lt;p&gt;Here are some communities you can join to network with other women across the world:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://discord.gg/basementdevs"&gt;https://discord.gg/basementdevs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://discord.gg/he4rt"&gt;https://discord.gg/he4rt&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://dly.to/BpsJDgyxi8U"&gt;dly.to/BpsJDgyxi8U&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;As we drew closer to the end of the session, we discussed a bit more on time management, avoiding regrets, and being happy with who you are and where you are. Our heartfelt wish is that these roundtable discussions inspire you to keep going, keep building, keep growing, keep learning, and keep connecting and uplifting the women you connect with.&lt;/p&gt;

&lt;p&gt;You can listen to the full recorded conversation &lt;a href="https://twitter.com/i/spaces/1ZkJzjEPNvDJv?s=20"&gt;here&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>wecoded</category>
      <category>womenintech</category>
      <category>stem</category>
    </item>
    <item>
      <title>AI Coding: The Ultimate Guide to Enhancing Your Development Workflow</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Wed, 27 Mar 2024 16:07:27 +0000</pubDate>
      <link>https://forem.com/getpieces/ai-coding-the-ultimate-guide-to-enhancing-your-development-workflow-4b1h</link>
      <guid>https://forem.com/getpieces/ai-coding-the-ultimate-guide-to-enhancing-your-development-workflow-4b1h</guid>
      <description>&lt;p&gt;As developers, the love of the craft can get away from us when we have to spend time on mundane tasks — repeating code, searching for bugs, and reading boring documentation. With AI coding tools, it is starting to look like a light at the end of the tunnel. Can we get back to the fun times of using our imagination and letting our creative juices flow to solve some abstract problem? The answer is yes.&lt;/p&gt;

&lt;p&gt;A &lt;a href="https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/unleashing-developer-productivity-with-generative-ai"&gt;recent report&lt;/a&gt; highlights the staggering efficiency gains and cost reductions companies are experiencing by integrating AI into their development workflows. According to this study, projects that utilize AI coding tools see a reduction in development time by up to 50%, along with a significant decrease in bugs and errors at the initial stages.&lt;/p&gt;

&lt;p&gt;Developers can now focus on creative problem-solving and strategic planning, leaving the repetitive and time-consuming coding tasks to their AI counterparts. Let’s take a close look at how you can make the most of these AI tools for coding and stay ready for what lies ahead.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding AI Coding
&lt;/h2&gt;

&lt;p&gt;To put it simply, AI coding includes the application of machine learning models to understand, generate, and optimize code. This field is massive, ranging from simple code suggestions to complex algorithms that can write and refactor entire segments of code autonomously.&lt;/p&gt;

&lt;p&gt;The types of AI coding tools vary significantly too, each serving distinct purposes. For instance, code completers like Intellisense or GitHub Copilot help developers by suggesting the next line of code based on context. The result is a significant speed boost in the overall coding process.&lt;/p&gt;

&lt;p&gt;Bug detectors are invaluable in identifying potential errors before they become a problem, saving time in the debugging phase. &lt;a href="https://code.pieces.app/blog/enhancing-ai-code-review-efficiency-with-retrieval-augmented-generation"&gt;AI code reviewers&lt;/a&gt; go a step further by not only spotting errors but also providing suggestions for code quality improvements, ensuring adherence to best practices.&lt;/p&gt;

&lt;p&gt;Question/answering tools are designed to assist developers in finding quick solutions to coding queries, while &lt;a href="https://code.pieces.app/blog/9-best-ai-code-generation-tools"&gt;AI coding assistants and code generation tools&lt;/a&gt;, like Pieces for Developers, offer the ability to generate code snippets from natural language descriptions, thus streamlining the development workflow significantly.&lt;/p&gt;

&lt;p&gt;Machine learning and AI are integral to coding because they can analyze vast amounts of data quickly and learn from it. In coding, these models are trained on large datasets of code to understand syntax, semantics, and even code structure, which enables them to predict and generate code accordingly. Safe to say, this comes with some sweet benefits.&lt;/p&gt;

&lt;h3&gt;
  
  
  Benefits of AI in the Software Development Industry
&lt;/h3&gt;

&lt;p&gt;The benefits of using AI for coding tasks are numerous. We already mentioned how it increases productivity by automating repetitive tasks, improves code quality through consistent application of best practices, and can significantly reduce the time spent on debugging. But how can we put this into numbers?&lt;/p&gt;

&lt;p&gt;Another &lt;a href="https://github.blog/2023-06-27-the-economic-impact-of-the-ai-powered-developer-lifecycle-and-lessons-from-github-copilot/"&gt;recent article&lt;/a&gt; outlines the potential for generative AI coding tools to boost global GDP by over $1.5 trillion, attributing this to a 30% productivity enhancement. These tools could add the equivalent of 15 million "effective developers" to the global workforce by 2030.&lt;/p&gt;

&lt;p&gt;But it isn’t just big companies seeing the benefits of AI in coding. Less experienced developers benefit more from tools like &lt;a href="https://code.pieces.app/blog/pieces-developers-github-copilot"&gt;Pieces and GitHub Copilot&lt;/a&gt;, which help them upskill and become more fluent in their programming languages of choice.​&lt;/p&gt;

&lt;p&gt;Developer burnout and context-switching are no fun. Constantly having to move from one task to another without letting your brain catch up can make it so you don’t even want to get out of bed in the morning and sign into that daily standup meeting. Knowing that you can offload the grunt work to AI-assisted coding tools makes it a little easier to go to work every day and keep on coding.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Technologies Behind AI Coding
&lt;/h2&gt;

&lt;p&gt;The primary engines behind coding with AI are &lt;a href="https://code.pieces.app/whitepapers/getting-started-with-large-language-models-llms"&gt;Large Language Models (LLMs)&lt;/a&gt;, neural networks, and various machine learning algorithms. LLMs, like GPT and Gemini, have revolutionized the way we think about human-computer interaction by processing and generating human-like text.&lt;/p&gt;

&lt;p&gt;Neural networks, particularly deep learning models, mimic the human brain's interconnected neuron structure to parse data and learn from it, making them ideal for pattern recognition and predictive analytics essential in coding tasks.&lt;/p&gt;

&lt;h3&gt;
  
  
  How are These Technologies Applied in Coding?
&lt;/h3&gt;

&lt;p&gt;These technologies come into play in AI-based coding by automating repetitive tasks, suggesting code completion, finding and fixing bugs, or even generating code snippets based on natural language descriptions. The problem, however, often lies in the context. Traditional models could churn out code, but understanding the broader context of that code within a project or a developer's unique coding style was beyond their reach.&lt;/p&gt;

&lt;p&gt;This is where &lt;a href="https://code.pieces.app/blog/retrieval-augmented-generation-for-curation"&gt;Retrieval-Augmented Generation&lt;/a&gt; (RAG) steps in, making a significant impact on AI and coding. RAG enhances the capabilities of LLMs by combining them with an extensive database, such as codebases or internal wikis, which the model can draw upon to provide context-specific information.&lt;/p&gt;

&lt;p&gt;This means that when you use a tool like Pieces, it's not just suggesting code — it's suggesting code that aligns with the context of your entire workflow. It understands the nuances of your project and adapts to the specific environment you're working in, be it an app, complex software, or a web development project.&lt;/p&gt;

&lt;h2&gt;
  
  
  AI Coding Tools and Platforms
&lt;/h2&gt;

&lt;p&gt;Let’s take a closer look at Pieces and a few other popular AI coding tools.&lt;/p&gt;

&lt;h3&gt;
  
  
  Pieces for Developers - Revolutionizing AI Coding
&lt;/h3&gt;

&lt;p&gt;Pieces for Developers is an AI productivity tool that's gaining attention for its unique approach to enhancing developer workflows. There are three key components that make Pieces a particularly special AI tool for coding to keep on your radar:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Retrieval Augmented Generation&lt;/li&gt;
&lt;li&gt;Cross-platform LLM utilization&lt;/li&gt;
&lt;li&gt;On-device processing&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;At its core, the application functions as a copilot, persisting across the various stages of software development, from coding to collaboration. This persistence is powered by the previously mentioned RAG technique that ups the performance of generative AI models by incorporating external data sources to provide more accurate and contextually relevant responses.&lt;/p&gt;

&lt;p&gt;What separates this technique from others is that it addresses the issue of "hallucinations" in AI responses by blending code generation with information retrieval. It enhances the accuracy and relevance of AI-generated content, and as a result, is more reliable for enterprise applications where factual accuracy is crucial.&lt;/p&gt;

&lt;p&gt;Cross-platform LLM utilization is one of the key differences between Pieces and other AI-powered coding tools. In practice, this means whether a developer is working in Visual Studio Code, JupyterLab, Chrome, or any other tool, they can expect a consistent level of support from Pieces. LLMs can be leveraged to understand and generate code across a variety of programming languages and frameworks, making the tool versatile and adaptable to different development environments.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://code.pieces.app/blog/the-importance-of-on-device-ai-for-developer-productivity"&gt;On-device AI&lt;/a&gt; processing means that the AI's computations are done locally on the developer's machine, rather than on remote servers. This approach is crucial for handling sensitive code because it mitigates the risk of exposing intellectual property or confidential information to external entities. Pieces takes advantage of this to ensure that developers' code remains private and secure while still benefiting from advanced AI-powered coding capabilities.&lt;/p&gt;

&lt;p&gt;Pieces isn't just a single tool — it's meant to be a suite of tools that assist at various stages of software development. We’re talking everything from real-time code completion, bug detection, and automated refactoring suggestions while coding. There is also the integration with version control systems to help manage changes or predict the impact of code modifications.&lt;/p&gt;

&lt;p&gt;Plus, Pieces doesn't just provide raw code snippets but enhances them with additional context or documentation. For instance, when a developer finds a snippet of code, Pieces can automatically annotate it with comments, link to the relevant function's documentation, or even provide visual aids that explain what the code does.&lt;/p&gt;

&lt;h3&gt;
  
  
  GitHub Copilot
&lt;/h3&gt;

&lt;p&gt;Released by GitHub in partnership with OpenAI, &lt;a href="https://github.blog/2021-06-29-introducing-github-copilot-ai-pair-programmer/"&gt;GitHub Copilot&lt;/a&gt; is an AI-powered code completion tool that provides suggestions for whole lines or blocks of code. It's designed to be a sort of pair programmer, and help developers code faster and learn new APIs and languages along the way.&lt;/p&gt;

&lt;p&gt;Similar to Pieces, instead of typing out boring boilerplate, you can just select it from a list of suggestions while you’re coding in your IDE. It’s great at what it does, and even though it charges $10 per month, many devs swear by it.&lt;/p&gt;

&lt;p&gt;GitHub Copilot isn’t the only AI used for coding in town though, and we’ve discussed a few &lt;a href="https://code.pieces.app/blog/best-free-and-paid-github-copilot-alternatives"&gt;alternatives to GitHub Copilot&lt;/a&gt; in a previous article. A few of the standout competitors are from other tech giants that you’ve probably heard of — namely Microsoft and Amazon.&lt;/p&gt;

&lt;h3&gt;
  
  
  Microsoft Copilot
&lt;/h3&gt;

&lt;p&gt;Microsoft Copilot may be newer than GitHub Copilot, but it has been a long time in the making. If you’ve been using VS Code for a few years, you might remember the built-in code completion, IntelliSense. You can think of it as a form of AI, but it is more basic than that.&lt;/p&gt;

&lt;p&gt;IntelliSense first made its appearance in a mainstream product with the release of Visual Basic 5.0 Control Creation Edition in 1996. By Visual Studio 2005, IntelliSense became more context-aware and proactive, activating by default as the user begins to type.&lt;/p&gt;

&lt;p&gt;Fast forward to 2023, and Microsoft Copilot represents a newer generation of AI integration, leveraging large language models to assist in a broader range of tasks beyond coding, including writing, creating, and summarizing content across Microsoft's suite of products​.&lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon CodeGuru
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://aws.amazon.com/codeguru/"&gt;CodeGuru&lt;/a&gt; is a machine learning service by Amazon Web Services that provides automated code reviews and performance recommendations. Amazon CodeGuru leverages machine learning to enhance code quality by providing automated code reviews and performance recommendations.&lt;/p&gt;

&lt;p&gt;It identifies your inefficient and problematic code segments and suggests improvements to both the performance and cost-efficiency of applications. CodeGuru is split into two main components: CodeGuru Reviewer, which conducts automated code reviews for potential issues, and CodeGuru Profiler, which identifies application performance bottlenecks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cloud-based or Local AI?
&lt;/h2&gt;

&lt;p&gt;All these AI assisted coding tools can make your head spin, but you shouldn’t jump into any particular one without figuring out how it will work within your environment, especially if you’re aiming to become an AI-powered enterprise. The biggest question you should ask yourself first is whether it is cloud-based or local?&lt;/p&gt;

&lt;p&gt;Choosing the &lt;a href="https://code.pieces.app/blog/best-llm-for-coding-cloud-vs-local"&gt;best LLM between cloud-based or local LLMs&lt;/a&gt; for coding hinges on balancing specific requirements and constraints. Cloud LLMs offer the advantages of scalability, ease of use, and reduced need for upfront hardware investment, making them a compelling choice for those needing rapid deployment and flexibility.&lt;/p&gt;

&lt;p&gt;On the flip side, &lt;a href="https://code.pieces.app/blog/local-large-language-models-lllms-and-copilot-integrations"&gt;local LLMs&lt;/a&gt; provide greater control over data and processing, enhanced privacy, and the potential for customized optimizations, catering to those prioritizing data security and specific performance needs. The decision ultimately rests on aligning the LLMs capabilities with your specific performance expectations, hardware availability, and privacy concerns.&lt;/p&gt;

&lt;h2&gt;
  
  
  Don’t Forget About Integrations
&lt;/h2&gt;

&lt;p&gt;Integrating AI coding tools with IDEs, browsers, and collaboration platforms is essential for modern software development. Such integration streamlines the workflow, allowing developers to remain within their coding environment while accessing a wealth of information and functionalities.&lt;/p&gt;

&lt;p&gt;For instance, the top AI tools for coding are integrated with IDEs to provide real-time, context-aware coding assistance, which can drastically improve productivity and reduce errors. When these tools are synced with web browsers, they simplify research and problem-solving by bringing intelligent assistance into the research phase, ensuring that insights and code snippets are easily accessible and relevant.&lt;/p&gt;

&lt;p&gt;You need a transition between solitary code writing and team-based activities, such as code reviews or collaborative problem-solving. An AI coding tool that unifies these aspects within the workflow doesn't just save time but also preserves the mental flow of developers, reducing the cognitive load and context switching that often lead to inefficiencies.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best Practices in AI Coding
&lt;/h2&gt;

&lt;p&gt;Effectively &lt;a href="https://code.pieces.app/blog/ai-integration-one-copilot-many-tools"&gt;integrating AI into your workflow&lt;/a&gt; isn’t difficult. But you should keep some AI coding basics and best practices in mind to make the most of it. Here are a few ideas to keep you on your toes:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understand the capabilities and limitations.&lt;/strong&gt; Before integrating the best AI tool for coding, think about what it can and cannot do. AI can offer suggestions, generate code, and automate repetitive tasks, but it may not always grasp the nuanced requirements of a project. Knowing this helps set realistic expectations. At least in the beginning, stick to using it for &lt;a href="https://code.pieces.app/blog/auto-complete-boilerplate-code"&gt;boilerplate code generation&lt;/a&gt;, bug fixes, or help with understanding documentation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Maintain code quality standards.&lt;/strong&gt; AI-generated code should keep to the same quality standards as manually written code (or better). This means you’ll still have to conduct the same old code reviews, maintain proper documentation, and run tests at each stage to ensure the generated code meets your project's standards. Yeah, this might be the boring stuff — but the best AI for coding should take most of the drudgery out of it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prioritize security.&lt;/strong&gt; When experimenting with coding and AI, make sure that the code generated does not introduce any security vulnerabilities. If you’re relying on cloud-based systems, try and move away from them. Remember how we said Pieces can use &lt;a href="https://code.pieces.app/blog/how-developers-are-using-offline-ai-tools-for-air-gapped-security"&gt;offline AI&lt;/a&gt; to run locally on your machine? This is one of the best ways to ensure security.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Incorporate human oversight.&lt;/strong&gt; AI should augment, not replace, human developers. Always have experienced developers oversee the AI's output and keep the necessary checks and balances to catch errors the AI might miss.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Train the AI with your codebase.&lt;/strong&gt; If your AI tool supports it, train the model with your codebase to improve its context-awareness. &lt;a href="https://code.pieces.app/blog/ai-context-making-the-most-out-of-your-llm-context-length"&gt;The more the AI understands your specific environment&lt;/a&gt; (as opposed to just general knowledge), the better its suggestions and code contributions will be.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Future of AI Coding
&lt;/h2&gt;

&lt;p&gt;The impact of AI on the future of software engineering is profound, to say the least. We’re only a year or two into the current AI upswing — a couple years ago, all of these capabilities and AIs for coding help were unheard of.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.technologyreview.com/2023/12/06/1084457/ai-assistants-copilot-changing-code-software-development-github-openai/"&gt;Researchers at MIT are saying&lt;/a&gt; that as AI coding tools continue to evolve, they are expected to handle more complex aspects of software development, beyond simple code generation. This could include managing large codebases, identifying and fixing bugs autonomously, and optimizing software for performance and security. The &lt;a href="https://code.pieces.app/blog/evolution-of-ai-software-development"&gt;promise of AI in software development&lt;/a&gt; lies not just in augmenting human capabilities but in fundamentally transforming the coding process.&lt;/p&gt;

&lt;p&gt;Additionally, the best AI coding assistants not only alleviate the tedium of certain coding tasks but also enhance developers' creativity and empower them to tackle more complex problems. The &lt;a href="https://github.blog/2023-04-14-how-generative-ai-is-changing-the-way-developers-work/"&gt;folks over at GitHub&lt;/a&gt; are saying AI is completely changing the way developers work, claiming these advancements have led to increased productivity, with developers reporting faster coding times and a reduced sense of frustration during the coding process.&lt;/p&gt;

&lt;p&gt;But even with all of the promising news, there is a gray cloud hovering on the horizon. For example, the use of GitHub Copilot and AI tools like ChatGPT for coding have raised concerns regarding intellectual property and data privacy, leading some companies to restrict their use. Additionally, there are ongoing legal debates around the training of AI models with code that may have been used without explicit consent from the original authors.&lt;/p&gt;

&lt;p&gt;Besides the potential legal issues, there is also the question of the intelligent developer going extinct. That might sound a little dramatic, but let’s not forget that &lt;a href="https://www.techradar.com/pro/nvidia-ceo-predicts-the-death-of-coding-jensen-huang-says-ai-will-do-the-work-so-kids-dont-need-to-learn"&gt;the CEO of Nvidia recently said&lt;/a&gt; that programming is a dying profession, soon to be replaced by AI that solves coding problems.&lt;/p&gt;

&lt;p&gt;And the question remains: if AI does all the work, does the developer learn anything? Do they maintain their skills? For new devs coming into the field, do they need to learn anything at all?&lt;/p&gt;

&lt;p&gt;We like to think on the bright side. As of right now at least, you can’t ask for AI to create a full-stack app from scratch and handle all of the nuanced user stories, requirements, and issues with deployment and scaling. We need developers, and despite a potential &lt;a href="https://code.pieces.app/blog/lack-of-software-developers-what-to-do"&gt;shortage of them due to AI&lt;/a&gt;, we can still leverage the best AI to learn coding to create more efficient and intelligent software engineers. We actually think there’s a &lt;a href="https://code.pieces.app/blog/lack-of-software-developers-what-to-do"&gt;lack of software developers due to AI&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;From dramatically reducing development times and enhancing code quality to bringing about a more satisfying development experience, AI coding tools are reshaping the industry's future. Adopting AI coding tools is increasingly essential for developers aiming to stay ahead. One such AI that can help with coding is Pieces for Developers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.pieces.app/installation-getting-started/what-am-i-installing"&gt;Check out Pieces for yourself&lt;/a&gt; and join a helpful community of developers who are shaping coding's future with AI. Boost your productivity, improve your code quality, or just enjoy coding more — Pieces is your go-to.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>beginners</category>
    </item>
    <item>
      <title>AWS CodeWhisperer vs Copilot: Features and Issues</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Wed, 27 Mar 2024 16:00:24 +0000</pubDate>
      <link>https://forem.com/getpieces/aws-codewhisperer-vs-copilot-features-and-issues-1612</link>
      <guid>https://forem.com/getpieces/aws-codewhisperer-vs-copilot-features-and-issues-1612</guid>
      <description>&lt;p&gt;Given the existence of “&lt;a href="https://www.cognition-labs.com/blog"&gt;the first AI software engineer,&lt;/a&gt;” it is increasingly important that developers use an AI assistant to be more productive. The increase in productivity can come from (1) cleaner, more maintainable code with fewer errors, (2) automating repetitive tasks such as writing boilerplate and code completion, (3) offering suggestions and variations of existing code, (4) retrieving desired information from your previous workflow, and other types of intelligent assistance.&lt;/p&gt;

&lt;p&gt;However, AI output needs to be checked for accuracy and not trusted unless reviewed by a human. For example, various AI systems have &lt;a href="https://thetaclv.com/resource/the-issue-of-ai-hallucination-in-writing-code-for-predictive-models-observations-from-the-theta-data-science-team/"&gt;invented variables that do not exist&lt;/a&gt; in reality, &lt;a href="https://www.forbes.com/sites/marisagarcia/2024/02/19/what-air-canada-lost-in-remarkable-lying-ai-chatbot-case/?sh=6d44bffb696f"&gt;explained hallucinated policies&lt;/a&gt; to customers, and &lt;a href="https://snyk.io/blog/addressing-risks-in-the-owasp-top-10-for-llms/"&gt;opened code to security vulnerabilities&lt;/a&gt;. In the legal system, an AI has &lt;a href="https://hai.stanford.edu/news/hallucinating-law-legal-mistakes-large-language-models-are-pervasive"&gt;supported false facts with imaginary sources&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Consequently, it is crucial to consider carefully when choosing your AI assistant. This post discusses the three best choices for a developer’s &lt;a href="https://code.pieces.app/blog/navigating-the-future-with-ai-copilots-a-comprehensive-guide"&gt;AI copilot&lt;/a&gt;, and compares their features and issues. The copilots are GitHub Copilot vs Amazon CodeWhisperer, and those are compared with a third alternative, Pieces for Developers.&lt;/p&gt;

&lt;h2&gt;
  
  
  GitHub Copilot Features and Issues
&lt;/h2&gt;

&lt;p&gt;After a year of technical preview, GitHub Copilot was released for &lt;a href="https://github.blog/2022-06-21-github-copilot-is-generally-available-to-all-developers/"&gt;individual developers&lt;/a&gt; on a monthly subscription basis in June 2022. It uses Open AI’s Codex LLM, which was derived from GPT-3. Codex is specialized to translate natural language into code.&lt;/p&gt;

&lt;p&gt;Microsoft bought GitHub for $7.5 billion in stock in June 2018, and &lt;a href="https://analyticsindiamag.com/the-dark-history-behind-github-co-pilots-success/"&gt;Microsoft collaborated with OpenAI&lt;/a&gt; to create GitHub Copilot before the Codex LLM was publicly released. GitHub provided an immense code database for training Copilot’s AI model, and Azure is an immensely scalable cloud environment that supports Copilot.&lt;/p&gt;

&lt;p&gt;In March of last year (2023), GitHub Copilot evolved to GitHub Copilot X as a “readily accessible AI assistant throughout the entire development lifecycle.” This was achieved by integrating it into Visual Studio and VS Code. It also supports Vim, Neovim, the JetBrains suite of IDEs, and Azure Data Studio. Some of the GitHub Copilots experimental features ended December 15, 2023, and others, such as Workspaces, are ongoing.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://analyticsindiamag.com/github-positions-ai-as-pivotal-in-software-development-journey/"&gt;intent of Workspaces&lt;/a&gt; is to guide a developer from initial idea to production and beyond. It combines GPT-4 with the available relevant codebase to provide better information during the entire development lifecycle process.&lt;/p&gt;

&lt;p&gt;Earlier this year (February 27, 2024), GitHub Enterprise was &lt;a href="https://github.blog/2024-02-27-github-copilot-enterprise-is-now-generally-available/#a-conversational-and-customized-github-copilot-experience"&gt;introduced&lt;/a&gt; to developers as “a copilot that is customized to their own organization’s code and processes.” The GitHub Blog describes the “three core features” of the GitHub Copilot Enterprise release.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The first core feature is the set of standard copilot features. It generates summaries of code and makes real-time suggestions to explain and improve code. At this time, there is &lt;a href="https://dev.to/rahulbanerjee99/some-experiments-with-github-copilot-4jao"&gt;an unanswered question&lt;/a&gt; about the context used by the Copilot. When a developer had two projects open in JetBrains Webstorm, it used both projects as a basis for its suggestions.&lt;/li&gt;
&lt;li&gt;The second core feature is the direct integration of chat into GitHub.com. This allows developers to query their organization’s codebase in natural language and to be guided to relevant code or documentation that might answer a question.&lt;/li&gt;
&lt;li&gt;The third core feature is faster review and integration of pull request results into code. The Copilot summarizes pull requests and analyzes pull request diffs, which is useful to developers who review pull requests.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The GitHub Blog explicitly states the vision for evolving GitHub Copilot is focused on AI integration for GitHub. I noticed that GitHub Copilot Enterprise has the integration of Microsoft’s Bing search into chat in beta. It does not mention using a code snippet as an example to find similar snippets, which is a search that can be done in Pieces.&lt;/p&gt;

&lt;p&gt;The Copilot’s AI model is not trained on any of an organization’s information unless it is introduced by organizational request. For example, GitHub Copilot can use a custom AI model and it can be fine-tuned for esoteric languages, such as &lt;a href="https://analyticsindiamag.com/github-positions-ai-as-pivotal-in-software-development-journey/#:~:text=Now%252C%2520with%2520AI%252C%2520GitHub%2520aims,codebase%252C%25E2%2580%259D%2520he%2520told%2520AIM."&gt;Verilog for hardware design&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;A study of the simpler GitHub Copilot provides &lt;a href="https://github.blog/wp-content/uploads/2024/02/accenture-research.png?w=1600"&gt;strong evidence&lt;/a&gt; for the benefits of using it. A major enterprise surveyed 450 of its Copilot users for their Activity, Productivity, Efficiency, and Satisfaction in using it. All eight data values in the results were highly positive. Five were in the range of 90%-96%. Developers retained 88% of the suggested code, there was an 84% increase in successful builds, and 50% did more builds.&lt;/p&gt;

&lt;p&gt;Thus, it is very clear that developers’ use of even the simplest auto-completion copilot is highly beneficial. The benefits increase with more intelligent copilots, especially when the copilots are designed with the developers’ specific use cases in mind. More advanced copilots understand and make suggestions relevant to regulations, policies, standard practices, and personal preferences.&lt;/p&gt;

&lt;p&gt;For security, data is encrypted. However, it is accessible to Microsoft and GitHub personnel, especially the data belonging to individuals. Individuals can turn off retention of prompts and suggestions, and they are not saved when on a business plan.&lt;/p&gt;

&lt;p&gt;There is a lot of community support at various levels of expertise. This can be especially useful for students and developers new to coding in a language, who have questions that go beyond interactions with the AI itself.&lt;/p&gt;

&lt;h2&gt;
  
  
  The AWS Copilot Competitor
&lt;/h2&gt;

&lt;p&gt;Amazon launched the AWS CodeWhisperer into technical preview in June 2022, within hours after GitHub Copilot was publicly launched. Now, the two frequently compared copilots are Amazon CodeWhisperer vs GitHub Copilot.&lt;/p&gt;

&lt;p&gt;CodeWhisperer is a direct challenge to Microsoft's GitHub Copilot, especially for developers whose work involves the AWS environment. It was trained on billions of lines of code and, &lt;a href="https://aws.amazon.com/blogs/aws/now-in-preview-amazon-codewhisperer-ml-powered-coding-companion/"&gt;according to Amazon,&lt;/a&gt; it continues to train on “open source repositories, internal Amazon repositories, API documentation, and forums.”&lt;/p&gt;

&lt;p&gt;CodeWhisperer uses the developer’s current context to make suggestions, including four primary sources: (1) the current location of the cursor in a body of code, (2) the code that comes before the cursor, (3) any available comments, and (4) the code it finds in other files in the same project.&lt;/p&gt;

&lt;p&gt;Like Microsoft focuses GitHub Copilot on GitHub, Amazon focuses CodeWhisperer on AWS. It writes code for accessing AWS’s services that conforms to the AWS best practices. For example, it offers suggestions for APIs such as Amazon EC2, AWS Lambda, and Amazon S3.&lt;/p&gt;

&lt;p&gt;There is also a reference tracker that flags code that may have plagiarism issues with open-source code. There is a filter that can be turned on to keep this code out of the code suggested by the AI.&lt;/p&gt;

&lt;p&gt;Like GitHub Copilot, business data is not stored or used by Amazon, but individuals have to opt out of their data being stored.&lt;/p&gt;

&lt;p&gt;The biggest general differentiator in AWS CodeWhisperer vs GitHub Copilot is the support for the AWS environment. This is a tremendous help, especially if you need to access AWS services but not often enough to have the code memorized.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pieces for Developers
&lt;/h2&gt;

&lt;p&gt;The team that created Pieces for Developers has and will continue to have a very different focus than Amazon and Microsoft. The company’s sole focus is identifying and satisfying the needs of developers. It is backed by some of the world’s best investors, and it is secure and continuing to grow.&lt;/p&gt;

&lt;p&gt;In 2022, the Pieces team set out to build the most advanced code snippet management and workflow context platform. It would use AI to augment and streamline a developer’s workflow and have an on-device personal mini-repository that stores the materials written and used by a developer. The goal was (and still is) to save the developer’s time while decreasing stress and helping the developer be more productive.&lt;/p&gt;

&lt;p&gt;The Pieces Suite, which is free to individuals, became a “tool-between-tools” that integrates three major workflow processes: (1) researching and problem-solving in the &lt;em&gt;browser&lt;/em&gt;; (2) writing, reviewing, referencing, and reusing code in the &lt;em&gt;IDE;&lt;/em&gt; and (3) working with colleagues in &lt;em&gt;collaborative environments&lt;/em&gt; such as Microsoft Teams.&lt;/p&gt;

&lt;p&gt;Pieces saves reusable and valuable code from the browser, the IDE, and directly from teammates when shared in Teams. These pieces of code are stored with intelligently enriched titles, explanations, tags, user annotations, and 15+ other enrichments. A developer saves countless hours because it is easy to find the right snippet, understand the appropriate context, and then plug it into the code.&lt;/p&gt;

&lt;p&gt;In summary, Pieces for Developers brings your tools to one place with all necessary capabilities like saving code, searching for it when it matters, reusing it seamlessly, and sharing it with one click! See how Pieces compares as a &lt;a href="https://code.pieces.app/blog/best-free-and-paid-github-copilot-alternatives"&gt;GitHub Copilot alternative&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  GitHub Copilot vs AWS CodeWhisperer vs Pieces
&lt;/h2&gt;

&lt;p&gt;When comparing GitHub Copilot vs CodeWhisperer, both &lt;a href="https://code.pieces.app/blog/9-best-ai-code-generation-tools"&gt;AI code generation tools&lt;/a&gt; are excellent. However, their code-snippet capabilities are much less than those already provided by Pieces’ fundamental focus on workflow.&lt;/p&gt;

&lt;p&gt;Pieces for Developers includes the desktop mini-repository with any, some, or all of Pieces integrations for browsers (Chrome, Edge, Firefox), IDEs (VS Code, JetBrains IDEs, JupyterLab, Azure Data Studio, Obsidian), and collaborative environments (Microsoft Teams). Others are in process and will be released soon.&lt;/p&gt;

&lt;p&gt;Compare the following list of ten feature &lt;u&gt;categories&lt;/u&gt; with the features of any copilot you are considering. Then, If you are an individual developer, remind yourself that all the features included in all of Pieces are free.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;em&gt;Manage Your Resources.&lt;/em&gt; Keep track of snippets, screenshots, and workflow context in an &lt;a href="https://code.pieces.app/blog/the-importance-of-on-device-ai-for-developer-productivity"&gt;on-device AI&lt;/a&gt; hub for developer materials.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Instantly Enriched.&lt;/em&gt; Benefit from AI-powered enrichment providing titles, descriptions, tags, documentation links, relevant collaborators, and so much more.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Your Personal Google (Offline &amp;amp; Online).&lt;/em&gt; Find the materials you need with a lightning-fast search experience that lets you query by natural language, code, tags, and other semantics, depending on your preference.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Your Personal Copilot.&lt;/em&gt; Ask the &lt;a href="https://code.pieces.app/blog/introducing-pieces-copilot"&gt;Pieces Copilot&lt;/a&gt; to generate code, connect you with teammates, or summarize what you worked on yesterday. It can run entirely offline and on-device, and it understands text, images, videos, and even entire local directories. It also can access the repository for anything you saved to Pieces while in a browser, IDE, or collaborative environment like Teams.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Easier to Share.&lt;/em&gt; Maintain invaluable context (such as people’s names) for your shared resources when collaborating with teammates, &lt;a href="https://code.pieces.app/blog/5-tips-for-writing-technical-documentation-that-developers-love"&gt;writing technical documentation&lt;/a&gt;, or publishing tutorial videos with custom shareable links.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Smart Transforms.&lt;/em&gt; Transform your snippets in a single click to improve readability, formatting, or runtime performance. You can even translate a snippet to your preferred programming language or convert it to boilerplate.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Workflow Backtracking.&lt;/em&gt; Easily pick up where you left off by revisiting what you searched, copied, saved, shared, referenced, and more because you have a chronological compass capturing the "when" and "where" of your workflow.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Code from Screenshots.&lt;/em&gt; Pieces upgrades screenshots with OCR and its AI extracts code and repairs invalid characters, which results in extremely accurate code extraction and deep metadata enrichment.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Persistent Conversations.&lt;/em&gt; Move seamlessly from browser to IDE to Teams in any sequence without breaking your connection to your AI assistant.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Support and Updates.&lt;/em&gt; Check what support is offered through what channels and how quickly issues are resolved and updates are released.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion: CodeWhisperer vs Copilot
&lt;/h2&gt;

&lt;p&gt;Other copilots are sometimes included in comparisons, such as &lt;a href="https://code.pieces.app/blog/github-copilot-vs-chatgpt-vs-tabnine-feature-comparison"&gt;GitHub Copilot vs ChatGPT vs Tabnine&lt;/a&gt;. I did include Tabnine in a previous low-level comparison of copilot features, but it was too limited to include in this post. One limitation is that it is not integrated into any browsers, but GitHub Copilot vs AWS CodeWhisperer also have that limitation.&lt;/p&gt;

&lt;p&gt;When choosing an AI coding assistant, the primary considerations are development environment, development/coding style, preferred language(s), budget, and, if any, your specific needs. If you need a copilot that is tied to the Azure or AWS environment, then your optimal choice will be the copilot designed for that environment.&lt;/p&gt;

&lt;p&gt;When not bound to an environment, Pieces would probably be your optimal choice. With its advanced features and focus on the overall workflow, it would probably give you the greatest increase in productivity and it is free (to individuals). If doubting which to choose, I suggest using the list of feature categories in the previous section as the checklist for comparisons.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.pieces.app/installation-getting-started/what-am-i-installing"&gt;Get started with Pieces for free today&lt;/a&gt;!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ai</category>
      <category>githubcopilot</category>
    </item>
    <item>
      <title>Top 10 Kotlin Code Snippets to Keep Handy</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Tue, 19 Mar 2024 16:00:53 +0000</pubDate>
      <link>https://forem.com/getpieces/top-10-kotlin-code-snippets-to-keep-handy-2bk2</link>
      <guid>https://forem.com/getpieces/top-10-kotlin-code-snippets-to-keep-handy-2bk2</guid>
      <description>&lt;p&gt;Kotlin is the cross-platform, general-purpose, open-source programming language developed by JetBrains. It is fully interoperable with Java. According to Stack Overflow, Kotlin is the 4th most loved language among the developer’s community, and it is the preferred language for Android development.&lt;/p&gt;

&lt;p&gt;Google announced first-class support for Kotlin on Android in addition to the existing languages – Java and C++ – at Google I/O in 2017. &lt;a href="https://techcrunch.com/2019/05/07/kotlin-is-now-googles-preferred-language-for-android-app-development/"&gt;Google also announced Kotlin as its preferred language for Android development&lt;/a&gt; on May 7, 2019.&lt;/p&gt;

&lt;p&gt;Rather than searching through your notes, codebase, and search history endlessly, you can use Pieces for Developers to organize useful code snippets along with useful links, tags, descriptions, anchors, and titles for each snippet, making it a lightning fast task to retrieve one whenever you need it.&lt;/p&gt;

&lt;p&gt;In this article we will look at a curated list of the best Kotlin code snippets for beginners and advanced developers who want to enhance their productivity &amp;amp; skills with this powerful language.&lt;/p&gt;

&lt;p&gt;Let’s get started!&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Delegated Properties
&lt;/h2&gt;

&lt;p&gt;Delegation is defined as granting authority or power to the person to carry out different tasks. Similarly, Kotlin provides the language support for delegation using Delegation properties.&lt;/p&gt;

&lt;p&gt;This feature allows you to create and implement properties only once and then use them continuously to delegate other code work in the application.&lt;/p&gt;

&lt;p&gt;Delegation is the design pattern where an object or property delegates the task to the other helper object instead of performing the task itself.&lt;/p&gt;

&lt;p&gt;In the code below, a property name in our Demo class delegates the logic for getter and setter to &lt;code&gt;DelegateHelperClass()&lt;/code&gt;, so anything after that keyword satisfies the conversion for the property delegates to act as a Delegate.&lt;/p&gt;

&lt;p&gt;Let's dive into how delegated properties work with a code example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class Demo {var name: String by DelegateHelperClass()}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=1ed64895be"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Smart casts
&lt;/h2&gt;

&lt;p&gt;Other programming languages, including Java, require explicit type casting on variables before accessing their properties. However, there is a feature in Kotlin called smart casting that tracks conditions within the if expression.&lt;/p&gt;

&lt;p&gt;If the Kotlin compiler discovers a variable that is not null and has a type equal to nullable, only then the compiler will allow access to that variable. The &lt;code&gt;is&lt;/code&gt; or &lt;code&gt;!is&lt;/code&gt; operator can be used to check the type of a variable, and the compiler will automatically cast the variable to the target type as shown in the code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;fun main(args: Array&amp;lt; String &amp;gt;) 
{val str1: String? = “Hello World”var str2: String? = null 
// prints String is nullif(str1 is String) 
          {// No Explicit type Casting needed.println(“length of String ${str1.length}”)}
else 
          {println(“String is null”)}}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=3bbe4388ac"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Elvis Operator (?:)
&lt;/h2&gt;

&lt;p&gt;The Elvis operator in Kotlin, represented as &lt;code&gt;?:&lt;/code&gt;, offers a concise way to handle nullability. It returns the expression on the left if it's not null, otherwise, it evaluates and returns the expression on the right. This snippet is handy when dealing with nullable types, preventing null pointer exceptions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;val result = nullableValue ?: defaultValue
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=da3c4f803c"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Use the Elvis operator to provide a default value when dealing with nullable variables, simplifying null checks.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Safe Call Operator (?.)
&lt;/h2&gt;

&lt;p&gt;The safe call operator, denoted by &lt;code&gt;?.&lt;/code&gt; in this Kotlin code snippet, allows accessing properties or invoking methods on nullable objects safely. It returns null if the object is null, preventing &lt;code&gt;NullPointerExceptions&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;val length = nullableString?.length
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=a75840a303"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Employ the safe call operator Kotlin snippet when working with nullable objects to access their properties or methods without risking null pointer exceptions.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. String Interpolation
&lt;/h2&gt;

&lt;p&gt;Kotlin supports string interpolation, enabling the embedding of expressions directly into strings. Enclose the expression in curly braces preceded by a dollar sign (&lt;code&gt;${expression}&lt;/code&gt;).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;val name = "John"
val greeting = "Hello, $name!"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=496042be9b"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Utilize string interpolation for dynamic string construction, enhancing code readability and maintainability.&lt;/p&gt;

&lt;h2&gt;
  
  
  6. Extension Functions
&lt;/h2&gt;

&lt;p&gt;Extension functions allow adding new functionalities to existing classes without modifying their source code. They enhance code reusability and facilitate a more fluent API design.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;fun String.addExclamation(): String {
                             return "$this!"
                             }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=d0c24c8572"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Employ extension functions to extend the functionality of existing classes, promoting &lt;a href="https://code.pieces.app/blog/modern-code-organization-techniques"&gt;code organization&lt;/a&gt; and maintainability.&lt;/p&gt;

&lt;h2&gt;
  
  
  7. Data Classes
&lt;/h2&gt;

&lt;p&gt;In Kotlin, data classes are a special type of class that is primarily used to hold data. They are designed to reduce boilerplate code by automatically generating several standard methods such as &lt;code&gt;equals()&lt;/code&gt;, &lt;code&gt;hashCode()&lt;/code&gt;, &lt;code&gt;toString()&lt;/code&gt;, and &lt;code&gt;copy()&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;In this Kotlin code snippet below, we define a data class named User with two properties: name of type String and age of type &lt;code&gt;Int&lt;/code&gt;. By simply declaring the class with the data modifier, Kotlin automatically generates the &lt;code&gt;equals()&lt;/code&gt;, &lt;code&gt;hashCode()&lt;/code&gt;, &lt;code&gt;toString()&lt;/code&gt;, and &lt;code&gt;copy()&lt;/code&gt; methods for us.&lt;/p&gt;

&lt;p&gt;Using data classes simplifies working with immutable data. In the example, we create a User object named user with the name "Alice" and age 30. Since &lt;code&gt;User&lt;/code&gt; is a data class, we can easily access its properties, compare objects for equality, generate hash codes, and create copies without writing additional boilerplate code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;data class User(val name: String, val age: Int)
val user = User("Alice", 30)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=2254438828"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Use data classes to represent immutable data, reducing boilerplate code and enhancing readability.&lt;/p&gt;

&lt;h2&gt;
  
  
  8. Higher-Order Functions
&lt;/h2&gt;

&lt;p&gt;Higher-order functions accept other functions as parameters or return them. They enable concise and expressive code, promoting functional programming paradigms. They can take other functions as parameters or return functions as results. In Kotlin, functions are first-class citizens, meaning they can be treated like any other value, such as integers or strings. This allows for the creation of higher-order functions, which enable more concise and flexible code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;fun operateOnNumber(number: Int, operation: (Int) -&amp;gt; Int): Int
{
 return operation(number)
}

val squared = operateOnNumber(5) { it * it }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=8a3240a0af"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Leverage higher-order functions to encapsulate common behaviors and promote &lt;a href="https://code.pieces.app/blog/making-code-reuse-and-reference-seamless"&gt;code reuse&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  9. Null Safety with !! Operator
&lt;/h2&gt;

&lt;p&gt;Although not recommended, the double exclamation mark (&lt;code&gt;!!&lt;/code&gt;) operator can be used to assert that an expression is not null. It throws a &lt;code&gt;NullPointerException&lt;/code&gt; if the expression evaluates to null.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;val length = nullableString!!.length
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=2a06479494"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Exercise caution when using the &lt;code&gt;!!&lt;/code&gt; operator, as it bypasses null safety checks and can lead to runtime exceptions.&lt;/p&gt;

&lt;h2&gt;
  
  
  10. Default Arguments
&lt;/h2&gt;

&lt;p&gt;Default arguments in Kotlin allow you to specify default values for function parameters. This means that when you call a function without providing values for certain parameters, those parameters will be initialized with their default values.&lt;/p&gt;

&lt;p&gt;Default arguments, like this Kotlin snippet, provide a convenient way to define functions with a varying number of parameters while allowing callers to omit some of them if they're not necessary. Kotlin supports default arguments in function parameters, allowing the omission of arguments when calling functions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;fun greet(name: String = "World") {
 println("Hello, $name!")
}
greet() // Output: Hello, World!
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=b1494295d7"&gt;Save this code&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Utilize default arguments to define flexible function interfaces and simplify function calls.&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonus Snippet: Inline Functions
&lt;/h2&gt;

&lt;p&gt;Inline functions in Kotlin are marked with the &lt;code&gt;inline&lt;/code&gt; keyword and are expanded at the call site, reducing the overhead of function calls.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;inline fun performOperation(value: Int, operation: (Int) -&amp;gt; Int): Int { return operation(value) }

val result = performOperation(5) { it * it }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://sophyia.pieces.cloud/?p=194747aaf4"&gt;Save this code snippet&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Use Case: Use inline functions to improve performance by eliminating the overhead of function calls for small functions or lambdas.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Kotlin's powerful features empower developers to write concise, expressive, and safe code. By leveraging Kotlin snippets like the Elvis operator, extension functions, and higher-order functions, developers can enhance productivity and maintainability.&lt;/p&gt;

&lt;p&gt;Understanding and mastering these features is key to unlocking the full potential of Kotlin and building robust applications efficiently. Experiment with these Kotlin code snippets in your projects, and witness firsthand the elegance and efficiency of Kotlin programming.&lt;/p&gt;

&lt;p&gt;With this we conclude our curated selection of top-notch Kotlin boilerplate snippets — a valuable resource for both experienced Kotlin developers and beginners alike.&lt;/p&gt;

&lt;p&gt;To streamline your coding journey and enhance your learning curve, the &lt;a href="https://docs.pieces.app/extensions-plugins/chrome"&gt;Pieces Web Extension&lt;/a&gt; offers a convenient feature. With just a few clicks, you can effortlessly copy and save any of these Kotlin boilerplate code snippets from our list, ensuring quick access whenever you require them.&lt;/p&gt;

&lt;p&gt;What's even more intriguing is that by saving a snippet in Pieces, you'll gain deeper insights into these boilerplate Kotlin snippets, including tags, context, and relevant links. Why not give it a try? Simply click one of the links above to save the code snippets, enriched with tags and all the context you need, saved into a personal, on-device repository. First you need to &lt;a href="https://docs.pieces.app/installation-getting-started/what-am-i-installing"&gt;install the Pieces Desktop App&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;We trust that this article has provided you with a solid understanding of useful Kotlin boilerplate snippets and when to apply them in your projects. You can find more useful snippets for other languages with our &lt;a href="https://code.pieces.app/collections"&gt;snippet collections&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>kotlin</category>
      <category>programming</category>
      <category>beginners</category>
    </item>
    <item>
      <title>How to Run an LLM Locally with Pieces</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Mon, 18 Mar 2024 20:31:49 +0000</pubDate>
      <link>https://forem.com/getpieces/how-to-run-an-llm-locally-with-pieces-4290</link>
      <guid>https://forem.com/getpieces/how-to-run-an-llm-locally-with-pieces-4290</guid>
      <description>&lt;p&gt;The demand for local, secure, and efficient machine learning solutions is more prominent than ever, especially for software developers working with sensitive code. At Pieces for Developers, we understand the importance of leveraging &lt;a href="https://code.pieces.app/blog/local-large-language-models-lllms-and-copilot-integrations" rel="noopener noreferrer"&gt;Local Large Language Models (LLLMs)&lt;/a&gt; not just for the enhanced privacy and security they offer, but also for their &lt;a href="https://code.pieces.app/blog/how-developers-are-using-offline-ai-tools-for-air-gapped-security" rel="noopener noreferrer"&gt;offline AI&lt;/a&gt; capabilities. Our commitment to a local-first philosophy has led us to support CPU and GPU versions of popular LLMs like Mistral, Phi-2, and Llama 2, with more on the way.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.pieces.app/product-highlights-and-benefits/privacy-security-data" rel="noopener noreferrer"&gt;Our focus on privacy and security&lt;/a&gt; goes beyond the ability to leverage LLLMs within &lt;a href="https://code.pieces.app/blog/introducing-pieces-copilot" rel="noopener noreferrer"&gt;Pieces Copilot&lt;/a&gt;. In order to ensure all of your data remains on your local machine rather than transmitting it over the internet, we’ve meticulously fine-tuned several &lt;a href="https://code.pieces.app/blog/small-language-models-outshine-large-language-models-enterprise-users" rel="noopener noreferrer"&gt;small language models&lt;/a&gt; so that anytime you &lt;a href="https://code.pieces.app/blog/how-to-store-code-snippets-and-10x-your-developer-productivity" rel="noopener noreferrer"&gt;store code snippets&lt;/a&gt; in Pieces, your code can be automatically enriched with titles, descriptions, tags, related links, and other useful context and metadata, without needing to connect to the cloud.&lt;/p&gt;

&lt;p&gt;While small language models can be supported on almost any machine, running LLMs locally that have 7 billion parameters or more can bring challenges, especially considering the hardware requirements. We still consider our &lt;a href="https://code.pieces.app/blog/the-importance-of-on-device-ai-for-developer-productivity" rel="noopener noreferrer"&gt;on-device AI&lt;/a&gt; as “experimental”, as this implementation and the availability of these models is so new, and we can’t always guarantee success for our users. This guide aims to demystify how to run an LLM locally within Pieces, outlining the minimum and recommended machine specs, the best GPUs for LLMs locally, and how to troubleshoot common issues when using Pieces Copilot in your development workflow.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding Local LLM Hardware Requirements
&lt;/h2&gt;

&lt;p&gt;Navigating the hardware requirements to run an LLM locally is crucial for developers seeking privacy, security, and the convenience of offline access. This section discusses the essential hardware components and configurations needed for efficiently running large language models locally, and how to check your machine’s specs to determine what it can handle.&lt;/p&gt;

&lt;h3&gt;
  
  
  Minimum and Recommended Specifications
&lt;/h3&gt;

&lt;p&gt;Local LLMs, while offering unparalleled privacy and offline access, require significant machine resources. For an optimal experience, we recommend using machines from 2021 or newer with a dedicated GPU boasting more than 6/7GB of available GPU-RAM (VRAM). Older machines or those without a dedicated GPU may need to opt for CPU versions, which, while efficient, may not offer the same performance level.&lt;/p&gt;

&lt;p&gt;As a user, if you are receiving the error message “&lt;em&gt;I’m sorry, something went wrong with processing…&lt;/em&gt;” or the Pieces application crashes when you are using a local model, you may be trying to run a local model that requires more resources than your machine has available, or you’ve selected a GPU model when you don’t have a dedicated GPU, and you will need to change it. Unfortunately, if you have an older machine or one with very low resources, you may have to use a cloud model in order to use the copilot.&lt;/p&gt;

&lt;h3&gt;
  
  
  GPU vs CPU
&lt;/h3&gt;

&lt;p&gt;Central Processing Units (CPUs) and Graphics Processing Units (GPUs) serve distinct functions within computing systems, often collaborating for optimal performance. CPUs, the general-purpose processors, are key for executing a broad range of tasks, excelling in sequential processing with their limited number of cores. They're crucial for running operating systems, applications, and managing system-level tasks.&lt;/p&gt;

&lt;p&gt;Conversely, GPUs specialize in accelerating graphics and data-heavy tasks. Originating in video game graphics, GPUs now handle tasks requiring parallel processing like video editing, scientific simulations, and machine learning, thanks to their thousands of cores. While most computers have both CPUs for general tasks and GPUs for graphics-intensive work, some may only include integrated graphics within the CPU for basic tasks, opting for compact designs over dedicated GPUs.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Best GPUs for Local LLMs
&lt;/h3&gt;

&lt;p&gt;When it comes to running local LLMs, the GPU plays a pivotal role. Dedicated GPUs with high VRAM are preferable, as they can significantly speed up computations required by these models. NVIDIA's GeForce RTX series and AMD's Radeon RX series are excellent choices, offering a balance between performance and power efficiency.&lt;/p&gt;

&lt;p&gt;When it comes to Apple products, the new M-series machines do not use dedicated GPUs, but the integrated GPUs they have are more than sufficient to run local LLMs.&lt;/p&gt;

&lt;h3&gt;
  
  
  Checking Your Machine Specs
&lt;/h3&gt;

&lt;p&gt;In order to select the best on-device LLM for you, you should first check your machine specifications.&lt;/p&gt;

&lt;h4&gt;
  
  
  Windows
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;Right-click on the taskbar and select "Task Manager".&lt;/li&gt;
&lt;li&gt;In the Task Manager window, go to the "Performance" tab.&lt;/li&gt;
&lt;li&gt;Here you can see your CPU and GPU details. Click on "GPU" to see GPU information.&lt;/li&gt;
&lt;li&gt;To see detailed GPU information including VRAM, click on "GPU 0" or your GPU's name.&lt;/li&gt;
&lt;/ol&gt;

&lt;h4&gt;
  
  
  Mac
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;Click on the Apple logo in the top-left corner of the screen.&lt;/li&gt;
&lt;li&gt;Select "About This Mac".&lt;/li&gt;
&lt;li&gt;In the window that opens, you can see if your machine runs an Intel chip or an Apple Silicon M-series chip.&lt;/li&gt;
&lt;/ol&gt;

&lt;h4&gt;
  
  
  Linux
&lt;/h4&gt;

&lt;ol&gt;
&lt;li&gt;Open a terminal window.&lt;/li&gt;
&lt;li&gt;For CPU information, you can use the lscpu command.&lt;/li&gt;
&lt;li&gt;For GPU information, you can use commands like lspci | grep -i vga to list GPU devices.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you don’t have this installed, you may need to consult online forums for other options for your Linux machine.&lt;/p&gt;

&lt;h4&gt;
  
  
  Recommendations
&lt;/h4&gt;

&lt;p&gt;If you’re interested in updating your machine to more efficiently run LLMs locally, you would generally want a GPU with a large amount of &lt;a href="https://www.techtarget.com/searchstorage/definition/video-RAM" rel="noopener noreferrer"&gt;VRAM&lt;/a&gt; (Video Random Access Memory) to handle the memory-intensive operations involved in processing these models. GPUs with higher &lt;a href="https://www.wevolver.com/article/understanding-nvidia-cuda-cores-a-comprehensive-guide" rel="noopener noreferrer"&gt;CUDA core counts&lt;/a&gt; and memory bandwidth can also contribute to faster computation.&lt;/p&gt;

&lt;p&gt;In order to ensure your system can handle hefty local LLM hardware requirements, we recommend you double check the available RAM and VRAM based on these specifications:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Llama2 7B&lt;/strong&gt;, a model trained by Meta AI optimized for completing general tasks.&lt;/li&gt;
&lt;li&gt;Requires a minimum of 5.6GB RAM for the CPU model and 5.6GB of VRAM for the GPU-accelerated model.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mistral 7B&lt;/strong&gt;, a dense Transformer, fast-deployed and fine-tuned on code datasets. Small, yet powerful for a variety of use cases.&lt;/li&gt;
&lt;li&gt;Requires a minimum of 6GB RAM for the CPU model and 6GB of VRAM for the GPU-accelerated model.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Phi-2 2.7B&lt;/strong&gt;, a small language model that demonstrates outstanding reasoning and language understanding capabilities.&lt;/li&gt;
&lt;li&gt;Requires a minimum of 3.1GB RAM for the CPU model and 3.1GB of VRAM for the GPU-accelerated model.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Note that as we continue supporting larger LLLMs like 13B models from Llama 2, these local LLM hardware requirements will change.&lt;/p&gt;

&lt;h2&gt;
  
  
  Performance and Troubleshooting
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Deciding Which Local Model To Use
&lt;/h3&gt;

&lt;p&gt;In a previous article, we discussed the &lt;a href="https://code.pieces.app/blog/best-llm-for-coding-cloud-vs-local" rel="noopener noreferrer"&gt;best LLMs for coding&lt;/a&gt;, whether that be cloud vs local LLMs. If you’ve decided you want to stick with a local model within Pieces for increased security and offline capabilities, then you’ll want to first choose which one to use before downloading.&lt;/p&gt;

&lt;p&gt;You will see that all of our local models have a CPU and GPU option. Now that you know a little more about your machine and GPU vs CPU, you can use the chart below to decide whether to use a GPU or a CPU version of a model.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fflowmap_29d62bf9a3d586d77497181d8a605234.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fflowmap_29d62bf9a3d586d77497181d8a605234.jpg" alt="A flowchart showing users whether CPU or GPU models are better for their machines."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you’ve decided on a GPU or CPU version, the choice between which brand model is more or less a matter of opinion, as models tend to excel in different types of knowledge (general QA, science, math, coding, etc.).&lt;/p&gt;

&lt;p&gt;We at Pieces like to use a variety of models, to see the differences between answers we get based on the model’s knowledge base and training, and for different purposes - for example, the Pieces team member writing this article likes Phi-2 for its lightweight speed, but Mistral for its quality of answers. We’ve included some links to model evaluations below, but there is new information being released almost daily at this time.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://mistral.ai/news/announcing-mistral-7b/" rel="noopener noreferrer"&gt;Mistral announcing their new 7B model&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.microsoft.com/en-us/research/blog/phi-2-the-surprising-power-of-small-language-models/" rel="noopener noreferrer"&gt;Microsoft announcing their new Phi-2 model&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard" rel="noopener noreferrer"&gt;HuggingFace Chatbot Arena Leaderboard&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Troubleshooting Common Issues
&lt;/h3&gt;

&lt;p&gt;Encountering crashes or performance issues can often be attributed to exceeding your machine's resource capabilities. Checking your system specifications and comparing them against the requirements of your chosen LLM model is a good first step (we outlined how earlier in this article). &lt;/p&gt;

&lt;p&gt;Another common issue on Linux and Windows is a corrupted or outdated Vulkan API, which we use to communicate with your GPU. Vulkan should be bundled with your AMD or NVIDIA drivers, and you can check its health by executing the command &lt;code&gt;vulkaninfo&lt;/code&gt; in your terminal and scanning the resulting logs for errors or warnings. This could indicate that either your GPU drivers need updated or there is an issue with the API itself. Please &lt;a href="https://docs.pieces.app/support" rel="noopener noreferrer"&gt;contact the Pieces support team&lt;/a&gt; if you believe your Vulkan installation is broken.&lt;/p&gt;

&lt;h2&gt;
  
  
  Future-Proofing Your Setup
&lt;/h2&gt;

&lt;p&gt;As technology advances, so do the requirements for running sophisticated models like LLMs.  Upgrading to a system with a high-performance GPU and ample RAM can ensure your setup remains capable of handling new and more demanding large language models run locally. Additionally, staying informed about emerging trends in hardware can help you make knowledgeable decisions about future upgrades.&lt;/p&gt;

&lt;p&gt;While we aimed to make this a complete guide to running local LLM models, things are evolving quickly and you may need to reference current research in order to understand how to run LLM models locally based on your hardware and memory requirements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Join the Discussion
&lt;/h2&gt;

&lt;p&gt;We hope this guide has shed some light on how to run an LLM locally efficiently and effectively within Pieces. We encourage our users to &lt;a href="https://github.com/pieces-app/support/discussions/126" rel="noopener noreferrer"&gt;join the discussion on GitHub&lt;/a&gt;, share their experiences, and provide feedback. Your input is invaluable as we continue to refine our support for running LLMs locally, ensuring a seamless and productive experience for all our users.&lt;/p&gt;

&lt;p&gt;If you’re just &lt;a href="https://code.pieces.app/whitepapers/getting-started-with-large-language-models-llms" rel="noopener noreferrer"&gt;getting started with Large Language Models (LLMs)&lt;/a&gt;, check out our whitepaper in the corresponding link.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>llm</category>
    </item>
    <item>
      <title>Best AI Tools for Students Learning Development and Engineering</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Mon, 18 Mar 2024 16:05:20 +0000</pubDate>
      <link>https://forem.com/getpieces/best-ai-tools-for-students-learning-development-and-engineering-hn4</link>
      <guid>https://forem.com/getpieces/best-ai-tools-for-students-learning-development-and-engineering-hn4</guid>
      <description>&lt;p&gt;Our world is rapidly changing, and AI is a big part of that change. Students in development and engineering (and developers already in their careers) need to become proficient in AI tools. Perhaps not experts, but at least enough to understand how AI might or might not apply to a project.&lt;/p&gt;

&lt;p&gt;These best AI tools for students can enhance your learning process and productivity, either in school or on the job. The first section describes each tool, and the second section describes how they could fit together into a streamlined workflow. The third section discusses the factors that affect your choice of AI tools.&lt;/p&gt;

&lt;h2&gt;
  
  
  10 Useful AI Tools for Students (and Developer Advancement)
&lt;/h2&gt;

&lt;p&gt;These ten free AI tools for students are actually free for everyone. They provide a solid foundation to work on a wide range of artificial intelligence (AI) and machine learning projects. By exploring these tools and their documentation, you can gain hands-on experience, develop your skills, and tackle real-world challenges. They are the best AI tools for students and for developers who want to open new job opportunities or &lt;a href="https://code.pieces.app/blog/code-snippets-coding-interview-prep"&gt;prepare for coding interviews&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The AI tools used by students, such as toolkits, notebooks, libraries, and frameworks, are the same tools used by developers in their careers. Each of the top AI tools for students plays a different but complementary role in software development. Their meanings sometimes overlap, and the specific roles may vary depending on the context and technology used.&lt;/p&gt;

&lt;p&gt;Imagine you are an architect designing houses in a new subdivision. &lt;strong&gt;Frameworks&lt;/strong&gt; are the pre-defined models, like a bungalow or a two-story, that provide a basic structure and guide your planning. You use a &lt;strong&gt;notebook&lt;/strong&gt; to create your blueprints and sketches, where you experiment and plan the house.&lt;/p&gt;

&lt;p&gt;There are two types of components you can use. &lt;strong&gt;Toolkits&lt;/strong&gt; provide prefabricated components such as walls, doors, and windows. They speed up construction, but they may limit design freedom. In contrast, &lt;strong&gt;Libraries&lt;/strong&gt; provide individual building blocks like bricks, pipes, and electrical components. You can use them independently or combine them into larger components.&lt;/p&gt;

&lt;p&gt;Which label applies to a tool sometimes depends on what you do with it. For example, &lt;a href="https://pytorch.org/"&gt;PyTorch&lt;/a&gt; or &lt;a href="https://www.tensorflow.org/"&gt;TensorFlow&lt;/a&gt; can be called a library, a toolkit, or a machine-learning framework.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Jupyter Notebook:&lt;/strong&gt; Jupyter Notebook allows you to create documents called &lt;em&gt;notebooks&lt;/em&gt; that combine code, text, and visualizations in a single interface. It supports multiple programming languages, including Python, R, and Julia. Jupyter Notebook is widely used in data analysis and exploration, machine-learning prototyping, and educational settings. It promotes reproducibility by capturing code, its output, and your textual explanations in a single file that can be shared with others.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;TensorFlow:&lt;/strong&gt; TensorFlow is an open-source machine-learning &lt;em&gt;framework&lt;/em&gt; developed by Google. It provides a wide range of tools and libraries for building, training, and &lt;a href="https://code.pieces.app/blog/the-ultimate-guide-to-ml-model-deployment"&gt;deploying various types of machine learning models&lt;/a&gt;, with a focus on deep learning. TensorFlow offers both high-level and low-level APIs, allowing users to choose between ease of use and flexibility. It supports distributed computing, enabling efficient training on multiple machines or GPUs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;PyTorch:&lt;/strong&gt; PyTorch is an open-source machine-learning &lt;em&gt;framework&lt;/em&gt; primarily developed by Facebook’s AI Research lab. It is known for its dynamic computational graph, which provides flexibility in model development and debugging. PyTorch has gained popularity in the research community due to its simplicity, strong support for neural network architectures, and its ability to seamlessly integrate with Python libraries.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scikit-learn:&lt;/strong&gt; Scikit-learn is a Python &lt;em&gt;library&lt;/em&gt; that provides a robust set of tools for machine learning and data mining. It offers a wide variety of AI algorithms for classification, regression, clustering, and dimensionality reduction, along with utilities for data preprocessing and evaluation. Scikit-learn is designed with a consistent API, making it easy to experiment with different models and compare their performance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Git:&lt;/strong&gt; Git is a &lt;em&gt;storage and distributed version control system&lt;/em&gt; that tracks changes to files and directories over time. It allows multiple developers to collaborate on a project, merging their changes efficiently. Git provides features such as branching and merging, which enable developers to work on separate features or experiment with code without affecting the main codebase. Platforms like GitHub and GitLab host Git repositories and provide additional features like issue tracking and pull requests.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Docker:&lt;/strong&gt; Docker is an open-source containerization platform that allows you to &lt;em&gt;package applications&lt;/em&gt; and their dependencies into lightweight, isolated containers. Containers provide a consistent and reproducible environment, ensuring that your code runs the same way across different systems. Docker allows you to define the dependencies and configurations of your application in a Dockerfile, making it easy to share and deploy your code on different machines or in the cloud.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OpenAI Gym:&lt;/strong&gt; OpenAI Gym is a popular &lt;em&gt;toolkit&lt;/em&gt; for developing and benchmarking reinforcement-learning algorithms. It provides a collection of environments, ranging from simple text-based games to complex control and robotics tasks. OpenAI Gym offers a simple and unified API to interact with these environments, making it easier to develop and compare different reinforcement-learning algorithms. It also includes evaluation metrics and tools for visualizing agent performance.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pieces for Developers:&lt;/strong&gt; Pieces is an AI-powered software tool designed to assist developers throughout their workflow. Its &lt;a href="https://code.pieces.app/blog/navigating-the-future-with-ai-copilots-a-comprehensive-guide"&gt;AI copilot&lt;/a&gt; offers real-time suggestions and AI assistance, ranging from code generation and debugging to exploring different approaches and improving code quality. Its plugins for browsers, IDEs, Obsidian, and other software tools access the same desktop repository. Consequently, Pieces stays context-aware and &lt;a href="https://code.pieces.app/blog/introducing-persisted-copilot-chats"&gt;persists your conversation across all contexts with integrated AI&lt;/a&gt;. Its suggestions include relevant code snippets, code refactoring, alternative approaches, identifying potential errors, and providing explanations for complex concepts. Even if Pieces was treated only as a closed-source or &lt;a href="https://code.pieces.app/blog/top-5-open-source-ai-chatbots-for-developers"&gt;open-source AI chatbot&lt;/a&gt; for students to help write code, its ability to add context to your conversations and leverage many of the &lt;a href="https://code.pieces.app/blog/best-llm-for-coding-cloud-vs-local"&gt;best LLMs for coding&lt;/a&gt; would be very helpful.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gradio:&lt;/strong&gt; Gradio provides a rapid-prototyping visual interface for building interactive web demos for machine learning models. It is easy to share demos with instructors and peers, fostering collaboration and knowledge sharing within the classroom or project teams to facilitate communication and instant feedback exchange.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hugging Face:&lt;/strong&gt; Hugging Face is a community-driven online platform with a large library of pre-trained transformers and powerful deep-learning models for natural language processing (NLP) tasks. Like the other tools in this list, its resources, such as NLP tools, datasets, and tutorials, are freely available for individual use. These include exploring NLP tasks such as text classification, sentiment analysis, summarizing information, and question answering.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Workflow for an Example Project
&lt;/h2&gt;

&lt;p&gt;I asked the Gemini LLM to define a streamlined workflow that combined these best free AI tools for students. The suggested workflow includes Jupyter Notebook, PyTorch, TensorFlow, Git, Scikit-learn, Docker, OpenAI Gym, and Pieces for Developers. Gradio and Hugging Face could also be included.&lt;/p&gt;

&lt;h3&gt;
  
  
  Project Setup:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Version Control:&lt;/strong&gt; Use Git to initialize a repository for your project. This allows you to track changes, collaborate with others, and revert to previous versions if needed.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Environment  Management:&lt;/strong&gt; Consider using Docker to create isolated environments for your project. This ensures consistent dependencies and avoids conflicts across different machines. Define Dockerfiles specifying the necessary libraries (PyTorch, TensorFlow, Scikit-learn, OpenAI Gym) and their versions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Experimentation &amp;amp; Documentation:&lt;/strong&gt; Use &lt;a href="https://docs.pieces.app/extensions-plugins/jupyterlab"&gt;Jupyter Notebook with the Pieces plugin&lt;/a&gt; as your primary development environment. It allows you to write, execute, and visualize both written and AI generated code interactively, and keep track of your results in a clear and organized manner. You can use it to explore ideas and document your work with rich markdown cells. For example, you can rapidly test and prototype functionalities with TensorFlow or PyTorch. You can visualize and analyze data generated by your agent’s interactions with the Gym environment. It also puts the full power of Pieces at your command as your personal coding assistant.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Development Workflow:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Data Preprocessing &amp;amp; Analysis:&lt;/strong&gt; Use Scikit-learn for data preprocessing tasks like cleaning, scaling, and feature engineering. You can also use it for exploratory data analysis and model evaluation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Model Development &amp;amp; Training:&lt;/strong&gt; Import and use these libraries within your Jupyter Notebook cells to define your model architecture, train it on Gym data, and evaluate its performance. Both offer flexible and powerful tools for neural network creation and optimization. As the core libraries for building your reinforcement learning model, PyTorch might be preferred for its dynamic computational graph and ease of use, while TensorFlow offers scalability and production-ready AI features. If your project involves reinforcement learning, leverage OpenAI Gym to create and interact with simulated environments. It provides various environments for testing and training your reinforcement learning agents.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Version Control &amp;amp; Collaboration:&lt;/strong&gt; After making changes in your notebook or code, commit them to your Git repository regularly. This allows you to track progress, revert to previous versions, and collaborate with others. The information is also saved in Pieces, which keeps a historical track of your workflow for going back in the past. Pieces can suggest who to ask questions about the code they provided.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Continuous Integration &amp;amp; Deployment:&lt;/strong&gt; Consider setting up continuous integration and deployment pipelines to automate testing, building, and deployment of your project. This ensures consistency and streamlines the development process.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  General Tips:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Start small and modular:&lt;/strong&gt; Break down your project into smaller, manageable tasks and modules. This makes the development process more manageable and easier to debug. You can ask Pieces to troubleshoot your code and debug it for you.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Utilize community resources:&lt;/strong&gt; Take advantage of online communities, forums, and tutorials for each tool and library. They offer valuable support and online learning opportunities. Pieces is open to questions in your IDE (such as Jupyter), your browser, and on the desktop so you can get answers and ask it about research while you continue coding, providing &lt;a href="https://code.pieces.app/blog/workflow-integration-with-ai-a-unified-approach-to-development"&gt;workflow integration&lt;/a&gt; across your entire toolchain.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Document your work:&lt;/strong&gt; Use Jupyter notebooks to document your thought processes, code snippets, and results. This will be helpful for you and your collaborators in the future. You can use Pieces’ &lt;a href="https://code.pieces.app/blog/the-importance-of-on-device-ai-for-developer-productivity"&gt;on-device AI&lt;/a&gt; to document your code and add your annotations to its enrichment of the code with its automatic explanations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Test and iterate:&lt;/strong&gt; Regularly test your code and models to identify and fix issues early on. Be prepared to iterate and adapt your approach based on your findings. Pieces records your workflow and can provide the materials you had been using that relate to the code.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Remember, this is just a general guideline, and the specific way you combine these tools will depend on your project’s unique requirements and goals.&lt;/p&gt;

&lt;h2&gt;
  
  
  Factors to Consider when Choosing
&lt;/h2&gt;

&lt;p&gt;Choosing the right AI tools for student use depends on several factors. Here are some key considerations to keep in mind when selecting student AI tools for your projects:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Project Requirements:&lt;/strong&gt; Consider the specific requirements of your project. What are you trying to accomplish? Are you working on a machine-learning task, data analysis, natural language processing, or computer vision? Different AI tools for students specialize in different areas, so choose the ones that align with your project goals.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Learning Curve and Documentation:&lt;/strong&gt; Evaluate the learning curve associated with each tool. Consider how easy it is to get started and whether there are comprehensive documentation and tutorials available. Beginner-friendly tools with extensive community support can help you quickly grasp the concepts and start implementing your ideas.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Programming Language:&lt;/strong&gt; Consider the programming language you are comfortable with or wish to learn. Many AI tools for students are available in Python, which is widely used in the AI community. However, there are also tools available in other languages such as R, Julia, or C++. Choose tools that are compatible with your preferred programming language or those that align with your academic program's requirements when &lt;a href="https://code.pieces.app/blog/pieces-user-stories-learning-new-languages"&gt;learning new languages&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Community and Support:&lt;/strong&gt; Assess the size and activity of the community surrounding the AI tools. Larger communities tend to offer more resources, tutorials, and active forums for seeking help and guidance. Robust community support can be invaluable, especially when you encounter challenges or have specific questions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration and Compatibility:&lt;/strong&gt; Consider how well the AI tools integrate with other libraries and frameworks you may want to use. For example, if you are working with data analysis, check if the tool integrates well with NumPy, Pandas, or SciPy. Compatibility with other tools ensures smooth workflow and enables you to leverage the strengths of multiple libraries.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalability and Performance:&lt;/strong&gt; If you anticipate working on large-scale or computationally intensive projects, evaluate the scalability and performance of the AI-powered tool. Some frameworks offer distributed computing capabilities or support for GPUs, which can significantly speed up training and inference processes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Industry Relevance:&lt;/strong&gt; Consider the relevance of AI capabilities in industry applications and job market demand. Tools that are widely adopted in industry settings can provide you with valuable skills and enhance student employability. Staying updated with popular tools can also give you insights into current trends and advancements in the field.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Personal Interest and Future Goals:&lt;/strong&gt; Lastly, consider your personal interests and long-term goals. Explore &lt;a href="https://code.pieces.app/blog/future-ai-tools-going-from-unknown-to-unstoppable"&gt;future AI tools&lt;/a&gt; that align with your interests and career aspirations. If you have a specific area of AI you wish to specialize in, choose tools that are commonly used in that domain to gain relevant expertise.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Considering these factors helps you make informed decisions about which &lt;a href="https://code.pieces.app/blog/embracing-emerging-ai-technologies-imperative-large-corporations"&gt;emerging AI technologies&lt;/a&gt; to explore. However, only experimentation and hands-on experience with different tools will ultimately help you determine which of the different AI tools for students work best for you.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Other tools could have been included as the best AI tools for engineering students or students who write code for any reason. I focused on Python as the common programming language, but you can input code into Pieces and have it translated into any of the 40+ languages it supports.&lt;/p&gt;

&lt;p&gt;Pieces’ ability to translate between languages is one of the main reasons I consider it the best free AI tool for students. Your lesson plans may have different requirements and may use different programming languages. Pieces is your personal intelligent tutor who can move between different languages and explain the code in ways the professor or the textbook didn't tell you, saving you time and making you more efficient.&lt;/p&gt;

&lt;p&gt;It's worth noting that while these AI for college students may have prerequisites for learning, they also provide extensive &lt;a href="https://docs.pieces.app/installation-getting-started/what-am-i-installing"&gt;documentation&lt;/a&gt;, tutorials, and resources to help beginners get started and learn the necessary concepts.&lt;/p&gt;

&lt;p&gt;These resources provide a solid foundation for beginners to understand and apply neural networks and deep learning concepts. They offer a mix of theoretical explanations, practical examples, and hands-on programming exercises to help you gain a deeper understanding of the subject.&lt;/p&gt;

&lt;p&gt;Exploring these resources will give you a strong starting point for your learning about AI tools. You can also reference our &lt;a href="https://code.pieces.app/blog/tips-for-software-engineering-students"&gt;tips for software engineering students&lt;/a&gt;, and explore how Kyle Goben uses Pieces in our &lt;a href="https://code.pieces.app/user-stories/university-student-user-stories-simplifying-coursework-internships-and-hackathons"&gt;University Student User Story&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>webdev</category>
      <category>beginners</category>
      <category>programming</category>
    </item>
    <item>
      <title>How to Use AI for Mobile App Development</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Wed, 13 Mar 2024 16:53:31 +0000</pubDate>
      <link>https://forem.com/getpieces/how-to-use-ai-for-mobile-app-development-ao9</link>
      <guid>https://forem.com/getpieces/how-to-use-ai-for-mobile-app-development-ao9</guid>
      <description>&lt;p&gt;Mobile developers are the creative minds and technical wizards behind the applications we use daily on our smartphones and tablets. In other words, they translate ideas into the mobile apps that control how we communicate, shop, play games, and access information on the go. Pieces, which is free to individuals, is the easy way for developers to use AI in mobile app development.&lt;/p&gt;

&lt;h2&gt;
  
  
  Four Ecosystems for AI Mobile App Development
&lt;/h2&gt;

&lt;p&gt;As mobile technology continues to evolve, the knowledge and skills required for mobile app development AI continue to change. The immediate effect of AI’s rapid rise in popularity is the expectation that its full functionality will be available as an AI tool to create mobile apps. Emerging trends like augmented reality, virtual reality, and foldable devices require developers to change and push the boundaries of their AI tools for mobile app design.&lt;/p&gt;

&lt;p&gt;Even the definition of a “mobile app developer” depends on the project. A mobile developer may be either front-end or full-stack for Android, iOS, or cross-platform with one codebase for both. There are also mobile game developers with different stacks who write C# or C++ for game engines.&lt;/p&gt;

&lt;p&gt;There are four mobile app ecosystems with different frameworks, rules, guidelines, libraries, and other tools. Pieces can assist as an AI to create mobile apps in any of these through its browser and desktop modules.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A developer of native Android apps will probably use Android Studio and write code in Kotlin or Java.&lt;/li&gt;
&lt;li&gt;An iOS native-app developer will probably use Xcode and write in Swift or Objective-C.&lt;/li&gt;
&lt;li&gt;A developer who wants to maintain consistency with .Net will probably use Xamarin and write in C#.&lt;/li&gt;
&lt;li&gt;The tools a developer writing for Web open-source would use are less clearly defined. For example, the developer might use Qt (“cute”) software, but the limited market share and lack of dedicated development tools and resources compared to Android and iOS make it less popular than other frameworks.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Choosing the Right Framework
&lt;/h2&gt;

&lt;p&gt;A cross-platform developer may use React Native, Flutter, Electron, Xamarin, or other frameworks (learn the difference between &lt;a href="https://code.pieces.app/blog/flutter-vs-react-native" rel="noopener noreferrer"&gt;Flutter vs React Native&lt;/a&gt;, and &lt;a href="https://code.pieces.app/blog/flutter-vs-electron-whats-the-difference" rel="noopener noreferrer"&gt;Flutter vs Electron&lt;/a&gt;). The five major frameworks (or Native IDEs) listed in the following table vary in several ways, including how much they are like native code. The most important factors in choosing a framework (or multiple frameworks) are discussed after the table.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage1_7b9111eba2997444cfd89f34342e85e0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage1_7b9111eba2997444cfd89f34342e85e0.png" alt="The five major app development frameworks."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are three very important considerations when choosing a framework or native IDE for mobile app development with AI:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Consider your target platform:&lt;/strong&gt; If you specifically need to target only Android or iOS, using the native framework (Kotlin/Java for Android, Swift for iOS) might be the preferred approach for optimal performance and access to native functionalities.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Evaluate your team's skills:&lt;/strong&gt; If your team is already comfortable with JavaScript and React, React Native could be a good choice. If you prefer a single codebase for both platforms and are open to learning Dart, Flutter could be a viable option (see how Pieces used &lt;a href="https://code.pieces.app/blog/dart-and-flutter-case-study" rel="noopener noreferrer"&gt;Dart and Flutter&lt;/a&gt; to build their cross-platform application). If your team has experience with .NET languages, Xamarin (or Microsoft’s replacement for it) might be a strong contender.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Project requirements:&lt;/strong&gt; If your app requires specific functionalities or needs deep integration with native features, using the native framework might be necessary. For simpler apps, a cross-platform framework could be a more efficient choice.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It's important to research each framework in detail, consider your specific needs and constraints, and explore the resources and communities available for each option before making a decision. Remember, there's no "one size fits all" answer, and the best framework for your mobile app project depends on various factors.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where Pieces Is Especially Useful
&lt;/h2&gt;

&lt;p&gt;The more variations in the types of code snippets used by a developer, the more Pieces for Developers can streamline the workflow. This especially includes moving between frameworks for different projects. You can relaunch onboarding in Pieces at any time to change your persona(s) and languages. This, along with &lt;a href="https://code.pieces.app/blog/how-to-store-code-snippets-and-10x-your-developer-productivity" rel="noopener noreferrer"&gt;storing various code snippets&lt;/a&gt; from your workflow, regrounds the AI into the languages and materials of your workflow.&lt;/p&gt;

&lt;p&gt;There are four major stages in a developer’s mobile app workflow:&lt;/p&gt;

&lt;h3&gt;
  
  
  Collaborative Design
&lt;/h3&gt;

&lt;p&gt;This involves understanding user needs, translating them into app features and functionalities, and creating user interfaces (UI) that are intuitive and visually appealing. This often requires working closely with designers, product managers, and other stakeholders to understand the purpose, target audience, and desired features of the app. It also includes ensuring maintainability.&lt;/p&gt;

&lt;p&gt;Pieces’ advanced features, such as workflow history, support easy communication and &lt;a href="https://code.pieces.app/blog/top-5-code-collaboration-tools-for-remote-work" rel="noopener noreferrer"&gt;code collaboration&lt;/a&gt;. For example, if the code for a widget comes from someone, then the source, perhaps even the person's name, is associated with that widget’s code in the developer’s workflow. That information can be retrieved months later to ask a question.&lt;/p&gt;

&lt;h3&gt;
  
  
  Development
&lt;/h3&gt;

&lt;p&gt;Choosing the appropriate programming languages and frameworks to write the code. This involves building the user interface (UI) for the interactions, writing the back-end logic for functionality, and integrating various features like databases, APIs, and security measures.&lt;/p&gt;

&lt;p&gt;Pieces provides all the traditionally expected features of a copilot and more. It can easily translate code in a language for one framework into a different language for a different framework, such as described in this &lt;a href="https://code.pieces.app/user-stories/mobile-app-development-user-stories-saving-time-with-code-reusability" rel="noopener noreferrer"&gt;mobile app development user story&lt;/a&gt;. It supports 40+ languages. It also can keep track of different versions of the same functional snippet in different languages, used in different UI versions, or for different projects.&lt;/p&gt;

&lt;h3&gt;
  
  
  Testing and Debugging
&lt;/h3&gt;

&lt;p&gt;Testing the app on various devices and operating systems to ensure it functions as intended, identifying and fixing any bugs or performance issues.&lt;/p&gt;

&lt;p&gt;Pieces can troubleshoot code and debug it. It also can write unit tests, and even enhance &lt;a href="https://code.pieces.app/blog/enhancing-ai-code-review-efficiency-with-retrieval-augmented-generation" rel="noopener noreferrer"&gt;AI code review&lt;/a&gt; efficiency.&lt;/p&gt;

&lt;h3&gt;
  
  
  Deployment and Maintenance
&lt;/h3&gt;

&lt;p&gt;Deploying the app to app stores like Google Play Store or Apple App Store, and continuing to monitor its performance, address user feedback, and release updates with new features or bug fixes.&lt;/p&gt;

&lt;p&gt;Pieces’ snippet enrichments and the developer’s annotations can be retrieved for any code written by the developer in the past. If a new developer is onboarded to the same repository after the previous developer leaves, the new developer has access to the stored workflow to ask the AI questions and get intelligent answers about the code.&lt;/p&gt;

&lt;p&gt;Also, Pieces provides different views for looking at code. The list view shows more &lt;a href="https://docs.pieces.app/features/auto-enrichment" rel="noopener noreferrer"&gt;auto-enrichment&lt;/a&gt; data and is great for shorter snippets. In contrast, the gallery view is more for longer snippets, and it is easy to move left and right between them.&lt;/p&gt;

&lt;p&gt;There is another view in Pieces called &lt;a href="https://docs.pieces.app/features/workflow-activity" rel="noopener noreferrer"&gt;workflow activity&lt;/a&gt;. You can filter that and search through it to see anything that you've referenced or opened any code snippets that you've used. This also works with the extensions as well. You can review any snippet that you saved when you did it. Later, you can go back through your workflow and remember how you did something.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pieces Mobile Use Case Interview
&lt;/h2&gt;

&lt;p&gt;Dan O’Leary, an avid Pieces user, was working on an iOS application for tracking aviation fuel at the time of the interview. It is now his fourth app in the Apple store. His full-time job is being a pilot, so he requires that any tool in his stack makes him more productive. He switches among frameworks depending on the mobile app project.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://code.pieces.app/user-stories/ios-app-development-user-stories-making-work-more-efficient" rel="noopener noreferrer"&gt;iOS app development user story&lt;/a&gt; he discussed in the interview is a recurring theme across his apps: “I seem to always find myself working with dates, date comparisons, whether I want to or not. I have another app on the store that's a running app. And so then that's not really dates, but it's more timers. Date comparisons and working with time is strangely complicated.”&lt;/p&gt;

&lt;p&gt;Before using Pieces, Dan used Snippets Lab for Mac. He said, “It was about half of the functionality or maybe a quarter of the functionality” of Pieces. His repository was “blocks of code that helped me sort dates—or things that I like to use to deal with sorting and comparing time. And so, you know, these are either live in a project or they live in a file that I can reference back because once I switch frameworks or I work on a different project, I'll forget again how to do that.&lt;/p&gt;

&lt;p&gt;Then I go back to falling into the trap of working with dates again–-I've forgotten. So I need to go back and open up my toolbox and look for what I need to do to remember how to do that. Because I don't. I don't work on only that all the time. And when you're &lt;a href="https://code.pieces.app/blog/minimizing-the-cost-of-context-switching-for-developers" rel="noopener noreferrer"&gt;context switching&lt;/a&gt; like that, it's too hard to just memorize everything. I don't think it's reasonable to expect a developer to do that.&lt;/p&gt;

&lt;p&gt;So having some sort of a tool in your workflow that you can search, which is great with your app that's got this automatic tagging for the frameworks and what the code actually does. Super handy for me. It just speeds up what I'm looking for and knowing what I have–-because maybe I don't have a block of code that does that. I won't waste my time looking for something that I don't have.”&lt;/p&gt;

&lt;h2&gt;
  
  
  What Else the User Said
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Conversational Interface:&lt;/strong&gt; “I played around with the copilot and I love that very familiar ChatGPT style. You know, kind of ask a question and I really like that it actually gave me an answer on how I was working on something. It spit out an answer that I needed, so that is very cool.”&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Searchable tags:&lt;/strong&gt; “I can't think of any other platform I've used that I've had my searchable tags for my code. … The fact that it auto-generates this stuff is, to me, that blew me away.”&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security and Privacy:&lt;/strong&gt;  “[I]t's so cool that it’s on-device … for people working in a security-sensitive or intellectual-property kind of environment, ... it's like, oh, that's very cool that you can't just send your code out to the world for storage or reference or anything.” Some people save API keys because they reuse them and Pieces will be able to identify sensitive information. The actual key will get flagged before you try to share it with someone or use it in your IDE. You'll get a little warning pop-up.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Each of the mobile-device ecosystems includes many different devices with different screen sizes, pixel densities, and OS versions. The complexity these differences add to development, testing, and maintenance is a strong justification for including Pieces as a tool for &lt;a href="https://code.pieces.app/blog/workflow-integration-with-ai-a-unified-approach-to-development" rel="noopener noreferrer"&gt;workflow integration with AI&lt;/a&gt;. Its intelligent management of snippets identifies and organizes which snippets are relevant to what devices and when and where they were used.&lt;/p&gt;

&lt;p&gt;The following quote from a user confirms the statements made in this post. Pieces’ advanced features gives it a special position among the numerous AI tools for mobile app development.&lt;/p&gt;

&lt;p&gt;“Pieces is the most useful piece of AI, machine learning, you know, whatever you want to call it, to me so far… because one thing in researching AI, as I've been doing the last few months, is I want to use this to make my work more efficient, … I guess my answer is, what I've seen you guys do is the best integration to my daily life that I have yet to see from any sort of machine learning and artificial intelligence [tool]—to my software development life.”&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>mobile</category>
      <category>flutter</category>
      <category>ai</category>
    </item>
    <item>
      <title>Navigating the Future with AI Copilots: A Comprehensive Guide</title>
      <dc:creator>Pieces 🌟</dc:creator>
      <pubDate>Wed, 13 Mar 2024 16:00:31 +0000</pubDate>
      <link>https://forem.com/getpieces/navigating-the-future-with-ai-copilots-a-comprehensive-guide-ob7</link>
      <guid>https://forem.com/getpieces/navigating-the-future-with-ai-copilots-a-comprehensive-guide-ob7</guid>
      <description>&lt;p&gt;AI copilots are popping up everywhere, changing how we work. They do the boring stuff so we can focus on the big ideas.&lt;/p&gt;

&lt;p&gt;For an overworked developer, they are a long-awaited miracle. Work's got too much going on. Too many emails, too much data, too little time. It's tough to keep up. You're stuck doing tasks that feel like a waste of your skills.&lt;/p&gt;

&lt;p&gt;Having an AI copilot to handle the repetitive tasks and &lt;a href="https://code.pieces.app/blog/modern-enterprise-ai-solutions-for-software-development" rel="noopener noreferrer"&gt;help make better decisions faster&lt;/a&gt; lets you get more done and have time for creative work. With the recent AI explosion, it is no surprise that there are a ton of options out there. So, in today’s guide, we’re going to explore a few of the best AI copilots and how to use them to their full potential.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding AI Copilots
&lt;/h2&gt;

&lt;p&gt;In simple terms, an AI copilot is a supportive tool that utilizes artificial intelligence to augment human capabilities. It doesn’t take away your entire workload but rather helps you focus on things besides hammering out lines of code.&lt;/p&gt;

&lt;p&gt;The technology behind AI copilots is both fascinating and complex. At the core of most copilots is a suite of machine learning algorithms and models that enable them to process and understand large datasets. These models, particularly &lt;a href="https://code.pieces.app/whitepapers/getting-started-with-large-language-models-llms" rel="noopener noreferrer"&gt;Large Language Models (LLMs)&lt;/a&gt;, are trained on a wide array of data sources to recognize patterns, predict outcomes, and generate human-like responses.&lt;/p&gt;

&lt;p&gt;The impact of AI copilots is far-reaching, with a presence in sectors such as software development, healthcare, finance, and customer service. With companies like &lt;a href="https://openai.com/chatgpt" rel="noopener noreferrer"&gt;OpenAI&lt;/a&gt;, &lt;a href="https://blog.google/technology/ai/google-gemini-ai/" rel="noopener noreferrer"&gt;Google&lt;/a&gt;, and &lt;a href="https://www.microsoft.com/en-us/microsoft-copilot" rel="noopener noreferrer"&gt;Microsoft&lt;/a&gt; all racing to develop the best possible solutions, we can enjoy the benefits of this rapidly improving technology for a wide range of use cases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Benefits of Using AI Copilots
&lt;/h2&gt;

&lt;p&gt;The benefits of using an AI coding copilot are substantial, particularly when it comes to software development. There are a lot of areas in coding that take up valuable time. Things like typing out boilerplate, templates, and frequently used code blocks — all take away from the part of software development that requires us to use our brains. A good copilot AI tool can help out in all of these areas.&lt;/p&gt;

&lt;h3&gt;
  
  
  Increased Developer Productivity
&lt;/h3&gt;

&lt;p&gt;One of the primary benefits of an AI-powered copilot is their ability to significantly boost efficiency and productivity.&lt;/p&gt;

&lt;h3&gt;
  
  
  Enhanced Accuracy and Decision-making
&lt;/h3&gt;

&lt;p&gt;Another key advantage is the enhancement of accuracy and decision-making. AI copilots are capable of suggesting entire blocks of code and solutions to problems that may arise, which can help reduce errors and improve the quality of the final product.&lt;/p&gt;

&lt;h3&gt;
  
  
  Customization and Learning from User Interactions
&lt;/h3&gt;

&lt;p&gt;A modern copilot AI tool is designed to learn from user interactions and adapt over time, offering personalized support that becomes more accurate and helpful with each use. For the longest time, LLMs were only good at making semi-usable suggestions. But with the increased innovation, evidenced by recent ChatGPT claims to &lt;a href="https://www.wired.com/story/chatgpt-memory-openai/" rel="noopener noreferrer"&gt;remember user conversations&lt;/a&gt;, we can offload some of our brain-power to an AI coding assistant copilot.&lt;/p&gt;

&lt;h3&gt;
  
  
  Reduction in Repetitive Task Load
&lt;/h3&gt;

&lt;p&gt;As developers, typing the same code over and over again can take the wind out of our sails. Besides making the job boring, having to do repetitive tasks drains our energy away from more important areas like creativity and strategic thinking. Copilot AI coding allows for the reduction in this repetitive task load, and tools like Pieces can even help you &lt;a href="https://code.pieces.app/blog/how-to-store-code-snippets-and-10x-your-developer-productivity" rel="noopener noreferrer"&gt;store code snippets&lt;/a&gt; you frequently reuse so you don’t have to go back and look for that perfect snippet you researched a month ago.&lt;/p&gt;

&lt;p&gt;AI copilots take on the burden of mundane tasks, freeing up developers to focus on more complex, creative, and strategic aspects of their projects. This shift in focus can lead to more innovative solutions and advancements in the field.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best AI Copilots on the Market
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Pieces
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage1_fb98c0d4ba6c255e179739bc96683c62.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage1_fb98c0d4ba6c255e179739bc96683c62.jpg" alt="The Pieces for Developers VS Code extension."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://pieces.app/" rel="noopener noreferrer"&gt;Pieces&lt;/a&gt; is one of the more powerful AI copilots out there that enhances developer productivity by offering personalized workflow assistance. It's particularly good at capturing and enriching materials, streamlining collaboration, and solving complex problems through a thorough understanding of your unique workflow.&lt;/p&gt;

&lt;p&gt;The great thing about Pieces is that it operates at the system level, &lt;a href="https://code.pieces.app/blog/workflow-integration-with-ai-a-unified-approach-to-development" rel="noopener noreferrer"&gt;integrating your workflow&lt;/a&gt; across the browser, IDE and collaboration software through &lt;a href="https://code.pieces.app/blog/retrieval-augmented-generation-for-curation" rel="noopener noreferrer"&gt;retrieval augmented generation&lt;/a&gt;, which allows it to make highly contextual suggestions based on the things you're referencing and interacting with. It's available on macOS, Windows, and even Linux.&lt;/p&gt;

&lt;p&gt;Pieces is a huge leg up for anyone looking to &lt;a href="https://code.pieces.app/blog/how-to-measure-developer-productivity-a-complete-guide" rel="noopener noreferrer"&gt;increase developer productivity&lt;/a&gt; on their team. But besides easing developer fatigue, security is also a major advantage. Unlike strictly cloud-based solutions, data security is a top priority with Pieces. Your information stays on your machine, and all large and &lt;a href="https://code.pieces.app/blog/small-language-models-outshine-large-language-models-enterprise-users" rel="noopener noreferrer"&gt;small language models&lt;/a&gt; run on-device, ensuring that your work remains private and secure. And for those times when &lt;a href="https://code.pieces.app/blog/how-developers-are-using-offline-ai-tools-for-air-gapped-security" rel="noopener noreferrer"&gt;offline AI&lt;/a&gt; isn't an option, cloud capabilities are available as an opt-in feature.&lt;/p&gt;

&lt;p&gt;There are more than a handful of ways to implement Pieces. You can find an extension or plugin for everything from IDEs like VS Code and JetBrains to knowledge management apps like Obsidian. Just locate the Pieces extension or plugin you need from the website, install it, and follow any onboarding instructions provided. The process is designed to be user-friendly, often requiring only a few clicks to get started.&lt;/p&gt;

&lt;h3&gt;
  
  
  Github Copilot
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage2_adbf6b66153f08460f28b1c8d7226743.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage2_adbf6b66153f08460f28b1c8d7226743.jpg" alt="The GitHub Copilot homepage."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you’ve been working in software development for any length of time, you have likely heard of this one. &lt;a href="https://github.com/features/copilot" rel="noopener noreferrer"&gt;GitHub Copilot&lt;/a&gt; is powered by OpenAI's Codex, a descendant of the GPT-3 family, but fine-tuned specifically for code completion. It works by analyzing the vast repositories of code available on GitHub to learn coding patterns, styles, and best practices, such as comments, function names, and the code itself, to generate syntactically correct and logically fitting code suggestions.&lt;/p&gt;

&lt;p&gt;Like Pieces, GitHub Copilot prides itself on easing the dreaded developer burnout that comes with spending hours on mundane code-monkey tasks. According to GitHub, developers who use Copilot find that it helps them code faster and more efficiently. For instance, in a survey conducted by GitHub, it was found that developers who use Copilot &lt;a href="https://github.blog/2022-09-07-research-quantifying-github-copilots-impact-on-developer-productivity-and-happiness/" rel="noopener noreferrer"&gt;can write code up to 55% faster&lt;/a&gt;, reducing the time spent searching for code snippets and solutions online.&lt;/p&gt;

&lt;p&gt;Learn what the &lt;a href="https://code.pieces.app/blog/best-free-and-paid-github-copilot-alternatives" rel="noopener noreferrer"&gt;best free and paid GitHub Copilot alternatives&lt;/a&gt; are in our latest post.&lt;/p&gt;

&lt;h3&gt;
  
  
  Microsoft Copilot
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage5_b929a18f6d51320f56e8ed18c76ab310.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage5_b929a18f6d51320f56e8ed18c76ab310.jpg" alt="The Microsoft Copilot homepage."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Given how much Microsoft has reshaped technology over the past few decades, it’s honestly surprising that we didn’t get a copilot AI assistant from them sooner. But now that it’s finally here, it does not disappoint.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://copilot.microsoft.com/" rel="noopener noreferrer"&gt;Microsoft Copilot&lt;/a&gt; is part of a broader vision to embed AI into the fabric of software development and beyond. While GitHub Copilot focuses specifically on code suggestions, Microsoft's Copilot aims to be a comprehensive assistant across various Microsoft products and services, including development tools.&lt;/p&gt;

&lt;h3&gt;
  
  
  CodeWP
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage4_9956c962d9a52630e86b29368bcb3f4c.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage4_9956c962d9a52630e86b29368bcb3f4c.jpg" alt="The CodeWP Homepage."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This one is a little different from the others in that it is targeted at a niche developer audience: WordPress developers. There are plenty of broad generative AI copilots out there, so having one tailored to a specific tech stack can come in handy. Since &lt;a href="https://w3techs.com/technologies/details/cm-wordpress" rel="noopener noreferrer"&gt;WordPress powers 43.1% of all websites&lt;/a&gt;, this specialized AI assistant is still useful to a broad audience.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://codewp.ai/" rel="noopener noreferrer"&gt;CodeWP&lt;/a&gt; can help you with all sorts of WordPress tasks such as generating PHP, JavaScript, and CSS code based on natural language descriptions. For WordPress developers, this means less time googling code snippets and more time focusing on creating custom, high-quality themes and plugins.&lt;/p&gt;

&lt;h3&gt;
  
  
  SQL.ai
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage3_41c0b76fcedda9301552ab978d8f3300.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fd37oebn0w9ir6a.cloudfront.net%2Faccount_32099%2Fimage3_41c0b76fcedda9301552ab978d8f3300.jpg" alt="The SQL AI homepage."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Database programming languages are a completely different beast compared to things like JavaScript, Ruby, or Python. But the fact is that almost every developer needs to work with a database at some point, even if it isn’t fun. Having an AI-powered copilot to help you out can be a massive timesaver.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.sqlai.ai/" rel="noopener noreferrer"&gt;SQLAI.ai&lt;/a&gt; focuses entirely on generating SQL queries for you. Unless you’re a SQL developer with a passion for the language who enjoys every second of typing out the perfect query, you’ll love this one.&lt;/p&gt;

&lt;p&gt;There are a few capabilities that this AI code copilot helps out with. The first is that you can type out what you want your SQL queries to do using everyday language. Compared to looking up syntax and making sure you’re doing everything right according to the documentation, this is much faster. You can also optimize or fix your SQL using AI. Paste your query into the box and watch the AI copilot work its magic in a few seconds.&lt;/p&gt;

&lt;p&gt;You won’t find a vast extension marketplace like some of the other AI copilots, but the pricing and ease of use can’t be argued with. You can pick plans from $4 per month, or even less if you pay by the year. Another drawback is that this is primarily an AI copilot for the web. So, you won’t be able to download it or use it within your IDE.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Choose the Right AI Copilot for Your Needs
&lt;/h2&gt;

&lt;p&gt;When it comes to choosing an AI copilot, the decision boils down to understanding your needs, the unique offerings of each AI copilot, and how they align with your development goals. Here’s how you should approach this selection process.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Consider your industry focus and project requirements.&lt;/strong&gt; If you need something for a specific use case such as WordPress or SQL, then you could consider CodeWP or SQL.ai. On the other hand, if you need a more versatile solution, consider something like &lt;a href="https://code.pieces.app/blog/pieces-developers-github-copilot" rel="noopener noreferrer"&gt;Pieces for Developers or GitHub Copilot&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Evaluate the ease of integration and compatibility.&lt;/strong&gt; If you want to stick to your favorite IDE or software interface, make sure you pick an AI copilot for the web, editor, and other tools, not just one for a specific app.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Project complexity and tech stack.&lt;/strong&gt; If your project is simple and limited to one app, such as WordPress, then you might be fine with a more limited AI copilot like CodeWP. Conversely, if your project incorporates multiple programming languages and apps, you might want to consider something with more diverse capabilities.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Core functionality.&lt;/strong&gt; Many AI assistants have very basic functionality, like SQL.ai, and they can only do one or two things at a time. Others, such as Microsoft Copilot, can handle a wide range of tasks across your entire operating system.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Developer support.&lt;/strong&gt; You’ll want to make sure you pick an AI coding assistant that has extensive documentation and a library of tutorials to help you get started.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost-effectiveness.&lt;/strong&gt; Many generative AI copilots are free, &lt;a href="https://docs.pieces.app/installation-getting-started/what-am-i-installing" rel="noopener noreferrer"&gt;such as Pieces&lt;/a&gt;. But other AI copilots charge a monthly subscription, like GitHub Copilot. Think about your budget when deciding which one is best for you.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Implementing AI Copilots in Your Workflow
&lt;/h2&gt;

&lt;p&gt;Integrating an AI assistant copilot into your daily tasks can significantly boost your team's productivity and streamline operations. Here is a quick step-by-step for integrating an AI-powered copilot into your workflow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Identify opportunities&lt;/strong&gt;. Look at your daily operations to find areas where an AI copilot can make a difference. This could be anything from data entry and &lt;a href="https://www.scheduler.ai/post/scheduler-ai-outpaces-chatgpt-and-google-bard-in-scheduling-capability" rel="noopener noreferrer"&gt;scheduling&lt;/a&gt; to more complex problem-solving tasks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Select the right tool.&lt;/strong&gt; Not all AI copilots are created equal. Choose one that fits well with your team's needs and the specific tasks you want to automate.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Start small.&lt;/strong&gt; Begin with a &lt;a href="https://getpieces.typeform.com/enterprise" rel="noopener noreferrer"&gt;pilot program&lt;/a&gt;. Implement the AI copilot in a small, controlled environment to gauge its effectiveness and identify any adjustments needed.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Educate your team&lt;/strong&gt;. Make sure your team knows how to work with the AI. This might include training sessions or workshops to get everyone up to speed.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Evaluate and scale&lt;/strong&gt;. Continuously assess the AI copilot’s impact on your workflow. If it proves beneficial, consider expanding its role within your organization.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  What about Enterprise AI Copilots?
&lt;/h3&gt;

&lt;p&gt;It seems every company out there is adding AI to their toolbox in some way or another. And this is all for a very good reason. AI is a game-changer when it comes to addressing the rapid pace of technological change and the growing complexity of software development.&lt;/p&gt;

&lt;p&gt;Enterprises face challenges such as &lt;a href="https://code.pieces.app/blog/lack-of-software-developers-what-to-do" rel="noopener noreferrer"&gt;high developer turnover&lt;/a&gt; and the need for &lt;a href="https://docs.pieces.app/use-cases/onboard-new-developers" rel="noopener noreferrer"&gt;faster onboarding&lt;/a&gt; and upskilling of new team members. AI copilots can help with these challenges by offering real-time assistance, automating routine tasks, and enabling more efficient knowledge transfer and collaboration among team members.&lt;/p&gt;

&lt;p&gt;As developers, we’re often told to consult the docs when we have a problem. But how often is &lt;a href="https://code.pieces.app/blog/art-of-writing-documentation-and-technical-content" rel="noopener noreferrer"&gt;documentation far more long and convoluted&lt;/a&gt; than it needs to be? It isn’t the best part of anyone’s day when we have to sift through pages of unfamiliar documents just to find the solution to a niche issue or bug. A good copilot AI tool can be a huge help here and reduce the cognitive load on developers, helping them to focus on more creative and strategic aspects of their work.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://code.pieces.app/blog/becoming-an-ai-powered-enterprise" rel="noopener noreferrer"&gt;Becoming an AI-powered enterprise&lt;/a&gt; isn’t all that difficult either, and the leap is often more than worth it.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to Build an AI Copilot
&lt;/h3&gt;

&lt;p&gt;Building your own free AI copilot is a great way to both learn about the underlying tech and customize it to your heart’s content. We’ll give you a quick rundown of how to do this with Pieces. Start with understanding the &lt;a href="https://github.com/pieces-app/pieces-copilot-vanilla-typescript-example" rel="noopener noreferrer"&gt;Pieces OS Client&lt;/a&gt;, which is essentially the heart of your custom copilot. It's a database that comes with the advantage of &lt;a href="https://code.pieces.app/blog/how-to-build-a-copilot-using-local-llms-with-pieces-client" rel="noopener noreferrer"&gt;built-in LLMs&lt;/a&gt; and the ability to set custom context for your conversations.&lt;/p&gt;

&lt;p&gt;When you're setting up, you'll want to choose an SDK that fits your language preference. Pieces has made it rather convenient by offering a variety of SDKs including Typescript, Python, Kotlin, and Dart. This means you can work in a language you're comfortable with and still leverage the powerful features of Pieces OS.&lt;/p&gt;

&lt;p&gt;Once you've downloaded the desired SDK, such as the &lt;a href="https://github.com/pieces-app/pieces-os-client-sdk-for-typescript" rel="noopener noreferrer"&gt;Typescript SDK&lt;/a&gt;, and installed &lt;a href="https://docs.pieces.app/installation-getting-started/pieces-os" rel="noopener noreferrer"&gt;Pieces OS&lt;/a&gt; on your machine, you're ready to start crafting your copilot. The beauty of this setup is that you can start making requests to your copilot with minimal setup, which is great for getting up and running quickly.&lt;/p&gt;

&lt;p&gt;Depending on your requirements, you can manage and use various cloud-based and &lt;a href="https://code.pieces.app/blog/local-large-language-models-lllms-and-copilot-integrations" rel="noopener noreferrer"&gt;local LLMs&lt;/a&gt;. Download them locally with the help of &lt;code&gt;modelsProgressController.tsx&lt;/code&gt; for offline use, or connect to cloud-based models if you prefer.&lt;/p&gt;

&lt;p&gt;If you want, you can enhance your copilot’s responses by adding context to your queries. Unlike many environments where you might have to retrain models with your data to get relevant answers, Pieces allows you to attach context to your questions, so you can get more accurate and tailored responses without the wait.&lt;/p&gt;

&lt;p&gt;You can check out our in-depth guide on &lt;a href="https://code.pieces.app/blog/build-your-own-open-source-copilot-with-pieces" rel="noopener noreferrer"&gt;building your own copilot AI here&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Future of AI Copilots
&lt;/h2&gt;

&lt;p&gt;Industry leaders at &lt;a href="https://a16z.com/ai-copilots-and-the-future-of-knowledge-work/" rel="noopener noreferrer"&gt;Andreessen Horowitz say&lt;/a&gt; the era we're entering could be akin to a new Industrial Revolution for cognitive tasks, suggesting a significant boost in productivity for knowledge workers. AI copilots serve as early examples of how AI can make knowledge work more efficient and enjoyable, pointing towards a future where the collaboration between humans and AI leads to unprecedented levels of productivity and innovation.&lt;/p&gt;

&lt;p&gt;Pieces is at the forefront of this revolution, directly enhancing the efficiency and productivity of developers by integrating AI-powered solutions into their workflows. The ability to create personalized suggestions and automate parts of the coding process, grounded in the unique context of each developer's work, embodies the promise of AI copilots in transforming cognitive tasks.&lt;/p&gt;

&lt;p&gt;All of this technology was a pipedream just a few years ago. But now it is a reality. If you want to see how, just &lt;a href="https://docs.pieces.app/installation-getting-started/what-am-i-installing" rel="noopener noreferrer"&gt;download Pieces and try it out for yourself.&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>beginners</category>
      <category>llm</category>
    </item>
  </channel>
</rss>
