Enhance your Vue.js application by integrating chat capabilities with Chart.js and LLMs like OpenAI and Deepseek-R1. This post walks through adding a chat node to the Micronaut-Optimizer workflow, enabling dynamic interactions with optimization results. Learn how to configure environment variables, connect workflow nodes, and send Chart.js data to LLMs. See it in action with sample inputs and responses, and explore running Deepseek-R1 locally with Ollama.
JDK Mission Control (JMC) is a powerful tool for low-overhead Java application profiling and performance analysis. In this post, I explore JMC’s capabilities while optimizing inference performance for Micronaut-Llama3 with DeepSeek-R1. I walk through setup, profiling with Flight Recorder, and identifying bottlenecks using flame graphs. Key optimizations, such as refining ByteVector operations, significantly enhance performance. A comparison with VisualVM highlights JMC’s advantages, making it the go-to tool for in-depth Java profiling. If you’re looking to fine-tune your Java applications, JMC provides essential insights for optimization.
In this post, I’ll walk you through the development of a Vue.js frontend application designed to complement my previous work on a flexible optimizer framework built with Micronaut. This frontend provides a visual interface for designing, managing, and optimizing workflows, with a focus on solving combinatorial optimization problems like the Traveling Salesman Problem (TSP). The app features a drag-and-drop UI, enabling users to define optimization problems graphically without relying on tools like Postman. By connecting inputs, transformations, and outputs through Workflow Nodes, users can visualize and compare the performance of backend optimization algorithms across different datasets.